id
int64 1
325
| guest
stringlengths 3
45
⌀ | title
stringlengths 3
75
⌀ | text
stringlengths 1
528
| start
stringlengths 9
11
| end
stringlengths 9
11
|
---|---|---|---|---|---|
1 | Max Tegmark | Life 3.0 | Yeah, and that's, I think, why it's really exciting | 1:20:44.040 | 1:20:46.640 |
1 | Max Tegmark | Life 3.0 | that you and others are connected | 1:20:46.640 | 1:20:49.440 |
1 | Max Tegmark | Life 3.0 | with some of the work Elon Musk is doing, | 1:20:49.440 | 1:20:51.880 |
1 | Max Tegmark | Life 3.0 | because he's literally going out into that space, | 1:20:51.880 | 1:20:54.480 |
1 | Max Tegmark | Life 3.0 | really exploring our universe, and it's wonderful. | 1:20:54.480 | 1:20:57.000 |
1 | Max Tegmark | Life 3.0 | That is exactly why Elon Musk is so misunderstood, right? | 1:20:57.000 | 1:21:02.000 |
1 | Max Tegmark | Life 3.0 | Misconstrued him as some kind of pessimistic doomsayer. | 1:21:02.000 | 1:21:05.000 |
1 | Max Tegmark | Life 3.0 | The reason he cares so much about AI safety | 1:21:05.000 | 1:21:07.640 |
1 | Max Tegmark | Life 3.0 | is because he more than almost anyone else appreciates | 1:21:07.640 | 1:21:12.080 |
1 | Max Tegmark | Life 3.0 | these amazing opportunities that we'll squander | 1:21:12.080 | 1:21:14.280 |
1 | Max Tegmark | Life 3.0 | if we wipe out here on Earth. | 1:21:14.280 | 1:21:16.640 |
1 | Max Tegmark | Life 3.0 | We're not just going to wipe out the next generation, | 1:21:16.640 | 1:21:19.680 |
1 | Max Tegmark | Life 3.0 | all generations, and this incredible opportunity | 1:21:19.680 | 1:21:23.320 |
1 | Max Tegmark | Life 3.0 | that's out there, and that would really be a waste. | 1:21:23.320 | 1:21:25.400 |
1 | Max Tegmark | Life 3.0 | And AI, for people who think that it would be better | 1:21:25.400 | 1:21:30.080 |
1 | Max Tegmark | Life 3.0 | to do without technology, let me just mention that | 1:21:30.080 | 1:21:33.600 |
1 | Max Tegmark | Life 3.0 | if we don't improve our technology, | 1:21:34.680 | 1:21:36.320 |
1 | Max Tegmark | Life 3.0 | the question isn't whether humanity is going to go extinct. | 1:21:36.320 | 1:21:39.320 |
1 | Max Tegmark | Life 3.0 | The question is just whether we're going to get taken out | 1:21:39.320 | 1:21:41.160 |
1 | Max Tegmark | Life 3.0 | by the next big asteroid or the next super volcano | 1:21:41.160 | 1:21:44.800 |
1 | Max Tegmark | Life 3.0 | or something else dumb that we could easily prevent | 1:21:44.800 | 1:21:48.280 |
1 | Max Tegmark | Life 3.0 | with more tech, right? | 1:21:48.280 | 1:21:49.840 |
1 | Max Tegmark | Life 3.0 | And if we want life to flourish throughout the cosmos, | 1:21:49.840 | 1:21:53.160 |
1 | Max Tegmark | Life 3.0 | AI is the key to it. | 1:21:53.160 | 1:21:54.760 |
1 | Max Tegmark | Life 3.0 | As I mentioned in a lot of detail in my book right there, | 1:21:56.120 | 1:21:59.840 |
1 | Max Tegmark | Life 3.0 | even many of the most inspired sci fi writers, | 1:21:59.840 | 1:22:04.840 |
1 | Max Tegmark | Life 3.0 | I feel have totally underestimated the opportunities | 1:22:04.880 | 1:22:08.120 |
1 | Max Tegmark | Life 3.0 | for space travel, especially at the other galaxies, | 1:22:08.120 | 1:22:11.240 |
1 | Max Tegmark | Life 3.0 | because they weren't thinking about the possibility of AGI, | 1:22:11.240 | 1:22:15.360 |
1 | Max Tegmark | Life 3.0 | which just makes it so much easier. | 1:22:15.360 | 1:22:17.520 |
1 | Max Tegmark | Life 3.0 | Right, yeah. | 1:22:17.520 | 1:22:18.440 |
1 | Max Tegmark | Life 3.0 | So that goes to your view of AGI that enables our progress, | 1:22:18.440 | 1:22:23.440 |
1 | Max Tegmark | Life 3.0 | that enables a better life. | 1:22:24.080 | 1:22:25.760 |
1 | Max Tegmark | Life 3.0 | So that's a beautiful way to put it | 1:22:25.760 | 1:22:28.320 |
1 | Max Tegmark | Life 3.0 | and then something to strive for. | 1:22:28.320 | 1:22:29.960 |
1 | Max Tegmark | Life 3.0 | So Max, thank you so much. | 1:22:29.960 | 1:22:31.440 |
1 | Max Tegmark | Life 3.0 | Thank you for your time today. | 1:22:31.440 | 1:22:32.560 |
1 | Max Tegmark | Life 3.0 | It's been awesome. | 1:22:32.560 | 1:22:33.560 |
1 | Max Tegmark | Life 3.0 | Thank you so much. | 1:22:33.560 | 1:22:34.400 |
1 | Max Tegmark | Life 3.0 | Thanks. | 1:22:34.400 | 1:22:35.240 |
1 | Max Tegmark | Life 3.0 | Have a great day. | 1:22:35.240 | 1:22:40.240 |
1 | Max Tegmark | Life 3.0 | got cleaned out of the gene pool, right? | 20:00.560 | 20:02.960 |
1 | Max Tegmark | Life 3.0 | But if you build an artificial general intelligence | 20:02.960 | 20:06.880 |
1 | Max Tegmark | Life 3.0 | the mind space that you can design is much, much larger | 20:06.880 | 20:10.040 |
1 | Max Tegmark | Life 3.0 | than just a specific subset of minds that can evolve. | 20:10.040 | 20:14.440 |
1 | Max Tegmark | Life 3.0 | So an AGI mind doesn't necessarily have | 20:14.440 | 20:17.280 |
1 | Max Tegmark | Life 3.0 | to have any self preservation instinct. | 20:17.280 | 20:19.880 |
1 | Max Tegmark | Life 3.0 | It also doesn't necessarily have to be | 20:19.880 | 20:21.600 |
1 | Max Tegmark | Life 3.0 | so individualistic as us. | 20:21.600 | 20:24.040 |
1 | Max Tegmark | Life 3.0 | Like, imagine if you could just, first of all, | 20:24.040 | 20:26.080 |
1 | Max Tegmark | Life 3.0 | or we are also very afraid of death. | 20:26.080 | 20:27.960 |
1 | Max Tegmark | Life 3.0 | You know, I suppose you could back yourself up | 20:27.960 | 20:29.920 |
1 | Max Tegmark | Life 3.0 | every five minutes and then your airplane | 20:29.920 | 20:32.000 |
1 | Max Tegmark | Life 3.0 | is about to crash. | 20:32.000 | 20:32.840 |
1 | Max Tegmark | Life 3.0 | You're like, shucks, I'm gonna lose the last five minutes | 20:32.840 | 20:36.680 |
1 | Max Tegmark | Life 3.0 | of experiences since my last cloud backup, dang. | 20:36.680 | 20:39.520 |
1 | Max Tegmark | Life 3.0 | You know, it's not as big a deal. | 20:39.520 | 20:41.520 |
1 | Max Tegmark | Life 3.0 | Or if we could just copy experiences between our minds | 20:41.520 | 20:45.680 |
1 | Max Tegmark | Life 3.0 | easily like we, which we could easily do | 20:45.680 | 20:47.640 |
1 | Max Tegmark | Life 3.0 | if we were silicon based, right? | 20:47.640 | 20:50.360 |
1 | Max Tegmark | Life 3.0 | Then maybe we would feel a little bit more | 20:50.360 | 20:54.040 |
1 | Max Tegmark | Life 3.0 | like a hive mind actually, that maybe it's the, | 20:54.040 | 20:56.560 |
1 | Max Tegmark | Life 3.0 | so I don't think we should take for granted at all | 20:56.560 | 20:59.960 |
1 | Max Tegmark | Life 3.0 | that AGI will have to have any of those sort of | 20:59.960 | 21:03.000 |
1 | Max Tegmark | Life 3.0 | competitive as alpha male instincts. | 21:04.880 | 21:07.360 |
1 | Max Tegmark | Life 3.0 | On the other hand, you know, this is really interesting | 21:07.360 | 21:10.160 |
1 | Max Tegmark | Life 3.0 | because I think some people go too far and say, | 21:10.160 | 21:13.840 |
1 | Max Tegmark | Life 3.0 | of course we don't have to have any concerns either | 21:13.840 | 21:16.680 |
1 | Max Tegmark | Life 3.0 | that advanced AI will have those instincts | 21:16.680 | 21:20.800 |
1 | Max Tegmark | Life 3.0 | because we can build anything we want. | 21:20.800 | 21:22.680 |
1 | Max Tegmark | Life 3.0 | That there's a very nice set of arguments going back | 21:22.680 | 21:26.280 |
1 | Max Tegmark | Life 3.0 | to Steve Omohundro and Nick Bostrom and others | 21:26.280 | 21:28.560 |
1 | Max Tegmark | Life 3.0 | just pointing out that when we build machines, | 21:28.560 | 21:32.280 |
1 | Max Tegmark | Life 3.0 | we normally build them with some kind of goal, you know, | 21:32.280 | 21:34.680 |
1 | Max Tegmark | Life 3.0 | win this chess game, drive this car safely or whatever. | 21:34.680 | 21:38.520 |
1 | Max Tegmark | Life 3.0 | And as soon as you put in a goal into machine, | 21:38.520 | 21:40.960 |
1 | Max Tegmark | Life 3.0 | especially if it's kind of open ended goal | 21:40.960 | 21:42.760 |
1 | Max Tegmark | Life 3.0 | and the machine is very intelligent, | 21:42.760 | 21:44.640 |
1 | Max Tegmark | Life 3.0 | it'll break that down into a bunch of sub goals. | 21:44.640 | 21:47.000 |
1 | Max Tegmark | Life 3.0 | And one of those goals will almost always | 21:48.280 | 21:51.280 |
1 | Max Tegmark | Life 3.0 | be self preservation because if it breaks or dies | 21:51.280 | 21:54.200 |
1 | Max Tegmark | Life 3.0 | in the process, it's not gonna accomplish the goal, right? | 21:54.200 | 21:56.120 |
1 | Max Tegmark | Life 3.0 | Like suppose you just build a little, | 21:56.120 | 21:58.040 |
1 | Max Tegmark | Life 3.0 | you have a little robot and you tell it to go down | 21:58.040 | 22:01.000 |
1 | Max Tegmark | Life 3.0 | the store market here and get you some food, | 22:01.000 | 22:04.040 |
1 | Max Tegmark | Life 3.0 | make you cook an Italian dinner, you know, | 22:04.040 | 22:06.200 |
1 | Max Tegmark | Life 3.0 | and then someone mugs it and tries to break it | 22:06.200 | 22:08.400 |
1 | Max Tegmark | Life 3.0 | on the way. | 22:08.400 | 22:09.480 |
1 | Max Tegmark | Life 3.0 | That robot has an incentive to not get destroyed | 22:09.480 | 22:12.920 |
1 | Max Tegmark | Life 3.0 | and defend itself or run away, | 22:12.920 | 22:14.720 |
1 | Max Tegmark | Life 3.0 | because otherwise it's gonna fail in cooking your dinner. | 22:14.720 | 22:17.720 |
1 | Max Tegmark | Life 3.0 | It's not afraid of death, | 22:17.720 | 22:19.560 |
1 | Max Tegmark | Life 3.0 | but it really wants to complete the dinner cooking goal. | 22:19.560 | 22:22.960 |
1 | Max Tegmark | Life 3.0 | So it will have a self preservation instinct. | 22:22.960 | 22:25.040 |
1 | Max Tegmark | Life 3.0 | Continue being a functional agent somehow. | 22:25.040 | 22:27.920 |
1 | Max Tegmark | Life 3.0 | And similarly, if you give any kind of more ambitious goal | 22:27.920 | 22:32.920 |
1 | Max Tegmark | Life 3.0 | to an AGI, it's very likely they wanna acquire | 22:33.720 | 22:37.000 |
1 | Max Tegmark | Life 3.0 | more resources so it can do that better. | 22:37.000 | 22:39.840 |
1 | Max Tegmark | Life 3.0 | And it's exactly from those sort of sub goals | 22:39.840 | 22:42.720 |
1 | Max Tegmark | Life 3.0 | that we might not have intended | 22:42.720 | 22:43.800 |