id
int64 1
325
| guest
stringlengths 3
45
⌀ | title
stringlengths 3
75
⌀ | text
stringlengths 1
528
| start
stringlengths 9
11
| end
stringlengths 9
11
|
---|---|---|---|---|---|
1 | Max Tegmark | Life 3.0 | that some of the concerns about AGI safety come. | 22:43.800 | 22:47.160 |
1 | Max Tegmark | Life 3.0 | You give it some goal that seems completely harmless. | 22:47.160 | 22:50.600 |
1 | Max Tegmark | Life 3.0 | And then before you realize it, | 22:50.600 | 22:53.360 |
1 | Max Tegmark | Life 3.0 | it's also trying to do these other things | 22:53.360 | 22:55.480 |
1 | Max Tegmark | Life 3.0 | which you didn't want it to do. | 22:55.480 | 22:56.920 |
1 | Max Tegmark | Life 3.0 | And it's maybe smarter than us. | 22:56.920 | 22:59.160 |
1 | Max Tegmark | Life 3.0 | So it's fascinating. | 22:59.160 | 23:01.000 |
1 | Max Tegmark | Life 3.0 | And let me pause just because I am in a very kind | 23:01.000 | 23:05.680 |
1 | Max Tegmark | Life 3.0 | of human centric way, see fear of death | 23:05.680 | 23:08.720 |
1 | Max Tegmark | Life 3.0 | as a valuable motivator. | 23:08.720 | 23:11.840 |
1 | Max Tegmark | Life 3.0 | So you don't think, you think that's an artifact | 23:11.840 | 23:16.440 |
1 | Max Tegmark | Life 3.0 | of evolution, so that's the kind of mind space | 23:16.440 | 23:19.120 |
1 | Max Tegmark | Life 3.0 | evolution created that we're sort of almost obsessed | 23:19.120 | 23:22.120 |
1 | Max Tegmark | Life 3.0 | about self preservation, some kind of genetic flow. | 23:22.120 | 23:24.400 |
1 | Max Tegmark | Life 3.0 | You don't think that's necessary to be afraid of death. | 23:24.400 | 23:29.400 |
1 | Max Tegmark | Life 3.0 | So not just a kind of sub goal of self preservation | 23:29.480 | 23:32.920 |
1 | Max Tegmark | Life 3.0 | just so you can keep doing the thing, | 23:32.920 | 23:34.920 |
1 | Max Tegmark | Life 3.0 | but more fundamentally sort of have the finite thing | 23:34.920 | 23:38.720 |
1 | Max Tegmark | Life 3.0 | like this ends for you at some point. | 23:38.720 | 23:43.080 |
1 | Max Tegmark | Life 3.0 | Interesting. | 23:43.080 | 23:44.160 |
1 | Max Tegmark | Life 3.0 | Do I think it's necessary for what precisely? | 23:44.160 | 23:47.440 |
1 | Max Tegmark | Life 3.0 | For intelligence, but also for consciousness. | 23:47.440 | 23:50.920 |
1 | Max Tegmark | Life 3.0 | So for those, for both, do you think really | 23:50.920 | 23:55.040 |
1 | Max Tegmark | Life 3.0 | like a finite death and the fear of it is important? | 23:55.040 | 23:59.120 |
1 | Max Tegmark | Life 3.0 | So before I can answer, before we can agree | 23:59.120 | 24:04.120 |
1 | Max Tegmark | Life 3.0 | on whether it's necessary for intelligence | 24:05.160 | 24:06.960 |
1 | Max Tegmark | Life 3.0 | or for consciousness, we should be clear | 24:06.960 | 24:08.360 |
1 | Max Tegmark | Life 3.0 | on how we define those two words. | 24:08.360 | 24:09.800 |
1 | Max Tegmark | Life 3.0 | Cause a lot of really smart people define them | 24:09.800 | 24:11.960 |
1 | Max Tegmark | Life 3.0 | in very different ways. | 24:11.960 | 24:13.320 |
1 | Max Tegmark | Life 3.0 | I was on this panel with AI experts | 24:13.320 | 24:17.080 |
1 | Max Tegmark | Life 3.0 | and they couldn't agree on how to define intelligence even. | 24:17.080 | 24:20.080 |
1 | Max Tegmark | Life 3.0 | So I define intelligence simply | 24:20.080 | 24:22.000 |
1 | Max Tegmark | Life 3.0 | as the ability to accomplish complex goals. | 24:22.000 | 24:24.760 |
1 | Max Tegmark | Life 3.0 | I like your broad definition, because again | 24:25.640 | 24:27.280 |
1 | Max Tegmark | Life 3.0 | I don't want to be a carbon chauvinist. | 24:27.280 | 24:29.040 |
1 | Max Tegmark | Life 3.0 | Right. | 24:29.040 | 24:30.400 |
1 | Max Tegmark | Life 3.0 | And in that case, no, certainly | 24:30.400 | 24:34.600 |
1 | Max Tegmark | Life 3.0 | it doesn't require fear of death. | 24:34.600 | 24:36.480 |
1 | Max Tegmark | Life 3.0 | I would say alpha go, alpha zero is quite intelligent. | 24:36.480 | 24:40.120 |
1 | Max Tegmark | Life 3.0 | I don't think alpha zero has any fear of being turned off | 24:40.120 | 24:43.080 |
1 | Max Tegmark | Life 3.0 | because it doesn't understand the concept of it even. | 24:43.080 | 24:46.320 |
1 | Max Tegmark | Life 3.0 | And similarly consciousness. | 24:46.320 | 24:48.440 |
1 | Max Tegmark | Life 3.0 | I mean, you could certainly imagine very simple | 24:48.440 | 24:52.240 |
1 | Max Tegmark | Life 3.0 | kind of experience. | 24:52.240 | 24:53.920 |
1 | Max Tegmark | Life 3.0 | If certain plants have any kind of experience | 24:53.920 | 24:57.200 |
1 | Max Tegmark | Life 3.0 | I don't think they're very afraid of dying | 24:57.200 | 24:58.560 |
1 | Max Tegmark | Life 3.0 | or there's nothing they can do about it anyway much. | 24:58.560 | 25:00.920 |
1 | Max Tegmark | Life 3.0 | So there wasn't that much value in, but more seriously | 25:00.920 | 25:04.560 |
1 | Max Tegmark | Life 3.0 | I think if you ask, not just about being conscious | 25:04.560 | 25:09.200 |
1 | Max Tegmark | Life 3.0 | but maybe having what you would, we might call | 25:09.200 | 25:14.200 |
1 | Max Tegmark | Life 3.0 | an exciting life where you feel passion | 25:14.320 | 25:16.400 |
1 | Max Tegmark | Life 3.0 | and really appreciate the things. | 25:16.400 | 25:21.400 |
1 | Max Tegmark | Life 3.0 | Maybe there somehow, maybe there perhaps it does help | 25:21.480 | 25:24.440 |
1 | Max Tegmark | Life 3.0 | having a backdrop that, Hey, it's finite. | 25:24.440 | 25:27.880 |
1 | Max Tegmark | Life 3.0 | No, let's make the most of this, let's live to the fullest. | 25:27.880 | 25:31.200 |
1 | Max Tegmark | Life 3.0 | So if you knew you were going to live forever | 25:31.200 | 25:33.800 |
1 | Max Tegmark | Life 3.0 | do you think you would change your? | 25:34.880 | 25:37.400 |
1 | Max Tegmark | Life 3.0 | Yeah, I mean, in some perspective | 25:37.400 | 25:39.560 |
1 | Max Tegmark | Life 3.0 | it would be an incredibly boring life living forever. | 25:39.560 | 25:43.960 |
1 | Max Tegmark | Life 3.0 | So in the sort of loose subjective terms that you said | 25:43.960 | 25:47.360 |
1 | Max Tegmark | Life 3.0 | of something exciting and something in this | 25:47.360 | 25:50.480 |
1 | Max Tegmark | Life 3.0 | that other humans would understand, I think is, yeah | 25:50.480 | 25:53.240 |
1 | Max Tegmark | Life 3.0 | it seems that the finiteness of it is important. | 25:53.240 | 25:57.120 |
1 | Max Tegmark | Life 3.0 | Well, the good news I have for you then is | 25:57.120 | 25:59.560 |
1 | Max Tegmark | Life 3.0 | based on what we understand about cosmology | 25:59.560 | 26:02.120 |
1 | Max Tegmark | Life 3.0 | everything is in our universe is probably | 26:02.120 | 26:05.120 |
1 | Max Tegmark | Life 3.0 | ultimately probably finite, although. | 26:05.120 | 26:07.960 |
1 | Max Tegmark | Life 3.0 | Big crunch or big, what's the, the infinite expansion. | 26:07.960 | 26:11.560 |
1 | Max Tegmark | Life 3.0 | Yeah, we could have a big chill or a big crunch | 26:11.560 | 26:13.840 |
1 | Max Tegmark | Life 3.0 | or a big rip or that's the big snap or death bubbles. | 26:13.840 | 26:18.440 |
1 | Max Tegmark | Life 3.0 | All of them are more than a billion years away. | 26:18.440 | 26:20.040 |
1 | Max Tegmark | Life 3.0 | So we should, we certainly have vastly more time | 26:20.040 | 26:24.600 |
1 | Max Tegmark | Life 3.0 | than our ancestors thought, but there is still | 26:24.600 | 26:27.920 |
1 | Max Tegmark | Life 3.0 | it's still pretty hard to squeeze in an infinite number | 26:29.160 | 26:32.360 |
1 | Max Tegmark | Life 3.0 | of compute cycles, even though there are some loopholes | 26:32.360 | 26:36.560 |
1 | Max Tegmark | Life 3.0 | that just might be possible. | 26:36.560 | 26:37.720 |
1 | Max Tegmark | Life 3.0 | But I think, you know, some people like to say | 26:37.720 | 26:41.960 |
1 | Max Tegmark | Life 3.0 | that you should live as if you're about to | 26:41.960 | 26:44.760 |
1 | Max Tegmark | Life 3.0 | you're going to die in five years or so. | 26:44.760 | 26:46.720 |
1 | Max Tegmark | Life 3.0 | And that's sort of optimal. | 26:46.720 | 26:47.960 |
1 | Max Tegmark | Life 3.0 | Maybe it's a good assumption. | 26:47.960 | 26:50.560 |
1 | Max Tegmark | Life 3.0 | We should build our civilization as if it's all finite | 26:50.560 | 26:54.680 |
1 | Max Tegmark | Life 3.0 | to be on the safe side. | 26:54.680 | 26:55.680 |
1 | Max Tegmark | Life 3.0 | Right, exactly. | 26:55.680 | 26:56.960 |
1 | Max Tegmark | Life 3.0 | So you mentioned defining intelligence | 26:56.960 | 26:59.720 |
1 | Max Tegmark | Life 3.0 | as the ability to solve complex goals. | 26:59.720 | 27:02.960 |
1 | Max Tegmark | Life 3.0 | Where would you draw a line or how would you try | 27:02.960 | 27:05.440 |
1 | Max Tegmark | Life 3.0 | to define human level intelligence | 27:05.440 | 27:08.200 |
1 | Max Tegmark | Life 3.0 | and superhuman level intelligence? | 27:08.200 | 27:10.680 |
1 | Max Tegmark | Life 3.0 | Where is consciousness part of that definition? | 27:10.680 | 27:13.280 |
1 | Max Tegmark | Life 3.0 | No, consciousness does not come into this definition. | 27:13.280 | 27:16.640 |
1 | Max Tegmark | Life 3.0 | So, so I think of intelligence as it's a spectrum | 27:16.640 | 27:20.280 |
1 | Max Tegmark | Life 3.0 | but there are very many different kinds of goals | 27:20.280 | 27:21.960 |
1 | Max Tegmark | Life 3.0 | you can have. | 27:21.960 | 27:22.800 |
1 | Max Tegmark | Life 3.0 | You can have a goal to be a good chess player | 27:22.800 | 27:24.000 |
1 | Max Tegmark | Life 3.0 | a good goal player, a good car driver, a good investor | 27:24.000 | 27:28.520 |
1 | Max Tegmark | Life 3.0 | good poet, et cetera. | 27:28.520 | 27:31.160 |
1 | Max Tegmark | Life 3.0 | So intelligence that by its very nature | 27:31.160 | 27:34.320 |
1 | Max Tegmark | Life 3.0 | isn't something you can measure by this one number | 27:34.320 | 27:36.680 |
Subsets and Splits