WEBVTT

1
00:00:00.090 --> 00:00:01.270
- So Elon Musk

2
00:00:02.160 --> 00:00:04.410
buys Twitter, do you
think he regrets it now?

3
00:00:04.410 --> 00:00:05.610
All the effort that went into it,

4
00:00:05.610 --> 00:00:08.010
all the money that he's lost,
or is it worth it because

5
00:00:08.010 --> 00:00:09.870
it's such a bigger platform for him?

6
00:00:09.870 --> 00:00:11.430
- You know, there are times he regrets it.

7
00:00:11.430 --> 00:00:13.980
There are times he says,
"Man, AI is so important,

8
00:00:13.980 --> 00:00:16.110
I should be focusing on that.

9
00:00:16.110 --> 00:00:19.890
I wish I weren't having to
spend so much time on Twitter."

10
00:00:19.890 --> 00:00:21.810
But then there's times
like he's a giddy child.

11
00:00:21.810 --> 00:00:23.970
He's got the biggest
playground in the world.

12
00:00:23.970 --> 00:00:27.090
He used to be beaten up when
he was a kid on the playground.

13
00:00:27.090 --> 00:00:31.020
Now he owns it, and he's doing
things that he really likes

14
00:00:31.020 --> 00:00:34.980
at Twitter, turning it
into a payments platform,

15
00:00:34.980 --> 00:00:37.830
a place for content-like video.

16
00:00:37.830 --> 00:00:41.460
So, a lot of the times he's
truly excited about it.

17
00:00:41.460 --> 00:00:43.260
- Is part of changing the name to X

18
00:00:43.260 --> 00:00:44.910
sort of erasing what happened before

19
00:00:44.910 --> 00:00:46.230
and putting his own stamp on it,

20
00:00:46.230 --> 00:00:48.990
almost as if he's starting
it like from scratch?

21
00:00:48.990 --> 00:00:52.110
- Well, partly it's, he wants to create

22
00:00:52.110 --> 00:00:54.060
what he tried to do 20 years ago.

23
00:00:54.060 --> 00:00:57.690
When he a company called
X that morphs into PayPal,

24
00:00:57.690 --> 00:01:00.420
and he wanted it to not
just be a payments platform,

25
00:01:00.420 --> 00:01:02.490
but a social media, and payments,

26
00:01:02.490 --> 00:01:06.240
and content creation
platform, sort of like WeChat

27
00:01:06.240 --> 00:01:07.407
or some of the Asian-
- The everything App.

28
00:01:07.407 --> 00:01:08.850
- The everything app.

29
00:01:08.850 --> 00:01:11.670
And now he thinks he's
making the everything app.

30
00:01:11.670 --> 00:01:14.490
- Artificial intelligence,
it's the hot topic right now.

31
00:01:14.490 --> 00:01:17.430
I feel like it's one of the
top topics on Elon Musk's mind.

32
00:01:17.430 --> 00:01:18.930
Obviously he has his own AI companies,

33
00:01:18.930 --> 00:01:22.233
he's invested in them from very
early on, including OpenAI.

34
00:01:23.190 --> 00:01:26.520
He has this vision of AI
as being almost apocalyptic

35
00:01:26.520 --> 00:01:27.630
if misused.

36
00:01:27.630 --> 00:01:29.880
Has he sold you in your conversations

37
00:01:29.880 --> 00:01:32.670
on his view of AI being

38
00:01:32.670 --> 00:01:35.550
dangerous if not carefully
controlled or regulated?

39
00:01:35.550 --> 00:01:37.710
- Well, I think it may be dangerous

40
00:01:37.710 --> 00:01:40.740
but am I ever gonna be as
apocalyptic as Elon Musk?

41
00:01:40.740 --> 00:01:42.780
I mean, he gets apocalyptic about things.

42
00:01:42.780 --> 00:01:45.930
He read Isaac Asimov novels far too often

43
00:01:45.930 --> 00:01:48.660
when he was a young kid
sitting in the corner

44
00:01:48.660 --> 00:01:51.060
of the bookstore with no friends.

45
00:01:51.060 --> 00:01:55.860
And ever since he and Demis
Hassabis and then Sam Altman

46
00:01:55.860 --> 00:01:57.900
were getting involved
in this 10 years ago,

47
00:01:57.900 --> 00:02:01.800
he felt that artificial
intelligence could wander off

48
00:02:01.800 --> 00:02:04.830
without us if we didn't put guardrails in.

49
00:02:04.830 --> 00:02:06.240
And that's what he's worried about now.

50
00:02:06.240 --> 00:02:08.760
- If we didn't make it sort
of connect with us in a way.

51
00:02:08.760 --> 00:02:10.590
- Right, and that's
what Neuralink is about.

52
00:02:10.590 --> 00:02:12.690
A chip in the brain that
connects you directly

53
00:02:12.690 --> 00:02:15.630
to your computer so that
there's an alignment,

54
00:02:15.630 --> 00:02:18.300
as you would say, between
your will and agency

55
00:02:18.300 --> 00:02:19.860
and what your computers are doing.

56
00:02:19.860 --> 00:02:22.140
- What we want as people
and what the computers want

57
00:02:22.140 --> 00:02:23.310
need to be in alignment.

58
00:02:23.310 --> 00:02:24.750
- Totally in alignment.

59
00:02:24.750 --> 00:02:27.240
And if you read a lot of science fiction,

60
00:02:27.240 --> 00:02:29.340
you worry that that ain't gonna happen.

61
00:02:29.340 --> 00:02:30.330
- A hundred years from now,

62
00:02:30.330 --> 00:02:33.000
another biographer is
writing about Elon Musk from

63
00:02:33.000 --> 00:02:34.080
further away in the future.

64
00:02:34.080 --> 00:02:36.630
Have we sanded off the rough edges?

65
00:02:36.630 --> 00:02:38.910
Has he become seen as a net force for good

66
00:02:38.910 --> 00:02:42.060
or do his darker impulses, that dark mode,

67
00:02:42.060 --> 00:02:44.280
does that win out in our historical view?

68
00:02:44.280 --> 00:02:46.830
- Well, in the end, the historical view

69
00:02:46.830 --> 00:02:48.810
is about the impact you had.

70
00:02:48.810 --> 00:02:53.580
His dark mode, him being a jerk
or even an asshole at times,

71
00:02:53.580 --> 00:02:55.170
that'll be known.

72
00:02:55.170 --> 00:03:00.170
But if he truly gets rocket
chips, that can get astronauts

73
00:03:00.390 --> 00:03:03.870
into orbit very regularly,
eventually gets one to Mars,

74
00:03:03.870 --> 00:03:07.260
eventually brings us into
self-driving vehicles.

75
00:03:07.260 --> 00:03:10.290
Brings us into the era of all EVs.

76
00:03:10.290 --> 00:03:14.460
I think he'll be seen, you
can go back to Henry Ford,

77
00:03:14.460 --> 00:03:17.430
horrible guy in many
ways, very rough edges.

78
00:03:17.430 --> 00:03:20.100
But we remember Henry
Ford more for the Model T

79
00:03:20.100 --> 00:03:21.540
and the assembly line,

80
00:03:21.540 --> 00:03:25.164
even though we should
remember the dark sides, too.

81
00:03:25.164 --> 00:03:28.247
(airy digital music)