WEBVTT

1
00:00:00.150 --> 00:00:01.320
- [Narrator] This is Future Tech

2
00:00:01.320 --> 00:00:03.570
where each week we
discuss the good, the bad

3
00:00:03.570 --> 00:00:05.730
and the ugly of where tech is headed

4
00:00:05.730 --> 00:00:07.803
in 2023 and beyond.

5
00:00:09.090 --> 00:00:11.122
- AI continues to be the biggest story,

6
00:00:11.122 --> 00:00:13.620
and we have a lot to break
down for you this week.

7
00:00:13.620 --> 00:00:15.840
Our fearless leader over at Gizmodo,

8
00:00:15.840 --> 00:00:17.550
Dave Ewalt joins us now to break down

9
00:00:17.550 --> 00:00:19.097
some of the things that
we've seen this past week.

10
00:00:19.097 --> 00:00:22.920
Notably some of the top
leaders in tech and AI

11
00:00:22.920 --> 00:00:24.840
actually published some
sort of open letter, right?

12
00:00:24.840 --> 00:00:28.170
Saying, "Can we just calm
down on the AI for now?"

13
00:00:28.170 --> 00:00:29.430
What more can you tell us?

14
00:00:29.430 --> 00:00:31.980
- So there's a group called
The Future of Life Institute,

15
00:00:31.980 --> 00:00:35.250
and they're very focused
on the development of AI

16
00:00:35.250 --> 00:00:36.150
and issues related to this.

17
00:00:36.150 --> 00:00:37.530
They put out an open letter this week

18
00:00:37.530 --> 00:00:40.650
signed by more than 500, at this point,

19
00:00:40.650 --> 00:00:42.990
different tech luminaries, you know,

20
00:00:42.990 --> 00:00:44.580
real engineering experts.

21
00:00:44.580 --> 00:00:48.418
People like Steve Wozniak,
Elon Musk also signed it,

22
00:00:48.418 --> 00:00:50.700
a bunch of MIT people.

23
00:00:50.700 --> 00:00:53.130
So it's very thoughtful
people who know a lot

24
00:00:53.130 --> 00:00:54.570
about this area of technology,

25
00:00:54.570 --> 00:00:56.216
and they're expressing some concern

26
00:00:56.216 --> 00:00:59.310
that we're moving too fast
in the development of AI

27
00:00:59.310 --> 00:01:01.170
and suggesting that we slow down.

28
00:01:01.170 --> 00:01:02.370
- It seems like they've been

29
00:01:02.370 --> 00:01:03.960
kind of can't stop, won't stop, right?

30
00:01:03.960 --> 00:01:05.700
The folks in the AI world.

31
00:01:05.700 --> 00:01:09.180
Let's see if that temporary
moratorium actually does work.

32
00:01:09.180 --> 00:01:12.061
But in that letter, I think
they even use the quote

33
00:01:12.061 --> 00:01:14.580
potential, like risks to society.

34
00:01:14.580 --> 00:01:16.620
I mean, what are some of the concerns?

35
00:01:16.620 --> 00:01:17.453
- Yeah, so here's the thing.

36
00:01:17.453 --> 00:01:19.140
When we talk about risks from AI,

37
00:01:19.140 --> 00:01:22.140
people automatically go
to "Terminator," right?

38
00:01:22.140 --> 00:01:23.850
They're like, "Oh, we're creating Skynet.

39
00:01:23.850 --> 00:01:26.831
AI is gonna kill all of us puny humans."

40
00:01:26.831 --> 00:01:29.010
And I don't think anybody really

41
00:01:29.010 --> 00:01:31.290
takes that scenario too seriously.

42
00:01:31.290 --> 00:01:32.910
What they're really
talking about is the fact

43
00:01:32.910 --> 00:01:34.110
that we're creating these tools

44
00:01:34.110 --> 00:01:35.670
that we really don't understand yet,

45
00:01:35.670 --> 00:01:37.175
and we don't understand the consequences.

46
00:01:37.175 --> 00:01:41.094
So for instance, are we
creating automations for things

47
00:01:41.094 --> 00:01:43.440
that are going to take
all of our jobs away?

48
00:01:43.440 --> 00:01:45.480
Not just the small jobs
nobody cares about,

49
00:01:45.480 --> 00:01:48.420
but, like, are we doing
these tools so fast,

50
00:01:48.420 --> 00:01:50.070
that 10 years from now we're gonna find

51
00:01:50.070 --> 00:01:52.140
out we've completely
destroyed our economy?

52
00:01:52.140 --> 00:01:54.397
'Cause, now 50% of the
population doesn't have jobs

53
00:01:54.397 --> 00:01:56.108
'cause AI does that.

54
00:01:56.108 --> 00:01:57.870
They're also very worried

55
00:01:57.870 --> 00:02:00.630
about disinformation and propaganda.

56
00:02:00.630 --> 00:02:02.610
And that's some of what
we've already seen already

57
00:02:02.610 --> 00:02:04.440
with these AIs, is that it's very easy

58
00:02:04.440 --> 00:02:06.744
to make fake videos, fake pictures.

59
00:02:06.744 --> 00:02:09.600
- Yeah, I mean, definitely
some real concerns there.

60
00:02:09.600 --> 00:02:11.455
I mean, the AI, some of
the images and videos

61
00:02:11.455 --> 00:02:13.080
can be fun though.

62
00:02:13.080 --> 00:02:14.820
And then, you know, some
can be a bit damaging.

63
00:02:14.820 --> 00:02:15.876
I mean, we saw what we saw,

64
00:02:15.876 --> 00:02:19.212
the Pope recently sporting
some sort of, like,

65
00:02:19.212 --> 00:02:21.750
dapper white puffer coat.

66
00:02:21.750 --> 00:02:22.800
- Yeah, you know, they're funny,

67
00:02:22.800 --> 00:02:26.639
and it's amazing what
these AIs can produce.

68
00:02:26.639 --> 00:02:29.790
That one you're referring to of the Pope

69
00:02:29.790 --> 00:02:32.160
in the big white puffer
coat went super viral,

70
00:02:32.160 --> 00:02:32.993
'cause it was really funny,

71
00:02:32.993 --> 00:02:34.230
and because it looked completely real.

72
00:02:34.230 --> 00:02:35.102
Everyone was like, "Oh, look,

73
00:02:35.102 --> 00:02:38.040
the Pope wears this awesome jacket."

74
00:02:38.040 --> 00:02:39.720
It was totally fake though.

75
00:02:39.720 --> 00:02:42.820
And that was a harmless
example of everybody falling

76
00:02:42.820 --> 00:02:45.075
for one of these fake images.

77
00:02:45.075 --> 00:02:47.991
But the fact that you can now
create those sort of things

78
00:02:47.991 --> 00:02:52.710
as fast as you can now, it
doesn't take a lot of Photoshop,

79
00:02:52.710 --> 00:02:54.240
doesn't take a ton of time,

80
00:02:54.240 --> 00:02:57.840
or the videos too, when suddenly
there could be a news event

81
00:02:57.840 --> 00:02:59.296
and someone says, "Hey, look at this video

82
00:02:59.296 --> 00:03:03.540
from this shooting or from the
president's speech backstage,

83
00:03:03.540 --> 00:03:04.530
what he said."

84
00:03:04.530 --> 00:03:07.343
The fact that you can produce fake content

85
00:03:07.343 --> 00:03:11.910
as quickly and as
accurately as these AIs do,

86
00:03:11.910 --> 00:03:13.380
that's what's really got people worried.

87
00:03:13.380 --> 00:03:15.185
- Right, I mean, a lot of
us already have trust issues

88
00:03:15.185 --> 00:03:18.330
just in general, and now
that apparently the AI is

89
00:03:18.330 --> 00:03:21.525
getting better at doing hands,
I think that was kind of,

90
00:03:21.525 --> 00:03:24.810
you know, a sign that it
was generated artificially,

91
00:03:24.810 --> 00:03:27.000
and it's getting better.

92
00:03:27.000 --> 00:03:28.770
But regulation's on the table, I guess.

93
00:03:28.770 --> 00:03:30.960
- Yeah, I mean, over the past couple years

94
00:03:30.960 --> 00:03:32.430
just being involved in journalism,

95
00:03:32.430 --> 00:03:34.290
I've been in working
groups where we've looked

96
00:03:34.290 --> 00:03:36.870
at things, like, well, we've
got a news clip coming,

97
00:03:36.870 --> 00:03:39.030
and how do we know this
isn't created by AI?

98
00:03:39.030 --> 00:03:41.700
And it's very rapidly,
just in the past two years,

99
00:03:41.700 --> 00:03:44.760
gone from "Oh, here's an obvious problem."

100
00:03:44.760 --> 00:03:46.980
The person depicted has six fingers

101
00:03:46.980 --> 00:03:48.678
or ears were something
that was often wrong

102
00:03:48.678 --> 00:03:50.433
on AI created stuff,

103
00:03:50.433 --> 00:03:52.590
to now it's very, very difficult to tell.

104
00:03:52.590 --> 00:03:53.557
So people are talking about,

105
00:03:53.557 --> 00:03:55.170
"Well, maybe we need to regulate this."

106
00:03:55.170 --> 00:03:57.210
- Because we've seen of course,

107
00:03:57.210 --> 00:03:58.357
throughout our journalism careers,

108
00:03:58.357 --> 00:04:02.040
doctored, you know, videos
that have been touched,

109
00:04:02.040 --> 00:04:03.960
and, you know, like,
I know Project Veritas

110
00:04:03.960 --> 00:04:05.160
and other companies have been famous

111
00:04:05.160 --> 00:04:08.640
for kind of bits and
piecing videos together.

112
00:04:08.640 --> 00:04:12.510
But yeah, I mean AI, you
can't as easily decipher

113
00:04:12.510 --> 00:04:13.920
whether something is real or not.

114
00:04:13.920 --> 00:04:15.570
- Yeah, it's getting
harder and harder to tell

115
00:04:15.570 --> 00:04:16.650
what's real and what's not.

116
00:04:16.650 --> 00:04:18.900
And, you know, at the rate we're moving.

117
00:04:18.900 --> 00:04:20.970
And I was about to say, give
it another three to five years,

118
00:04:20.970 --> 00:04:22.172
but at this point it's,
like, give it another-

119
00:04:22.172 --> 00:04:23.043
- Isn't this all a sim, Dave?

120
00:04:23.043 --> 00:04:24.048
- [Dave] Three to give months.

121
00:04:24.048 --> 00:04:24.881
- Is this a simulation?

122
00:04:24.881 --> 00:04:25.714
- Totally.

123
00:04:25.714 --> 00:04:27.750
We are definitely living in a simulation,

124
00:04:27.750 --> 00:04:29.820
and, you know, the
technology is gonna increase

125
00:04:29.820 --> 00:04:31.410
to the point where
whoever's playing this sim

126
00:04:31.410 --> 00:04:33.270
decides to hit us all
with a massive hurricane

127
00:04:33.270 --> 00:04:34.170
and burn down our city.

128
00:04:34.170 --> 00:04:36.330
So let's do what we can to stay alive

129
00:04:36.330 --> 00:04:38.010
until the gamer gets bored.

130
00:04:38.010 --> 00:04:41.970
- Well, in the meantime, we'll
hopefully play the game fair.

131
00:04:41.970 --> 00:04:43.620
Dave Ewalt, thank you so much for that.

132
00:04:43.620 --> 00:04:46.320
And for more on AI and the simulation

133
00:04:46.320 --> 00:04:48.902
that we're all living
in, check out gizmodo.com

134
00:04:48.902 --> 00:04:52.152
(gentle ambient music)