WEBVTT

1
00:00:00.180 --> 00:00:01.320
- [Host] This is Future Tech,

2
00:00:01.320 --> 00:00:03.570
where each week we
discuss the good, the bad,

3
00:00:03.570 --> 00:00:07.831
and the ugly of where tech
is headed in 2023 and beyond.

4
00:00:07.831 --> 00:00:08.970
(swoosh music)

5
00:00:08.970 --> 00:00:12.120
- AI is in the spotlight again
this time on Capitol Hill.

6
00:00:12.120 --> 00:00:14.910
Sam Altman testified
before Congress yesterday

7
00:00:14.910 --> 00:00:17.040
and here to talk about
it is senior reporter

8
00:00:17.040 --> 00:00:18.060
Thomas Germain.

9
00:00:18.060 --> 00:00:21.210
Tom, what did Sam Altman
say yesterday to the Senate?

10
00:00:21.210 --> 00:00:22.650
- Yeah, it was a pretty weird hearing.

11
00:00:22.650 --> 00:00:24.762
He showed up and
essentially the argument was

12
00:00:24.762 --> 00:00:28.650
please regulate the AI
industry to protect the public

13
00:00:28.650 --> 00:00:32.010
from the technology that I am creating.

14
00:00:32.010 --> 00:00:34.200
And it was kind of unusual
for a number of reasons,

15
00:00:34.200 --> 00:00:35.550
one of the things that was weird about it

16
00:00:35.550 --> 00:00:38.029
was all of the congressman

17
00:00:38.029 --> 00:00:42.750
and the witnesses, including
the CEO got along really well,

18
00:00:42.750 --> 00:00:44.640
like they were all little buddies

19
00:00:44.640 --> 00:00:45.990
and essentially they all agree

20
00:00:45.990 --> 00:00:48.600
that AI is scary and
it should be regulated

21
00:00:48.600 --> 00:00:49.500
but exactly how?

22
00:00:49.500 --> 00:00:51.450
They have no real idea.

23
00:00:51.450 --> 00:00:52.800
- So it was pretty different from hearings

24
00:00:52.800 --> 00:00:55.341
with Mark Zuckerberg where
they're at loggerheads.

25
00:00:55.341 --> 00:00:57.990
- Yeah, Cory Booker was calling Sam Altman

26
00:00:57.990 --> 00:01:00.090
by his first name, for
example, which is really

27
00:01:00.090 --> 00:01:02.100
not the way that these things usually go.

28
00:01:02.100 --> 00:01:04.050
They all were very impressed with him,

29
00:01:04.050 --> 00:01:05.220
it was kind of smarmy.

30
00:01:05.220 --> 00:01:06.240
It was really different.

31
00:01:06.240 --> 00:01:08.850
Usually it's very hostile when
they bring these people in.

32
00:01:08.850 --> 00:01:11.820
- Did Congress seem to understand AI?

33
00:01:11.820 --> 00:01:13.193
I mean, that's been a huge problem

34
00:01:13.193 --> 00:01:16.320
with hearings on social
media and other things.

35
00:01:16.320 --> 00:01:18.930
- I think, no, and I don't think really

36
00:01:18.930 --> 00:01:20.130
anyone understands AI

37
00:01:20.130 --> 00:01:22.650
but Congress definitely
didn't do an impressive job

38
00:01:22.650 --> 00:01:23.662
for the most part.

39
00:01:23.662 --> 00:01:26.310
Marsha Blackburn, for
example, was very concerned

40
00:01:26.310 --> 00:01:29.580
about this deep fake of Garth
Brooks that she heard about.

41
00:01:29.580 --> 00:01:32.700
I'm not sure what Sam
Altman has to do with that

42
00:01:32.700 --> 00:01:35.730
but when it got into the
specifics, it really was vague.

43
00:01:35.730 --> 00:01:38.640
And it says a lot about how little we know

44
00:01:38.640 --> 00:01:40.620
about AI and where the technology is going

45
00:01:40.620 --> 00:01:42.600
or even what it might be useful for.

46
00:01:42.600 --> 00:01:43.860
- Yeah, I did see Josh Hawley

47
00:01:43.860 --> 00:01:45.090
call it either a printing press

48
00:01:45.090 --> 00:01:45.960
or the atomic bottom

49
00:01:45.960 --> 00:01:48.754
and there's a lot of distance
in between those two.

50
00:01:48.754 --> 00:01:51.360
If Sam Altman's the one
proposing regulation,

51
00:01:51.360 --> 00:01:53.160
what does he want to happen?

52
00:01:53.160 --> 00:01:54.690
- So Sam Altman's idea is that we might

53
00:01:54.690 --> 00:01:58.080
have a new regulatory
agency that's just for AI,

54
00:01:58.080 --> 00:02:00.330
which sounds kind of weird, right?

55
00:02:00.330 --> 00:02:03.690
Why would a tech CEO want
this government intervention?

56
00:02:03.690 --> 00:02:05.250
But it's actually a pretty normal thing.

57
00:02:05.250 --> 00:02:06.660
We've seen this for years

58
00:02:06.660 --> 00:02:09.060
on privacy is a really good example.

59
00:02:09.060 --> 00:02:10.110
Some of the biggest advocates

60
00:02:10.110 --> 00:02:13.980
for a privacy law are
Microsoft and Google and Meta.

61
00:02:13.980 --> 00:02:16.020
And the reason is, is if there is a law

62
00:02:16.020 --> 00:02:19.218
but it's not that
restrictive, it's a real help

63
00:02:19.218 --> 00:02:22.020
to these companies because
if something goes wrong

64
00:02:22.020 --> 00:02:24.630
they can say, oh, well we
were complying with the law.

65
00:02:24.630 --> 00:02:26.472
The problem is that the
government did a bad job

66
00:02:26.472 --> 00:02:29.310
of regulating us, be mad at them.

67
00:02:29.310 --> 00:02:31.470
And another thing that an agency would do,

68
00:02:31.470 --> 00:02:35.160
is it would make it really
expensive to build AI?

69
00:02:35.160 --> 00:02:37.230
One of the proposals was
maybe you'd need a license

70
00:02:37.230 --> 00:02:38.776
to develop the technology.

71
00:02:38.776 --> 00:02:41.460
If you needed that, it
would be really hard to do.

72
00:02:41.460 --> 00:02:43.860
Only a big company like Microsoft

73
00:02:43.860 --> 00:02:45.990
or Google or OpenAI
would be able to do it.

74
00:02:45.990 --> 00:02:48.750
So it's a little more
cynical than it seems.

75
00:02:48.750 --> 00:02:49.980
- So they would be pulling up the ladder

76
00:02:49.980 --> 00:02:54.000
behind them if they did
establish this agency probably

77
00:02:54.000 --> 00:02:55.050
with guidance from Sam?

78
00:02:55.050 --> 00:02:56.220
Input from Sam Altman.

79
00:02:56.220 --> 00:02:58.440
- Exactly, and it looks
like they really care about

80
00:02:58.440 --> 00:03:00.690
all the terrible things that
their technology might do

81
00:03:00.690 --> 00:03:02.430
and they're on the side
of justice and good.

82
00:03:02.430 --> 00:03:04.320
- Right, he said "I'm scared too."

83
00:03:04.320 --> 00:03:06.000
If you say so, maybe.

84
00:03:06.000 --> 00:03:08.136
So what question would you ask Sam Altman

85
00:03:08.136 --> 00:03:10.080
if you were on the Senate panel?

86
00:03:10.080 --> 00:03:12.240
- Yeah, I think the
important thing to think

87
00:03:12.240 --> 00:03:14.194
about is like what are the consequences

88
00:03:14.194 --> 00:03:16.830
of the technology as it exists right now?

89
00:03:16.830 --> 00:03:17.850
What could go wrong?

90
00:03:17.850 --> 00:03:22.050
Not in some hypothetical,
"Robocop" future world

91
00:03:22.050 --> 00:03:23.490
where AI is destroying everything

92
00:03:23.490 --> 00:03:25.440
but with the tech as it currently works.

93
00:03:25.440 --> 00:03:26.940
And it could make a lot
of the problems that we

94
00:03:26.940 --> 00:03:29.430
already have significantly worse.

95
00:03:29.430 --> 00:03:31.470
One thing that I think we need more

96
00:03:31.470 --> 00:03:34.353
of is government mandated transparency.

97
00:03:35.280 --> 00:03:37.350
Tech companies should have to at least

98
00:03:37.350 --> 00:03:40.620
let regulators know about the data sets

99
00:03:40.620 --> 00:03:42.270
that this technology is being trained on,

100
00:03:42.270 --> 00:03:44.700
if it's going to be
deployed on this mass scale.

101
00:03:44.700 --> 00:03:46.560
And perhaps there should
be people who know

102
00:03:46.560 --> 00:03:49.170
how this stuff works,
who go in and investigate

103
00:03:49.170 --> 00:03:50.820
and prod at the algorithms

104
00:03:50.820 --> 00:03:52.620
and see if they're not
doing anything wrong.

105
00:03:52.620 --> 00:03:55.080
But I think the lesson that
we've learned is you want

106
00:03:55.080 --> 00:03:56.314
to get out in front of this stuff

107
00:03:56.314 --> 00:03:59.190
before we have all these
problems in housing

108
00:03:59.190 --> 00:04:01.710
and you violate the
Civil Rights Act instead

109
00:04:01.710 --> 00:04:04.140
of waiting for the problems to happen.

110
00:04:04.140 --> 00:04:06.930
We know what could go wrong,
there's experts who have a lot

111
00:04:06.930 --> 00:04:09.210
of ideas that we could
appeal to right now.

112
00:04:09.210 --> 00:04:10.130
We could start.

113
00:04:10.130 --> 00:04:10.973
- There was one on the panel yesterday.

114
00:04:10.973 --> 00:04:12.136
- There was one, they're right there.

115
00:04:12.136 --> 00:04:13.620
All you have to do is ask him

116
00:04:13.620 --> 00:04:15.903
or you could ask Sam Altman, right?

117
00:04:17.070 --> 00:04:19.110
He has this idea for a regulatory agency

118
00:04:19.110 --> 00:04:22.770
and John Kennedy, who's a
representative from Louisiana

119
00:04:22.770 --> 00:04:24.960
asked if sort of in sort of a jokey way,

120
00:04:24.960 --> 00:04:26.520
whether Sam would be interested

121
00:04:26.520 --> 00:04:28.642
in running the agency, which is like,

122
00:04:28.642 --> 00:04:31.230
I get kind of cutesy, but it speaks

123
00:04:31.230 --> 00:04:33.720
to how unprepared they are
to deal with this problem.

124
00:04:33.720 --> 00:04:35.430
- And Sam Altman has a job.

125
00:04:35.430 --> 00:04:37.770
- It's like asking the CEO of Shell Oil

126
00:04:37.770 --> 00:04:40.080
if he wants to run the
Environmental Protection Agency,

127
00:04:40.080 --> 00:04:41.463
it's just absurd.

128
00:04:42.720 --> 00:04:44.640
- Well, hopefully Sam Altman does not run

129
00:04:44.640 --> 00:04:46.530
our AI regulatory agency.

130
00:04:46.530 --> 00:04:49.066
To read more, go to gizmoto.com.

131
00:04:49.066 --> 00:04:50.144
(swoosh music)

132
00:04:50.144 --> 00:04:52.811
(upbeat music)