WEBVTT

1
00:00:04.560 --> 00:00:06.690
- Are we ready for self-driving vehicles?

2
00:00:06.690 --> 00:00:08.855
Of course we are. They're safer than ever.

3
00:00:08.855 --> 00:00:11.460
And when we have human drivers
on the road that don't know

4
00:00:11.460 --> 00:00:13.380
what they're doing, why
wouldn't we take the shot?

5
00:00:13.380 --> 00:00:16.050
- Absolutely not. It's not worth the risk.

6
00:00:16.050 --> 00:00:18.840
The data isn't there,
the tests aren't there.

7
00:00:18.840 --> 00:00:20.370
These things are not safe.

8
00:00:20.370 --> 00:00:24.330
They're contributing to
accidents, deaths, and lawsuits.

9
00:00:24.330 --> 00:00:27.780
They're just all in all a bad way forward

10
00:00:27.780 --> 00:00:29.850
for the road. And for drivers,

11
00:00:29.850 --> 00:00:31.590
- Actually, we do have the data.

12
00:00:31.590 --> 00:00:34.165
Waymo, for example, back in
December, they put out a study

13
00:00:34.165 --> 00:00:38.430
of about 7.5 million miles of user data.

14
00:00:38.430 --> 00:00:41.100
They found that when you
have a AI driven driver

15
00:00:41.100 --> 00:00:43.050
behind the wheel, not a human,

16
00:00:43.050 --> 00:00:45.420
it's actually 90% more
safe than having just a

17
00:00:45.420 --> 00:00:48.510
regular person behind the wheel.
Why wouldn't we take that?

18
00:00:48.510 --> 00:00:51.960
- Well, that's funny because
GM's crews back in 2020 said

19
00:00:51.960 --> 00:00:53.670
that they had 2 million miles

20
00:00:53.670 --> 00:00:56.010
of driver data over five years,

21
00:00:56.010 --> 00:00:58.590
and then they started rolling
out these driverless cars on

22
00:00:58.590 --> 00:01:01.500
the road in San Francisco
and other cities,

23
00:01:01.500 --> 00:01:03.330
and there were accidents.

24
00:01:03.330 --> 00:01:06.300
A woman got dragged 20
feet by a cruise car

25
00:01:06.300 --> 00:01:08.610
because the software just didn't register,

26
00:01:08.610 --> 00:01:11.130
that she was stuck under
its wheel and didn't stop.

27
00:01:11.130 --> 00:01:15.120
So how is that safety risk
outweighed by what little data

28
00:01:15.120 --> 00:01:16.950
that they're willing to show us when

29
00:01:16.950 --> 00:01:20.670
that data doesn't actually
result in more safety?

30
00:01:20.670 --> 00:01:23.130
- You can't tell me that
humans don't get to accidents.

31
00:01:23.130 --> 00:01:24.540
Humans don't make mistakes.

32
00:01:24.540 --> 00:01:28.350
In 2021 alone, we had 42,000
people die from a car crash.

33
00:01:28.350 --> 00:01:31.715
That is way more than
any ax has in the past 15

34
00:01:31.715 --> 00:01:33.000
years or so of testing.

35
00:01:33.000 --> 00:01:36.420
- Yeah, because way less people
are using driverless cars

36
00:01:36.420 --> 00:01:37.500
and robo taxis.

37
00:01:37.500 --> 00:01:39.840
Yes, there are only maybe in the hundreds

38
00:01:39.840 --> 00:01:42.720
or very low thousands of accidents

39
00:01:42.720 --> 00:01:44.820
for all robo taxis across brands.

40
00:01:44.820 --> 00:01:47.160
But there are only very few people

41
00:01:47.160 --> 00:01:49.590
that are using these
driverless vehicles compared

42
00:01:49.590 --> 00:01:51.810
to the millions and millions
of people in America

43
00:01:51.810 --> 00:01:52.980
that drive cars.

44
00:01:52.980 --> 00:01:54.990
- And once we get more on the

45
00:01:54.990 --> 00:01:56.490
road, more people will drive them.

46
00:01:56.490 --> 00:01:58.920
And we have more data. But
nothing happens if we don't

47
00:01:58.920 --> 00:02:00.240
innovate, if we don't test things,

48
00:02:00.240 --> 00:02:01.920
and if we don't keep expanding. But at

49
00:02:01.920 --> 00:02:03.570
- What point does innovation

50
00:02:03.570 --> 00:02:08.280
and testing, you know, reach
a certain safety threshold?

51
00:02:08.280 --> 00:02:11.190
How can a company comfortably say, yes,

52
00:02:11.190 --> 00:02:12.570
we've done enough testing.

53
00:02:12.570 --> 00:02:15.605
We are ready to roll out these
driverless vehicles on the

54
00:02:15.605 --> 00:02:18.900
road when there's actually
no defined threshold for

55
00:02:18.900 --> 00:02:21.480
what safety even means. In this case,

56
00:02:21.480 --> 00:02:23.880
- That's on the federal
government, that's on states.

57
00:02:23.880 --> 00:02:25.800
They need to make the regulation.

58
00:02:25.800 --> 00:02:27.450
They need to take the laws.

59
00:02:27.450 --> 00:02:29.310
The companies have to follow the law.

60
00:02:29.310 --> 00:02:31.290
If there's no law for
them to follow, they have

61
00:02:31.290 --> 00:02:33.960
to do their best to
keep their people safe,

62
00:02:33.960 --> 00:02:35.760
to keep drivers safe, consumers safe,

63
00:02:35.760 --> 00:02:36.760
- Right?

64
00:02:36.760 --> 00:02:37.593
But companies doing their best.

65
00:02:37.593 --> 00:02:41.220
When has that ever resulted
in the best for people?

66
00:02:41.220 --> 00:02:42.720
- Part of the reason businesses are going

67
00:02:42.720 --> 00:02:44.250
after this technology, not only

68
00:02:44.250 --> 00:02:45.480
'cause of the economic benefits,

69
00:02:45.480 --> 00:02:46.980
they're a company, they have that motive.

70
00:02:46.980 --> 00:02:48.540
Of course they do. But

71
00:02:48.540 --> 00:02:50.250
'cause of what they can do for people.

72
00:02:50.250 --> 00:02:52.740
It's just a tool to help
make your life easier,

73
00:02:52.740 --> 00:02:54.480
safer, and more comfortable.

74
00:02:54.480 --> 00:02:55.805
When you're on a highway, for example,

75
00:02:55.805 --> 00:02:59.320
going 80 miles an hour on
a eight hour road trip,

76
00:02:59.320 --> 00:03:01.390
do you want to be at the
wheel the entire time fully,

77
00:03:01.390 --> 00:03:03.370
a hundred percent paying attention?

78
00:03:03.370 --> 00:03:06.430
No. You want a minute, just
a minute to drink water,

79
00:03:06.430 --> 00:03:08.530
take a breath, look away. Close your eyes.

80
00:03:08.530 --> 00:03:10.420
- Right? But that's not
how it's being marketed.

81
00:03:10.420 --> 00:03:13.065
And I think that that's where
the biggest disconnect is,

82
00:03:13.065 --> 00:03:15.910
where these companies are
saying, this is fully driverless,

83
00:03:15.910 --> 00:03:18.070
self-driving car, fully autonomous.

84
00:03:18.070 --> 00:03:20.410
And people are taking that at face value

85
00:03:20.410 --> 00:03:21.820
and not thinking critically about it.

86
00:03:21.820 --> 00:03:24.310
Even if the companies put
it in their fine print

87
00:03:24.310 --> 00:03:26.685
that you know, this is just a tool.

88
00:03:26.685 --> 00:03:27.730
This is just a crutch.

89
00:03:27.730 --> 00:03:30.370
Like you said, people are
not getting that message

90
00:03:30.370 --> 00:03:32.980
and that's not how they're
treating these technologies.

91
00:03:32.980 --> 00:03:35.530
And then that's what's resulting in all

92
00:03:35.530 --> 00:03:37.660
of these accidents and problems.

93
00:03:37.660 --> 00:03:39.700
- What they're doing is they're
showing you the technology

94
00:03:39.700 --> 00:03:41.860
that we can use now in the modern day.

95
00:03:41.860 --> 00:03:43.930
Things like lane assist cruise control

96
00:03:43.930 --> 00:03:47.320
to help you keep safe right
now, but in the near future.

97
00:03:47.320 --> 00:03:49.150
And we're getting closer every day.

98
00:03:49.150 --> 00:03:52.270
That technology will help
you stay safe all the time.

99
00:03:52.270 --> 00:03:54.040
It will drive your car for you.

100
00:03:54.040 --> 00:03:55.600
You can sit back and relax

101
00:03:55.600 --> 00:03:57.430
and just let it take you on a trip,

102
00:03:57.430 --> 00:03:58.430
- Right?

103
00:03:58.430 --> 00:04:00.880
But how long is it going to
take to get to that point

104
00:04:00.880 --> 00:04:05.080
where not only can people trust
this technology to be able

105
00:04:05.080 --> 00:04:07.780
to fully sit back, relax,
and enjoy the ride,

106
00:04:07.780 --> 00:04:09.735
but also feel, you know, safe

107
00:04:09.735 --> 00:04:12.160
and secure in making those
decisions for themselves.

108
00:04:12.160 --> 00:04:15.130
And where this technology
won't result in, you know,

109
00:04:15.130 --> 00:04:16.150
more accidents

110
00:04:16.150 --> 00:04:18.220
and disproportionately higher levels

111
00:04:18.220 --> 00:04:19.935
of accidents than maybe human drivers?

112
00:04:19.935 --> 00:04:21.910
We don't have that data
yet. We don't know yet.

113
00:04:21.910 --> 00:04:24.190
Thanks for watching.
Check out qz.com for more.