WEBVTT

1
00:00:00.150 --> 00:00:01.950
- [Narrator] This is "Future
Tech", where each week

2
00:00:01.950 --> 00:00:03.480
we discuss the good, the bad,

3
00:00:03.480 --> 00:00:07.470
and the ugly of where tech
is headed in 2023 and beyond.

4
00:00:07.470 --> 00:00:10.770
- There's a new push to ban
a facial recognition tech

5
00:00:10.770 --> 00:00:12.990
on the federal level in Congress.

6
00:00:12.990 --> 00:00:14.160
Blake Montgomery joins us now.

7
00:00:14.160 --> 00:00:16.252
He's a deputy editor at Gizmodo and Blake,

8
00:00:16.252 --> 00:00:18.270
I mean, did Democrats think

9
00:00:18.270 --> 00:00:20.580
that they'll actually be
successful this go round?

10
00:00:20.580 --> 00:00:22.800
- I imagine Ed Marquee thinks
he's going to be successful

11
00:00:22.800 --> 00:00:24.180
reintroducing this bill.

12
00:00:24.180 --> 00:00:26.280
He introduced it in 2021.

13
00:00:26.280 --> 00:00:27.750
It died in committee.

14
00:00:27.750 --> 00:00:30.060
We still don't have a federal regulation

15
00:00:30.060 --> 00:00:31.830
that governs facial recognition.

16
00:00:31.830 --> 00:00:35.220
This bill would prohibit any
federal agencies like the FBI

17
00:00:35.220 --> 00:00:39.240
or the FDA or what have you
from identifying Americans

18
00:00:39.240 --> 00:00:40.350
or someone on American soil

19
00:00:40.350 --> 00:00:42.540
using facial recognition software.

20
00:00:42.540 --> 00:00:45.030
- Okay, so we do know that
the FBI has been testing it.

21
00:00:45.030 --> 00:00:46.890
Right, we did a report on that recently.

22
00:00:46.890 --> 00:00:47.940
What did we find there?

23
00:00:47.940 --> 00:00:51.360
- We found that the FBI,
according to documents released

24
00:00:51.360 --> 00:00:53.820
in response to a FOIA lawsuit by the ACLU,

25
00:00:53.820 --> 00:00:56.610
the FBI has been partnering
with a lot of researchers

26
00:00:56.610 --> 00:00:59.970
to test basically drastically
expanding the scope

27
00:00:59.970 --> 00:01:02.850
of facial recognition software's
use in the United States.

28
00:01:02.850 --> 00:01:06.120
There was testing on using
it in cameras, in subways,

29
00:01:06.120 --> 00:01:08.670
on public streets, in
hospitals, in schools,

30
00:01:08.670 --> 00:01:11.280
and it would just really widen the scope

31
00:01:11.280 --> 00:01:13.650
of how often Americans would be subjected

32
00:01:13.650 --> 00:01:15.810
to identification via facial recognition

33
00:01:15.810 --> 00:01:18.990
in a way that we haven't
seen in this country at all.

34
00:01:18.990 --> 00:01:21.390
It was a multiple years
long research effort

35
00:01:21.390 --> 00:01:23.490
between the FBI and these academics.

36
00:01:23.490 --> 00:01:25.560
It is over, it was
called the Janus Program.

37
00:01:25.560 --> 00:01:26.520
It ended in 2020

38
00:01:26.520 --> 00:01:29.910
but was folded into another
program called Project Horas.

39
00:01:29.910 --> 00:01:32.640
So it may still be going
on, we're not sure.

40
00:01:32.640 --> 00:01:34.740
The documents end at 2020

41
00:01:34.740 --> 00:01:39.087
but what they do show is
a really vast experiment

42
00:01:40.140 --> 00:01:42.300
in facial recognition
in the United States.

43
00:01:42.300 --> 00:01:44.520
- So of course there's
privacy concerns, right?

44
00:01:44.520 --> 00:01:46.890
There's also concerns about racial bias.

45
00:01:46.890 --> 00:01:49.383
We've seen that in
various algorithms before.

46
00:01:50.280 --> 00:01:51.990
I mean, can you talk a
little bit more about that?

47
00:01:51.990 --> 00:01:54.150
Some of the concerns
about this technology?

48
00:01:54.150 --> 00:01:56.370
- Yeah, researchers have
found that facial recognition

49
00:01:56.370 --> 00:01:59.820
is much less accurate
when looking at faces

50
00:01:59.820 --> 00:02:01.350
that are darker skinned

51
00:02:01.350 --> 00:02:04.410
and that has led to people
being wrongfully detained,

52
00:02:04.410 --> 00:02:08.130
wrongfully jailed for being
identified as subjects in crimes

53
00:02:08.130 --> 00:02:09.510
that they were nowhere near.

54
00:02:09.510 --> 00:02:12.210
A man was even identified
as a subject for a crime

55
00:02:12.210 --> 00:02:15.000
in Georgia and detained when he was, no,

56
00:02:15.000 --> 00:02:16.410
he lived in Louisiana.

57
00:02:16.410 --> 00:02:19.590
He didn't like, he wasn't
near the crime scene at all,

58
00:02:19.590 --> 00:02:21.120
and was detained for more than a week.

59
00:02:21.120 --> 00:02:22.950
- So, of course it might
not be a surprise, you know,

60
00:02:22.950 --> 00:02:24.600
that the federal government has tested

61
00:02:24.600 --> 00:02:26.820
out this tech that law
enforcement have, you know,

62
00:02:26.820 --> 00:02:30.000
they've been using this
tech in some instances.

63
00:02:30.000 --> 00:02:32.070
But even like MSG Stadium,

64
00:02:32.070 --> 00:02:34.740
you're going to go watch
a concert or a show,

65
00:02:34.740 --> 00:02:37.980
they have used facial recognition as well.

66
00:02:37.980 --> 00:02:41.910
- Yeah, we've seen the
owner of MSG really put

67
00:02:41.910 --> 00:02:44.430
the software to use in an aggressive way

68
00:02:44.430 --> 00:02:45.900
that we haven't seen
anywhere else in the country.

69
00:02:45.900 --> 00:02:48.900
So Madison Square Garden is
this enormous entertainment

70
00:02:48.900 --> 00:02:51.180
like conglomerate that owns
some of the most iconic stadiums

71
00:02:51.180 --> 00:02:53.850
in New York City and by
extension in the country

72
00:02:53.850 --> 00:02:56.160
and we've seen the stadium,

73
00:02:56.160 --> 00:02:58.320
like put facial recognition
software to use,

74
00:02:58.320 --> 00:03:01.260
identifying lawyers who
are involved in lawsuits

75
00:03:01.260 --> 00:03:03.570
against the company,
not even as plaintiffs.

76
00:03:03.570 --> 00:03:06.000
These are just lawyers who
work for enormous firms.

77
00:03:06.000 --> 00:03:08.892
They may not even be
involved at the suit at all

78
00:03:08.892 --> 00:03:11.910
but by scraping the
headshots of those lawyers

79
00:03:11.910 --> 00:03:13.290
from the firm's website,

80
00:03:13.290 --> 00:03:15.450
feeding them into the
facial recognition algorithm

81
00:03:15.450 --> 00:03:18.120
and putting into use in
line for basketball games,

82
00:03:18.120 --> 00:03:19.380
for concerts.

83
00:03:19.380 --> 00:03:21.060
One woman was kicked out of the stadium

84
00:03:21.060 --> 00:03:23.430
when she was attending a
concert with her daughter.

85
00:03:23.430 --> 00:03:24.810
She was trying to go to the Rockettes

86
00:03:24.810 --> 00:03:27.870
and was unceremoniously barred from entry

87
00:03:27.870 --> 00:03:31.770
and it's this petty use
by a powerful person

88
00:03:31.770 --> 00:03:35.190
of facial recognition
software that kind of heralds

89
00:03:35.190 --> 00:03:36.510
an unpleasant future.

90
00:03:36.510 --> 00:03:39.330
- Yeah, of course there
could be a lot of misuse

91
00:03:39.330 --> 00:03:40.560
for this type of technology.

92
00:03:40.560 --> 00:03:42.150
Of course there could be some great uses

93
00:03:42.150 --> 00:03:42.983
for this technology.

94
00:03:42.983 --> 00:03:45.330
I just, I think that the
concern by many Democrats

95
00:03:45.330 --> 00:03:47.670
probably is that we don't
really have the oversight.

96
00:03:47.670 --> 00:03:49.920
We don't have the safety guards just yet

97
00:03:49.920 --> 00:03:51.960
and of course all that data collected

98
00:03:51.960 --> 00:03:54.448
on Americans is gonna
be a concern as well.

99
00:03:54.448 --> 00:03:55.950
- And it's what we don't
have are safeguards

100
00:03:55.950 --> 00:03:56.880
at the national level.

101
00:03:56.880 --> 00:04:00.690
This is what I think that
attempted the reintroduction

102
00:04:00.690 --> 00:04:04.500
of the attempted 2021 ban
is supposed to rectify.

103
00:04:04.500 --> 00:04:07.290
I don't think it's
going to pass yet again.

104
00:04:07.290 --> 00:04:09.480
So what we have instead
is like a patchwork

105
00:04:09.480 --> 00:04:11.700
of regulation that governs
facial recognition.

106
00:04:11.700 --> 00:04:15.450
16 municipalities, including
San Francisco, Boston, Oakland,

107
00:04:15.450 --> 00:04:18.720
have banned the use of facial
recognition by city officials

108
00:04:18.720 --> 00:04:19.590
like the cops.

109
00:04:19.590 --> 00:04:22.653
Not to say that like private
businesses can't use it.

110
00:04:24.120 --> 00:04:27.660
So some places prohibit it
but most places allow it.

111
00:04:27.660 --> 00:04:29.700
16 cities is not that many.

112
00:04:29.700 --> 00:04:30.660
- Yeah, well, it's certainly gonna be

113
00:04:30.660 --> 00:04:32.100
an ongoing conversation.

114
00:04:32.100 --> 00:04:33.660
Blake, we appreciate you joining us.

115
00:04:33.660 --> 00:04:36.390
For more information on
facial recognition tech,

116
00:04:36.390 --> 00:04:38.516
you can check out gizmodo.com.

117
00:04:38.516 --> 00:04:41.016
(funky music)