WEBVTT

1
00:00:00.150 --> 00:00:02.490
- [Narrator] This is future
tech where each week we discuss

2
00:00:02.490 --> 00:00:04.650
the good the bad, and the ugly

3
00:00:04.650 --> 00:00:07.740
of where tech is headed
in 2023 and beyond.

4
00:00:07.740 --> 00:00:10.680
- Today we're talking
about a meme of Balenciaga.

5
00:00:10.680 --> 00:00:13.110
It's AI generated
characters talking about how

6
00:00:13.110 --> 00:00:15.600
much they love fashion, and
we've learned how it works

7
00:00:15.600 --> 00:00:17.760
behind the scenes, what tools
are used to generate it.

8
00:00:17.760 --> 00:00:18.960
It's a cobbled together mix

9
00:00:18.960 --> 00:00:20.490
of a bunch of different AI tools,

10
00:00:20.490 --> 00:00:23.790
a voice generator, an image
generator, a video generator

11
00:00:23.790 --> 00:00:25.140
and it's really uncanny valley.

12
00:00:25.140 --> 00:00:28.080
And we're wondering, are
these characters fair use?

13
00:00:28.080 --> 00:00:28.913
I think so.

14
00:00:28.913 --> 00:00:31.260
They seem to be akin
to a political cartoon

15
00:00:31.260 --> 00:00:34.260
and the video makers aren't
making so much money.

16
00:00:34.260 --> 00:00:36.390
They seem like parody more so

17
00:00:36.390 --> 00:00:38.880
than co-optation of the characters

18
00:00:38.880 --> 00:00:42.210
for nefarious uses that
would damage the brand,

19
00:00:42.210 --> 00:00:44.280
the franchise, et cetera, et cetera.

20
00:00:44.280 --> 00:00:46.980
We haven't seen any
legal cases involving the

21
00:00:46.980 --> 00:00:48.510
Balenciaga meme, so we're not quite sure

22
00:00:48.510 --> 00:00:51.870
how a court would find it
as it pertains to copyright

23
00:00:51.870 --> 00:00:52.830
and trademark law.

24
00:00:52.830 --> 00:00:55.920
So far, the legal precedents
for AI are unset and it's a bit

25
00:00:55.920 --> 00:00:58.080
of a wild west and we're not
quite sure what will happen.

26
00:00:58.080 --> 00:01:01.440
Reddit just started charging AI companies

27
00:01:01.440 --> 00:01:04.980
for using its giant corpus
of text as training data

28
00:01:04.980 --> 00:01:07.530
and we'll see how these
image cases play out.

29
00:01:07.530 --> 00:01:10.470
Whether image repositories can keep their

30
00:01:10.470 --> 00:01:13.323
proprietary databases, their
proprietary content away

31
00:01:13.323 --> 00:01:16.500
from the prying eyes of
AI scraping software,

32
00:01:16.500 --> 00:01:18.136
which needs tons and tons and tons

33
00:01:18.136 --> 00:01:21.990
of data to train things
like ChatGPT and Midjourney

34
00:01:21.990 --> 00:01:24.840
and video generators and
voice generators, even.

35
00:01:24.840 --> 00:01:27.270
We saw Getty Images sue stable diffusion,

36
00:01:27.270 --> 00:01:29.100
which is an image generator, for allegedly

37
00:01:29.100 --> 00:01:33.120
copying 12 million
copyrighted images to train

38
00:01:33.120 --> 00:01:34.320
the image generation.

39
00:01:34.320 --> 00:01:36.300
Experts are concerned that deep fakes

40
00:01:36.300 --> 00:01:38.910
herald a new era of
political misinformation.

41
00:01:38.910 --> 00:01:40.380
These memes seem a little different.

42
00:01:40.380 --> 00:01:41.850
They're not quite so similar

43
00:01:41.850 --> 00:01:44.190
to the viral images of
Trump getting arrested

44
00:01:44.190 --> 00:01:46.170
on the day that he appeared in court

45
00:01:46.170 --> 00:01:48.570
which showed the former
president surrounded

46
00:01:48.570 --> 00:01:52.320
by police officers or leading
a phalanx of supporters.

47
00:01:52.320 --> 00:01:55.830
These are silly memes
making fun of fashion.

48
00:01:55.830 --> 00:01:58.290
These are not so ideologically
motivated it seems

49
00:01:58.290 --> 00:02:01.140
and so would probably fall
under Fair Use Doctrine,

50
00:02:01.140 --> 00:02:03.990
which holds that
modification of the original

51
00:02:03.990 --> 00:02:06.810
does not necessarily consist
of copyright infringement.

52
00:02:06.810 --> 00:02:08.520
If it is modified

53
00:02:08.520 --> 00:02:11.610
and distinguishable from
the original, a parody.

54
00:02:11.610 --> 00:02:14.901
Would I be okay with my
own image being used?

55
00:02:14.901 --> 00:02:19.110
Yeah, I probably, I've made them myself,

56
00:02:19.110 --> 00:02:20.760
isn't he beautiful?

57
00:02:20.760 --> 00:02:23.040
For more stories about
AI generated images,

58
00:02:23.040 --> 00:02:25.653
text, and video, check out gizmodo.com.