WEBVTT

1
00:00:00.120 --> 00:00:03.240
- This is Techmodo, a Gizmodo
series where we take a dive

2
00:00:03.240 --> 00:00:05.070
into some of the most recent tech news

3
00:00:05.070 --> 00:00:07.230
with some expert guests.

4
00:00:07.230 --> 00:00:10.050
This week, mathematician
Hannah Fry talks to me

5
00:00:10.050 --> 00:00:13.680
about how to study a city
in a computer program.

6
00:00:13.680 --> 00:00:15.600
Humans are creatures of habit,

7
00:00:15.600 --> 00:00:17.730
a phenomenon that is so reliable

8
00:00:17.730 --> 00:00:21.360
that researchers are able to
model our movements, actions,

9
00:00:21.360 --> 00:00:23.400
and decisions using computers.

10
00:00:23.400 --> 00:00:24.810
For what? You might ask.

11
00:00:24.810 --> 00:00:26.040
Well, lots of things.

12
00:00:26.040 --> 00:00:27.300
Just ask Hannah Fry,

13
00:00:27.300 --> 00:00:30.540
who is a mathematician from
the University College, London.

14
00:00:30.540 --> 00:00:31.920
By working with physicists,

15
00:00:31.920 --> 00:00:33.960
computer programmers, and geographers,

16
00:00:33.960 --> 00:00:37.170
Hannah and her colleagues are
able to build digital networks

17
00:00:37.170 --> 00:00:40.110
of human behavior in
order to study the effects

18
00:00:40.110 --> 00:00:42.240
of things like terror attacks, and dating,

19
00:00:42.240 --> 00:00:45.390
and as we know all too well, pandemics.

20
00:00:45.390 --> 00:00:47.940
I got to talk to Hannah
all about her research

21
00:00:47.940 --> 00:00:50.310
in modeling the ins and
outs of human behavior

22
00:00:50.310 --> 00:00:51.930
using mathematics and computers,

23
00:00:51.930 --> 00:00:55.080
and it's just about as
cool as you would expect.

24
00:00:55.080 --> 00:00:58.260
So you are someone who uses digital models

25
00:00:58.260 --> 00:01:00.660
to study urban spaces, would
you say that's correct?

26
00:01:00.660 --> 00:01:01.590
- Yeah, yeah.
- Cool.

27
00:01:01.590 --> 00:01:04.890
- So what are you using those
models to actually study?

28
00:01:04.890 --> 00:01:07.740
- So to be honest, I think
if you looked at like,

29
00:01:07.740 --> 00:01:09.840
the history of things that I've looked at,

30
00:01:11.295 --> 00:01:14.820
it's like the most mad shopping list.

31
00:01:14.820 --> 00:01:16.950
- Okay, cool.
- Of like, human beings.

32
00:01:16.950 --> 00:01:20.449
- So I've done lots of work on terrorism,

33
00:01:20.449 --> 00:01:21.282
- Okay.

34
00:01:21.282 --> 00:01:25.650
- I've done work on serial killers.

35
00:01:25.650 --> 00:01:29.250
There's some work in there on pandemics.

36
00:01:29.250 --> 00:01:32.250
Also on badgers biting cattle.

37
00:01:32.250 --> 00:01:33.875
I mean, you know, it's eclectic.

38
00:01:33.875 --> 00:01:35.105
- [Kevin] A lot of light fluffy stuff.

39
00:01:35.105 --> 00:01:36.570
- A lot of light fluffy stuff.

40
00:01:36.570 --> 00:01:39.000
But generally speaking, all
of them have these patterns

41
00:01:39.000 --> 00:01:41.130
that move around in space and time,

42
00:01:41.130 --> 00:01:43.803
generally driven by humans or badgers.

43
00:01:45.690 --> 00:01:47.460
But, you know, mathematically speaking,

44
00:01:47.460 --> 00:01:49.290
humans and badgers are
almost indistinguishable

45
00:01:49.290 --> 00:01:50.140
from one another.

46
00:01:51.630 --> 00:01:55.140
It's not about trying to
predict what one person will do,

47
00:01:55.140 --> 00:01:57.660
it's about looking at the
patterns that we create

48
00:01:57.660 --> 00:02:00.270
without even necessarily
knowing that we're doing so,

49
00:02:00.270 --> 00:02:05.160
and using these very well-worn
mathematical techniques

50
00:02:05.160 --> 00:02:07.290
to try and understand the directions

51
00:02:07.290 --> 00:02:08.250
that things are gonna move in.

52
00:02:08.250 --> 00:02:10.890
- How do you then, when
you go to model a problem,

53
00:02:10.890 --> 00:02:13.230
let's say we're modeling like,
the pandemic, for example,

54
00:02:13.230 --> 00:02:14.190
the COVID-19 pandemic.

55
00:02:14.190 --> 00:02:15.990
How do you go about building a model

56
00:02:15.990 --> 00:02:19.260
that can capture kinda all
the nuances of human behavior?

57
00:02:19.260 --> 00:02:20.580
- Well, I mean,

58
00:02:20.580 --> 00:02:22.830
there's a couple of different
schools of thought here.

59
00:02:22.830 --> 00:02:24.240
And on the one hand,

60
00:02:24.240 --> 00:02:26.940
I think that you have to
say that you can't, right?

61
00:02:26.940 --> 00:02:30.390
There's no way that you will
ever be able to get all of it.

62
00:02:30.390 --> 00:02:33.570
And so as a result, you
know, you have to be,

63
00:02:33.570 --> 00:02:35.760
I think, have intellectual humility

64
00:02:35.760 --> 00:02:37.680
over the things that you are creating,

65
00:02:37.680 --> 00:02:39.630
because these things, these
models that you're building,

66
00:02:39.630 --> 00:02:41.130
they're not crystal balls.

67
00:02:41.130 --> 00:02:44.190
They don't like, allow you
to peer into the future,

68
00:02:44.190 --> 00:02:47.830
but what they do do is
give you the best forecast

69
00:02:49.380 --> 00:02:50.520
that you can possibly have, right?

70
00:02:50.520 --> 00:02:52.950
They are much better than having nothing,

71
00:02:52.950 --> 00:02:55.230
but you have to be careful about thinking

72
00:02:55.230 --> 00:02:56.910
that they are telling
you the absolute truth

73
00:02:56.910 --> 00:02:57.900
of what's to come.

74
00:02:57.900 --> 00:02:59.760
But I think that there's different
levels, really, you know,

75
00:02:59.760 --> 00:03:02.460
like, there's some model with the pandemic

76
00:03:02.460 --> 00:03:04.350
where you don't even
really think, necessarily,

77
00:03:04.350 --> 00:03:06.630
about individual people,
you sort of think of like,

78
00:03:06.630 --> 00:03:09.930
of people are sort of,
kind of great masses

79
00:03:09.930 --> 00:03:13.170
that are sort of transitioning
from being infected

80
00:03:13.170 --> 00:03:16.650
to infecting others and
then being susceptible.

81
00:03:16.650 --> 00:03:18.720
And then there's other
much more detailed models

82
00:03:18.720 --> 00:03:20.820
where you really start
to take into account

83
00:03:20.820 --> 00:03:23.040
the type of jobs that people have,

84
00:03:23.040 --> 00:03:24.870
how far they move around,

85
00:03:24.870 --> 00:03:26.190
all of those different kind of things.

86
00:03:26.190 --> 00:03:29.790
And for that, you need a
lot, a lot, a lot of data.

87
00:03:29.790 --> 00:03:30.810
- Yeah, with that,

88
00:03:30.810 --> 00:03:33.330
what is the most complex
model you've built

89
00:03:33.330 --> 00:03:35.580
and what were you studying with it?

90
00:03:35.580 --> 00:03:37.650
- Oh, so I think that
actually, the most complex one

91
00:03:37.650 --> 00:03:40.380
that I've been involved in was COVID-19.

92
00:03:40.380 --> 00:03:42.300
- Okay. Why is that?

93
00:03:42.300 --> 00:03:45.000
- So, I mean, because just
the vastness of the data

94
00:03:45.000 --> 00:03:46.890
and the importance of it, too,

95
00:03:46.890 --> 00:03:50.430
but that urgency to work out

96
00:03:50.430 --> 00:03:52.620
what the next right thing to do was,

97
00:03:52.620 --> 00:03:53.910
you know, in that moment in the time,

98
00:03:53.910 --> 00:03:56.100
it was just incredibly pressing.

99
00:03:56.100 --> 00:03:58.230
I mean, I have to say that my colleagues,

100
00:03:58.230 --> 00:04:00.270
the epidemiologist, basically,

101
00:04:00.270 --> 00:04:02.026
I think they didn't sleep for two years.

102
00:04:02.026 --> 00:04:04.487
- It's extreme.
- Yeah, probably. Sure.

103
00:04:04.487 --> 00:04:06.420
- But, you know, lots of them,

104
00:04:06.420 --> 00:04:07.830
there's varying levels of complexity.

105
00:04:07.830 --> 00:04:12.830
I think some of the models that I've built

106
00:04:13.860 --> 00:04:17.910
that look around the
patterns that are created

107
00:04:17.910 --> 00:04:19.680
with terror events.

108
00:04:19.680 --> 00:04:22.050
So how, and particularly,

109
00:04:22.050 --> 00:04:23.430
there was some work that I did

110
00:04:23.430 --> 00:04:25.260
that looked at what happened

111
00:04:25.260 --> 00:04:27.150
in the troubles in Northern Ireland.

112
00:04:27.150 --> 00:04:29.400
So looking at how, because it's sort of,

113
00:04:29.400 --> 00:04:30.480
they follow this pattern

114
00:04:30.480 --> 00:04:31.830
that's a little bit like earthquakes,

115
00:04:31.830 --> 00:04:33.750
so an earthquake happens,

116
00:04:33.750 --> 00:04:35.520
and then you get these
subsequent aftershocks,

117
00:04:35.520 --> 00:04:38.130
and you actually see a
really similar signature

118
00:04:38.130 --> 00:04:41.130
when you look at incidents

119
00:04:41.130 --> 00:04:44.610
in sort of fractious settings, right?

120
00:04:44.610 --> 00:04:47.520
So you have kind of one big
event, one spark, as it were,

121
00:04:47.520 --> 00:04:50.340
and then these subsequent
events that happen.

122
00:04:50.340 --> 00:04:53.130
And I don't think that you can
use that to make predictions

123
00:04:53.130 --> 00:04:56.370
or to sort of change the environment,

124
00:04:56.370 --> 00:04:58.710
but I do think that what
it does retrospectively

125
00:04:58.710 --> 00:05:01.020
is it allows you to understand
the different phases

126
00:05:01.020 --> 00:05:03.720
that a potential conflict
has gone through.

127
00:05:03.720 --> 00:05:06.510
- Wow. That is amazing work, thank you.

128
00:05:06.510 --> 00:05:07.704
Well, Hannah, thank you for joining us.

129
00:05:07.704 --> 00:05:08.537
- Of course.
- Cool.

130
00:05:08.537 --> 00:05:09.370
- [Hannah] Thank you so much.

131
00:05:09.370 --> 00:05:11.100
- It would appear that
there's lots to study

132
00:05:11.100 --> 00:05:14.250
using computer models, and
lots of outcomes to learn.

133
00:05:14.250 --> 00:05:16.530
While a computer model
might not look as flashy

134
00:05:16.530 --> 00:05:17.550
as a computer game,

135
00:05:17.550 --> 00:05:20.370
these models are in
essence, the same thing.

136
00:05:20.370 --> 00:05:21.330
With all that said,

137
00:05:21.330 --> 00:05:24.780
not every human action can
be modeled realistically.

138
00:05:24.780 --> 00:05:26.790
We make mistakes, we go against the grain,

139
00:05:26.790 --> 00:05:28.380
and we can be irrational.

140
00:05:28.380 --> 00:05:29.880
But the fact that a computer

141
00:05:29.880 --> 00:05:31.890
can't necessarily account for those things

142
00:05:31.890 --> 00:05:33.990
is kind of beautiful.

143
00:05:33.990 --> 00:05:36.453
Check out more videos here on gizmodo.com.