WEBVTT

1
00:00:00.000 --> 00:00:01.140
- Let's talk a little bit about

2
00:00:01.140 --> 00:00:03.090
being a steward of good tech.

3
00:00:03.090 --> 00:00:03.923
- Yeah.

4
00:00:03.923 --> 00:00:06.240
- So your tenure at IBM
was marked in particular

5
00:00:06.240 --> 00:00:10.110
by moving to the cloud, a
lot of investment in AI.

6
00:00:10.110 --> 00:00:11.970
You helped develop Watson-

7
00:00:11.970 --> 00:00:13.530
- Very early.
- not just the systems I mean,

8
00:00:13.530 --> 00:00:16.230
but issues which are
incredibly important right now,

9
00:00:16.230 --> 00:00:17.670
especially AI.

10
00:00:17.670 --> 00:00:19.770
What are the lessons that you've learned

11
00:00:19.770 --> 00:00:22.140
from your history in
developing that technology?

12
00:00:22.140 --> 00:00:23.580
When you look at what's happening now,

13
00:00:23.580 --> 00:00:26.136
how can we be a steward
of good tech in 2023?

14
00:00:26.136 --> 00:00:28.230
- 'Cause I would like to,
I hope anybody watches

15
00:00:28.230 --> 00:00:30.090
convince not just as
individuals, as users,

16
00:00:30.090 --> 00:00:31.290
what we do with our tech

17
00:00:31.290 --> 00:00:32.850
but anybody who builds it, fine,

18
00:00:32.850 --> 00:00:34.890
but actually since so
many people use it now,

19
00:00:34.890 --> 00:00:36.240
it's like any company, right?

20
00:00:36.240 --> 00:00:40.140
So, in my mind, the number
one thing is whatever you do

21
00:00:40.140 --> 00:00:42.300
you've got to build
trust for the long term.

22
00:00:42.300 --> 00:00:45.240
So, maybe even said better,
being a steward of good tech

23
00:00:45.240 --> 00:00:48.660
means can you manage the
upside and the downside

24
00:00:48.660 --> 00:00:51.030
in parallel, not distanced by time?

25
00:00:51.030 --> 00:00:53.070
So in other words, "hey
look, it isn't this great.

26
00:00:53.070 --> 00:00:54.840
Oh, it's causing problems.

27
00:00:54.840 --> 00:00:56.220
No, no, no,"

28
00:00:56.220 --> 00:00:57.090
from the beginning.

29
00:00:57.090 --> 00:00:58.440
And I even think about, you know,

30
00:00:58.440 --> 00:01:00.180
like when Chat GPT came out,

31
00:01:00.180 --> 00:01:02.527
the first thing you heard
about was from educators.

32
00:01:02.527 --> 00:01:04.020
"Oh no, kids can cheat now," right?

33
00:01:04.020 --> 00:01:05.730
They can have their paper written.

34
00:01:05.730 --> 00:01:08.400
I would've loved to have
seen the app at the same time

35
00:01:08.400 --> 00:01:09.397
come out that said,

36
00:01:09.397 --> 00:01:10.980
"you know, let me give
you a confidence level

37
00:01:10.980 --> 00:01:12.720
that that paper was written with Chat GPT.

38
00:01:12.720 --> 00:01:14.880
I mean, just a subtle
difference would've kind of

39
00:01:14.880 --> 00:01:18.450
telegraphed, "hey," or, I
mean, you've got to educate.

40
00:01:18.450 --> 00:01:20.490
So the first thing I think about is

41
00:01:20.490 --> 00:01:21.900
manage up and down in parallel.

42
00:01:21.900 --> 00:01:24.300
The second is you're gonna
have to have some guardrails.

43
00:01:24.300 --> 00:01:26.010
And to me, the guardrails,

44
00:01:26.010 --> 00:01:27.540
okay, yes, there can be regulation,

45
00:01:27.540 --> 00:01:29.110
but the real issue is

46
00:01:30.630 --> 00:01:32.040
think about its usage.

47
00:01:32.040 --> 00:01:33.840
I mean, I can think the
regulation that would be effective

48
00:01:33.840 --> 00:01:35.250
would be on usage

49
00:01:35.250 --> 00:01:38.730
because people have different
tolerance levels for errors.

50
00:01:38.730 --> 00:01:40.950
You and I are just both talking
about Wikipedia for a second

51
00:01:40.950 --> 00:01:42.570
and when things are right
and when they're wrong.

52
00:01:42.570 --> 00:01:45.360
Like, okay, if you found a
few wrong things about me,

53
00:01:45.360 --> 00:01:46.193
you don't care.

54
00:01:46.193 --> 00:01:47.910
I mean, you're gonna ask
me and we clear that up.

55
00:01:47.910 --> 00:01:49.260
But if you're doing medical research,

56
00:01:49.260 --> 00:01:50.490
and I learned that through our work,

57
00:01:50.490 --> 00:01:52.830
when I work on really important problems,

58
00:01:52.830 --> 00:01:55.830
even though a human might
be wrong 30% of the time,

59
00:01:55.830 --> 00:01:58.500
when I ask a medical
question, I expect a 100%.

60
00:01:58.500 --> 00:02:00.540
So there are really
different tolerance levels

61
00:02:00.540 --> 00:02:02.220
based on the problem.

62
00:02:02.220 --> 00:02:05.700
And now I think for the
public to not fear it,

63
00:02:05.700 --> 00:02:08.400
we have to be really careful
about giving good guidance

64
00:02:08.400 --> 00:02:10.890
on how it's trained, where it's used,

65
00:02:10.890 --> 00:02:13.710
and, you know, what are,
you know, at this time,

66
00:02:13.710 --> 00:02:15.210
what is the right things to use it for?

67
00:02:15.210 --> 00:02:16.620
And I'm a little worried
about that right now

68
00:02:16.620 --> 00:02:19.080
because that's gotta come through quickly.

69
00:02:19.080 --> 00:02:22.470
Or, you know, Dr. Kissinger
had a great quote,

70
00:02:22.470 --> 00:02:24.900
he said, "history says when people

71
00:02:24.900 --> 00:02:26.670
if they don't understand something

72
00:02:26.670 --> 00:02:28.410
they fear it or they revolt," right?

73
00:02:28.410 --> 00:02:31.440
And I'd hate to see that
sort of really hold back AI

74
00:02:31.440 --> 00:02:34.290
'cause I think there's
so much promise with it,

75
00:02:34.290 --> 00:02:35.460
but we're in that moment.

76
00:02:35.460 --> 00:02:38.430
So to me, good tech meant
anything you can do with trust,

77
00:02:38.430 --> 00:02:41.370
you know, these all fall under
trust, long term, short term,

78
00:02:41.370 --> 00:02:44.460
in parallel, the idea of being explainable

79
00:02:44.460 --> 00:02:47.820
with the technology and then
teaching people how to use it.

80
00:02:47.820 --> 00:02:50.430
And so they think, "hey, I'm
gonna have a better life."

81
00:02:50.430 --> 00:02:52.770
Not all this stuff about,
I learned at the beginning,

82
00:02:52.770 --> 00:02:56.430
it was back in 2012, "oh no,
it's gonna destroy all jobs."

83
00:02:56.430 --> 00:02:59.430
I kept saying, "no, no, no,
it's gonna change a lot of jobs"

84
00:02:59.430 --> 00:03:01.620
but you're focusing on the wrong thing.

85
00:03:01.620 --> 00:03:03.240
But if people think it's gonna mean

86
00:03:03.240 --> 00:03:04.530
they're gonna have a bad future,

87
00:03:04.530 --> 00:03:06.300
this is not good either for a technology.

88
00:03:06.300 --> 00:03:10.080
So. I think it can make you
and I better at our jobs.

89
00:03:10.080 --> 00:03:11.880
And there are a lot of people
that talk about that now,

90
00:03:11.880 --> 00:03:13.020
being a co-pilot.

91
00:03:13.020 --> 00:03:13.853
But I gotta tell you,

92
00:03:13.853 --> 00:03:15.300
I was talking about AI
ethics a decade ago.

93
00:03:15.300 --> 00:03:16.290
I couldn't get anybody to listen. (laughs)

94
00:03:16.290 --> 00:03:18.033
- Now everybody wants to.
- So I was, you know,

95
00:03:18.033 --> 00:03:20.070
now, I'm like, "okay, oh, okay."

96
00:03:20.070 --> 00:03:21.330
Like usually I'm a little bit early,

97
00:03:21.330 --> 00:03:22.890
but I was actually right on that one.

98
00:03:22.890 --> 00:03:25.020
Not everything, that one I was right on.

99
00:03:25.020 --> 00:03:26.880
- So if there's somebody
who's just starting out

100
00:03:26.880 --> 00:03:28.680
their career in technology right now,

101
00:03:28.680 --> 00:03:30.150
maybe they're a recent college graduate

102
00:03:30.150 --> 00:03:31.740
or they're changing industries,

103
00:03:31.740 --> 00:03:33.660
and they're facing these thorny issues

104
00:03:33.660 --> 00:03:36.840
from AI to privacy to everything
else, what's your advice

105
00:03:36.840 --> 00:03:39.323
for somebody just starting out
in the tech world right now?

106
00:03:40.590 --> 00:03:42.123
- I think my advice would be,

107
00:03:43.680 --> 00:03:46.740
you know, do your work
with values in mind,

108
00:03:46.740 --> 00:03:48.600
with whatever your values are.

109
00:03:48.600 --> 00:03:50.370
Apply that to your work.

110
00:03:50.370 --> 00:03:54.300
And I think because I'd love
to see these technologies

111
00:03:54.300 --> 00:03:56.610
built from the beginning
with trust in 'em.

112
00:03:56.610 --> 00:03:58.290
That would be my advice, right?

113
00:03:58.290 --> 00:04:00.040
And just be ever curious with them.