1
00:00:17,680 --> 00:00:18,620
Hello, world.

2
00:00:19,040 --> 00:00:19,780
I'm Tomasino.

3
00:00:20,400 --> 00:00:22,060
This is Solar Punk Prompts,

4
00:00:22,500 --> 00:00:25,060
a series for writers where we discuss solar punk,

5
00:00:25,580 --> 00:00:28,700
a movement that imagines a world where technology is used

6
00:00:28,700 --> 00:00:29,660
for the good of the planet.

7
00:00:30,280 --> 00:00:32,299
Or, as one guy on Reddit describes it,

8
00:00:32,860 --> 00:00:36,020
riding the catabotta collapse to some kind of sustainable

9
00:00:36,020 --> 00:00:36,620
society.

10
00:00:37,400 --> 00:00:38,140
In this series,

11
00:00:38,740 --> 00:00:42,820
we spend each episode exploring a single solar punk story

12
00:00:42,820 --> 00:00:46,040
prompt, adding some commentary, some inspirations,

13
00:00:46,480 --> 00:00:47,260
and some considerations.

14
00:00:48,440 --> 00:00:49,200
Most importantly,

15
00:00:49,440 --> 00:00:52,260
we consider how that story might help us to better envision

16
00:00:52,260 --> 00:00:53,800
a sustainable civilization.

17
00:00:54,780 --> 00:00:56,280
If this is your first time here,

18
00:00:56,480 --> 00:00:58,980
I'd recommend checking out our introduction episode first,

19
00:00:59,280 --> 00:01:01,180
where we talk about what solar punk is,

20
00:01:01,520 --> 00:01:04,239
why you should care, and why this series came into being.

21
00:01:05,160 --> 00:01:07,480
Today's prompt is The Beekeepers.

22
00:01:08,640 --> 00:01:11,980
A team of environmentalists and neural network researchers

23
00:01:11,980 --> 00:01:17,360
are training new bee -like AIs by having them coexist with

24
00:01:17,360 --> 00:01:20,240
the animal populations of a local ecosystem,

25
00:01:20,700 --> 00:01:23,060
calling themselves the Beekeepers.

26
00:01:23,660 --> 00:01:27,920
The core idea of this prompt is not the novel technology,

27
00:01:27,960 --> 00:01:31,860
but rather the shifting perception of AI from a sort of

28
00:01:31,860 --> 00:01:33,880
singularity to a beast of burden.

29
00:01:34,500 --> 00:01:37,380
It is a dumb animal to be trained and used,

30
00:01:37,680 --> 00:01:38,960
cared for and herded.

31
00:01:39,760 --> 00:01:41,260
To carry forward the bee analogy,

32
00:01:41,600 --> 00:01:45,440
a wild AI may flit around harmlessly, or it may sting you,

33
00:01:45,640 --> 00:01:49,540
but a colony well trained and cared for can be healthy and

34
00:01:49,540 --> 00:01:50,840
provide much for the community.

35
00:01:52,000 --> 00:01:56,820
A proper orientation of AI in our lives and communities is

36
00:01:56,820 --> 00:01:57,260
essential.

37
00:01:57,260 --> 00:02:01,300
One of the biggest dangers is our possibility of

38
00:02:01,300 --> 00:02:02,000
dependency.

39
00:02:02,380 --> 00:02:06,300
John Havens works at the forefront of promoting ethical AI,

40
00:02:06,740 --> 00:02:08,440
which prioritizes human well -being.

41
00:02:09,039 --> 00:02:12,300
He is the executive of AI director of the IEEE Global

42
00:02:12,300 --> 00:02:15,940
Initiative on Ethics of Autonomous and Intelligent Systems.

43
00:02:16,900 --> 00:02:18,140
He states, quote,

44
00:02:18,720 --> 00:02:22,620
the biggest risk of AI that anyone faces is the loss of

45
00:02:22,620 --> 00:02:24,160
ability to think for yourself.

46
00:02:25,020 --> 00:02:27,740
We're already seeing people are forgetting how to read

47
00:02:27,740 --> 00:02:29,580
maps, they're forgetting other skills.

48
00:02:30,300 --> 00:02:32,660
If we've lost the ability to be introspective,

49
00:02:32,800 --> 00:02:34,280
we've lost human agency,

50
00:02:34,300 --> 00:02:36,220
and we're spinning around in circles.

51
00:02:37,400 --> 00:02:40,380
There are other deep psychological questions as well.

52
00:02:40,940 --> 00:02:43,980
Our relationships with these systems can change our

53
00:02:43,980 --> 00:02:45,540
perceptions of ourselves.

54
00:02:46,380 --> 00:02:49,080
In his article, The March of the Robot Dogs,

55
00:02:49,620 --> 00:02:51,940
published in Ethics and Information Technology,

56
00:02:52,500 --> 00:02:53,600
Robert Sparrow states,

57
00:02:54,120 --> 00:02:56,960
to truly benefit from relationships with artificial

58
00:02:56,960 --> 00:02:59,200
intelligences, a person would have to quote,

59
00:02:59,540 --> 00:03:03,020
systematically delude themselves regarding the real nature

60
00:03:03,020 --> 00:03:04,700
of their relation with that AI.

61
00:03:05,760 --> 00:03:06,120
Further,

62
00:03:06,000 --> 00:03:10,000
He calls it a sentimentality of a morally deplorable sort.

63
00:03:10,280 --> 00:03:13,500
Vulnerable people would be especially at risk of falling

64
00:03:13,500 --> 00:03:14,700
prey to this deception.

65
00:03:15,780 --> 00:03:19,200
A misplacement of relationships with AI can also affect how

66
00:03:19,200 --> 00:03:20,060
we see one another.

67
00:03:20,980 --> 00:03:21,780
Nicholas A.

68
00:03:21,980 --> 00:03:24,140
Christakis writes in his article in The Atlantic,

69
00:03:24,480 --> 00:03:27,580
How AI Will Rewire Us, quote,

70
00:03:28,080 --> 00:03:32,080
machines made to look and act like us could also affect the

71
00:03:32,080 --> 00:03:35,300
social suite of capacities we have evolved to cooperate

72
00:03:35,300 --> 00:03:39,160
with one another, including love, friendship, cooperation,

73
00:03:39,960 --> 00:03:40,440
and teaching.

74
00:03:41,380 --> 00:03:44,800
And these concerns are only confounded when we begin to

75
00:03:44,800 --> 00:03:47,220
question whether AIs have personhood.

76
00:03:47,880 --> 00:03:50,860
Do they have a moral or legal agency?

77
00:03:51,680 --> 00:03:55,420
In 2017, the EU Parliament invited the European Commission,

78
00:03:55,660 --> 00:03:57,400
quote, to explore, analyze,

79
00:03:57,620 --> 00:04:00,620
and consider the implications of all possible legal

80
00:04:00,620 --> 00:04:01,280
solutions,

81
00:04:01,460 --> 00:04:05,940
including creating a specific legal status for robots in

82
00:04:05,940 --> 00:04:09,060
the long run so that at least the most sophisticated

83
00:04:09,060 --> 00:04:12,300
autonomous robots could be established as having the status

84
00:04:12,300 --> 00:04:16,300
of electronic persons responsible for making good any

85
00:04:16,300 --> 00:04:17,320
damage they may cause,

86
00:04:17,579 --> 00:04:21,500
and possibly applying electronic personality to cases where

87
00:04:21,500 --> 00:04:25,100
robots make autonomous decisions or otherwise interact with

88
00:04:25,100 --> 00:04:26,400
third parties independently.

89
00:04:27,380 --> 00:04:29,740
This sounds good on the surface level,

90
00:04:29,900 --> 00:04:32,200
respecting the intelligence of these creations,

91
00:04:32,720 --> 00:04:35,500
but it sparked a number of immediate objections from the

92
00:04:35,500 --> 00:04:35,820
field.

93
00:04:36,480 --> 00:04:36,980
In response,

94
00:04:37,280 --> 00:04:40,240
an open letter from several artificial intelligence and

95
00:04:40,240 --> 00:04:43,580
robotics experts stated that the creation of a legal status

96
00:04:43,580 --> 00:04:47,140
of electronic personhood for autonomous, unpredictable,

97
00:04:47,740 --> 00:04:50,560
and self -learning robots should be discarded from

98
00:04:50,560 --> 00:04:52,960
technical, legal, and ethical perspectives.

99
00:04:53,840 --> 00:04:57,040
Attributing electronic personhood to robots risks

100
00:04:57,040 --> 00:05:00,960
misplacing moral responsibility, causal accountability,

101
00:05:01,640 --> 00:05:05,600
and legal liability regarding their mistakes and misuses.

102
00:05:06,000 --> 00:05:07,100
you In short,

103
00:05:07,520 --> 00:05:10,420
it's more of an excuse for creators to hide their mistakes

104
00:05:10,420 --> 00:05:14,140
in shaping these AIs than it is to respect the creation.

105
00:05:15,120 --> 00:05:18,500
Blame the robot, we will shout, not Robot Corp.

106
00:05:19,440 --> 00:05:22,620
If these systems are to persist in our future worlds,

107
00:05:23,020 --> 00:05:25,680
it is imperative that we establish our relationships with

108
00:05:25,680 --> 00:05:28,400
them in ways that are sustainable to our psyches,

109
00:05:28,860 --> 00:05:31,460
the environment, and our economic systems.

110
00:05:32,480 --> 00:05:34,900
Leveraging AIs as beasts of burden, then,

111
00:05:35,280 --> 00:05:38,820
helps to think of one aspect of that relationship in a

112
00:05:38,820 --> 00:05:39,520
healthier way.

113
00:05:40,440 --> 00:05:43,920
Your story may also serve as a playground to address the

114
00:05:43,920 --> 00:05:47,280
other issues, like job losses, environmental damage,

115
00:05:47,540 --> 00:05:48,720
energy use, and so on.

116
00:05:49,740 --> 00:05:50,740
But for this prompt,

117
00:05:51,000 --> 00:05:53,860
we are mostly focusing on the human relationships.

118
00:05:54,360 --> 00:06:00,020
We are explicit about the role of AIBs by placing them in

119
00:06:00,020 --> 00:06:01,480
context with other animals.

120
00:06:02,160 --> 00:06:04,580
They are trained to socialize with those creatures,

121
00:06:04,840 --> 00:06:08,620
whether they be domesticated farm animals or local wild

122
00:06:08,620 --> 00:06:09,220
habitats.

123
00:06:10,180 --> 00:06:12,680
The bees are not friends who make us tea in our homes,

124
00:06:12,920 --> 00:06:16,140
they are animals learning to coexist in harmony with their

125
00:06:16,140 --> 00:06:16,580
environment.

126
00:06:17,700 --> 00:06:18,840
As beekeepers,

127
00:06:19,260 --> 00:06:22,240
the human role is to shape that learning through training.

128
00:06:23,180 --> 00:06:25,340
Now what shapes might that take?

129
00:06:26,380 --> 00:06:35,160
One idea is that these AI bees are the first step in

130
00:06:35,160 --> 00:06:38,440
halfway training them, not to a specific job,

131
00:06:38,700 --> 00:06:40,040
but to jobs in general.

132
00:06:40,900 --> 00:06:44,600
Recent studies in invertebrate neurobiology are providing

133
00:06:44,600 --> 00:06:48,120
new insights into the ways neurons are organized into

134
00:06:48,120 --> 00:06:50,580
functional networks to generate behavior.

135
00:06:51,360 --> 00:06:54,960
They are sort of building blocks of intelligence in a

136
00:06:54,960 --> 00:06:55,640
general sense.

137
00:06:56,300 --> 00:07:04,260
One could consider that these bees are being farmed into

138
00:07:04,260 --> 00:07:07,280
half products that could later be sold or shared with other

139
00:07:07,280 --> 00:07:08,560
groups and communities.

140
00:07:09,480 --> 00:07:12,780
If that feels a little too fine a distinction,

141
00:07:13,160 --> 00:07:16,060
or you're excited to write about AI's in a collective,

142
00:07:16,960 --> 00:07:19,500
there's the idea of hive intelligence to consider,

143
00:07:19,980 --> 00:07:23,200
where one bee may be little more than a collection of

144
00:07:23,200 --> 00:07:24,780
sensors and basic processing,

145
00:07:25,160 --> 00:07:28,560
then the hive may accomplish more complex tasks.

146
00:07:29,280 --> 00:07:33,320
A point of consideration here would be to really nail down

147
00:07:33,320 --> 00:07:36,600
why a hive makes more sense than a single entity.

148
00:07:37,300 --> 00:07:39,440
The distinction should be important to the action,

149
00:07:39,740 --> 00:07:42,000
otherwise it's simply descriptive coloring.

150
00:07:42,820 --> 00:07:46,220
One possible perspective is that the bees have a role in

151
00:07:46,220 --> 00:07:46,780
the environment.

152
00:07:47,600 --> 00:07:50,360
They monitor the ecosystem by living in it,

153
00:07:50,780 --> 00:07:51,880
acting as caretakers.

154
00:07:52,580 --> 00:07:55,480
Perhaps they watch out for pollution or poachers.

155
00:07:55,860 --> 00:07:59,160
Perhaps they augment and encourage pollination.

156
00:08:00,140 --> 00:08:03,460
Each bee on its own has individual tasks,

157
00:08:03,900 --> 00:08:05,760
while the collective is a whole, overseas,

158
00:08:06,000 --> 00:08:07,220
and shapes the environment.

159
00:08:08,340 --> 00:08:12,260
The beekeeper's role in training these bees is as stewards

160
00:08:12,260 --> 00:08:13,480
to the stewards.

161
00:08:14,120 --> 00:08:18,140
They must learn the ways to help and coexist through trial

162
00:08:18,140 --> 00:08:20,500
and error, testing, and verification.

163
00:08:21,700 --> 00:08:25,180
The beekeepers become something between a teacher and an

164
00:08:25,180 --> 00:08:25,700
artist then.

165
00:08:26,680 --> 00:08:30,300
The ecosystem and its connection to the bees and beekeepers

166
00:08:30,300 --> 00:08:32,080
is an essential part of the story.

167
00:08:33,059 --> 00:08:36,419
A lovely setting for this prompt might be within a forest

168
00:08:36,419 --> 00:08:36,880
or park.

169
00:08:37,860 --> 00:08:40,280
Do the bees draw their power from that setting?

170
00:08:40,799 --> 00:08:44,140
Is there geothermal energy or are they solar machines

171
00:08:44,140 --> 00:08:44,940
working by daylight?

172
00:08:45,840 --> 00:08:48,120
How do the humans interact with the environment?

173
00:08:49,020 --> 00:08:51,820
Is this a place where the community spends a lot of time?

174
00:08:52,540 --> 00:08:53,580
A respite from the city?

175
00:08:54,160 --> 00:08:55,940
An ecological preserve perhaps?

176
00:08:56,940 --> 00:09:00,360
In addition to the scientists and teachers of the bees,

177
00:09:00,680 --> 00:09:02,180
who else spends time here?

178
00:09:02,660 --> 00:09:05,120
What sort of relationship character to the characters have

179
00:09:05,120 --> 00:09:06,160
with the place itself.

180
00:09:06,820 --> 00:09:10,280
Is this a place of spiritual power or an agricultural

181
00:09:10,280 --> 00:09:10,760
center?

182
00:09:11,560 --> 00:09:12,720
In my own imagination,

183
00:09:13,400 --> 00:09:15,760
I've created a spiritually active community,

184
00:09:16,000 --> 00:09:19,400
perhaps with roots in an indigenous culture from the global

185
00:09:19,400 --> 00:09:19,760
south.

186
00:09:20,620 --> 00:09:24,020
They maintain a large forest near their settlement and

187
00:09:24,020 --> 00:09:27,660
encourage the biodiversity and natural habitats there with

188
00:09:27,660 --> 00:09:35,860
the support of as a place of healing, literally.

189
00:09:36,740 --> 00:09:38,320
From here they source their medicines,

190
00:09:39,080 --> 00:09:40,960
and in here they find their mental peace.

191
00:09:41,920 --> 00:09:43,680
And then there's young Carlos,

192
00:09:44,160 --> 00:09:45,940
who despite his reverence for his elders,

193
00:09:46,180 --> 00:09:48,000
cares only about the science at work.

194
00:09:48,580 --> 00:09:51,540
He wants to be a researcher and to leverage the bees for

195
00:09:51,540 --> 00:09:55,140
deeper learning about this place and what secrets it holds

196
00:09:55,140 --> 00:09:55,780
for their future.

197
00:09:57,100 --> 00:09:59,920
These different outlooks can begin as a point of difficulty

198
00:09:59,920 --> 00:10:03,180
and eventually opportunity for the characters to see the

199
00:10:03,180 --> 00:10:04,340
world from each other's viewpoint.

200
00:10:05,040 --> 00:10:06,560
Or maybe, instead of Carlos,

201
00:10:07,060 --> 00:10:09,900
my Reading Lens settles on a group of young children who

202
00:10:09,900 --> 00:10:12,080
are exploring the forest, building forts,

203
00:10:12,420 --> 00:10:13,640
and having playful adventures.

204
00:10:14,680 --> 00:10:16,500
Deep in its sheltering trees,

205
00:10:17,100 --> 00:10:18,820
they encounter the strangest creatures.

206
00:10:19,580 --> 00:10:22,540
They find them sometimes moving through the forest or

207
00:10:22,540 --> 00:10:23,780
inside a rare flower.

208
00:10:24,540 --> 00:10:27,240
They have been here for as long as a new child remembers,

209
00:10:28,300 --> 00:10:29,940
though they don't know what they are.

210
00:10:30,720 --> 00:10:33,480
They care for the trees, for the animals, and plants.

211
00:10:34,160 --> 00:10:35,720
They are always kind to children.

212
00:10:36,400 --> 00:10:38,420
Maybe they are spirits or sprites.

213
00:10:38,960 --> 00:10:41,180
Maybe they are animals themselves.

214
00:10:42,340 --> 00:10:45,520
Maybe the children have grown up with Shinto beliefs about

215
00:10:45,520 --> 00:10:47,540
local gods, and they wonder.

216
00:10:49,080 --> 00:10:51,660
This prompt gives us a lot to play with.

217
00:10:52,020 --> 00:10:54,520
There is a room within the theme for social criticism,

218
00:10:54,920 --> 00:10:56,620
as well as artful dreaming.

219
00:10:57,380 --> 00:10:59,740
So many little elements are there for you.

220
00:10:59,740 --> 00:11:03,320
They just need to be taught to work together and set in the

221
00:11:03,320 --> 00:11:06,380
right direction, like a single hive.

222
00:11:07,320 --> 00:11:10,640
Until next time, I'm Tomasino.

223
00:11:11,160 --> 00:11:14,400
I hope you'll join me for the next Solar Punk Prompt.

224
00:11:15,980 --> 00:11:18,940
Music in this recording is bio -field,

225
00:11:19,500 --> 00:11:22,680
from Global Patterns Compilation, Solar Punk,

226
00:11:22,960 --> 00:11:24,540
A Brighter Perspective.

227
00:11:29,960 --> 00:11:33,740
Music in this recording is bio -field,

228
00:11:33,740 --> 00:11:33,740
from Global Patterns Compilation, Solar Punk,

229
00:11:33,740 --> 00:11:33,740
A Brighter Perspective.