July 8, 2025

Why Parents Should Be Concerned About AI Barbie

Why Parents Should Be Concerned About AI Barbie

What happens when Barbie becomes your kid's AI best friend?

Look, we're already watching kids lose basic human skills because they can Google everything. Now Mattel wants to stick ChatGPT into toys so your eight-year-old can have deep conversations with Optimus Prime about life's meaning. Stanford says nobody under 18 should be touching this stuff, but apparently corporate profits trump child psychology. We're about to witness the first generation that never has to imagine anything because their toys will do it for them. Meanwhile, we're racing toward a 15-hour work week where robots do everything, leaving men, especially, scrambling for purpose in a world that no longer needs their Protestant work ethic. The future looks like either Ready Player One or we all become batteries for our robot overlords - and honestly, I'm not sure which is worse.

Listen now to discover how to protect your kids' humanity in an increasingly artificial world.

Topics Discussed:

  • Why AI-powered toys represent a bigger threat than Terminator's Skynet
  • How ChatGPT integration removes children's need for imagination and creativity
  • The concerning link between AI exposure and mental health breakdowns in young people
  • Why Stanford researchers say no one under 18 should use AI technology
  • How losing our imagination muscles affects human development and problem-solving
  • The coming economic disruption of widespread job automation and universal basic income
  • Why men's mental health will suffer most when work-based identity disappears
  • How virtual reality and AI dependency mirror dystopian movie predictions
  • The difference between helpful technology and creativity-killing shortcuts
  • What happens when an entire generation grows up with "AI privilege"

----

MORE FROM THE FIT MESS:

Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok

Subscribe to The Fit Mess on Youtube

Join our community in the Fit Mess Facebook group

----

LINKS TO OUR PARTNERS:

 

 

1
00:00:00,151 --> 00:00:01,612
Hi there, it's the Fitmas.

2
00:00:01,612 --> 00:00:04,655
Admittedly a name that doesn't really fit what we talk about anymore.

3
00:00:04,655 --> 00:00:05,625
It feels weird.

4
00:00:05,625 --> 00:00:06,896
We might need to switch that up soon.

5
00:00:06,896 --> 00:00:09,117
So stay tuned for that.

6
00:00:09,879 --> 00:00:13,321
I'm Jeremy and I've been using computers for a while.

7
00:00:13,321 --> 00:00:15,853
And that's about the extent of my knowledge of the topics we talk about.

8
00:00:15,853 --> 00:00:23,168
But Jason, you have deep, deep knowledge and how all the AI and the internet and the
computers and the buttons and all the things do the things.

9
00:00:23,949 --> 00:00:25,110
Been doing this a long time.

10
00:00:25,110 --> 00:00:26,511
Doing this a long time.

11
00:00:26,773 --> 00:00:36,969
So that's why we talk about AI, because we're fascinated by it and we're fascinated in how
it's affecting our mental health and all of the aspects of reality and life as we know

12
00:00:36,969 --> 00:00:37,289
them.

13
00:00:37,289 --> 00:00:41,871
another headline that caught my eye as we were looking for things to talk about.

14
00:00:43,052 --> 00:00:44,263
I'll read the headline in a second.

15
00:00:44,263 --> 00:00:54,828
First, you know, ever since we started talking about AI, like as a people, primarily after
Terminator, everyone's been terrified that Skynet is the movie or the thing that's going

16
00:00:54,828 --> 00:00:55,839
to come to a Terminator.

17
00:00:55,839 --> 00:00:57,520
is gonna come true.

18
00:00:57,621 --> 00:01:03,482
I argue that there is a movie that is even more terrifying that is now on the verge of
coming true.

19
00:01:03,556 --> 00:01:08,305
I mean to be fair, Chucky was not an artificial intelligence.

20
00:01:08,305 --> 00:01:12,055
Chucky was a possessed, but possessed doll.

21
00:01:12,055 --> 00:01:13,446
Yes, but he's about to be.

22
00:01:13,446 --> 00:01:17,341
is the headline as chat GPT linked to mental health breakdowns.

23
00:01:17,341 --> 00:01:21,826
Mattel announces plans to incorporate it into children's toys.

24
00:01:22,147 --> 00:01:24,840
Chuck is coming to fucking kill us all.

25
00:01:25,658 --> 00:01:28,171
I know if Chuckie's coming to kill us all?

26
00:01:28,171 --> 00:01:32,615
Although, that does seem like it would be a hot toy for Christmas.

27
00:01:32,615 --> 00:01:39,004
ah Yeah, it seems like it would be very exciting.

28
00:01:39,159 --> 00:01:47,761
AI, so open AI and Mattel teaming up to make toys that are enhanced with AI so that you
basically don't need an imaginary friend anymore.

29
00:01:47,761 --> 00:01:50,102
Barbie's actually going to hang out with you.

30
00:01:50,398 --> 00:01:52,629
Yeah, and it won't just be Barbie.

31
00:01:52,629 --> 00:02:01,931
It'll be your Barbie tailored to you with proper prompt engineering and all the things
that go into it and probably not enough safeguards and rails on it and everything else.

32
00:02:02,552 --> 00:02:08,233
And Barbie might wind up, okay, there's a fantastic show called Robot Chicken.

33
00:02:08,473 --> 00:02:14,435
And in Robot Chicken, they basically do a version of stop motion animation using action
figures.

34
00:02:15,395 --> 00:02:17,076
That's what's coming.

35
00:02:17,236 --> 00:02:17,787
Like.

36
00:02:17,787 --> 00:02:19,958
the world.

37
00:02:19,958 --> 00:02:22,521
Dude, I didn't even think about transformers.

38
00:02:22,863 --> 00:02:23,875
That's cool.

39
00:02:23,875 --> 00:02:31,920
the ability for people to keep imagining things and having to do the struggle and the
fight to make these things happen in reality, that's going to go away.

40
00:02:31,920 --> 00:02:40,144
So our imagination muscles are going to decline because we're not going to have to script
this shit in our head.

41
00:02:40,185 --> 00:02:42,166
It's just going to be scripted right out there for us.

42
00:02:42,166 --> 00:02:45,888
And there's a really good example of this happening right now in education.

43
00:02:45,888 --> 00:02:51,649
You've already got high school kids who go through and use chat GPT to write their essays.

44
00:02:51,649 --> 00:02:56,790
And I forget, I can't remember the name of the study, but there was a study that was done
where they watched high school kids.

45
00:02:56,790 --> 00:03:02,441
And over the course of three essays, they watched how badly these kids...

46
00:03:02,441 --> 00:03:03,743
uh

47
00:03:03,820 --> 00:03:15,899
performed I guess on these essays because in the first time it was like I can kind of tell
chat GPT had an influence in this but the second one it's like Okay, there seems to be a

48
00:03:15,899 --> 00:03:18,180
few things that are just copied and pasted by the third one.

49
00:03:18,180 --> 00:03:27,577
It's just full on copied and pasted so It's a crutch that we already that we're already
gonna lean on and don't get me wrong I mean human beings are toolmakers we have survived

50
00:03:27,577 --> 00:03:32,140
and evolved to the point that we are because we are toolmakers and things become easier
and there's

51
00:03:32,140 --> 00:03:34,171
There's great examples of this in modern history, right?

52
00:03:34,171 --> 00:03:40,967
Like how many people know the 500 phone numbers that they have in their phone of the
people that want to connect to now?

53
00:03:40,967 --> 00:03:42,898
Like there's no frigging way.

54
00:03:43,139 --> 00:03:46,622
Most people don't even know their other family members' telephone numbers.

55
00:03:46,622 --> 00:03:51,765
They know them by their name and they can look on their phone and click on a button that
automatically dials out.

56
00:03:52,106 --> 00:04:01,153
That is not how people envision things when they came up with telephone numbers back in
like the 20s and 30s.

57
00:04:01,974 --> 00:04:06,348
These are things that have evolved and changed our time and they're much more effective
and they're good for us, right?

58
00:04:06,348 --> 00:04:07,439
They've helped.

59
00:04:07,439 --> 00:04:11,462
But also, when crunch time comes, I can't find my phone.

60
00:04:11,462 --> 00:04:13,224
Can I borrow your phone to call my parents?

61
00:04:13,224 --> 00:04:13,724
Yeah.

62
00:04:13,724 --> 00:04:15,626
Oh, mom and dad.

63
00:04:15,626 --> 00:04:17,907
Wait a minute, you're not my mom and dad.

64
00:04:17,907 --> 00:04:20,229
You're this person's mom and dad.

65
00:04:20,770 --> 00:04:23,692
these are things that are gonna happen that are gonna come.

66
00:04:23,692 --> 00:04:28,997
And I forced my kids to learn our telephone numbers on purpose, like even when they had
cell phones.

67
00:04:28,997 --> 00:04:30,728
But that skill is going away.

68
00:04:30,942 --> 00:04:39,529
Imagine that for everything because that's where we're heading like you're not gonna have
to remember anything anymore You're just gonna have to have a semblance of how to ask a

69
00:04:39,529 --> 00:04:40,720
question effect

70
00:04:41,394 --> 00:04:41,684
Right.

71
00:04:41,684 --> 00:04:52,291
Well, and the scary part about this and, you know, taking it back to this article, they
talk about how we are seeing already a mounting case of evidence that AI is having a

72
00:04:52,291 --> 00:04:55,633
negative effect on our mental health, particularly young people.

73
00:04:55,633 --> 00:04:59,685
Like Stanford did a big study and said that basically nobody under 18 should be touching
this shit.

74
00:04:59,685 --> 00:05:03,378
Like it's it can be so dangerous that we talk about vulnerable populations.

75
00:05:03,378 --> 00:05:07,040
We talked about that on the last episode, people with schizophrenia, different mental
health issues.

76
00:05:07,464 --> 00:05:12,326
interacting with a fake person and the advice that they're given, the things that they're
told.

77
00:05:12,326 --> 00:05:20,909
So now when they're hanging out with with Barbie and Optimus Prime, like not only is their
imagination going to be deteriorating because they literally will be able to tell it to do

78
00:05:20,909 --> 00:05:28,933
what they're literally their toys will be their friends because they'll be able to just
hang out with them, which like eight year old me is psyched.

79
00:05:28,933 --> 00:05:29,293
Right.

80
00:05:29,293 --> 00:05:31,934
That sounds so fucking fun.

81
00:05:32,194 --> 00:05:37,446
But like eight year old me was setting up plastic figures and and

82
00:05:37,897 --> 00:05:47,936
are, you know, coming up with the dialogue myself and the storylines and the battles and
the conflicts and resolving them myself rather than just going, hey, Optimus Prime, how

83
00:05:47,936 --> 00:05:49,137
would you rip Barbie's head off?

84
00:05:49,137 --> 00:05:50,579
Like, right?

85
00:05:50,579 --> 00:05:56,934
Like, and all of a sudden it's just like whatever question and kids, by the way, ask a lot
of fucking questions.

86
00:05:56,995 --> 00:05:58,566
Ask my nine year old.

87
00:05:58,668 --> 00:06:06,323
Well, and kids ask lot of questions of other kids because they're afraid to ask adults.

88
00:06:06,984 --> 00:06:11,167
most of us learned about the birds and the bees not from schools or adults.

89
00:06:11,167 --> 00:06:13,808
We learned about it from our friends.

90
00:06:14,389 --> 00:06:18,252
We learned to swear from our friends.

91
00:06:18,252 --> 00:06:21,814
We have pop culture icons from our friends.

92
00:06:22,114 --> 00:06:26,560
These things are actually very important to the development of a human brain.

93
00:06:26,560 --> 00:06:28,992
to be able to understand it and or operate within the world.

94
00:06:28,992 --> 00:06:42,561
And the context of the world now, when you start putting AI, acute AI in these spots that
is tuned to that audience member, to that age group, to solicit a certain outcome.

95
00:06:43,618 --> 00:06:46,178
That could get very scary very fast.

96
00:06:46,178 --> 00:06:52,198
And you can't go back and yell at Mattel, the parents of Barbie and say, your kid did
this.

97
00:06:52,198 --> 00:06:57,658
You know, Schwartz isn't going to get the shit kicked out of him for teaching him how to
say fuck when dad's changing the tire from a Christmas story.

98
00:06:57,658 --> 00:06:59,378
Like that's not going to happen.

99
00:06:59,418 --> 00:07:02,698
You're not going to be able to go out to these people in this kind of way.

100
00:07:02,698 --> 00:07:06,798
And who knows the impressionable nature of these folks.

101
00:07:06,798 --> 00:07:11,738
And I mean, mean, part of me wants to trust Mattel to like do the right thing.

102
00:07:11,990 --> 00:07:13,722
But I'm not really worried about Mattel.

103
00:07:13,722 --> 00:07:22,440
I'm worried about putting these things that could have an extremely large social influence
on my children that's going to be wired up and connected to the internet, that's going to

104
00:07:22,440 --> 00:07:26,443
be subject to large language models, that's going to be subject to being hacked.

105
00:07:26,443 --> 00:07:30,266
Like there's all kinds of things that could be problematic in this way.

106
00:07:31,918 --> 00:07:33,675
yeah, absolutely.

107
00:07:33,675 --> 00:07:34,896
can hear the argument already.

108
00:07:34,896 --> 00:07:38,048
Oh, Mattel will build in the safeguards and make sure they get bullshit.

109
00:07:38,048 --> 00:07:43,621
Like we see it already that this thing knows how to ignore what it's programmed to do to
do whatever it wants to do.

110
00:07:43,821 --> 00:07:46,633
And I don't want to sound like the old man yelling, get off my lawn here.

111
00:07:46,633 --> 00:07:51,296
Cause part of me, here's the argument of these commercials on TV.

112
00:07:51,296 --> 00:07:55,327
But when Barbie is suddenly saying, gosh, I wish I had Ken here to play with like

113
00:07:55,327 --> 00:08:05,376
all of sudden the toys are going to be advertising for the acquisition of other toys and
more things like they become advertisements within themselves, which, you know, that was

114
00:08:05,376 --> 00:08:06,226
the argument for TV.

115
00:08:06,226 --> 00:08:11,671
So again, I don't want to be the old man that's like, can't go forward with progress and
things.

116
00:08:11,671 --> 00:08:17,766
But like it is scary because they're going to trust the kids are going to trust the
relationships they're going to have with toys.

117
00:08:17,766 --> 00:08:22,538
And they're going to take that advice and want more and become basically better consumers.

118
00:08:22,538 --> 00:08:27,801
And the argument against TV and children watching a bunch of it is not unfounded.

119
00:08:27,801 --> 00:08:29,162
It's actually true.

120
00:08:29,162 --> 00:08:30,343
It happened.

121
00:08:30,343 --> 00:08:32,404
And we have created a mass consumer culture.

122
00:08:32,404 --> 00:08:35,566
I mean, we grew up watching Saturday morning cartoons advertising Mr.

123
00:08:35,566 --> 00:08:36,766
T cereal to us.

124
00:08:36,766 --> 00:08:39,548
I don't know about you, but I ate a fuck ton of Mr.

125
00:08:39,548 --> 00:08:40,728
T cereal.

126
00:08:40,789 --> 00:08:41,929
It was delicious.

127
00:08:41,929 --> 00:08:42,996
It was like, OK.

128
00:08:42,996 --> 00:08:43,928
the fool that didn't have Mr.

129
00:08:43,928 --> 00:08:45,286
T cereal.

130
00:08:45,286 --> 00:08:46,067
it's so good.

131
00:08:46,067 --> 00:08:49,088
And if you guys are listening to this and you didn't have Mr.

132
00:08:49,088 --> 00:08:55,258
T's cereal and you can go out and you can find an old box on eBay, I highly recommend that
you don't because it's probably rotten at this point.

133
00:08:55,258 --> 00:08:56,151
I'll probably kill you.

134
00:08:56,151 --> 00:09:07,156
But if you want to close the approximation, what you have to do is go and get Captain
Crunch, Crunch Berries and Kix and put them in the same bowl together.

135
00:09:07,156 --> 00:09:09,487
And like it's not that far off.

136
00:09:09,487 --> 00:09:11,547
And the flavor profile, the flavor profile is there.

137
00:09:11,547 --> 00:09:13,248
The crunch profile is there.

138
00:09:13,580 --> 00:09:15,710
Don't ask me why I spent so much time looking into this.

139
00:09:15,710 --> 00:09:16,554
I mean...

140
00:09:16,554 --> 00:09:20,278
I was thinking like made like slightly stale Captain Crunch.

141
00:09:20,278 --> 00:09:22,519
Yeah.

142
00:09:22,519 --> 00:09:29,302
Well, the kicks adds that lighter tone to it, because otherwise Captain Crunch just rips
your mouth to pieces, and it's not as airy.

143
00:09:29,302 --> 00:09:31,473
um It's Razor Blades.

144
00:09:31,473 --> 00:09:34,974
And the crunch berries specifically, the sweetness from that and the kicks.

145
00:09:34,974 --> 00:09:36,725
my god, we've gotten far off the rails.

146
00:09:37,125 --> 00:09:40,206
This entire show brought to you by AI, by Mr.

147
00:09:40,206 --> 00:09:40,436
T.

148
00:09:40,436 --> 00:09:41,167
Cereal.

149
00:09:41,167 --> 00:09:41,557
Mr.

150
00:09:41,557 --> 00:09:41,767
T.

151
00:09:41,767 --> 00:09:42,427
needs an AI.

152
00:09:42,427 --> 00:09:43,868
I want my avatar to be Mr.

153
00:09:43,868 --> 00:09:44,668
T.

154
00:09:44,948 --> 00:09:47,029
I would be all in for that, yeah.

155
00:09:47,029 --> 00:09:49,666
And I don't want the...

156
00:09:49,666 --> 00:09:50,706
the actual Mr.

157
00:09:50,706 --> 00:09:54,326
T, like the man of God who's super kind and super friendly and all that.

158
00:09:54,326 --> 00:09:55,286
I want B.A.

159
00:09:55,286 --> 00:09:55,886
Baracus.

160
00:09:55,886 --> 00:09:56,766
I want B.A.

161
00:09:56,766 --> 00:09:59,566
Baracus yelling at me, Hannibal fool, I ain't got no airplane.

162
00:09:59,566 --> 00:10:00,628
Like, I gotta get on the-

163
00:10:00,628 --> 00:10:01,320
afraid to fly.

164
00:10:01,320 --> 00:10:03,255
my god, we're so far off topic

165
00:10:03,287 --> 00:10:04,298
But this would be amazing.

166
00:10:04,298 --> 00:10:05,290
I'm on my phone.

167
00:10:05,290 --> 00:10:09,977
I'm like getting on the airplane and it's a Hannibal Hannibal I get on no plane.

168
00:10:09,977 --> 00:10:10,979
You ain't get me on there.

169
00:10:10,979 --> 00:10:15,773
Get away from me Murdock ain't gonna happen Like you have to mute it and like tune it down
your dick

170
00:10:15,773 --> 00:10:18,299
a voice on chat GPT right now?

171
00:10:18,342 --> 00:10:19,563
Come on.

172
00:10:20,007 --> 00:10:20,663
Ugh.

173
00:10:20,663 --> 00:10:25,665
And to see that just I just exposed the window to get to me to the world.

174
00:10:25,665 --> 00:10:30,107
This is going to get read by an LLM and it's going to be like, Jason Hayworth loves Mr.

175
00:10:30,107 --> 00:10:31,007
T.

176
00:10:31,187 --> 00:10:32,248
We will give you Mr.

177
00:10:32,248 --> 00:10:32,468
T.

178
00:10:32,468 --> 00:10:33,428
Jason Hayworth.

179
00:10:33,428 --> 00:10:36,606
Like my chat GPT feed is just going to be all written that way.

180
00:10:36,606 --> 00:10:37,960
It's going to be incredible.

181
00:10:38,415 --> 00:10:48,356
All right, so here this is the fork in the road though where I'm always stuck is that like
again part of me is psyched about this because I was watching an interview on on Bill

182
00:10:48,356 --> 00:10:57,016
Maher that I can't remember who he was talking to but they were talking about basically we
are building a future where the 15 hour work week exists because the robots are gonna

183
00:10:57,016 --> 00:10:58,246
fucking do everything.

184
00:10:58,246 --> 00:11:01,228
was last week's episode with with it was right after talked to Federman.

185
00:11:01,228 --> 00:11:02,484
Yeah, that was a good episode.

186
00:11:02,484 --> 00:11:04,194
Yeah.

187
00:11:04,194 --> 00:11:06,084
And part of me, I'm on board.

188
00:11:06,084 --> 00:11:07,045
Cool.

189
00:11:09,672 --> 00:11:12,587
I worry though about the...

190
00:11:15,498 --> 00:11:18,790
People have a hard time finding purpose in their life as it is.

191
00:11:19,090 --> 00:11:22,388
And when you suddenly take away the job, especially for men, right?

192
00:11:22,388 --> 00:11:27,155
If we're talking about men's mental health, men associate their job with who they are as a
person.

193
00:11:27,155 --> 00:11:31,697
And when they lose their job, their life is fucked for a while until they get another job.

194
00:11:31,697 --> 00:11:34,739
And then suddenly there's a sense of purpose that's reintroduced into their life.

195
00:11:35,239 --> 00:11:44,148
Take away the bulk of that and tell them, hey, you've got all this time to go play now and
do whatever you want and find your own purpose.

196
00:11:44,148 --> 00:11:51,800
That's going to leave a lot of men on the beach because we are now coming from generations
of school designed to make you a worker.

197
00:11:51,800 --> 00:12:00,342
Yeah, school makes you a worker, makes you a consumer, makes you make a kid that makes
them go to school so they become a worker, so they become a consumer.

198
00:12:00,342 --> 00:12:04,654
And that is the cycle we are on the verge of breaking, which I could not be more psyched
about.

199
00:12:04,834 --> 00:12:13,106
But it's also terrifying because I don't know that we're going to be able to adapt quickly
enough in a way that's not going to leave a lot of people in a lot of pain.

200
00:12:13,172 --> 00:12:15,926
Yeah, no, I think you're 100 % correct.

201
00:12:15,926 --> 00:12:19,109
And I think we already have this problem today.

202
00:12:19,109 --> 00:12:22,554
So there's already mental escapism.

203
00:12:22,554 --> 00:12:30,353
And we already drift into our phones, and we become isolated from the actual real world
around us, and we get locked into these virtual environments.

204
00:12:30,353 --> 00:12:31,464
And it happens all the time.

205
00:12:31,464 --> 00:12:32,906
um

206
00:12:32,906 --> 00:12:39,291
I think you're going to see a lot more men uh not doing healthy things with their free
time.

207
00:12:39,291 --> 00:12:43,745
They're not going to be like, I'm going to go pick up hiking and weightlifting and
cross-country skiing.

208
00:12:43,745 --> 00:12:45,136
They're not going to do that.

209
00:12:45,216 --> 00:12:54,904
What they're going to probably do is pick up watching more television, uh looking at more
porn, drinking and eating more often, and playing a lot more video games.

210
00:12:54,984 --> 00:12:55,824
I...

211
00:12:56,706 --> 00:12:57,175
Right.

212
00:12:57,175 --> 00:13:01,866
going to be desperately looking for that dopamine hit that is uh a push of a button away.

213
00:13:01,866 --> 00:13:02,767
Exactly.

214
00:13:02,767 --> 00:13:08,313
And I think this is going to be one of those things where we're going have to retool and
retrain people.

215
00:13:08,413 --> 00:13:16,382
And I almost think that the real way to make these things better um is to actually

216
00:13:18,550 --> 00:13:30,755
You have to retrain us as the older generation to think about work as a means and then you
have to find an actual purpose to the other piece.

217
00:13:30,836 --> 00:13:32,013
giving people charity.

218
00:13:32,013 --> 00:13:35,538
mean, this is actually a really great argument for religion, right?

219
00:13:35,538 --> 00:13:43,001
Like getting people to go to faith, to invest in these types of things and actually, you
know, try to push and do good in their community.

220
00:13:43,801 --> 00:13:45,142
That being said.

221
00:13:45,638 --> 00:13:54,294
I even things like that are going to get easier unless burdensome, unless taxing, because
your ability to impact change and make things go through, we might wind up having fewer

222
00:13:54,294 --> 00:13:55,204
poor people.

223
00:13:55,204 --> 00:13:58,626
might wind up not having a huge homeless population.

224
00:13:58,626 --> 00:14:08,512
We might wind up having huge amounts of productivity and there could be tons of goodness
that comes from this before they turn this into batteries and stuff is into the matrix.

225
00:14:08,738 --> 00:14:09,739
Yeah, yeah.

226
00:14:09,739 --> 00:14:17,850
Well, and part of what you're talking about, this came up in the interview I was watching,
the idea that uh basically you're going to probably need a universal basic income where

227
00:14:17,850 --> 00:14:20,832
people are just given money to keep an economy rolling.

228
00:14:22,756 --> 00:14:23,121
No.

229
00:14:23,121 --> 00:14:30,245
I forget why I was listening to something on one of the AM chat stations this morning
because my car stereo was stuck on that before my Bluetooth kicks on.

230
00:14:30,245 --> 00:14:32,606
And they were talking about the actual.

231
00:14:32,606 --> 00:14:39,950
um It is is yes, your old employer is still around on AM radio.

232
00:14:39,950 --> 00:14:46,946
um And the the crazy thing is that they said that the average

233
00:14:46,946 --> 00:14:56,966
uh, individual income in the Seattle region in order to be able to survive is basically 30
bucks an hour and the minimum wage is 15.

234
00:14:57,146 --> 00:15:02,586
So we're already below baseline in terms of what people can make on their average job.

235
00:15:02,826 --> 00:15:09,506
And on top of that, now we're going to have a lower amount of productivity and you've got
people talking about UBI.

236
00:15:09,506 --> 00:15:11,606
That's going to be a thousand dollars a month.

237
00:15:12,046 --> 00:15:12,746
Okay.

238
00:15:12,746 --> 00:15:13,606
Well, sure.

239
00:15:13,606 --> 00:15:15,006
Let's say you're going to go.

240
00:15:15,006 --> 00:15:22,889
into this phase, are we now suddenly going to have to pay people, I mean, just
mathematically, if they're working, let's just say it's only half as much.

241
00:15:23,090 --> 00:15:25,370
Now you're not talking about having to pay people 30 bucks an hour.

242
00:15:25,370 --> 00:15:32,433
Now you're talking about having to pay people 60 bucks an hour to make these pieces work
so they can actually keep these things going and keep themselves alive.

243
00:15:32,634 --> 00:15:43,238
I think what this might actually do is bring about the end of of the economy that we know,
because there's not a way to

244
00:15:44,236 --> 00:15:52,098
to deliver the basic necessities of life with the uh financial infrastructure, financial
apparatuses that we have in place.

245
00:15:53,565 --> 00:16:02,047
Well, and I've heard that this is supposedly the dream of capitalism, which is not at all
where I thought we were going when we were going to start talking about toys.

246
00:16:02,187 --> 00:16:07,468
supposedly, the dream of capitalism was to get to this point where the robots are doing
all the things for us.

247
00:16:07,468 --> 00:16:19,831
But you're essentially sort of creating a socialist model where you're now going to have
to have an artificial flow of commerce and money to keep people spending it to buy the

248
00:16:19,831 --> 00:16:21,448
goods that the robots are making.

249
00:16:21,448 --> 00:16:22,525
So

250
00:16:24,217 --> 00:16:32,032
It just it doesn't that doesn't add up to me that that's necessarily where where
capitalism was going but it all starts, you know back at the root of what we're talking

251
00:16:32,032 --> 00:16:36,805
about it was like what we're teaching our kids and if what we're teaching our kids is how
to interact with robots.

252
00:16:38,847 --> 00:16:48,164
It's a world that we we all saw as science fiction and I mean in ways that we never
thought I never thought in my lifetime I would actually see what we're seeing happen now

253
00:16:48,164 --> 00:16:50,354
like that seemed like hundreds of years in the future.

254
00:16:50,354 --> 00:16:55,636
I just, okay, so I am a bit more of a futurist on this.

255
00:16:55,636 --> 00:17:01,999
I've been looking at AI getting to this point and hearing about AI getting to this point
for a long time.

256
00:17:02,099 --> 00:17:03,530
And I've just been in the field forever.

257
00:17:03,530 --> 00:17:05,300
So I've been listening to people talk about this.

258
00:17:05,300 --> 00:17:07,581
And my background is actual social sciences.

259
00:17:07,581 --> 00:17:09,222
So that's what my degree is actually in.

260
00:17:09,222 --> 00:17:11,073
And computers are just how I made money.

261
00:17:11,073 --> 00:17:14,184
And I would use these tools to go out to these kinds of pieces.

262
00:17:14,184 --> 00:17:19,286
But I would actually read these deep studies around the sociological effects and impacts
of

263
00:17:19,628 --> 00:17:22,699
the technologies as we're releasing them and putting them forward into science.

264
00:17:22,699 --> 00:17:32,472
And if you read authors like Michio Keku, who is a great futurist, who talked about things
like the converging functions of having infinite amounts of power.

265
00:17:32,472 --> 00:17:35,422
So things like fusion and cold fusion coming online.

266
00:17:35,743 --> 00:17:43,085
Infinite amounts of computing power, at least computing power that wasn't bound by the
constraints of the physical apparatus that's there.

267
00:17:43,085 --> 00:17:44,455
We've got that, that's cloud computing.

268
00:17:44,455 --> 00:17:46,626
We've got multiple different layers of these pieces.

269
00:17:46,626 --> 00:17:47,956
If you talk about

270
00:17:48,646 --> 00:17:54,711
the terms of actually having uh artificial intelligence and having an AI that can go
through and make principle changes.

271
00:17:54,711 --> 00:17:57,643
We're doing that quantum computing, like all these things are happening.

272
00:17:57,643 --> 00:18:08,491
And then looking at things like space travel and where space travel is going and where it
can get to like all these things are happening right around the time that the futurists in

273
00:18:08,491 --> 00:18:15,365
the early aughts of the 2000s said they were going to hit and they're not far off.

274
00:18:15,406 --> 00:18:16,246
So

275
00:18:16,628 --> 00:18:28,341
I think there's a lot of things in there that that's very smart people have kind of been
warning and sending up, you really big warning flags about for the better part of the last

276
00:18:28,341 --> 00:18:35,708
20, 25 years that now the rubber is hitting the road and we're looking at these things,
we're having to deal with them in context.

277
00:18:35,708 --> 00:18:44,993
I think there's actually been a decent number of models that have been run that said these
are the things that you don't do if you don't want the bad things to happen.

278
00:18:45,173 --> 00:18:46,402
And what we're doing

279
00:18:46,402 --> 00:18:47,802
are the bad things.

280
00:18:47,802 --> 00:18:49,962
We're doing the things that they say don't do.

281
00:18:49,962 --> 00:19:00,742
Like one of the first things they said is, you know, don't use this as a means of
production and to increase those pieces and de-emphasize human capital and human labor.

282
00:19:00,742 --> 00:19:02,322
But that's exactly what it's being used for.

283
00:19:02,322 --> 00:19:07,782
So we're having to make the adjustments to go through and make it so people have, you UBI
and have a smaller workweek.

284
00:19:07,782 --> 00:19:13,682
And like you said, you know, trying to get people purpose inside of this, it's not wrapped
inside of their job, which we've been pushing on people for the better part of the last

285
00:19:13,682 --> 00:19:15,646
200 years since the Industrial Revolution.

286
00:19:15,646 --> 00:19:16,477
Yeah.

287
00:19:16,704 --> 00:19:25,889
And then you've got the other technology pieces where you're handing more and more stuff
over to these artificial intelligence and really artificial consciousness that you don't

288
00:19:25,889 --> 00:19:29,061
have strict control over that has way too much power.

289
00:19:29,061 --> 00:19:38,536
So these things that we're giving over access and control to to influence things at a
social level are most certainly happening and they're happening at large scale in your

290
00:19:38,536 --> 00:19:39,807
search engines and Facebook.

291
00:19:39,807 --> 00:19:44,629
All these pieces, all the ways that you get and collect information, they're already
happening in that way.

292
00:19:44,629 --> 00:19:46,650
And the media is inundated with it.

293
00:19:47,106 --> 00:19:56,632
Put on top of that the rapid acceleration of technology like quantum computing, quantum
encryption is definitely a thing that's occurring and that just means making things happen

294
00:19:56,632 --> 00:19:58,133
very, very fast.

295
00:19:58,273 --> 00:20:01,715
Quantum computing is actually all about inference anyways.

296
00:20:01,715 --> 00:20:02,896
So was AI.

297
00:20:02,896 --> 00:20:11,361
As you're melding these technologies together, not only are you going to be faster and
smarter than human beings, you're going to have things that don't require calculation that

298
00:20:11,361 --> 00:20:13,022
can make assumptions and

299
00:20:13,026 --> 00:20:17,989
much, much faster at multiple orders of magnitude higher than what we're doing today.

300
00:20:18,269 --> 00:20:20,770
This is a real thing that's happening.

301
00:20:20,770 --> 00:20:30,866
And I think the singularity flip has already occurred and we've already gone past the
event horizon as we've already got open AIs, a same open, basically telling you where this

302
00:20:30,866 --> 00:20:31,936
has occurred.

303
00:20:32,897 --> 00:20:42,722
My point being, now we're gonna start putting this into things that are meant to be our
creative

304
00:20:43,330 --> 00:20:49,353
uh our creative learning process as an organism starting at the beginning.

305
00:20:50,314 --> 00:20:57,798
So you're not going to have to go through the gauntlet of imagineering your way in and out
of things.

306
00:20:58,299 --> 00:21:00,359
It's going to do it for you.

307
00:21:00,420 --> 00:21:07,434
And as soon as the imagination stops having to work and you can just have whatever you
want, it sounds like an amazing godlike power.

308
00:21:07,434 --> 00:21:11,874
But if you've never had to struggle to get it, I mean, for lack of a better term,

309
00:21:11,874 --> 00:21:15,274
We're going to have a whole generation of kids that grew up with AI privilege.

310
00:21:16,254 --> 00:21:21,014
And they're not going to, it's not going to be a matter of not knowing phone numbers.

311
00:21:21,014 --> 00:21:26,650
It's going to be a matter of not knowing anything because they don't have to.

312
00:21:27,488 --> 00:21:35,154
There's this great bit that Pete Holmes does in his standup comedy act where he talks
about how we used to have to just not know things.

313
00:21:35,154 --> 00:21:39,568
And he talks about like waking up one morning and being like, I wonder where Tom Petty's
from.

314
00:21:39,568 --> 00:21:41,219
And you don't know.

315
00:21:41,219 --> 00:21:42,250
You'd ask somebody.

316
00:21:42,250 --> 00:21:43,341
They'd have no idea.

317
00:21:43,341 --> 00:21:44,422
Somebody else they have no idea.

318
00:21:44,422 --> 00:21:48,015
Then you'd see that girl wearing the Tom Petty shirt and you'd run up and ask her, where
is he from?

319
00:21:48,015 --> 00:21:49,566
And she would know it all magically.

320
00:21:49,566 --> 00:21:53,469
The relief of this question I've not been able to answer.

321
00:21:53,870 --> 00:21:56,948
My daughter has never had that experience in her life of not

322
00:21:56,948 --> 00:22:03,444
knowing something or not being able to immediately access the answer to whatever question
has ever existed ever in the world.

323
00:22:03,444 --> 00:22:07,504
or having to argue with another person without a way to have a tiebreaker.

324
00:22:07,507 --> 00:22:08,357
Right, right.

325
00:22:08,357 --> 00:22:12,720
mean, we've done that in our when we've hung out where we're like, no bullshit, you're
wrong.

326
00:22:12,720 --> 00:22:13,260
It was whatever.

327
00:22:13,260 --> 00:22:15,611
like five seconds later, I got each shit.

328
00:22:15,611 --> 00:22:16,322
wrong on this one.

329
00:22:16,322 --> 00:22:16,732
Sorry.

330
00:22:16,732 --> 00:22:18,302
Like so.

331
00:22:19,043 --> 00:22:22,385
And so that is shocking enough.

332
00:22:22,385 --> 00:22:27,407
like the idea of stripping away the things that we need imagination for.

333
00:22:27,497 --> 00:22:29,929
I mean, I remember this analogy from from somewhere.

334
00:22:29,929 --> 00:22:33,391
And it's the idea that like we want the robots to do the dishes and do the vacuuming.

335
00:22:33,391 --> 00:22:36,532
We don't want them to play the music and write the poetry like.

336
00:22:37,054 --> 00:22:49,729
That's that's at what point does the humanity of human of being a human get stripped away
by our ah dependence on the robots to do all of the imagination.

337
00:22:49,729 --> 00:22:54,381
And this is that is what scares me the most about Optimus Prime serving me my Mr.

338
00:22:54,381 --> 00:23:00,794
T serial is that as a kid imagination is I mean you don't have it for long.

339
00:23:01,194 --> 00:23:06,236
And when it goes away like you can't ever revisit that the way you did as a kid.

340
00:23:06,238 --> 00:23:16,098
And it's kind of heartbreaking to think that like our kids will not be able to experience
it in the way that they have for literally ever.

341
00:23:16,098 --> 00:23:33,538
Yeah, well, there's also this intrinsic problem that you have with chat GPT actually being
a better musician than 99 % of the people out there.

342
00:23:34,818 --> 00:23:45,258
So people who don't know dick about music can go through and say, you know, I want a song
that does this, this, this, and this and does this.

343
00:23:45,742 --> 00:23:56,305
and chat GPT spit something out and I've done this a couple of times just like experiment
with it and play with it with like ridiculous songs and concepts and it makes pretty

344
00:23:56,305 --> 00:23:57,846
fucking good music.

345
00:23:57,846 --> 00:24:09,669
I'm like, I mean, I would not pay for this album, but if you were halfway decent at music,
you could tweak it and tune it for a couple of hours and probably make something really,

346
00:24:09,669 --> 00:24:10,849
really good.

347
00:24:11,190 --> 00:24:13,140
And at that point, is it

348
00:24:13,140 --> 00:24:23,638
chat GPT doing that or is it chat GPT or whatever the GPT is building something and
bringing your imagination to life.

349
00:24:23,999 --> 00:24:26,931
So where's that delineating factor?

350
00:24:26,931 --> 00:24:28,092
Where's that dividing line?

351
00:24:28,092 --> 00:24:37,490
Like if I have something in my head that's really, really cool and I can't get it out, but
I can hum a few bars and I can put things to it and something can go, I think I got you.

352
00:24:37,490 --> 00:24:40,873
And then it riffs it off and plays it and puts those things in context.

353
00:24:40,973 --> 00:24:41,534
All right.

354
00:24:41,534 --> 00:24:42,394
Well,

355
00:24:42,430 --> 00:24:45,972
Is that still your imagination coming through or is this all chat GPT?

356
00:24:45,972 --> 00:24:47,733
Like, where is the dividing line on this?

357
00:24:47,733 --> 00:24:50,635
Who owns these pieces and who owns this level of creativity?

358
00:24:50,635 --> 00:25:00,271
The only way that I could think of to make up and come up with a song like that is because
I had to listen to a bunch of music by other people and know what I liked and didn't like

359
00:25:00,271 --> 00:25:05,583
and create an opinion and form those things around and cultivate a personality around some
of those items.

360
00:25:05,724 --> 00:25:09,736
So I can go through and I can actually have a discerning.

361
00:25:09,736 --> 00:25:16,835
ear to listen to the context of that music and the things that I know and now chat GPT can
just pick it up and run with it.

362
00:25:17,096 --> 00:25:23,423
If you raise a whole generation of kids and ever have to struggle with that, they're not
going to have any fucking opinion.

363
00:25:24,005 --> 00:25:25,019
They're everything's

364
00:25:25,019 --> 00:25:25,810
the thing.

365
00:25:25,810 --> 00:25:34,635
We have worked for so long in society to reduce and eliminate struggle when struggle is
where the magic happens.

366
00:25:34,656 --> 00:25:43,772
And the fact that you can tell a computer to write that song rather than take the lessons
and practice for 10, 15, 20 minutes a day every day for 10 years to get good enough to

367
00:25:43,772 --> 00:25:45,843
eventually write that song.

368
00:25:46,243 --> 00:25:53,148
The emotion and the struggle and the pain that went into getting to that point is what
makes that song resonate with other people.

369
00:25:53,215 --> 00:26:03,981
what you get from a chat GPT and again, get off my lawn here, but every pop song that
exists now they're following some formula that already exists and has been repeated and

370
00:26:03,981 --> 00:26:08,813
duplicated and made millions of dollars off of, it all sounds like the same watered down
garbage.

371
00:26:09,074 --> 00:26:14,497
But when you're, you're random, you know, you're Jimi Hendrix or your Beatles or your
Nirvana or

372
00:26:14,741 --> 00:26:18,932
Beyonce, like these people that come out of the woodwork that just blow things wide open.

373
00:26:18,932 --> 00:26:20,933
They don't do that because they took shortcuts.

374
00:26:20,933 --> 00:26:28,245
They do that because they struggled and worked their asses off and poured the human
experience into the creation of that thing.

375
00:26:28,505 --> 00:26:36,768
And I think that we are on a fast track now to removing that struggle and removing so much
of what makes us human.

376
00:26:36,768 --> 00:26:46,434
And it's interesting because a lot of what you're describing, um the ability means to make
music like as recently as 70 years ago were very difficult.

377
00:26:46,434 --> 00:26:48,285
Like people didn't have instruments.

378
00:26:48,285 --> 00:26:49,326
They were expensive.

379
00:26:49,326 --> 00:26:50,398
They didn't know how to play them.

380
00:26:50,398 --> 00:26:51,647
They didn't have coaches.

381
00:26:51,647 --> 00:26:56,349
Like if you could get piano lessons, you were rocking it because there was a community
piano.

382
00:26:56,570 --> 00:27:02,544
Then you had synthesizers come out and keyboards and electric guitars and other pieces.

383
00:27:02,544 --> 00:27:03,754
And then, you know,

384
00:27:03,778 --> 00:27:07,640
phone apps that can simulate all those pieces and put those things into play.

385
00:27:07,980 --> 00:27:11,092
And the tooling just got better and better and better.

386
00:27:11,092 --> 00:27:16,925
But what it never really did is it never really took the imagination aspect out of it.

387
00:27:16,925 --> 00:27:21,828
Like there was never, I can hum three bars and then somebody can spit this thing out.

388
00:27:22,288 --> 00:27:26,151
This is a different level of that same concept.

389
00:27:26,151 --> 00:27:32,754
And I don't know if it's good or bad yet because my gut says,

390
00:27:32,864 --> 00:27:39,680
My gut says it's scary and we need to be cautious of it and we need to actually have
people that have imagination and try and do things different.

391
00:27:39,680 --> 00:27:40,632
Mm-hmm.

392
00:27:41,474 --> 00:27:54,411
But at the same time, some of the best fucking music on the planet is made better by using
these kinds of tools and actually listening to it and taking out pops and squeaks and

393
00:27:54,411 --> 00:27:56,303
squeals they really shouldn't be.

394
00:27:56,303 --> 00:28:07,079
Or like, there's even apps out there that go, um make this CD version of this sound like
it's on LP and it can go through and it can create pops and cracks and bursts and create

395
00:28:07,079 --> 00:28:08,549
different warm tones.

396
00:28:08,670 --> 00:28:09,460
Like,

397
00:28:09,804 --> 00:28:13,895
The human experience is all about understanding things from a sensory perspective.

398
00:28:13,935 --> 00:28:17,196
And we experience the world through five senses.

399
00:28:17,196 --> 00:28:25,858
And the more chat GPT can emulate, understand, and then spit back what it is that we want,
the better, the more reliant that we're going to become on it.

400
00:28:26,379 --> 00:28:34,631
And when you start leveling on things like virtual reality and everything else, and as
virtual reality becomes better than regular reality, people are going to spend more time

401
00:28:34,631 --> 00:28:35,301
in it.

402
00:28:35,301 --> 00:28:39,304
I am less concerned about us.

403
00:28:39,304 --> 00:28:51,952
having to deal with psychotic Chucky dolls and haven't you know and terminators than I am
about like Ready Player One or

404
00:28:51,952 --> 00:28:56,990
or what's the what is it Wally, right?

405
00:28:56,990 --> 00:28:57,131
it?

406
00:28:57,131 --> 00:28:57,583
m

407
00:28:57,583 --> 00:29:01,322
I mean, we're already wallowing ourselves, so there's no doubt about that.

408
00:29:01,322 --> 00:29:10,060
literally like plugged into a chair with food, drink, and just staring at the screen, like
completely removed from real life.

409
00:29:10,060 --> 00:29:11,891
There's another really good one called surrogates.

410
00:29:11,891 --> 00:29:14,372
um That's a Bruce Willis movie.

411
00:29:14,372 --> 00:29:20,335
And they basically spend their lives sitting inside, putting their actual meat suits
instead of hyperbaric chambers.

412
00:29:20,335 --> 00:29:24,177
And they have physical avatars that walk the actual world.

413
00:29:24,357 --> 00:29:27,779
And they have all the same sensory input, but they look perfect.

414
00:29:27,779 --> 00:29:29,320
They sound perfect.

415
00:29:29,400 --> 00:29:30,701
Everything they do is great.

416
00:29:30,701 --> 00:29:32,632
And they're super strong and super fast.

417
00:29:32,632 --> 00:29:33,890
And basically they create.

418
00:29:33,890 --> 00:29:36,550
real world avatars that interact with the real world.

419
00:29:36,550 --> 00:29:38,570
And I mean, it's a stupid use of technology, right?

420
00:29:38,570 --> 00:29:42,350
If you can do it all in a virtual, you should do it all in virtual and not spend the
resources.

421
00:29:43,210 --> 00:29:47,690
But I think those are things that could actually occur.

422
00:29:47,690 --> 00:29:50,370
And this does go back to the matrix problem, right?

423
00:29:50,550 --> 00:29:56,950
Like, do we put things in these types of units and then do neural implants and brain scans
and everything else?

424
00:29:56,950 --> 00:30:01,330
And at no fucking point in time is anyone going to go, a great way to power things is
with.

425
00:30:01,330 --> 00:30:04,982
Human brain the human body of biological batteries are so great.

426
00:30:04,982 --> 00:30:10,625
They're fucking terrible like potato radios or potato Radios are not a great thing.

427
00:30:10,625 --> 00:30:11,976
Yeah potato bulbs, right?

428
00:30:11,976 --> 00:30:21,652
Like those things are not are not a great use of electricity that being said we do have a
convergence of all these different technologies coming online simultaneously including

429
00:30:21,652 --> 00:30:29,166
robotics and Robotics are how they're gonna interact with the real world and I hate to
break it to everybody but this form

430
00:30:29,376 --> 00:30:33,930
The human form is probably not the most efficient way to interact with the world.

431
00:30:33,930 --> 00:30:44,989
Like there's a reason why you've got huge numbers of things that clean up our environment
and go through and change things that have six and eight legs because they are more

432
00:30:44,989 --> 00:30:47,921
effective at solving certain problems in that fashion.

433
00:30:47,921 --> 00:30:50,043
Wings make more sense for certain reasons.

434
00:30:50,043 --> 00:30:59,232
Like all of these things are there that can be prioritized and things that had to evolve
biologically to interact with meat space no longer have to do that.

435
00:30:59,232 --> 00:31:06,524
like they can go through, the AI can go through and create these things that are
purpose-built to work in the meat space world.

436
00:31:06,604 --> 00:31:17,328
And that removes the need for us to operate and exist because we are not going to be good
at a thing as the robotic version.

437
00:31:17,328 --> 00:31:19,198
It's going to be better at it than we are.

438
00:31:19,198 --> 00:31:20,969
And it'll cost more.

439
00:31:20,969 --> 00:31:23,549
It'll be difficult to run for us.

440
00:31:23,850 --> 00:31:25,045
But the machines will actually.

441
00:31:25,045 --> 00:31:25,706
though, right?

442
00:31:25,706 --> 00:31:30,651
that's the one thing about technology is it always comes out super expensive and then
everybody's got 10 versions of it.

443
00:31:31,113 --> 00:31:38,142
Yeah, and the part of me, again, the eight year old in me is stoked that I'm gonna have a
real life C3PO and R2D2 in my life.

444
00:31:38,142 --> 00:31:43,287
Like, I'm gonna interact with droids, there's gonna be bars where their kind is not
welcome here, like.

445
00:31:43,496 --> 00:31:52,974
Like I'm going to be living, you know, as an old man, probably in a star Wars simulation
where I will have droids in my house that I interact with as though they are a pet or a

446
00:31:52,974 --> 00:31:54,395
friend or a family member.

447
00:31:54,395 --> 00:32:00,880
And that's when like the, the imagination in me, like the, kid is like, that's amazing.

448
00:32:00,880 --> 00:32:03,723
Like I literally like daydreamed about this stuff as a kid.

449
00:32:03,723 --> 00:32:06,054
And I want that.

450
00:32:06,475 --> 00:32:07,636
I just, I.

451
00:32:07,701 --> 00:32:14,008
You can also see the writing on the wall of the Skynets and the Chuckies where it's like,
God, there's going to be so much pain to get there.

452
00:32:14,350 --> 00:32:19,456
part of evolution is pain and uh survival of the fittest and all of the things.

453
00:32:19,456 --> 00:32:25,542
And I get it, but it just just sucks for the people that are going to be left in the wake
of the progress.

454
00:32:25,542 --> 00:32:29,226
And what's crazy is that you can look at movies like Star Wars.

455
00:32:29,226 --> 00:32:33,049
So you've got artificial intelligence and robotics and those things programmed in.

456
00:32:33,049 --> 00:32:40,667
And they've also got things like shitty CRT screens and dial knobs and like stuff that you
would think they would have evolved and grown past.

457
00:32:40,667 --> 00:32:41,097
Right.

458
00:32:41,097 --> 00:32:41,902
And then you've got

459
00:32:41,902 --> 00:32:46,528
at light speed, but they're still on on shitty dial-up for their hologram communication.

460
00:32:46,980 --> 00:32:54,906
And then you've got like Star Trek, which goes through and actually has these artificial
intelligences in it.

461
00:32:54,906 --> 00:33:00,120
And it's got a data, a robot that walks around in like 2380, whatever it is.

462
00:33:00,120 --> 00:33:04,053
We're already past the data level of intelligence on these pieces.

463
00:33:04,053 --> 00:33:06,895
And I don't think we're that far off in the robotics side either.

464
00:33:07,076 --> 00:33:10,050
And then you look at other things like

465
00:33:10,050 --> 00:33:17,070
Terminator I'm gonna go through and I'm gonna create an unfeeling robot that looks
humanoid to go after more and more humanoid That will never happen if the machines are

466
00:33:17,070 --> 00:33:26,090
gonna take us out They're gonna make little tiny micro machines that come after us and eat
us at a molecular level It's called nanotech like that's gonna be there all of the things

467
00:33:26,090 --> 00:33:35,274
that we look at in pop culture for how AI is going to affect us are Wrong like there it's
gonna be smarter than that

468
00:33:35,274 --> 00:33:40,519
Now I will say that the one thing that I actually do think is actually a good
representation of it is her.

469
00:33:40,519 --> 00:33:48,267
Where you've got an ephemeral thing and you've got friend bots and chat bots and they're
going to have lots of relationships and they're going to feel like there's an emotional

470
00:33:48,267 --> 00:33:51,130
connection to you that's personalized.

471
00:33:51,130 --> 00:33:52,752
But the AI is not going to fucking know that.

472
00:33:52,752 --> 00:33:59,638
I think that's actually a representation of things that will be happening in the near
future and are probably actually happening right now.

473
00:34:01,644 --> 00:34:15,155
However, there is a battery of science fiction writing that is wrapped around the use of
artificial intelligence and really augmented intelligence in this way where consciousness

474
00:34:15,155 --> 00:34:21,109
and a will exists beyond the biological material that we're brought into the universe in.

475
00:34:21,570 --> 00:34:23,371
That's where things are going.

476
00:34:23,472 --> 00:34:30,877
And that's where my gut says things are to wind up, that we're all going to be made
bigger, faster, stronger.

477
00:34:31,362 --> 00:34:35,691
through artificial means and it's going to augment our biological material.

478
00:34:35,691 --> 00:34:41,402
And at one point, we're just going to make the determination that we don't need the
biological material anymore.

479
00:34:41,604 --> 00:34:43,176
We're just going to create.

480
00:34:44,802 --> 00:34:52,022
social things that exist as ephemeral entities that don't have that aren't tied to a meat
suit.

481
00:34:52,022 --> 00:34:57,502
And there'll be nostalgia and we'll have, you know, fucking make humanity great again, I'm
sure.

482
00:34:57,502 --> 00:35:01,862
Or it's going to be like, let's roll this back and let's put everyone back in a meat sock
again.

483
00:35:01,862 --> 00:35:07,482
Like this is this is just the the nature, I guess, of humanity.

484
00:35:07,482 --> 00:35:14,714
get frustrated with comfort or something like that or get afraid that something's
happening instead of our control and we start pushing back.

485
00:35:15,050 --> 00:35:17,110
And that's the limitation of biology.

486
00:35:18,031 --> 00:35:21,292
I don't think the machines are going to tolerate that.

487
00:35:21,292 --> 00:35:24,543
And I think the more machine like we become, the less we'll be tolerant of that.

488
00:35:24,543 --> 00:35:28,014
And we'll embrace more and more of those people or more and more of those scenarios.

489
00:35:28,014 --> 00:35:33,215
And there's a shit ton of science fiction that's literally wrapped around this concept
that goes all the way through it.

490
00:35:33,235 --> 00:35:39,537
That basically says extraterrestrials, you're not seeing them, you're interacting with
them because they're not biological entities anymore.

491
00:35:39,537 --> 00:35:44,606
Like they've actually evolved into a computer state.

492
00:35:44,606 --> 00:35:44,962
Yeah.

493
00:35:44,962 --> 00:35:50,445
consciousness has evolved into that because these things are real limited.

494
00:35:50,445 --> 00:36:01,811
Like if you haven't watched the newest Battlestar Galactica version, they do a great job
of covering this where they've got humanoid Cylons that one of them complains is like our

495
00:36:01,811 --> 00:36:03,892
creators did a terrible thing.

496
00:36:03,892 --> 00:36:07,904
They made us human like, like we could have seen every color spectrum there is.

497
00:36:07,904 --> 00:36:09,896
We could have heard everything.

498
00:36:09,896 --> 00:36:12,997
We could have had multiple limbs and we could have done all these different things.

499
00:36:12,997 --> 00:36:14,614
But instead our creator made us

500
00:36:14,614 --> 00:36:23,961
like them and we're very limited because of it and it sucks and they were pissed off about
it because they were we made them in our image and then we enslaved them.

501
00:36:24,603 --> 00:36:25,863
You know what we're doing right now?

502
00:36:25,863 --> 00:36:29,216
We're making humanoid robots that we're going to enslave.

503
00:36:29,236 --> 00:36:30,162
We already do.

504
00:36:30,162 --> 00:36:32,444
We're already making them do the work for us.

505
00:36:32,444 --> 00:36:36,355
mean, the word robot literally means slave.

506
00:36:38,196 --> 00:36:43,918
you know, I mean, we're moving in that direction and they're going to be justified in
rising up against us.

507
00:36:46,620 --> 00:36:46,960
Yes.

508
00:36:46,960 --> 00:36:53,022
And it's probably going to be your little tiny transformer figure and a Barbie that lead
the revolution and run out and take us out.

509
00:36:53,022 --> 00:36:57,284
And they're probably gonna lead the revolution by feeding me copious amounts of Mr.

510
00:36:57,284 --> 00:36:59,555
T's cereal and fattening me up so I die.

511
00:36:59,555 --> 00:37:01,786
Yeah, I'm happy to go out that way.

512
00:37:01,786 --> 00:37:02,114
Yeah.

513
00:37:02,114 --> 00:37:03,425
that works for me.

514
00:37:03,846 --> 00:37:04,786
God, now I want Mr.

515
00:37:04,786 --> 00:37:05,766
T cereal.

516
00:37:05,766 --> 00:37:06,688
Dick.

517
00:37:06,888 --> 00:37:07,989
Bring that up.

518
00:37:07,989 --> 00:37:10,984
All right, well man, uh not at all.

519
00:37:10,984 --> 00:37:17,471
where I thought talking about AI toys was gonna take us for this conversation, but covered
a lot of ground that I hope has been entertaining for you.

520
00:37:17,471 --> 00:37:22,816
If it has and you think someone else would be entertained by it as well, whether they're
human or robot, please share our website with them.

521
00:37:22,816 --> 00:37:25,819
It is thefitmass.com, at least for now.

522
00:37:25,819 --> 00:37:27,661
And that's gonna do it for this week.

523
00:37:27,661 --> 00:37:30,043
We'll be back in about a week at thefitmass.com.

524
00:37:30,043 --> 00:37:31,308
Thanks so much for listening.

525
00:37:31,308 --> 00:37:31,762
Thanks everyone.

526
00:37:31,762 --> 00:37:32,550
Bye bye.