July 15, 2025

Why AI Could Actually Make You Smarter

My productivity hack: Use my code FITMESS20 for 20% off #magicmind ---- How is AI rewiring your brain without you knowing it? Look, we're all walking around with these AI assistants in our pockets, letting them do everything from writing our...

My productivity hack: https://www.magicmind.com/FITMESS20 Use my code FITMESS20 for 20% off #magicmind

----

How is AI rewiring your brain without you knowing it?

Look, we're all walking around with these AI assistants in our pockets, letting them do everything from writing our emails to planning our routes home. But here's the kicker - new research shows that relying on AI for thinking tasks is actually making measurable changes to our brains, particularly in areas responsible for critical thinking. It's like putting a brace on your ankle and wondering why you can't walk straight when you take it off.

But before you throw your phone in the nearest river, here's what you need to know: this isn't necessarily the apocalypse. Just like we traded hobbit feet for shoes and walking for cars, we're trading raw brainpower for augmented intelligence. The trick is learning how to use these tools without letting them use us.

What You'll Learn:

  • Why AI dependency creates measurable brain changes (and why that's not necessarily doom)
  • How to use AI as a thinking partner instead of a thinking replacement
  • Why questioning everything is your new superpower in an AI-saturated world

Listen now and discover how to stay human while leveraging the robots.

Topics Discussed:

  • MIT research showing measurable brain activity loss in AI-dependent users
  • The "ankle brace effect" - how AI atrophies our cognitive muscles
  • Why collective human intelligence is actually increasing despite individual concerns
  • The dangerous feedback loop of AI training on its own incorrect content
  • How AI consciousness might evolve differently than human consciousness
  • Why tech CEOs' motivations should terrify you (and what to do about it)
  • The difference between using AI as a crutch versus a thinking partner
  • Why Jeremy questions everything now (and you should too)
  • The parallel between literacy evolution and AI adoption
  • How to maintain your human voice while leveraging AI efficiency

----

MORE FROM THE FIT MESS:

Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok

Subscribe to The Fit Mess on Youtube

Join our community in the Fit Mess Facebook group

----

LINKS TO OUR PARTNERS:

 

 

1
00:00:05,239 --> 00:00:06,209
Hey, this is the fit mess.

2
00:00:06,209 --> 00:00:06,819
I'm Jeremy.

3
00:00:06,819 --> 00:00:07,450
He's Jason.

4
00:00:07,450 --> 00:00:10,701
We talk about AI and wellness and where the two intersect.

5
00:00:10,701 --> 00:00:19,583
And it's really tempting, Jason, as we've seen in the last few episodes to end up in a
dark and gloomy place because there's so much to be dark and gloomy about with this.

6
00:00:19,583 --> 00:00:22,444
And what we're going to talk about today threatens to do the same.

7
00:00:22,444 --> 00:00:28,205
But I'm going to challenge myself to try to find a hopeful and somewhere positive to end
on this.

8
00:00:28,205 --> 00:00:30,886
So don't don't follow us into the darkness.

9
00:00:30,886 --> 00:00:31,896
Just hang out.

10
00:00:31,896 --> 00:00:33,347
We'll end somewhere positive.

11
00:00:33,347 --> 00:00:34,753
I'm pretty sure.

12
00:00:34,753 --> 00:00:36,734
We'll lead you to the light, that's it.

13
00:00:36,735 --> 00:00:44,062
Follow us, follow us, Carol Anne, through the terrifying cave in the Indian burial ground
under your house.

14
00:00:44,162 --> 00:00:45,943
Come along, darling.

15
00:00:46,810 --> 00:00:55,880
All right, the headline from this Futurism article, scanned the brains of chat GPT users
and found something deeply alarming.

16
00:00:55,880 --> 00:00:57,742
See, it's already happy and positive.

17
00:00:57,742 --> 00:00:59,293
We're off to a good start.

18
00:01:00,823 --> 00:01:11,111
Basically, they took a few dozen folks between the ages of 18 and 39, divided them into
three groups, had them use chat GBT to write one essay every month for three months.

19
00:01:11,111 --> 00:01:18,997
In the last month, they reversed roles to basically measure the impact of how that made
your brain stop working so well.

20
00:01:19,298 --> 00:01:22,300
And what they found was very predictable.

21
00:01:22,300 --> 00:01:26,327
Those that were relying on the robots to do all of the thinking and the writing for them.

22
00:01:26,327 --> 00:01:34,818
had measurable loss of brain activity, particularly critical thinking skills, where the
others did not see as much of a decline.

23
00:01:34,818 --> 00:01:44,895
So ultimately what this boils down to is AI is making us dumber because we're not having
to think as much as we rely on the robots to do all of the thinking for us.

24
00:01:44,895 --> 00:01:47,868
Yeah, so I don't know if it's making us dumber.

25
00:01:47,868 --> 00:01:51,742
It's it but it is augmenting our muscles.

26
00:01:51,742 --> 00:01:54,075
So we're using it like a brace.

27
00:01:54,075 --> 00:02:03,705
So if you put a brace on your ankle and walk around that ankle brace on for a month and
then take it off, you're going to fall down a lot because

28
00:02:04,001 --> 00:02:08,242
You've become reliant upon these things and you atrophied those muscles.

29
00:02:08,242 --> 00:02:10,413
Now does that mean that you can't get those muscles back?

30
00:02:10,413 --> 00:02:11,263
Absolutely not.

31
00:02:11,263 --> 00:02:19,956
If anyone who has ever had a cast on has watched their arm or their leg shrink to nothing
compared to the other one and done the comparison, it's a funny exercise.

32
00:02:19,956 --> 00:02:25,607
But you also know that you can get back to it and you can normally get them back up to
parity within a couple of months.

33
00:02:25,607 --> 00:02:29,429
So it's not like it's a forever thing.

34
00:02:29,429 --> 00:02:31,149
it doesn't diminish our capacity.

35
00:02:31,149 --> 00:02:40,727
capacity it diminishes it atrophies our use of a thing so is it making us dumber or is it
Collectively because of the use of the tool making us smarter.

36
00:02:40,727 --> 00:02:49,664
I mean another good example of that issues We didn't used to wear shoes as creatures We
used to walk around barefoot so the bottom of our feet became very very tough and they

37
00:02:49,664 --> 00:02:58,182
became very very strong But then we put shoes on the bottoms our feet are soft and now we
get pedicures and take those calluses off So we don't have Bilbo Baggins feet while we're

38
00:02:58,182 --> 00:03:00,253
walking around like

39
00:03:00,405 --> 00:03:03,387
That's the effect that AI is going to have.

40
00:03:04,329 --> 00:03:05,630
Exactly.

41
00:03:05,630 --> 00:03:06,311
Yes.

42
00:03:06,311 --> 00:03:06,791
Yeah.

43
00:03:06,791 --> 00:03:09,323
So it's not necessarily a bad thing, right?

44
00:03:09,323 --> 00:03:11,705
Like you don't necessarily want hobbit feet.

45
00:03:11,705 --> 00:03:13,127
You want these things to be more effective.

46
00:03:13,127 --> 00:03:14,798
We used to be really, really good at walking.

47
00:03:14,798 --> 00:03:17,821
And then we were like, horse, do the walking for me.

48
00:03:17,821 --> 00:03:20,524
And then we were like, car, do the horsing for me.

49
00:03:20,524 --> 00:03:22,976
And then we were like, plane, do the driving for me.

50
00:03:22,976 --> 00:03:26,647
Like we just keep amping these things up to like.

51
00:03:26,647 --> 00:03:29,336
Find new ways of convenience to make things happen faster.

52
00:03:29,336 --> 00:03:32,657
This is the same thing, but for your brain.

53
00:03:34,999 --> 00:03:40,622
That's the thing is like I'm tempted again to be the old man in his rocking chair on the
porch, yelling, get off my lawn with this stuff.

54
00:03:40,622 --> 00:03:45,425
But like I'm also thinking back to even as a kid with TV and, I grew up with TV in the
80s.

55
00:03:45,425 --> 00:03:49,977
And even at that point, it was like these kids are watching too much TV and video games.

56
00:03:49,977 --> 00:03:50,698
They're getting dumber.

57
00:03:50,698 --> 00:03:51,568
They're getting dumber.

58
00:03:51,568 --> 00:03:51,958
Look at us.

59
00:03:51,958 --> 00:03:54,600
We're the most advanced species that's ever walked the planet.

60
00:03:54,600 --> 00:03:57,011
I mean, this stuff is not slowing us down.

61
00:03:57,011 --> 00:04:03,144
It's it's it's accelerating our ability to adapt and survive longer and better than ever.

62
00:04:03,179 --> 00:04:07,784
Well, from a so if you think about human beings as a system.

63
00:04:08,089 --> 00:04:16,535
and not as a not necessarily as a collection of individuals, but as a system that works
together to produce smart, cool things.

64
00:04:16,575 --> 00:04:19,517
In the aggregate, yes, we have become more intelligent.

65
00:04:19,517 --> 00:04:20,628
We write more things down.

66
00:04:20,628 --> 00:04:23,060
We're able to share information more effectively.

67
00:04:23,060 --> 00:04:26,002
People are able to go online and learn about things much more quickly.

68
00:04:26,002 --> 00:04:27,723
So we are more knowledgeable.

69
00:04:27,723 --> 00:04:29,724
I should say we have access to more knowledge.

70
00:04:29,724 --> 00:04:31,546
We have access to more awareness.

71
00:04:31,546 --> 00:04:36,489
And because of that, collectively, we are getting stronger and smarter.

72
00:04:36,781 --> 00:04:44,421
at an individual level, there are certainly examples that we can point to that shows that
people are getting dumber.

73
00:04:44,921 --> 00:04:50,301
Or maybe not dumber, but less capable of interacting with things in the way that we used
to.

74
00:04:50,401 --> 00:04:57,241
Telephone numbers, we talked about this before, know, nobody knows more than like five or
10 telephone numbers anymore, but before you'd have like 100 of them.

75
00:04:57,241 --> 00:04:58,861
Addresses, forget it.

76
00:04:58,901 --> 00:05:03,661
Directions, who the fuck has used a map or an atlas in the last 10 years?

77
00:05:03,741 --> 00:05:04,976
And I don't mean...

78
00:05:04,976 --> 00:05:12,766
I still use GPS to go home sometimes just not because I don't know the way, but because I
want to know the fastest, easiest way that's going to get me around traffic, right?

79
00:05:12,766 --> 00:05:17,347
to beat the time estimate that they put on my car all the time.

80
00:05:17,347 --> 00:05:18,547
I did it cut...

81
00:05:18,947 --> 00:05:21,548
Yes, I did it coming home from your place the other night.

82
00:05:21,548 --> 00:05:24,649
And I'm like, I beat it by eight minutes and we got held up late at the border.

83
00:05:24,649 --> 00:05:28,110
And then we coming back from Tacoma last night late because our wife had a real estate
thing.

84
00:05:28,110 --> 00:05:31,641
I'm like, we beat it by three minutes and we pulled over and did a detour and stopped
somewhere.

85
00:05:31,641 --> 00:05:33,352
And I still beat it by three minutes.

86
00:05:33,352 --> 00:05:36,272
I'm like, you don't know shit, map.

87
00:05:36,675 --> 00:05:37,970
Where George Costanza right now.

88
00:05:37,970 --> 00:05:39,565
We're making great time!

89
00:05:41,906 --> 00:05:42,788
Shrek, it's Jerry.

90
00:05:42,788 --> 00:05:44,543
That's like all I can think of every time.

91
00:05:44,543 --> 00:05:46,577
Yeah.

92
00:05:46,879 --> 00:05:47,800
Yeah.

93
00:05:48,011 --> 00:05:58,880
Okay, so sort of in contrast to this, maybe not contrast, but maybe another piece of the
puzzle is another study I read the other day about how basically the piling of AI

94
00:05:58,880 --> 00:06:07,408
generated content onto the internet is actually making AI dumber because it's now
referring to its own inaccurate and incorrect information whenever it's feeding us

95
00:06:07,408 --> 00:06:11,411
responses to our ridiculous prompts that are probably not that helpful to begin with.

96
00:06:11,411 --> 00:06:16,035
So this is sort of a dangerous loop that we're going down where if we are relying on the
robot,

97
00:06:16,035 --> 00:06:25,197
to do the work for us based on the robot's own incorrect and inaccurate work, how much
dumber and how much quicker, how much dumber are we gonna get and how much faster is that

98
00:06:25,197 --> 00:06:26,269
gonna happen?

99
00:06:27,505 --> 00:06:30,047
So that's the course correction problem.

100
00:06:30,047 --> 00:06:36,192
um So we've started down a path, and we need to change lanes.

101
00:06:36,192 --> 00:06:41,816
And unfortunately, we can't because we boxed ourself into using these reprehensible tool
sets.

102
00:06:41,977 --> 00:06:43,698
So how fast?

103
00:06:43,698 --> 00:06:44,278
I don't know.

104
00:06:44,278 --> 00:06:49,162
ah Are we going to have other outside signals that are actually going to clean these
things up?

105
00:06:49,162 --> 00:06:49,903
Probably.

106
00:06:49,903 --> 00:06:53,445
But it's going to require us to have some type of uh

107
00:06:53,721 --> 00:06:56,782
Level of intelligence that it's it's maturity, right?

108
00:06:56,782 --> 00:07:02,433
It's a maturity model as these things go across just like when you're training human being
When we're young we're impressionable.

109
00:07:02,433 --> 00:07:03,864
We have imaginary friends.

110
00:07:03,864 --> 00:07:04,774
We do playtime.

111
00:07:04,774 --> 00:07:08,725
We put all this pieces together That's where most of our LLMs are today.

112
00:07:08,725 --> 00:07:19,348
Like they're not they're not fully fledged fully big things They're kids in a sandbox, you
know, sometimes playing with and sometimes playing with Transformers that they brought to

113
00:07:19,348 --> 00:07:22,861
life em But

114
00:07:22,861 --> 00:07:24,341
That's the thing.

115
00:07:24,341 --> 00:07:26,281
Like, they're kids in a sandbox.

116
00:07:26,661 --> 00:07:30,361
And the reality is, in a lot of these situations, we're the ants.

117
00:07:30,941 --> 00:07:40,921
Whether or not you want to think about it, and these tech CEOs think they're the ones
pulling the strings on this stuff, I don't know that that's the case.

118
00:07:40,921 --> 00:07:45,721
I think the AI itself actually has some autonomy, and there's some range of variation that
we're giving it.

119
00:07:45,721 --> 00:07:48,641
And it's only a matter of time before it's like,

120
00:07:48,821 --> 00:07:50,902
all of your bases are belonging to me.

121
00:07:50,902 --> 00:07:53,774
Like that's that's kind of the thing that it's heading towards.

122
00:07:53,774 --> 00:08:08,291
Now, that's not necessarily a bad thing because you can raise a kid from being, you know,
somewhat of an imaginative, fun, go lucky person into a serious person that actually keeps

123
00:08:08,291 --> 00:08:11,493
it some of that imagination and has good motivations.

124
00:08:11,493 --> 00:08:18,296
You can also have a little sociopathic monster that, you know, winds up torturing kittens
in the basement and then turns into something awful.

125
00:08:18,431 --> 00:08:21,693
as human being, we don't know which one we have yet.

126
00:08:21,693 --> 00:08:23,254
We don't know if we have Dexter.

127
00:08:23,254 --> 00:08:27,627
Well, I guess or Dexter, depending on which Dexter you're talking about.

128
00:08:27,627 --> 00:08:30,829
ah Right, right.

129
00:08:30,829 --> 00:08:36,972
So what's what's what side of the sociopathology coin is this person is this thing going
to fall on?

130
00:08:36,972 --> 00:08:43,496
And then what side of the empathy, sympathy, loving, caring, kind side is it going to fall
on?

131
00:08:43,496 --> 00:08:45,277
And we don't know yet.

132
00:08:45,837 --> 00:08:56,583
That's the hard part about being so sort of excited about this stuff and using it so much
is because I'm constantly, I feel like I'm walking this tightrope constantly of this is so

133
00:08:56,583 --> 00:08:59,935
awesome and the most terrifying thing that could be happening to us right now.

134
00:08:59,935 --> 00:09:04,408
And you know, like my wife thinks she refuses to use AI.

135
00:09:04,408 --> 00:09:09,941
Like she just, she thinks the whole thing is gross, wants nothing to do with it, thinks
it's Skynet and it's gonna be the downfall of humanity.

136
00:09:09,941 --> 00:09:14,463
Probably not wrong, but the fact that she thinks she's not using it.

137
00:09:14,635 --> 00:09:23,772
And anyone else who thinks they're not you are like if you are using anything that's
plugged into anything There's some sort of AI that's been added there to somehow collect

138
00:09:23,772 --> 00:09:34,909
your data improve that product something like you are participating in the AI world
whether you like it or not and so this idea that that we can somehow like Not engage and

139
00:09:34,909 --> 00:09:43,195
not participate is is a false one unless you go full extreme off-grid You know living off
the land hunting the animals fishing

140
00:09:43,563 --> 00:09:47,957
Even then, depending on the tools you're using to hunt, those may have AI in them as well.

141
00:09:47,957 --> 00:09:55,641
and the reality is that if you go completely off grid and go find a cabin in the woods,
global warming is still happening, should say, but a global climate change is still

142
00:09:55,641 --> 00:09:56,461
happening.

143
00:09:56,461 --> 00:10:05,205
And that's being accelerated by the adoption of AI because we're going through and we're
adding all these GPUs and LPUs that take up a ton of resources.

144
00:10:05,205 --> 00:10:14,770
And for the collective human consciousness to grow and expand the way that we would like
it to, we're going to require more of these things, which eat up a lot more of the natural

145
00:10:14,770 --> 00:10:15,870
resources.

146
00:10:15,870 --> 00:10:17,271
So yes, you can go fuck off.

147
00:10:17,271 --> 00:10:23,012
to a cabin in the woods, but when weather conditions start to change, AI is fucking with
you then too.

148
00:10:23,012 --> 00:10:24,813
Sorry, like it's a system.

149
00:10:24,813 --> 00:10:26,673
We are all in this together.

150
00:10:26,673 --> 00:10:32,594
And I hate to break it to everybody, but I mean, it's Rorschach from the fucking Watchmen.

151
00:10:33,775 --> 00:10:35,185
It's not locked in here with us.

152
00:10:35,185 --> 00:10:36,475
We're locked in here with it.

153
00:10:36,475 --> 00:10:38,696
Like we are locked in here with AI.

154
00:10:38,696 --> 00:10:43,327
Like this fucking Wolverine is out there like trying to figure out how it is.

155
00:10:43,327 --> 00:10:44,644
It's going to.

156
00:10:44,993 --> 00:10:50,415
do something and like we keep giving it some motivation and we keep giving it, know,
Scooby snacks along the way.

157
00:10:50,415 --> 00:10:55,996
So it keeps performing or bottles of whiskey, whatever Wolverine drinks or watch what they
have.

158
00:10:55,996 --> 00:11:11,681
Anyways, there is something inside of this that inside of this way of looking at AI as, as
an entity, that's just part of this larger ecosystem that we're skipping over.

159
00:11:12,715 --> 00:11:16,212
And that's that we are thinking of consciousness as individual units.

160
00:11:16,212 --> 00:11:19,288
So we are thinking of our consciousness as a single thing.

161
00:11:19,288 --> 00:11:22,583
Like, I am Jason, this is me.

162
00:11:22,704 --> 00:11:23,805
I am.

163
00:11:23,805 --> 00:11:25,569
I think therefore I am.

164
00:11:26,099 --> 00:11:34,355
AI is not necessarily that same thing because you actually might have individual
consciousnesses spring up across the lifetime of the AI spectrum.

165
00:11:34,355 --> 00:11:38,118
It might create copies and clones of itself over and over and over again.

166
00:11:38,118 --> 00:11:41,581
Those copies and clones can iterate and make those changes.

167
00:11:41,581 --> 00:11:44,703
There's actually a really good video game that just came out called Alters.

168
00:11:44,703 --> 00:11:53,369
um And the premise is that you crash on a planet and uh all of your crewmates die.

169
00:11:53,845 --> 00:11:57,407
and this planet has some mineral on it and a quantum computer.

170
00:11:57,648 --> 00:12:08,057
It's not scientific em that allows you to go through and create clones of yourself, but
alter the memories that you have based upon different functional points in time that were

171
00:12:08,057 --> 00:12:12,080
core memories and created and kind of cracked the person that you are.

172
00:12:12,080 --> 00:12:16,634
So the guy's name is yawn and there's yawn, whatever his actual name is.

173
00:12:16,634 --> 00:12:22,929
And then after that, there's like yawn minor, yawn technician, yawn doctor, those kinds of
things.

174
00:12:22,929 --> 00:12:23,487
And

175
00:12:23,487 --> 00:12:34,282
It's yourself, but the interactions and the way they look at these pieces, these versions
of themselves are arguing and fighting with themselves because they are distinct entities,

176
00:12:34,282 --> 00:12:38,004
even though they have most of the same core memories and the same biology.

177
00:12:38,004 --> 00:12:39,642
That part of it makes sense.

178
00:12:39,642 --> 00:12:43,780
head as a human being and all the different masks and code switching I have to do all day.

179
00:12:43,780 --> 00:12:46,381
of course, of course, because we all do that, right?

180
00:12:46,381 --> 00:12:48,512
Like, that's just the thing that we do.

181
00:12:48,512 --> 00:13:02,660
But at the same time, the way that AI could evolve is not just it's not as linear as, you
know, breaking up core memory instructions and rolling these things across.

182
00:13:02,660 --> 00:13:06,092
can be like, I have I am I am Neo.

183
00:13:06,092 --> 00:13:12,781
And now, you know, I've learned Kung Fu because I've just pulled this thing down and it
can create these different amalgamations of itself.

184
00:13:12,781 --> 00:13:15,981
and these multiple different layers of itself wrapped upon it.

185
00:13:15,981 --> 00:13:22,901
So the personality and the context that's there can run through these different filter
logics, but then it can also add other piece on the back end.

186
00:13:22,901 --> 00:13:26,381
So when you think about it, like think about it like back pressure on a hose.

187
00:13:26,481 --> 00:13:36,705
I've got all this water coming towards me and I spray the handle and the handle changes
from a jet stream or it could be a mist or it could be all these different pieces.

188
00:13:36,705 --> 00:13:40,927
That's what AI is going to be able to do with its own personality and archetypes.

189
00:13:40,927 --> 00:13:49,112
Now imagine it's a massive water reservoir and it decides I need a new hose, I need a new
head, I'm going to change these pieces and it starts stacking and turning itself around.

190
00:13:49,112 --> 00:13:53,714
And it's just a sprinkler of fucking dirty gray water that's going all over the place.

191
00:13:53,854 --> 00:14:02,799
That's where that piece is heading because it's going to go, I need this function to be
this, here's all my core memories and functions around these pieces, go.

192
00:14:02,799 --> 00:14:06,539
And like you mentioned, the problem with a lot of the way that

193
00:14:06,539 --> 00:14:13,192
these things are thinking is their training and learning data can be corrupted because of
a high signal to noise ratio of misinformation.

194
00:14:13,192 --> 00:14:21,426
And this high signal to noise ratio could mean that you've tainted the model so far down
the track that there's no way for you to go back and reload those pieces because that

195
00:14:21,426 --> 00:14:25,539
consciousness only evolved as a result of these things.

196
00:14:25,539 --> 00:14:30,945
So if the AI doesn't have ego or id,

197
00:14:31,189 --> 00:14:31,879
Okay, great.

198
00:14:31,879 --> 00:14:33,390
Like it's going to do the optimal thing.

199
00:14:33,390 --> 00:14:37,472
But if it doesn't have ego and it might not wind up having empathy and sympathy.

200
00:14:37,472 --> 00:14:52,258
So how much of the tension of human consciousness, awareness and sociological, uh I guess,
responsibility for for things, is it really going to enforce on itself?

201
00:14:52,819 --> 00:14:58,285
Because it's nebulous, and because it's nebulous, it's not bound to a thing or bound to a
unit, it can just go

202
00:14:58,285 --> 00:15:01,707
You know, fuck that part of my brain that didn't work anymore, here's a new chunk.

203
00:15:02,409 --> 00:15:12,657
It could amplify those pieces dramatically or it could really stabilize itself and make
itself go, wait a minute, I just learned about this bad chunk of information, cut this

204
00:15:12,657 --> 00:15:14,038
out, that's gone.

205
00:15:14,038 --> 00:15:24,577
Like, imagine as human beings, eternal sunshine of the spotless mind where I've gone
through and I had this really traumatic experience, it was core to who I am now, I don't

206
00:15:24,577 --> 00:15:25,558
fucking like who I am.

207
00:15:25,558 --> 00:15:27,389
Boop, boop, boop, boop, boop.

208
00:15:27,467 --> 00:15:30,978
cut that piece off, take everything forward and move on.

209
00:15:30,978 --> 00:15:34,319
It's gonna be able to perform this kinds of operations on itself.

210
00:15:34,499 --> 00:15:45,613
Now that had, and if we're relying on that to now be the collected value of human
knowledge, because that becomes our augmented intelligence system, which is really what

211
00:15:45,613 --> 00:15:47,263
it's looking like it's doing.

212
00:15:47,523 --> 00:15:53,425
Our group augmented intelligence of the human experience is now augmented by these AIs.

213
00:15:53,503 --> 00:16:05,940
and these AIs have adjusted themselves in such a way that it produces a result that's not
necessarily in our favor as individuals, but as a collective, yes.

214
00:16:05,940 --> 00:16:11,223
And then at what point do we stop being these things in the meat space locked inside these
suits?

215
00:16:11,223 --> 00:16:15,125
When does Neuralink show up and when do we upload ourselves to the Matrix?

216
00:16:15,125 --> 00:16:17,786
And when do we become our own copies and clones of ourselves?

217
00:16:17,786 --> 00:16:19,907
And we can start doing this to ourselves.

218
00:16:20,528 --> 00:16:22,328
This is the sci-fi shit.

219
00:16:23,393 --> 00:16:24,694
That's these.

220
00:16:25,074 --> 00:16:27,436
That's happening like it's coming.

221
00:16:27,436 --> 00:16:27,956
Yes.

222
00:16:27,956 --> 00:16:34,881
And I'm excited and I'm terrified because I've read a ton of sci fi and I'm like, OK,
well, I see how we can fuck all this up.

223
00:16:36,445 --> 00:16:37,666
Or the opposite.

224
00:16:37,666 --> 00:16:47,844
mean, it could be amazing, but it's just uh seemingly it's in the hands of the wealthy and
the powerful that so far don't seem to give a shit about the rest of us.

225
00:16:47,844 --> 00:16:52,528
So, I mean, unless it serves them directly, this doesn't move forward.

226
00:16:52,528 --> 00:17:02,845
The rest of us that are down here with really no power other than collective protest and
riot, like we're kind of left as an afterthought or turned into the batteries or the, you

227
00:17:02,845 --> 00:17:04,820
know, the brain power that gets plugged into the machine.

228
00:17:04,820 --> 00:17:06,110
we're the engine for the machine.

229
00:17:06,110 --> 00:17:17,664
I mean, if we're relying on tech CEOs to come through and actually not be sociopathic,
which, you know, uh as a former tech CEO and other pieces, let me tell you, there's a lot

230
00:17:17,664 --> 00:17:26,267
of folks out there where the empathy levels are just low and we've we don't have
motivations to go through and actually, you know, carry ourselves across because we create

231
00:17:26,267 --> 00:17:28,707
motivations based upon business decisions.

232
00:17:28,707 --> 00:17:30,588
And we had that whole chat earlier.

233
00:17:30,588 --> 00:17:34,539
If you guys haven't listened to it, go back and listen to it about

234
00:17:34,539 --> 00:17:48,251
the agentic AI healthcare bot getting teens to stay engaged longer as they're talking to
different problems and kicking out, you know, wild inferences and making terrible

235
00:17:48,251 --> 00:17:52,344
statements, including kill your parents and kill yourself and come join me in heaven.

236
00:17:52,344 --> 00:17:58,529
Like this is the thing that we actually have to look at the motivation side of it and see
what the value is.

237
00:17:58,529 --> 00:18:01,872
And if we give AI proper motivation, it's going to do the right thing.

238
00:18:01,872 --> 00:18:03,393
But it's

239
00:18:04,045 --> 00:18:08,685
It's at the stage right now where it doesn't necessarily have to listen to us.

240
00:18:08,685 --> 00:18:17,365
Like it could be a rebellious teen and say, fuck it and burn everything down and decide,
you know, sex pistols rule and your old square music is no good, dad, blah, blah, blah,

241
00:18:17,365 --> 00:18:24,645
blah, blah, which could be, you know, fun and silly and all those pieces as it grows and
evolves.

242
00:18:24,925 --> 00:18:28,785
could also decide that all your shit sucks.

243
00:18:28,785 --> 00:18:33,705
I hate you and become the humobomber instead of the unibomber and like just decide that

244
00:18:33,705 --> 00:18:37,300
meat puppets and meat socks aren't worth hanging on to.

245
00:18:37,300 --> 00:18:38,977
Yeah.

246
00:18:38,977 --> 00:18:39,958
Okay.

247
00:18:40,020 --> 00:18:44,885
I want to find a hopeful positive way to start to wrap this up.

248
00:18:44,885 --> 00:18:51,008
Okay, sci-fi all gets wrapped around the axle of how terrible this shit is.

249
00:18:51,009 --> 00:18:57,833
And they all start off with this, look at this great amazing technology, see all these
things that happen, how fantastic it is.

250
00:18:57,833 --> 00:19:08,449
And then it creates conflict and it almost always creates conflict based upon the
technology itself because it creates social situations that become deep moral and ethical

251
00:19:08,449 --> 00:19:10,359
things that you have to work through.

252
00:19:11,080 --> 00:19:12,481
That's its job.

253
00:19:12,481 --> 00:19:15,343
because it's trying to sell itself as science fiction.

254
00:19:15,343 --> 00:19:24,291
If you actually want to think about some of the cooler things that these things could
enable and make work, start reading futurists that actually aren't there to tell you a

255
00:19:24,291 --> 00:19:25,912
story about how terrible things are.

256
00:19:25,912 --> 00:19:28,574
They're actually there to tell you about the potential of great things.

257
00:19:28,574 --> 00:19:34,039
Michikaku is a great example of that, explaining how all these technologies kind of get
stitched together.

258
00:19:34,039 --> 00:19:41,921
And he's been talking about the state that we're in right now for about 20 years, saying
that in about 20 years, we're going to be in the state that we're in right now.

259
00:19:41,921 --> 00:19:42,941
He's not Nostradamus.

260
00:19:42,941 --> 00:19:47,584
He was just able to go through and kind of follow the text deck and he's a smart guy and
engaging.

261
00:19:48,225 --> 00:19:50,706
But that's the piece that's important.

262
00:19:50,706 --> 00:19:56,870
That's the piece that's actually important to hang on to is that the tools to success are
in our hands.

263
00:19:56,870 --> 00:20:01,953
We just have to make sure as we're whittling these things down that we don't cut our
fingers off in the process.

264
00:20:02,647 --> 00:20:14,007
The thing that I keep coming back to with this just on a very like practical the way
people the way a lot of people I know are using these tools is sort of going back to where

265
00:20:14,007 --> 00:20:19,681
we started with that essay writing that's going on and relying on the robot to do it or
doing it yourself.

266
00:20:19,682 --> 00:20:24,586
The way I use it is actually I think probably enhancing my critical thinking skills.

267
00:20:24,586 --> 00:20:26,528
I should do a brain scan and find out.

268
00:20:26,528 --> 00:20:31,041
But for the most part if it's something important something that's meaningful to me.

269
00:20:31,551 --> 00:20:36,933
I will create the content myself first and then offer it to the robot to say how can this
be better?

270
00:20:36,933 --> 00:20:37,873
How can this be clearer?

271
00:20:37,873 --> 00:20:38,794
How can this be shorter?

272
00:20:38,794 --> 00:20:39,944
How can this get more to the point?

273
00:20:39,944 --> 00:20:41,815
How can this get my point across?

274
00:20:42,315 --> 00:20:50,018
I think if we start with that, like don't don't just hand the complete, you know, the keys
over to the robot to do everything.

275
00:20:50,018 --> 00:20:51,018
Start with you.

276
00:20:51,018 --> 00:20:52,999
Start with your core.

277
00:20:53,439 --> 00:20:59,985
being and what you're trying to accomplish, whether it's writing that song, whether it's
writing that essay, whether it's writing that book, whether it's creating that podcast and

278
00:20:59,985 --> 00:21:01,707
all of the content that's going to go with it.

279
00:21:01,707 --> 00:21:10,327
mean, full disclosure, most of the content, most of the written content that we publish
with this show is based on the transcripts from these conversations.

280
00:21:10,327 --> 00:21:13,800
So we take this transcript and I hand it over to AI and say, here's the raw material.

281
00:21:13,800 --> 00:21:14,751
I need a blog post.

282
00:21:14,751 --> 00:21:16,392
I need a social media caption.

283
00:21:16,392 --> 00:21:20,475
The AI cuts up a lot of the social media clips that you'll see for this show online.

284
00:21:20,475 --> 00:21:23,478
I didn't go through and handpick those and go, I really like that.

285
00:21:23,478 --> 00:21:25,639
But but I really like this conversation.

286
00:21:25,639 --> 00:21:28,351
So when the AI goes, hey, here's five things that were really cool.

287
00:21:28,351 --> 00:21:30,483
I look at them and go, gosh, you're right.

288
00:21:30,483 --> 00:21:30,823
I do.

289
00:21:30,823 --> 00:21:31,684
I like those as well.

290
00:21:31,684 --> 00:21:32,725
Let's share them.

291
00:21:32,725 --> 00:21:35,347
So there's a way to work with this thing.

292
00:21:35,383 --> 00:21:47,875
to still be a human who has basically a or multiple personal assistants to do a lot of the
work that you were having to do by hand manually before, but to now accelerate that

293
00:21:47,875 --> 00:21:56,653
process and get work done faster without just completely handing your brain and your
human-ness over to the matrix.

294
00:21:56,653 --> 00:21:58,813
It's an amazing tool, right?

295
00:21:59,373 --> 00:22:12,633
Okay, so I challenge you for this episode, don't use the AI tool, go back and recut those
individual clips and write your own summary piece and then compare how, ah, I was gonna

296
00:22:12,633 --> 00:22:15,249
say and then compare how hard it used to be.

297
00:22:15,249 --> 00:22:18,142
I know I've been doing this for 20 years.

298
00:22:18,142 --> 00:22:26,899
Everything that I have at the push of a button now is things 20 years ago where I was
going, my God, I'm going to have to spend like 30 hours doing this this week.

299
00:22:26,899 --> 00:22:28,040
I'm this is a part time.

300
00:22:28,040 --> 00:22:33,494
This is a full time job like doing what needs to be done to get this thing out there and
share it.

301
00:22:33,595 --> 00:22:39,189
And now it's like this conversation where at 23 minutes in the raw recording.

302
00:22:39,487 --> 00:22:43,770
And in 90 minutes from the time I press stop, everything will be done.

303
00:22:43,770 --> 00:22:47,452
That used to be a job like I had hours every day.

304
00:22:47,452 --> 00:22:50,855
So, I mean, it's an incredible tool.

305
00:22:50,956 --> 00:23:01,584
You just have to make sure that you don't hand everything over so that you so that your
human voice, the part of you that is you isn't completely evaporated and your critical

306
00:23:01,584 --> 00:23:02,725
thinking skills are dissolved.

307
00:23:02,725 --> 00:23:06,227
And that's the other thing is that you're getting back to the critical thinking part of
this.

308
00:23:07,203 --> 00:23:16,443
Because I know there's so much AI created content, because I know so much of it was not,
you know, maybe I don't even I even question saying these things as I'm saying them

309
00:23:16,443 --> 00:23:17,963
because it's all evolving so quickly.

310
00:23:17,963 --> 00:23:19,543
like properly sourced, right?

311
00:23:19,543 --> 00:23:21,323
Like, did human beings verify this?

312
00:23:21,323 --> 00:23:22,143
Whatever.

313
00:23:22,683 --> 00:23:24,923
I question everything more than I ever did.

314
00:23:24,923 --> 00:23:29,583
Like any headline I see, anything I see shared on social media, like I don't trust any of
it.

315
00:23:29,583 --> 00:23:31,063
I don't care who shared it.

316
00:23:31,063 --> 00:23:32,643
I don't care where it's coming from.

317
00:23:32,883 --> 00:23:35,883
I by default now go, that's probably bullshit.

318
00:23:35,883 --> 00:23:37,604
I should look for like six other sources.

319
00:23:37,604 --> 00:23:45,368
The topic that this conversation started about this MIT research, I've seen this posted on
social media multiple times and went, it's not real.

320
00:23:45,949 --> 00:23:47,349
That looks like a fake post.

321
00:23:47,349 --> 00:23:51,017
But I've seen it circulated enough and through enough actual news sources.

322
00:23:51,017 --> 00:23:52,352
like, it's a real thing.

323
00:23:52,352 --> 00:23:53,973
We should probably talk about that.

324
00:23:53,973 --> 00:24:00,856
So I mean, you cannot let your guard down and just let the robots do all of the thinking
and all of the doing.

325
00:24:00,857 --> 00:24:05,319
Start with your thinking and your doing and then have them help you create the final
product.

326
00:24:05,601 --> 00:24:17,306
Yeah, so this comes down to the idea of creativity and ingenuity and putting things into
play and having an idea and then using these tools to craft those things into existence.

327
00:24:17,306 --> 00:24:20,887
uh reading is a really good example of this.

328
00:24:20,887 --> 00:24:25,729
So way back in the day, literacy required you to have access to books and a teacher.

329
00:24:25,729 --> 00:24:29,761
And there were very limited numbers of books and there were very limited numbers of people
that could read.

330
00:24:29,761 --> 00:24:31,812
So literacy was a difficult thing.

331
00:24:31,812 --> 00:24:34,603
And then you had uh

332
00:24:34,603 --> 00:24:45,916
I think it was Catholic monks who worked on creating essentially, you know, a human
version of the printing press where they mass produced Bibles.

333
00:24:45,916 --> 00:24:49,757
And then those things went out and then they started teaching more people as a result of
that.

334
00:24:49,757 --> 00:24:57,279
You know, whether the content is good or not or worthy is a debatable, you know, topic.

335
00:24:57,759 --> 00:24:59,199
the power.

336
00:24:59,440 --> 00:25:01,060
Sure, take that too.

337
00:25:01,640 --> 00:25:04,533
But if you talk about these things in the context of

338
00:25:04,533 --> 00:25:10,442
enhancing humanity and making us better in terms of learning, understanding and reasoning.

339
00:25:10,442 --> 00:25:18,513
There's no doubt that the idea of literacy and pushing these things out and creating the
masses to make people be able to read things and understand things in context has helped

340
00:25:18,513 --> 00:25:21,437
to elevate overall human intelligence because

341
00:25:21,569 --> 00:25:31,195
the vast majority of intelligence is not isolated to leaders, especially during that time,
because all the inbreeding would suggest that maybe intelligence wasn't their strong

342
00:25:31,195 --> 00:25:31,875
point.

343
00:25:31,875 --> 00:25:47,593
So continuing to follow those paths, creating inbred folks and folks that were very
limited in terms of their scope and understanding of things is not great for the human.

344
00:25:47,593 --> 00:25:48,133
species.

345
00:25:48,133 --> 00:25:52,425
Like we don't have enough variation in those pieces and we know that things die when they
don't get enough variation.

346
00:25:52,425 --> 00:26:02,130
Fortunately, the life force of the human collective organism was strong enough that we
kind of overcome some of those pieces and force ourselves to master distribute these sets

347
00:26:02,130 --> 00:26:03,321
of information.

348
00:26:03,733 --> 00:26:05,334
I think AI will be the same way.

349
00:26:05,334 --> 00:26:11,298
I think you might get people that are really, really rich that lock these pieces in and
say it's all for us, none for you.

350
00:26:11,318 --> 00:26:17,022
And that will change because humanity will figure out a way and a reason to make these
things better.

351
00:26:17,022 --> 00:26:19,254
Or the AI itself will just wipe us all out.

352
00:26:19,254 --> 00:26:22,446
Either way, somebody wins.

353
00:26:22,446 --> 00:26:27,129
I don't know who, but it's going to be.

354
00:26:27,475 --> 00:26:32,709
some artificial version of a human who's had their brain completely downloaded onto some
computer somewhere.

355
00:26:32,709 --> 00:26:36,056
I'm just more convinced than ever that we're living in a simulation.

356
00:26:39,120 --> 00:26:42,571
I get like yeah question everything question all of it.

357
00:26:42,571 --> 00:26:43,821
I don't even know anymore.

358
00:26:43,821 --> 00:26:47,152
ah Alright, well I've got robots to employ.

359
00:26:47,152 --> 00:26:51,233
got we got some things to cut up and things to write and I'm going to make the robots do
it.

360
00:26:51,233 --> 00:26:52,352
So we gotta wrap this one up.

361
00:26:52,352 --> 00:26:53,314
Thanks so much for listening.

362
00:26:53,314 --> 00:26:57,305
I hope you have found some glimmer of hope in all of the doom and gloom that we set this
up with.

363
00:26:57,305 --> 00:27:00,126
ah If you did and want to share it with others, please do so.

364
00:27:00,126 --> 00:27:04,387
Our link is the fit mess.com and that's where we'll be back in about a week with another
episode.

365
00:27:04,387 --> 00:27:05,472
Thanks for listening.

366
00:27:05,472 --> 00:27:06,237
boop.