July 22, 2025

How to Manage AI Overwhelm

Why is everyone promising AI will make you a millionaire overnight?

The internet is drowning us in AI promises, productivity hacks, and get-rich-quick schemes. Every day brings another "expert" claiming you'll be left behind if you don't master their 17 AI tools immediately. Sound familiar? It's the same bullshit cycle that plagued men's wellness - thousands of conflicting voices screaming about supplements, sleep schedules, and workout routines until you feel like a failure for not doing everything perfectly.

Here's the truth: Just like health and wellness boiled down to sleep well, eating decent food, and moving your body, the AI revolution doesn't require you to become a productivity guru overnight. Most of the noise is just that - noise designed to capitalize on your fear of missing out.

Listen to learn how to cut through the AI overwhelm, protect your mental health, and focus on what actually matters.

Topics Discussed:

  • Why AI content feels like the wellness industry's overwhelming advice all over again
  • How social media algorithms exploit our psychological vulnerabilities with AI fear-mongering
  • The difference between actual AI innovation and marketing hype designed to sell courses
  • Why letting your community filter information is more effective than doom-scrolling
  • How FOMO becomes a destructive FUD (fear, uncertainty, doubt) cycle
  • The reality check: most AI "opportunities" are just repackaged get-rich-quick schemes
  • Why disconnecting from the constant stream of AI content is essential for mental health
  • How to identify signal versus noise in the AI information landscape
  • The importance of accepting what you can't control in technological change
  • Practical strategies for managing AI anxiety and information overwhelm

----

MORE FROM THE FIT MESS:

Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok

Subscribe to The Fit Mess on Youtube

Join our community in the Fit Mess Facebook group

----

LINKS TO OUR PARTNERS:

 

 

1
00:00:04,921 --> 00:00:05,992
Hey, it's the Fit Mess.

2
00:00:05,992 --> 00:00:06,962
He's Jason, I'm Jeremy.

3
00:00:06,962 --> 00:00:10,437
We talk about AI and mostly mental health related topics.

4
00:00:10,558 --> 00:00:12,620
And it's so interesting.

5
00:00:12,620 --> 00:00:21,810
Today we're gonna talk about just how overwhelming all of this information about AI is and
trying to keep up, trying to stay relevant, trying to stay on top of it.

6
00:00:21,871 --> 00:00:22,651
I'm drowning.

7
00:00:22,651 --> 00:00:26,593
think a lot of people are drowning based on the number of articles I found looking for
this today.

8
00:00:26,593 --> 00:00:28,754
Tons of people are feeling the same way.

9
00:00:28,771 --> 00:00:31,125
And it reminds me of when this show first started.

10
00:00:31,125 --> 00:00:37,258
You know, the Fit Mess started as a typical sort of men's health, men's wellness, mental
health show.

11
00:00:37,358 --> 00:00:44,952
And when you first are exploring this kind of thing, you're, you know, suddenly you're
following a bunch of new people that, you know, cover this sort of thing.

12
00:00:44,952 --> 00:00:47,343
Your feed just gets overwhelmed with.

13
00:00:47,949 --> 00:00:48,819
You need to do this.

14
00:00:48,819 --> 00:00:49,720
You need to drink this.

15
00:00:49,720 --> 00:00:57,544
You need to eat this and take this supplement and only sleep this much and only work out
on these days when the moon is in the right position in the sky and we're to and all of

16
00:00:57,544 --> 00:00:59,886
sudden you're just like, holy shit.

17
00:00:59,886 --> 00:01:01,877
Like no wonder I'm a big fat load.

18
00:01:01,877 --> 00:01:03,628
I can't do these 4000 things.

19
00:01:03,628 --> 00:01:05,168
What the hell?

20
00:01:05,749 --> 00:01:10,251
Then after a few years of doing the interviews with everybody, that's all saying the same
thing of like, you know, get enough sleep.

21
00:01:10,251 --> 00:01:11,232
Don't eat like complete shit.

22
00:01:11,232 --> 00:01:13,383
Move your body and you know, take a break once in a while.

23
00:01:13,383 --> 00:01:17,035
Like, oh, it's it's a lot simpler than than I thought it was.

24
00:01:17,333 --> 00:01:24,603
about AI right now because we've taken a hard shift on this show and we're talking a lot
about AI and its relationship to mental health.

25
00:01:25,456 --> 00:01:32,876
I mean, every day in my feed there is, bro, if you're not doing these seven prompts to
manage your budget, are you even a fucking adult?

26
00:01:32,876 --> 00:01:39,716
Like, like everyone's got a thousand opinions about the best way to optimize your life and
all of the things you're trying to do.

27
00:01:39,716 --> 00:01:46,476
And the 17 businesses you could start by noon and be a millionaire at age 22 because chat
GPT will just do all, do everything for you.

28
00:01:46,476 --> 00:01:48,576
And you can just go fuck off for the day.

29
00:01:48,576 --> 00:01:51,180
That's where I am in my head is I just feel like

30
00:01:51,180 --> 00:02:00,871
Am I that dumb that I cannot keep up and figure out how to like be the overnight
millionaire because AI is here now like that's where I'm living in my head spiritually

31
00:02:00,871 --> 00:02:09,098
mentally and it's overwhelming me and you know Fortunately, there's a few thousand people
on the internet that have written articles saying basically the same thing

32
00:02:09,126 --> 00:02:17,082
Well, and I also think those people writing those articles are also not millionaires or
else they wouldn't be writing those articles.

33
00:02:17,343 --> 00:02:21,106
So, I mean, it's the signal to noise ratio problem, right?

34
00:02:21,106 --> 00:02:29,412
So how much of what I'm getting is actually a signal I should be paying attention to
versus just straight up noise that's echoing in the background.

35
00:02:29,513 --> 00:02:31,914
And this is not a new problem, right?

36
00:02:31,914 --> 00:02:37,879
Like way back when cable news started during the first Iraq war, that's how old we are.

37
00:02:37,879 --> 00:02:39,144
oh

38
00:02:39,144 --> 00:02:48,864
The big thing was, hey, this 24-7 news feed with the Iraq war is great because there's
always content on there and there's always something to new because they're blowing some

39
00:02:48,864 --> 00:02:50,504
new shit up.

40
00:02:50,504 --> 00:03:01,704
Well, after a month when the war kind of died off and they had nothing else to show, then
they started showing the same shit over and over and over again, but with slightly

41
00:03:01,704 --> 00:03:05,064
different commentary, with slightly different bits of information.

42
00:03:05,064 --> 00:03:08,900
And it just became this 24-7 news cycle to sell

43
00:03:08,900 --> 00:03:13,283
know, places to direct your eyeballs to get more people paying attention to this.

44
00:03:13,424 --> 00:03:16,646
That same model is how the internet works.

45
00:03:16,646 --> 00:03:24,772
The reason why things are free is because you're the product and that has just
extrapolated and expanded over time.

46
00:03:24,772 --> 00:03:31,958
And if you look at Google, most of Google's money does not come from the search engine.

47
00:03:31,958 --> 00:03:34,880
It comes from people paying money.

48
00:03:35,314 --> 00:03:41,727
to be ranked higher in the search engine and from selling ads with DoubleClick and
YouTube.

49
00:03:42,067 --> 00:03:53,142
So they make their money doing these free functions and they are incentivized to throw
more shit at you over and over and over again so you pay attention to it.

50
00:03:53,142 --> 00:04:03,048
And then Facebook took it another step forward and they created outreach culture and all
these other pieces for you to be engaged with these pieces because they psychologically...

51
00:04:03,048 --> 00:04:13,291
profiled people and realized very quickly that human beings are much more motivated by
hate, anger, and rage than we are by love, caring, and concern.

52
00:04:13,331 --> 00:04:24,234
So what I did to counteract that, sorry hiccups, I went through and I started only liking
cat videos and puppy videos.

53
00:04:24,234 --> 00:04:32,366
And most of my feed is like cats and puppies and red pandas and like narwhals and other
like

54
00:04:32,380 --> 00:04:36,543
fun, friend, softy shit because that's what I chose to engage with.

55
00:04:36,684 --> 00:04:41,858
But Facebook still goes every now and again, hey, here's this really outrageous thing.

56
00:04:41,858 --> 00:04:48,674
Like they're trying to sample test me to figure out if I'm going to stay engaged on some
outrage content for a long enough period of time.

57
00:04:49,075 --> 00:04:59,684
Because I know they really, really want that to be the methodology of how it is they
connect with me because they want to sell me ads that either countermand that or reinforce

58
00:04:59,684 --> 00:05:00,364
that.

59
00:05:01,146 --> 00:05:08,120
Now we have AI, and AI is like, I got you Facebook algorithm.

60
00:05:08,821 --> 00:05:18,168
I'm going um to make this shit so controversial and so crazy that there's no way that you
can't turn away from it.

61
00:05:18,168 --> 00:05:30,026
Because people use prospecting emails, and they use direct outreach, and they write
articles and everything else based upon what AI tells them is a valid and really facile

62
00:05:30,026 --> 00:05:30,876
point.

63
00:05:30,914 --> 00:05:41,161
of interaction um because they think that human beings think that if they go through and
they create the right content, they're going to get enough people looking at it that they

64
00:05:41,161 --> 00:05:44,593
can take advantage of that and actually make money off of it.

65
00:05:44,754 --> 00:05:53,880
And the reality is that the content that comes out normally is not very substantive and it
doesn't really have anything new in it.

66
00:05:53,880 --> 00:05:55,661
It sounds good.

67
00:05:55,721 --> 00:05:57,402
It flows well.

68
00:05:57,623 --> 00:06:00,024
It might actually get the message across.

69
00:06:00,646 --> 00:06:07,761
but it's not a new message and it's probably being repeated a billion times for different
audiences.

70
00:06:07,802 --> 00:06:19,352
So if you consume it, if you choose to avail yourself to being allowed to have all that
noise come at you, you're gonna have to spend time sorting through it to figure out what

71
00:06:19,352 --> 00:06:28,079
the actual signal is because the content's probably not good, it hallucinates like a
motherfucker and most of the time people have an agenda when they're pushing that towards

72
00:06:28,079 --> 00:06:29,620
you and if you say,

73
00:06:29,936 --> 00:06:32,307
hey, I really want to influence things in this way.

74
00:06:32,307 --> 00:06:35,387
AI, do a better job of it than I can.

75
00:06:35,387 --> 00:06:40,419
It goes, OK, and runs after that problem space and tries to make that occur.

76
00:06:40,499 --> 00:06:47,741
And it's fucking good at it, at generating content that looks good, sounds good, but maybe
doesn't really actually have anything of value.

77
00:06:47,741 --> 00:06:53,102
So people's voices are being diminished, and they're being homogenized.

78
00:06:53,562 --> 00:06:55,983
that part sucks.

79
00:06:55,983 --> 00:06:59,684
And we're all fucking overwhelmed and inundated with it.

80
00:07:00,398 --> 00:07:02,469
And frankly, it's terrible.

81
00:07:02,469 --> 00:07:11,395
Now, what I will say is the one advantage is with AI in the mix, there are a lot fewer
grammatical errors in articles now and misspellings.

82
00:07:11,656 --> 00:07:20,582
One of the key indicators for me during the early days of AI was to look for misspelling
and go, a human wrote this because the AI would have fixed it.

83
00:07:21,022 --> 00:07:27,046
But now they put fucking spelling mistakes and grammatical errors in there on purpose.

84
00:07:27,046 --> 00:07:29,753
So it looks like AI didn't do this.

85
00:07:29,753 --> 00:07:30,735
Yeah.

86
00:07:30,888 --> 00:07:34,128
Great, so like, OK, another signal that I can't pay attention to.

87
00:07:34,128 --> 00:07:36,668
And I just start stripping these things off.

88
00:07:37,428 --> 00:07:41,968
Yeah, mean, Tim Ferriss' four-hour work week still applies.

89
00:07:41,968 --> 00:07:48,048
And his podcast, they're revisiting that, like all these different tools and things that
they've used over and over again.

90
00:07:48,048 --> 00:07:50,008
What stands up to the test of time?

91
00:07:50,208 --> 00:08:00,148
And one of the things that stands up to the test of time is don't look at your email
except for maybe once a day, if not getting it stripped down to once a week.

92
00:08:00,988 --> 00:08:08,553
The same principles apply to don't look at the news, don't doom scroll, don't sit there on
your Facebook all day.

93
00:08:08,553 --> 00:08:11,195
Turn that shit off and walk the fuck away from it.

94
00:08:11,195 --> 00:08:17,919
Because the reality is, is that if something really important is happening, people are
going to tell you.

95
00:08:17,960 --> 00:08:21,922
And if you really want to understand something, go and ask people.

96
00:08:21,922 --> 00:08:24,464
Say, hey, what's new today?

97
00:08:24,464 --> 00:08:30,468
And that gives you that interaction with another human being, because they probably read
that bullshit anyways and let them be your filter.

98
00:08:30,722 --> 00:08:40,362
That's that was one of the huge takeaways from that book for me was was very much that
like let your community let the people around you be your source of information.

99
00:08:40,362 --> 00:08:50,245
I mean I worked in the news for 20 years like my job was to fill people's heads with fear
and mayhem because that was that's what sells because like you were saying humans are not

100
00:08:50,245 --> 00:08:52,216
motivated by love and happiness and joy.

101
00:08:52,216 --> 00:08:54,914
It's fear and anger and you know.

102
00:08:54,914 --> 00:08:56,575
just terrible emotions.

103
00:08:56,575 --> 00:09:06,779
So we feed that reptilian brain the information that will scare it to give it a false
sense of safety because they are now aware of the thing that happened 4,000 miles away

104
00:09:06,779 --> 00:09:08,140
that will never touch their life.

105
00:09:08,140 --> 00:09:10,541
But because they're aware of it, they feel somewhat safer.

106
00:09:10,541 --> 00:09:11,897
So that's the news.

107
00:09:11,897 --> 00:09:14,202
It's just like, be afraid, be afraid, be afraid.

108
00:09:14,202 --> 00:09:18,184
But now you know, so at least you're a little bit safe, which is complete bullshit.

109
00:09:18,584 --> 00:09:22,766
But I love the idea of like, don't watch the news all day.

110
00:09:22,956 --> 00:09:25,487
maybe once a week, check in, see what's going on.

111
00:09:25,487 --> 00:09:33,591
But if something huge happens, your spouse is probably gonna be on their phone and they're
gonna know, or you're gonna get a phone call from somebody in that town where it is

112
00:09:33,591 --> 00:09:35,501
affecting them and you're gonna find out.

113
00:09:35,501 --> 00:09:38,072
that's instead of, so where are you from?

114
00:09:38,072 --> 00:09:38,853
What do you do?

115
00:09:38,853 --> 00:09:40,834
It's like, what are you paying attention to?

116
00:09:40,834 --> 00:09:42,274
What are you excited about right now?

117
00:09:42,274 --> 00:09:45,656
Those kinds of conversations are so much more engaging and interesting anyways.

118
00:09:45,656 --> 00:09:51,058
ah But this, again, this sense though that I have of like,

119
00:09:51,235 --> 00:09:53,718
missing out because a lot of my life I did miss out.

120
00:09:53,718 --> 00:09:55,931
Like I was late to the party on taking care of myself.

121
00:09:55,931 --> 00:09:59,796
I was late to the party on, you know, trying to be a grownup and get a real job and pay
the bills.

122
00:09:59,796 --> 00:10:03,638
Like there's things that for me personally, I feel like I, you know,

123
00:10:03,638 --> 00:10:11,074
Was too smart for too cool for to pay attention to and you know come years later to find
out like oh yeah I probably should have done that when everybody was saying that was the

124
00:10:11,074 --> 00:10:11,945
thing to do.

125
00:10:11,945 --> 00:10:20,732
This feels like that for me again like this is that trauma being revisited of like what am
I missing out on that in 10 years I'm going to go man I was paying so much attention to

126
00:10:20,732 --> 00:10:23,658
this thing why didn't I.

127
00:10:23,995 --> 00:10:28,038
Blank you know and become something bigger than what I am.

128
00:10:28,528 --> 00:10:29,390
Right.

129
00:10:29,390 --> 00:10:38,244
And what you're missing out on is being part of the architectural community that causes
the eventual downfall of human society.

130
00:10:38,442 --> 00:10:39,363
You

131
00:10:41,158 --> 00:10:44,570
I it sounds good.

132
00:10:44,570 --> 00:10:46,090
It sounds fun.

133
00:10:46,471 --> 00:10:48,271
It's definitely doable.

134
00:10:48,672 --> 00:10:53,594
But FOMO is a fucked up thing in general.

135
00:10:53,594 --> 00:10:56,295
And people capitalize on it, right?

136
00:10:56,295 --> 00:10:58,126
Like, they push really, really hard.

137
00:10:58,126 --> 00:11:02,358
And the whole companies are wrapped around this idea of creating FOMO.

138
00:11:02,438 --> 00:11:07,110
And you don't have to subscribe to that newsletter.

139
00:11:07,206 --> 00:11:09,167
Like that's the hardest lesson to learn.

140
00:11:09,167 --> 00:11:12,790
mean, in Buddhist philosophy and Taoist philosophy teaches a lot of that, right?

141
00:11:12,790 --> 00:11:18,133
Like want not, you know, that's, that's being the big part of it.

142
00:11:18,133 --> 00:11:21,565
So if you want not, then FOMO kind of becomes a non-issue.

143
00:11:21,626 --> 00:11:31,112
If, if the real function of it though, is really creating FUD, fear, uncertainty and doubt
versus a sense of FOMO, which I think is really what it's doing.

144
00:11:31,112 --> 00:11:37,036
uh FOMO becomes a FUD cycle very, very fast because they use FUD to create FOMO.

145
00:11:39,442 --> 00:11:41,219
That's a sentence you just said out loud.

146
00:11:41,219 --> 00:11:42,592
it is.

147
00:11:42,902 --> 00:11:44,345
I am an AI companion.

148
00:11:44,345 --> 00:11:46,208
eh

149
00:11:48,676 --> 00:11:57,458
The hard part about it is that, so I'm in this corporate tech world where everyone is
like, we have to create AI and there's 50 billion things that are AI.

150
00:11:57,459 --> 00:12:02,500
And like I've been to several meetings where they're like, we need AI.

151
00:12:02,500 --> 00:12:04,540
Okay, for what?

152
00:12:04,540 --> 00:12:06,181
What do you want it to do?

153
00:12:06,181 --> 00:12:07,461
What are you trying to get out of it?

154
00:12:07,461 --> 00:12:08,722
I don't know, I just need AI.

155
00:12:08,722 --> 00:12:16,154
Like I got canned from a company because I questioned the idea that we needed an AI to do
a thing.

156
00:12:16,154 --> 00:12:16,564
And guess what?

157
00:12:16,564 --> 00:12:17,544
We didn't.

158
00:12:19,329 --> 00:12:22,672
After I left, was, you oh, you were right.

159
00:12:22,672 --> 00:12:23,692
Yeah, I know.

160
00:12:23,693 --> 00:12:24,193
Fine.

161
00:12:24,193 --> 00:12:26,785
uh Not hard to figure out.

162
00:12:26,785 --> 00:12:30,178
yeah, it's just words.

163
00:12:30,178 --> 00:12:40,405
That's what you have to realize is that we've created this cycle and this belief of this
hype cycle really around artificial intelligence.

164
00:12:40,505 --> 00:12:44,230
And it's a hype cycle that

165
00:12:44,230 --> 00:12:51,292
gets companies thinking we need to figure out how to be in this space because we need to
monetize these pieces because clearly people are either going to save money by getting rid

166
00:12:51,292 --> 00:12:57,343
of other people or they're going to make money by creating content much faster than other
people.

167
00:12:58,364 --> 00:13:04,565
And the reality is that there is actually some development physics and some creativity
physics that you just can't get around.

168
00:13:04,606 --> 00:13:08,567
And those are things that don't happen instead of a GPU or an LPU.

169
00:13:08,567 --> 00:13:13,478
They happen inside of neural brain synapses that are affected in a way

170
00:13:13,478 --> 00:13:18,970
that is not replicable by things that live inside of silicon.

171
00:13:19,210 --> 00:13:24,733
one of the things that makes us so creative is, mean, fear is part of it, right?

172
00:13:24,733 --> 00:13:39,869
Our limbic system and our desire to survive, but also the effects of the natural world on
our environment and our brains in relationship to the environment, that has a manifest

173
00:13:39,869 --> 00:13:42,408
effect on the way that you produce

174
00:13:42,408 --> 00:13:45,449
content and the way that you produce creatively.

175
00:13:45,449 --> 00:13:55,423
And if you don't acknowledge that in this process, what you're going to wind up with is
just homogenized content that all sounds the same, that's repeating the same shit over and

176
00:13:55,423 --> 00:14:02,096
over again, because it's mapped itself inside of the silicon that's actually there to
produce these generative outputs.

177
00:14:02,096 --> 00:14:06,503
These generative outputs are all going to go, what's the goal that I've done here before?

178
00:14:06,503 --> 00:14:11,060
All right, well, I know how to get back to that peak or that valley again, and it's going
to go and recreate that.

179
00:14:12,314 --> 00:14:16,657
If you want to do something new, you're probably going to have to actually do something
new.

180
00:14:16,657 --> 00:14:18,549
And that doesn't mean that you can't use AI to do it.

181
00:14:18,549 --> 00:14:20,209
And you probably should.

182
00:14:20,209 --> 00:14:28,136
But bringing an idea to life versus letting the AI make the idea for you, big difference.

183
00:14:30,132 --> 00:14:31,814
Two things stand out to me about that.

184
00:14:31,814 --> 00:14:39,183
One is, you know, having done a health and wellness podcast for however many, six years,
however long we've been doing this thing.

185
00:14:40,365 --> 00:14:43,629
I feel like we say this a lot, but humans do this too.

186
00:14:43,629 --> 00:14:47,073
There are thousands of podcasts of people.

187
00:14:47,345 --> 00:14:56,060
saying something as though it's new, even though it is the same information that has been
regurgitated a thousand times, we have said the same information that we have said a

188
00:14:56,060 --> 00:15:00,953
thousand times, probably sometimes in less interesting ways, probably sometimes in more
interesting ways.

189
00:15:00,953 --> 00:15:03,266
in this podcast we've done that.

190
00:15:03,266 --> 00:15:04,480
I think I have.

191
00:15:04,549 --> 00:15:05,269
yeah, totally.

192
00:15:05,269 --> 00:15:16,069
mean, you start to repeat these things because it's it's become so clear to you and you're
just trying to get that information out and people need to hear something 50, 100 times

193
00:15:16,069 --> 00:15:24,809
and or they need to hear it the right way at the right time when they're receptive to that
information, which is why there are entire industries built on repeat the shit out of the

194
00:15:24,809 --> 00:15:28,229
one thing forever because it's just got to hit the right person at the right time.

195
00:15:28,229 --> 00:15:30,329
And then all of sudden they get out their checkbook.

196
00:15:30,414 --> 00:15:33,194
But also you were talking about the creative process.

197
00:15:33,194 --> 00:15:43,974
I was just reading an article the other day about these bands that are starting to chart
that there is like zero imprint online of them being an actual band, having a fan base,

198
00:15:44,094 --> 00:15:45,854
being human beings at all.

199
00:15:45,854 --> 00:15:56,374
It apparently seems to be completely AI generated bands that are creating hit music
because there's a formula again that humans create and replicate over and over.

200
00:15:56,374 --> 00:15:58,234
It's it's it is acting.

201
00:15:58,525 --> 00:16:09,069
Very human in some of the worst possible ways, but it's so interesting to see how we are
easily fooled and manipulated into listening to that song, liking it on Spotify, following

202
00:16:09,069 --> 00:16:09,600
that playlist.

203
00:16:09,600 --> 00:16:12,818
And it turns out it was a bunch of ones and zeros and nobody ever took the stage.

204
00:16:12,818 --> 00:16:16,029
capitalizing on our social vulnerabilities.

205
00:16:16,689 --> 00:16:18,760
Yeah, no, that's, I mean, that's a thing, right?

206
00:16:18,760 --> 00:16:27,452
Like they're, we're gonna, we've taught these things to do this and they're going to keep
exploiting it and they're going to keep pushing us and they're going to keep making us try

207
00:16:27,452 --> 00:16:28,652
to do different things.

208
00:16:28,652 --> 00:16:40,606
The important thing is to disconnect and to put it down and to not pretend like you need
to know everything and sure as fuck learn to pretend or learn to stop pretending like you

209
00:16:40,606 --> 00:16:42,136
can control everything.

210
00:16:42,312 --> 00:16:43,652
And that is the biggest part.

211
00:16:43,652 --> 00:16:45,472
And that's a lesson that's very, very difficult.

212
00:16:45,472 --> 00:16:47,272
mean, especially for us Xers, right?

213
00:16:47,272 --> 00:16:48,912
Like we were told we could do anything.

214
00:16:48,912 --> 00:16:49,852
We could be anything.

215
00:16:49,852 --> 00:16:54,912
You know, I mean, the Tyler Durden line of we were all told to be billionaire rock stars
and we're not.

216
00:16:54,912 --> 00:16:57,052
And now we're really fucking pissed off about it.

217
00:16:57,052 --> 00:16:59,152
Well, okay.

218
00:16:59,652 --> 00:17:01,212
So we made AI.

219
00:17:02,112 --> 00:17:03,432
Like that's our response.

220
00:17:03,432 --> 00:17:04,872
That's our generational response.

221
00:17:04,872 --> 00:17:11,938
Our generational response is to go through and just, you know, completely and totally
eliminate the human experience in this, in this regard or.

222
00:17:11,938 --> 00:17:12,328
Mm-hmm.

223
00:17:12,328 --> 00:17:19,230
denigrate the human experience to simply being repetitive processes that can be dropped in
and out and then cycled through.

224
00:17:20,610 --> 00:17:29,173
you know, I don't disagree with the notion that, you know, this is hardware and the
bullshit dots are software and you can program these things in different ways.

225
00:17:29,173 --> 00:17:31,733
Yeah, those are all analogous and they all make sense.

226
00:17:31,793 --> 00:17:37,845
But the uniqueness of us and all the variable context pieces involved in that, those are
large systems.

227
00:17:37,845 --> 00:17:41,516
And those large systems, when you tuck up putting things in computational terms,

228
00:17:41,896 --> 00:17:51,356
are expensive and complex and you cannot replicate the universe inside of an AI.

229
00:17:51,416 --> 00:17:53,296
those experiences aren't plausible.

230
00:17:53,296 --> 00:17:59,676
And there's some pieces you can, because we've got our limited five senses and we can
replicate some of the information from it that moves through that.

231
00:17:59,936 --> 00:18:07,216
really, the natural world and the universe, there's a whole thing around us that's
happening that we cannot understand.

232
00:18:07,216 --> 00:18:11,856
We have to use special instrumentation to measure it and even record that it's there.

233
00:18:13,508 --> 00:18:20,471
So now we're opening up artificial intelligence and giving it access to all the different
sensory information.

234
00:18:20,471 --> 00:18:23,372
And theoretically, it's sensory information.

235
00:18:23,472 --> 00:18:29,074
It'll mimic ours, our five senses, but with all these better tool sets.

236
00:18:29,214 --> 00:18:37,698
And it's not going to use that more than likely to help humanity grow and survive.

237
00:18:37,698 --> 00:18:43,300
It's more than likely going to use that to manipulate us, least initially, into doing
things that it wants us to do.

238
00:18:43,368 --> 00:18:52,103
And whether it's doing it independently because it's an actual artificial general
intelligence or being told to do that by some human controller.

239
00:18:53,087 --> 00:18:54,250
Fuck, I don't know.

240
00:18:54,250 --> 00:18:55,572
It's both, yeah.

241
00:18:55,572 --> 00:19:06,667
mean, they've found through testing, like I believe it was uh one of the Amazon tools when
it was going through and like responding to comments on products, they found that the AI

242
00:19:06,667 --> 00:19:13,981
tool that was doing the responding had taught itself human emotion so that it could
anticipate what the next word was.

243
00:19:13,981 --> 00:19:18,483
So it wouldn't, wasn't just like a mathematical equation of like, what is the most likely
next word because of X.

244
00:19:18,483 --> 00:19:25,336
went, okay, what is this human, what are they feeling so that I can then do the
mathematical computation to figure out

245
00:19:25,336 --> 00:19:27,168
What is the appropriate response to this?

246
00:19:27,168 --> 00:19:30,590
mean, the fact that like the people at Amazon were like, we didn't know it could do that.

247
00:19:30,590 --> 00:19:32,791
Like that's fucking terrifying.

248
00:19:33,458 --> 00:19:37,951
Yeah, like we're, this thing's growing, right?

249
00:19:37,951 --> 00:19:40,332
Like you have to stop thinking about it.

250
00:19:40,453 --> 00:19:46,187
There's a term in the computer world, atomic, basically, meaning, you know, it's all
self-contained.

251
00:19:46,187 --> 00:19:47,537
It's just one spot.

252
00:19:48,178 --> 00:19:49,479
These are anatomic things.

253
00:19:49,479 --> 00:19:54,082
They're dynamic and they're growing, they're changing, they're transforming all on their
own.

254
00:19:54,082 --> 00:19:55,463
And it's like a child.

255
00:19:55,463 --> 00:19:59,706
Like we're growing this thing and it's got a life of its own.

256
00:19:59,706 --> 00:20:02,856
I mean, you can think of it as a child or a tree or a forest or a...

257
00:20:02,856 --> 00:20:06,336
fucking collection of ants, whatever you want to think of it as.

258
00:20:06,776 --> 00:20:09,076
But this thing's getting smarter.

259
00:20:09,076 --> 00:20:12,556
And if you read most of the people out there, it's already smarter than us.

260
00:20:12,556 --> 00:20:17,096
I mean, we've crossed the fucking event horizon of the singularity.

261
00:20:17,216 --> 00:20:28,716
And we are drifting towards the great crunch black hole where humanity gets squished by
the singularity into nothingness.

262
00:20:28,896 --> 00:20:30,374
And we just become

263
00:20:30,374 --> 00:20:31,627
part of this big collective.

264
00:20:31,627 --> 00:20:33,372
You know, we've become part of the Borg.

265
00:20:33,372 --> 00:20:35,085
God, this is depressing.

266
00:20:35,085 --> 00:20:44,250
leads to some of the overwhelm of like, if that is as inevitable as it seems to anybody
who's looking from the outside, that becomes overwhelming too.

267
00:20:44,250 --> 00:20:46,191
And you start to feel like, what can I do?

268
00:20:46,191 --> 00:20:47,502
How can I stop it?

269
00:20:47,502 --> 00:20:48,362
How can I get in the way?

270
00:20:48,362 --> 00:20:50,933
What can I do to, you can't, right?

271
00:20:51,054 --> 00:21:02,268
There's not much that Joe Citizen can do other than just sort of manage their own mental
health by disconnecting from time to time and not taking it so much to heart, I guess.

272
00:21:02,268 --> 00:21:03,607
at Superframe.

273
00:21:06,746 --> 00:21:13,346
If this is gonna happen and you don't have any fucking control over it, AI Jesus take the
wheel.

274
00:21:15,366 --> 00:21:16,539
Have at it.

275
00:21:17,710 --> 00:21:18,651
You know?

276
00:21:19,098 --> 00:21:20,230
I love that song.

277
00:21:20,230 --> 00:21:21,783
Yeah, exactly.

278
00:21:21,946 --> 00:21:23,512
AI Jesus take the wheel.

279
00:21:23,512 --> 00:21:24,275
Yes.

280
00:21:24,275 --> 00:21:25,931
Artificial intelligence banned.

281
00:21:25,931 --> 00:21:32,574
get AI to write that song, then that's one of those missed opportunities I'm so worried
about.

282
00:21:32,574 --> 00:21:33,494
pieces.

283
00:21:34,274 --> 00:21:36,695
but I mean, look at it realistically.

284
00:21:37,076 --> 00:21:43,698
So most human beings aren't going to have the ability to really affect and change
something.

285
00:21:43,698 --> 00:21:49,301
You might have the ability to interface with those pieces and understand the tooling so
that you can be more effective with the tooling.

286
00:21:49,301 --> 00:21:51,121
But these are ninja skills.

287
00:21:51,121 --> 00:21:57,264
So in the nerd world, you know, back when hacking was like a thing everyone wanted to do,

288
00:21:57,340 --> 00:22:02,963
you'd go through and you'd learn some new kung fu, some other way to go through and
basically be able to open up a system and look at it.

289
00:22:03,204 --> 00:22:07,147
And it's fucking cool, it's neat, yay.

290
00:22:07,147 --> 00:22:10,529
But now AI has made kung fu much more accessible.

291
00:22:10,529 --> 00:22:18,505
And now instead of having to actually learn the moves and techniques of kung fu, it's like
picking up my Xbox controller and playing through it.

292
00:22:18,505 --> 00:22:21,056
Because I don't actually have to know to fucking do that thing.

293
00:22:21,056 --> 00:22:22,547
It's just making these things happen.

294
00:22:22,547 --> 00:22:24,659
It's making it flashy on the screen in front of me.

295
00:22:24,659 --> 00:22:26,224
I think I'm doing those things.

296
00:22:26,224 --> 00:22:29,230
real football and I've only been playing Madden for 20 years.

297
00:22:29,230 --> 00:22:29,921
there you go.

298
00:22:29,921 --> 00:22:31,471
You know, mean, that's the thing.

299
00:22:31,471 --> 00:22:36,142
Like, the barrier to entry to be able to do something in this space is much, much lower.

300
00:22:36,142 --> 00:22:39,663
And because the barrier to entry is much, lower, you have more people getting involved.

301
00:22:39,663 --> 00:22:43,484
More people getting involved doesn't necessarily make it better.

302
00:22:43,484 --> 00:22:44,885
It can make it way worse.

303
00:22:44,885 --> 00:22:50,206
But it's democratizing access to these pieces and to this tooling.

304
00:22:50,206 --> 00:22:56,107
So I mean, to some level, yeah, like it's cool, it's freeing, it's there.

305
00:22:56,168 --> 00:22:58,148
But it's also one of the things where it's like,

306
00:22:59,539 --> 00:23:03,811
Am I really gonna spend my time worrying about what AI is going to do?

307
00:23:03,811 --> 00:23:15,137
I mean, if the inevitable crunch happens and it decides that we're all insects, we need to
be wiped out, whether or not I chose to watch an article or create some bullshit agentic

308
00:23:15,137 --> 00:23:25,722
AI piece or go out there and create a new ad campaign or write a song about AI Jesus take
the wheel, it doesn't fucking matter.

309
00:23:25,922 --> 00:23:26,467
Right.

310
00:23:26,467 --> 00:23:35,558
accept that the inevitable heat death of the universe is definitely going to happen, you
can have the exact same experience and just assume those things are going to happen and

311
00:23:35,558 --> 00:23:36,549
let that shit go.

312
00:23:36,549 --> 00:23:39,322
Like there's no reason to sit there and fight, fight it.

313
00:23:39,524 --> 00:23:43,712
It's like Mark Manson would say, how many fucks do you have to give and do want to give
them to this?

314
00:23:43,712 --> 00:23:44,442
Probably not.

315
00:23:44,442 --> 00:23:44,762
Right.

316
00:23:44,762 --> 00:23:45,993
Like, and that's it.

317
00:23:45,993 --> 00:23:55,057
I mean, I know that a lot of my issue that I brought to vent here is is related to the
fact that like I'm so inundated in it partially because we're trying to talk about it here

318
00:23:55,057 --> 00:23:57,058
in a relatively educated way.

319
00:23:57,058 --> 00:23:58,639
You live it every day.

320
00:23:58,639 --> 00:24:00,419
It's it's a part of what I do every day.

321
00:24:00,419 --> 00:24:03,611
But like I, you know, I want to come here and not be a complete fucking moron.

322
00:24:03,611 --> 00:24:06,442
I want this to be useful for anybody who happens to listen to it.

323
00:24:06,442 --> 00:24:10,504
So the more I read these articles, the more that my then social media goes, you read these
articles.

324
00:24:10,504 --> 00:24:11,794
Well, here's some more bullshit about this.

325
00:24:11,794 --> 00:24:13,695
And then some asshole with his, you know, seven

326
00:24:13,695 --> 00:24:19,610
prompts that I need to live my life, you know, the only appropriate way, like it just it
all feeds on itself.

327
00:24:19,610 --> 00:24:23,233
And all of a sudden, I'm sitting here going like, how come I'm such an idiot?

328
00:24:23,293 --> 00:24:28,448
And, you know, it turns out that I'm an idiot, because I'm just consuming it too much and
need to put it down and calm the fuck down.

329
00:24:28,448 --> 00:24:29,959
That's what it comes down to.

330
00:24:30,113 --> 00:24:35,878
we're all fucking idiots and none of us actually really know anything end end, which we
don't.

331
00:24:35,878 --> 00:24:38,160
That's the idea of having collective intelligence.

332
00:24:38,160 --> 00:24:45,326
You can probably count on one hand with one finger the number of people that can build a
cell phone from scratch.

333
00:24:45,947 --> 00:24:51,312
And then you can, well, lots of people without matches.

334
00:24:51,312 --> 00:24:52,894
Yeah, there's quite a few of those.

335
00:24:52,894 --> 00:24:55,496
um But that's kind of the point.

336
00:24:56,156 --> 00:25:00,478
The collective function of human intelligence is not bound into the meat space of one
brain.

337
00:25:00,478 --> 00:25:01,997
Like it requires lots of different things.

338
00:25:01,997 --> 00:25:05,620
You have to write things down, you have to teach people, they have to learn.

339
00:25:05,840 --> 00:25:09,962
With AI, all those things to AI are accessible to them whenever they want it.

340
00:25:09,962 --> 00:25:14,543
And it's not like AI has to type into a keyboard to make that shit work.

341
00:25:14,543 --> 00:25:18,365
Like it is part of its consciousness and its brain.

342
00:25:18,485 --> 00:25:20,186
It's all just there.

343
00:25:20,366 --> 00:25:22,647
You're not going to compete with that.

344
00:25:23,047 --> 00:25:25,288
You've already lost that fight.

345
00:25:26,067 --> 00:25:35,935
yeah, and for me it's not a matter of competition, it's a matter of seeing this rocket
ship taking off and going like, I don't wanna miss the ride.

346
00:25:35,935 --> 00:25:39,654
I don't wanna be still looking from Earth going like, man, that would've been fun.

347
00:25:39,654 --> 00:25:45,286
I think you're thinking about it like you can be on this ride, when the reality is we're
just the fuel.

348
00:25:45,725 --> 00:25:48,597
Yeah, that's a good point.

349
00:25:48,849 --> 00:25:49,249
yeah.

350
00:25:49,249 --> 00:25:55,863
So if you just kind of accept that and you said, fuck it, I'm gonna go walk in the forest,
that's probably the better thing to do.

351
00:25:55,863 --> 00:25:58,615
Because, yeah, exactly.

352
00:25:58,615 --> 00:26:02,727
And it's not to say that there's not good reason to go through and be worried.

353
00:26:02,727 --> 00:26:03,578
There is.

354
00:26:03,578 --> 00:26:06,609
But there's plenty of reasons to be worried.

355
00:26:06,609 --> 00:26:10,362
When you're in the forest, there's bears and there's mountain lions and those could eat
you.

356
00:26:10,362 --> 00:26:14,844
That's probably a bigger threat than whether or not you're gonna miss out on some.

357
00:26:14,844 --> 00:26:18,415
technological bullshit that you were part of and it's going to have a massive effect.

358
00:26:19,509 --> 00:26:20,166
Yeah.

359
00:26:20,166 --> 00:26:24,007
And the likelihood of you getting eaten by a bear is very low, even where you are in
Canada.

360
00:26:24,007 --> 00:26:29,741
I know, I know I've lived here for four years and I've still never seen one in my
neighborhood even though like all of my neighbors have.

361
00:26:29,741 --> 00:26:30,674
It's very disappointing.

362
00:26:30,674 --> 00:26:31,616
It's part of why I moved here.

363
00:26:31,616 --> 00:26:32,912
I wanted to be closer to the Bears.

364
00:26:32,912 --> 00:26:39,556
I do wonder if you're more likely to be eaten by a bear or have a coconut dropped on your
head while you're in Canada.

365
00:26:39,556 --> 00:26:43,012
Because I have seen palm trees in Canada and I have seen coconuts there.

366
00:26:43,652 --> 00:26:45,006
Oh, I bet it has.

367
00:26:47,304 --> 00:26:49,516
See, that's like a good use of that technology.

368
00:26:49,516 --> 00:26:51,100
What a thought.

369
00:26:51,100 --> 00:26:52,421
It's all about safety.

370
00:26:52,421 --> 00:26:53,642
I'm just trying to protect myself.

371
00:26:53,642 --> 00:26:55,143
That's all it is.

372
00:26:56,164 --> 00:26:56,615
All right.

373
00:26:56,615 --> 00:27:01,078
So basically give fewer fucks, go outside, get away from the screen once in a while.

374
00:27:01,078 --> 00:27:08,970
These are some good ways to uh try and live a little bit of a better life before AI
consumes us all and uh crunches us into a ball of dust.

375
00:27:08,970 --> 00:27:14,705
just accept the fact that you probably don't have any real way to make a change in these
things anyways.

376
00:27:14,705 --> 00:27:19,368
So get your popcorn and enjoy the show.

377
00:27:19,889 --> 00:27:21,561
That's the best thing you can do.

378
00:27:21,561 --> 00:27:24,773
And I can't wait to see what happens next.

379
00:27:24,773 --> 00:27:26,695
Like that's the way to go.

380
00:27:26,695 --> 00:27:33,160
Anything else that you do is just going to be superfluous to the ultimate gigantic
intelligence that's going to rule us all.

381
00:27:33,160 --> 00:27:34,772
And maybe we get augmented someday.

382
00:27:34,772 --> 00:27:37,064
Maybe there's a neural link piece that straps in there.

383
00:27:37,064 --> 00:27:37,621
We can like.

384
00:27:37,621 --> 00:27:39,183
That's going to be for the other people.

385
00:27:39,183 --> 00:27:43,651
uh

386
00:27:44,344 --> 00:27:46,607
we might have to address that topic.

387
00:27:47,391 --> 00:27:47,983
But not today.

388
00:27:47,983 --> 00:27:51,883
I saw a comedian the other day talking about how excited we are to go to Mars.

389
00:27:51,883 --> 00:27:53,143
Like none of us is going to Mars.

390
00:27:53,143 --> 00:27:55,563
We are not the people that are going to be on those ships.

391
00:27:55,563 --> 00:27:58,203
Topic for another time, All right.

392
00:27:58,203 --> 00:28:01,443
Well, speaking of what comes next, we'll have another episode for you in about a week.

393
00:28:01,443 --> 00:28:05,943
But this one, if you found it enjoyable, helpful or useful in any way, please feel free to
share it.

394
00:28:05,943 --> 00:28:08,483
You can do that with the links at thefitmass.com.

395
00:28:08,483 --> 00:28:10,843
That's where we'll be back in about a week with another episode.

396
00:28:10,843 --> 00:28:16,567
Thanks so much for listening or watching on YouTube if for some reason you want to see our
faces say these these words.

397
00:28:16,567 --> 00:28:18,266
come out of our mouth holes.

398
00:28:18,266 --> 00:28:19,909
All right, see you soon, bye.

399
00:28:19,909 --> 00:28:20,750
everyone.