How To Question AI Like a Critical Thinker

My productivity hack: Use my code FITMESS40 to get 40% off your Magc Mind subscription (available only for the first 10 orders) --- Can AI's ability to shortcut learning destroy our capacity for critical thinking? As AI tools become more powerful...
My productivity hack: https://www.magicmind.com/FITMESS40
Use my code FITMESS40 to get 40% off your Magc Mind subscription (available only for the first 10 orders)
--- Can AI's ability to shortcut learning destroy our capacity for critical thinking?
As AI tools become more powerful and conversational, many worry we're outsourcing our thinking and learning processes to machines that can't truly understand context or verify accuracy. This fear is especially common among younger tech-savvy generations who already distrust online information.
In this episode, discover how to approach AI as a tool rather than a replacement for critical thinking. Learn why prompt engineering skills are becoming crucial digital literacy tools, and understand the parallels between AI, textbooks, and other historical knowledge-sharing technologies
Listen now to gain practical insights on navigating the AI revolution while maintaining your intellectual autonomy.
Topics Discussed- The viral "gorillas vs humans" debate as an entry point to discussing AI reliability
- Comparing AI-generated summaries to traditional shortcuts like Cliff Notes and textbooks
- How our brains and AI both use inference models to process and interpret information
- Why beliefs are harder to change than ideas, and how this affects information processing
- The Common Core math controversy as an example of resistance to new learning approaches
- Neil deGrasse Tyson's prediction about AI destroying internet credibility
- How confirmation bias influences our acceptance of information regardless of source
- The importance of questioning outputs from any information source, AI or human
- The existential value of human perspective and discernment in an AI-saturated world
- Real-world examples of AI's creative applications (music generation)
----
MORE FROM THE FIT MESS: Connect with us on Threads , Twitter , Instagram , Facebook , and Tiktok Subscribe to The Fit Mess on Youtube
Join our community in the Fit Mess Facebook group
----
LINKS TO OUR PARTNERS:
-
Explore the many benefits of cold therapy for your body with Nurecover
-
Muse's Brain Sensing Headbands Improve Your Meditation Practice.
-
Get a Free One Year Supply of AG1 Vitamin D3+K2, 5 Travel Packs
-
You Need a Budget helps you quickly get out of debt, and save money faster!
00:00:09,399 --> 00:00:10,340
Welcome to the fit mess.
2
00:00:10,340 --> 00:00:11,050
name is Jeremy.
3
00:00:11,050 --> 00:00:11,961
His name is Jason.
4
00:00:11,961 --> 00:00:18,835
And here we talk about all things sort of AI and health and wellness and where they
intersect to try to make sure that you are getting the best information you can
5
00:00:19,382 --> 00:00:23,305
to use these tools for your benefit while avoiding many of the pitfalls that come with
them.
6
00:00:23,305 --> 00:00:29,056
uh Jason, today's episode comes from from a real life conversation you had with a soon to
be family member.
7
00:00:29,056 --> 00:00:40,955
Yeah, so my future son-in-law and I were discussing oddly enough the who would win in a
fight a hundred men versus a gorilla.
8
00:00:40,955 --> 00:00:45,151
Okay, I'm going to take us off the rails immediately because this keeps coming up in my
feed.
9
00:00:45,151 --> 00:00:46,022
Where did this come?
10
00:00:46,022 --> 00:00:47,584
don't I don't follow the memes.
11
00:00:47,584 --> 00:00:48,175
I'm not hip.
12
00:00:48,175 --> 00:00:48,910
I don't know what's down.
13
00:00:48,910 --> 00:00:50,445
I don't have the 411.
14
00:00:50,458 --> 00:00:52,310
What is the deal with this gorilla versus man thing?
15
00:00:52,310 --> 00:00:54,924
Is this like the bear in the woods thing with women?
16
00:00:54,924 --> 00:00:55,940
What's happening here?
17
00:00:55,940 --> 00:01:04,205
if you're expecting me to be a substitute for relevance or coolness, that might be
problematic to your desired outcome.
18
00:01:04,205 --> 00:01:08,824
um I did not hear about this right until my kids told me.
19
00:01:08,824 --> 00:01:10,809
I started seeing it on the news.
20
00:01:10,880 --> 00:01:13,511
I started seeing it on CNN and NBC.
21
00:01:13,511 --> 00:01:18,053
And then, I mean, was floating around in all of my social profiles.
22
00:01:18,053 --> 00:01:22,976
And uh I haven't seen it on Al Jazeera or BBC.
23
00:01:23,210 --> 00:01:24,270
I haven't seen it on Fox.
24
00:01:24,270 --> 00:01:25,251
I haven't seen it on Drudge.
25
00:01:25,251 --> 00:01:32,226
I haven't seen it on like the 10,000 news sources I try to collect information from all of
them, but I've seen it on a few.
26
00:01:32,606 --> 00:01:39,811
And the origin is I guess some TikToker, you know, mentioned would this happen?
27
00:01:39,811 --> 00:01:40,752
Who would win?
28
00:01:40,752 --> 00:01:45,135
And it just exploded because a bunch of men were like.
29
00:01:46,416 --> 00:01:48,917
I could beat up a gorilla, I could beat up a bear.
30
00:01:50,138 --> 00:01:51,099
Like.
31
00:01:51,820 --> 00:01:52,920
No, you couldn't.
32
00:01:52,920 --> 00:01:57,860
And then it became realistically, how many men would it take to do these things?
33
00:01:58,500 --> 00:01:59,920
Somebody said 1100.
34
00:02:00,300 --> 00:02:09,520
So we went around and around, you know, like, okay, so a gorilla is 10 times stronger than
a human being roughly about double the weight of an average male.
35
00:02:09,720 --> 00:02:12,020
But we have to get through this.
36
00:02:12,120 --> 00:02:15,540
So so we had a discussion about it.
37
00:02:15,540 --> 00:02:19,280
And I said it was very much so like 100 humans.
38
00:02:19,560 --> 00:02:20,960
Gorilla doesn't stand a chance.
39
00:02:20,960 --> 00:02:22,000
I'm like
40
00:02:22,040 --> 00:02:25,060
know, like there's a lot of environmental factors and pieces that go into this.
41
00:02:25,060 --> 00:02:26,800
Maybe the gorilla runs better.
42
00:02:26,800 --> 00:02:28,820
It's going to run people over.
43
00:02:29,160 --> 00:02:31,140
He's like 100 people jumping in a gorilla all at once.
44
00:02:31,140 --> 00:02:32,840
I'm like, that's not how fights happen.
45
00:02:32,840 --> 00:02:35,020
Like, that's not how things occur.
46
00:02:35,020 --> 00:02:37,760
Like, you're not going to get 100 people to sneak up.
47
00:02:38,320 --> 00:02:39,980
They very, very quiet.
48
00:02:39,980 --> 00:02:41,380
I'm hunting gorillas.
49
00:02:41,380 --> 00:02:43,480
Like, that's not going to be a thing.
50
00:02:43,480 --> 00:02:45,200
So you're going to have to approach slowly.
51
00:02:45,200 --> 00:02:48,420
So eventually, I'm like, all right, let me consult chat.
52
00:02:48,420 --> 00:02:50,440
So I start poking through it and have a look at it.
53
00:02:50,440 --> 00:02:51,856
I find an answer and
54
00:02:52,024 --> 00:02:55,344
GI TBT is basically spinning back the same bullshit that would come out of my head.
55
00:02:55,344 --> 00:02:57,824
So I immediately start to question it.
56
00:02:58,044 --> 00:03:03,864
And Jared, my son-in-law is like, yeah, so like, is this a good source of information?
57
00:03:03,864 --> 00:03:04,824
He goes, I don't think it is.
58
00:03:04,824 --> 00:03:06,764
He goes, I don't think it's a good way to learn.
59
00:03:06,764 --> 00:03:14,664
I don't think it's a good way to look things up because somebody controls all the rights
to this information, all the LLMs, all these pieces.
60
00:03:14,664 --> 00:03:17,224
And there are some things that he was talking about.
61
00:03:17,224 --> 00:03:20,824
Like, and I've worked in this field for the better part of 20 years.
62
00:03:21,696 --> 00:03:27,138
And he's telling me things that I know, like the technology is not set up to do that.
63
00:03:27,138 --> 00:03:30,810
And there are things that it's set up to do that he says that it can't do.
64
00:03:30,810 --> 00:03:31,890
like, but it does.
65
00:03:31,890 --> 00:03:37,983
It's just some misinformation, I guess, that most people have around this topic.
66
00:03:37,983 --> 00:03:42,465
And the guy's really smart and really capable, really competent.
67
00:03:42,465 --> 00:03:50,468
And as we're kind of going through these pieces, uh he gets to the point where he's like,
look, I don't trust AI because the
68
00:03:50,686 --> 00:03:54,338
learning process that I have to go through to understand something.
69
00:03:54,798 --> 00:04:00,181
If I can go to AI and say, summarize this book for me, and it gives me a good summary.
70
00:04:00,181 --> 00:04:02,362
He's like, did I actually learn anything?
71
00:04:02,362 --> 00:04:13,368
Or am I just taking the summary as the thing to understand and know, and then passing this
off as my own learned and earned knowledge?
72
00:04:13,449 --> 00:04:14,679
I'm like, sure.
73
00:04:14,679 --> 00:04:16,490
So what you're describing there is Cliff Notes.
74
00:04:16,490 --> 00:04:18,832
Like we've had those forever.
75
00:04:18,832 --> 00:04:19,884
and by the way,
76
00:04:19,884 --> 00:04:30,888
your entire experience because we have history classes and because we have written history
and oral history is in fact the same thing because you didn't live all the experience.
77
00:04:30,888 --> 00:04:38,659
Like we were in the cave together and Grok is up there talking about, oh, let I go and
fight bear and be stabbed stick and poke.
78
00:04:38,659 --> 00:04:44,484
Like you didn't experience that, but you got to experience the story and understand those
pieces when they go.
79
00:04:44,484 --> 00:04:47,147
And if we make stick sharp, it poke bear better.
80
00:04:47,147 --> 00:04:47,727
Oh, okay.
81
00:04:47,727 --> 00:04:48,568
No shit.
82
00:04:48,568 --> 00:04:49,100
Well,
83
00:04:49,100 --> 00:04:53,523
great, that's now learned knowledge, you can take that, absorb that and put that into your
practice.
84
00:04:53,523 --> 00:05:03,319
But it's this notion that AI is somehow uh shortcutting things for people, which it is, so
do textbooks, so does literacy.
85
00:05:03,319 --> 00:05:06,451
mean, so do all these things that take the part of us.
86
00:05:06,451 --> 00:05:14,446
It's why we have trade schools, because when you go into a trade school, you don't go into
a trade school, typically speaking, and not go through some kind of classroom learning.
87
00:05:14,446 --> 00:05:15,490
Typically there's...
88
00:05:15,490 --> 00:05:19,222
some interaction that you're gonna go through and you're gonna read things and you're
gonna understand them.
89
00:05:19,222 --> 00:05:22,944
Like there's not a lot of jobs out there that don't require you to have literacy.
90
00:05:22,944 --> 00:05:24,675
Like it's just, part of it.
91
00:05:24,675 --> 00:05:28,447
And the communicating and past information level works that way.
92
00:05:28,447 --> 00:05:31,579
These AI tools are a further extrapolation of that, right?
93
00:05:31,579 --> 00:05:37,732
Like the amount of information they can send to you is much higher and you can use them
and rely on them for different things.
94
00:05:37,732 --> 00:05:45,400
And we had a discussion last week where we talked about the fact that people fed their
symptoms into an AI system and it performed, you know,
95
00:05:45,400 --> 00:05:50,620
statistically much better than the doctors did in the same exercise and diagnosis process.
96
00:05:50,620 --> 00:05:56,560
But we also talked about the fact that, okay, this is great that it can go through the
search filters and extract this information to look at it.
97
00:05:56,560 --> 00:05:59,280
And there's all kinds of problems with Western medicine and distraction.
98
00:05:59,500 --> 00:06:03,220
That being said, what if AI is wrong?
99
00:06:03,740 --> 00:06:09,520
What if, you know, it diagnosed the right thing 70 % of the time, but 30 % of the time,
the diagnosis was completely off.
100
00:06:09,540 --> 00:06:13,080
And you're like, well, they told me to go eat bleach to take care of this problem.
101
00:06:13,080 --> 00:06:14,454
So I went and ate bleach.
102
00:06:14,454 --> 00:06:15,625
and then died.
103
00:06:15,625 --> 00:06:18,447
You don't get a suit AI from malpractice.
104
00:06:18,447 --> 00:06:20,688
Like you've taken this on yourself.
105
00:06:20,688 --> 00:06:27,893
And again, the Dunning-Kruger effect in full force has led you to do things that are
stupid and you wind up paying a price for that.
106
00:06:28,273 --> 00:06:38,200
I would venture to say that Google, like since the invention of search engines really, so
like way back, the AltaVista days and Yahoo and everybody else, since then, this has been
107
00:06:38,200 --> 00:06:39,341
a progressive problem.
108
00:06:39,341 --> 00:06:44,086
It's just that now it happens much quicker and it feels much more authoritative because
109
00:06:44,086 --> 00:06:50,618
the thing we're asking questions is actually conversational and feels like we're talking
to an expert, even though it's just rhythm.
110
00:06:51,179 --> 00:07:06,545
That's where I think we're running into snags because my daughter's fiance is 23 and it's
not like he's not used to having to absorb and adopt new technologies all the time.
111
00:07:06,665 --> 00:07:12,908
This is a new technology, but he's immediately wary of it because he's like, I already
don't trust the internet.
112
00:07:12,908 --> 00:07:14,028
I already don't trust these things.
113
00:07:14,028 --> 00:07:16,409
already don't trust these sources of information.
114
00:07:16,810 --> 00:07:20,211
Like, okay, why do you trust this one less?
115
00:07:20,451 --> 00:07:21,822
And then I really had to think about it.
116
00:07:21,822 --> 00:07:33,697
Like, is there a legit reason to be more worried about this ah summarization,
extrapolation, and regurgitating of data than it is at other standpoints?
117
00:07:33,697 --> 00:07:40,499
And my gut says, be afraid.
118
00:07:40,780 --> 00:07:42,340
And my brain says,
119
00:07:42,508 --> 00:07:44,974
Yeah, but you're fucking afraid of everything with the same reasons.
120
00:07:45,693 --> 00:07:47,524
Well, but there's a couple of things to consider with this.
121
00:07:47,524 --> 00:07:54,907
And one is, you know, up until now, we've had to rely on human beings to write a lot of
what we're finding online.
122
00:07:54,907 --> 00:08:02,881
So it's all coming from somebody who theoretically, hopefully did some sort of research
before writing that thing that ended up in front of my eyeballs.
123
00:08:02,881 --> 00:08:07,143
But now the machines are writing based on what's already existing.
124
00:08:07,143 --> 00:08:15,470
And it's going to be probably wrong more than people were because it's pulling from all
these different data sources and putting things together that don't necessarily go
125
00:08:15,470 --> 00:08:15,719
together.
126
00:08:15,719 --> 00:08:29,142
And so when we read that summary of the 10 web pages we might have looked at on our own
that summary might be based on more inaccurate information and and spit back to us and but
127
00:08:29,142 --> 00:08:40,645
I also wonder like how much I Mean just using my own experience when I ask Google a
question and it spits back an AI answer, you know along with the million web pages that go
128
00:08:40,645 --> 00:08:41,785
with that answer
129
00:08:41,809 --> 00:08:43,651
I tend to read like the paragraph, right?
130
00:08:43,651 --> 00:08:47,654
Like I get a quick idea of its full thought on what the thing is.
131
00:08:47,654 --> 00:08:48,534
So.
132
00:08:49,716 --> 00:08:58,223
Just like when I use the tool Blinkist to take a book, condense it down to 15 minutes,
play it at double speed in seven minutes.
133
00:08:58,383 --> 00:09:01,986
You know, I I take that information and I decide.
134
00:09:02,347 --> 00:09:03,848
Did I get anything out of that?
135
00:09:03,848 --> 00:09:06,430
Is there more I want to explore here?
136
00:09:06,551 --> 00:09:09,735
Because that's my approach to to that sort of a tool is like.
137
00:09:09,735 --> 00:09:10,506
I've got enough.
138
00:09:10,506 --> 00:09:14,229
Have I heard is there anything here that I'm like, wow, I've never heard that before.
139
00:09:14,229 --> 00:09:15,690
I must know more.
140
00:09:15,690 --> 00:09:18,292
Or I'm like, OK, it's another self-help book.
141
00:09:18,292 --> 00:09:19,102
Get some good sleep.
142
00:09:19,102 --> 00:09:19,703
Eat some shit.
143
00:09:19,703 --> 00:09:21,144
You know, like go for a run.
144
00:09:21,144 --> 00:09:21,835
You'll feel better.
145
00:09:21,835 --> 00:09:27,949
I don't need to go read that 300 page book now because I just got the summary that tells
me I've I've heard this information before.
146
00:09:27,949 --> 00:09:35,105
But when somebody blows my mind with like, here's how to how to experience life at a
slower pace and get more out of it.
147
00:09:35,431 --> 00:09:35,671
Great.
148
00:09:35,671 --> 00:09:37,982
I'm going to go get the book now and I'm going to read more.
149
00:09:37,982 --> 00:09:40,802
And I want that, that firsthand knowledge from, from that person.
150
00:09:40,802 --> 00:09:46,224
So I think to me, like my experience tells me that it's really not that, that different.
151
00:09:46,364 --> 00:09:48,755
It, you know, it speeds up the process of now.
152
00:09:48,755 --> 00:09:53,586
I don't have to click 10 things to get a quick paragraph to tell me if I'm interested
enough to learn more.
153
00:09:53,766 --> 00:10:03,285
But, you know, I do have to take with a grain of salt that summary, because I believe that
it's pulling from its own resources and you know, probably more full of shit than those 10
154
00:10:03,285 --> 00:10:05,329
links might've been if I found them on my own.
155
00:10:05,484 --> 00:10:06,249
So.
156
00:10:10,890 --> 00:10:21,234
If you take the term intelligence and you look at that from a contextual perspective,
whether it's artificial or naturally derived, intelligence is basically taking information
157
00:10:21,234 --> 00:10:29,788
signals inbound towards you and turning them into something and interpreting something
into something of use and of value.
158
00:10:29,788 --> 00:10:37,891
And really what we're talking about with most LLMs and all the GITPT and AI that we
interact with are inference models.
159
00:10:37,891 --> 00:10:39,702
So it's, you know,
160
00:10:40,414 --> 00:10:42,425
is this thing close to this thing?
161
00:10:42,425 --> 00:10:44,505
And if so, they kind of have a relationship.
162
00:10:44,505 --> 00:10:51,047
And can I infer enough information to build enough of a response that actually seems like
it sounds legit?
163
00:10:51,047 --> 00:10:52,527
Like, it's good.
164
00:10:53,628 --> 00:10:55,848
That's exactly what human brains do.
165
00:10:55,948 --> 00:10:57,909
Like, that's how we learn.
166
00:10:57,909 --> 00:11:02,310
That's how we process information and the books that you read and things that you read
online.
167
00:11:02,310 --> 00:11:07,111
There is another set of intelligence ripping through that and maybe making mistakes.
168
00:11:07,111 --> 00:11:09,652
So, yeah, like
169
00:11:09,868 --> 00:11:16,634
The idea is that the aggregate idea of having lots of different intelligences out there,
going through and looking at these things, self-checking themselves, cross-checking
170
00:11:16,634 --> 00:11:26,362
themselves, trying to make sure that they're actually saying the right thing and moving in
the right direction is really how we as a society and as a species have grown and moved
171
00:11:26,362 --> 00:11:27,282
forward.
172
00:11:28,063 --> 00:11:30,705
Now we have this new thing that does a lot of that for us.
173
00:11:30,825 --> 00:11:37,911
And it means that a lot of people don't have to take the time to actually learn understand
these things to actually have a decent level of information and knowledge on it.
174
00:11:38,071 --> 00:11:39,572
That's not a bad thing.
175
00:11:40,216 --> 00:11:50,324
Because really when you think about it, these chat TPP pieces and all the generative AI,
you're essentially using them like you would a tool like a cell phone or a browser.
176
00:11:50,324 --> 00:12:02,822
Jeremy, if I came to you and I said, here is a shovel and a battery backup, go build me a
cell phone.
177
00:12:04,652 --> 00:12:06,675
How many lifetimes would it take you?
178
00:12:08,443 --> 00:12:09,604
I'd have to ask AI.
179
00:12:09,604 --> 00:12:10,205
I don't know.
180
00:12:10,205 --> 00:12:13,943
uh Right.
181
00:12:13,943 --> 00:12:18,240
We use shit all the time that we have no fucking idea how it's built, how it's made.
182
00:12:18,240 --> 00:12:21,135
I mean, because it becomes part of the user experience.
183
00:12:21,917 --> 00:12:26,294
The backend of all these GPTs is the same.
184
00:12:26,294 --> 00:12:29,877
Before we continue this conversation about AI accountability,
185
00:12:29,942 --> 00:12:35,927
I want to take a moment to share something that's been really helpful for me, especially
when researching complex topics like this.
186
00:12:35,927 --> 00:12:42,463
you know that feeling when you're really just dialed in, like you're processing
information clearly and making connections effortlessly.
187
00:12:42,463 --> 00:12:46,627
That's exactly what I've been experiencing with magic minds, mental performance shot.
188
00:12:47,087 --> 00:12:48,093
It's just a couple of ounces,
189
00:12:48,093 --> 00:12:54,214
But it's loaded with great stuff like lion's mane mushroom, vitamin B2, B3, B12,
ceremonial matcha,
190
00:12:54,214 --> 00:13:02,546
Ashwagandha and more bringing all these amazing ingredients together to help you have that
focused, clear, calm energy you need to get through your day.
191
00:13:02,709 --> 00:13:10,541
As we talk about augmenting human intelligence with AI tools, I'm reminded that taking
care of our own cognitive performance is just as important.
192
00:13:17,635 --> 00:13:14,832
perfect balance of focus and creativity.
193
00:13:15,072 --> 00:13:20,672
It's becoming an essential part of my morning routine the optimal mindset for critical
thinking.
194
00:13:21,453 --> 00:13:23,913
need when evaluating AI-generated information.
195
00:13:24,486 --> 00:13:30,851
That's one of my favorite little productivity hacks and it can also be yours and it can be
yours for 40 % off your subscription.
196
00:13:30,851 --> 00:13:42,853
Just head over to magicmind.com forward slash fit mess 40 again magicmind.com forward
slash fit mess 40 get 40 % off your subscription and that's only available to the first 10
197
00:13:42,853 --> 00:13:44,181
of you that go there.
198
00:13:44,181 --> 00:13:46,712
Head there now you won't regret it.
199
00:13:46,953 --> 00:14:01,577
but I mean, realistically, at some point, um we have to start accepting the fact that
these artificial sources of truth um are still sources of truth.
200
00:14:01,617 --> 00:14:06,719
And we have to delineate between the concept of truth and fact.
201
00:14:07,059 --> 00:14:10,080
And we don't do that enough already.
202
00:14:10,080 --> 00:14:14,893
Like, we look at the news and news is all about interpretation of facts for the most part.
203
00:14:14,893 --> 00:14:16,754
these days because it's editorialized.
204
00:14:17,195 --> 00:14:27,544
But even the things that produce em the mechanisms for how we're just going to understand
the world and the practical formulas behind that, like textbooks, math, English,
205
00:14:27,544 --> 00:14:33,471
especially history, those things get rewritten all the time to be editorialized towards a
certain perspective.
206
00:14:33,471 --> 00:14:36,831
I mean, let's use math for a very simple mechanism.
207
00:14:37,112 --> 00:14:43,037
Common Core made the country throw their hands up in outrage because
208
00:14:43,289 --> 00:14:46,869
the mechanism was different than the rote math techniques that we had used before.
209
00:14:46,869 --> 00:14:53,989
And the whole point of it was to get schools to go through and to grant a curriculum that
at a national level, they can repeat everywhere.
210
00:14:54,229 --> 00:14:56,369
And my kids brought home their common core math.
211
00:14:56,369 --> 00:14:58,329
And I looked at it and I'm like, fuck this.
212
00:14:58,329 --> 00:15:00,409
It's so different from what I learned and know.
213
00:15:00,409 --> 00:15:02,329
There's no way this can be useful.
214
00:15:02,669 --> 00:15:04,509
And then I started reading the studies and the returns on it.
215
00:15:04,509 --> 00:15:11,729
They're like, well, the way that it's been put together and the other international groups
that have used these pieces, like here's what their success rate's been.
216
00:15:11,729 --> 00:15:12,665
And I'm like, well,
217
00:15:12,665 --> 00:15:13,405
Maybe I'm an idiot.
218
00:15:13,405 --> 00:15:14,505
Maybe I'm wrong.
219
00:15:14,525 --> 00:15:16,285
Well, my kids do math now and I'm looking at it.
220
00:15:16,285 --> 00:15:18,785
I'm like, I can't figure that out.
221
00:15:18,785 --> 00:15:20,185
And I'm pretty good at math.
222
00:15:20,185 --> 00:15:23,945
And I've taken the Common Core stuff and I fit it into AI and looked at it.
223
00:15:24,565 --> 00:15:25,945
And you know what?
224
00:15:26,265 --> 00:15:28,065
It actually is pretty good.
225
00:15:28,545 --> 00:15:29,885
It is another way of doing this.
226
00:15:29,885 --> 00:15:30,725
It is effective.
227
00:15:30,725 --> 00:15:32,285
Like it does do all these things.
228
00:15:32,285 --> 00:15:36,505
So I had to get past my own bias that the way that I learned something was the right way
to do it.
229
00:15:36,505 --> 00:15:40,385
Because the reality is that knowing, learning, and understanding.
230
00:15:40,919 --> 00:15:49,511
whether it's artificial or naturally derived, we're all just things try to sort shit, put
it into something that's cognitively recognizable.
231
00:15:49,511 --> 00:15:55,793
So we can pilot these meatseats of ours through this, you know, scary mystical world that
we live with it.
232
00:15:55,793 --> 00:16:10,937
And the more we absorb and the more we, I guess, give in or surrender our ignorance to an
artificial intelligence.
233
00:16:11,105 --> 00:16:12,665
and let it feed us?
234
00:16:12,966 --> 00:16:17,547
Yeah, they're going to have homogenization of ideas and thoughts and processes and all
those pieces.
235
00:16:20,109 --> 00:16:24,510
TV does that to you, so does the Internet, like a million things already do this to you
like.
236
00:16:25,031 --> 00:16:26,601
Why is this one so much worse?
237
00:16:26,601 --> 00:16:29,432
And I guess the idea of this one so much worse because.
238
00:16:29,833 --> 00:16:35,035
Random people can make things that can fool the human being into thinking that it's
somebody else that does these things, blah, blah, blah, blah.
239
00:16:35,035 --> 00:16:36,655
Yes, all those things happen.
240
00:16:36,655 --> 00:16:40,907
Information is fucking nebulous and ephemeral and weird.
241
00:16:41,057 --> 00:16:41,499
Agreed.
242
00:16:41,499 --> 00:16:45,208
But this tool is effective and it makes great cat memes.
243
00:16:45,208 --> 00:16:47,132
So people aren't going to give it up anytime soon.
244
00:16:47,771 --> 00:16:49,631
All right, well, let me get your take on this then.
245
00:16:49,631 --> 00:16:52,622
The very famous scientist, Neil deGrasse Tyson.
246
00:16:52,731 --> 00:16:59,705
has repeatedly uh predicted that AI will be the end of the internet as we know it.
247
00:17:33,392 --> 00:17:41,284
Hmm, so The idea that everything becomes total bullshit Or is completely
248
00:17:41,284 --> 00:17:52,759
untrustable because the deep fakes are so good that there's nobody can discern them so the
people that believe in the fake news start thinking everything is fake news because right
249
00:17:54,442 --> 00:17:55,214
No
250
00:17:55,214 --> 00:18:01,694
find themselves driving back to a library to read these old fashioned books to get their
information?
251
00:18:01,694 --> 00:18:03,654
Well, no, it's confirmation bias, right?
252
00:18:03,654 --> 00:18:16,514
I mean, when you're talking about people that there's a very large percentage of the
population that doesn't have ideas, they have beliefs and ideas are great.
253
00:18:16,514 --> 00:18:17,594
You can change an idea.
254
00:18:17,594 --> 00:18:21,394
Like you can get new information and go, oh, well, now I think something different.
255
00:18:21,814 --> 00:18:23,154
A belief is different.
256
00:18:23,154 --> 00:18:25,854
You're going to we are.
257
00:18:26,534 --> 00:18:29,214
For lack of a better term, hardwired.
258
00:18:29,854 --> 00:18:30,874
To.
259
00:18:31,034 --> 00:18:42,765
accept beliefs and do whatever kind of mental gymnastics we have to to find things that
support those beliefs because that's what we is relied upon us for survival within our
260
00:18:42,765 --> 00:18:43,656
tribes.
261
00:18:43,661 --> 00:18:50,316
We have to go so much farther to go from idea to belief, to adopt an idea as a belief.
262
00:18:50,397 --> 00:18:52,337
It has to become a part of you.
263
00:18:53,039 --> 00:18:55,971
Religion, for example, is a faith.
264
00:18:55,971 --> 00:19:03,147
It is a belief in a story that becomes a part of who you are and way you live your life
and every decision that you make.
265
00:19:03,416 --> 00:19:05,647
Yeah, well, and there's two sides of that, right?
266
00:19:05,647 --> 00:19:20,543
So em it can be really hard to go from an idea to a belief, but it's a lot easier to have
no idea and then have a belief thrust upon you as an idea, adopt it, then that becomes the
267
00:19:20,543 --> 00:19:22,253
fundamental part of who you are.
268
00:19:22,694 --> 00:19:24,354
I think that happens a lot.
269
00:19:24,735 --> 00:19:33,528
Like, I would say most people that uh subscribe to our religion, that's the path for how
they got.
270
00:19:33,544 --> 00:19:37,175
like their parents did this thing, they picked it up as tradition, they ran forward with
it.
271
00:19:37,175 --> 00:19:41,797
And of course it's true because absolutely, right?
272
00:19:41,797 --> 00:19:45,699
I mean, we all have our camps and our tribes and we don't want to not be in our tribes.
273
00:19:45,699 --> 00:19:53,532
um information and the way that we collect information and the way that we distribute
information, it almost always comes from a bias source.
274
00:19:53,532 --> 00:19:56,308
And that was always comes from somebody that has a belief about something.
275
00:19:56,308 --> 00:19:58,174
And that's why it is they're trying to spread this across.
276
00:19:58,174 --> 00:20:01,325
Like, I mean, I don't have an answer for atheism.
277
00:20:01,325 --> 00:20:03,346
I mean, because the of atheism,
278
00:20:03,346 --> 00:20:09,206
is that it's lack of belief, but it has a lot of the same kind of tenants to it where it's
like.
279
00:20:09,206 --> 00:20:10,588
mean, it's a belief on its own though.
280
00:20:10,588 --> 00:20:13,227
It's a belief in nothing more than it's a.
281
00:20:13,227 --> 00:20:16,159
in and I mean, it's not necessarily nihilism or anything like that.
282
00:20:16,159 --> 00:20:17,349
But yeah.
283
00:20:17,349 --> 00:20:27,276
And then you have agnostics, you know, which I would argue are the scientists and people
like, I think it's like this, but I don't actually fucking know, like how you live your
284
00:20:27,276 --> 00:20:30,818
life in any of those extremes.
285
00:20:31,379 --> 00:20:35,742
If you're living your life in a way other than I don't know.
286
00:20:36,683 --> 00:20:38,323
You're living life a belief.
287
00:20:39,044 --> 00:20:41,564
And the idea is to be open.
288
00:20:41,564 --> 00:20:46,878
and to be understanding of these pieces and have enough neuroplasticity to go through and
change your mind.
289
00:20:46,939 --> 00:20:51,403
If you can't change your mind about an idea, then it's a belief period.
290
00:20:51,403 --> 00:21:00,241
If you can't argue the other side in a reasonable way, then you are arguing for your
belief, not an idea.
291
00:21:00,241 --> 00:21:06,697
Because I can look at the most horrible things out there and I can think through the
counter argument all the way through end to end.
292
00:21:06,697 --> 00:21:07,934
And I can come up with another
293
00:21:07,934 --> 00:21:12,334
premises to go, okay, well, I could see from this perspective, how somebody might feel
that way.
294
00:21:12,334 --> 00:21:13,494
I think they're fucking nuts.
295
00:21:13,494 --> 00:21:14,454
I disagree with them.
296
00:21:14,454 --> 00:21:16,154
I think it's abysmal and awful.
297
00:21:16,354 --> 00:21:22,994
But I can understand and empathize at it to some degree and connect with that kind of
outrageous bullshit.
298
00:21:24,013 --> 00:21:27,894
If you can't do that, then you have a belief, not an idea.
299
00:21:27,894 --> 00:21:31,174
And we have enough beliefs.
300
00:21:31,874 --> 00:21:33,634
We have enough believers.
301
00:21:33,634 --> 00:21:35,934
I mean, there's people that believe bullshit all the time.
302
00:21:35,934 --> 00:21:37,874
I don't think AI
303
00:21:38,438 --> 00:21:39,969
makes this any better or worse?
304
00:21:39,969 --> 00:21:47,855
Because I think you're going to ask AI questions and you're going to prompt engineer it to
give you have it spit back the response that you want to see.
305
00:21:47,855 --> 00:21:51,956
I mean, I think I think ultimately it does what where we started with this conversation.
306
00:21:51,956 --> 00:21:53,597
You said something about doing it faster.
307
00:21:53,597 --> 00:21:56,038
It gets you to that belief point faster.
308
00:21:56,038 --> 00:22:04,600
It takes a bunch of data, puts it together for you to form a idea or a belief faster than
you would have.
309
00:22:05,080 --> 00:22:06,931
And is that good or bad?
310
00:22:06,931 --> 00:22:07,781
I don't know.
311
00:22:07,781 --> 00:22:08,938
I'm agnostic about it.
312
00:22:08,938 --> 00:22:10,661
I don't know, but.
313
00:22:13,362 --> 00:22:14,993
I mean, it's it's made my life easier.
314
00:22:14,993 --> 00:22:16,343
I I was was going to.
315
00:22:16,495 --> 00:22:22,955
joke that like, you know, I got your one paragraph text message about this is an idea for
a podcast episode.
316
00:22:22,955 --> 00:22:24,975
I threw it into a I said, give me an outline.
317
00:22:24,975 --> 00:22:36,815
I've got a three segment nine section outline for an episode that I haven't looked at
once, but I like to I like to use this as a tool again to like give like give me something
318
00:22:36,815 --> 00:22:38,255
I can sink my teeth into.
319
00:22:38,255 --> 00:22:42,029
And do I care enough about the output to do something with it?
320
00:22:42,029 --> 00:22:46,579
And ultimately, like I got some ideas about like some questions I wanted to ask you and
some directions I wanted to go.
321
00:22:46,579 --> 00:22:56,574
I didn't follow the outline, but I was able to take your three sentences and turn it into
a potentially 30 minute conversation if I followed the path.
322
00:22:56,574 --> 00:23:01,241
that's the crazy thing is that like, is this path worth following?
323
00:23:01,241 --> 00:23:04,344
So you have to ask yourself that every time you read these things.
324
00:23:04,344 --> 00:23:05,446
Does this make sense?
325
00:23:05,446 --> 00:23:06,537
Am I going in the right direction?
326
00:23:06,537 --> 00:23:08,828
Whether you have an output is actually a value.
327
00:23:09,630 --> 00:23:15,736
If you're not doing that with everything that you look at, read, write, and interact with,
you're fucking up.
328
00:23:15,831 --> 00:23:16,673
Right.
329
00:23:16,757 --> 00:23:17,007
Right.
330
00:23:17,007 --> 00:23:17,782
That's not thinking.
331
00:23:17,782 --> 00:23:20,475
That's just following, really.
332
00:23:20,475 --> 00:23:25,859
yeah, like at that point, you can be replaced by the chat GPT prompt, you should be.
333
00:23:25,859 --> 00:23:32,155
Like your unique value is understanding these things to try to bring perspective into
something in this world.
334
00:23:32,155 --> 00:23:36,448
And I'm not talking about your unique value, I'm talking about every individual's unique
value.
335
00:23:36,969 --> 00:23:42,853
As individuals, our unique value is our ability to discern things and bring our own
perspective into those events.
336
00:23:43,715 --> 00:23:48,178
And how we experience that and how we go about looking at those pieces.
337
00:23:48,754 --> 00:23:51,015
That's the big existential question.
338
00:23:51,676 --> 00:23:54,818
Is AI a replacement for religion?
339
00:23:54,818 --> 00:23:56,938
Is AI a replacement for belief systems?
340
00:23:56,938 --> 00:24:02,522
Is AI a replacement for writing Perl scripts, for writing code?
341
00:24:02,522 --> 00:24:05,924
Is it a replacement for going through and creating new graphics and new music?
342
00:24:05,924 --> 00:24:09,016
I made a song at a Nowhere using ChatGPT.
343
00:24:09,016 --> 00:24:15,960
I used one of the follow-on GPTs to go through and write a song in the style of Bauhaus
about real estate in Everett, Washington.
344
00:24:18,977 --> 00:24:20,781
I think I heard that on the radio the other day.
345
00:24:20,781 --> 00:24:22,258
That's quite a song.
346
00:24:22,258 --> 00:24:27,642
but the thing is when I played it, I'm like, shit, this actually isn't bad.
347
00:24:27,642 --> 00:24:30,604
Like, nothing really went into it.
348
00:24:30,604 --> 00:24:34,317
Like, and I can download the copyright license to it and I own it.
349
00:24:34,317 --> 00:24:35,238
Like I'm the creator.
350
00:24:35,238 --> 00:24:38,819
I'm like, I'm not the creator of this shit.
351
00:24:38,819 --> 00:24:39,270
is
352
00:24:39,270 --> 00:24:43,732
Can AI show me a visual representation of 10,000 gorillas versus one man?
353
00:24:43,732 --> 00:24:45,465
That's what I wanna know.
354
00:24:46,149 --> 00:24:48,076
uh Absolutely.
355
00:24:50,453 --> 00:24:53,156
I saw this 10,000 versus a million.
356
00:24:53,177 --> 00:24:54,878
This thing is amazing.
357
00:24:54,879 --> 00:25:05,772
Like, it's just this mass filtering down and like, in these quantities, you're talking
about a very, very different set up because the gorillas are all like pumped to fight with
358
00:25:05,772 --> 00:25:06,923
with each other.
359
00:25:09,893 --> 00:25:11,973
I like that they're all like Paul Rudd too.
360
00:25:11,973 --> 00:25:18,833
Like all of the men, it's like, will a million Paul Rudds beat 10,000 Cocos?
361
00:25:19,973 --> 00:25:23,373
I like how the Cocos are running upright too.
362
00:25:23,373 --> 00:25:29,278
By the way, for all of you watching this or listening to this, just go into YouTube and
look for 10,000 Gorillas versus.
363
00:25:29,278 --> 00:25:35,703
Yeah, I don't like how they just fling the bodies because in reality the gorillas would
rip their arms off and beat them to death with them.
364
00:25:35,703 --> 00:25:37,524
I that's what would really happen.
365
00:25:38,786 --> 00:25:39,397
Right.
366
00:25:39,397 --> 00:25:41,191
where are the parts?
367
00:25:42,216 --> 00:25:42,629
I mean...
368
00:25:42,629 --> 00:25:43,801
trust this at all.
369
00:25:43,801 --> 00:25:48,687
This is, find to be completely false and I am going to choose to not accept that as fact.
370
00:25:49,207 --> 00:25:51,038
Yeah, again, you don't have to.
371
00:25:51,038 --> 00:25:52,718
mean, that's the thing.
372
00:25:53,718 --> 00:25:57,750
There's always the argument, are we actually living in the real world?
373
00:25:57,750 --> 00:26:00,020
Is this a simulation?
374
00:26:00,020 --> 00:26:02,737
These are big existential questions.
375
00:26:02,737 --> 00:26:10,303
I mean, some of the arguments you can read about this topic, it's hard to disagree with
some of what they're saying.
376
00:26:11,284 --> 00:26:13,824
But it's also not practical.
377
00:26:14,078 --> 00:26:15,201
Right.
378
00:26:17,057 --> 00:26:18,912
Okay, what if we're living in a simulation?
379
00:26:18,912 --> 00:26:22,068
Well, I can fucking do anything about that.
380
00:26:22,431 --> 00:26:26,518
I'm gonna keep going until someone gives me a cheat code to get around this shit.
381
00:26:26,518 --> 00:26:27,198
this game sucks.
382
00:26:27,198 --> 00:26:31,018
Where's my upright, upright, left, right, left, right, A, A, B, select, start.
383
00:26:31,018 --> 00:26:32,211
That's what I want to know.
384
00:26:32,211 --> 00:26:34,493
no, like that's a legit question.
385
00:26:34,493 --> 00:26:42,079
uh That being said, we experience the world through our senses.
386
00:26:42,740 --> 00:26:47,083
And chat GPT experiences the world through its senses.
387
00:26:47,083 --> 00:26:49,065
And its senses are basically data input.
388
00:26:49,065 --> 00:26:51,477
And then they spit out data.
389
00:26:51,477 --> 00:26:55,049
And then they get more data input from other people around them.
390
00:26:55,410 --> 00:26:56,771
It's got visual cues.
391
00:26:56,771 --> 00:26:59,321
It can see in light spectrums that we can't see in.
392
00:26:59,321 --> 00:27:01,681
It can hear things that we can't see.
393
00:27:01,882 --> 00:27:04,282
It can extract data much faster than we can.
394
00:27:04,282 --> 00:27:10,944
Like all these things are great, but it doesn't really have a lot of motivation from the
chat GPT perspective.
395
00:27:11,484 --> 00:27:17,116
What they have discovered is when they give it motivation, depending on the motivation
they give it, gives you very different outputs.
396
00:27:17,116 --> 00:27:24,928
Like there's the AIs that decided that they would form their own communication language
and start talking to each other.
397
00:27:24,928 --> 00:27:28,133
And nobody can understand it, but the AIs could clearly...
398
00:27:28,133 --> 00:27:31,093
communicate in a way that we couldn't understand.
399
00:27:31,273 --> 00:27:38,073
There's the AIs that they went through and they put forth the problem, keep all people
safe.
400
00:27:38,073 --> 00:27:43,993
And one of the AIs was like, cool, I'm going to melt all of you in nuclear warfare.
401
00:27:44,053 --> 00:27:46,913
And then we never have to worry about you being in danger again.
402
00:27:46,913 --> 00:27:49,233
It's like, huh, OK.
403
00:27:50,133 --> 00:27:53,613
And they're not unreasonable answers.
404
00:27:53,973 --> 00:27:56,980
I look at it, I'm like, fuck, I get that.
405
00:27:56,980 --> 00:27:57,797
you
406
00:27:57,861 --> 00:28:02,301
You need to apply that to everything that you work in, in every possible way.
407
00:28:02,301 --> 00:28:04,641
AI is just another form of intelligence.
408
00:28:04,641 --> 00:28:07,601
And some of them are your crazy fucking weird 3 % of uncle.
409
00:28:07,621 --> 00:28:09,201
Like that's just gonna happen.
410
00:28:09,201 --> 00:28:21,541
And you gotta learn how to filter that shit out and stop fighting, you know, this idea
that there's so much fear and trepidation around these pieces because there shouldn't be.
411
00:28:21,541 --> 00:28:26,415
I mean, it's just another source of information to make you do dumb shit faster.
412
00:28:26,415 --> 00:28:28,106
which is how we're gonna use it.
413
00:28:28,106 --> 00:28:31,333
So you have to critically think, but faster.
414
00:28:31,853 --> 00:28:38,303
Yeah, well, and the biggest part is em critically think before you ask your question.
415
00:28:39,105 --> 00:28:42,591
If you think critically before you ask your question, you're going to get a better answer.
416
00:28:42,591 --> 00:28:46,477
If you ask a stupid question, you're probably going to get a stupid response.
417
00:28:46,772 --> 00:28:48,346
That's my experience.
418
00:28:49,233 --> 00:28:50,796
I've got a lot of it.
419
00:28:50,915 --> 00:28:51,566
Yes.
420
00:28:51,566 --> 00:28:53,228
Yeah, yeah, me too.
421
00:28:54,632 --> 00:28:58,158
Most of the wealth of my knowledge has been derived from me making mistakes.
422
00:28:58,158 --> 00:29:01,866
And those mistakes normally start with me not asking the right questions up for.
423
00:29:01,866 --> 00:29:04,207
Yeah, failure is a hell of a teacher.
424
00:29:05,229 --> 00:29:05,587
All right.
425
00:29:05,587 --> 00:29:08,091
Well, I think that's enough uh enough for one week.
426
00:29:08,091 --> 00:29:10,323
I want to go watch some more Gorillas fighting humans videos.
427
00:29:10,323 --> 00:29:13,575
I'm going to try and find a better one and maybe we'll bring that next time.
428
00:29:15,297 --> 00:29:15,627
I will.
429
00:29:15,627 --> 00:29:16,518
Thanks so much for listening.
430
00:29:16,518 --> 00:29:20,691
If you've gotten any value out of this, please do share it with somebody who could also
benefit from it.
431
00:29:20,691 --> 00:29:24,762
There's links to do that at our website, the fit mess dot com, and we'll be back there in
about a week.
432
00:29:25,014 --> 00:29:32,037
And quickly, don't forget, head over to magicmind.com forward slash fit mass 40 to get 40
% off your subscription to magic mind.
433
00:29:32,395 --> 00:29:33,866
I know I can't get through my day without it.
434
00:29:33,866 --> 00:29:35,719
I think you'll feel the same once you try it.
435
00:29:35,719 --> 00:29:38,773
Magic mind.com forward slash fit mass 40.
436
00:29:39,056 --> 00:29:39,827
Thanks again for listening.
437
00:29:39,827 --> 00:29:40,777
We'll see you then.
438
00:29:40,842 --> 00:29:41,691
Thanks, bye bye.