Are we living in the sci-fi future we always dreamed of?

Look, we spend way too much time doom-scrolling about AI taking our jobs and potentially ending humanity. But let's be real for a hot minute – there's some genuinely mind-blowing tech coming that's going to make our daily lives infinitely better. From glasses that remember where you left your keys to robots that'll do your laundry while you binge Netflix, we're about to enter an era where forgetting stuff and doing chores becomes optional.
Jeremy and Jason dive into the cool side of AI advancement, exploring smart glasses with photographic memory, prosthetic limbs controlled by thoughts, business-building chatbots, and yes – robots that might replace your spouse (we went there). These aren't just prototypes anymore; this stuff is hitting the market now.
Ready to get excited about technology that doesn't involve robots uprising? Listen now and discover what's possible when AI works for us instead of against us.
Topics Discussed:
- Smart glasses with perfect recall – Never lose your keys again with AI that remembers everything you glance at
- Mind-controlled prosthetics – Wireless robotic limbs that work even when detached from the body
- AI-powered business creation – From idea to functioning website in minutes using ChatGPT
- Firefighting robots – Machines that walk through flames without getting tired or complaining
- Home automation evolution – Why your Roomba is about to get a serious upgrade
- Privacy implications – The trade-off between convenience and constant surveillance
- Relationship robots – Gary Vee's prediction that your grandkids will marry AI partners
- Market research automation – Getting million-dollar analysis for free through AI tools
- Safety applications – How smart glasses could prevent drowsy driving accidents
- The accountability problem – Whether technology can actually make us stick to our goals
----
MORE FROM THE FIT MESS:
Connect with us on
Threads
,
Twitter
,
Instagram
,
Facebook
, and
Tiktok
Subscribe to The Fit Mess on Youtube
Join our community in the Fit Mess Facebook group
----
LINKS TO OUR PARTNERS:
-
Explore the many benefits of cold therapy for your body with Nurecover
-
Muse's Brain Sensing Headbands Improve Your Meditation Practice.
-
Get a Free One Year Supply of AG1 Vitamin D3+K2, 5 Travel Packs
-
You Need a Budget helps you quickly get out of debt, and save money faster!
00:00:01,496 --> 00:00:10,191
It's a fitness where we talk about AI and health and wellness, and uh there's a lot of
things to talk about with AI on a daily basis that are affecting our lives and our overall
2
00:00:10,191 --> 00:00:12,182
health and certainly our mental health.
3
00:00:12,322 --> 00:00:22,067
And we also talk about a lot of a doom and gloom of AI and how scary it can be and the
potential uh terrible things that it could bring into the future for all of us.
4
00:00:23,229 --> 00:00:24,339
Probably not good for us to be.
5
00:00:24,339 --> 00:00:26,220
mean, yeah, look, it's therapy, right?
6
00:00:26,220 --> 00:00:29,712
Like it feels good to vent and to talk about it, but.
7
00:00:29,830 --> 00:00:35,685
There's a lot of things to be excited about too, and there's a lot of fun things that AI
is going to be bringing into our lives and.
8
00:00:39,790 --> 00:00:46,826
So to present this for you today last night I was trying to think of what are some of the
fun things?
9
00:00:46,826 --> 00:00:48,298
What are the things to be excited about?
10
00:00:48,298 --> 00:00:53,072
And so what I did what I do normally these days I went to AI and I said hey, why should we
be excited about you?
11
00:00:53,072 --> 00:00:55,234
Share some things that are cool and.
12
00:00:55,234 --> 00:01:00,508
In typical fashion, it gave me really terrible advice, like some really lame, boring,
watered down stuff.
13
00:01:00,728 --> 00:01:06,313
But because I was using my phone, my phone wants to know everything about me, so it
captures everything I type and turns it into great content for me.
14
00:01:06,313 --> 00:01:13,759
So as I then shifted to my doom scrolling for a few minutes to turn my brain off, I found
a few examples of things to be really excited about.
15
00:01:13,759 --> 00:01:19,433
So I thought that Jason I would share them with you, get your reaction, see how you
because you know the inner workings.
16
00:01:19,433 --> 00:01:20,944
A lot of a lot of this stuff and.
17
00:01:20,944 --> 00:01:25,758
Wanna just sort of see how you react to some of these devices and ideas that are coming
our way.
18
00:01:25,758 --> 00:01:29,291
So I'm gonna jump right in and start by sharing my screen here.
19
00:01:29,291 --> 00:01:33,255
So if you're watching this on YouTube, you will get a good kick out of this.
20
00:01:33,255 --> 00:01:38,284
Let me find, okay, how is it possible I have more windows open than I have shared?
21
00:01:38,284 --> 00:01:38,551
we go.
22
00:01:38,551 --> 00:01:39,618
Okay, let's start with this one.
23
00:01:39,618 --> 00:01:42,574
almost like there should be an artificial intelligence that can figure that out for you.
24
00:01:42,574 --> 00:01:44,115
I know it should be so much simpler than this.
25
00:01:44,115 --> 00:01:45,526
All right, let's do this.
26
00:01:45,526 --> 00:01:46,116
All right.
27
00:01:46,116 --> 00:01:50,668
This one I was blown away by and uh scared and excited about.
28
00:01:50,668 --> 00:01:55,750
So you guys may have noticed I snuck a peek back at the shelf a moment ago.
29
00:01:55,851 --> 00:01:59,272
I wasn't paying attention, but let's see if Jem and I was.
30
00:02:01,190 --> 00:02:07,865
Hey, did you happen to catch the title of the white book that was on the shelf behind me?
31
00:02:09,106 --> 00:02:12,389
The white book is Atomic Habits by James Clear.
32
00:02:12,910 --> 00:02:15,112
That is absolutely right.
33
00:02:15,112 --> 00:02:17,340
So let's try something harder.
34
00:02:17,340 --> 00:02:19,856
I keep losing my hotel key card.
35
00:02:19,856 --> 00:02:22,638
Do you know where I last left the card?
36
00:02:23,719 --> 00:02:27,001
The hotel key card is to the right of the music record.
37
00:02:27,903 --> 00:02:28,883
Great.
38
00:02:30,076 --> 00:02:32,696
For someone as forgetful as me, that's a killer app.
39
00:02:32,696 --> 00:02:38,896
All right, now let's see how the AI can connect the physical world with your digital
content and take action.
40
00:02:39,076 --> 00:02:42,836
This is my first time in Vancouver, and I love going on walks.
41
00:02:42,936 --> 00:02:49,056
So why don't you navigate me to a park nearby with views of the ocean?
42
00:02:51,118 --> 00:02:57,456
Okay, I am starting navigation to Lighthouse Park, which has magnificent views of the
Pacific Ocean.
43
00:02:57,777 --> 00:03:00,100
Is there anything else I can assist you with?
44
00:03:00,100 --> 00:03:06,378
Honestly, with these directions and a 3D map, I should be all set and hopefully I won't
look like a tourist.
45
00:03:06,435 --> 00:03:07,945
that's mind blowing.
46
00:03:07,945 --> 00:03:10,557
How many times have you lost your keys?
47
00:03:10,858 --> 00:03:12,189
Don't know where you left the hotel key.
48
00:03:12,189 --> 00:03:20,967
I mean, there's so many things that just by wearing these glasses that our brains can't
capture all of that information, but to have a tool that has um like 100 % immediate
49
00:03:20,967 --> 00:03:23,609
recall of everything you glance at.
50
00:03:23,790 --> 00:03:28,374
Is incredibly uh useful and massively time saving and frustration relieving.
51
00:03:28,374 --> 00:03:29,074
would.
52
00:03:29,315 --> 00:03:42,069
mean, for lack of a better term, it gives all of us a photographic memory, which is
amazing and awesome and powerful and is like the meat space equivalent of Google.
53
00:03:42,069 --> 00:03:47,610
ah Because you're no longer going to have arguments to say, I'm going to Google for the
right answer.
54
00:03:47,610 --> 00:03:53,352
You're going to be like, I'm going look back in time on these camera memories that I took,
which is great.
55
00:03:53,372 --> 00:03:55,512
Make sure you take them off when you go to the bathroom.
56
00:03:56,213 --> 00:03:57,438
That's the first warning.
57
00:03:57,438 --> 00:03:58,898
not going to?
58
00:03:58,898 --> 00:04:00,262
Just because you'll forget they're on?
59
00:04:00,262 --> 00:04:04,131
well, how long until there's an app that analyzes your poop in the toilet?
60
00:04:04,133 --> 00:04:05,215
That's the real question.
61
00:04:05,215 --> 00:04:07,550
Like the shit's about to get real.
62
00:04:07,550 --> 00:04:07,750
right?
63
00:04:07,750 --> 00:04:14,588
When it becomes contacts and you sort of forget they're even in, you're going to forget
that you're capturing everything that's happening in your entire life, no matter where you
64
00:04:14,637 --> 00:04:19,529
Yeah, privacy concerns are going to be like completely and my God, we're doing we're doing
gloomy again.
65
00:04:19,619 --> 00:04:20,349
No, it's fun.
66
00:04:20,349 --> 00:04:20,819
It's fun.
67
00:04:20,819 --> 00:04:21,620
It's really cool.
68
00:04:21,620 --> 00:04:22,580
fun.
69
00:04:22,740 --> 00:04:24,332
It's also going to help, right?
70
00:04:24,332 --> 00:04:30,216
Because when you think about like I remember a friend of mine once said that like I think
it was like the Mount St.
71
00:04:30,216 --> 00:04:34,268
Helens eruption was probably the last like major thing that would never be captured,
right?
72
00:04:34,268 --> 00:04:42,824
Because we all had cell phones in our pockets at that point when we are all capturing and
potentially recording everything that we do with every aspect of our life.
73
00:04:42,848 --> 00:04:45,120
No crime will be uncaptured.
74
00:04:45,120 --> 00:04:47,643
No natural disaster will be uncaptured.
75
00:04:47,643 --> 00:04:55,741
Like every car accident you're in, you'll have complete video recall of what happened and
an automated witness to whatever may have happened.
76
00:04:55,741 --> 00:05:03,098
It's crazy how much we'll be able to hold onto and capture and have forever.
77
00:05:03,137 --> 00:05:03,507
Right.
78
00:05:03,507 --> 00:05:06,639
So um a few things here.
79
00:05:06,639 --> 00:05:16,064
ah It's amazing because it's going to remove the issues with first person perspective
witnesses.
80
00:05:16,144 --> 00:05:27,140
So people that remember things incorrectly, most I forget what it is, but psychologists
say that 90 percent of all witnesses that go on the jury stand are just wrong.
81
00:05:27,140 --> 00:05:29,072
Like their recall is incorrect.
82
00:05:29,072 --> 00:05:31,885
And when they actually write because human brains
83
00:05:31,885 --> 00:05:39,927
remap information to put them into a certain format and then we store them, which is great
for us because that's how we allow ourselves enough cognitive dissonance to keep doing the
84
00:05:39,927 --> 00:05:43,928
dumb shit that we keep doing, like going to work every day to pay someone else's bills.
85
00:05:44,789 --> 00:05:52,250
Now we're going to have this thing that records all this stuff, which is going to be
amazing because if my wife wears them, she'll remember where her keys are all the time.
86
00:05:52,331 --> 00:06:01,186
And if I wear them, I'll remember all the terrible, stupid, shitty mistakes that I've done
and made like throughout the day.
87
00:06:01,186 --> 00:06:02,217
You
88
00:06:02,217 --> 00:06:05,278
other people are like, hey, you did this and this and this.
89
00:06:05,278 --> 00:06:06,220
I'm like, I did not.
90
00:06:06,220 --> 00:06:09,062
Well, check your feed and look, I'm like, I did do that.
91
00:06:09,062 --> 00:06:09,492
Damn it.
92
00:06:09,492 --> 00:06:10,263
I'm sorry.
93
00:06:10,263 --> 00:06:12,654
Like, maybe I can use it to make my habits better, right?
94
00:06:12,654 --> 00:06:14,095
Because recall is not great.
95
00:06:14,095 --> 00:06:22,461
The brain doesn't have a great job of understanding the information spec and the ability
for individuals to go through and use this kind of technology on a personal level to
96
00:06:22,461 --> 00:06:23,952
improve ourselves.
97
00:06:24,372 --> 00:06:26,013
It's going to be friggin rad.
98
00:06:26,262 --> 00:06:28,270
Well, and the ability to get.
99
00:06:29,823 --> 00:06:32,404
I saw I saw a tool that Google created.
100
00:06:32,404 --> 00:06:40,067
It's like a like a screen overlay where you can ask it to like the demonstration that was
given was a video game and the thing said, how should I?
101
00:06:40,067 --> 00:06:41,228
What's the next move I should make?
102
00:06:41,228 --> 00:06:43,949
And it was like, from the lower left, move it to this piece and whatever.
103
00:06:44,049 --> 00:06:51,072
Imagine that in your real life, like even whether you're looking back on your day or
whatever you're looking at, you know what?
104
00:06:51,072 --> 00:06:54,144
How fast should I run on this trail today?
105
00:06:54,144 --> 00:06:55,374
Like what?
106
00:06:55,374 --> 00:06:58,856
What could I have done better today to stick to the habits that I've told
107
00:06:58,856 --> 00:07:07,993
I want to build into my life like all these things that it's going to give you that that
mirror held up and saying like here's the mistakes you made at the time on your calendar
108
00:07:07,993 --> 00:07:09,725
when you said you wanted to go to the gym.
109
00:07:09,725 --> 00:07:11,466
How come you kept looking at Facebook right?
110
00:07:11,466 --> 00:07:16,649
Like it's going to be able to help you and keep you accountable for the things you say you
want to be.
111
00:07:16,925 --> 00:07:22,885
Well, and also like AI with with visual information.
112
00:07:22,885 --> 00:07:26,825
So the ability to process those things and give you certain information and feedback.
113
00:07:27,065 --> 00:07:29,685
Like a lot of it has been expert system defined.
114
00:07:29,925 --> 00:07:32,605
you like security cameras are a really good example of this.
115
00:07:32,605 --> 00:07:37,805
Like like the closed captioned TV circuits that most people run for their businesses.
116
00:07:38,365 --> 00:07:41,805
A lot of them have AI in them and the AI is just not very good.
117
00:07:41,805 --> 00:07:42,645
It gets things wrong.
118
00:07:42,645 --> 00:07:44,365
You have to capture those things in.
119
00:07:44,941 --> 00:07:48,613
because they were never really trained for like human beings to use these things.
120
00:07:48,613 --> 00:07:51,404
They don't have as broad a base of information set.
121
00:07:51,645 --> 00:07:56,868
Google glasses have all of the information set right now that is happening on a person's
face.
122
00:07:56,868 --> 00:08:00,309
So I'm curious to see her example.
123
00:08:00,309 --> 00:08:04,672
Like was it, if we take it out in the real world, does it work that way?
124
00:08:04,672 --> 00:08:09,845
Or are these prototypes that are mapped into these expert systems to be able to pull
certain information sets out?
125
00:08:09,845 --> 00:08:14,457
Because the amount of learning and understanding that the human brain has to do
126
00:08:14,701 --> 00:08:15,782
is a lot.
127
00:08:15,782 --> 00:08:26,471
that's why when we turn around and look at that shelf, like our pixelated version of what
goes on only happens the narrow scope and our periphery doesn't catch all those things.
128
00:08:26,471 --> 00:08:31,405
So if I didn't turn my head all the way around and stop and focus on the thing, my eyes
never cue and depict it up.
129
00:08:31,405 --> 00:08:38,971
My brain and my cerebral cortex would never understand those things in context where these
things are just capturing and reading things all the time and they can post process and
130
00:08:38,971 --> 00:08:40,582
try to look up information.
131
00:08:40,582 --> 00:08:44,269
And because they're capturing video at a higher resolution,
132
00:08:44,269 --> 00:08:50,871
then or I should say, because they're capturing a broader scope of information with high
resolution, not higher resolution than your brain.
133
00:08:50,871 --> 00:08:57,363
um They can look into information more, I guess, effectively and appropriately.
134
00:08:57,363 --> 00:09:03,214
And the ability to zoom in and zoom out on those types of things is going to be
phenomenal.
135
00:09:03,214 --> 00:09:05,935
Now, not to get too gloomy here.
136
00:09:05,935 --> 00:09:08,936
Obviously, these things can be used for nefarious purposes.
137
00:09:08,936 --> 00:09:12,857
But the practical use of doing something like this is incredible.
138
00:09:13,003 --> 00:09:25,080
An amazing use case that I can think of that's super practical that probably affects way
more people and causes a lot more deaths than people realize is altered state driving.
139
00:09:25,080 --> 00:09:32,464
imagine driving not just intoxicated, but you're not just drunk, but intoxicated or high
or tired.
140
00:09:32,605 --> 00:09:40,669
So the effects of lack of sleep quite often have a higher deleterious effect on your
reaction time than even being drunk or hot.
141
00:09:40,769 --> 00:09:45,011
And the reason why is because your brain cannot process through all the sludge that's in
there.
142
00:09:45,011 --> 00:09:52,996
And if these glasses can go through at the middle of night and actually see things and
point things out, I mean, regular driving, it'll be amazing because they'll have the full
143
00:09:52,996 --> 00:09:53,566
spectrum.
144
00:09:53,566 --> 00:09:58,759
But if they can say, watch out, a deer is about to jump in front of you or hey, dick bag,
stay in your fucking lane.
145
00:09:58,759 --> 00:10:01,000
And I wanted to talk to me like that.
146
00:10:01,901 --> 00:10:04,022
Because that's how I talk to other people on the road.
147
00:10:04,022 --> 00:10:07,524
I want like the road rage package in my glasses to inform me.
148
00:10:07,524 --> 00:10:08,605
Being a dick.
149
00:10:08,605 --> 00:10:08,987
Yeah.
150
00:10:08,987 --> 00:10:11,440
also assuming that humans will still need to drive cars.
151
00:10:11,440 --> 00:10:14,222
I come on, that's that's such a dinosaur idea.
152
00:10:14,677 --> 00:10:25,108
That's actually a very valid point and a very good, honest thing to say, but I guarantee
you that cars will not be fully autonomous.
153
00:10:25,108 --> 00:10:30,336
Like, right, we like to get behind the wheel and steer and pretend we're in fucking Grand
Theft Auto.
154
00:10:30,336 --> 00:10:31,636
That's right, it's too much fun.
155
00:10:31,636 --> 00:10:32,387
It's too much fun.
156
00:10:32,387 --> 00:10:35,682
The problem is now if you act like one Grand Theft Auto, there's gonna be recording.
157
00:10:35,682 --> 00:10:36,374
Yeah, it's true.
158
00:10:36,374 --> 00:10:37,467
Yeah, you're screwed.
159
00:10:37,467 --> 00:10:38,330
All right.
160
00:10:38,330 --> 00:10:40,426
Let's not get into the doom and gloom of it, though.
161
00:10:40,426 --> 00:10:46,238
if I rear end somebody, you know, they can't be like, you were creeping up on me or you
were doing this or doing that.
162
00:10:46,238 --> 00:10:48,709
Like it's all gonna be there.
163
00:10:48,709 --> 00:10:56,561
So this is the ultimate surveillance state because we all bitch about the idea there being
these video cameras, they're watching things all the time, which they've been there for a
164
00:10:56,561 --> 00:10:58,812
long time anyways, all over the world.
165
00:10:59,292 --> 00:11:01,609
Now you're gonna put them in everybody's faces.
166
00:11:01,609 --> 00:11:04,297
and willingly, and they're gonna pay for the privilege.
167
00:11:04,540 --> 00:11:04,969
Yes.
168
00:11:04,969 --> 00:11:09,749
all going to give money and privacy away to be able to know where our keys were and.
169
00:11:09,749 --> 00:11:16,104
got his camera everywhere including the one in my pocket and the one on my face that I
paid thousands of dollars for.
170
00:11:16,595 --> 00:11:17,285
Exactly.
171
00:11:17,285 --> 00:11:17,675
Yeah.
172
00:11:17,675 --> 00:11:21,618
Don't track me as I'm writing this post from my glasses.
173
00:11:24,440 --> 00:11:25,821
Exactly.
174
00:11:26,501 --> 00:11:29,603
Here's everything you want to know that I did wrong.
175
00:11:31,265 --> 00:11:32,353
It's going to be awesome.
176
00:11:32,353 --> 00:11:33,726
It's going to be amazing.
177
00:11:35,027 --> 00:11:37,869
It's going to be an ultimate like ego dump also.
178
00:11:37,869 --> 00:11:44,299
because like you're not gonna be able to get away from the facts, because there's gonna be
fucking video evidence that you captured.
179
00:11:46,144 --> 00:11:48,086
Yes, yes.
180
00:11:48,087 --> 00:11:49,409
So excited for this.
181
00:11:49,409 --> 00:11:52,112
Alright, so that's one one thing that I found pretty fun.
182
00:11:52,112 --> 00:11:53,704
This one's not so much AI.
183
00:11:53,704 --> 00:12:01,563
I mean, I'm sure there's some AI integrated into it, but it's more just sort of technology
and and and helping make lives easier for people with disabilities.
184
00:12:01,563 --> 00:12:04,566
So I wanted to share this one very cool technology.
185
00:12:05,789 --> 00:12:07,120
when it's not attached to you.
186
00:12:07,120 --> 00:12:12,954
Yes, so that's the thing and we've posted about it on social media and that's what
everybody is like, this is crazy.
187
00:12:12,954 --> 00:12:13,855
you've got to see.
188
00:12:13,855 --> 00:12:14,285
Yes.
189
00:12:14,285 --> 00:12:15,099
have.
190
00:12:15,099 --> 00:12:16,453
I'll just help you.
191
00:12:16,453 --> 00:12:16,567
it?
192
00:12:16,567 --> 00:12:19,764
Completely wireless, so you can actually just attach the hand like that.
193
00:12:19,764 --> 00:12:20,830
You're thinking about it.
194
00:12:20,830 --> 00:12:24,072
And I can still operate it the same way as if it was attached.
195
00:12:24,072 --> 00:12:24,853
Make a move.
196
00:12:24,853 --> 00:12:27,274
That is incredible.
197
00:12:28,455 --> 00:12:30,196
That's unbelievable.
198
00:12:30,196 --> 00:12:32,598
Yeah, so I can do little tasks.
199
00:12:32,598 --> 00:12:33,514
uh
200
00:12:33,514 --> 00:12:42,637
Alright, so the video that you just saw or the video that I'm about to show, depending on
how I edit this, there's a woman who has no arms and so she has basically animatronic arms
201
00:12:42,637 --> 00:12:44,658
that she's able to control.
202
00:12:44,918 --> 00:12:54,541
Apparently with her mind takes her hand off, puts it on the table and is still able to
control that hand while it's detached from the animatronic arm.
203
00:12:54,942 --> 00:12:56,122
That has.
204
00:12:56,899 --> 00:12:59,903
implications, but just for people with those disabilities.
205
00:12:59,903 --> 00:13:06,552
Someone who's missing an arm to be able to have a full functioning Luke Skywalker
animatronic hand attached to their body.
206
00:13:07,054 --> 00:13:09,715
The past version of the future is here.
207
00:13:09,715 --> 00:13:13,934
about to get cancelled because that's what I call a hand job.
208
00:13:17,343 --> 00:13:18,596
It's amazing.
209
00:13:18,596 --> 00:13:20,453
you can record it with your Google Glass.
210
00:13:20,453 --> 00:13:21,344
Exactly.
211
00:13:21,344 --> 00:13:23,095
There's no getting away.
212
00:13:23,335 --> 00:13:25,397
Those are incredible.
213
00:13:25,397 --> 00:13:34,042
mean, and you're right, there's definitely augmented intelligence that goes into that from
a craft way and user perspective.
214
00:13:34,083 --> 00:13:40,027
But yeah, like this is this is the sci fi shit that we're getting into now.
215
00:13:40,027 --> 00:13:48,473
And then the brain to physical interface, mean, direct, direct neural contact into uh
something that can
216
00:13:48,473 --> 00:13:52,313
control things extends our reach and our grasp and everything else.
217
00:13:52,492 --> 00:13:59,633
And those Google glasses make it so we can see, you know, in multiple different light
spectrums that our brain and our eyes can't capture because you could actually use those
218
00:13:59,633 --> 00:14:03,253
glasses to go through and say, give me an over read heat map of what's going on here.
219
00:14:03,253 --> 00:14:09,473
I mean, the practical implications are are mind blowing and phenomenal and super cool.
220
00:14:09,533 --> 00:14:15,233
Again, the risk being that, you know, you don't know who's taking all this information,
what they're going to do with it.
221
00:14:15,693 --> 00:14:17,913
But from a consumer perspective,
222
00:14:19,255 --> 00:14:20,856
Yeah, it's pretty incredible.
223
00:14:20,856 --> 00:14:28,029
And I mean, I I wrote my distal bicep tendin off like a few years, like 10 or four years
ago, five years ago.
224
00:14:28,089 --> 00:14:36,883
And if I had had a robotic version to go through and like slap a brace on my arm to keep
it stable and have it do exercises and move things so I didn't have to go to a physical
225
00:14:36,883 --> 00:14:45,857
therapy clinic and spend all that time doing those pieces like every day for for six weeks
for an hour, hour and a half plus the drive time.
226
00:14:45,857 --> 00:14:46,657
I mean.
227
00:14:47,201 --> 00:14:51,663
It would save so much time, money, everything else in all kinds of different directions.
228
00:14:51,744 --> 00:14:52,584
And.
229
00:14:53,445 --> 00:15:02,520
I'm just thinking about having other automated tools in my house, like if the Roomba was
somehow plugged into my cerebral cortex, could I finally get it to stop climbing up over
230
00:15:02,520 --> 00:15:08,473
the ledge of my fucking coffee table to try to get in the middle and get itself stuck
every single time?
231
00:15:09,014 --> 00:15:15,245
From a practical perspective, I feel like like I robot needs to learn something from
these.
232
00:15:15,245 --> 00:15:16,974
These hands thingies.
233
00:15:16,974 --> 00:15:25,949
It's OK, so it's funny if you say that one of the videos I didn't save and I'm sure you've
seen this is the the home chore robot that is like apparently supposed to be being rolled
234
00:15:25,949 --> 00:15:29,941
out into people's homes this year as prototypes.
235
00:15:29,941 --> 00:15:30,851
I believe it is Optimus.
236
00:15:30,851 --> 00:15:38,065
Yeah, I mean this thing does your gardening, does your laundry, washes the car like it
does all of your household chores.
237
00:15:38,305 --> 00:15:39,801
So you don't have to.
238
00:15:39,801 --> 00:15:45,422
Yeah, and it's only $35,000 for a down for the down payment.
239
00:15:45,422 --> 00:15:53,062
but let's talk about in five years when you're buying the used one or they get cheaper and
easier to make and all of sudden they're $1000 right like.
240
00:15:56,182 --> 00:15:57,482
Try to keep it light man.
241
00:15:57,482 --> 00:15:58,734
I'm trying to keep it light.
242
00:15:58,734 --> 00:16:00,034
gonna rise up.
243
00:16:03,830 --> 00:16:04,630
the clock's ticking.
244
00:16:04,630 --> 00:16:06,011
We got five years.
245
00:16:06,071 --> 00:16:09,973
All right, well, before they turn evil, they do have a lot of good to offer.
246
00:16:09,973 --> 00:16:11,642
I wanted to show this one as well.
247
00:16:11,642 --> 00:16:13,335
I was very excited about this one.
248
00:16:13,335 --> 00:16:23,721
When you talk about, know, I live in British Columbia where there are there's a very real
risk every single year of massive wildfires wiping out tons and tons of homes and lives
249
00:16:23,721 --> 00:16:25,102
and all the things.
250
00:16:25,102 --> 00:16:28,003
So imagine, if you will, the firefighting.
251
00:16:43,640 --> 00:16:48,557
He doesn't get hot, he doesn't get tired, he just walks through the forest putting out
fires.
252
00:16:48,557 --> 00:16:57,461
and it's funny because his movement pattern looks a lot like my movement pattern after
like a heavy leg day where it's like walking stiff and upright, like I can't bend over and
253
00:16:57,461 --> 00:16:58,822
tie my shoes.
254
00:16:58,822 --> 00:17:00,172
Who's going to wipe my butt for me?
255
00:17:00,172 --> 00:17:02,223
He doesn't have to worry about those things.
256
00:17:02,324 --> 00:17:04,204
No, no, he does not poop.
257
00:17:04,204 --> 00:17:06,445
He just spells exhaust.
258
00:17:06,826 --> 00:17:07,726
It's a.
259
00:17:08,466 --> 00:17:12,078
It's and yeah, I mean, his core strength has got to be pretty incredible.
260
00:17:12,078 --> 00:17:13,757
I have to admit, I'm kind of jealous.
261
00:17:13,757 --> 00:17:14,314
All right.
262
00:17:14,314 --> 00:17:19,978
yeah, I how heavy are those big fire extinguishing cannons he's carrying around
effortlessly?
263
00:17:22,661 --> 00:17:23,521
Nope.
264
00:17:23,902 --> 00:17:28,097
Until he turns on us and starts burning the forest down to create a job for himself.
265
00:17:28,097 --> 00:17:32,021
No, I mean, I don't know what he's spraying out of there, but I mean, he could put
anything in those chemical tubes.
266
00:17:32,021 --> 00:17:35,264
You could put friggin aerosol rat poison in there.
267
00:17:35,310 --> 00:17:37,310
Yeah, suddenly it's gonna be walking through the mall with gas.
268
00:17:37,310 --> 00:17:38,950
No, keep it light, keep it fun.
269
00:17:38,950 --> 00:17:40,803
Don't go there, don't go there.
270
00:17:40,803 --> 00:17:44,881
a Wally joke in here somewhere, I know, I just can't quite pin it down just yet.
271
00:17:46,200 --> 00:17:47,001
We'll find it.
272
00:17:47,001 --> 00:17:48,181
We'll find it.
273
00:17:48,702 --> 00:17:52,204
OK, so uh the fires have been put out.
274
00:17:52,204 --> 00:17:57,508
Your Google Glasses have told you where to go and your animatronic arms can do everything
for you.
275
00:17:57,508 --> 00:17:59,719
But what if you just hate your job and want a new one?
276
00:17:59,719 --> 00:18:02,451
You want to start a new business and don't exactly know where to start.
277
00:18:02,451 --> 00:18:07,394
This was a pretty cool application for uh using chat GPT or some sort of AI tool.
278
00:18:07,394 --> 00:18:08,624
Can you do me a favor?
279
00:18:08,624 --> 00:18:10,095
I've got this crazy idea.
280
00:18:10,095 --> 00:18:12,635
Airbnb, but for dogs.
281
00:18:12,635 --> 00:18:18,077
Make up a name, put up the details, the about page, act as a programmer and write the code
for me.
282
00:18:18,077 --> 00:18:19,417
How about BarkBnB?
283
00:18:19,417 --> 00:18:22,818
Profiles for dogs, booking options, and verified sitters.
284
00:18:22,818 --> 00:18:25,299
Yeah, that sounds great, but I need you to just write the code.
285
00:18:25,299 --> 00:18:26,519
Document HTML.
286
00:18:26,519 --> 00:18:28,400
Header, meta name viewport.
287
00:18:28,400 --> 00:18:31,651
Some of you guys are like, I'd start my business, but I'm working on my website.
288
00:18:31,651 --> 00:18:32,071
Guess what?
289
00:18:32,071 --> 00:18:32,961
Your website's done.
290
00:18:32,961 --> 00:18:34,541
You have no idea.
291
00:18:34,541 --> 00:18:35,674
It is done.
292
00:18:35,674 --> 00:18:36,252
Look at
293
00:18:36,252 --> 00:18:38,945
Well, okay.
294
00:18:39,227 --> 00:18:49,253
getting a domain name, uh going through and have it configure DNS and set up a web server
is going to directly impact my job and the things that I do.
295
00:18:49,253 --> 00:18:51,557
So fuck you, this technology has to be destroyed.
296
00:18:51,557 --> 00:18:52,754
Hahaha.
297
00:18:53,564 --> 00:18:56,935
No, it's funny because I actually, I do this all the time.
298
00:18:56,935 --> 00:19:01,648
I'll open up ChatGPT and I'll be like, I'm thinking of doing a business like this, give
you some information.
299
00:19:04,606 --> 00:19:08,268
DNS search and does a dig and sees what available domain names there are.
300
00:19:08,268 --> 00:19:14,600
You can actually even tell it to go through and say what the likelihood of this being
found versus this is it more SEO categorized.
301
00:19:14,600 --> 00:19:22,294
Like all the questions that would take you like months and weeks to go through on
marketing and try to figure out you can use chat GPT to do a lot of this shit.
302
00:19:22,294 --> 00:19:31,794
And yes, making your website is rad and you can actually tell it like a lot of the website
services that are out there that actually host these things.
303
00:19:31,794 --> 00:19:36,217
like Wix and other folks, actually used AI generated capabilities to go through and build
this code for you.
304
00:19:36,217 --> 00:19:41,892
And even Google Sites, which is free, m does something very similar where you don't
actually have to do anything.
305
00:19:41,892 --> 00:19:43,953
It's just simple form fields to move things across.
306
00:19:43,953 --> 00:19:48,346
But you can actually say, I've used this template and have it write the HTML code, upload
it, and have it work.
307
00:19:48,346 --> 00:19:55,392
um So I mean, this is cool because it makes the accessibility of actually doing e-commerce
much more likely for folks.
308
00:19:55,392 --> 00:19:58,386
Now, it doesn't do all the stitching to make ad software work.
309
00:19:58,386 --> 00:20:01,829
which is a real pain in the ass, which I used to work for an ad tech company, full
disclosure.
310
00:20:01,829 --> 00:20:04,770
uh So I know how difficult it is.
311
00:20:04,851 --> 00:20:06,972
And it doesn't do all the security pieces.
312
00:20:07,213 --> 00:20:09,254
But we're not that far from that.
313
00:20:09,254 --> 00:20:11,436
Like, this is going to happen.
314
00:20:11,436 --> 00:20:20,312
And theoretically, we should have a user to use faster, more robust, more secure, and safe
internet as a result.
315
00:20:20,312 --> 00:20:23,575
Except people are going to exploit it and use it for nasty shit.
316
00:20:23,575 --> 00:20:25,117
So stay in your lane.
317
00:20:25,117 --> 00:20:34,952
exciting for me about this is that I'm the kind of person that has like I have a million
ideas and never get past the idea stage because I have five more after that and I can't
318
00:20:34,952 --> 00:20:38,193
stay committed or excited about any of them long enough.
319
00:20:38,334 --> 00:20:46,737
But if I get to a point where I can hold on to an idea long enough to talk to a computer
like this and say hey I just had this idea for a business what do need to know about it.
320
00:20:46,817 --> 00:20:54,491
Help me set this up help me get the first five things I need to do in a list to take
action right like all of
321
00:20:54,491 --> 00:21:06,100
these things that that you know my my overactive brain that has kept me from things can be
funneled into action and potentially you know again going back to the firefighting robots
322
00:21:06,100 --> 00:21:16,924
like if there are automated ways to make the ideas come to life quickly I can stay excited
about them long enough to see them through and potentially turn into something useful.
323
00:21:16,924 --> 00:21:19,115
Yeah, like it's motivationally.
324
00:21:19,714 --> 00:21:24,540
It directs your motivation in at least what feels like a more productive fashion.
325
00:21:24,581 --> 00:21:36,131
And what's fascinating is that if you look at the way that a lot of people are using this
technology now, especially on like the high end enterprise things like things like
326
00:21:36,131 --> 00:21:41,695
marketing, there are services out there that are analysis services that do market
analysis.
327
00:21:41,996 --> 00:21:44,926
They do competitive analysis, all these different things.
328
00:21:44,926 --> 00:21:49,066
And like some of them are hundreds of thousands, if not millions of dollars a year.
329
00:21:49,766 --> 00:21:58,966
And the LLMs have already captured most of the data repositories that these people already
have out there, whether they publish them online or via free blogs, or they just get
330
00:21:58,966 --> 00:22:00,286
scraped and searched.
331
00:22:00,286 --> 00:22:02,866
It's fucking there now and it's free.
332
00:22:02,866 --> 00:22:11,706
So you don't necessarily need a Gartner or Forrester or any of these other services if
you're when you're starting off, because you can actually go through and get most of the
333
00:22:11,706 --> 00:22:13,854
publicly available information that's out there.
334
00:22:13,854 --> 00:22:18,474
and compile that information and use it in a practical, meaningful way.
335
00:22:18,494 --> 00:22:24,054
And yes, I'm not saying that you shouldn't have a Gartner or a Forrester subscription
because there's going to be all kinds of stuff that they've paywalled and put behind
336
00:22:24,054 --> 00:22:28,494
things to give you better analysis because those companies are actually cool.
337
00:22:28,494 --> 00:22:30,514
Please don't sue us.
338
00:22:31,214 --> 00:22:38,938
But, you know, I've been on multiple of those and done multiple different presser tours
and all these other pieces out there.
339
00:22:38,938 --> 00:22:39,878
And they're great.
340
00:22:39,878 --> 00:22:41,339
And the people that I work with are great.
341
00:22:41,339 --> 00:22:47,281
And they are all doing analysis of these kinds of technologies and how they use these
things.
342
00:22:47,561 --> 00:22:57,025
And it's cool because it's like the snake eating its own tail in some regard, because,
know, it's kind of consuming itself as it goes through, but it's actually not making
343
00:22:57,025 --> 00:22:57,785
getting smaller.
344
00:22:57,785 --> 00:22:59,711
It's becoming a bigger snake with more information.
345
00:22:59,711 --> 00:23:01,656
It's actually more effective and more useful.
346
00:23:01,758 --> 00:23:10,358
And I think that's the real value that AI brings to some of this analysis is that you can
have analysis, meta analysis and meta meta meta analysis and so on down the train.
347
00:23:10,438 --> 00:23:18,218
And this companion of information, it does not make things more clear, but it gives you
faster options to make decisions.
348
00:23:18,398 --> 00:23:25,458
And whether it's the right decision or a wrong decision, a decision puts you on a path,
which lets you go through it, a B test, and then keep motivated.
349
00:23:25,458 --> 00:23:27,978
So you can test other routes and paths.
350
00:23:28,454 --> 00:23:36,997
And it's really cool because if you use chat GPT effectively, you can say, build me a plan
to execute on my business idea in a step by step format.
351
00:23:36,997 --> 00:23:39,218
And let me bite off these steps at this rate.
352
00:23:39,218 --> 00:23:46,301
And you can actually pre-program that into your calendar, your event schedule, or those
types of things and have those things automatically populated.
353
00:23:46,301 --> 00:23:50,723
So you don't, if I need to build a website, I need to, know I need to do these 50 things.
354
00:23:50,723 --> 00:23:51,103
Cool.
355
00:23:51,103 --> 00:23:54,204
Chat GPT will go through, tell you how to do it, write those things.
356
00:23:54,204 --> 00:23:56,645
And so you want to do these things, these things, these, these, these things.
357
00:23:56,645 --> 00:23:58,110
And if you know, you've only got it.
358
00:23:58,110 --> 00:24:04,790
hour a day to work on this shit, break these things down into one hour tasks and put them
on my calendar and the events like.
359
00:24:04,849 --> 00:24:05,663
Yep.
360
00:24:06,000 --> 00:24:06,364
Yep.
361
00:24:06,364 --> 00:24:13,007
that level of organization is something most of us can't do well and that we sure as fuck
can't do it fast.
362
00:24:13,007 --> 00:24:20,676
Well, and then the trick, well, I think the trick that I don't know that I will ever solve
for people is motivation and accountability.
363
00:24:20,777 --> 00:24:22,399
Can you actually do it?
364
00:24:22,399 --> 00:24:25,682
And will you hold yourself accountable for the things you say you want to do?
365
00:24:25,682 --> 00:24:25,992
Yeah.
366
00:24:25,992 --> 00:24:28,943
And how many of your whimsical ideas are actually a value?
367
00:24:29,063 --> 00:24:31,934
Well, you can test it and you can A B test it really, really quickly.
368
00:24:31,934 --> 00:24:33,225
And you can even ask it ahead of time.
369
00:24:33,225 --> 00:24:35,486
Like, does this actually make economic sense?
370
00:24:35,486 --> 00:24:37,847
What's this going to cost me to put money in?
371
00:24:37,847 --> 00:24:41,828
What's my rate of return that I'm actually going to see and how quickly will I see it?
372
00:24:42,269 --> 00:24:45,686
And that's a very, very powerful thing.
373
00:24:45,686 --> 00:24:48,701
So you can set expectations correctly and start moving pieces around.
374
00:24:48,701 --> 00:24:53,033
But we're getting to the point where, you know, it might say, well, what's your financial
situation?
375
00:24:53,033 --> 00:24:54,473
How much money you got?
376
00:24:54,637 --> 00:24:56,291
Right, bitch, I'm broke.
377
00:24:56,291 --> 00:24:57,532
How do we do this?
378
00:24:59,818 --> 00:25:01,021
Alright, one last one.
379
00:25:01,021 --> 00:25:06,381
We've we've hinted at this before, but I couldn't have this conversation without sharing
this quickly.
380
00:25:07,655 --> 00:25:15,886
ah I know you are Jason, but Gary Vee, very popular guy, talks about how to get out of
your own way, do these things, be successful.
381
00:25:15,886 --> 00:25:20,131
He was asked a very interesting question, something we've talked about here as well.
382
00:25:20,131 --> 00:25:22,934
What are your most insane AI predictions?
383
00:25:23,135 --> 00:25:25,998
That your grandkids will marry an AI human.
384
00:25:29,243 --> 00:25:31,323
What's your conviction level on that?
385
00:25:31,323 --> 00:25:31,873
100%.
386
00:25:31,873 --> 00:25:33,295
That they're gonna marry an AI?
387
00:25:33,295 --> 00:25:36,126
your grandchild is going to marry an AI human.
388
00:25:36,126 --> 00:25:41,218
I think people are gonna have relationships fully AI boyfriend-girlfriends
indistinguishable.
389
00:25:41,218 --> 00:25:42,189
You Megan Fox, right?
390
00:25:42,189 --> 00:25:43,359
The thing, like, from that movie?
391
00:25:43,359 --> 00:25:44,900
There was a Megan Fox one that just came out.
392
00:25:44,900 --> 00:25:48,661
And when you say an AI, does that mean, a physical body of a robot?
393
00:25:48,661 --> 00:25:51,243
So you're saying that the bodies will get so good.
394
00:25:51,243 --> 00:25:51,803
Yes.
395
00:25:51,803 --> 00:25:54,694
don't know if you know this, but people have s*** with s*** dolls.
396
00:25:55,194 --> 00:25:55,815
What?
397
00:25:55,815 --> 00:25:56,975
Jack does?
398
00:25:57,187 --> 00:26:00,031
Do you think this is a just population collapse?
399
00:26:00,031 --> 00:26:02,333
People are not going to reproduce.
400
00:26:02,475 --> 00:26:03,595
Maybe.
401
00:26:03,634 --> 00:26:07,138
Replace your spouse with the the fuckbot.
402
00:26:07,760 --> 00:26:09,091
We've talked about this.
403
00:26:09,467 --> 00:26:10,748
terrible.
404
00:26:11,684 --> 00:26:19,844
I mean, instant access to...we already have instant access to porn and now we're gonna
have instant access to a squishy hole.
405
00:26:19,844 --> 00:26:20,664
Okay.
406
00:26:20,818 --> 00:26:21,722
You
407
00:26:22,928 --> 00:26:23,889
I get it.
408
00:26:23,889 --> 00:26:36,396
But I actually think it's somewhat different than that, because I think the human brain
actually, we're actually programmed, we're like, we've evolved to be attracted to certain
409
00:26:36,396 --> 00:26:38,696
physical characteristics and feature sets.
410
00:26:39,237 --> 00:26:47,422
White hips, large breasts for men, you know, under the assumption that these are good
things that produce well, um that make babies.
411
00:26:47,422 --> 00:26:52,344
But at the same time, there are certain things that were kind of repulsed by.
412
00:26:52,844 --> 00:27:03,110
like the whole clown phobia thing is, they believe it's somewhat based on the idea of
decaying corpses, having discolored lips and discolored faces being things that make
413
00:27:03,110 --> 00:27:05,231
people so afraid because they think the dead have come to life.
414
00:27:05,231 --> 00:27:13,216
um There's all kinds of psychological reasons why this thing may not pan out, but there's
all kinds of other reasons why it will.
415
00:27:13,216 --> 00:27:20,219
So I don't doubt that young men will have an instance of this, and quite possibly in women
as well.
416
00:27:20,219 --> 00:27:21,884
I don't want to exclude anybody from.
417
00:27:21,884 --> 00:27:23,305
expected.
418
00:27:23,305 --> 00:27:36,890
But whether it's artificial intelligence or augmented intelligence, like the meat suit
that you're going to choose to be with, or don't fucking rubber metal suit, whatever, is
419
00:27:36,890 --> 00:27:48,495
going to have some kind of input in some kind of mechanism, where the persona and the
presence that they present themselves with and who they are, is distinctly tied to these
420
00:27:48,495 --> 00:27:50,095
AI LLM models.
421
00:27:50,336 --> 00:27:51,576
There's no doubt.
422
00:27:51,952 --> 00:27:53,593
And I hate to say it, but.
423
00:27:54,493 --> 00:27:59,136
We're already that like you're already sucked into your phone.
424
00:27:59,136 --> 00:28:03,012
Your intelligence is already augmented by things that you shouldn't know naturally.
425
00:28:03,012 --> 00:28:04,479
I mean, that's recorded history.
426
00:28:04,479 --> 00:28:07,360
We're making these things happen faster, more effectively.
427
00:28:07,661 --> 00:28:11,803
It seems super scary because we don't know what it is, how to control those pieces.
428
00:28:11,803 --> 00:28:13,614
But I can break it to you.
429
00:28:13,614 --> 00:28:15,225
We were living in that universe forever.
430
00:28:15,225 --> 00:28:16,586
You don't have control over shit.
431
00:28:16,586 --> 00:28:18,507
The illusion of it is nonsense.
432
00:28:18,507 --> 00:28:21,382
And the idea that you're going to be able to make these things happen.
433
00:28:21,382 --> 00:28:24,013
one way or the other is silly.
434
00:28:24,013 --> 00:28:38,651
I don't know if my grandkids will have, will get married to artificial humans, maybe, but
they're, they're going to get married to some augmented artificial intelligence of some
435
00:28:38,651 --> 00:28:41,302
kind, whether it's in a meat suit or something else.
436
00:28:42,343 --> 00:28:46,666
And we're just blurring this line between us and those things.
437
00:28:46,666 --> 00:28:51,508
And that's probably the safest thing to ensure that humanity survives.
438
00:28:51,654 --> 00:28:55,169
but we might not survive as biological organisms.
439
00:28:55,169 --> 00:28:59,946
We might get ourselves all uploaded and beat these virtual machines or whatever.
440
00:28:59,946 --> 00:29:03,460
Fucking great, fine.
441
00:29:03,721 --> 00:29:05,844
Does my steak taste good?
442
00:29:05,844 --> 00:29:08,869
Yes, then I don't mind being in the matrix.
443
00:29:08,869 --> 00:29:10,820
But this and this is what I what I wonder about.
444
00:29:10,820 --> 00:29:12,110
I mean, you're talking about evolution.
445
00:29:12,110 --> 00:29:22,613
If if you know, and the idea of being repulsed by something so like I you know, if I if I
walk into a room and I see something that is clearly a robot, no, I'm not going to ask it
446
00:29:22,613 --> 00:29:23,173
on a date.
447
00:29:23,173 --> 00:29:25,194
That's you know, that's that's weird.
448
00:29:25,194 --> 00:29:27,674
But if I walk into a room and I can't tell the difference.
449
00:29:28,755 --> 00:29:36,307
Then yeah, it's it's incredibly likely that I would end up, you connecting with a robot
without knowing it right like that.
450
00:29:36,307 --> 00:29:38,029
That would be bizarre.
451
00:29:38,816 --> 00:29:43,837
And and so let's let's imagine the scenario where it is a repulsive thing.
452
00:29:43,837 --> 00:29:46,897
You can tell it's a robot and that's weird for plenty of us.
453
00:29:46,897 --> 00:29:50,897
That doesn't matter because that is an easier relationship than an actual human one.
454
00:29:50,897 --> 00:30:01,337
And so does this start to in the let's imagine a scenario where through genetics we're
able to reproduce, you know, in basically in a lab through this robot.
455
00:30:01,497 --> 00:30:04,637
Does evolution suddenly favor the?
456
00:30:04,719 --> 00:30:12,499
people that are now attracted to robots and can produce with them because it's an easier
relationship, it's a predictable relationship, it's not one that they have to have with an
457
00:30:12,499 --> 00:30:13,586
actual person.
458
00:30:13,586 --> 00:30:18,324
I think if you're gonna make a meat baby.
459
00:30:20,568 --> 00:30:36,881
it's still going to attract the same kind of people because willingly making an 18 year
commitment that kind of looks like you and might look like you're fucked all is is a huge
460
00:30:36,881 --> 00:30:37,452
commitment.
461
00:30:37,452 --> 00:30:48,462
think if anything, what this is is it would get people to stop reproducing because there
would not be a motivation to make babies if I'm just there to have this kind of
462
00:30:48,462 --> 00:30:49,064
connection.
463
00:30:49,064 --> 00:30:55,624
And I think that's actually a very important thing to kind of discuss and to kind of look
at.
464
00:30:55,928 --> 00:31:10,635
But I also think that the way that you're describing these things, like I walk in and I
don't discover this thing, and it's going to be the crying game, but with robots.
465
00:31:10,635 --> 00:31:12,180
yes, it totally is good.
466
00:31:12,180 --> 00:31:13,770
That's exactly where I was taking this
467
00:31:13,770 --> 00:31:21,269
you like checking for an Adam's apple isn't going to do you any good because they shaved
that shit down and you can't tell anymore and you reach in between and you find a cyber
468
00:31:21,269 --> 00:31:28,236
port as opposed to a penis or a vagina that might change the way that you interact with
things.
469
00:31:28,878 --> 00:31:29,798
Which.
470
00:31:30,660 --> 00:31:31,620
I mean.
471
00:31:33,940 --> 00:31:34,698
Yeah.
472
00:31:34,698 --> 00:31:35,349
for some people.
473
00:31:35,349 --> 00:31:36,702
That's gonna work for some people.
474
00:31:36,702 --> 00:31:38,203
I'm just gonna say it.
475
00:31:42,192 --> 00:31:42,847
Yes.
476
00:31:42,847 --> 00:31:43,978
the couch.
477
00:31:44,159 --> 00:31:45,049
I mean-
478
00:31:45,477 --> 00:31:47,724
Hey, but some of them grew up to be vice president.
479
00:31:48,028 --> 00:31:48,829
They do.
480
00:31:48,829 --> 00:31:51,961
And some of them grew up with brain worms and shooting bears.
481
00:31:51,961 --> 00:31:55,705
And I actually I'm very we should try to get Robert F.
482
00:31:55,705 --> 00:31:56,465
Kennedy Jr.
483
00:31:56,465 --> 00:32:00,459
on this program and see what he his thoughts and opinions are on A.I.
484
00:32:00,459 --> 00:32:04,723
and how fucking robots might affect children with autism.
485
00:32:04,723 --> 00:32:09,706
Because what's the vaccination you're going to give to these kids when they're completely
cyber built and put in?
486
00:32:09,827 --> 00:32:11,489
Is it now a digital virus?
487
00:32:11,489 --> 00:32:12,889
Am I going to?
488
00:32:13,771 --> 00:32:15,992
Right, like shit.
489
00:32:17,865 --> 00:32:18,965
Alright.
490
00:32:19,505 --> 00:32:20,286
Yeah, me too.
491
00:32:20,286 --> 00:32:21,437
Let's go get some more coffee.
492
00:32:21,437 --> 00:32:23,748
We're going to do that and that's going to wrap up this episode.
493
00:32:23,748 --> 00:32:24,418
Thanks for listening.
494
00:32:24,418 --> 00:32:27,739
If you've found any of this fun, want to share this with somebody.
495
00:32:27,739 --> 00:32:29,021
I'm not going to get in your way.
496
00:32:29,021 --> 00:32:29,971
Please go ahead and share it.
497
00:32:29,971 --> 00:32:37,961
You can do that at thefitmass.com and that's where we'll be back in just about a week with
another episode with hopefully something terrifying that will change your life forever.
498
00:32:37,961 --> 00:32:39,132
Alright, see you then.
499
00:32:39,132 --> 00:32:40,220
See ya, bye.