Why Your Phone Knows You're Depressed Before You Do
My productivity hack: Use my code FITMESS20 for 20% off #magicmind ---- Will robots decide if you keep your nuts based on cancer predictions? The world of predictive healthcare is here, and it's not the helpful crystal ball we hoped for....
My productivity hack: https://www.magicmind.com/FITMESS20 Use my code FITMESS20 for 20% off #magicmind
----
Will robots decide if you keep your nuts based on cancer predictions?
The world of predictive healthcare is here, and it's not the helpful crystal ball we hoped for. Insurance companies are already using AI to analyze your genetic data, social media posts, and digital footprints to predict everything from mental health crises to testicular cancer. The catch? They're not using this information to help you - they're using it to deny coverage and shift financial responsibility back to you when predictions go wrong.
In this episode:
- Learn how AI is currently being used to predict your health outcomes
- Understand the financial and personal risks of genetic data sharing
- Discover practical steps to protect your data and maintain autonomy
Listen to this episode to understand what's at stake before you become a statistic in someone else's algorithm.
Topics Discussed:
- How genetic testing companies are selling your DNA data to healthcare analytics firms
- The nightmare scenario of preventive surgery based on AI predictions with moderate confidence levels
- Why American healthcare profits are driving global surveillance standards
- How social media monitoring can predict mental health episodes before they happen
- The reality of insurance companies using AI to deny coverage based on "prior knowledge"
- Brain-computer interfaces and the subscription model for your thoughts (Black Mirror style)
- GDPR vs. American data protection laws and what rights you actually have
- Why HIPAA doesn't protect you from insurance company data mining
- The difference between humanitarian AI tools and profit-driven surveillance systems
- Practical steps to minimize your digital health footprint starting today
----
MORE FROM THE FIT MESS:
Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok
Subscribe to The Fit Mess on Youtube
Join our community in the Fit Mess Facebook group
----
LINKS TO OUR PARTNERS:
-
Explore the many benefits of cold therapy for your body with Nurecover
-
Muse's Brain Sensing Headbands Improve Your Meditation Practice.
-
Get a Free One Year Supply of AG1 Vitamin D3+K2, 5 Travel Packs
-
You Need a Budget helps you quickly get out of debt, and save money faster!
00:00:05,462 --> 00:00:06,262
This is the fit master.
2
00:00:06,262 --> 00:00:07,822
We talk about AI and health and wellness.
3
00:00:07,822 --> 00:00:17,225
today we're going to talk about what the robots might be able to predict about your uh
mental health, your physical health in the future and how that could impact the decisions
4
00:00:17,225 --> 00:00:27,390
you make to avoid the possible outcomes predicted by robots and ultimately who's
responsible for your decisions based on what the robots tell you to do.
5
00:00:27,390 --> 00:00:29,100
The future is now.
6
00:00:29,100 --> 00:00:31,660
Minority report, it's all coming true.
7
00:00:31,660 --> 00:00:33,193
The science fiction is here.
8
00:00:33,193 --> 00:00:34,385
We're living in it.
9
00:00:34,385 --> 00:00:36,347
We're watching it unfold in front of us.
10
00:00:36,347 --> 00:00:41,930
And the scenario you just described, Jason, before we pressed record, literally our nuts
are on the line here.
11
00:00:41,930 --> 00:00:46,734
Yeah, ah predictive analytics.
12
00:00:46,734 --> 00:00:58,163
So going through and taking the data sets out there to go through and try to find patterns
that let you go through and say, based upon these input variables, it looks like I am
13
00:00:58,163 --> 00:01:02,777
highly susceptible to these types of disease, ailments, those types of things.
14
00:01:02,777 --> 00:01:04,589
And we already do this today, right?
15
00:01:04,589 --> 00:01:10,794
We already do it where we go through and we look at someone's genetic profile and we say,
you're more susceptible
16
00:01:10,794 --> 00:01:19,474
to this kind of cancer risk than others, because we've used large language models to go
through and kind of map these pieces and take the human genome itself, sorry, not
17
00:01:19,474 --> 00:01:22,974
language, machine learning to go through and actually understand these pieces in context.
18
00:01:22,974 --> 00:01:35,454
So you can already go to like any of the genetic data services out there and say, you
know, tell me, am I more susceptible to cancer or Alzheimer's or dementia or a million
19
00:01:35,454 --> 00:01:36,694
other diseases?
20
00:01:37,034 --> 00:01:40,194
And invariably it comes back with a score.
21
00:01:40,638 --> 00:01:42,599
And that score is a certain percentage.
22
00:01:42,599 --> 00:01:50,543
And if you don't share this data with anybody and you just keep it to yourself, OK, well,
you know what your risks are and you understand those things in context.
23
00:01:50,804 --> 00:02:02,990
However, 23andMe being bought by a data by basically a health care analytics company uh is
going to take that data and probably not use it to uh do anything good for you.
24
00:02:02,990 --> 00:02:10,314
They will probably use it to deny medical care and to make it difficult for you to get
treatment because you had prior awareness.
25
00:02:10,314 --> 00:02:14,976
uh of a known potential bad outcome.
26
00:02:15,278 --> 00:02:26,767
let's say it said you're highly susceptible to lung cancer and you decided to have a cigar
and AI catches on video somewhere and it says, oh, well, you broke the rules.
27
00:02:26,767 --> 00:02:27,778
You know, this might happen.
28
00:02:27,778 --> 00:02:33,352
So you are in trouble or you chose to live in an area that's got higher air pollution
because it's close to your work.
29
00:02:34,033 --> 00:02:40,438
Again, this comes down to the concept of not just that your personal responsibility is no
longer just your business.
30
00:02:40,682 --> 00:02:43,622
And that's kind of the scary part.
31
00:02:43,622 --> 00:02:46,982
And at the end of the day, who holds accountability?
32
00:02:47,262 --> 00:02:57,682
So I mean, what we were talking about before was let's say you're at high risk for
testicular cancer and the prescribed treatment for testicular cancer is to remove your
33
00:02:57,682 --> 00:02:58,602
testicles.
34
00:02:58,602 --> 00:03:04,062
And let's say that balancing act is like 51%.
35
00:03:04,062 --> 00:03:09,314
You're 51 % more likely to get cancer than the average person.
36
00:03:09,790 --> 00:03:11,551
If you keep your nuts, okay.
37
00:03:11,551 --> 00:03:15,073
Well, I mean do I lower my odds by only keeping one?
38
00:03:15,073 --> 00:03:18,574
ah Do I have to remove both?
39
00:03:18,574 --> 00:03:24,477
and by the way What happens if I choose not to remove them and then I do get the cancer?
40
00:03:24,477 --> 00:03:35,784
Am I now on the hook for paying all my medical bills that are associated with that and
then I went and losing my nuts anyways Or if I cut my nuts off with no good reason
41
00:03:35,784 --> 00:03:40,095
And then 10 years later, they come back and they go, our analysis was entirely wrong.
42
00:03:40,095 --> 00:03:42,056
Your genetic code for this does not match that.
43
00:03:42,056 --> 00:03:45,136
You are actually regular like everybody else.
44
00:03:45,417 --> 00:03:46,797
Sorry about the no nuts.
45
00:03:46,797 --> 00:04:00,501
um I mean, this is the kind of horseshit that we're going to have to figure out how to
deal with, because we're giving a lot of respect and we're paying a lot of attention to
46
00:04:00,501 --> 00:04:02,821
these artificial intelligence sources.
47
00:04:03,102 --> 00:04:04,942
And companies are using them.
48
00:04:05,390 --> 00:04:13,454
in ways to block you from getting access to things or to entice you to do certain things
and follow certain patterns and behaviors.
49
00:04:13,454 --> 00:04:20,628
And they're trying to do it because your medical insurance companies are not, don't make
their money by paying out on your services.
50
00:04:20,628 --> 00:04:24,760
They make their money by collecting your premiums and not paying out.
51
00:04:25,181 --> 00:04:28,553
Well, that's not just going to be true for the U.S.
52
00:04:28,553 --> 00:04:30,113
like with our privatized healthcare system.
53
00:04:30,113 --> 00:04:33,465
That's going to start being true for everybody because all of these things cost extra
money.
54
00:04:33,465 --> 00:04:34,986
And at what point do you
55
00:04:35,178 --> 00:04:44,396
uh countries start saying, well, we simply can't afford to do these things anymore because
of X, Y, and Z, or we know the survivability rate of these things with co-born-minted
56
00:04:44,396 --> 00:04:45,767
factors is much lower.
57
00:04:45,767 --> 00:04:48,229
So we're not even going to bother to treat these pieces.
58
00:04:48,229 --> 00:04:57,577
Like, I think we're getting to those kinds of places where before it took a human being to
go through and sort this data and try to understand your charts and context and pull all
59
00:04:57,577 --> 00:05:04,040
this data in and use those predictions to go through and say, well, it seems like you have
a higher
60
00:05:04,040 --> 00:05:06,382
you're more highly susceptible to these types of things.
61
00:05:06,382 --> 00:05:07,893
So maybe you should change your behavior.
62
00:05:07,893 --> 00:05:17,662
Where if a company is using AI to do this, and instead of going to a natural path and
getting all this information, understanding these things in context, if the artificial
63
00:05:17,662 --> 00:05:25,769
path uh goes through and tracks all these data sets and puts them into context that makes
sense for the end operator, i.e.
64
00:05:25,769 --> 00:05:29,993
the insurance company, that's a whole different ballgame.
65
00:05:29,993 --> 00:05:31,493
And we're heading there.
66
00:05:31,722 --> 00:05:34,283
they're going to use this for that kind of fashion.
67
00:05:34,283 --> 00:05:37,444
the next question becomes, what do I do with this information?
68
00:05:37,444 --> 00:05:49,589
And how do I not make myself more panicked and more scared and more susceptible to stress
related death causes or anxiety causes because of this?
69
00:05:49,589 --> 00:05:52,695
And that's, I think that's the part we should chat about.
70
00:05:52,695 --> 00:05:54,046
Well, that's that's the thing that's interesting.
71
00:05:54,046 --> 00:06:02,942
I was looking at a study just before we jumped on here talking about how this can play
into mental health and the different patterns and different things that can be detected to
72
00:06:02,942 --> 00:06:04,973
help identify problems before they occur.
73
00:06:04,973 --> 00:06:05,154
Right.
74
00:06:05,154 --> 00:06:13,359
So like the things that you are typing online, whatever you're saying on social media, are
those going to be offering clues to some data source that will help indicate whether or
75
00:06:13,359 --> 00:06:16,792
not there is a likelihood for developing anxiety or depression?
76
00:06:16,792 --> 00:06:19,944
And then does that then lead to a better treatment outcome?
77
00:06:19,944 --> 00:06:21,857
I mean, that's what you described is
78
00:06:21,857 --> 00:06:25,351
honestly the most likely, but also worst case scenario.
79
00:06:25,351 --> 00:06:31,579
But there is there is a reality out there, I have to believe, where this actually gets
used for good.
80
00:06:31,579 --> 00:06:41,961
And and we see on the on the horizon, these bad things that could happen to somebody and
some, you know, humanitarian some some good Samaritan.
81
00:06:41,961 --> 00:06:45,212
sees this and goes, here's how we stop this before it becomes a problem.
82
00:06:45,212 --> 00:06:46,733
Let's move you out of this region.
83
00:06:46,733 --> 00:06:47,914
Let's get you to quit smoking.
84
00:06:47,914 --> 00:06:48,224
Let's go.
85
00:06:48,224 --> 00:06:52,055
mean, a lot of this, you know, a lot of these analogies I'm using human beings do now.
86
00:06:52,055 --> 00:06:53,246
Hey, dummy, don't smoke.
87
00:06:53,246 --> 00:06:54,476
As it turns out, it's bad for you.
88
00:06:54,476 --> 00:06:56,337
We have a bunch of data that backs it up.
89
00:06:56,577 --> 00:07:00,515
The robots will be able to pull all that together more quickly.
90
00:07:00,515 --> 00:07:08,815
and see things we don't see, they will be able to see what you're doing online, other sort
of digital footprints that you're leaving that will leave these clues that could
91
00:07:08,815 --> 00:07:11,595
potentially lead to a better outcome for you.
92
00:07:11,595 --> 00:07:20,895
And I'm curious too, because the whole time you were talking there, I was thinking, this
is very true for the American healthcare system, because it is 100 % for profit.
93
00:07:20,915 --> 00:07:23,955
But I live in Canada, where it is much more of a socialized system.
94
00:07:23,955 --> 00:07:27,215
And if they can find a way to go, hey, let's cut costs.
95
00:07:27,233 --> 00:07:34,449
because we can actually prevent a lot of these things from happening rather than waiting
until the fire is raging and there's no way to put it out.
96
00:07:34,529 --> 00:07:44,177
It could, I think, knowing very little about the way insurance companies work and the way
all this stuff gets sorted out, it seems like if we can prevent bad things from happening,
97
00:07:44,177 --> 00:07:46,539
we can also save a lot of money in the long run.
98
00:07:47,178 --> 00:07:47,698
Right.
99
00:07:47,698 --> 00:07:51,998
So the question becomes which motivation has higher precedence in the operator.
100
00:07:52,358 --> 00:08:00,538
And if you're talking about governments, governments are supposed to be there for their
people and to make those pieces actually work in a typical democracy.
101
00:08:00,638 --> 00:08:03,758
And I think Canada has has one of those.
102
00:08:03,758 --> 00:08:03,938
Right.
103
00:08:03,938 --> 00:08:06,818
And I think most of Western Europe has one of those.
104
00:08:07,518 --> 00:08:07,738
The U.S.
105
00:08:07,738 --> 00:08:16,198
is not the U.S.'s primary motivation is actually profits and companies or people and
companies are people with extra
106
00:08:16,362 --> 00:08:20,881
extra special rights, and they're elevated above regular people.
107
00:08:20,922 --> 00:08:28,162
And that's why so many people themselves have made themselves corporations, wrapped
themselves around this piece is because that gives them those types of inalienable rights
108
00:08:28,162 --> 00:08:30,542
that corporations have that human beings don't.
109
00:08:30,542 --> 00:08:31,462
It's fucked up.
110
00:08:31,462 --> 00:08:32,242
It's weird.
111
00:08:32,242 --> 00:08:33,162
It's horseshit.
112
00:08:33,162 --> 00:08:36,242
It's some stupid fucking mind game.
113
00:08:36,242 --> 00:08:42,662
And at the end of the day, it's the reality of the people that live in the US and in our
for profit health care system.
114
00:08:43,642 --> 00:08:44,622
So
115
00:08:45,778 --> 00:08:58,227
I do think what winds up happening is the U S winds up affecting everybody's healthcare
downstream because we are one of the largest providers and builders of medical care record
116
00:08:58,227 --> 00:09:03,951
systems, medical compliance systems, all of these different component pieces that people
use to make decisions.
117
00:09:04,031 --> 00:09:07,834
And because that's who we are, because that's what we provide and that's we produce.
118
00:09:07,834 --> 00:09:11,536
As much as you want to think I'm safe in Canada, I don't think you are.
119
00:09:11,536 --> 00:09:13,928
I think our ability to fuck things up.
120
00:09:13,928 --> 00:09:17,591
in the US is highly contagious and our blast radius is global.
121
00:09:17,591 --> 00:09:19,642
Let me do make those kinds of mistakes.
122
00:09:19,742 --> 00:09:23,625
And I think we're running headstrong into this one and we're going to do that.
123
00:09:23,625 --> 00:09:25,326
Now, don't get me wrong.
124
00:09:25,326 --> 00:09:34,162
think people and some companies, some organizations will find a way to use AI in not
nefarious ways, not shitty ways to do the things that you're talking about.
125
00:09:34,162 --> 00:09:36,004
Like, Hey, let's have you move.
126
00:09:36,004 --> 00:09:37,716
Let's have you get away from these power lines.
127
00:09:37,716 --> 00:09:38,475
Let's have you do this.
128
00:09:38,475 --> 00:09:39,736
Let's have you do that.
129
00:09:39,736 --> 00:09:42,598
But the cost basis to be able to do something like that.
130
00:09:42,866 --> 00:09:47,528
And the U S can be incredibly prohibitive and you may not actually have access to those
pieces.
131
00:09:47,528 --> 00:09:51,440
So Obamacare, the ACA comes in and says, everybody has to have healthcare.
132
00:09:51,440 --> 00:09:57,272
Well, it wouldn't be that hard for the ACA to also say everyone's DNA needs to be on file.
133
00:09:57,472 --> 00:10:06,756
And I think we are in a particular administration that is highly susceptible to
implementing that as an executive order.
134
00:10:06,896 --> 00:10:10,278
And it doesn't matter if another administration comes in and says, get rid of all of it.
135
00:10:10,278 --> 00:10:11,248
It's done.
136
00:10:12,052 --> 00:10:14,323
Tough shit dude, once you're in, you're in.
137
00:10:14,323 --> 00:10:18,584
Like, it's a fucking Chinese finger trap for the lack of a better term.
138
00:10:18,585 --> 00:10:20,756
When you're in it, you're in it, there's no way to wiggle out.
139
00:10:20,756 --> 00:10:22,246
You're just kind of stuck.
140
00:10:22,246 --> 00:10:25,487
And yeah, you can slowly try to move yourself out.
141
00:10:25,608 --> 00:10:28,329
Yes, I understand it might not be the best analogy.
142
00:10:28,329 --> 00:10:30,709
Maybe the better analogy is Roach Hotel.
143
00:10:30,870 --> 00:10:38,213
We're in a Roach Hotel and we're stuck and we're glued in there and it's a big fucking
party of genetic mayhem.
144
00:10:38,213 --> 00:10:41,374
And once you check in, you don't check out.
145
00:10:41,374 --> 00:10:44,562
And I think that's what our reality is, especially in the US.
146
00:10:45,014 --> 00:10:54,901
Well, not to pull a talking point from the conservative playbook or anything, but this
does open up a door for private industry to create tools, you know, going back to the
147
00:10:54,901 --> 00:10:58,554
mental health analogy, if there's something that you can attach to your Facebook account.
148
00:10:58,554 --> 00:11:09,052
And every time you have reached the 100th post on why I hate Donald Trump, something can
jump in and say, hey, maybe step away, maybe take a break from the news for a week.
149
00:11:09,052 --> 00:11:10,354
It might do you some good.
150
00:11:10,354 --> 00:11:10,593
Right.
151
00:11:10,593 --> 00:11:12,788
There's there's some there's ways that like
152
00:11:12,788 --> 00:11:17,240
removing the insurance company altogether or removing even the medical industry.
153
00:11:17,240 --> 00:11:22,142
There are some tools that I think could be implemented that will be able to monitor.
154
00:11:22,442 --> 00:11:23,903
God, I'm saying this out loud.
155
00:11:23,903 --> 00:11:24,463
I don't even know.
156
00:11:24,463 --> 00:11:32,306
I don't think I even agree with myself, but you can monitor your behavior and help suggest
things that could make your life better.
157
00:11:32,306 --> 00:11:42,330
Because if you fall into these patterns of constantly whining to Facebook or constantly
interacting with your AI tool as your therapist and it goes, you know,
158
00:11:42,921 --> 00:11:45,011
I'm seeing a pattern here.
159
00:11:45,012 --> 00:11:46,752
You come to me for a lot of stuff.
160
00:11:46,752 --> 00:11:48,532
Have you talked to a person?
161
00:11:48,532 --> 00:11:50,333
Have you gone outside lately?
162
00:11:50,333 --> 00:11:52,813
Maybe just give it a shot.
163
00:11:52,813 --> 00:12:03,036
I just think that there's, the optimist in me wants to believe there's a way that we can
use this and honestly, just try to help people.
164
00:12:03,256 --> 00:12:11,276
And I know it's bullshit because I work with enough people that I know that we live in a
system that is designed to just make money from people to get.
165
00:12:11,276 --> 00:12:15,415
to get money from your pockets into my pockets in the most efficient way possible.
166
00:12:15,477 --> 00:12:20,926
And the actual helping of people seems to be something that nobody seems to give a shit
about anymore.
167
00:12:20,926 --> 00:12:22,787
extracting profit from pain.
168
00:12:23,068 --> 00:12:28,753
is like, AI is a refinement tool to help make those things better and faster.
169
00:12:28,753 --> 00:12:36,239
Automation puts those into place in a cold, sterile way.
170
00:12:36,960 --> 00:12:46,238
Adding these pieces together and then using inference models to go through and make, you
know, try to predict future crime or future health or future death, whatever the fuck you
171
00:12:46,238 --> 00:12:47,429
want to call it.
172
00:12:47,469 --> 00:12:48,710
That's reality.
173
00:12:48,773 --> 00:12:49,418
Yeah.
174
00:12:49,418 --> 00:12:50,479
We're already using it today.
175
00:12:50,479 --> 00:12:54,601
We used it in expert systems before we had AI with human beings doing it.
176
00:12:54,601 --> 00:12:59,583
Now we're going to do it at scale and apply the same type of thing to every use case.
177
00:12:59,583 --> 00:13:04,124
Where before you'd have to go through and put a lot of extra work into it and have
somebody look at all these records.
178
00:13:04,265 --> 00:13:05,565
It's really fucking easy now.
179
00:13:05,565 --> 00:13:09,527
So we'll just use it because it's there, it's available, and it's going to happen.
180
00:13:09,787 --> 00:13:17,010
But I do like the idea that there's an opportunity for private companies to come in and
help make some of these things better.
181
00:13:17,502 --> 00:13:19,093
Facebook example is a really good one.
182
00:13:19,093 --> 00:13:23,905
Like, I've seen X number of posts, are you putting these pieces in in this way?
183
00:13:23,926 --> 00:13:31,009
And that might suggest that, you know, perhaps you need to take this course of action or
put these, or move these things in this direction.
184
00:13:31,430 --> 00:13:32,841
That's a great idea.
185
00:13:32,841 --> 00:13:39,435
Those thresholds, those pieces, those components are highly up to the individual to make
those decisions.
186
00:13:39,435 --> 00:13:46,038
So this is a checks and balances problem where if you really want to do this right and you
want an Overwatch,
187
00:13:46,204 --> 00:13:48,715
over your body, your mind, your mental health and all those.
188
00:13:48,715 --> 00:13:54,218
And you can predict it and put it into play and say, here are what my thresholds are and
then say, go.
189
00:13:54,559 --> 00:13:59,281
That gives you that kind of autonomy over your own health and your own system pieces.
190
00:13:59,281 --> 00:14:00,222
I think that's great.
191
00:14:00,222 --> 00:14:03,343
I think 90 % of the people on the planet would not use it that way.
192
00:14:03,343 --> 00:14:05,384
I think they'd press the easy button.
193
00:14:05,384 --> 00:14:13,329
And then the first thing that this town says, hey, buddy, maybe we should disconnect from
the screen and take a nap for two hours.
194
00:14:13,329 --> 00:14:16,212
Most people go, fuck you and like turn it off.
195
00:14:16,212 --> 00:14:19,247
They're gonna be like, you don't tell me what to do, I'm the boss of me.
196
00:14:19,268 --> 00:14:21,647
Because that's what we are.
197
00:14:21,647 --> 00:14:29,576
Like when we get in these states of mania, which happens all the time when we're stressed
out and going too hard, we don't know how to go.
198
00:14:31,730 --> 00:14:41,414
And just relax, because we're not in that Buddhist state of enlightenment and wokeness
that we can go through and go, none of this actually matters.
199
00:14:41,414 --> 00:14:48,527
Fuck it and push it away because we don't do a good job of treating people or of rewarding
people for.
200
00:14:48,527 --> 00:14:58,401
Well, but not to self care, but but stoic calmness like we reward outlandish behavior.
201
00:14:59,162 --> 00:15:00,502
Communication.
202
00:15:00,724 --> 00:15:09,980
People lean in on communication when it's a variable over a monotone rate, which means
either super, super high or super, super low.
203
00:15:09,980 --> 00:15:13,128
And if you do something in the middle, nobody's going to pay attention to you.
204
00:15:13,128 --> 00:15:22,749
Like we are on a podcast and I know that most people will not watch this, but I am making
hand signs and doing shit with my hands because I'm trying to help people understand the
205
00:15:22,749 --> 00:15:23,920
emphasis of what's going on.
206
00:15:23,920 --> 00:15:25,691
That's just my speech pattern.
207
00:15:26,131 --> 00:15:30,434
Like we're ingrained to do this and
208
00:15:31,058 --> 00:15:39,409
Getting somebody's attention and being calm and being rational and having these things go
through this way is not something we're good at.
209
00:15:39,409 --> 00:15:44,795
So we make snap emotional decisions where the artificial intelligence does not do that.
210
00:15:44,976 --> 00:15:47,399
It is cold, calculated, monotone.
211
00:15:47,399 --> 00:15:48,711
This is how things are.
212
00:15:48,711 --> 00:15:50,172
This is the result.
213
00:15:51,881 --> 00:15:54,784
Here's the other problem with my own idea, right?
214
00:15:54,784 --> 00:16:05,475
the Overwatch tool that you now have plugged into Facebook, it's going to go from a
humanitarian tool to making suggestions that are sponsored by Nike.
215
00:16:05,475 --> 00:16:08,759
know, like, hey, what if you went outside and went for a run?
216
00:16:08,759 --> 00:16:12,042
Maybe put on some of these new Nikes and see how they feel.
217
00:16:12,042 --> 00:16:14,714
All of a sudden, it's just another advertising mechanism.
218
00:16:14,714 --> 00:16:18,255
you watch the first episode of Black Mirror the new season?
219
00:16:19,496 --> 00:16:20,997
Literally that.
220
00:16:22,158 --> 00:16:31,922
So a woman gets a brain, sorry for the spoiler alert coming up, if you don't want to learn
about Black Mirror season one, one, or season four, episode one, don't listen.
221
00:16:32,162 --> 00:16:39,585
But a woman has a brain problem and they have to replace part of her brain with some
artificial components.
222
00:16:39,585 --> 00:16:41,290
And the way that it does that,
223
00:16:41,290 --> 00:16:46,010
is it communicates to a back end tower in a data center via cell towers.
224
00:16:46,230 --> 00:16:50,530
And if she gets out of range of the cell tower, her brain just shuts off and she falls
asleep.
225
00:16:51,290 --> 00:16:58,370
If she wants to use the service, they have a various different subscription levels.
226
00:16:58,490 --> 00:17:06,330
And the entry level is like something ridiculous, like $100 a month, like super cheap,
right?
227
00:17:06,510 --> 00:17:08,190
And these people are struggling to get by.
228
00:17:08,190 --> 00:17:09,610
She's a school teacher.
229
00:17:11,130 --> 00:17:19,950
He's in construction and they do a price change because they're coming out of beta and
they've upgraded to a new network.
230
00:17:19,950 --> 00:17:23,270
Well, now they want 300 bucks a month and then it's 700 and then it's 900.
231
00:17:23,270 --> 00:17:25,630
And suddenly they just can't afford it.
232
00:17:25,989 --> 00:17:31,390
And the way that they get around that is they say, well, instead of that, you can have
ads.
233
00:17:31,390 --> 00:17:35,030
And she just randomly starts spouting off ads that she's talking about.
234
00:17:35,030 --> 00:17:37,370
She'll be talking about a topic and it will pick up on it.
235
00:17:37,370 --> 00:17:38,986
And then she'll go, did you know?
236
00:17:38,986 --> 00:17:54,206
Blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah, blah,
237
00:17:56,299 --> 00:17:58,517
Game over, man, game over.
238
00:18:08,884 --> 00:18:10,336
brain computer interfaces.
239
00:18:10,336 --> 00:18:14,921
And when this brain computer interfaces go down, you're going to want a really, really
good firewall.
240
00:18:14,921 --> 00:18:18,476
And you're going to want to not accept any of the end user license agreements.
241
00:18:18,476 --> 00:18:23,051
And you're going to want to make sure that somebody can remove it.
242
00:18:23,051 --> 00:18:25,724
And I don't know what's going to happen.
243
00:18:26,741 --> 00:18:28,945
So then I'll take it back to the question that you asked.
244
00:18:28,945 --> 00:18:30,147
What do we do about it?
245
00:18:30,147 --> 00:18:36,758
We've resumed our doom and gloom talk about the inevitable end of society thanks to AI.
246
00:18:36,758 --> 00:18:38,110
What do we do to protect ourselves?
247
00:18:38,110 --> 00:18:39,074
That's true, it's Wednesday.
248
00:18:39,074 --> 00:18:41,094
doom and gloom one is on hump day.
249
00:18:41,094 --> 00:18:44,234
The happiness one is on Friday, so we go into the weekend feeling good.
250
00:18:44,314 --> 00:18:48,054
This is the shit, can't believe this, I got to grind through the rest of the week.
251
00:18:48,334 --> 00:18:50,314
And then Friday, it's the joyous one.
252
00:18:50,314 --> 00:18:51,974
So what do we do about it?
253
00:18:51,974 --> 00:19:01,494
So short of like, abandoning society and moving to a cabin in the woods, which I think is
what you could actually do about it and not have to, which probably self 99.9 % of all of
254
00:19:01,494 --> 00:19:04,614
the first world problems that most of us experience these days.
255
00:19:07,370 --> 00:19:08,050
Check your data.
256
00:19:08,050 --> 00:19:10,932
um Don't allow any end user.
257
00:19:10,932 --> 00:19:14,054
Don't accept any end user license agreement on anything.
258
00:19:14,054 --> 00:19:15,344
Disconnect from social media.
259
00:19:15,344 --> 00:19:18,970
Stop listening to podcasts and dumb-dumbs like us who actually don't fucking know
anything.
260
00:19:18,970 --> 00:19:22,839
Or just here talking about summaries of information that we've pulled in.
261
00:19:22,839 --> 00:19:25,060
Which, by the way, I think we're not just dumb-dumbs.
262
00:19:25,060 --> 00:19:33,825
ah But realistically speaking, looking at your social media footprint is probably a really
good starting point.
263
00:19:33,825 --> 00:19:36,616
So if what you're doing,
264
00:19:37,052 --> 00:19:42,705
is out there rage blasting or rage tweeting, whatever you're to do on your social media
platforms.
265
00:19:42,785 --> 00:19:44,286
Maybe stop that.
266
00:19:44,486 --> 00:19:50,930
And that's good advice in general because things are watching that and they will start to
correlate those pieces and put those pieces into play.
267
00:19:51,050 --> 00:20:03,627
Next, if you have subscribed to a DNA service of some kind that's tracking your
information and has good information on you, start looking into ways to go through and ask
268
00:20:03,627 --> 00:20:05,650
to get your data back and have your data destroyed.
269
00:20:05,650 --> 00:20:08,771
Now, if you're in Europe, GDPR requires them to do that.
270
00:20:08,771 --> 00:20:11,583
In the US, there's no data protection rights around that.
271
00:20:11,583 --> 00:20:13,193
But if you live in California, there is.
272
00:20:13,193 --> 00:20:15,774
And other states are beginning to adopt these laws as well.
273
00:20:15,774 --> 00:20:24,810
I don't know what Canada's digital data footprint laws are, um but it's worth
investigating.
274
00:20:24,810 --> 00:20:29,000
But lots of countries are starting to do this because they want people to have autonomy
over their data.
275
00:20:29,220 --> 00:20:33,022
So once you get that, then you can start taking those pieces out.
276
00:20:33,022 --> 00:20:34,002
um
277
00:20:34,002 --> 00:20:36,143
and try to protect it as best as you can.
278
00:20:36,143 --> 00:20:45,068
And I think that's how that these things are going to make sense for people that have to
have medical insurance or you have to give up your end user rights to your data.
279
00:20:45,348 --> 00:20:46,699
That's a whole different scenario.
280
00:20:46,699 --> 00:20:51,952
And HIPAA is supposed to protect this, but HIPAA only applies to the medical community.
281
00:20:51,952 --> 00:20:57,114
And oddly enough, insurance companies are not considered to be part of the medical
community, even though they have doctors.
282
00:20:57,114 --> 00:21:03,902
And even though they can look at your medical records, those scenarios, are not
necessarily they don't necessarily have to
283
00:21:03,902 --> 00:21:07,105
follow all of the HIPAA rules when they take your data.
284
00:21:07,105 --> 00:21:11,869
Now they might have to go through and anonymize certain data sets, pull those pieces
apart, back them out.
285
00:21:11,869 --> 00:21:15,533
But the compliance legislation around it is not clear.
286
00:21:15,533 --> 00:21:22,019
And all it really is is just a, it's the handling of records and who you are and are not
allowed to give them to.
287
00:21:22,019 --> 00:21:29,165
But as an individual, if you sign up for an insurance company, part of their license
agreement says that you have to give them full access to your medical information.
288
00:21:29,306 --> 00:21:30,146
So.
289
00:21:30,686 --> 00:21:36,954
This is one of those things where you're just going to have to be very, very careful with
the input sources that you're allowing people to look at.
290
00:21:36,954 --> 00:21:41,859
And the best way for you to do that, again, is to move out to a cabin in the woods.
291
00:21:42,040 --> 00:21:47,816
Short of that, disconnect as much as popular as possible, and then sanitize your data
streams.
292
00:21:49,280 --> 00:21:58,942
ah much of what you just said terrifying but I'm gonna echo not only for all of these
reasons to look at what you're doing on social media for these reasons but also because
293
00:21:59,083 --> 00:22:08,035
you're not gonna change anyone's mind there is no Facebook post in the history of Facebook
that ever made anyone go god you know I never thought of it that way huh you're right so
294
00:22:08,035 --> 00:22:15,997
just stop because you're not helping yourself you're just you're terrorizing yourself
you're terrorizing all your friends just stop with the rage posting let's try and have
295
00:22:15,997 --> 00:22:19,108
some fun people come on what's wrong with you all right
296
00:22:19,232 --> 00:22:20,252
We're going to have some fun.
297
00:22:20,252 --> 00:22:20,952
We just had some fun.
298
00:22:20,952 --> 00:22:21,612
We're going to have some more.
299
00:22:21,612 --> 00:22:22,412
We're going to wrap this one up.
300
00:22:22,412 --> 00:22:26,712
Thanks so much for watching this on YouTube or listening to this on your favorite podcast
player.
301
00:22:26,712 --> 00:22:28,592
Come back in about a week to get another one.
302
00:22:28,592 --> 00:22:31,532
It's going to be also at our website, the fit mess.com.
303
00:22:31,532 --> 00:22:32,712
And yeah, we'll see you there in about a week.
304
00:22:32,712 --> 00:22:33,192
Thanks so much.
305
00:22:33,192 --> 00:22:33,709
See you soon.
306
00:22:33,709 --> 00:22:34,289
Bye bye.