TRANSCRIPTEnglish

Great Reset Elites are Planning a Post-Human Future | Whitney Webb | The Glenn Beck Podcast | Ep 269

1h 37m 11s16,783 words2,443 segmentsEnglish

FULL TRANSCRIPT

0:00

The last time I spoke with this week's

0:03

guest, the great reset was in full

0:05

swing. Klaus Schwab and the World

0:06

Economic Forum ruled the world. And my

0:09

guest predicted that the global elites

0:10

were going to use AI and transhumanism

0:12

to create a new class of slaves. Now,

0:16

fast forward 3 years and it seems like

0:17

Donald Trump has destroyed the WF and

0:20

ESG in America. The rest of the world is

0:23

spiraling towards total government

0:25

control and AI is becoming a part of our

0:28

daily lives. So what's happening now?

0:31

Where does she see us now? Is the great

0:33

reset really dead? Uh or have the global

0:36

elites just pivoted? And and what's

0:39

happening with um digital ID, which has

0:43

just been uh released? Is that part of

0:46

everything kind of spooky that we've

0:48

talked about in the past? Please welcome

0:50

back to the podcast one of the world's

0:52

leading researchers on these issues,

0:54

Whitney Webb.

0:59

[Music]

1:09

Hello, Whitney. Welcome back. Glad to

1:12

have you.

1:12

>> Hi. It's been a while. My pleasure.

1:15

Thanks for having me back on.

1:16

>> You bet. Um, you know, last time we

1:18

spoke it was I think it was right after

1:21

COVID. You had just released your book

1:23

on Epstein, which is fabulous. Um uh I

1:28

just had released a book on the World

1:31

Economic Forum and the Great Reset

1:33

>> and we were talking about this about the

1:37

Great Reset and the World Economic Forum

1:39

and you said that's only really one part

1:45

of this big global octopus.

1:49

And uh and I was so hyperfocused on the

1:52

World Economic Forum and what they were

1:54

doing. I have to ask you,

1:57

has the World Economic Forum been

1:59

sacrificed? Did we win? Because it kind

2:02

of went to the wayside, but I know

2:05

they're not gone. Claus Schwab was

2:07

exposed. Of

2:08

>> course, they're not,

2:09

>> right? They're not gone. So,

2:11

>> have they just mutated? Is somebody else

2:14

taking their place? Did they pass the

2:16

torch? What What is happening?

2:20

>> Yeah. So I would argue let's look let's

2:21

let's go back to what the W is by its

2:24

own description. It's the premier

2:26

promoter of the public private

2:28

partnership. So I think a lot of the

2:30

policies they attempt to sell people

2:32

through the public sector i.e.

2:34

governments was exposed and I think

2:37

they've gone to the other side of the

2:38

public private partnership

2:41

um and are trying to uh market some of

2:43

their policies that are uh unpopular

2:46

with significant segments particularly

2:48

in the west uh you know via the private

2:51

sector I would uh

2:52

>> can you give me examples

2:54

>> I think that's what's happening

2:57

>> uh yeah so I guess one example would be

3:00

uh let's take what's happening in in

3:01

Britain for example with uh the

3:03

so-called Brit card and and digital ID.

3:06

So obviously there's been a lot of

3:08

political push back to that um from from

3:11

Kier Starmer

3:13

his uh in intention to uh frame this as

3:17

a way to solve illegal immigration which

3:19

is absolutely a ludicrous idea.

3:20

>> Yeah. Madness.

3:21

>> Uh yeah. Yeah. Completely uh insane. And

3:25

so, um, and then they of course come out

3:27

and said that soon, you know, it wasn't

3:29

just limited to, uh, its use as an

3:32

alleged work permit. It would expand to

3:34

all, uh, facets of life. Um but actually

3:39

um if you look at how the UN has labeled

3:43

or has sort of laid out the the its plan

3:47

really uh to have digital ID implemented

3:50

at a global scale. It's not to have it

3:52

be a centralized digital ID like the

3:55

brick card has been proposed. um instead

3:58

it's meant to be a vendor agnostic

4:00

system whereby you would have different

4:02

vendors um sell a digital ID type of

4:07

platform and so to the public the public

4:10

would see it as decentralized and all

4:12

these different private sector uh

4:14

partners in digital ID that they have

4:17

the illusion of choice between them but

4:18

really all that data is meant to be

4:20

interoperable

4:22

um and so that it can all be harvested

4:24

off of any of these um you

4:27

uh different digital ID platforms and

4:29

coalated in a mass in a in a single

4:31

database. Uh because ultimately if you

4:34

were to have something like Britard for

4:36

example happen, you would have all of

4:37

the data be harvested into a single uh

4:40

library, what Tony Blair's institute for

4:43

example calls the national data library.

4:46

>> Um something like that. So, you could

4:47

have that happen with Kier Starmer's

4:49

Brick card or something else um that it

4:53

comes from, I don't know, five or six

4:55

five or six different companies offering

4:57

different forms of digital ID. Uh but

4:59

all of that data could still be

5:00

harvested um you know, from all of those

5:04

um different vendors because they all

5:06

agreed to specific standards. And if you

5:08

look at some of these alliances about

5:10

digital ID uh that were a focus during

5:14

um COVID for example like the ID 2020

5:17

alliance for example uh they were all

5:19

about getting all these different

5:20

vendors of digital ID to agree to the

5:23

same set of international standards so

5:25

that they could harvest the data from

5:28

any digital ID no matter who makes it

5:30

and have it held in the same global

5:32

centralized database. So um there are

5:35

different ways to get

5:37

what they ultimately want, but it all

5:40

comes down to public perception. So, a

5:42

colleague of mine who I've worked with

5:44

closely on digital ID um uh for a few

5:48

years now um Ian Davis recently wrote

5:50

about what's going on in the UK. He's

5:52

based there. Um and he posited that

5:54

maybe what Kier Starmer is doing is

5:56

actually a bait and switch. um that to

5:59

create all this unpopularity about this

6:01

style of digital ID, but then someone

6:03

later could come in riding the wave of

6:05

the discontent that this is creating and

6:07

then offer a new solution uh which would

6:10

be more along the lines of what I just

6:12

described which is actually how the UN

6:14

itself and STG16

6:16

uh which is the STG that includes uh

6:18

digital ID uh you know the road map laid

6:21

out there uh is not the same as the one

6:24

laid out uh by Kier Starmer. So in in in

6:28

that you still have a public private

6:29

partnership, right? But it would be the

6:31

private leading as opposed to the public

6:33

leading. Um and what we're seeing come

6:35

out of the UK right now is is being sold

6:38

as a public leading thing and it's

6:39

grossly unpopular. Um and I think

6:42

they're a lot smarter than people give

6:44

them credit for. Um, I mean, they're uh

6:47

fundamentally very uh manipulative and

6:49

they want us to get stuck with the same

6:52

um policy, but they're very apt at

6:54

selling it uh different ways and they

6:57

know that they've become very unpopular

6:58

with large segments of the population.

7:01

Um, and so, you know, like a chameleon,

7:02

they have to take a different form, but

7:04

ultimately the goal is to lead people to

7:06

the same um type of uh you know,

7:09

technocratic uh Orwellian system.

7:13

>> Couple things. First of all, I have for

7:16

years now

7:18

looked at what is being done to us with

7:21

both horror and also

7:25

in a way strange admiration. They they

7:28

they are so thorough. They are so

7:33

wellthought out. The structure of this,

7:36

the fallbacks, the the use of behavioral

7:40

scientists and everything else. At some

7:42

point a book is going to be written that

7:45

says look at how all of this was

7:48

designed. I mean it is

7:49

>> probably many books.

7:51

>> Yeah. It is it's really

7:54

it's it's it's

7:57

incredible to me how many great minds

8:01

have spent so much time trying to

8:04

enslave their fellow human beings, you

8:07

know.

8:10

>> Yeah. Uh I I think it's because a lot of

8:12

the people uh that seek to uh en enslave

8:16

the vast majority of of humanity uh have

8:19

a a lot of capital uh that they uh want

8:22

to uh devote to this unfortunately. Um

8:27

and um unfortunately we also know that

8:30

money can buy you essentially anything

8:32

in today's world including armies of

8:34

behavioral psy psychologists and any

8:37

other number of other specialists. Um

8:39

but ultimately you know I think a lot of

8:41

them are increasingly relying on um

8:43

artificial intelligence to be able to do

8:45

this uh at scale. And so I think um uh

8:49

this admin of the era of you know AI

8:51

generated content also enables them to

8:55

um you know tweak uh things faster and

8:58

also to um manipulate our attention in

9:01

ways that uh you know are just really

9:03

being uh discovered and maybe won't be

9:06

discovered uh you know for a long time.

9:08

um with you know increasingly

9:11

significant impacts on on human behavior

9:13

behavior and also on human uh

9:16

perception. So um yeah I think

9:19

ultimately it's never been more

9:20

important to be a critical thinker and

9:22

to do as much of your own research as

9:24

possible and the best way to do that

9:26

research like what I just talked about

9:28

regarding you know the UN and digital ID

9:30

and how they say it you know it's in

9:32

their own documents you just have to go

9:34

in and read it and not everyone can do

9:35

that. Um but if these are issues that

9:37

particularly concern you, we abs

9:40

absolutely should uh you know make make

9:42

that effort. And also I think uh you

9:44

know in the co era for example a lot of

9:47

people were against these particular

9:49

policies digital ID uh being one of

9:51

them. But these people will repackage

9:53

and rename and sell you the same policy

9:57

under different metrics and under a

9:58

different name with a different face

10:00

that they deem, you know, uh, you know,

10:03

polling shows they're more politically

10:05

palatable to that particular demographic

10:07

or what have you. So, I think the more

10:08

we focus on the policies that we don't

10:10

want, uh, the better off we'll be

10:12

instead of the person selling it to us.

10:15

Um,

10:16

>> and you know, the new buzzword that's

10:18

following it around.

10:18

>> And we never seem to learn. I mean, this

10:20

is what they did with the Federal

10:21

Reserve, you know, with the Federal

10:22

Reserve Act 1913.

10:25

>> This is what they did with the Patriot

10:26

Act. That thing was written, you know,

10:27

two years, three years before 9/11. They

10:30

tried to package it. Didn't work. Just

10:33

repackaged, waited for the right moment.

10:35

I mean, this is the way they do it. For

10:37

anybody who is not truly up on digital

10:42

IDs and why this is so important, can

10:46

you explain what digital ID means if we

10:50

begin to implement them?

10:54

>> Yeah. Yeah. Well, digital ID is really

10:56

the lynch pin to uh you know the

10:58

sustainable development goals as well as

11:01

this mass surveillance paradigm that's

11:03

being sold to us uh by oligarchs on the

11:06

left and the right. It would be your

11:08

unique identifier for the digital world.

11:10

The goal is to have it be uh the way for

11:12

you to uh offer your credentials to

11:15

every service uh that you access period.

11:18

Everything ranging from healthcare to

11:20

telecommunications, your social media

11:22

accounts. And as things become

11:23

increasing increasingly digitally

11:25

connected, you know, perhaps even your

11:27

appliances, if they're smart appliances,

11:29

at some point won't uh function without

11:31

you having the proper credentials uh to

11:34

show that it's you.

11:37

So ultimately, if people want to uh

11:39

fight against this mass surveillance par

11:41

uh paradigm and these efforts to usher

11:43

us into into you know a very dark I

11:46

would argue a technocratic future, the

11:48

most important thing is to not comply uh

11:51

with digital ID because it's the single

11:52

most important piece of infrastructure

11:55

uh that they need and they need us to

11:57

voluntarily consent because even if they

11:59

roll it out um and people but it it will

12:02

fail if people decline to use it. So

12:05

ultimately so much effort is being spent

12:08

on convincing us to adopt it. And so we

12:10

need to be laser focused on that policy

12:12

and say no thank you.

12:13

>> Let me play the devil's advocate that

12:16

you hear every time every time we take a

12:18

bad bad step towards more digital

12:20

surveillance. Well, I don't have

12:22

anything to hide. I don't really care. I

12:24

don't have anything to hide. Why why why

12:28

is that,

12:30

you know, a a kindergarten answer?

12:33

Well, I would argue because a lot of

12:35

these uh companies that are engaged in

12:37

these uh mass surveillance or the

12:39

contractors really that are engaged in

12:40

in mass surveillance don't ultimately

12:42

have just watching what you're doing uh

12:45

as being enough for them. They're

12:47

ultimately interested in things like

12:49

predictive analytics and predictive

12:51

policing. So based on your behavior now

12:54

and your behavior in the past, they want

12:56

to use artificial intelligence to

12:58

determine what you may do in the future.

13:01

And in the case of predictive policing,

13:03

that would be well, we've determined

13:04

that you may commit a crime in the

13:06

future, and so we're going to uh, you

13:09

know, uh, send you to uh, a

13:11

court-ordered physician or, you know,

13:14

detain, issue house arrest to protect to

13:17

stop crime before it happens.

13:19

Essentially,

13:20

>> um, is where a lot of these companies

13:22

Well, yeah. uh and unfortunately it is

13:24

that and there's a lot of companies that

13:26

have uh made um massive inroads uh in in

13:31

that type of technology even though it's

13:33

been hugely discredited. Um there's

13:35

several companies I think the most

13:36

notorious at this point is called or was

13:39

called PRP pole. They've since rebranded

13:41

but they were uh less accurate than a

13:43

coin toss and people were being uh you

13:46

know deprived of of their liberty uh

13:48

because of a of an algorithm that was

13:50

hugely inaccurate. uh and ultimately you

13:54

know if you look in the UK for example

13:55

some of these algorithms for facial

13:57

recognition have been rolled out even

13:59

though they've been shown over there too

14:00

to be hugely inaccurate and there's no

14:02

interest in changing uh vendors even

14:05

when this inaccuracy is demonstrated. So

14:07

to me that says that their goal is to

14:10

have us induce immed obedience by the

14:13

fact that you're being watched all the

14:15

time and anything you may do uh could be

14:17

used against you even if you're not

14:19

doing anything wrong now. um an

14:22

algorithm could determine that uh

14:24

certain you know errant behaviors uh

14:26

warrant you being added to a list of

14:28

some type and actually Larry Ellison of

14:30

Oracle who is one of the main funders of

14:32

Tony Blair's uh institute that's one of

14:35

the biggest pushers for digital ID uh in

14:37

the UK said this at an Oracle uh

14:39

shareholder meeting that you know we're

14:41

recording and surveilling everything and

14:42

citizens will be on their best behavior

14:45

>> terrible because they have to

14:46

essentially paraphrasing

14:48

>> the fact that Donald Trump is listening

14:50

Listening to that guy is terrifying to

14:53

me. I mean, he is he has put some people

14:56

around him on this tech board that are

14:59

not friends of freedom and liberty.

15:01

They're just not. Larry Ellison is

15:04

leading that pack.

15:07

>> Yeah. A lot of them are are are, you

15:08

know, I would argue overtly and also

15:11

covertly globalist. Um you have people

15:13

uh you know in that network you just

15:15

mentioned serving for example on the

15:17

steering committee of the Bilderberg

15:19

group uh which is you know a well-known

15:23

closed door meeting uh globalist conflab

15:26

um and unfortunately um you know I think

15:29

they've been some of them anyway have

15:31

been able to characterize their policies

15:33

as uh libertarian for example uh even

15:37

though some of those same oligarchs are

15:38

on record saying that the free market is

15:40

for losers. uh if you want to get rich

15:43

build a monopoly and build monopolies uh

15:46

they have unfortunately uh but I think

15:48

again this is what uh I was saying

15:50

earlier about um the world economic

15:53

forum you know there's an effort to sell

15:55

this uh since they couldn't sell it from

15:57

the left uh the goal now is to try and

16:01

sell it somehow uh from the right uh and

16:04

to try and frame it under metrics and

16:06

dialectics that'll be more appealing uh

16:09

to the group that was most against these

16:10

policies just a few years ago. Um, and

16:14

unfortunately, you know, with AI and all

16:15

of that, it potent we could happen. It

16:17

could happen if people aren't aren't

16:19

vigilant. You know, just a few years

16:21

ago, someone like Elon Musk was a major

16:23

promoter of things like carbon markets

16:26

>> um and pricing carbon for example. And

16:28

that was actually why he had a falling

16:30

out with Trump in Trump's first

16:32

administration was because uh Trump

16:34

pulled out of the Paris agreements and

16:35

Elon Musk was like, "Well, I can't have

16:37

that." Um, so have these oligarchs

16:40

really changed or have they instead

16:42

tried to make themselves more appealing

16:44

because they've noticed the change in

16:46

public opinion uh and want to uh try and

16:49

get you know us to continue to buy into

16:52

uh their solutions uh that they uh have

16:55

a lot of money to spend convincing us

16:57

are actually good and rebranding them.

16:59

>> So how

16:59

>> and again this is why I say it's

17:01

important to focus on the policies

17:03

specifically. How do we um well wait

17:07

before I get there, let let me go back

17:08

to digital ID. Tie this into a digital

17:12

currency

17:13

because this is the this is the the

17:16

highway system for that, isn't it?

17:19

>> Sure. Yeah. Well, um Larry Fank is now

17:23

running, I believe, the World Economic

17:25

Forum. He's acting chairman. And uh in

17:28

addition to saying that everything Yeah.

17:30

In addition to saying that everything

17:32

will be tokenized, he's uh said that

17:35

everything will soon be uh on the same

17:38

universal uh digital ledger or database.

17:42

Um and that everything on that database

17:44

will have a unique identifier number. So

17:47

for you as an individual, your

17:50

identifier number uh will presumably be

17:53

your digital ID or directly linked to

17:55

that. But everything will have a digital

17:58

ID. Uh the tokenization agenda in

18:01

particular seeks to tokenize uh not just

18:04

you know assets that we traditionally

18:06

think of um like real estate for example

18:10

or or gold or you know physical assets

18:12

as well as digital

18:14

>> assets like Bitcoin uh there's a a major

18:17

effort uh connected with people like

18:19

like F and also people like Mark Carney

18:21

who's now uh prime minister of of Canada

18:24

to tokenize uh the the natural world and

18:27

transform it into financial assets.

18:29

assets and there was an attempt to do

18:30

this to an extent under the Biden

18:32

administration I believe through this

18:34

the department of interior uh with

18:36

natural asset corporations but that has

18:38

not gone away uh and there are groups um

18:42

for example uh one of the creators of

18:44

the ETF uh model originally uh which

18:47

Black Rockck now now owns Eyesshares his

18:50

name is Peter Kesz I think is how you

18:52

pronounce it he's trying to turn um the

18:55

Amazon rainforest uh into a digital

18:58

commodity uh sort of similar to bear uh

19:02

bit Bitcoin in terms of like the the

19:03

scarcity uh idea that you know each

19:06

hectare of the Amazon rainforest would

19:08

represent um you know a token and then

19:11

and then financialize it that way and

19:13

then each hectare would then be have its

19:16

unique identifier right on on the on the

19:19

blockchain

19:20

>> and and would be you know serviced uh by

19:23

surveillance drones and all sorts of

19:25

stuff. So even our most like natural the

19:28

places we conceptualize as the most

19:30

natural places on earth, these people

19:32

want to come and place surveillance

19:34

technology and you know tokenize it and

19:36

put it on a blockchain and use it to um

19:39

you know I would argue in the case

19:41

particularly of natural asset uh

19:43

corporations and the group behind it the

19:45

intrinsic exchange group um they just

19:47

want to open up a huge new asset class

19:49

they call it nature's opportunity so

19:52

that they can continue engaging in the

19:54

same type of uh bad behavior that for

19:56

example bought us brought us the 2008

19:58

financial crisis u by you know uh

20:02

quintupling basically uh the amount of

20:05

assets currently in play um it's um

20:08

>> you know I had a guy

20:09

>> very insane

20:10

>> I had a guy who worked u very very very

20:14

high up at uh city bank and he told me

20:16

around 2008 uh he said Glenn you know

20:19

don't worry about the financial system

20:21

I'm like aha and uh he said um you know,

20:26

we're never going to go broke. I mean,

20:27

do you know how much just the national

20:29

parks are worth? And I looked at him and

20:31

said, "Are you seriously telling me that

20:34

we should commoditize the national

20:37

parks?"

20:38

And he said, "It's going to happen."

20:41

>> And I wonder now if this is what he was

20:43

talking about, if it was just a digital,

20:46

not actually selling them, it's just a

20:48

digital

20:50

commoditization of our parks.

20:53

>> Yeah. So apply this now to the the

20:55

phrase that we all heard during the co

20:57

era, you'll own nothing and be happy.

20:59

Well, there's certain people that want

21:01

to own everything. And that includes

21:03

things that have never been able to be

21:05

owned before that were considered things

21:08

like the public commons like rivers,

21:10

lakes, the ocean itself, natural

21:13

forests, all sorts of it. These people

21:14

want to put all of that um into the

21:17

financial system, fractionalize it,

21:20

tokenize it, and sell piece of it, sell

21:22

pieces of it around. Uh you know, use it

21:24

to speculate on. I mean, it's it's it's

21:28

very bonkers. So,

21:30

>> yeah. And so, this is just one aspect of

21:32

the the digital currency play.

21:34

Obviously, there's a lot more than that

21:36

just going on as well. Um, I would argue

21:38

that a lot of this push, particularly in

21:40

the US, um, for dollar stable coin

21:43

supposedly being better than a central

21:45

bank digital currency, also falls into

21:47

this, uh, paradigm we talked about

21:49

earlier of, you know, moving from the

21:52

public to the private of the public

21:54

private partnership because um, a lot of

21:56

these stable coin issuers, you know, if

21:58

the concern the big concerns about uh,

22:01

CBDC's was that they're seizable,

22:04

they're surveillable, and they're

22:05

programmable. Well, all of those three

22:07

things also can apply to stable coins.

22:09

The only difference is that you would

22:11

have the p a private company issue it

22:14

and control it. But we've seen time and

22:16

again how a lot of these private

22:17

entities are willing to do that. Uh when

22:20

contacted, just look at how Bank of

22:21

America behaved with January 6. Uh

22:23

people accused of wrongdoing on that

22:25

day, for example. Um you know, they have

22:27

no qualms in doing that uh and engaging

22:29

in in those type of activities. And the

22:31

biggest uh dollar stable coin issuer uh

22:34

Tether which just hired Bo Hines uh from

22:38

the White House um they have uh openly

22:42

said that they are uh a close partner of

22:44

the US government for dollar hegemony uh

22:47

globally and have uploaded uh the FBI,

22:49

the Secret Service and other aspects um

22:52

of the US government onto its platform

22:54

directly and have seized uh tethers you

22:57

know from people uh just because you

23:00

know the government told them to and

23:01

this was during the Biden

23:02

administration. So they obviously are

23:04

willing to do that under any

23:05

administration and it's uh essentially

23:07

functioning as a de facto public private

23:09

partnership even though we're being told

23:12

um it's a it's much better than a CBDC

23:14

but in terms of its impacts on civil

23:16

liberties you know that's not

23:17

necessarily true. So again vigilance is

23:19

is important here.

23:23

More with Whitney in just a second, but

23:24

right now the average American is still

23:26

finding it difficult to pay expenses

23:28

every month and in most cases there is

23:29

nothing left over to cover the extras.

23:32

Most aren't getting a big raise and

23:33

expenses are being so high it can be

23:35

really hard to manage without grabbing

23:36

for the credit cards. But listen, if

23:38

you're a homeowner and you are

23:39

frustrated with that endless cycle that

23:41

only produces more debt, I want you to

23:43

take 10 minutes today and give American

23:46

Financing a call. If you're constantly

23:47

carrying a credit card balance every

23:49

single month with an interest rate in

23:51

the 20s or even 30s, American Financing

23:53

can show you how to put your hard-earned

23:55

equity to work and get out of debt. They

23:57

have salarybased mortgage consultants

23:59

that are saving customers an average of

24:00

$800 a month. And that could be you. So

24:03

get started today. You may not have to

24:05

make next month's mortgage payment. No

24:07

upfront fees. Doesn't cost you anything.

24:08

To find out how much you could be

24:10

saving, call American Financing

24:11

800962440.

24:14

800906-2440

24:16

or americanancing.net.

24:19

Let me go to AI because it's all

24:22

connected unfortunately.

24:24

Um

24:26

AI AI is one of the most exciting things

24:30

man has ever come up with and also the

24:33

most terrifying thing man has ever I

24:35

mean it it makes uh nuclear weapons look

24:39

like romper room or you know some sort

24:41

of preschool game. Um it is uh

24:46

it is frightening in the fact that you

24:50

don't really know who's programming it.

24:53

Um it's going to be ubiquitous. It's

24:56

going to be everywhere.

24:57

>> It will know

24:58

>> everything that you're doing looking

25:00

for, etc., etc.

25:02

>> Um

25:03

>> uh but it is now also crossing the

25:06

lines. Where was it? Was it Albania that

25:09

just put their first minister

25:12

digital minister

25:14

>> into place?

25:16

>> It would be like having, you know, Pete

25:18

Hegth, you know, replaced with a an

25:20

avatar and it doesn't seem to be that

25:24

big of a deal to a lot of people. You

25:27

want to tell that story and what that

25:29

means?

25:31

Well, I think people have been

25:32

increasingly normalized uh to sort of to

25:35

the dissolution between the digital and

25:37

and virtual worlds. And that's not by

25:39

coincidence. So, going back to the World

25:40

Economic Forum, the goal of the the

25:42

West's so-called fourth industrial

25:44

revolution is to blur those lines uh

25:47

very overtly. And so, you know, what

25:49

we're seeing here are stepping stones

25:50

leading us to an increasingly uh

25:53

encroaching all digital system. And, you

25:57

know, it it probably began some time

25:59

ago. Um, I'm sure you remember several

26:01

years ago, uh, Muhammad bin Salman, for

26:03

example, gave citizenship to a robot and

26:06

that was kind of framed as novel, but

26:08

you know, there there's been an effort

26:10

to normalize these kinds of of things

26:12

with respect uh to the government. So

26:14

now they're having um,

26:16

>> you know, AI run the government under

26:18

the guise that it's it's more efficient,

26:20

it's more trustworthy and all of that,

26:22

but again, who is accountable if the AI

26:24

makes a mistake? Because AI does make

26:27

mistakes. AI also hallucinates and uh

26:30

returns results that are essentially

26:32

indicative of an ear reality, something

26:34

that is completely uh not true. Um and

26:38

so who is accountable in those cases? Uh

26:40

can they hold the AI minister directly

26:42

accountable? Not really. Does the

26:44

accountability fall to the person who

26:46

programmed the AI? uh it it it obviously

26:49

opens up a pretty sticky uh situation,

26:52

but uh I would in in the case of this

26:55

argue that this is in furtherance of of

26:57

an agenda that was actually laid out by

26:58

Henry Kissinger and Eric Schmidt in

27:00

their book um oh I forget what it's

27:03

called uh sorry about that but they they

27:06

wrote a couple books on AI and the

27:07

earlier one I think it's AI in our human

27:10

future is the subtitle or the age of AI

27:13

something like that um they essentially

27:15

argue that um we should put AI in charge

27:18

of government because they assume they

27:22

they obviously believe that AI is a form

27:24

of a super intelligence. Therefore, it

27:26

knows better than humans do. And so even

27:29

when it returns these uh unreal,

27:31

irreality uh results, we should take

27:34

that as uh as a sign that it can see

27:38

things humans cannot see. We should just

27:41

trust that it's there because we should

27:43

trust that it's super intelligent and

27:45

sort of uh you know offset uh give give

27:48

it power over our lives supposedly

27:52

because it's a better arbiter uh of

27:54

what's real and what's not uh than we

27:58

are which is um I think that is uh just

28:03

insane. Also, sorry to keep repeating

28:06

that word, but it's hard. Some of this

28:07

is really

28:08

>> insane.

28:09

>> Um, just bonkers stuff. Yeah. And in

28:12

addition to that, Kissinger and Schmidt

28:13

laid out that their biggest interest in

28:16

AI uh was its impact on human

28:18

perception. And ultimately, if you're

28:20

able to uh completely control how people

28:24

perceive reality, you control their

28:26

behavior. you don't need mind control at

28:29

the end of the day or any of these

28:30

things in the back that uh you know the

28:32

CIA and national security agencies were

28:36

experimenting with. You know, you don't

28:37

need that if you can completely control

28:40

uh their perception of what's going on.

28:42

Um and so the goal uh as they lay out

28:44

here or laid out in that book uh is uh

28:47

to have people rely on AI for their

28:50

perception of essentially everything. Um

28:53

and that eventually by doing so uh

28:56

people would be what they uh the term

28:57

they used was cognitively diminished to

29:00

the point that they wouldn't be able to

29:02

understand how AI acts upon them

29:04

anymore. But that wouldn't be true for

29:06

everyone. There would be a small class

29:08

that is not uh affected that way and

29:11

they would be the class that programs

29:13

and maintains the AI determines what it

29:16

does. But the rest of us um a large

29:19

underclass would be acted upon by the AI

29:22

but again uh lose the mental capacity to

29:24

understand what it's doing to them and

29:26

that eventually it would start

29:28

determining their preferences for them

29:30

and all sorts.

29:30

>> This is such evil. I mean there is no

29:32

other way to describe this other than

29:34

evil. When you are taking humans who are

29:38

built to act, not to be acted upon, and

29:41

you purposely

29:44

>> put them into a class that you can act

29:46

upon, that is there's no better

29:50

word to define it than evil.

29:54

>> Yeah. Well, the term, you know, that

29:56

gets thrown around a lot for this is

29:57

posthuman future. But what what is more

30:00

evil

30:01

>> to humanity than that? just eliminating

30:03

us and and turning us into what some of

30:06

these um libertarian oligarchs called

30:08

technoplastic beings. I mean, some of

30:11

them think that uh humans are nothing

30:13

more than uh bootloaders for digital

30:15

intelligence. I mean, that's how we are

30:18

perceived by a lot of these tech

30:19

oligarchs because again, a lot of their

30:21

goal and they've been relatively open

30:23

about this is to live forever uh but in

30:26

defiance of natural law. So using

30:28

technology to allow them to uh become

30:31

gods. A lot of these uh tech oligarchs

30:34

including like the co-founders of Google

30:36

uh have been pretty uh open about that.

30:39

And even someone like Jeffrey Epstein

30:41

for example who was uh very interested

30:43

in in eugenics and AI and all of that

30:45

was interested in in you know those

30:48

technologies for those same ends. I mean

30:50

there's a a whole group of I would I

30:54

would call them pretty sick billionaires

30:56

uh who want to use this technology uh to

30:59

better themselves in that way and live

31:02

forever while the rest of us become

31:04

cognitively uh um become we become

31:07

cognitively incapable of questioning

31:09

what ultimately is amount to slavery.

31:14

>> We should say no.

31:15

>> I know we should

31:16

>> I think that should be pretty clear.

31:17

>> I don't know if we do. Um, where do you

31:20

find hope in all of this?

31:23

>> So, uh, yeah, I get asked this question

31:25

a lot because when I'm talking about

31:27

these systems, it's it it's obviously

31:29

dark and it's obviously wrong. But

31:31

again, like I said earlier, it's I don't

31:33

I don't think it's hopeless because they

31:34

are spending so much money and so much

31:36

energy on getting us to consent to these

31:38

policies. Um, you can build these

31:41

digital systems that once you're in them

31:43

will imprison you. uh but if no one uses

31:46

these systems they can't do anything. Um

31:48

so a lot of there's a lot of efforts for

31:50

example to use them to implement them on

31:53

existing user bases of massive social

31:55

media websites uh for example but if

31:57

people decline to use it or people leave

32:00

these platforms um or stop using these

32:04

uh you know certain digital

32:05

infrastructure tied to these people it

32:07

will collapse. They need people

32:09

>> what are the ones we should be avoiding

32:11

right now?

32:13

Well, I think people should do their own

32:14

research uh and look at who owns what.

32:17

But a lot of these uh billionaires uh

32:19

you have, you know, people like Larry

32:21

Ellison and Eric Schmidt, the Google

32:23

guys, people like Peter Almidar who were

32:26

on the left, Reed Hoffman, Bill Gates,

32:28

right? And then you have, you know,

32:29

people like Elon Musk uh and and Peter

32:32

Teal and the the PayPal mafia crowd,

32:34

most of them uh frame themselves as

32:36

libertarian. If you look at um uh their

32:40

philosophy, their own words, um they're

32:43

overtly transhumanist, um a lot of them,

32:46

despite saying things to the contrary,

32:49

want global government in some form. Uh

32:53

and you know, the ones on the supposedly

32:55

libertarian side frame it as having a

32:57

CEO in charge of everything, but a CEO

33:00

that would govern as a dictator. So I

33:03

don't ultimately see that as as much

33:04

better given all this technology that's

33:07

uh would be in the hands of this one or

33:10

you know these this this very small uh

33:12

group of people. Um but they don't own

33:14

everything. They own a lot of technology

33:17

uh obviously and tech in technological

33:19

platforms, social media um and all sorts

33:23

of things. Uh but it's up to people if

33:25

they want to continue using those

33:26

services and supporting these people uh

33:28

because ultimately they need us to make

33:32

their system work. They want to harvest

33:34

us for data and like I said earlier they

33:36

want to use us as as bootloadaders for

33:38

their digital intelligence and they

33:40

can't continue to improve and feed the

33:42

AI without us doing it for them. They

33:45

can't do it alone. So,

33:48

>> um I think the more we

33:50

>> But people are not

33:53

they're not likely to leave things that

33:56

make their life easier. They're just

33:57

not.

33:59

>> Yeah. Well, that's that's the price of

34:00

convenience, isn't it? And I think a lot

34:02

of the effort to enslave us has been to

34:05

uh uh cajul us and uh influence us with

34:09

with convenience and comfort. Uh but

34:12

also in theory, you know, prison is

34:14

comfortable, right? in the sense that

34:16

you have a roof over your head and they

34:18

bring you food. Um, and I mean it it you

34:22

know a digital prison without walls uh

34:24

you know could be similarly comfortable

34:27

and you wouldn't have to lift a finger

34:29

to fight uh you know for your freedom

34:31

but we can still uh oh sorry you

34:34

wouldn't have to lift a finger to fight

34:36

for your freedom. You would just

34:37

willingly walk into the system right um

34:39

but we those of us that don't want to

34:41

live in the system have to do something.

34:43

And so I think we're at the at at a at a

34:46

crossroads and have been for several

34:48

years uh where those of us that don't

34:51

want to uh walk into this have to

34:55

actively build alternatives. And if you

34:57

don't have, you know, uh a ton of people

34:59

in your community uh doing that, maybe

35:02

you should reach out and build

35:03

awareness. Uh but if you have people

35:05

that are aware of this around you, um

35:08

it's it's important to build, I would

35:10

argue, local resilient networks that

35:12

don't depend on on this infrastructure.

35:14

There's still open-source alternatives

35:16

to a lot of the um you know, big tech

35:20

platforms uh out there. Uh and

35:24

I I still think I'm still hopeful that

35:26

there is time. Uh but you know

35:29

ultimately at the end of in end of the

35:31

day you know if they're pushing us

35:32

towards a posthuman future I think at

35:34

some point people will realize uh that

35:37

they don't want to lose what makes us

35:38

human. And so so much of what we're

35:40

being pushed to use AI for are things or

35:43

creative pursuits that help define us as

35:46

human right uh making art making music

35:50

writing. um these are the things that

35:52

we're being told to outsource uh to

35:54

artificial intelligence not necessarily

35:56

the tedious stuff right so what's going

35:58

to be left for us when we uh outsource

36:01

of this all to AI will we allow

36:03

ourselves to be cognitive cognitively

36:04

diminished to the point that we can't

36:06

even create anymore and then what kind

36:08

of you know humans are we at that point

36:11

so I think it's very important to um

36:14

encourage uh analog alternatives to that

36:16

kind of stuff and to engage in uh in

36:19

creativity And uh there's a lot of

36:22

opportunity for that especially for

36:23

people that have uh children. You know,

36:25

children are very creative and we need

36:27

to uh promote that to them instead of

36:29

being like here's a tablet, learn how to

36:31

scroll by the time you're three or four

36:34

>> um and navigate the the algorithms. You

36:37

know, if we do nothing and we don't

36:39

shift that cultural uh uh behavior or

36:44

what's being made, you know, common

36:45

cultural behavior now, then yeah, it

36:47

will be very problematic. And so I

36:49

think, you know, it's a very important

36:51

time right now for parents uh to make

36:54

sure your kids are are well and anchored

36:55

in in the real world and not just uh you

36:59

know uh checked out to launch and

37:01

trusting uh you know potentially

37:03

trusting algorithms more than you. I

37:05

mean there's these efforts to have

37:06

domestic robots in the house. A lot of

37:09

the ads show show you know young

37:11

children develop developing emotional

37:13

relationships with these robots saying I

37:15

love you and all of this stuff. It is

37:18

that is not good. I absolutely agree. Uh

37:21

and so you know just because you want to

37:24

focus on yourself or X Y and Z is is no

37:27

excuse to have you know the emotional

37:29

connection your child needs be built

37:31

with a machine programmed by who knows

37:33

who. I mean so many of these big tech

37:35

figures also had relationships to

37:38

Jeffrey Epstein a pedophile. Do you want

37:40

to trust those people uh to program

37:43

stuff uh that's around your kids and and

37:45

talks to them and you know potentially

37:47

manipulates them when you're not there?

37:50

So, you know, it's not just with that

37:52

too. I mean that that is the idea of

37:53

taking active responsibility for things

37:55

in your life and we need to do more of

37:57

that. And culturally, Americans have

37:59

been the best at that for a very long

38:01

time. But we there have been a lot of

38:03

efforts to condition us out of that. And

38:05

a lot of it has been through this um

38:08

effort to cultivate the importance of

38:10

comfort above all else and convenience.

38:12

you know the idea of rugged

38:13

individualism in the US uh unfortunately

38:16

has been uh you know greatly reduced and

38:19

I think it's important for us to take

38:22

active responsibility because you know

38:24

the the pull of AI is to get is is for

38:27

is uh for us to be passive and do

38:30

nothing and just let it wash over us and

38:33

uh oh you don't have to do that anymore

38:35

AI can do that and AI can do this for

38:37

you and and this and that. Um and if

38:40

we're not uh focused on uh the things

38:42

that we like to create and that we like

38:44

to do um and uh active, you know, we

38:48

will recede and that is how the

38:50

posthuman future will happen. There is

38:51

still a lot of time for agency. Um but

38:54

people just need to be

38:56

>> really aware of what's going on and

38:58

determined to to change it.

39:00

>> Is there anything to I mean do you use

39:03

AI at all for anything?

39:06

>> Nothing. You're completely off it. You

39:09

don't use it.

39:11

>> No, I'm I'm uninterested in using it. I

39:13

mean, I didn't I mean, it wasn't always

39:15

around. You know, I'm I'm 35 now, and

39:18

you know, when I was in university,

39:20

there was no AI. I learned how to write

39:22

and do what I do now without it. So, why

39:24

would I need it? Especially when I'm

39:25

aware that, you know, the whole idea if

39:28

you don't use it, you lose it. So, I

39:30

stop uh you know, let's say for example,

39:33

a person who does work similar to me uh

39:35

stops researching, has AI do their

39:37

research for them. Well, they'll come

39:38

back in a year or two and be like, "Wow,

39:40

I kind of forgot how to do this. I don't

39:42

remember how to do it anymore. It's

39:43

gotten a lot harder for me." Right? The

39:46

same idea if you stop doing mental math

39:48

because you're constantly relying on a

39:50

calculator. Uh it gets harder. Uh that's

39:53

the idea of cognitive

39:55

>> diminishment. Ray Kerszsw called it. Ray

39:58

Kerszswhile told me that uh No, it'll

40:02

just free your mind up to do other

40:04

bigger, more important things. And I

40:06

didn't believe

40:07

>> that's not happening.

40:08

>> Yeah.

40:09

>> Yeah. I didn't think it would.

40:10

>> We can already see that's Yeah. We can

40:13

already see that's not uh that's not

40:15

happening.

40:16

>> So I I think people again need to take

40:19

active control of not just their

40:21

physical lives as much as possible, but

40:23

their mental lives too. And have to

40:25

remember that you know uh even on big

40:28

social media platforms like uh X

40:30

formerly Twitter for example they've

40:32

openly said that the AI Grock is going

40:34

to be running the algorithms period uh

40:37

come November you know so AI uh is is

40:42

inescapable in those types of

40:43

environments and we have to remember

40:45

that um we have to be aware that there

40:48

is an effort to influence us towards

40:50

these policies. Um, and a lot of people

40:54

go on to social media assume it's uh,

40:56

you know, the new public square and you

40:59

know, free, you know, that it's better

41:00

for free speech now and all of that, but

41:02

aren't um, aware that really every time

41:04

you're going on these platforms, it is a

41:07

cognitive battlefield. Uh, and again,

41:10

this is why I want really want to stress

41:12

stretch sorry stress that critical

41:15

thinking has never been more important.

41:16

there's a reason they've tried to breed

41:18

it out of the school systems uh in the

41:20

US and uh social media chat GPT the chat

41:24

bots all of that are meant to further

41:27

eliminate that from us so it's never

41:30

been uh more important to scrutinize uh

41:33

things and and go into these envi

41:35

digital environments uh realizing them

41:37

for what they are um and some people get

41:40

benefits from them but some people uh

41:42

don't necessarily anymore uh and there's

41:45

been a lot um even studies that have

41:47

been leaked from places like Facebook

41:49

where they've manipulated your

41:50

algorithms to depress you to make you

41:53

feel feel very negative and feel

41:55

despondent um and all of that and yeah I

41:58

mean if we give in to those kind of

42:01

emotions then we we'll just do nothing

42:04

right to change uh our situation and do

42:07

nothing uh while we're at this crossro

42:09

crossroads that we're at that I

42:11

mentioned earlier uh so there's an

42:12

effort to emotionally manipulate us uh

42:15

there as well what you know they can

42:17

determine what you see and they know you

42:19

know you're you're well studied because

42:20

of all the data that has uh you have

42:22

generated during your time in the

42:24

digital environment and they can use

42:26

that to determine exactly what type of

42:28

demographic you are exactly what uh you

42:30

would need to see to shift your

42:32

viewpoint from viewpoint A to viewpoint

42:34

B

42:36

>> um and uh you know the type of

42:37

manipulations they can do um you know

42:39

they can do at a tremendous scale now

42:42

with AI and we also have to keep in mind

42:44

that during the Obama administration

42:46

They lifted the ban on the use of

42:48

propaganda on the domestic American

42:50

population that had been in place for

42:52

many decades. And a lot of people

42:54

unfortunately uh forget that.

42:57

>> I was just talking to a senator the

42:58

other day and saying why haven't you why

43:00

haven't you stood up and said and he

43:02

said I have but nobody wants to listen.

43:03

That that needs to be repealed. That

43:05

needs to be changed back to the way it

43:07

originally was. It it's in it's insane.

43:10

Um uh look, can we talk about the way

43:15

countries are behaving right now? Um

43:19

with all of the flaws of Donald Trump,

43:23

it is he is

43:27

at least appears to be the only one that

43:30

is fighting for

43:33

uh the country or his country. I I see

43:38

some of these others that I think the

43:41

the head of Hungary is doing the same,

43:44

but you see these prime ministers and

43:46

these presidents everywhere and

43:51

they are so disconnected from the people

43:56

and they're all for this global thing

43:58

where everybody is like, "No, I like my

44:01

flag. I like being Italian. I like being

44:05

English. I like being American. I like

44:07

being Canadian.

44:09

And yet that's all being erased. And

44:12

it's all happening in the same language

44:15

at the same time in their political

44:19

systems. We're passing the same laws.

44:21

We're doing the same things. And yet

44:23

we're each of us convinced it's only our

44:26

country because we have this politician.

44:28

We got to get this politician. How do

44:30

you break through to people to say,

44:32

"Look, dummy. Look past the borders.

44:36

Look past our politician. Like or

44:38

dislike, doesn't matter. Look past them.

44:41

It's happening everywhere. This is a

44:43

global movement.

44:48

>> Yeah. I would probably start with

44:49

pointing out that for example in our

44:52

Congress, it's not like the congressmen

44:54

themselves write all of the legislation

44:56

that they pass, right? A lot of that

44:59

comes pre-written from think tanks. And

45:02

a lot of those think tanks have certain

45:03

things in common. They share a lot of

45:05

the same uh oligarch connections for

45:08

example. So the world economic forum

45:10

arguably is one such think tank. Um

45:12

another one would be the Carnegie

45:14

endowment for example that for a long

45:16

time was dominated by uh the Pritsker

45:18

family. Bill Burns Biden and CIA

45:20

director used to be the head of that for

45:22

example. Um they're another one that has

45:25

a lot of influence uh that way. CSIS,

45:28

which has another, I believe, Pritsker

45:30

on the chairman, uh, as its chairman,

45:33

the one that was most tied up with the

45:35

Epstein scandal, uh, is there. And, you

45:37

know, the Pritskars, uh, not to pick on

45:40

them, but they're just, you know, one of

45:41

these families that doesn't really get

45:42

talked about very often, instrumental in

45:44

the rise of Barack Obama in Chicago

45:46

along with the Crown family, uh, for

45:49

example. Uh, but, you know, their family

45:51

has ties to organized crime going back

45:53

to like the 30s or something like that.

45:56

So unfortunately a lot of um these

45:58

powerful figures have uh shady

46:01

connections but a lot of money and have

46:03

rebranded as philanthropists and in

46:05

doing so have allow uh you know have

46:08

gotten these uh trustee or influential

46:11

roles at these think tanks which then uh

46:14

you know fund you know various fellows

46:16

and other uh people that write the

46:19

actual legislation that ends up in the

46:21

hand of your congressman right and the

46:23

congressman is told by the different

46:24

lobby groups

46:26

um you know this legislation is uh you

46:29

know covers these topics and and here

46:31

you go you know and I mean I think it's

46:34

only a few uh people in in our

46:37

legislature like uh you know maybe Rand

46:39

Paul or a Thomas Massie who will point

46:41

out and be like I just got this

46:43

legislation on my desk desk uh and I

46:45

have to vote on it in 48 hours and it's

46:48

you know a couple thousand pages right

46:50

>> have any congressmen how many

46:52

congressmen have actually read this

46:53

whole

46:56

you know and so I tend right

47:00

>> and I tend to think that this is a very

47:02

common it appears to be common in other

47:04

countries uh around the world and so you

47:06

would have um you know a lot of these

47:09

think tanks that we know in the US some

47:10

of them have subsidiaries in other

47:12

countries like Latin America or Asia

47:15

where have you um and so that's how you

47:17

get you know I think the most uh easiest

47:21

example would be you know in the co era

47:23

how a lot of the a lot of countries

47:25

regardless of what they had whether they

47:27

had leaders on the left or the right um

47:29

adopted a lot of the same legislation

47:31

and policies in a very very very uh

47:34

short period of time. But also if you

47:36

look at Europe for example and the idea

47:38

uh the the policies and ideas that have

47:40

led to the current uh you know uh

47:42

immigration crisis there um you know it

47:45

was coming from left and from right the

47:47

legislation was coming from certain

47:49

think tanks um and I think people need

47:52

to look at these other layers of power

47:54

that are behind uh the politician.

47:57

there's the the the think tanks and

47:59

there's also the people that fund those

48:01

think tanks and a lot of those a lot of

48:03

that money also directly funds campaigns

48:06

of of politicians right and um I think

48:10

unfortunately a lot of the media uh for

48:13

a long time obviously mainstream media

48:16

um you know doesn't really look at those

48:17

connections arguably because a lot of

48:19

those a lot of that same money

48:20

influences the corporations that own

48:23

them right Yeah,

48:27

back with Whitney in just a second.

48:29

First, if you wake up every morning

48:30

wondering how pain is going to affect

48:32

your day, Michael from Connecticut used

48:34

to, not anymore. Let me tell you his

48:36

relief factor story. Uh, Michael was

48:38

dealing with elbow pain, which was, you

48:41

know, everything he used his hand and

48:42

arm for, it made it really, really hard

48:45

to do anything. He tried to work until

48:49

um he just couldn't anymore. And then he

48:50

tried Relief Factor. Nothing worked

48:52

until Relief Factor. And then he started

48:55

working again. He said, "My elbow pain

48:57

went away." Uh, and on top of that, he

48:59

said, "I have more energy." If you're

49:00

living with aches and pain, see how

49:02

Relief Factor, a daily drug-free

49:03

supplement, could help you feel better

49:04

and live better. Join the over 1 million

49:07

people who have turned to Relief Factor,

49:08

just like Michael. Call now. Try the

49:10

3-week quick start. It's $19.95, less

49:13

than a dollar a day. Don't let pain keep

49:14

you from living every day that you want.

49:16

It's relief.com. 8004reief. 800 the

49:19

number relief.com.

49:23

Is there I mean I' I've been looking at

49:26

South Korea since Charlie Kirk died. Um

49:29

I was asked to take on a couple of

49:31

things that he was doing and one of them

49:33

was South Korea and I had no idea what

49:36

was going on in South Korea. I mean, I I

49:38

knew somewhat, but I knew that there was

49:40

a president that uh was an awful lot

49:43

like Donald Trump was, you know,

49:45

fighting against a lot of the literal

49:48

Chinese communists that had infiltrated

49:51

his country and they they did all kinds

49:54

of stuff. A lot of the stuff they did to

49:55

Donald Trump, but he was backed into a

49:58

corner and made a huge mistake and he

50:01

went authoritarian and he's like, "I'm

50:03

suspending cuz I don't believe any of

50:05

you guys. you're all you're all part of

50:07

this. I'm suspending the legislature and

50:11

declared martial law until it could be

50:13

sorted out. Well, the people rightfully

50:15

went what? They revolted. They threw him

50:18

out. He was impeached. Uh by the I think

50:21

by the end of the day he was out. Um but

50:24

that swung everything towards the

50:28

revolutionaries on the other side. Uh,

50:32

and you know, they're they've opened

50:33

their border to China, to North Korea,

50:36

letting people just just flow in. Um,

50:40

and they are now starting to persecute

50:42

anybody who had a conservative point of

50:44

view, anybody that was involved, you

50:46

know, from 5 years ago with this

50:48

president, uh, or voted that way. And

50:50

now churches and pastors are going to

50:54

prison.

50:55

And it is

50:58

really frightening to watch this. And

51:01

I've been watching it and I thought,

51:03

"Wow, I think this is the playbook here

51:06

for America and any of these people like

51:10

Donald Trump that, you know, they say,

51:13

well, they have, you know, tendencies

51:14

towards authoritarianism."

51:17

Maybe he does, maybe he doesn't. And I'm

51:18

the I'll be the first to stand up if you

51:20

start breaking the Constitution. But I'm

51:23

I'm watching what's happening for

51:25

instance in Chicago

51:27

and I'm thinking, okay, if I'm the

51:29

average person, I'm like, well,

51:31

something has to be done. And that's

51:33

your first mistake. When something has

51:35

to be done and it doesn't, you don't

51:37

follow it with something constitutional

51:40

must be done. You you find you find

51:43

yourself in a whole different ballgame.

51:46

We're entering a time where the left is

51:49

causing so much chaos on the streets.

51:52

They are I mean they they it has

51:55

something has to be done. You know what

51:58

I mean? And then you have because of

52:00

that you have this growing feeling on

52:03

the right saying, "Yeah, I know

52:05

something has to be done and it just has

52:07

to stop."

52:09

That's where South Korea ended. And I

52:13

fear that if we're not really super

52:16

careful, that's where we're going to

52:17

end. And that's by design.

52:20

>> Does that make sense or is that just

52:22

crazy talk?

52:24

>> Um, yeah. I don't think it's crazy. But

52:26

what it does remind me of is something

52:27

that happened several decades ago,

52:29

mainly in Europe, that was called

52:31

Operation Gladadio. I don't know if

52:32

you're familiar, but it basically

52:34

involved intelligence agencies uh

52:36

organized crime and elements of the

52:39

Vatican

52:40

>> um funding uh terror attacks against

52:43

civilians. Um and they were framed in

52:46

that particular case as being terrorist

52:49

attacks from the left. But the ultimate

52:51

goal uh was to create so much terror

52:54

that people would give up their uh

52:56

liberty for feeling of security, feeling

52:58

it was safe to take the bus again, that

53:01

it was safe um

53:03

>> to live a semblance of a normal life.

53:06

It's sort of similar to what happened

53:08

during co people would give up so much,

53:10

right? Take um the injections, get the

53:12

vaccine passport just to have a a a

53:14

semblance of a normal life, right? But

53:17

this is the same way to do that but with

53:19

violence. Um, and who ultimately wins at

53:22

the end of of the day, I think, is what

53:24

we should be asking here. And we need to

53:26

keep in mind, too, that particularly in

53:28

the United States, every president since

53:30

September 11th has opted to expand um

53:34

the so-called war on domestic terror.

53:37

>> And uh you know, you'll have a Democrat

53:40

president in and they'll weaponize it

53:42

against the right and vice versa. And we

53:45

have and but either way, the more it

53:47

grows, the more it endangers our

53:49

constitutional rights. Correct.

53:51

>> And so I think it's very important um to

53:55

uh again be extra vigilant um about that

53:59

because ultimately what they want what

54:01

what the powers that be uh want is that

54:04

same Hegelian dialectic of problem,

54:06

reaction, solution. they want to solicit

54:09

that reaction or which has us consent to

54:12

the solution uh that they wanted to

54:14

implement anyway. And so I fear that

54:18

because of the increased power of an

54:19

entity like Palunteer in the US

54:21

government now that the the next shoe to

54:24

drop will there will be a huge push uh

54:27

for pre-rime predictive policing as

54:29

discussed earlier and uh Trump uh nearly

54:33

fell for that trap in uh 2019 when there

54:36

were a spate of mass shootings. So,

54:38

William Bar, who was an attorney

54:39

general, uh it got barely any media

54:43

coverage, but he created the uh the

54:45

legal infrastructure for pre-rime in the

54:48

United States through a program called

54:50

DEP. Um and then after that, uh

54:53

>> explain what for anybody who doesn't

54:55

know what that is, explain it.

54:58

>> Uh DEP is an acronym. I forget exactly

55:00

what it stands for, but it's like

55:02

deterring. It's something about

55:04

deterrence ear through early detection

55:06

or something like that.

55:08

>> Uh but basically the legal

55:10

infrastructure set up by Billbar there

55:11

was that you could ostensibly um arrest

55:14

someone before they committed a crime

55:17

preemptively and there have been only a

55:18

handful of arrests through deep vi VMI

55:21

understanding but because it's there

55:23

anything could happen that could make it

55:26

uh be deployed at scale. And so that was

55:29

particularly concerning at the time

55:30

because after that uh because of the

55:33

outrage about the spate of shootings at

55:35

the time that I think began with the El

55:37

Paso uh Walmart shooting of that year.

55:40

Um

55:41

>> Trump said that social media platforms

55:43

need to develop tools uh where uh they

55:47

look at what users are saying and uh

55:49

determine who will be a shooter before

55:51

they can commit an act of violence. I'm

55:53

paraphrase paraphrasing there. Um and

55:56

then uh his uh administration was

55:59

considering but did not implement um a

56:03

uh health focused version of of uh the

56:06

Pentagon's DARPA. They were calling it

56:08

HARPA and that the pilot program of uh

56:11

the proposed DARPA would be another

56:13

acronym and I'm sorry that I don't

56:15

remember what it stands for but it's

56:16

quite long. It's called Safe Homes. And

56:19

uh the biggest uh lobbyists of this to

56:22

the president at the time were Jared

56:23

Kushner and his daughter uh Ivanka. Um

56:26

and basically what that program proposed

56:28

was for an AI to go over all of American

56:32

social media posts and determine what

56:34

they called early warning uh early

56:37

warning signs of neurossychiatric

56:39

violence. And if that and if a user's

56:42

profile was flagged, all sorts of things

56:45

could be triaged from that, including uh

56:48

you know, court-ordered physician

56:49

appointments and all sorts of things

56:51

that sound terrible. Uh Trump, according

56:55

to the Washington Post, liked the idea,

56:58

but he ultimately didn't pass it. So,

57:00

you can take the post reporting uh

57:02

however you want, I guess. But what did

57:04

happen, the Biden administration did

57:07

create HARPA, but they created it under

57:09

another name. They called it ARPAH and

57:12

they framed it as uh this is how we're

57:14

going to cure cancer. But a lot of the

57:17

same uh programs are still there. The

57:20

same architects of that HARPA proposed

57:22

to Trump for those purposes in 2019 were

57:25

also involved in the creation of ARPA H

57:28

uh which has been pushing for uh you

57:30

know uh people to wear wearables for

57:32

example which are you know

57:34

>> Mhm.

57:35

You could theoretically uses

57:37

surveillance devices, but you wear them

57:39

on your body, right? Um, and they might,

57:42

you know, Palanteer runs a lot of that

57:45

same data as well. And if they were ever

57:48

to combine and end the silo between

57:51

healthc care and law enforcement since

57:53

they contract to both, there is a

57:55

potential for very very um you know

57:58

Orwellian uh terrifying stuff when it

58:01

comes to predictive policing and

58:03

predictive analytics. Uh so you know I

58:07

it it again depends on who is around the

58:09

president and how much he listens to

58:11

them. Uh but I think it's uh since that

58:14

happened in 2019, you know, there was an

58:16

attempt to get him to implement that

58:19

program then and if there is a big

58:21

enough um event again uh that could uh

58:25

lead to huge calls to do something

58:29

um you know we could see that be

58:31

marketed as the quote unquote solution.

58:33

And who wins there? Well, the big tech

58:35

oligarchs that control all of the

58:37

infrastructure that would be behind

58:39

pre-rime and the AI algorithms. And

58:42

what's troubling too about the war on

58:44

domestic terror um is that it the

58:46

definition for it, the government's

58:48

definition for it across uh uh

58:50

administrations is incredibly incredibly

58:52

vague. So one example is that you can be

58:55

defined a domestic terrorist if you feel

58:57

like you have to um uh stand up against

59:01

government perceived government

59:03

overreach is the term.

59:06

>> So that could very easily be anyone on

59:08

either side of the political aisle.

59:10

>> Yeah. So, um, again, when we see when we

59:14

want to suspend civil liberties and

59:16

constitutional rights for just one

59:18

segment of the population because we're

59:20

told it's necessary so that we can feel

59:22

safer.

59:24

What ultimately happens historically is

59:27

that those rights go away for everybody

59:30

except the people at the very very very

59:32

top that are controlling these systems.

59:34

Um, for anybody who doesn't know what

59:35

Palunteer is, who's running it, why it's

59:39

so dangerous, will you take us down that

59:41

road?

59:44

>> Uh, sure. I would be happy to. So, um,

59:47

my work on Palunteer argues that it was

59:49

an effort to privatize this program that

59:52

was pushed on the public after 911 that

59:54

was called total information awareness

59:56

that was also housed in the Pentagon

59:58

DARPA. Um

60:00

there was a huge outcry about this

60:02

program at the time because it was uh I

60:05

would argue rightly described as uh

60:07

eliminating uh the constitutional right

60:10

to privacy because everyone's data was

60:12

being sucked in and everyone's data was

60:14

being spied on. And the ultimate goal of

60:17

TIA, total information awareness,

60:20

was to have a pre-rime system in the

60:23

United States that would stop, they said

60:25

at first terrorist attacks before they

60:27

could happen. But they're not just

60:29

looking at terrorists, they were looking

60:30

at everybody. So obviously it was moving

60:33

towards predicting crime before it

60:35

happens. And it also had a health

60:36

component where they said they would uh

60:39

hopefully predict uh bioteterror attacks

60:42

before they happen. This is again during

60:44

the anthrax. uh the aftermath of the

60:45

anthrax attacks of 2001 uh but also that

60:48

they would predict pandemics before they

60:51

happen and a lot of that uh renewed

60:53

interest in that you could say uh

60:55

occurred during the co era right and so

60:58

as this program was getting into trouble

61:00

and they tried to change their name and

61:03

tried to do all these things to keep

61:04

Congress from defunding them uh

61:06

Palunteer uh was incorporated uh by

61:09

Peter Teal and Peter Teal and Alex Karp

61:12

who were two of the Palunteer

61:13

co-founders

61:14

uh talked to Richard Pearl who put them

61:16

in touch with the person who was running

61:18

total information awareness and they

61:20

basically said uh you know they viewed

61:22

him as uh point John Po Dextter was his

61:25

name they viewed him as the godfather of

61:27

modern surveillance and they wanted u to

61:31

essentially recreate what he was doing

61:33

but they did so as an entirely private

61:35

entity and in doing so because the

61:37

government wasn't directly involved a

61:39

lot of the outcry just dissipated and

61:42

the earliest funders of Palunteer um

61:45

were uh Teal himself, but also the CIA's

61:48

Inqel and the CIA uh was Palanteer's

61:51

first client uh and was their only

61:54

client, I believe, for their first five

61:56

or six years as um a company. Uh Alex

61:59

Corpus said the CIA was always the

62:01

intended client of Palunteer. Uh you had

62:04

Palunteer engineers going to CIA

62:06

headquarters every two weeks having them

62:08

uh tweak their product. Uh, it appears

62:10

to be, I would argue, a CIA front

62:13

company. And the CIA, particularly its

62:15

chief information officer at the time, a

62:18

fellow named Alan Wade, had also been

62:20

one of the biggest cheerleaders of total

62:23

information awareness.

62:25

And he was also um, apparently a

62:27

business partner of Galain Maxwell's

62:29

sister Christine Maxwell. They tried to

62:31

make a homeland security software

62:33

program together um called Kilad

62:37

>> which is uh you know worthy of scrutiny

62:40

as well and I have uh some writing about

62:42

that or some more information about that

62:44

in my in my book. Well, basically that

62:47

uh there was this scandal in the 80s um

62:50

that involved Robert Maxwell uh her

62:52

father called the promise software

62:54

scandal and it was where the CIA and

62:56

also Israeli intelligence put back doors

62:58

into um this uh software program that

63:02

was marketed to countries and to

63:03

corporations and banks um throughout the

63:06

world. And Christine Maxwell had

63:08

actually been directly involved with the

63:10

front company that her father used to

63:12

market that software. Um and then uh

63:15

actively after his death in 1991 said

63:18

that she and her sister, her twin

63:19

sister, also Gileain's sister, um were

63:22

trying to rebuild their father's uh

63:24

legacy. And so they uh created this tech

63:27

company uh that became one of the early

63:30

search engines, but they developed a

63:32

very close working relationship with

63:33

Bill Gates and Microsoft, which is

63:35

probably how Bill Gates actually met

63:37

>> uh Jeffrey Epstein many decades before

63:40

uh they officially met. And there's

63:42

other attestations to that as well. But

63:44

basically um the software that she

63:47

created with Wade Kilad uh was a proto

63:50

Palunteer and the promise software was

63:52

actually very similar to um Palunteer as

63:56

well. But uh the software had been

63:58

stolen from a fellow named Bill Hamilton

64:00

and his company Inslaw uh Inc. And so uh

64:03

they had been the Hamiltons had been

64:05

suing uh the US government to try and uh

64:09

get payments restored to them for the

64:12

use of their uh software, but it was

64:14

stolen illicitly. And so by turning it

64:17

sort of laundering it into these

64:19

different um companies, they were able

64:21

to avoid ever paying the Hamiltons any

64:23

money for the software that they

64:25

essentially stole. Um, and so anyway, I

64:28

don't want to get too um

64:30

>> off the topic of of Palunteer, but you

64:32

know, these are the characters that

64:34

essentially uh created it. And it uh it

64:37

labels people as there's a label you can

64:39

label someone as a subversive

64:42

>> um in the Palunteer system. Um and it

64:44

collects essentially everything um about

64:47

you. And so currently it's being used to

64:49

target and uh classify uh immigrants for

64:53

deportation, but it has those same

64:55

capabilities that could be used against

64:57

uh you know actual American citizens

64:59

domestically if the war on domestic

65:01

terror was ever to begin in earnest. Um

65:05

and so I find it an immensely concerning

65:07

company. Particularly its interest in uh

65:10

predictive policing and pre-rime which

65:12

it was one of the earliest uh piloters

65:14

of uh of predictive policing. I believe

65:16

they started in New Orleans. And there's

65:18

also the fact that um you know the

65:20

co-founder of of Palunteer, Peter Teal,

65:23

uh was dis relatively dishonest, I would

65:26

argue, about his meetings with Jeffrey

65:27

Epstein. Uh he was uh trying to get uh

65:31

well, he was involved in funding a

65:33

company that also has pre-rime uh

65:35

pre-rime uh capabilities uh that was uh

65:38

championed by Ahood, Barack, and

65:40

Epstein. Uh Epste put a lot of money

65:42

into it. It's called Carbine. Um and uh

65:46

there were meeting newly released emails

65:48

showed that they were all sort of

65:50

talking to each other about Teal

65:52

investing directly in Carbine and Teal

65:54

invested. Um you know uh I think he one

65:57

of his venture capital uh firms received

65:59

a significant amount of money from

66:01

Epstein and he had not been uh very

66:03

upfront about that um until you know

66:06

relatively recently. Um so I think um

66:10

you know that company too, Carbine uh

66:12

have it has creeped into a variety of uh

66:16

uh counties across the US taking over

66:19

the 911 emergency call systems. And if

66:21

Congress is to pass legislation uh that

66:24

would federalize the 911 system, make it

66:27

a all national system, which there is a

66:29

push to do, um you know, Carbine is has

66:32

been the top lobbying firm for that. Um,

66:35

but they have a pre-rime component where

66:37

if you um they call it the Crecords

66:39

component, but you can't find it on

66:41

their website anymore after um there

66:43

were reports on it. Um, but essentially

66:46

it would comb all of the data off of

66:48

your smartphone and use it, put it into

66:50

its pre-rime uh analytics to determine

66:53

>> if you might be calling 911 again in the

66:56

future or be the reason 911 is called.

66:59

and that eventually street lights uh in

67:01

smart cities would call 911 uh for you

67:04

on their own.

67:07

>> Sometimes the most powerful innovations

67:09

aren't about adding more stuff. They're

67:11

about taking things away. They're about

67:13

creating less clutter, less confusion,

67:15

less fiddling around. And that's exactly

67:17

what Audient did with the new Atom X

67:20

hearing aid. Instead of tiny little

67:23

buttons and frustrating apps or endless

67:25

configuration screens, they just put a

67:27

simple touchcreen right on the charging

67:29

case. So you have your ears, you put

67:31

them in and you just tap and adjust and

67:34

you hear. It's really simple. No more

67:37

squinting, no more need for techsavvy

67:39

grandkids. Just a beautifully designed

67:42

ready to go device that makes is made by

67:44

aologist who listen to people what they

67:47

want. Here's the best part. You don't

67:49

need a prescription. You don't need a

67:50

waiting room. You don't need a $1,000

67:52

loan. The Atomax starts at 98 bucks.

67:55

It's hearing without the hassle. Clarity

67:58

without the cost. And for the first time

68:00

in a long time, you will hear your

68:02

family's laughter at dinner. The

68:04

pastor's message. The punchline on TV.

68:07

And you'll smile because you didn't miss

68:08

a thing. The Adam X from Audient.

68:11

Finally, somebody got it right. Don't

68:13

wait. Visit audihheering.com.

68:15

audihering.com. Take control of your

68:17

hearing today. audiheering.com.

68:22

Is it um it does it ever amaze you how

68:25

small the circle is? There's not a lot

68:29

of people doing these things. I mean, it

68:32

is, but not when you look at it

68:33

globally.

68:34

>> It's it's like

68:35

>> I think it amazed me at first, but now

68:38

it's like, oh yeah, it's those guys. Um

68:40

you know,

68:41

>> what do you think the number is? What do

68:43

you think the number is that's actually

68:46

knows what they're doing and is doing

68:49

it?

68:50

At most, I would say it's probably a

68:52

couple hundred.

68:54

>> Is that probably smaller than that, but

68:58

well, you know, especially with the

68:59

technology they have today, it's never

69:01

been easier for the few to control the

69:03

many. And they want to make it so that

69:06

um you know, the peasants,

69:09

>> yeah,

69:09

>> uh the surfs can't uh you know, fight

69:12

against their rule anymore.

69:15

Um, and again that's why we have to uh

69:17

resist this as much as possible. But I

69:20

uh unfortunately think that to try and

69:22

get us to consent because again they

69:24

need our consent. They will throw the

69:26

kitchen sink at us to try and get us to

69:28

consent.

69:30

Uh they they could make life very

69:32

difficult. They could I mean you know

69:34

>> Oh yeah. use acts of of terror like they

69:36

did in in something like Operation

69:38

Gladadio to make people so afraid for

69:41

their lives that they will give up all

69:43

of their liberties.

69:44

>> They did. This is what the communist did

69:46

feel safer.

69:47

>> This is what the communists did to take

69:48

over. I think it was Hungary. You know,

69:50

the the the NATO thing was we'll have

69:52

peace, but you can't go in unless

69:55

invited. You can't turn any countries uh

69:58

into uh Russian satellite countries

70:02

unless invited in. And so they just went

70:04

in and they they did pretty much what's

70:07

happening now, you know, and built the

70:10

framework for it to fall in and then

70:12

caused chaos in the streets. They had

70:13

tanks parked right on the border and

70:17

when the chaos got to a certain level,

70:19

their people inside the government of

70:21

Hungary said, "We need help." And Russia

70:25

rolled ac and they were communist

70:26

country overnight. I mean, it's it's not

70:29

a it's not a hard thing to figure out.

70:30

They do it over and over again.

70:33

But I would argue too that this is

70:35

bigger than just national governments.

70:37

This is um

70:38

>> oh yeah

70:40

>> people yeah I don't know what to call

70:41

them but oligarchs again it's a small

70:44

number of people and they have their men

70:47

as it were in every government

70:48

everywhere.

70:50

>> Um I'll give you an example that I find

70:52

particularly interesting. So Samuel

70:54

Pisar uh remembered as a human rights

70:56

lawyer, maybe remembered better uh in

70:59

the last administration because he

71:00

helped raise Anthony Blinken who was his

71:02

stepson.

71:03

>> Uh he was also a very close friend of of

71:06

Robert Maxwell. um he testified to

71:09

Congress in the early 70s and he talked

71:12

about something called the rise of the

71:14

trans ideological corporation and he

71:18

said that the western multinational

71:20

corporations of uh yeah in the west

71:23

right um had started making all of these

71:25

joint ventures with the state-owned

71:27

communist companies um of Russia and of

71:30

China and that what was happening is

71:32

that they were basically creating a

71:34

global government of economic power that

71:36

was making the nation state entirely

71:38

irrelevant. This is in the early 1970s

71:42

and a congressman, I forget who it was,

71:44

uh asked Pisar, "Is this a bad thing?"

71:47

And Pisar was like, "Not necessarily."

71:50

>> Yeah.

71:51

>> And at the same time, his pal Robert

71:53

Maxwell was making all of these

71:55

connections uh to entities like the KGB

71:59

uh to uh Israeli National Security

72:02

Agencies in the UK, also in the US,

72:05

across the board. uh and giving them

72:08

this backdoor software while also trying

72:10

to tie together a bunch of organized

72:12

crime families across the world starting

72:14

with the Yakuza in Japan to Simeon

72:17

Mogulich and uh you know Soviet in the

72:21

Soviet Union and to mob bosses in the

72:24

United States. I mean it I I don't mean

72:27

to laugh but it's just truly astounding.

72:29

And this was going I mean this was the

72:30

70s and he just brazenly admitted it to

72:32

Congress. Well, Carol Quigley,

72:34

>> he said one of the

72:35

>> Do you remember in the 60s Carol

72:37

Quigley? He did the same thing. They

72:39

made him a pariah for a few years, but

72:41

he was like proud of it. No, we're going

72:43

to end war. We're going to just tie

72:44

everything together financially and

72:46

then,

72:47

>> you know, you'll have these police

72:48

actions and the world changed exactly

72:51

the way he said it was going to change.

72:54

I mean, they're they're proud of it.

72:55

They want to tell you. They're proud of

72:57

what they do.

73:00

>> Yeah. Uh I think he in particular

73:02

Quigley was talking about uh this being

73:04

affected by the so-called roundt groups

73:06

like the Council on Foreign Relations,

73:08

the Trilateral Commission

73:10

>> uh of which Kier Starmer is a is a

73:12

member if I'm not mistaken. Um and uh

73:16

yeah, again these these think tanks are

73:18

very powerful. I think actually as it

73:19

relates to the CFR, there's a video of

73:21

Hillary Clinton uh calling it the

73:23

mothership when she was Secretary of

73:26

State.

73:26

>> Wow. where uh where where her where the

73:29

foreign policy directives really come

73:31

from. Something to that effect.

73:34

>> Is it possible to break this

73:36

>> without breaking society? Is it possible

73:39

to break and stop this?

73:43

>> So I think it is. But I think also that

73:47

people have to realize that to untangle

73:51

uh these powerful interests from our

73:53

lives and from our world uh it they

73:56

won't go down easily but they will go

73:59

down more easily if more of us act and

74:01

more of us also know and understand that

74:05

they uh in a lot of cases there are

74:07

efforts to try and make us

74:09

uh resort to violence. Yes, that's what

74:11

they want. And I think um

74:14

>> in the last administration it should

74:15

have been very obvious to conservatives

74:17

that there was an effort to go them

74:18

towards violence. Um and I think that

74:21

will pingpong from left to right. It

74:23

will go you know it it it they want to

74:26

just get people that want to fight

74:27

against this on both sides and they want

74:30

to demonize them uh so that they can be

74:32

sort of swept up in this war on on

74:35

domestic terror. So violence is

74:37

absolutely not the answer. But what can

74:38

we do? I think it's important again uh

74:42

what we have to focus on what we can

74:44

actually control you know overnight we

74:46

can't um you know a person like me can't

74:49

dismantle uh the W or the CFR or any of

74:52

these things but what can I do what can

74:54

I actually control right um and so I can

74:57

control um you know where I uh how I

75:01

live my life how I raise my children uh

75:04

whether I'm dependent on uh the

75:06

infrastructure of people that I know are

75:08

bad whether that's the power grid or how

75:11

I use social media or any of these other

75:14

things you know people need to take

75:15

stock of of their life and what they can

75:17

control but ultimately what it comes

75:19

down to also and I think one of the most

75:22

important uh points I have to convey

75:24

today um is that they want our consent

75:27

so badly and they need it uh for this to

75:30

work and that includes in a lot of

75:34

>> why do they need

75:35

>> I think it comes down to a a user base

75:38

Um, so for example, if uh there's a CBDC

75:41

or a stable coin launched by a

75:43

government somewhere and no one uses it,

75:45

it fails. If digital D ID is a lynch pin

75:50

to all of this stuff and no one uses it,

75:52

it fails.

75:55

And I think they just don't think that

75:57

we they think uh they can use uh you

76:00

know a carrot you know in the carrot and

76:02

stick analogy to lure us in and then

76:06

once we are in out comes the stick and

76:10

um

76:10

>> I think a lot of this if it's not

76:12

through uh you know fear which is you

76:16

know the go-to way to control people

76:18

whether it's the COVID type of fear or

76:21

you know the domestic terror uh type of

76:23

fear You know, that's one way, but also

76:25

money. Our money is a key way uh to try

76:28

and attacking people's wealth in wealth

76:31

transfers. Uh because people that are

76:33

more likely to go into these digital

76:35

prisons, uh they will be desperate. And

76:38

so you want uh desperate people also

76:40

don't think rationally. And so at a

76:43

certain point uh you worry about your

76:45

survival

76:46

>> and you uh stop worrying about you know

76:48

maybe your civil liberties or maybe even

76:50

the constitution and that needs

76:54

Yeah.

76:54

>> And so I think we how do we protect

76:57

ourselves and insulate ourselves and our

76:59

communities from events that would that

77:01

are leading us towards that reaction and

77:03

the problem reaction solution paradigm?

77:05

>> Can I ask you where do we where where do

77:08

we stand on um

77:11

the race for AI and does it matter?

77:16

I mean, I see things I see things that

77:19

are being developed for the Pentagon and

77:22

for China that are terrifying. I don't

77:26

think people understand war. It It's

77:29

going to be as if you fought in the

77:32

SpanishAmerican War and all of a sudden

77:35

you were transported to, you know, uh,

77:39

World War II. Um, nothing is going to be

77:43

the same. Everything that we have is

77:45

going to be outdated. Every I mean the

77:47

killing that is possible in the very

77:51

near future with AI is breathtaking.

77:57

Am I wrong on this? Please say yes.

78:00

>> No, I don't think you're I don't think

78:01

you're wrong on that. Uh I think it is

78:04

incredibly deeply unsettling. Um it

78:06

allows uh not just war but war crimes to

78:09

be committed at scale with minimal human

78:11

involvement. And uh yeah,

78:14

>> if Hitler had just the technology, just

78:16

if Hitler had the technology just that

78:18

we know of today, there wouldn't be a

78:20

Jew on on the planet. There wouldn't be

78:22

one.

78:23

>> I mean, you can track people, you can

78:25

hunt them down, you can grab them, you

78:27

can you can convince them to do I mean

78:31

the the power. So

78:35

tell me where we are with AI on the

78:39

China and our race towards it and all of

78:42

this stuff. I I don't see us building

78:44

all these power. I've I've I've talked

78:47

to the president about, hey, we're going

78:49

to build all these nuclear power plant.

78:51

Well, you better hurry because if you're

78:54

actually fighting that war, we're not

78:57

going to have the power to run these

78:59

places. So where are we on all of this?

79:05

Well, I guess there's a couple different

79:06

ways um to to go here. Um and I'm not

79:11

really sure the best place to start, but

79:13

I guess um what I think of in in you

79:17

asking that question is um there was

79:20

this National Security Commission uh

79:22

called the National Security Commission

79:24

on AI. Eric Schmidt, unsurprisingly,

79:27

um, led it

79:29

>> and yeah, he and he, um, basically some

79:33

of in some of the documentation that

79:35

came out of, uh, of of that commission

79:37

via foyer request uh, showed that they

79:40

felt that the only way for the US to

79:42

catch up to China, and I'm par, this is

79:44

my opinion, um, was essentially to

79:46

become China, right? In the name of

79:49

beating China, we have to do all the

79:51

things. um God that sounds like a very

79:54

gets criticized about China. So the idea

79:57

was in China they use AI for everything.

80:00

AI has crept into every um

80:03

>> facet of a person's life in a Chinese

80:05

mega city. And so um we need to make

80:09

Americans uh use AI just as much if not

80:13

more in order to leapfrog uh Chinese AI

80:16

capabilities.

80:18

So, what did they suggest? And this is

80:20

right before COVID, by the way. Uh, they

80:21

suggested an end to in-person shopping

80:24

and an end to in-person doctor visits.

80:27

Um, an end of uh car ownership that we

80:32

should only use fleets of self-driving

80:34

Ubers that we rent. Um, and you know,

80:37

basically live through our phones and

80:39

live through apps uh that are powered by

80:42

AI because uh they argue China has a

80:46

larger population. It has a user base

80:48

that is feeding Chinese AI uh with so

80:51

much more data than Americans are

80:53

feeding American AI. Uh so we have to uh

80:57

harvest more data

80:59

>> from Americans faster

81:02

>> um in order to catch up.

81:05

So, um, at the at at the same time too,

81:10

you have a lot of big tech oligarchs

81:11

that have a lot of ties to China and

81:14

Chinese industry,

81:16

>> um, and Chinese tech companies that run

81:18

those things in China. Um, and you know,

81:22

I would argue, you know, is the AI uh,

81:25

arms race all the fear jinned up about

81:27

it just to sort of get us to acquies to

81:30

that same type of system here? And

81:32

sometimes, yeah, I uh that's what it

81:35

sometimes seems like to me.

81:39

>> An amazing final segment uh with Whitney

81:42

here in just a second. First, let me

81:45

tell you about Moxy. The seasons are

81:47

changing, the air is getting cooler,

81:49

days are getting shorter, and while

81:50

you're switching out the wardrobe or

81:52

putting on an extra blanket on the bed,

81:54

something else is happening. Something

81:55

you don't know until you already until

81:58

they're already inside. And that is uh

82:01

the through the leaves and the crawl

82:03

spaces and and everything else. Pests

82:06

are coming in. Ants, the silverfish, all

82:09

of them looking for the same thing

82:10

you're looking for. Warm place to hide

82:11

out for the winter. So unless you've

82:13

made it clear they're not welcome. That

82:14

place might be your house. Moxy. Moxy

82:17

pest control. They know exactly how this

82:19

plays out. They've been through it a

82:21

thousand times. They don't just show up

82:22

and spray something around your house.

82:24

They plan. They strategize. They build a

82:25

perimeter and they protect what's yours

82:28

from everything that isn't yours. This

82:30

isn't just about bugs. It's about the

82:32

line between what's out there and what

82:34

belongs inside of your home. The seasons

82:36

may change, but your peace of mind

82:37

doesn't have to. Celebrate 25 years in

82:39

business. Now you can get your first

82:40

pest control service for $25. Visit

82:43

moxyservices.com/beck.

82:46

Use the promo code Beck and get it.

82:48

First service $25.

82:51

Well, it is always fascinating to talk

82:53

to you. I can we talk about Jeffrey

82:55

Epstein for just a second? Um because

82:56

you are the foremost expert on that

83:00

whole web. Um

83:02

>> Oh, well thank you.

83:04

>> Well, you are um I mean I I was thinking

83:07

about it today when we were getting

83:08

ready to do the interview. I'm thinking

83:10

I don't there's nobody that knows more

83:11

about it than you. Do you think?

83:15

>> Well, you know, I would say my expertise

83:17

in my book, you know, about Epstein only

83:19

really goes up to um his first arrest.

83:22

And so I don't really consider myself an

83:25

expert in all the litigation that

83:26

followed that and all the civil cases

83:28

between his accusers

83:30

um and a lot of the court stuff and also

83:33

I feel like there's plenty of other

83:34

journalists that have covered um victim

83:36

testimonies and what victims have said

83:39

um

83:39

>> but

83:42

Jeffrey Ebstein and where he came from,

83:45

what he was, there's nobody better than

83:48

you.

83:48

>> Um

83:49

>> thank you. So is there a

83:54

is there a black book?

83:57

>> So I would say first of all there is a

84:00

black book that has been published. It

84:02

was published by Gawker in 2015. It was

84:04

obtained by journalist Nick Bryant. Um

84:07

and that is the black book we have. Um

84:11

there is obviously documentation and

84:13

documents that the US government still

84:16

has that it has very openly over the

84:18

past several months uh made various

84:20

excuses for about why it will not

84:22

release them. Um as far as the black

84:27

>> uh no I don't

84:28

>> okay

84:29

>> um but I can I can guess about some

84:31

things. So, but there are also a few

84:33

questions that they could just answer

84:35

that don't necessarily involve document

84:37

releasing. Like, why was Zoro Ranch

84:39

never raided?

84:41

It's one of it's in the continental US.

84:43

It's an Epstein property. The New York

84:45

townhouse was town raided. Why were

84:46

there not simultaneous raids on all of

84:48

his properties on US territory not

84:51

coordinate that?

84:53

>> Uh, well, I don't know. I mean, Zoro

84:54

Ranch, there's a lot of um

84:56

>> that's New Mexico, right?

84:57

>> Spec in the New Mexico property. There's

85:00

a lot of speculation about what happened

85:01

there uh with women in particular. Um

85:05

and uh why was it never raided? I just

85:07

find that uh incredibly strange. And

85:10

also, you know, there's attestations

85:12

during the 2019 uh raid on the New York

85:15

townhouse. Uh that there were binders of

85:18

CDs um and you know, hard drives. You

85:21

know, what was what was the content? I

85:23

mean, Pam Bondi has now, more recently,

85:25

after saying she was going to release

85:27

them, turned around and said that

85:28

they're all CP. Um, I don't necessarily

85:31

know if that's true, but again,

85:35

>> a child porn. I just

85:36

>> Okay. All right. Yeah, fine.

85:37

>> Preferred to use the abbreviation.

85:39

Sorry.

85:39

>> Yeah, that's all right.

85:40

>> Yeah. Um, but there's all sorts of um

85:43

things that that could actually be.

85:45

Again, we don't know. Um,

85:48

again, I, as I've said for a long time,

85:50

I think the Jeffrey Epstein case is a

85:52

bipartisan issue. Um, there's a lot of

85:55

powerful people that went to him, and it

85:57

wasn't exclusively, uh, for sexual

85:59

deviency. there uh I I've argued for a

86:02

long time that Epstein was involved in

86:04

financial criminality uh money

86:06

laundering uh tax evasion and it seems

86:09

that there are a lot of very powerful uh

86:12

oligarch figures and many of them very

86:14

powerful big tech figures whose money he

86:17

was uh managing um and one one example

86:20

of that that came downwind of the USBI

86:23

uh case against u you know JP Morgan uh

86:26

was Sergey Bren in particular the Google

86:28

co-founder um and a lot of uh but those

86:32

cases were settled. Um the son of a

86:34

judge was murdered when she was

86:36

overseeing the Deutsche Bank uh Epstein

86:39

case. I think there's a a major interest

86:41

in not having those financial um uh

86:45

relationships fully untangled. Um and I

86:48

think um you know because of how

86:50

interwoven

86:52

um these these networks are um you know

86:56

it's it's uh not politically uh salient

87:02

for uh the Trump administration to

87:04

release them all for whatever reason. Um

87:08

>> can I can I ask you a question? I don't

87:10

know. How do you h how do you decipher

87:14

between an actual conspiracy

87:18

and I mean one that's been driving me

87:21

crazy is that Charlie Kirk was shot in

87:23

the back by a MSAD agent who used a

87:25

hatch that was in the grass right behind

87:28

him and shot him from behind. I mean

87:30

just crazy stuff. How do you when you're

87:35

looking at something? How do you go

87:37

about going ah that's worth looking into

87:40

that's not

87:43

>> well I think at this point for me it's

87:44

it's uh intuition and also the fact that

87:47

a lot of my work is historical so I look

87:50

back many decades and so if I get

87:52

inkling of something suspect happening

87:54

now and the the parties involved happen

87:57

to be directly connected to people that

87:59

I know engaged in wrongdoing and crime

88:01

in the past

88:02

>> then I I tend to be more inclined

88:04

because there's a there are pattern

88:06

patterns and a lot of these people

88:07

repeat the same tactics and the same

88:09

patterns uh of criminality over and over

88:12

again. Uh but I think also um yeah there

88:16

was a deliberate effort to try and

88:18

undermine the reporting on real

88:19

conspiracies by muddying the waters and

88:22

flooding it with crap language.

88:24

>> It was a CIA operative, wasn't it? It

88:26

was that said discredit people by

88:28

calling them conspiracy theorists.

88:31

>> Uh after the Kennedy assassination. Yes.

88:34

And so, but in addition to that, more

88:36

recently, uh, Samantha P's husband, Cass

88:39

Sunstein, uh, wrote a bizarre paper. I

88:43

forget exactly when. I think it was in

88:45

the Obama era. Exactly what the quote

88:48

is.

88:48

>> It said, uh, even if it turns out to be

88:51

true,

88:53

discredit.

88:54

That was, it was like it was your first

88:57

go was to call it a conspiracy theory.

89:00

Even if it turns out later to be true,

89:02

it doesn't matter. Discredit, discredit,

89:04

discredit.

89:06

>> Yes. But in addition to that, there was

89:08

an there was something about

89:10

infiltrating conspiracy movements.

89:12

That's right.

89:12

>> In order to push the needle to a

89:14

narrative that's was more favorable uh

89:17

to the powers that be. So he as one

89:20

example he said a lot of conspiracy

89:23

uh the conspiracy movement in the US at

89:26

that time did not trust the government.

89:28

So, how do we make a conspira infiltrate

89:30

conspiracy movements to make them trust

89:33

the government? And I would argue that

89:35

something like QAnon, it likely was

89:37

downwind of that.

89:38

>> Wow. I never thought of that.

89:41

>> But, but there's very it's very possible

89:45

um that that continues now. Uh I would

89:47

argue it does, especially, you know,

89:49

they know that a lot of this information

89:51

about past conspiracies or even current

89:53

ones, you know, can't always be put back

89:55

in the bottle. But if you muddy the

89:57

waters, you flood the zone, to use one

89:59

of their terms, with things that are are

90:02

dubious, you know, it becomes very hard

90:04

for people uh to sift through the

90:06

content.

90:07

>> And then we're left doing what Eric

90:09

Schmidt and Henry Kissinger proposed,

90:11

relying on AI to sift through all of

90:13

that for us, to tell us the right

90:16

answer. Um, so again, critical thinking

90:20

very important. Um, but I think, you

90:22

know, because trust is at is at an

90:24

all-time low. Um, you know, there's it

90:27

just depends on the person, uh, I mean,

90:29

obviously there's a lot of people that,

90:31

uh,

90:33

um, you know, are terminally online and

90:36

sort of drift into places where they

90:39

might think things are true that, um, I

90:41

I, you know,

90:43

>> certain people would definitely not

90:45

agree with. Um but again I think it just

90:48

comes down to individual discernment and

90:50

critical thinking which are uh qualities

90:52

that are not taught to people uh anymore

90:56

in in schools and uh you know it starts

90:59

with with parents teaching that type of

91:02

um discernment. And for me personally

91:04

you know I I think history adds a lot of

91:06

the necessary context to having that

91:09

ability to discern. Um and so I would

91:12

you know urge people to look at um you

91:15

know what these particular networks um

91:19

have done decade over decade. You know

91:22

what the reason my book is so long and

91:24

is in two volumes is because you know I

91:26

thought that the repeated patterns by

91:28

the repeated individuals that are all

91:30

connected together would show that

91:31

obviously there is something wrong here.

91:34

uh maybe we won't get an emission you

91:36

know from Bill Gates and writing about

91:38

his Epstein relationship or you know

91:40

from uh intelligence agencies that they

91:44

had connections to Jeffrey Epstein and

91:46

affidavit. It's very unlikely we'll get

91:47

those documents. So what can we look at

91:49

in uh in in you know the public record

91:53

that's publicly available? And uh

91:56

obviously I think you know my book shows

91:58

that there's various instances the same

92:01

individuals repeating the same tactics

92:04

um over and over again using a lot of

92:05

the same institutions to do so. Um and

92:08

how you know the it just stacks so much

92:12

that it becomes to me quite obvious that

92:14

something is is very wrong um with that

92:17

particular network. And when you have so

92:21

many instances of financial crime, arms

92:24

trafficking, sex trafficking

92:25

concentrated with such a in such a small

92:28

group of people, many of whom have ties

92:31

to the organized crime uh gangs from,

92:34

you know, America's notsodistant past.

92:37

Um, you know, it to me it looks like

92:39

that a lot of those people rebranded and

92:41

basically the main thesis of my book is

92:43

that those organized crime interests got

92:46

in bed with our our intelligence

92:47

agencies. Um, and some of those

92:50

organized crime figures rebranded as

92:52

philanthropists or other things. Um but

92:55

ultimately um you know it's what that

92:59

that entity that fused entity ultimately

93:01

wants is an authoritarian

93:04

uh government

93:06

and we have to u fight against that

93:09

despite you know all the things that

93:11

they could throw on us and all the

93:12

manipulations

93:14

um that they may target us with. Um,

93:17

which again I think over the ne over the

93:20

short term it's going to be more than

93:21

we've probably ever seen uh before but

93:24

people have to be very steadfast in uh

93:27

how much the constitution matters to

93:29

them that constitutional rights are for

93:31

every American not just the American

93:33

that we happen to agree with. Yes. And I

93:36

think that who benefits the most if we

93:39

start hating our neighbor and want to

93:41

kill them

93:42

>> you know. Um, so

93:44

>> last question.

93:46

What keeps you up at night? What are you

93:49

looking at the future over the horizon

93:50

and going, "Oh my gosh." Well, there's

93:53

more than a few things I guess I would

93:55

say right now, but I'm I'm very

93:57

concerned, you know, as as a parent, you

94:00

know, seeing a school um

94:03

kids that go to school with my children

94:07

or that we just know or or seeing other

94:09

kids uh of other people online just how

94:13

sucked into um technology they are and

94:16

some and some of them how much they

94:18

identify with the technology more than

94:21

the real world. That worries me greatly,

94:23

especially considering that we saw this

94:25

push a few years ago um for the the

94:28

so-called metaverse as it were. Um and

94:31

getting people to want to live in a

94:33

virtual

94:35

uh reality.

94:36

>> Um and actually this political

94:39

philosopher who's very close to to Peter

94:41

Teal, um Curtis Yarvin, he has this

94:45

quote about what should be done with the

94:47

undesirabs of society. He calls it a

94:50

humane alternative to genocide. And it

94:52

sounds just like something Claus Schwab

94:54

would say. It was basically, you know,

94:56

the best. I have the quote. I could read

94:58

it to you. It's on my desk.

95:00

>> It's probably like uh Yaval Harrari's

95:04

quote.

95:04

>> It It truly is. But this is somehow

95:07

someone that is popular uh in certain

95:10

right-leaning circles in the US right

95:12

now. Um but I

95:13

>> What's his name? I got to look him up.

95:15

>> Uh Curt Curtis Yarvin.

95:18

>> Okay. Okay. Uh he said the best humane

95:19

alternative to genocide I can think of

95:22

is not to liquidate the wards meaning

95:24

people either metaphorically or

95:26

literally but to virtualize them. A

95:29

virtualized human is in permanent

95:30

solitary confinement waxed like a bee

95:33

larvae into a cell which is sealed

95:35

except for emergencies.

95:37

>> Oh my god. This would drive him insane

95:39

except that the cell contains an

95:41

immersive virtual reality interface

95:43

which allows him to experience a rich,

95:46

fulfilling life in a completely

95:48

imaginary world.

95:49

>> It is the Matrix.

95:51

>> It's the Matrix. Uh, and I think I just

95:55

worry about how uh I I know I see

95:59

parents that are my age that probably

96:01

shouldn't be parents at all that just

96:03

pass, you know, tablets or phones to

96:05

their kids and just want to focus on

96:07

their own stuff or their own screens and

96:08

don't parent. And we are and those

96:10

people are inadvertently socially

96:12

engineering their children to live in

96:14

that kind of reality if they are deemed

96:17

undesirable or you know part of this

96:20

underclass that AI is going to act upon.

96:24

Um, and that's what really unsettles me.

96:27

Uh, because I think a lot of this, um,

96:29

if they can't do it, um, you know, now,

96:32

they'll absolutely try in future

96:34

generations. And if we don't prepare

96:36

them for this and prepare them to live

96:38

and stand up for the real world and to

96:40

stand up for what it means to be human,

96:42

if they forget, if they never learn

96:44

Yeah.

96:44

>> Uh, you know, what it means, we could,

96:47

then, yeah, I think we could lose it.

96:48

And so, you know, I think uh there's

96:51

never been a more important time uh to

96:53

to be a good parent.

96:55

>> Wow.

96:56

>> Uh than right now.

96:58

>> Good for you. I I just I really love

97:01

talking to you. You're so bright and so

97:04

centered and that's rare. Thank you.

97:06

>> Thanks.

97:07

>> You bet.

UNLOCK MORE

Sign up free to access premium features

INTERACTIVE VIEWER

Watch the video with synced subtitles, adjustable overlay, and full playback control.

SIGN UP FREE TO UNLOCK

AI SUMMARY

Get an instant AI-generated summary of the video content, key points, and takeaways.

SIGN UP FREE TO UNLOCK

TRANSLATE

Translate the transcript to 100+ languages with one click. Download in any format.

SIGN UP FREE TO UNLOCK

MIND MAP

Visualize the transcript as an interactive mind map. Understand structure at a glance.

SIGN UP FREE TO UNLOCK

CHAT WITH TRANSCRIPT

Ask questions about the video content. Get answers powered by AI directly from the transcript.

SIGN UP FREE TO UNLOCK

GET MORE FROM YOUR TRANSCRIPTS

Sign up for free and unlock interactive viewer, AI summaries, translations, mind maps, and more. No credit card required.