TRANSCRIPTEnglish

DOGE & Palantir WITHOUT Musk | This Changes Everything.

29m 5s5,159 words770 segmentsEnglish

FULL TRANSCRIPT

0:00

So, just so you know, yeah, the New York

0:04

Times is handing Doge a win. That's

0:07

pretty rare cuz well, you know how the

0:10

Times likes to be. Musk may be gone, but

0:13

his team burrows in deeper. At the

0:14

Department of Energy, a former member of

0:17

Doge is now serving of the as the chief

0:19

of staff. And at the Interior

0:21

Department, many Doge employees have

0:23

been converted into federal employees

0:25

and they're now deeply embedded in the

0:27

agency. Same thing at the EPA. And the

0:29

story then goes on over here. And I want

0:32

to see what else they're saying. But

0:34

what's fascinating is basically Elon

0:36

Musk's Doge effect might live on even

0:40

though Musk is gone. Which is kind of

0:41

cool because that's what we all wanted

0:44

was that somebody would bring efficiency

0:46

to the government. Uh bring technology,

0:48

bring artificial intelligence, bring

0:50

LLMs, you know, whatever. Uh bring

0:52

Palunteer to the government, whatever to

0:54

operate the government more efficiently.

0:56

And if that could be done with sort of

0:58

the seed that Elon spread, which is

1:00

basically the same thing he's doing with

1:02

children, but in this case, it's Doge

1:03

employees, it's actually kind of cool.

1:05

Some Doge members on Friday expressed

1:07

concern that the president could choose

1:08

to retaliate against Mr. Musk by firing

1:10

people associated with the initiatives.

1:12

Okay, so because of the whole like

1:14

meltdown from Elon, which he's now

1:16

deleted, people were worried that maybe

1:18

Doge could fail because of this. But I

1:20

don't actually think so. I think Donald

1:22

Trump likes this. You know, Donald Trump

1:24

gave Musk an easy win here because if

1:26

Doge can save any money, Trump can turn

1:29

around and spend it and be net neutral.

1:31

Everybody's okay with neutral neutral or

1:33

cuts, great, you know, they just don't

1:35

want to see more debt. And that's what

1:37

Elon was so pissy about any in the first

1:38

place. Anyway, others could Let's see

1:40

here. Uh, but the approach that Doge has

1:44

embodied at the outset, deep cuts in

1:46

spending, personnel, and projects

1:48

appears to have taken root even with

1:50

Musk on the sidelines. Doge on Friday

1:53

notched two legal victories. The Supreme

1:56

Court said it can have access to

1:57

sensitive data uh social security data.

2:01

Wow. And ruled that for now the

2:03

organization does not have to turn over

2:05

internal records to a government

2:07

watchdog group as part of a records

2:09

lawsuit. That's crazy. So you can't foil

2:12

what do Doge is doing right now. And

2:15

they get access to social security data.

2:18

And even with Musk on the sort of Doge

2:22

virus, if you will, though that makes it

2:24

sound bad, but we'll just go with it

2:25

because it's like the spread is

2:27

spreading. Doge staffers are becoming

2:30

far more institutionalized with

2:31

government agencies said the director of

2:34

the White House budget uh management and

2:36

budget. Uh let's see here. A White House

2:40

spokesman said, "The mission of

2:41

eliminating waste, fraud, and abuse is

2:43

part of the DNA of the federal

2:44

government now and will continue under

2:46

the direction of the president." With

2:47

Doge, Musk sought to orchestrate an

2:50

extensive overhaul of the government. He

2:52

promised to eliminate a third of the

2:53

federal budget, $2 trillion, and remake

2:56

federal agencies into a streamlined

2:58

techoriented entity that would operate

3:01

like a business. The billionaire adopted

3:03

the same playbook at Twitter.

3:06

Yep. We know the group's errors have

3:10

included posting billion instead of

3:11

millions. Okay, we already know that. We

3:13

already know the mainstream media

3:14

basically they've gone through a bunch

3:16

of the contracts that Elon says they cut

3:18

and they're like, "Well, some of those

3:20

you only cut because you killed DEI.

3:22

Some of those were already cut. Some of

3:25

those you typoed. Some of those you

3:29

counted the full contract value and not

3:31

the part that was actually assigned."

3:33

And and so after lots of tweaking, this

3:35

is where people think Doge is so far

3:36

safe 50 or 60 bill. But I think the

3:38

legacy of Doge actually lives on so far.

3:42

Like I think a lot of people thought

3:43

Elon's exit from the White House would

3:45

lead to the failure of Doge. So far it

3:49

sounds like you might have just embedded

3:51

a bunch of AI tech bros into the

3:54

government and they're getting like jobs

3:57

at the government in sort of a

3:58

non-traditional way. It's almost like if

4:00

if it was a a virus to take tech bro and

4:04

AI culture and inject it into the

4:07

government. It's now spreading. That's

4:09

kind of cool actually. I don't know.

4:10

Like I'm kind of like damn. All right. I

4:13

mean like Elon can be out but you know

4:15

I've planted a good seed. Courts blocked

4:17

some of Doge's initiatives. Some

4:19

dismissed employees were reinstated when

4:22

their work was proved uh essential. All

4:25

to all told Doge has tried to gain entry

4:27

into more than 80 data systems across at

4:30

least a dozen different agencies. The

4:33

data sets include systems that hold

4:34

personal information about federal

4:36

workers, detailed financial data about

4:38

federal procurement and spending, and

4:40

intimate personal details about the

4:42

American public. Now, the one thing

4:44

that's a little weird about it all, I

4:46

will say, is sort of like the

4:48

pounderification of American data. Uh,

4:52

and this is I mean there were a lot of

4:55

stories on this, but I'll jump into I'll

4:58

I'll just use the Times of India here. I

5:00

mean there there are endless companies

5:02

reporting like the same. Oh, this is

5:03

actually just using a New York Times

5:04

piece. Oh, that's funny. Yeah, here it's

5:07

just the New York Times piece. Fine.

5:09

That they've reposted the Times. Anyway,

5:10

in March, President Trump signed an

5:12

executive order calling for the federal

5:13

government to share data across

5:15

agencies. And apparently now they've

5:17

expanded Palunteers work across the

5:19

federal government in recent months.

5:21

that this by the way is something that I

5:23

mentioned when Trump won and Doge came

5:27

in. I actually talked about Palanteer on

5:29

the channel quite a bit and I said this

5:30

is good for Palanteer because Palanteer

5:32

as they say in the earnings calls, you

5:34

know, I think Karp is right when he says

5:35

it is like we don't cost money, we make

5:37

money, right? Like we make these

5:40

entities money. So Palanteer is the

5:42

perfect Doge tool. Like you could

5:44

literally just be a Doge working go

5:46

install Palanteer everywhere basically.

5:48

Okay, I'm oversimplifying, right? But

5:50

the company has received more than 113

5:52

million in federal government spending

5:54

since Trump took office. Whoa, that's

5:56

actually big because I wonder what

5:58

proportion of their revenue that is. I

6:00

mean, let's look it up. Palunteer

6:01

investor relations. So, financials,

6:05

quarterlys, quarterly report.

6:08

Whoa, look at this.

6:12

So Palanteer in the first three months

6:14

of the year took in 883 billion million

6:19

dollars. But apparently in just about

6:22

the three months that Trump has been in

6:23

office, Trump has sent about 113 million

6:27

to

6:28

Palunteer. 113 million just roughly. I

6:32

know the time periods don't align.

6:33

Divided by

6:35

883. Whoa. about

6:38

12.7% of Palanteer's budget has come

6:41

from or or revenue so far has come from

6:44

Trump new contracts existing contracts

6:46

does not include the DoD contract

6:48

awarded holy smokes social security and

6:53

IRS of course they're going to use

6:55

Palanteer it's a great product for

6:57

putting all this stuff together creating

7:00

detailed portraits of Americans like if

7:02

you wanted to create the perfect deep

7:03

state that has all information about

7:06

everything you've ever done, said, or

7:08

even what you think you're going to do

7:10

next, or every entity you want to

7:12

create, or every entity you want to

7:14

close, every LLC you want to form, every

7:16

trust you want to file, every public

7:17

notice you circulate in the newspaper,

7:20

Palanteer is the perfect way to do it.

7:22

Like, you know, when you go file like a

7:23

doing business as, you're supposed to

7:26

circulate a like notice in the

7:28

newspaper. Dude, who reads that? Nobody

7:31

reads that crap, you know? Like, I don't

7:33

even know that I could find one here.

7:34

Who knows? Maybe the USA Today has one

7:36

somewhere in the back. Usually it'd be

7:38

like VC Star, but I canceled them

7:40

because they were scummy to me during

7:41

the campaign. Um, but anyway, usually

7:45

there's a section like way in the back

7:47

of the papers where they'll be like,

7:49

"Oh, so and so filed, you know, a

7:51

petition for a doing business as or

7:53

whatever." Nobody reads it. But if you

7:55

have pounder tie it all together and

7:56

somebody looks up Kevin Pra, they could

7:58

be like, "Oh, filed this trust and did

8:00

this and this this this entity,

8:02

whatever, all these licenses, you know,

8:05

does this with this agency and this with

8:06

this." Really interesting. I mean, like

8:09

it is the perfect George Orwell

8:11

database. I And I'm not saying I promote

8:13

it. I'm just saying if you wanted to do

8:16

that, Palanteer would be the perfect

8:17

product for it. Trump administration has

8:19

already sought access to hundreds of

8:20

data points on citizens through other

8:23

government databases, including bank

8:25

account numbers, student debt, medical

8:27

claims, and disability status. Wow.

8:31

Trump could potentially use the

8:33

information to advance his political

8:34

agenda by policing immigrants. I

8:37

definitely believe he's going to do

8:38

that. 100% he's going to do that. And

8:41

punishing critics. Okay, so basically

8:43

the only way to protect yourself is just

8:45

yolo calls into Palanteer. Obviously,

8:48

no. Okay. The Palunteers valuation is

8:50

really, really high right now, but you

8:51

could see why it's going up cuz they

8:53

keep plowing money into them. Privacy

8:55

advocates, student unions, labor

8:57

organizations have filed lawsuits. Yeah.

8:58

Good luck with that. Palanteer selection

9:00

of cheap vendor for the project was div

9:02

driven by Doge. There it is. See? Yeah.

9:05

Palanteer and Doge. I'm telling you, all

9:07

you need is the virus, so to speak, of

9:09

the tech bros going in with their little

9:12

USB

9:13

drives. Wait, wait for it. Hold on. Hold

9:15

on. Hold on.

9:19

Uh, all right, bros. Uh,

9:25

USBA or

9:27

USBC? Where are we putting

9:31

Palanteer? And then Palunteer distress.

9:34

Obviously way oversimplifying here, but

9:36

this is very interesting. Uh, okay. Some

9:39

current and former Faller employees have

9:41

been unnerved by the work. Okay, get

9:43

over it. the company risk becoming the

9:45

face of Trump's political agenda? Uh,

9:48

whatever. Everybody, people have

9:50

complained about this forever with

9:51

Palanteer. This is old news. Some

9:53

employee, former employees have signed a

9:55

letter urging Palunteer to stop its

9:56

endeavors with Trump. Nobody cares about

9:58

your letter.

10:00

Nobody. The scoring of the shrine. The

10:05

scoring of the shrine. A letter from

10:06

concerned Palanteer alumni to tech

10:09

workers of Silicon Valley. Nobody cares.

10:12

like this is democracy versus big data.

10:15

Palanteer's leadership has abandoned its

10:17

founding ideals. Actually, I think this

10:19

is exactly what Palanteer was built for

10:22

this sort of technology and people will

10:24

pay huge money for this data that is

10:27

collected for one reason should not be

10:28

repurposed for the uses of other. Sorry.

10:30

Combining all that data, even with the

10:32

noblest of intentions, increases the

10:33

risk of misuse. Of course, because

10:35

imagine how much power politicians are

10:38

now going to have during elections when

10:40

they have competitors, you know, like

10:43

Gavin Newsome collects all the data on

10:45

his competitors when he runs for

10:46

president.

10:48

Dude, there's so much access they have

10:51

here. Palanteer declined to comment. We

10:54

act as a data processor, not a data

10:56

controller. Yeah, I mean, that's a great

10:58

copout for Palanteer, but I don't blame

11:01

them. They make a crapload of money.

11:02

Like if I were Palanteer, I'd be like,

11:04

"Bro, yeah, I mean, whatever, man. If

11:05

that's what they do, that's what they

11:06

do." At the IRS, Palanteer's engineers

11:09

joined in April to use Foundry to

11:11

organize data on American taxpayers.

11:13

Yep. Their work began as a way to create

11:15

a single searchable database for the

11:17

IRS, but has since expanded. Palanteer

11:19

is in talks for a permanent contract

11:22

with the IRS.

11:24

Dude, oh, they're going to know

11:27

everything about

11:28

everyone. This is insane. Social

11:32

Security

11:33

Administration, Education

11:36

Department. This is

11:39

huge. Some people are quitting because

11:41

of their partnership with

11:43

ICE. Yeah. And and this is like even as

11:46

Musk is gone. That's why I'm saying like

11:48

Musk, he only had to push the button to

11:50

start the process basically. What Shri

11:53

Shrier? I don't know what the hell that

11:55

is. I don't care. All right. So anyway,

11:57

those data sets include Okay, we read

11:58

that several days before his departure,

12:00

Mr. Musk was optimistic about the legacy

12:02

he was leaving that Doge will only

12:04

strengthen over time. By th Tuesday, Mr.

12:06

Musk was fretting that his

12:07

accomplishments were being washed away

12:09

by Trump's big beautiful bill. Yeah, we

12:12

already know about Trump or Elon

12:14

freaking out at the EPA. They Let's see

12:18

here. Doge employees in hand. We see the

12:20

policy is continuing on. Fine. Actively

12:23

listening to the

12:24

recommendations. That's

12:26

fine. in recent weeks. That's boring.

12:31

Cancel contracts with Harvard. Big deal.

12:35

Social Security Administration, one of

12:36

the most politically sensitive agencies

12:38

in government. Two members of Doge are

12:41

ex effectively serving as co-information

12:44

chief information officers. Whoa. Two

12:47

members of Doge, Aram Moadasi and

12:51

Michael Russo, are effectively serving

12:53

as co-chief investment officers.

12:56

According to two people with knowledge

12:57

of the arrangement, Mr. Moadasi appeared

13:01

alongside Mr. Musk and other members of

13:03

Doge during a Fox News interview. Still,

13:06

the continued influence of Doge could

13:07

diminish in the continuing days if the

13:10

White House chooses to retaliate. I

13:11

don't think the White House is going to

13:13

retaliate against Doge. See, Trump can

13:16

take the good of Doge and still kick

13:19

basically cut ties with Elon. If you

13:22

want to work for Elon, you're not going

13:23

to go work for Doge anymore. Well, yeah.

13:26

Okay, that's fine. Doge's fate could be

13:29

set up in the courts. A federal judge

13:30

allowed a lawsuit to proceed challenging

13:32

the entire Doge operation. And the

13:34

opinion of Judge Chut Chutkin, oh, we've

13:37

heard about her before, noted that Doge

13:38

has been accused of seizing control of

13:40

at least 17 federal agencies. Several

13:43

federal agencies have been dismantled,

13:44

dismantled. Thousands of federal

13:46

employees have been terminated or placed

13:47

on leaves. Leaves sensitive data has

13:49

been haphazardly accessed, edited, and

13:51

disclosed. Blah blah blah blah blah. The

13:53

card fight will go on, but honestly,

13:55

this spread or the

13:58

palunteerification of American data,

14:02

it's coming whether you like it or not.

14:04

Like, I don't know how I feel about the

14:06

government, you know, being able to tie

14:08

all your stuff together, but, you know,

14:11

I'm kind of like of the mindset that

14:13

it's going to happen whether I complain

14:15

about it or not. So, I'm just going to

14:18

like consider that, I guess.

14:23

And

14:24

um I don't know. I guess you have to say

14:26

nice things about the government in

14:28

power

14:29

now. No, I'm just kidding. I'm still

14:31

gonna tell you all the same

14:34

stuff, but I feel so anything else on

14:40

that. Isn't that very interesting? I do

14:43

want to see some reaction. Let's get

14:44

some reaction on social media on this

14:47

Palunteer

14:48

thing. Uh let's see.

14:53

Twitter first big update is a master of

14:55

overhaul of design language across

14:57

liquid glass aesthetic. I don't care. US

15:00

weapons cannot save Taiwan's party.

15:02

Okay, fine. Sarah Dichi, someone who is

15:06

still on iOS 18. I'll happily skip

15:08

straight to iOS 28. All right, let's do

15:11

what is this? This was the Palunteer

15:13

data. Let's just see what people were

15:15

ranting about. You know where you'll

15:17

probably see the rants would be on uh

15:19

Reddit. Asked about Palanteers compiling

15:22

government data. JD Vance says he is

15:23

more concerned about surveillance via

15:25

commercial ad data brokers. What? No

15:29

way. So like Trade Desk Palunteer

15:32

company, right? Yeah. Yeah. And that's

15:33

about a surveillance thing. So this

15:35

thing is [ __ ] This sounds crazy,

15:37

dude. Like this sounds like we're only

15:39

going to be human for like two more

15:40

years, right? So they and this is the

15:43

Palunteer company where they're going to

15:45

build security databases that have all

15:47

of our information in them. Right.

15:48

That's what they're saying. That's what

15:50

they're saying. Yeah. And it's going to

15:51

have like everything. It'll like JD's

15:53

like that's exactly what we're doing.

15:56

Look at you and know like if you're good

15:58

at tennis or if you've ever, you know,

16:00

if you have like if you have $60. It's

16:02

more than that, bro. It's it's auditing.

16:05

It's do they want to license you as a

16:07

financial adviser, as a broker? Do they

16:09

want you to pass the bar? Do they want

16:11

you like all do they want you to get a

16:14

permit for your home? I don't know. It

16:16

could hit everything, man. this the

16:18

amount of data like every agency will be

16:21

able to tap into this in the future and

16:22

be like, "Do we want this person as part

16:24

of our organization and every little

16:26

thing you've ever said on the internet

16:28

could come back to haunt you because now

16:30

when you want to go get your real estate

16:32

agent license and they palanteer your

16:34

ass, the Department of Real Estate is

16:36

going to be like, "Well, in 2021 you

16:40

said this about Gavin Newsome. you

16:45

know, we don't think agents should speak

16:48

like that. License deferred and they

16:51

just put you in like this loop of

16:52

deferral. Now, I'm being, you know,

16:53

obviously I'm just coming up with an on

16:55

the spot example here, but like every

16:57

agency getting access to this

17:00

data, boy, you basically you're just

17:04

going to have to behave a whole lot

17:05

better because they they're going to

17:07

know everything your pocket or if

17:10

you're, you know, it can tell if you

17:12

know, if your kids can have a limp or

17:13

whatever, if he'll be in the Christmas

17:14

play, you know what I'm saying? It can

17:16

tell all of that.

17:18

And then, and that feels real scary,

17:21

man. like to to a regular guy on the

17:23

street that feels like we're going to

17:24

give our our society like like we're

17:29

going to become these like they're going

17:31

to have every know everything about us.

17:33

It makes you feel like you won't be a

17:34

person anymore. I hear you. I don't I

17:37

don't I don't know what the hell this

17:39

guy is saying about it. Makes you not

17:41

feel like you're going to be a person

17:42

anymore. I'm still going to feel like a

17:44

human. Okay. Coffee probably still going

17:47

to hit the damn same. I go on a run,

17:50

it's still gonna hit the same. I go

17:52

plant some Irish moss, it's gonna hit

17:54

the damn same. Okay? It's not gonna take

17:56

away human. It's It's going to make it

17:59

harder for you to do business if you're

18:01

a critic of the

18:04

establishment. That's what it is. That's

18:06

not taking away your humanity. That is

18:09

instead a form of business censorship.

18:12

That's what I'm more concerned about. Oh

18:14

my gosh, they have my data. I don't feel

18:16

like a human anymore. What the hell

18:19

sense to you? It it definitely makes

18:21

sense to me. I mean, let let me try to

18:22

explain. So, to be clear, I'm not an

18:24

expert on this particular deal. I

18:26

actually just read about it earlier

18:28

today or maybe Okay, this is the classic

18:32

politician thing to say. Oh, well, you

18:35

know, I actually uh I haven't heard

18:37

about that yet. Uh I just saw the

18:38

headline uh sorry so I don't know

18:40

yesterday but the president did an

18:42

executive order I don't know a couple

18:44

months ago and the basic idea is you've

18:47

got all this different information but

18:50

it's not accessible in one place. So

18:53

like let me example where this might be

18:55

useful. Let's say you catch an illegal

18:56

im immigrant. Okay. And by the way this

19:00

is the perfect talking point to defend

19:04

this is immigration. it because all the

19:07

Republicans are gonna be like, "Hell

19:09

yeah, they damn illegal anyway. Get them

19:12

to hell

19:14

out but forgetting that it's also data

19:18

on the legal citizens." That person's

19:21

using a social security number, but the

19:23

Department of Homeland Security that

19:25

arrests the person can't actually figure

19:26

out what social security number that

19:29

illegal immigrant is, what name it's

19:31

attached to. Okay. Well, because often

19:33

they're also using taxpayer

19:34

identification numbers as opposed to

19:36

SSNs, but or you know, let's say you're

19:38

like investigating some terrorist and

19:41

the FBI arrests the person, but you

19:43

know, there are information about like

19:45

where the person lived a couple years

19:47

ago that you'd like to have so you'd

19:48

maybe like to go, you know, talk to

19:50

their friends or associates or whatever.

19:51

Yeah. My understanding is that it does

19:54

sound odd, but that what they're trying

19:57

to do is take all the information that

19:59

the Department of Homeland Security has,

20:00

that the FBI has, and just make it so

20:03

that it's actually not in some hyper

20:05

inefficient system. It's all sort of

20:08

accessible. And here's the thing, modern

20:11

technology is just crazy and weird and

20:13

it affects our privacy. And I I think

20:17

this is like the best vice presidential

20:19

defense. Hey man, we know it feels

20:22

uncomfortable, but but it's okay. Just

20:25

let it slide in. I mean, we don't have

20:28

to think that. Oh, hi Jack.

20:32

What's up, dude? How can I help you?

20:34

What happened? Oh, nothing. We're just

20:35

listening. It is like a reality of the

20:38

world that we live in, right? Sometimes

20:40

we get a little romantic about things.

20:41

Well, I mean, look, and I think it's

20:43

going to go back in time. I mean, look,

20:45

everybody I kind of agree with you. I I

20:46

think people are sort of going to rebel

20:48

against technology a little bit. In some

20:50

ways, they already are. But I mean,

20:52

look, dude, if I if I This has happened

20:54

to me so many times where I'm talking to

20:56

my wife like, "Oh, what are we going to

20:57

make for dinner for the kids tonight?

20:59

Oh, let's just like do door Door Dash or

21:00

GrubHub." And then you go on like X or

21:03

you go on, you know, Facebook or

21:05

Instagram and there's like an

21:06

advertisement for a Door Dash coupon and

21:08

it's like, well, I was just talking

21:09

about this 10 minutes ago. So, we know

21:12

that big technology spies on us and

21:14

harvests our data. I honestly worry more

21:17

about that than about like connecting

21:20

the DHS system to the FBI system. So,

21:23

that's all that this Palanteer deal is

21:24

part like a lot of that is just it's

21:26

just connecting information. So, I'm

21:28

hardly an expert, but that is my

21:30

understanding is that it's just taking

21:31

okay DHS has information, FBI has

21:34

information, Secretary of the Treasury

21:35

has information and making it possible

21:37

for that information to be searched.

21:40

Yeah, by whoever is looking for it.

21:43

That's my understanding. I dude Theo,

21:47

what he just said wasn't that

21:50

surprising. We already knew that. We

21:54

already knew that. And this is the

21:55

response. Looking for it. Oh, that's my

21:58

Can we meme that? Oh, so you're saying

22:02

it's not a big database that's going to

22:05

collect all of our data and make it

22:06

easier to search. It's just the big

22:08

database that's going to collect all of

22:10

our data and make it easier for law

22:12

enforcement and government agencies to

22:13

search. Oh, understanding. Yeah. And I

22:16

and I but again like I hear that same

22:18

that story and my reaction is the same

22:20

which is oh I don't like the government

22:22

having my information. The reality is

22:24

the government already has my

22:25

information and more importantly some of

22:27

these private technology companies have

22:29

way more information on me than the

22:32

government does. Well, they're doing it

22:34

so we need to do it too.

22:36

Oh yeah. I mean, anywhere you shop has

22:39

unbelievable information on you. I mean,

22:40

let me let me tell you like Blockbuster

22:42

two days ago, Blockbuster was like,

22:44

"Happy birthday." They were rolling. So

22:46

I I got to I got to It's like you're out

22:50

of business. There's a there's a Yeah.

22:52

When was the last time you were in a

22:53

Blockbuster?

22:56

But some guy somewhere probably in

22:59

another country, Nepal. I think this is

23:01

why I don't I don't ever watch this guy,

23:03

but I think this is why JD Vance will go

23:06

on the show because he doesn't actually

23:07

have to answer real questions. You and

23:09

FDR somewhere. You and FDR took your

23:11

Celsius and went down to Blockbuster to

23:13

get some get some VHS rentals. I Okay,

23:17

little bit of

23:22

boogie. That's all right. I was okay. I

23:25

I was getting a brief so because that's

23:27

what people when I first became No, no,

23:28

they they are and I get it. And look,

23:30

like all all I tell you is we try to be

23:32

as Oh, yeah. I did go to the last

23:33

blockbuster. I forgot about that. Shut

23:35

up. How old was that? 14 years ago.

23:38

Lived out there. Got stuck in the snow

23:39

out there. Yeah. One of the guys who

23:41

works the national security team of the

23:43

Trump administration

23:44

um gave me this brief about how Okay.

23:47

When you're using an iPad and let's say

23:52

you know you're reading a story from

23:55

some random newspaper and you hover on a

23:59

particular paragraph like your iPad is

24:02

collecting that information on you. God

24:04

like it's actually trying to track what

24:06

you're doing. Like that is the stuff

24:08

that really freaks me out. Well, how do

24:09

we stop that? I think it freaks

24:10

everybody out. I think it just makes

24:12

people sick. It's like so I think that's

24:14

interesting. It's like,

24:15

oh, the idea is, okay, advertisers use

24:20

cookies that follow you around to try to

24:22

advertise products to you. This isn't a

24:24

surprise. I actually think uh there JD

24:28

is probably right in saying that

24:29

corporations have a lot of data because

24:31

I I think that, you know, these always

24:33

on microphones that you now have in your

24:36

phones or Alexexas or whatever, uh they

24:39

they collect a lot of advertising data

24:41

on you. Uh, I mean there's to me it's no

24:43

surprise that when you talk about

24:44

certain things, you start getting

24:46

notifications related to those topics.

24:47

Thank you, Jack. Uh, like, you know,

24:49

Jack wanted Chipotle the other day and

24:52

uh it was the weirdest thing ever, but I

24:53

I don't think I ever get Chipotle

24:55

notifications, but I was talking to Jack

24:57

about maybe ordering Chipotle and like,

25:00

"Oh, yeah, we could order Chipotle in a

25:01

little bit, you know, in the car, I

25:02

think we said." And uh and then I like

25:05

as I was still in Home Depot, I get a

25:07

notification and it was like, you know,

25:10

some kind of Chipotle special. And I'm

25:12

like, dude, I never get Chipotle

25:13

notifications. So, I don't know if I

25:15

just like noticed it or if that's part

25:18

of the Apple ecosystem, listen to you,

25:21

pitch and promote things that are

25:23

relevant. And maybe Apple advertises

25:26

that as like, yes, relevant

25:28

notifications, right? So really the only

25:32

way to kind of get away from that is you

25:33

would just have to have no listening

25:35

devices anywhere which is kind of crazy.

25:39

Um so yeah but I I don't think that like

25:44

so to me that's weird and I don't like

25:46

it but I don't think saying that oh

25:48

companies do it so the government should

25:50

do it too makes me feel better about the

25:53

government doing it right. I think

25:54

that's why people are frustrated about

25:56

this palunteerification.

25:58

That said, it makes sense. You know,

26:01

again, if I wanted to make the

26:02

government more efficient and I wanted

26:04

to fulfill the Doge mission at the

26:06

government, I would go install that

26:08

pound tier USB stick for again uh

26:11

oversimplification purposes everywhere I

26:13

could. And so, you know, absent a

26:17

recession, you're probably not going to

26:19

get a good deal on Palunteer for a while

26:20

because I think it's still trading for

26:21

like a six-pack. What up, homie? Let's

26:24

look up the valuation. There might there

26:26

might be a little bit too much milk. Oh,

26:28

there can never be too much milk. By the

26:30

way, I'm starving. Is mom around? She's

26:33

taking a shower.

26:35

Would you mind asking her anyway if she

26:37

can make some corn beans?

26:39

Yes. Thanks,

26:41

man. And chicken. If there's chicken.

26:45

All right. So, pound

26:47

here. We've got EPS of

26:52

uh 130 bucks divided by 58 cents 244

26:56

times. Growth is expected to

27:00

be 30% growth and it's trading for 244

27:03

times. So that puts me at an 8 peg.

27:07

That's insane. An eight

27:10

peg. So yeah, I mean Palanteer is really

27:13

really rich. I love I want to own

27:15

Palanteer at lower valuations. So, and

27:17

I'm happy for everybody who's made money

27:19

on pounder. Please diversify a house

27:21

hack. But, uh, 40 times one. Let's see.

27:24

If I put them at a 269 time 40 *.58, it

27:28

should be trading for no more than 60

27:29

bucks at a at a fair valuation. It's

27:32

twice its fair more than twice its fair

27:34

valuation. Um, probably two and a half

27:37

times its fair valuation, but anyway.

27:39

Uh, because yeah. Okay. Yeah, I

27:41

multiplied a little bit too much. I gave

27:42

a little bit too much credit there. But

27:44

anyway, that's my take on Palunteer.

27:47

and Elana Doge, which is good. Some

27:49

legacy. What? What's Oh, could you put

27:50

that key there? Thanks. All right. Hey,

27:53

quick reminder. If you want to earn a 5%

27:56

yield plus all of the upside in my real

27:59

estate startup, we are offering a

28:02

nonacredited investor round where we

28:04

will pay you a 5% yield plus you get all

28:08

of the upside in the stock when your

28:10

stock converts in the future. It gives

28:12

you an opportunity to diversify away

28:14

from rich stock market valuations,

28:16

diversify away from most tariff risk,

28:18

diversify away from the craziness of the

28:20

treasuries market, especially with these

28:22

Moody's downgrades. You diversify into

28:24

fixerupper real estate, which is what I

28:27

believe I do best. This is my baby. I

28:30

call it my little Birkshire Hathaway.

28:31

Now, maybe I'm biased, but I've got over

28:33

$5 million of my own money invested into

28:36

this business, and I'm so excited for

28:38

what I think we're going to be able to

28:40

do with this business over not just the

28:42

next years, but the next decades. So, if

28:45

you want to invest in my real estate

28:47

startup, go to househack.com and know

28:49

that I will do everything for this

28:51

business in my power to make sure we can

28:53

do the absolute best that we can. Make

28:56

sure to read the solicitations and

28:57

disclosures on the House hack website

29:00

because there is risk with every

29:02

investment.

UNLOCK MORE

Sign up free to access premium features

INTERACTIVE VIEWER

Watch the video with synced subtitles, adjustable overlay, and full playback control.

SIGN UP FREE TO UNLOCK

AI SUMMARY

Get an instant AI-generated summary of the video content, key points, and takeaways.

SIGN UP FREE TO UNLOCK

TRANSLATE

Translate the transcript to 100+ languages with one click. Download in any format.

SIGN UP FREE TO UNLOCK

MIND MAP

Visualize the transcript as an interactive mind map. Understand structure at a glance.

SIGN UP FREE TO UNLOCK

CHAT WITH TRANSCRIPT

Ask questions about the video content. Get answers powered by AI directly from the transcript.

SIGN UP FREE TO UNLOCK

GET MORE FROM YOUR TRANSCRIPTS

Sign up for free and unlock interactive viewer, AI summaries, translations, mind maps, and more. No credit card required.