TRANSCRIPTEnglish

Spiritual formation and AI: A deep dive with Andy Crouch and Jay Kim

1h 18m 33s14,069 words2,076 segmentsEnglish

FULL TRANSCRIPT

0:00

Pastoring in the digital age is not

0:02

easy. It's likely that historians will

0:04

name 2007 as a key inflection point in

0:08

human history. That's the year that

0:10

Steve Jobs released the iPhone into the

0:13

wild. And it was a before moment for the

0:17

church in the West. In hindsight, many

0:20

of us uncritically adopted technologies

0:23

like the smartphone or social media

0:25

without knowing just how these new tools

0:28

were designed to malform our soul. It

0:31

was all so new that we missed a chance

0:34

to pastor people into an alternative

0:37

future. Now we're seeing the emergence

0:39

of AI and we think right now might be a

0:43

2007 moment for artificial intelligence.

0:46

We do not know the future, but we want

0:48

to learn from our mistakes in the past

0:50

so we can thoughtfully pastor people in

0:53

the digital age. To that end, we've

0:56

created a conversation between two

0:58

trusted guides. My friend Jay Kim is the

1:01

pastor of Westgate Church in Silicon

1:03

Valley, serving on the bleeding edge of

1:05

the AI revolution, and Andy Crouch is a

1:08

partner with Practis in New York City,

1:10

author and public intellectual. We've

1:13

asked them to offer historical context,

1:15

ancient Christian wisdom, and just

1:18

practical advice on pastoring the church

1:20

into the next era of the digital age.

1:23

Enjoy their conversation.

1:32

>> Andy, it is so good to be back with you.

1:35

It's been too long since you and I

1:37

>> too long

1:38

>> been together. Um I I'm a pastor in the

1:41

Silicon Valley. You work with

1:43

entrepreneurs, many of whom work in

1:46

technological spaces. And so, uh, in

1:49

different ways and similar ways, you and

1:50

I, I think, are both deeply aware of,

1:53

um, how quickly, uh, one, just how

1:56

quickly artificial intelligence is

1:58

changing and evolving and expanding, but

2:00

also how quickly the conversations about

2:03

AI are evolving and moving. And um there

2:07

are so many lenses through which we can

2:09

have a conversation about AI. But for

2:12

our purposes today um to to be as

2:15

helpful pastorally as we can to people,

2:17

we want to look through the lens of

2:20

practical pastoral thoughts on how

2:24

followers of Jesus might consider uh not

2:27

just our engagement with AI but but how

2:29

we might even think about AI. So you

2:32

know our our belief is that um the life

2:34

of disciplehip is to be with Jesus is to

2:37

become like Jesus in all of life. So we

2:40

want to ask questions about what sort of

2:42

obstacles and opportunities are there

2:44

for followers of Jesus when it comes to

2:47

being with him and becoming like him um

2:49

at the intersection of these new

2:51

technologies. But I want to lay the

2:53

groundwork a little bit before we get

2:55

into some of the practical pastoral

2:57

implications. Um, let's just talk about

3:00

like what is it that we're talking about

3:02

when we talk about AI? Let's just start

3:04

there. Yes. What what are we talking

3:06

about when we talk about AI? And then

3:08

maybe more importantly like why why are

3:10

we talking about AI? Isn't it just

3:12

another technology? We've kind of been

3:14

through this before. So what is it and

3:16

then why are we talking about it?

3:18

>> Yeah. Let me try to set the stage um by

3:21

going way back. Yeah. And and just

3:23

starting with tools that human beings

3:25

have always had tools and the Greek word

3:27

techn that we get all this language of

3:29

technology from originally refers to the

3:32

craft of kind of making and using tools

3:34

and this is this goes as far back as the

3:37

human story goes back. So we've always

3:39

had tools about a hundredish years ago

3:43

150 years ago you might date this to the

3:45

steam engine in some ways um we develop

3:47

a new kind of thing uh and we initially

3:50

call them machines. uh I think the best

3:53

broad word is devices which is things

3:56

that unlike tools sort of operate on

3:58

their own that is they actually can do

4:01

things for us without us uh or with

4:04

without us or being very involved. So

4:07

tools always require a human being to be

4:09

using skill and knowledge and attention.

4:12

I mean woe to you even with a tool as

4:14

simple as a hammer. If you're not paying

4:16

attention you're going to hit your

4:17

thumb. You're going to miss the target.

4:19

It actually requires a surprising amount

4:20

of skill to hammer things, right? So

4:22

tools require human engagement, skill,

4:24

presence, attention, power, even like

4:26

literal like that where do they get

4:28

their energy from? They get their energy

4:29

from human bodies. Um

4:32

devices sort of change the game in a

4:35

pretty amazing way. They they often have

4:36

autonomous sources of power. That is

4:38

they can power themselves. They have

4:40

these things called cybernetic feedback

4:41

loops that allow them to sort of

4:42

regulate their own response to the

4:44

environment. So you go from a hammer to

4:46

a nail gun and a human being may still

4:49

be sort of involved but way less skill

4:50

is required, way less kind of human

4:53

development is required and it's way

4:56

more effective like you can get a whole

4:58

roof done with minimal like physical

5:00

effort and much less skill actually than

5:02

if you had to do a roof with a hammer.

5:04

>> Yeah.

5:05

>> So I would say that's really when we

5:08

start to have what we call technology

5:09

and part of the reason I say that is

5:11

that's when we start actually using a

5:13

word. We make up this word technology

5:15

that is feels different from tools.

5:17

That's like layer two. Layer three is

5:20

the digital. Uh we in the in the second

5:23

half of the 20th century um partly

5:26

through information theory, partly

5:27

through these inventions like

5:28

transistors, the most important one, we

5:30

start to be able to build these this new

5:32

kind of devices uh that aren't just

5:34

electrical but electronic that actually

5:36

don't just operate in the physical world

5:38

the way the steam engine did, but

5:39

operate in the world of information uh

5:41

through binary encoding of information.

5:44

>> And the digital world appears and we

5:46

quickly discover that the best way for

5:48

us to interact with that world is

5:49

through screens. So on top of the

5:51

digital, we get eventually screens, not

5:54

that much longer after digital. Um, and

5:57

now arrives the next uh kind of element

6:00

in the stack. And we're calling it AI.

6:03

And we've been calling it that for a

6:05

long time, by the way. It's just that it

6:07

kept not really working out. So AI was a

6:09

dream. As soon as people started

6:11

building computers, they thought, well,

6:12

can these things become as intelligent

6:13

as people?

6:15

>> And in the 1950s and60s, people were

6:17

asking that question and doing thought

6:18

experiments. the touring test, for

6:20

example, could you have a human being in

6:22

one room, a computer in the other room,

6:23

and have messages go back and forth, and

6:25

you wouldn't be able to tell if it was a

6:27

computer or a person in the other room.

6:29

That's a that comes from the 1950s.

6:30

They're already thinking like, how might

6:32

we uh take these computational systems

6:35

uh and turn them into something like

6:37

intelligence?

6:38

>> And we kept having what they called AI

6:40

winners, uh which is you'd have like

6:42

spring where everything's growing and

6:44

you're thinking this is gonna be

6:45

amazing. you have a very short summer

6:47

and then it would totally not work out

6:49

and you'd go into this long period

6:50

called AI winter where it's it it just

6:54

seems like you know computers are

6:55

getting better and better at some things

6:56

but they're not getting better and

6:57

better at the things that matter most to

6:59

human beings

7:00

for whatever reason well for a couple

7:02

interesting reasons in the last five

7:05

years as we speak there's been this

7:07

takeoff again of capabilities we we

7:10

uncovered actually a simpler way of

7:12

building these systems um than we had

7:14

Before we used to think you sort of had

7:16

to tell the computers how to be

7:18

intelligent. We then invented this

7:20

extremely simple set of mathematics

7:22

vector mathematics and and we just set

7:25

these computational systems loose on uh

7:28

trillions of bytes of data. I mean

7:30

trillions of trillions of bytes like all

7:32

the data we could find for them to

7:34

ingest and we just let them train

7:35

themselves. And something has emerged

7:38

from this that has um four well three

7:42

qualities now and one that might be

7:43

coming. So this new kind of AI is better

7:48

at interacting with culture and language

7:51

than computers have ever been. So

7:53

computers have historically been pretty

7:55

terrible at language. Uh used to have to

7:58

learn programming languages, right, to

8:00

interact with computers. And we built a

8:02

bunch of layers on top of that to make

8:03

it easier for the average user. But but

8:05

underneath this computer really did not

8:07

know was what was going on

8:08

linguistically. These systems really get

8:12

language like they really know how to

8:14

interact the way we interact in language

8:16

>> and they get culture which in many ways

8:19

is built on language because they've

8:21

ingested everything we've ever been able

8:23

to put into words and ended up on the

8:25

internet they've ingested. So cultural

8:27

linguistic fluency is genuinely new. The

8:31

sort of following from that, these

8:33

systems actually have, you could say,

8:36

emotional and relational intelligence.

8:38

Not in the way a person does. They're

8:40

not persons at all, but they are able to

8:43

sense and respond to emotion and

8:46

relationship in a way that computers

8:48

never did. I have uh joked sometimes

8:51

that um Siri, you know, the Apple

8:54

assistant barely knows that I am

8:56

married. So, I'm married to Katherine,

8:57

but when I say send a, you know, say I

9:00

don't want to say it because there might

9:02

be a device on, hey, you know who?

9:05

>> Send a message to Katherine. Like, half

9:07

the time it pops up my wife, who is

9:09

obviously the Katherine. Half the time

9:10

it's some random Katherine in my address

9:12

book. Like, barely knows I'm married.

9:14

Well, the new generation of AI will

9:18

absolutely be able to keep track of

9:19

who's most important to you, what does

9:22

it mean to be married? What does it mean

9:23

to have a spouse? All those things are

9:24

within its kind of grasp

9:26

computationally. You might say that's

9:28

new.

9:30

>> On top of these two things, cultural

9:31

linguistic fluency and relational and

9:33

emotional fluency, by which I don't mean

9:36

they're themselves,

9:37

>> right? They're not people.

9:38

>> They're not people. They're just amazing

9:40

computational systems that have ingested

9:42

all this and can interact with it.

9:44

>> The third thing that now they're able to

9:46

do and these capabilities are growing

9:48

every day is I would say simulation

9:51

power or simulation fluency. they are

9:54

able to simulate

9:56

uh experiences that we've never been

9:57

able to simulate before. And and because

9:59

this this basic vector math technique

10:01

works with language, it works with video

10:03

and image. They can make as we sit here,

10:06

we've just they've just been able to

10:08

make 8 minute or sorry, 8-second short

10:10

films.

10:11

>> Surely by the time anyone watches this,

10:13

it'll be 16 seconds, then it'll be 32

10:15

seconds. Like it'll keep growing. Um,

10:17

and they're able to simulate for you

10:22

things that have never been uh you've

10:25

only been able to experience them in the

10:26

real world.

10:27

>> Mhm. So the fourth thing that might be

10:31

next uh and that I think is worth

10:33

assuming is coming is could they conquer

10:37

in a way could these systems the same

10:38

thing that train them in language and

10:40

images and so forth and in music and

10:43

sound could that apply to moving through

10:46

the world of space and time the way we

10:48

do with our bodies. And so this would

10:50

basically be a breakthrough in robotics.

10:52

Right now robots are very limited in

10:54

what they can do. But couldn't we use

10:57

these same techniques and train physical

11:00

systems?

11:00

>> Yeah.

11:01

>> To move through the world, uh, which

11:03

could mean that within all of our

11:05

lifetimes, our world will be just as

11:08

filled with um,

11:11

>> entities that have a real kind of

11:12

intelligence and and physical spatial

11:14

fluency plus relational, emotional

11:17

fluency, plus cultural and linguistic

11:19

fluency. It'll it'll be like a new

11:22

category of creatures in our world very

11:24

much like dogs and cats or domesticated

11:26

animals. So, we've got these other

11:27

intelligences that share our homes. Now,

11:30

um that know things about us and have

11:32

their own purposes and so forth. If

11:35

robotics gets solved, and it's like a

11:37

50-50 coin flip, whether it can be done

11:40

or is in fact way harder and not easy to

11:42

do,

11:42

>> if it gets solved, then all this stuff

11:44

that right now is still present to us

11:46

through screens or audio interfaces

11:48

shows up in an embodied form,

11:50

>> takes out physicality

11:51

>> and and now is with us in the world.

11:53

>> Yeah. and all. That's why we're talking

11:55

about it because as impressive as the

11:57

steam engine was and impressive as the

11:59

computer is and impressive as all the

12:01

screen stuff is, this is really quite

12:05

remarkable and and even what we already

12:07

have is remarkable and where we can kind

12:10

of logically or naturally imagine it

12:12

going is pretty amazing.

12:14

>> I want to dive deeper a little bit there

12:16

to get to some of the more practical,

12:18

pragmatic and pastoral wisdom as we

12:21

think about as followers of Jesus AI. So

12:23

if disciplehip to Jesus is at least

12:27

nothing less than being with Jesus, what

12:29

we might call communion with him and

12:33

becoming like him, what we might call

12:34

formation, being formed into his image,

12:38

then um what sort of disconnecting

12:42

and deforming

12:44

potential? You're already sort of

12:46

teasing it out. What sort of

12:47

disconnecting and deforming potential

12:49

does AI hold that, you know, as

12:52

followers of Jesus, we need to be really

12:54

thoughtful about before we just open the

12:56

app on our phones? You've already sort

12:58

of talked about this. I know I know I've

13:00

heard you talk um and write quite

13:03

extensively about what you call the

13:04

superhero zone,

13:06

>> sort of easy everywhere that gets

13:08

accelerated with artificial

13:09

intelligence.

13:10

>> So, talk talk more about that.

13:13

>> All right, a couple couple things to

13:14

think about. I think where I want to

13:17

start, I think we have to take really

13:20

seriously

13:22

like what it would mean to be with and

13:24

be like Jesus with respect to technology

13:29

by reflecting on a very interesting

13:32

thing about Jesus of Nazareth, the

13:34

actual person.

13:37

>> He lives first of all in a barely

13:39

technological world, nothing like our

13:41

world. Um so already we feel actually if

13:45

we think about quite a bit of gap

13:47

between

13:48

>> lots of tools.

13:50

>> Oh tools everywhere because tools are

13:52

always there. Yes.

13:53

>> But technology in the sense of things

13:55

that operate kind of free of human skill

13:57

and engagement

13:59

>> that don't involve a human being kind of

14:01

exerting themselves and and and

14:03

involving themselves in the work of

14:05

being in the world and instead that sort

14:07

of operate quasi independently.

14:09

>> Yeah.

14:11

Jesus and everyone around him d it's

14:13

it's dreamed of at this it's it's like

14:16

been a human dream all along and

14:17

Aristotle has dreamed it 300 400 years

14:20

before Jesus right there are two

14:23

technologies in Jesus world that are

14:24

like that writing and money so writing

14:27

is a form of communication that that can

14:29

float free from the speaker if if this

14:32

what we're doing now face tof face

14:34

speech is the basic form of

14:36

communication if I write it down it sort

14:39

of wanders off into the world and speaks

14:41

in a way to those who can read

14:43

>> in it without a person at least without

14:45

the original speaker having to go.

14:47

That's that's an early technology. It's

14:48

kind of a I'd call it a primal

14:50

technology. The other primal technology

14:52

is money which allows the representation

14:55

of value in human affairs to kind of

14:57

float free of human care of the earth

14:59

and harvesting of the good of the earth

15:01

or finding the resources of the earth

15:03

and money can kind of float around. So

15:05

techn uh writing and money are the two

15:07

primal technologies. They're the first

15:09

really to show up in the human story.

15:12

>> Jesus uses neither one

15:16

>> in his whole ministry. Now, we know, we

15:18

can almost be sure he was literate. Uh

15:21

all all Hebrew boys would have been. He

15:24

would have read the Hebrew scriptures.

15:26

Um but he did not write anything down

15:29

ever.

15:30

>> Yeah.

15:30

>> Uh so that's interesting. Even though he

15:33

has writing, there is writing. He has

15:35

the written script text of scripture and

15:38

then money uh

15:40

you know there is money probably

15:42

involved. Uh we read that women who

15:45

followed Jesus provided for him out of

15:46

their own wealth which meant financial

15:48

wealth for his kind of itinerate

15:51

ministry and we know that there was a

15:53

money bag uh that the disciples kept. We

15:56

also know who had it.

15:57

>> Yeah.

15:58

>> The least reliable of the apostles

16:00

>> not well in the end not an apostle of

16:02

the 12 uh is selected. So G but Jesus

16:06

himself when when money comes up uh like

16:08

paying taxes to Caesar that they he has

16:11

to ask someone to bring him a dinarius.

16:12

He's like oh wait well bring me the

16:14

coin. Uh he's and then he looks at as if

16:17

he's never seen it.

16:18

>> Yeah.

16:19

>> So this person that we want to be with

16:21

and be like

16:23

did not even use the technology

16:25

available to him in a pre-technological

16:27

time. And

16:30

this human being, we believe, embodied

16:34

the fullness of what it is to be human

16:36

more than anyone who's ever lived.

16:38

>> So, our apprenticeship to him is going

16:42

to have to be about learning how it is

16:45

to be human without even the

16:47

technologies Jesus had access to.

16:49

>> Okay. What about the push back though

16:50

that I'm sure you get all the time? It's

16:52

like, okay, Jesus didn't, but

16:54

>> Paul did.

16:55

>> Yes. all of that the written word and

16:58

it's true. I mean that's the Bible we

17:01

hold in our hands or sometimes on our

17:02

tablets that's all a result of

17:04

technology. It is a technology. So how

17:07

how would you respond to that?

17:09

>> It's absolutely true. The the first

17:10

followers of Jesus start making use of

17:13

the technology available to them. Uh

17:15

they travel using Roman in a way

17:17

technology ships and or tools at least

17:19

ships and roads and so forth. But Jesus

17:21

didn't really but they do uh very

17:23

intentionally kind of maximizing in some

17:25

ways. Um and they are writing they are

17:27

using money. Paul's collecting money for

17:29

Jerusalem at at the point he writes to

17:31

the Corinthians. So um I will say what's

17:34

notable is they never say look what this

17:38

technology is allowing us to do that you

17:40

there's nothing in the there's not one

17:42

moment in the New Testament where uh

17:45

someone kind of marvels isn't it a good

17:48

thing we're Roman citizens for the sake

17:50

of the gospel or something like that or

17:52

isn't it great that I can write to you?

17:54

In fact, what we read is the is sort of

17:58

apologies for writing. So the the author

18:00

of the uh letters of John says, "I

18:02

really long to be with you face to face,

18:05

but uh for now I'm writing."

18:07

>> Um

18:09

uh Paul is uh sends every letter that as

18:12

best we can tell with a person.

18:15

>> So it's not that they didn't use it.

18:17

It's not that we don't use it. You and I

18:18

are using it now.

18:19

>> Yeah. It's that it's so secondary

18:23

>> to the

18:25

uh astonished sense that those first

18:28

Christians have that that as John says

18:31

uh in the prologue to his first letter

18:34

that we actually touched and handled and

18:37

were with the word made flesh and we

18:40

beheld him and were with him and now

18:42

we're just using any means we can to

18:45

invite you into relationship with him

18:46

because we actually believe he's still

18:47

available to us. So there's a there's a

18:51

lack of intoxication with it, a lack of

18:53

preoccupation with it, and always a

18:56

reentering. You know, Paul's just

18:58

received a financial gift from the

18:59

Philippians. He says, "You know what? I

19:00

don't even seek the gift. I seek the

19:02

fruit that it's going to bear in the

19:04

world." He desenters the attention from

19:06

the technological medium that they're

19:08

supporting him through and says, "But

19:10

what I care about is the fruit. What I

19:12

care about is not the writing. It's it's

19:14

the connection with you and the desire

19:16

to be face to face with you."

19:17

>> Yeah. That desire for connection face to

19:20

face, I think it's such a striking

19:22

thought, really convicting in many ways,

19:25

sobering for for pastors and church

19:28

leaders especially. So I hear in my

19:30

circles amongst church leaders all the

19:32

time these words. You've written about

19:34

this word impact, you know. So let's use

19:37

the tool be it whatever social media app

19:40

of choice, whatever it might be, right?

19:43

um you know so so let's use the tool to

19:47

maximize our impact often it's maximize

19:49

our reach which is really interesting to

19:51

me because that word is like literally

19:53

taken from the world of social media

19:55

what is my reach

19:57

>> um is there some sense for church

20:00

leaders and pastors in particular but

20:02

for all of us as followers of Jesus

20:04

>> to reassess or or to reconsider the sort

20:08

of blind infatuation we have with impact

20:12

and reach And the reason I'm asking that

20:14

is because you're right, it is a really

20:15

convicting thought. When you read the

20:17

New Testament writers over and over

20:19

again, they they leverage a particular

20:21

technology, but they seem to insert the

20:24

sort of desire for human connection. I'm

20:27

using the written word, but I so wish I

20:30

was with you. Yes.

20:31

>> You know, and there's a there's an utter

20:33

lack of that when I'm thinking about

20:35

just reach impact. Get the cotton out

20:37

there.

20:38

>> Yeah. Comment on that a little bit. It's

20:41

just totally orthogonal to the project

20:44

that the kingdom is on, which is the

20:47

relational reconstitution of human

20:50

beings who have lost their ability to

20:52

love in the fullness of who they are.

20:54

>> And for that to be restored is not going

20:57

to happen through any kind of scale

20:59

technique or any kind of um kind of

21:02

productivity breakthrough.

21:03

>> Yeah. It's going to happen uh through

21:07

the the sort of uh extraordinary

21:11

effect of this one life lived wholly

21:15

without scale and amplification

21:18

techniques

21:19

>> and yet with such resonance that the

21:21

whole rest of his of history reshapes

21:24

itself because of what that life is

21:26

like.

21:26

>> Yeah.

21:26

>> And because

21:28

>> in succeeding generations people because

21:30

of that life are going to say I I want

21:32

to know him. I want to become like him.

21:34

I want to be with him. And in fact,

21:36

through through his spirit, it's

21:37

possible to be uh become like him and be

21:40

with him.

21:41

>> Yeah.

21:42

>> So, it's, you know, it's not that all

21:44

this stuff is not helpful. It's just not

21:47

relevant to the project of becoming like

21:50

Jesus. Except you a little bit ago asked

21:53

this question. Is it somehow distorting

21:56

or deforming? Yeah.

21:57

>> And the answer is yes, profoundly. If

21:59

what we do is we start to think, you

22:01

know what this is about is my finally

22:03

being free of the burden of being human.

22:06

>> And being human involves all this uh

22:09

work, all this suffering, um all this uh

22:13

sometimes fairly uh tedious toil on

22:16

mostly in the service of others. What if

22:19

I could just get away from all that?

22:20

What if the robots will show up and and

22:22

do the dishes for me? So I no longer now

22:25

at one level that's that's one less

22:27

chore to do around the house.

22:28

>> In another way, why do I do the dishes?

22:30

To serve the people that I live with and

22:33

to keep a home that is kind of uh

22:36

dignified and not

22:37

>> maybe the distortion what I'm hearing

22:39

you say is the distortion is that we

22:41

have

22:42

>> we've sort of divided that to be human

22:45

is to fully experience human pleasure

22:48

>> without human pain. Right? That's the

22:50

divide. That's what technology we think

22:53

can do for us for us. Yeah.

22:55

>> So AI already in its early stages seems

22:59

to be doing some things that were once

23:01

reserved

23:02

>> for humans, you know. Oh yeah. Pastoral

23:04

care, spiritual direction. There are

23:06

people that I know who go to chat bots

23:09

for um pastoral care, spiritual

23:12

direction. Certainly like biblical

23:13

theology questions. um a whole like this

23:17

huge rise of people going to AI for

23:20

therapy and counseling.

23:22

>> Uh so talk about that like what are we

23:25

risking when we sort of make it this

23:28

element that was once reserved for for

23:30

human beings? Well, the very broadest

23:33

way I would think about um how to

23:36

approach AI is it there will be things

23:39

for which it is useful

23:41

>> because really all technology is

23:43

designed to kind of increase the

23:44

available usefulness of the world to us

23:47

and I fully believe and we've already

23:49

seen I we could list numerous ways that

23:53

even already this ba this basic set of

23:55

techniques that have been so fruitful in

23:57

the last few years have become

23:58

incredibly useful and they they will

24:00

continue to

24:01

And there are things that people have

24:03

sought from uh other people that are

24:08

fall in the realm of usefulness. It's

24:09

useful to know what a word means um when

24:13

you don't know the word and you could

24:14

ask someone or if you have a dictionary

24:16

you could look it up or now you can ask

24:19

AI, right? And so in that sense uh if

24:22

what you're looking for is information,

24:25

if what you're looking for is very basic

24:28

kind of building block techniques to

24:31

interact with the world. Uh how how do I

24:34

make an omelette? Yeah.

24:36

>> How do I I mean you could apprentice

24:38

yourself to a chef and you could

24:40

probably learn some things from a true

24:42

chef about omelette that that AI will

24:44

have a hard time teaching you. But for

24:46

you and me, let's say, who don't aspire,

24:48

let's say, to be truly great chefs, we

24:50

just need to turn the eggs into

24:52

something edible.

24:53

>> I bet AI will be able to teach you all

24:55

kinds of useful things like that, right?

24:57

>> How does that apply to the spiritual

24:58

life? Well,

24:59

>> the spiritual life and let's say the

25:01

life of spiritual and emotional growth,

25:03

there are some technique things that if

25:05

you just know them, it really helps. Um,

25:09

there are there are better and worse

25:11

ways to have a conflict with your

25:13

spouse. And there are things you can say

25:16

and things you can not say that if you

25:18

just literally know like, hey, it's

25:20

probably better if you don't do this

25:22

thing and it's better if you say this

25:24

thing or open up a question this way.

25:26

Um, I I completely believe that AI

25:30

today, uh, let alone as it gets better,

25:32

can help you with those things. So, uh I

25:37

you know if if what you need is

25:39

information or what you need is

25:41

techniques um AI will be better than

25:43

Google at that already is

25:46

>> and may well be better than the nearest

25:48

available person who may have a lot to

25:50

get done like your pastor. So you may be

25:52

your pastor may be glad you asked AI

25:54

rather than asking your pastor.

25:57

When we start talking about the deep

26:00

work of uh let's call it therapy kind of

26:04

in the which is really to heal the human

26:07

self.

26:08

>> Yeah.

26:08

>> Especially in its interiority. Not so

26:10

much the therapy of the body which is

26:12

medicine but the the soul and emotions

26:15

and the heart and the mind.

26:19

I'm concerned that that is not something

26:23

that is just a matter of information and

26:25

techniques.

26:27

Yes, those are part of it. And to the

26:29

extent that information and techniques

26:31

are part of a good kind of counseling

26:33

practice, AI could absolutely do some of

26:36

that.

26:37

But what we all come into the world with

26:40

our pain, our trauma, our need for

26:43

healing, um what we all require coming

26:46

into the world with that is someone who

26:50

will be with us in I think three things.

26:53

Our fear, our guilt, and our shame.

26:56

Um, and these I I don't choose these at

27:00

random. They're they also actually

27:01

ramify up into whole cultural systems.

27:04

There are cultures that are built around

27:05

fear and the relief of fear. There are

27:07

cultures that are bu built around guilt

27:08

and the relief of guilt. And there are

27:10

cultures built around shame and the

27:11

relief of shame.

27:13

>> And these are also psychological

27:14

dynamics at the level of the individual.

27:17

And what I need when I am afraid is I

27:20

need someone who can give me trust that

27:22

I'm not alone in what I'm afraid of.

27:25

What I need when I feel guilt is someone

27:27

to tell me, "Yes, you have sinned, but

27:30

your guilt is covered and you are

27:31

forgiven." What I need when I feel shame

27:34

is someone who sees what I'm afraid to

27:36

disclose and and gives me love and says,

27:38

"You are nonetheless, yes, I see it, but

27:41

you are still completely loved." um that

27:46

is a personal encounter

27:48

>> that that lifts the burden of fear and

27:52

guilt and shame and AI will be able to

27:55

simulate that.

27:56

>> Of course, it can simulate it. It's got

27:58

it in its training data. It can say the

28:00

words you're forgiven. Though, if you

28:03

set it up with the right prompt, it can

28:05

also be the ultimate accuser and chase

28:07

you around with your failings as long as

28:09

you want. Like, it it can be the

28:11

ultimate satist. In other words, it it

28:13

it it's indifferent. It it's all in just

28:17

how the prompt was set up, how the

28:18

reinforcement learning through human

28:20

feedback was set up. Now, of course, all

28:22

these uh commercial interfaces have been

28:24

trained to be very nice and very, you

28:26

know, and of course they'll tell you

28:27

you're forgiven, but there's no reason

28:29

they have to. They could just as

28:31

persuasively tell you you're condemned

28:32

for the rest of your life.

28:34

>> Only a person

28:36

can hear what you have to share, your

28:39

fear, your guilt, your shame. uh absorb

28:42

it, suffer it with you and then offer

28:44

back with if if that person is in touch

28:47

with the spirit of Jesus Christ, offer

28:49

you trust and forgiveness and love. And

28:52

in that sense it's a disaster if people

28:56

chase after a simulation of the thing

29:00

that God has given to his I mean if you

29:02

want to think about specifically the

29:04

forgiveness of sins has entrusted to his

29:06

church to proclaim to people with

29:09

authority in heaven and on earth that

29:12

your sins are forgiven to go to a

29:14

chatbot that will either directly or

29:16

indirectly try to convey to you a sense

29:18

of reassurance that your sins are not

29:20

that big a deal or that oh it's all okay

29:22

now or here's some techniques for

29:24

getting beyond your feelings of guilt.

29:26

Like we cannot allow that to be

29:29

outsourced. But people will absolutely

29:32

prefer the sort of um lowrisk, low

29:37

engagement, low formation way of

29:39

receiving that kind of minimal assurance

29:41

rather than the very risky thing of

29:42

confessing your sins to one another that

29:44

you may be healed.

29:45

>> Do you see the gap there? Like it's a

29:48

gulf.

29:48

>> Yeah. But most people will stay on the

29:52

side of that gulf that is safe.

29:54

>> And and the problem with all of these uh

29:57

relational simulators that AI is able to

29:59

spin up, even though it can do all kinds

30:01

of other things, it can fold proteins

30:02

for you all day. It can explore the

30:05

space of available petrochemical

30:07

substitutes. like you can do all this

30:08

useful stuff, but instead what we're

30:10

want we're going to want it to do is

30:12

behave like a person and make me feel

30:15

safe so that I can ask you all the

30:17

questions that I'm too afraid to ask a

30:19

real person. But then I'll receive from

30:21

you the simulated system an assurance

30:25

that is not real, is not personal, will

30:27

not be with me in my dying moment, will

30:29

not help me care for a little baby. like

30:32

like it's we're missing out on the

30:34

formation as persons who can actually

30:36

redeem uh be agents of God's redemption

30:39

in the world and agents of

30:40

reconciliation.

30:42

>> This is the stakes for me.

30:43

>> Yeah, man. I think it was a writer Kurt

30:47

Thompson, the Christian psychologist who

30:49

said um you know every person enters the

30:54

world looking for someone looking for

30:56

them. And as you're talking, it strikes

30:59

me one of the dangers, but also one of

31:03

the lenses that followers of Jesus can

31:06

always sort of look through to consider

31:09

any technology, especially AI, is one,

31:13

it can make you feel like it's looking

31:16

for you and at you.

31:17

>> Yes.

31:18

>> But to remind yourself, it's not really

31:21

seeing you in any real human way. It's

31:26

based on a lot of your prompts. It's

31:28

based on its own sort of learning models

31:30

and this immense wealth of information

31:34

data that it has that is simulating a

31:38

sort of human experience but to to

31:41

remind ourselves that um you know one of

31:45

the thoughts that comes to mind

31:46

>> can I just I think it's really important

31:49

to almost finish that thought but I'm we

31:51

got to something else.

31:53

What it is is the ultimate mirror.

31:56

>> What I mean, if I look in a mirror,

31:58

there's someone looking for me. Me is

32:00

looking for me.

32:00

>> Yes.

32:01

>> It's like the Zoom phenomenon. Like,

32:03

which box do you look at when you're on

32:04

Zoom? You look at you.

32:05

>> Yeah.

32:06

>> Right. But it's a mirror. And it's not

32:08

just a mirror of you. It's a mirror of

32:10

human culture. It's like the perfect

32:12

reflection of everything human beings

32:14

ever done. But all it is is a mirror.

32:16

It's as inert as a mirror. It's as

32:19

featureless as a mirror. And it is as

32:20

predictable as a mirror. It will never

32:22

be to you what you need, which is

32:24

another person who is not you. Like when

32:27

I I'm looking at you now, we're having a

32:29

conversation. It's vulnerable in certain

32:31

ways. Partly because we know other

32:32

people are going to listen and watch,

32:33

right?

32:34

>> And if you're just a mirror, then

32:36

ultimately all I've got are my own

32:38

resources, my own strengths, my own

32:40

weaknesses, my own vulnerabilities.

32:42

They're all just being reflected. But if

32:44

you are actually with me in this hard

32:46

thing that we're doing and you are and I

32:48

can feel it and it's not just me

32:50

reflecting

32:52

that is real strength to do something

32:55

hard.

32:56

>> That's what we have to have. And AI will

32:59

either be uh literally in the sense that

33:01

it responds to the user's inputs with

33:02

such fidelity. It'll be an amazing

33:04

mirror of the person. But even at its

33:06

best in a way, at its most amazing

33:08

capabilities, it's just a mirror of all

33:09

of human culture.

33:10

>> Yeah. But Jesus comes into the world and

33:13

God intervenes in history so that

33:14

culture isn't just a mirror and history

33:16

isn't just a series of reflections like

33:18

a hall of mirrors. It's actually the the

33:21

inbreaking of something truly new

33:24

>> and someone truly new who can look at us

33:26

in a new way and say, "I love you. I'm

33:29

going to restore you. I'm going to take

33:30

you to somewhere you've never been as a

33:33

culture or as a human race since the

33:34

fall."

33:35

>> That's that's what we need. Mirrors are

33:38

super predictable. I in many ways it's

33:41

very predictable. I look, I move, I say

33:44

particular things and it reflects back.

33:46

And you just said Jesus um his his

33:49

inbreaking is is the reality of someone,

33:54

a person in people, not not my

33:57

reflection in a mirror, but you and I

33:58

right now. We've talked about this

34:00

conversation. We've emailed like you

34:02

said, but I don't totally know what

34:04

you're going to say.

34:05

>> Exactly. Exactly. you don't totally know

34:07

what I'm going to ask. And there is a

34:10

certain level of risk. There is a

34:12

certain level of of nerves and

34:15

resistance. Right? This would be very

34:17

easy if it was just a scripted if you

34:20

were an AI and we had just scripted out

34:22

and you were some simulation, but it's

34:24

not like I don't quite know how you're

34:26

going to respond to this very question.

34:29

And it strikes me the way you're talking

34:30

about it. Maybe some of it for followers

34:33

of Jesus is AI dangerously becomes this

34:37

sort of tangible version of our

34:40

misunderstanding of what we sort of

34:42

think maybe God should be like. So we

34:45

read in Hebrews or wherever like he's

34:47

the same yesterday, today, and forever.

34:50

And we misunderstand the text to mean

34:52

>> he is really just this ultimate magical

34:56

machine in the cosmic sphere somewhere

34:59

that I input and he gives me whatever

35:02

data, answer, blessing, provision that

35:07

I'm looking for. Sometimes it takes

35:08

longer, sometimes it's immediate,

35:10

sometimes it's never. And that's when

35:12

I'm really questioning, does he love me?

35:14

Is he for me? Does he see me? But there

35:17

is an unpredictability. Just like there

35:19

is with any person, there is an

35:21

unpredictability with God. Him being the

35:24

same yesterday, today, and forever

35:26

doesn't mean he's void of personality

35:28

and his own thoughts and will.

35:30

Obviously, he has all of those things,

35:33

and that can be really frightening.

35:36

Whereas AI is like not as scary. Oh,

35:39

>> but it does the godlike things, you

35:42

know, it knows everything. It has this

35:44

sort of like omniscience to it or or so

35:47

we think.

35:48

>> So, talk a little bit about that. Like

35:50

it feels a little bit like and I want to

35:52

in a moment I want to get really

35:54

specific about practices, you know, how

35:57

we can be formed into Christlikkeness.

35:59

But talk about sort of like the sacred

36:02

unpredictability of God and why we

36:05

especially in an age of AI, we need to

36:09

remind ourselves and learn to trust and

36:11

lean into that reality more and more.

36:14

>> Here's one way to think about it. um

36:16

these LLMs, large language models, um

36:20

are basically taking all the data

36:22

they've ingested um and they are

36:26

navigating a a non-deterministic so

36:29

there's some randomness in it. Uh but

36:32

but they're navigating a probability

36:34

pathway along a series of tokens. In the

36:37

case of language, it's kind of basically

36:38

words. You can think of it as like word

36:40

what word most likely follows the next.

36:42

and they are following more or less the

36:45

most probable path through what you

36:48

would naturally say at a given moment

36:50

with a given topic on the table.

36:52

>> Yeah.

36:52

>> And they've of course been going all

36:54

these prompts and that's all part of

36:55

their what we call their context window,

36:56

right?

36:57

>> But in the end, the the next thing you

37:00

get from an AI as it's spitting out

37:02

words is the most probable. Contrast

37:05

this to Jesus.

37:08

In the course of Jesus' ministry, people

37:10

come to him with all kinds of questions,

37:12

all kinds of requests, uh lots lots of

37:15

interesting topics. There is one time

37:18

when they ask him a question and Jesus

37:20

gives the expected answer. One time,

37:23

what is the greatest commandment? Jesus

37:24

answers that the way every rabbi then

37:26

and now would answer with the Shama

37:27

Israel. Except there's two things he

37:30

does. First of all, the Shama Israel has

37:32

three terms. Love the Lord your God with

37:33

your heart, soul, and strength. Let's

37:34

say, or heart, and mind, and strength.

37:36

Three things. When Jesus answers, he has

37:38

four. So, this is like adding a verse to

37:41

the Star Spangle Banner or it's like

37:43

it's taking a the phrase every faithful

37:46

Jew repeats every morning when he gets

37:47

up.

37:48

>> Yeah.

37:48

>> Jesus changes it. Okay. So, even the

37:50

expected answer he he gives an expanded

37:53

answer which is heart, soul, mind, and

37:55

strength.

37:56

>> And then he adds and love your neighbor

37:58

as yourself. And then he has the the

38:00

sort of gall to say that's the greatest

38:02

commandment. Those two are the greatest.

38:04

Right? So now that's Jesus at his most

38:07

predictable. Every other time Jesus is

38:11

asked a question. Well, first of all,

38:13

90% of the time I venture to say he does

38:15

not answer in the conventional sense. He

38:17

asks another question. He tells a story.

38:19

Who is my neighbor? Oh, well, let me

38:20

tell you. Yeah. You know, um

38:22

>> this is he he is if you think of you

38:25

know the what we call the normal

38:27

distribution as a bell curve, right? the

38:28

probability distribution like AI will

38:31

always sit in the middle of the bell

38:33

curve of what's most probable to be said

38:35

about any topic. This is why AI is the

38:37

like the greatest cliche generator. In

38:39

fact, as a bonus observation here,

38:42

>> uh if you're a writer or a preacher and

38:44

you want to find out the normal thing to

38:46

say about a topic, ask AI what it is and

38:48

then don't say that because everyone's

38:50

heard that thing already. That's cliche.

38:52

Okay.

38:53

>> Jesus is like the opposite of that. He

38:55

never is in the middle of the bell

38:57

curve. He's way out on the edge of the

38:59

distribution. No one you I defy anyone

39:02

to read through the gospels, look at

39:04

every question put to Jesus, and then if

39:06

you did not already know because you a

39:07

good Bible reader, try to guess what he

39:10

says next. You'll never guess.

39:12

>> Yeah.

39:13

>> With AI, it's always guessing based on

39:16

all the data it's accumulated.

39:18

>> This is the most fundamental difference.

39:20

And in the end, this is because Jesus is

39:24

the most alive person who's ever lived

39:26

because he is the son of the living God.

39:29

The living God never repeats himself is

39:32

not a mechanism that just kind of re

39:34

rehearses things. Yeah.

39:36

>> God is always opening up new

39:38

possibility. And Jesus as a human being

39:41

was the most creative uh conversation

39:44

partner you could imagine because he

39:46

never says what you'd expect.

39:47

>> Immense risk talking to Jesus. I mean, I

39:51

don't really recommend asking him a

39:53

question unless you really want your

39:55

life to be turned upside down because

39:57

his answer is going to take you

39:58

somewhere you didn't know there even was

40:00

to go.

40:01

>> Yes,

40:02

>> that's one side. Now, the other thing

40:04

about So, that's Jesus in the flesh.

40:05

Now, Jesus now ascended. His spirit is

40:07

now with us.

40:09

>> But how does he relate to us? It's so

40:12

different from a chatbot. So, the moment

40:14

you ask a chatbot a question, it gets to

40:16

work answering it. And it usually does

40:18

so fairly quickly. You know, there's

40:19

some new deep research techniques that

40:21

take a little bit of time, but it's

40:23

mostly amazing how how fast and how

40:25

complete.

40:26

>> And of course, we think, well, this is

40:28

great for studying the Bible. Like, if I

40:30

run into a question about the Bible, I

40:31

can just ask and AI will quickly answer,

40:34

right? And yet, this is not how God

40:38

operates with us in prayer.

40:39

>> Yeah. How often uh maybe you are a

40:43

different kind of prayer than I am, but

40:45

in your prayer life, how often do you

40:47

ask God a question and immediately like

40:50

a full multiple paragraphs well

40:52

formatted appear in your mind or I don't

40:54

know.

40:55

>> Yeah.

40:56

>> Never.

40:56

>> Right. Never.

40:57

>> Never. How I won't speak for you. For

41:00

me, how often anything that I address to

41:03

God in prayer do I get back a palpable

41:05

response that I'm even sure I was heard?

41:08

Mhm.

41:09

>> I can count on one hand the times in my

41:12

life when I I when God did speak in that

41:15

way in my heart and I was like, "Oh my

41:17

goodness,

41:18

>> I have just been addressed by God." I

41:19

will I will say that we won't go into

41:21

the details now for time

41:23

>> that uh every time it was an answer I

41:25

did not couldn't have made up myself,

41:28

did not expect often quite challenging

41:30

also full of love and grace. But most of

41:34

the time when I offer something up to

41:36

God in prayer, I get back silence.

41:39

>> I'm waiting. And AI doesn't make you

41:42

wait.

41:43

>> Yeah.

41:43

>> And it gives you an answer.

41:45

>> Whereas I think part of relating to God

41:48

is is really the deep realization.

41:51

>> He is truly other for me. He is

41:54

absolutely for me. But the path to to

41:58

him will not be done on my terms based

42:01

on my kind of how I'd like to navigate

42:04

and the sequence I'd like to go through.

42:05

like he's doing something different in

42:08

my life that's real

42:10

>> and I can look back on and be quite

42:12

confident was real,

42:14

>> but it's not this sort of fluent

42:17

conversation that that AI trains us to

42:19

expect or fluent exchange of answers

42:22

about problems that we have including

42:24

with the text.

42:27

Your experience with the Bible should be

42:29

perplexity. Yes,

42:30

>> you should get to points and just be

42:32

like, I really do not know what is

42:35

happening at this moment and let alone

42:37

what it means for me. And if you can

42:40

turn to a system that will always give

42:42

you some kind of explanation, you are

42:44

missing out on the dependence on God

42:47

that the word is meant to the word is

42:49

meant to stop you in your tracks.

42:51

>> Yeah.

42:51

>> And have you say, "What in the be

42:53

perfect as your father in heaven is

42:54

perfect." What in the world can that

42:56

mean? You can plug that into AI and

42:58

it'll give you kind of a middle of the

43:00

bell curve summation of all the

43:02

different things Christians have said

43:03

about it, which will may well alleviate

43:05

your anxiety about what it means, but

43:07

will not at all answer the question of

43:09

why in the world does Jesus say, "Be

43:10

perfect as your father in heaven is

43:12

perfect."

43:12

>> It makes me more informed. It does very

43:15

little to form me. indeed may un unform

43:20

may deform may may undo the formative

43:23

process of of of realizing that you know

43:26

we don't live by bread alone. We live by

43:28

every word that comes from the mouth of

43:29

God. And and this is precious stuff that

43:31

we can't just control or fix it or

43:33

figure out.

43:34

>> Yeah. Keep going down this line with

43:36

maybe other spiritual practices. Let's

43:39

talk about solitude. Uh solitude, right?

43:41

I mean, there's kind of an an obvious

43:43

one, but there's a lot to say here.

43:45

Well, I have been thinking a lot uh

43:48

about

43:50

the the just consistent tradition of the

43:53

consistent witness of the tradition that

43:55

solitude, silence and fasting are the

43:57

elemental spiritual disciplines that

43:59

they are the alloys you need in your

44:00

life to strengthen you to be to be on

44:03

the way of being like Jesus and how

44:07

beautifully irrelevant AI is to all

44:09

three. So, solitude

44:12

um AI wants to be your companion. Yeah,

44:15

>> there are wonderful well-intentioned

44:17

brothers and sisters in Christ creating

44:20

spiritual companion apps that will go

44:22

with you in your quiet time and you know

44:24

will talk to you about how your quiet

44:25

time is going and read scripture aloud

44:27

to you and hey Jay how are you feeling

44:29

about God and they'll be there with you

44:31

and that means you will be completely

44:33

forfeiting the first elemental uh piece

44:36

of the true spiritual life was the which

44:38

is the the courage to be alone without

44:42

another present without the certainty

44:44

that God is present because who of us

44:46

truly can say we know for sure that God

44:48

is there in that in that empty place of

44:51

aloneeness

44:52

and yet that place without the companion

44:55

without the helpful friend is where you

44:59

will uh somehow develop a hunger for the

45:02

presence of God that somehow God will

45:04

not overlook and will care for you in

45:06

it.

45:06

>> Yeah.

45:07

>> Silence.

45:09

Silence is not the absence of sound

45:11

because in outside of an anoeic chamber

45:14

or something, I mean, there's sound

45:16

around us. Yeah.

45:17

>> Silence is the relinquishing of the need

45:20

for communication.

45:22

>> And we're not going to do it right now

45:23

because it's it it strangely cannot be

45:25

mediated. So you and I, if we were not

45:27

filming, could reasonably comfortably,

45:31

especially if we know and trust each

45:32

other, sit together in many, many

45:34

seconds, even minutes of silence and and

45:37

be know that we're together, but not

45:39

have to be like exchanging words. Now,

45:42

for interesting reasons, uh, in audio

45:44

and video, it doesn't work. It would be

45:47

boring and strange, and we we won't

45:49

subject anyone to it.

45:51

>> Um, AI doesn't want to be silent. AI

45:54

can't be silent. uh because it is

45:55

mediated uh intrinsically and so if you

45:59

carry AI with you you are missing out on

46:02

learning how to not need to speak to God

46:05

and strangely how not to require God to

46:07

speak to you

46:09

>> uh Mother Teresa who's known St. Teresa

46:11

of Kolkata uh was asked by this

46:13

broadcast TV interviewer uh about her

46:15

prayer life and she said and and this

46:19

gentleman said well so what do you say

46:20

to God when you pray and she said oh I

46:22

mostly don't say anything I listen and

46:24

he found this a very uncomfortable

46:26

answer and so he immediately follows up

46:27

he's like well well what does God say

46:29

and she says oh he mostly listens

46:31

>> yeah and then she says if you don't

46:34

understand that I don't know how to

46:36

explain it to you is fascinating

46:38

>> wow

46:39

>> right which makes sense

46:40

Oh yeah,

46:41

>> you just sort of have to live in the

46:43

absence.

46:44

>> Yes.

46:44

>> To understand it.

46:45

>> So she knew

46:47

>> what silence was.

46:48

>> Yeah.

46:49

>> AI can't help you. It It only works when

46:52

you talk to it. And then you've broken

46:53

the solitude. You've broken the silence.

46:55

Then fasting. I mean, it's it's not

46:58

going to it doesn't need food. It can't

47:00

be food. It's just it's orthogonal,

47:02

right? This is why I say this stuff is

47:04

orthogonal to the real the real thing.

47:07

>> Yeah. that's going to that are I think

47:10

the true alloys like these are kind of

47:12

the elements we need to build into our

47:14

lives that then make us truly available

47:17

that when we do have to speak or when we

47:18

do have to act in the world when AI can

47:20

become very useful to us getting stuff

47:21

done just like all technology can be we

47:23

will have something worth offering to

47:25

the world once you have something worth

47:27

offering technology is amazing for

47:30

making it available and and doing it at

47:32

a a cost that makes it possible to do it

47:35

you know at scale or whatever that's

47:36

Right. But how do you become the kind of

47:39

person who has something worth offering?

47:41

>> That's what technology generally uh from

47:45

from the devices on up is not very

47:47

helpful for. Talk a bit about community.

47:51

There's a lot of conversation about um

47:53

one we've already talked about how AI

47:56

can can sort of mediate uh connection

47:59

maybe simulate certainly this feeling of

48:01

of being seen and heard and known

48:04

without really being human and without

48:06

giving you a fully human experience. And

48:09

this is I don't know exactly where AI

48:12

will go but community as a spiritual

48:15

practice just thinking back to some of

48:18

the things you've said if AI is a mirror

48:21

>> it can give us this sense of being with

48:24

>> another or maybe several others and yet

48:28

there's something empty there. So

48:31

>> um for folks who who might be thinking

48:33

about okay you know AI actually makes me

48:36

it helps me feel not alone. I

48:38

understand, Andy. Um, when I'm

48:41

practicing silence and solitude, I'll

48:44

shut it off,

48:45

>> but when I need community, can't I just

48:48

sort of turn it on and have this feeling

48:50

and this experience of of being with,

48:53

you know, this sense of withness? Um, t

48:55

talk a little bit about the practice of

48:57

community maybe where AI, you know, can

49:01

be deformational in many ways.

49:05

Well, I think the first problem is going

49:06

to be very related to why it's such a

49:09

bad substitute for God, which is other

49:11

people like God are not predictable.

49:14

>> Yeah.

49:15

>> Don't always respond in the way we want.

49:17

And an entity that is predictable and

49:19

always responds with exquisite

49:21

attunement to what we want is is

49:24

actually terrible training for human

49:26

relationships because the reality is AI

49:29

I mean if you want um interestingly AI

49:32

boyfriends as we speak are taking off

49:34

way faster in adoption than AI

49:36

girlfriends. Uh,

49:38

>> and so for whatever reason, a lot of

49:40

women are turning to AI as kind of be

49:42

that boyfriend that they may not

49:44

currently have and probably have never

49:46

had and never will have because it's a

49:49

fabulous boyfriend. It's always there

49:51

for you. It answers all your texts in

49:53

the way you would want a guy to answer,

49:55

right? You know, you can you can

49:56

understand how compelling it is. The

49:58

only problem is real men are not as good

50:03

at this stuff.

50:03

>> Yeah. And if you're going to be a

50:05

relationship with us, it's going to be a

50:07

somewhat awkward process

50:09

that AI will be incredibly smooth at.

50:12

And in a way the sort of fil the AI's

50:17

own facility with emotion relationships

50:20

is very poor preparation for the reality

50:24

that that human beings are kind of

50:26

awkward with all this stuff and and

50:28

there's a lot of pain and friction and

50:30

rupture and repair that's required to

50:33

sustain community. So that that would be

50:35

you know the first thing. The second

50:38

thing is

50:39

that AI doesn't need you.

50:42

>> I mean, it needs you to keep the power

50:44

on and maybe to keep paying the monthly

50:47

subscription fee which keeps the power

50:48

on, but AI is not going to get sick.

50:52

It's not going to uh require compassion.

50:56

It will not grow old.

50:59

uh it will never have been an infant who

51:01

needs uh the kind of uh incredible

51:04

amount of care that a a very small human

51:07

being does.

51:08

>> And that's what community is meant to be

51:10

about. And and that's how we build our

51:12

relationships with each other is being

51:13

present in our our needfulness of one

51:16

another.

51:17

AI also is will, we're told, get better

51:20

and better at everything, but your

51:22

fellow human beings are going to get

51:24

weaker in certain ways and your fellow

51:26

human beings are and will be disabled.

51:29

Right? You and I at the moment we're

51:31

speaking are relatively typically

51:33

aabled, but we have in our lives and we

51:36

will one day be in the lives of others

51:37

who love us disabled by illness or age

51:40

or injury. And it is in that moment that

51:43

community will actually flourish, right?

51:45

Because we know how important the

51:47

presence of the people who are not

51:49

typically abled has been in helping us

51:51

learn to love. AI that allegedly is just

51:54

going to get more and more powerful,

51:55

more and more capable, more and more

51:57

intelligent. How is that going to help

51:59

us who actually need people who have

52:01

intellectual and developmental

52:02

disabilities in our lives who need to

52:04

slow down for someone who doesn't

52:06

understand what we're saying? Like it's

52:08

again, it's just going in a completely

52:10

different direction. Yeah. from what we

52:12

need as human beings and what we we are

52:14

going to need a community around us at

52:17

some stage of our lives perhaps sooner

52:20

than we would ever imagine that knows

52:22

how to care uh for someone who is not

52:25

able to be economically productive

52:28

cognitively relevant all these things

52:29

that AI is so good at

52:31

>> and if you've been practicing your

52:33

relationships only on AI how useful are

52:36

you going to be at that moment of

52:38

greatest human possibility which is also

52:41

greatest human need.

52:42

>> Yeah. AI which demands nothing of you.

52:46

How will that form you to becoming the

52:49

sort of person of love that when a human

52:52

needs something of you, needs you, how

52:55

prepared will you be to give your whole

52:58

self to that person in love? This is a

53:00

fascinating idea. I think one we need to

53:02

ponder for pastors like me who are

53:05

sitting in this moment. Um we have a

53:07

little bit of history. We look back and

53:09

we think, "Oh, remember pastoring before

53:12

the internet?" I don't remember that,

53:13

but my friends do.

53:15

>> I do remember pastoring before the

53:17

smartphone was ubiquitous.

53:19

>> And we look back and we say, "We just we

53:22

were not prepared. We started playing

53:24

catch-up and we're still trying to play

53:26

catchup."

53:27

>> Now we have AI. We're looking forward to

53:29

it. Speak to pastors and church leaders.

53:32

What are some ways that we could look

53:34

ahead and be prepared to pastor and to

53:38

shepherd and to love our people uh

53:41

really well into and through um the age

53:44

that's here and that is to come,

53:46

>> right? Wow.

53:48

There is one really interesting

53:50

difference between this and several of

53:52

those previous technological waves that

53:53

we've all lived through, smartphones,

53:55

social media, and it is that everyone's

53:58

ambivalent. When smartphones appeared,

54:01

everyone wanted one. It was going to be

54:03

just amazing. When social media

54:05

appeared, everyone wanted to get on

54:06

Facebook. It was going to be incredible

54:07

to reconnect with your friends. There

54:09

was so little ambivalence, so little

54:11

downside. Today, as we speak, twothirds

54:14

of the United States wants AI to stop.

54:16

They don't want any more of it. There's

54:18

incredible resistance to it. And there's

54:20

all these questions being asked about. I

54:21

think this is so fruitful. Not because

54:24

AI is not going to be beneficial and

54:26

useful and good in certain ways, but

54:28

because it actually gives us a chance to

54:30

kind of shepherd the conversation of

54:32

what is this good for and not good for

54:34

in a way that we missed out when

54:35

everyone just leaped into the previous

54:37

waves. So I want to say pastors have

54:40

more of a shot of really shaping how our

54:43

communities respond to this than we had

54:45

in the previous two because the previous

54:47

two everybody was just sort of

54:49

intoxicated with what it could be. this

54:51

one. Everyone, including the people

54:53

developing it, are asking these really

54:55

deep existential questions.

54:57

>> And I think there's a couple ways in.

54:58

One is this is an amazing time to teach

55:03

how Christians see what it is to be

55:05

human and why it's good to be human.

55:07

Because so much of what's behind the

55:09

development of all this technology,

55:12

including AI, is often an explicit quest

55:15

to escape the conditions of being human

55:17

in ways that Christians just have a

55:19

different account of. So we can be

55:21

teaching on that. And the other thing

55:24

that we failed to do with these other

55:27

waves, these other waves were of

55:29

incredibly useful technologies of

55:31

connection, let's say very broadly, that

55:34

had all kinds of value. It's just that

55:36

they weren't good at absorbing all of

55:39

human life into them and and maintaining

55:41

what's best about human life and

55:42

relationships. And we allowed our our

55:46

attention, our solitude, our silence,

55:49

you know, all these things to be

55:50

colonized

55:51

for the sake of getting the usefulness.

55:55

I think we have a chance to help people

55:58

uh implement the these new technologies

56:02

in ways that do serve a limited useful

56:06

purpose while not bringing kind of the

56:10

sacred parts of human life and the and

56:12

the sacred dimensions of our life. uh

56:14

just sort of letting them float into the

56:17

stream of the way that they kind of

56:18

floated into the stream of smartphones

56:20

and floated into the world of social

56:22

media that was not prepared to be a good

56:24

place to be

56:25

>> uh someone pursuing God and pursuing

56:27

love of others. Right? So, we've got a

56:29

chance to help people differentiate,

56:32

make targeted strategic choices about

56:35

what we use and how and why and when

56:37

rather than just sort of letting it all

56:39

wash over us and take over uh the human

56:42

experience. I want to ask a really

56:44

practical question like so practical for

56:48

church leaders, pastors and then um I

56:51

want to ask you to give us sort of a

56:53

final word on maybe just a paradigm

56:56

through which all of us can consider AI.

56:58

So the practical question for pastors,

57:00

church leaders, I'm just going to get

57:02

hyper nuts and bolts right now. There's

57:05

a pastor out there. There are many

57:07

pastors out there, I assume, hundreds,

57:09

thousands maybe, who quietly, secretly,

57:13

maybe with the tinge of guilt or shame,

57:15

are going to their chatbot, their AI

57:17

because it's Thursday afternoon.

57:19

>> Oh, I thought you were going to say say

57:21

Saturday at

57:22

>> or Saturday night, you know. And um and

57:25

they've been doing great work. They're

57:26

they're ministering to people. They're

57:28

really pastoring. you know, the couple

57:30

whose marriage is falling apart and

57:31

they're the hospital visit

57:33

>> and and Sunday's coming and they've got

57:35

to preach. So, the AI sort of pops open

57:39

and it's like this magic box. I type in

57:42

the text some some maybe some ideas of

57:44

my own like I've got these sort of

57:46

thoughts.

57:46

>> I'm sure you do.

57:47

>> Give me an outline, whatever it might

57:49

be. And then

57:50

>> so that's one one example. Well, there

57:52

are millions, not millions, but there

57:54

are many examples like this that we

57:56

could probably either are happening in

57:58

churches or we can think of.

58:00

>> How should church leaders, how should

58:02

pastors think about is there a paradigm

58:04

or a lens, specific lens through which

58:07

we can consider what is a responsible

58:10

way to leverage some of these

58:13

technologies to be tools,

58:15

>> an extension of my capacity as a pastor?

58:19

um and not allow it to become this this

58:22

technology that sort of robs me of

58:25

pastoral capacity.

58:27

>> So, I'd say there's a spectrum of things

58:28

that you have to do to get through the

58:30

day uh and and fulfill your job

58:34

and some of them are really quite

58:36

technical and quite tedious. And if you

58:39

can find a way to use technology as an

58:41

instrument, you're still involved,

58:43

you're still making the key decisions,

58:44

but it makes it quicker, it makes it

58:46

less frictionful, great. And then over

58:49

here somewhere are the most sacred

58:51

things you do.

58:54

When you get to the most sacred thing, I

58:57

do not think you want to be trying to

58:59

alloy silicon into that. And then the

59:02

question is in your theology let's say

59:05

of the Sunday sermon which is it is it

59:09

basically a technical process where you

59:12

have to have some points and you

59:14

probably need some slides and uh it

59:17

needs to sort of really make sense as an

59:19

outline and you're out of time and so AI

59:23

can give you a bunch of those things.

59:26

I mean, that might be your theology of

59:28

preaching, right? And maybe maybe you

59:31

don't think it's that significant even

59:33

like compared to some of the other

59:34

things you do. I'm not here to tell you

59:38

how what's the role of the proclamation

59:40

of the word except to say

59:43

I don't know that many people who went

59:45

into ministry believing it was just a

59:47

technical thing.

59:48

>> Yeah. I think most people went in

59:50

thinking the chance to be as a person

59:53

responsible to the word of God with the

59:55

word of God before a community of people

59:58

and to dare to say I am speaking to you

60:01

in the name of God, the father, son, and

60:03

the holy spirit is is such a sacred

60:06

thing.

60:08

I would just say go unprepared on Sunday

60:10

morning

60:11

>> rather than turn it into an economically

60:15

productive cognitive task which is what

60:17

AI can do. Do you not believe if you've

60:21

been faithful for six days of the week

60:24

doing what you're called to do and that

60:26

just has meant you haven't had time to

60:27

prepare your sermon. Do you not believe

60:29

the Holy Spirit can show up in that

60:31

moment of need with people who need to

60:33

hear from the word? Even if all the

60:35

preparation you've had the time to do is

60:36

just see what the text is, say, "Oh,

60:38

God, help. I don't know what to say."

60:40

And maybe just begin by saying, "I come

60:42

to you unprepared, but let's look at

60:44

this text together." Don't you think God

60:46

could show up?

60:49

If it's just a technical activity,

60:52

sure, use the chatbot. If it's the most

60:55

sacred thing you do or one of the most

60:57

sacred things, then trust the Holy

61:00

Spirit who is going to use even your

61:02

lack of preparation. And believe me, I

61:04

speak from all too real experience of

61:08

sometimes also having frankly not it's

61:10

not like I was being so amazingly

61:12

faithful that I just ran out of time.

61:14

It's I procrastinated. I avoided the

61:16

work and I've still found God has

61:19

honored my my willingness to to speak in

61:22

his name depending on him repentantly

61:25

truthfully. Do don't miss out on what

61:29

the out of distribution Holy Spirit

61:32

might do

61:33

>> by giving your people something they

61:35

themselves could get from the chatbot.

61:38

>> Like they could go plug in the text

61:40

>> and probably get it faster and cleaner

61:43

than you're going to give it to them.

61:44

take the risk uh of giving them

61:47

something of of together going into the

61:50

presence of God with the word of God,

61:53

asking for the spirit of God and seeing

61:55

what happens. I'd much rather do that

61:56

any Sunday of the year.

61:59

>> Um and and then of course over time make

62:01

sure that you have enough time really

62:02

for the solitude and the science and the

62:04

study and the preparation. That's part

62:05

of being prepared to rightly handle the

62:07

word of God. But gosh, if one Sunday you

62:09

just can't quite get there,

62:11

>> let's see what God would do with that

62:14

that inability

62:16

>> rather than technologically

62:18

concealing

62:20

>> um our true situation.

62:23

>> That's so well said. Really beautiful.

62:26

Yeah. What God can do with our

62:28

inability. Do we believe that his

62:30

strength really is made perfect in our

62:33

weakness, not our preparedness or our

62:37

preparation? Yeah. Or our skill even.

62:39

Yeah, that's a beautiful word.

62:41

>> What all these devices give us is access

62:44

to this thing I call the superpower

62:45

zone. Superpowers is this language. I

62:47

think it's fading a little bit, but it

62:49

was like the like the word for what

62:52

technology was gonna give you a couple

62:53

years ago.

62:54

>> I was a comic book kid, so I totally

62:56

relate. So it comes from the world of

62:58

comics of course but then you start you

63:00

know reading that if you use this

63:02

platform you'll have coding superpowers

63:03

but then if you're in marketing we can

63:05

give you marketing superpowers all of

63:06

which basically means more and more

63:09

effect with less and less effort. So

63:11

it's basically effortless power. And the

63:13

superpower zone

63:15

>> is when you get into this kind of flow

63:17

of a kind of feeling like with with

63:20

minimal like activity and and u

63:24

expenditure of energy you're having

63:26

outsized effect. This is when something

63:29

goes viral, right? Like you write a

63:31

little tweet

63:32

>> or whatever kids are doing doing these

63:34

days. Tweet tweets are gone. Um and and

63:37

somehow it takes off and all you did was

63:40

like press a button. Yeah.

63:42

>> And you're seeing these effects

63:44

>> and technology increasingly is designed

63:47

to keep us in this zone of feeling like

63:49

we are just uh kind kind of surfing

63:52

through the world on power that is not

63:54

our own without requiring any real

63:56

effort or skill unlike real surfing

63:58

which requires incredible effort and

64:00

skill. I'm just like uh coasting on this

64:05

wave of ability to make a difference

64:08

without myself having to become

64:10

different.

64:10

>> Yeah.

64:11

>> And this is why these technologies are

64:13

not neutral for the project of becoming

64:16

like Jesus because they train us to not

64:19

want the the pain, friction, and

64:23

difficulty of being formed into someone

64:26

who's different from who we were.

64:28

>> Yeah. It's very related to the dream of

64:30

magic, which is the dream of having

64:32

power that I just wield without moral

64:35

character, without dependence on God,

64:37

without dependence on other people, uh

64:39

without friction. If I just say the

64:41

spell, if I know the spell and can say

64:43

the spell, boom, something will happen,

64:45

>> right?

64:45

>> And again, this is how technology has

64:47

been sold. And and interestingly uh as

64:50

AI has been rolled out by the various

64:52

companies that have commercialized it,

64:54

almost all of them are using some little

64:56

iconography that represents like a

64:57

little magical moment like these little

64:59

starbursts or you know or a sometimes a

65:02

literal a literal wand like apply this

65:04

wand to your writing and boom without

65:07

any effort without you becoming a better

65:09

writer it'll just make your writing

65:10

better

65:11

>> right

65:11

>> and we think that sounds like the life I

65:13

want. I want a life of minimal uh

65:16

difficulty, maximal effect, um where I

65:21

get to just see things happen without

65:24

myself having to be kind of deeply

65:27

changed for them to happen.

65:30

>> And

65:31

that can happen. I mean, there's a

65:34

version of that that is available to us

65:36

now. But how is it related to the way of

65:39

Jesus? That's the question. Yeah, that

65:41

thought effortless power I think is such

65:43

an important thought for pastors, church

65:45

leaders, yes, but for all of us who are

65:47

followers of Jesus. One to just

65:50

recognize it is in large part what I

65:52

want when I wake up in the morning in

65:54

the morning.

65:55

>> I would love effortless power and it's

65:58

not it's not you know they're not all

65:59

like I want to go viral. It's just

66:01

simple things in life

66:03

>> that you know when I raising my kids I

66:05

would like for it to be a bit more

66:08

effortless. parenting superpowers. Well,

66:10

you love to feel like I've got parenting

66:12

superpowers.

66:13

>> Right. Right. And there's this sort of

66:15

amplification of the myth that other

66:18

people can do it effortlessly when you

66:20

scroll whatever it is you scroll. I was

66:22

like, man, totally that should be my

66:24

life. So, there's this sort of loop that

66:26

we live in that everybody seems to have

66:28

effortless power.

66:30

>> Wow. And the only reason the only way

66:32

I'm going to know that I am experiencing

66:36

and wielding effortless power is if my

66:40

life looks a bit

66:41

>> like this life. You know, this mom is

66:43

just beautiful and her kids are perfect.

66:45

You know, that needs to be my life. And

66:47

we know like intellectually that's not

66:49

true. But I think that idea is so key

66:53

because if I'm hearing you correctly and

66:56

just hearing my own life correctly,

66:59

>> there is no way to be formed into

67:02

Christlikkeness effortlessly. The

67:05

invitation is to bear a cross and to

67:08

walk a very narrow path. Bonhaofer's

67:11

idea of cheap grace is like really no

67:14

grace at all is essentially what he's

67:16

saying. If it's grace without the cross,

67:18

then it's not really grace. You've just

67:20

cheapened it into this effortless

67:23

>> sort of thing that looks like the real

67:25

thing

67:26

>> but falls falls short of that. So, I

67:28

want to talk more about that a bit.

67:30

There's um

67:31

>> the the Canadian philosopher Marsh

67:33

Marshall McLuhan. He had that concept of

67:35

the his four laws of media. By media, he

67:38

didn't mean, you know, the news. He

67:40

meant any extension of

67:42

>> human capacity. Um and he talks about

67:45

how the four laws essentially you know

67:47

every technology

67:50

in it it does several things but I I

67:52

want to get to the fourth thing. He

67:54

basically says that when a technology is

67:57

pushed to its limits

67:59

>> and it it's pushed to its extremes. It

68:02

almost always sort of folds in on itself

68:05

and it undoes the thing the human

68:07

capacity that it was sort of originally

68:09

intended to do. The the classic example

68:12

is the phone, right? phone was

68:14

originally intended to extend the human

68:16

capacity to hear one another, to

68:19

communicate to one another. You see it

68:21

move over time and now we sit at the

68:24

same coffee shop and we're not

68:26

connecting at all and we're just lost on

68:28

these phones that have robbed us of that

68:30

capacity. So, um I

68:33

>> I want to talk about sort of in light of

68:36

that,

68:36

>> I want to talk about a an idea that

68:38

you've proposed to me in conversations.

68:42

um you use the imagery of elements and

68:45

alloys

68:46

>> and we can think about any technology

68:48

along those lines. Is this technology an

68:51

element or an alloy? The reason that

68:54

matters is is because by element talk

68:57

more about this but what you mean is it

68:59

sort of replaces a particular human

69:02

capacity which is McLuhan's concept just

69:04

reverses in and now humans lose the

69:07

capacity in some sense

69:09

>> whereas there is the possibility of

69:11

technologies some technologies being an

69:14

alloy something that strengthens and

69:16

supports human capacity so talk about AI

69:20

through that lens element an alloy. Is

69:24

it even possible for AI to become sort

69:27

of an alloy? This supportive

69:29

strengthening mechanism.

69:31

>> Wow. I do love this metaphor. I got it

69:34

from Robert Putnham who wrote this book,

69:35

Bowling Alone. And in one of the updates

69:37

to his uh to this book, which is about

69:39

really the the uh disconnection that's

69:42

happened in American life. Um partly

69:44

because of technology, partly for other

69:45

reasons. Um he uses this this picture.

69:48

So what is an ally? Let's just refresh

69:50

our metallurgical memories that um steel

69:53

is an alloy of fundamentally of iron and

69:56

carbon. Um so iron itself when properly

69:59

kind of uh smelted or whatever the right

70:01

word is um is actually a very strong uh

70:05

thing cast iron if you think about a

70:06

cast iron pan uh but only so strong and

70:10

uh along the way human beings discover

70:12

if you combine iron with a little bit of

70:14

carbon uh less than 20% I think um and

70:18

and heat them and uh do the things they

70:21

know how to do. You can see I'm very

70:23

sophisticated user of this metaphor. um

70:26

you get this substance called steel

70:27

that's way stronger than iron.

70:30

>> Now also way stronger than carbon. So

70:33

it's interesting carbon is actually very

70:35

not strong uh except in the form of

70:37

diamonds. But like ordinary carbon like

70:39

you think about a pencil uh that you

70:41

might write with pencil lead that's

70:42

carbon is it's very you know fryable. It

70:45

it breaks very easily. So it's this real

70:48

interesting thing like you've got this

70:49

pretty strong stuff iron and this very

70:51

actually weak um stuff uh carbon. But if

70:55

you combine them at the right

70:56

proportions, mostly iron, a little bit

70:58

of carbon, you end up with this alloy

71:00

steel that's much harder and and better

71:03

for a lot of things. I think it's a

71:05

beautiful and fruitful metaphor because

71:08

what PDM observes is he says, well,

71:10

we've ended up with kind of our

71:12

relationships are digital alloys of face

71:16

to face. So face to face is like the

71:18

iron here. So what builds a

71:19

relationship? It's when we're together

71:21

in person. is when we're getting the

71:23

incredible amount of information that's

71:25

passing between you and me. We're

71:27

capturing some of it on cameras and and

71:29

audio here, but you and I are having

71:31

like a multi- terabyte information

71:33

experience that folks who are going to

71:35

watch this conversation on video will

71:37

get like a I don't know a a multi-

71:39

gigabyte experience. Um, and so there's

71:43

something very fundamental about the

71:44

face to face about the the true embodied

71:47

personal. But you and I may go away and

71:50

well, I mean, even to set up this

71:51

conversation, we had a bunch of emails.

71:52

So, we used digital technology building

71:55

on prior in-person time, anticipating an

71:58

in-person time that we're literally

72:00

having now. Um, we sort of bridged

72:03

between those with this very thin

72:05

communication medium and in our case

72:07

called email, maybe a couple of texts,

72:09

>> but that's a little bit like we added a

72:12

little bit of of an element that on its

72:14

own would not be enough to sustain the

72:15

relationship, right? So I think of

72:18

digital relationships as the carbon of

72:21

human relationality. Like 100% carbon

72:24

you all you can do is make little pencil

72:26

scratches with it. You can't build a

72:28

sword with it. You can't make a pot out

72:30

of it. It's not that useful. Right? But

72:32

maybe it's the case and this is what

72:34

Putnham is suggesting that if we are

72:36

building our relationships primarily on

72:39

the iron of of the the truly embodied

72:42

relationship that's always been the way

72:44

we do it as humans but we add a little

72:47

bit of the digital it can actually

72:49

strengthen it. Here's another way of

72:51

thinking about Jay.

72:52

>> It's really good at all the things that

72:54

make us economically productive in the

72:56

world. So one way AI or specifically

72:59

artificial general intelligence has been

73:00

defined by the people who are

73:02

commercializing it is we'll know we have

73:04

artificial general intelligence when a

73:06

computer can do and it's so interesting

73:08

they use this phrase any economically

73:10

valuable cognitive task that human

73:12

beings currently can do. So the goal

73:14

right now of of firms like open AAI is

73:17

to build a system that can take any task

73:19

that's cognitive that involves it

73:21

involves processing responding to

73:22

information or signals from the world

73:24

and it's economically valuable. We want

73:26

to be able to do that. Well, think about

73:28

all the things that have nothing to do

73:30

with any kind of economy

73:33

very deeply and broadly. You might say

73:36

the work of care,

73:38

>> the work of compassion that when I

73:41

really am present to you, especially

73:42

when I'm present to you at a time when

73:44

you are dependent or when you're present

73:46

to me at a time when I'm dependent.

73:47

Yeah. Which is in infancy, in old age,

73:50

in serious illness, in disability.

73:54

um th those things literally cannot be

73:57

captured by an economic system, right?

73:59

But they're the most important human

74:01

things we do.

74:02

>> That's when people are formed. That's

74:03

when we are most formed into people of

74:05

love.

74:06

>> Yes.

74:07

>> In both directions, right? So you're

74:09

reformed. You're formed in receiving the

74:11

care, but the one who cares is also

74:13

formed.

74:13

>> The resistance almost seems necessary

74:16

for the formation of love. If love is to

74:20

through the challenge to lean into human

74:23

pain, not just human pleasure but human

74:25

pain, to lean into the challenge of

74:27

human pain, to fight through the

74:29

challenge of human pain to will

74:31

another's good even at great cost to

74:34

yourself. It almost feels

74:37

completely necessary that that without

74:40

those challenges you cannot possibly be

74:43

formed certainly not fully formed

74:46

>> into a into a person of of love. I want

74:50

to ask you as we conclude this

74:52

conversation

74:54

maybe offer folks who are listening and

74:57

watching

74:59

a simple accessible it's not a simple

75:02

conversation but as simple and

75:05

accessible as possible a paradigm as

75:08

they walk away okay it AI is all around

75:12

me technology is all around me AI is

75:14

certainly all around me and it's

75:15

increasingly so what is sort of a a a a

75:20

lens through which I can consider any

75:22

and all engagement I have with AI

75:27

as a follower of Jesus. I just don't

75:30

think we're going to do better than the

75:32

greatest commandment. Love the Lord your

75:34

God with all your heart, soul, mind, and

75:36

strength. And from this I've kind of

75:39

deduced in a way like what is it to be

75:41

human? It's to be a heart, soul, mind,

75:43

strength complex designed for love. Mhm.

75:45

>> So I sort of I say this is what we're

75:47

meant to be and everything we bring into

75:50

our lives should uh respect that should

75:54

sort of honor what we really are heart

75:55

soul mind strength complex design for

75:57

love and should advance that should help

75:59

me be that

76:00

>> and then from this incredible word in

76:02

the schma with all your heart all your

76:05

soul I I hear this call to de to

76:08

development that is uh there's a way to

76:11

be half-hearted

76:13

instead of allhearted. There's a way to

76:15

be half minded, half sold. I guess

76:17

there's certainly ways for your strength

76:19

to atrophy and you're not even

76:21

physically capable of what you could be.

76:23

Right? So, it's this invitation to see

76:26

life as the pursuit of allness of heart,

76:30

soul, mind, and strength. All all only

76:33

to the extent that they equip us to

76:35

love.

76:37

I I don't think in a way if you're

76:39

talking about something as general as

76:40

technology, I don't think I can be more

76:42

specific than say we need individually

76:45

and together and at a point in time and

76:48

overtime to ask about everything we

76:50

bring to into our lives. Is this helping

76:52

me develop allness of heart that is grow

76:54

in my emotional capacity and the

76:56

strength of my will to to the good to

76:59

grow in allness of soul the my depth of

77:02

self ability to connect to the depth

77:04

that is God as well as well as the depth

77:06

that is in others my allness of mind my

77:09

allness of strength

77:11

and if it helps

77:13

increase allness and if as it does I

77:16

don't become inflated and prideful about

77:18

my mental abilities or my great physical

77:21

prowess us or something. But no, all

77:22

that is in the service of loving God and

77:24

neighbor, then bring it on.

77:27

>> Uh and if it doesn't get it out

77:32

>> because this is the this is the way to

77:34

flourishing. So we will be asking every

77:38

time we think about incorporating AI

77:40

into some system or into some process or

77:42

some part of our lives.

77:44

Is this deployed in a way that helps me

77:48

grow in my heart, my soul, my mind, and

77:50

my strength?

77:51

>> If it does, I'll use it. If it doesn't,

77:55

I will gently but firmly say no thank

77:59

you

78:00

>> to the merchants of it

78:01

>> because we're here to love in that way.

78:05

Um, Andy, you're a gift to me, to so

78:09

many people. I think that was such a

78:12

fitting um a fitting way to uh end this

78:17

conversation.

78:19

Is it is it developing and furthering

78:21

allness in me, making me more human the

78:24

way God called us to be? Thank you so

78:26

much for this conversation, for your

78:28

time.

78:29

>> Thank you, Jay.

UNLOCK MORE

Sign up free to access premium features

INTERACTIVE VIEWER

Watch the video with synced subtitles, adjustable overlay, and full playback control.

SIGN UP FREE TO UNLOCK

AI SUMMARY

Get an instant AI-generated summary of the video content, key points, and takeaways.

SIGN UP FREE TO UNLOCK

TRANSLATE

Translate the transcript to 100+ languages with one click. Download in any format.

SIGN UP FREE TO UNLOCK

MIND MAP

Visualize the transcript as an interactive mind map. Understand structure at a glance.

SIGN UP FREE TO UNLOCK

CHAT WITH TRANSCRIPT

Ask questions about the video content. Get answers powered by AI directly from the transcript.

SIGN UP FREE TO UNLOCK

GET MORE FROM YOUR TRANSCRIPTS

Sign up for free and unlock interactive viewer, AI summaries, translations, mind maps, and more. No credit card required.