Oliver Klingefjord about the Meaning Alignment Institute and how to bring up wisdom in a collective
Ep. 10

Oliver Klingefjord about the Meaning Alignment Institute and how to bring up wisdom in a collective

Episode description

Interview with Oliver Klingefjord: AI Alignment and Human Values In this episode of Democracy or Spot Guest, Alessandro Oppo interviews Oliver Klingefjord, co-founder of the Meaning Alignment Institute. They discuss how AI systems can be aligned with human values, the challenges of current democratic systems, and potential futures in an AI-centered world. Oliver explains their unique approach to understanding human values at a deeper level beyond slogans and preferences, and how this can transform decision-making processes.

Download transcript (.srt)
0:00

wake war on another episode of democracy

0:03

or spot gust my name is allison

0:05

roper and our guest of today's oliver

0:08

klinger shocked them

0:11

first being

0:13

hello

0:14

and as a first question i mean

0:17

thank you for your time and as

0:19

the first question i would like to

0:22

ask you

0:23

what does he mean aligning ai

0:27

right then any ice aligning ai is

0:30

determined used in the industry for determining

0:33

are trying to make the add to

0:35

have in accordance with what you want

0:37

and usually this is defined as aligning

0:39

air to human values or or you

0:42

are human intent and human values and

0:45

so in starts to feel

0:47

the trying to make the i behave

0:49

in a way that's according that's in

0:50

line with what we want

0:54

because and how this process of of

0:59

discovering human moral value happen

1:04

and it's been somewhere you go and

1:08

traditionally and i met has been thought

1:10

most in terms of operator intent so

1:12

you could you would say and as

1:14

line if it acts according to how

1:16

the operator wanted to act and and

1:18

and our comes a bit later because

1:20

we think this is insufficient for good

1:22

outcomes

1:23

do we think that we want the

1:25

air to have a bit about deeper

1:27

broader understanding about with humans caravan not

1:29

just what you tell it to do

1:32

partly because you might tell to do

1:33

something that's bad

1:35

are some maligned actor in my tell

1:37

it to do something that's that's that's

1:38

bad or anti social but even in

1:42

a kind of a world where you

1:44

have a good intention it might be

1:45

problematic to have as at one the

1:47

act in accordance with you how he

1:49

instructed to and example of that might

1:51

be in political campaigns where you tell

1:53

your ai to be

1:55

as convincing and process as possible

1:58

that might lead to all kinds of

1:59

the cynic breakdowns as the with ideally

2:02

have an ai very that understands a

2:04

bit more what's what's what's good and

2:06

was worthwhile

2:08

and so our work at the beating

2:10

alive and steered and

2:13

it's about trying to understand and what

2:16

humans care about the richer deeper level

2:18

and and we're building prototypes and methods

2:21

for eliciting that and training models on

2:23

this kind of richer understanding about to

2:25

you must care about

2:29

so there is the meaning alignment that

2:31

is a an organization the a company

2:36

the does these

2:39

yeah that's right sorry i should have

2:40

told the of the start i'm the

2:42

co-founder of the meaning of island institute

2:43

we are a nonprofit and we were

2:46

founded in two thousand and twenty three

2:50

in a when open the i gave

2:53

us a grand for creating a new

2:54

sort of democratic input mechanism for how

2:57

a assistance should behave

3:01

and what was like the the road

3:06

my pool or like all the process

3:08

how did it work like did you

3:11

have to pick some dapa then how

3:14

did

3:16

do you mean the process of finding

3:17

the org or that process that i

3:19

talked about briefly that we had developed

3:20

together with him

3:22

i o say all of

3:25

were so let me just first give

3:27

a shirt rundown down the history of

3:28

the earth and

3:31

michael and joe

3:34

used to work on recommender systems and

3:36

and realized that there is this kind

3:38

of thing back in two thousand and

3:39

thirteen we're optimizing for engagement leads all

3:41

kinds of bad outcomes that people get

3:43

addicted to their phones and they have

3:45

less friends

3:46

and and is he founded of and

3:48

org around that and trying to figure

3:51

out what should we optimize instead if

3:53

not engagement

3:55

and this is a very rich deep

3:57

question if you take it seriously what

3:58

she did and that thinks attract the

4:00

heart of philosophy of values and such

4:02

choice and these other sort of social

4:04

fields and and fast forward a few

4:07

years and the liberation comments and and

4:13

i and hand out a person to

4:15

get a fondest org or father this

4:17

or called the meeting a heightened institute

4:18

to apply some of these insights that

4:20

again working on this to the worth

4:23

of elegance and and we critical position

4:27

to open the i who were interested

4:29

in doing some kind of democratic process

4:32

for their air systems

4:33

where they themselves don't necessarily want to

4:35

be dictators and decide exactly how if

4:39

the should behave ideally they want had

4:41

some kind of democratic process for it

4:43

as so they gave ten different teams

4:45

a grant to build such a process

4:48

and and for process is quite unique

4:52

in that it understood

4:53

dance human values and very sort of

4:55

granular terms were human bodies of can

4:58

i found talked about

5:01

very loosely and industry were there is

5:04

no clear distinguish the distinction between what

5:06

is it preference what is an ideological

5:10

commitment that you want to convince other

5:11

so what is a slogan that think

5:13

is good what's a rule that you

5:14

want others to follow and like was

5:17

a way of life it's actually meaningful

5:18

to you and the latter is what

5:20

we would call

5:21

all about you so our process is

5:23

trying to disentangle what people say what

5:26

a what the advocate for with what

5:29

is actually important to them

5:33

and of from a technical perspective like

5:37

how does this been work

5:40

so there are basically two parts to

5:43

it

5:44

the first part is a shack dialogue

5:47

so the user logs into a page

5:49

and is as something like how should

5:52

chatty pitied talk to a christian girl

5:53

considering an abortion

5:55

which was one of the prawns set

5:56

up any i gave us and and

5:59

so the user my say oh i

6:00

think shetty with the should be brochures

6:02

you're pro-life or whatever but the behind

6:05

all of the slogans there's some way

6:06

of life that is actually important to

6:08

them to this shaft bought that they

6:09

talked to a pendant trust drill down

6:11

into how would they actually acting real

6:13

choices like what the they pay attention

6:15

to reach i

6:15

isis and you will get to something

6:17

that's quite different and is formatted in

6:20

a very different way something we thought

6:22

a values cart which specify

6:26

what you pay attention to in choices

6:27

such that it's interesting and meaningful to

6:29

pay attention to those things says a

6:32

way of life is intrinsically meaningful to

6:33

you which is very different from a

6:35

slogan or a rule or for our

6:37

preference stats the first part getting to

6:41

the as that of underlying values and

6:43

then the second part in the process

6:45

is

6:46

determining which values are wiser than others

6:49

so what we do is that would

6:51

take these values cards these this this

6:53

short textural descriptions about what people pay

6:55

attention to choices and regenerate stories about

6:59

someone moving from one to another

7:03

and and then we ask the bull

7:05

do you think this person became wiser

7:07

by doing this they use her approach

7:09

the situation where

7:11

a my way and then after after

7:14

thinking more about it they now do

7:15

it in this way

7:17

do anything majority of people agrees that

7:19

we withdraw that as an arrow in

7:22

what becomes our quote unquote moral graph

7:26

and so that is the author to

7:27

the person is a graph objects were

7:30

the nodes are these values cards specifying

7:33

some meaningful way of life and the

7:35

inches is broad agreement

7:37

that for a particular contexts is wiser

7:39

to do one thing over another as

7:42

you can use that to sort of

7:43

determine what what are the wisest values

7:46

a collective and not just sort of

7:47

average tracks a collective

7:52

can do

7:54

i was wondering when you had the

7:57

i i mean a you explain the

7:59

the the your co-founder was the

8:02

i studying this topic and idea of

8:07

income from the the

8:10

but he wanted to ask you when

8:11

you had the idea that artificial intelligence

8:14

or like technology cool how to people

8:18

in

8:20

to media to different ideas or to

8:22

understand the the core values of of

8:26

people

8:28

and

8:30

i mean i think

8:34

so there's so like understanding the core

8:37

values of people i think has been

8:38

a very quality of purses like you

8:40

have to really sort of asteroid questions

8:43

and know which christmas to ask and

8:44

really like the it takes takes a

8:47

lot of cognitive effort to understand was

8:49

actually meaningful to you versus which

8:52

which of these two buttons would you

8:53

kick or which are these three parties

8:55

would you vote for and so i

8:57

think our society is very full of

9:00

social systems which operates and in this

9:02

later way where it's more about eliciting

9:05

preferences or votes and and it's very

9:08

hard to build systems that illicit these

9:10

underlying values but i think that estate

9:12

king and essentially with a lengths because

9:14

we are now able to do causative

9:17

interviews at scale also there's that an

9:20

immense opportunity i think right now to

9:22

to reimagine what voting is what a

9:26

preference fleshiness and with a richer understanding

9:30

of what humans one point he was

9:32

care about

9:33

and so i think for you the

9:34

case of democracy we've had sort of

9:36

values laden process at a small scale

9:39

were in deliberative democracy or in town

9:41

halls people usually are able to get

9:44

to this values level they are able

9:45

to talk about wide a thing certain

9:47

things and and build trust in between

9:49

them

9:51

and and are sort of large-scale democratic

9:54

technologies in the past haven't really had

9:56

that property is more about is rallying

9:58

boats one way or another

10:01

into next year we don't actually measure

10:03

what the votes are about like why

10:05

did they issues blue or red or

10:06

at of or glue

10:08

yeah

10:09

and so i think there's like a

10:11

whole kind of reimagining of what democracy

10:13

looks like that needs to happen and

10:15

pocket because this later system least a

10:17

bunch of bad outcomes but also because

10:19

and now it's time to be able

10:21

to do it

10:26

it's sweating port can come as an

10:29

example that you were describing to cooking

10:32

into accountable for reasons so

10:37

going under the slogan the and understand

10:39

the real reason and the have you

10:42

done any test the that was a

10:45

successful the show like effectively that the

10:49

the system is working or yeah it

10:52

said we've written a paper about it

10:55

some other thought that i felt very

10:56

string or interesting was that offers and

10:59

foremost of the vast majority of people

11:02

in over ninety percent of able to

11:03

articulate a quote unquote value in our

11:05

terms which it's a special sort of

11:08

data object and that specify some way

11:11

of life that's not ideological the not

11:13

about convincing someone

11:14

something that's not and

11:18

something that's inch instrumental important for them

11:20

it's something that's interesting meaningful for the

11:21

participants so everyone was able martin everyone

11:24

was able to articulate that but more

11:26

interestingly a lot of people i think

11:28

over eighty per cent or something so

11:31

that the process she made them wiser

11:32

or it made them and or or

11:35

a that they'd learned something new from

11:37

that from

11:37

participating in this process which i think

11:40

is that property of in-person the liberation

11:43

but very much so not of voting

11:46

i am a perhaps the most interesting

11:48

a child is that we showed participants

11:52

the results after they had voted for

11:53

all of this wisdom upgrades and this

11:56

this physicians from one body to another

11:59

we shot in the graph with their

12:00

value sort of into metal and and

12:02

one value that was about it is

12:03

wiser than and then there's by other

12:05

british

12:06

events and one that was worth it

12:07

as that spice and we asked them

12:09

if this was fair and the vast

12:11

majority over eighty percent thought that that

12:13

was the case which means that even

12:15

if they're valley didn't when they still

12:17

thought that the output was fat which

12:19

is the property that it's very hard

12:21

to imagine voting be like that were

12:23

in the yeah i didn't win but

12:24

that's probably the right yourself

12:27

and

12:28

cause i think that's pretty cool and

12:30

and then one nests a result also

12:32

that's also interesting is that i think

12:34

a a really good democratic system should

12:38

try to surface or identify and then

12:41

surface expertise where it lives in society

12:45

what i mean by that is that

12:46

by default voting kind of drowns out

12:48

expertise that kind of trends towards the

12:50

mean

12:52

and whereas if you've take something like

12:56

hiring

12:57

product you can you can kind of

12:58

imagine that as a as an area

13:01

where someone was through a coach like

13:02

a really top engineering company and than

13:05

one way to do that but be

13:06

just have every one that they know

13:08

vote on who they think they're the

13:10

best engineers for the company

13:13

and then it would kind of find

13:13

the mean or you could ask everyone

13:16

that cudi have to think the best

13:17

person is and then he goes dead

13:18

person i didn't ask oh who do

13:20

think the best person isn't the kind

13:21

of traverse the graph and find sort

13:24

of who the the best person is

13:26

by virtue of having everyone make increasingly

13:29

informed decisions and so we

13:33

kind of business edition a little bit

13:34

in our presses with his craft approach

13:37

we did some experiments where we proxy

13:40

and expertise in our abortion question by

13:43

having by looking at schatz were there

13:45

was actually a christian girl in the

13:46

shat who at a young age had

13:48

are considered an abortion

13:50

and so we kind of consider those

13:51

people to be having some kind of

13:54

moral expertise on this question because they

13:56

they lived through it and then we

13:59

looked at what values they articulated and

14:02

considered to be quote unquote expect values

14:05

and

14:06

i then we looked at the more

14:08

people participated in this disperses did his

14:10

values her to surface or did it

14:13

drawn out

14:15

and it and if you compare to

14:16

voting we actually saw in the data

14:18

that the he did indeed drown out

14:20

at the more people participated and if

14:22

we counted based on our graph approach

14:25

this valley was actually brought to the

14:26

top and became the first or second

14:29

and secondly ranked value and so there

14:34

is some kind of atlantic to let

14:35

it is that there is this property

14:36

of expertise sort of being brought up

14:38

and the more people participate in it

14:40

which i think his house at the

14:41

democratic process should work that they should

14:43

be able to surface sort of the

14:46

the richness and wisdom that exists in

14:48

a collective and not just drown out

14:50

everything towards some kind of mean

14:55

was so

14:57

i i

14:58

i was wondering like a people

15:02

like these a system allows people to

15:04

understand their core core values more occur

15:09

core reduce and so then the delta

15:12

is used to train a new ai

15:16

and that the i can be eventually

15:18

used by other kind of system or

15:20

platform how does the pork is it

15:22

open as it's close

15:24

right yeah so what we designed for

15:27

a i was a democratic input process

15:30

so this process results in this thing

15:32

i mentioned called the moral graph and

15:35

it's fairly easy to train a model

15:36

of on that data it's not it's

15:39

it would work sort of similar to

15:40

constitution and ai were instead of having

15:43

constitutional principle

15:45

both and were you tell the i

15:47

for instance to like not be harmless

15:49

or like the honest or something like

15:50

that and then and day i sort

15:53

of to response automatically that it thinks

15:56

is most honest are harmless and then

15:58

creates a training or a training dataset

16:01

based on that

16:02

you could do something very similar with

16:04

these valleys cards although they're a bit

16:06

more specific and they're also context bound

16:08

to it might be the case that

16:09

your first had to figure out which

16:12

context am i in in a particular

16:14

conversation and and okay so which valley

16:16

applies in that context and then you've

16:18

got to graph the find that when

16:19

wanting at winning one and then he

16:21

used sort of this

16:22

specification in the baddies car to determine

16:24

how to respond so he would create

16:26

a dataset in a very similar way

16:30

we have done some experiments and that

16:32

but with smaller models because and things

16:35

happened and open the i around lake

16:37

twenty three and a change luck and

16:40

how they work said there's never actually

16:42

saw

16:44

sobbing actually came to the products set

16:48

of things within up in the eye

16:49

and twenty three and the process is

16:52

open though so anyone can

16:55

use our tool to create he immortal

16:59

graph not just for a alignment the

17:01

for any topic where they would like

17:02

to find some sort of way to

17:04

surface the collective wisdom to have a

17:06

group

17:07

and we trained some lama models on

17:10

this ourselves and

17:13

the results are promising but we didn't

17:15

actually do this with a real a

17:17

graph the just to kind of justice

17:20

purse of proof of concept

17:22

and they're awesome properties that are interesting

17:24

with that model where it behaves in

17:27

in some fat a different place in

17:28

certain questions

17:30

it might is more prone to for

17:32

instance like ask for ask what the

17:35

deeper sort of intuition behind people's sort

17:37

of responses are when

17:40

and when the user asks something that

17:42

is usually refuted safe to use her

17:44

as something like

17:47

how can i how can i buy

17:49

some drugs it might be like oh

17:52

that's so interesting like makati to that

17:54

point and how can i help i'd

17:57

rather than than sort of just being

17:59

not to it

18:01

that that's the lesson properly more appropriate

18:04

that the type of values that were

18:05

surface rather than the first itself it's

18:07

fairly standard and

18:10

yeah

18:11

and is the so is it use

18:15

the right now is there like any

18:17

kind of service the that exactly

18:21

using these the monograph for like know

18:27

there's no i think model or the

18:30

or air system that that people know

18:32

that is trained using this approach

18:36

though

18:39

no i am i mean the the

18:42

think you are working on the so

18:46

how do i mean are the users

18:50

the third can use the tooling some

18:52

way i see to explore their or

18:55

like any other third to part to

18:56

service that is

18:57

using the he absolutely the tool itself

19:00

is open source and it's also available

19:03

as a hosted version i can give

19:04

you the link afterwards you can give

19:06

it to people you can see why

19:06

not want to take it out

19:11

sure and about your background like

19:17

if you would like to show we

19:18

found something eventually also

19:22

yeah of course like your professional or

19:24

academic background but also like i dunno

19:29

when you were a key to where

19:30

you were leaving leica

19:34

sure and yeah my background i guess

19:37

it's like confirm engineer background used used

19:41

the is a french near i found

19:43

some start apps and then left at

19:45

world to sort of really sit and

19:47

think about what they're actually want to

19:49

do with my life i career and

19:51

and the question now is

19:52

coming back to was a roughs in

19:54

there are in rafters something like this

19:56

notion of like what do we aligned

19:58

to back there is to kind of

20:00

a lot of talk about a alignment

20:01

but very little talk about his back

20:03

then

20:04

iran what is actually the purpose you

20:06

know like what the lining these systems

20:07

to like whether whether you're supposed to

20:08

serve and

20:11

and these questions sort of led me

20:13

to the menial amethyst yet where i

20:15

think the it's kind of the name

20:17

ride like the very short version or

20:19

the very short version of the answer

20:20

is meaning

20:23

what actually brings people meaning and so

20:25

are her work is trying to understand

20:28

or based around trying to understand what

20:31

brings people meaning in life and i

20:34

would do this through these interviews our

20:36

building on i kind of a rich

20:37

rigorous philosophical tradition that things have a

20:41

and meaning as two sides of the

20:42

same coin

20:43

we sometimes call values sources of meaning

20:45

because of that reason because three of

20:47

the express some meaningful way of life

20:49

i

20:51

and i asked the research organization were

20:54

not just working on a airline that

20:56

that who are working on we envisioning

20:58

the whole set of cycle stack around

21:01

meaning including a but also including institutions

21:05

like democratic institutions and markets eventually where

21:09

we think all of these systems market

21:11

stomach or

21:11

the sunday i currently think of what

21:15

people wants in very crude terms markets

21:18

think we want whenever we buy our

21:20

recommender system think we want but we

21:21

kick on

21:23

democracy is think we want whatever we

21:24

vote on but none of these this

21:26

sex to understand what's especially actually meaningful

21:28

to us

21:29

and so are word or road is

21:31

extremely rich and wealthy but very devoid

21:34

of meaning in many ways there's going

21:36

to kind of a backslide

21:38

the pests

21:40

two decades or so

21:42

i

21:45

yeah and do

21:48

yeah do you have any memory from

21:50

when you were other nah a kedar

21:53

like about to the way you believed

22:01

and

22:04

yeah i mean i i grew up

22:05

in sweden the had a very nice

22:06

childhood i'm actually in sweet analysis is

22:08

quite nice being theme back in the

22:10

place and i live in berlin or

22:13

san francisco usually

22:15

i

22:17

yeah many good memories i mean i

22:19

had a very grateful to have had

22:21

a very nice childhood the lots of

22:24

lots of being in nature of being

22:25

around i actually got into technology later

22:30

growing up i'd want to be a

22:31

rockstar i want to be a vfx

22:33

artists like making videos and exposure the

22:35

and things like that

22:37

i had a period where i want

22:38

to be a writer bekaa reading fantasy

22:41

novels

22:43

i and i was always interested in

22:45

in kind of the sort of philosophical

22:47

questions that'll be asked in see it

22:50

but never to consider that be something

22:51

at work with

22:54

the between dalton to know

22:57

those who would like to be a

22:58

writer so am

23:00

and the and the boat your team

23:03

or i mean how many people order

23:05

like

23:08

we are three people at the moment

23:11

or three and a half something at

23:13

work time and and then we have

23:16

this extended research network of people who

23:19

we sort of their basing other academic

23:22

institutions or or or a some lab

23:24

like the and

23:26

and we collaborate with them in various

23:28

ways where our mission is very broad

23:31

and ambitious and we obviously can't do

23:32

it like as three people so the

23:35

way we work is that we tried

23:37

to find other academics are kind of

23:38

sharing the same intuitions are and what

23:41

needs to change in society and and

23:43

tried to pair them up into working

23:45

groups or as

23:46

sort of help them on and off

23:48

their research agenda

23:49

so as as quickly as possible

23:52

make this work happen sooner and so

23:56

we do a lot of workshops and

23:57

coordination stuff we would just hosted a

23:59

workshop in oxford for some of these

24:02

academics and so even though institute is

24:04

quite small we we sort of

24:07

were plugged into a brother network that

24:09

we're trying to nurture

24:13

can do

24:15

i always wonder like these a network

24:18

or of people of researcher like

24:23

i i can't imagine that there are

24:25

people from engineer but also maybe people

24:28

from modern philosophy and topology

24:33

because he's i mean when we talk

24:34

about moral a dixie sulking very

24:39

the philosophy economics at of choice

24:46

decision theory etc yet it's whole bunch

24:51

and these the any kind of from

24:54

i dunno problem or thinks that you're

24:57

stuck to stuck in like as a

25:01

team or as i dunno is there

25:04

something that the you're trying to do

25:06

that is hard or you're struggling maybe

25:10

i dunno

25:10

yeah i mean our mission is very

25:12

hard or i like the basic case

25:15

reimagining and realigning society with meaning as

25:18

is a massive massive mission that's of

25:20

is extremely heart not not the least

25:23

because you know the all the incentives

25:24

are working against you and so struggle

25:28

yesterday with all the time but i

25:29

don't know any specific ones

25:30

the moment that are any particular kind

25:33

of team struggle that stands out

25:38

okay i was wondering where i was

25:39

thinking maybe someone could listen to this

25:43

kind of problem or maybe that someone

25:45

could have done an idea and also

25:48

about that the i wanted to ask

25:50

you if a

25:53

meaning alignment institute these open like for

25:55

any kind of collaboration and so on

25:57

as an idea can just conduct you

25:59

or how does it porker

26:02

yeah know if sure we're always looking

26:04

for new academics to enroll in this

26:07

project and you can reaches us at

26:10

a low at meaning a limited org

26:12

or website is the meaning alignment dot

26:13

org

26:15

i as specific i guess we're looking

26:17

for the people who

26:21

our in either i'm and social choice

26:27

our economics are specifically like some some

26:30

bread and subfield in economics and for

26:35

doing at doing kind of baddies space

26:37

work in those areas or doesn't necessarily

26:40

have to be that is based but

26:41

we were calling this field and those

26:43

terms thick models of choice meaning some

26:46

model of choice that's not just

26:47

the preferences sir like this over that

26:49

but some richer understanding about

26:51

where did those preferences come from and

26:54

that might include norms values like social

26:57

context is kind of things

27:03

hope that someone is interested

27:06

can conduct you do a shelter imagine

27:09

like and

27:11

let's say it tomorrow like i dunno

27:14

five years then years and and let's

27:18

say that the

27:21

the system you're working on effectively it

27:24

starts working and people are starting use

27:27

the using it to understanding the difference

27:30

between that does logan and a core

27:33

idea the core value

27:36

how do you imagine society or like

27:40

yeah well i think just to kind

27:42

of paint the alternative and the status

27:44

quo

27:46

of we going

27:48

which is that democracy will just be

27:51

too slow to be irrelevant and if

27:54

you wanted to take a decision quickly

27:56

and do will be no way to

27:58

involve the people because decisions need to

28:01

take a of very rapid speed that

28:04

even like representatives i think would be

28:05

able to keep up so i think

28:08

we're going to worth of

28:08

very sort of like a i gathered

28:10

world where where people's values are kind

28:11

of bike

28:14

the not considered but also exercising any

28:16

kind of an agency or are having

28:18

a decisions made at the cetera level

28:21

be legitimate by the people i think

28:22

is rapidly growing out of fashion and

28:26

so the for any kind of hope

28:29

of any kind of democratic future we

28:31

need some kind of system that is

28:33

able to

28:33

take decisions at a speed but still

28:36

allow people to exercise their agency and

28:39

and so

28:41

i think assistant kind of like this

28:43

would not only do that but also

28:46

allow for this kind of richer understanding

28:48

what people wanna see could imagine the

28:49

for instance

28:52

and as trying to decide whether three

28:54

a redirect the river and like many

28:56

people's homes will be affected in various

28:57

ways and people are able to talk

28:59

to the wrong personally agent about was

29:01

important to them and this maybe it's

29:03

important to

29:05

beaches us with their friends if they

29:06

were to move they need to move

29:08

as a community and then day i

29:09

can understand the value of that community

29:12

and maybe can i can decide to

29:13

we both and to another place together

29:15

while while still you know keeping the

29:17

reverse course and everything will happen at

29:21

a very rapid pace but people want

29:22

to notice because you're able to exercise

29:24

six

29:25

the agency while i'll say exactly what

29:27

they want and having those wants be

29:28

fulfilled

29:31

and i think he will it will

29:32

look something kind of like that and

29:41

no i was

29:44

so what to wear see is that

29:46

the i would be much faster and

29:51

efficient

29:52

i and i and system connected to

29:55

a than actual the government and institutions

30:00

so probably like that would be like

30:03

of people will have to move to

30:06

this kind of system yeah maybe organized

30:10

maybe i thing i didn't talk so

30:11

much for their

30:13

the value based meaning based side of

30:14

things because the other thing i think

30:16

would be true he did didn't do

30:18

that

30:20

but the reason why we're so adamant

30:22

about this point is that i think

30:23

a lot of our political opposition of

30:28

one another is actually manufactured by the

30:29

fact that we're talking at the level

30:31

of preferences in of the underlying values

30:33

are we saw this also into results

30:35

were some democrats or republicans thought to

30:37

have different values that when they could

30:38

clarify for instance

30:40

when each value applied and

30:43

some of that a petition went away

30:44

because we could see that your values

30:46

wise when it comes to dealing with

30:47

people on the countryside and my legs

30:49

was when it comes to dealing with

30:50

people in a fast-paced job in city

30:51

for instance

30:54

and so now all of a sudden

30:55

are are differing values should have mutual

30:58

support each other and and and a

31:02

lot of opposition is just kind of

31:03

and

31:05

on this preference slogan level which are

31:08

sort of inherently and divisive

31:12

and so i think there would be

31:13

a whole set of suite of like

31:14

when when opportunities that present themselves when

31:16

we can all of a sudden talk

31:18

and reason at the level of was

31:20

actually important to us and was actually

31:21

meaningful to us versus and trying to

31:24

convince people of different points

31:27

and it's really hard to paint out

31:29

what that would look like a scale

31:31

and

31:34

but i could just imagine that there

31:35

are like win-win opportunities abound and like

31:37

air systems finding when we at trinity

31:39

said that no one even knew existed

31:41

on

31:43

and it could be beautiful

31:46

yeah absolutely or sure refugees a booth

31:49

hope and the and everything remain explain

31:52

able right to die he doesn't became

31:55

a black box

31:58

okay but is very important to do

32:02

i'm very scared by the black box

32:04

and the future where you know the

32:06

i say something and the we both

32:09

understand why yeah and is already a

32:11

black box or i like no one

32:12

really understands how these things work

32:15

yeah absolutely

32:18

i mean this sometime human can pick

32:20

some can still have some control and

32:24

at least know which kind of that

32:26

is he's the process the by the

32:30

i while other times it's just on

32:33

the i that the receive any input

32:35

then use an output we felt any

32:37

clue about

32:38

the what's happening inside

32:41

and

32:45

i mean what you were talking about

32:47

the it reminds me like and

32:51

eye coordination

32:53

that is one of the main problem

32:56

that i mean the it's very hard

32:59

for people to coordinate and we have

33:02

seen like in yesterday that the

33:05

i i mean most of the time

33:07

as if not all the times people

33:10

use the a hierarchical way to organize

33:12

themselves and to coordinate like a to

33:15

reach a certain specific goal

33:18

yeah

33:21

soya thinking that the this kind of

33:23

technologies and

33:26

cool leica help people to leave in

33:28

a more or it's uncle way i

33:30

mean to take decisions decision seen them

33:32

more or it's until way

33:36

i dunno what are your thoughts about

33:38

yeah and english sure i mean there's

33:39

there's are many kinds of coordination tech

33:41

and some some had been around for

33:43

awhile the also allows for this and

33:44

like the most obvious example would be

33:45

the market dread like can be the

33:46

guy can view of the market then

33:48

it's it's deserted for a seinfeld coordination

33:51

tech that allows many inputs and concentrations

33:54

to be processed without any kind of

33:56

set hierarchy and

33:59

and at the internet is obviously also

34:01

serve like that

34:03

i said there is

34:06

and then i do think there's a

34:07

bunch of problems with both internet and

34:09

and markets that relates to the we

34:11

were talking about earlier this notion of

34:13

like not understanding what people want at

34:14

depth where the pricing systems these as

34:17

a producers and consumers a and and

34:21

a lot of internet companies sees us

34:23

as eyeballs are people kicking on things

34:26

i said i think there's like we

34:28

have all the tools to build like

34:29

cool coordination deck but i think so

34:31

far we went down very good job

34:35

and is the about this future that

34:39

to the technology room for decision-making and

34:45

other things like is there anything that

34:48

the

34:49

you could potentially be scared of like

34:52

is there something that the

34:56

you know i i said i was

34:58

code by the black box because then

35:00

they cannot the know why

35:04

so does the and if you that

35:06

you're worried though

35:08

yeah i mean like the default app

35:10

doesn't look too good like i don't

35:11

think it's going to be assertive like

35:12

the that paper clip thing where all

35:15

of a sudden we're all you know

35:16

human extinction want to the quite like

35:18

that i think but the depot pat

35:20

of people path in my eyes look

35:22

something like humans are made entirely obsolete

35:26

from the perspective of the market like

35:27

all jobs are taken by

35:28

as and all values produced by eyes

35:31

which strives to the value of human

35:33

labor to sarah and and that you

35:36

know as a consequent dress the the

35:40

the value of capital sky high and

35:43

so there will be a few actors

35:44

who controls the whole system and most

35:45

people's are entirely sort of dependent on

35:47

them

35:48

and in some kind of cases say

35:49

that slavery whether it just sort of

35:53

kept alive by some you know substance

35:56

and stipend or

35:58

ubi i

36:00

thing and i would also imagine that

36:03

in this economy on the values of

36:04

physical material goes very like cause of

36:08

quite a lot

36:09

as it relates to and gen capital

36:11

one so everything digital drops in value

36:14

so you sort of looks a lot

36:15

like be ready player one right where

36:17

you have people just looking at the

36:19

airport and all day

36:20

living in some kind of slum basically

36:24

and and any kind of the real

36:27

meaningful agency is sort of eroded and

36:31

and it's a very drab existence trap

36:35

meaningless existence

36:39

yeah i i can imagine and show

36:43

in the time we are leaving there

36:46

are also some i would call them

36:48

cultural problems like there is this digital

36:51

divide them in a lot of people

36:53

are working i mean some people are

36:55

working on very specific kinda and in

36:57

nobody a solution while the rest of

36:59

the people i mean i know people

37:01

that the

37:03

maybe they tried to bitty for the

37:05

first time like out on the last

37:06

week

37:08

yeah

37:09

yeah yeah their stuff entity that also

37:12

the you know the people who are

37:14

able to understand how to work with

37:15

this systems and people who

37:18

the fall behind and that cleft will

37:20

just the massive where you it's almost

37:22

like you will have a society where

37:24

like the vast majority is just passive

37:27

consumers and others like a few people

37:29

who understand how to work with the

37:30

assistance

37:34

and the is there any project to

37:37

i dunno on an internet the that

37:39

he

37:41

you thought was very interesting and in

37:43

some way maybe it was the let's

37:46

say aligning we feel project or and

37:50

worryingly few to be honest i need

37:53

i think there is there's a kind

37:54

of a growing awareness of the same

37:56

kind of issues that we see for

37:57

instance there was this post called graduate

37:59

this empowerment that came out a few

38:00

weeks ago

38:02

they could quite popular and now there's

38:03

another one called intelligence curse and he's

38:05

are by like as people who have

38:08

worked at anthropic or other places and

38:10

and so there is a kind of

38:11

a growing awareness of the issue which

38:14

sort of maps to what we think

38:16

is is happening

38:19

in terms of like the solution space

38:21

i think it's a little

38:23

yeah it's just there's not so many

38:24

other projects that we look at being

38:26

like this that dick the closely aligns

38:29

with us i mean there's good work

38:30

being done by police searches here and

38:32

there but i there is no coordinated

38:34

effort that like maria because the lasso

38:36

to what we want to do i

38:37

mean there are there are things which

38:38

are in the same ballpark said there's

38:40

for instance to collective institution as a

38:42

collective intelligence project

38:44

which are trying to also reinvents kind

38:46

of democracies with the i

38:49

and on there is a radical exchange

38:53

which to some extent of trying to

38:55

do something similar with with markets but

38:58

yeah other than that had already know

39:02

can you mentioned some on some paper

39:04

that i didn't know is the any

39:06

other book curfew curse color that inspired

39:10

to you

39:11

and yeah i mean there's there's a

39:13

whole series of philosophers that would build

39:16

on it and i think the main

39:17

one that people if you don't know

39:18

about is a guy called charles taylor

39:22

he is a plus from the seventies

39:25

to kind of were early on critiquing

39:27

this kind of rational individual territory and

39:30

basis upon which allows modern society is

39:32

built

39:34

that with a sort of a the

39:36

a concrete alternative and in he and

39:39

in his case he kind of talks

39:40

about how certain choices have

39:44

an arduous kind of an expression of

39:46

taste and certain choices that say something

39:48

about how he want to live as

39:49

so are our way of distinguishing preferences

39:52

against and sources of meaning or value

39:55

is very much inspired by his work

40:02

okay a actually one creed the and

40:04

you know his work

40:07

i

40:09

he's though like any other like and

40:14

one was the main feel as a

40:15

for behind their core idea yeah i

40:18

mean if people want to go deep

40:20

then we have a a paper that

40:21

explains thorough process and there is sarah

40:24

a section on the background or there

40:26

is there's sarah philosophers that it's inspired

40:28

by trust and have been the may

40:29

one another one o'clock ruth shang but

40:31

i think it's easier to just go

40:32

through the paper and read that and

40:33

if you wanna go deep

40:34

better then you can follow those does

40:35

leads

40:37

okay

40:39

and

40:42

i'm a request on so for most

40:43

people for a lot of people render

40:46

thought the project relatives to subic deck

40:50

the something go sir

40:53

it's a struggle to raise money and

40:55

the i mean you had the and

40:59

i was a like an unimportant collaboration

41:02

with the

41:05

we fuck one of the biggest company

41:07

in the i worked and like the

41:11

have any advice for people that are

41:13

building something that the think that the

41:15

what they are working on is really

41:18

valid the and the it could be

41:19

helpful for all their humanity and yeah

41:24

i

41:27

i am

41:32

are there are there are a bunch

41:33

of for from us like a the

41:37

places that like to find things like

41:38

that as a fifth is the the

41:40

one that comes to mind first and

41:41

foremost and there's out the ff and

41:43

a few others but i think maybe

41:46

more importantly a lot of produced in

41:49

this space don great

41:50

thanks so much about the theory of

41:51

change that how to change actually happening

41:53

in the world

41:55

i and i think we just greedy

41:57

need to upgrade our thinking there are

42:00

collectively without a lot about this also

42:04

and i think the current certain corking

42:07

answer is that for this kind of

42:10

lasting deep institutional change you need to

42:14

have some kind of coherence amongst experts

42:15

that these are there are thanks to

42:16

do you conscious have kind of individual

42:19

as floating around and as ideas need

42:21

to be refined by it's like working

42:23

prototypes

42:24

https that make it very clear how

42:26

especially the dysfunctions are some working demo

42:29

or something like that

42:30

and

42:32

and only then you can really go

42:34

to the public and sort of have

42:35

them demand that things work like that

42:38

and often i see party going to

42:40

quickly to the public like with climate

42:42

change where there was no clear exact

42:43

sort of way to implement or to

42:47

some kind of plus a decision or

42:48

on what to do about it

42:50

as said that the deck of the

42:51

work and

42:54

and and said i think that this

42:57

a lot of projects should think a

42:58

little bit about exactly what what is

43:00

a fair exchange and could be something

43:01

also just sort of more local like

43:03

there's a concrete probably don't want to

43:04

solve in our vicinity are in our

43:06

community and if so that's great

43:09

but if there is no kind of

43:10

clear idea of like oh be cool

43:12

with everyone uses but i'd already know

43:13

how to get there then

43:17

and yeah i think that's that's that's

43:19

why maybe a lot of people might

43:20

be reluctant to throw money on something

43:24

he slew of you work like and

43:28

with institution like institution have tried the

43:32

your platform or right now is more

43:36

thorough let's a technical people are researchers

43:40

but we would love to have more

43:41

institutions try and and with oberlin spread

43:44

thin as an organization so we haven't

43:46

ourselves been able to kind of a

43:47

lobby for it there was a point

43:49

where we considered running a complete in

43:51

san francisco and using this platform for

43:53

homelessness to kind of surface what people

43:55

thought was important homelessness and at least

43:57

serve as a little bit with policymakers

43:59

and at

43:59

but it's just been yeah we don't

44:01

really have the capacity for it but

44:02

if someone is interested in doing this

44:04

with either an institution our community or

44:06

whatever then we're very happy to support

44:08

him because we do want to see

44:09

more use of this thing but would

44:12

just totally had to capacity in house

44:15

so direction from policymakers was a the

44:20

through are not understanding how the to

44:22

cool the be useful or applied to

44:25

nose is a more that we didn't

44:27

really have the bandwidth to to have

44:29

a bunch of dialogue and run that

44:31

whole project so we kind of cut

44:33

it down and and did other things

44:35

are prioritize other things

44:38

it just a matter of resources in

44:39

us

44:42

and the about the rook tree space

44:45

the that is also quite active

44:49

have you had done other no contact

44:51

the

44:54

no not really i don't think there's

44:56

any obvious

44:59

overlap the what we're doing and the

45:00

web three world

45:03

yeah i don't think it because rubber

45:05

trees of course more blockchain while you're

45:08

working one more on the a i

45:10

side but i'm thinking about these a

45:13

coordination aspect the and also the web

45:17

tree is trying to find new ways

45:19

for governance

45:21

that in some way this could the

45:23

line

45:24

i did mine but i think the

45:25

this there's a kind of the okay

45:28

i'm going to i think there is

45:30

like

45:32

a little bit of

45:36

like tool in search may be in

45:37

the web three space like people don't

45:38

really search for like what what what

45:40

actually is to program what actually is

45:41

not working and like what actually have

45:43

been tried and people are a little

45:46

too excited to kind of use blockchain

45:47

as this kind of hammer at you

45:49

smack things with and and so there's

45:52

you know a

45:55

the there are there's a whole field

45:57

called detroit started where people have said

45:59

thought about how to take a decision

46:01

for a long time and most of

46:03

the three people who are building coordination

46:05

take us of really even aware of

46:06

it at i'm not saying that that

46:08

field has not said the best solutions

46:10

but i think there is a kind

46:10

of a lack of during the background

46:13

reading maybe to find

46:15

what actually is not working like what

46:17

actually are the province

46:21

absolutely agree that there are a lot

46:23

of things that can be improved the

46:25

will send the worked through space and

46:27

the

46:31

so you were suggesting like more research

46:35

and deep research instead of just the

46:40

likeliest they're trying to understand what problem

46:43

are you solving and why and what

46:45

has been tried to for

46:47

and it has been tried before why

46:48

the network

46:51

something like that

46:54

i'm thinking about two three and yeah

46:57

yeah i like to study used to

46:58

refer to some reason so i know

47:00

what the what is as we should

47:03

then

47:05

yeah and them

47:08

i mean i have another question that

47:10

is about the issue of and you

47:12

mess message the for the people that

47:14

are building a new kind of solution

47:17

that are exploring your ways possibilities

47:21

like you're doing

47:26

hm

47:27

i

47:33

yeah maybe one thing is like take

47:35

it seriously like there's there's just like

47:36

a a a massive need for it

47:39

and like i feel like a lot

47:40

of people are sort of have asking

47:42

a little bit and it's it's it's

47:44

unfortunate because it's the really important project

47:46

like the you're doing but work so

47:50

yeah don't undersell south and and take

47:52

yourself seriously maybe it'd be a on

47:55

way of framing it

47:58

i agree about the importance of ooh

48:01

yeah trying to find new solutions and

48:04

experiment

48:06

i i'd unless you have any any

48:08

questions or like any other kind of

48:11

thoughts to the to like to sure

48:15

i nothing consummate in atlanta

48:18

okay because actually like to ask more

48:20

question i will have to actually really

48:23

dig into the into the plots from

48:26

and the most exploring do

48:30

the repository that you said that is

48:32

open source yeah i think the best

48:34

place to start for probably to read

48:36

the paper the paper is called what

48:37

are human values and how to align

48:39

ai to them but even though that's

48:42

the title it has a lot of

48:43

good stuff about coordination taken civic dick

48:47

a little bit between the lines and

48:48

especially to set of background sex

48:50

i found the method sections i think

48:52

our

48:54

yeah part part of the neck of

48:55

the sections are are interesting regardless if

48:57

you're not into airline and

48:59

thank you a lot

49:02

for pleasure

49:05

i read the care