Alex Blaga from Trollwall about using AI to combat toxic content and misinformation on social media
Ep. 27

Alex Blaga from Trollwall about using AI to combat toxic content and misinformation on social media

Episode description

Alex Blaga works at Trollwall AI, a platform designed to moderate toxic comments on social media. In this episode, Alex explains how harmful content is often used to spread narratives through repetition alone. The conversation also touches on the need for user identification on social media and the balance between freedom of speech and accountability.

If you enjoy our podcast, please consider making a donation. It really helps us to sustain the podcast.

Download transcript (.srt)
0:00

Welcome on another episode of Democracy Innovator podcast and our guest of today is Alex Blaga.

0:07

And thank you for your time, Alex.

0:11

Thank you, Alessandro.

0:12

It's a pleasure.

0:13

And you're working on a Trollwall AI, right?

0:19

And would you like to tell us something about this project?

0:25

Yes, we're getting into it right away.

0:28

see.

0:28

uh Yeah.

0:30

So the project that I'm working on at the moment is called Troll.ai.

0:34

uh As the name calls it, it's an artificial intelligence platform with an LLM behind it that we've built ourselves.

0:44

And in very short terms, and perhaps we can develop later on, we focus on moderation and community management on social media.

0:54

with a focus on toxic, harmful comments.

1:01

We identify the most toxic, the most harmful comments on social media in the comment sections and we take care of them so that we make, so that we can make the internet and

1:13

the social media in particular a better place and a place that is more, that looks better for discussions, for debates.

1:24

And I can imagine there is a story behind this software.

1:31

I don't know, you had the idea, one of your co-founders had the idea when it happened.

1:38

So one of our co-founders that is currently our CEO was working in the European Parliament.

1:46

He was working for the vice president of the European Parliament.

1:49

He was doing specifically the community management on social media.

1:54

And back in 2022, which we all know coincides with the invasion of Russia in Ukraine, together with the waves of migrants,

2:07

that came to Western Europe, also a wave of harmful comments and certain narratives, toxic narratives came at the same time in the comment sections of our social media platforms.

2:24

together with the waves and the narratives, the flow and the sheer volume of those comments became unbearable for a lot of political actors.

2:35

specifically for the vice president of the European Parliament back then, and many actors, but also businesses as well.

2:43

So they were seeing comments like, we don't want these Ukrainians.

2:46

They should go back.

2:47

They should stop stealing our jobs.

2:49

ah Why are they not staying in their countries to fight?

2:54

what was really interesting was that these messages were sent uh and composed in a really violent manner.

3:04

what is our opinion and the opinion of our clients in an unacceptable manner, in a toxic, unacceptable manner.

3:14

So at the time, my colleagues, my now colleagues looked around and they couldn't find a solution that could handle that volume, that toxicity effectively.

3:29

So they got together.

3:32

The team slowly expanded and we've built a company around this idea of making the social media a better place.

3:42

And it all started in 2022.

3:44

We're now in 2025 with over 70 clients present in over nine countries in Europe and Latin America, and also supporting now four heads of state.

4:02

Okay, and so there are different kinds of clients, customers, so they can be uh political parties or institutions or like also companies.

4:15

um

4:17

Sure.

4:17

So although we've started in the political sphere, naturally, we slowly but surely migrated to the commercial side of things because unfortunately, that toxicity that

4:31

oftentimes

4:34

starts in the social, in the political spheres, slowly but surely migrates to businesses, to the commercial side of our society.

4:48

we see on our clients, such as the news media, media houses that suffer from the same toxic narratives, from the same toxic comments.

5:02

And the results are

5:04

different and the effects are different, but they're just as harmful for everybody, regardless of the nature, whether the organization that we work is political, uh

5:14

commercial, or NGOs and institutions.

5:19

An interesting thing uh is the large language model built inside the company.

5:26

you use the...

5:30

I mean comments that were probably made by trolls or AI bots and you train the AI, right?

5:39

Yeah, so we trained our own large language model specifically in a number of languages, mostly European languages.

5:49

And I'll tell you in a minute why we focused on European languages.

5:54

We work with, we do still work with linguists in specific languages so that we can understand the context, the local context of each country and each

6:06

region, each language in particular, really, really well.

6:09

And by understanding that uh local context, we can identify subtleties in the language and so filter sentences and comments much more effectively, much more effectively than the

6:24

social platforms can do by themselves.

6:27

When I say social platforms, I mean the big meta, the Googles, ah the TikToks and so on.

6:35

which natively, of course, they do moderation by themselves.

6:43

But as we all know by now, they do it really, really poorly.

6:47

So all the toxic content and also comments present on their platforms is because they don't want to, or they cannot do moderation properly.

7:02

I have some thoughts regarding, but I would like to ask you something about maybe your professional background, if you'd like.

7:12

Sure.

7:12

So I have a very mixed uh background in the sense that I have a love for on the one hand, business, politics, and also communication on the other side.

7:28

So my academic background is in politics and international affairs.

7:34

Soon after I finished my studies, um I delved into the world of

7:42

business and communication through various roles.

7:47

And I'm happy to say that the project that I'm working on right now, Trollwall, really combines these three aspects of uh my life, which I'm really passionate about really,

8:00

really well.

8:01

the communication part, the social media, the business, it's an obvious one, but also the uh political aspect of it.

8:12

And if you'd like something more about uh your personal background, don't know, like where did you came from?

8:19

ah

8:23

originally from Romania, but I spent probably now half of my career in the United Kingdom, where I also completed my academic studies, where I also worked for a number of years.

8:39

And now I travel in between my home country of Romania and London, England.

8:47

So if you're going to ask me where my home is,

8:51

Unfortunately, I don't have an answer at the moment.

8:55

I'm a citizen of Europe.

8:59

Although my fellow English friends might disagree that the UK is still Europe.

9:09

Yeah, it's a complex situation.

9:16

I was thinking about...

9:20

Because comments can be very toxic and at the same time there could be also a good manner to say that you disagree about something.

9:33

So how can both...

9:39

happen?

9:40

uh Because I understand that nowadays with automation, technology, AI, like a country or a political party could actually attack another political party or another country, uh

9:55

showing that the population is thinking something, but maybe this is not true.

10:01

So how to protect the...

10:03

uh

10:04

can be a political party, be a politician, can be a country, from these kinds of attacks, but at the same time also allowing people to say, eh I disagree about this specific thing.

10:21

Well, the quick and easy answer and probably the obvious one would be moderation.

10:28

Political entities should take care of their moderation.

10:34

For the sake of this discussion, let's focus on social media.

10:39

So you mentioned attacks and these attacks and the...

10:47

The influencing of minds by certain political actors happens in a few ways as far as we've been able to identify this.

11:01

So we mentioned already the sheer volume of comments.

11:06

So these attacks always, always come with huge volumes of comments that want to carry a certain narrative.

11:17

In other words, the more you repeat an idea, albeit lie, at some point, somebody is likely to believe that idea or that lie.

11:31

So the volume that I was talking about.

11:34

On the other hand, another interesting aspect that happens is the violence that I already mentioned, the toxicity of the comments.

11:46

The same narratives also come in a violent manner so that normal people, normal citizens that would usually engage with political content get deterred from engaging in the

12:02

discussion.

12:04

So they get scared, they um pushed away from getting involved in the discussion and starting an actual debate.

12:15

over a set of policy, um over certain ideas, any kind of discussion in a violent environment is less likely to happen.

12:25

So on the one hand, you push them away.

12:26

And on the other hand, you repeat it enough times until somebody believes you.

12:31

And there's so many examples out there in Europe.

12:35

If we looked at the Brexit vote, it's an easy example.

12:42

If we look at the elections in 2024 and 2025 all across Europe, we see this pattern on and on and on with certain actors pushing certain narratives through the comment sections.

13:00

But not only, of course, they rely on things like bot farms and troll farms to push certain narratives enough times.

13:09

But specifically, we focus on

13:11

on the comment sections.

13:16

And I was wondering, uh this kind of attacks came from foreign actors, mainly like outside Europe or also inside Europe or party against party, I don't know, inside the European

13:30

Union, inside a specific country.

13:33

uh

13:35

However, when it comes to the volume and the resources, they usually come from uh outside of Europe.

13:47

Posing parties from within a specific country, they also look at what worked for others and they slowly adapt the same tactics, but often with smaller resources.

14:00

So they don't have the resources that state actors often have.

14:07

So we see the same kind of attacks, the same uh kind of narratives.

14:16

posing narratives in between uh parties from a specific country, but at a lower scale or a smaller volume than compared to two state actors.

14:28

But what it works for states also works for any kind of actor.

14:35

Yeah, I'm thinking about the future, because nowadays it's so easy to create bots, and now also with AI probably everyone will be able also to say, Alexa, create some bots.

14:57

And so, they talk about

15:01

registration with identity, to know the identity of a person.

15:08

uh I think also to avoid these kind of trolls.

15:15

At the same time that profilation is something that I don't know, not everyone would like to have a sort of identity connected to...

15:30

the internet.

15:32

Yeah.

15:32

So if so far I have been speaking, let's say in the name of the project and the company that I represent, I would also like to give you a personal opinion and a personal view on

15:48

this.

15:49

I do think that uh both Roll and Bob Farms are an issue and are an issue that we could address because it's so obvious by now that it's affecting our

16:02

democracy overall.

16:04

And I am in fact a big advocate of building social media platforms that rely in fact on identification.

16:14

banks can do it if your cell phone provider can do it and you have to verify yourself in order to open a subscription or a bank account.

16:29

Why wouldn't you?

16:30

be obliged to do it on social media.

16:33

So it's fine if you have an opinion, maybe even a controversial opinion, but there's an identity behind it.

16:42

And it's so much easier to control what's going on on social media.

16:47

And that's not to say that your opinions, your ideas shouldn't be valued or shouldn't be taken into consideration.

16:56

Quite the opposite.

16:59

So you know who's behind it.

17:01

And if you say something really, really toxic or if you promote certain violent ideas, you could also be held accountable, which is only normal in my opinion.

17:17

But it would also fix a lot of other smaller problems.

17:21

And I do agree that might just be the solution.

17:26

I wonder, because I see both pros and cons and ah actually I don't have a clear idea about it ah because I...

17:37

the contours?

17:41

As I said, I see the pros about the concert.

17:45

uh It could be, but I'm not sure, of course, uh that people then they don't feel free to say what they think about like because.

18:00

uh But I think it's maybe related to.

18:06

uh

18:09

If the citizen feels that he or she is safe in that country, in that...

18:17

ah I think that the person maybe could uh say what...

18:28

without applying auto-censorship.

18:32

Because often I think we end up doing it.

18:38

I don't know if I was clear.

18:40

At the same time, isn't democracy all about protecting personal freedoms?

18:45

So if the democratic system is healthy enough and strong enough, you could say anything, of course, within certain boundaries and nothing would happen to you.

18:59

You would feel protected and you would be in fact protected by the system, by the state, and you would be defended.

19:09

On the other hand, when democracies become less democratic, if we can put it like that, that's when personal freedoms and uh personal rights become an issue.

19:25

Isn't it actually in less democratic and more dictatorial states that citizens are more at risk?

19:35

If they say something without giving names.

19:39

of states, but we could of course call China, the USSR, and other more tyrannical states.

19:51

Because that is the alternative, right?

19:55

Yeah, exactly like I can suppose that maybe uh also in those places, I mean, they can receive attacks.

20:06

And so I wonder like, if in those places where that are less democratic, like an identity to access to social network is...

20:19

um

20:21

is applied, so then people would not be able to express uh their idea without having...

20:35

Like in places that are not very democratic, if users express their idea, then they can be uh punished.

20:46

And this is where the other really important pillar of a strong democracy comes in, which is the rule of law.

20:58

When you know that institutions will protect you, you won't be scared to voice your opinions.

21:05

But in less democratic systems, that rule of law pillar, that umbrella over everything when you know it's not there,

21:15

who's going to be there to protect you.

21:16

So they go hand in hand.

21:19

And I do agree that in states like China or maybe Russia, something like this, maybe it wouldn't be wise to be implemented in Europe on the other hand, and some countries in

21:33

Europe.

21:35

I don't see why not.

21:39

But I really like to think about these kind of topics because I think that...

21:44

um

21:48

In some way, as we said, there are some pros, are some cons, and we are the one thinking about problems and solutions.

21:59

We are not the one that will decide if there will be an identification on such a network, but I think that is very helpful talking.

22:07

Yeah, exactly.

22:09

we do have the power to vote.

22:11

We do have the liberty to debate and to have these kinds of discussions, unlike in many other places in the world.

22:19

But here you go.

22:21

That's uh another good idea for a third podcast.

22:24

We're going to make a series out of this and make proper discussions.

22:29

Yeah, will be absolutely awesome.

22:31

And I was thinking like, uh do you know if other countries, like we mentioned China, we mentioned Russia, or whatever, uh if they are having some other kind of attacks, some, I

22:50

don't know...

22:51

between countries, non-European countries or maybe by European countries.

22:57

Yeah, I don't know.

23:00

If there are also companies, maybe similar to the one like to Trollwall in other places that are non-European places.

23:14

Are you referring to competitors of uh ours that offer the same service that come from places like China or Russia?

23:23

Yeah, I wonder also if Russia has uh the same problem about other strong countries or whatever, that they are also using these kind of attacks.

23:43

It's hard to say in my opinion.

23:44

um One, because we don't really follow what happens on the Russian social media and the Russian uh internet.

23:55

Also, when it comes to China, if you take into account the great wall of the Chinese internet that blocks everything, doesn't have any of the big social platforms that we do,

24:09

doesn't have the Googles, the YouTubes.

24:12

while at the same time we accept TikTok.

24:19

It's hard to say, but saying this, it does look like they have more leverage over what's going on in the West than the leverage that we as Westerners have over what's going on in

24:36

China, Russia, and other similar countries.

24:40

But when it comes to attacks, I honestly wouldn't be able to tell you.

24:45

I'd like to believe that...

24:48

we uh reciprocate what we get is also what we give back in kind but I wouldn't know for sure.

24:59

Yeah, it will be very interesting to know all these, I don't know how to call them, attacks of bots.

25:08

ah Because, I mean, people are not aware of them, they just read a comment and they think, okay, that person is quite mad about a certain thing.

25:21

But it could be that it's just AI.

25:26

It often is.

25:27

It often is.

25:28

all these uh farms that propagate certain messages, it's more more difficult to tell if behind a profile or behind the comment is a real person or uh just an algorithm.

25:44

It often happens that you end up arguing in the comment sections over a stupid idea with an algorithm that there's no person behind it.

25:56

And while at first it might look funny and you might think, I just spend 15 minutes of my day arguing with nobody, with a computer.

26:10

I think when it comes to decision makers and institutions and political parties and brands, it should make them think that if...

26:23

we can already do these things with artificial intelligence, we should really take artificial intelligence much more seriously.

26:34

And we should be trying harder to fight fire with fire.

26:40

And as a matter of fact, this is what we do.

26:44

So if uh actors fight us using all sorts of AI tools,

26:52

We have no choice but to build our own AI tools and fight them back.

26:57

And that's what we do, at least in the case of moderation.

27:03

And yeah, I was thinking that actually now with a comment is quite easy to not be able to understand if there is a person behind or an algorithm.

27:15

Nowadays also with a video.

27:19

So we are quite, uh this is quite problematic.

27:23

And also I'm thinking that...

27:24

um

27:26

arguing actually with a person on internet, just chatting.

27:31

I saw that it doesn't matter about the language, but most of the time is toxic.

27:37

This is from my experience that it is not very easy to communicate with a person uh about complex topics, just chatting, because we cannot really empathize with the other person,

27:52

we don't see the other person.

27:54

So I'm thinking also about places where this can happen, could be in real life.

28:05

And because of your passion that you mentioned, one is technology and the other one is politics, I wonder if you thought about these where maybe debates or like contamination

28:18

between different thoughts and ideas can happen.

28:27

Yes, so uh I was mentioning the narratives that are spread.

28:33

And of course, they can happen.

28:36

And you raised two ideas there.

28:39

the...

28:43

conversation that you have with, and perhaps oftentimes an AI system.

28:51

Were you referring strictly in the comment sections when you were describing?

28:58

I was thinking that someone posted on a Facebook page and then there is the comment section, uh but could be also in other places of social network.

29:09

uh But yeah, we can think about the specific example of the comment section.

29:16

Yes.

29:18

of course, minds can be changed.

29:21

Ideas can be really be pushed forward.

29:25

Narratives can be spread really easily.

29:30

Now, even in a conversational manner, as you said, you can chat with a bot and it can have the same kind of idea and can almost sometimes convince you of a certain idea.

29:45

Um, but as you said that conversation doesn't feel okay.

29:50

It doesn't feel natural.

29:51

It feels kind of, uh, toxic.

29:54

Um, and we, we do understand that because often what happens is that a certain AI, uh, system and AI bot uses simple algorithms of open AI of chat GPT.

30:09

There's a chat GPT, uh, interface to it.

30:12

which doesn't really understand local context.

30:15

It doesn't really understand the local history, the local developments.

30:20

It just has simple ideas that it knows it needs to push forward.

30:27

But as I said, the key to that is just the repetition of the same ideas on and on and on and on.

30:34

And we really resonate with that local context that this is why at the beginning of the conversation, I told you that

30:41

When it comes to the languages that we've developed, we focused on mostly the European ones.

30:49

This is on the one hand where we work, but on the other hand, we understand the fact that the big players, the metas of the world, the Googles of the world, usually focus on big

31:03

languages.

31:04

So English.

31:06

German, in some cases, maybe even Italian, but when it comes to smaller languages like ah Ukrainian, Romanian, Bulgarian, Greek, smaller languages, they don't really spend much

31:20

resources.

31:20

They don't really spend much time on developing those languages.

31:23

But as a result, the moderation they do in-house on those specific languages is really, really poor.

31:32

So they work with very few linguists, they work with very few local experts and very few moderators at the end of the day.

31:40

And this is what we do differently.

31:41

ah We process enormous sets of data, amounts of data with local context so that we understand what people talk about really, really well.

31:59

And that's, I guess, the key to, on the one hand, keeping that space clean from bots, from trolls, from toxic in general, but also to promoting healthy discussions, healthy debates.

32:18

and to maybe give an idea about the amount of data.

32:26

Do you have an example to share?

32:32

But when it comes to data, what I can tell you is that it fluctuates a lot.

32:38

It's like a sea when waves come and go.

32:43

We've seen the biggest heats of the data we processed, of course, naturally around electoral events.

32:57

So in 2024, we probably saw the highest peaks.

33:01

of data of comments being dropped on social platforms.

33:09

One interesting case we worked in were the Romanian elections of 2024, with some of our clients having over 500,000 comments on each social account per month, which is an

33:27

enormous amount for a small country, a small language.

33:31

So they usually have about three or four social accounts ah per political party that would equate to over 2 million comments per month for just one political actor.

33:48

Just so you imagine the kind of scale that was going on.

33:56

I was thinking about the infrastructure and the AI model that has to analyze that quantity of message.

34:06

But yeah, with technology it's quite easy to do that, but manually no.

34:14

And I was thinking, how do you imagine the future?

34:19

Okay, let's think about...

34:24

mean, democracy, ah like in 10 years, 20 years.

34:29

Have you ever made this?

34:35

I tried a few times and then a couple of years would pass and my predictions would ruin themselves, especially with the advent of uh AI and technology.

34:50

I mean, just looking back at how the world looked like five, 10, 15 years ago.

34:57

um I mean, at the beginning of my career and how I got into

35:03

politics, international affairs and communication.

35:08

It was back in 2014 when again, Russia had invaded, allegedly invaded Crimea and they took over the Crimea with their little green men.

35:25

Now, just a little over 10 years, there's a full

35:30

fully fledged war going on in Ukraine.

35:34

And we in Europe back then used to see the United States as the beacon of democracy.

35:43

We would look up to them and think, this is an example, this is we should aspire to.

35:50

Now, than 15 years later, we look at what's happening in the United States, what

35:58

uh Trump is saying and how their democratic system is becoming less and less, uh, fragile.

36:05

Um, it's almost like they, they can say anything and nobody can dispute them.

36:12

No, nobody can, uh, can really debate them.

36:14

And at the same time in Europe, we seem to hold with our teeth to democracy and really want to protect it and go forward with it.

36:26

I think.

36:28

Looking forward, inevitably, one of the biggest players and the biggest decisive factors of our democracy will be the aspect of war on the one hand and on the other hand,

36:46

technology.

36:49

how we go about and what we do about the war that's going on on our doorsteps and how effective we're going to be at fighting it.

36:59

On the other hand, technology.

37:02

Will we be able to protect our democracy?

37:06

Will we be able to use technology to develop our democracy with things, as you said, perhaps developing social networks that

37:16

require you to authenticate yourself and you can only create one account per uh social ID.

37:25

Or will we completely crash it by uh making it worse and worse with the likes of TikTok?

37:33

That is clearly, in my opinion, an enemy to democracy.

37:39

um I'm returning to TikTok here.

37:44

So yeah, I would say it depends on these two aspects and it's ultimately up to us to make the world uh a better place.

37:54

uh I can confidently say that specifically what we're doing in our team is exactly this.

38:03

We're trying to make the world a better place by making the social media a better place.

38:10

And in turn, as a result,

38:12

open democracy and building democracy.

38:16

I'm curious about TikTok.

38:18

um Do you think it is a threat for democracy more because it comes from a foreign country or because it's very addicting for people?

38:29

Because I also tried it and it is a...

38:32

you cannot really stop looking at it.

38:34

Yes.

38:38

So when it comes to TikTok, I see a lot of decision makers, a lot of people on televisions and podcasts and so on, really hiding behind their words ah when describing TikTok.

38:53

I'm going to be honest and more direct because that's how I like it.

39:02

I think it's definitely a really nasty drug.

39:06

and it should be avoided.

39:10

So on the topic of addictiveness.

39:13

On the other hand, of course, it's a tool that comes from the outside world, if we can call it like that.

39:22

It's made by uh a state which is not necessarily our friend, maybe not anymore.

39:31

And uh it's a dangerous tool.

39:35

Let's just look at the way it's used and treated at home, on home ground.

39:41

So Chinese kids have access to a very similar platform, but with educational content.

39:48

Now, have you ever seen the feed on the Chinese TikTok for Chinese kids?

39:53

How it looks like compared to what we get in Europe and the West?

39:56

I saw something about what I haven't really investigated a lot.

40:01

Yes, so it has very clear limits.

40:04

So they can only use it a number of hours a day.

40:08

It's not unlimited like we use it here.

40:11

It's 24 hours a day.

40:13

And the content is strictly controlled, and it focuses on educational content.

40:21

So crafts, new languages, engineering.

40:27

And then we look at what we get in Europe and the United States, the nastiest of the content that rots our brains.

40:39

kid this.

40:42

joking.

40:42

But like, yeah, yeah, I understand that sometimes.

40:51

Less educational content is the one that the people search for, especially if they are young.

41:01

it is not only my opinion that it's harmful for citizens in general.

41:08

um But if we look at, again, I'll go back to the Romanian election case of 2024.

41:16

We uh had this one candidate, um independent candidate that received

41:26

Now, just in the last few weeks, it has been proven that it has received support from the Russian state, mostly through TikTok.

41:38

So the Chinese control TikTok and they took through the meddling of algorithm and through pumping millions of dollars.

41:47

They skyrocketed his accounts to number nine globally in 2024.

41:55

So a nobody candidate was skyrocketed on number nine in the algorithms, in the tags used on TikTok, and ended up winning the first round of presidential elections.

42:11

So going back to democracy and the threat that technology and certain states can pose to it, in a matter of months,

42:22

They can change, they can almost, they didn't really succeed.

42:26

They can almost change the whole structure of a democratic state through one single app, through some algorithms, through some AI narratives and just a few million euros.

42:42

So it's clear by now that it's not just a simple tool, it's a weapon.

42:50

It's a weapon for hybrid warfare.

42:52

And we shouldn't be hiding behind our politeness, our European politeness.

42:59

We should call it the way it is.

43:02

Yeah, yeah, absolutely, like this system.

43:04

But I'm thinking also about the other big tech, like also X-Twitter in some way.

43:11

We don't know the algorithm that there is behind.

43:16

We don't know, like, also for Facebook meta.

43:21

Like, I would say that maybe I actually would like more transparency related to the algorithm.

43:30

Because I think, yeah, absolutely, like information is power and with information you can change uh the power that is...

43:41

Yeah, I political institutional power can be changed.

43:45

And if it wasn't a web content, if it wasn't really as important as we say it is, the United States wouldn't fight so hard to get control over TikTok in the United States.

44:03

To take control away from from ByteDance and control it themselves and...

44:10

heads of state, Trump and Xi Jinping.

44:12

I think it was last week.

44:14

They were supposed to meet or it was recently in any case to discuss specifically TikTok and them giving up control to the United States over not the app itself, but the algorithm

44:33

because that's really what's at stake.

44:40

It's incredible um how can be important and how they can change people's life because algorithm at the end of the day is like, I don't know, a page with some lines ah with

44:55

something written inside but then it keeps people...

44:59

ah

45:03

like how do you say like in front of the screen for days for many hours.

45:09

exactly.

45:12

Exactly.

45:14

And I mean, we saw that technology now it's very important for I mean, it's in every in every aspect of our life and also related to politics.

45:27

And I wonder if you

45:31

thought about any other I would say let's say good use of AI uh for politics for democracy ah I don't know sort of assistant

45:50

Yes, we are, fortunately and unfortunately at the same time, our roadmap is so long, it's so big with so many ideas that the biggest challenge is picking the one that's most

46:08

relevant and the most important of all.

46:15

Just to name a few, we've recently...

46:19

Actually, a few months ago, we've released some new features.

46:23

I would mention the drafting of answers um for our clients.

46:30

in very short, our clients can build their own databases.

46:37

They can create their own assistants where they can upload certain files.

46:43

And based on those files and that database, we can draft answers for them.

46:49

for the comments they receive on social media.

46:52

Now the key word here is drafting.

46:55

We never reply on their behalf on social media.

46:59

We wouldn't allow that to happen.

47:03

But although it might seem like a basic, like an easy idea, drafting answers, that really allows, in our case, political actors to engage more with their audience.

47:18

This is an issue a lot of them face right now.

47:23

So they simply don't have the resources to pay people to sit down in front of computers, read through the comments and reply to comments and engage with the audience, which I

47:37

think it's so important and it's so critical for, again, for democracy that when political actors, when you're an MP,

47:45

hosts about something, a new piece of law that he or she uh supports.

47:54

And you want to ask them a question.

47:57

Oftentimes people, citizens will do it on social media in the comment sections.

48:02

And again, oftentimes they will never receive an answer because they simply don't have the resources to reply.

48:09

And this is what we try to do.

48:12

We draft the answers for them.

48:14

which they can tweak and really engage with the audience.

48:20

Of course, we also do uh other more community management related features such as uh sentiment analysis, which again, it's really important for political actors so that they

48:36

can better understand the pulse of their voter base, of their audience and what people

48:44

feel because it's really, really difficult to read through and to gauge the sentiment of hundreds, thousands, hundreds of thousands of comments and the opinions of uh so many

48:58

people.

49:01

I was thinking about this approach of using social networks for also in this case educating people, not really educating, informing people about a certain thing.

49:18

So if the citizen asks a question instead of having the president that is replying, a chatbot is replying but using all the knowledge.

49:32

And because several times, in the Civic Tech field, there is this thing about building the platform from scratch.

49:45

And um while on social networks, there is already a user base because most of the times they end up not being used, the software related to Civic Tech.

49:55

uh While...

49:58

People are using Facebook, Twitter and so on.

50:02

so tools that run on those platforms, they have probably a higher probability to be actually used.

50:12

Yes.

50:15

At the same time, would say I fully agree.

50:19

A lot of political tech ends up being unused.

50:25

But I do appreciate and I do encourage politicians specifically to try new tools, to be open-minded and to adopt new tools, going back to the idea that I mentioned earlier, uh

50:41

fighting fire with fire.

50:46

I don't see successful politicians in the near future, even winning election cycles without using the latest technologies, without using uh artificial intelligence at a large

51:02

scale, whether it's replying to the audience or segmenting the population and the voter base, they have to adopt and they have to adapt really, really fast.

51:16

many of those tools that they try, it's bound that some of them will be not very useful and they won't uh end up adopting them.

51:28

But I do think it's important that they stay open-minded.

51:34

And do you think that nowadays politicians are aware of these kind of tools?

51:40

um

51:44

Because a lot of people that know about politics, then they don't know about technology.

51:58

That is very true and uh somewhat painful for us, if I can put it like that.

52:06

I do agree that politics is a weird industry.

52:16

It often feels like a task that in Europe...

52:22

You have the political families, you have the conservatives, you have the leftist um that even at a European level, they don't really talk to each other.

52:35

So they will vote together in the European parliament, The left is from uh Italy together with the left is from France and Germany.

52:45

But when it comes to sharing ideas and sharing tools that

52:52

then doesn't really happen.

52:54

Like it happens in the United States, for instance.

52:58

So you would have the Democrats or the Republicans from one state to another really sharing, exchanging ideas.

53:07

In Europe, this doesn't really happen, and uh it's quite unfortunate.

53:13

So in our case, we go country by country.

53:17

uh

53:20

political party by political party uh in trying to open their eyes one by one.

53:28

But what I would also say is that I'm happy to see that uh events such as the political tech summit uh in uh Berlin is a much needed event together.

53:44

professionals from the political sphere and let them change ideas.

53:49

And I know the organizer of the event was present on your podcast just a few episodes ago.

54:01

Yeah, it was very interesting also talking to Joseph Lensch and I have just a couple of questions more and so em What is for your democracy?

54:16

like from also from a political science point of view or liking Your opinion

54:40

Very good question.

54:41

I should have known you're going to ask me this.

54:47

It's in the title of your podcast.

54:48

um

54:52

don't always ask for it, I think it's interesting because a lot of times we think about a concept.

55:02

we, don't know, capitalism maybe is different for me and for you or democracy and it's part of human nature and I always like to see also the other point of view of other

55:15

people.

55:16

I think without realizing, I described it earlier on.

55:21

ah I mentioned liberties.

55:26

think first and foremost, democracy is based on personal and civil freedom and liberty.

55:35

But at the same time on the rule of law, this is just to make sure we stay within certain boundaries.

55:44

And if something happens, if our civil uh liberties are threatened, somebody or something will protect those liberties.

55:54

um And I know this is a really simplistic way to describe it, but this uh is the way I see it.

56:05

And yeah, I believe that's the core of it.

56:09

Liberties and protection and strong institutions.

56:15

Thank you.

56:17

do you have any message for the people that um are working in the space of political tech?

56:27

So people that are, I don't know, finding new way of governance, maybe using technology or like uh tools similar to Trollwall?

56:42

Yeah, again, I think I mentioned it just a few minutes ago.

56:47

I would urge them to stay open-minded and to at least try new methods and new tools if they are to stay in the field.

56:59

That's one idea.

57:01

And the second one is to communicate between themselves, is to communicate with each other even if they're from opposing.

57:11

sides of the political spectrum.

57:13

There's so much they can learn uh from each other.

57:17

There's so much they can adopt from each other.

57:21

And at the end of the day, all of us would gain from it.

57:30

So thank you Alex, thank you a lot.

57:32

uh

57:33

I actually have a question for you, if I may, if you're not running out of time.

57:40

You asked me a really interesting question.

57:41

What was democracy for me?

57:44

What does it mean for you?

57:47

Good question, actually.

57:49

I was not prepared.

57:54

No, I...

58:00

As I said, I think there is in some way a lot of confusion related to the term because...

58:10

mean demo and Kratos, so it's like power to the people.

58:16

uh But I think that, I mean nowadays the democracy that we see is quite different from the democracy as it was conceived.

58:28

ah Because uh nowadays we have the representative system uh that is...

58:38

It was defined as...

58:42

There is the famous phrase by...

58:45

It was from Churchill, if I'm not wrong.

58:49

That is like...

58:52

I don't remember the exact phrase, that democracy is very bad, but all the other systems were even worse.

59:01

And so...

59:03

But I think, as you said, that also if people are, I don't know, someone who was always from, I don't know, voting left, leftist parties, because from his family and so on, and

59:18

the other person is, I don't know, voting for another party.

59:24

I think that is nowadays with technology, I think that some paradigms can also change.

59:33

This is why it is so important to talk about um what to do with democracy.

59:41

Also because there is AI that is changing everything and uh there are new ways of doing worse, maybe with farm bots and so on.

59:55

And so I think that we should really...

59:57

m

1:00:00

sit down at the table.

1:00:01

It doesn't matter if I'm left-wing or right-wing.

1:00:06

We should all sit at the table and decide what to do.

1:00:12

as we said, maybe before the interview, democracy can be, in the future, be like a...

1:00:25

like, I don't know, the best place where everyone is happy and so on, or it could be like a worst place ever.

1:00:31

um So now we are humans and humans sometimes in the past were not really, how do you say, good.

1:00:45

Now we have technology and also using technology we really...

1:00:50

um

1:00:55

We didn't really make a good use of technology, have to say.

1:00:59

Historically speaking, yes, you're probably right.

1:01:04

Yeah, but are you more pessimistic or more optimistic?

1:01:08

Do you think it's going to turn out really, really well or really, really bad?

1:01:15

I am...

1:01:16

I don't know, but I would say optimistic in the long term and pessimistic in the short one.

1:01:26

Yes, let's leave it like that.

1:01:27

Let's leave it on the optimistic side then.

1:01:30

Okay.

1:01:31

Okay.

1:01:32

So thank you a lot.

1:01:34

It was a...

1:01:35

It's been a real pleasure.

1:01:37

Thank you.