The Seventh Value – Episode Four: Equality

What does AI have to do with equality? In Episode Four of The Seventh Value, we explore how Artificial Intelligence is being used to undermine or promote the ideal of equality within Europe.

Often described as the core of democracy, equality is enshrined as a key value in the European Treaty and in the Declaration of Human Rights. However, many systems and institutions within Europe still reinforce inequality in various forms, built on conscious and unconscious preconceptions. Now, the rise of AI is weaponising many of these underlying prejudices, with severe implications for the future of equality in Europe.

Host Juli Simond and Carina Lopes, head of technology and social science think tank Digital Future Society, discusses the urgent issue of empathy and equality in AI.

Episode Four Credits
Guest – Carina Lopes, Head, Digital Future Society
Host – Juli Simond, Are We Europe
Producer – Anneleen Ophoff, Are We Europe
Sound Designer – Wederik De Backer, Are We Europe
Coordinator – Federica Mantoan, Evens Foundation

The Seventh Value is a collaboration between the Evens Foundation and Are We Europe, which explores the values that make up the fabric and face of the European Union. Human dignity, freedom, democracy, equality, the rule of law and respect for human rights. Over 70 years ago these founding pillars were written into the treaty of the European Union, but how have they withstood the test of time? What other values might be important today? Find out more here.

Read the full transcript of Episode Four: Equality

Carina Lopes:
We all travel through airports. It's very common. It's based on our understanding of what being European is, it's about this kind of easy traveling. It's become part of our life. But for many people, it's a moment of anxiety.

Carina Lopes:
Joy is a transgender person. Traveling for her, always means a lot of anxiety, in the night without sleeping. Every time she travels, she gets a moment where she has to go through the border, the security section. And there is a machine, she enters the machine. She raises both of her arms and her hands above her head. And then they start this full-body scan machine starts working. And she can see, on the other side, the security guy or lady looking at her. And in front of them, they have these two buttons and they have to press the pink button if they think that person, identify that person, as a female or blue button if they identify that as a male person.

Carina Lopes:
Every time she always gets checked up by the security guy or lady, and based on if they pressed a boy or a girl, according to how she was dressing that day, she looked a bit more female or male dressed, they'll be touching her in different parts of her body. Because, they won't match the heating patterns of what appears on the screen.

Carina Lopes:
This is a system that we normally already designed as being a system of exclusion, a system that automatically assumes a society being non-equal because not all the citizens will be treated in the same way. Imagine this is the beginning of your holiday, going through the system and at the end, you always get the cherry on top of the cake at the beginning and at the end of your holiday.

Juli Simond:
Europe is built around six different values that make up the fabric and face of the European Union, human dignity, freedom, democracy, equality, the rule of law and respect for human rights. Over 70 years ago, these founding pillars were written into the treaty of the European Union. But how have they withstood the test of time, from deconstructing the original meaning to re-imagining the future.

Guest:
The six European values?

Guest:
Quite a big question.

Guest:
Does it have to be a proper word?

Juli Simond:
This is The Seventh Value.

Guest:
Equity.

Guest:
Free movement of goods and people.

Guest:
Democracy, I guess.

Guest:
Racial injustice.

Guest:
They're not doing a very good job of this?

Guest:
No, I don't know the other ones.

Guest:
Sorry. I think I only got three.

Juli Simond:
Today, we talk to Carina Lopes, head of Digital Future Society. Hello.

Carina Lopes:
How are you doing?

Juli Simond:
I'm good. How are you?

Carina Lopes:
Good. Good.

Juli Simond:
On our minds is one question. Are all humans equal in the eyes of AI?

Carina Lopes:
You'll see. My voice is quite nasal because I've been with a cold.

Juli Simond:
In other words, what to do with the EU's fourth value, equality. How about we play a quick one word game to get to know each other? So I will ask you a question and you will reply in just one word. Where do you live?

Carina Lopes:
I live in Barcelona.

Juli Simond:
What values are important to you?

Carina Lopes:
Equality, empathy and inclusion.

Juli Simond:
What motivates you to work hard?

Carina Lopes:
Putting tech development at the service of society and our planet.

Juli Simond:
What did you want to be when you were small?

Carina Lopes:
Either a journalist or a lawyer.

Juli Simond:
And lastly, if you ruled your own country, what would be the first law that you would introduce?

Carina Lopes:
I'll bring gender equality to the core of my country.

Juli Simond:
Equality. It's enshrined in Article Two of the treaty on the European Union, in the Declaration of Human Rights, and it's often called the core of democracy. But before we delve any deeper, how would you define equality?

Carina Lopes:
At European level, we talk about leaving no one behind. That for me is about equality. But also when we talk about equality, it's about recognition that we have not always been equal. And we recognise that institutionally, there has been collectives that have been left behind. For example, we've been doing research about gender perspective and the experience of women in the digital economy. So when you design, for example, a system that doesn't take into account the reality of women, but also for example, certain migrants or LGTB community, you are leaving people behind. You're not really taking into account their needs.

Juli Simond:
Looking at your personal life, how do you apply equality in the day-to-day?

Carina Lopes:
I like to think about equality and empathy together. I think we need to be empathic and try to put ourselves in other's position in order to understand what to be equal means. A white male will never question are they a white male or what that means in society? You might have that question every day as, for example, a woman of color. What does that mean when you are looking for work? What does that mean when your culture or your traditions are being questioned by your surroundings? Or what does that mean when you're trying to balance work and caring duties? They might even be my own blind spots because I've taken so many things for granted. So, unless we put ourselves in the position of empathy and try to understand the context of the other, I think it's really hard to achieve equality.

Ursula von der Leyen - pre recording – speaking in English:
Racism is around us, in our societies. Racism is in our streets, in our workplaces and at times, even in our institutions. Europe must be better than this.
And there's more to do. Too many women in Europe lack a very fundamental opportunity. This is simply not right.

Juli Simond:
You've just listened to a speech by Ursula von der Leyen, president of the European Commission on the International Women's Day celebration.

Juli Simond:
We often think about equality in the physical realm. So are places safe and accessible to all? Are my rights the same as another's? But with technology, our lives are increasingly taking place online. How does the digital realm fair in terms of equality, in general?

Carina Lopes:
We shouldn't take for granted physical spaces, our spaces of equality. And maybe if we look at the spaces and we already understand how many challenges we have overcome, and we still have to overcome, to turn these spaces, spaces of equality, then it might also help us understand the challenges we face online. Because these challenges online are the same, but also much more complex.

Carina Lopes:
I'm talking about are the physical spaces equally safe for everyone, equally accessible for everyone, easy to navigate for everyone? Now, those are just questions we might put on the table. Do women, LGBT people, children, people with disabilities, do they feel equally in the use of that space? And that that space has been designed with their needs in mind.

Carina Lopes:
So going back to the online space, where I think all of these challenges exist, but we have a few other new layers added to it. The challenge of connection is not the same everywhere. I would say it's less of a problem in urban areas, it's more of a problem in a lot of rural areas, still. But then we have the access to devices. On what type of devices is access is being done, and is our internet world prepared for that type of access? And then affordability, because we assume that internet is affordable to everyone, but not all of us are in the same levels of income.

Juli Simond:
Since we are creating digital technologies and we are creating these digital realms, is it in some sense inevitable that whatever inequalities are perpetuated in the physical realm, that they will also be transported to the digital realm?

Carina Lopes:
Not necessarily. When we're designing technology, we are building a reality, we are building a perspective that we have about the world, and that's very important to have in mind. And when we talk about, we need diverse inclusive teams to design technologies, because unless you create this diversity and inclusivity from the starting point, it's really hard to design a service that doesn't repeat some of the challenges we already have in the physical world.

Carina Lopes:
But also we have a magnifying aspect when it comes to technology. And for example, in particular to AI, because what happens is once you have data and you use an algorithm to learn about what that data is telling you, what you're going to do is amplify the bias that's already in that data, but you're also going to crystallise this bias. So the challenges become much bigger than what you have in the physical world. And it's really, really difficult to challenge the machine. Not because the machine cannot be challenged, but also because we've been trained to believe that the machine is always right.

Juli Simond:
Absolutely. We've all seen facial recognition software being used to track criminals in countless TV dramas, and most people know algorithms from what they get to see on TikTok or YouTube, but they are much, much more commonplace. Many have seen their workplaces implement all kinds of software to improve efficiency. Could you give me a few examples of the role that artificial intelligence plays in our everyday lives?

Carina Lopes:
We know that if you're going to ask the bank for a new credit card, a loan, a mortgage, the bank will access you through AI and will access your risk. I was talking with someone from my European country recently about an automated decision making system used by the police force in the case of domestic violence. So when they have a case of domestic violence, they have to answer a series of questions on the screen. And then they'll give you the level of risk that most cases, almost always, women are facing. So that would be like green, yellow, red risk. So you have to do certain types of different procedures according to that risk.

Carina Lopes:
Similar software. In the early stages, for example, it didn't include if the women had children or not, but that changes dramatically the scenery. Or the type of support system that you need to put in place, it's very different.

Juli Simond:
This begs the eternal question when it comes to our relationship to technology, are humans too complex for technology?

Carina Lopes:
Well, I think it's a really good question. And intersectionality is a really good concept to deal with this. You might suffer some discrimination as a woman. You might suffer even another layer of discrimination as a black woman, and you might suffer even a new layer of discrimination as a black woman working in a certain industry.

Juli Simond:
Should there be then more transparency between the people who make the algorithms and the people who feed them?

Carina Lopes:
It might be a bumpy road, but I think we'll get there as Europeans. The first step is to really define what we already have at European level, in the draft of the AI regulation, differentiating the high-risk and medium-risk. And this will be the most difficult. How do you define a system that really has an impact on your life or not, will not be the same experience for everyone. And here again, equality will be a big challenge. You only really understand the part the system has in your life when something goes wrong.

Carina Lopes:
For example, I think it's most common with people that have a bill, they were not able to pay for that bill. They go on a blacklist and then it becomes like a nightmare to be able to understand where are you flagged, in what institutions, in what systems and how do you become clear of being blacklisted in different systems? And that can follow you in your life for many years afterwards.

Juli Simond:
So what is truly driving AI advancements? Are we currently making AI technology for the sake of productivity, convenience and capitalism, to bring profits up while simultaneously lowering expenses, or is it for the everyday person to increase quality of life?

Carina Lopes:
Look, traditionally and historically technology has been designed, and the great technological advancements have happened, within industrial military settings. And I'm sure those are the settings where we'll see great advance in AI. But I think it's a responsibility of the public sector, of the regulator, as a civic society, to put pressure to make sure that we design the framework that we want these research, and I'll call it research, becomes then part of our daily lives. Because we won't stop this type of investment and research and innovation taking place within these contexts, but we can shape how then they products and how they become part of our fabric, our social societal fabric.

Juli Simond:
So that's a question of creation and then a secondary phase of integration right into people's lives, when it becomes actively a product that can be used?

Carina Lopes:
So what we have had so far in most contexts at global level, is we design technology, so we give space for innovation and for new products to be designed, and then they arrive to society. We start using them. And then governments come at a later stage to mitigate the negative aspects. For me, what is missing from this is the vision. We want a digital transformation. We have a lot of very clever, smart people looking at the loopholes of the regulation you design. And then you end up, the next few years, trying to close these loopholes. We have moved forward in that sense.

Carina Lopes:
So when Europe comes to say we don't want the Chinese model or the American model, we don't want the social scoring, that's completely forbidden. Or, we don't want a model where the market is free and anything can happen, and then you come and regulate the very basic levels of protection and data protection of citizens. That's already a starting point. It's important to understand that money is a way to influence the design of that market. And I think sometimes we haven't been critical enough, where we invest the money in the innovation phase, and we could have been a little bit more interventionist when we designed what type of funding we're giving and for what, and where we want this innovation to take us.

Juli Simond:
Allow me to be a bit cynical when talking about new tech, let's dive into the crux of this episode. Algorithms are becoming increasingly ingrained in our lives, but are all humans created equal in the eyes of AI?

Carina Lopes:
I don't think so. We have human rights for a reason. I think it's like a utopia that we have to fight for every single day. We cannot just sit down and expect that will happen, that we've put it on paper and now this will take place. So our societies are full of ingrained discrimination, institutional discrimination. Even within the European space, we talk about citizenship and that is already a model of exclusion. So when we talk about borders of belonging and not belonging to these borders, it's already a model of exclusion. We never have built a model that is about humans, per se. So I think it's very hard to think that technology will ever be just designed for taking into account every, single human. That should be our aim, but I don't think it will happen likely, because that's not the society, the real world that we have created.

Juli Simond:
While inequality is often rooted in history, technology is now being developed during a period in which many try to actively build a more equal society. So how come the same centuries-old biases and stereotypes are still present in new tech? Why hasn't it changed?

Carina Lopes:
First, I think, it's often, we have many blind spots. Even when we try to be very self-aware, we have to be very proactive to assure that what we are designing is really that diverse space, inclusive and equitable. And that means we have to bring other people to the table. And we have to also generate these spaces where you don't feel comfortable, where you move away from your comfort zone, because those are the spaces where you start understanding where are your blind spots?

Juli Simond:
So can AI be a tool that reduces social divides rather than exacerbating them?

Carina Lopes:
Look, it's a technology, it's a tool. You can see where the problems are rising, and then you can start playing around. And we can have systems that intentionally give priority to certain collectives or intentionally are designed to create other types of positive bias. It's up to us. Because, I really think it's a space with a huge potential for creativity. And we are already seeing it. But this is about putting this creativity at the service of society.

Carina Lopes:
On a positive note, I also want to say that when human effort comes together, like in the case of the pandemic, how fast we can advance when we put all these tools at the service of society? At the global level, it also shows how much potential there is in technology, when technology and humans come together to put technology at the service of society.

Juli Simond:
Yeah. Collective action begets collective potential, which is very exciting. However, can AI be monopolised? And is it being monopolised at the moment?

Carina Lopes:
AI that is developed by a small startup in London, Paris or Barcelona, or AI that is developed by a huge company like Amazon, not only because of the resources, but also because of the level of data they have access to, and the use context of experiment they have. And that is, for example, Facebook is a clear case of that. How much social data they have, how much ability they have to dividing us into small groups and tweak around with our settings and see what happens and how our interactions change. And it's really scary at the same time to see the power that these really, really large organisations have in your back, without you understanding nor knowing that it's happening, but also understanding the impact of what is happening. So the challenge is massive. And of course, the potential to be completely monopolised is there.

Juli Simond:
Let's think ahead for a second. Do you see a future in which AI is accessible equally to every person who may need it, or is there the potential that we will come to need AI to survive and it could become a future inequality?

Carina Lopes:
I don't really have the answer. AI or technology that thrives on equality requires global effort. And this will also be seen with the climate emergency challenges that we face to our survival as a species really. And what are the measures that will be taken? What will be the role that digital technologies play? And if the technologies will be accessible to everyone in the same way? In the cases where these technologies can help mitigate or reduce some of the carbon impacts.

Juli Simond:
What should the future look like? Is it a case of humanising technology, digitalising humans, somewhere in between? What do you think?

Carina Lopes:
There's always been this tension between humans and technology going back to the Greeks and the philosophy of technology. The role of techne and the fire in relation to humans and our need to tame nature, no? And put nature at our service. So, that's a tension that's always been there. But I do think we have to aim at the technology, that is designed, that is put at the service of society, that really it's designed in order to achieve this vision of equality, inclusion, sustainability. Otherwise, it makes very little sense, this design of technology.

Juli Simond:
Today, we've only discussed one of the six values of the European Union, human dignity, freedom, democracy, equality, the rule of law and respect for human rights. If you could add anything to the mix, what would be your seventh value?

Carina Lopes:
Wow, that's a really, really challenging question. Definitely it will have to be something related to sustainability and climate change. I think that should be what guides us now and for the coming decades. I cannot think of anything more urgent.

Juli Simond:
Thank you for listening to The Seventh Value, a podcast in collaboration between Are We Europe and the Evans Foundation. I am Juli Simond, your host, and with me in the studio, our producer Anneleen Ophoff and sound designer Wederik De Backer.