Audio file
Transcript
00:00:03 Veena McCoole
Hello and welcome to the Human Interface, brought to you by the Oxford Internet Institute at the University of Oxford.
00:00:10 Veena McCoole
This is a podcast about how developments in technology impact our work and life, as told through the research and insights of a brilliant roster of experts and industry practitioners from the university and beyond.
00:00:22 Veena McCoole
I'm your host, Veena McCool.
00:00:24 Veena McCoole
In today's episode, we're going to pull back the curtain on the often invisible human workforce powering AI, from content moderation to data labeling, which we admittedly don't often think about.
00:00:35 Veena McCoole
I'm joined by two experts from the OII's Fair Work team, Eka Ergin and Ashley Jiju.
00:00:41 Veena McCoole
Eka is a researcher and product marketing professional working at the intersection of AI ethics, digital labor, and responsible innovation.
00:00:50 Veena McCoole
Ashley is a researcher and policy professional specializing in geopolitics and technology policy, with a focus on artificial intelligence and data protection.
00:00:59 Veena McCoole
Both ACA and Ashley support the Fair Work Project at the University of Oxford, developing AI supply chain audits and certification pathways to promote fair labor practices in the digital economy.
00:01:11 Veena McCoole
Eka and Ashley, welcome to the show.
00:01:13 Ashly Jiju
Thank you, Veena.
00:01:14 Ashly Jiju
Thank you for having us here.
00:01:15 Ayca Ergin
Thank you for having us.
00:01:16 Ayca Ergin
We're so excited to join the podcast second season.
00:01:19 Veena McCoole
Thanks for coming.
00:01:20 Veena McCoole
So paint us a picture to begin with of what the invisible human workforce behind AI even is.
00:01:26 Ashly Jiju
I'd like to take the example of the Mechanical Turk.
00:01:29 Ashly Jiju
And if you're not aware of what that is, it was this 18th century machine.
00:01:34 Ashly Jiju
that claim that it can play chess on its own.
00:01:37 Ashly Jiju
So the creator would take this machine all over the world and show it, exhibit it to people being like, oh, look at this machine that can just play chess on its own.
00:01:45 Ashly Jiju
But in reality, there was a human being hiding beneath the surface and playing the chess.
00:01:51 Ashly Jiju
But people didn't know that.
00:01:52 Ashly Jiju
So the reason we bring up this example is we like to take this example to tell you what, you know, why we're discussing what we're discussing today.
00:02:02 Ashly Jiju
It's because AI has a lot of invisible human labor hiding beneath the surface.
00:02:07 Ashly Jiju
We think about AI as this, you know, magical, beautiful thing that floats in the air, that we, you know, that's just all nice and shiny on the surface.
00:02:15 Ashly Jiju
But the thing is that it has a material reality.
00:02:18 Ashly Jiju
There are cables under the sea, there are data centers all over the world.
00:02:21 Ashly Jiju
And just like that, there are human beings that work day and night to ensure that what you get on your screen just a click away is created, is checked, is moderated.
00:02:32 Ashly Jiju
That takes a lot of work.
00:02:33 Ashly Jiju
And often these people are ignored.
00:02:36 Ashly Jiju
Companies don't think about them.
00:02:37 Ashly Jiju
Governments don't factor them in their regulatory discussions.
00:02:41 Ashly Jiju
And they're often forgotten.
00:02:43 Ashly Jiju
But they form the core of whatever we have with AI and tech.
00:02:49 Ayca Ergin
Yeah, to add on top of that, I think
00:02:52 Ayca Ergin
The Mechanical Turk is a great example to give because that's actually the brand name used by one of the big tech giants of today that provides 24-7 labor services to customers and clients in the world.
00:03:06 Ayca Ergin
So to speak to a little bit more around how does that look like when we look at the world today is we see this invisible labor workforce coming into the limelight through the AI supply chains.
00:03:21 Ayca Ergin
So we see a lot of
00:03:22 Ayca Ergin
of large AI tech companies coming out of the Global North.
00:03:26 Ayca Ergin
we hear a lot of them coming from the US and Europe and the UK.
00:03:31 Ayca Ergin
But these AI models are definitely, yes, created by these companies, but a lot of the activity that is at the baseline of the creation of these models are actually outsourced.
00:03:41 Ayca Ergin
So we see this extended AI supply chain that starts in the Global North extending into Africa, into Asia,
00:03:49 Ayca Ergin
that ends up outsourcing, data annotation, content moderation services through BPOs and outsourcing initiatives.
00:03:58 Ayca Ergin
These individuals that are situated in mostly, you know, the Global South end up having to do these baseline work that ends up being invisible to not only just the users, but also the policy makers.
00:04:12 Ayca Ergin
They're not protected, they're not supported, and they're also, you know,
00:04:18 Ayca Ergin
not able to meet the necessities of today's world to deliver these AI models and the levels that we expect them to do, but also at the same time meet their own personal expectations in life to earn living wage and have a certain level of living circumstances.
00:04:38 Veena McCoole
Okay, so there's
00:04:40 Veena McCoole
all these people behind the glossy screens that we see, that we prompt every day to help us in our work and our life.
00:04:47 Veena McCoole
I mean, walk me through what it looks like to be one of these laborers involved in the AI supply chain.
00:04:54 Veena McCoole
You know, what does data annotation look like in practice?
00:04:58 Ashly Jiju
Let's say I am a data annotator in one of these BPOs, which is a business process outsourcing company.
00:05:04 Ashly Jiju
So let's say I'm a worker in Kenya.
00:05:06 Ashly Jiju
I show up right on dot.
00:05:08 Ashly Jiju
I have a fixed target.
00:05:10 Ashly Jiju
Let's say if I have a nine-hour working day, most of these companies have workers working for 8, 8 1/2 hours.
00:05:17 Ashly Jiju
So let's say, let's be nice, let's say they have just 8 hours.
00:05:20 Ashly Jiju
Within these eight hours, I'm monitored constantly that I have to hit a certain target each day.
00:05:26 Ashly Jiju
The target is set by my team leader, the target is set and the team leader sets the target based on the company's requirements.
00:05:33 Ashly Jiju
So AI companies give certain projects
00:05:36 Ashly Jiju
to these BPOs.
00:05:38 Ashly Jiju
And the BPOs then get employees for those project periods.
00:05:43 Ashly Jiju
So a lot of them are not even permanent employees.
00:05:45 Ashly Jiju
They are project-based employees.
00:05:47 Ashly Jiju
And within those four months, they have a set target.
00:05:50 Ashly Jiju
Each day, every day, they come to the office, sometimes six days a week, to work on these targets.
00:05:56 Ashly Jiju
Now, a lot of these companies are very strict about meeting the targets every day.
00:06:01 Ashly Jiju
If you don't meet the targets, you can get penalized in different ways.
00:06:05 Ashly Jiju
Some of the work we do is to ensure
00:06:06 Ashly Jiju
I'm sure that they are humans after all, just because they work in tech doesn't mean they're robots.
00:06:11 Ashly Jiju
So back to my day in the life.
00:06:14 Ashly Jiju
So you come to the office, you start working, you have a set target, you sit next to your fellow colleagues and you're all looking at these images.
00:06:22 Ashly Jiju
Let's say if you're a data labeler, what you're trying to do is there's an image and you're trying to teach the AI what the image is.
00:06:28 Ashly Jiju
Some images, for example, an example is that
00:06:32 Ashly Jiju
cats and tigers and lions.
00:06:34 Ashly Jiju
So you're trying to teach the AI which one is a cat, which one is a tiger, which one is a lion.
00:06:39 Ashly Jiju
The AI, on its own, fails to recognize that.
00:06:43 Ashly Jiju
So that's one example.
00:06:44 Ashly Jiju
There are a lot of people who go, check these images and label what it is.
00:06:48 Ashly Jiju
And it could be like you have to label X number of images over 8 hours.
00:06:53 Ashly Jiju
And then you might get like a 15 minute break if you're lucky.
00:06:56 Ashly Jiju
A lot of companies don't give people that luxury too.
00:07:00 Ashly Jiju
So this is just one example.
00:07:02 Ashly Jiju
Another one, you can be a content moderator.
00:07:04 Ashly Jiju
And this is usually the really difficult work because you have to go, if you report that there is a video online of someone being killed, then these content moderators are the one who's got to watch it and then label it as, okay, this is not AI, this is a real video and that we should remove it from our platform or make the AI system flag it as, you know, do not watch this, et cetera.
00:07:23 Ashly Jiju
So there's, you know, a range of things that they can do.
00:07:26 Ayca Ergin
To add to that, I think coming into the space, like working with the Fair Work Project, reading
00:07:32 Ayca Ergin
The book by Mark Graham, who's the director of Fair Work, Feeding the Machine, gave me some really raw examples that I hadn't been exposed to before.
00:07:39 Ayca Ergin
Like one of the examples is a content moderator going to work on a regular day, you know, reviewing various types of content, end up actually having to see a video of someone from his family being killed in the workplace.
00:07:53 Ayca Ergin
And as she watches this video, she wants to leave the workplace and go see her family because she's actually watching it live from the platform as she's moderating it.
00:08:04 Ayca Ergin
But if she ends up leaving, that's going to cause her to drop her rate that she's been doing really well to that point during the day.
00:08:10 Veena McCoole
And presumably affect her income and things like that.
00:08:12 Ayca Ergin
Income.
00:08:13 Ayca Ergin
And then the manager says, okay, like I think you should continue working for the rest of the day and you can take tomorrow off so your rate for today won't be impacted.
00:08:21 Ayca Ergin
That's one example.
00:08:23 Ayca Ergin
a little bit further, we've spoken to, I think, some of the more common examples, but there's also this really deep and dark, examples that are also happening in terms of human trafficking.
00:08:32 Ayca Ergin
We've seen the news of, I believe it was a data annotator in Africa, moving from one African country to another based off of, you know, better job options in this new country that she moved to.
00:08:43 Ayca Ergin
But she was basically brought there without a visa, and she ended up, you know, having to work for this company to receive her paycheck when she was like illegally employed.
00:08:52 Ayca Ergin
And she wasn't able
00:08:53 Ayca Ergin
to leave or go back because there was, this company workforce above her telling her that, well, you're here on illegal terms, you have to keep working or you're, you can't go anywhere.
00:09:03 Ayca Ergin
So there's just these layers of layers of elements that put these individuals at risk that a lot of people don't talk about or even think about, I think, day-to-day.
00:09:11 Ayca Ergin
And these companies that have these AI supply chains sort of get away with because there are no regulations, there are no obligations to meet certain standards as they continue, especially within the
00:09:23 Ayca Ergin
the race to win this, AI boom.
00:09:26 Veena McCoole
So hearing all of this, I'm curious about whether there are platforms for these workers to voice any concerns or report issues.
00:09:33 Veena McCoole
I mean, what are the consequences here for the large tech companies that outsource to these BPOs?
00:09:39 Ayca Ergin
Yeah, so to speak to that, so maybe I need to take a step back, actually, to talk a little bit about the Fair Work principles and what we look at.
00:09:47 Ayca Ergin
So Fair Work has been around for about 7 to 8 years now.
00:09:50 Ayca Ergin
We are, you know, new
00:09:53 Ayca Ergin
newer, I would say, on this AI supply chain, but we've been doing years of research on identifying the right labor practices within these vulnerable workforce economies.
00:10:04 Ayca Ergin
So think about gig economies, platform work, sex work, and we look at 5 principles.
00:10:10 Ayca Ergin
So one of the big principles that we look at is actually fair representation.
00:10:14 Ayca Ergin
Workers being able to have the right forums to express concerns within the workplace that they're working on, having the right support systems,
00:10:23 Ayca Ergin
including labor organizations or unions.
00:10:25 Ayca Ergin
And that is, we see as a baseline to call a workplace as a fair workforce or a fair workplace.
00:10:32 Ayca Ergin
In today's world, we see that there are not a lot of forums where workers can express their concerns.
00:10:37 Ayca Ergin
And I think that is coming from a place of the abundance of people that can replace these workers.
00:10:43 Ayca Ergin
So imagine yourself going to your manager working for a BPO and saying, hey, I'm not doing well.
00:10:51 Ayca Ergin
I am not in the right mental state
00:10:53 Ayca Ergin
come into work tomorrow to work a 15-hour shift.
00:10:55 Ayca Ergin
The response that a lot of these people would get, and that we've heard from our interviews, is that you are replaceable.
00:11:01 Ayca Ergin
If you would like to not continue, you can quit your job right now, and we can replace you with somebody else tomorrow.
00:11:07 Ayca Ergin
And I think at the same time, there is also this worry that if you were to raise your concerns, if you were to unionize, you would be retaliated against.
00:11:15 Ayca Ergin
You're already potentially earning below living wage, and there are not a lot of options in terms of where you
00:11:23 Ayca Ergin
we can go next.
00:11:23 Ayca Ergin
So unfortunately, it's a bit of a dire situation right now, unless organizations like ourselves are able to step in and sort of give voice to these individuals behind these AI models and AI supply chains.
00:11:35 Ashly Jiju
And just to add to, you know, how big of an issue this is, just to put it in terms of like numbers and percentages, we work with workers across the digital economy.
00:11:44 Ashly Jiju
So we work with workers from the platform economy.
00:11:47 Ashly Jiju
We work so that can be location-based work or cloud work, so they work from home, remote.
00:11:53 Ashly Jiju
We also work with a lot of sex workers.
00:11:55 Ashly Jiju
And AI plays, AI comes into all of these different fields, especially a lot of platform work.
00:12:00 Ashly Jiju
Now, platform work is not a small percentage.
00:12:04 Ashly Jiju
They're about, they form 12% of the global workforce, which if you put it in numbers, is over 404 million people.
00:12:12 Ashly Jiju
And this is an estimate based on our research.
00:12:14 Ashly Jiju
And this is a very conservative estimate.
00:12:16 Ashly Jiju
The numbers can be over 600 million too.
00:12:18 Ashly Jiju
So we are talking about over 600 million people without these rights, without these
00:12:23 Ashly Jiju
recourses.
00:12:23 Ashly Jiju
They don't know who to go to.
00:12:25 Ashly Jiju
They might be replaced any moment.
00:12:27 Ashly Jiju
A lot of our workers complain that they don't really have an option to complain because if they complain, they might be replaced the next day.
00:12:36 Ashly Jiju
Because if you think about it, these companies go to Asia and Africa because there's a lot of young people there.
00:12:42 Ashly Jiju
There's a lot of young people who are very educated, who can speak multiple languages fluently.
00:12:47 Ashly Jiju
And something that they look for, especially in BPOs, is that they have a sort of neutral accent so that they can help customers.
00:12:53 Ashly Jiju
or they can help with data annotation from all over the world so that they know English and other languages and they have a neutral accent.
00:13:00 Ashly Jiju
This is available in plenty in countries like Philippines, India, Kenya, Nigeria, etc.
00:13:06 Ashly Jiju
So what happens is well-educated youth in these countries end up going to these PPOs.
00:13:12 Ashly Jiju
One thing that I want to mention here is that when I often talk to people about this, there's this idea that these young people don't know any better, that they're going to this because they're not aware, but the reality is that they are aware.
00:13:23 Ashly Jiju
It's just that they don't have opportunities available in the countries that they are in to go to any other field.
00:13:29 Ashly Jiju
An example is that one of our workers...
00:13:33 Ashly Jiju
One of the workers we interviewed said that he has a really cool bachelor's degree, but he's working at a BPO.
00:13:38 Ashly Jiju
Now he's the head of a family of four.
00:13:40 Ashly Jiju
He said that he works over 45 hours a week, but the salary that he gets from his 45-hour work week is not enough to sustain him for over 2 weeks in a month.
00:13:50 Ashly Jiju
And he says that for the rest of the two weeks of a month, he has to take loans from money lenders.
00:13:55 Ashly Jiju
So what happens is these educated, employed youth from a lot of these countries, despite having a full-time job, fall into this sort of debt
00:14:03 Ashly Jiju
trap that it just extends the risks that they have, which is beyond not just workplace issues, but it also impacts the rest of their life.
00:14:11 Ashly Jiju
Another example is that a worker from Latin America was telling us that he can work over 14 hours a day and still not earn minimum wage for a day.
00:14:21 Ashly Jiju
This shows the extent to which a lot of these people are being exploited.
00:14:25 Ashly Jiju
It's not because they don't know any better.
00:14:27 Ashly Jiju
It's because that they know what's wrong in the workplace.
00:14:30 Ashly Jiju
They know what's wrong in their countries.
00:14:32 Ashly Jiju
But they just don't have
00:14:33 Ashly Jiju
have the outlet.
00:14:35 Ashly Jiju
They don't have support.
00:14:37 Ashly Jiju
They can't complain because they will be kicked out.
00:14:39 Ashly Jiju
So this is, honestly what we try to do.
00:14:42 Ashly Jiju
We come and be that person.
00:14:43 Ashly Jiju
We come and be a shield for them.
00:14:45 Ashly Jiju
But not just for them.
00:14:46 Ashly Jiju
We come and also be that source of support for companies.
00:14:49 Ashly Jiju
Maybe they haven't thought about this.
00:14:51 Ashly Jiju
Well, it's high time that they think about this because 400, 600 million is not a small number and that is a low estimate.
00:14:58 Veena McCoole
As you're sharing all this, what I'm thinking about is regulation.
00:15:02 Veena McCoole
I mean, what is the regulatory
00:15:03 Veena McCoole
context around this.
00:15:04 Veena McCoole
It sounds like if the problem is structural and at this scale, there's got to be some kind of response or framework that's legally enforceable that companies have to abide by, no?
00:15:14 Ayca Ergin
Yeah, great topic to move into next because I think this is the most exciting part of the research that we've been doing that sort of fueled the fire on why we should expand into the AI supply chain portion of our research at Fair Work.
00:15:27 Ayca Ergin
So there's a couple of different regulations that are ongoing right now, especially coming out of the EU.
00:15:32 Ayca Ergin
I do see
00:15:33 Ayca Ergin
The EU being criticized a lot, like in the global scene, in terms of their caution and the way they promote tech being developed, but having moved to the UK from the US, I actually do appreciate the approach that the EU is taking, because I think it's absolutely necessary.
00:15:48 Ayca Ergin
So, the big regular...
00:15:50 Ayca Ergin
movement that everyone's kind of talking about right now, especially in our space, is the CS3D.
00:15:56 Ayca Ergin
It's the Corporate Sustainability Due Diligence Directive, which actually focuses on supply chains as an overall sort of context.
00:16:03 Ayca Ergin
So what the CS3D expects, and it's going into effect in 2026, is that any company that has a supply chain that goes through the European Union has to go through the risk analysis to identify, you know,
00:16:20 Ayca Ergin
Where are their suppliers located?
00:16:23 Ayca Ergin
Are they meeting certain standards?
00:16:25 Ayca Ergin
Do they have the right contracts in place?
00:16:28 Ayca Ergin
And report that, these various stakeholders within their AI supply chains are meeting the EU's and the CS3D's expectations.
00:16:36 Ayca Ergin
It goes back to the ideology that, you know, corporate sustainability is no longer a nice to have, but it's actually mandatory.
00:16:45 Ayca Ergin
You have to do something about it and you have to report on it to be able to function your business within the EU.
00:16:50 Ayca Ergin
This is especially important, and I think a lot of people are missing on it within the AI space, because when you think about supply chains, a lot of times you think about manufacturing or agriculture, but these are, the more traditional supply chains that we know of.
00:17:04 Ayca Ergin
But AI supply chain still fits into this framework, and AI companies that have these supply chains will be affected.
00:17:10 Ayca Ergin
The other regulation that, you know, we talk about is the EU AI Act.
00:17:15 Ayca Ergin
that enforces companies that builds AI models to shed light on, how do you build these models?
00:17:22 Ayca Ergin
Can you provide explainability, at least at the human level, the level that we expect from humans?
00:17:28 Ayca Ergin
So, you know, we sort of set the standards on how these models are developed.
00:17:31 Ayca Ergin
The last piece is around platform work.
00:17:35 Ayca Ergin
It's called the platform work directive that, again, the European Union is working on building.
00:17:40 Ayca Ergin
It's again, you know, going back to our five principles of making sure
00:17:46 Ayca Ergin
workers within these fields are sort of earning living wage, have the right, forums to raise concerns, have fair management.
00:17:53 Ayca Ergin
So we are seeing all, these three different regulations sort of converging, right?
00:17:58 Ayca Ergin
Moving towards an ideal world where workers and different suppliers or different stakeholders within supply chains are able to receive what they, you know, invest in, you know, what they do well in and get, you know, the right outcomes.
00:18:13 Ayca Ergin
So that's, you know, the regulatory
00:18:15 Ayca Ergin
context.
00:18:16 Ayca Ergin
To add on top of that, though, we talked a lot about EU.
00:18:19 Ayca Ergin
So I think what I also hear a lot about is, so what is US doing?
00:18:23 Ayca Ergin
And I think everyone kind of knows in terms of how US is approaching it.
00:18:26 Ayca Ergin
Well, in this case, though, if CS3D goes into effect and the EU AI Act and the platform work directive, even if you're not based out of the European Union, you will have to comply with the European Union.
00:18:38 Ayca Ergin
So I do think that European Union's sort of pioneering efforts in this space is going to raise the standards in the global.
00:18:46 Veena McCoole
Got it.
00:18:47 Veena McCoole
And we've talked about regulation, we've talked about the impact on individual workers, and we've also talked about the scale of this problem.
00:18:53 Veena McCoole
Actually, those numbers are staggering, 404 million at a conservative estimate.
00:18:58 Veena McCoole
Why should companies care?
00:18:59 Veena McCoole
Like, is there a business case for improving this?
00:19:02 Ashly Jiju
Well, I think companies should absolutely care because just like how Ika was saying, if you have these regulations and they come into place, it's not just going to affect the companies that are located in those areas.
00:19:14 Ashly Jiju
Let's say you're a company in Australia, but you want to work with, you know, if you want to sell your product to the European market, you still have to comply.
00:19:21 Ashly Jiju
If you are a European company and you want to sell your product to Australia, again, I'm just taking these two examples.
00:19:28 Ashly Jiju
you still have to comply to the European regulations.
00:19:31 Ashly Jiju
So one factor is that it's there, it's out there, you have to follow what, because if you don't follow these regulations, it can really badly impact your company's reputation from a regulatory perspective.
00:19:44 Ashly Jiju
Now, another side that it can impact you is that workers, I mean, through our work and the work of a lot of civil society organizations, are starting to learn about their importance and their impact in the global supply chain.
00:19:55 Ashly Jiju
They are gaining voice and they know, like I said before,
00:19:58 Ashly Jiju
They know that there are people who can help them, and they know that their voice matters.
00:20:02 Ashly Jiju
The AI supply chain is a very competitive market, and no company that's located in it, you can't say that they will be there.
00:20:11 Ashly Jiju
10 years later.
00:20:12 Ashly Jiju
It's an incredibly competitive market with a lot of funding.
00:20:16 Ashly Jiju
Funders are pouring in money if you have something AI related.
00:20:19 Ashly Jiju
So to sustain in that competitive market, you not just need a good product, you need to ensure that the whole supply chain that forms your product is also ethical.
00:20:30 Ashly Jiju
Because if something comes up, funders are not inclined to fund you.
00:20:34 Ashly Jiju
If they see an article that says, oh, you are not treating your workers well, why would they invest 500 million billions in your company?
00:20:41 Ashly Jiju
Because
00:20:41 Ashly Jiju
If that company's stock crashes, then the funders lose out on their money.
00:20:45 Ashly Jiju
So one side is regulatory, another side is just from a pure financial point of view.
00:20:51 Ashly Jiju
And the third side is ethical.
00:20:53 Ashly Jiju
At the end of the day, we're humans.
00:20:55 Ashly Jiju
You can't just build a product.
00:20:58 Ashly Jiju
and claim it to be an ethical product, but not have ethical supply chains behind it.
00:21:03 Ashly Jiju
People say that, oh, knowledge is human.
00:21:05 Ashly Jiju
AI is human.
00:21:07 Ashly Jiju
We're trying to build an ethical AI product.
00:21:10 Ashly Jiju
But AI is the work of humans.
00:21:12 Ashly Jiju
If the humans who work behind the AI systems are not being treated well, the product, the output, might not be the most ethical product.
00:21:22 Ashly Jiju
Building a product is not just coding.
00:21:23 Ashly Jiju
It's not just putting the pieces together.
00:21:26 Ashly Jiju
It's also about what goes into the product.
00:21:28 Ashly Jiju
With AI, we teach it human behavior.
00:21:31 Ashly Jiju
We teach it human emotions.
00:21:33 Ashly Jiju
If the workers working on this AI model, not being paid well, are being mistreated in their company, do you really think that they will be able to put in a very beautiful, moralistic idea of what humanity should be?
00:21:48 Ashly Jiju
No, that's not going to happen.
00:21:50 Ashly Jiju
It might happen on paper, but AI is very complex and we still don't know how AI really works.
00:21:55 Ashly Jiju
So that's the third thing.
00:21:57 Ashly Jiju
Honestly, so that's the three broad things, regulatory perspective, financial perspective, and also pure humanity.
00:22:03 Ashly Jiju
an ethical perspective.
00:22:04 Ashly Jiju
And I think these three reasons altogether encompasses everything that a company does.
00:22:11 Ayca Ergin
Yeah.
00:22:11 Ayca Ergin
And to add on top of that, I think I want to go for a little bit more of a positive reinforcement, is that you can avoid fines.
00:22:19 Ayca Ergin
There's going to be some fines coming out of these regulations.
00:22:22 Ayca Ergin
If you don't meet these regulation standards, especially this yesterday, you're going to be fine for that.
00:22:27 Ayca Ergin
What if you were to meet these standards and invested the amount you would pay
00:22:31 Ayca Ergin
back into your workforce, back into how you would develop these AI models.
00:22:34 Ayca Ergin
I think that's definitely an important sort of twist to the situation.
00:22:39 Ayca Ergin
The other piece that I also want to emphasize, you know, building on top of that, the outcome of the AI models is that imagine you're coming into work, you worked 18 hours for the last three weeks.
00:22:49 Ayca Ergin
It's been, you know, really taking a toll on you.
00:22:52 Ayca Ergin
The output that you're going to contribute towards that AI model is also not going to be great.
00:22:56 Ayca Ergin
Like one of the biggest challenges we're seeing right now is the data quality, right?
00:23:00 Ayca Ergin
And the amount of data available.
00:23:01 Ayca Ergin
to build AI models.
00:23:03 Ayca Ergin
So it's super, super important, how we manage the existing data that we have.
00:23:08 Ayca Ergin
So that also goes back to the idea of like, okay, if we would like to continue building better AI models, we also need to invest in the workforce behind, because they are the people that are leveraging this very important, rare data that we have available in building these models.
00:23:24 Ayca Ergin
And then I think, going back to the reputation issue that you were saying, I'm playing the good cop here, is I think companies that are investing
00:23:31 Ayca Ergin
investing in this space are also going to start shining.
00:23:34 Ayca Ergin
That's what we're hoping to do also with, you know, the initiative that we're driving at Fair Work.
00:23:38 Ayca Ergin
We would like the companies within AI supply chains, including lead firms and suppliers that comply with these, you know, bare minimum standards to be appreciated and celebrated and they for them to get to drive the future of AI development.
00:23:52 Ayca Ergin
So I think that's the objective that we're trying to drive with our research.
00:23:56 Veena McCoole
Yeah, I appreciate you turning this conversation into slightly more positive.
00:24:00 Veena McCoole
Yeah, need to get carried away with that.
00:24:01 Veena McCoole
how bleak things are, but it's really heartening also to hear the fact that there is work being done to audit, to certify, to, as you were saying, support the companies that do want to make a change with exactly how that's done.
00:24:14 Veena McCoole
And I'd love for you to give us a bit of a whistle-stop tour as to, first of all, what that looks like, but also maybe what you've learned so far from this process.
00:24:21 Ayca Ergin
I am very happy to actually be here because the notion that we've been driving around this AI supply chain actually goes back to some very interesting research and due diligence we have done to come up with this new direction we're taking as we provide lead firms, as well as suppliers sitting within AI supply chains to get certified.
00:24:43 Ayca Ergin
We're a bit ahead of our time.
00:24:44 Ayca Ergin
times, CS3D was sort of the fire behind our decision to develop this approach.
00:24:49 Ayca Ergin
It's coming into effect in 2026, and we are in conversation with a lot of potential companies that we can partner with to bring them up to a level where they meet these standards.
00:24:59 Ayca Ergin
So what does the fair work audit and certification process entail?
00:25:04 Ayca Ergin
It's sort of a mirror to all the work that we've been doing over the last seven, 8 years in the platform work, cloud work space.
00:25:12 Ayca Ergin
We look at lead firms
00:25:14 Ayca Ergin
supply chains, we sort of do a risk analysis and map, what are the different types of companies that sit within their AI supply chains.
00:25:21 Ayca Ergin
And we, do a sort of a discovery and we look at, the contracts that they have in place and if they meet certain principles that we have, the five fair work principles that we always speak to.
00:25:34 Ayca Ergin
And then from there, we work with them to identify areas that they need improvement.
00:25:39 Ayca Ergin
So we build an action plan with them.
00:25:41 Ayca Ergin
It could be that, you know, they don't have the right forums where the workers
00:25:44 Ayca Ergin
can express concerns.
00:25:46 Ayca Ergin
They don't have enough fair management practices or, the workers might not be making the living wage.
00:25:51 Ayca Ergin
So we work with them to come up with a plan where they can meet these standards and we work with them.
00:25:57 Ayca Ergin
So we go into the suppliers' workplaces to provide workshops, trainings, help them change certain terms in their contracts.
00:26:05 Ayca Ergin
And as we complete this process, we look at, you know, long-term growth and their commitment to improving their practices.
00:26:12 Ayca Ergin
It's not a one-off, you know,
00:26:14 Ayca Ergin
Assessment of what they're doing, but it's a long-term partnership that we're hoping to drive, and hopefully get them to a place where they meet Fair Works standards, as well as these upcoming...
00:26:25 Ayca Ergin
global regulations.
00:26:27 Ayca Ergin
we have the audit space and then the certification option.
00:26:30 Ayca Ergin
Audit is mostly for companies that are trying to get that one-time mark.
00:26:35 Ayca Ergin
But certification is, I think, the pathway we're focusing on the most because that will bring to light which companies are looking to grow and sustain growth within this space in the long term.
00:26:46 Veena McCoole
And walk me through some of the kind of like learning so far from this engagement.
00:26:50 Veena McCoole
You know, a lot of our conversation today has alluded to anecdotes and interviews and
00:26:55 Veena McCoole
that you've learned by going in and doing this work.
00:26:57 Veena McCoole
I mean, I'm curious to hear what those key learnings look like at this early stage.
00:27:03 Ashly Jiju
Well, one key learning that I've had so far is...
00:27:06 Ashly Jiju
Companies are interested.
00:27:08 Ashly Jiju
Companies are positive about it.
00:27:09 Ashly Jiju
They're interested.
00:27:10 Ashly Jiju
And these companies come from all over the world.
00:27:13 Ashly Jiju
So I've had interest from companies in the US, companies in Northern Africa, Southern Africa, Asia, companies from India, companies from Australia, companies from Southeast Asia, companies from East Asia.
00:27:26 Ashly Jiju
You know, I'm not going to specify which companies, but our team's seen interest from all over the world.
00:27:30 Ashly Jiju
So this is, it clearly speaks to the work that we're trying to do.
00:27:34 Ashly Jiju
And the fact that, like I just said, companies recognize
00:27:36 Ashly Jiju
the value of this work and they want to put themselves out in the global supply chain as leaders.
00:27:42 Ashly Jiju
So one way of becoming a leader is to build a good product, but another way to become a leader is to build the best product.
00:27:49 Ashly Jiju
And like I said, best product also comes with having good ethical practices.
00:27:53 Ashly Jiju
So that's one learning that we've had is that companies from across the world are genuinely interested in this and they're actively having discussions with a lot of them to see how it aligns.
00:28:02 Ashly Jiju
And these companies are really big to really small companies.
00:28:06 Ashly Jiju
So they're across the spectrum because we are evaluating the supply chain.
00:28:11 Ashly Jiju
We are not just focused on a specific kind of company.
00:28:14 Ashly Jiju
We get requests from data annotation companies, consultancies, large software development companies, big tech companies.
00:28:23 Ashly Jiju
It's all over the world.
00:28:25 Ashly Jiju
And that's the main learning that we've had.
00:28:28 Ayca Ergin
Yeah, and to add to that, I think the big notion behind that is like with this AI race, like how do you differentiate yourself?
00:28:34 Ayca Ergin
Like there's so many ways to differentiate.
00:28:36 Ayca Ergin
Like you could
00:28:36 Ayca Ergin
You could be pricing your AI models at a lower threshold, or you could be paying your workforce below a living wage.
00:28:44 Ayca Ergin
But one way to also differentiate yourself is to prove that you're ethical.
00:28:48 Ayca Ergin
And I think people are getting more and more sensitive in this particular space because we are hearing a lot of, you know, news of users or companies impacted by AI models that are not fully well developed or meeting their needs.
00:29:01 Ayca Ergin
So I think there is this growing interest of working or adopting AI tools
00:29:06 Ayca Ergin
that are ethical.
00:29:08 Ayca Ergin
And the way to differentiate yourself in this AI race could be actually pursuing that pathway.
00:29:13 Veena McCoole
Yeah, why not be a company that is known for doing good and doing right in this way completely?
00:29:19 Ayca Ergin
Yeah.
00:29:19 Veena McCoole
I feel like we could carry on this conversation forever, talking about the intricacies of the supply chain and things that we're not thinking about on a day-to-day basis.
00:29:29 Veena McCoole
I'd love to conclude by asking both of you what one thing you wish people who are logging on to use a chatbot
00:29:36 Veena McCoole
for their work or life, what would you like them to keep in mind the next time they're doing that?
00:29:42 Ayca Ergin
The one message that I want folks who are listening to this to leave this podcast with is that you might be looking at a screen that is a hardware and then a software engraved into it, but there are millions of people sitting behind it.
00:29:54 Ayca Ergin
Just because you can't see them doesn't mean that they're not doing the work for you.
00:29:58 Ayca Ergin
So next time as you're chatting with either, you know, an agent or just a chatbot, remember that they're the ones carrying that initiative for you.
00:30:06 Ashly Jiju
What I would say is that you might just look at the screen, but that screen, the chatbot, the AI, it's not perfect.
00:30:13 Ashly Jiju
We tend to think that AI is a final product.
00:30:15 Ashly Jiju
We tend to think that, oh, it's all going uphill from here.
00:30:17 Ashly Jiju
But the reality is that we still need humans to build these AI systems because we still don't know about a lot of, you know, we don't know how the AI's brain works.
00:30:27 Ashly Jiju
AI interpretability is, you know, a field that is still being studied.
00:30:31 Ashly Jiju
AI hallucinates, AI is biased.
00:30:33 Ashly Jiju
So we still need humans to cross-check to
00:30:36 Ashly Jiju
to develop these AI systems.
00:30:38 Ashly Jiju
And we can't have those humans be mistreated.
00:30:41 Ashly Jiju
We can't build an inclusive internet.
00:30:43 Ashly Jiju
We can't build an ethical AI product without ensuring that the humans behind it are remunerated well, that they are, you know, that we think about them, we care about them.
00:30:55 Ashly Jiju
They come to our minds, whether we are governments, whether we are companies, whether we are individuals just operating in the space, that there are humans behind this.
00:31:04 Ashly Jiju
And
00:31:05 Ashly Jiju
an inclusive world, an inclusive internet.
00:31:08 Ashly Jiju
Inclusive AI cannot happen without including the people that work behind it.
00:31:13 Veena McCoole
Eka and Ashley, this has been such a fascinating conversation.
00:31:15 Veena McCoole
Thank you both.
00:31:16 Veena McCoole
If you enjoyed this episode, we would love for you to share it on social media.
00:31:20 Veena McCoole
Follow along with all of our latest updates from the Oxford Internet Institute by following us online.
00:31:25 Veena McCoole
And stay tuned for the next episodes to come.
00:31:29 Veena McCoole
Thanks so much again, guys.
00:31:30 Ashly Jiju
Thank you.
We recommend upgrading to the latest Chrome, Firefox, Safari, or Edge.
Please check your internet connection and refresh the page. You might also try disabling any ad blockers.
You can visit our support center if you're having problems.