Algorithms drive online discrimination, academic warns

Social Media

2019-12-12 / www.ft.com




Sandra Wachter says AI uses sensitive personal traits to target or exclude people in ads.

Existing laws are failing to protect the public from discrimination by algorithms that influence decision-making on everything from employment to housing, according to new research from the Oxford Internet Institute.
Sandra Wachter, the academic behind the study, found algorithms are drawing inferences about sensitive personal traits such as ethnicity, gender, sexual orientation and religious beliefs based on our browsing behaviour.
These traits are then used by online advertisers to either target or exclude certain groups from products and services, or to offer them different prices.
Under current data protection regulation, it is illegal for advertisers to target groups of people based on sensitive "special category" information.

The 33-year-old researcher's new paper, which is to be published in the Berkeley Technology Law Journal in early 2020, calls the phenomenon "discrimination by association".
"Grouping people according to their assumed interests rather than solely their personal traits has become commonplace in the online advertising industry," Ms Wachter wrote. "[I]f users are segregated into groups and offered or excluded different products, services, or prices on the basis of affinity, it could raise discrimination issues."
Ms Wachter, an Austrian lawyer who specialises in the ethical and legal implications of emerging technologies such as machine learning and robotics, has recently co-founded a new research programme at the Oxford Internet Institute called the "governance of emerging technologies", an interdisciplinary effort to interrogate artificial intelligence.
In particular, her recent work has examined intelligent systems that make decisions impacting humans, and what recourse humans have to take back control when algorithms fail, or demonstrate prejudice.
"I grew up with the understanding that the purpose of technology and innovation should be to have a democratising effect," Ms Wachter said.
"Figuring out how to explain algorithmic systems in a meaningful way, and making sure that they are accountable, is what I'm interested in. We have to make sure that innovation is not . . . disadvantaging groups or widening existing gaps."

Algorithmic discrimination has been observed by civil society organizations. The non-profit ProPublica published reports showing Facebook allowed companies including Amazon, Goldman Sachs, Verizon and Uber to target adverts for jobs, housing and credit by excluding certain groups based on ethnicity, age and gender. For instance, their survey of 91 Uber ads across a dozen US cities found 87 ads exclusively targeting men, and just one exclusively targeting women.
Discrimination by association means that even the most innocuous groupings can have harmful consequences. For instance, a loan-granting algorithm might see a positive correlation between dog owners and people who pay back their loans on time, therefore including "dog owning" as a neutral variable that it uses to favour certain people for loans.
"But, for example, in the UK, most people won't be able to own a dog if they don't own a house because most landlords don't allow pets if you're renting," Ms Wachter said. "So you are preferring people who have their own houses over renters, and that will have a disparate result for certain populations."
From a legal perspective, it is particularly difficult for people to bring non-discrimination cases against companies doing this type of targeting, because individuals are rarely aware they have been profiled or treated differently online, she said.
Whereas in offline world you could visit three different supermarkets and find that you were being excluded from some or offered higher prices at others, the internet is more opac. "That's so hard to prove in the online world because how do I know that I've been offered a different price?" Ms Wachter said. "And do I even know that certain ads are not being shown to me, and maybe I'm not even seeing certain products any more?"
"You would also need to show that the differential pricing the company is using has a disadvantage for a protected group," she added. "And that protected group has to be big enough. But how do I know what the criteria are that I'm being profiled on? Who are the other people in that group? How many are we? I don't know anything about that."
Ms Wachter argued that the current laws need to be amended to protect against this type of profiling at scale, and she proposed greater protections for groups associated with sensitive traits, such as ethnicity or religion, whether these traits are assumed or accurate.
"Applying the concept of 'discrimination by association' to online behavioral advertising is unprecedented, but a potentially powerful tool in the quest for more algorithmic accountability," she said.

HIGHLIGHTS


Top