Twitter to investigate 'racist' image-cropping function

Users of the social media site have pointed out a possible racial bias in Twitter's image cropping algorithm which relies on facial recognition.

The experiment compared whether Twitter's image algorithm would focus on the face of either US politician Mitch McConnell or former President Barack Obama.

The experiment compared whether Twitter's image algorithm would focus on the face of either US politician Mitch McConnell or former President Barack Obama. Source: AAP

Social media giant Twitter said it would investigate its image-cropping function after users complained it favored white faces over black.

The image preview function of Twitter's mobile app automatically crops pictures that are too big to fit on the screen and selects which parts of the image to display and cut off.

Prompted by a graduate student who found an image he was posting cropped out the face of a black colleague, a San Francisco-based programmer found Twitter's system would crop out images of President Barack Obama when posted alongside Republican Senate Leader Mitch McConnell.

"Twitter is just one example of racism manifesting in machine learning algorithms," the programmer, Tony Arcieri, wrote on Twitter.
Twitter is one of the world's most popular social networks, with nearly 200 million daily users.

Other users shared similar experiments online they said showed Twitter’s cropping system favoring white people.

Twitter admitted the company still had work to do.

“Our team did test for bias before shipping the model and did not find evidence of racial or gender bias in our testing. But it’s clear from these examples that we’ve got more analysis to do. We’ll continue to share what we learn, what actions we take, and will open-source our analysis so others can review and replicate,” a Twitter spokesperson said.
In a 2018 blog post, Twitter had said the cropping system was based on a “neural network” that used artificial intelligence to predict what part of a photo would be interesting to a user and crop out the rest.

A representative of Twitter also pointed to an experiment by a Carnegie Mellon University scientist who analysed 92 images and found the algorithm favored black faces 52 times.
But Meredith Whittaker, co-founder of the AI Now Institute that studies the social implications of artificial intelligence, said she was not satisfied with Twitter's response.

"Systems like Twitter's image preview are everywhere, implemented in the name of standardisation and convenience,” she said.

"This is another in a long and weary litany of examples that show automated systems encoding racism, misogyny and histories of discrimination." 

A number of studies have found evidence of racial bias in facial recognition software, finding white faces are more likely to be identified then black faces.


Share
3 min read

Published

Source: Reuters, SBS

Tags

Share this with family and friends


Get SBS News daily and direct to your Inbox

Sign up now for the latest news from Australia and around the world direct to your inbox.

By subscribing, you agree to SBS’s terms of service and privacy policy including receiving email updates from SBS.

Download our apps
SBS News
SBS Audio
SBS On Demand

Listen to our podcasts
An overview of the day's top stories from SBS News
Interviews and feature reports from SBS News
Your daily ten minute finance and business news wrap with SBS Finance Editor Ricardo Gonçalves.
A daily five minute news wrap for English learners and people with disability
Get the latest with our News podcasts on your favourite podcast apps.

Watch on SBS
SBS World News

SBS World News

Take a global view with Australia's most comprehensive world news service
Watch the latest news videos from Australia and across the world