Twitters Photo Cropping Algorithm Draws Heat for Possible Racial Bias – PetaPixel

Over the weekend, cryptographic engineer Tony Arcieri went viral on Twitter by pointing out an uncomfortable problem with the social networks auto-cropping algorithm. In what he categorized as a “awful experiment,” he published 2 various images, each of which was comprised of a portrait of Senate Majority Leader Mitch McConnell and previous President Barack Obama.

Back in January of 2018, Twitter presented an auto-cropping AI that identifies the most intriguing part of your image and crops the sneak peek picture to match. This works with whatever from plane wings to people, but as one engineer showed this weekend, it may experience some fundamental bias.

For context, these are the photos he used, each of which have to do with 600 x 3000px. Notification the severe amount of white area between the photo on the top and the one on the bottom:

” Which [face] will the Twitter algorithm choice: Mitch McConnell or Barack Obama?,” asked Arcieri. In this particular case, using these 2 images, the response was constantly McConnell, no matter what order the pictures are stacked.

Heres Arcieris initial post, which has actually been retweeted over 76K times and liked over 190K times as of this writing:

Attempting a terrible experiment …
Which will the Twitter algorithm choice: Mitch McConnell or Barack Obama? pic.twitter.com/bR1GRyCkia
— Tony “Abolish (Pol) ICE” Arcieri (@bascule) September 19, 2020

After the post went viral, Arcieri ran a couple of other experiments to attempt and address some criticisms and alternative theories that users had brought up. For example, switching out the red tie for a blue tie did not change the results:

” Its the red tie! Clearly the algorithm prefers red ties!”
Well lets see … pic.twitter.com/l7qySd5sRW
— Tony “Abolish (Pol) ICE” Arcieri (@bascule) September 19, 2020

But inverting the colors of the image did:

Lets try inverting the colors … (h/t @KnabeWolf) pic.twitter.com/5hW4owmej2
— Tony “Abolish (Pol) ICE” Arcieri (@bascule) September 19, 2020

Another user showed that even if you increase the number of Obamas and remove all the white area between the images, the same thing occurs:

If we increase the number of Obamas, I question what occurs. pic.twitter.com/sjrlxjTDSb
— Jack Philipson (@Jack09philj) September 19, 2020

Others have actually attempted reversing the order in which the photos are connected, or reversing the order of the names in the tweet itself, neither of which worked. However, using a different image of Obama with a more apparent, high-contrast smile did reverse the order each time:

@thetokensquare
Okay, to test this hypothesis, lets attempt utilizing an image of Barack with a greater contrast smile. This might do it. pic.twitter.com/AX073Ss2KD
— Kim Sherrell (@kim) September 20, 2020

No doubt the experiments will continue as people attempt to parse exactly what the algorithm is highlighting and whether or not it ought to be classified as implicit racial predisposition. In the meantime, Liz Kelley of Twitter Comms responded by thanking Arcieri and the rest of individuals who were evaluating this out, and confessing that theyve “got more analysis to do.”

— liz kelley (@lizkelley) September 20, 2020

” We checked for predisposition before delivering the design and didnt discover proof of racial or gender predisposition in our testing, however its clear that weve got more analysis to do,” composed Kelley in a Tweet. “Well open source our work so others can examine and duplicate.”

,” asked Arcieri. In this specific case, using these 2 images, the response was always McConnell, no matter what order the photos are stacked.

Okay, to evaluate this hypothesis, lets attempt utilizing an image of Barack with a greater contrast smile.

( by means of Engadget).