A great post on Penang Malaysia.
Twitter revealed that its image-cropping algorithm has a notable bias against black individuals and men. The company looked into the issue after users complained that the machine learning-based algorithm was excluding black people’s faces from image previews, Reuters said.
According to Twitter’s analysis, the algorithm favoured women by 8% from demographic parity, and white people by 4%. Demographic parity essentially means that there is no bias, and both groups compared have an equal chance of being highlighted by the algorithm.
The company also investigated whether its algorithm sexually objectified women by cropping images to focus on body parts aside from faces. But ultimately, the social media firm found no evidence of this.
I’m excited to share that we’re rolling this out to everyone today on iOS and Android. You’ll now be able to view single, standard aspect ratio images uncropped in your timeline. Tweet authors will be able to also see their image as it will appear, before they Tweet it. https://t.co/vwJ2WZQMSk
— Dantley Davis (@dantley) May 5, 2021
Twitter explained that it began using a “saliency algorithm” in 2018 to crop pictures, aiming to standardise the size of images and allow people to see more tweets at a glance. Powered by machine learning, the algorithm estimated what a user might want to see first within an image.
Abandoning that approach, the platform recently shifted to displaying standard aspect ratio photos in full, without the algorithm-backed cropping. Users are also shown a true preview of the image before they click to post.
In the end, the company concluded, “not everything on Twitter is a good candidate for an algorithm, and in this case, how to crop an image is a decision best made by people.”
(Source: Reuters, Twitter.)
This post was published here.
We hope that you found the above informative. You can find similar content on our blog here:
Please let me have your feedback in the comments section below. Let us know what subjects we should cover for you next.