Last Saturday, Twitter user @bascule tested the platform’s algorithm by uploading different sets of photos of Mitch McConnell and Barack Obama. Many other users soon joined in with their own versions of this “experiment”, only to find that the algorithm almost always “chooses” to display the white person in its previews (before opening the full image).
This post was prompted by another viral tweet, by @colinmadland, who talked about a coworker (of colour) not being able to use the video-calling platform Zoom’s virtual backgrounds, due to the platform’s uninclusive algorithm.
The rising social-political tensions in the US because of the ongoing Black Lives Matter protests have also fuelled this particular Twitter experiment, which quickly became viral. Other users tried techniques like adjusting the contrasts of both photos, inverting their colours, or changing their positions. However, the majority of the previewed images still displayed their white counterpart.
The virality of this post forced Twitter to respond to the accusations of racial preference. Twitter claims to “have tested for bias” in 2017, finding “no significant bias between ethnicities (or genders)”, according to expert research engineer Dr Zehan Wang.
Twitter has acknowledged the consequences of this issue and they have decided to review their algorithm’s prior tests. They have also promised to be transparent during this process and to share an update to this problem soon. Zoom’s official Twitter account has also reached out to @colinmadland to solve this dispute.