The Black Lives Matter movement in focus has made a lot of tech companies to review their policies to make sure black voices don’t get drowned out. Instagram‘s CEO, Adam Mosseri, said the company is inspecting its recommendation algorithm and policy to determine if it treats everyone equally.
Mosseri talked about ‘shadow bans,’ harassment, and content takedown disagreements of black creators. He said the company will focus on four areas to improve its services: harassment, account verification, content distribution, and algorithmic bias.
Addressing the feedback we get has always been an integral part of how we work, and has helped us build a better Instagram for everyone. We’re going to focus on four areas:
• Account verification
• Content distribution
• Algorithmic bias
Out of these four, algorithmic bias and content distribution go hand in hand, as people spend most of their time on their feed or the explore tab. Mosseri said that the company has heard a lot about ‘shadow banning’ a practice where the platform limits the reach of the certain posts — and it’ll soon release information on what kind of content they down rank.
Instagram will also examine if its algorithms have some subconscious bias, but didn’t provide details on what it aims to do to find and correct it. Plus, it will look into safety issues black people face on the platform.
These efforts won’t stop with the disparities people may experience solely on the basis of race; we’re also going to look at how we can better serve other underrepresented groups that use our product. In the last year alone the feedback we’ve received from communities like LGBTQ+ groups, body positivity activists, and artists has helped us build a more inclusive product.
Ideally, Instagram should ask outside academics to study their algorithms and policies to find the bias. Plus, it needs to include marginalized communities in the discussion to solve these problems. Bias against underrepresented groups is not something that can be solved in a single day. It’ll be a long process.