April 8 2025
Breaking Alert

Google’s AI Nano Banana Pro accused of generating racialized visuals

post-img

 

AI image generators have often been shown repeatedly to replicate – and at times exaggerate – US social biases. Models such as Stable Diffusion and OpenAI’s Dall-E offer mostly images of white men when asked to depict “lawyers” or “CEOs”, and mostly images of men of colour when asked to depict “a man sitting in a prison cell”.

 

Recently, AI-generated images of extreme, racialized poverty have flooded stock photo sites, leading to discussion in the NGO community about how AI tools replicate harmful images and stereotypes, bringing in an era of “poverty porn 2.0”.

 

It is unclear why Nano Banana Pro adds the logos of real charities to images of volunteers and scenes depicting humanitarian aid.