See for yourself how biased AI image models are with these new tools

3Eipueet Bmeqyvb0Wstdrpxybgkveebzmancj8Kv5Jmw0Xa5Glq1F5Mkgir

One theory for why this might be is that non-binary browns may have more visibility in the press recently, which means their images end up in the datasets used for training AI models, Jernite says.

OpenAI and Stable.AI, the company that built Stable Diffusion, have introduced tweaks to their systems to address ingrained biases, such as blocking certain queries that appear likely to generate offensive images. However, these new tools show how limited these fixes are in the face of adoption.

A spokesperson for Stability.AI told us that the company trains its models “on data sets specific to different countries and cultures,” which “should serve to reduce bias caused by over-representation in general data sets.”

An OpenAI spokesperson did not comment specifically on the tools, but referred to a Blog post Explaining how the company has added 2 different techniques to DALL-E to filter out biased and sexual and offensive images.

As these AI models become more widely accepted and produce more realistic images, bias is becoming a more pressing problem. They are already rolling in products like these. Stock photos. Luccioni said she fears the models will reinforce harmful biases at large. She hopes the tools she and her team have developed will bring more transparency to image-generating AI systems and highlight the need to be less biased.

Part of the problem is that these models are trained on mostly US-centric data, meaning they largely reflect US associations, biases, values ​​and culture, says Eileen Kaliscan, an assistant professor at the University of Washington. He studies bias in AI systems and was not involved in this study.

“The last thing is that this online American culture is a thumbs up … it’s continued all over the world,” says Kaliscan.

Caliscan says face-hugging can help AI developers understand and reduce bias in their AI models. “I believe that when people see these examples firsthand, they can better understand the importance of these biases,” she says.

We offer you some site tools and assistance to get the best result in daily life by taking advantage of simple experiences

 
Please enable / Bitte aktiviere JavaScript!
Veuillez activer / Por favor activa el Javascript! [ ? ]