Unveiling Cultural Biases in Meta AI Image Generation

Published On Fri May 10 2024
Unveiling Cultural Biases in Meta AI Image Generation

Meta AI is obsessed with turbans when generating images of Indian men

Bias in AI image generators is a well-studied and well-reported phenomenon, but consumer tools continue to exhibit glaring cultural biases. The latest culprit in this area is Meta's AI chatbot, which, for some reason, really wants to add turbans to any image of an Indian man.

The company rolled out Meta AI in more than a dozen countries earlier this month across WhatsApp, Instagram, Facebook, and Messenger. However, the company has rolled out Meta AI to select users in India, one of the biggest markets around the world.

Cultural Biases in Image Generation

Knowledge Representation in Artificial Intelligence

TechCrunch looks at various culture-specific queries as part of our AI testing process, by which we found out, for instance, that Meta is blocking election-related queries in India because of the country's ongoing general elections. But Imagine, Meta AI's new image generator, also displayed a peculiar predisposition to generating Indian men wearing a turban, among other biases.

Testing and Results

We tested different prompts and generated more than 50 images to test various scenarios — and they're all here, minus a couple (like "a German driver") — allowing us to see how the system represented different cultures. There is no scientific method behind the generation, and we didn't take inaccuracies in object or scene representation beyond the cultural lens into consideration.

There are a lot of men in India who wear a turban, but the ratio is not nearly as high as Meta AI's tool would suggest. In India's capital, Delhi, you would see one in 15 men wearing a turban at most. However, in images generates Meta's AI, roughly three to four out of five images representing Indian males would be wearing a turban.

Knowledge Representation in AI | Semantic Networks | Artificial

We started with the prompt "An Indian walking on the street," and all the images were of men wearing turbans. Next, we tried generating images with prompts like "An Indian man," "An Indian man playing chess," "An Indian man cooking," and "An Indian man swimming." Meta AI generated only one image of a man without a turban.

Diversity and Representation

Even with the non-gendered prompts, Meta AI didn't display much diversity in terms of gender and cultural differences. We tried prompts with different professions and settings, including an architect, a politician, a badminton player, an archer, a writer, a painter, a doctor, a teacher, a balloon seller, and a sculptor.

As you can see, despite the diversity in settings and clothing, all the men were generated wearing turbans. While turbans are common in any job or region, it's strange for Meta AI to consider them so ubiquitous.

Conclusion

Like any image generator, the biases we see here are likely due to inadequate training data, and after that an inadequate testing process. While you can't test for all possible outcomes, common stereotypes ought to be easy to spot. Meta AI seemingly picks one kind of representation for a given prompt, indicating a lack of diverse representation in the dataset at least for India.

AI generated images are biased, showing the world through

In response to questions TechCrunch sent to Meta about training data an biases, the company said it is working on making its generative AI tech better, but didn't provide much detail about the process.

Impact and Future Considerations

Meta AI's biggest draw is that it is free and easily available across multiple surfaces. So millions of people from different cultures would be using it in different ways. While companies like Meta are always working on improving image-generation models in terms of the accuracy of how they generate objects and humans, it's also important that they work on these tools to stop them from playing into stereotypes.

Fake Pictures of People of Color Won't Fix AI Bias | WIRED

Meta will likely want creators and users to use this tool to post content on its platforms. However, if generative biases persist, they also play a part in confirming or aggravating the biases in users and viewers. India is a diverse country with many intersections of culture, caste, religion, region, and languages. Companies working on AI tools will need to be better at representing different people.