Google suspends Gemini from making AI images of people after a controversy
Google has announced plans to address issues identified with its AI model, Gemini. Users raised concerns that Gemini was producing historically inaccurate images of people of color. In response, the company has decided to temporarily halt Gemini's image-generating feature for people while necessary adjustments are made.
Google has committed to resolving the concerns surrounding Gemini, its alternative to OpenAI's GPT-4. Complaints emerged regarding the "woke" nature of the image-generating feature in the multi-modal AI model's functionality.
Controversy and Social Media Backlash
Social media platforms were flooded with complaints about Gemini producing inaccurate depictions of people of color in historical contexts. Users shared their frustration, with some pointing out specific instances where the AI model failed to generate appropriate images.
BBC News was one of the first outlets to report this.
Software engineer Mike Wacker shared an experience where Gemini struggled to produce accurate images, particularly when prompted to generate images of historical figures like the Founding Fathers. The results led to further concerns about the AI tool's capabilities.
Google's Response and Future Plans
In response to the growing criticism, Google acknowledged the need for immediate improvements in image generation. The company emphasized the importance of portraying a diverse range of individuals accurately, especially considering Gemini's broad user base.
Gemini will undergo a temporary pause in generating images of people while necessary updates are implemented. Google aims to reintroduce an improved version of the AI model in the near future.
The company assured users that it is actively working to address the inaccuracies in historical image generation depicted by Gemini. Google's commitment to enhancing the AI model's capabilities reflects its dedication to providing a more inclusive and accurate representation of individuals across various backgrounds.