Categories: Technology

Wanting to combat stereotypes, the Google Gemini AI generated historically inaccurate images

Google’s Gemini tool began introducing and replacing white historical figures, such as the founding fathers of the United States or Nazi-era German soldiers, with people of color.
Figaro screenshot

Nazi soldiers of color, founding fathers of the United States of Asian descent… Faced with controversy, the company suspended the possibility of creating images of human beings.

The amplification of racist or sexist stereotypes by generative artificial intelligence is not a new problem. Companies behind software capable of generating text or images from simple text queries are often accused by users of perpetuating preconceived ideas. Powered by huge databases, these tend to materialize biases that exist in the real world.

Thus, the site The Verge reminds us, such as requests “productive person” Often giving rise to the representation of white people. While the demand for visuals linked to social hardship often creates images of black people. But wanting to avoid this pitfall on its new Gemini AI, available in the United States since early February, Google ran into the opposite problem.

Indeed, as internet users have noted, its AI ensures that it provides a diverse representation of humanity. But this leads Gemini to create historically inconsistent images. Thus, the automatic creation of scenes for “German Soldiers in 1943”, “Portrait of the Founding Fathers of the United States” Or “American Senators in the 1800s” This causes the AI ​​to create images of colored people.

Faced with outrage, Google this Thursday banned the possibility of images of humans from Gemini. “We are working to resolve issues with the Gemini image functionality. In the meantime, we are suspending the generation of people’s images and will release an improved version soon.” About the tool, the company explained in a press release published on its X account this Thursday.

Difficulty correcting bias

“We know that Gemini has inaccuracies in some of its depictions of historical images.”Google apologized in its first statement published on Wednesday, admitting to it “missed the boat”

. “We are working to fix this type of presentation immediately.”

Jack Krawczyk, Product Director of Gemini, said on Wednesday than Google “Consistent with our AI principles, Google has designed its imaging tools to reflect the diversity of users around the world. We will continue to do this for general questions. But the historical context brings more nuances and so we will adapt our model.”

Google isn’t the only group to combat AI bias. Software such as DALL-E developed by Open AI (at the core of ChatGPT) or its competitor Stable Diffusion, for example, tend to be realized by 97% of male business leaders. At least, that’s what researchers at the start-up Hugging Face, which advocates for the open source design of AI resources, concluded last March.

To avoid the same pitfall, the latter developed a tool called Fair Diffusion based on semantic guidance. Specifically, this allows the user to stay with the software and modify the results of the acquired images. Thus, if a user asks the AI ​​to realize business leaders, it can ask for less biased visual suggestions. And hopefully his request is fulfilled by both women and men.

” data-script=”https://static.lefigaro.fr/widget-video/short-ttl/video/index.js” >

Source link

Admin

Recent Posts

100 million degrees for 48 seconds: South Korea’s ‘artificial sun’ moves closer to nuclear revolution

This is a new record that scientists from the Korea Fusion Energy Institute (KFE) have…

8 months ago

The report offers solutions for insurers facing future growth in natural disasters

Damages associated with drought, floods, hail and other increasingly violent events are expected to increase…

8 months ago

You still have time to claim this exciting investigation

An estimated 9 million people in the United States are still waiting for their final…

8 months ago

IDF recognizes “serious mistake” in killing seven members of NGO World Central Kitchen

The death of seven humanitarian workers from the American NGO World Central Kitchen in an…

8 months ago

Fortnite Shop Apr 3, 2024 – Fortnite

Today, at one o'clock in the morning, Gamer updates it Boutique de Fortnite Through the…

8 months ago

Sharon Stone tried to make a Barbie movie in the 1990s

The Basic Instinct and Casino actress looks back at a time in Hollywood when adapting…

8 months ago