Facebook and Instagram have shared advertising material for a fake porn app, which uses a doctored image of an underage Hollywood actress.
The app in question advertised that it could undress women using artificial intelligence.
A review of the ad library of Meta, the parent company of the two social networks, showed that the company behind the app had delivered 11 ads that molested series star Jenna Ortega, who was 16 years old. old, NBC reported Tuesday.
The ads appeared on Facebook, Instagram and Messenger throughout much of February but have since been removed.
This is not the first ad campaign for the company on meta platforms, which did not operate before being warned by NBC that the image of Ms. Ortega in question was child pornography.
According to a Canadian study published in 2021, the majority of fake porn found online is pornographic and the people featured in it rarely consent to its creation and publication.
Although most of these copies are targeted at adult women, a 2023 Internet Watch Foundation study shows how artificial intelligence is increasingly being used to create pornographic images of minors.
This is a new record that scientists from the Korea Fusion Energy Institute (KFE) have…
Damages associated with drought, floods, hail and other increasingly violent events are expected to increase…
An estimated 9 million people in the United States are still waiting for their final…
The death of seven humanitarian workers from the American NGO World Central Kitchen in an…
Today, at one o'clock in the morning, Gamer updates it Boutique de Fortnite Through the…
The Basic Instinct and Casino actress looks back at a time in Hollywood when adapting…