Entertainment

Meta shows ads for hyperfakes of child pornography

Facebook and Instagram have shared advertising material for a fake porn app, which uses a doctored image of an underage Hollywood actress.

The app in question advertised that it could undress women using artificial intelligence.

A review of the ad library of Meta, the parent company of the two social networks, showed that the company behind the app had delivered 11 ads that molested series star Jenna Ortega, who was 16 years old. old, NBC reported Tuesday.

The ads appeared on Facebook, Instagram and Messenger throughout much of February but have since been removed.

This is not the first ad campaign for the company on meta platforms, which did not operate before being warned by NBC that the image of Ms. Ortega in question was child pornography.

Rapid rise in pornographic hyperfax

According to a Canadian study published in 2021, the majority of fake porn found online is pornographic and the people featured in it rarely consent to its creation and publication.

Although most of these copies are targeted at adult women, a 2023 Internet Watch Foundation study shows how artificial intelligence is increasingly being used to create pornographic images of minors.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button