Media images of the 'body beautiful' make women feel question their looks and lose confidence in…
[Photo: Suljo / iStock]
By JACKIE SNOW2 MINUTE READ
The new AI is designed to make NSFW images more concise. A recent paper describes how researchers train an AI algorithm called the Generate Against Network, which puts a bikini on a photo of a naked woman. By observing the example of nearly 2,000 female photos – nude and bikini wear – artificial intelligence can take a photo of a lady in her birthday dress, find out where to place the bikini and create a new, more modest shot.
The team behind this work came from the Catholic University of Rio Grande do Sul in Brazil, a university managed by the local Catholic Archdiocese and the Jesuits, which explains some of the inspiration behind this work. The researchers described their motivations and highlighted the rich sexual content on the web, most of which are easily accessible to children. They point out that more than 90% of boys and nearly two-thirds of girls report having seen pornography before the age of 18. They write that their software can also be used to retaliate against pornographic content, or to protect the Internet experience of anyone who might just want a PG rating.
Top line: real image (manually remove the protected reader). Middle line: Results from using the 9-Block ResNet generator. Bottom line: The result of using the U-Net 256 generator (blurring applied to unsatisfactory results). [Photo: Institute of Machine Intelligence and Robotics Research Institute, Pontifıcia Universidade Catolica do Rio Grande do Sul]
They argue that the Internet platform does not simply block the inclusion of nudity, but can seamlessly cover up naughty content. Previously, algorithms for detecting hundreds of thousands of images were designed to detect “adult content,” research reports, but so far “no work” attempts to automatically review naked content. “At the same time, they argue that taking a simpler approach than bikinis, for example, placing black boxes around the genitals means “people who consume content can still see censorship.”
They wrote: “The motivation behind this task is to avoid disrupting the user experience, while consumption may occasionally contain explicit content.” Their study “Seamless Bare Review: Image-to-Image Translation Method Based on Antagonistic Training” Month made its debut at the IEEE International Conference on Neural Networks in Rio de Janeiro.
The study joins the history of long-term review of female images that are considered inadequately worn. Facebook and other online platforms use AI software and manual moderators to then ban nudity, sometimes in excessive and inconsistent ways. Conservative communities around the world edit photos to add more clothes, from high school yearbooks to album covers. Although it does exist, there are far fewer examples of men who are covered up. (Researchers downloaded their “data sets” from the torrent website, saying they didn’t use male images because of time constraints, but plan to add them in future iterations.)
There may be some potential pitfalls. This technique can be adjusted and used in reverse to take a picture of a woman wearing a bikini and put an anatomically correct body part into it. “What we want to emphasize is that taking off clothes is by no means our intended goal. It is only a side effect of the method. In any case, it is reported that one of the authors of the newspaper, Rodrigo Barros, told the “Register.”
People may end up using such software to create a new kind of deep vacation goods, and they get aerial attention after the video begins to spread unsuspecting victims (usually celebrities) with the help of artificial intelligence to insert porn movies. But judging from the limitations of this artificial intelligence – computer-drawn bikinis are sometimes unbalanced, seemingly faulty, or other fake horny Internet guests may be better than looking for real nude photos instead of trying to reverse engineer bikini snapshots , at least for the time being.