An AI that scrawls bikinis over nude photographs of women has been developed by scientists to block racy online images.
The system, built at a Catholic institute in Brazil, automatically seeks out lewd pictures and digitally adds swimwear to speed up the process of censorsing images.
Researchers warned that while the AI was designed to be used for good, cyber criminals could one day reverse the process to erase bikinis from people’s photos.
Scroll down for video
An AI that scrawls bikinis over nude photographs of women has been developed by scientists to block racy online images. Pictured are some of the AI’s successful (centre row) and unsuccessful (bottom row) attempts to censor nude images (top line)
The AI was trained by software engineers at the Pontifical Catholic University of Rio Grande do Sul using 2,000 images of women.
It is a type of AI known as a generative adversarial network, which that ‘learn’ to perform tasks by recognising patterns commonly found in a set of images.
Researchers first fed their system pictures of models in bikinis to teach it what the swimwear looks like.
It was then given photos of nude women to show it where bikinis should be placed on uncensored images.
The AI connected the patterns learned from each set of images to create a system that automatically adds bikinis to racy photos.
Project scientist Dr Rodrigo Barros told the Register: ‘When we train the network, it attempts to learn how to map data from one domain – nude pictures – to another domain – swimsuit pictures.
Researchers warned that while the system was designed to be used for good, cyber criminals could one day reverse the process to erase bikinis from people’s photos (stock image)
‘The distribution of the images from both domains are quite different.’
While the AI worked well at adding bikinis to some images, it was less successful with others.
It struggled to match the backgrounds between nude and swimwear photos, researchers said.
Dr Barros said: ‘One thing that we noticed in particular is that several of our swimsuit images are photo-shoot pictures, shot on a white background, while the background of the nude pictures is often quite complex.
HOW DOES ARTIFICIAL INTELLIGENCE LEARN?
AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn.
ANNs can be trained to recognise patterns in information – including speech, text data, or visual images – and are the basis for a large number of the developments in AI over recent years.
Conventional AI uses input to ‘teach’ an algorithm about a particular subject by feeding it massive amounts of information.
AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn. ANNs can be trained to recognise patterns in information – including speech, text data, or visual images
Practical applications include Google’s language translation services, Facebook’s facial recognition software and Snapchat’s image altering live filters.
The process of inputting this data can be extremely time consuming, and is limited to one type of knowledge.
A new breed of ANNs called Adversarial Neural Networks pits the wits of two AI bots against each other, which allows them to learn from each other.
This approach is designed to speed up the process of learning, as well as refining the output created by AI systems.
‘The network, therefore, implicitly learns to associate “background types” for each domain. Thus, when translating, the background is often disturbed signiﬁcantly.’
He added that the AI was developed to test out a ‘novel’ way of censoring images on the internet.
Researchers warned that while the system was developed with good intentions, it could be used by cyber criminals to create fake pornography.
The technology could be revesed to remove bikinis from people’s photos.
Dr Barros added: ‘Once training is done, we can safely discard the model that maps from domain Y (swimsuit pictures) to domain X (nude pictures) and keep only the model that we are interested in.
‘We would like to emphasize that removing clothing was never our intended goal, it is merely a side-effect of the method, which is discarded anyway.’
Read the original article here