Singapore women fall victim to DeepNude app

Singapore women fall victim to DeepNude app

Using app to doctor photos of people to make them appear naked is a criminal offence similar to taking a nude photo, lawyers warn

She posted an innocuous selfie on social media more than a year ago. Last week, her photo appeared on a sex forum with a startling difference - she is naked.

Yesterday, the 27-year-old woman, who wanted to be known only as Rose, was aghast when The New Paper told her her photo had been doctored using artificial intelligence to show her in the nude.

She is not alone. Over the past week, dozens of women in Singapore have had their pictures on social media stolen, doctored and uploaded to the sex forum.

Some of these pictures have been compiled and recirculated on pornographic sites, with more additions every day.

The photos are believed to have been doctored using a version of the DeepNude app, which was launched several months ago.

It uses artificial intelligence to make women appear naked.

Its creators, who listed their location as Estonia, shut down the application last month following an uproar on social media.

They said the app was meant only for entertainment and they had not expected it to go viral.

Several versions of the software have since surfaced online. Versions of the app have been shared via download links on the sex forum, which has a high number of visitors from Singapore.

[[nid:451534]]

Forum users could also submit photos and request for them to be doctored by those who have the software. The doctored pictures would then be uploaded and circulated.

Rose believes circulation of her doctored photo had led to a recent surge in followers on her social media accounts.

She said: "This is so disgusting, disrespectful and perverted. What if someone else did this to their mothers, sisters, wives or girlfriends?"

She now intends to make her social media accounts private.

Lawyers told TNP that while there has not been any reported prosecution here over the use of the DeepNude app, using it to doctor photos to make people appear naked is a criminal offence under the law.

Lawyer Fong Wei Li said that in the eyes of the law, creating such pictures is no different from taking an actual nude photo.

He said: "It is fundamentally the same, causing the same kind of harassment and alarm. At the end of the day, the content is obscene by objective standards."

While the use of artificial intelligence is a seemingly new aspect, the broad definitions of current law allow for prosecution, even if those responsible hide behind anonymous usernames online, Mr Fong added.

"Anonymity makes it difficult but not impossible to identify them. With their resources, the police can break through the barrier of anonymity to identify the people responsible," he said.

Lawyer Gloria James said that under the Films Act, anyone who creates such pictures can be fined up to $40,000, jailed for up to two years, or both.

The culprits can also be charged with insult of modesty, for which they face a jail term of up to a year, a fine, or both.

Both lawyers said that victims can use the Protection from Harassment Act to take out protection orders against online users, even if they appear to be anonymous.

Mr Fong said: "As long as the person is still identifiable via a username, the victim can still make an application. There are limitations, but it doesn't mean nothing can be done."

National University of Singapore sociologist Tan Ern Ser said there will always be demand for an app such as DeepNude.

He said: "Like foodies who discover some great food and beverage places, consumers of such products are similarly eager to share their discoveries with like-minded people on their social networks."

He added that such an app might also affect how people interact online, making them more cautious about sharing their photos and becoming less trusting of others.

Prof Tan also said that banning such apps might not be the most effective way of dealing with the exploitation of new technologies.

"They can be banned the way pornographic materials are, but the more effective approach is for people to internalise socially acceptable values with respect to sexual exploitation," he added.

This article was first published in The New Paper. Permission required for reproduction.

This website is best viewed using the latest versions of web browsers.