You may have heard something about the “deepnudeapp” (the “App“). The App is a computer software that has the ability to make photographs of women (and only women) appear naked.
According to various online reports, the App is a recent creation (written a number of months ago) which was launched on a website that permitted downloading on to Windows and Linux platforms for a fee. Apparently, the App was created in Estonia and sold at US$50 a copy. In a recent June 2019 article by Vice media, one of the App’s creators (who remained anonymous by using “Alberto” as a pseudonym) had this to say:
“I’m not a voyeur, I’m a technology enthusiast,… Continuing to improve the algorithm. Recently, also due to previous failures (other startups) and economic problems, I asked myself if I could have an economic return from this algorithm. That’s why I created DeepNude.“
Unprompted, he said he’s always asked himself whether the program should have ever been made: “Is this right? Can it hurt someone?” he asked.
“I think that what you can do with DeepNude, you can do it very well with Photoshop (after a few hours of tutorial),” he said, noting that DeepNude doesn’t transmit images itself, only creates them and allows the user to do what they will with the results.
“I also said to myself: the technology is ready (within everyone’s reach),” he said. “So if someone has bad intentions, having DeepNude doesn’t change much… If I don’t do it, someone else will do it in a year.“
On 27 June 2019, after the article was published, the deepnudeapp was taken down. Its creators stated, in a Twitter post:
— deepnudeapp (@deepnudeapp) June 27, 2019
See the Vice article reporting on the removal of the App here. The creators apparently mentioned that they had greatly underestimated the demand for the App, and also never thought that it could go viral.
Really? That’s a little hard to believe.
As reported today in The New Paper (10 July 2019), the terrifying possibilities posed by the App have reached Singapore. According to the report, several App photos of various local female victims have been circulating, prompting some to completely privatise their social media accounts to prevent further access to their photos. It is unclear how many are presently affected and whether all the victims are aware of the ongoing circulation of their manipulated pictures.
How does the App work?
First, keep in mind that the creator may have taken down the App officially, but the App remains in circulation on the web and can easily be obtained. We have personally trawled the web and come across various links offering the App.
According to an online source, the App works by taking 3 steps – First, with the picture that it has, the App creates a mask over the figure of the female subject. Next, it generates an abstract representation of the body shape, anticipating where the various body parts will be and their angles. Third, it searches for existing pictures of such body parts and composites the images on to the mask and then generates the fake nude photo.
The above screenshot shows how the core algorithm of the App generates a deepnude photo. Apparently, the creators chose to limit the App to women because naked pictures of women were easier to find online to comprise the bank of pictures the App needs.
Presently a normal computer (we assume this means a home computer) can process a deepnude photo on the App in approximately 30 seconds.
As we understand from various other reports, the process and algorithm adopted by the App is closely similar to the way deepfake photos and videos are being generated. This is another variant, perhaps better regarded as an evolution of the deepfake software.
The effects and what can be done – in Singapore
Across the world, deep outrage has been expressed at what the App can do.
We note that the earlier cited 27 June 2019 Vice article engaged in some self-censorship to minimise the harm that could be caused as a result of misusing the App. As Vice states:
“Editor’s note, June 27 1:05 p.m. EST: This story originally included five side-by-side images of various celebrities and DeepNude-manipulated images of those celebrities. While the images were redacted to not show explicit nudity, after hearing from our readers, academic experts, and colleagues, we realized that those images could do harm to the real people in them. We think it’s important to show the real consequences that new technologies unleashed on the world without warning have on people, but we also have to make sure that our reporting minimizes harm. For that reason, we have removed the images from the story, and regret the error.”
Quite plainly, the harm that could be caused is psychological and reputational, amongst potentially many other types of harm.
A quick review of various statutes in Singapore indicate that using the App to create a deepnude of a woman, intending that the final product be circulated is very likely illegal, and a single act appears to possibly lead to the commission of several offences, such as the following:
- Criminal defamation under the Penal Code:
- Under Section 499 of the Penal Code: 499. Whoever, by words either spoken or intended to be read, or by signs, or by visible representations, makes or publishes any imputation concerning any person, intending to harm, or knowing or having reason to believe that such imputation will harm, the reputation of such person, is said, except in the cases hereinafter excepted, to defame that person.
- Explanation 4.—No imputation is said to harm a person’s reputation, unless that imputation directly or indirectly, in the estimation of others, lowers the moral or intellectual character of that person, or lowers the character of that person in respect of his calling, or lowers the credit of that person, or causes it to be believed that the body of that person is in a loathsome state, or in a state generally considered as disgraceful.
- This is punishable with imprisonment for a term which may extend to 2 years or with fine, or both.
- Harassment under the Protection from Harassment Act:
- Sections 3 and 4 of the Protection from Harassment Act plausibly make criminal the act of communicating an image, or a visual representation (this could well include a deepnude photograph) either with the intention of causing harassment, alarm or distress to the victim (Section 3), or such image is seen or perceived by someone else and thereby causes such harassment, alarm or distress (Section 4).
- Section 3.—(1) No person shall, with intent to cause harassment, alarm or distress to another person, by any means —
(a) use any threatening, abusive or insulting words or behaviour; or
(b) make any threatening, abusive or insulting communication,
thereby causing that other person or any other person (each referred to for the purposes of this section as the victim) harassment, alarm or distress.
(a) use any threatening, abusive or insulting words or behaviour; or
(b) make any threatening, abusive or insulting communication,
which is heard, seen or otherwise perceived by any person (referred to for the purposes of this section as the victim) likely to be caused harassment, alarm or distress.
-
- Punishment-wise, under Section 3, this is with a fine of up to S$5,000 or to imprisonment for a term not exceeding 6 months, or both. Under Section 4, only with a fine of up to S$5,000.
- Undesirable Publications Act
- Under Section 11 of the Undesirable Publications Act: Any person who —
(a) makes or reproduces, or makes or reproduces for the purposes of sale, supply, exhibition or distribution to any other person;
(b) imports or has in his possession for the purposes of sale, supply, exhibition or distribution to any other person; or
(c) sells, offers for sale, supplies, offers to supply, exhibits or distributes to any other person,
any obscene publication (not being a prohibited publication) knowing or having reasonable cause to believe the publication to be obscene shall be guilty of an offence and shall be liable on conviction to a fine not exceeding $10,000 or to imprisonment for a term not exceeding 2 years or to both.
We note that criminal offences are in the realm of the Police and the Attorney General’s Chambers, and there is little more that can be done once a police report is made. We also stress that the authorities may not agree with us on what constitutes criminal conduct.
However, we would emphasise that a victim may be able to take out proceedings herself under the latest Protection From Harassment Act.
Under Section 15 of the Protection From Harassment Act, a victim can apply to the District Court for an order to cease publication or take down the photograph from online sites. Whether an application will be successful however, is hard to say. It depends on whether the term “statement” includes visual representations and photographs made by the App. We believe that the grounds for success are strong – It is difficult to believe that the statute intended for “statement” to be limited to words alone. A statement can be conveyed in a variety of ways, not necessarily in writing. If we would consider an audio recording that is distributed to be sufficient to trigger the remedies under the Act, then why not a picture?