The AI ​​that “undresses with just one click” doesn’t even let children go

"Spiritual Journey" says that when looking for the ocean, you should know that you are already living in the water.

The same seems to be true for AI in human society.

Since the beginning of this year, "iPhone" has been happening all the time in all walks of life. Runway turns silence into movement, Pika redraws parts, HeyGen lets foreigners speak authentic Chinese, and you sing and I appear in the middle of the stage.

In the corner without highlights, the seemingly ordinary face-changing Deepfake is expanding its territory, affecting everyone from celebrities and presidents to middle school students and even ordinary people. It is not amazing or subversive, but it is enough to make us sweat.

Take off your clothes with one click and escape completely

"I saw your naked photo."

In September, on the first day back at school, a boy approached 14-year-old Isabel and said these words to her. In the morning, the campus was abuzz with gossip. News spread in groups. Almost everyone's mobile phone had "explicit pictures" of female classmates circulating.

This is a true event that took place in southwestern Spain and made Almendralejo, a small town of only about 30,000 people, internationally known. There are five middle schools in the town, and "nude photos" were circulated in at least four of them.

The cause of the incident is not complicated. A group of nefarious boys uploaded photos of their female classmates on social media to a "one-click stripping" AI tool.

▲The AI ​​tools they use.

This tool can be used through the mobile app or the instant messaging software Telegram. The slogan is "Take off anyone's clothes for free", as long as someone else has a photo in your mobile phone album.

Although it is said to be free, the price for creating 25 nude images is 10 euros, and payment can be made with Visa, Paypal, MasterCard, etc., just as convenient as using WeChat and Alipay in China, but these payment methods were deactivated after the media exposure.

Spanish police confirmed that there were at least 30 victims, mainly female students aged 12 to 14. Most of the initiators knew them and were also minors. There were at least 10 of them, some even under the age of 14, and could not face criminal charges.

They created group chats on WhatsApp and Telegram to spread these "nude photos", and threatened victims through Instagram to extort "ransom" and real nude photos.

The mother of one of the victims felt her heart skip a beat after seeing the "nude photo" of her daughter. "This photo looks real."

▲ A mother called on more victims to come forward.

In October, a similar situation occurred in a high school in New Jersey, USA. There were about 30 victims. Their male classmates made "nude photos" during the summer vacation. Many girls cried with anger. One of the victims had never thought that Deepfake would be related to her before:

I never imagined that as a student, AI didn't even cross my mind. I just feel like this is like a monster on the internet.

The principal assured that all the pictures had been deleted and would not be disseminated again. The initiator was suspended from school for a few days and returned to the "site of the crime" as if nothing was wrong.

Many parents are dissatisfied with letting go, but there is nothing they can do. To this day, Deepfake is still a gray area in many places.

In the United States, Deepfake images of minors are banned. In November, a North Carolina child psychiatrist was sentenced to 40 years in prison for secretly filming patients and "undressing" them with one click.

But there are exceptions. If the perpetrator is also a minor, the school may treat it leniently. For example, in this high school in New Jersey, the penalty is only suspension. One month after the incident, the victims and their parents still do not know their identity or number.

If the victim is an adult who can protect himself, the situation is more complicated. Each state has different legal provisions, either criminalizing it or filing civil lawsuits, but there is currently no federal law prohibiting the production of Deepfake pornography.

One reason why legislation is difficult is that some people believe that even if the subject in the Deepfake image looks like you, it is not actually you, so your privacy is not really violated.

However, everyone knows that although the pictures are false, the injuries are real and are no different from real nude photos. For the parties involved and passers-by who don’t know the truth, the key is not whether the photos are 100% real, but whether they look real.

The work "Ghost in the Shell" at the end of the last century has already explored such issues. Someone who looks exactly like you has done something you would never do, but it is difficult for you to prove that it is not you, the data is you, and the traces of existence are you.

Ordinary people’s pranks, mobile phones are weapons

To put it simply, deepfake uses AI to generate seemingly real videos, audios or images to simulate things that did not actually happen.

Deepfake first emerged in 2017 on Reddit, the "American version of Tieba". The main form is to replace the faces of celebrities with the protagonists of pornographic videos, or to spoof political figures.

The earliest AI application for "one-click undressing" may be DeepNude in 2019. At that time, you still needed to download software compatible with Windows 10 and Linux devices, and the server would go down due to too many people.

But now "one-click undressing" has become easier and more common. Services are provided in the form of apps, and money is made through common payment methods. It has become a prank played by ordinary people on ordinary people. It has already happened in middle schools and may also happen in middle schools. on anyone.

With just a photo, an email address and a few dollars, you can take off the "clothes" of celebrities, classmates, and strangers in batches, and the effect is even more difficult to distinguish between real and fake. Images used for "undressing" are often obtained from social media without the consent of the publisher, and then spread without their knowledge.

The father of a 14-year-old deepfake victim in Spain hit the nail on the head:

Now, the smartphone can be considered a weapon.

While the law lags behind, technology platforms urged by public opinion have made some remedies, but they can only quell the fire.

Google removes relevant ads and optimizes the ranking system, and also supports me to delete relevant content after providing evidence. TikTok blocks keywords such as "undressing", Reddit bans multiple domain names, and Telegram relies on automatic monitoring and user reports to delete content that violates the terms of service…

There are no roadblocks to fear. Deepfake has developed particularly fast this year, as if it were on Musk's starship.

According to statistics from the social network analysis company Graphika, in September alone, the 34 "one-click undressing" platforms had a total of more than 24 million visitors, and the 52 Telegram groups accessing the "one-click undressing" service contained at least 1 million users.

The operation team behind it is also very smart and understands marketing and monetization. They advertise on social media and recommend the service to "destined people". Since the beginning of this year, the number of "one-click strip" advertising links on X and Reddit has increased by more than 2,400%.

The maturity of the industry chain has led Graphika to the conclusion that Deepfake has now been given the surname "Niu Hulu", which is completely different:

AI-generated stripping images have moved from niche porn discussion forums to large-scale and commercial online businesses.

Graphika believes that there is another important reason why Deepfake took off again this year – the functionality of the open source AI image diffusion model continues to increase and it becomes easier to use.

These open source models have contributed to the "high-quality and low-priced" deepfake market. Otherwise, developers of "one-click undressing" applications would have to host, maintain and run their own custom image diffusion models, which takes longer and costs more.

The FBI, which has shown its prowess in criminal investigation films, could not sit still. In June this year, it issued a carefully worded warning to let the public be careful about pornographic Deepfake images and videos. They are usually publicly disseminated on social media or pornographic websites and used for harassment and extortion. In the end, A good way is to stop the loss at the source:

Once content is shared on the Internet, it is extremely difficult, if not impossible, to remove it once it has been disseminated or published by other parties.

Real opportunities, abstract fun

ABC News once interviewed a team that developed "one-click undressing" via email, and the other party's reply was difficult to comment on.

They claim that they develop this type of application mainly for fun, and use it to process photos of themselves or their friends. Everyone enjoys it and is no longer ashamed of nudity. Anyway, it is all done with AI.

So why in reality, middle school boys and other perpetrators don’t make fun of each other, or seek the victim’s consent in advance? Teams don’t have to pretend to be confused because they really want to stand aloof from the world. Why not provide this kind of service specifically for company team building? Let everyone put down their shame, be motivated, and truly treat the company as their home.

But this is indeed a point of view. If you are naked, others are naked, and I am naked, it means that no one is naked. There is no need to panic when you see naked photos in the future. I insist that AI will change the face, and the smile will pass. There is no concept of pornography. Originally, There is nothing, no matter how dusty it is.

I don’t know what kind of advanced society can have such a harmonious scene. Anyway, recent statistics have found that even though everyone is equal when it comes to taking off clothes, most services are only available to women. Therefore, the "one-click undressing" service is not of a high standard and has to be tinged with gender.

Just "one-click undressing" is not enough to satisfy your desires.

In September of this year, a South Korean man in his 40s was sentenced to two and a half years in prison for using AI to generate approximately 360 child pornographic images. He was charged with violating the Child and Youth Protection Act. This was the first time in South Korea that this was done. Domestic courts have sentenced such AI cases.

The Korea Herald reported that he used prompt words such as "10 years old," "nude" and "child" to generate the images, which the court ruled were realistic enough, rejecting the defense's argument that they could not be considered sexual exploitation.

If DIY is not possible, some websites that share AI models provide more "thoughtful" services. If you enter the name of a celebrity in the text box and add "nude" and other prompt words, you will get a blurry image to whet the user's appetite and pay. Unlock children and old men without deception.

Some sharp-eyed adult industry practitioners see dangers and opportunities at the same time. AI may replace them or become their colleagues.

Fanvue is a subscription platform similar to OnlyFans. It is very optimistic about AI-generated creators because it can simplify the process and create characters that do not exist in reality. It’s bolder than an AI girlfriend who just chats under a quilt, and more original than a celebrity deepfake.

▲ Picture from: Fanvue

One of their AI characters, Emily Pellegrini, has nearly 120,000 followers on Instagram and is also an adult content creator. She has already made money on Fanvue, earning more than $9,600 in six weeks through subscriptions, pay-per-view content, etc.

If this trend continues, why would Luo Ji, the wall-facer in "The Three-Body Problem", bother looking for his ideal lover Zhuang Yan in a sea of ​​people? He also explains to others that this is part of the plan and it is not difficult to customize it.

In addition to platforms with a large number of people, there are also solo sex workers who actively cooperate with AI because they understand one truth:

If I don't participate, others will misuse my image.

There are two main ways of cooperation. One is a chatbot that talks about everything. It is trained using YouTube videos, podcasts, interviews and restricted content. It charges a monthly subscription fee of dozens of dollars. The other is personalized customization. Picture service, ranging from PG13 that can be viewed by minors to 18+, you will be satisfied with watching people's dishes.

Some practitioners are also taking precautions, hoping to provide immersive VR experiences in the future and meet requirements that even real people cannot meet.

There is nothing wrong with food and sex. Mustafa of "Brave New World" said: "Civilized people have no need to endure any discomfort."

Mustafa is the most sober and ruthless character in this dystopian novel. He understands all ills and sufferings, understands that barbarians are pursuing the right to suffer, and knows how to make the vast majority of people live comfortably through various methods. Customary chemicals help people simulate necessary emotions.

If this is unavoidable, at least we should first protect the children as much as possible. If AI porn allows humans to achieve great liberation of the body and shame, you might as well add a reminder to the work: no real humans are harmed in the process.

It is as sharp as autumn frost and can ward off evil disasters. Work email: [email protected]

# Welcome to follow the official WeChat public account of aifaner: aifaner (WeChat ID: ifanr). More exciting content will be provided to you as soon as possible.

Ai Faner | Original link · View comments · Sina Weibo