Photo/Illutration An airbrushed webpage that lists many deepfake videos (Asahi Shimbun file photo)

Police have arrested three men on suspicion of posting hundreds of deep fake porn videos that swapped out porn stars' faces with those of female celebrities.

Masayuki Futamata, 46, a company employee in Tokyo’s Minato Ward, Gaku Yamaguchi, 33, a company employee in Tokyo’s Setagaya Ward, and Kentaro Kubo, 23, a hotel employee in Yokohama, were charged with defamation, the Metropolitan Police Department and Chiba prefectural police announced on Nov. 19.

They apparently utilized face-swapping software available on the internet.

Police said each of the accused have run an adult video website. During the period from December 2019 to September this year, they posted weblinks on the sites for altered porn movies, in which the faces of female celebrities were doctored in. 

They have made a total of 215 such videos available on three sites that are accessible to the public.

Titles of these videos suggested the names of celebrities included in the content. 

Futamata has made ad revenue totaling more than 500,000 yen ($4,820) from the website, while Kubo has made more than 1 million yen, police said.

Futamata has denied the defamation charge because the website states the people featured in the videos are not celebrities, police said. But Yamaguchi and Kubo have admitted their guilt.

Police concluded that such videos were seen by many unspecified viewers due to the existence of the websites run by the three men.

“Making such videos as well as spreading them is unacceptable. We will continue the investigation and prevent further damage,” a police representative said.

In September, the Metropolitan Police Department and Chiba prefectural police arrested two men who are accused of creating deepfake porn videos by using artificial intelligence (AI) on defamation and other charges.

It was the first time for police to arrest a creator of a deepfake video, which has been prevalent in recent years. 

NOT JUST CELEBRITIES 

The recent intensified crackdown on creators and distributors of deepfake videos by police is welcomed by and a relief for many in the entertainment industry.

Ever Green Entertainment, a Tokyo-based talent management company, issued a statement shortly after the September arrest and said that their “talents who are the victims of these illegally created videos have been under attack and become a target of abuse as some people comment on social media that these videos are not fake.” 

Shigeru Nakada, who heads the company that manages popular actors and talents, said, “I hope not only the people who created deepfake videos but also those who have spread them take a sense of responsibility.” 

Nakada said he witnessed at least one actor who belongs to the company fell victim to such a video and became hurt by it.

It is tough for the actor’s family, too,” Nakada said, pointing out that it is nearly impossible to completely delete data off the internet. 

And those who think that being subjected to such attacks is just part of the price of fame are wrong.

Experts say the problem runs much deeper and even noncelebrities are at risk  because anyone can easily make a deepfake video with the readily accessible software. 

Lighthouse, a Tokyo-based nonprofit organization that supports victims of pornography and trafficking businesses, once consulted with a junior high school girl who fell victim to “aikora.” 

Aikora is a shortened word for “idol collage,” which is swapping porn stars’ faces with female idol stars and celebrities in a still image. The technique became known in the late 1990s and exploded in popularity as the internet and computers became more readily available. 

The student said her face was doctored in over a naked woman’s body in a picture, in which many hands are reaching out to touch it. She received the collaged image via a social networking site’s direct messaging tool. 

She looked concerned and told staff that she had no idea who created it or sent it to her. 

Experts say it is difficult to identify such a perpetrator, who could be a former boyfriend, a stalker, a friend holding a grudge or a stranger who takes a picture on the streets. 

Keiko Segawa, who works at Lighthouse, is concerned that “there must be many uncovered cases” and that deepfake images and videos “can be used to blackmail a victim.” 

I hope the public becomes more aware that such videos are unacceptable,” she said.