Photo/Illutration Children at Meijo University Senior High School in Nagoya’s Nakamura Ward discuss the sexual exploitation of minors through social media on July 20, while students from Fujita Health University listen. (Umeka Yoshinaga)

NAGOYA--A new smartphone app aims to prevent strangers from exploiting youth by luring them into sending nude or risque selfies.

Fujita Health University in Toyoake, Aichi Prefecture, teamed with Tokyo-based Smartbooks Inc. and the Aichi prefectural police to make an app that removes photos that contain sexually explicit content from phones. It will be accessible for free by the end of the year, they jointly announced.

According to data from the National Police Agency, 1,458 minors under age 18 became victims of child pornography in 2021.

More than 30 percent of them, or 514, were tricked into sharing sexually explicit images online.

The number doubled from 2012, when counting started, and has passed the 500-mark for the fifth year in a row. About 90 percent of the victims were students at junior and senior high schools.

Many are victimized through social networking sites, so the Aichi prefectural police last fall asked an organization forging ahead with its program to nurture young entrepreneurs for advice on measures to prevent such offenses.

Fujita Health University, which was then involved in the project, came forward, so that Smartbooks, headed by Naoto Tomita, 25, a visiting lecturer at the college, would come up with a countermeasure.

The app under development, tentatively named “Kodomamo,” will use artificial intelligence to check photo libraries on smartphones for anything that might be sexually explicit.

The app automatically deletes any images it finds that show revealed skin, genitals or undergarments. If it discovers pictures that fall into a gray zone, such as swimwear, the app will notify their guardians.

Since junior and senior high schoolers often rely on multiple ways to share photos through their phone’s cameras, such as through the popular photo sharing app Instagram, Kodomamo will examine all data stored in image folders to remove anything considered inappropriate.

But the app still has hurdles to clear. Challenges remain on how to deal with photos that are difficult for the AI to handle.

Questions remain, for instance, over whether it should remove all images of bare chests regardless of a person’s gender and how much skin needs to be exposed to be considered problematic.

“We are weighing whether to incorporate a function that would tell their guardians to check the images in these kinds of cases,” said Tomita.

The developers are planning to reflect the opinions of high school children in Kodomamo since they would be the main users of the app.

Students at Meijo University Senior High School in Nagoya’s Nakamura Ward tested the app on July 20.

One high schooler insisted “some form of reward should be offered to those who frequently log in to promote its usage,” while another said a “feature should be added for users to contact the developers if it raises alarms over nonproblematic photos.”

Tomita said he will commit to making the app “as easy to use as possible to prevent victimization.”

Nobuhiro Suzuki, a prefectural police captain at the community safety division at Nakamura Police Station, pledged to extend maximum cooperation in developing the app.

“A feature to enable guardians to control their children’s use of smartphones properly would be helpful,” Suzuki said. “We will provide advice.”