Photo/Illutration A sign at a JR East station says a security system based on facial recognition technology is in operation. (Provided by East Japan Railway Co.)

Since summer, East Japan Railway Co. has been using thousands of security cameras with facial recognition technology at 110 major railway stations and other facilities in the Tokyo metropolitan area to detect suspicious people.

While JR East says the system is part of its safety measures, unregulated use of the technology in public spaces raises concerns about invasions of privacy.

It is urgent to establish regulations for the technology. Such rules must make clear the conditions that allow public or private actors to use facial recognition software to identify individuals.

Although the company did not reveal what types of individuals would be monitored, the Yomiuri Shimbun reported last month that JR East’s network of facial recognition cameras was designed to detect certain people who have been released from prison.

Following the report, the company suspended the operation of the system, saying “a social consensus has not yet been reached” on use of the technology.

But JR East says it will continue feeding facial data about people acting in a suspicious manner, such as hanging about within stations, into the system’s database.

If cameras detect the faces of such people, security guards may question them or check their belongings.

In Japan, the consent of individuals is not legally required to collect and use their facial biometric data for an algorithm designed to detect and identify faces through mathematical calculations.

In addition, there are no rules governing the management and application of such data collected without the consent of the individuals. This is a highly disturbing situation.

Two years ago, three bookstores in Tokyo made headlines when they jointly established a system to register and share facial data of suspected shoplifters. The system alerts clerks when a suspect is detected in their stores.

The European Union, which is highly sensitive to privacy protections, in 2018 prohibited, in principle, the collection of facial recognition data. EU member states are required to establish legislation to define the requirements to allow for exceptions to the ban.

In the United States, which places importance on distribution of information, some states and cities have laws and regulations that ban or limit the collection of facial data.

Some people are willing to tolerate use of facial recognition systems to prevent crimes, like those introduced by JR East and the bookstores.

But the risk of errors in facial recognition or abusive use of data could lead to dire consequences.

This possibility makes a convincing case for legal and regulatory rules on registering data with a facial recognition system, the permitted period of storing such data, and steps to check for violations and penalties against offenses.

These rules should be backed by effective enforcement measures.

This approach requires a highly independent watchdog to monitor and scrutinize use of facial recognition in both the public and private sectors.

The Personal Information Protection Commission of the Cabinet Office currently performs the watchdog role. The credibility of the commission, however, has been undermined by the revelation that it approved, without much consideration, JR East’s system to use information about released prisoners, which requires especially careful handling.

Despite legal shortcomings behind the commission’s careless decision, there is no disputing the panel’s lack of sensitivity to the issue.

With the government now viewing the promotion of digitization as a key policy challenge, proper use of personal data is assuming increasing importance.

The controversy over JR East’s system must have made many people in Japan wary about how their facial and other personal data are actually used.

Lawmakers have a duty to create a clear system to make citizens feel safe about the protection of their privacy.

--The Asahi Shimbun, Oct. 8