Photo/Illutration The Personal Information Protection Commission has proposed revisions to the Personal Information Protection Law. (Asahi Shimbun file photo)

To accelerate artificial intelligence development, the government plans to relax consent requirements for access to personal information while introducing tougher penalties for intentional misuse. 

The proposal disclosed on Dec. 4 calls for a change to the Personal Information Protection Law, which requires individuals’ consent for the third-party sharing of personal data and for the collection of sensitive information, such as medical or criminal records.

Under the draft of the major amendment to the law, consent would no longer be required if the data is used solely for creating statistical information.

Business leaders have argued that strict consent rules hinder AI research, which depends on vast amounts of training data.

AI developers often collect information by automatically scanning publicly available web pages, where sensitive personal data may appear.

The revision would allow such data to be used without prior approval, provided it is processed into statistical form, rather than shared in a way that could identify individuals.

The government sees the move as part of a broader national strategy to strengthen economic security through AI.

Similar debates are unfolding globally.

Last month, the European Commission proposed allowing certain biometric and personal data to be processed for AI development without consent under specific conditions, signaling a worldwide trend toward easing restrictions.

The amendment to Japanese law specifies that hospitals and clinics, in addition to research institutions, may use personal data for academic research without requiring consent.

To protect children’s rights, the revision would mandate that consent for collecting data from individuals under age 16 must be obtained from a legal guardian rather than the child.

While easing some rules, the government also plans to introduce a new penalty system to deter violations and abuse.

Businesses that deceive individuals to collect data from more than 1,000 people, then sell that information for profit, would face fines equivalent to the illicit gains.

However, violations resulting from accidental data leaks or inadequate security management would not be subject to these penalties.

The dual approach--regulatory relaxation for AI development alongside stricter penalties for abuse--reflects a compromise between industry demands and public concerns over privacy and human rights.

The Personal Information Protection Law is reviewed every three years, and the Personal Information Protection Commission has been discussing revisions since 2023.

The government is seeking to submit the amendment to the Diet at the earliest possible date.

(This article was compiled from reports by Kae Kawashima, Naoko Murai and Masako Wakae, a senior staff writer.)