Photo/Illutration A robot equipped with sensors grasps tofu. It was developed by Thinker, a venture company from Osaka University. (Provided by Thinker)

Artificial intelligence is in the spotlight at the Combined Exhibition of Advanced Technologies (CEATEC), a high-tech trade show that kicked off at Makuhari Messe in Chiba on Oct. 17.

The event was opened to the media a day earlier, and many companies showcased their new AI-driven technologies.

The use of AI is spreading into a wide variety of fields for many different purposes, from solving labor shortages to offering human-like “companionship.”

ROBOTS FOR THE WORKFORCE

Thinker, a venture company from Osaka University, exhibited a robotic arm capable of picking up delicate foods such as tofu and irregularly shaped vegetables, which were previously considered difficult to grasp for robots.

“We have placed a brain at its fingertips,” said Hiromichi Fujimoto, the CEO of Thinker.

The robotic arm is equipped with four infrared sensors at its tips. The sensors measure the light reflected from an object 200 times per second, and AI accurately calculates the object’s distance and angle based on the amount of reflection.

This allows the robot to quickly discern the position and shape of the object before grasping.

A robot capable of picking up delicate items could help replenish shelves with prepared foods, potentially solving staff shortages that have plagued retailers.

Meanwhile, Lixil Corp., a housing equipment manufacturer, showcased a service that streamlines toilet cleaning.

Toilets in commercial facilities tend to be used at varying frequencies. The service uses sensors on doors and locks to check which restrooms are used most.

By detecting urgent needs, such as clogs, through sensors attached to toilet bowls, the AI outlines the most efficient cleaning routes for staff via a smartphone app.

Hitachi Ltd. exhibited a service that uses AI to visualize water and gas pipes buried underground, which are difficult to discern without skilled workers.

This could prevent delays in construction schedules and alleviate the resulting worsening labor shortages, the company said.

HUMAN-LIKE CONVERSATIONS

Many exhibits featured AI capable of providing comfort in everyday life through casual conversation or by serving as tour guides that use local dialects.

The use of generative AI that can create human-like sentences makes interactions with machines feel more natural and friendly.

One of them, a palm-sized robot with a shape like a water droplet, smiled cheerfully when a reporter asked: “What should I have for lunch? I feel like fried food might be hard on the stomach.”

“Sure. Then how about pork miso soup?” the robot responded.

This robot, named Romi, was developed by IT giant Mixi Inc. It uses deep learning technology to select words according to the conversation, rather than using fixed phrases.

Romi can express more than 100 different facial expressions and emotions, the company said.

“Home robots tend to focus on convenience, so there are few robots that can engage in natural, back-and-forth conversation,” said Akira Nagaoka, who oversees Romi at Mixi. “This could become something like a human partner.”

Romi has been on sale since 2020. In a survey of users, 90 percent said they felt comforted by conversing with it.

Advanced Media Inc., Japan’s largest provider of voice recognition software, exhibited an avatar built on the generative AI ChatGPT.

The avatar can change its way of speaking and tone of voice, depending on the settings.

The avatar was introduced on a trial basis to the Ibaraki prefectural government office, where it was incorporated into the prefecture’s official VTuber, “Ibara Hiyori.” It serves as a guide within the office and speaks using the local dialect.

Motoki Imamiya, head of the CTI department at Advanced Media, expressed his confidence, saying, “Adding human warmth to AI makes it more endearing.”

This high-tech trade show will be held until Oct. 20, with participation by 684 companies and organizations, 122 more than last year.

(This article was written by Kanako Tanaka, Ayumi Sugiyama and Kenta Nakamura.)