Photo/Illutration The Gran Turismo Sport video game, which offers life-like images of existing car models and race tracks, was used as a platform for the latest study on artificial intelligence. (Provided by Sony Group Corp.)

An artificial intelligence system developed by a Sony Group Corp. subsidiary beat world-class players of the Gran Turismo Sport car racing game, overtaking opponents at high speed and avoiding crashes based on split-second decisions.

Gran Turismo Sophy defeated four Japanese players, some of whom had won world championships, in all races at an event held in Tokyo in 2021, according to an article published Feb. 10 in the online edition of the British science journal Nature.

Sony AI Inc. adopted an approach called deep reinforcement learning, and the system acquired driving skills, such as how to efficiently use acceleration and braking and how to respond when the way ahead is blocked by an opponent, based on vast amounts of data.

“Gran Turismo Sophy used innovative ways to race its cars faster, and I could tell at a glance that its moves made perfect sense,” said one player. “There is so much to learn from what it did.”

Technologies behind complex racing maneuvers could be applied to autonomous driving on public roads, an area that Sony Group is expected to work on through an electric vehicle venture to be set up this spring.

Similar know-how could also be used for drones as well as robots designed to work alongside humans.

Gran Turismo Sport, a popular simulation game for the PlayStation 4 home video-game console, allows players across the globe to compete online.

The e-sport platform offers faithful reproductions of racing cars, high-resolution imagery and realistic driving experiences. It has been adopted for racing events certified by the International Automobile Federation (FIA).

20220225-sony-2-L
The research results on the Gran Turismo Sophy artificial intelligence system are featured on the cover of an issue of the Nature science journal. (Provided by Sony Group Corp.)

At the event in Tokyo, four AI-operated cars competed against the four players in races set along three tracks, including France’s Circuit de la Sarthe. The 13.629-kilometer course is the venue of the 24 Hours of Le Mans endurance race, a leg of the so-called Triple Crown of Motorsport.

The Red Bull X2019 Competition, a fictional car with a top speed in excess of 300 kph, was among the models featured in the virtual races.

Gran Turismo Sophy also beat three top-level players in time trial races on solo runs. The best finish time along the Circuit de la Sarthe was 193.080 seconds, about 1.8 seconds below the minimum of 194.888 seconds for humans.

“AI has the potential to make racing games more exciting and help discover new maneuvers,” said a member of the research team.

AI systems have already overwhelmed humans in board games.

In 1997, IBM’s supercomputer, Deep Blue, defeated the world chess champion. Another system won a professional “shogi” player in 2013. An article published in Nature magazine in 2016 said AlphaGo, developed by a Google Inc. subsidiary, beat Europe’s go champion.

But things are far more complicated when it comes to racing games, where car movements are simulated in accordance with the laws of physics, especially when multiple players are involved.

Competitors have to know, for example, how to pass an opponent using tactical maneuvers and block a rival while avoiding excessive contact and incurring penalties.

Those techniques require complicated strategies, real-time decision-making and advanced car control skills all at the same time. That previously made it difficult for AI systems to get the better of humans.

But there is still room for improvement in AI’s strategic decision-making abilities.

Gran Turismo Sophy sometimes failed to follow the racing line immediately after it had overtaken an opponent along a linear section of track, according to the research team.

The journal article can be read at (https://www.nature.com/articles/s41586-021-04357-7).