Sony has developed what it's calling a breakthrough artificial intelligence program for the Gran Turismo series of PlayStation racing games. The software, called Gran Turismo Sophy, is so sophisticated, Sony says, that it handily beat a group of the world's best virtual race car drivers in test version of the 2017 game Gran Turismo Sport in October.
“Outracing human drivers so skillfully in a head-to-head competition represents a landmark achievement for AI,” Chris Gerdes, a Stanford professor specializing in autonomous driving, wrote in a Nature article published alongside Sony's research. Gerdes said this research could one day affect self-driving car development, according to Wired. "GT Sophy’s success on the track suggests that neural networks might one day have a larger role in the software of automated vehicles than they do today,” Gerdes wrote.
Using video games to develop and train novel AI systems has become a popular research strategy for the world's most cutting-edge AI labs. In just the last decade or so, board and card games like Go and Texas hold 'em and sophisticated strategy video games like Starcraft and Dota have been "solved," so to speak, after becoming test beds for some of the world's most advanced forms of self-learning AI. Many of these research teams use a variety of AI training called deep reinforcement learning, which effectively allows an AI to evolve over time using various combinations of trial and error and reward systems modeled after human neural development.
Gran Turismo Sophy is no different, though Sony said learning to operate a virtual race car as realistic as those deployed in Gran Turismo presented a unique challenge for its research team. While AI programs have become superhuman at numerous games, these programs often excel at ones in which they're able to take considerable time computing the best move to make. In a racing game, the AI has to make continuous judgment calls while using near-perfect reflexes and obeying restrictions Sony put in place to replicate the rules of a real-life race, such as penalties for overly aggressive driving.
"In order to train the agent at massive scale, we developed a novel distributed reinforcement learning platform that can run many instances of GT Sport. We deployed these systems at massive scale using SIE’s cloud gaming infrastructure," said Michael Spranger, the chief operating officer of Sony AI, in a virtual press conference on Wednesday. "Finding the right balance between aggressive but fair racing is one of the defining characteristics of motorsport. Collectively, this is what makes GT Sophy a unique AI achievement."
Sony said it plans to incorporate Gran Turismo Sophy into the upcoming Gran Turismo 7, which releases in March, as a training tool for players. Like in chess, Go and other games, AI programs have shown to have a positive effect on human play, highlighting unique strategies humans had not yet discovered and pushing opponents to push beyond their limits to try to keep pace with machine play.
“Sophy takes some racing lines that a human driver would never think of,” Gran Turismo creator Kazunori Yamauchi told Wired. “I think a lot of the textbooks regarding driving skills will be rewritten.”
Correction: An earlier version of this story misstated the year Gran Turismo Sport was released. This story was updated on Feb. 9, 2022.