April 10, 2024

This article was originally published on February 11, 2022. We’re republishing it as Gran Turismo players can now compete against the artificial intelligence Gran Turismo Sophy in the most recent edition of the sport.

To accelerate on the most efficient “racing line” without losing control, race car drivers have to be able to brake and accelerate, as well as steer according to precisely scheduled sequences. The procedure is based on frictional limits and is governed by well-established physical laws. This means that autonomous cars can learn to complete a race at the fastest speed possible (as many have already achieved). But it becomes more complicated when the independent driver needs to share the space with other vehicles. Researchers have discovered the solution by preparing an artificial intelligence program to beat human opponents in Gran Turismo Sport’s ultra-realistic gaming sport. The results could lead autonomous car researchers towards innovative methods to help this valuable technology in real life.

Artificial Intelligence has defeated human players in video games like Starcraft II and Dota 2. However, Gran Turismo differs from other games in several ways, according to Peter Wurman, director of Sony AI America and co-author of the study published in Nature. “In most games, the environment defines the rules and protects the users from each other,” Wurman says. “But when racing the cars are close to one another and they have a precise sense of manners which must be mastered and mastered by AI agents. To win it is essential that they be considerate of their competitors however, they must also be able to maintain their lines of travel and ensure that they don’t let them let their opponents pass them by.”

To teach their AI the basics, the Sony AI researchers used a method known as deep reinforcement learning. They rewarded their AI for specific actions like keeping the track in place and on the controls of their car and adhering to race etiquette. Then they let the program loose to experiment with different types of racing that could help it to reach those goals. This Sony AI team trained multiple versions of its AI program, which they named Gran Turismo Sophy (GT Sophy), each introduced to be proficient in driving a specific type of vehicle on a particular track. The researchers then pitted the AI against humans as Gran Turismo champions. In the first test conducted in July of last year, humans had the best overall team score. The second time around, in October 2021, AI made its mark. It dominated its human opponents together, setting the fastest time for laps.

Human players take their losses in the right direction, and some even were enthralled by the challenge of AI. “Some of the things that we also heard from the drivers was that they learned new things from Sophy’s maneuvers,” says Erica Kato Marcus, director of partnerships and strategies in Sony AI. “The strategies that the AI used were so difficult, and I think I could attempt them one time. However, it was tough that I would never try it in an actual racing event,” claims Emily Jones, a world champion at the FIA-Certified Grand Turismo Championships 2020, then competed with GT Sophy. Although Jones claims that competing against the AI caused her to feel somewhat powerless, however, she says the experience was awe-inspiring.


“Racing, like a lot of sports, is all about getting as close to the perfect lap as possible, but you can never actually get there,” Jones declares. “With Sophy, it was amazing to be able to observe what could be exactly the best lap. It was impossible to get faster.”

The Sony team is working on working to improve the AI. “We trained an agent, a version of GT Sophy, for each car-track combination,” Wurman declares. “And one of the things we’re looking at is: Can we train a single policy that can run on any car on any of the tracks in the game?” On the commercial side, Sony AI is also working with the developers of Gran Turismo, the Sony Interactive Entertainment subsidiary Polyphony Digital, to include a variant similar to GT Sophy in a future version of the game. To do this, researchers will need to modify the AI’s performance so that it can be a formidable adversary but not unbeatable–even for players less skilled than the top players who have tried the AI to date.

Because Gran Turismo offers a realistic representation of the exact vehicles and tracks and the specific physics parameters that regulate each, this research could also be applicable beyond video games. “I think one of the pieces that’s interesting, which does differentiate this from the Dota game, is to be in a physics-based environment,” claims Brooke Chan, a software engineer for the research firm for artificial intelligence OpenAI with a co-authorship of the OpenAI Five project that beat human players in Dota 2. “It’s not out in the real world but still is able to emulate characteristics of the real world such that we’re training AI to understand the physical world a little bit more.” (Chan did not participate in this GT Sophy study. )

“Gran Turismo is a very good simulator–it’s gamified in a few ways, but it really does faithfully represent a lot of the differences that you would get with different cars and different tracks,” claims J. Christian Gerdes, who is a Stanford University professor of mechanical engineering, who wasn’t involved in the study. “This is, in my mind, the closest thing out there to anybody publishing a paper that says AI can go toe-to-toe with humans in a racing environment.”

Many disagree, however. “In the real world, you have to deal with things like bicyclists, pedestrians, animals, things that fall off trucks and drop in the road that you have to be able to avoid, bad weather, vehicle breakdowns–things like that,” says Steven Shladover an engineer in research within the California Partners for Advanced Transportation Technology (California PATH) program at the University of California, Berkeley’s Institute of Transportation Studies, who was not part of Nature paper. Nature paper. “None of that stuff shows up in in the gaming world.”

However, Gerdes states that GT Sophy’s successes could be of value because it challenges beliefs about how self-driving vehicles are programmed. A self-driving car can make decisions based on the laws of physics or its AI training. “If you look at what’s out there in the literature–and, to some extent, what people are putting on the road–the motion planners will tend to be physics-based in optimization, and the perception and prediction parts will be AI,” Gerdes declares. In the case of GT Sophy, however, the motion planning of the AI (such as deciding on how to maneuver around a corner at the highest level of its ability without creating an accident) was built on the AI part of the equation. “I think the lesson for automated car developers is: there’s a data point here that maybe some of our preconceived notions–that certain parts of this problem are best done in physics–need to be revisited,” he adds. “AI might be able to play there as well.”


Gerdes also suggests that GT Sophy’s accomplishment could be a model for other areas in which robots and humans interact. For instance, in Gran Turismo, he points out that the AI needs to be able to balance the challenge of finding the quickest route to the finish line with the challenging task of interacting seamlessly with humans, who are often unpredictable. “If we do have an AI system that can make some sophisticated decisions in that environment, that might have applicability–not just for automated driving,” Gerdes states, “but also for interactions like robotic surgery or robots that assist in the home. If you’re faced with a situation requiring a human and robot work in tandem to get something moved, it’s more difficult than a robot doing it by itself.”

The article that bears “AI Champions, “AI Champions” was adapted to be published in the May 2022 issue of Scientific American.


Leave a Reply

Your email address will not be published. Required fields are marked *