An artificial intelligence program has beaten the best players in the world in the famous PlayStation racing game Gran Turismo Sportsand in doing so, they may have helped design better autonomous vehicles in the real world, according to an expert.
The latest development comes after a few interesting decades for AI games.
It started with chess, when world champion Garry Kasparov lost to IBM’s Deep Blue in a match in 1997. Then with Go, when AI beat Korean champion Lee Sedol in 2016. And in 2019, an AI program ranked higher than 99.8% of global players. in the hugely popular real-time strategy game StarCraft 2.
Today, an AI program has dethroned the best human players in the world of professional esports Gran Turismo Sports.
In a recent article published in the scientific journal Natureresearchers from a team led by Sony AI detailed how they created a program called Gran Turismo Sophy, which managed to win a race in Tokyo last October.
Peter Wurman is the team leader of the GT Sophy project and said they didn’t manually program the AI to be good at racing. Instead, they trained it race after race, running multiple simulations of the game using a computer system connected to around 1,000 PlayStation 4 consoles.
“He doesn’t know what his controls are doing,” Wurman said. “And through trial and error, he learns that the throttle keeps him going and the steering wheel turns left and right…and if he does the right thing going forward, then he gets a little reward.”
“It takes about an hour for the agent to learn to drive on a track. It takes about four hours to become about as good as the average human driver. And it takes 24 to 48 hours to be as good as the top 1% of pilots who play the game.”
And after another 10 days, he can finally face the best of humanity.
After finishing behind two bots controlled by Gran Turismo Sophy in the race in Tokyo, player champion Takuma Miyazono said it was actually a rewarding experience.
“I learned a lot from the AI agent,” Miyazono said. “In order to drive faster, the AI drives in ways we never imagined, which made sense when I saw its maneuvers.”
Chris Gerdes is a professor of mechanical engineering at Stanford and reviewed the team’s findings throughout its publication process at Nature. Gerdes also specializes in physics and drives racing cars himself.
He said he spent a lot of time watching GT Sophy in action, trying to figure out if the AI was actually doing something smart or just learning a faster way around the same track through repetition.
“And it turns out that Sophy is actually doing things that race car drivers would consider very smart, doing maneuvers that it would take a human race car driver an entire career to be able to get… out of their repertoire at just the right time,” he said.
What’s more, Gerdes said this work could have even bigger implications.
“I think you can take the lessons you learned from Sophy and think about how they work in developing, for example, self-driving vehicles,” he said.
Gerdes should know: he studies and designs autonomous vehicles.
“It’s not like you can just take the results of this article and say, ‘Great, I’m going to try it on a self-driving vehicle tomorrow,'” Gerdes said. “But I really think it’s an eye opener for people developing self-driving vehicles to sit down and say, well, maybe we need to keep an open mind about the scope of the possibilities here with the ‘AI and neural networks.’
Both Wurman and Gerdes said applying this work to real-world cars could still be a long way off.
But in the short term, Wurman’s team is working with the developers of Gran Turismo to create a more engaging AI that normal players can compete against in the next game in the series.
So in the near future we might also try our hand at racing.
Copyright 2022 NPR. To learn more, visit https://www.npr.org.