Damn it, Sony’s AI can now beat us at Gran Turismo racing, too

Sony AI's GT Sophy racing in GT Sport

A Sony AI system called GT Sophy races in the Gran Turismo Sport video game.


Sony AI/Screenshot by Stephen Shankland/CNET

Over the last two years, Sony AI trained a computer system to play Polyphony Digital’s Gran Turismo Sport, a popular and realistic car racing game, and beat some of the world’s best human competitors, Sony said on Wednesday.

The AI, named GT Sophy, defeated top humans only in time trials when there were no other cars on the track during a July competition. But by October, GT Sophy beat the humans even with a scrum of virtual race cars.

GT Sophy is the latest experiment demonstrating that AI can be victorious at games such as chess and Go, which were long thought to be the domain of human intelligence. AI has also beaten people at classic Atari video games and the Starcraft real-time strategy game. 

AI today generally refers to a process for programming computers using a technology known as neural networks, which mimic the way human brains work. Sony’s achievement is notable enough to warrant a research paper in the prestigious journal Nature.

A car racing video game, like Gran Turismo, presents open-ended tactical choices as well as simulated rules of physics. GT Sophy picked new ways to approach them, one of the human competitors said.

“The AI drives in a way that we would never have come up with,” said Takuma Miyazono, who won three challenges in the FIA Gran Turismo 2020 World Finals, speaking in a video. He said GT Sophy’s tactics made sense when he saw it drive. 

Many AI systems are trained with real-world data through a system called deep learning that gives them the ability to recognize faces and to spot spam. GT Sophy used a different technique called reinforcement learning that starts with an entirely untrained system that has no idea what to do. It raced courses over and over again, following a human-designed reward system that encouraged better results and eventually mastered the game.

One particular difficulty was figuring out the unwritten rules of car racing, such as avoiding collisions and not inappropriately cutting off other drivers.

“We all underestimated how hard it would be to get the sportsmanship side right,” said Sony AI Director Peter Wurman in the video. “To do that without being overly aggressive or overly timid in the face of competitors.”

Sony AI ran simulations on computers connected to a bank of more than 1,000 PlayStation 4 game consoles.

As is common in AI, Sony trained its versions of Sophy GT using fast graphics chips. To run the simulations, it used computers with conventional processors.

The next in the venerable series, Gran Turismo 7, debuts on March 4.

For all the latest world News Click Here 

Read original article here

Denial of responsibility! TechAI is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.