In context: Most video game artificial intelligence is not all that intelligent, and for a good reason. Nobody wants to play with an AI that is smarter, faster, and better than them. However, that has not stopped machine learning developers from creating unbeatable bots.

On Wednesday, Sony's artificial intelligence team published its work on a bot called "GT Sophy" in the Nature science journal. The AI is a neural network that strives for perfection on the race tracks of Gran Turismo Sport through reinforcement learning. It has trained the equivalent of over 45,000 hours and can run a race line to within one millimeter of precision.

The team recently tested four GT Sophy bots against four professional esports champions. The AIs came in first, third, fifth, and seventh, beating their respective human counterparts on the eight-car grid.

The second place, 2021 TGR GT Cup champion Tomoaki Yamanaka might have had a chance had he been placed in the pole position. As it was, the bot in pole, nicknamed GT Sophy Rogue, remained in first the entire three-lap race on GT Sport's Autodrome Lago Maggiore course.

The closest Yamanaka could ever get to Rogue was within one second. However, constant pressure from third place bot GT Sophy Lavande caused him to concede first place to Rogue and focus on remaining in second playing defense against the bot on his tail. Rogue defeated Yamanaka by almost six seconds, which is a massive margin of victory at the professional level.

It is important to note that GT Sophy started out "tabula rasa" (a clean slate). It taught itself to drive from scratch. In the early days, it was almost comical watching Sophy run into walls and swerve left and right on the straights for no reason. Even the testers, who are novice racers, could beat the AI 100 times out of 100.

However, the AI eventually figured out how to drive straight, negotiate the turns, and drive an orthodox racing line. It even taught itself a technique that the AI team thought was unique.

Most racers are taught to employ the "slow-in-fast-out" technique of cornering. The method involves braking in a straight line as you come up to a curve to enter the corner without losing control then accelerating coming out of the apex.

What GT Sophy began doing was a "fast-in-fast-out" approach. It would brake as it entered the turn while maintaining enough speed to pitch the car so that most of the weight was on three wheels (both front and the outside rear). It doesn't sound like much difference, but the late-braking can shave seconds off a lap depending on the course.

The AI team thought Sophy was doing something unprecedented, but the technique has been employed for a long time. You can see it used by nearly every sprint car driver. Some Formula One drivers also use this method of cornering.

"We notice that, actually, top drivers such as Lewis Hamilton or Max Verstappen actually are doing that, using three tires, going fast in and fast out, all these things that we thought were unique to GT Sophy," said Polyphony Digital CEO Kazunori Yamauchi, the father of Gran Turismo.

Verstappen was Formula One's world champion last year. Hamilton won the F1 championship seven times in 2008, 2014, 2015, 2017, 2018, and 2019. His seventh win in 2020 tied Michael Schumacher's record of most career F1 world drivers' championship wins. Hamilton also holds the record for most race wins and pole positions.

If you plan to pick up Gran Turismo 7 next month, don't worry about having your pants beat off by GT Sophy. The team does plan on adding the AI to the game in a future update as a fun option, but the vanilla AI, which is challenging but beatable, will remain.