On Wednesday, Sony announced its project Ace, an autonomous robot capable of competing with professional table tennis players. Sony says this breakthrough demonstrates that AI systems have achieved human-like, expert-level performance in a competitive sport in the physical world.
While the world is currently focused on the agentic AI craze — allowing AI agents to perform tasks on your behalf — physical AI is right on its heels. Having a humanoid robot fold your laundry is just the beginning, and Sony’s announcement shows that physical AI could outperform humans in some instances.
“Table tennis is a game of enormous complexity that requires split-second decisions as well as speed and power,” said Peter Dürr, director of Sony AI and project lead for Ace, in the press release.
Ace uses advanced sensory technology, reinforcement learning and precision hardware to play table tennis. The robot has nine active pixel-sensor cameras, so it can determine the ball’s position in 3D space. There are also additional cameras and systems to help it measure the ball’s velocity and spin.
Outside of the optics portion, Ace also has a control system based on model-free reinforcement learning that enables it to adapt and make decisions without a preprogrammed model. Combine this with high-speed robotic hardware to play the game, and the robot is as much a piece of art as it is hardware.
Ace was tested against five elite and two professional-level table tennis players, and it won three of the five matches against the elite players overall. It also scored sixteen direct points while serving versus the elite players’ eight.
Ace is more than a proof of concept, but a glimpse into the future of physical AI. “Once AI can operate at an expert human level under these conditions, it opens the door to an entirely new class of real-world applications that were previously out of reach,” said Peter Stone, chief scientist at Sony AI, in the company’s press release.
Sony did not immediately respond to a request for further comment.


