4

Days

15

Hours

59

Mins

9

Secs

The biggest announcement at CES was not a new car but a new software platform. Jensen Huang, CEO of NVIDIA, revealed Alpamayo, an autonomous vehicle platform that NVIDIA plans to launch later this year.

The first production vehicle announced to use NVIDIA’s new autonomous stack is the 2026 Mercedes-Benz CLA. Huang described Alpamayo as a breakthrough in artificial intelligence that does more than react to its surroundings—it predicts and reasons about future actions.

"It's trained end-to-end. Literally from camera in to actuation out," Huang said at CES. "It reasons what action it is about to take, the reason, and the trajectory."

Here’s a look at NVIDIA’s autonomos system in action:

Alpamayo centers on a Vision-Language-Action (VLA) model designed to explain its own decisions. NVIDIA says it produces reasoning traces—analogous to the thought traces used by reasoning large language models—showing why it chose a particular maneuver. The company argues this transparency helps mitigate the black-box problem that has long affected autonomous driving systems.

While the announcement drew widespread attention, Tesla’s team, which has deployed end-to-end neural networks across millions of cars for years, responded with caution.

Exactly What Tesla is Doing

Elon Musk’s response mixed recognition with a reality check. He noted that NVIDIA’s end-to-end approach mirrors what Tesla began deploying in 2023 with FSD V12.

Tesla’s shift to end-to-end learning—where vehicles learn driving behavior from video rather than hand-coded rules—was completed years ago. From Tesla’s perspective, NVIDIA adopting the same architecture is confirmation that Tesla’s direction was correct.

The March of 9’s

The deeper issue isn’t the architecture itself but the difficulty of perfecting it. Elon emphasized that reaching 99% capability is straightforward, while solving the remaining 1%—the long tail of rare scenarios—is exceptionally difficult. Tesla’s Director of AI, Ashok Elluswamy, also commented on the Long Tail Trap on X, highlighting the same problem.

The final 1% involves bizarre, one-in-a-million scenarios: a person in a chicken suit crossing the street; a roundabout with traffic flowing in three directions; or a sign partially covered by snow that reads 5 instead of 50.

Ashok’s remarks reflect the gruelling work Tesla has undergone over the past four years. Millions of miles of data are curated daily to surface the rare edge cases needed for training models to handle real-world complexity. NVIDIA, by contrast (and similarity), is trying to address this gap using simulation tools and a much smaller dataset from partners.

The Hardware Lag

There is also a structural challenge for NVIDIA: it does not manufacture cars. It provides chips and software rather than designing vehicles directly, which adds complexity to broad deployment.

Even though NVIDIA plans to roll Alpamayo into the Mercedes CLA later this year, the initial deployment will likely be limited in volume. Tesla already has millions of cars on the road collecting the long-tail data that helps address that last, elusive 1%.

Welcome to the Grind

NVIDIA’s announcement serves as a strong validation of Tesla’s end-to-end neural network approach. The world’s leading AI chipmaker adopting the same architecture suggests the industry may be coalescing around a common solution.

However, as Elon and Ashok emphasize, architecture is only the starting point. The competitive edge will come from data—finding and training on the rare edge cases that cause reasoning to fail. NVIDIA has produced a powerful engine; now it must prove it in the chaotic and messy environments where Tesla has already spent a decade gathering experience.

Mutual Respect

Jensen Huang acknowledged Tesla’s work and praised its technology stack as one of the most advanced available.

Neueste Geschichten

Dieser Abschnitt enthält derzeit keine Inhalte. Füge über die Seitenleiste Inhalte zu diesem Abschnitt hinzu.