At CES 2026 on Monday, NVIDIA Corp. (NASDAQ:NVDA) unveiled a shift in autonomous vehicle (AV) development with the release of the open-source Alpamayo family.
Previous self-driving systems relied on separate modules for “seeing” (perception) and “steering” (planning), while Alpamayo introduced vision language action (VLA) models that possess human-like reasoning capabilities.
Read Next: What To Expect At CES 2026: Nvidia, AMD, Joby, Archer, D-Wave And More
A core challenge in autonomy has been the “long tail,” which is rare, unpredictable road scenarios that traditional algorithms fail to navigate.
Nvidia says Alpamayo 1, a 10-billion-parameter model, addresses this by using chain-of-thought reasoning.
“The ChatGPT moment for physical AI is here — when machines begin to understand, reason and act in the real world,” said Jensen Huang, CEO of Nvidia.
“Robotaxis are among the first to benefit. Alpamayo brings reasoning to autonomous vehicles, allowing them to think through rare scenarios, drive safely in complex environments and explain their driving decisions — it’s the foundation for safe, scalable autonomy,” Huang added.
Much like a human driver might think, “There is a ball in the street, so a child might follow,” Alpamayo 1 generates trajectories alongside logical traces.
This transparency is crucial for helping developers and regulators understand why a vehicle made a specific decision.
Nvidia is providing a full-stack open development environment:
| Component | Description |
| Alpamayo 1 | An open VLA model that acts as a “teacher,” allowing developers to distill its complex reasoning into smaller, faster models for use in actual cars. |
| AlpaSim | An open-source, high-fidelity simulation framework built to test vehicles in a “closed-loop” digital environment before they hit the pavement. |
| Physical AI Datasets | Over 1,700 hours of diverse driving data specifically curated to include the rare edge cases that have historically hindered Level 4 autonomy. |
By moving toward end-to-end physical AI, NVIDIA is leveraging its dominant hardware position, specifically the DRIVE Thor platform, to run these massive neural networks.
Nvidia said industry leaders such as Lucid Group, Inc. (NASDAQ:LCID) and Uber Technologies, Inc. (NYSE:UBER) are already showing interest in the Alpamayo framework to fast-track their Level 4 roadmaps.
“The shift toward physical AI highlights the growing need for AI systems that can reason about real-world behavior, not just process data,” said Kai Stepper, vice president of ADAS and autonomous driving at Lucid Motors.
“Advanced simulation environments, rich datasets and reasoning models are important elements of the evolution,” Stepper added.
As Huang noted, this could be the “ChatGPT moment” for physical AI, where machines finally begin to understand the nuances of the physical world rather than just reacting to it.
Photo: Shutterstock