
Think of them as jacks of all trades that can also master specific jobs.
Building these robots requires integrated cloud-to-robot workflows that make it seamless to collect and generate data, train and evaluate control policies, and deploy them safely onto physical machines. These generalist-specialist systems depend on reasoning vision language action (VLA) models to perceive, understand and act intelligently across diverse tasks.
To accelerate this shift, the open NVIDIA Isaac platform provides robotics developers with everything they need — models, data pipelines, simulation frameworks, runtime libraries — to build a robot and deploy it at scale with NVIDIA’s three-computer solution. NVIDIA even provides an open VLA model, NVIDIA Isaac GR00T N, which gives developers a powerful foundation to bootstrap and post-train their own robotic intelligence.
These models, libraries and frameworks can run in the cloud or on edge AI infrastructure — and can now be further accelerated with the integration of long-running agents like OpenClaw.
With the latest agent-friendly NVIDIA Isaac GR00T models, Isaac robot simulation and learning frameworks, as well as edge AI systems announced this week at NVIDIA GTC, NVIDIA is giving developers new, powerful tools for the generalist-specialist era of autonomy.
These workflows are open and composable, so developers can mix and match components, bring their own tools and data, and accelerate their pipeline from prototype to real-world deployment.
Your browser does not support the video tag.</p><div>
<div class="full-width-layout__copy"><p>Agility uses NVIDIA Isaac open frameworks to bring its robots from simulation to reality.
It all starts with data.
Turning Compute Into Data
Just a couple years ago, scaling a robotics pipeline depended on a developer’s ability to manually collect data: A robot’s learning depended on its exposure to different scenarios and real-world environments.
NVIDIA open libraries and frameworks change the equation by blending real-world signals — sensor logs and teleoperation demonstrations — with simulation-generated data to quickly turn cloud compute into large quantities of usable data.
Generating high-fidelity, physically accurate synthetic data helps robotics developers overcome the limitations of physical data collection, where it can be difficult or impossible to gather enough information about rare edge cases. These edge cases may be hard or unsafe to capture physically, but they’re essential for a robot to master before deploying at scale in unpredictable, real-world environments.
While synthetic data today makes up just 20% of AI training data for edge scenarios, it’s expected to constitute more than 90% of edge scenario data by 2030, according to a report by Gartner.
NVIDIA is propelling this shift with libraries and open frameworks that fuel an entire factory for realistic synthetic data based on the physical world.
NVIDIA Omniverse NuRec accelerated 3D Gaussian splatting libraries, now in general availability, turn real-world sensor data into OpenUSD-based interactive simulations in NVIDIA Isaac Sim, an open source robotics simulation framework. This enables developers to scan and recreate real worlds from sensor data, making it easy to safely test robots in simulations based on real physical interactions.
