Physical AI: Amazon, NVIDIA, and MassRobotics accelerating development of smart robots

Amazon and NVIDIA are teaming up with MassRobotics to make physical AI a reality. That’s atoms-and-molecules robots in the real physical world powered by electrons-and-chips AI that makes them smart, adaptable, capable, and extendable.

They’re doing it via a physical AI Fellowship, essentially an accelerator for robotics companies, and it’s part of a growing trend in the evolution of artificial intelligence from AI that’s stuck in a box to embodied intelligence that perceives, moves, and acts.

In this episode of TechFirst, I talk with Taimur Rashid, Head of Generative AI and Innovation Delivery at AWS, about what physical AI really means, why it’s essential now, and how Amazon and NVIDIA are using it to accelerate the future of robotics.

Watch the full episode here (and subscribe to my YouTube channel while you’re at it):

Let’s start here: what is “physical AI”?

According to Rashid, physical AI combines spatial awareness, generative intelligence, and real-world actuation — essentially teaching AI to understand and safely interact with physical environments.

It’s what happens when the power of large language models meets sensors, motors, and cameras. The result is AI that can do things, not just say things: navigate, grasp, manipulate, and move through the world.

Safety is the first principle, Rashid says. Unlike a chatbot that can “hallucinate” with little consequence, a robot misunderstanding its environment could cause harm — so the data, training, and models behind physical AI need to be extraordinarily robust.

The first cohort in the physical AI fellowship includes startups doing:

  • Drone naval ships
  • Robotic construction equipment
  • Robotic hands
  • Warehouse robots
  • Powered exoskeletons
  • Agtech for better food
  • And more ..

Why now?

Three forces are converging right now …

  • Labor shortages: Over 2.5 billion people globally do physical labor, representing $50 trillion in annual output. Aging populations are creating major workforce gaps in manufacturing, logistics, and healthcare.

  • Falling hardware costs: Sensors, cameras, and robotics-grade processors have become dramatically cheaper and more efficient.

  • Generative + agentic AI: The same breakthroughs that let ChatGPT reason and act digitally are now being applied to the physical world.

Add it all up, and this is the moment for physical AI: intelligence that doesn’t just compute, but moves.

The startups building the future

The first Physical AI Fellowship cohort includes eight startups tackling everything from construction to caregiving:

  • Bedrock Robotics
    Based in San Francisco, Bedrock Robotics retrofits existing heavy construction machinery with sensors, computers, and AI software, turning ordinary excavators and bulldozers into autonomous, self-operating equipment that can work safely and efficiently on active job sites.
  • Blue Water Autonomy
    Blue Water Autonomy develops autonomous systems for naval and maritime operations, building AI-powered drone vessels that can navigate, patrol, and perform missions on open water without direct human control.
  • Diligent Robotics
    Austin-based Diligent Robotics builds socially intelligent humanoid robots designed to assist people in hospitals and senior living facilities, performing routine tasks like deliveries and logistics so human staff can focus on care.
  • Generalist AI
    Generalist AI is creating adaptable AI models that can learn and perform a wide range of real-world robotic tasks, aiming to bridge the gap between today’s narrow, task-specific robots and truly multipurpose, general-capability machines.
  • RobCo
    RobCo provides modular robotic arms and automation kits for small and medium-sized manufacturers, enabling flexible, low-cost automation that can be easily reconfigured for different production tasks without expert coding or integration.
  • Tutor Intelligence
    Tutor Intelligence builds AI-powered control software that allows robots to learn and adapt through observation and interaction, reducing the need for manual programming and speeding up deployment across diverse industrial environments.
  • Wandercraft
    Wandercraft, a French robotics company, designs powered exoskeletons that restore or enhance human mobility, giving people with walking impairments the ability to stand and move naturally through advanced balance and motion control systems.
  • Zordi
    Zordi is building autonomous farm robots that use AI and computer vision to monitor, harvest, and manage crops, helping growers improve yield, reduce labor costs, and make farming more sustainable.

Each is blending edge AI, reinforcement learning, and real-time physical interaction to solve some of the hardest problems in robotics.

Generalists, specialists, and the path ahead

Rashid predicts a future with both general-purpose and specialized robots — mirroring how AI models are evolving today. Foundation models can learn broad skills, while specialized models (and robots) handle domain-specific, fine-grained tasks like surgery or precision assembly.

Amazon itself is already seeing results: robotic automation in fulfillment centers has boosted efficiency by 25%, with software intelligence playing just as big a role as the hardware.

“We’re in the early days of agentic AI,” Rashid says. “And we’re in the earliest days of physical AI.”

But if this fellowship is any indication, there’s very cool stuff coming in the near future.