Key Takeaways:For decades robots have been able to perform a certain number of specified tasks on a repetitive basis but have struggled to handle novelty and changing conditions.Now, however, robots and artificial intelligence (AI) are beginning to converge, and we may finally be able to see industrial robots that can recognize patterns and generalize.Successfully integrating AI with robotics could also make robots cheaper as the hardware would no longer need to be as precise.There’s a palpable excitement in Silicon Valley right now. Tesla is building humanoid robots. Google DeepMind launched Gemini Robotics. Startups are raising millions on pitch decks about robots folding laundry. The goal is a robot that can perceive, decide, and act in the real world, rather than just processing text or images on a server. This is known as “embodied artificial intelligence,” and the term is currently everywhere.When I started working in robotics again, after years of watching the field promise breakthroughs that never quite arrived, I met this excitement with some skepticism. I’d heard “this time is different” before. Similarly, if you’re outside this world, you might be wondering: What’s the big deal? We’ve had robots for decades. Factories have used robotic arms since the 1960s. Amazon warehouses are full of autonomous machines.But 2026 does actually feel different. Why? The answer lies in understanding what robots couldn’t do until now—and what’s suddenly changed to make the previously impossible feel within reach.The long history of robots that could only do one thingTo understand where we are, we…