Just posted to my Forbes column about a breakthrough that could fundamentally change how robots interact with people—and how they protect themselves.
Researchers at City University of Hong Kong have developed a neuromorphic robotic “e-skin” that works much more like human skin. Instead of routing every touch signal up to a central CPU or GPU, this skin senses pressure, detects damage, and even triggers instant local reflexes—much like the way your spinal cord yanks your hand away from a hot stove before your brain catches up. It’s a major shift from today’s robotic architectures, where touch is slow, centralized, and often little more than an on/off signal.
What makes this especially interesting is that the skin can also detect when it’s been damaged and pinpoint exactly where the injury occurred, thanks to modular components that constantly signal they’re “alive.” If a section fails, it can be quickly swapped out—no full disassembly required. That’s a big step toward safer, more resilient robots operating inches away from humans in hospitals, homes, hotels, and collaborative workplaces. Aside from a few exceptions like Neura Robotics, most robots today simply aren’t built with this kind of tactile intelligence.