Document cover

Engineering the next generation of surgical robots...conductive skin gives robots the human touch...how vision integration is transforming collaborative robotics. Read about these innovations and more in this compendium of articles from the editors of Tech Briefs and Motion Design.


Don't have an account?


Overview

The February 2026 Robotics & Motion Control Special Report showcases cutting-edge advances transforming the robotics landscape, highlighting innovations spanning tactile sensing, vision systems, soft robotics, autonomous surgery, and AI-driven design.

A key breakthrough is the development of a flexible, durable electronic robotic skin by researchers at the University of Cambridge and University College London. Unlike conventional skins embedding multiple distinct sensors, this single-material gelatine-based conductive hydrogel contains over 860,000 sensory pathways enabling multi-modal touch detection—pressure, temperature, cuts, and multi-point contact—mimicking human skin more closely. Equipped with only 32 electrodes at its wrist, the skin gathers millions of data points, refined through machine learning models for efficient, nuanced tactile perception. Applications range from humanoid robots and prosthetics to automotive and disaster-relief robotics.

Complementing tactile sensing improvements, new robotic eyes developed by Fuzhou University use quantum dot technology to emulate human vision's rapid light adaptation. These vision sensors adjust to extreme lighting within 40 seconds—much faster than the human eye—facilitating reliable operation in autonomous vehicles and robots navigating dynamic lighting conditions, such as tunnels and direct sunlight. The sensor’s nano-engineered layers trap and release charges like the eye’s photopigments, enhancing responsiveness while reducing redundant visual data and power consumption.

Soft robotics also advance with novel thin liquid crystal elastomer actuators enabling millimeter-scale soft vine robots to navigate delicate environments, such as threading models of human arteries and jet engine interiors. These robots grow by everting their skin and use temperature and pressure-based control to steer, opening prospects for wearable haptics, grippers, and delicate exploration.

On the control and design front, AI-powered frameworks optimize complex actuator configurations and morphing capabilities, allowing reductions in control channel numbers without sacrificing function. This human-AI collaboration promises to produce adaptive, scalable robots with shape-changing abilities, potentially revolutionizing everyday objects like wearable devices and robotic bedsheets.

In surgical robotics, Johns Hopkins University’s STAR device demonstrates autonomous tissue operation by learning from surgical videos, indicating a future where robots assist or perform surgeries with real-time sensing, machine learning, and advanced control, augmenting human surgeons.

Overall, the report underscores a robotics revolution fueled by integration of advanced materials, sensory intelligence, AI-driven design, and system-level optimization—paving the way for more adaptable, autonomous, and human-like robots capable of operating in complex, variable, and delicate real-world environments.