The world has been introduced to X1, the groundbreaking multirobot system that combines a humanoid robot with a transforming drone. This innovative system, developed through a collaboration between Caltech’s Center for Autonomous Systems and Technologies (CAST) and the Technology Innovation Institute (TII) in Abu Dhabi, showcases the potential of global expertise in autonomous systems, artificial intelligence, robotics, and propulsion systems.
Aaron Ames, the director of CAST, describes the unique capabilities of the X1 system, stating, “Right now, robots can fly, robots can drive, and robots can walk. Those are all great in certain scenarios. But how do we take those different locomotion modalities and put them together into a single package, so we can excel from the benefits of all these while mitigating the downfalls that each of them have?”
Recently, the team conducted a demonstration of the X1 system on Caltech’s campus. In the scenario presented during the demo, an emergency situation required autonomous agents to quickly reach the scene. The team modified a Unitree G1 humanoid to carry M4, Caltech’s multimodal robot, as a “backpack” for the test.
The demo began with the humanoid walking through the campus, eventually reaching an elevated spot where M4 could be safely deployed. The humanoid then allowed M4 to launch in drone mode before M4 transformed into driving mode to continue efficiently towards the destination. Along the way, M4 encountered obstacles such as the Turtle Pond, prompting it to switch back to drone mode to fly over the obstacle and reach the site of the “emergency” near Caltech Hall.
The X1 system’s ability to seamlessly transition between flying, driving, and walking modes highlights the potential for innovative solutions in emergency response and autonomous operations. This collaboration between Caltech and TII showcases the power of combining diverse expertise to push the boundaries of robotics and autonomous systems. In a groundbreaking collaboration between different research teams, a humanoid robot and an advanced M4 robot have joined forces to create a unified system with diverse functionalities. Led by Mory Gharib, Ph.D., from Caltech and CAST, the teams have successfully integrated expertise in flying and driving robots, locomotion, algorithms, autonomy, and sensing to develop a multi-robot response team.
The partnership also includes the expertise of TII in autonomy and sensing in urban environments, as well as Northeastern University’s contributions to morphing robot design. According to Alireza Ramezani, an associate professor at Northeastern, the collaboration brought together researchers with varied skills to tackle challenging robotics problems.
During a visit to Caltech in July 2025, TII engineers worked on enhancing the M4 robot with Saluki technology, a secure flight controller and computer system for onboard computing. Future plans for the collaboration involve equipping the system with sensors, model-based algorithms, and machine learning-driven autonomy to enable real-time navigation and adaptation to surroundings.
Claudio Tortorici, director of TII, highlighted the installation of various sensors like lidar, cameras, and range finders to enable the robot’s autonomous movement. The goal is for the robot to understand its location and navigate from point to point independently.
In a demonstration, the humanoid robot showcased capabilities beyond basic mobility. Traditionally, humanoid robots mimic human movements based on captured data, but Ames emphasized the need to generate actions without human references for deployment in real-world scenarios. His team focuses on building mathematical models that apply physics principles to robots more broadly.
The collaboration between these research teams has paved the way for a transformative approach to emergency response protocols, with the potential to revolutionize the field of robotics and autonomous systems.