Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
In this 3 minutes long one-take video, the LimX Dynamics CL-1 takes on the challenge of continuous heavy objects loading among shelves in a simulated warehouse, showcasing the advantages of the general-purpose form factor of humanoid robots.
Birds, bats and many insects can tuck their wings against their bodies when at rest and deploy them to power flight. Whereas birds and bats use well-developed pectoral and wing muscles, how insects control their wing deployment and retraction remains unclear because this varies among insect species. Here we demonstrate that rhinoceros beetles can effortlessly deploy their hindwings without necessitating muscular activity. We validated the hypothesis using a flapping microrobot that passively deployed its wings for stable, controlled flight and retracted them neatly upon landing, demonstrating a simple, yet effective, approach to the design of insect-like flying micromachines.
Agility Robotics’ CTO, Pras Velagapudi, talks about data collection, and specifically about the different kinds we collect from our real-world robot deployments and generally what that data is used for.
ABB Robotics and the Texas Children’s Hospital have developed a groundbreaking lab automation solution using ABB’s YuMi® cobot to transfer fruit flies (Drosophila melanogaster) used in the study for developing new drugs for neurological conditions such as Alzheimer’s, Huntington’s and Parkinson’s.
Extend Robotics are building embodied AI enabling highly flexible automation for real-world physical tasks. The system features intuitive immersive interface enabling tele-operation, supervision and training AI models.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
At ICRA 2024, in Tokyo last May, we sat down with the director of Shadow Robot, Rich Walker, to talk about the journey toward developing its newest model. Designed for reinforcement learning, the hand is extremely rugged, has three fingers that act like thumbs, and has fingertips that are highly sensitive to touch.
Food Angel is a food delivery robot to help with the problems of food insecurity and homelessness. Utilizing autonomous wheeled robots for this application may seem to be a good approach, especially with a number of successful commercial robotic delivery services. However, besides technical considerations such as range, payload, operation time, autonomy, etc., there are a number of important aspects that still need to be investigated, such as how the general public and the receiving end may feel about using robots for such applications, or human-robot interaction issues such as how to communicate the intent of the robot to the homeless.
The UKRI FLF team RoboHike of UCL Computer Science of the Robot Perception and Learning lab with Forestry England demonstrate the ANYmal robot to help preserve the cultural heritage of an historic mine in the Forest of Dean, Gloucestershire, UK.
This clip is from a reboot of the British TV show “Time Team.” If you’re not already a fan of “Time Team,” let me just say that it is one of the greatest retro reality TV shows ever made, where actual archaeologists wander around the United Kingdom and dig stuff up. If they can find anything. Which they often can’t. And also it has Tony Robinson (from “Blackadder”), who runs everywhere for some reason. Go to Time Team Classics on YouTube for 70+ archived episodes.
UBTECH humanoid robot Walker S Lite is working in Zeekr’s intelligent factory to complete handling tasks at the loading workstation for 21 consecutive days, and assist its employees with logistics work.
Current visual navigation systems often treat the environment as static, lacking the ability to adaptively interact with obstacles. This limitation leads to navigation failure when encountering unavoidable obstructions. In response, we introduce IN-Sight, a novel approach to self-supervised path planning, enabling more effective navigation strategies through interaction with obstacles.
MIT MechE researchers introduce an approach called SimPLE (Simulation to Pick Localize and placE), a method of precise kitting, or pick and place, in which a robot learns to pick, regrasp, and place objects using the object’s computer-aided design (CAD) model, and all without any prior experience or encounters with the specific objects.
Staff, students (and quadruped robots!) from UCL Computer Science wish the Great Britain athletes the best of luck this summer in the Olympic Games & Paralympics.
Walking in tall grass can be hard for robots, because they can’t see the ground that they’re actually stepping on. Here’s a technique to solve that, published in Robotics and Automation Letters last year.
There is no such thing as excess batter on a corn dog, and there is also no such thing as a defective donut. And apparently, making Kool-Aid drink pouches is harder than it looks.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
IROS 2024: 14–18 October 2024, ABU DHABI, UNITED ARAB EMIRATES
We introduce Berkeley Humanoid, a reliable and low-cost mid-scale humanoid research platform for learning-based control. Our lightweight, in-house-built robot is designed specifically for learning algorithms with low simulation complexity, anthropomorphic motion, and high reliability against falls. Capable of omnidirectional locomotion and withstanding large perturbations with a compact setup, our system aims for scalable, sim-to-real deployment of learning-based humanoid systems.
This article presents Ray, a new type of audio-animatronic robot head. All the mechanical structure of the robot is built in one step by 3-D printing... This simple, lightweight structure and the separate tendon-based actuation system underneath allow for smooth, fast motions of the robot. We also develop an audio-driven motion generation module that automatically synthesizes natural and rhythmic motions of the head and mouth based on the given audio.
CSAIL researchers introduce a novel approach allowing robots to be trained in simulations of scanned home environments, paving the way for customized household automation accessible to anyone.
NVIDIA CEO Jensen Huang presented a major breakthrough on Project GR00T with WIRED’s Lauren Goode at SIGGRAPH 2024. In a two-minute demonstration video, NVIDIA explained a systematic approach they discovered to scale up robot data, addressing one of the most challenging issues in robotics.
In this research, we investigated the innovative use of a manipulator as a tail in quadruped robots to augment their physical capabilities. Previous studies have primarily focused on enhancing various abilities by attaching robotic tails that function solely as tails on quadruped robots. While these tails improve the performance of the robots, they come with several disadvantages, such as increased overall weight and higher costs. To mitigate these limitations, we propose the use of a 6-DoF manipulator as a tail, allowing it to serve both as a tail and as a manipulator.
In this end-to-end demo, we showcase how MenteeBot transforms the shopping experience for individuals, particularly those using wheelchairs. Through discussions with a global retailer, MenteeBot has been designed to act as the ultimate shopping companion, offering a seamless, natural experience.
Nature Fresh Farms, based in Leamington, Ontario, is one of North America’s largest greenhouse farms growing high-quality organics, berries, peppers, tomatoes, and cucumbers. In 2022, Nature Fresh partnered with Four Growers, a FANUC Authorized System Integrator, to develop a robotic system equipped with AI to harvest tomatoes in the greenhouse environment.
Honeybee Robotics, a Blue Origin company, is developing Lunar Utility Navigation with Advanced Remote Sensing and Autonomous Beaming for Energy Redistribution, also known as LUNARSABER. In July 2024, Honeybee Robotics captured LUNARSABER’s capabilities during a demonstration of a scaled prototype.
In this video we present results of our lab from the latest field deployments conducted in the scope of the Digiforest EU project, in Stein am Rhein, Switzerland. Digiforest brings together various partners working on aerial and legged robots, autonomous harvesters, and forestry decision-makers. The goal of the project is to enable autonomous robot navigation, exploration, and mapping, both below and above the canopy, to create a data pipeline that can support and enhance foresters’ decision-making systems.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
If the Italian Institute of Technology’s iRonCub3 looks this cool while learning to fly, just imagine how cool it will look when it actually takes off!
Hovering is in the works, but this is a really hard problem, which you can read more about in Daniele Pucci’s post on LinkedIn.
Stanford Engineering and the Toyota Research Institute achieve the world’s first autonomous tandem drift. Leveraging the latest AI technology, Stanford Engineering and TRI are working to make driving safer for all. By automating a driving style used in motorsports called drifting—in which a driver deliberately spins the rear wheels to break traction—the teams have unlocked new possibilities for future safety systems.
Researchers at the Istituto Italiano di Tecnologia (Italian Institute of Technology) have demonstrated that under specific conditions, humans can treat robots as coauthors of the results of their actions. The condition that enables this phenomenon is a robot that behaves in a social, humanlike manner. Engaging in eye contact and participating in a common emotional experience, such as watching a movie, are key.
I’m sorry, but this looks exactly like a quadrotor sitting on a test stand.
The 12-pound Quad-Biplane combines four rotors and two wings without any control surfaces. The aircraft takes off like a conventional quadcopter and transitions to a more-efficient horizontal cruise flight, similar to that of a biplane. This combines the simplicity of a quadrotor design, providing vertical flight capability, with the cruise efficiency of a fixed-wing aircraft. The rotors are responsible for aircraft control both in vertical and forward cruise flight regimes.
Humans possess a remarkable ability to react to unpredictable perturbations through immediate mechanical responses, which harness the visco-elastic properties of muscles to maintain balance. Inspired by this behavior, we propose a novel design of a robotic leg utilizing fiber-jammed structures as passive compliant mechanisms to achieve variable joint stiffness and damping.
This video shows a dexterous avatar humanoid robot with VR teleoperation, hand tracking, and speech recognition to achieve highly dexterous mobile manipulation. Extend Robotics is developing a dexterous remote-operation interface to enable data collection for embodied AI and humanoid robots.
I never really thought about this, but wind turbine blades are hollow inside and need to be inspected sometimes, which is really one of those jobs where you’d much rather have a robot do it.
Here’s a full, uncut drone-delivery mission, including a package pickup from our AutoLoader—a simple, nonpowered mechanical device that allows retail partners to utilize drone delivery with existing curbside-pickup workflows.
Daniel Simu and his acrobatic robot competed in “America’s Got Talent,” and even though his robot did a very robot thing by breaking itself immediately beforehand, the performance went really well.
Zoox CEO Aicha Evans and cofounder and chief technology officer Jesse Levinson hosted a LinkedIn Live last week to reflect on the past decade of building Zoox and their predictions for the next 10 years of the autonomous-vehicle industry.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
Perching with winged Unmanned Aerial Vehicles has often been solved by means of complex control or intricate appendages. Here, we present a method that relies on passive wing morphing for crash-landing on trees and other types of vertical poles. Inspired by the adaptability of animals’ and bats’ limbs in gripping and holding onto trees, we design dual-purpose wings that enable both aerial gliding and perching on poles.
Pretty impressive to have low enough latency in controlling your robot’s hardware that it can play ping pong, although it makes it impossible to tell whether the robot or the human is the one that’s actually bad at the game.
Have you ever wondered how insects are able to go so far beyond their home and still find their way? The answer to this question is not only relevant to biology but also to making the AI for tiny, autonomous robots. We felt inspired by biological findings on how ants visually recognize their environment and combine it with counting their steps in order to get safely back home.
Team RoMeLa Practice with ARTEMIS humanoid robots, featuring Tsinghua Hephaestus (Booster Alpha). Fully autonomous humanoid robot soccer match with the official goal of beating the human WorldCup Champions by the year 2050.
We propose RialTo, a new system for robustifying real-world imitation learning policies via reinforcement learning in “digital twin” simulation environments constructed on the fly from small amounts of real-world data.
There is absolutely no reason to watch this entire video, but Moley Robotics is still working on that robotic kitchen of theirs.
I will once again point out that the hardest part of cooking (for me, anyway) is the prep and the cleanup, and this robot still needs you to do all that.
B-Human has so far won 10 titles at the RoboCup SPL tournament. Can we make it 11 this year? Our RoboCup starts off with a banger game against HTWK Robots form Leipzig!
AMBIDEX is a dual-armed robot with an innovative mechanism developed for safe coexistence with humans. Based on an innovative cable structure, it is designed to be both strong and stable.
As NASA’s Perseverance rover prepares to ascend to the rim of Jezero Crater, its team is investigating a rock unlike any that they’ve seen so far on Mars. Deputy project scientist Katie Stack Morgan explains why this rock, found in an ancient channel that funneled water into the crater, could be among the oldest that Perseverance has investigated—or the youngest.
We present a novel approach for enhancing human-robot collaboration using physical interactions for real-time error correction of large language model (LLM) parameterized commands.
Husky Observer was recently used to autonomously inspect solar panels at a large solar panel farm. As part of its mission, the robot navigated rows of solar panels, stopping to inspect areas with its integrated thermal camera. Images were taken by the robot and enhanced to detect potential “hot spots” in the panels.
Meet Hydrus, the autonomous underwater drone revolutionising underwater data collection by eliminating the barriers to its entry. Hydrus ensures that even users with limited resources can execute precise and regular subsea missions to meet their data requirements.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
We present Morphy, a novel compliant and morphologically aware flying robot that integrates sensorized flexible joints in its arms, thus enabling resilient collisions at high speeds and the ability to squeeze through openings narrower than its nominal dimensions.
Morphy represents a new class of soft flying robots that can facilitate unprecedented resilience through innovations both in the “body” and “brain.” The novel soft body can, in turn, enable new avenues for autonomy. Collisions that previously had to be avoided have now become acceptable risks, while areas that are untraversable for a certain robot size can now be negotiated through self-squeezing. These novel bodily interactions with the environment can give rise to new types of embodied intelligence.
Segments of daily training for robots driven by reinforcement learning. Multiple tests done in advance for friendly service humans. The training includes some extreme tests. Please do not imitate!
Here I am, without the ability or equipment (or desire) required to iron anything that I own, and Flexiv’s got robots out there ironing fancy leather car seats.
We unveiled a significant leap forward in perception technology for our humanoid robot GR-1. The newly adapted pure-vision solution integrates bird’s-eye view, transformer models, and an occupancy network for precise and efficient environmental perception.
LimX Dynamics’ humanoid robot CL-1 was launched in December 2023. It climbed stairs based on real-time terrain perception, two steps per stair. Four months later, in April 2024, the second demo video showcased CL-1 in the same scenario. It had advanced to climb the same stair, one step per stair.
New research from the University of Massachusetts Amherst shows that programming robots to create their own teams and voluntarily wait for their teammates results in faster task completion, with the potential to improve manufacturing, agriculture, and warehouse automation.
LASDRA (Large-size Aerial Skeleton with Distributed Rotor Actuation system (ICRA18) is a scalable and modular aerial robot. It can assume a very slender, long, and dexterous form factor and is very lightweight.
We propose augmenting initially passive structures built from simple repeated cells, with novel active units to enable dynamic, shape-changing, and robotic applications. Inspired by metamaterials that can employ mechanisms, we build a framework that allows users to configure cells of this passive structure to allow it to perform complex tasks.
Testing autonomous exploration at the Exyn Office using Spot from Boston Dynamics. In this demo, Spot autonomously explores our flight space while on the hunt for one of our engineers.
Meet Heavy Picker, the strongest robot in bulky-waste sorting and an absolute pro at lifting and sorting waste. With skills that would make a concert pianist jealous and a work ethic that never needs coffee breaks, Heavy Picker was on the lookout for new challenges.
AI is the biggest and most consequential business, financial, legal, technological, and cultural story of our time. In this panel, you will hear from the underrepresented community of women scientists who have been leading the AI revolution—from the beginning to now.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
ICRA@40: 23–26 September 2024, ROTTERDAM, NETHERLANDS
IROS 2024: 14–18 October 2024, ABU DHABI, UNITED ARAB EMIRATES
In this video, you see the start of 1X’s development of an advanced AI system that chains simple tasks into complex actions using voice commands, allowing seamless multi-robot control and remote operation. By starting with single-task models, we ensure smooth transitions to more powerful unified models, ultimately aiming to automate high-level actions using AI.
This video does not contain teleoperation, computer graphics, cuts, video speedups, or scripted trajectory playback. It’s all controlled via neural networks.
As the old adage goes, one cannot claim to be a true man without a visit to the Great Wall of China. XBot-L, a full-sized humanoid robot developed by Robot Era, recently acquitted itself well in a walk along sections of the Great Wall.
The paper presents a novel rotary wing platform, that is capable of folding and expanding its wings during flight. Our source of inspiration came from birds’ ability to fold their wings to navigate through small spaces and dive. The design of the rotorcraft is based on the monocopter platform, which is inspired by the flight of Samara seeds.
We present a variable stiffness robotic skin (VSRS), a concept that integrates stiffness-changing capabilities, sensing, and actuation into a single, thin modular robot design. Reconfiguring, reconnecting, and reshaping VSRSs allows them to achieve new functions both on and in the absence of a host body.
Heimdall is a new rover design for the 2024 University Rover Challenge (URC). This video shows highlights of Heimdall’s trip during the four missions at URC 2024.
Heimdall features a split body design with whegs (wheel legs), and a drill for sub-surface sample collection. It also has the ability to manipulate a variety of objects, collect surface samples, and perform onboard spectrometry and chemical tests.
The AI system used identifies and separates red apples from green apples, after which a robotic arm picks up the red apples identified with a qb SoftHand Industry and gently places them in a basket.
My favorite part is the magnetic apple stem system.
DexNex (v0, June 2024) is an anthropomorphic teleoperation testbed for dexterous manipulation at the Center for Robotics and Biosystems at Northwestern University. DexNex recreates human upper-limb functionality through a near 1-to-1 mapping between Operator movements and Avatar actions.
Motion of the Operator’s arms, hands, fingers, and head are fed forward to the Avatar, while fingertip pressures, finger forces, and camera images are fed back to the Operator. DexNex aims to minimize the latency of each subsystem to provide a seamless, immersive, and responsive user experience. Future research includes gaining a better understanding of the criticality of haptic and vision feedback for different manipulation tasks; providing arm-level grounded force feedback; and using machine learning to transfer dexterous skills from the human to the robot.
Fulfilling a school requirement by working in a Romanian locomotive factory one week each month, Daniela Rus learned to operate “machines that help us make things.” Appreciation for the practical side of math and science stuck with Daniela, who is now Director of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL).
For AI to achieve its full potential, non-experts need to be let into the development process, says Rumman Chowdhury, CEO and cofounder of Humane Intelligence. She tells the story of farmers fighting for the right to repair their own AI-powered tractors (which some manufacturers actually made illegal), proposing everyone should have the ability to report issues, patch updates or even retrain AI technologies for their specific uses.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
Do you have trouble multitasking? Cyborgize yourself through muscle stimulation to automate repetitive physical tasks while you focus on something else.
By combining a 5,000 frame-per-second (FPS) event camera with a 20-FPS RGB camera, roboticists from the University of Zurich have developed a much more effective vision system that keeps autonomous cars from crashing into stuff, as described in the current issue of Nature.
Mitsubishi Electric has been awarded the GUINNESS WORLD RECORDS title for the fastest robot to solve a puzzle cube. The robot’s time of 0.305 second beat the previous record of 0.38 second, for which it received a GUINNESS WORLD RECORDS certificate on 21 May 2024.
Sony’s AIBO is celebrating its 25th anniversary, which seems like a long time, and it is. But back then, the original AIBO could check your email for you. Email! In 1999!
MIT CSAIL researchers enhance robotic precision with sophisticated tactile sensors in the palm and agile fingers, setting the stage for improvements in human-robot interaction and prosthetic technology.
We present a novel adversarial attack method designed to identify failure cases in any type of locomotion controller, including state-of-the-art reinforcement-learning-based controllers. Our approach reveals the vulnerabilities of black-box neural network controllers, providing valuable insights that can be leveraged to enhance robustness through retraining.
In this work, we investigate a novel integrated flexible OLED display technology used as a robotic skin-interface to improve robot-to-human communication in a real industrial setting at Volkswagen or a collaborative human-robot interaction task in motor assembly. The interface was implemented in a workcell and validated qualitatively with a small group of operators (n=9) and quantitatively with a large group (n=42). The validation results showed that using flexible OLED technology could improve the operators’ attitude toward the robot; increase their intention to use the robot; enhance their perceived enjoyment, social influence, and trust; and reduce their anxiety.
We introduce InflatableBots, shape-changing inflatable robots for large-scale encountered-type haptics in VR. Unlike traditional inflatable shape displays, which are immobile and limited in interaction areas, our approach combines mobile robots with fan-based inflatable structures. This enables safe, scalable, and deployable haptic interactions on a large scale.
We present a bioinspired passive dynamic foot in which the claws are actuated solely by the impact energy. Our gripper simultaneously resolves the issue of smooth absorption of the impact energy and fast closure of the claws by linking the motion of an ankle linkage and the claws through soft tendons.
In this video, a 3-UPU exoskeleton robot for a wrist joint is designed and controlled to perform wrist extension, flexion, radial-deviation, and ulnar-deviation motions in stroke-affected patients. This is the first time a 3-UPU robot has been used effectively for any kind of task.
“UPU” stands for “universal-prismatic-universal” and refers to the actuators—the prismatic joints between two universal joints.
In this paper, we propose a system for the artist-directed authoring of stylized bipedal walking gaits, tailored for execution on robotic characters. To demonstrate the utility of our approach, we animate gaits for a custom, free-walking robotic character, and show, with two additional in-simulation examples, how our procedural animation technique generalizes to bipeds with different degrees of freedom, proportions, and mass distributions.
The European drone project Labyrinth aims to keep new and conventional air traffic separate, especially in busy airspaces such as those expected in urban areas. The project provides a new drone-traffic service and illustrates its potential to improve the safety and efficiency of civil land, air, and sea transport, as well as emergency and rescue operations.
This Carnegie Mellon University Robotics Institute seminar, by Kim Baraka at Vrije Universiteit Amsterdam, is on the topic “Why We Should Build Robot Apprentices and Why We Shouldn’t Do It Alone.”
For robots to be able to truly integrate human-populated, dynamic, and unpredictable environments, they will have to have strong adaptive capabilities. In this talk, I argue that these adaptive capabilities should leverage interaction with end users, who know how (they want) a robot to act in that environment. I will present an overview of my past and ongoing work on the topic of human-interactive robot learning, a growing interdisciplinary subfield that embraces rich, bidirectional interaction to shape robot learning. I will discuss contributions on the algorithmic, interface, and interaction design fronts, showcasing several collaborations with animal behaviorists/trainers, dancers, puppeteers, and medical practitioners.
Greetings from the
IEEE International Conference on Robotics and Automation (ICRA) in Yokohama, Japan! We hope you’ve been enjoying our short videos on TikTok, YouTube, and Instagram. They are just a preview of our in-depth ICRA coverage, and over the next several weeks we’ll have lots of articles and videos for you. In today’s edition of Video Friday, we bring you a dozen of the most interesting projects presented at the conference.
Enjoy today’s videos, and stay tuned for more ICRA posts!
Upcoming robotics events for the next few months:
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
The following two videos are part of the “
Cooking Robotics: Perception and Motion Planning” workshop, which explored “the new frontiers of ‘robots in cooking,’ addressing various scientific research questions, including hardware considerations, key challenges in multimodal perception, motion planning and control, experimental methodologies, and benchmarking approaches.” The workshop featured robots handling food items like cookies, burgers, and cereal, and the two robots seen in the videos below used knives to slice cucumbers and cakes. You can watch all workshop videos here.
“SliceIt!: Simulation-Based Reinforcement Learning for Compliant Robotic Food Slicing,” by Cristian C. Beltran-Hernandez, Nicolas Erbetti, and Masashi Hamaya from OMRON SINIC X Corporation, Tokyo, Japan.
Cooking robots can enhance the home experience by reducing the burden of daily chores. However, these robots must perform their tasks dexterously and safely in shared human environments, especially when handling dangerous tools such as kitchen knives. This study focuses on enabling a robot to autonomously and safely learn food-cutting tasks. More specifically, our goal is to enable a collaborative robot or industrial robot arm to perform food-slicing tasks by adapting to varying material properties using compliance control. Our approach involves using Reinforcement Learning (RL) to train a robot to compliantly manipulate a knife, by reducing the contact forces exerted by the food items and by the cutting board. However, training the robot in the real world can be inefficient, and dangerous, and result in a lot of food waste. Therefore, we proposed SliceIt!, a framework for safely and efficiently learning robot food-slicing tasks in simulation. Following a real2sim2real approach, our framework consists of collecting a few real food slicing data, calibrating our dual simulation environment (a high-fidelity cutting simulator and a robotic simulator), learning compliant control policies on the calibrated simulation environment, and finally, deploying the policies on the real robot.
“Cafe Robot: Integrated AI Skillset Based on Large Language Models,” by Jad Tarifi, Nima Asgharbeygi, Shuhei Takamatsu, and Masataka Goto from Integral AI in Tokyo, Japan, and Mountain View, Calif., USA.
The cafe robot engages in natural language inter-action to receive orders and subsequently prepares coffee and cakes. Each action involved in making these items is executed using AI skills developed by Integral, including Integral Liquid Pouring, Integral Powder Scooping, and Integral Cutting. The dialogue for making coffee, as well as the coordination of each action based on the dialogue, is facilitated by the Integral Task Planner.
“Autonomous Overhead Powerline Recharging for Uninterrupted Drone Operations,” by Viet Duong Hoang, Frederik Falk Nyboe, Nicolaj Haarhøj Malle, and Emad Ebeid from University of Southern Denmark, Odense, Denmark.
We present a fully autonomous self-recharging drone system capable of long-duration sustained operations near powerlines. The drone is equipped with a robust onboard perception and navigation system that enables it to locate powerlines and approach them for landing. A passively actuated gripping mechanism grasps the powerline cable during landing after which a control circuit regulates the magnetic field inside a split-core current transformer to provide sufficient holding force as well as battery recharging. The system is evaluated in an active outdoor three-phase powerline environment. We demonstrate multiple contiguous hours of fully autonomous uninterrupted drone operations composed of several cycles of flying, landing, recharging, and takeoff, validating the capability of extended, essentially unlimited, operational endurance.
“Learning Quadrupedal Locomotion With Impaired Joints Using Random Joint Masking,” by Mincheol Kim, Ukcheol Shin, and Jung-Yup Kim from Seoul National University of Science and Technology, Seoul, South Korea, and Robotics Institute, Carnegie Mellon University, Pittsburgh, Pa., USA.
Quadrupedal robots have played a crucial role in various environments, from structured environments to complex harsh terrains, thanks to their agile locomotion ability. However, these robots can easily lose their locomotion functionality if damaged by external accidents or internal malfunctions. In this paper, we propose a novel deep reinforcement learning framework to enable a quadrupedal robot to walk with impaired joints. The proposed framework consists of three components: 1) a random joint masking strategy for simulating impaired joint scenarios, 2) a joint state estimator to predict an implicit status of current joint condition based on past observation history, and 3) progressive curriculum learning to allow a single network to conduct both normal gait and various joint-impaired gaits. We verify that our framework enables the Unitree’s Go1 robot to walk under various impaired joint conditions in real world indoor and outdoor environments.
“Synthesizing Robust Walking Gaits via Discrete-Time Barrier Functions With Application to Multi-Contact Exoskeleton Locomotion,” by Maegan Tucker, Kejun Li, and Aaron D. Ames from Georgia Institute of Technology, Atlanta, Ga., and California Institute of Technology, Pasadena, Calif., USA.
Successfully achieving bipedal locomotion remains challenging due to real-world factors such as model uncertainty, random disturbances, and imperfect state estimation. In this work, we propose a novel metric for locomotive robustness – the estimated size of the hybrid forward invariant set associated with the step-to-step dynamics. Here, the forward invariant set can be loosely interpreted as the region of attraction for the discrete-time dynamics. We illustrate the use of this metric towards synthesizing nominal walking gaits using a simulation in-the-loop learning approach. Further, we leverage discrete time barrier functions and a sampling-based approach to approximate sets that are maximally forward invariant. Lastly, we experimentally demonstrate that this approach results in successful locomotion for both flat-foot walking and multicontact walking on the Atalante lower-body exoskeleton.
“Supernumerary Robotic Limbs to Support Post-Fall Recoveries for Astronauts,” by Erik Ballesteros, Sang-Yoep Lee, Kalind C. Carpenter, and H. Harry Asada from MIT, Cambridge, Mass., USA, and Jet Propulsion Laboratory, California Institute of Technology, Pasadena, Calif., USA.
This paper proposes the utilization of Supernumerary Robotic Limbs (SuperLimbs) for augmenting astronauts during an Extra-Vehicular Activity (EVA) in a partial-gravity environment. We investigate the effectiveness of SuperLimbs in assisting astronauts to their feet following a fall. Based on preliminary observations from a pilot human study, we categorized post-fall recoveries into a sequence of statically stable poses called “waypoints”. The paths between the waypoints can be modeled with a simplified kinetic motion applied about a specific point on the body. Following the characterization of post-fall recoveries, we designed a task-space impedance control with high damping and low stiffness, where the SuperLimbs provide an astronaut with assistance in post-fall recovery while keeping the human in-the-loop scheme. In order to validate this control scheme, a full-scale wearable analog space suit was constructed and tested with a SuperLimbs prototype. Results from the experimentation found that without assistance, astronauts would impulsively exert themselves to perform a post-fall recovery, which resulted in high energy consumption and instabilities maintaining an upright posture, concurring with prior NASA studies. When the SuperLimbs provided assistance, the astronaut’s energy consumption and deviation in their tracking as they performed a post-fall recovery was reduced considerably.
“ArrayBot: Reinforcement Learning for Generalizable Distributed Manipulation through Touch,” by Zhengrong Xue, Han Zhang, Jingwen Cheng, Zhengmao He, Yuanchen Ju, Changyi Lin, Gu Zhang, and Huazhe Xu from Tsinghua Embodied AI Lab, IIIS, Tsinghua University; Shanghai Qi Zhi Institute; Shanghai AI Lab; and Shanghai Jiao Tong University, Shanghai, China.
We present ArrayBot, a distributed manipulation system consisting of a 16 × 16 array of vertically sliding pillars integrated with tactile sensors. Functionally, ArrayBot is designed to simultaneously support, perceive, and manipulate the tabletop objects. Towards generalizable distributed manipulation, we leverage reinforcement learning (RL) algorithms for the automatic discovery of control policies. In the face of the massively redundant actions, we propose to reshape the action space by considering the spatially local action patch and the low-frequency actions in the frequency domain. With this reshaped action space, we train RL agents that can relocate diverse objects through tactile observations only. Intriguingly, we find that the discovered policy can not only generalize to unseen object shapes in the simulator but also have the ability to transfer to the physical robot without any sim-to-real fine tuning. Leveraging the deployed policy, we derive more real world manipulation skills on ArrayBot to further illustrate the distinctive merits of our proposed system.
“SKT-Hang: Hanging Everyday Objects via Object-Agnostic Semantic Keypoint Trajectory Generation,” by Chia-Liang Kuo, Yu-Wei Chao, and Yi-Ting Chen from National Yang Ming Chiao Tung University, in Taipei and Hsinchu, Taiwan, and NVIDIA.
We study the problem of hanging a wide range of grasped objects on diverse supporting items. Hanging objects is a ubiquitous task that is encountered in numerous aspects of our everyday lives. However, both the objects and supporting items can exhibit substantial variations in their shapes and structures, bringing two challenging issues: (1) determining the task-relevant geometric structures across different objects and supporting items, and (2) identifying a robust action sequence to accommodate the shape variations of supporting items. To this end, we propose Semantic Keypoint Trajectory (SKT), an object agnostic representation that is highly versatile and applicable to various everyday objects. We also propose Shape-conditioned Trajectory Deformation Network (SCTDN), a model that learns to generate SKT by deforming a template trajectory based on the task-relevant geometric structure features of the supporting items. We conduct extensive experiments and demonstrate substantial improvements in our framework over existing robot hanging methods in the success rate and inference time. Finally, our simulation-trained framework shows promising hanging results in the real world.
“TEXterity: Tactile Extrinsic deXterity,” by Antonia Bronars, Sangwoon Kim, Parag Patre, and Alberto Rodriguez from MIT and Magna International Inc.
We introduce a novel approach that combines tactile estimation and control for in-hand object manipulation. By integrating measurements from robot kinematics and an image based tactile sensor, our framework estimates and tracks object pose while simultaneously generating motion plans in a receding horizon fashion to control the pose of a grasped object. This approach consists of a discrete pose estimator that tracks the most likely sequence of object poses in a coarsely discretized grid, and a continuous pose estimator-controller to refine the pose estimate and accurately manipulate the pose of the grasped object. Our method is tested on diverse objects and configurations, achieving desired manipulation objectives and outperforming single-shot methods in estimation accuracy. The proposed approach holds potential for tasks requiring precise manipulation and limited intrinsic in-hand dexterity under visual occlusion, laying the foundation for closed loop behavior in applications such as regrasping, insertion, and tool use.
“Out of Sight, Still in Mind: Reasoning and Planning about Unobserved Objects With Video Tracking Enabled Memory Models,” by Yixuan Huang, Jialin Yuan, Chanho Kim, Pupul Pradhan, Bryan Chen, Li Fuxin, and Tucker Hermans from University of Utah, Salt Lake City, Utah, Oregon State University, Corvallis, Ore., and NVIDIA, Seattle, Wash., USA.
Robots need to have a memory of previously observed, but currently occluded objects to work reliably in realistic environments. We investigate the problem of encoding object-oriented memory into a multi-object manipulation reasoning and planning framework. We propose DOOM and LOOM, which leverage transformer relational dynamics to encode the history of trajectories given partial-view point clouds and an object discovery and tracking engine. Our approaches can perform multiple challenging tasks including reasoning with occluded objects, novel objects appearance, and object reappearance. Throughout our extensive simulation and real world experiments, we find that our approaches perform well in terms of different numbers of objects and different numbers
“Open Sourse Underwater Robot: Easys,” by Michikuni Eguchi, Koki Kato, Tatsuya Oshima, and Shunya Hara from University of Tsukuba and Osaka University, Japan.
“Sensorized Soft Skin for Dexterous Robotic Hands,” by Jana Egli, Benedek Forrai, Thomas Buchner, Jiangtao Su, Xiaodong Chen, and Robert K. Katzschmann from ETH Zurich, Switzerland, and Nanyang Technological University, Singapore.
Conventional industrial robots often use two fingered grippers or suction cups to manipulate objects or interact with the world. Because of their simplified design, they are unable to reproduce the dexterity of human hands when manipulating a wide range of objects. While the control of humanoid hands evolved greatly, hardware platforms still lack capabilities, particularly in tactile sensing and providing soft contact surfaces. In this work, we present a method that equips the skeleton of a tendon-driven humanoid hand with a soft and sensorized tactile skin. Multi-material 3D printing allows us to iteratively approach a cast skin design which preserves the robot’s dexterity in terms of range of motion and speed. We demonstrate that a soft skin enables frmer grasps and piezoresistive sensor integration enhances the hand’s tactile sensing capabilities.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
Mobile robots can transport payloads far greater than their mass through vehicle traction. However, off-road terrain features substantial variation in height, grade, and friction, which can cause traction to degrade or fail catastrophically. This paper presents a system that utilizes a vehicle-mounted, multipurpose manipulator to physically adapt the robot with unique anchors suitable for a particular terrain for autonomous payload transport.
Wing posts a video with the title “What Do Wing’s Drones Sound Like” but only includes a brief snippet—though nothing without background room noise—revealing to curious viewers and listeners exactly what Wing’s drones sound like.
Because, look, a couple seconds of muted audio underneath a voiceover is in fact not really answering the question.
This first instance of ROB 450 in Winter 2024 challenged students to synthesize the knowledge acquired through their Robotics undergraduate courses at the University of Michigan to use a systematic and iterative design and analysis process and apply it to solving a real, open-ended Robotics problem.
This Microsoft Future Leaders in Robotics and AI Seminar is from Catie Cuan at Stanford, on “Choreorobotics: Teaching Robots How to Dance With Humans.”
As robots transition from industrial and research settings into everyday environments, robots must be able to (1) learn from humans while benefiting from the full range of the humans’ knowledge and (2) learn to interact with humans in safe, intuitive, and social ways. I will present a series of compelling robot behaviors, where human perception and interaction are foregrounded in a variety of tasks.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
In this work, we present LocoMan, a dexterous quadrupedal robot with a novel morphology to perform versatile manipulation in diverse constrained environments. By equipping a Unitree Go1 robot with two low-cost and lightweight modular 3-DoF loco-manipulators on its front calves, LocoMan leverages the combined mobility and functionality of the legs and grippers for complex manipulation tasks that require precise 6D positioning of the end effector in a wide workspace.
Object manipulation has been extensively studied in the context of fixed-base and mobile manipulators. However, the overactuated locomotion modality employed by snake robots allows for a unique blend of object manipulation through locomotion, referred to as loco-manipulation. In this paper, we present an optimization approach to solving the loco-manipulation problem based on nonimpulsive implicit-contact path planning for our snake robot COBRA.
Okay, but where that costume has eyes is not where Spot has eyes, so the Spot in the costume can’t see, right? And now I’m skeptical of the authenticity of the mutual snoot-boop.
Here’s some video of Field AI’s robots operating in relatively complex and unstructured environments without prior maps. Make sure to read our article from this week for details!
Is it just me, or is it kind of wild that researchers are now publishing papers comparing their humanoid controller to the “manufacturer’s” humanoid controller? It’s like humanoids are a commodity now or something.
Honey Badger 4.0 is our latest robotic platform, created specifically for traversing hostile environments and difficult terrains. Equipped with multiple cameras and sensors, it will make sure no defect is omitted during inspection.
Have an automation task that calls for the precision and torque of an industrial robot arm…but you need something that is more rugged or a nonconventional form factor? Meet the HEBI Robotics H-Series Actuator! With 9x the torque of our X-Series and seamless compatibility with the HEBI ecosystem for robot development, the H-Series opens a new world of possibilities for robots.
EPFL’s team, led by Ph.D. student Milad Shafiee along with coauthors Guillaume Bellegarda and BioRobotics Lab head Auke Ijspeert, have trained a four-legged robot using deep-reinforcement learning to navigate challenging terrain, achieving a milestone in both robotics and biology.
At Agility, we make robots that are made for work. Our robot Digit works alongside us in spaces designed for people. Digit handles the tedious and repetitive tasks meant for a machine, allowing companies and their people to focus on the work that requires the human element.
With a wealth of incredible figures and outstanding facts, here’s Jan Jonsson, ABB Robotics veteran, sharing his knowledge and passion for some of our robots and controllers from the past.
I have it on good authority that getting robots to mow a lawn (like, any lawn) is much harder than it looks, but Electric Sheep has built a business around it.
The AI Index, currently in its seventh year, tracks, collates, distills, and visualizes data relating to artificial intelligence. The Index provides unbiased, rigorously vetted, and globally sourced data for policymakers, researchers, journalists, executives, and the general public to develop a deeper understanding of the complex field of AI. Led by a steering committee of influential AI thought leaders, the Index is the world’s most comprehensive report on trends in AI. In this seminar, HAI Research Manager Nestor Maslej offers highlights from the 2024 report, explaining trends related to research and development, technical performance, technical AI ethics, the economy, education, policy and governance, diversity, and public opinion.
This week’s CMU Robotics Institute seminar, from Dieter Fox at Nvidia and the University of Washington, is “Where’s RobotGPT?”
In this talk, I will discuss approaches to generating large datasets for training robot-manipulation capabilities, with a focus on the role simulation can play in this context. I will show some of our prior work, where we demonstrated robust sim-to-real transfer of manipulation skills trained in simulation, and then present a path toward generating large-scale demonstration sets that could help train robust, open-world robot-manipulation models.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
DARPA’s Robotic Autonomy in Complex Environments with Resiliency (RACER) program recently conducted its fourth experiment (E4) to assess the performance of off-road unmanned vehicles. These tests, conducted in Texas in late 2023, were the first time the program tested its new vehicle, the RACER Heavy Platform (RHP). The video shows autonomous route following for mobility testing and demonstration, including sensor point cloud visualizations.
The 12-ton RHP is significantly larger than the 2-ton RACER Fleet Vehicles (RFVs) already in use in the program. Using the algorithms on a very different platform helps RACER toward its goal of platform agnostic autonomy of combat-scale vehicles in complex, mission-relevant off-road environments that are significantly more unpredictable than on-road conditions.
In our new Science Roboticspaper, we introduce an autonomous navigation system developed for our wheeled-legged quadrupeds, designed for fast and efficient navigation within large urban environments. Driven by neural network policies, our simple, unified control system enables smooth gait transitions, smart navigation planning, and highly responsive obstacle avoidance in populated urban environments.
Generation 7 of “Phoenix” robots include improved human-like range of motion. Improvements in uptime, visual perception, and tactile sensing increase the capability of the system to perform complex tasks over longer periods. Design iteration significantly decreases build time. The speed at which new tasks can be automated has increased 50x, marking a major inflection point in task automation speed.
We’re proud to celebrate our one millionth commercial delivery—that’s a million deliveries of lifesaving blood, critical vaccines, last-minute groceries, and so much more. But the best part? This is just the beginning.
We propose a novel humanoid TWIMP, which combines a human mimetic musculoskeletal upper limb with a two-wheel inverted pendulum. By combining the benefit of a musculoskeletal humanoid, which can achieve soft contact with the external environment, and the benefit of a two-wheel inverted pendulum with a small footprint and high mobility, we can easily investigate learning control systems in environments with contact and sudden impact.
Ballbots are uniquely capable of pushing wheelchairs—arguably better than legged platforms, because they can move in any direction without having to reposition themselves.
Charge Robotics is building robots that automate the most labor-intensive parts of solar construction. Solar has rapidly become the cheapest form of power generation in many regions. Demand has skyrocketed, and now the primary barrier to getting it installed is labor logistics and bandwidth. Our robots remove the labor bottleneck, allowing construction companies to meet the rising demand for solar, and enabling the world to switch to renewables faster.
The QUT CGRAS project’s robot prototype captures images of baby corals, destined for the Great Barrier Reef, monitoring and counting them in grow tanks. The team uses state-of-the-art AI algorithms to automatically detect and count these coral babies and track their growth over time – saving human counting time and money.
We are conducting research to develop Unmanned Aerial Systems to aid in wildfire monitoring. The hazardous, dynamic, and visually degraded environment of wildfire gives rise to many unsolved fundamental research challenges.
In March 2024, Northwestern University’s Center for Robotics and Biosystems demonstrated the Omnid mobile collaborative robots (mocobots) at MARS, a conference in Ojai, California on Machine learning, Automation, Robotics, and Space, hosted by Jeff Bezos. The “swarm” of mocobots is designed to collaborate with humans, allowing a human to easily manipulate large, heavy, or awkward payloads. In this case, the mocobots cancel the effect of gravity, so the human can easily manipulate the mock airplane wing in six degrees of freedom. In general, human-cobot systems combine the best of human capabilities with the best of robot capabilities.
EELS, or Exobiology Extant Life Surveyor, is a versatile, snake-like robot designed for exploration of previously inaccessible terrain. This talk on EELS was presented at the 2024 Amazon MARS conference.
The convergence of AI and robotics will unlock a wonderful new world of possibilities in everyday life, says robotics and AI pioneer Daniela Rus. Diving into the way machines think, she reveals how “liquid networks”—a revolutionary class of AI that mimics the neural processes of simple organisms—could help intelligent machines process information more efficiently and give rise to “physical intelligence” that will enable AI to operate beyond digital confines and engage dynamically in the real world.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
Enjoy today’s videos!
In the SpaceHopper project, students at ETH Zurich developed a robot capable of moving in low gravity environments through hopping motions. It is intended to be used in future space missions to explore small celestial bodies.
The exploration of asteroids and moons could provide insights into the formation of the universe, and they may contain valuable minerals that humanity could use in the future.The project began in 2021 as an ETH focus project for bachelor’s students. Now, it is being continued as a regular research project. A particular challenge in developing exploration robots for asteroids is that, unlike larger celestial bodies like Earth, there is low gravity on asteroids and moons. The students have therefore tested their robot’s functionality in zero gravity during a parabolic flight. The parabolic flight was conducted in collaboration with the European Space Agency as part of the ESA Academy Experiments Programme.
It’s still kind of wild to me that it’s now possible to just build a robot like Menteebot. Having said that, at present it looks to be a fairly long way from being able to usefully do tasks in a reliable way.
We are glad to announce the latest updates with our humanoid robot CL-1. In the test, it demonstrates stair climbing in a single stride based on real-time terrain perception. For the very first time, CL-1 accomplishes back and forth running, in a stable and dynamic way!
EEWOC [Extended-reach Enhanced Wheeled Orb for Climbing] uses a unique locomotion scheme to climb complex steel structures with its magnetic grippers. Its lightweight and highly extendable tape spring limb can reach over 1.2 meters, allowing it to traverse gaps and obstacles much larger than other existing climbing robots. Its ability to bend allows it to reach around corners and over ledges, and it can transition between surfaces easily thanks to assistance from its wheels. The wheels also let it to drive more quickly and efficiently on the ground. These features make EEWOC well-suited for climbing the complex steel structures seen in real-world environments.
NASA’s Ingenuity Mars helicopter became the first vehicle to achieve powered, controlled flight on another planet when it took to the Martian skies on 19 April 2021. This video maps the location of the 72 flights that the helicopter took over the course of nearly three years. Ingenuity far surpassed expectations—soaring higher and faster than previously imagined.
MERL introduces a new autonomous robotic assembly technology, offering an initial glimpse into how robots will work in future factories. Unlike conventional approaches where humans set pre-conditions for assembly, our technology empowers robots to adapt to diverse scenarios. We showcase the autonomous assembly of a gear box that was demonstrated live at CES2024.
In November, 2023 Digit was deployed in a distribution center unloading totes from an AMR as part of regular facility operations, including a shift during Cyber Monday.
DARPA’s Air Combat Evolution (ACE) program has achieved the first-ever in-air tests of AI algorithms autonomously flying a fighter jet against a human-piloted fighter jet in within-visual-range combat scenarios (sometimes referred to as “dogfighting”).In this video, team members discuss what makes the ACE program unlike other aerospace autonomy projects and how it represents a transformational moment in aerospace history, establishing a foundation for ethical, trusted, human-machine teaming for complex military and civilian applications.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
I think suggesting that robots can’t fall is much less useful than instead suggesting that robots can fall and get quickly and easily get back up again.
Sanctuary AI says that this video shows Phoenix operating at “human-equivalent speed,” but they don’t specify which human or under which conditions. Though it’s faster than I would be, that’s for sure.
This is the RAM—robotic autonomous mower. It can be dropped anywhere in the world and will wake up with a mission to make tall grass around it shorter. Here is a quick clip of it working on the Presidio in SF.
This year, our robots braved a Finnish winter for the first time. As the snow clears and the days get longer, we’re looking back on how our robots made thousands of deliveries to S Group customers during the colder months.
Adopting omnidirectional Field of View (FoV) cameras in aerial robots vastly improves perception ability, significantly advancing aerial robotics’s capabilities in inspection, reconstruction, and rescue tasks. We propose OmniNxt, a fully open-source aerial robotics platform with omnidirectional perception.
The MAkEable framework enhances mobile manipulation in settings designed around humans by streamlining the process of sharing learned skills and experiences among different robots and contexts. Practical tests confirm its efficiency in a range of scenarios, involving different robots, in tasks such as object grasping, coordinated use of both hands in tasks, and the exchange of skills among humanoid robots.
We conducted trials of Ringbot outdoors on a 400 meter track. With a power source of 2300 milliamp-hours and 11.1 Volts, Ringbot managed to cover approximately 3 kilometers in 37 minutes. We commanded its target speed and direction using a remote joystick controller (Steam Deck), and Ringbot experienced five falls during this trial.
As with every single cooking video, there’s a lot of background prep that’s required for this robot to cook an entire meal, but I would utterly demolish those fries.
Here’s everything you need to know about Wing delivery drones, except for how much human time they actually require and the true cost of making deliveries by drone, because those things aren’t fun to talk about.
This CMU Teruko Yata Memorial Lecture is by Agility Robotics’ Jonathan Hurst, on “Human-Centric Robots and How Learning Enables Generality.”
Humans have dreamt of robot helpers forever. What’s new is that this dream is becoming real. New developments in AI, building on foundations of hardware and passive dynamics, enable vastly improved generality. Robots can step out of highly structured environments and become more human-centric: operating in human spaces, interacting with people, and doing some basic human workflows. By connecting a Large Language Model, Digit can convert natural language high-level requests into complex robot instructions, composing the library of skills together, using human context to achieve real work in the human world. All of this is new—and it is never going back: AI will drive a fast-following robot revolution that is going to change the way we live.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
Enjoy today’s videos!
USC, UPenn, Texas A&M, Oregon State, Georgia Tech, Temple University, and NASA Johnson Space Center are teaching dog-like robots to navigate craters of the moon and other challenging planetary surfaces in research funded by NASA.
AMBIDEX is a revolutionary robot that is fast, lightweight, and capable of human-like manipulation. We have added a sensor head and the torso and the waist to greatly expand the range of movement. Compared to the previous arm-centered version, the overall impression and balance has completely changed.
Experience the future of robotics as UBTECH’s humanoid robot integrates with Baidu’s ERNIE through AppBuilder! Witness robots [that] understand language and autonomously perform tasks like folding clothes and object sorting.
I know the fins on this robot are for walking underwater rather than on land, but watching it move, I feel like it’s destined to evolve into something a little more terrestrial.
The video demonstrates the wave-basin testing of a 43 kg (95 lb) amphibious cycloidal propeller unmanned underwater vehicle (Cyclo-UUV) developed at the Advanced Vertical Flight Laboratory, Texas A&M University. The use of cyclo-propellers allows for 360 degree thrust vectoring for more robust dynamic controllability compared to UUVs with conventional screw propellers.
Operating robots precisely and at high speeds has been a long-standing goal of robotics research. To enable precise and safe dynamic motions, we introduce a four degree-of-freedom (DoF) tendon-driven robot arm. Tendons allow placing the actuation at the base to reduce the robot’s inertia, which we show significantly reduces peak collision forces compared to conventional motor-driven systems. Pairing our robot with pneumatic muscles allows generating high forces and highly accelerated motions, while benefiting from impact resilience through passive compliance.
Rovers on Mars have previously been caught in loose soils, and turning the wheels dug them deeper, just like a car stuck in sand. To avoid this, Rosalind Franklin has a unique wheel-walking locomotion mode to overcome difficult terrain, as well as autonomous navigation software.
MOMO has learned the Bam Yang Gang dance moves with its hand dexterity. :) By analyzing 2D dance videos, we extract detailed hand skeleton data, allowing us to recreate the moves in 3D using a hand model. With this information, MOMO replicates the dance motions with its arm and hand joints.
This UPenn GRASP SFI Seminar is from Eric Jang at 1X Technologies, on “Data Engines for Humanoid Robots.”
1X’s mission is to create an abundant supply of physical labor through androids that work alongside humans. I will share some of the progress 1X has been making towards general-purpose mobile manipulation. We have scaled up the number of tasks our androids can do by combining an end-to-end learning strategy with a no-code system to add new robotic capabilities. Our Android Operations team trains their own models on the data they gather themselves, producing an extremely high-quality “farm-to-table” dataset that can be used to learn extremely capable behaviors. I’ll also share an early preview of the progress we’ve been making towards a generalist “World Model” for humanoid robots.
This Microsoft Future Leaders in Robotics and AI Seminar is from Chahat Deep Singh at the University of Maryland, on “Minimal Perception: Enabling Autonomy in Palm-Sized Robots.”
The solution to robot autonomy lies at the intersection of AI, computer vision, computational imaging, and robotics—resulting in minimal robots. This talk explores the challenge of developing a minimal perception framework for tiny robots (less than 6 inches) used in field operations such as space inspections in confined spaces and robot pollination. Furthermore, we will delve into the realm of selective perception, embodied AI, and the future of robot autonomy in the palm of your hands.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
Enjoy today’s videos!
We present Human to Humanoid (H2O), a reinforcement learning (RL) based framework that enables real-time, whole-body teleoperation of a full-sized humanoid robot with only an RGB camera. We successfully achieve teleoperation of dynamic, whole-body motions in real-world scenarios, including walking, back jumping, kicking, turning, waving, pushing, boxing, etc. To the best of our knowledge, this is the first demonstration to achieve learning-based, real-time, whole-body humanoid teleoperation.
Legged robots have the potential to traverse complex terrain and access confined spaces beyond the reach of traditional platforms thanks to their ability to carefully select footholds and flexibly adapt their body posture while walking. However, robust deployment in real-world applications is still an open challenge. In this paper, we present a method for legged locomotion control using reinforcement learning and 3D volumetric representations to enable robust and versatile locomotion in confined and unstructured environments.
Improving the safety of collaborative manipulators necessitates the reduction of inertia in the moving part. We introduce a novel approach in the form of a passive, 3D wire aligner, serving as a lightweight and low-friction power transmission mechanism, thus achieving the desired low inertia in the manipulator’s operation.
Robot Era just launched Humanoid-Gym, an open-source reinforcement learning framework for bipedal humanoids. As you can see from the video, RL algorithms have given the robot, called Xiao Xing, or XBot, the ability to climb up and down haphazardly stacked boxes with relative stability and ease.
More than 80% of stroke survivors experience walking difficulty, significantly impacting their daily lives, independence, and overall quality of life. Now, new research from the University of Massachusetts Amherst pushes forward the bounds of stroke recovery with a unique robotic hip exoskeleton, designed as a training tool to improve walking function. This invites the possibility of new therapies that are more accessible and easier to translate from practice to daily life, compared to current rehabilitation methods.
DJI drones work to make the world a better place and one of the ways that we do this is through conservation work. We partnered with Halo Robotics and the OFI Orangutan Foundation International to showcase just how these drones can make an impact.
The aim of the test is to demonstrate the removal and replacement of satellite modules into a 27U CubeSat format using augmented reality control of a robot. In this use case, the “client” satellite is being upgraded and refueled using modular componentry. The robot will then remove the failed computer module and place it in a fixture. It will then do the same with the propellant tank. The robot will then place these correctly back into the satellite.
This video features some of the highlights and favorite moments from the CYBATHLON Challenges 2024 that took place on 2 February, showing so many diverse types of assistive technology taking on discipline tasks and displaying pilots’ tenacity and determination. The Challenges saw new teams, new tasks, and new formats for many of the CYBATHLON disciplines.
Small drones for catastrophic wildfires (ones covering more than [40,470 hectares]) are like bringing a flashlight to light up a football field. This short video describes the major uses for drones of all sizes and why and when they are used, or why not.
DARPA’s Learning Introspective Control (LINC) program is developing machine learning methods that show promise in making that scenario closer to reality. LINC aims to fundamentally improve the safety of mechanical systems—specifically in ground vehicles, ships, drone swarms, and robotics—using various methods that require minimal computing power. The result is an AI-powered controller the size of a cell phone.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please
send us your events for inclusion.
HRI 2024: 11–15 March 2024, BOULDER, COLORADO, USA
The time has come for humanoid robots to enter industrial production lines and learn how to assist humans by undertaking repetitive, tedious, and potentially dangerous tasks for them. Recently, UBTECH’s humanoid robot Walker S was introduced into the assembly line of NIO’s advanced vehicle-manufacturing center, as an “intern” assisting in the car production. Walker S is the first bipedal humanoid robot to complete a specific workstation’s tasks on a mobile EV production line.
Henry Evans keeps working hard to make robots better, this time with the assistance of researchers from Carnegie Mellon University.
Henry said he preferred using head-worn assistive teleoperation (HAT) with a robot for certain tasks rather than depending on a caregiver. “Definitely scratching itches,” he said. “I would be happy to have it stand next to me all day, ready to do that or hold a towel to my mouth. Also, feeding me soft foods, operating the blinds, and doing odd jobs around the room.” One innovation in particular, software called Driver Assistance that helps align the robot’s gripper with an object the user wants to pick up, was “awesome,” Henry said. Driver Assistance leaves the user in control while it makes the fine adjustments and corrections that can make controlling a robot both tedious and demanding. “That’s better than anything I have tried for grasping,” Henry said, adding that he would like to see Driver Assistance used for every interface that controls Stretch robots.
Get ready to rip, shear, mow, and tear, as DOOM is back! This April, we’re making the legendary game playable on our robotic mowers as a tribute to 30 years of mowing down demons.
Could an artificial-intelligence system learn language from a child? New York University researchers supported by the National Science Foundation, using first-person video from a head-mounted camera, trained AI models to learn language through the eyes and ears of a child.
The world’s leaders in manufacturing, natural resources, power, and utilities are using our autonomous robots to gather data of higher quality and higher quantities of data than ever before. Thousands of Spots have been deployed around the world—more than any other walking robot—to tackle this challenge. This release helps maintenance teams tap into the power of AI with new software capabilities and Spot enhancements.
Modular self-reconfigurable robotic systems are more adaptive than conventional systems. This article proposes a novel free-form and truss-structured modular self-reconfigurable robot called FreeSN, containing node and strut modules. This article presents a novel configuration identification system for FreeSN, including connection point magnetic localization, module identification, module orientation fusion, and system-configuration fusion.
The OOS-SIM (On-Orbit Servicing Simulator) is a simulator for on-orbit servicing tasks such as repair, maintenance and assembly that have to be carried out on satellites orbiting the earth. It simulates the operational conditions in orbit, such as the felt weightlessness and the harsh illumination.
The next CYBATHLON competition, which will take place again in 2024, breaks down barriers between the public, people with disabilities, researchers and technology developers. From 25 to 27 October 2024, the CYBATHLON will take place in a global format in the Arena Schluefweg in Kloten near Zurich and in local hubs all around the world.
George’s story is a testament to the incredible journey that unfolds when passion, opportunity and community converge. His journey from a drone enthusiast to someone actively contributing to making a difference not only to his local community but also globally; serves as a beacon of hope for all who dare to dream and pursue their passions.
ABB’s fifty-year story of robotic innovation that began in 1974 with the sale of the world’s first commercial all-electric robot, the IRB 6. Björn Weichbrodt was a key figure in the development of the IRB 6.
Robotics Debate of the Ingenuity Labs Robotics and AI Symposium (RAIS2023) from October 12, 2023: Is robotics helping or hindering our progress on UN Sustainable Development Goals?
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
Enjoy today’s videos!
Legged robots have the potential to become vital in maintenance, home support, and exploration scenarios. In order to interact with and manipulate their environments, most legged robots are equipped with a dedicated robot arm, which means additional mass and mechanical complexity compared to standard legged robots. In this work, we explore pedipulation—using the legs of a legged robot for manipulation.
This work, by Philip Arm, Mayank Mittal, Hendrik Kolvenbach, and Marco Hutter from ETH Zurich’s Robotic Systems Lab, will be presented at the IEEE International Conference on Robotics and Automation (ICRA 2024) in May, in Japan (see events calendar above).
I learned a new word today: “stigmergy.” Stigmergy is a kind of group coordination that’s based on environmental modification. Like, when insects leave pheromone trails, they’re not directly sending messages to other individuals. But as a group, ants are able to manifest surprisingly complex coordinated behaviors. Cool, right? Researchers at IRIDIA are exploring the possibilities for robots using stigmergy with a cool “artificial pheromone” system using a UV-sensitive surface.
“Automatic Design of Stigmergy-Based Behaviors for Robot Swarms,” by Muhammad Salman, David Garzón Ramos, and Mauro Birattari, is published in the journal Communications Engineering.
Filmed in July 2017, this video shows Atlas walking through a “hatch” on a pitching surface. This skill uses autonomous behaviors, with the robot not knowing about the rocking world. Robot built by Boston Dynamics for the DARPA Robotics Challenge in 2013. Software by IHMC Robotics.
That IHMC video reminded me of the SAFFiR program for Shipboard Autonomous Firefighting Robots, which is responsible for a bunch of really cool research in partnership with the U.S. Naval Research Laboratory. NRL did some interesting stuff with Nexi robots from MIT and made their own videos. That effort I think didn’t get nearly enough credit for being very entertaining while communicating important robotics research.
Large industrial-asset operators increasingly use robotics to automate hazardous work at their facilities. This has led to soaring demand for autonomous inspection solutions like ANYmal. Series production by our partner Zollner enables ANYbotics to supply our customers with the required quantities of robots.
Hawkeye is a real-time multimodal conversation-and-interaction agent for the Boston Dynamics’ mobile robot Spot. Leveraging OpenAI’s experimental GPT-4 Turbo and Vision AI models, Hawkeye aims to empower everyone, from seniors to health care professionals in forming new and unique interactions with the world around them.
The rover Artemis, developed at the DFKI Robotics Innovation Center, has been equipped with a penetrometer that measures the soil’s penetration resistance to obtain precise information about soil strength. The video showcases an initial test run with the device mounted on the robot. During this test, the robot was remotely controlled, and the maximum penetration depth was limited to 15 millimeters.
To efficiently achieve complex humanoid loco-manipulation tasks in industrial contexts, we propose a combined vision-based tracker-localization interplay integrated as part of a task-space whole-body-optimization control. Our approach allows humanoid robots, targeted for industrial manufacturing, to manipulate and assemble large-scale objects while walking.
We developed a novel multibody robot (called the Two-Body Bot) consisting of two small-footprint mobile bases connected by a four-bar linkage where handlebars are mounted. Each base measures only 29.2 centimeters wide, making the robot likely the slimmest ever developed for mobile postural assistance.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
Filmed in July 2017, this video shows us using Atlas to put out a “fire” on our loading dock. This uses a combination of teleoperation and autonomous behaviors through a single, remote computer. Robot built by Boston Dynamics for the DARPA Robotics Challenge in 2013. Software by IHMC Robotics.
I would say that in the middle of a rainstorm is probably the best time to start a fire that you expect to be extinguished by a robot.
Inspired by caregiving experts, we proposed a bimanual interactive robotic dressing assistance scheme, which is unprecedented in previous research. In the scheme, an interactive robot joins hands with the human thus supporting/guiding the human in the dressing process, while the dressing robot performs the dressing task. This work represents a paradigm shift of thinking of the dressing assistance task from one-robot-to-one-arm to two-robot-to-one-arm.
Tony Punnoose Valayil from the Bulgarian Academy of Sciences Institute of Robotics wrote in to share some very low-cost hand-rehabilitation robots for home use.
In this video, we present a robot-assisted rehabilitation of the wrist joint which can aid in restoring the strength that has been lost across the upper limb due to stroke. This robot is very cost-effective and can be used for home rehabilitation.
In this video, we present an exoskeleton robot which can be used at home for rehabilitating the index and middle fingers of stroke-affected patients. This robot is built at a cost of 50 euros for patients who are not financially independent to get better treatment.
Some very impressive work here from the Norwegian University of Science and Technology (NTNU), showing a drone tracking its position using radar and lidar-based odometry in some nightmare (for robots) environments, including a long tunnel that looks the same everywhere and a hallway full of smoke.
I’m sorry, but people should really know better than to make videos like this for social robot crowdfunding by now.
It’s on Kickstarter for about $300, and the fact that it’s been funded so quickly tells me that people have already forgotten about the social robotpocalypse.
Introducing Orbit, your portal for managing asset-intensive facilities through real-time and predictive intelligence. Orbit brings a whole new suite of fleet management capabilities and will unify your ecosystem of Boston Dynamics robots, starting with Spot.
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
In this video, we present Ringbot, a novel leg-wheel transformer robot incorporating a monocycle mechanism with legs. Ringbot aims to provide versatile mobility by replacing the driver and driving components of a conventional monocycle vehicle with legs mounted on compact driving modules inside the wheel.
Making money with robots has always been a struggle, but I think ALOHA 2 has figured it out.
Seriously, though, that is some impressive manipulation capability. I don’t know what that freakish panda thing is, but getting a contact lens from the package onto its bizarre eyeball was some wild dexterity.
Highlights from testing our new arms built by Boardwalk Robotics. Installed in October of 2023, these new arms are not just for boxing and provide much greater speed and power. This matches the mobility and manipulation goals we have for Nadia!
The least dramatic but possibly most important bit of that video is when Nadia uses her arms to help her balance against a wall, which is one of those things that humans do all the time without thinking about it. And we always appreciate being shown things that don’t go perfectly alongside things that do. The bit at the end there was Nadia not quite managing to do lateral arm raises. I can relate; that’s my reaction when I lift weights, too.
We present an avatar system designed to facilitate the embodiment of humanoid robots by human operators, validated through iCub3, a humanoid developed at the Istituto Italiano di Tecnologia.
Multimodal UAVs (unmanned aerial vehicles) are rarely capable of more than two modalities—that is, flying and walking or flying and perching. However, being able to fly, perch, and walk could further improve their usefulness by expanding their operating envelope. For instance, an aerial robot could fly a long distance, perch in a high place to survey the surroundings, then walk to avoid obstacles that could potentially inhibit flight. Birds are capable of these three tasks, and so offer a practical example of how a robot might be developed to do the same.
Nissan announces the concept model of “Iruyo,” a robot that supports babysitting while driving. Ilyo relieves the anxiety of the mother, father, and baby in the driver’s seat. We support safe and secure driving for parents and children. Nissan and Akachan Honpo are working on a project to make life better with cars and babies. Iruyo was born out of the voices of mothers and fathers who said, “I can’t hold my baby while driving alone.”
Building 937 houses the coolest robots at CERN. This is where the action happens to build and program robots that can tackle the unconventional challenges presented by the laboratory’s unique facilities. Recently, a new type of robot called CERNquadbot has entered CERN’s robot pool and successfully completed its first radiation protection test in the North Area.
By blending 2D images with foundation models to build 3D feature fields, a new MIT method helps robots understand and manipulate nearby objects with open-ended language prompts.
Our current care system does not scale, and our populations are aging fast. Robodies are multipliers for care staff, allowing them to work together with local helpers to provide protection and assistance around the clock while maintaining personal contact with people in the community.
SEAS researchers are helping develop resilient and autonomous deep-space and extraterrestrial habitations by developing technologies to let autonomous robots repair or replace damaged components in a habitat. The research is part of the Resilient ExtraTerrestrial Habitats institute (RETHi), led by Purdue University in partnership with SEAS, the University of Connecticut, and the University of Texas at San Antonio. Its goal is to “design and operate resilient deep-space habitats that can adapt, absorb, and rapidly recover from expected and unexpected disruptions.”
Find out how a bold vision became a success story! The DLR Institute of Robotics and Mechatronics has been researching robotic arms since the 1990s, originally for use in space. It was a long and ambitious journey before these lightweight robotic arms could be used on Earth and finally in operating theaters, a journey that required concentrated robotics expertise, interdisciplinary cooperation, and ultimately a successful technology transfer.
Robotics is changing the world, driven by focused teams of diverse experts. Willow Garage operated with the mantra “Impact first, return on capital second” and through ROS and the PR2 had enormous impact. Autonomous mobile robots are finally being accepted in the service industry, and Savioke (now Relay Robotics) was created to drive that impact. This talk will trace the evolution of Relay robots and their deployment in hotels, hospitals, and other service industries, starting with roots at Willow Garage. As robotics technology is poised for the next round of advances, how do we create and maintain the organizations that continue to drive progress?
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at
IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
GITAI is pleased to announce that its 1.5-meter-long autonomous dual robotic arm system (S2) has successfully arrived at the International Space Station (ISS) aboard the SpaceX Falcon 9 rocket (NG-20) to conduct an external demonstration of in-space servicing, assembly, and manufacturing (ISAM) while onboard the ISS. The success of the S2 tech demo will be a major milestone for GITAI, confirming the feasibility of this technology as a fully operational system in space.
This work presents a comprehensive study on using deep reinforcement learning (RL) to create dynamic locomotion controllers for bipedal robots. Going beyond focusing on a single locomotion skill, we develop a general control solution that can be used for a range of dynamic bipedal skills, from periodic walking and running to aperiodic jumping and standing.
And if you want to get exhausted on behalf of a robot, the full 400-meter dash is below.
NASA’s Ingenuity Mars Helicopter pushed aerodynamic limits during the final months of its mission, setting new records for speed, distance, and altitude. Hear from Ingenuity chief engineer Travis Brown on how the data the team collected could eventually be used in future rotorcraft designs.
[Harvard School of Engineering and Applied Sciences] researchers are helping develop resilient and autonomous deep space and extraterrestrial habitations by developing technologies to let autonomous robots repair or replace damaged components in a habitat. The research is part of the Resilient ExtraTerrestrial Habitats institute (RETHi) led by Purdue University, in partnership with [Harvard] SEAS, the University of Connecticut and the University of Texas at San Antonio. Its goal is to “design and operate resilient deep space habitats that can adapt, absorb and rapidly recover from expected and unexpected disruptions.”
Researchers from Huazhong University of Science and Technology (HUST) in a recent T-RO paper describe and construct a novel variable stiffness spherical joint motor that enables dexterous motion and joint compliance in omni-directions.
We are told that this new robot from HEBI is called “Mark Suckerberg” and that they’ve got a pretty cool application in mind for it, to be revealed later this year.
Dive into the first edition of our new Real-World-Robotics class at ETH Zürich! Our students embarked on an incredible journey, creating their human-like robotic hands from scratch. In just three months, the teams designed, built, and programmed their tendon-driven robotic hands, mastering dexterous manipulation with reinforcement learning! The result? A spectacular display of innovation and skill during our grand final.
Carnegie Mellon researchers have built a system with a robotic arm atop a RangerMini 2.0 robotic cart from AgileX robotics to make what they’re calling a platform for “intelligent movement and processing.”
Picassnake is our custom-made robot that paints pictures from music. Picassnake consists of an arm and a head, embedded in a plush snake doll. The robot is connected to a laptop for control and music processing, which can be fed through a microphone or an MP3 file. To open the media source, an operator can use the graphical user interface or place a text QR code in front of a webcam. Once the media source is opened, Picassnake generates unique strokes based on the music and translates the strokes to physical movement to paint them on canvas.
In April 2021, NASA’s Ingenuity Mars Helicopter became the first spacecraft to achieve powered, controlled flight on another world. With 72 successful flights, Ingenuity has far surpassed its originally planned technology demonstration of up to five flights. On Jan. 18, Ingenuity flew for the final time on the Red Planet. Join Tiffany Morgan, NASA’s Mars Exploration Program Deputy Director, and Teddy Tzanetos, Ingenuity Project Manager, as they discuss these historic flights and what they could mean for future extraterrestrial aerial exploration.