FreshRSS

Zobrazení pro čtení

Jsou dostupné nové články, klikněte pro obnovení stránky.

Robot Dog Cleans Up Beaches With Foot-Mounted Vacuums



Cigarette butts are the second most common undisposed-of litter on Earth—of the six trillion-ish cigarettes inhaled every year, it’s estimated that over 4 trillion of the butts are just tossed onto the ground, each one leeching over 700 different toxic chemicals into the environment. Let’s not focus on the fact that all those toxic chemicals are also going into people’s lungs, and instead talk about the ecosystem damage that they can do and also just the general grossness of having bits of sucked-on trash everywhere. Ew.

Preventing those cigarette butts from winding up on the ground in the first place would be the best option, but it would require a pretty big shift in human behavior. Operating under the assumption that humans changing their behavior is a nonstarter, roboticists from the Dynamic Legged Systems unit at the Italian Institute of Technology (IIT), in Genoa, have instead designed a novel platform for cigarette-butt cleanup in the form of a quadrupedal robot with vacuums attached to its feet.

IIT

There are, of course, far more efficient ways of at least partially automating the cleanup of litter with machines. The challenge is that most of that automation relies on mobility systems with wheels, which won’t work on the many beautiful beaches (and many beautiful flights of stairs) of Genoa. In places like these, it still falls to humans to do the hard work, which is less than ideal.

This robot, developed in Claudio Semini’s lab at IIT, is called VERO (Vacuum-cleaner Equipped RObot). It’s based around an AlienGo from Unitree, with a commercial vacuum mounted on its back. Hoses go from the vacuum down the leg to each foot, with a custom 3D-printed nozzle that puts as much suction near the ground as possible without tripping the robot up. While the vacuum is novel, the real contribution here is how the robot autonomously locates things on the ground and then plans how to interact with those things using its feet.

First, an operator designates an area for VERO to clean, after which the robot operates by itself. After calculating an exploration path to explore the entire area, the robot uses its onboard cameras and a neural network to detect cigarette butts. This is trickier than it sounds, because there may be a lot of cigarette butts on the ground, and they all probably look pretty much the same, so the system has to filter out all of the potential duplicates. The next step is to plan its next steps: VERO has to put the vacuum side of one of its feet right next to each cigarette butt while calculating a safe, stable pose for the rest of its body. Since this whole process can take place on sand or stairs or other uneven surfaces, VERO has to prioritize not falling over before it decides how to do the collection. The final collecting maneuver is fine-tuned using an extra Intel RealSense depth camera mounted on the robot’s chin.

A collage of six photos of a quadruped robot navigating different environments. VERO has been tested successfully in six different scenarios that challenge both its locomotion and detection capabilities.IIT

Initial testing with the robot in a variety of different environments showed that it could successfully collect just under 90 percent of cigarette butts, which I bet is better than I could do, and I’m also much more likely to get fed up with the whole process. The robot is not very quick at the task, but unlike me it will never get fed up as long as it’s got energy in its battery, so speed is somewhat less important.

As far as the authors of this paper are aware (and I assume they’ve done their research), this is “the first time that the legs of a legged robot are concurrently utilized for locomotion and for a different task.” This is distinct from other robots that can (for example) open doors with their feet, because those robots stop using the feet as feet for a while and instead use them as manipulators.

So, this is about a lot more than cigarette butts, and the researchers suggest a variety of other potential use cases, including spraying weeds in crop fields, inspecting cracks in infrastructure, and placing nails and rivets during construction.

Some use cases include potentially doing multiple things at the same time, like planting different kinds of seeds, using different surface sensors, or driving both nails and rivets. And since quadrupeds have four feet, they could potentially host four completely different tools, and the software that the researchers developed for VERO can be slightly modified to put whatever foot you want on whatever spot you need.

VERO: A Vacuum‐Cleaner‐Equipped Quadruped Robot for Efficient Litter Removal, by Lorenzo Amatucci, Giulio Turrisi, Angelo Bratta, Victor Barasuol, and Claudio Semini from IIT, was published in the Journal of Field Robotics.

Video Friday: Multitasking



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
ICSR 2024: 23–26 October 2024, ODENSE, DENMARK
Cybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

Do you have trouble multitasking? Cyborgize yourself through muscle stimulation to automate repetitive physical tasks while you focus on something else.

[ SplitBody ]

By combining a 5,000 frame-per-second (FPS) event camera with a 20-FPS RGB camera, roboticists from the University of Zurich have developed a much more effective vision system that keeps autonomous cars from crashing into stuff, as described in the current issue of Nature.

[ Nature ]

Mitsubishi Electric has been awarded the GUINNESS WORLD RECORDS title for the fastest robot to solve a puzzle cube. The robot’s time of 0.305 second beat the previous record of 0.38 second, for which it received a GUINNESS WORLD RECORDS certificate on 21 May 2024.

[ Mitsubishi ]

Sony’s AIBO is celebrating its 25th anniversary, which seems like a long time, and it is. But back then, the original AIBO could check your email for you. Email! In 1999!

I miss Hotmail.

[ AIBO ]

SchniPoSa: schnitzel with french fries and a salad.

[ Dino Robotics ]

Cloth-folding is still a really hard problem for robots, but progress was made at ICRA!

[ ICRA Cloth Competition ]

Thanks, Francis!

MIT CSAIL researchers enhance robotic precision with sophisticated tactile sensors in the palm and agile fingers, setting the stage for improvements in human-robot interaction and prosthetic technology.

[ MIT ]

We present a novel adversarial attack method designed to identify failure cases in any type of locomotion controller, including state-of-the-art reinforcement-learning-based controllers. Our approach reveals the vulnerabilities of black-box neural network controllers, providing valuable insights that can be leveraged to enhance robustness through retraining.

[ Fan Shi ]

In this work, we investigate a novel integrated flexible OLED display technology used as a robotic skin-interface to improve robot-to-human communication in a real industrial setting at Volkswagen or a collaborative human-robot interaction task in motor assembly. The interface was implemented in a workcell and validated qualitatively with a small group of operators (n=9) and quantitatively with a large group (n=42). The validation results showed that using flexible OLED technology could improve the operators’ attitude toward the robot; increase their intention to use the robot; enhance their perceived enjoyment, social influence, and trust; and reduce their anxiety.

[ Paper ]

Thanks, Bram!

We introduce InflatableBots, shape-changing inflatable robots for large-scale encountered-type haptics in VR. Unlike traditional inflatable shape displays, which are immobile and limited in interaction areas, our approach combines mobile robots with fan-based inflatable structures. This enables safe, scalable, and deployable haptic interactions on a large scale.

[ InflatableBots ]

We present a bioinspired passive dynamic foot in which the claws are actuated solely by the impact energy. Our gripper simultaneously resolves the issue of smooth absorption of the impact energy and fast closure of the claws by linking the motion of an ankle linkage and the claws through soft tendons.

[ Paper ]

In this video, a 3-UPU exoskeleton robot for a wrist joint is designed and controlled to perform wrist extension, flexion, radial-deviation, and ulnar-deviation motions in stroke-affected patients. This is the first time a 3-UPU robot has been used effectively for any kind of task.

“UPU” stands for “universal-prismatic-universal” and refers to the actuators—the prismatic joints between two universal joints.

[ BAS ]

Thanks, Tony!

BRUCE Got Spot-ted at ICRA2024.

[ Westwood Robotics ]

Parachutes: maybe not as good of an idea for drones as you might think.

[ Wing ]

In this paper, we propose a system for the artist-directed authoring of stylized bipedal walking gaits, tailored for execution on robotic characters. To demonstrate the utility of our approach, we animate gaits for a custom, free-walking robotic character, and show, with two additional in-simulation examples, how our procedural animation technique generalizes to bipeds with different degrees of freedom, proportions, and mass distributions.

[ Disney Research ]

The European drone project Labyrinth aims to keep new and conventional air traffic separate, especially in busy airspaces such as those expected in urban areas. The project provides a new drone-traffic service and illustrates its potential to improve the safety and efficiency of civil land, air, and sea transport, as well as emergency and rescue operations.

[ DLR ]

This Carnegie Mellon University Robotics Institute seminar, by Kim Baraka at Vrije Universiteit Amsterdam, is on the topic “Why We Should Build Robot Apprentices and Why We Shouldn’t Do It Alone.”

For robots to be able to truly integrate human-populated, dynamic, and unpredictable environments, they will have to have strong adaptive capabilities. In this talk, I argue that these adaptive capabilities should leverage interaction with end users, who know how (they want) a robot to act in that environment. I will present an overview of my past and ongoing work on the topic of human-interactive robot learning, a growing interdisciplinary subfield that embraces rich, bidirectional interaction to shape robot learning. I will discuss contributions on the algorithmic, interface, and interaction design fronts, showcasing several collaborations with animal behaviorists/trainers, dancers, puppeteers, and medical practitioners.

[ CMU RI ]

Video Friday: A Starbucks With 100 Robots



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
ICSR 2024: 23–26 October 2024, ODENSE, DENMARK
Cybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

NAVER 1784 is the world’s largest robotics testbed. The Starbucks on the second floor of 1784 is the world’s most unique Starbucks, with more than 100 service robots called “Rookie” delivering Starbucks drinks to meeting rooms and private seats, and various experiments with a dual-arm robot.

[ Naver ]

If you’re gonna take a robot dog with you on a hike, the least it could do is carry your backpack for you.

[ Deep Robotics ]

Obligatory reminder that phrases like “no teleoperation” without any additional context can mean many different things.

[ Astribot ]

This video is presented at the ICRA 2024 conference and summarizes recent results of our Learning AI for Dextrous Manipulation Lab. It demonstrates how our learning AI methods allowed for breakthroughs in dextrous manipulation with the mobile humanoid robot DLR Agile Justin. Although the core of the mechatronic hardware is almost 20 years old, only the advent of learning AI methods enabled a level of dexterity, flexibility and autonomy coming close to human capabilities.

[ TUM ]

Thanks Berthold!

Hands of blue? Not a good look.

[ Synaptic ]

With all the humanoid stuff going on, there really should be more emphasis on intentional contact—humans lean and balance on things all the time, and robots should too!

[ Inria ]

LimX Dynamics W1 is now more than a wheeled quadruped. By evolving into a biped robot, W1 maneuvers slickly on two legs in different ways: non-stop 360° rotation, upright free gliding, slick maneuvering, random collision and self-recovery, and step walking.

[ LimX Dynamics ]

Animal brains use less data and energy compared to current deep neural networks running on Graphics Processing Units (GPUs). This makes it hard to develop tiny autonomous drones, which are too small and light for heavy hardware and big batteries. Recently, the emergence of neuromorphic processors that mimic how brains function has made it possible for researchers from Delft University of Technology to develop a drone that uses neuromorphic vision and control for autonomous flight.

[ Science ]

In the beginning of the universe, all was darkness — until the first organisms developed sight, which ushered in an explosion of life, learning and progress. AI pioneer Fei-Fei Li says a similar moment is about to happen for computers and robots. She shows how machines are gaining “spatial intelligence” — the ability to process visual data, make predictions and act upon those predictions — and shares how this could enable AI to interact with humans in the real world.

[ TED ]

Video Friday: Loco-Manipulation



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

Eurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE
ICRA 2024: 13–17 May 2024, YOKOHAMA, JAPAN
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
Cybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

In this work, we present LocoMan, a dexterous quadrupedal robot with a novel morphology to perform versatile manipulation in diverse constrained environments. By equipping a Unitree Go1 robot with two low-cost and lightweight modular 3-DoF loco-manipulators on its front calves, LocoMan leverages the combined mobility and functionality of the legs and grippers for complex manipulation tasks that require precise 6D positioning of the end effector in a wide workspace.

[ CMU ]

Thanks, Changyi!

Object manipulation has been extensively studied in the context of fixed-base and mobile manipulators. However, the overactuated locomotion modality employed by snake robots allows for a unique blend of object manipulation through locomotion, referred to as loco-manipulation. In this paper, we present an optimization approach to solving the loco-manipulation problem based on nonimpulsive implicit-contact path planning for our snake robot COBRA.

[ Silicon Synapse Lab ]

Okay, but where that costume has eyes is not where Spot has eyes, so the Spot in the costume can’t see, right? And now I’m skeptical of the authenticity of the mutual snoot-boop.

[ Boston Dynamics ]

Here’s some video of Field AI’s robots operating in relatively complex and unstructured environments without prior maps. Make sure to read our article from this week for details!

[ Field AI ]

Is it just me, or is it kind of wild that researchers are now publishing papers comparing their humanoid controller to the “manufacturer’s” humanoid controller? It’s like humanoids are a commodity now or something.

[ OSU ]

I, too, am packing armor for ICRA.

[ Pollen Robotics ]

Honey Badger 4.0 is our latest robotic platform, created specifically for traversing hostile environments and difficult terrains. Equipped with multiple cameras and sensors, it will make sure no defect is omitted during inspection.

[ MAB Robotics ]

Thanks, Jakub!

Have an automation task that calls for the precision and torque of an industrial robot arm…but you need something that is more rugged or a nonconventional form factor? Meet the HEBI Robotics H-Series Actuator! With 9x the torque of our X-Series and seamless compatibility with the HEBI ecosystem for robot development, the H-Series opens a new world of possibilities for robots.

[ HEBI ]

Thanks, Dave!

This is how all spills happen at my house too: super passive-aggressively.

[ 1X ]

EPFL’s team, led by Ph.D. student Milad Shafiee along with coauthors Guillaume Bellegarda and BioRobotics Lab head Auke Ijspeert, have trained a four-legged robot using deep-reinforcement learning to navigate challenging terrain, achieving a milestone in both robotics and biology.

[ EPFL ]

At Agility, we make robots that are made for work. Our robot Digit works alongside us in spaces designed for people. Digit handles the tedious and repetitive tasks meant for a machine, allowing companies and their people to focus on the work that requires the human element.

[ Agility ]

With a wealth of incredible figures and outstanding facts, here’s Jan Jonsson, ABB Robotics veteran, sharing his knowledge and passion for some of our robots and controllers from the past.

[ ABB ]

I have it on good authority that getting robots to mow a lawn (like, any lawn) is much harder than it looks, but Electric Sheep has built a business around it.

[ Electric Sheep ]

The AI Index, currently in its seventh year, tracks, collates, distills, and visualizes data relating to artificial intelligence. The Index provides unbiased, rigorously vetted, and globally sourced data for policymakers, researchers, journalists, executives, and the general public to develop a deeper understanding of the complex field of AI. Led by a steering committee of influential AI thought leaders, the Index is the world’s most comprehensive report on trends in AI. In this seminar, HAI Research Manager Nestor Maslej offers highlights from the 2024 report, explaining trends related to research and development, technical performance, technical AI ethics, the economy, education, policy and governance, diversity, and public opinion.

[ Stanford HAI ]

This week’s CMU Robotics Institute seminar, from Dieter Fox at Nvidia and the University of Washington, is “Where’s RobotGPT?”

In this talk, I will discuss approaches to generating large datasets for training robot-manipulation capabilities, with a focus on the role simulation can play in this context. I will show some of our prior work, where we demonstrated robust sim-to-real transfer of manipulation skills trained in simulation, and then present a path toward generating large-scale demonstration sets that could help train robust, open-world robot-manipulation models.

[ CMU ]

Video Friday: RACER Heavy



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

Eurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE
ICRA 2024: 13–17 May 2024, YOKOHAMA, JAPAN
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
Cybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

DARPA’s Robotic Autonomy in Complex Environments with Resiliency (RACER) program recently conducted its fourth experiment (E4) to assess the performance of off-road unmanned vehicles. These tests, conducted in Texas in late 2023, were the first time the program tested its new vehicle, the RACER Heavy Platform (RHP). The video shows autonomous route following for mobility testing and demonstration, including sensor point cloud visualizations.

The 12-ton RHP is significantly larger than the 2-ton RACER Fleet Vehicles (RFVs) already in use in the program. Using the algorithms on a very different platform helps RACER toward its goal of platform agnostic autonomy of combat-scale vehicles in complex, mission-relevant off-road environments that are significantly more unpredictable than on-road conditions.

[ DARPA ]

In our new Science Robotics paper, we introduce an autonomous navigation system developed for our wheeled-legged quadrupeds, designed for fast and efficient navigation within large urban environments. Driven by neural network policies, our simple, unified control system enables smooth gait transitions, smart navigation planning, and highly responsive obstacle avoidance in populated urban environments.

[ Github ]

Generation 7 of “Phoenix” robots include improved human-like range of motion. Improvements in uptime, visual perception, and tactile sensing increase the capability of the system to perform complex tasks over longer periods. Design iteration significantly decreases build time. The speed at which new tasks can be automated has increased 50x, marking a major inflection point in task automation speed.

[ Sanctuary AI ]

We’re proud to celebrate our one millionth commercial delivery—that’s a million deliveries of lifesaving blood, critical vaccines, last-minute groceries, and so much more. But the best part? This is just the beginning.

[ Zipline ]

Work those hips!

[ RoMeLa ]

This thing is kind of terrifying, and I’m fascinated by it.

[ AVFL ]

We propose a novel humanoid TWIMP, which combines a human mimetic musculoskeletal upper limb with a two-wheel inverted pendulum. By combining the benefit of a musculoskeletal humanoid, which can achieve soft contact with the external environment, and the benefit of a two-wheel inverted pendulum with a small footprint and high mobility, we can easily investigate learning control systems in environments with contact and sudden impact.

From Humanoids 2018.

[ Paper ] via [ JSK Lab ]

Thanks, Kento!

Ballbots are uniquely capable of pushing wheelchairs—arguably better than legged platforms, because they can move in any direction without having to reposition themselves.

[ Paper ]

Charge Robotics is building robots that automate the most labor-intensive parts of solar construction. Solar has rapidly become the cheapest form of power generation in many regions. Demand has skyrocketed, and now the primary barrier to getting it installed is labor logistics and bandwidth. Our robots remove the labor bottleneck, allowing construction companies to meet the rising demand for solar, and enabling the world to switch to renewables faster.

[ Charge Robotics ]

Robots doing precision assembly is cool and all, but those vibratory bowl sorters seem like magic.

[ FANUC ]

The QUT CGRAS project’s robot prototype captures images of baby corals, destined for the Great Barrier Reef, monitoring and counting them in grow tanks. The team uses state-of-the-art AI algorithms to automatically detect and count these coral babies and track their growth over time – saving human counting time and money.

[ QUT ]

We are conducting research to develop Unmanned Aerial Systems to aid in wildfire monitoring. The hazardous, dynamic, and visually degraded environment of wildfire gives rise to many unsolved fundamental research challenges.

[ CMU ]

Here’s a little more video of that robot elevator, but I’m wondering why it’s so slow—clamp those bots in there and rocket that elevator up and down!

[ NAVER ]

In March 2024, Northwestern University’s Center for Robotics and Biosystems demonstrated the Omnid mobile collaborative robots (mocobots) at MARS, a conference in Ojai, California on Machine learning, Automation, Robotics, and Space, hosted by Jeff Bezos. The “swarm” of mocobots is designed to collaborate with humans, allowing a human to easily manipulate large, heavy, or awkward payloads. In this case, the mocobots cancel the effect of gravity, so the human can easily manipulate the mock airplane wing in six degrees of freedom. In general, human-cobot systems combine the best of human capabilities with the best of robot capabilities.

[ Northwestern ]

There’s something so soothing about watching a lithium battery get wrecked and burn for 8 minutes.

[ Hardcore Robotics ]

EELS, or Exobiology Extant Life Surveyor, is a versatile, snake-like robot designed for exploration of previously inaccessible terrain. This talk on EELS was presented at the 2024 Amazon MARS conference.

[ JPL ]

The convergence of AI and robotics will unlock a wonderful new world of possibilities in everyday life, says robotics and AI pioneer Daniela Rus. Diving into the way machines think, she reveals how “liquid networks”—a revolutionary class of AI that mimics the neural processes of simple organisms—could help intelligent machines process information more efficiently and give rise to “physical intelligence” that will enable AI to operate beyond digital confines and engage dynamically in the real world.

[ TED ]

Video Friday: SpaceHopper



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

RoboCup German Open: 17–21 April 2024, KASSEL, GERMANY
AUVSI XPONENTIAL 2024: 22–25 April 2024, SAN DIEGO
Eurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE
ICRA 2024: 13–17 May 2024, YOKOHAMA, JAPAN
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS

Enjoy today’s videos!

In the SpaceHopper project, students at ETH Zurich developed a robot capable of moving in low gravity environments through hopping motions. It is intended to be used in future space missions to explore small celestial bodies.

The exploration of asteroids and moons could provide insights into the formation of the universe, and they may contain valuable minerals that humanity could use in the future.The project began in 2021 as an ETH focus project for bachelor’s students. Now, it is being continued as a regular research project. A particular challenge in developing exploration robots for asteroids is that, unlike larger celestial bodies like Earth, there is low gravity on asteroids and moons. The students have therefore tested their robot’s functionality in zero gravity during a parabolic flight. The parabolic flight was conducted in collaboration with the European Space Agency as part of the ESA Academy Experiments Programme.

[ SpaceHopper ]

It’s still kind of wild to me that it’s now possible to just build a robot like Menteebot. Having said that, at present it looks to be a fairly long way from being able to usefully do tasks in a reliable way.

[ Menteebot ]

Look, it’s the robot we all actually want!

[ Github ]

I wasn’t quite sure what made this building especially “robot-friendly” until I saw the DEDICATED ROBOT ELEVATOR.

[ NAVER ]

We are glad to announce the latest updates with our humanoid robot CL-1. In the test, it demonstrates stair climbing in a single stride based on real-time terrain perception. For the very first time, CL-1 accomplishes back and forth running, in a stable and dynamic way!

[ LimX Dynamics ]

EEWOC [Extended-reach Enhanced Wheeled Orb for Climbing] uses a unique locomotion scheme to climb complex steel structures with its magnetic grippers. Its lightweight and highly extendable tape spring limb can reach over 1.2 meters, allowing it to traverse gaps and obstacles much larger than other existing climbing robots. Its ability to bend allows it to reach around corners and over ledges, and it can transition between surfaces easily thanks to assistance from its wheels. The wheels also let it to drive more quickly and efficiently on the ground. These features make EEWOC well-suited for climbing the complex steel structures seen in real-world environments.

[ Paper ]

Thanks to its “buttock-contact sensors,” JSK’s musculoskeletal humanoid has mastered(ish) the chair-scoot.

[ University of Tokyo ]

Thanks, Kento!

Physical therapy seems like a great application for a humaonid robot when you don’t really need that humanoid robot to do much of anything.

[ Fourier Intelligence ]

NASA’s Ingenuity Mars helicopter became the first vehicle to achieve powered, controlled flight on another planet when it took to the Martian skies on 19 April 2021. This video maps the location of the 72 flights that the helicopter took over the course of nearly three years. Ingenuity far surpassed expectations—soaring higher and faster than previously imagined.

[ JPL ]

No thank you!

[ Paper ]

MERL introduces a new autonomous robotic assembly technology, offering an initial glimpse into how robots will work in future factories. Unlike conventional approaches where humans set pre-conditions for assembly, our technology empowers robots to adapt to diverse scenarios. We showcase the autonomous assembly of a gear box that was demonstrated live at CES2024.

[ Mitsubishi ]

Thanks, Devesh!

In November, 2023 Digit was deployed in a distribution center unloading totes from an AMR as part of regular facility operations, including a shift during Cyber Monday.

[ Agility ]

The PR2 just refuses to die. Last time I checked, official support for it ceased in 2016!

[ University of Bremen ]

DARPA’s Air Combat Evolution (ACE) program has achieved the first-ever in-air tests of AI algorithms autonomously flying a fighter jet against a human-piloted fighter jet in within-visual-range combat scenarios (sometimes referred to as “dogfighting”).In this video, team members discuss what makes the ACE program unlike other aerospace autonomy projects and how it represents a transformational moment in aerospace history, establishing a foundation for ethical, trusted, human-machine teaming for complex military and civilian applications.

[ DARPA ]

Sometimes robots that exist for one single purpose that they only do moderately successfully while trying really hard are the best of robots.

[ CMU ]

Video Friday: Robot Dog Can’t Fall



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

RoboCup German Open: 17–21 April 2024, KASSEL, GERMANY
AUVSI XPONENTIAL 2024: 22–25 April 2024, SAN DIEGO
Eurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE
ICRA 2024: 13–17 May 2024, YOKOHAMA, JAPAN
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS
Cybathlon 2024: 25–27 October 2024, ZURICH

Enjoy today’s videos!

I think suggesting that robots can’t fall is much less useful than instead suggesting that robots can fall and get quickly and easily get back up again.

[ Deep Robotics ]

Sanctuary AI says that this video shows Phoenix operating at “human-equivalent speed,” but they don’t specify which human or under which conditions. Though it’s faster than I would be, that’s for sure.

[ Sanctuary AI ]

“Suzume” is an animated film by Makoto Shinkai, in which one of the characters gets turned into a three-legged chair:

Shintaro Inoue from JSK Lab at the University of Tokyo has managed to build a robotic version of that same chair, which is pretty impressive:


[ Github ]

Thanks, Shintaro!

Humanoid robot EVE training for home assistance like putting groceries into the kitchen cabinets.

[ 1X ]

This is the RAM—robotic autonomous mower. It can be dropped anywhere in the world and will wake up with a mission to make tall grass around it shorter. Here is a quick clip of it working on the Presidio in SF.

[ Electric Sheep ]

This year, our robots braved a Finnish winter for the first time. As the snow clears and the days get longer, we’re looking back on how our robots made thousands of deliveries to S Group customers during the colder months.

[ Starship ]

Agility Robotics is doing its best to answer the (very common) question of “Okay, but what can humanoid robots actually do?”


[ Agility Robotics ]

Digit is great and everything, but Cassie will always be one of my favorite robots.

[ CoRIS ]

Adopting omnidirectional Field of View (FoV) cameras in aerial robots vastly improves perception ability, significantly advancing aerial robotics’s capabilities in inspection, reconstruction, and rescue tasks. We propose OmniNxt, a fully open-source aerial robotics platform with omnidirectional perception.

[ OmniNxt ]

The MAkEable framework enhances mobile manipulation in settings designed around humans by streamlining the process of sharing learned skills and experiences among different robots and contexts. Practical tests confirm its efficiency in a range of scenarios, involving different robots, in tasks such as object grasping, coordinated use of both hands in tasks, and the exchange of skills among humanoid robots.

[ Paper ]

We conducted trials of Ringbot outdoors on a 400 meter track. With a power source of 2300 milliamp-hours and 11.1 Volts, Ringbot managed to cover approximately 3 kilometers in 37 minutes. We commanded its target speed and direction using a remote joystick controller (Steam Deck), and Ringbot experienced five falls during this trial.

[ Paper ]

There is a notable lack of consistency about where exactly Boston Dynamics wants you to think Spot’s eyes are.

[ Boston Dynamics ]

As with every single cooking video, there’s a lot of background prep that’s required for this robot to cook an entire meal, but I would utterly demolish those fries.

[ Dino Robotics ]

Here’s everything you need to know about Wing delivery drones, except for how much human time they actually require and the true cost of making deliveries by drone, because those things aren’t fun to talk about.

[ Wing ]

This CMU Teruko Yata Memorial Lecture is by Agility Robotics’ Jonathan Hurst, on “Human-Centric Robots and How Learning Enables Generality.”

Humans have dreamt of robot helpers forever. What’s new is that this dream is becoming real. New developments in AI, building on foundations of hardware and passive dynamics, enable vastly improved generality. Robots can step out of highly structured environments and become more human-centric: operating in human spaces, interacting with people, and doing some basic human workflows. By connecting a Large Language Model, Digit can convert natural language high-level requests into complex robot instructions, composing the library of skills together, using human context to achieve real work in the human world. All of this is new—and it is never going back: AI will drive a fast-following robot revolution that is going to change the way we live.

[ CMU ]

Video Friday: LASSIE On the Moon



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

RoboCup German Open: 17–21 April 2024, KASSEL, GERMANY
AUVSI XPONENTIAL 2024: 22–25 April 2024, SAN DIEGO
Eurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE
ICRA 2024: 13–17 May 2024, YOKOHAMA, JAPAN
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS

Enjoy today’s videos!

USC, UPenn, Texas A&M, Oregon State, Georgia Tech, Temple University, and NASA Johnson Space Center are teaching dog-like robots to navigate craters of the moon and other challenging planetary surfaces in research funded by NASA.

[ USC ]

AMBIDEX is a revolutionary robot that is fast, lightweight, and capable of human-like manipulation. We have added a sensor head and the torso and the waist to greatly expand the range of movement. Compared to the previous arm-centered version, the overall impression and balance has completely changed.

[ Naver Labs ]

It still needs a lot of work, but the six-armed pollinator, Stickbug, can autonomously navigate and pollinate flowers in a greenhouse now.

I think “needs a lot of work” really means “needs a couple more arms.”

[ Paper ]

Experience the future of robotics as UBTECH’s humanoid robot integrates with Baidu’s ERNIE through AppBuilder! Witness robots [that] understand language and autonomously perform tasks like folding clothes and object sorting.

[ UBTECH ]

I know the fins on this robot are for walking underwater rather than on land, but watching it move, I feel like it’s destined to evolve into something a little more terrestrial.

[ Paper ] via [ HERO Lab ]

iRobot has a new Roomba that vacuums and mops—and at $275, it’s a pretty good deal.

Also, if you are a robot vacuum owner, please, please remember to clean the poor thing out from time to time. Here’s how to do it with a Roomba:

[ iRobot ]

The video demonstrates the wave-basin testing of a 43 kg (95 lb) amphibious cycloidal propeller unmanned underwater vehicle (Cyclo-UUV) developed at the Advanced Vertical Flight Laboratory, Texas A&M University. The use of cyclo-propellers allows for 360 degree thrust vectoring for more robust dynamic controllability compared to UUVs with conventional screw propellers.

[ AVFL ]

Sony is still upgrading Aibo with new features, like the ability to listen to your terrible music and dance along.

[ Aibo ]

Operating robots precisely and at high speeds has been a long-standing goal of robotics research. To enable precise and safe dynamic motions, we introduce a four degree-of-freedom (DoF) tendon-driven robot arm. Tendons allow placing the actuation at the base to reduce the robot’s inertia, which we show significantly reduces peak collision forces compared to conventional motor-driven systems. Pairing our robot with pneumatic muscles allows generating high forces and highly accelerated motions, while benefiting from impact resilience through passive compliance.

[ Max Planck Institute ]

Rovers on Mars have previously been caught in loose soils, and turning the wheels dug them deeper, just like a car stuck in sand. To avoid this, Rosalind Franklin has a unique wheel-walking locomotion mode to overcome difficult terrain, as well as autonomous navigation software.

[ ESA ]

Cassie is able to walk on sand, gravel, and rocks inside the Robot Playground at the University of Michigan.

Aww, they stopped before they got to the fun rocks.

[ Paper ] via [ Michigan Robotics ]

Not bad for 2016, right?

[ Namiki Lab ]

MOMO has learned the Bam Yang Gang dance moves with its hand dexterity. :) By analyzing 2D dance videos, we extract detailed hand skeleton data, allowing us to recreate the moves in 3D using a hand model. With this information, MOMO replicates the dance motions with its arm and hand joints.

[ RILAB ] via [ KIMLAB ]

This UPenn GRASP SFI Seminar is from Eric Jang at 1X Technologies, on “Data Engines for Humanoid Robots.”

1X’s mission is to create an abundant supply of physical labor through androids that work alongside humans. I will share some of the progress 1X has been making towards general-purpose mobile manipulation. We have scaled up the number of tasks our androids can do by combining an end-to-end learning strategy with a no-code system to add new robotic capabilities. Our Android Operations team trains their own models on the data they gather themselves, producing an extremely high-quality “farm-to-table” dataset that can be used to learn extremely capable behaviors. I’ll also share an early preview of the progress we’ve been making towards a generalist “World Model” for humanoid robots.

[ UPenn ]

This Microsoft Future Leaders in Robotics and AI Seminar is from Chahat Deep Singh at the University of Maryland, on “Minimal Perception: Enabling Autonomy in Palm-Sized Robots.”

The solution to robot autonomy lies at the intersection of AI, computer vision, computational imaging, and robotics—resulting in minimal robots. This talk explores the challenge of developing a minimal perception framework for tiny robots (less than 6 inches) used in field operations such as space inspections in confined spaces and robot pollination. Furthermore, we will delve into the realm of selective perception, embodied AI, and the future of robot autonomy in the palm of your hands.

[ UMD ]

Video Friday: Human to Humanoid



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

HRI 2024: 11–15 March 2024, BOULDER, COLO.
Eurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE
ICRA 2024: 13–17 May 2024, YOKOHAMA, JAPAN
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS

Enjoy today’s videos!

We present Human to Humanoid (H2O), a reinforcement learning (RL) based framework that enables real-time, whole-body teleoperation of a full-sized humanoid robot with only an RGB camera. We successfully achieve teleoperation of dynamic, whole-body motions in real-world scenarios, including walking, back jumping, kicking, turning, waving, pushing, boxing, etc. To the best of our knowledge, this is the first demonstration to achieve learning-based, real-time, whole-body humanoid teleoperation.

[ CMU ]

Legged robots have the potential to traverse complex terrain and access confined spaces beyond the reach of traditional platforms thanks to their ability to carefully select footholds and flexibly adapt their body posture while walking. However, robust deployment in real-world applications is still an open challenge. In this paper, we present a method for legged locomotion control using reinforcement learning and 3D volumetric representations to enable robust and versatile locomotion in confined and unstructured environments.

[ Takahiro Miki ]

Sure, 3.3 meters per second is fast for a humanoid, but I’m more impressed by the spinning around while walking downstairs.

[ Unitree ]

Improving the safety of collaborative manipulators necessitates the reduction of inertia in the moving part. We introduce a novel approach in the form of a passive, 3D wire aligner, serving as a lightweight and low-friction power transmission mechanism, thus achieving the desired low inertia in the manipulator’s operation.

[ SAQIEL ]

Thanks, Temma!

Robot Era just launched Humanoid-Gym, an open-source reinforcement learning framework for bipedal humanoids. As you can see from the video, RL algorithms have given the robot, called Xiao Xing, or XBot, the ability to climb up and down haphazardly stacked boxes with relative stability and ease.

[ Robot Era ]

“Impact-Aware Bimanual Catching of Large-Momentum Objects.” Need I say more?

[ SLMC ]

More than 80% of stroke survivors experience walking difficulty, significantly impacting their daily lives, independence, and overall quality of life. Now, new research from the University of Massachusetts Amherst pushes forward the bounds of stroke recovery with a unique robotic hip exoskeleton, designed as a training tool to improve walking function. This invites the possibility of new therapies that are more accessible and easier to translate from practice to daily life, compared to current rehabilitation methods.

[ UMass Amherst ]

Thanks, Julia!

The manipulation here is pretty impressive, but it’s hard to know how impressive without also knowing how much the video was sped up.

[ Somatic ]

DJI drones work to make the world a better place and one of the ways that we do this is through conservation work. We partnered with Halo Robotics and the OFI Orangutan Foundation International to showcase just how these drones can make an impact.

[ DJI ]

The aim of the test is to demonstrate the removal and replacement of satellite modules into a 27U CubeSat format using augmented reality control of a robot. In this use case, the “client” satellite is being upgraded and refueled using modular componentry. The robot will then remove the failed computer module and place it in a fixture. It will then do the same with the propellant tank. The robot will then place these correctly back into the satellite.

[ Extend Robotics ]

This video features some of the highlights and favorite moments from the CYBATHLON Challenges 2024 that took place on 2 February, showing so many diverse types of assistive technology taking on discipline tasks and displaying pilots’ tenacity and determination. The Challenges saw new teams, new tasks, and new formats for many of the CYBATHLON disciplines.

[ Cybathlon ]

It’s been a long road to electrically powered robots.

[ ABB ]

Small drones for catastrophic wildfires (ones covering more than [40,470 hectares]) are like bringing a flashlight to light up a football field. This short video describes the major uses for drones of all sizes and why and when they are used, or why not.

[ CRASAR ]

It probably will not surprise you that there are a lot of robots involved in building Rivian trucks and vans.

[ Kawasaki Robotics ]

DARPA’s Learning Introspective Control (LINC) program is developing machine learning methods that show promise in making that scenario closer to reality. LINC aims to fundamentally improve the safety of mechanical systems—specifically in ground vehicles, ships, drone swarms, and robotics—using various methods that require minimal computing power. The result is an AI-powered controller the size of a cell phone.

[ DARPA ]

Video Friday: Pedipulate



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

HRI 2024: 11–15 March 2024, BOULDER, COLO.
Eurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE
ICRA 2024: 13–17 May 2024, YOKOHAMA, JAPAN
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS

Enjoy today’s videos!

Legged robots have the potential to become vital in maintenance, home support, and exploration scenarios. In order to interact with and manipulate their environments, most legged robots are equipped with a dedicated robot arm, which means additional mass and mechanical complexity compared to standard legged robots. In this work, we explore pedipulation—using the legs of a legged robot for manipulation.

This work, by Philip Arm, Mayank Mittal, Hendrik Kolvenbach, and Marco Hutter from ETH Zurich’s Robotic Systems Lab, will be presented at the IEEE International Conference on Robotics and Automation (ICRA 2024) in May, in Japan (see events calendar above).

[ Pedipulate ]

I learned a new word today: “stigmergy.” Stigmergy is a kind of group coordination that’s based on environmental modification. Like, when insects leave pheromone trails, they’re not directly sending messages to other individuals. But as a group, ants are able to manifest surprisingly complex coordinated behaviors. Cool, right? Researchers at IRIDIA are exploring the possibilities for robots using stigmergy with a cool “artificial pheromone” system using a UV-sensitive surface.

“Automatic Design of Stigmergy-Based Behaviors for Robot Swarms,” by Muhammad Salman, David Garzón Ramos, and Mauro Birattari, is published in the journal Communications Engineering.

[ Nature ] via [ IRIDIA ]

Thanks, David!

Filmed in July 2017, this video shows Atlas walking through a “hatch” on a pitching surface. This skill uses autonomous behaviors, with the robot not knowing about the rocking world. Robot built by Boston Dynamics for the DARPA Robotics Challenge in 2013. Software by IHMC Robotics.

[ IHMC ]

That IHMC video reminded me of the SAFFiR program for Shipboard Autonomous Firefighting Robots, which is responsible for a bunch of really cool research in partnership with the U.S. Naval Research Laboratory. NRL did some interesting stuff with Nexi robots from MIT and made their own videos. That effort I think didn’t get nearly enough credit for being very entertaining while communicating important robotics research.

[ NRL ]

I want more robot videos with this energy.

[ MIT CSAIL ]

Large industrial-asset operators increasingly use robotics to automate hazardous work at their facilities. This has led to soaring demand for autonomous inspection solutions like ANYmal. Series production by our partner Zollner enables ANYbotics to supply our customers with the required quantities of robots.

[ ANYbotics ]

This week is Grain Bin Safety Week, and Grain Weevil is here to help.

[ Grain Weevil ]

Oof, this is some heavy, heavy deep-time stuff.

[ Onkalo ]

And now, this.

[ RozenZebet ]

Hawkeye is a real-time multimodal conversation-and-interaction agent for the Boston Dynamics’ mobile robot Spot. Leveraging OpenAI’s experimental GPT-4 Turbo and Vision AI models, Hawkeye aims to empower everyone, from seniors to health care professionals in forming new and unique interactions with the world around them.

That moment at 1:07 is so relatable.

[ Hawkeye ]

Wing would really prefer that if you find one of their drones on the ground, you don’t run off with it.

[ Wing ]

The rover Artemis, developed at the DFKI Robotics Innovation Center, has been equipped with a penetrometer that measures the soil’s penetration resistance to obtain precise information about soil strength. The video showcases an initial test run with the device mounted on the robot. During this test, the robot was remotely controlled, and the maximum penetration depth was limited to 15 millimeters.

[ DFKI ]

To efficiently achieve complex humanoid loco-manipulation tasks in industrial contexts, we propose a combined vision-based tracker-localization interplay integrated as part of a task-space whole-body-optimization control. Our approach allows humanoid robots, targeted for industrial manufacturing, to manipulate and assemble large-scale objects while walking.

[ Paper ]

We developed a novel multibody robot (called the Two-Body Bot) consisting of two small-footprint mobile bases connected by a four-bar linkage where handlebars are mounted. Each base measures only 29.2 centimeters wide, making the robot likely the slimmest ever developed for mobile postural assistance.

[ MIT ]

Lex Fridman interviews Marc Raibert.

[ Lex Fridman ]

Video Friday: Agile but Safe



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

Cybathlon Challenges: 2 February 2024, ZURICH
Eurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE
ICRA 2024: 13–17 May 2024, YOKOHAMA, JAPAN

Enjoy today’s videos!

Is “scamperiest” a word? If not, it should be, because this is the scamperiest robot I’ve ever seen.

[ ABS ]

GITAI is pleased to announce that its 1.5-meter-long autonomous dual robotic arm system (S2) has successfully arrived at the International Space Station (ISS) aboard the SpaceX Falcon 9 rocket (NG-20) to conduct an external demonstration of in-space servicing, assembly, and manufacturing (ISAM) while onboard the ISS. The success of the S2 tech demo will be a major milestone for GITAI, confirming the feasibility of this technology as a fully operational system in space.

[ GITAI ]

This work presents a comprehensive study on using deep reinforcement learning (RL) to create dynamic locomotion controllers for bipedal robots. Going beyond focusing on a single locomotion skill, we develop a general control solution that can be used for a range of dynamic bipedal skills, from periodic walking and running to aperiodic jumping and standing.

And if you want to get exhausted on behalf of a robot, the full 400-meter dash is below.

[ Hybrid Robotics ]

NASA’s Ingenuity Mars Helicopter pushed aerodynamic limits during the final months of its mission, setting new records for speed, distance, and altitude. Hear from Ingenuity chief engineer Travis Brown on how the data the team collected could eventually be used in future rotorcraft designs.

[ NASA ]

BigDog: 15 years of solving mobility problems its own way.

[ Boston Dynamics ]

[Harvard School of Engineering and Applied Sciences] researchers are helping develop resilient and autonomous deep space and extraterrestrial habitations by developing technologies to let autonomous robots repair or replace damaged components in a habitat. The research is part of the Resilient ExtraTerrestrial Habitats institute (RETHi) led by Purdue University, in partnership with [Harvard] SEAS, the University of Connecticut and the University of Texas at San Antonio. Its goal is to “design and operate resilient deep space habitats that can adapt, absorb and rapidly recover from expected and unexpected disruptions.”

[ Harvard SEAS ]

Researchers from Huazhong University of Science and Technology (HUST) in a recent T-RO paper describe and construct a novel variable stiffness spherical joint motor that enables dexterous motion and joint compliance in omni-directions.

[ Paper ]

Thanks, Ram!

We are told that this new robot from HEBI is called “Mark Suckerberg” and that they’ve got a pretty cool application in mind for it, to be revealed later this year.

[ HEBI Robotics ]

Thanks, Dave!

Dive into the first edition of our new Real-World-Robotics class at ETH Zürich! Our students embarked on an incredible journey, creating their human-like robotic hands from scratch. In just three months, the teams designed, built, and programmed their tendon-driven robotic hands, mastering dexterous manipulation with reinforcement learning! The result? A spectacular display of innovation and skill during our grand final.

[ SRL ETHZ ]

Carnegie Mellon researchers have built a system with a robotic arm atop a RangerMini 2.0 robotic cart from AgileX robotics to make what they’re calling a platform for “intelligent movement and processing.”

[ CMU ] via [ AgileX ]

Picassnake is our custom-made robot that paints pictures from music. Picassnake consists of an arm and a head, embedded in a plush snake doll. The robot is connected to a laptop for control and music processing, which can be fed through a microphone or an MP3 file. To open the media source, an operator can use the graphical user interface or place a text QR code in front of a webcam. Once the media source is opened, Picassnake generates unique strokes based on the music and translates the strokes to physical movement to paint them on canvas.

[ Picassnake ]

In April 2021, NASA’s Ingenuity Mars Helicopter became the first spacecraft to achieve powered, controlled flight on another world. With 72 successful flights, Ingenuity has far surpassed its originally planned technology demonstration of up to five flights. On Jan. 18, Ingenuity flew for the final time on the Red Planet. Join Tiffany Morgan, NASA’s Mars Exploration Program Deputy Director, and Teddy Tzanetos, Ingenuity Project Manager, as they discuss these historic flights and what they could mean for future extraterrestrial aerial exploration.

[ NASA ]

❌