FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál
  • ✇IEEE Spectrum
  • Figure 02 Robot Is a Sleeker, Smarter HumanoidEvan Ackerman
    Today, Figure is introducing the newest, slimmest, shiniest, and least creatively named next generation of its humanoid robot: Figure 02. According to the press release, Figure 02 is the result of “a ground-up hardware and software redesign” and is “the highest performing humanoid robot,” which may even be true for some arbitrary value of “performing.” Also notable is that Figure has been actively testing robots with BMW at a manufacturing plant in Spartanburg, S.C., where the new humanoid has b
     

Figure 02 Robot Is a Sleeker, Smarter Humanoid

6. Srpen 2024 v 15:06


Today, Figure is introducing the newest, slimmest, shiniest, and least creatively named next generation of its humanoid robot: Figure 02. According to the press release, Figure 02 is the result of “a ground-up hardware and software redesign” and is “the highest performing humanoid robot,” which may even be true for some arbitrary value of “performing.” Also notable is that Figure has been actively testing robots with BMW at a manufacturing plant in Spartanburg, S.C., where the new humanoid has been performing “data collection and use case training.”

The rest of the press release is pretty much, “Hey, check out our new robot!” And you’ll get all of the content in the release by watching the videos. What you won’t get from the videos is any additional info about the robot. But we sent along some questions to Figure about these videos, and have a few answers from Michael Rose, director of controls, and Vadim Chernyak, director of hardware.


First, the trailer:

How many parts does Figure 02 have, and is this all of them?

Figure: A couple hundred unique parts and a couple thousand parts total. No, this is not all of them.

Does Figure 02 make little Figure logos with every step?

Figure: If the surface is soft enough, yes.

Swappable legs! Was that hard to do, or easier to do because you only have to make one leg? Figure: We chose to make swappable legs to help with manufacturing.

Is the battery pack swappable too?

Figure: Our battery is swappable, but it is not a quick swap procedure.

What’s that squishy-looking stuff on the back of Figure 02’s knees and in its elbow joints?

Figure: These are soft stops which limit the range of motion in a controlled way and prevent robot pinch points

Where’d you hide that thumb motor?

Figure: The thumb is now fully contained in the hand.

Tell me about the “skin” on the neck!

Figure: The skin is a soft fabric which is able to keep a clean seamless look even as the robot moves its head.

And here’s the reveal video:

When Figure 02’s head turns, its body turns too, and its arms move. Is that necessary, or aesthetic?

Figure: Aesthetic.

The upper torso and shoulders seem very narrow compared to other humanoids. Why is that?

Figure: We find it essential to package the robot to be of similar proportions to a human. This allows us to complete our target use cases and fit into our environment more easily.

What can you tell me about Figure 02’s walking gait?

Figure: The robot is using a model predictive controller to determine footstep locations and forces required to maintain balance and follow the desired robot trajectory.

How much runtime do you get from 2.25 kilowatt-hours doing the kinds of tasks that we see in the video?

Figure: We are targeting a 5-hour run time for our product.


A photo a grey and black humanoid robot with a shiny black face plate standing in front of a white wall. Slick, but also a little sinister?Figure

This thing looks slick. I’d say that it’s maybe a little too far on the sinister side for a robot intended to work around humans, but the industrial design is badass and the packaging is excellent, with the vast majority of the wiring now integrated within the robot’s skins and flexible materials covering joints that are typically left bare. Figure, if you remember, raised a US $675 million Series B that valued the company at $2.6 billion, and somehow the look of this robot seems appropriate to that.

I do still have some questions about Figure 02, such as where the interesting foot design came from and whether a 16-degree-of-freedom hand is really worth it in the near term. It’s also worth mentioning that Figure seems to have a fair number of Figure 02 robots running around—at least five units at its California headquarters, plus potentially a couple of more at the BMW Spartanburg manufacturing facility.

I also want to highlight this boilerplate at the end of the release: “our humanoid is designed to perform human-like tasks within the workforce and in the home.” We are very, very far away from a humanoid robot in the home, but I appreciate that it’s still an explicit goal that Figure is trying to achieve. Because I want one.

  • ✇IEEE Spectrum
  • Video Friday: $2.6 BillionEvan Ackerman
    Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion. HRI 2024: 11–15 March 2024, BOULDER, COLORADO, USAEurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCEICRA 2024: 13–17 May 2024, YOKOHAMA, JAPANRoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS Enjoy today’s videos! Figure has raised
     

Video Friday: $2.6 Billion

1. Březen 2024 v 22:02


Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

HRI 2024: 11–15 March 2024, BOULDER, COLORADO, USA
Eurobot Open 2024: 8–11 May 2024, LA ROCHE-SUR-YON, FRANCE
ICRA 2024: 13–17 May 2024, YOKOHAMA, JAPAN
RoboCup 2024: 17–22 July 2024, EINDHOVEN, NETHERLANDS

Enjoy today’s videos!

Figure has raised a US $675 million Series B, valuing the company at $2.6 billion.

[ Figure ]

Meanwhile, here’s how things are going at Agility Robotics, whose last raise was a $150 million Series B in April of 2022.

[ Agility Robotics ]

Also meanwhile, here’s how things are going at Sanctuary AI, whose last raise was a $58.5 million Series A in March of 2022.

[ Sanctuary AI ]

The time has come for humanoid robots to enter industrial production lines and learn how to assist humans by undertaking repetitive, tedious, and potentially dangerous tasks for them. Recently, UBTECH’s humanoid robot Walker S was introduced into the assembly line of NIO’s advanced vehicle-manufacturing center, as an “intern” assisting in the car production. Walker S is the first bipedal humanoid robot to complete a specific workstation’s tasks on a mobile EV production line.

[ UBTECH ]

Henry Evans keeps working hard to make robots better, this time with the assistance of researchers from Carnegie Mellon University.

Henry said he preferred using head-worn assistive teleoperation (HAT) with a robot for certain tasks rather than depending on a caregiver. “Definitely scratching itches,” he said. “I would be happy to have it stand next to me all day, ready to do that or hold a towel to my mouth. Also, feeding me soft foods, operating the blinds, and doing odd jobs around the room.”
One innovation in particular, software called Driver Assistance that helps align the robot’s gripper with an object the user wants to pick up, was “awesome,” Henry said. Driver Assistance leaves the user in control while it makes the fine adjustments and corrections that can make controlling a robot both tedious and demanding. “That’s better than anything I have tried for grasping,” Henry said, adding that he would like to see Driver Assistance used for every interface that controls Stretch robots.

[ HAT2 ] via [ CMU ]

Watch this video for the three glorious seconds at the end.

[ Tech United ]

Get ready to rip, shear, mow, and tear, as DOOM is back! This April, we’re making the legendary game playable on our robotic mowers as a tribute to 30 years of mowing down demons.

Oh, it’s HOOSKvarna, not HUSKvarna.

[ Husqvarna ] via [ Engadget ]

Latest developments demonstrated on the Ameca Desktop platform. Having fun with vision- and voice-cloning capabilities.

[ Engineered Arts ]

Could an artificial-intelligence system learn language from a child? New York University researchers supported by the National Science Foundation, using first-person video from a head-mounted camera, trained AI models to learn language through the eyes and ears of a child.

[ NYU ]

The world’s leaders in manufacturing, natural resources, power, and utilities are using our autonomous robots to gather data of higher quality and higher quantities of data than ever before. Thousands of Spots have been deployed around the world—more than any other walking robot—to tackle this challenge. This release helps maintenance teams tap into the power of AI with new software capabilities and Spot enhancements.

[ Boston Dynamics ]

Modular self-reconfigurable robotic systems are more adaptive than conventional systems. This article proposes a novel free-form and truss-structured modular self-reconfigurable robot called FreeSN, containing node and strut modules. This article presents a novel configuration identification system for FreeSN, including connection point magnetic localization, module identification, module orientation fusion, and system-configuration fusion.

[ Freeform Robotics ]

The OOS-SIM (On-Orbit Servicing Simulator) is a simulator for on-orbit servicing tasks such as repair, maintenance and assembly that have to be carried out on satellites orbiting the earth. It simulates the operational conditions in orbit, such as the felt weightlessness and the harsh illumination.

[ DLR ]

The next CYBATHLON competition, which will take place again in 2024, breaks down barriers between the public, people with disabilities, researchers and technology developers. From 25 to 27 October 2024, the CYBATHLON will take place in a global format in the Arena Schluefweg in Kloten near Zurich and in local hubs all around the world.

[ CYBATHLON ]

George’s story is a testament to the incredible journey that unfolds when passion, opportunity and community converge. His journey from a drone enthusiast to someone actively contributing to making a difference not only to his local community but also globally; serves as a beacon of hope for all who dare to dream and pursue their passions.

[ WeRobotics ]

In case you’d forgotten, Amazon has a lot of robots.

[ Amazon Robotics ]

ABB’s fifty-year story of robotic innovation that began in 1974 with the sale of the world’s first commercial all-electric robot, the IRB 6. Björn Weichbrodt was a key figure in the development of the IRB 6.

[ ABB ]

Robotics Debate of the Ingenuity Labs Robotics and AI Symposium (RAIS2023) from October 12, 2023: Is robotics helping or hindering our progress on UN Sustainable Development Goals?

[ Ingenuity Labs ]

  • ✇IEEE Spectrum
  • Figure Raises $675M for Its Humanoid Robot DevelopmentEvan Ackerman
    Today, Figure is announcing an astonishing US $675 million Series B raise, which values the company at an even more astonishing $2.6 billion. Figure is one of the companies working toward a multipurpose or general-purpose (depending on whom you ask) bipedal or humanoid (depending on whom you ask) robot. The astonishing thing about this valuation is that Figure’s robot is still very much in the development phase—although they’re making rapid progress, which they demonstrate in a new video posted
     

Figure Raises $675M for Its Humanoid Robot Development

29. Únor 2024 v 14:00


Today, Figure is announcing an astonishing US $675 million Series B raise, which values the company at an even more astonishing $2.6 billion. Figure is one of the companies working toward a multipurpose or general-purpose (depending on whom you ask) bipedal or humanoid (depending on whom you ask) robot. The astonishing thing about this valuation is that Figure’s robot is still very much in the development phase—although they’re making rapid progress, which they demonstrate in a new video posted this week.


This round of funding comes from Microsoft, OpenAI Startup Fund, Nvidia, Jeff Bezos (through Bezos Expeditions), Parkway Venture Capital, Intel Capital, Align Ventures, and ARK Invest. Figure says that they’re going to use this new capital “for scaling up AI training, robot manufacturing, expanding engineering head count, and advancing commercial deployment efforts.” In addition, Figure and OpenAI will be collaborating on the development of “next-generation AI models for humanoid robots” which will “help accelerate Figure’s commercial timeline by enhancing the capabilities of humanoid robots to process and reason from language.”

As far as that commercial timeline goes, here’s the most recent update:

Figure

And to understand everything that’s going on here, we sent a whole bunch of questions to Jenna Reher, senior robotics/AI engineer at Figure.

What does “fully autonomous” mean, exactly?

Jenna Reher: In this case, we simply put the robot on the ground and hit go on the task with no other user input. What you see is using a learned vision model for bin detection that allows us to localize the robot relative to the target bin and get the bin pose. The robot can then navigate itself to within reach of the bin, determine grasp points based on the bin pose, and detect grasp success through the measured forces on the hands. Once the robot turns and sees the conveyor, the rest of the task rolls out in a similar manner. By doing things in this way we can move the bins and conveyor around in the test space or start the robot from a different position and still complete the task successfully.

How many takes did it take to get this take?

Reher: We’ve been running this use case consistently for some time now as part of our work in the lab, so we didn’t really have to change much for the filming here. We did two or three practice runs in the morning and then three filming takes. All of the takes were successful, so the extras were to make sure we got the cleanest one to show.

What’s back in the Advanced Actuator Lab?

Reher: We have an awesome team of folks working on some exciting custom actuator designs for our future robots, as well as supporting and characterizing the actuators that went into our current robots.

That’s a very specific number for “speed vs. human.” Which human did you measure the robot’s speed against?

Reher: We timed Brett [Adcock, founder of Figure] and a few poor engineers doing the task and took the average to get a rough baseline. If you are observant, that seemingly overspecific number is just saying we’re at 1/6 human speed. The main point that we’re trying to make here is that we are aware we are currently below human speed, and it’s an important metric to track as we improve.

What’s the tether for?

Reher: For this task we currently process the camera data off-robot while all of the behavior planning and control happens on board in the computer that’s in the torso. Our robots should be fully tetherless in the near future as we finish packaging all of that on board. We’ve been developing behaviors quickly in the lab here at Figure in parallel to all of the other systems engineering and integration efforts happening, so hopefully folks notice all of these subtle parallel threads converging as we try to release regular updates.

How the heck do you keep your robotics lab so clean?

Reher: Everything we’ve filmed so far is in our large robot test lab, so it’s a lot easier to keep the area clean when people’s desks aren’t intruding in the space. Definitely no guarantees on that level of cleanliness if the camera were pointed in the other direction!

Is the robot in the background doing okay?

Reher: Yes! The other robot was patiently standing there in the background, waiting for the filming to finish up so that our manipulation team could get back to training it to do more manipulation tasks. We hope we can share some more developments with that robot as the main star in the near future.

What would happen if I put a single bowling ball into that tote?

Reher: A bowling ball is particularly menacing to this task primarily due to the moving mass, in addition to the impact if you are throwing it in. The robot would in all likelihood end up dropping the tote, stay standing, and abort the task. With what you see here, we assume that the mass of the tote is known a priori so that our whole-body controller can compensate for the external forces while tracking the manipulation task. Reacting to and estimating larger unknown disturbances such as this is a challenging problem, but we’re definitely working on it.

Tell me more about that very Zen arm and hand pose that the robot adopts after putting the tote on the conveyor.

Reher: It does look kind of Zen! If you rewatch our coffee video, you’ll notice the same pose after the robot gets things brewing. This is a reset pose that our controller will go into between manipulation tasks while the robot is awaiting commands to execute either an engineered behavior or a learned policy.

Are the fingers less fragile than they look?

Reher: They are more robust than they look, but not impervious to damage by any means. The design is pretty modular, which is great, meaning that if we damage one or two fingers, there is a small number of parts to swap to get everything back up and running. The current fingers won’t necessarily survive a direct impact from a bad fall, but can pick up totes and do manipulation tasks all day without issues.

Is the Figure logo footsteps?

Reher: One of the reasons I really like the Figure logo is that it has a bunch of different interpretations depending on how you look at it. In some cases it’s just an F that looks like a footstep plan rollout, while some of the logo animations we have look like active stepping. One other possible interpretation could be an occupancy grid.

❌
❌