Boston Dynamics

Arm you glad to see me, Atlas? | Boston Dynamics



Boston Dynamics

In robotics, dexterity and perception go hand-in-hand, literally! Our Atlas manipulation test stand practices a range of grasps, using RL policies trained using NVIDIA’s DextrAH-RGB.

Learn More: https://developer.nvidia.com/blog/r%C2%B2d%C2%B2-adapting-dexterous-robots-with-nvidia-research-workflows-and-models/

Source

Similar Posts

23 thoughts on “Arm you glad to see me, Atlas? | Boston Dynamics
  1. You have to add a penalty in the function cost to the movement of the object before the grab happens. It could help the robot with that rapid movements after the grab

  2. I'm going to be brutally honest: I'm not too worried about losing my job to this guy. When this guy is finally ready to compete, I've already been decomposing in the ground for a while.

  3. Trying to mimic human anatomy in robots is idiotic. There is a reason robot vacuums do not look like a person with a vacuum cleaner and a welding robot in a car factory does not look like a guy with a welding tool. Humanoid robots are good only for hype and maybe a very niche field of prosthetics, but even there you can get much more effective manipulators if you stop trying to mimic human anatomy.

  4. Why not design a movement system that the machine learning iterates on top of? It seems like you could save a lot of time, let alone make something much better than what this thing is doing.

Comments are closed.

WP2Social Auto Publish Powered By : XYZScripts.com