Robot Videos: Deep Robotics, Surgery Robot, Lunar Exploration

Estimated read time 6 min read



Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

Humanoids 2024: 22–24 November 2024, NANCY, FRANCE
Humanoids Summit: 11–12 December 2024, MOUNTAIN VIEW, CA

Enjoy today’s videos!

Don’t get me wrong, this is super impressive, but I’m like 95% sure that there’s a human driving it. For robots like these to be useful, they’ll need to be autonomous, and high speed autonomy over unstructured terrain is still very much a work in progress.

[ Deep Robotics ]

[ Paper ] via [ Advanced Science News ]

Subsurface lava tubes have been detected from orbit on both the Moon and Mars. These natural voids are potentially the best place for long-term human habitations, because they offer shelter against radiation and meteorites. This work presents the development and implementation of a novel Tether Management and Docking System (TMDS) designed to support the vertical rappel of a rover through a skylight into a lunar lava tube. The TMDS connects two rovers via a tether, enabling them to cooperate and communicate during such an operation.

[ DFKI Robotics Innovation Center ]

Ad Spiers at Imperial College London writes, “We’ve developed a $80 barometric tactile sensor that, unlike past efforts, is easier to fabricate and repair. By training a machine learning model on controlled stimulation of the sensor we have been able to increase the resolution from 6mm to 0.28mm. We also implement it in one of our E-Troll robotic grippers, allowing the estimation of object position and orientation.”

[ Imperial College London ] via [ Ad Spiers ]

Thanks Ad!

A robot, trained for the first time to perform surgical procedures by watching videos of robotic surgeries, executed the same procedures—but with considerably more precision.

[ Johns Hopkins University ]

Thanks, Dina!

This is brilliant but I’m really just in it for the satisfying noise it makes.

[ RoCogMan Lab ]

Fast and accurate physics simulation is an essential component of robot learning, where robots can explore failure scenarios that are difficult to produce in the real world and learn from unlimited on-policy data. Yet, it remains challenging to incorporate RGB-color perception into the sim-to-real pipeline that matches the real world in its richness and realism. In this work, we train a robot dog in simulation for visual parkour. We propose a way to use generative models to synthesize diverse and physically accurate image sequences of the scene from the robot’s ego-centric perspective. We present demonstrations of zero-shot transfer to the RGB-only observations of the real world on a robot equipped with a low-cost, off-the-shelf color camera.

[ MIT CSAIL ]

WalkON Suit F1 is a powered exoskeleton designed to walk and balance independently, offering enhanced mobility and independence. Users with paraplegia can easily transfer into the suit directly from their wheelchair, ensuring exceptional usability for people with disabilities.

[ Angel Robotics ]

In order to promote the development of the global embodied AI industry, the Unitree G1 robot operation data set is open sourced, adapted to a variety of open source solutions, and continuously updated.

[ Unitree Robotics ]

Spot encounters all kinds of obstacles and environmental changes, but it still needs to safely complete its mission without getting stuck, falling, or breaking anything. While there are challenges and obstacles that we can anticipate and plan for—like stairs or forklifts—there are many more that are difficult to predict. To help tackle these edge cases, we used AI foundation models to give Spot a better semantic understanding of the world.

[ Boston Dynamics ]

Wing drone deliveries of NHS blood samples are now underway in London between Guy’s and St Thomas’ hospitals.

[ Wing ]

As robotics engineers, we love the authentic sounds of robotics—the metal clinking and feet contacting the ground. That’s why we value unedited, raw footage of robots in action. Although unpolished, these candid captures let us witness the evolution of robotics technology without filters, which is truly exciting.

[ UCR ]

Eight minutes of chill mode thanks to Kuka’s robot DJs, which make up the supergroup the Kjays.

A KR3 AGILUS at the drums, loops its beats and sets the beat. The KR CYBERTECH nano is our nimble DJ with rhythm in his blood. In addition, a KR AGILUS performs as a light artist and enchants with soft and expansive movements. In addition there is an LBR Med, which – mounted on the ceiling – keeps an eye on the unusual robot party.

[ Kuka Robotics Corp. ]

Am I the only one disappointed that this isn’t actually a little mini Ascento?

[ Ascento Robotics ]

This demo showcases our robot performing autonomous table wiping powered by Deep Predictive Learning developed by Ogata Lab at Waseda University. Through several dozen human teleoperation demonstrations, the robot has learned natural wiping motions.

[ Tokyo Robotics ]

What’s green, bidirectional, and now driving autonomously in San Francisco and the Las Vegas Strip? The Zoox robotaxi! Give us a wave if you see us on the road!

[ Zoox ]

Northrop Grumman has been pioneering capabilities in the undersea domain for more than 50 years. Now, we are creating a new class of uncrewed underwater vehicles (UUV) with Manta Ray. Taking its name from the massive “winged” fish, Manta Ray will operate long-duration, long-range missions in ocean environments where humans can’t go.

[ Northrop Grumman ]

I was at ICRA 2024 and I didn’t see most of the stuff in this video.

[ ICRA 2024 ]

A fleet of marble-sculpting robots is carving out the future of the art world. It’s a move some artists see as cheating, but others are embracing the change.

[ CBS ]

From Your Site Articles

Related Articles Around the Web



Source link

You May Also Like

More From Author

+ There are no comments

Add yours