Tag Archives: good
#437935 Start the New Year Right: By Watching ...
I don’t need to tell you that 2020 was a tough year. There was almost nothing good about it, and we saw it off with a “good riddance” and hopes for a better 2021. But robotics company Boston Dynamics took a different approach to closing out the year: when all else fails, why not dance?
The company released a video last week that I dare you to watch without laughing—or at the very least, cracking a pretty big smile. Because, well, dancing robots are funny. And it’s not just one dancing robot, it’s four of them: two humanoid Atlas bots, one four-legged Spot, and one Handle, a bot-on-wheels built for materials handling.
The robots’ killer moves look almost too smooth and coordinated to be real, leading many to speculate that the video was computer-generated. But if you can trust Elon Musk, there’s no CGI here.
This is not CGI https://t.co/VOivE97vPR
— Elon Musk (@elonmusk) December 29, 2020
Boston Dynamics went through a lot of changes in the last ten years; it was acquired by Google in 2013, then sold to Japanese conglomerate SoftBank in 2017 before being acquired again by Hyundai just a few weeks ago for $1.1 billion. But this isn’t the first time they teach a robot to dance and make a video for all the world to enjoy; Spot tore up the floor to “Uptown Funk” back in 2018.
Four-legged Spot went commercial in June, with a hefty price tag of $74,500, and was put to some innovative pandemic-related uses, including remotely measuring patients’ vital signs and reminding people to social distance.
Hyundai plans to implement its newly-acquired robotics prowess for everything from service and logistics robots to autonomous driving and smart factories.
They’ll have their work cut out for them. Besides being hilarious, kind of heartwarming, and kind of creepy all at once, the robots’ new routine is pretty impressive from an engineering standpoint. Compare it to a 2016 video of Atlas trying to pick up a box (I know it’s a machine with no feelings, but it’s hard not to feel a little bit bad for it, isn’t it?), and it’s clear Boston Dynamics’ technology has made huge strides. It wouldn’t be surprising if, in two years’ time, we see a video of a flash mob of robots whose routine includes partner dancing and back flips (which, admittedly, Atlas can already do).
In the meantime, though, this one is pretty entertaining—and not a bad note on which to start the new year.
Image Credit: Boston Dynamics Continue reading
#437918 Video Friday: These Robots Wish You ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
ICCR 2020 – December 26-29, 2020 – [Online]
HRI 2021 – March 8-11, 2021 – [Online]
RoboSoft 2021 – April 12-16, 2021 – [Online]
Let us know if you have suggestions for next week, and enjoy today's videos.
Look who’s baaaack: Jibo! After being sold (twice?), this pioneering social home robot (it was first announced back in 2014!) now belongs to NTT Disruption, which was described to us as the “disruptive company of NTT Group.” We are all for disruption, so this looks like a great new home for Jibo.
[ NTT Disruption ]
Thanks Ana!
FZI's Christmas Party was a bit of a challenge this year; good thing robots are totally competent to have a part on their own.
[ FZI ]
Thanks Arne!
Do you have a lonely dog that just wants a friend to watch cat videos on YouTube with? The Danish Technological Institute has a gift idea for you.
[ DTI ]
Thanks Samuel!
Once upon a time, not so far away, there was an elf who received a very special gift. Watch this heartwarming story. Happy Holidays from the Robotiq family to yours!
Of course, these elves are not now unemployed, they've instead moved over to toy design full time!
[ Robotiq ]
An elegant Christmas video from the Dynamics System Lab, make sure and watch through the very end for a little extra cheer.
[ Dynamic Systems Lab ]
Thanks Angela!
Usually I complain when robotics companies make holiday videos without any real robots in them, but this is pretty darn cute from Yaskawa this year.
[ Yaskawa ]
Here's our little christmas gift to the fans of strange dynamic behavior. The gyro will follow any given shape as soon as the tip touches its edge and the rotation is fast enough. The friction between tip and shape generates a tangential force, creating a moment such that the gyroscopic reaction pushes the tip towards the shape. The resulting normal force produces a moment that guides the tip along the shape's edge.
[ TUM ]
Happy Holidays from Fanuc!
Okay but why does there have to be an assembly line elf just to put in those little cranks?
[ Fanuc ]
Astrobotic's cute little CubeRover is at NASA busy not getting stuck in places.
[ Astrobotic ]
Team CoSTAR is sharing more of their work on subterranean robotic exploration.
[ CoSTAR ]
Skydio Autonomy Enterprise Foundation (AEF), a new software product that delivers advanced AI-powered capabilities to assist the pilot during tactical situational awareness scenarios and detailed industrial asset inspections. Designed for professionals, it offers an enterprise-caliber flight experience through the new Skydio Enterprise application.
[ Skydio ]
GITAI's S1 autonomous robot will conduct two experiments: IVA (Intra-Vehicular Activity) tasks such as switch and cable operations, and assembly of structures and panels to demonstrate its capability for ISA (In-Space Assembly) tasks. This video was recorded in the Nanoracks Bishop Airlock mock-up facility @GITAI Tokyo office.
[ GITAI ]
It's no Atlas, but this is some impressive dynamic balancing from iCub.
[ IIT ]
The Campaign to Stop Killer Robots and I don't agree on a lot of things, and I don't agree with a lot of the assumptions made in this video, either. But, here you go!
[ CSKR ]
I don't know much about this robot, but I love it.
[ Columbia ]
Most cable-suspended robots have a very well defined workspace, but you can increase that workspace by swinging them around. Wheee!
[ Laval ]
How you know your robot's got some skill: “to evaluate the performance in climbing over the step, we compared the R.L. result to the results of 12 students who attempted to find the best planning. The RL outperformed all the group, in terms of effort and time, both in continuous (joystick) and partition planning.”
[ Zarrouk Lab ]
In the Spring 2021 semester, mechanical engineering students taking MIT class 2.007, Design and Manufacturing I, will be able to participate in the class’ iconic final robot competition from the comfort of their own home. Whether they take the class virtually or semi-virtually, students will be sent a massive kit of tools and materials to build their own unique robot along with a “Home Alone” inspired game board for the final global competition.
[ MIT ]
Well, this thing is still around!
[ Moley Robotics ]
Manuel Ahumada wrote in to share this robotic Baby Yoda that he put together with a little bit of help from Intel's OpenBot software.
[ YouTube ]
Thanks Manuel!
Here's what Zoox has been working on for the past half-decade.
[ Zoox ] Continue reading
#437896 Solar-based Electronic Skin Generates ...
Replicating the human sense of touch is complicated—electronic skins need to be flexible, stretchable, and sensitive to temperature, pressure and texture; they need to be able to read biological data and provide electronic readouts. Therefore, how to power electronic skin for continuous, real-time use is a big challenge.
To address this, researchers from Glasgow University have developed an energy-generating e-skin made out of miniaturized solar cells, without dedicated touch sensors. The solar cells not only generate their own power—and some surplus—but also provide tactile capabilities for touch and proximity sensing. An early-view paper of their findings was published in IEEE Transactions on Robotics.
When exposed to a light source, the solar cells on the s-skin generate energy. If a cell is shadowed by an approaching object, the intensity of the light, and therefore the energy generated, reduces, dropping to zero when the cell makes contact with the object, confirming touch. In proximity mode, the light intensity tells you how far the object is with respect to the cell. “In real time, you can then compare the light intensity…and after calibration find out the distances,” says Ravinder Dahiya of the Bendable Electronics and Sensing Technologies (BEST) Group, James Watt School of Engineering, University of Glasgow, where the study was carried out. The team used infra-red LEDs with the solar cells for proximity sensing for better results.
To demonstrate their concept, the researchers wrapped a generic 3D-printed robotic hand in their solar skin, which was then recorded interacting with its environment. The proof-of-concept tests showed an energy surplus of 383.3 mW from the palm of the robotic arm. “The eSkin could generate more than 100 W if present over the whole body area,” they reported in their paper.
“If you look at autonomous, battery-powered robots, putting an electronic skin [that] is consuming energy is a big problem because then it leads to reduced operational time,” says Dahiya. “On the other hand, if you have a skin which generates energy, then…it improves the operational time because you can continue to charge [during operation].” In essence, he says, they turned a challenge—how to power the large surface area of the skin—into an opportunity—by turning it into an energy-generating resource.
Dahiya envisages numerous applications for BEST’s innovative e-skin, given its material-integrated sensing capabilities, apart from the obvious use in robotics. For instance, in prosthetics: “[As] we are using [a] solar cell as a touch sensor itself…we are also [making it] less bulkier than other electronic skins.” This, he adds, will help create prosthetics that are of optimal weight and size, thus making it easier for prosthetics users. “If you look at electronic skin research, the the real action starts after it makes contact… Solar skin is a step ahead, because it will start to work when the object is approaching…[and] have more time to prepare for action.” This could effectively reduce the time lag that is often seen in brain–computer interfaces.
There are also possibilities in the automation sector, particularly in electrical and interactive vehicles. A car covered with solar e-skin, because of its proximity-sensing capabilities, would be able to “see” an approaching obstacle or a person. It isn’t “seeing” in the biological sense, Dahiya clarifies, but from the point of view of a machine. This can be integrated with other objects, not just cars, for a variety of uses. “Gestures can be recognized as well…[which] could be used for gesture-based control…in gaming or in other sectors.”
In the lab, tests were conducted with a single source of white light at 650 lux, but Dahiya feels there are interesting possibilities if they could work with multiple light sources that the e-skin could differentiate between. “We are exploring different AI techniques [for that],” he says, “processing the data in an innovative way [so] that we can identify the the directions of the light sources as well as the object.”
The BEST team’s achievement brings us closer to a flexible, self-powered, cost-effective electronic skin that can touch as well as “see.” At the moment, however, there are still some challenges. One of them is flexibility. In their prototype, they used commercial solar cells made of amorphous silicon, each 1cm x 1cm. “They are not flexible, but they are integrated on a flexible substrate,” Dahiya says. “We are currently exploring nanowire-based solar cells…[with which] we we hope to achieve good performance in terms of energy as well as sensing functionality.” Another shortcoming is what Dahiya calls “the integration challenge”—how to make the solar skin work with different materials. Continue reading