Tag Archives: robot
#438294 Video Friday: New Entertainment Robot ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
HRI 2021 – March 8-11, 2021 – [Online Conference]
RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
ICRA 2021 – May 30-5, 2021 – Xi'an, China
Let us know if you have suggestions for next week, and enjoy today's videos.
Engineered Arts' latest Mesmer entertainment robot is Cleo. It sings, gesticulates, and even does impressions.
[ Engineered Arts ]
I do not know what this thing is or what it's saying but Panasonic is going to be selling them and I will pay WHATEVER. IT. COSTS.
Slightly worrisome is that Google Translate persistently thinks that part of the description involves “sleeping and flatulence.”
[ Panasonic ] via [ RobotStart ]
Spot Enterprise is here to help you safely ignore every alarm that goes off at work while you're snug at home in your jammies drinking cocoa.
That Spot needs a bath.
If you missed the launch event (with more on the arm), check it out here:
[ Boston Dynamics ]
PHASA-35, a 35m wingspan solar-electric aircraft successfully completed its maiden flight in Australia, February 2020. Designed to operate unmanned in the stratosphere, above the weather and conventional air traffic, PHASA-35 offers a persistent and affordable alternative to satellites combined with the flexibility of an aircraft, which could be used for a range of valuable applications including forest fire detection and maritime surveillance.
[ BAE Systems ]
As part of the Army Research Lab’s (ARL) Robotics Collaborative Technology Alliance (RCTA), we are developing new planning and control algorithms for quadrupedal robots. The goal of our project is to equip the robot LLAMA, developed by NASA JPL, with the skills it needs to move at operational tempo over difficult terrain to keep up with a human squad. This requires innovative perception, planning, and control techniques to make the robot both precise in execution for navigating technical obstacles and robust enough to reject disturbances and recover from unknown errors.
[ IHMC ]
Watch what happens to this drone when it tries to install a bird diverter on a high voltage power line:
[ GRVC ]
Soldiers navigate a wide variety of terrains to successfully complete their missions. As human/agent teaming and artificial intelligence advance, the same flexibility will be required of robots to maneuver across diverse terrain and become effective combat teammates.
[ Army ]
The goal of the GRIFFIN project is to create something similar to sort of robotic bird, which almost certainly won't look like this concept rendering.
While I think this research is great, at what point is it in fact easier to just, you know, train an actual bird?
[ GRIFFIN ]
Paul Newman narrates this video from two decades ago, which is a pretty neat trick.
[ Oxford Robotics Institute ]
The first step towards a LEGO-based robotic McMuffin creator is cracking and separating eggs.
[ Astonishing Studios ] via [ BB ]
Some interesting soft robotics projects at the University of Southern Denmark.
[ SDU ]
Chong Liu introduces Creature_02, his final presentation for Hod Lipson's Robotics Studio course at Columbia.
[ Chong Liu ]
The world needs more robot blimps.
[ Lab INIT Robots ]
Finishing its duty early, the KR CYBERTECH nano uses this time to play basketball.
[ Kuka ]
senseFly has a new aerial surveying drone that they call “affordable,” although they don't say what the price is.
[ senseFly ]
In summer 2020 participated several science teams of the ETH Zurich at the “Art Safiental” in the mountains of Graubunden. After the scientists packed their hiking gear and their robots, their only mission was “over hill and dale to the summit”. How difficult will it be to reach the summit with a legged robot and an exosceletton? What's the relation of synesthetic dance and robotic? How will the hikers react to these projects?
[ Rienerschnitzel Films ]
Thanks Robert!
Karen Liu: How robots perceive the physical world. A specialist in computer animation expounds upon her rapidly evolving specialty, known as physics-based simulation, and how it is helping robots become more physically aware of the world around them.
[ Stanford ]
This week's UPenn GRASP On Robotics seminar is by Maria Chiara Carrozza from Scuola Superiore Sant’Anna, on “Biorobotics for Personal Assistance – Translational Research and Opportunities for Human-Centered Developments.”
The seminar will focus on the opportunities and challenges offered by the digital transformation of healthcare which was accelerated in the COVID-19 Pandemia. In this framework rehabilitation and social robotics can play a fundamental role as enabling technologies for providing innovative therapies and services to patients even at home or in remote environments.
[ UPenn ] Continue reading
#438014 Meet Blueswarm, a Smart School of ...
Anyone who’s seen an undersea nature documentary has marveled at the complex choreography that schooling fish display, a darting, synchronized ballet with a cast of thousands.
Those instinctive movements have inspired researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), and the Wyss Institute for Biologically Inspired Engineering. The results could improve the performance and dependability of not just underwater robots, but other vehicles that require decentralized locomotion and organization, such as self-driving cars and robotic space exploration.
The fish collective called Blueswarm was created by a team led by Radhika Nagpal, whose lab is a pioneer in self-organizing systems. The oddly adorable robots can sync their movements like biological fish, taking cues from their plastic-bodied neighbors with no external controls required. Nagpal told IEEE Spectrum that this marks a milestone, demonstrating complex 3D behaviors with implicit coordination in underwater robots.
“Insights from this research will help us develop future miniature underwater swarms that can perform environmental monitoring and search in visually-rich but fragile environments like coral reefs,” Nagpal said. “This research also paves a way to better understand fish schools, by synthetically recreating their behavior.”
The research is published in Science Robotics, with Florian Berlinger as first author. Berlinger said the “Bluedot” robots integrate a trio of blue LED lights, a lithium-polymer battery, a pair of cameras, a Raspberry Pi computer and four controllable fins within a 3D-printed hull. The fish-lens cameras detect LED’s of their fellow swimmers, and apply a custom algorithm to calculate distance, direction and heading.
Based on that simple production and detection of LED light, the team proved that Blueswarm could self-organize behaviors, including aggregation, dispersal and circle formation—basically, swimming in a clockwise synchronization. Researchers also simulated a successful search mission, an autonomous Finding Nemo. Using their dispersion algorithm, the robot school spread out until one could detect a red light in the tank. Its blue LEDs then flashed, triggering the aggregation algorithm to gather the school around it. Such a robot swarm might prove valuable in search-and-rescue missions at sea, covering miles of open water and reporting back to its mates.
“Each Bluebot implicitly reacts to its neighbors’ positions,” Berlinger said. The fish—RoboCod, perhaps?—also integrate a Wifi module to allow uploading new behaviors remotely. The lab’s previous efforts include a 1,000-strong army of “Kilobots,” and a robotic construction crew inspired by termites. Both projects operated in two-dimensional space. But a 3D environment like air or water posed a tougher challenge for sensing and movement.
In nature, Berlinger notes, there’s no scaly CEO to direct the school’s movements. Nor do fish communicate their intentions. Instead, so-called “implicit coordination” guides the school’s collective behavior, with individual members executing high-speed moves based on what they see their neighbors doing. That decentralized, autonomous organization has long fascinated scientists, including in robotics.
“In these situations, it really benefits you to have a highly autonomous robot swarm that is self-sufficient. By using implicit rules and 3D visual perception, we were able to create a system with a high degree of autonomy and flexibility underwater where things like GPS and WiFi are not accessible.”
Berlinger adds the research could one day translate to anything that requires decentralized robots, from self-driving cars and Amazon warehouse vehicles to exploration of faraway planets, where poor latency makes it impossible to transmit commands quickly. Today’s semi-autonomous cars face their own technical hurdles in reliably sensing and responding to their complex environments, including when foul weather obscures onboard sensors or road markers, or when they can’t fix position via GPS. An entire subset of autonomous-car research involves vehicle-to-vehicle (V2V) communications that could give cars a hive mind to guide individual or collective decisions— avoiding snarled traffic, driving safely in tight convoys, or taking group evasive action during a crash that’s beyond their sensory range.
“Once we have millions of cars on the road, there can’t be one computer orchestrating all the traffic, making decisions that work for all the cars,” Berlinger said.
The miniature robots could also work long hours in places that are inaccessible to humans and divers, or even large tethered robots. Nagpal said the synthetic swimmers could monitor and collect data on reefs or underwater infrastructure 24/7, and work into tiny places without disturbing fragile equipment or ecosystems.
“If we could be as good as fish in that environment, we could collect information and be non-invasive, in cluttered environments where everything is an obstacle,” Nagpal said. Continue reading
#438012 Video Friday: These Robots Have Made 1 ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
HRI 2021 – March 8-11, 2021 – [Online Conference]
RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
Let us know if you have suggestions for next week, and enjoy today's videos.
We're proud to announce Starship Delivery Robots have now completed 1,000,000 autonomous deliveries around the world. We were unsure where the one millionth delivery was going to take place, as there are around 15-20 service areas open globally, all with robots doing deliveries every minute. In the end it took place at Bowling Green, Ohio, to a student called Annika Keeton who is a freshman studying pre-health Biology at BGSU. Annika is now part of Starship’s history!
[ Starship ]
I adore this little DIY walking robot- with modular feet and little dials to let you easily adjust the walking parameters, it's an affordable kit that's way more nuanced than most.
It's called Bakiwi, and it costs €95. A squee cover made from feathers or fur is an extra €17. Here's a more serious look at what it can do:
[ Bakiwi ]
Thanks Oswald!
Savva Morozov, an AeroAstro junior, works on autonomous navigation for the MIT mini cheetah robot and reflects on the value of a crowded Infinite Corridor.
[ MIT ]
The world's most advanced haptic feedback gloves just got a huge upgrade! HaptX Gloves DK2 achieves a level of realism that other haptic devices can't match. Whether you’re training your workforce, designing a new product, or controlling robots from a distance, HaptX Gloves make it feel real.
They're the only gloves with true-contact haptics, with patented technology that displace your skin the same way a real object would. With 133 points of tactile feedback per hand, for full palm and fingertip coverage. HaptX Gloves DK2 feature the industry's most powerful force feedback, ~2X the strength of other force feedback gloves. They're also the most accurate motion tracking gloves, with 30 tracked degrees of freedom, sub-millimeter precision, no perceivable latency, and no occlusion.
[ HaptX ]
Yardroid is an outdoor robot “guided by computer vision and artificial intelligence” that seems like it can do almost everything.
These are a lot of autonomous capabilities, but so far, we've only seen the video. So, best not to get too excited until we know more about how it works.
[ Yardroid ]
Thanks Dan!
Since as far as we know, Pepper can't spread COVID, it had a busy year.
I somehow missed seeing that chimpanzee magic show, but here it is:
[ Simon Pierro ] via [ SoftBank Robotics ]
In spite of the pandemic, Professor Hod Lipson’s Robotics Studio persevered and even thrived— learning to work on global teams, to develop protocols for sharing blueprints and code, and to test, evaluate, and refine their designs remotely. Equipped with a 3D printer and a kit of electronics prototyping equipment, our students engineered bipedal robots that were conceptualized, fabricated, programmed, and endlessly iterated around the globe in bedrooms, kitchens, backyards, and any other makeshift laboratory you can imagine.
[ Hod Lipson ]
Thanks Fan!
We all know how much quadrupeds love ice!
[ Ghost Robotics ]
We took the opportunity of the last storm to put the Warthog in the snow of Université Laval. Enjoy!
[ Norlab ]
They've got a long way to go, but autonomous indoor firefighting drones seem like a fantastic idea.
[ CTU ]
Individual manipulators are limited by their vertical total load capacity. This places a fundamental limit on the weight of loads that a single manipulator can move. Cooperative manipulation with two arms has the potential to increase the net weight capacity of the overall system. However, it is critical that proper load sharing takes place between the two arms. In this work, we outline a method that utilizes mechanical intelligence in the form of a whiffletree.
And your word of the day is whiffletree, which is “a mechanism to distribute force evenly through linkages.”
[ DART Lab ]
Thanks Raymond!
Some highlights of robotic projects at FZI in 2020, all using ROS.
[ FZI ]
Thanks Fan!
iRobot CEO Colin Angle threatens my job by sharing some cool robots.
[ iRobot ]
A fascinating new talk from Henry Evans on robotic caregivers.
[ HRL ]
The ANA Avatar XPRIZE semifinals selection submission for Team AVATRINA. The setting is a mock clinic, with the patient sitting on a wheelchair and nurse having completed an initial intake. Avatar enters the room controlled by operator (Doctor). A rolling tray table with medical supplies (stethoscope, pulse oximeter, digital thermometer, oxygen mask, oxygen tube) is by the patient’s side. Demonstrates head tracking, stereo vision, fine manipulation, bimanual manipulation, safe impedance control, and navigation.
[ Team AVATRINA ]
This five year old talk from Mikell Taylor, who wrote for us a while back and is now at Amazon Robotics, is entitled “Nobody Cares About Your Robot.” For better or worse, it really doesn't sound like it was written five years ago.
Robotics for the consumer market – Mikell Taylor from Scott Handsaker on Vimeo.
[ Mikell Taylor ]
Fall River Community Media presents this wonderful guy talking about his love of antique robot toys.
If you enjoy this kind of slow media, Fall River also has weekly Hot Dogs Cool Cats adoption profiles that are super relaxing to watch.
[ YouTube ] Continue reading