Tag Archives: open
#437643 Video Friday: Matternet Launches Urban ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
IROS 2020 – October 25-25, 2020 – [Online]
Bay Area Robotics Symposium – November 20, 2020 – [Online]
ACRA 2020 – December 8-10, 2020 – [Online]
Let us know if you have suggestions for next week, and enjoy today's videos.
Sixteen teams chose their roster of virtual robots and sensor payloads, some based on real-life subterranean robots, and submitted autonomy and mapping algorithms that SubT Challenge officials then tested across eight cave courses in the cloud-based SubT Simulator. Their robots traversed the cave environments autonomously, without any input or adjustments from human operators. The Cave Circuit Virtual Competition teams earned points by correctly finding, identifying, and localizing up to 20 artifacts hidden in the cave courses within five-meter accuracy.
[ SubT ]
This year, the KUKA Innovation Award’s international jury of experts received a total of more than 40 ideas. The five finalist teams had time until November to implement their ideas. A KUKA LBR Med lightweight robot – the first robotic component to be certified for integration into a medical device – has been made available to them for this purpose. Beyond this, the teams have received a training for the hardware and coaching from KUKA experts throughout the competition. At virtual.MEDICA from 16-19.11.2020, the finalists presented their concepts to an international audience of experts and to the Innovation Award jury.
The winner of the KUKA Innovation Award 2020, worth 20,000 euros, is Team HIFUSK from the Scuola Superiore Sant'Anna in Italy.
[ KUKA Innovation Award ]
Like everything else the in-person Cybathlon event was cancelled, but the competition itself took place, just a little more distributed than it would have been otherwise.
[ Cybathlon ]
Matternet, developer of the world's leading urban drone logistics platform, today announced the launch of operations at Labor Berlin Charité Vivantes in Germany. The program kicked-off November 17, 2020 with permanent operations expected to take flight next year, creating the first urban BVLOS [Beyond Visual Line of Sight] medical drone delivery network in the European Union. The drone network expects to significantly improve the timeliness and efficiency of Labor Berlin’s diagnostics services by providing an option to avoid roadway delays, which will improve patient experience with potentially life-saving benefits and lower costs.
Routine BVLOS over an urban area? Impressive.
[ Matternet ]
Robots playing diabolo!
Thanks Thilo!
[ OMRON Sinic X]
Anki's tech has been repackaged into this robot that serves butter:
[ Butter Robot ]
Berkshire Grey just announced our Picking With Purpose Program in which we’ve partnered our robotic automation solutions with food rescue organizations City Harvest and The Greater Boston Food Bank to pick, pack, and distribute food to families in need in time for Thanksgiving. Berkshire Grey donated about 40,000 pounds of food, used one of our robotic automation systems to pick and pack that food into meal boxes for families in need, and our team members volunteered to run the system. City Harvest and The Greater Boston Food Bank are distributing the 4,000 meal boxes we produced. This is just the beginning. We are building a sponsorship program to make Picking With Purpose an ongoing initiative.
[ Berkshire Grey ]
Thanks Peter!
We posted a video previously of Cassie learning to skip, but here's a much more detailed look (accompanying an ICRA submission) that includes some very impressive stair descending.
[ DRL ]
From garage inventors to university students and entrepreneurs, NASA is looking for ideas on how to excavate the Moon’s icy regolith, or dirt, and deliver it to a hypothetical processing plant at the lunar South Pole. The NASA Break the Ice Lunar Challenge, a NASA Centennial Challenge, is now open for registration. The competition will take place over two phases and will reward new ideas and approaches for a system architecture capable of excavating and moving icy regolith and water on the lunar surface.
[ NASA ]
Adaptation to various scene configurations and object properties, stability and dexterity in robotic grasping manipulation is far from explored. This work presents an origami-based shape morphing fingertip design to actively tackle the grasping stability and dexterity problems. The proposed fingertip utilizes origami as its skeleton providing degrees of freedom at desired positions and motor-driven four-bar-linkages as its transmission components to achieve a compact size of the fingertip.
[ Paper ]
“If Roboy crashes… you die.”
[ Roboy ]
Traditionally lunar landers, as well as other large space exploration vehicles, are powered by solar arrays or small nuclear reactors. Rovers and small robots, however, are not big enough to carry their own dedicated power supplies and must be tethered to their larger counterparts via electrical cables. Tethering severely restricts mobility, and cables are prone to failure due to lunar dust (regolith) interfering with electrical contact points. Additionally, as robots become smaller and more complex, they are fitted with additional sensors that require more power, further exacerbating the problem. Lastly, solar arrays are not viable for charging during the lunar night. WiBotic is developing rapid charging systems and energy monitoring base stations for lunar robots, including the CubeRover – a shoebox-sized robot designed by Astrobotic – that will operate autonomously and charge wirelessly on the Moon.
[ WiBotic ]
Watching pick and place robots is my therapy.
[ Soft Robotics ]
It's really, really hard to beat liquid fuel for energy storage, as Quaternium demonstrates with their hybrid drone.
[ Quaternium ]
Thanks Gregorio!
State-of-the-art quadrotor simulators have a rigid and highly-specialized structure: either are they really fast, physically accurate, or photo-realistic. In this work, we propose a novel quadrotor simulator: Flightmare.
[ Flightmare ]
Drones that chuck fire-fighting balls into burning buildings, sure!
[ LARICS ]
If you missed ROS World, that's okay, because all of the talks are now online. Here's the opening keynote from Vivian Chu and Diligent robotics, along with a couple fun lightning talks.
[ ROS World 2020 ]
This week's CMU RI Seminar is by Chelsea Finn from Stanford University, on Data Scalability for Robot Learning.
Recent progress in robot learning has demonstrated how robots can acquire complex manipulation skills from perceptual inputs through trial and error, particularly with the use of deep neural networks. Despite these successes, the generalization and versatility of robots across environment conditions, tasks, and objects remains a major challenge. And, unfortunately, our existing algorithms and training set-ups are not prepared to tackle such challenges, which demand large and diverse sets of tasks and experiences. In this talk, I will discuss two central challenges that pertain to data scalability: first, acquiring large datasets of diverse and useful interactions with the world, and second, developing algorithms that can learn from such datasets. Then, I will describe multiple approaches that we might take to rethink our algorithms and data pipelines to serve these goals. This will include algorithms that allow a real robot to explore its environment in a targeted manner with minimal supervision, approaches that can perform robot reinforcement learning with videos of human trial-and-error experience, and visual model-based RL approaches that are not bottlenecked by their capacity to model everything about the world.
[ CMU RI ] Continue reading
#437610 How Intel’s OpenBot Wants to Make ...
You could make a pretty persuasive argument that the smartphone represents the single fastest area of technological progress we’re going to experience for the foreseeable future. Every six months or so, there’s something with better sensors, more computing power, and faster connectivity. Many different areas of robotics are benefiting from this on a component level, but over at Intel Labs, they’re taking a more direct approach with a project called OpenBot that turns US $50 worth of hardware and your phone into a mobile robot that can support “advanced robotics workloads such as person following and real-time autonomous navigation in unstructured environments.”
This work aims to address two key challenges in robotics: accessibility and scalability. Smartphones are ubiquitous and are becoming more powerful by the year. We have developed a combination of hardware and software that turns smartphones into robots. The resulting robots are inexpensive but capable. Our experiments have shown that a $50 robot body powered by a smartphone is capable of person following and real-time autonomous navigation. We hope that the presented work will open new opportunities for education and large-scale learning via thousands of low-cost robots deployed around the world.
Smartphones point to many possibilities for robotics that we have not yet exploited. For example, smartphones also provide a microphone, speaker, and screen, which are not commonly found on existing navigation robots. These may enable research and applications at the confluence of human-robot interaction and natural language processing. We also expect the basic ideas presented in this work to extend to other forms of robot embodiment, such as manipulators, aerial vehicles, and watercraft.
One of the interesting things about this idea is how not-new it is. The highest profile phone robot was likely the $150 Romo, from Romotive, which raised a not-insignificant amount of money on Kickstarter in 2012 and 2013 for a little mobile chassis that accepted one of three different iPhone models and could be controlled via another device or operated somewhat autonomously. It featured “computer vision, autonomous navigation, and facial recognition” capabilities, but was really designed to be a toy. Lack of compatibility hampered Romo a bit, and there wasn’t a lot that it could actually do once the novelty wore off.
As impressive as smartphone hardware was in a robotics context (even back in 2013), we’re obviously way, way beyond that now, and OpenBot figures that smartphones now have enough clout and connectivity that turning them into mobile robots is a good idea. You know, again. We asked Intel Labs’ Matthias Muller why now was the right time to launch OpenBot, and he mentioned things like the existence of a large maker community with broad access to 3D printing as well as open source software that makes broader development easier.
And of course, there’s the smartphone hardware: “Smartphones have become extremely powerful and feature dedicated AI processors in addition to CPUs and GPUs,” says Mueller. “Almost everyone owns a very capable smartphone now. There has been a big boost in sensor performance, especially in cameras, and a lot of the recent developments for VR applications are well aligned with robotic requirements for state estimation.” OpenBot has been tested with 10 recent Android phones, and since camera placement tends to be similar and USB-C is becoming the charging and communications standard, compatibility is less of an issue nowadays.
Image: OpenBot
Intel researchers created this table comparing OpenBot to other wheeled robot platforms, including Amazon’s DeepRacer, MIT’s Duckiebot, iRobot’s Create-2, and Thymio. The top group includes robots based on RC trucks; the bottom group includes navigation robots for deployment at scale and in education. Note that the cost of the smartphone needed for OpenBot is not included in this comparison.
If you’d like an OpenBot of your own, you don’t need to know all that much about robotics hardware or software. For the hardware, you probably need some basic mechanical and electronics experience—think Arduino project level. The software is a little more complicated; there’s a pretty good walkthrough to get some relatively sophisticated behaviors (like autonomous person following) up and running, but things rapidly degenerate into a command line interface that could be intimidating for new users. We did ask about why OpenBot isn’t ROS-based to leverage the robustness and reach of that community, and Muller said that ROS “adds unnecessary overhead,” although “if someone insists on using ROS with OpenBot, it should not be very difficult.”
Without building OpenBot to explicitly be part of an existing ecosystem, the challenge going forward is to make sure that the project is consistently supported, lest it wither and die like so many similar robotics projects have before it. “We are committed to the OpenBot project and will do our best to maintain it,” Mueller assures us. “We have a good track record. Other projects from our group (e.g. CARLA, Open3D, etc.) have also been maintained for several years now.” The inherently open source nature of the project certainly helps, although it can be tricky to rely too much on community contributions, especially when something like this is first starting out.
The OpenBot folks at Intel, we’re told, are already working on a “bigger, faster and more powerful robot body that will be suitable for mass production,” which would certainly help entice more people into giving this thing a go. They’ll also be focusing on documentation, which is probably the most important but least exciting part about building a low-cost community focused platform like this. And as soon as they’ve put together a way for us actual novices to turn our phones into robots that can do cool stuff for cheap, we’ll definitely let you know. Continue reading
#437608 Video Friday: Agility Robotics Raises ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.
Digit is now in full commercial production and we’re excited to announce a $20M funding rounding round co-led by DCVC and Playground Global!
Digits for everyone!
[ Agility Robotics ]
A flexible rover that has both ability to travel long distances and rappel down hard-to-reach areas of scientific interest has undergone a field test in the Mojave Desert in California to showcase its versatility. Composed of two Axel robots, DuAxel is designed to explore crater walls, pits, scarps, vents and other extreme terrain on the moon, Mars and beyond.
This technology demonstration developed at NASA’s Jet Propulsion Laboratory in Southern California showcases the robot’s ability to split in two and send one of its halves — a two-wheeled Axle robot — over an otherwise inaccessible slope, using a tether as support and to supply power.
The rappelling Axel can then autonomously seek out areas to study, safely overcome slopes and rocky obstacles, and then return to dock with its other half before driving to another destination. Although the rover doesn’t yet have a mission, key technologies are being developed that might, one day, help us explore the rocky planets and moons throughout the solar system.
[ JPL ]
A rectangular robot as tiny as a few human hairs can travel throughout a colon by doing back flips, Purdue University engineers have demonstrated in live animal models. Why the back flips? Because the goal is to use these robots to transport drugs in humans, whose colons and other organs have rough terrain. Side flips work, too. Why a back-flipping robot to transport drugs? Getting a drug directly to its target site could remove side effects, such as hair loss or stomach bleeding, that the drug may otherwise cause by interacting with other organs along the way.
[ Purdue ]
This video shows the latest results in the whole-body locomotion control of the humanoid robot iCub achieved by the Dynamic Interaction Control line at IIT-Istituto Italiano di Tecnologia in Genova (Italy). In particular, the iCub now keeps the balance while walking and receiving pushes from an external user. The implemented control algorithms also ensure the robot to remain compliant during locomotion and human-robot interaction, a fundamental property to lower the possibility to harm humans that share the robot surrounding environment.
This is super impressive, considering that iCub was only able to crawl and was still tethered not too long ago. Also, it seems to be blinking properly now, so it doesn’t look like it’s always sleepy.
[ IIT ]
This video shows a set of new tests we performed on Bolt. We conducted tests on 5 different scenarios, 1) walking forward/backward 2) uneven surface 3) soft surface 4) push recovery 5) slippage recovery. Thanks to our feedback control based on Model Predictive Control, the robot can perform walking in the presence of all these uncertainties. We will open-source all the codes in a near future.
[ ODRI ]
The title of this video is “Can you throw your robot into a lake?” The title of this video should be, “Can you throw your robot into a lake and drive it out again?”
[ Norlab ]
AeroVironment Successfully Completes Sunglider Solar HAPS Stratospheric Test Flight, Surpassing 60,000 Feet Altitude and Demonstrating Broadband Mobile Connectivity.
[ AeroVironment ]
We present CoVR, a novel robotic interface providing strong kinesthetic feedback (100 N) in a room-scale VR arena. It consists of a physical column mounted on a 2D Cartesian ceiling robot (XY displacements) with the capacity of (1) resisting to body-scaled users actions such as pushing or leaning; (2) acting on the users by pulling or transporting them as well as (3) carrying multiple potentially heavy objects (up to 80kg) that users can freely manipulate or make interact with each other.
[ DeepAI ]
In a new video, personnel from Swiss energy supply company Kraftwerke Oberhasli AG (KWO) explain how they were able to keep employees out of harm’s way by using Flyability’s Elios 2 to collect visual data while building a new dam.
[ Flyability ]
Enjoy our Ascento robot fail compilation! With every failure we experience, we learn more and we can improve our robot for its next iteration, which will come soon… Stay tuned for more!
FYI posting a robot fails video will pretty much guarantee you a spot in Video Friday!
[ Ascento ]
Humans are remarkably good at using chopsticks. The Guinness World Record witnessed a person using chopsticks to pick up 65 M&Ms in just a minute. We aim to collect demonstrations from humans and to teach robot to use chopsticks.
[ UW Personal Robotics Lab ]
A surprising amount of personality from these Yaskawa assembly robots.
[ Yaskawa ]
This paper presents the system design, modeling, and control of the Aerial Robotic Chain Manipulator. This new robot design offers the potential to exert strong forces and moments to the environment, carry and lift significant payloads, and simultaneously navigate through narrow corridors. The presented experimental studies include a valve rotation task, a pick-and-release task, and the verification of load oscillation suppression to demonstrate the stability and performance of the system.
[ ARL ]
Whether animals or plants, whether in the water, on land or in the air, nature provides the model for many technical innovations and inventions. This is summed up in the term bionics, which is a combination of the words ‘biology‘ and ‘electronics’. At Festo, learning from nature has a long history, as our Bionic Learning Network is based on using nature as the source for future technologies like robots, assistance systems or drive solutions.
[ Festo ]
Dogs! Selfies! Thousands of LEGO bricks! This video has it all.
[ LEGO ]
An IROS workshop talk on “Cassie and Mini Cheetah Autonomy” by Maani Ghaffari and Jessy Grizzle from the University of Michigan.
[ Michigan Robotics ]
David Schaefer’s Cozmo robots are back with this mind-blowing dance-off!
What you just saw represents hundreds of hours of work, David tells us: “I wrote over 10,000 lines of code to create the dance performance as I had to translate the beats per minute of the song into motor rotations in order to get the right precision needed to make the moves look sharp. The most challenging move was the SpongeBob SquareDance as any misstep would send the Cozmos crashing into each other. LOL! Fortunately for me, Cozmo robots are pretty resilient.”
[ Life with Cozmo ]
Thanks David!
This week’s GRASP on Robotics seminar is by Sangbae Kim from MIT, on “Robots with Physical Intelligence.”
While industrial robots are effective in repetitive, precise kinematic tasks in factories, the design and control of these robots are not suited for physically interactive performance that humans do easily. These tasks require ‘physical intelligence’ through complex dynamic interactions with environments whereas conventional robots are designed primarily for position control. In order to develop a robot with ‘physical intelligence’, we first need a new type of machines that allow dynamic interactions. This talk will discuss how the new design paradigm allows dynamic interactive tasks. As an embodiment of such a robot design paradigm, the latest version of the MIT Cheetah robots and force-feedback teleoperation arms will be presented.
[ GRASP ]
This week’s CMU Ri Seminar is by Kevin Lynch from Northwestern, on “Robotics and Biosystems.”
Research at the Center for Robotics and Biosystems at Northwestern University encompasses bio-inspiration, neuromechanics, human-machine systems, and swarm robotics, among other topics. In this talk I will give an overview of some of our recent work on in-hand manipulation, robot locomotion on yielding ground, and human-robot systems.
[ CMU RI ] Continue reading