Tag Archives: awesome

#437693 Video Friday: Drone Helps Explore ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online Conference]
IROS 2020 – October 25-29, 2020 – Las Vegas, Nev., USA
CYBATHLON 2020 – November 13-14, 2020 – [Online Event]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today's videos.

Clearpath Robotics and Boston Dynamics were obviously destined to partner up with Spot, because Spot 100 percent stole its color scheme from Clearpath, which has a monopoly on yellow and black robots. But seriously, the news here is that thanks to Clearpath, Spot now works seamlessly with ROS.

[ Clearpath Robotics ]

A new video created by Swisscom Ventures highlights a research expedition sponsored by Moncler to explore the deepest ice caves in the world using Flyability’s Elios drone. […] The expedition was sponsored by apparel company Moncler and took place over two weeks in 2018 on the Greenland ice sheet, the second largest body of ice in the world after Antarctica. Research focused on an area about 80 kilometers east of Kangerlussuaq, where scientists wanted to study the movement of water deep underground to better understand the effects of climate change on the melting ice.

[ Flyability ]

Shane Wighton of the “Stuff Made Here” YouTube channel, whose terrifying haircut machine we featured a few months ago, has improved on his robotic basketball hoop. It’s actually more than an improvement: It’s a complete redesign that nearly drove Wighton insane. But the result is pretty cool. It’s fun to watch him building a highly complicated system while always seeking simple and elegant designs for its components.

[ Stuff Made Here ]

SpaceX rockets are really just giant, explosion-powered drones that go into space sometimes. So let's watch more videos of them! This one is sped up, and puts a flight into just a couple of minutes.

[ SpaceX ]

Neato Robotics makes some solid autonomous vacuums, and these incremental upgrades feature improved battery life and better air filters.

[ Neato Robotics ]

A full-scale engineering model of NASA's Perseverance Mars rover now resides in a garage facing the Mars Yard at NASA's Jet Propulsion Laboratory in Southern California.

This vehicle system test bed rover (VSTB) is also known as OPTIMISM, which stands for Operational Perseverance Twin for Integration of Mechanisms and Instruments Sent to Mars. OPTIMISM was built in a warehouselike assembly room near the Mars Yard – an area that simulates the Red Planet's rocky surface. The rover helps the mission test hardware and software before it’s transmitted to the real rover on Mars. OPTIMISM will share the space with the Curiosity rover's twin MAGGIE.

[ JPL ]

Heavy asset industries like shipping, oil and gas, and manufacturing are grounded in repetitive tasks like locating items on large industrial sites — a tedious task that can take as long 45 minutes to find critical items like a forklift in an area that spans the size of multiple football fields. Not only is this work boring, it’s dangerous and inefficient. Robots like Spot, however, love this sort of work.

Spot can provide real-time updates on the location of assets and complete other mundane tasks. In this case, Spot is using software from Cognite to roam the vast shipyard to locate and manage more than 100,000 assets stored across the facility. What used to take humans hours can be managed on an ongoing basis by Spot — leaving employees to focus on more strategic tasks.

[ Cognite ]

The KNEXT Barista system helps high volume premium coffee providers who want to offer artisan coffee specialities in consistent quality.

[ Kuka ]

In this paper, we study this idea of generality in the locomotion domain. We develop a learning framework that can learn sophisticated locomotion behavior for a wide spectrum of legged robots, such as bipeds, tripeds, quadrupeds and hexapods, including wheeled variants. Our learning framework relies on a data-efficient, off-policy multi-task RL algorithm and a small set of reward functions that are semantically identical across robots.

[ DeepMind ]

Thanks Dave!

Even though it seems like the real risk of COVID is catching it from another person, robotics companies are doing what they can with UVC disinfecting systems.

[ BlueBotics ]

Aeditive develop robotic 3D printing solutions for the production of concrete components. At the heart of their production plant are two large robots that cooperate to manufacture the component. The automation technology they build on is a robotic shotcrete process. During this process, they apply concrete layer by layer and thus manufacture complete components. This means that their customers no longer dependent on formwork, which is expensive and time-consuming to create. Instead, their customers can manufacture components directly on a steel pallet without these moulds.

[ Aeditive ]

Something BIG is coming next month from Robotiq!

My guess: an elephant.

[ Robotiq ]

TurtleBot3 is a great little home robot, as long as you have a TurtleBot3-sized home.

[ Robotis ]

How do you calculate the coordinated movements of two robot arms so they can accurately guide a highly flexible tool? ETH researchers have integrated all aspects of the optimisation calculations into an algorithm. The hot-​wire cutter will be used, among other things, to develop building blocks for a mortar-​free structure.

[ ETH Zurich ]

And now, this.

[ RobotStart ] Continue reading

Posted in Human Robots

#437689 GITAI Sending Autonomous Robot to Space ...

We’ve been keeping a close watch on GITAI since early last year—what caught our interest initially is the history of the company, which includes a bunch of folks who started in the JSK Lab at the University of Tokyo, won the DARPA Robotics Challenge Trials as SCHAFT, got swallowed by Google, narrowly avoided being swallowed by SoftBank, and are now designing robots that can work in space.

The GITAI YouTube channel has kept us more to less up to date on their progress so far, and GITAI has recently announced the next step in this effort: The deployment of one of their robots on board the International Space Station in 2021.

Photo: GITAI

GITAI’s S1 is a task-specific 8-degrees-of-freedom arm with an integrated sensing and computing system and 1-meter reach.

GITAI has been working on a variety of robots for space operations, the most sophisticated of which is a humanoid torso called G1, which is controlled through an immersive telepresence system. What will be launching into space next year is a more task-specific system called the S1, which is an 8-degrees-of-freedom arm with an integrated sensing and computing system that can be wall-mounted and has a 1-meter reach.

The S1 will be living on board a commercially funded, pressurized airlock-extension module called Bishop, developed by NanoRacks. Mounted on the inside of the Bishop module, the S1 will have access to a task board and a small assembly area, where it will demonstrate common crew intra-vehicular activity, or IVA—tasks like flipping switches, turning knobs, and managing cables. It’ll also do some in-space assembly, or ISA, attaching panels to create a solar array.

Here’s a demonstration of some task board activities, conducted on Earth in a mockup of Bishop:

GITAI says that “all operations conducted by the S1 GITAI robotic arm will be autonomous, followed by some teleoperations from Nanoracks’ in-house mission control.” This is interesting, because from what we’ve seen until now, GITAI has had a heavy emphasis on telepresence, with a human in the loop to get stuff done. As GITAI’s founder and CEO Sho Nakanose commented to us a year ago, “Telepresence robots have far better performance and can be made practical much quicker than autonomous robots, so first we are working on making telepresence robots practical.”

So what’s changed? “GITAI has been concentrating on teleoperations to demonstrate the dexterity of our robot, but now it’s time to show our capabilities to do the same this time with autonomy,” Nakanose told us last week. “In an environment with minimum communication latency, it would be preferable to operate a robot more with teleoperations to enhance the capability of the robot, since with the current technology level of AI, what a robot can do autonomously is very limited. However, in an environment where the latency becomes noticeable, it would become more efficient to have a mixture of autonomy and teleoperations depending on the application. Eventually, in an ideal world, a robot will operate almost fully autonomously with minimum human cognizance.”

“In an environment where the latency becomes noticeable, it would become more efficient to have a mixture of autonomy and teleoperations depending on the application. Eventually, in an ideal world, a robot will operate almost fully autonomously with minimum human cognizance.”
—Sho Nakanose, GITAI founder and CEO

Nakanose says that this mission will help GITAI to “acquire the skills, know-how, and experience necessary to prepare a robot to be ISS compatible, prov[ing] the maturity of our technology in the microgravity environment.” Success would mean conducting both IVA and ISA experiments as planned (autonomous and teleop for IVA, fully autonomous for ISA), which would be pretty awesome, but we’re told that GITAI has already received a research and development order for space robots from a private space company, and Nakanose expects that “by the mid-2020s, we will be able to show GITAI's robots working in space on an actual mission.”

NanoRacks is schedule to launch the Bishop module on SpaceX CRS-21 in November. The S1 will be launched separately in 2021, and a NASA astronaut will install the robot and then leave it alone to let it start demonstrating how work in space can be made both safer and cheaper once the humans have gotten out of the way. Continue reading

Posted in Human Robots

#437687 Video Friday: Bittle Is a Palm-Sized ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online]
IROS 2020 – October 25-29, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today's videos.

Rongzhong Li, who is responsible for the adorable robotic cat Nybble, has an updated and even more adorable quadruped that's more robust and agile but only costs around US $200 in kit form on Kickstarter.

Looks like the early bird options are sold out, but a full kit is a $225 pledge, for delivery in December.

[ Kickstarter ]

Thanks Rz!

I still maintain that Stickybot was one of the most elegantly designed robots ever.

[ Stanford ]

With the unpredictable health crisis of COVID-19 continuing to place high demands on hospitals, PAL Robotics have successfully completed testing of their delivery robots in Barcelona hospitals this summer. The TIAGo Delivery and TIAGo Conveyor robots were deployed in Hospital Municipal of Badalona and Hospital Clínic Barcelona following a winning proposal submitted to the European DIH-Hero project. Accerion sensors were integrated onto the TIAGo Delivery Robot and TIAGo Conveyor Robot for use in this project.

[ PAL Robotics ]

Energy Robotics, a leading developer of software solutions for mobile robots used in industrial applications, announced that its remote sensing and inspection solution for Boston Dynamics’s agile mobile robot Spot was successfully deployed at Merck’s thermal exhaust treatment plant at its headquarters in Darmstadt, Germany. Energy Robotics equipped Spot with sensor technology and remote supervision functions to support the inspection mission.

Combining Boston Dynamics’ intuitive controls, robotic intelligence and open interface with Energy Robotics’ control and autonomy software, user interface and encrypted cloud connection, Spot can be taught to autonomously perform a specific inspection round while being supervised remotely from anywhere with internet connectivity. Multiple cameras and industrial sensors enable the robot to find its way around while recording and transmitting information about the facility’s onsite equipment operations.

Spot reads the displays of gauges in its immediate vicinity and can also zoom in on distant objects using an externally-mounted optical zoom lens. In the thermal exhaust treatment facility, for instance, it monitors cooling water levels and notes whether condensation water has accumulated. Outside the facility, Spot monitors pipe bridges for anomalies.

Among the robot’s many abilities, it can detect defects of wires or the temperature of pump components using thermal imaging. The robot was put through its paces on a comprehensive course that tested its ability to handle special challenges such as climbing stairs, scaling embankments and walking over grating.

[ Energy Robotics ]

Thanks Stefan!

Boston Dynamics really should give Dr. Guero an Atlas just to see what he can do with it.

[ DrGuero ]

World's First Socially Distanced Birthday Party: Located in London, the robotic arm was piloted in real time to light the candles on the cake by the founder of Extend Robotics, Chang Liu, who was sat 50 miles away in Reading. Other team members in Manchester and Reading were also able to join in the celebration as the robot was used to accurately light the candles on the birthday cake.

[ Extend Robotics ]

The Robocon in-person competition was canceled this year, but check out Tokyo University's robots in action:

[ Robocon ]

Sphero has managed to pack an entire Sphero into a much smaller sphere.

[ Sphero ]

Squishy Robotics, a small business funded by the National Science Foundation (NSF), is developing mobile sensor robots for use in disaster rescue, remote monitoring, and space exploration. The shape-shifting, mobile, senor robots from UC-Berkeley spin-off Squishy Robotics can be dropped from airplanes or drones and can provide first responders with ground-based situational awareness during fires, hazardous materials (HazMat) release, and natural and man-made disasters.

[ Squishy Robotics ]

Meet Jasper, the small girl with big dreams to FLY. Created by UTS Animal Logic Academy in partnership with the Royal Australian Air Force to encourage girls to soar above the clouds. Jasper was created using a hybrid of traditional animation techniques and technology such as robotics and 3D printing. A KUKA QUANTEC robot is used during the film making to help the Australian Royal Airforce tell their story in a unique way. UTS adapted their High Accurate robot to film consistent paths, creating a video with physical sets and digital characters.

[ AU AF ]

Impressive what the Ghost Robotics V60 can do without any vision sensors on it.

[ Ghost Robotics ]

Is your job moving tiny amounts of liquid around? Would you rather be doing something else? ABB’s YuMi got you.

[ Yumi ]

For his PhD work at the Media Lab, Biomechatronics researcher Roman Stolyarov developed a terrain-adaptive control system for robotic leg prostheses. as a way to help people with amputations feel as able-bodied and mobile as possible, by allowing them to walk seamlessly regardless of the ground terrain.

[ MIT ]

This robot collects data on each cow when she enters to be milked. Milk samples and 3D photos can be taken to monitor the cow’s health status. The Ontario Dairy Research Centre in Elora, Ontario, is leading dairy innovation through education and collaboration. It is a state-of-the-art 175,000 square foot facility for discovery, learning and outreach. This centre is a partnership between the Agricultural Research Institute of Ontario, OMAFRA, the University of Guelph and the Ontario dairy industry.

[ University of Guleph ]

Australia has one of these now, should the rest of us panic?

[ Boeing ]

Daimler and Torc are developing Level 4 automated trucks for the real world. Here is a glimpse into our closed-course testing, routes on public highways in Virginia, and self-driving capabilities development. Our year of collaborating on the future of transportation culminated in the announcement of our new truck testing center in New Mexico.

[ Torc Robotics ] Continue reading

Posted in Human Robots

#437671 Video Friday: Researchers 3D Print ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online]
IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

The Giant Gundam in Yokohama is actually way cooler than I thought it was going to be.

[ Gundam Factory ] via [ YouTube ]

A new 3D-printing method will make it easier to manufacture and control the shape of soft robots, artificial muscles and wearable devices. Researchers at UC San Diego show that by controlling the printing temperature of liquid crystal elastomer, or LCE, they can control the material’s degree of stiffness and ability to contract—also known as degree of actuation. What’s more, they are able to change the stiffness of different areas in the same material by exposing it to heat.

[ UCSD ]

Thanks Ioana!

This is the first successful reactive stepping test on our new torque-controlled biped robot named Bolt. The robot has 3 active degrees of freedom per leg and one passive joint in ankle. Since there is no active joint in ankle, the robot only relies on step location and timing adaptation to stabilize its motion. Not only can the robot perform stepping without active ankles, but it is also capable of rejecting external disturbances as we showed in this video.

[ ODRI ]

The curling robot “Curly” is the first AI-based robot to demonstrate competitive curling skills in an icy real environment with its high uncertainties. Scientists from seven different Korean research institutions including Prof. Klaus-Robert Müller, head of the machine-learning group at TU Berlin and guest professor at Korea University, have developed an AI-based curling robot.

[ TU Berlin ]

MoonRanger, a small robotic rover being developed by Carnegie Mellon University and its spinoff Astrobotic, has completed its preliminary design review in preparation for a 2022 mission to search for signs of water at the moon’s south pole. Red Whittaker explains why the new MoonRanger Lunar Explorer design is innovative and different from prior planetary rovers.

[ CMU ]

Cobalt’s security robot can now navigate unmodified elevators, which is an impressive feat.

Also, EXTERMINATE!

[ Cobalt ]

OrionStar, the robotics company invested in by Cheetah Mobile, announced the Robotic Coffee Master. Incorporating 3,000 hours of AI learning, 30,000 hours of robotic arm testing and machine vision training, the Robotic Coffee Master can perform complex brewing techniques, such as curves and spirals, with millimeter-level stability and accuracy (reset error ≤ 0.1mm).

[ Cheetah Mobile ]

DARPA OFFensive Swarm-Enabled Tactics (OFFSET) researchers recently tested swarms of autonomous air and ground vehicles at the Leschi Town Combined Arms Collective Training Facility (CACTF), located at Joint Base Lewis-McChord (JBLM) in Washington. The Leschi Town field experiment is the fourth of six planned experiments for the OFFSET program, which seeks to develop large-scale teams of collaborative autonomous systems capable of supporting ground forces operating in urban environments.

[ DARPA ]

Here are some highlights from Team Explorer’s SubT Urban competition back in February.

[ Team Explorer ]

Researchers with the Skoltech Intelligent Space Robotics Laboratory have developed a system that allows easy interaction with a micro-quadcopter with LEDs that can be used for light-painting. The researchers used a 92x92x29 mm Crazyflie 2.0 quadrotor that weighs just 27 grams, equipped with a light reflector and an array of controllable RGB LEDs. The control system consists of a glove equipped with an inertial measurement unit (IMU; an electronic device that tracks the movement of a user’s hand), and a base station that runs a machine learning algorithm.

[ Skoltech ]

“DeKonBot” is the prototype of a cleaning and disinfection robot for potentially contaminated surfaces in buildings such as door handles, light switches or elevator buttons. While other cleaning robots often spray the cleaning agents over a large area, DeKonBot autonomously identifies the surface to be cleaned.

[ Fraunhofer IPA ]

On Oct. 20, the OSIRIS-REx mission will perform the first attempt of its Touch-And-Go (TAG) sample collection event. Not only will the spacecraft navigate to the surface using innovative navigation techniques, but it could also collect the largest sample since the Apollo missions.

[ NASA ]

With all the robotics research that seems to happen in places where snow is more of an occasional novelty or annoyance, it’s good to see NORLAB taking things more seriously

[ NORLAB ]

Telexistence’s Model-T robot works very slowly, but very safely, restocking shelves.

[ Telexistence ] via [ YouTube ]

Roboy 3.0 will be unveiled next month!

[ Roboy ]

KUKA ready2_educate is your training cell for hands-on education in robotics. It is especially aimed at schools, universities and company training facilities. The training cell is a complete starter package and your perfect partner for entry into robotics.

[ KUKA ]

A UPenn GRASP Lab Special Seminar on Data Driven Perception for Autonomy, presented by Dapo Afolabi from UC Berkeley.

Perception systems form a crucial part of autonomous and artificial intelligence systems since they convert data about the relationship between an autonomous system and its environment into meaningful information. Perception systems can be difficult to build since they may involve modeling complex physical systems or other autonomous agents. In such scenarios, data driven models may be used to augment physics based models for perception. In this talk, I will present work making use of data driven models for perception tasks, highlighting the benefit of such approaches for autonomous systems.

[ GRASP Lab ]

A Maryland Robotics Center Special Robotics Seminar on Underwater Autonomy, presented by Ioannis Rekleitis from the University of South Carolina.

This talk presents an overview of algorithmic problems related to marine robotics, with a particular focus on increasing the autonomy of robotic systems in challenging environments. I will talk about vision-based state estimation and mapping of underwater caves. An application of monitoring coral reefs is going to be discussed. I will also talk about several vehicles used at the University of South Carolina such as drifters, underwater, and surface vehicles. In addition, a short overview of the current projects will be discussed. The work that I will present has a strong algorithmic flavour, while it is validated in real hardware. Experimental results from several testing campaigns will be presented.

[ MRC ]

This week’s CMU RI Seminar comes from Scott Niekum at UT Austin, on Scaling Probabilistically Safe Learning to Robotics.

Before learning robots can be deployed in the real world, it is critical that probabilistic guarantees can be made about the safety and performance of such systems. This talk focuses on new developments in three key areas for scaling safe learning to robotics: (1) a theory of safe imitation learning; (2) scalable reward inference in the absence of models; (3) efficient off-policy policy evaluation. The proposed algorithms offer a blend of safety and practicality, making a significant step towards safe robot learning with modest amounts of real-world data.

[ CMU RI ] Continue reading

Posted in Human Robots

#437667 17 Teams to Take Part in DARPA’s ...

Among all of the other in-person events that have been totally wrecked by COVID-19 is the Cave Circuit of the DARPA Subterranean Challenge. DARPA has already hosted the in-person events for the Tunnel and Urban SubT circuits (see our previous coverage here), and the plan had always been for a trio of events representing three uniquely different underground environments in advance of the SubT Finals, which will somehow combine everything into one bonkers course.

While the SubT Urban Circuit event snuck in just under the lockdown wire in late February, DARPA made the difficult (but prudent) decision to cancel the in-person Cave Circuit event. What this means is that there will be no Systems Track Cave competition, which is a serious disappointment—we were very much looking forward to watching teams of robots navigating through an entirely unpredictable natural environment with a lot of verticality. Fortunately, DARPA is still running a Virtual Cave Circuit, and 17 teams will be taking part in this competition featuring a simulated cave environment that’s as dynamic and detailed as DARPA can make it.

From DARPA’s press releases:

DARPA’s Subterranean (SubT) Challenge will host its Cave Circuit Virtual Competition, which focuses on innovative solutions to map, navigate, and search complex, simulated cave environments November 17. Qualified teams have until Oct. 15 to develop and submit software-based solutions for the Cave Circuit via the SubT Virtual Portal, where their technologies will face unknown cave environments in the cloud-based SubT Simulator. Until then, teams can refine their roster of selected virtual robot models, choose sensor payloads, and continue to test autonomy approaches to maximize their score.

The Cave Circuit also introduces new simulation capabilities, including digital twins of Systems Competition robots to choose from, marsupial-style platforms combining air and ground robots, and breadcrumb nodes that can be dropped by robots to serve as communications relays. Each robot configuration has an associated cost, measured in SubT Credits – an in-simulation currency – based on performance characteristics such as speed, mobility, sensing, and battery life.

Each team’s simulated robots must navigate realistic caves, with features including natural terrain and dynamic rock falls, while they search for and locate various artifacts on the course within five meters of accuracy to score points during a 60-minute timed run. A correct report is worth one point. Each course contains 20 artifacts, which means each team has the potential for a maximum score of 20 points. Teams can leverage numerous practice worlds and even build their own worlds using the cave tiles found in the SubT Tech Repo to perfect their approach before they submit one official solution for scoring. The DARPA team will then evaluate the solution on a set of hidden competition scenarios.

Of the 17 qualified teams (you can see all of them here), there are a handful that we’ll quickly point out. Team BARCS, from Michigan Tech, was the winner of the SubT Virtual Urban Circuit, meaning that they may be the team to beat on Cave as well, although the course is likely to be unique enough that things will get interesting. Some Systems Track teams to watch include Coordinated Robotics, CTU-CRAS-NORLAB, MARBLE, NUS SEDS, and Robotika, and there are also a handful of brand new teams as well.

Now, just because there’s no dedicated Cave Circuit for the Systems Track teams, it doesn’t mean that there won’t be a Cave component (perhaps even a significant one) in the final event, which as far as we know is still scheduled to happen in fall of next year. We’ve heard that many of the Systems Track teams have been testing out their robots in caves anyway, and as the virtual event gets closer, we’ll be doing a sort of Virtual Systems Track series that highlights how different teams are doing mock Cave Circuits in caves they’ve found for themselves.

For more, we checked in with DARPA SubT program manager Dr. Timothy H. Chung.

IEEE Spectrum: Was it a difficult decision to cancel the Systems Track for Cave?

Tim Chung: The decision to go virtual only was heart wrenching, because I think DARPA’s role is to offer up opportunities that may be unimaginable for some of our competitors, like opening up a cave-type site for this competition. We crawled and climbed through a number of these sites, and I share the sense of disappointment that both our team and the competitors have that we won’t be able to share all the advances that have been made since the Urban Circuit. But what we’ve been able to do is pour a lot of our energy and the insights that we got from crawling around in those caves into what’s going to be a really great opportunity on the Virtual Competition side. And whether it’s a global pandemic, or just lack of access to physical sites like caves, virtual environments are an opportunity that we want to develop.

“The simulator offers us a chance to look at where things could be … it really allows for us to find where some of those limits are in the technology based only on our imagination.”
—Timothy H. Chung, DARPA

What kind of new features will be included in the Virtual Cave Circuit for this competition?

I’m really excited about these particular features because we’re seeing an opportunity for increased synergy between the physical and the virtual. The first I’d say is that we scanned some of the Systems Track robots using photogrammetry and combined that with some additional models that we got from the systems competitors themselves to turn their systems robots into virtual models. We often talk about the sim to real transfer and how successful we can get a simulation to transfer over to the physical world, but now we’ve taken something from the physical world and made it virtual. We’ve validated the controllers as well as the kinematics of the robots, we’ve iterated with the systems competitors themselves, and now we have these 13 robots (air and ground) in the SubT Tech Repo that now all virtual competitors can take advantage of.

We also have additional robot capability. Those comms bread crumbs are common among many of the competitors, so we’ve adopted that in the virtual world, and now you have comms relay nodes that are baked in to the SubT Simulator—you can have either six or twelve comms nodes that you can drop from a variety of our ground robot platforms. We have the marsupial deployment capability now, so now we have parent ground robots that can be mixed and matched with different child drones to become marsupial pairs.

And this is something I’ve been planning for for a while: we now have the ability to trigger things like rock falls. They still don’t quite look like Indiana Jones with the boulder coming down the corridor, but this comes really close. In addition to it just being an interesting and realistic consideration, we get to really dynamically test and stress the robots’ ability to navigate and recognize that something has changed in the environment and respond to it.

Image: DARPA

DARPA is still running a Virtual Cave Circuit, and 17 teams will be taking part in this competition featuring a simulated cave environment.

No simulation is perfect, so can you talk to us about what kinds of things aren’t being simulated right now? Where does the simulator not match up to reality?

I think that question is foundational to any conversation about simulation. I’ll give you a couple of examples:

We have the ability to represent wholesale damage to a robot, but it’s not at the actuator or component level. So there’s not a reliability model, although I think that would be really interesting to incorporate so that you could do assessments on things like mean time to failure. But if a robot falls off a ledge, it can be disabled by virtue of being too damaged to continue.

With communications, and this is one that’s near and dear not only to my heart but also to all of those that have lived through developing communication systems and robotic systems, we’ve gone through and conducted RF surveys of underground environments to get a better handle on what propagation effects are. There’s a lot of research that has gone into this, and trying to carry through some of that realism, we do have path loss models for RF communications baked into the SubT Simulator. For example, when you drop a bread crumb node, it’s using a path loss model so that it can represent the degradation of signal as you go farther into a cave. Now, we’re not modeling it at the Maxwell equations level, which I think would be awesome, but we’re not quite there yet.

We do have things like battery depletion, sensor degradation to the extent that simulators can degrade sensor inputs, and things like that. It’s just amazing how close we can get in some places, and how far away we still are in others, and I think showing where the limits are of how far you can get simulation is all part and parcel of why SubT Challenge wants to have both System and Virtual tracks. Simulation can be an accelerant, but it’s not going to be the panacea for development and innovation, and I think all the competitors are cognizant those limitations.

One of the most amazing things about the SubT Virtual Track is that all of the robots operate fully autonomously, without the human(s) in the loop that the System Track teams have when they compete. Why make the Virtual Track even more challenging in that way?

I think it’s one of the defining, delineating attributes of the Virtual Track. Our continued vision for the simulation side is that the simulator offers us a chance to look at where things could be, and allows for us to explore things like larger scales, or increased complexity, or types of environments that we can’t physically gain access to—it really allows for us to find where some of those limits are in the technology based only on our imagination, and this is one of the intrinsic values of simulation.

But I think finding a way to incorporate human input, or more generally human factors like teleoperation interfaces and the in-situ stress that you might not be able to recreate in the context of a virtual competition provided a good reason for us to delineate the two competitions, with the Virtual Competition really being about the role of fully autonomous or self-sufficient systems going off and doing their solution without human guidance, while also acknowledging that the real world has conditions that would not necessarily be represented by a fully simulated version. Having said that, I think cognitive engineering still has an incredibly important role to play in human robot interaction.

What do we have to look forward to during the Virtual Competition Showcase?

We have a number of additional features and capabilities that we’ve baked into the simulator that will allow for us to derive some additional insights into our competition runs. Those insights might involve things like the performance of one or more robots in a given scenario, or the impact of the environment on different types of robots, and what I can tease is that this will be an opportunity for us to showcase both the technology and also the excitement of the robots competing in the virtual environment. I’m trying not to give too many spoilers, but we’ll have an opportunity to really get into the details.

Check back as we get closer to the 17 November event for more on the DARPA SubT Challenge. Continue reading

Posted in Human Robots