Tag Archives: virtual
#439451 12 Robotics Teams Will Hunt For ...
Last week, DARPA announced the twelve teams who will be competing in the Virtual Track of the DARPA Subterranean Challenge Finals, scheduled to take place in September in Louisville, KY. The robots and the environment may be virtual, but the prize money is very real, with $1.5 million of DARPA cash on the table for the teams who are able to find the most subterranean artifacts in the shortest amount of time.
You can check out the list of Virtual Track competitors here, but we’ll be paying particularly close attention to Team Coordinated Robotics and Team BARCS, who have been trading first and second place back and forth across the three previous competitions. But there are many other strong contenders, and since nearly a year will have passed between the Final and the previous Cave Circuit, there’s been plenty of time for all teams to have developed creative new ideas and improvements.
As a quick reminder, the SubT Final will include elements of tunnels, caves, and the urban underground. As before, teams will be using simulated models of real robots to explore the environment looking for artifacts (like injured survivors, cell phones, backpacks, and even hazardous gas), and they’ll have to manage things like austere navigation, degraded sensing and communication, dynamic obstacles, and rough terrain.
While we’re not sure exactly what the Virtual Track is going to look like, one of the exciting aspects of a virtual competition like this is how DARPA is not constrained by things like available physical space or funding. They could make a virtual course that incorporates the inside of the Egyptian pyramids, the Cheyenne Mountain military complex, and my basement, if they were so inclined. We are expecting a combination of the overall themes of the three previous virtual courses (tunnel, cave, and urban), but connected up somehow, and likely with a few surprises thrown in for good measure.
To some extent, the Virtual Track represents the best case scenario for SubT robots, in the sense that fewer things will just spontaneously go wrong. This is something of a compromise, since things very often spontaneously go wrong when you’re dealing with real robots in the real world. This is not to diminish the challenges of the Virtual Track in the least—even the virtual robots aren’t invincible, and their software will need to keep them from running into simulated walls or falling down simulated stairs. But as far as I know, the virtual robots will not experience damage during transport to the event, electronics shorting, motors burning out, emergency stop buttons being accidentally pressed, and that sort of thing. If anything, this makes the Virtual Track more exciting to watch, because you’re seeing teams of virtual robots on their absolute best behavior challenging each other primarily on the cleverness and efficiency of their programmers.
The other reason that the Virtual Track is more exciting is that unlike the Systems Track, there are no humans in the loop at all. Teams submit their software to DARPA, and then sit back and relax (or not) and watch their robots compete all by themselves in real time. This is a hugely ambitious way to do things, because a single human even a little bit in the loop can provide the kind of critical contextual world knowledge and mission perspective that robots often lack. A human in there somewhere is fine in the near to medium term, but full autonomy is the dream.
As for the Systems Track (which involves real robots on the physical course in Louisville), we’re not yet sure who all of the final competitors will be. The pandemic has made travel complicated, and some international teams aren’t yet sure whether they’ll be able to make it. Either way, we’ll be there at the end of September, when we’ll be able to watch both the Systems and Virtual Track teams compete for the SubT Final championship. Continue reading
#439429 12 Robotics Teams Will Hunt For ...
Last week, DARPA announced the twelve teams who will be competing in the Virtual Track of the DARPA Subterranean Challenge Finals, scheduled to take place in September in Louisville, KY. The robots and the environment may be virtual, but the prize money is very real, with $1.5 million of DARPA cash on the table for the teams who are able to find the most subterranean artifacts in the shortest amount of time.
You can check out the list of Virtual Track competitors here, but we’ll be paying particularly close attention to Team Coordinated Robotics and Team BARCS, who have been trading first and second place back and forth across the three previous competitions. But there are many other strong contenders, and since nearly a year will have passed between the Final and the previous Cave Circuit, there’s been plenty of time for all teams to have developed creative new ideas and improvements.
As a quick reminder, the SubT Final will include elements of tunnels, caves, and the urban underground. As before, teams will be using simulated models of real robots to explore the environment looking for artifacts (like injured survivors, cell phones, backpacks, and even hazardous gas), and they’ll have to manage things like austere navigation, degraded sensing and communication, dynamic obstacles, and rough terrain.
While we’re not sure exactly what the Virtual Track is going to look like, one of the exciting aspects of a virtual competition like this is how DARPA is not constrained by things like available physical space or funding. They could make a virtual course that incorporates the inside of the Egyptian pyramids, the Cheyenne Mountain military complex, and my basement, if they were so inclined. We are expecting a combination of the overall themes of the three previous virtual courses (tunnel, cave, and urban), but connected up somehow, and likely with a few surprises thrown in for good measure.
To some extent, the Virtual Track represents the best case scenario for SubT robots, in the sense that fewer things will just spontaneously go wrong. This is something of a compromise, since things very often spontaneously go wrong when you’re dealing with real robots in the real world. This is not to diminish the challenges of the Virtual Track in the least—even the virtual robots aren’t invincible, and their software will need to keep them from running into simulated walls or falling down simulated stairs. But as far as I know, the virtual robots will not experience damage during transport to the event, electronics shorting, motors burning out, emergency stop buttons being accidentally pressed, and that sort of thing. If anything, this makes the Virtual Track more exciting to watch, because you’re seeing teams of virtual robots on their absolute best behavior challenging each other primarily on the cleverness and efficiency of their programmers.
The other reason that the Virtual Track is more exciting is that unlike the Systems Track, there are no humans in the loop at all. Teams submit their software to DARPA, and then sit back and relax (or not) and watch their robots compete all by themselves in real time. This is a hugely ambitious way to do things, because a single human even a little bit in the loop can provide the kind of critical contextual world knowledge and mission perspective that robots often lack. A human in there somewhere is fine in the near to medium term, but full autonomy is the dream.
As for the Systems Track (which involves real robots on the physical course in Louisville), we’re not yet sure who all of the final competitors will be. The pandemic has made travel complicated, and some international teams aren’t yet sure whether they’ll be able to make it. Either way, we’ll be there at the end of September, when we’ll be able to watch both the Systems and Virtual Track teams compete for the SubT Final championship. Continue reading
#439290 Making virtual assistants sound human ...
There's a scene in the 2008 film “Iron Man” that offers a glimpse of future interactions between human and artificial intelligence assistants. In it, Tony Stark's virtual assistant J.A.R.V.I.S. responds with sarcasm and humor to Stark's commands. Continue reading
#439230 Using a virtual linkage representation ...
A team of researchers at Yale University has developed a new kind of algorithm to improve the functionally of a robot hand. In their paper published in the journal Science Robotics, the group describes their algorithm and then demonstrate, via videos, how it can be used. Continue reading
#439220 Video Friday: Virtual Cat Petting
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
ICRA 2021 – May 30-5, 2021 – [Online Event]
RoboCup 2021 – June 22-28, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
IROS 2021 – September 27-1, 2021 – [Online Event]
ROSCon 20201 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.
The 2021 Computer-Human Interaction conference (CHI) took place this week, in amongst the stereo smelling and tooth control were some incredibly creative robotics projects, like this “HairTouch” system that uses robotic manipulation to achieve a variety of haptic sensations using hair.
We propose a pinbased handheld device, HairTouch, to provide stiffness differences, roughness differences, surface height differences and their combinations. HairTouch consists of two pins for the two finger segments close to the index fingertip, respectively. By controlling brush hairs’ length and bending direction to change the hairs’ elasticity and hair tip direction, each pin renders various stiffness and roughness, respectively.
[ NTU ]
Thanks Fan!
Here's another cool thing from CHI: a “Pneumatic Raspberry Pi for Soft Robotics.”
FlowIO is a miniature, modular, pneumatic development platform with a software toolkit for control, actuation, and sensing of soft robots and programmable materials. Five pneumatic ports and multiple fully-integrated modules to satisfy various pressure, flow, and size requirements make FlowIO suitable for most wearable and non-wearable pneumatic applications in HCI and soft robotics.
[ FlowIO ]
Thanks Fan!
NASA’s Ingenuity Mars Helicopter completed its fifth flight with a one-way journey from Wright Brothers Field to a new airfield 423 feet (129 meters) to the south on May 7, 2021.
NASA has 3D-ified Ingenuity's third flight, so dig up your 3D glasses and check it out:
Also, audio!
[ NASA ]
Until we can find a good way of training cats, we'll have to make due with robots if we want to study their neuromuscular dynamics.
Toyoaki Tanikawa and his supervisors assistant professor Yoichi Masuda and Prof Masato Ishikawa developed a four-legged robot that enables the reproduction of motor control of animals using computers. This quadruped robot, which comprises highly back-drivable legs to reproduce the flexibility of animals and torque-controllable motors, can reproduce muscle characteristics of animals. Thus, it is possible to conduct various experiments using this robot instead of the animals themselves.
[ Osaka University ]
Thanks Yoichi!
Turner Topping is a PhD student and researcher with Kod*lab, a legged robotics group within the GRASP Lab at Penn Engineering. Through this video profile, one gains insight into Turner’s participation in the academic research environment, overcoming uncertainties and obstacles.
[ Kod*Lab ]
A team led by Assistant Professor Benjamin Tee from the National University of Singapore has developed a smart material known as AiFoam that could give machines human-like sense of touch, to better judge human intentions and respond to changes in the environment.
[ NUS ]
Boston University mechanical engineers have developed a unique way to use an ancient Japanese art form for a very 21st-century purpose. In a paper published this week in Science Robotics, Douglas Holmes and BU PhD student Yi Yang demonstrate how they were inspired by kirigami, the traditional Japanese art of paper cutting (cousin of origami paper-folding art), to design soft robotic grippers.
[ BU ]
Turns out, if you give robots voices and names and googly eyes and blogs (?), people will try to anthropomorphize them. Go figure!
[ NTNU ]
Domestic garbage management is an important aspect of a sustainable environment. This paper presents a novel garbage classification and localization system for grasping and placement in the correct recycling bin, integrated on a mobile manipulator. In particular, we first introduce and train a deep neural network (namely, GarbageNet) to detect different recyclable types of garbage in the wild. Secondly, we use a grasp localization method to identify the grasp poses of garbage that need to be collected from the ground. Finally, we perform grasping and sorting of the objects by the mobile robot through a whole-body control framework.
[ UCL ]
Thanks Dimitrios!
I am 100% here for telepresence robots with emotive antennas.
[ Pollen Robotics ]
We propose a novel robotic system that can improve its semantic perception during deployment. Our system tightly couples multi-sensor perception and localisation to continuously learn from self-supervised pseudo labels.
[ ASL ]
Vandi Verma is one of the people driving the Mars Perseverance rover, and CMU would like to remind you that that she graduated from CMU.
[ CMU ]
Pepper is here to offer a “phygital” experience to shoppers.
I had to look up “phygital,” and it's a combination of phyiscal and digital that is used exclusively in marketing, as far as I can tell, so let us never speak of it again.
[ CMU ]
Researchers conduct early mobility testing on an engineering model of NASA’s Volatiles Investigating Polar Exploration Rover, or VIPER, and fine-tune a newly installed OptiTrack motion tracking camera system at NASA Glenn’s Simulated Lunar Operations Lab.
[ NASA ]
Mmm, sorting is satisfying to watch.
[ Dorabot ]
iRobot seems to be hiring, although you’ll have to brave a pupper infestation.
Clean floors, though!
[ iRobot ]
Shadow Robot's bimanual teleoperation system is now commercially available for a price you almost certainly cannot afford!
Converge Robotics Group offers a haptic option, too.
[ Shadow ] Continue reading