Tag Archives: order

#437697 These Underwater Drones Use Water ...

Yi Chao likes to describe himself as an “armchair oceanographer” because he got incredibly seasick the one time he spent a week aboard a ship. So it’s maybe not surprising that the former NASA scientist has a vision for promoting remote study of the ocean on a grand scale by enabling underwater drones to recharge on the go using his company’s energy-harvesting technology.

Many of the robotic gliders and floating sensor stations currently monitoring the world’s oceans are effectively treated as disposable devices because the research community has a limited number of both ships and funding to retrieve drones after they’ve accomplished their mission of beaming data back home. That’s not only a waste of money, but may also contribute to a growing assortment of abandoned lithium-ion batteries polluting the ocean with their leaking toxic materials—a decidedly unsustainable approach to studying the secrets of the underwater world.

“Our goal is to deploy our energy harvesting system to use renewable energy to power those robots,” says Chao, president and CEO of the startup Seatrec. “We're going to save one battery at a time, so hopefully we're going to not to dispose more toxic batteries in the ocean.”

Chao’s California-based startup claims that its SL1 Thermal Energy Harvesting System can already help save researchers money equivalent to an order of magnitude reduction in the cost of using robotic probes for oceanographic data collection. The startup is working on adapting its system to work with autonomous underwater gliders. And it has partnered with defense giant Northrop Grumman to develop an underwater recharging station for oceangoing drones that incorporates Northrop Grumman’s self-insulating electrical connector capable of operating while the powered electrical contacts are submerged.

Seatrec’s energy-harvesting system works by taking advantage of how certain substances transition from solid-to-liquid phase and liquid-to-gas phase when they heat up. The company’s technology harnesses the pressure changes that result from such phase changes in order to generate electricity.

Image: Seatrec

To make the phase changes happen, Seatrec’s solution taps the temperature differences between warmer water at the ocean surface and colder water at the ocean depths. Even a relatively simple robotic probe can generate additional electricity by changing its buoyancy to either float at the surface or sink down into the colder depths.

By attaching an external energy-harvesting module, Seatrec has already begun transforming robotic probes into assets that can be recharged and reused more affordably than sending out a ship each time to retrieve the probes. This renewable energy approach could keep such drones going almost indefinitely barring electrical or mechanical failures. “We just attach the backpack to the robots, we give them a cable providing power, and they go into the ocean,” Chao explains.

The early buyers of Seatrec’s products are primarily academic researchers who use underwater drones to collect oceanographic data. But the startup has also attracted military and government interest. It has already received small business innovation research contracts from both the U.S. Office of Naval Research and National Oceanic and Atmospheric Administration (NOAA).

Seatrec has also won two $10,000 prizes under the Powering the Blue Economy: Ocean Observing Prize administered by the U.S. Department of Energy and NOAA. The prizes awarded during the DISCOVER Competition phase back in March 2020 included one prize split with Northrop Grumman for the joint Mission Unlimited UUV Station concept. The startup and defense giant are currently looking for a robotics company to partner with for the DEVELOP Competition phase of the Ocean Observing Prize that will offer a total of $3 million in prizes.

In the long run, Seatrec hopes its energy-harvesting technology can support commercial ventures such as the aquaculture industry that operates vast underwater farms. The technology could also support underwater drones carrying out seabed surveys that pave the way for deep sea mining ventures, although those are not without controversy because of their projected environmental impacts.

Among all the possible applications Chao seems especially enthusiastic about the prospect of Seatrec’s renewable power technology enabling underwater drones and floaters to collect oceanographic data for much longer periods of time. He spent the better part of two decades working at the NASA Jet Propulsion Laboratory in Pasadena, Calif., where he helped develop a satellite designed for monitoring the Earth’s oceans. But he and the JPL engineering team that developed Seatrec’s core technology believe that swarms of underwater drones can provide a continuous monitoring network to truly begin understanding the oceans in depth.

The COVID-19 pandemic has slowed production and delivery of Seatrec’s products somewhat given local shutdowns and supply chain disruptions. Still, the startup has been able to continue operating in part because it’s considered to be a defense contractor that is operating an essential manufacturing facility. Seatrec’s engineers and other staff members are working in shifts to practice social distancing.

“Rather than building one or two for the government, we want to scale up to build thousands, hundreds of thousands, hopefully millions, so we can improve our understanding and provide that data to the community,” Chao says. Continue reading

Posted in Human Robots

#437689 GITAI Sending Autonomous Robot to Space ...

We’ve been keeping a close watch on GITAI since early last year—what caught our interest initially is the history of the company, which includes a bunch of folks who started in the JSK Lab at the University of Tokyo, won the DARPA Robotics Challenge Trials as SCHAFT, got swallowed by Google, narrowly avoided being swallowed by SoftBank, and are now designing robots that can work in space.

The GITAI YouTube channel has kept us more to less up to date on their progress so far, and GITAI has recently announced the next step in this effort: The deployment of one of their robots on board the International Space Station in 2021.

Photo: GITAI

GITAI’s S1 is a task-specific 8-degrees-of-freedom arm with an integrated sensing and computing system and 1-meter reach.

GITAI has been working on a variety of robots for space operations, the most sophisticated of which is a humanoid torso called G1, which is controlled through an immersive telepresence system. What will be launching into space next year is a more task-specific system called the S1, which is an 8-degrees-of-freedom arm with an integrated sensing and computing system that can be wall-mounted and has a 1-meter reach.

The S1 will be living on board a commercially funded, pressurized airlock-extension module called Bishop, developed by NanoRacks. Mounted on the inside of the Bishop module, the S1 will have access to a task board and a small assembly area, where it will demonstrate common crew intra-vehicular activity, or IVA—tasks like flipping switches, turning knobs, and managing cables. It’ll also do some in-space assembly, or ISA, attaching panels to create a solar array.

Here’s a demonstration of some task board activities, conducted on Earth in a mockup of Bishop:

GITAI says that “all operations conducted by the S1 GITAI robotic arm will be autonomous, followed by some teleoperations from Nanoracks’ in-house mission control.” This is interesting, because from what we’ve seen until now, GITAI has had a heavy emphasis on telepresence, with a human in the loop to get stuff done. As GITAI’s founder and CEO Sho Nakanose commented to us a year ago, “Telepresence robots have far better performance and can be made practical much quicker than autonomous robots, so first we are working on making telepresence robots practical.”

So what’s changed? “GITAI has been concentrating on teleoperations to demonstrate the dexterity of our robot, but now it’s time to show our capabilities to do the same this time with autonomy,” Nakanose told us last week. “In an environment with minimum communication latency, it would be preferable to operate a robot more with teleoperations to enhance the capability of the robot, since with the current technology level of AI, what a robot can do autonomously is very limited. However, in an environment where the latency becomes noticeable, it would become more efficient to have a mixture of autonomy and teleoperations depending on the application. Eventually, in an ideal world, a robot will operate almost fully autonomously with minimum human cognizance.”

“In an environment where the latency becomes noticeable, it would become more efficient to have a mixture of autonomy and teleoperations depending on the application. Eventually, in an ideal world, a robot will operate almost fully autonomously with minimum human cognizance.”
—Sho Nakanose, GITAI founder and CEO

Nakanose says that this mission will help GITAI to “acquire the skills, know-how, and experience necessary to prepare a robot to be ISS compatible, prov[ing] the maturity of our technology in the microgravity environment.” Success would mean conducting both IVA and ISA experiments as planned (autonomous and teleop for IVA, fully autonomous for ISA), which would be pretty awesome, but we’re told that GITAI has already received a research and development order for space robots from a private space company, and Nakanose expects that “by the mid-2020s, we will be able to show GITAI's robots working in space on an actual mission.”

NanoRacks is schedule to launch the Bishop module on SpaceX CRS-21 in November. The S1 will be launched separately in 2021, and a NASA astronaut will install the robot and then leave it alone to let it start demonstrating how work in space can be made both safer and cheaper once the humans have gotten out of the way. Continue reading

Posted in Human Robots

#437639 Boston Dynamics’ Spot Is Helping ...

In terms of places where you absolutely want a robot to go instead of you, what remains of the utterly destroyed Chernobyl Reactor 4 should be very near the top of your list. The reactor, which suffered a catastrophic meltdown in 1986, has been covered up in almost every way possible in an effort to keep its nuclear core contained. But eventually, that nuclear material is going to have to be dealt with somehow, and in order to do that, it’s important to understand which bits of it are just really bad, and which bits are the actual worst. And this is where Spot is stepping in to help.

The big open space that Spot is walking through is right next to what’s left of Reactor 4. Within six months of the disaster, Reactor 4 was covered in a sarcophagus made of concrete and steel to try and keep all the nasty nuclear fuel from leaking out more than it already had, and it still contains “30 tons of highly contaminated dust, 16 tons of uranium and plutonium, and 200 tons of radioactive lava.” Oof. Over the next 10 years, the sarcophagus slowly deteriorated, and despite the addition of that gigantic network of steel support beams that you can see in the video, in the late 1990s it was decided to erect an enormous building over the entire mess to try and stabilize it for as long as possible.

Reactor 4 is now snugly inside the massive New Safe Confinement (NSC) structure, and the idea is that eventually, the structure will allow for the safe disassembly of what’s left of the reactor, although nobody is quite sure how to do that. This is all just to say that the area inside of the containment structure offers a lot of good opportunities for robots to take over from humans.

This particular Spot is owned by the U.K. Atomic Energy Authority, and was packed off to Russia with the assistance of the Robotics and Artificial Intelligence in Nuclear (RAIN) initiative and the National Centre for Nuclear Robotics. Dr. Dave Megson-Smith, who is a researcher at the University of Bristol, in the U.K., and part of the Hot Robotics Facility at the National Nuclear User Facility, was one of the scientists lucky enough to accompany Spot on its adventure. Megson-Smith specializes in sensor development, and he equipped Spot with a collimated radiation sensor in addition to its mapping payload. “We actually built a map of the radiation coming out of the front wall of Chernobyl power plant as we were in there with it,” Megson-Smith told us, and was able to share this picture, which shows a map of gamma photon count rate:

Image: University of Bristol

Researchers equipped Spot with a collimated radiation sensor and use one of the data readings (gamma photon count rate) to create a map of the radiation coming out of the front wall of the Chernobyl power plant.

So what’s the reason you’d want to use a very expensive legged robot to wander around what looks like a very flat and robot friendly floor? As it turns out, the floor is very dusty in there, and a priority inside the NSC is to keep dust down as much as possible, since the dust is radioactive and gets on everything and is consequently the easiest way for radioactivity to escape the NSC. “You want to minimize picking up material, so we consider the total contact surface area,” says Megson-Smith. “If you use a legged system rather than a wheeled or tracked system, you have a much smaller footprint and you disturb the environment a lot less.” While it’s nice that Spot is nimble and can climb stairs and stuff, tracked vehicles can do that as well, so in this case, the primary driving factor of choosing a robot to work inside Chernobyl is minimizing those contact points.

Right now, routine weekly measurements in contaminated spaces at Chernobyl are done by humans, which puts those humans at risk. Spot, or a robot like it, could potentially take over from those humans, as a sort of “automated safety checker”

Right now, routine weekly measurements in contaminated spaces at Chernobyl are done by humans, which puts those humans at risk. Spot, or a robot like it, could potentially take over from those humans, as a sort of “automated safety checker” able to work in medium level contaminated environments.” As far as more dangerous areas go, there’s a lot of uncertainty about what Spot is actually capable of, according to Megson-Smith. “What you think the problems are, and what the industry thinks the problems are, are subtly different things.

We were thinking that we’d have to make robots incredibly radiation proof to go into these contaminated environments, but they said, “can you just give us a system that we can send into places where humans already can go, but where we just don’t want to send humans.” Making robots incredibly radiation proof is challenging, and without extensive testing and ruggedizing, failures can be frequent, as many robots discovered at Fukushima. Indeed, Megson-Smith that in Fukushima there’s a particular section that’s known as a “robot graveyard” where robots just go to die, and they’ve had to up their standards again and again to keep the robots from failing. “So the thing they’re worried about with Spot is, what is its tolerance? What components will fail, and what can we do to harden it?” he says. “We’re approaching Boston Dynamics at the moment to see if they’ll work with us to address some of those questions.

There’s been a small amount of testing of how robots fair under harsh radiation, Megson-Smith told us, including (relatively recently) a KUKA LBR800 arm, which “stopped operating after a large radiation dose of 164.55(±1.09) Gy to its end effector, and the component causing the failure was an optical encoder.” And in case you’re wondering how much radiation that is, a 1 to 2 Gy dose to the entire body gets you acute radiation sickness and possibly death, while 8 Gy is usually just straight-up death. The goal here is not to kill robots (I mean, it sort of is), but as Megson-Smith says, “if we can work out what the weak points are in a robotic system, can we address those, can we redesign those, or at least understand when they might start to fail?” Now all he has to do is convince Boston Dynamics to send them a Spot that they can zap until it keels over.

The goal for Spot in the short term is fully autonomous radiation mapping, which seems very possible. It’ll also get tested with a wider range of sensor packages, and (happily for the robot) this will all take place safely back at home in the U.K. As far as Chernobyl is concerned, robots will likely have a substantial role to play in the near future. “Ultimately, Chernobyl has to be taken apart and decommissioned. That’s the long-term plan for the facility. To do that, you first need to understand everything, which is where we come in with our sensor systems and robotic platforms,” Megson-Smith tells us. “Since there are entire swathes of the Chernobyl nuclear plant where people can’t go in, we’d need robots like Spot to do those environmental characterizations.” Continue reading

Posted in Human Robots

#437614 Video Friday: Poimo Is a Portable ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today's videos.

Engineers at the University of California San Diego have built a squid-like robot that can swim untethered, propelling itself by generating jets of water. The robot carries its own power source inside its body. It can also carry a sensor, such as a camera, for underwater exploration.

[ UCSD ]

Thanks Ioana!

Shark Robotics, French and European leader in Unmanned Ground Vehicles, is announcing today a disinfection add-on for Boston Dynamics Spot robot, designed to fight the COVID-19 pandemic. The Spot robot with Shark’s purpose-built disinfection payload can decontaminate up to 2,000 m2 in 15 minutes, in any space that needs to be sanitized – such as hospitals, metro stations, offices, warehouses or facilities.

[ Shark Robotics ]

Here’s an update on the Poimo portable inflatable mobility project we wrote about a little while ago; while not strictly robotics, it seems like it holds some promise for rapidly developing different soft structures that robotics might find useful.

[ University of Tokyo ]

Thanks Ryuma!

Pretty cool that you can do useful force feedback teleop while video chatting through a “regular broadband Internet connection.” Although, what “regular” means to you is a bit subjective, right?

[ HEBI Robotics ]

Thanks Dave!

While NASA's Mars rover Perseverance travels through space toward the Red Planet, its nearly identical rover twin is hard at work on Earth. The vehicle system test bed (VSTB) rover named OPTIMISM is a full-scale engineering version of the Mars-bound rover. It is used to test hardware and software before the commands are sent up to the Perseverance rover.

[ NASA ]

Jacquard takes ordinary, familiar objects and enhances them with new digital abilities and experiences, while remaining true to their original purpose — like being your favorite jacket, backpack or a pair of shoes that you love to wear.

Our ambition is simple: to make life easier. By staying connected to your digital world, your things can do so much more. Skip a song by brushing your sleeve. Take a picture by tapping on a shoulder strap. Get reminded about the phone you left behind with a blink of light or a haptic buzz on your cuff.

[ Google ATAP ]

Should you attend the IROS 2020 workshop on “Planetary Exploration Robots: Challenges and Opportunities”? Of course you should!

[ Workshop ]

Kuka makes a lot of these videos where I can’t help but think that if they put as much effort into programming the robot as they did into producing the video, the result would be much more impressive.

[ Kuka ]

The Colorado School of Mines is one of the first customers to buy a Spot robot from Boston Dynamics to help with robotics research. Watch as scientists take Spot into the school's mine for the first time.

[ HCR ] via [ CNET ]

A very interesting soft(ish) actuator from Ayato Kanada at Kyushu University's Control Engineering Lab.

A flexible ultrasonic motor (FUSM), which generates linear motion as a novel soft actuator. This motor consists of a single metal cube stator with a hole and an elastic elongated coil spring inserted into the hole. When voltages are applied to piezoelectric plates on the stator, the coil spring moves back and forward as a linear slider. In the FUSM that uses the friction drive as the principle, the most important parameter for optimizing its output is the preload between the stator and slider. The coil spring has a slightly larger diameter than the stator hole and generates the preload by expanding in a radial direction. The coil springs act not only as a flexible slider but also as a resistive positional sensor. Changes in the resistance between the stator and the coil spring end are converted to a voltage and used for position detection.

[ Control Engineering Lab ]

Thanks Ayato!

We show how to use the limbs of a quadruped robot to identify fine-grained soil, representative for Martian regolith.

[ Paper ] via [ ANYmal Research ]

PR2 is serving breakfast and cleaning up afterwards. It’s slow, but all you have to do is eat and leave.

That poor PR2 is a little more naked than it's probably comfortable with.

[ EASE ]

NVIDIA researchers present a hierarchical framework that combines model-based control and reinforcement learning (RL) to synthesize robust controllers for a quadruped robot (the Unitree Laikago).

[ NVIDIA ]

What's interesting about this assembly task is that the robot is using its arm only for positioning, and doing the actual assembly with just fingers.

[ RC2L ]

In this electronics assembly application, Kawasaki's cobot duAro2 uses a tool changing station to tackle a multitude of tasks and assemble different CPU models.

Okay but can it apply thermal paste to a CPU in the right way? Personally, I find that impossible.

[ Kawasaki ]

You only need to watch this video long enough to appreciate the concept of putting a robot on a robot.

[ Impress ]

In this lecture, we’ll hear from the man behind one of the biggest robotics companies in the world, Boston Dynamics, whose robotic dog, Spot, has been used to encourage social distancing in Singapore and is now getting ready for FDA approval to be able to measure patients’ vital signs in hospitals.

[ Alan Turing Institute ]

Greg Kahn from UC Berkeley wrote in to share his recent dissertation talk on “Mobile Robot Learning.”

In order to create mobile robots that can autonomously navigate real-world environments, we need generalizable perception and control systems that can reason about the outcomes of navigational decisions. Learning-based methods, in which the robot learns to navigate by observing the outcomes of navigational decisions in the real world, offer considerable promise for obtaining these intelligent navigation systems. However, there are many challenges impeding mobile robots from autonomously learning to act in the real-world, in particular (1) sample-efficiency–how to learn using a limited amount of data? (2) supervision–how to tell the robot what to do? and (3) safety–how to ensure the robot and environment are not damaged or destroyed during learning? In this talk, I will present deep reinforcement learning methods for addressing these real world mobile robot learning challenges and show results which enable ground and aerial robots to navigate in complex indoor and outdoor environments.

[ UC Berkeley ]

Thanks Greg!

Leila Takayama from UC Santa Cruz (and previously Google X and Willow Garage) gives a talk entitled “Toward a more human-centered future of robotics.”

Robots are no longer only in outer space, in factory cages, or in our imaginations. We interact with robotic agents when withdrawing cash from bank ATMs, driving cars with adaptive cruise control, and tuning our smart home thermostats. In the moment of those interactions with robotic agents, we behave in ways that do not necessarily align with the rational belief that robots are just plain machines. Through a combination of controlled experiments and field studies, we use theories and concepts from the social sciences to explore ways that human and robotic agents come together, including how people interact with personal robots and how people interact through telepresence robots. Together, we will explore topics and raise questions about the psychology of human-robot interaction and how we could invent a future of a more human-centered robotics that we actually want to live in.

[ Leila Takayama ]

Roboticist and stand-up comedian Naomi Fitter from Oregon State University gives a talk on “Everything I Know about Telepresence.”

Telepresence robots hold promise to connect people by providing videoconferencing and navigation abilities in far-away environments. At the same time, the impacts of current commercial telepresence robots are not well understood, and circumstances of robot use including internet connection stability, odd personalizations, and interpersonal relationship between a robot operator and people co-located with the robot can overshadow the benefit of the robot itself. And although the idea of telepresence robots has been around for over two decades, available nonverbal expressive abilities through telepresence robots are limited, and suitable operator user interfaces for the robot (for example, controls that allow for the operator to hold a conversation and move the robot simultaneously) remain elusive. So where should we be using telepresence robots? Are there any pitfalls to watch out for? What do we know about potential robot expressivity and user interfaces? This talk will cover my attempts to address these questions and ways in which the robotics research community can build off of this work

[ Talking Robotics ] Continue reading

Posted in Human Robots

#437608 Video Friday: Agility Robotics Raises ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

Digit is now in full commercial production and we’re excited to announce a $20M funding rounding round co-led by DCVC and Playground Global!

Digits for everyone!

[ Agility Robotics ]

A flexible rover that has both ability to travel long distances and rappel down hard-to-reach areas of scientific interest has undergone a field test in the Mojave Desert in California to showcase its versatility. Composed of two Axel robots, DuAxel is designed to explore crater walls, pits, scarps, vents and other extreme terrain on the moon, Mars and beyond.

This technology demonstration developed at NASA’s Jet Propulsion Laboratory in Southern California showcases the robot’s ability to split in two and send one of its halves — a two-wheeled Axle robot — over an otherwise inaccessible slope, using a tether as support and to supply power.

The rappelling Axel can then autonomously seek out areas to study, safely overcome slopes and rocky obstacles, and then return to dock with its other half before driving to another destination. Although the rover doesn’t yet have a mission, key technologies are being developed that might, one day, help us explore the rocky planets and moons throughout the solar system.

[ JPL ]

A rectangular robot as tiny as a few human hairs can travel throughout a colon by doing back flips, Purdue University engineers have demonstrated in live animal models. Why the back flips? Because the goal is to use these robots to transport drugs in humans, whose colons and other organs have rough terrain. Side flips work, too. Why a back-flipping robot to transport drugs? Getting a drug directly to its target site could remove side effects, such as hair loss or stomach bleeding, that the drug may otherwise cause by interacting with other organs along the way.

[ Purdue ]

This video shows the latest results in the whole-body locomotion control of the humanoid robot iCub achieved by the Dynamic Interaction Control line at IIT-Istituto Italiano di Tecnologia in Genova (Italy). In particular, the iCub now keeps the balance while walking and receiving pushes from an external user. The implemented control algorithms also ensure the robot to remain compliant during locomotion and human-robot interaction, a fundamental property to lower the possibility to harm humans that share the robot surrounding environment.

This is super impressive, considering that iCub was only able to crawl and was still tethered not too long ago. Also, it seems to be blinking properly now, so it doesn’t look like it’s always sleepy.

[ IIT ]

This video shows a set of new tests we performed on Bolt. We conducted tests on 5 different scenarios, 1) walking forward/backward 2) uneven surface 3) soft surface 4) push recovery 5) slippage recovery. Thanks to our feedback control based on Model Predictive Control, the robot can perform walking in the presence of all these uncertainties. We will open-source all the codes in a near future.

[ ODRI ]

The title of this video is “Can you throw your robot into a lake?” The title of this video should be, “Can you throw your robot into a lake and drive it out again?”

[ Norlab ]

AeroVironment Successfully Completes Sunglider Solar HAPS Stratospheric Test Flight, Surpassing 60,000 Feet Altitude and Demonstrating Broadband Mobile Connectivity.

[ AeroVironment ]

We present CoVR, a novel robotic interface providing strong kinesthetic feedback (100 N) in a room-scale VR arena. It consists of a physical column mounted on a 2D Cartesian ceiling robot (XY displacements) with the capacity of (1) resisting to body-scaled users actions such as pushing or leaning; (2) acting on the users by pulling or transporting them as well as (3) carrying multiple potentially heavy objects (up to 80kg) that users can freely manipulate or make interact with each other.

[ DeepAI ]

In a new video, personnel from Swiss energy supply company Kraftwerke Oberhasli AG (KWO) explain how they were able to keep employees out of harm’s way by using Flyability’s Elios 2 to collect visual data while building a new dam.

[ Flyability ]

Enjoy our Ascento robot fail compilation! With every failure we experience, we learn more and we can improve our robot for its next iteration, which will come soon… Stay tuned for more!

FYI posting a robot fails video will pretty much guarantee you a spot in Video Friday!

[ Ascento ]

Humans are remarkably good at using chopsticks. The Guinness World Record witnessed a person using chopsticks to pick up 65 M&Ms in just a minute. We aim to collect demonstrations from humans and to teach robot to use chopsticks.

[ UW Personal Robotics Lab ]

A surprising amount of personality from these Yaskawa assembly robots.

[ Yaskawa ]

This paper presents the system design, modeling, and control of the Aerial Robotic Chain Manipulator. This new robot design offers the potential to exert strong forces and moments to the environment, carry and lift significant payloads, and simultaneously navigate through narrow corridors. The presented experimental studies include a valve rotation task, a pick-and-release task, and the verification of load oscillation suppression to demonstrate the stability and performance of the system.

[ ARL ]

Whether animals or plants, whether in the water, on land or in the air, nature provides the model for many technical innovations and inventions. This is summed up in the term bionics, which is a combination of the words ‘biology‘ and ‘electronics’. At Festo, learning from nature has a long history, as our Bionic Learning Network is based on using nature as the source for future technologies like robots, assistance systems or drive solutions.

[ Festo ]

Dogs! Selfies! Thousands of LEGO bricks! This video has it all.

[ LEGO ]

An IROS workshop talk on “Cassie and Mini Cheetah Autonomy” by Maani Ghaffari and Jessy Grizzle from the University of Michigan.

[ Michigan Robotics ]

David Schaefer’s Cozmo robots are back with this mind-blowing dance-off!

What you just saw represents hundreds of hours of work, David tells us: “I wrote over 10,000 lines of code to create the dance performance as I had to translate the beats per minute of the song into motor rotations in order to get the right precision needed to make the moves look sharp. The most challenging move was the SpongeBob SquareDance as any misstep would send the Cozmos crashing into each other. LOL! Fortunately for me, Cozmo robots are pretty resilient.”

[ Life with Cozmo ]

Thanks David!

This week’s GRASP on Robotics seminar is by Sangbae Kim from MIT, on “Robots with Physical Intelligence.”

While industrial robots are effective in repetitive, precise kinematic tasks in factories, the design and control of these robots are not suited for physically interactive performance that humans do easily. These tasks require ‘physical intelligence’ through complex dynamic interactions with environments whereas conventional robots are designed primarily for position control. In order to develop a robot with ‘physical intelligence’, we first need a new type of machines that allow dynamic interactions. This talk will discuss how the new design paradigm allows dynamic interactive tasks. As an embodiment of such a robot design paradigm, the latest version of the MIT Cheetah robots and force-feedback teleoperation arms will be presented.

[ GRASP ]

This week’s CMU Ri Seminar is by Kevin Lynch from Northwestern, on “Robotics and Biosystems.”

Research at the Center for Robotics and Biosystems at Northwestern University encompasses bio-inspiration, neuromechanics, human-machine systems, and swarm robotics, among other topics. In this talk I will give an overview of some of our recent work on in-hand manipulation, robot locomotion on yielding ground, and human-robot systems.

[ CMU RI ] Continue reading

Posted in Human Robots