Tag Archives: forces

#437671 Video Friday: Researchers 3D Print ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online]
IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

The Giant Gundam in Yokohama is actually way cooler than I thought it was going to be.

[ Gundam Factory ] via [ YouTube ]

A new 3D-printing method will make it easier to manufacture and control the shape of soft robots, artificial muscles and wearable devices. Researchers at UC San Diego show that by controlling the printing temperature of liquid crystal elastomer, or LCE, they can control the material’s degree of stiffness and ability to contract—also known as degree of actuation. What’s more, they are able to change the stiffness of different areas in the same material by exposing it to heat.

[ UCSD ]

Thanks Ioana!

This is the first successful reactive stepping test on our new torque-controlled biped robot named Bolt. The robot has 3 active degrees of freedom per leg and one passive joint in ankle. Since there is no active joint in ankle, the robot only relies on step location and timing adaptation to stabilize its motion. Not only can the robot perform stepping without active ankles, but it is also capable of rejecting external disturbances as we showed in this video.

[ ODRI ]

The curling robot “Curly” is the first AI-based robot to demonstrate competitive curling skills in an icy real environment with its high uncertainties. Scientists from seven different Korean research institutions including Prof. Klaus-Robert Müller, head of the machine-learning group at TU Berlin and guest professor at Korea University, have developed an AI-based curling robot.

[ TU Berlin ]

MoonRanger, a small robotic rover being developed by Carnegie Mellon University and its spinoff Astrobotic, has completed its preliminary design review in preparation for a 2022 mission to search for signs of water at the moon’s south pole. Red Whittaker explains why the new MoonRanger Lunar Explorer design is innovative and different from prior planetary rovers.

[ CMU ]

Cobalt’s security robot can now navigate unmodified elevators, which is an impressive feat.

Also, EXTERMINATE!

[ Cobalt ]

OrionStar, the robotics company invested in by Cheetah Mobile, announced the Robotic Coffee Master. Incorporating 3,000 hours of AI learning, 30,000 hours of robotic arm testing and machine vision training, the Robotic Coffee Master can perform complex brewing techniques, such as curves and spirals, with millimeter-level stability and accuracy (reset error ≤ 0.1mm).

[ Cheetah Mobile ]

DARPA OFFensive Swarm-Enabled Tactics (OFFSET) researchers recently tested swarms of autonomous air and ground vehicles at the Leschi Town Combined Arms Collective Training Facility (CACTF), located at Joint Base Lewis-McChord (JBLM) in Washington. The Leschi Town field experiment is the fourth of six planned experiments for the OFFSET program, which seeks to develop large-scale teams of collaborative autonomous systems capable of supporting ground forces operating in urban environments.

[ DARPA ]

Here are some highlights from Team Explorer’s SubT Urban competition back in February.

[ Team Explorer ]

Researchers with the Skoltech Intelligent Space Robotics Laboratory have developed a system that allows easy interaction with a micro-quadcopter with LEDs that can be used for light-painting. The researchers used a 92x92x29 mm Crazyflie 2.0 quadrotor that weighs just 27 grams, equipped with a light reflector and an array of controllable RGB LEDs. The control system consists of a glove equipped with an inertial measurement unit (IMU; an electronic device that tracks the movement of a user’s hand), and a base station that runs a machine learning algorithm.

[ Skoltech ]

“DeKonBot” is the prototype of a cleaning and disinfection robot for potentially contaminated surfaces in buildings such as door handles, light switches or elevator buttons. While other cleaning robots often spray the cleaning agents over a large area, DeKonBot autonomously identifies the surface to be cleaned.

[ Fraunhofer IPA ]

On Oct. 20, the OSIRIS-REx mission will perform the first attempt of its Touch-And-Go (TAG) sample collection event. Not only will the spacecraft navigate to the surface using innovative navigation techniques, but it could also collect the largest sample since the Apollo missions.

[ NASA ]

With all the robotics research that seems to happen in places where snow is more of an occasional novelty or annoyance, it’s good to see NORLAB taking things more seriously

[ NORLAB ]

Telexistence’s Model-T robot works very slowly, but very safely, restocking shelves.

[ Telexistence ] via [ YouTube ]

Roboy 3.0 will be unveiled next month!

[ Roboy ]

KUKA ready2_educate is your training cell for hands-on education in robotics. It is especially aimed at schools, universities and company training facilities. The training cell is a complete starter package and your perfect partner for entry into robotics.

[ KUKA ]

A UPenn GRASP Lab Special Seminar on Data Driven Perception for Autonomy, presented by Dapo Afolabi from UC Berkeley.

Perception systems form a crucial part of autonomous and artificial intelligence systems since they convert data about the relationship between an autonomous system and its environment into meaningful information. Perception systems can be difficult to build since they may involve modeling complex physical systems or other autonomous agents. In such scenarios, data driven models may be used to augment physics based models for perception. In this talk, I will present work making use of data driven models for perception tasks, highlighting the benefit of such approaches for autonomous systems.

[ GRASP Lab ]

A Maryland Robotics Center Special Robotics Seminar on Underwater Autonomy, presented by Ioannis Rekleitis from the University of South Carolina.

This talk presents an overview of algorithmic problems related to marine robotics, with a particular focus on increasing the autonomy of robotic systems in challenging environments. I will talk about vision-based state estimation and mapping of underwater caves. An application of monitoring coral reefs is going to be discussed. I will also talk about several vehicles used at the University of South Carolina such as drifters, underwater, and surface vehicles. In addition, a short overview of the current projects will be discussed. The work that I will present has a strong algorithmic flavour, while it is validated in real hardware. Experimental results from several testing campaigns will be presented.

[ MRC ]

This week’s CMU RI Seminar comes from Scott Niekum at UT Austin, on Scaling Probabilistically Safe Learning to Robotics.

Before learning robots can be deployed in the real world, it is critical that probabilistic guarantees can be made about the safety and performance of such systems. This talk focuses on new developments in three key areas for scaling safe learning to robotics: (1) a theory of safe imitation learning; (2) scalable reward inference in the absence of models; (3) efficient off-policy policy evaluation. The proposed algorithms offer a blend of safety and practicality, making a significant step towards safe robot learning with modest amounts of real-world data.

[ CMU RI ] Continue reading

Posted in Human Robots

#437635 Toyota Research Demonstrates ...

Over the last several years, Toyota has been putting more muscle into forward-looking robotics research than just about anyone. In addition to the Toyota Research Institute (TRI), there’s that massive 175-acre robot-powered city of the future that Toyota still plans to build next to Mount Fuji. Even Toyota itself acknowledges that it might be crazy, but that’s just how they roll—as TRI CEO Gill Pratt told me a while back, when Toyota decides to do something, they really do go all-in on it.

TRI has been focusing heavily on home robots, which is reflective of the long-term nature of what TRI is trying to do, because home robots are both the place where we’ll need robots the most at the same time as they’re the place where it’s going to be hardest to deploy them. The unpredictable nature of homes, and the fact that homes tend to have squishy fragile people in them, are robot-unfriendly characteristics, but as the population continues to age (an increasingly acute problem in Japan), homes offer an enormous amount of potential for helping us maintain our independence.

Today, Toyota is showing off some of the research that it’s been working on recently, in the form of a virtual reality presentation in lieu of an in-person press event. For journalists, TRI pre-loaded the recording onto a VR headset, which was FedEx’ed to my house. You can watch the entire 40-minute presentation in 360 video on YouTube (or in VR if you have a headset of your own), but if you don’t watch the whole thing, you should at least check out the full-on GLaDOS (with arms) that TRI thinks belongs in your home.

The presentation features an introduction from Gill Pratt, who looks entirely too comfortable embedded inside of one of TRI’s telepresence robots. The event also covers a lot of territory, but the highlight is almost certainly the new hardware that TRI demonstrates.

Soft bubble gripper

Photo: TRI

This is a “soft bubble gripper,” under development at TRI’s Cambridge, Mass., branch. These passively-compliant, air-filled grippers make it easier to grasp many different kinds of objects safely, but the nifty thing is that they’ve got cameras inside of them watching a pattern of dots on the interior of the soft membrane.

When the outside of the bubble makes contact with an object, the bubble deforms, and the deformation of the dot pattern on the inside can be tracked by the camera to determine both directions and magnitudes of forces. This is a concept that we’ve seen elsewhere before, but TRI’s implementation is a clever way of making an inherently safe end effector that can still perform all the sensing you need it to do for relatively complex manipulation tasks.

The bubble gripper was presented at ICRA this year, and you can read the technical paper here.

Ceiling-mounted home robot

Photo: TRI

I don’t know whether robots dangling from the ceiling was somehow sinister pre-Portal, but it sure as heck is for me having played through that game a couple of times, and it’s since been reinforced by AUTO from WALL-E.

The reason that we generally see robots mounted on the floor or on tables or on mobile bases is that we’re bipeds, not bats, and giving a robot access to a human-like workspace is easiest to do if you also give that robot a human-like position and orientation. And if you want to be able to reach stuff high up, you do what TRI did with their previous generation of kitchen manipulator, and just give it the ability to make itself super tall. But TRI is convinced it’s a good place to put our future home robots:

One innovative concept is a “gantry robot” that would descend from an overhead framework to perform tasks such as loading the dishwasher, wiping surfaces, and clearing clutter. By traveling on the ceiling, the robot avoids the problems of navigating household floor clutter and navigating cramped spaces. When not in use, the robot would tuck itself up out of the way. To further investigate this idea, the team has built a laboratory prototype robot that can do all the same tasks as a floor-based mobile robot but with the innovative overhead mobility system.

Another obvious problem with the gantry robot is that you have to install all kinds of stuff in your ceiling for this to work, which makes it very impractical (if not totally impossible) to introduce a system like this into a home that wasn’t built specifically for it. If, however, you do build a home with a robot like this in mind, the animation below from TRI shows how it could be extra useful. Suddenly, stairs are a non-issue. Payload is presumably also a non-issue, since loads can be transferred to the ceiling. Batteries become unnecessary, so the whole robot can be much lighter weight, which in turn makes it safer. Sensors get a fantastic view, and obstacle avoidance becomes trivial.

Robots as “time machines”

Photo: TRI

TRI’s presentation covered more than what we’ve highlighted here—our focus has been on the hardware prototypes, but TRI had more to talk about, including learning through demonstration, scaling learning through simulation, and how TRI has been working with users to figure out what research directions should be explored. It’s all available right now on YouTube, and it’s well worth 40 minutes of your time.

“What we’re really focused on is this principle idea of amplifying, rather than replacing, human beings”
—Gill Pratt, TRI

It’s only been five years since Toyota announced the $1 billion investment that established TRI, and it feels like the progress that’s been made since then has been substantial. It’s not often that vision, resources, and long-term commitment come together like this, and TRI’s emphasis on making life better for people is one of the things that helps to keep us optimistic about the future of robotics.

“What we’re really focused on is this principle idea of amplifying, rather than replacing, human beings,” Gill Pratt told us. “And what it means to amplify a person, particularly as they’re aging—what we’re really trying to do is build a time machine. This may sound fanciful, and of course we can’t build a real time machine, but maybe we can build robotic assistants to make our lives as we age seem as if we are actually using a time machine.” He explains that it doesn’t mean building robots for convenience or to do our jobs for us. “It means building technology that enables us to continue to live and to work and to relate to each other as if we were younger,” he says. “And that’s really what our main goal is.” Continue reading

Posted in Human Robots

#437628 Video Friday: An In-Depth Look at Mesmer ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online]
IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

Bear Robotics, a robotics and artificial intelligence company, and SoftBank Robotics Group, a leading robotics manufacturer and solutions provider, have collaborated to bring a new robot named Servi to the food service and hospitality field.

[ Bear Robotics ]

A literal in-depth look at Engineered Arts’ Mesmer android.

[ Engineered Arts ]

Is your robot running ROS? Is it connected to the Internet? Are you actually in control of it right now? Are you sure?

I appreciate how the researchers admitted to finding two of their own robots as part of the scan, a Baxter and a drone.

[ Brown ]

Smile Robotics describes this as “(possibly) world’s first full-autonomous clear-up-the-table robot.”

We’re not qualified to make a judgement on the world firstness, but personally I hate clearing tables, so this robot has my vote.

Smile Robotics founder and CEO Takashi Ogura, along with chief engineer Mitsutaka Kabasawa and engineer Kazuya Kobayashi, are former Google roboticists. Ogura also worked at SCHAFT. Smile says its robot uses ROS and is controlled by a framework written mainly in Rust, adding: “We are hiring Rustacean Roboticists!”

[ Smile Robotics ]

We’re not entirely sure why, but Panasonic has released plans for an Internet of Things system for hamsters.

We devised a recipe for a “small animal healthcare device” that can measure the weight and activity of small animals, the temperature and humidity of the breeding environment, and manage their health. This healthcare device visualizes the health status and breeding environment of small animals and manages their health to promote early detection of diseases. While imagining the scene where a healthcare device is actually used for an important small animal that we treat with affection, we hope to help overcome the current difficult situation through manufacturing.

[ Panasonic ] via [ RobotStart ]

Researchers at Yale have developed a robotic fabric, a breakthrough that could lead to such innovations as adaptive clothing, self-deploying shelters, or lightweight shape-changing machinery.

The researchers focused on processing functional materials into fiber-form so they could be integrated into fabrics while retaining its advantageous properties. For example, they made variable stiffness fibers out of an epoxy embedded with particles of Field’s metal, an alloy that liquifies at relatively low temperatures. When cool, the particles are solid metal and make the material stiffer; when warm, the particles melt into liquid and make the material softer.

[ Yale ]

In collaboration with Armasuisse and SBB, RSL demonstrated the use of a teleoperated Menzi Muck M545 to clean up a rock slide in Central Switzerland. The machine can be operated from a teloperation platform with visual and motion feedback. The walking excavator features an active chassis that can adapt to uneven terrain.

[ ETHZ RSL ]

An international team of JKU researchers is continuing to develop their vision for robots made out of soft materials. A new article in the journal “Communications Materials” demonstrates just how these kinds of soft machines react using weak magnetic fields to move very quickly. A triangle-shaped robot can roll itself in air at high speed and walk forward when exposed to an alternating in-plane square wave magnetic field (3.5 mT, 1.5 Hz). The diameter of the robot is 18 mm with a thickness of 80 µm. A six-arm robot can grab, transport, and release non-magnetic objects such as a polyurethane foam cube controlled by a permanent magnet.

Okay but tell me more about that cute sheep.

[ JKU ]

Interbotix has this “research level robotic crawler,” which both looks mean and runs ROS, a dangerous combination.

And here’s how it all came together:

[ Interbotix ]

I guess if you call them “loitering missile systems” rather than “drones that blow things up” people are less likely to get upset?

[ AeroVironment ]

In this video, we show a planner for a master dual-arm robot to manipulate tethered tools with an assistant dual-arm robot’s help. The assistant robot provides assistance to the master robot by manipulating the tool cable and avoiding collisions. The provided assistance allows the master robot to perform tool placements on the robot workspace table to regrasp the tool, which would typically fail since the tool cable tension may change the tool positions. It also allows the master robot to perform tool handovers, which would normally cause entanglements or collisions with the cable and the environment without the assistance.

[ Harada Lab ]

This video shows a flexible and robust robotic system for autonomous drawing on 3D surfaces. The system takes 2D drawing strokes and a 3D target surface (mesh or point clouds) as input. It maps the 2D strokes onto the 3D surface and generates a robot motion to draw the mapped strokes using visual recognition, grasp pose reasoning, and motion planning.

[ Harada Lab ]

Weekly mobility test. This time the Warthog takes on a fallen tree. Will it cross it? The answer is in the video!

And the answer is: kinda?

[ NORLAB ]

One of the advantages of walking machines is their ability to apply forces in all directions and of various magnitudes to the environment. Many of the multi-legged robots are equipped with point contact feet as these simplify the design and control of the robot. The iStruct project focuses on the development of a foot that allows extensive contact with the environment.

[ DFKI ]

An urgent medical transport was simulated in NASA’s second Systems Integration and Operationalization (SIO) demonstration Sept. 28 with partner Bell Textron Inc. Bell used the remotely-piloted APT 70 to conduct a flight representing an urgent medical transport mission. It is envisioned in the future that an operational APT 70 could provide rapid medical transport for blood, organs, and perishable medical supplies (payload up to 70 pounds). The APT 70 is estimated to move three times as fast as ground transportation.

Always a little suspicious when the video just shows the drone flying, and sitting on the ground, but not that tricky transition between those two states.

[ NASA ]

A Lockheed Martin Robotics Seminar on “Socially Assistive Mobile Robots,” by Yi Guo from Stevens Institute of Technology.

The use of autonomous mobile robots in human environments is on the rise. Assistive robots have been seen in real-world environments, such as robot guides in airports, robot polices in public parks, and patrolling robots in supermarkets. In this talk, I will first present current research activities conducted in the Robotics and Automation Laboratory at Stevens. I’ll then focus on robot-assisted pedestrian regulation, where pedestrian flows are regulated and optimized through passive human-robot interaction.

[ UMD ]

This week’s CMU RI Seminar is by CMU’s Zachary Manchester, on “The World’s Tiniest Space Program.”

The aerospace industry has experienced a dramatic shift over the last decade: Flying a spacecraft has gone from something only national governments and large defense contractors could afford to something a small startup can accomplish on a shoestring budget. A virtuous cycle has developed where lower costs have led to more launches and the growth of new markets for space-based data. However, many barriers remain. This talk will focus on driving these trends to their ultimate limit by harnessing advances in electronics, planning, and control to build spacecraft that cost less than a new smartphone and can be deployed in large numbers.

[ CMU RI ] Continue reading

Posted in Human Robots

#437608 Video Friday: Agility Robotics Raises ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

Digit is now in full commercial production and we’re excited to announce a $20M funding rounding round co-led by DCVC and Playground Global!

Digits for everyone!

[ Agility Robotics ]

A flexible rover that has both ability to travel long distances and rappel down hard-to-reach areas of scientific interest has undergone a field test in the Mojave Desert in California to showcase its versatility. Composed of two Axel robots, DuAxel is designed to explore crater walls, pits, scarps, vents and other extreme terrain on the moon, Mars and beyond.

This technology demonstration developed at NASA’s Jet Propulsion Laboratory in Southern California showcases the robot’s ability to split in two and send one of its halves — a two-wheeled Axle robot — over an otherwise inaccessible slope, using a tether as support and to supply power.

The rappelling Axel can then autonomously seek out areas to study, safely overcome slopes and rocky obstacles, and then return to dock with its other half before driving to another destination. Although the rover doesn’t yet have a mission, key technologies are being developed that might, one day, help us explore the rocky planets and moons throughout the solar system.

[ JPL ]

A rectangular robot as tiny as a few human hairs can travel throughout a colon by doing back flips, Purdue University engineers have demonstrated in live animal models. Why the back flips? Because the goal is to use these robots to transport drugs in humans, whose colons and other organs have rough terrain. Side flips work, too. Why a back-flipping robot to transport drugs? Getting a drug directly to its target site could remove side effects, such as hair loss or stomach bleeding, that the drug may otherwise cause by interacting with other organs along the way.

[ Purdue ]

This video shows the latest results in the whole-body locomotion control of the humanoid robot iCub achieved by the Dynamic Interaction Control line at IIT-Istituto Italiano di Tecnologia in Genova (Italy). In particular, the iCub now keeps the balance while walking and receiving pushes from an external user. The implemented control algorithms also ensure the robot to remain compliant during locomotion and human-robot interaction, a fundamental property to lower the possibility to harm humans that share the robot surrounding environment.

This is super impressive, considering that iCub was only able to crawl and was still tethered not too long ago. Also, it seems to be blinking properly now, so it doesn’t look like it’s always sleepy.

[ IIT ]

This video shows a set of new tests we performed on Bolt. We conducted tests on 5 different scenarios, 1) walking forward/backward 2) uneven surface 3) soft surface 4) push recovery 5) slippage recovery. Thanks to our feedback control based on Model Predictive Control, the robot can perform walking in the presence of all these uncertainties. We will open-source all the codes in a near future.

[ ODRI ]

The title of this video is “Can you throw your robot into a lake?” The title of this video should be, “Can you throw your robot into a lake and drive it out again?”

[ Norlab ]

AeroVironment Successfully Completes Sunglider Solar HAPS Stratospheric Test Flight, Surpassing 60,000 Feet Altitude and Demonstrating Broadband Mobile Connectivity.

[ AeroVironment ]

We present CoVR, a novel robotic interface providing strong kinesthetic feedback (100 N) in a room-scale VR arena. It consists of a physical column mounted on a 2D Cartesian ceiling robot (XY displacements) with the capacity of (1) resisting to body-scaled users actions such as pushing or leaning; (2) acting on the users by pulling or transporting them as well as (3) carrying multiple potentially heavy objects (up to 80kg) that users can freely manipulate or make interact with each other.

[ DeepAI ]

In a new video, personnel from Swiss energy supply company Kraftwerke Oberhasli AG (KWO) explain how they were able to keep employees out of harm’s way by using Flyability’s Elios 2 to collect visual data while building a new dam.

[ Flyability ]

Enjoy our Ascento robot fail compilation! With every failure we experience, we learn more and we can improve our robot for its next iteration, which will come soon… Stay tuned for more!

FYI posting a robot fails video will pretty much guarantee you a spot in Video Friday!

[ Ascento ]

Humans are remarkably good at using chopsticks. The Guinness World Record witnessed a person using chopsticks to pick up 65 M&Ms in just a minute. We aim to collect demonstrations from humans and to teach robot to use chopsticks.

[ UW Personal Robotics Lab ]

A surprising amount of personality from these Yaskawa assembly robots.

[ Yaskawa ]

This paper presents the system design, modeling, and control of the Aerial Robotic Chain Manipulator. This new robot design offers the potential to exert strong forces and moments to the environment, carry and lift significant payloads, and simultaneously navigate through narrow corridors. The presented experimental studies include a valve rotation task, a pick-and-release task, and the verification of load oscillation suppression to demonstrate the stability and performance of the system.

[ ARL ]

Whether animals or plants, whether in the water, on land or in the air, nature provides the model for many technical innovations and inventions. This is summed up in the term bionics, which is a combination of the words ‘biology‘ and ‘electronics’. At Festo, learning from nature has a long history, as our Bionic Learning Network is based on using nature as the source for future technologies like robots, assistance systems or drive solutions.

[ Festo ]

Dogs! Selfies! Thousands of LEGO bricks! This video has it all.

[ LEGO ]

An IROS workshop talk on “Cassie and Mini Cheetah Autonomy” by Maani Ghaffari and Jessy Grizzle from the University of Michigan.

[ Michigan Robotics ]

David Schaefer’s Cozmo robots are back with this mind-blowing dance-off!

What you just saw represents hundreds of hours of work, David tells us: “I wrote over 10,000 lines of code to create the dance performance as I had to translate the beats per minute of the song into motor rotations in order to get the right precision needed to make the moves look sharp. The most challenging move was the SpongeBob SquareDance as any misstep would send the Cozmos crashing into each other. LOL! Fortunately for me, Cozmo robots are pretty resilient.”

[ Life with Cozmo ]

Thanks David!

This week’s GRASP on Robotics seminar is by Sangbae Kim from MIT, on “Robots with Physical Intelligence.”

While industrial robots are effective in repetitive, precise kinematic tasks in factories, the design and control of these robots are not suited for physically interactive performance that humans do easily. These tasks require ‘physical intelligence’ through complex dynamic interactions with environments whereas conventional robots are designed primarily for position control. In order to develop a robot with ‘physical intelligence’, we first need a new type of machines that allow dynamic interactions. This talk will discuss how the new design paradigm allows dynamic interactive tasks. As an embodiment of such a robot design paradigm, the latest version of the MIT Cheetah robots and force-feedback teleoperation arms will be presented.

[ GRASP ]

This week’s CMU Ri Seminar is by Kevin Lynch from Northwestern, on “Robotics and Biosystems.”

Research at the Center for Robotics and Biosystems at Northwestern University encompasses bio-inspiration, neuromechanics, human-machine systems, and swarm robotics, among other topics. In this talk I will give an overview of some of our recent work on in-hand manipulation, robot locomotion on yielding ground, and human-robot systems.

[ CMU RI ] Continue reading

Posted in Human Robots

#437603 Throwable Robot Car Always Lands on Four ...

Throwable or droppable robots seem like a great idea for a bunch of applications, including exploration and search and rescue. But such robots do come with some constraints—namely, if you’re going to throw or drop a robot, you should be prepared for that robot to not land the way you want it to land. While we’ve seen some creative approaches to this problem, or more straightforward self-righting devices, usually you’re in for significant trade-offs in complexity, mobility, and mass.

What would be ideal is a robot that can be relied upon to just always land the right way up. A robotic cat, of sorts. And while we’ve seen this with a tail, for wheeled vehicles, it turns out that a tail isn’t necessary: All it takes is some wheel spin.

The reason that AGRO (Agile Ground RObot), developed at the U.S. Military Academy at West Point, can do this is because each of its wheels is both independently driven and steerable. The wheels are essentially reaction wheels, which are a pretty common way to generate forces on all kinds of different robots, but typically you see such reaction wheels kludged onto these robots as sort of an afterthought—using the existing wheels of a wheeled robot is a more elegant way to do it.

Four steerable wheels with in-hub motors provide control in all three axes (yaw, pitch, and roll). You’ll notice that when the robot is tossed, the wheels all toe inwards (or outwards, I guess) by 45 degrees, positioning them orthogonal to the body of the robot. The front left and rear right wheels are spun together, as are the front right and rear left wheels. When one pair of wheels spins in the same direction, the body of the robot twists in the opposite way along an axis between those wheels, in a combination of pitch and roll. By combining different twisting torques from both pairs of wheels, pitch and roll along each axis can be adjusted independently. When the same pair of wheels spin in directions opposite to each other, the robot yaws, although yaw can also be derived by adjusting the ratio between pitch authority and roll authority. And lastly, if you want to sacrifice pitch control for more roll control (or vice versa) the wheel toe-in angle can be changed. Put all this together, and you get an enormous amount of mid-air control over your robot.

Image: Robotics Research Center/West Point

The AGRO robot features four steerable wheels with in-hub motors, which provide control in all three axes (yaw, pitch, and roll).

According to a paper that the West Point group will present at the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), the overall objective here is for the robot to reach a state of zero pitch or roll by the time the robot impacts with the ground, to distribute the impact as much as possible. AGRO doesn’t yet have a suspension to make falling actually safe, so in the short term, it lands on a foam pad, but the mid-air adjustments it’s currently able to make result in a 20 percent reduction of impact force and a 100 percent reduction in being sideways or upside-down.

The toss that you see in the video isn’t the most aggressive, but lead author Daniel J. Gonzalez tells us that AGRO can do much better, theoretically stabilizing from an initial condition of 22.5 degrees pitch and 22.5 degrees roll in a mere 250 milliseconds, with room for improvement beyond that through optimizing the angles of individual wheels in real time. The limiting factor is really the amount of time that AGRO has between the point at which it’s released and the point at which it hits the ground, since more time in the air gives the robot more time to change its orientation.

Given enough height, the current generation of AGRO can recover from any initial orientation as long as it’s spinning at 66 rpm or less. And the only reason this is a limitation at all is because of the maximum rotation speed of the in-wheel hub motors, which can be boosted by increasing the battery voltage, as Gonzalez and his colleagues, Mark C. Lesak, Andres H. Rodriguez, Joseph A. Cymerman, and Christopher M. Korpela from the Robotics Research Center at West Point, describe in the IROS paper, “Dynamics and Aerial Attitude Control for Rapid Emergency Deployment of the Agile Ground Robot AGRO.”

Image: Robotics Research Center/West Point

AGRO 2 will include a new hybrid wheel-leg and non-pneumatic tire design that will allow it to hop up stairs and curbs.

While these particular experiments focus on a robot that’s being thrown, the concept is potentially effective (and useful) on any wheeled robot that’s likely to find itself in mid-air. You can imagine it improving the performance of robots doing all sorts of stunts, from driving off ramps or ledges to being dropped out of aircraft. And as it turns out, being able to self-stabilize during an airdrop is an important skill that some Humvees could use to keep themselves from getting tangled in their own parachute lines and avoid mishaps.

Before they move on to Humvees, though, the researchers are working on the next version of AGRO named AGRO 2. AGRO 2 will include a new hybrid wheel-leg and non-pneumatic tire design that will allow it to hop up stairs and curbs, which sounds like a lot of fun to us. Continue reading

Posted in Human Robots