Tag Archives: robot

#437616 Innovative YUJIN 3D LiDAR, Now Shipping!

Recently Yujin Robot launched a new 3D LiDAR for indoor service robot, AGVs/AMRs and smart factory. The YRL3 series is a line of precise laser sensors for vertical and horizontal scanning to detect environments or objects. The Yujin Robot YRL3 series LiDAR is designed for indoor applications and utilizes an innovative 3D scanning LiDAR for a 270°(Horizontal) x 90°(vertical) dynamic field of view as a single channel. The fundamental principle is based on direct ToF (Time of Flight) and designed to measure distances towards surroundings. YRL3 collect useful data including ranges, angles, intensities and Cartesian coordinates (x,y,z). Real-time vertical right-angle adjustment is possible and supports powerful S/W package for autonomous driving devices.

“In recent years, our product lineup expanded to include models for the Fourth Industrial Revolution,” shares the marketing team of Yujin Robot. These models namely are Kobuki, the ROS reference research robot platform used by robotics research labs around the world, the Yujin LiDAR range-finding scanning sensor for LiDAR-based autonomous driving, AMS solution (Autonomous Mobility Solution) for customized autonomous driving. The company continues to push the boundaries of robotics and artificial intelligence, developing game-changing autonomous solutions that give companies around the world an edge over the competition.

Photo: Yujin

YUJIN 3D LiDAR, Now Shipping! Indoor 3D LiDAR for AGVs/AMRs, Service Robots, and Factories Continue reading

Posted in Human Robots

#437614 Video Friday: Poimo Is a Portable ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today's videos.

Engineers at the University of California San Diego have built a squid-like robot that can swim untethered, propelling itself by generating jets of water. The robot carries its own power source inside its body. It can also carry a sensor, such as a camera, for underwater exploration.

[ UCSD ]

Thanks Ioana!

Shark Robotics, French and European leader in Unmanned Ground Vehicles, is announcing today a disinfection add-on for Boston Dynamics Spot robot, designed to fight the COVID-19 pandemic. The Spot robot with Shark’s purpose-built disinfection payload can decontaminate up to 2,000 m2 in 15 minutes, in any space that needs to be sanitized – such as hospitals, metro stations, offices, warehouses or facilities.

[ Shark Robotics ]

Here’s an update on the Poimo portable inflatable mobility project we wrote about a little while ago; while not strictly robotics, it seems like it holds some promise for rapidly developing different soft structures that robotics might find useful.

[ University of Tokyo ]

Thanks Ryuma!

Pretty cool that you can do useful force feedback teleop while video chatting through a “regular broadband Internet connection.” Although, what “regular” means to you is a bit subjective, right?

[ HEBI Robotics ]

Thanks Dave!

While NASA's Mars rover Perseverance travels through space toward the Red Planet, its nearly identical rover twin is hard at work on Earth. The vehicle system test bed (VSTB) rover named OPTIMISM is a full-scale engineering version of the Mars-bound rover. It is used to test hardware and software before the commands are sent up to the Perseverance rover.

[ NASA ]

Jacquard takes ordinary, familiar objects and enhances them with new digital abilities and experiences, while remaining true to their original purpose — like being your favorite jacket, backpack or a pair of shoes that you love to wear.

Our ambition is simple: to make life easier. By staying connected to your digital world, your things can do so much more. Skip a song by brushing your sleeve. Take a picture by tapping on a shoulder strap. Get reminded about the phone you left behind with a blink of light or a haptic buzz on your cuff.

[ Google ATAP ]

Should you attend the IROS 2020 workshop on “Planetary Exploration Robots: Challenges and Opportunities”? Of course you should!

[ Workshop ]

Kuka makes a lot of these videos where I can’t help but think that if they put as much effort into programming the robot as they did into producing the video, the result would be much more impressive.

[ Kuka ]

The Colorado School of Mines is one of the first customers to buy a Spot robot from Boston Dynamics to help with robotics research. Watch as scientists take Spot into the school's mine for the first time.

[ HCR ] via [ CNET ]

A very interesting soft(ish) actuator from Ayato Kanada at Kyushu University's Control Engineering Lab.

A flexible ultrasonic motor (FUSM), which generates linear motion as a novel soft actuator. This motor consists of a single metal cube stator with a hole and an elastic elongated coil spring inserted into the hole. When voltages are applied to piezoelectric plates on the stator, the coil spring moves back and forward as a linear slider. In the FUSM that uses the friction drive as the principle, the most important parameter for optimizing its output is the preload between the stator and slider. The coil spring has a slightly larger diameter than the stator hole and generates the preload by expanding in a radial direction. The coil springs act not only as a flexible slider but also as a resistive positional sensor. Changes in the resistance between the stator and the coil spring end are converted to a voltage and used for position detection.

[ Control Engineering Lab ]

Thanks Ayato!

We show how to use the limbs of a quadruped robot to identify fine-grained soil, representative for Martian regolith.

[ Paper ] via [ ANYmal Research ]

PR2 is serving breakfast and cleaning up afterwards. It’s slow, but all you have to do is eat and leave.

That poor PR2 is a little more naked than it's probably comfortable with.

[ EASE ]

NVIDIA researchers present a hierarchical framework that combines model-based control and reinforcement learning (RL) to synthesize robust controllers for a quadruped robot (the Unitree Laikago).

[ NVIDIA ]

What's interesting about this assembly task is that the robot is using its arm only for positioning, and doing the actual assembly with just fingers.

[ RC2L ]

In this electronics assembly application, Kawasaki's cobot duAro2 uses a tool changing station to tackle a multitude of tasks and assemble different CPU models.

Okay but can it apply thermal paste to a CPU in the right way? Personally, I find that impossible.

[ Kawasaki ]

You only need to watch this video long enough to appreciate the concept of putting a robot on a robot.

[ Impress ]

In this lecture, we’ll hear from the man behind one of the biggest robotics companies in the world, Boston Dynamics, whose robotic dog, Spot, has been used to encourage social distancing in Singapore and is now getting ready for FDA approval to be able to measure patients’ vital signs in hospitals.

[ Alan Turing Institute ]

Greg Kahn from UC Berkeley wrote in to share his recent dissertation talk on “Mobile Robot Learning.”

In order to create mobile robots that can autonomously navigate real-world environments, we need generalizable perception and control systems that can reason about the outcomes of navigational decisions. Learning-based methods, in which the robot learns to navigate by observing the outcomes of navigational decisions in the real world, offer considerable promise for obtaining these intelligent navigation systems. However, there are many challenges impeding mobile robots from autonomously learning to act in the real-world, in particular (1) sample-efficiency–how to learn using a limited amount of data? (2) supervision–how to tell the robot what to do? and (3) safety–how to ensure the robot and environment are not damaged or destroyed during learning? In this talk, I will present deep reinforcement learning methods for addressing these real world mobile robot learning challenges and show results which enable ground and aerial robots to navigate in complex indoor and outdoor environments.

[ UC Berkeley ]

Thanks Greg!

Leila Takayama from UC Santa Cruz (and previously Google X and Willow Garage) gives a talk entitled “Toward a more human-centered future of robotics.”

Robots are no longer only in outer space, in factory cages, or in our imaginations. We interact with robotic agents when withdrawing cash from bank ATMs, driving cars with adaptive cruise control, and tuning our smart home thermostats. In the moment of those interactions with robotic agents, we behave in ways that do not necessarily align with the rational belief that robots are just plain machines. Through a combination of controlled experiments and field studies, we use theories and concepts from the social sciences to explore ways that human and robotic agents come together, including how people interact with personal robots and how people interact through telepresence robots. Together, we will explore topics and raise questions about the psychology of human-robot interaction and how we could invent a future of a more human-centered robotics that we actually want to live in.

[ Leila Takayama ]

Roboticist and stand-up comedian Naomi Fitter from Oregon State University gives a talk on “Everything I Know about Telepresence.”

Telepresence robots hold promise to connect people by providing videoconferencing and navigation abilities in far-away environments. At the same time, the impacts of current commercial telepresence robots are not well understood, and circumstances of robot use including internet connection stability, odd personalizations, and interpersonal relationship between a robot operator and people co-located with the robot can overshadow the benefit of the robot itself. And although the idea of telepresence robots has been around for over two decades, available nonverbal expressive abilities through telepresence robots are limited, and suitable operator user interfaces for the robot (for example, controls that allow for the operator to hold a conversation and move the robot simultaneously) remain elusive. So where should we be using telepresence robots? Are there any pitfalls to watch out for? What do we know about potential robot expressivity and user interfaces? This talk will cover my attempts to address these questions and ways in which the robotics research community can build off of this work

[ Talking Robotics ] Continue reading

Posted in Human Robots

#437610 How Intel’s OpenBot Wants to Make ...

You could make a pretty persuasive argument that the smartphone represents the single fastest area of technological progress we’re going to experience for the foreseeable future. Every six months or so, there’s something with better sensors, more computing power, and faster connectivity. Many different areas of robotics are benefiting from this on a component level, but over at Intel Labs, they’re taking a more direct approach with a project called OpenBot that turns US $50 worth of hardware and your phone into a mobile robot that can support “advanced robotics workloads such as person following and real-time autonomous navigation in unstructured environments.”

This work aims to address two key challenges in robotics: accessibility and scalability. Smartphones are ubiquitous and are becoming more powerful by the year. We have developed a combination of hardware and software that turns smartphones into robots. The resulting robots are inexpensive but capable. Our experiments have shown that a $50 robot body powered by a smartphone is capable of person following and real-time autonomous navigation. We hope that the presented work will open new opportunities for education and large-scale learning via thousands of low-cost robots deployed around the world.

Smartphones point to many possibilities for robotics that we have not yet exploited. For example, smartphones also provide a microphone, speaker, and screen, which are not commonly found on existing navigation robots. These may enable research and applications at the confluence of human-robot interaction and natural language processing. We also expect the basic ideas presented in this work to extend to other forms of robot embodiment, such as manipulators, aerial vehicles, and watercraft.

One of the interesting things about this idea is how not-new it is. The highest profile phone robot was likely the $150 Romo, from Romotive, which raised a not-insignificant amount of money on Kickstarter in 2012 and 2013 for a little mobile chassis that accepted one of three different iPhone models and could be controlled via another device or operated somewhat autonomously. It featured “computer vision, autonomous navigation, and facial recognition” capabilities, but was really designed to be a toy. Lack of compatibility hampered Romo a bit, and there wasn’t a lot that it could actually do once the novelty wore off.

As impressive as smartphone hardware was in a robotics context (even back in 2013), we’re obviously way, way beyond that now, and OpenBot figures that smartphones now have enough clout and connectivity that turning them into mobile robots is a good idea. You know, again. We asked Intel Labs’ Matthias Muller why now was the right time to launch OpenBot, and he mentioned things like the existence of a large maker community with broad access to 3D printing as well as open source software that makes broader development easier.

And of course, there’s the smartphone hardware: “Smartphones have become extremely powerful and feature dedicated AI processors in addition to CPUs and GPUs,” says Mueller. “Almost everyone owns a very capable smartphone now. There has been a big boost in sensor performance, especially in cameras, and a lot of the recent developments for VR applications are well aligned with robotic requirements for state estimation.” OpenBot has been tested with 10 recent Android phones, and since camera placement tends to be similar and USB-C is becoming the charging and communications standard, compatibility is less of an issue nowadays.

Image: OpenBot

Intel researchers created this table comparing OpenBot to other wheeled robot platforms, including Amazon’s DeepRacer, MIT’s Duckiebot, iRobot’s Create-2, and Thymio. The top group includes robots based on RC trucks; the bottom group includes navigation robots for deployment at scale and in education. Note that the cost of the smartphone needed for OpenBot is not included in this comparison.

If you’d like an OpenBot of your own, you don’t need to know all that much about robotics hardware or software. For the hardware, you probably need some basic mechanical and electronics experience—think Arduino project level. The software is a little more complicated; there’s a pretty good walkthrough to get some relatively sophisticated behaviors (like autonomous person following) up and running, but things rapidly degenerate into a command line interface that could be intimidating for new users. We did ask about why OpenBot isn’t ROS-based to leverage the robustness and reach of that community, and Muller said that ROS “adds unnecessary overhead,” although “if someone insists on using ROS with OpenBot, it should not be very difficult.”

Without building OpenBot to explicitly be part of an existing ecosystem, the challenge going forward is to make sure that the project is consistently supported, lest it wither and die like so many similar robotics projects have before it. “We are committed to the OpenBot project and will do our best to maintain it,” Mueller assures us. “We have a good track record. Other projects from our group (e.g. CARLA, Open3D, etc.) have also been maintained for several years now.” The inherently open source nature of the project certainly helps, although it can be tricky to rely too much on community contributions, especially when something like this is first starting out.

The OpenBot folks at Intel, we’re told, are already working on a “bigger, faster and more powerful robot body that will be suitable for mass production,” which would certainly help entice more people into giving this thing a go. They’ll also be focusing on documentation, which is probably the most important but least exciting part about building a low-cost community focused platform like this. And as soon as they’ve put together a way for us actual novices to turn our phones into robots that can do cool stuff for cheap, we’ll definitely let you know. Continue reading

Posted in Human Robots

#437608 Video Friday: Agility Robotics Raises ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

Digit is now in full commercial production and we’re excited to announce a $20M funding rounding round co-led by DCVC and Playground Global!

Digits for everyone!

[ Agility Robotics ]

A flexible rover that has both ability to travel long distances and rappel down hard-to-reach areas of scientific interest has undergone a field test in the Mojave Desert in California to showcase its versatility. Composed of two Axel robots, DuAxel is designed to explore crater walls, pits, scarps, vents and other extreme terrain on the moon, Mars and beyond.

This technology demonstration developed at NASA’s Jet Propulsion Laboratory in Southern California showcases the robot’s ability to split in two and send one of its halves — a two-wheeled Axle robot — over an otherwise inaccessible slope, using a tether as support and to supply power.

The rappelling Axel can then autonomously seek out areas to study, safely overcome slopes and rocky obstacles, and then return to dock with its other half before driving to another destination. Although the rover doesn’t yet have a mission, key technologies are being developed that might, one day, help us explore the rocky planets and moons throughout the solar system.

[ JPL ]

A rectangular robot as tiny as a few human hairs can travel throughout a colon by doing back flips, Purdue University engineers have demonstrated in live animal models. Why the back flips? Because the goal is to use these robots to transport drugs in humans, whose colons and other organs have rough terrain. Side flips work, too. Why a back-flipping robot to transport drugs? Getting a drug directly to its target site could remove side effects, such as hair loss or stomach bleeding, that the drug may otherwise cause by interacting with other organs along the way.

[ Purdue ]

This video shows the latest results in the whole-body locomotion control of the humanoid robot iCub achieved by the Dynamic Interaction Control line at IIT-Istituto Italiano di Tecnologia in Genova (Italy). In particular, the iCub now keeps the balance while walking and receiving pushes from an external user. The implemented control algorithms also ensure the robot to remain compliant during locomotion and human-robot interaction, a fundamental property to lower the possibility to harm humans that share the robot surrounding environment.

This is super impressive, considering that iCub was only able to crawl and was still tethered not too long ago. Also, it seems to be blinking properly now, so it doesn’t look like it’s always sleepy.

[ IIT ]

This video shows a set of new tests we performed on Bolt. We conducted tests on 5 different scenarios, 1) walking forward/backward 2) uneven surface 3) soft surface 4) push recovery 5) slippage recovery. Thanks to our feedback control based on Model Predictive Control, the robot can perform walking in the presence of all these uncertainties. We will open-source all the codes in a near future.

[ ODRI ]

The title of this video is “Can you throw your robot into a lake?” The title of this video should be, “Can you throw your robot into a lake and drive it out again?”

[ Norlab ]

AeroVironment Successfully Completes Sunglider Solar HAPS Stratospheric Test Flight, Surpassing 60,000 Feet Altitude and Demonstrating Broadband Mobile Connectivity.

[ AeroVironment ]

We present CoVR, a novel robotic interface providing strong kinesthetic feedback (100 N) in a room-scale VR arena. It consists of a physical column mounted on a 2D Cartesian ceiling robot (XY displacements) with the capacity of (1) resisting to body-scaled users actions such as pushing or leaning; (2) acting on the users by pulling or transporting them as well as (3) carrying multiple potentially heavy objects (up to 80kg) that users can freely manipulate or make interact with each other.

[ DeepAI ]

In a new video, personnel from Swiss energy supply company Kraftwerke Oberhasli AG (KWO) explain how they were able to keep employees out of harm’s way by using Flyability’s Elios 2 to collect visual data while building a new dam.

[ Flyability ]

Enjoy our Ascento robot fail compilation! With every failure we experience, we learn more and we can improve our robot for its next iteration, which will come soon… Stay tuned for more!

FYI posting a robot fails video will pretty much guarantee you a spot in Video Friday!

[ Ascento ]

Humans are remarkably good at using chopsticks. The Guinness World Record witnessed a person using chopsticks to pick up 65 M&Ms in just a minute. We aim to collect demonstrations from humans and to teach robot to use chopsticks.

[ UW Personal Robotics Lab ]

A surprising amount of personality from these Yaskawa assembly robots.

[ Yaskawa ]

This paper presents the system design, modeling, and control of the Aerial Robotic Chain Manipulator. This new robot design offers the potential to exert strong forces and moments to the environment, carry and lift significant payloads, and simultaneously navigate through narrow corridors. The presented experimental studies include a valve rotation task, a pick-and-release task, and the verification of load oscillation suppression to demonstrate the stability and performance of the system.

[ ARL ]

Whether animals or plants, whether in the water, on land or in the air, nature provides the model for many technical innovations and inventions. This is summed up in the term bionics, which is a combination of the words ‘biology‘ and ‘electronics’. At Festo, learning from nature has a long history, as our Bionic Learning Network is based on using nature as the source for future technologies like robots, assistance systems or drive solutions.

[ Festo ]

Dogs! Selfies! Thousands of LEGO bricks! This video has it all.

[ LEGO ]

An IROS workshop talk on “Cassie and Mini Cheetah Autonomy” by Maani Ghaffari and Jessy Grizzle from the University of Michigan.

[ Michigan Robotics ]

David Schaefer’s Cozmo robots are back with this mind-blowing dance-off!

What you just saw represents hundreds of hours of work, David tells us: “I wrote over 10,000 lines of code to create the dance performance as I had to translate the beats per minute of the song into motor rotations in order to get the right precision needed to make the moves look sharp. The most challenging move was the SpongeBob SquareDance as any misstep would send the Cozmos crashing into each other. LOL! Fortunately for me, Cozmo robots are pretty resilient.”

[ Life with Cozmo ]

Thanks David!

This week’s GRASP on Robotics seminar is by Sangbae Kim from MIT, on “Robots with Physical Intelligence.”

While industrial robots are effective in repetitive, precise kinematic tasks in factories, the design and control of these robots are not suited for physically interactive performance that humans do easily. These tasks require ‘physical intelligence’ through complex dynamic interactions with environments whereas conventional robots are designed primarily for position control. In order to develop a robot with ‘physical intelligence’, we first need a new type of machines that allow dynamic interactions. This talk will discuss how the new design paradigm allows dynamic interactive tasks. As an embodiment of such a robot design paradigm, the latest version of the MIT Cheetah robots and force-feedback teleoperation arms will be presented.

[ GRASP ]

This week’s CMU Ri Seminar is by Kevin Lynch from Northwestern, on “Robotics and Biosystems.”

Research at the Center for Robotics and Biosystems at Northwestern University encompasses bio-inspiration, neuromechanics, human-machine systems, and swarm robotics, among other topics. In this talk I will give an overview of some of our recent work on in-hand manipulation, robot locomotion on yielding ground, and human-robot systems.

[ CMU RI ] Continue reading

Posted in Human Robots

#437603 Throwable Robot Car Always Lands on Four ...

Throwable or droppable robots seem like a great idea for a bunch of applications, including exploration and search and rescue. But such robots do come with some constraints—namely, if you’re going to throw or drop a robot, you should be prepared for that robot to not land the way you want it to land. While we’ve seen some creative approaches to this problem, or more straightforward self-righting devices, usually you’re in for significant trade-offs in complexity, mobility, and mass.

What would be ideal is a robot that can be relied upon to just always land the right way up. A robotic cat, of sorts. And while we’ve seen this with a tail, for wheeled vehicles, it turns out that a tail isn’t necessary: All it takes is some wheel spin.

The reason that AGRO (Agile Ground RObot), developed at the U.S. Military Academy at West Point, can do this is because each of its wheels is both independently driven and steerable. The wheels are essentially reaction wheels, which are a pretty common way to generate forces on all kinds of different robots, but typically you see such reaction wheels kludged onto these robots as sort of an afterthought—using the existing wheels of a wheeled robot is a more elegant way to do it.

Four steerable wheels with in-hub motors provide control in all three axes (yaw, pitch, and roll). You’ll notice that when the robot is tossed, the wheels all toe inwards (or outwards, I guess) by 45 degrees, positioning them orthogonal to the body of the robot. The front left and rear right wheels are spun together, as are the front right and rear left wheels. When one pair of wheels spins in the same direction, the body of the robot twists in the opposite way along an axis between those wheels, in a combination of pitch and roll. By combining different twisting torques from both pairs of wheels, pitch and roll along each axis can be adjusted independently. When the same pair of wheels spin in directions opposite to each other, the robot yaws, although yaw can also be derived by adjusting the ratio between pitch authority and roll authority. And lastly, if you want to sacrifice pitch control for more roll control (or vice versa) the wheel toe-in angle can be changed. Put all this together, and you get an enormous amount of mid-air control over your robot.

Image: Robotics Research Center/West Point

The AGRO robot features four steerable wheels with in-hub motors, which provide control in all three axes (yaw, pitch, and roll).

According to a paper that the West Point group will present at the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), the overall objective here is for the robot to reach a state of zero pitch or roll by the time the robot impacts with the ground, to distribute the impact as much as possible. AGRO doesn’t yet have a suspension to make falling actually safe, so in the short term, it lands on a foam pad, but the mid-air adjustments it’s currently able to make result in a 20 percent reduction of impact force and a 100 percent reduction in being sideways or upside-down.

The toss that you see in the video isn’t the most aggressive, but lead author Daniel J. Gonzalez tells us that AGRO can do much better, theoretically stabilizing from an initial condition of 22.5 degrees pitch and 22.5 degrees roll in a mere 250 milliseconds, with room for improvement beyond that through optimizing the angles of individual wheels in real time. The limiting factor is really the amount of time that AGRO has between the point at which it’s released and the point at which it hits the ground, since more time in the air gives the robot more time to change its orientation.

Given enough height, the current generation of AGRO can recover from any initial orientation as long as it’s spinning at 66 rpm or less. And the only reason this is a limitation at all is because of the maximum rotation speed of the in-wheel hub motors, which can be boosted by increasing the battery voltage, as Gonzalez and his colleagues, Mark C. Lesak, Andres H. Rodriguez, Joseph A. Cymerman, and Christopher M. Korpela from the Robotics Research Center at West Point, describe in the IROS paper, “Dynamics and Aerial Attitude Control for Rapid Emergency Deployment of the Agile Ground Robot AGRO.”

Image: Robotics Research Center/West Point

AGRO 2 will include a new hybrid wheel-leg and non-pneumatic tire design that will allow it to hop up stairs and curbs.

While these particular experiments focus on a robot that’s being thrown, the concept is potentially effective (and useful) on any wheeled robot that’s likely to find itself in mid-air. You can imagine it improving the performance of robots doing all sorts of stunts, from driving off ramps or ledges to being dropped out of aircraft. And as it turns out, being able to self-stabilize during an airdrop is an important skill that some Humvees could use to keep themselves from getting tangled in their own parachute lines and avoid mishaps.

Before they move on to Humvees, though, the researchers are working on the next version of AGRO named AGRO 2. AGRO 2 will include a new hybrid wheel-leg and non-pneumatic tire design that will allow it to hop up stairs and curbs, which sounds like a lot of fun to us. Continue reading

Posted in Human Robots