Tag Archives: signs

#436546 How AI Helped Predict the Coronavirus ...

Coronavirus has been all over the news for the last couple weeks. A dedicated hospital sprang up in just eight days, the stock market took a hit, Chinese New Year celebrations were spoiled, and travel restrictions are in effect.

But let’s rewind a bit; some crucial events took place before we got to this point.

A little under two weeks before the World Health Organization (WHO) alerted the public of the coronavirus outbreak, a Canadian artificial intelligence company was already sounding the alarm. BlueDot uses AI-powered algorithms to analyze information from a multitude of sources to identify disease outbreaks and forecast how they may spread. On December 31st 2019, the company sent out a warning to its customers to avoid Wuhan, where the virus originated. The WHO didn’t send out a similar public notice until January 9th, 2020.

The story of BlueDot’s early warning is the latest example of how AI can improve our identification of and response to new virus outbreaks.

Predictions Are Bad News
Global pandemic or relatively minor scare? The jury is still out on the coronavirus. However, the math points to signs that the worst is yet to come.

Scientists are still working to determine how infectious the virus is. Initial analysis suggests it may be somewhere between influenza and polio on the virus reproduction number scale, which indicates how many new cases one case leads to.

UK and US-based researchers have published a preliminary paper estimating that the confirmed infected people in Wuhan only represent five percent of those who are actually infected. If the models are correct, 190,000 people in Wuhan will be infected by now, major Chinese cities are on the cusp of large-scale outbreaks, and the virus will continue to spread to other countries.

Finding the Start
The spread of a given virus is partly linked to how long it remains undetected. Identifying a new virus is the first step towards mobilizing a response and, in time, creating a vaccine. Warning at-risk populations as quickly as possible also helps with limiting the spread.

These are among the reasons why BlueDot’s achievement is important in and of itself. Furthermore, it illustrates how AIs can sift through vast troves of data to identify ongoing virus outbreaks.

BlueDot uses natural language processing and machine learning to scour a variety of information sources, including chomping through 100,000 news reports in 65 languages a day. Data is compared with flight records to help predict virus outbreak patterns. Once the automated data sifting is completed, epidemiologists check that the findings make sense from a scientific standpoint, and reports are sent to BlueDot’s customers, which include governments, businesses, and public health organizations.

AI for Virus Detection and Prevention
Other companies, such as Metabiota, are also using data-driven approaches to track the spread of the likes of the coronavirus.

Researchers have trained neural networks to predict the spread of infectious diseases in real time. Others are using AI algorithms to identify how preventive measures can have the greatest effect. AI is also being used to create new drugs, which we may well see repeated for the coronavirus.

If the work of scientists Barbara Han and David Redding comes to fruition, AI and machine learning may even help us predict where virus outbreaks are likely to strike—before they do.

The Uncertainty Factor
One of AI’s core strengths when working on identifying and limiting the effects of virus outbreaks is its incredibly insistent nature. AIs never tire, can sift through enormous amounts of data, and identify possible correlations and causations that humans can’t.

However, there are limits to AI’s ability to both identify virus outbreaks and predict how they will spread. Perhaps the best-known example comes from the neighboring field of big data analytics. At its launch, Google Flu Trends was heralded as a great leap forward in relation to identifying and estimating the spread of the flu—until it underestimated the 2013 flu season by a whopping 140 percent and was quietly put to rest.

Poor data quality was identified as one of the main reasons Google Flu Trends failed. Unreliable or faulty data can wreak havoc on the prediction power of AIs.

In our increasingly interconnected world, tracking the movements of potentially infected individuals (by car, trains, buses, or planes) is just one vector surrounded by a lot of uncertainty.

The fact that BlueDot was able to correctly identify the coronavirus, in part due to its AI technology, illustrates that smart computer systems can be incredibly useful in helping us navigate these uncertainties.

Importantly, though, this isn’t the same as AI being at a point where it unerringly does so on its own—which is why BlueDot employs human experts to validate the AI’s findings.

Image Credit: Coronavirus molecular illustration, Gianluca Tomasello/Wikimedia Commons Continue reading

Posted in Human Robots

#436530 How Smart Roads Will Make Driving ...

Roads criss-cross the landscape, but while they provide vital transport links, in many ways they represent a huge amount of wasted space. Advances in “smart road” technology could change that, creating roads that can harvest energy from cars, detect speeding, automatically weigh vehicles, and even communicate with smart cars.

“Smart city” projects are popping up in countries across the world thanks to advances in wireless communication, cloud computing, data analytics, remote sensing, and artificial intelligence. Transportation is a crucial element of most of these plans, but while much of the focus is on public transport solutions, smart roads are increasingly being seen as a crucial feature of these programs.

New technology is making it possible to tackle a host of issues including traffic congestion, accidents, and pollution, say the authors of a paper in the journal Proceedings of the Royal Society A. And they’ve outlined ten of the most promising advances under development or in planning stages that could feature on tomorrow’s roads.

Energy harvesting

A variety of energy harvesting technologies integrated into roads have been proposed as ways to power street lights and traffic signals or provide a boost to the grid. Photovoltaic panels could be built into the road surface to capture sunlight, or piezoelectric materials installed beneath the asphalt could generate current when deformed by vehicles passing overhead.

Musical roads

Countries like Japan, Denmark, the Netherlands, Taiwan, and South Korea have built roads that play music as cars pass by. By varying the spacing of rumble strips, it’s possible to produce a series of different notes as vehicles drive over them. The aim is generally to warn of hazards or help drivers keep to the speed limit.

Automatic weighing

Weight-in-motion technology that measures vehicles’ loads as they drive slowly through a designated lane has been around since the 1970s, but more recently high speed weight-in-motion tech has made it possible to measure vehicles as they travel at regular highway speeds. The latest advance has been integration with automatic licence plate reading and wireless communication to allow continuous remote monitoring both to enforce weight restrictions and monitor wear on roads.

Vehicle charging

The growing popularity of electric vehicles has spurred the development of technology to charge cars and buses as they drive. The most promising of these approaches is magnetic induction, which involves burying cables beneath the road to generate electromagnetic fields that a receiver device in the car then transforms into electrical power to charge batteries.

Smart traffic signs

Traffic signs aren’t always as visible as they should be, and it can often be hard to remember what all of them mean. So there are now proposals for “smart signs” that wirelessly beam a sign’s content to oncoming cars fitted with receivers, which can then alert the driver verbally or on the car’s display. The approach isn’t affected by poor weather and lighting, can be reprogrammed easily, and could do away with the need for complex sign recognition technology in future self-driving cars.

Traffic violation detection and notification

Sensors and cameras can be combined with these same smart signs to detect and automatically notify drivers of traffic violations. The automatic transmission of traffic signals means drivers won’t be able to deny they’ve seen the warnings or been notified of any fines, as a record will be stored on their car’s black box.

Talking cars

Car-to-car communication technology and V2X, which lets cars share information with any other connected device, are becoming increasingly common. Inter-car communication can be used to propagate accidents or traffic jam alerts to prevent congestion, while letting vehicles communicate with infrastructure can help signals dynamically manage timers to keep traffic flowing or automatically collect tolls.

Smart intersections

Combing sensors and cameras with object recognition systems that can detect vehicles and other road users can help increase safety and efficiency at intersections. It can be used to extend green lights for slower road users like pedestrians and cyclists, sense jaywalkers, give priority to emergency vehicles, and dynamically adjust light timers to optimize traffic flow. Information can even be broadcast to oncoming vehicles to highlight blind spots and potential hazards.

Automatic crash detection

There’s a “golden hour” after an accident in which the chance of saving lives is greatly increased. Vehicle communication technology can ensure that notification of a crash reaches the emergency services rapidly, and can also provide vital information about the number and type of vehicles involved, which can help emergency response planning. It can also be used to alert other drivers to slow down or stop to prevent further accidents.

Smart street lights

Street lights are increasingly being embedded with sensors, wireless connectivity, and micro-controllers to enable a variety of smart functions. These include motion activation to save energy, providing wireless access points, air quality monitoring, or parking and litter monitoring. This can also be used to send automatic maintenance requests if a light is faulty, and can even allow neighboring lights to be automatically brightened to compensate.

Image Credit: Image by David Mark from Pixabay Continue reading

Posted in Human Robots

#436174 How Selfish Are You? It Matters for ...

Our personalities impact almost everything we do, from the career path we choose to the way we interact with others to how we spend our free time.

But what about the way we drive—could personality be used to predict whether a driver will cut someone off, speed, or, say, zoom through a yellow light instead of braking?

There must be something to the idea that those of us who are more mild-mannered are likely to drive a little differently than the more assertive among us. At least, that’s what a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) is betting on.

“Working with and around humans means figuring out their intentions to better understand their behavior,” said graduate student Wilko Schwarting, lead author on the paper published this week in Proceedings of the National Academy of Sciences. “People’s tendencies to be collaborative or competitive often spills over into how they behave as drivers. In this paper we sought to understand if this was something we could actually quantify.”

The team is building a model that classifies drivers according to how selfish or selfless they are, then uses that classification to help predict how drivers will behave on the road. Ideally, the system will help improve safety for self-driving cars by integrating a degree of ‘humanity’ into how their software perceives its surroundings; right now, human drivers and their cars are just another object, not much different than a tree or a sign.

But unlike trees and signs, humans have behavioral patterns and motivations. For greater success on roads that are still dominated by us mercurial humans, the CSAIL team believes, driverless cars should take our personalities into account.

How Selfish Are You?
About how important is your own well-being to you vs. the well-being of other people? It’s a hard question to answer without specifying who the other people are; your answer would likely differ if we’re talking about your friends, loved ones, strangers, or people you actively dislike.

In social psychology, social value orientation (SVO) refers to people’s preferences for allocating resources between themselves and others. The two broad categories people can fall into are pro-social (people who are more cooperative, and expect cooperation from others) and pro-self (pretty self-explanatory: “Me first!”).

Based on drivers’ behavior in two different road scenarios—merging and making a left turn—the CSAIL team’s model classified drivers as pro-social or egoistic. Slowing down to let someone merge into your lane in front of you would earn you a pro-social classification, while cutting someone off or not slowing down to allow a left turn would make you egoistic.

On the Road
The system then uses these classifications to model and predict drivers’ behavior. The team demonstrated that using their model, errors in predicting the behavior of other cars were reduced by 25 percent.

In a left-turn simulation, for example, their car would wait when an approaching car had an egoistic driver, but go ahead and make the turn when the other driver was prosocial. Similarly, if a self-driving car is trying to merge into the left lane and it’s identified the drivers in that lane as egoistic, it will assume they won’t slow down to let it in, and will wait to merge behind them. If, on the other hand, the self-driving car knows that the human drivers in the left lane are prosocial, it will attempt to merge between them since they’re likely to let it in.

So how does this all translate to better safety?

It’s essentially a starting point for imbuing driverless cars with some of the abilities and instincts that are innate to humans. If you’re driving down the highway and you see a car swerving outside its lane, you’ll probably distance yourself from that car because you know it’s more likely to cause an accident. Our senses take in information we can immediately interpret and act on, and this includes predictions about what might happen based on observations of what just happened. Our observations can clue us in to a driver’s personality (the swerver must be careless) or simply to the circumstances of a given moment (the swerver was texting).

But right now, self-driving cars assume all human drivers behave the same way, and they have no mechanism for incorporating observations about behavioral differences between drivers into their decisions.

“Creating more human-like behavior in autonomous vehicles (AVs) is fundamental for the safety of passengers and surrounding vehicles, since behaving in a predictable manner enables humans to understand and appropriately respond to the AV’s actions,” said Schwarting.

Though it may feel a bit unsettling to think of an algorithm lumping you into a category and driving accordingly around you, maybe it’s less unsettling than thinking of self-driving cars as pre-programmed, oblivious robots unable to adapt to different driving styles.

The team’s next step is to apply their model to pedestrians, bikes, and other agents frequently found in driving environments. They also plan to look into other robotic systems acting among people, like household robots, and integrating social value orientation into their algorithms.

Image Credit: Image by Free-Photos from Pixabay Continue reading

Posted in Human Robots

#435681 Video Friday: This NASA Robot Uses ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICRES 2019 – July 29-30, 2019 – London, U.K.
DARPA SubT Tunnel Circuit – August 15-22, 2019 – Pittsburgh, Pa., USA
IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Let us know if you have suggestions for next week, and enjoy today’s videos.

Robots can land on the Moon and drive on Mars, but what about the places they can’t reach? Designed by engineers as NASA’s Jet Propulsion Laboratory in Pasadena, California, a four-limbed robot named LEMUR (Limbed Excursion Mechanical Utility Robot) can scale rock walls, gripping with hundreds of tiny fishhooks in each of its 16 fingers and using artificial intelligence to find its way around obstacles. In its last field test in Death Valley, California, in early 2019, LEMUR chose a route up a cliff, scanning the rock for ancient fossils from the sea that once filled the area.

The LEMUR project has since concluded, but it helped lead to a new generation of walking, climbing and crawling robots. In future missions to Mars or icy moons, robots with AI and climbing technology derived from LEMUR could discover similar signs of life. Those robots are being developed now, honing technology that may one day be part of future missions to distant worlds.

[ NASA ]

This video demonstrates the autonomous footstep planning developed by IHMC. Robots in this video are the Atlas humanoid robot (DRC version) and the NASA Valkyrie. The operator specifies a goal location in the world, which is modeled as planar regions using the robot’s perception sensors. The planner then automatically computes the necessary steps to reach the goal using a Weighted A* algorithm. The algorithm does not reject footholds that have a certain amount of support, but instead modifies them after the plan is found to try and increase that support area.

Currently, narrow terrain has a success rate of about 50%, rough terrain is about 90%, whereas flat ground is near 100%. We plan on increasing planner speed and the ability to plan through mazes and to unseen goals by including a body-path planner as the first step. Control, Perception, and Planning algorithms by IHMC Robotics.

[ IHMC ]

I’ve never really been able to get into watching people play poker, but throw an AI from CMU and Facebook into a game of no-limit Texas hold’em with five humans, and I’m there.

[ Facebook ]

In this video, Cassie Blue is navigating autonomously. Right now, her world is very small, the Wavefield at the University of Michigan, where she is told to turn left at intersections. You’re right, that is not a lot of independence, but it’s a first step away from a human and an RC controller!

Using a RealSense RGBD Camera, an IMU, and our version of an InEKF with contact factors, Cassie Blue is building a 3D semantic map in real time that identifies sidewalks, grass, poles, bicycles, and buildings. From the semantic map, occupancy and cost maps are built with the sidewalk identified as walk-able area and everything else considered as an obstacle. A planner then sets a goal to stay approximately 50 cm to the right of the sidewalk’s left edge and plans a path around obstacles and corners using D*. The path is translated into way-points that are achieved via Cassie Blue’s gait controller.

[ University of Michigan ]

Thanks Jesse!

Dave from HEBI Robotics wrote in to share some new actuators that are designed to get all kinds of dirty: “The R-Series takes HEBI’s X-Series to the next level, providing a sealed robotics solution for rugged, industrial applications and laying the groundwork for industrial users to address challenges that are not well met by traditional robotics. To prove it, we shot some video right in the Allegheny River here in Pittsburgh. Not a bad way to spend an afternoon :-)”

The R-Series Actuator is a full-featured robotic component as opposed to a simple servo motor. The output rotates continuously, requires no calibration or homing on boot-up, and contains a thru-bore for easy daisy-chaining of wiring. Modular in nature, R-Series Actuators can be used in everything from wheeled robots to collaborative robotic arms. They are sealed to IP67 and designed with a lightweight form factor for challenging field applications, and they’re packed with sensors that enable simultaneous control of position, velocity, and torque.

[ HEBI Robotics ]

Thanks Dave!

If your robot hands out karate chops on purpose, that’s great. If it hands out karate chops accidentally, maybe you should fix that.

COVR is short for “being safe around collaborative and versatile robots in shared spaces”. Our mission is to significantly reduce the complexity in safety certifying cobots. Increasing safety for collaborative robots enables new innovative applications, thus increasing production and job creation for companies utilizing the technology. Whether you’re an established company seeking to deploy cobots or an innovative startup with a prototype of a cobot related product, COVR will help you analyze, test and validate the safety for that application.

[ COVR ]

Thanks Anna!

EPFL startup Flybotix has developed a novel drone with just two propellers and an advanced stabilization system that allow it to fly for twice as long as conventional models. That fact, together with its small size, makes it perfect for inspecting hard-to-reach parts of industrial facilities such as ducts.

[ Flybotix ]

SpaceBok is a quadruped robot designed and built by a Swiss student team from ETH Zurich and ZHAW Zurich, currently being tested using Automation and Robotics Laboratories (ARL) facilities at our technical centre in the Netherlands. The robot is being used to investigate the potential of ‘dynamic walking’ and jumping to get around in low gravity environments.

SpaceBok could potentially go up to 2 m high in lunar gravity, although such a height poses new challenges. Once it comes off the ground the legged robot needs to stabilise itself to come down again safely – like a mini-spacecraft. So, like a spacecraft. SpaceBok uses a reaction wheel to control its orientation.

[ ESA ]

A new video from GITAI showing progress on their immersive telepresence robot for space.

[ GITAI ]

Tech United’s HERO robot (a Toyota HSR) competed in the RoboCup@Home competition, and it had a couple of garbage-related hiccups.

[ Tech United ]

Even small drones are getting better at autonomous obstacle avoidance in cluttered environments at useful speeds, as this work from the HKUST Aerial Robotics Group shows.

[ HKUST ]

DelFly Nimbles now come in swarms.

[ DelFly Nimble ]

This is a very short video, but it’s a fairly impressive look at a Baxter robot collaboratively helping someone put a shirt on, a useful task for folks with disabilities.

[ Shibata Lab ]

ANYmal can inspect the concrete in sewers for deterioration by sliding its feet along the ground.

[ ETH Zurich ]

HUG is a haptic user interface for teleoperating advanced robotic systems as the humanoid robot Justin or the assistive robotic system EDAN. With its lightweight robot arms, HUG can measure human movements and simultaneously display forces from the distant environment. In addition to such teleoperation applications, HUG serves as a research platform for virtual assembly simulations, rehabilitation, and training.

[ DLR ]

This video about “image understanding” from CMU in 1979 (!) is amazing, and even though it’s long, you won’t regret watching until 3:30. Or maybe you will.

[ ARGOS (pdf) ]

Will Burrard-Lucas’ BeetleCam turned 10 this month, and in this video, he recounts the history of his little robotic camera.

[ BeetleCam ]

In this week’s episode of Robots in Depth, Per speaks with Gabriel Skantze from Furhat Robotics.

Gabriel Skantze is co-founder and Chief Scientist at Furhat Robotics and Professor in speech technology at KTH with a specialization in conversational systems. He has a background in research into how humans use spoken communication to interact.

In this interview, Gabriel talks about how the social robot revolution makes it necessary to communicate with humans in a human ways through speech and facial expressions. This is necessary as we expand the number of people that interact with robots as well as the types of interaction. Gabriel gives us more insight into the many challenges of implementing spoken communication for co-bots, where robots and humans work closely together. They need to communicate about the world, the objects in it and how to handle them. We also get to hear how having an embodied system using the Furhat robot head helps the interaction between humans and the system.

[ Robots in Depth ] Continue reading

Posted in Human Robots

#435591 Video Friday: This Robotic Thread Could ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

Eight engineering students from ETH Zurich are working on a year-long focus project to develop a multimodal robot called Dipper, which can fly, swim, dive underwater, and manage that difficult air-water transition:

The robot uses one motor to selectively drive either a propeller or a marine screw depending on whether it’s in flight or not. We’re told that getting the robot to autonomously do the water to air transition is still a work in progress, but that within a few weeks things should be much smoother.

[ Dipper ]

Thanks Simon!

Giving a jellyfish a hug without stressing them out is exactly as hard as you think, but Harvard’s robot will make sure that all jellyfish get the emotional (and physical) support that they need.

The gripper’s six “fingers” are composed of thin, flat strips of silicone with a hollow channel inside bonded to a layer of flexible but stiffer polymer nanofibers. The fingers are attached to a rectangular, 3D-printed plastic “palm” and, when their channels are filled with water, curl in the direction of the nanofiber-coated side. Each finger exerts an extremely low amount of pressure — about 0.0455 kPA, or less than one-tenth of the pressure of a human’s eyelid on their eye. By contrast, current state-of-the-art soft marine grippers, which are used to capture delicate but more robust animals than jellyfish, exert about 1 kPA.

The gripper was successfully able to trap each jellyfish against the palm of the device, and the jellyfish were unable to break free from the fingers’ grasp until the gripper was depressurized. The jellyfish showed no signs of stress or other adverse effects after being released, and the fingers were able to open and close roughly 100 times before showing signs of wear and tear.

[ Harvard ]

MIT engineers have developed a magnetically steerable, thread-like robot that can actively glide through narrow, winding pathways, such as the labyrinthine vasculature of the brain. In the future, this robotic thread may be paired with existing endovascular technologies, enabling doctors to remotely guide the robot through a patient’s brain vessels to quickly treat blockages and lesions, such as those that occur in aneurysms and stroke.

[ MIT ]

See NASA’s next Mars rover quite literally coming together inside a clean room at the Jet Propulsion Laboratory. This behind-the-scenes look at what goes into building and preparing a rover for Mars, including extensive tests in simulated space environments, was captured from March to July 2019. The rover is expected to launch to the Red Planet in summer 2020 and touch down in February 2021.

The Mars 2020 rover doesn’t have a name yet, but you can give it one! As long as you’re not too old! Which you probably are!

[ Mars 2020 ]

I desperately wish that we could watch this next video at normal speed, not just slowed down, but it’s quite impressive anyway.

Here’s one more video from the Namiki Lab showing some high speed tracking with a pair of very enthusiastic robotic cameras:

[ Namiki Lab ]

Normally, tedious modeling of mechanics, electronics, and information science is required to understand how insects’ or robots’ moving parts coordinate smoothly to take them places. But in a new study, biomechanics researchers at the Georgia Institute of Technology boiled down the sprints of cockroaches to handy principles and equations they then used to make a test robot amble about better.

[ Georgia Tech ]

More magical obstacle-dodging footage from Skydio’s still secret new drone.

We’ve been hard at work extending the capabilities of our upcoming drone, giving you ways to get the control you want without the stress of crashing. The result is you can fly in ways, and get shots, that would simply be impossible any other way. How about flying through obstacles at full speed, backwards?

[ Skydio ]

This is a cute demo with Misty:

[ Misty Robotics ]

We’ve seen pieces of hardware like this before, but always made out of hard materials—a soft version is certainly something new.

Utilizing vacuum power and soft material actuators, we have developed a soft reconfigurable surface (SRS) with multi-modal control and performance capabilities. The SRS is comprised of a square grid array of linear vacuum-powered soft pneumatic actuators (linear V-SPAs), built into plug-and-play modules which enable the arrangement, consolidation, and control of many DoF.

[ RRL ]

The EksoVest is not really a robot, but it’ll make you a cyborg! With super strength!

“This is NOT intended to give you super strength but instead give you super endurance and reduce fatigue so that you have more energy and less soreness at the end of your shift.”

Drat!

[ EksoVest ]

We have created a solution for parents, grandparents, and their children who are living separated. This is an amazing tool to stay connected from a distance through the intimacy that comes through interactive play with a child. For parents who travel for work, deployed military, and families spread across the country, the Cushybot One is much more than a toy; it is the opportunity for maintaining a deep connection with your young child from a distance.

Hmm.

I think the concept here is great, but it’s going to be a serious challenge to successfully commercialize.

[ Indiegogo ]

What happens when you equip RVR with a parachute and send it off a cliff? Watch this episode of RVR Launchpad to find out – then go Behind the Build to see how we (eventually) accomplished this high-flying feat.

[ Sphero ]

These omnidirectional crawler robots aren’t new, but that doesn’t keep them from being fun to watch.

[ NEDO ] via [ Impress ]

We’ll finish up the week with a couple of past ICRA and IROS keynote talks—one by Gill Pratt on The Reliability Challenges of Autonomous Driving, and the other from Peter Hart, on Making Shakey.

[ IEEE RAS ] Continue reading

Posted in Human Robots