Tag Archives: mechanical engineering

#436209 Video Friday: Robotic Endoscope Travels ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

DARPA SubT Urban Circuit – February 18-27, 2020 – Olympia, WA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

Kuka has just announced the results of its annual Innovation Award. From an initial batch of 30 applicants, five teams reached the finals (we were part of the judging committee). The five finalists worked for nearly a year on their applications, which they demonstrated this week at the Medica trade show in Düsseldorf, Germany. And the winner of the €20,000 prize is…Team RoboFORCE, led by the STORM Lab in the U.K., which developed a “robotic magnetic flexible endoscope for painless colorectal cancer screening, surveillance, and intervention.”

The system could improve colonoscopy procedures by reducing pain and discomfort as well as other risks such as bleeding and perforation, according to the STORM Lab researchers. It uses a magnetic field to control the endoscope, pulling rather than pushing it through the colon.

The other four finalists also presented some really interesting applications—you can see their videos below.

“Because we were so please with the high quality of the submissions, we will have next year’s finals again at the Medica fair, and the challenge will be named ‘Medical Robotics’,” says Rainer Bischoff, vice president for corporate research at Kuka. He adds that the selected teams will again use Kuka’s LBR Med robot arm, which is “already certified for integration into medical products and makes it particularly easy for startups to use a robot as the main component for a particular solution.”

Applications are now open for Kuka’s Innovation Award 2020. You can find more information on how to enter here. The deadline is 5 January 2020.

[ Kuka ]

Oh good, Aibo needs to be fed now.

You know what comes next, right?

[ Aibo ]

Your cat needs this robot.

It's about $200 on Kickstarter.

[ Kickstarter ]

Enjoy this tour of the Skydio offices courtesy Skydio 2, which runs into not even one single thing.

If any Skydio employees had important piles of papers on their desks, well, they don’t anymore.

[ Skydio ]

Artificial intelligence is everywhere nowadays, but what exactly does it mean? We asked a group MIT computer science grad students and post-docs how they personally define AI.

“When most people say AI, they actually mean machine learning, which is just pattern recognition.” Yup.

[ MIT ]

Using event-based cameras, this drone control system can track attitude at 1600 degrees per second (!).

[ UZH ]

Introduced at CES 2018, Walker is an intelligent humanoid service robot from UBTECH Robotics. Below are the latest features and technologies used during our latest round of development to make Walker even better.

[ Ubtech ]

Introducing the Alpha Prime by #VelodyneLidar, the most advanced lidar sensor on the market! Alpha Prime delivers an unrivaled combination of field-of-view, range, high-resolution, clarity and operational performance.

Performance looks good, but don’t expect it to be cheap.

[ Velodyne ]

Ghost Robotics’ Spirit 40 will start shipping to researchers in January of next year.

[ Ghost Robotics ]

Unitree is about to ship the first batch of their AlienGo quadrupeds as well:

[ Unitree ]

Mechanical engineering’s Sarah Bergbreiter discusses her work on micro robotics, how they draw inspiration from insects and animals, and how tiny robots can help humans in a variety of fields.

[ CMU ]

Learning contact-rich, robotic manipulation skills is a challenging problem due to the high-dimensionality of the state and action space as well as uncertainty from noisy sensors and inaccurate motor control. To combat these factors and achieve more robust manipulation, humans actively exploit contact constraints in the environment. By adopting a similar strategy, robots can also achieve more robust manipulation. In this paper, we enable a robot to autonomously modify its environment and thereby discover how to ease manipulation skill learning. Specifically, we provide the robot with fixtures that it can freely place within the environment. These fixtures provide hard constraints that limit the outcome of robot actions. Thereby, they funnel uncertainty from perception and motor control and scaffold manipulation skill learning.

[ Stanford ]

Since 2016, Verity's drones have completed more than 200,000 flights around the world. Completely autonomous, client-operated and designed for live events, Verity is making the magic real by turning drones into flying lights, characters, and props.

[ Verity ]

To monitor and stop the spread of wildfires, University of Michigan engineers developed UAVs that could find, map and report fires. One day UAVs like this could work with disaster response units, firefighters and other emergency teams to provide real-time accurate information to reduce damage and save lives. For their research, the University of Michigan graduate students won first place at a competition for using a swarm of UAVs to successfully map and report simulated wildfires.

[ University of Michigan ]

Here’s an important issue that I haven’t heard talked about all that much: How first responders should interact with self-driving cars.

“To put the car in manual mode, you must call Waymo.” Huh.

[ Waymo ]

Here’s what Gitai has been up to recently, from a Humanoids 2019 workshop talk.

[ Gitai ]

The latest CMU RI seminar comes from Girish Chowdhary at the University of Illinois at Urbana-Champaign on “Autonomous and Intelligent Robots in Unstructured Field Environments.”

What if a team of collaborative autonomous robots grew your food for you? In this talk, I will discuss some key advances in robotics, machine learning, and autonomy that will one day enable teams of small robots to grow food for you in your backyard in a fundamentally more sustainable way than modern mega-farms! Teams of small aerial and ground robots could be a potential solution to many of the serious problems that modern agriculture is facing. However, fully autonomous robots that operate without supervision for weeks, months, or entire growing season are not yet practical. I will discuss my group’s theoretical and practical work towards the underlying challenging problems in robotic systems, autonomy, sensing, and learning. I will begin with our lightweight, compact, and autonomous field robot TerraSentia and the recent successes of this type of undercanopy robots for high-throughput phenotyping with deep learning-based machine vision. I will also discuss how to make a team of autonomous robots learn to coordinate to weed large agricultural farms under partial observability. These direct applications will help me make the case for the type of reinforcement learning and adaptive control that are necessary to usher in the next generation of autonomous field robots that learn to solve complex problems in harsh, changing, and dynamic environments. I will then end with an overview of our new MURI, in which we are working towards developing AI and control that leverages neurodynamics inspired by the Octopus brain.

[ CMU RI ] Continue reading

Posted in Human Robots

#435791 To Fly Solo, Racing Drones Have a Need ...

Drone racing’s ultimate vision of quadcopters weaving nimbly through obstacle courses has attracted far less excitement and investment than self-driving cars aimed at reshaping ground transportation. But the U.S. military and defense industry are betting on autonomous drone racing as the next frontier for developing AI so that it can handle high-speed navigation within tight spaces without human intervention.

The autonomous drone challenge requires split-second decision-making with six degrees of freedom instead of a car’s mere two degrees of road freedom. One research team developing the AI necessary for controlling autonomous racing drones is the Robotics and Perception Group at the University of Zurich in Switzerland. In late May, the Swiss researchers were among nine teams revealed to be competing in the two-year AlphaPilot open innovation challenge sponsored by U.S. aerospace company Lockheed Martin. The winning team will walk away with up to $2.25 million for beating other autonomous racing drones and a professional human drone pilot in head-to-head competitions.

“I think it is important to first point out that having an autonomous drone to finish a racing track at high speeds or even beating a human pilot does not imply that we can have autonomous drones [capable of] navigating in real-world, complex, unstructured, unknown environments such as disaster zones, collapsed buildings, caves, tunnels or narrow pipes, forests, military scenarios, and so on,” says Davide Scaramuzza, a professor of robotics and perception at the University of Zurich and ETH Zurich. “However, the robust and computationally efficient state estimation algorithms, control, and planning algorithms developed for autonomous drone racing would represent a starting point.”

The nine teams that made the cut—from a pool of 424 AlphaPilot applicants—will compete in four 2019 racing events organized under the Drone Racing League’s Artificial Intelligence Robotic Racing Circuit, says Keith Lynn, program manager for AlphaPilot at Lockheed Martin. To ensure an apples-to-apples comparison of each team’s AI secret sauce, each AlphaPilot team will upload its AI code into identical, specially-built drones that have the NVIDIA Xavier GPU at the core of the onboard computing hardware.

“Lockheed Martin is offering mentorship to the nine AlphaPilot teams to support their AI tech development and innovations,” says Lynn. The company “will be hosting a week-long Developers Summit at MIT in July, dedicated to workshopping and improving AlphaPilot teams’ code,” he added. He notes that each team will retain the intellectual property rights to its AI code.

The AlphaPilot challenge takes inspiration from older autonomous drone racing events hosted by academic researchers, Scaramuzza says. He credits Hyungpil Moon, a professor of robotics and mechanical engineering at Sungkyunkwan University in South Korea, for having organized the annual autonomous drone racing competition at the International Conference on Intelligent Robots and Systems since 2016.

It’s no easy task to create and train AI that can perform high-speed flight through complex environments by relying on visual navigation. One big challenge comes from how drones can accelerate sharply, take sharp turns, fly sideways, do zig-zag patterns and even perform back flips. That means camera images can suddenly appear tilted or even upside down during drone flight. Motion blur may occur when a drone flies very close to structures at high speeds and camera pixels collect light from multiple directions. Both cameras and visual software can also struggle to compensate for sudden changes between light and dark parts of an environment.

To lend AI a helping hand, Scaramuzza’s group recently published a drone racing dataset that includes realistic training data taken from a drone flown by a professional pilot in both indoor and outdoor spaces. The data, which includes complicated aerial maneuvers such as back flips, flight sequences that cover hundreds of meters, and flight speeds of up to 83 kilometers per hour, was presented at the 2019 IEEE International Conference on Robotics and Automation.

The drone racing dataset also includes data captured by the group’s special bioinspired event cameras that can detect changes in motion on a per-pixel basis within microseconds. By comparison, ordinary cameras need milliseconds (each millisecond being 1,000 microseconds) to compare motion changes in each image frame. The event cameras have already proven capable of helping drones nimbly dodge soccer balls thrown at them by the Swiss lab’s researchers.

The Swiss group’s work on the racing drone dataset received funding in part from the U.S. Defense Advanced Research Projects Agency (DARPA), which acts as the U.S. military’s special R&D arm for more futuristic projects. Specifically, the funding came from DARPA’s Fast Lightweight Autonomy program that envisions small autonomous drones capable of flying at high speeds through cluttered environments without GPS guidance or communication with human pilots.

Such speedy drones could serve as military scouts checking out dangerous buildings or alleys. They could also someday help search-and-rescue teams find people trapped in semi-collapsed buildings or lost in the woods. Being able to fly at high speed without crashing into things also makes a drone more efficient at all sorts of tasks by making the most of limited battery life, Scaramuzza says. After all, most drone battery life gets used up by the need to hover in flight and doesn’t get drained much by flying faster.

Even if AI manages to conquer the drone racing obstacle courses, that would be the end of the beginning of the technology’s development. What would still be required? Scaramuzza specifically singled out the need to handle low-visibility conditions involving smoke, dust, fog, rain, snow, fire, hail, as some of the biggest challenges for vision-based algorithms and AI in complex real-life environments.

“I think we should develop and release datasets containing smoke, dust, fog, rain, fire, etc. if we want to allow using autonomous robots to complement human rescuers in saving people lives after an earthquake or natural disaster in the future,” Scaramuzza says. Continue reading

Posted in Human Robots

#435219 For less than $200, engineering students ...

Mechanical engineering students challenged themselves to make a robotic fish that not only swims like a real fish, but looks the part too, demonstrating the possibilities inherent to soft robotics. Continue reading

Posted in Human Robots