Tag Archives: science

#437805 Video Friday: Quadruped Robot HyQ ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

Four-legged HyQ balancing on two legs. Nice results from the team at IIT’s Dynamic Legged Systems Lab. And we can’t wait to see the “ninja walk,” currently shown in simulation, implemented with the real robot!

The development of balance controllers for legged robots with point feet remains a challenge when they have to traverse extremely constrained environments. We present a balance controller that has the potential to achieve line walking for quadruped robots. Our initial experiments show the 90-kg robot HyQ balancing on two feet and recovering from external pushes, as well as some changes in posture achieved without losing balance.

[ IIT ]

Thanks Victor!

Ava Robotics’ telepresence robot has been beheaded by MIT, and it now sports a coronavirus-destroying UV array.

UV-C light has proven to be effective at killing viruses and bacteria on surfaces and aerosols, but it’s unsafe for humans to be exposed. Fortunately, Ava’s telepresence robot doesn’t require any human supervision. Instead of the telepresence top, the team subbed in a UV-C array for disinfecting surfaces. Specifically, the array uses short-wavelength ultraviolet light to kill microorganisms and disrupt their DNA in a process called ultraviolet germicidal irradiation. The complete robot system is capable of mapping the space — in this case, GBFB’s warehouse — and navigating between waypoints and other specified areas. In testing the system, the team used a UV-C dosimeter, which confirmed that the robot was delivering the expected dosage of UV-C light predicted by the model.

[ MIT ]

While it’s hard enough to get quadrupedal robots to walk in complex environments, this work from the Robotic Systems Lab at ETH Zurich shows some impressive whole body planning that allows ANYmal to squeeze its body through small or weirdly shaped spaces.

[ RSL ]

Engineering researchers at North Carolina State University and Temple University have developed soft robots inspired by jellyfish that can outswim their real-life counterparts. More practically, the new jellyfish-bots highlight a technique that uses pre-stressed polymers to make soft robots more powerful.

The researchers also used the technique to make a fast-moving robot that resembles a larval insect curling its body, then jumping forward as it quickly releases its stored energy. Lastly, the researchers created a three-pronged gripping robot – with a twist. Most grippers hang open when “relaxed,” and require energy to hold on to their cargo as it is lifted and moved from point A to point B. But this claw’s default position is clenched shut. Energy is required to open the grippers, but once they’re in position, the grippers return to their “resting” mode – holding their cargo tight.

[ NC State ]

As control skills increase, we are more and more impressed by what a Cassie bipedal robot can do. Those who have been following our channel, know that we always show the limitations of our work. So while there is still much to do, you gotta like the direction things are going. Later this year, you will see this controller integrated with our real-time planner and perception system. Autonomy with agility! Watch out for us!

[ University of Michigan ]

GITAI’s S1 arm is a little less exciting than their humanoid torso, but it looks like this one might actually be going to the ISS next year.

Here’s how the humanoid would handle a similar task:

[ GITAI ]

Thanks Fan!

If you need a robot that can lift 250 kg at 10 m/s across a workspace of a thousand cubic meters, here’s your answer.

[ Fraunhofer ]

Penn engineers with funding from the National Science Foundation, have nanocardboard plates able to levitate when bright light is shone on them. This fleet of tiny aircraft could someday explore the skies of other worlds, including Mars. The thinner atmosphere there would give the flyers a boost, enabling them to carry payloads ten times as massive as they are, making them an efficient, light-weight alternative to the Mars helicopter.

[ UPenn ]

Erin Sparks, assistant professor in Plant and Soil Sciences, dreamed of a robot she could use in her research. A perfect partnership was formed when Adam Stager, then a mechanical engineering Ph.D. student, reached out about a robot he had a gut feeling might be useful in agriculture. The pair moved forward with their research with corn at the UD Farm, using the robot to capture dynamic phenotyping information of brace roots over time.

[ Sparks Lab ]

This is a video about robot spy turtles but OMG that bird drone landing gear.

[ PBS ]

If you have a DJI Mavic, you now have something new to worry about.

[ DroGone ]

I was able to spot just one single person in the warehouse footage in this video.

[ Berkshire Grey ]

Flyability has partnered with the ROBINS Project to help fill gaps in the technology used in ship inspections. Watch this video to learn more about the ROBINS project and how Flyability’s drones for confined spaces are helping make inspections on ships safer, cheaper, and more efficient.

[ Flyability ]

In this video, a mission of the Alpha Aerial Scout of Team CERBERUS during the DARPA Subterranean Challenge Urban Circuit event is presented. The Alpha Robot operates inside the Satsop Abandoned Power Plant and performs autonomous exploration. This deployment took place during the 3rd field trial of team CERBERUS during the Urban Circuit event of the DARPA Subterranean Challenge.

[ ARL ]

More excellent talks from the remote Legged Robots ICRA workshop- we’ve posted three here, but there are several other good talks this week as well.

[ ICRA 2020 Legged Robots Workshop ] Continue reading

Posted in Human Robots

#437796 AI Seeks ET: Machine Learning Powers ...

Can artificial intelligence help the search for life elsewhere in the solar system? NASA thinks the answer may be “yes”—and not just on Mars either.

A pilot AI system is now being tested for use on the ExoMars mission that is currently slated to launch in the summer or fall of 2022. The machine-learning algorithms being developed will help science teams decide how to test Martian soil samples to return only the most meaningful data.

For ExoMars, the AI system will only be used back on earth to analyze data gather by the ExoMars rover. But if the system proves to be as useful to the rovers as now suspected, a NASA mission to Saturn’s moon Titan (now scheduled for 2026 launch) could automate the scientific sleuthing process in the field. This mission will rely on the Dragonfly octocopter drone to fly from surface location to surface location through Titan’s dense atmosphere and drill for signs of life there.

The hunt for microbial life in another world’s soil, either as fossilized remnants or as present-day samples, is very challenging, says Eric Lyness, software lead of the NASA Goddard Planetary Environments Lab in Greenbelt, Md. There is of course no precedent to draw upon, because no one has yet succeeded in astrobiology’s holy grail quest.

But that doesn’t mean AI can’t provide substantial assistance. Lyness explained that for the past few years he’d been puzzling over how to automate portions of an exploratory mission’s geochemical investigation, wherever in the solar system the scientific craft may be.

Last year he decided to try machine learning. “So we got some interns,” he said. “People right out of college or in college, who have been studying machine learning. … And they did some amazing stuff. It turned into much more than we expected.” Lyness and his collaborators presented their scientific analysis algorithm at a geochemistry conference last month.

Illustration: ESA

The ExoMars rover, named Rosalind Franklin, will be the first that can drill down to 2-meter depths, where living soil bacteria could possibly be found.

ExoMars’s rover—named Rosalind Franklin, after one of the co-discoverers of DNA—will be the first that can drill down to 2-meter depths, beyond where solar UV light might penetrate and kill any life forms. In other words, ExoMars will be the first Martian craft with the ability to reach soil depths where living soil bacteria could possibly be found.

“We could potentially find forms of life, microbes or other things like that,” Lyness said. However, he quickly added, very little conclusive evidence today exists to suggest that there’s present-day (microbial) life on Mars. (NASA’s Curiosity rover has sent back some inexplicable observations of both methane and molecular oxygen in the Martian atmosphere that could conceivably be a sign of microbial life forms, though non-biological processes could explain these anomalies too.)

Less controversially, the Rosalind Franklin rover’s drill could also turn up fossilized evidence of life in the Martian soil from earlier epochs when Mars was more hospitable.

NASA’s contribution to the joint Russian/European Space Agency ExoMars project is an instrument called a mass spectrometer that will be used to analyze soil samples from the drill cores. Here, Lyness said, is where AI could really provide a helping hand.

Because the Dragonfly drone and possibly a future mission to Jupiter’s moon Europa would be operating in hostile environments with less opportunity for data transmission to Earth, automating a craft’s astrobiological exploration would be practically a requirement

The spectrometer, which studies the mass distribution of ions in a sample of material, works by blasting the drilled soil sample with a laser and then mapping out the atomic masses of the various molecules and portions of molecules that the laser has liberated. The problem is any given mass spectrum could originate from any number of source compounds, minerals and components. Which always makes analyzing a mass spectrum a gigantic puzzle.

Lyness said his group is studying the mineral montmorillonite, a commonplace component of the Martian soil, to see the many ways it might reveal itself in a mass spectrum. Then his team sneaks in an organic compound with the montmorillonite sample to see how that changes the mass spectrometer output.

“It could take a long time to really break down a spectrum and understand why you’re seeing peaks at certain [masses] in the spectrum,” he said. “So anything you can do to point scientists into a direction that says, ‘Don’t worry, I know it’s not this kind of thing or that kind of thing,’ they can more quickly identify what’s in there.”

Lyness said the ExoMars mission will provide a fertile training ground for his team’s as-yet-unnamed AI algorithm. (He said he’s open to suggestions—though, please, no spoof Boaty McBoatface submissions need apply.)

Because the Dragonfly drone and possibly a future astrobiology mission to Jupiter’s moon Europa would be operating in much more hostile environments with much less opportunity for data transmission back and forth to Earth, automating a craft’s astrobiological exploration would be practically a requirement.

All of which points to a future in mid-2030s in which a nuclear-powered octocopter on a moon of Saturn flies from location to location to drill for evidence of life on this tantalizingly bio-possible world. And machine learning will help power the science.

“We should be researching how to make the science instruments smarter,” Lyness said. “If you can make it smarter at the source, especially for planetary exploration, it has huge payoffs.” Continue reading

Posted in Human Robots

#437778 A Bug-Sized Camera for Bug-Sized Robots ...

As if it’s not hard enough to make very small mobile robots, once you’ve gotten the power and autonomy all figured out (good luck with that), your robot isn’t going to be all that useful unless it can carry some payload. And the payload that everybody wants robots to carry is a camera, which is of course a relatively big, heavy, power hungry payload. Great, just great.

This whole thing is frustrating because tiny, lightweight, power efficient vision systems are all around us. Literally, all around us right this second, stuffed into the heads of insects. We can’t make anything quite that brilliant (yet), but roboticists from the University of Washington, in Seattle, have gotten us a bit closer, with the smallest wireless, steerable video camera we’ve ever seen—small enough to fit on the back of a microbot, or even a live bug.

To make a camera this small, the UW researchers, led by Shyam Gollakota, a professor of computer science and engineering, had to start nearly from scratch, primarily because existing systems aren’t nearly so constrained by power availability. Even things like swallowable pill cameras require batteries that weigh more than a gram, but only power the camera for under half an hour. With a focus on small size and efficiency, they started with an off-the-shelf ultra low-power image sensor that’s 2.3 mm wide and weighs 6.7 mg. They stuck on a Bluetooth 5.0 chip (3 mm wide, 6.8 mg), and had a fun time connecting those two things together without any intermediary hardware to broadcast the camera output. A functional wireless camera also requires a lens (20 mg) and an antenna, which is just 5 mm of wire. An accelerometer is useful so that insect motion can be used to trigger the camera, minimizing the redundant frames that you’d get from a robot or an insect taking a nap.

Photo: University of Washington

The microcamera developed by the UW researchers can stream monochrome video at up to 5 frames per second to a cellphone 120 meters away.

The last bit to make up this system is a mechanically steerable “head,” weighing 35 mg and bringing the total weight of the wireless camera system to 84 mg. If the look of the little piezoelectric actuator seems familiar, you have very good eyes because it’s tiny, and also, it’s the same kind of piezoelectric actuator that the folks at UW use to power their itty bitty flying robots. It’s got a 60-degree panning range, but also requires a 96 mg boost converter to function, which is a huge investment in size and weight just to be able to point the camera a little bit. But overall, the researchers say that this pays off, because not having to turn the entire robot (or insect) when you want to look around reduces the energy consumption of the system as a whole by a factor of up to 84 (!).

Photo: University of Washington

Insects are very mobile platforms for outdoor use, but they’re also not easy to steer, so the researchers also built a little insect-scale robot that they could remotely control while watching the camera feed. As it turns out, this seems to be the smallest, power-autonomous terrestrial robot with a camera ever made.

This efficiency means that the wireless camera system can stream video frames (160×120 pixels monochrome) to a cell phone up to 120 meters away for up to 6 hours when powered by a 0.5-g, 10-mAh battery. A live, first-bug view can be streamed at up to 5 frames per second. The system was successfully tested on a pair of darkling beetles that were allowed to roam freely outdoors, and the researchers noted that they could also mount it on spiders or moths, or anything else that could handle the payload. (The researchers removed the electronics from the insects after the experiments and observed no noticeable adverse effects on their behavior.)

The researchers are already thinking about what it might take to put a wireless camera system on something that flies, and it’s not going to be easy—a bumblebee can only carry between 100 and 200 mg. The power system is the primary limitation here, but it might be possible to use a solar cell to cut down on battery requirements. And the camera itself could be scaled down as well, by using a completely custom sensor and a different type of lens. The other thing to consider is that with a long-range wireless link and a vision system, it’s possible to add sophisticated vision-based autonomy to tiny robots by doing the computation remotely. So, next time you see something scuttling across the ground, give it another look, because it might be looking right back at you.

“Wireless steerable vision for live insects and insect-scale robots,” by Vikram Iyer, Ali Najafi, Johannes James, Sawyer Fuller, and Shyamnath Gollakota from the University of Washington, is published in Science Robotics. Continue reading

Posted in Human Robots

#437776 Video Friday: This Terrifying Robot Will ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today's videos.

The Aigency, which created the FitBot launch video below, is “the world’s first talent management resource for robotic personalities.”

Robots will be playing a bigger role in our lives in the future. By learning to speak their language and work with them now, we can make this future better for everybody. If you’re a creator that’s producing content to entertain and educate people, robots can be a part of that. And we can help you. Robotic actors can show up alongside the rest of your actors.

The folks at Aigency have put together a compilation reel of clips they’ve put on TikTok, which is nice of them, because some of us don’t know how to TikTok because we’re old and boring.

Do googly eyes violate the terms and conditions?

[ Aigency ]

Shane Wighton of the “Stuff Made Here” YouTube channel, who you might remember from that robotic basketball hoop, has a new invention: A haircut robot. This is not the the first barber bot, but previous designs typically used hair clippers. Shane wanted his robot to use scissors. Hilarious and terrifying at once.

[ Stuff Made Here ]

Starting in October of 2016, Prof. Charlie Kemp and Henry M. Clever invented a new kind of robot. They named the prototype NewRo. In March of 2017, Prof. Kemp filmed this video of Henry operating NewRo to perform a number of assistive tasks. While visiting the Bay Area for a AAAI Symposium workshop at Stanford, Prof. Kemp showed this video to a select group of people to get advice, including Dr. Aaron Edsinger. In August of 2017, Dr. Edsinger and Dr. Kemp founded Hello Robot Inc. to commercialize this patent pending assistive technology. Hello Robot Inc. licensed the intellectual property (IP) from Georgia Tech. After three years of stealthy effort, Hello Robot Inc. revealed Stretch, a new kind of robot!

[ Georgia Tech ]

NASA’s Ingenuity Mars Helicopter will make history's first attempt at powered flight on another planet next spring. It is riding with the agency's next mission to Mars (the Mars 2020 Perseverance rover) as it launches from Cape Canaveral Air Force Station later this summer. Perseverance, with Ingenuity attached to its belly, will land on Mars February 18, 2021.

[ JPL ]

For humans, it can be challenging to manipulate thin flexible objects like ropes, wires, or cables. But if these problems are hard for humans, they are nearly impossible for robots. As a cable slides between the fingers, its shape is constantly changing, and the robot’s fingers must be constantly sensing and adjusting the cable’s position and motion. A group of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and from the MIT Department of Mechanical Engineering pursued the task from a different angle, in a manner that more closely mimics us humans. The team’s new system uses a pair of soft robotic grippers with high-resolution tactile sensors (and no added mechanical constraints) to successfully manipulate freely moving cables.

The team observed that it was difficult to pull the cable back when it reached the edge of the finger, because of the convex surface of the GelSight sensor. Therefore, they hope to improve the finger-sensor shape to enhance the overall performance. In the future, they plan to study more complex cable manipulation tasks such as cable routing and cable inserting through obstacles, and they want to eventually explore autonomous cable manipulation tasks in the auto industry.

[ MIT ]

Gripping robots typically have troubles grabbing transparent or shiny objects. A new technique by Carnegie Mellon University relies on color camera system and machine learning to recognize shapes based on color.

[ CMU ]

A new robotic prosthetic leg prototype offers a more natural, comfortable gait while also being quieter and more energy efficient than other designs. The key is the use of new small and powerful motors with fewer gears, borrowed from the space industry. This streamlined technology enables a free-swinging knee and regenerative braking, which charges the battery during use with energy that would typically be dissipated when the foot hits the ground. This feature enables the leg to more than double a typical prosthetic user's walking needs with one charge per day.

[ University of Michigan ]

Thanks Kate!

This year’s Wonder League teams have been put to the test not only with the challenges set forth by Wonder Workshop and Cartoon Network as they look to help the creek kids from Craig of the Creek solve the greatest mystery of all – the quest for the Lost Realm but due to forces outside their control. With a global pandemic displacing many teams from one another due to lockdowns and quarantines, these teams continued to push themselves to find new ways to work together, solve problems, communicate more effectively, and push themselves to complete a journey that they started and refused to give up on. We at Wonder Workshop are humbled and in awe of all these teams have accomplished.

[ Wonder Workshop ]

Thanks Nicole!

Meet Colin Creager, a mechanical engineer at NASA's Glenn Research Center. Colin is focusing on developing tires that can be used on other worlds. These tires use coil springs made of a special shape memory alloy that will let rovers move across sharp jagged rocks or through soft sand on the Moon or Mars.

[ NASA ]

To be presented at IROS this year, “the first on robot collision detection system using low cost microphones.”

[ Rutgers ]

Robot and mechanism designs inspired by the art of Origami have the potential to generate compact, deployable, lightweight morphing structures, as seen in nature, for potential applications in search-and-rescue, aerospace systems, and medical devices. However, it is challenging to obtain actuation that is easily patternable, reversible, and made with a scalable manufacturing process for origami-inspired self-folding machines. In this work, we describe an approach to design reversible self-folding machines using liquid crystal elastomer (LCE), that contracts when heated, as an artificial muscle.

[ UCSD ]

Just in case you need some extra home entertainment, and you’d like cleaner floors at the same time.

[ iRobot ]

Sure, toss it from a drone. Or from orbit. Whatever, it’s squishy!

[ Squishy Robotics ]

The [virtual] RSS conference this week featured an excellent lineup of speakers and panels, and the best part about it being virtual is that you can watch them all at your leisure! Here’s what’s been posted so far:

[ RSS 2020 ]

Lockheed Martin Robotics Seminar: Toward autonomous flying insect-sized robots: recent results in fabrication, design, power systems, control, and sensing with Sawyer Fuller.

[ UMD ]

In this episode of the AI Podcast, Lex interviews Sergey Levine.

[ AI Podcast ] Continue reading

Posted in Human Robots

#437765 Video Friday: Massive Robot Joins ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

AWS Cloud Robotics Summit – August 18-19, 2020 – [Online Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

Here are some professional circus artists messing around with an industrial robot for fun, like you do.

The acrobats are part of Östgötateatern, a Swedish theatre group, and the chair bit got turned into its own act, called “The Last Fish.” But apparently the Swedish Work Environment Authority didn’t like that an industrial robot—a large ABB robotic arm—was being used in an artistic performance, arguing that the same safety measures that apply in a factory setting would apply on stage. In other words, the robot had to operate inside a protective cage and humans could not physically interact with it.

When told that their robot had to be removed, the acrobats went to court. And won! At least that’s what we understand from this Swedish press release. The court in Linköping, in southern Sweden, ruled that the safety measures taken by the theater had been sufficient. The group had worked with a local robotics firm, Dyno Robotics, to program the manipulator and learn how to interact with it as safely as possible. The robot—which the acrobats say is the eighth member of their troupe—will now be allowed to return.

[ Östgötateatern ]

Houston Mechathronics’ Aquanaut continues to be awesome, even in the middle of a pandemic. It’s taken the big step (big swim?) out of NASA’s swimming pool and into open water.

[ HMI ]

Researchers from Carnegie Mellon University and Facebook AI Research have created a navigation system for robots powered by common sense. The technique uses machine learning to teach robots how to recognize objects and understand where they’re likely to be found in house. The result allows the machines to search more strategically.

[ CMU ]

Cassie manages 2.1 m/s, which is uncomfortably fast in a couple of different ways.

Next, untethered. After that, running!

[ Michigan Robotics ]

Engineers at Caltech have designed a new data-driven method to control the movement of multiple robots through cluttered, unmapped spaces, so they do not run into one another.

Multi-robot motion coordination is a fundamental robotics problem with wide-ranging applications that range from urban search and rescue to the control of fleets of self-driving cars to formation-flying in cluttered environments. Two key challenges make multi-robot coordination difficult: first, robots moving in new environments must make split-second decisions about their trajectories despite having incomplete data about their future path; second, the presence of larger numbers of robots in an environment makes their interactions increasingly complex (and more prone to collisions).

To overcome these challenges, Soon-Jo Chung, Bren Professor of Aerospace, and Yisong Yue, professor of computing and mathematical sciences, along with Caltech graduate student Benjamin Rivière (MS ’18), postdoctoral scholar Wolfgang Hönig, and graduate student Guanya Shi, developed a multi-robot motion-planning algorithm called “Global-to-Local Safe Autonomy Synthesis,” or GLAS, which imitates a complete-information planner with only local information, and “Neural-Swarm,” a swarm-tracking controller augmented to learn complex aerodynamic interactions in close-proximity flight.

[ Caltech ]

Fetch Robotics’ Freight robot is now hauling around pulsed xenon UV lamps to autonomously disinfect spaces with UV-A, UV-B, and UV-C, all at the same time.

[ SmartGuard UV ]

When you’re a vertically symmetrical quadruped robot, there is no upside-down.

[ Ghost Robotics ]

In the virtual world, the objects you pick up do not exist: you can see that cup or pen, but it does not feel like you’re touching them. That presented a challenge to EPFL professor Herbert Shea. Drawing on his extensive experience with silicone-based muscles and motors, Shea wanted to find a way to make virtual objects feel real. “With my team, we’ve created very small, thin and fast actuators,” explains Shea. “They are millimeter-sized capsules that use electrostatic energy to inflate and deflate.” The capsules have an outer insulating membrane made of silicone enclosing an inner pocket filled with oil. Each bubble is surrounded by four electrodes, that can close like a zipper. When a voltage is applied, the electrodes are pulled together, causing the center of the capsule to swell like a blister. It is an ingenious system because the capsules, known as HAXELs, can move not only up and down, but also side to side and around in a circle. “When they are placed under your fingers, it feels as though you are touching a range of different objects,” says Shea.

[ EPFL ]

Through the simple trick of reversing motors on impact, a quadrotor can land much more reliably on slopes.

[ Sherbrooke ]

Turtlebot delivers candy at Harvard.

I <3 Turtlebot SO MUCH

[ Harvard ]

Traditional drone controllers are a little bit counterintuitive, because there’s one stick that’s forwards and backwards and another stick that’s up and down but they’re both moving on the same axis. How does that make sense?! Here’s a remote that gives you actual z-axis control instead.

[ Fenics ]

Thanks Ashley!

Lio is a mobile robot platform with a multifunctional arm explicitly designed for human-robot interaction and personal care assistant tasks. The robot has already been deployed in several health care facilities, where it is functioning autonomously, assisting staff and patients on an everyday basis.

[ F&P Robotics ]

Video shows a ground vehicle autonomously exploring and mapping a multi-storage garage building and a connected patio on Carnegie Mellon University campus. The vehicle runs onboard state estimation and mapping leveraging range, vision, and inertial sensing, local planning for collision avoidance, and terrain analysis. All processing is real-time and no post-processing involved. The vehicle drives at 2m/s through the exploration run. This work is dedicated to DARPA Subterranean Challange.

[ CMU ]

Raytheon UK’s flagship STEM programme, the Quadcopter Challenge, gives 14-15 year olds the chance to participate in a hands-on, STEM-based engineering challenge to build a fully operational quadcopter. Each team is provided with an identical kit of parts, tools and instructions to build and customise their quadcopter, whilst Raytheon UK STEM Ambassadors provide mentoring, technical support and deliver bite-size learning modules to support the build.

[ Raytheon ]

A video on some of the research work that is being carried out at The Australian Centre for Field Robotics, University of Sydney.

[ University of Sydney ]

Jeannette Bohg, assistant professor of computer science at Stanford University, gave one of the Early Career Award Keynotes at RSS 2020.

[ RSS 2020 ]

Adam Savage remembers Grant Imahara.

[ Tested ] Continue reading

Posted in Human Robots