Tag Archives: week
#436165 Video Friday: DJI’s Mavic Mini Is ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.
DJI’s new Mavic Mini looks like a pretty great drone for US $400 ($500 for a combo with more accessories): It’s tiny, flies for 30 minutes, and will do what you need as far as pictures and video (although not a whole lot more).
DJI seems to have put a bunch of effort into making the drone 249 grams, 1 gram under what’s required for FAA registration. That means you save $5 and a few minutes of your time, but that does not mean you don’t have to follow the FAA’s rules and regulations governing drone use.
[ DJI ]
Don’t panic, but Clearpath and HEBI Robotics have armed the Jackal:
After locking eyes across a crowded room at ICRA 2019, Clearpath Robotics and HEBI Robotics basked in that warm and fuzzy feeling that comes with starting a new and exciting relationship. Over a conference hall coffee, they learned that the two companies have many overlapping interests. The most compelling was the realization that customers across a variety of industries are hunting for an elusive true love of their own – a robust but compact robotic platform combined with a long reach manipulator for remote inspection tasks.
After ICRA concluded, Arron Griffiths, Application Engineer at Clearpath, and Matthew Tesch, Software Engineer at HEBI, kept in touch and decided there had been enough magic in the air to warrant further exploration. A couple of months later, Matthew arrived at Clearpath to formally introduce the HEBI’s X-Series Arm to Clearpath’s Jackal UGV. It was love.
[ Clearpath ]
Thanks Dave!
I’m really not a fan of the people-carrying drones, but heavy lift cargo drones seem like a more okay idea.
Volocopter, the pioneer in Urban Air Mobility, presented the demonstrator of its VoloDrone. This marks Volocopters expansion into the logistics, agriculture, infrastructure and public services industry. The VoloDrone is an unmanned, fully electric, heavy-lift utility drone capable of carrying a payload of 200 kg (440 lbs) up to 40 km (25 miles). With a standardized payload attachment, VoloDrone can serve a great variety of purposes from transporting boxes, to liquids, to equipment and beyond. It can be remotely piloted or flown in automated mode on pre-set routes.
[ Volocopter ]
JAY is a mobile service robot that projects a display on the floor and plays sound with its speaker. By playing sounds and videos, it provides visual and audio entertainment in various places such as exhibition halls, airports, hotels, department stores and more.
[ Rainbow Robotics ]
The DARPA Subterranean Challenge Virtual Tunnel Circuit concluded this week—it was the same idea as the physical challenge that took place in August, just with a lot less IRL dirt.
The awards ceremony and team presentations are in this next video, and we’ll have more on this once we get back from IROS.
[ DARPA SubT ]
NASA is sending a mobile robot to the south pole of the Moon to get a close-up view of the location and concentration of water ice in the region and for the first time ever, actually sample the water ice at the same pole where the first woman and next man will land in 2024 under the Artemis program.
About the size of a golf cart, the Volatiles Investigating Polar Exploration Rover, or VIPER, will roam several miles, using its four science instruments — including a 1-meter drill — to sample various soil environments. Planned for delivery in December 2022, VIPER will collect about 100 days of data that will be used to inform development of the first global water resource maps of the Moon.
[ NASA ]
Happy Halloween from HEBI Robotics!
[ HEBI ]
Happy Halloween from Soft Robotics!
[ Soft Robotics ]
Halloween must be really, really confusing for autonomous cars.
[ Waymo ]
Once a year at Halloween, hardworking JPL engineers put their skills to the test in a highly competitive pumpkin carving contest. The result: A pumpkin gently landed on the Moon, its retrorockets smoldering, while across the room a Nemo-inspired pumpkin explored the sub-surface ocean of Jupiter moon Europa. Suffice to say that when the scientists and engineers at NASA’s Jet Propulsion Laboratory compete in a pumpkin-carving contest, the solar system’s the limit. Take a look at some of the masterpieces from 2019.
Now in its ninth year, the contest gives teams only one hour to carve and decorate their pumpkin though they can prepare non-pumpkin materials – like backgrounds, sound effects and motorized parts – ahead of time.
[ JPL ]
The online autonomous navigation and semantic mapping experiment presented [below] is conducted with the Cassie Blue bipedal robot at the University of Michigan. The sensors attached to the robot include an IMU, a 32-beam LiDAR and an RGB-D camera. The whole online process runs in real-time on a Jetson Xavier and a laptop with an i7 processor.
[ BPL ]
Misty II is now available to anyone who wants one, and she’s on sale for a mere $2900.
[ Misty ]
We leveraged LIDAR-based slam, in conjunction with our specialized relative localization sensor UVDAR to perform a de-centralized, communication-free swarm flight without the units knowing their absolute locations. The swarming and obstacle avoidance control is based on a modified Boids-like algorithm, while the whole swarm is controlled by directing a selected leader unit.
[ MRS ]
The MallARD robot is an autonomous surface vehicle (ASV), designed for the monitoring and inspection of wet storage facilities for example spent fuel pools or wet silos. The MallARD is holonomic, uses a LiDAR for localisation and features a robust trajectory tracking controller.
The University of Manchester’s researcher Dr Keir Groves designed and built the autonomous surface vehicle (ASV) for the challenge which came in the top three of the second round in Nov 2017. The MallARD went on to compete in a final 3rd round where it was deployed in a spent fuel pond at a nuclear power plant in Finland by the IAEA, along with two other entries. The MallARD came second overall, in November 2018.
[ RNE ]
Thanks Jennifer!
I sometimes get the sense that in the robotic grasping and manipulation world, suction cups are kinda seen as cheating at times. But, their nature allows you to do some pretty interesting things.
More clever octopus footage please.
[ CMU ]
A Personal, At-Home Teacher For Playful Learning: From academic topics to child-friendly news bulletins, fun facts and more, Miko 2 is packed with relevant and freshly updated content specially designed by educationists and child-specialists. Your little one won’t even realize they’re learning.
As we point out pretty much every time we post a video like this, keep in mind that you’re seeing a heavily edited version of a hypothetical best case scenario for how this robot can function. And things like “creating a relationship that they can then learn how to form with their peers” is almost certainly overselling things. But at $300 (shipping included), this may be a decent robot as long as your expectations are appropriately calibrated.
[ Miko ]
ICRA 2018 plenary talk by Rodney Brooks: “Robots and People: the Research Challenge.”
[ IEEE RAS ]
ICRA-X 2018 talk by Ron Arkin: “Lethal Autonomous Robots and the Plight of the Noncombatant.”
[ IEEE RAS ]
On the most recent episode of the AI Podcast, Lex Fridman interviews Garry Kasparov.
[ AI Podcast ] Continue reading
#436155 This MIT Robot Wants to Use Your ...
MIT researchers have demonstrated a new kind of teleoperation system that allows a two-legged robot to “borrow” a human operator’s physical skills to move with greater agility. The system works a bit like those haptic suits from the Spielberg movie “Ready Player One.” But while the suits in the film were used to connect humans to their VR avatars, the MIT suit connects the operator to a real robot.
The robot is called Little HERMES, and it’s currently just a pair of little legs, about a third the size of an average adult. It can step and jump in place or walk a short distance while supported by a gantry. While that in itself is not very impressive, the researchers say their approach could help bring capable disaster robots closer to reality. They explain that, despite recent advances, building fully autonomous robots with motor and decision-making skills comparable to those of humans remains a challenge. That’s where a more advanced teleoperation system could help.
The researchers, João Ramos, now an assistant professor at the University of Illinois at Urbana-Champaign, and Sangbae Kim, director of MIT’s Biomimetic Robotics Lab, describe the project in this week’s issue of Science Robotics. In the paper, they argue that existing teleoperation systems often can’t effectively match the operator’s motions to that of a robot. In addition, conventional systems provide no physical feedback to the human teleoperator about what the robot is doing. Their new approach addresses these two limitations, and to see how it would work in practice, they built Little HERMES.
Image: Science Robotics
The main components of MIT’s bipedal robot Little HERMES: (A) Custom actuators designed to withstand impact and capable of producing high torque. (B) Lightweight limbs with low inertia and fast leg swing. (C) Impact-robust and lightweight foot sensors with three-axis contact force sensor. (D) Ruggedized IMU to estimates the robot’s torso posture, angular rate, and linear acceleration. (E) Real-time computer sbRIO 9606 from National Instruments for robot control. (F) Two three-cell lithium-polymer batteries in series. (G) Rigid and lightweight frame to minimize the robot mass.
Early this year, the MIT researchers wrote an in-depth article for IEEE Spectrum about the project, which includes Little HERMES and also its big brother, HERMES (for Highly Efficient Robotic Mechanisms and Electromechanical System). In that article, they describe the two main components of the system:
[…] We are building a telerobotic system that has two parts: a humanoid capable of nimble, dynamic behaviors, and a new kind of two-way human-machine interface that sends your motions to the robot and the robot’s motions to you. So if the robot steps on debris and starts to lose its balance, the operator feels the same instability and instinctively reacts to avoid falling. We then capture that physical response and send it back to the robot, which helps it avoid falling, too. Through this human-robot link, the robot can harness the operator’s innate motor skills and split-second reflexes to keep its footing.
You could say we’re putting a human brain inside the machine.
Image: Science Robotics
The human-machine interface built by the MIT researchers for controlling Little HERMES is different from conventional ones in that it relies on the operator’s reflexes to improve the robot’s stability. The researchers call it the balance-feedback interface, or BFI. The main modules of the BFI include: (A) Custom interface attachments for torso and feet designed to capture human motion data at high speed (1 kHz). (B) Two underactuated modules to track the position and orientation of the torso and apply forces to the operator. (C) Each actuation module has three DoFs, one of which is a push/pull rod actuated by a DC brushless motor. (D) A series of linkages with passive joints connected to the operator’s feet and track their spatial translation. (E) Real-time controller cRIO 9082 from National Instruments to close the BFI control loop. (F) Force plate to estimated the operator’s center of pressure position and measure the shear and normal components of the operator’s net contact force.
Here’s more footage of the experiments, showing Little HERMES stepping and jumping in place, walking a few steps forward and backward, and balancing. Watch until the end to see a compilation of unsuccessful stepping experiments. Poor Little HERMES!
In the new Science Robotics paper, the MIT researchers explain how they solved one of the key challenges in making their teleoperation system effective:
The challenge of this strategy lies in properly mapping human body motion to the machine while simultaneously informing the operator how closely the robot is reproducing the movement. Therefore, we propose a solution for this bilateral feedback policy to control a bipedal robot to take steps, jump, and walk in synchrony with a human operator. Such dynamic synchronization was achieved by (i) scaling the core components of human locomotion data to robot proportions in real time and (ii) applying feedback forces to the operator that are proportional to the relative velocity between human and robot.
Little HERMES is now taking its first steps, quite literally, but the researchers say they hope to use robotic legs with similar design as part of a more advanced humanoid. One possibility they’ve envisioned is a fast-moving quadruped robot that could run through various kinds of terrain and then transform into a bipedal robot that would use its hands to perform dexterous manipulations. This could involve merging some of the robots the MIT researchers have built in their lab, possibly creating hybrids between Cheetah and HERMES, or Mini Cheetah and Little HERMES. We can’t wait to see what the resulting robots will look like.
[ Science Robotics ] Continue reading
#436119 How 3D Printing, Vertical Farming, and ...
Food. What we eat, and how we grow it, will be fundamentally transformed in the next decade.
Already, indoor farming is projected to be a US$40.25 billion industry by 2022, with a compound annual growth rate of 9.65 percent. Meanwhile, the food 3D printing industry is expected to grow at an even higher rate, averaging 50 percent annual growth.
And converging exponential technologies—from materials science to AI-driven digital agriculture—are not slowing down. Today’s breakthroughs will soon allow our planet to boost its food production by nearly 70 percent, using a fraction of the real estate and resources, to feed 9 billion by mid-century.
What you consume, how it was grown, and how it will end up in your stomach will all ride the wave of converging exponentials, revolutionizing the most basic of human needs.
Printing Food
3D printing has already had a profound impact on the manufacturing sector. We are now able to print in hundreds of different materials, making anything from toys to houses to organs. However, we are finally seeing the emergence of 3D printers that can print food itself.
Redefine Meat, an Israeli startup, wants to tackle industrial meat production using 3D printers that can generate meat, no animals required. The printer takes in fat, water, and three different plant protein sources, using these ingredients to print a meat fiber matrix with trapped fat and water, thus mimicking the texture and flavor of real meat.
Slated for release in 2020 at a cost of $100,000, their machines are rapidly demonetizing and will begin by targeting clients in industrial-scale meat production.
Anrich3D aims to take this process a step further, 3D printing meals that are customized to your medical records, heath data from your smart wearables, and patterns detected by your sleep trackers. The company plans to use multiple extruders for multi-material printing, allowing them to dispense each ingredient precisely for nutritionally optimized meals. Currently in an R&D phase at the Nanyang Technological University in Singapore, the company hopes to have its first taste tests in 2020.
These are only a few of the many 3D food printing startups springing into existence. The benefits from such innovations are boundless.
Not only will food 3D printing grant consumers control over the ingredients and mixtures they consume, but it is already beginning to enable new innovations in flavor itself, democratizing far healthier meal options in newly customizable cuisine categories.
Vertical Farming
Vertical farming, whereby food is grown in vertical stacks (in skyscrapers and buildings rather than outside in fields), marks a classic case of converging exponential technologies. Over just the past decade, the technology has surged from a handful of early-stage pilots to a full-grown industry.
Today, the average American meal travels 1,500-2,500 miles to get to your plate. As summed up by Worldwatch Institute researcher Brian Halweil, “We are spending far more energy to get food to the table than the energy we get from eating the food.” Additionally, the longer foods are out of the soil, the less nutritious they become, losing on average 45 percent of their nutrition before being consumed.
Yet beyond cutting down on time and transportation losses, vertical farming eliminates a whole host of issues in food production. Relying on hydroponics and aeroponics, vertical farms allows us to grow crops with 90 percent less water than traditional agriculture—which is critical for our increasingly thirsty planet.
Currently, the largest player around is Bay Area-based Plenty Inc. With over $200 million in funding from Softbank, Plenty is taking a smart tech approach to indoor agriculture. Plants grow on 20-foot-high towers, monitored by tens of thousands of cameras and sensors, optimized by big data and machine learning.
This allows the company to pack 40 plants in the space previously occupied by 1. The process also produces yields 350 times greater than outdoor farmland, using less than 1 percent as much water.
And rather than bespoke veggies for the wealthy few, Plenty’s processes allow them to knock 20-35 percent off the costs of traditional grocery stores. To date, Plenty has their home base in South San Francisco, a 100,000 square-foot farm in Kent, Washington, an indoor farm in the United Arab Emirates, and recently started construction on over 300 farms in China.
Another major player is New Jersey-based Aerofarms, which can now grow two million pounds of leafy greens without sunlight or soil.
To do this, Aerofarms leverages AI-controlled LEDs to provide optimized wavelengths of light for each plant. Using aeroponics, the company delivers nutrients by misting them directly onto the plants’ roots—no soil required. Rather, plants are suspended in a growth mesh fabric made from recycled water bottles. And here too, sensors, cameras, and machine learning govern the entire process.
While 50-80 percent of the cost of vertical farming is human labor, autonomous robotics promises to solve that problem. Enter contenders like Iron Ox, a firm that has developed the Angus robot, capable of moving around plant-growing containers.
The writing is on the wall, and traditional agriculture is fast being turned on its head.
Materials Science
In an era where materials science, nanotechnology, and biotechnology are rapidly becoming the same field of study, key advances are enabling us to create healthier, more nutritious, more efficient, and longer-lasting food.
For starters, we are now able to boost the photosynthetic abilities of plants. Using novel techniques to improve a micro-step in the photosynthesis process chain, researchers at UCLA were able to boost tobacco crop yield by 14-20 percent. Meanwhile, the RIPE Project, backed by Bill Gates and run out of the University of Illinois, has matched and improved those numbers.
And to top things off, The University of Essex was even able to improve tobacco yield by 27-47 percent by increasing the levels of protein involved in photo-respiration.
In yet another win for food-related materials science, Santa Barbara-based Apeel Sciences is further tackling the vexing challenge of food waste. Now approaching commercialization, Apeel uses lipids and glycerolipids found in the peels, seeds, and pulps of all fruits and vegetables to create “cutin”—the fatty substance that composes the skin of fruits and prevents them from rapidly spoiling by trapping moisture.
By then spraying fruits with this generated substance, Apeel can preserve foods 60 percent longer using an odorless, tasteless, colorless organic substance.
And stores across the US are already using this method. By leveraging our advancing knowledge of plants and chemistry, materials science is allowing us to produce more food with far longer-lasting freshness and more nutritious value than ever before.
Convergence
With advances in 3D printing, vertical farming, and materials sciences, we can now make food smarter, more productive, and far more resilient.
By the end of the next decade, you should be able to 3D print a fusion cuisine dish from the comfort of your home, using ingredients harvested from vertical farms, with nutritional value optimized by AI and materials science. However, even this picture doesn’t account for all the rapid changes underway in the food industry.
Join me next week for Part 2 of the Future of Food for a discussion on how food production will be transformed, quite literally, from the bottom up.
Join Me
Abundance-Digital Online Community: Stay ahead of technological advancements and turn your passion into action. Abundance Digital is now part of Singularity University. Learn more.
Image Credit: Vanessa Bates Ramirez Continue reading
#436114 Video Friday: Transferring Human Motion ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.
We are very sad to say that MIT professor emeritus Woodie Flowers has passed away. Flowers will be remembered for (among many other things, like co-founding FIRST) the MIT 2.007 course that he began teaching in the mid-1970s, famous for its student competitions.
These competitions got a bunch of well-deserved publicity over the years; here’s one from 1985:
And the 2.007 competitions are still going strong—this year’s theme was Moonshot, and you can watch a replay of the event here.
[ MIT ]
Looks like Aibo is getting wireless integration with Hitachi appliances, which turns out to be pretty cute:
What is this magical box where you push a button and 60 seconds later fluffy pancakes come out?!
[ Aibo ]
LiftTiles are a “modular and reconfigurable room-scale shape display” that can turn your floor and walls into on-demand structures.
[ LiftTiles ]
Ben Katz, a grad student in MIT’s Biomimetics Robotics Lab, has been working on these beautiful desktop-sized Furuta pendulums:
That’s a crowdfunding project I’d pay way too much for.
[ Ben Katz ]
A clever bit of cable manipulation from MIT, using GelSight tactile sensors.
[ Paper ]
A useful display of industrial autonomy on ANYmal from the Oxford Robotics Group.
This video is of a demonstration for the ORCA Robotics Hub showing the ANYbotics ANYmal robot carrying out industrial inspection using autonomy software from Oxford Robotics Institute.
[ ORCA Hub ] via [ DRS ]
Thanks Maurice!
Meet Katie Hamilton, a software engineer at NASA’s Ames Research Center, who got into robotics because she wanted to help people with daily life. Katie writes code for robots, like Astrobee, who are assisting astronauts with routine tasks on the International Space Station.
[ NASA Astrobee ]
Transferring human motion to a mobile robotic manipulator and ensuring safe physical human-robot interaction are crucial steps towards automating complex manipulation tasks in human-shared environments. In this work we present a robot whole-body teleoperation framework for human motion transfer. We validate our approach through several experiments using the TIAGo robot, showing this could be an easy way for a non-expert to teach a rough manipulation skill to an assistive robot.
[ Paper ]
This is pretty cool looking for an autonomous boat, but we’ll see if they can build a real one by 2020 since at the moment it’s just an average rendering.
[ ProMare ]
I had no idea that asparagus grows like this. But, sure does make it easy for a robot to harvest.
[ Inaho ]
Skip to 2:30 in this Pepper unboxing video to hear the noise it makes when tickled.
[ HIT Lab NZ ]
In this interview, Jean Paul Laumond discusses his movement from mathematics to robotics and his career contributions to the field, especially in regards to motion planning and anthropomorphic motion. Describing his involvement at CNRS and in other robotics projects, such as HILARE, he comments on the distinction in perception between the robotics approach and a mathematics one.
[ IEEE RAS History ]
Here’s a couple of videos from the CMU Robotics Institute archives, showing some of the work that took place over the last few decades.
[ CMU RI ]
In this episode of the Artificial Intelligence Podcast, Lex Fridman speaks with David Ferrucci from IBM about Watson and (you guessed it) artificial intelligence.
David Ferrucci led the team that built Watson, the IBM question-answering system that beat the top humans in the world at the game of Jeopardy. He is also the Founder, CEO, and Chief Scientist of Elemental Cognition, a company working engineer AI systems that understand the world the way people do. This conversation is part of the Artificial Intelligence podcast.
[ AI Podcast ]
This week’s CMU RI Seminar is by Pieter Abbeel from UC Berkeley, on “Deep Learning for Robotics.”
Programming robots remains notoriously difficult. Equipping robots with the ability to learn would by-pass the need for what otherwise often ends up being time-consuming task specific programming. This talk will describe recent progress in deep reinforcement learning (robots learning through their own trial and error), in apprenticeship learning (robots learning from observing people), and in meta-learning for action (robots learning to learn). This work has led to new robotic capabilities in manipulation, locomotion, and flight, with the same approach underlying advances in each of these domains.
[ CMU RI ] Continue reading