Tag Archives: solar

#437796 AI Seeks ET: Machine Learning Powers ...

Can artificial intelligence help the search for life elsewhere in the solar system? NASA thinks the answer may be “yes”—and not just on Mars either.

A pilot AI system is now being tested for use on the ExoMars mission that is currently slated to launch in the summer or fall of 2022. The machine-learning algorithms being developed will help science teams decide how to test Martian soil samples to return only the most meaningful data.

For ExoMars, the AI system will only be used back on earth to analyze data gather by the ExoMars rover. But if the system proves to be as useful to the rovers as now suspected, a NASA mission to Saturn’s moon Titan (now scheduled for 2026 launch) could automate the scientific sleuthing process in the field. This mission will rely on the Dragonfly octocopter drone to fly from surface location to surface location through Titan’s dense atmosphere and drill for signs of life there.

The hunt for microbial life in another world’s soil, either as fossilized remnants or as present-day samples, is very challenging, says Eric Lyness, software lead of the NASA Goddard Planetary Environments Lab in Greenbelt, Md. There is of course no precedent to draw upon, because no one has yet succeeded in astrobiology’s holy grail quest.

But that doesn’t mean AI can’t provide substantial assistance. Lyness explained that for the past few years he’d been puzzling over how to automate portions of an exploratory mission’s geochemical investigation, wherever in the solar system the scientific craft may be.

Last year he decided to try machine learning. “So we got some interns,” he said. “People right out of college or in college, who have been studying machine learning. … And they did some amazing stuff. It turned into much more than we expected.” Lyness and his collaborators presented their scientific analysis algorithm at a geochemistry conference last month.

Illustration: ESA

The ExoMars rover, named Rosalind Franklin, will be the first that can drill down to 2-meter depths, where living soil bacteria could possibly be found.

ExoMars’s rover—named Rosalind Franklin, after one of the co-discoverers of DNA—will be the first that can drill down to 2-meter depths, beyond where solar UV light might penetrate and kill any life forms. In other words, ExoMars will be the first Martian craft with the ability to reach soil depths where living soil bacteria could possibly be found.

“We could potentially find forms of life, microbes or other things like that,” Lyness said. However, he quickly added, very little conclusive evidence today exists to suggest that there’s present-day (microbial) life on Mars. (NASA’s Curiosity rover has sent back some inexplicable observations of both methane and molecular oxygen in the Martian atmosphere that could conceivably be a sign of microbial life forms, though non-biological processes could explain these anomalies too.)

Less controversially, the Rosalind Franklin rover’s drill could also turn up fossilized evidence of life in the Martian soil from earlier epochs when Mars was more hospitable.

NASA’s contribution to the joint Russian/European Space Agency ExoMars project is an instrument called a mass spectrometer that will be used to analyze soil samples from the drill cores. Here, Lyness said, is where AI could really provide a helping hand.

Because the Dragonfly drone and possibly a future mission to Jupiter’s moon Europa would be operating in hostile environments with less opportunity for data transmission to Earth, automating a craft’s astrobiological exploration would be practically a requirement

The spectrometer, which studies the mass distribution of ions in a sample of material, works by blasting the drilled soil sample with a laser and then mapping out the atomic masses of the various molecules and portions of molecules that the laser has liberated. The problem is any given mass spectrum could originate from any number of source compounds, minerals and components. Which always makes analyzing a mass spectrum a gigantic puzzle.

Lyness said his group is studying the mineral montmorillonite, a commonplace component of the Martian soil, to see the many ways it might reveal itself in a mass spectrum. Then his team sneaks in an organic compound with the montmorillonite sample to see how that changes the mass spectrometer output.

“It could take a long time to really break down a spectrum and understand why you’re seeing peaks at certain [masses] in the spectrum,” he said. “So anything you can do to point scientists into a direction that says, ‘Don’t worry, I know it’s not this kind of thing or that kind of thing,’ they can more quickly identify what’s in there.”

Lyness said the ExoMars mission will provide a fertile training ground for his team’s as-yet-unnamed AI algorithm. (He said he’s open to suggestions—though, please, no spoof Boaty McBoatface submissions need apply.)

Because the Dragonfly drone and possibly a future astrobiology mission to Jupiter’s moon Europa would be operating in much more hostile environments with much less opportunity for data transmission back and forth to Earth, automating a craft’s astrobiological exploration would be practically a requirement.

All of which points to a future in mid-2030s in which a nuclear-powered octocopter on a moon of Saturn flies from location to location to drill for evidence of life on this tantalizingly bio-possible world. And machine learning will help power the science.

“We should be researching how to make the science instruments smarter,” Lyness said. “If you can make it smarter at the source, especially for planetary exploration, it has huge payoffs.” Continue reading

Posted in Human Robots

#437778 A Bug-Sized Camera for Bug-Sized Robots ...

As if it’s not hard enough to make very small mobile robots, once you’ve gotten the power and autonomy all figured out (good luck with that), your robot isn’t going to be all that useful unless it can carry some payload. And the payload that everybody wants robots to carry is a camera, which is of course a relatively big, heavy, power hungry payload. Great, just great.

This whole thing is frustrating because tiny, lightweight, power efficient vision systems are all around us. Literally, all around us right this second, stuffed into the heads of insects. We can’t make anything quite that brilliant (yet), but roboticists from the University of Washington, in Seattle, have gotten us a bit closer, with the smallest wireless, steerable video camera we’ve ever seen—small enough to fit on the back of a microbot, or even a live bug.

To make a camera this small, the UW researchers, led by Shyam Gollakota, a professor of computer science and engineering, had to start nearly from scratch, primarily because existing systems aren’t nearly so constrained by power availability. Even things like swallowable pill cameras require batteries that weigh more than a gram, but only power the camera for under half an hour. With a focus on small size and efficiency, they started with an off-the-shelf ultra low-power image sensor that’s 2.3 mm wide and weighs 6.7 mg. They stuck on a Bluetooth 5.0 chip (3 mm wide, 6.8 mg), and had a fun time connecting those two things together without any intermediary hardware to broadcast the camera output. A functional wireless camera also requires a lens (20 mg) and an antenna, which is just 5 mm of wire. An accelerometer is useful so that insect motion can be used to trigger the camera, minimizing the redundant frames that you’d get from a robot or an insect taking a nap.

Photo: University of Washington

The microcamera developed by the UW researchers can stream monochrome video at up to 5 frames per second to a cellphone 120 meters away.

The last bit to make up this system is a mechanically steerable “head,” weighing 35 mg and bringing the total weight of the wireless camera system to 84 mg. If the look of the little piezoelectric actuator seems familiar, you have very good eyes because it’s tiny, and also, it’s the same kind of piezoelectric actuator that the folks at UW use to power their itty bitty flying robots. It’s got a 60-degree panning range, but also requires a 96 mg boost converter to function, which is a huge investment in size and weight just to be able to point the camera a little bit. But overall, the researchers say that this pays off, because not having to turn the entire robot (or insect) when you want to look around reduces the energy consumption of the system as a whole by a factor of up to 84 (!).

Photo: University of Washington

Insects are very mobile platforms for outdoor use, but they’re also not easy to steer, so the researchers also built a little insect-scale robot that they could remotely control while watching the camera feed. As it turns out, this seems to be the smallest, power-autonomous terrestrial robot with a camera ever made.

This efficiency means that the wireless camera system can stream video frames (160×120 pixels monochrome) to a cell phone up to 120 meters away for up to 6 hours when powered by a 0.5-g, 10-mAh battery. A live, first-bug view can be streamed at up to 5 frames per second. The system was successfully tested on a pair of darkling beetles that were allowed to roam freely outdoors, and the researchers noted that they could also mount it on spiders or moths, or anything else that could handle the payload. (The researchers removed the electronics from the insects after the experiments and observed no noticeable adverse effects on their behavior.)

The researchers are already thinking about what it might take to put a wireless camera system on something that flies, and it’s not going to be easy—a bumblebee can only carry between 100 and 200 mg. The power system is the primary limitation here, but it might be possible to use a solar cell to cut down on battery requirements. And the camera itself could be scaled down as well, by using a completely custom sensor and a different type of lens. The other thing to consider is that with a long-range wireless link and a vision system, it’s possible to add sophisticated vision-based autonomy to tiny robots by doing the computation remotely. So, next time you see something scuttling across the ground, give it another look, because it might be looking right back at you.

“Wireless steerable vision for live insects and insect-scale robots,” by Vikram Iyer, Ali Najafi, Johannes James, Sawyer Fuller, and Shyamnath Gollakota from the University of Washington, is published in Science Robotics. Continue reading

Posted in Human Robots

#437689 GITAI Sending Autonomous Robot to Space ...

We’ve been keeping a close watch on GITAI since early last year—what caught our interest initially is the history of the company, which includes a bunch of folks who started in the JSK Lab at the University of Tokyo, won the DARPA Robotics Challenge Trials as SCHAFT, got swallowed by Google, narrowly avoided being swallowed by SoftBank, and are now designing robots that can work in space.

The GITAI YouTube channel has kept us more to less up to date on their progress so far, and GITAI has recently announced the next step in this effort: The deployment of one of their robots on board the International Space Station in 2021.

Photo: GITAI

GITAI’s S1 is a task-specific 8-degrees-of-freedom arm with an integrated sensing and computing system and 1-meter reach.

GITAI has been working on a variety of robots for space operations, the most sophisticated of which is a humanoid torso called G1, which is controlled through an immersive telepresence system. What will be launching into space next year is a more task-specific system called the S1, which is an 8-degrees-of-freedom arm with an integrated sensing and computing system that can be wall-mounted and has a 1-meter reach.

The S1 will be living on board a commercially funded, pressurized airlock-extension module called Bishop, developed by NanoRacks. Mounted on the inside of the Bishop module, the S1 will have access to a task board and a small assembly area, where it will demonstrate common crew intra-vehicular activity, or IVA—tasks like flipping switches, turning knobs, and managing cables. It’ll also do some in-space assembly, or ISA, attaching panels to create a solar array.

Here’s a demonstration of some task board activities, conducted on Earth in a mockup of Bishop:

GITAI says that “all operations conducted by the S1 GITAI robotic arm will be autonomous, followed by some teleoperations from Nanoracks’ in-house mission control.” This is interesting, because from what we’ve seen until now, GITAI has had a heavy emphasis on telepresence, with a human in the loop to get stuff done. As GITAI’s founder and CEO Sho Nakanose commented to us a year ago, “Telepresence robots have far better performance and can be made practical much quicker than autonomous robots, so first we are working on making telepresence robots practical.”

So what’s changed? “GITAI has been concentrating on teleoperations to demonstrate the dexterity of our robot, but now it’s time to show our capabilities to do the same this time with autonomy,” Nakanose told us last week. “In an environment with minimum communication latency, it would be preferable to operate a robot more with teleoperations to enhance the capability of the robot, since with the current technology level of AI, what a robot can do autonomously is very limited. However, in an environment where the latency becomes noticeable, it would become more efficient to have a mixture of autonomy and teleoperations depending on the application. Eventually, in an ideal world, a robot will operate almost fully autonomously with minimum human cognizance.”

“In an environment where the latency becomes noticeable, it would become more efficient to have a mixture of autonomy and teleoperations depending on the application. Eventually, in an ideal world, a robot will operate almost fully autonomously with minimum human cognizance.”
—Sho Nakanose, GITAI founder and CEO

Nakanose says that this mission will help GITAI to “acquire the skills, know-how, and experience necessary to prepare a robot to be ISS compatible, prov[ing] the maturity of our technology in the microgravity environment.” Success would mean conducting both IVA and ISA experiments as planned (autonomous and teleop for IVA, fully autonomous for ISA), which would be pretty awesome, but we’re told that GITAI has already received a research and development order for space robots from a private space company, and Nakanose expects that “by the mid-2020s, we will be able to show GITAI's robots working in space on an actual mission.”

NanoRacks is schedule to launch the Bishop module on SpaceX CRS-21 in November. The S1 will be launched separately in 2021, and a NASA astronaut will install the robot and then leave it alone to let it start demonstrating how work in space can be made both safer and cheaper once the humans have gotten out of the way. Continue reading

Posted in Human Robots

#437643 Video Friday: Matternet Launches Urban ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

IROS 2020 – October 25-25, 2020 – [Online]
Bay Area Robotics Symposium – November 20, 2020 – [Online]
ACRA 2020 – December 8-10, 2020 – [Online]
Let us know if you have suggestions for next week, and enjoy today's videos.

Sixteen teams chose their roster of virtual robots and sensor payloads, some based on real-life subterranean robots, and submitted autonomy and mapping algorithms that SubT Challenge officials then tested across eight cave courses in the cloud-based SubT Simulator. Their robots traversed the cave environments autonomously, without any input or adjustments from human operators. The Cave Circuit Virtual Competition teams earned points by correctly finding, identifying, and localizing up to 20 artifacts hidden in the cave courses within five-meter accuracy.

[ SubT ]

This year, the KUKA Innovation Award’s international jury of experts received a total of more than 40 ideas. The five finalist teams had time until November to implement their ideas. A KUKA LBR Med lightweight robot – the first robotic component to be certified for integration into a medical device – has been made available to them for this purpose. Beyond this, the teams have received a training for the hardware and coaching from KUKA experts throughout the competition. At virtual.MEDICA from 16-19.11.2020, the finalists presented their concepts to an international audience of experts and to the Innovation Award jury.

The winner of the KUKA Innovation Award 2020, worth 20,000 euros, is Team HIFUSK from the Scuola Superiore Sant'Anna in Italy.

[ KUKA Innovation Award ]

Like everything else the in-person Cybathlon event was cancelled, but the competition itself took place, just a little more distributed than it would have been otherwise.

[ Cybathlon ]

Matternet, developer of the world's leading urban drone logistics platform, today announced the launch of operations at Labor Berlin Charité Vivantes in Germany. The program kicked-off November 17, 2020 with permanent operations expected to take flight next year, creating the first urban BVLOS [Beyond Visual Line of Sight] medical drone delivery network in the European Union. The drone network expects to significantly improve the timeliness and efficiency of Labor Berlin’s diagnostics services by providing an option to avoid roadway delays, which will improve patient experience with potentially life-saving benefits and lower costs.

Routine BVLOS over an urban area? Impressive.

[ Matternet ]

Robots playing diabolo!

Thanks Thilo!

[ OMRON Sinic X]

Anki's tech has been repackaged into this robot that serves butter:

[ Butter Robot ]

Berkshire Grey just announced our Picking With Purpose Program in which we’ve partnered our robotic automation solutions with food rescue organizations City Harvest and The Greater Boston Food Bank to pick, pack, and distribute food to families in need in time for Thanksgiving. Berkshire Grey donated about 40,000 pounds of food, used one of our robotic automation systems to pick and pack that food into meal boxes for families in need, and our team members volunteered to run the system. City Harvest and The Greater Boston Food Bank are distributing the 4,000 meal boxes we produced. This is just the beginning. We are building a sponsorship program to make Picking With Purpose an ongoing initiative.

[ Berkshire Grey ]

Thanks Peter!

We posted a video previously of Cassie learning to skip, but here's a much more detailed look (accompanying an ICRA submission) that includes some very impressive stair descending.

[ DRL ]

From garage inventors to university students and entrepreneurs, NASA is looking for ideas on how to excavate the Moon’s icy regolith, or dirt, and deliver it to a hypothetical processing plant at the lunar South Pole. The NASA Break the Ice Lunar Challenge, a NASA Centennial Challenge, is now open for registration. The competition will take place over two phases and will reward new ideas and approaches for a system architecture capable of excavating and moving icy regolith and water on the lunar surface.

[ NASA ]

Adaptation to various scene configurations and object properties, stability and dexterity in robotic grasping manipulation is far from explored. This work presents an origami-based shape morphing fingertip design to actively tackle the grasping stability and dexterity problems. The proposed fingertip utilizes origami as its skeleton providing degrees of freedom at desired positions and motor-driven four-bar-linkages as its transmission components to achieve a compact size of the fingertip.

[ Paper ]

“If Roboy crashes… you die.”

[ Roboy ]

Traditionally lunar landers, as well as other large space exploration vehicles, are powered by solar arrays or small nuclear reactors. Rovers and small robots, however, are not big enough to carry their own dedicated power supplies and must be tethered to their larger counterparts via electrical cables. Tethering severely restricts mobility, and cables are prone to failure due to lunar dust (regolith) interfering with electrical contact points. Additionally, as robots become smaller and more complex, they are fitted with additional sensors that require more power, further exacerbating the problem. Lastly, solar arrays are not viable for charging during the lunar night. WiBotic is developing rapid charging systems and energy monitoring base stations for lunar robots, including the CubeRover – a shoebox-sized robot designed by Astrobotic – that will operate autonomously and charge wirelessly on the Moon.

[ WiBotic ]

Watching pick and place robots is my therapy.

[ Soft Robotics ]

It's really, really hard to beat liquid fuel for energy storage, as Quaternium demonstrates with their hybrid drone.

[ Quaternium ]

Thanks Gregorio!

State-of-the-art quadrotor simulators have a rigid and highly-specialized structure: either are they really fast, physically accurate, or photo-realistic. In this work, we propose a novel quadrotor simulator: Flightmare.

[ Flightmare ]

Drones that chuck fire-fighting balls into burning buildings, sure!

[ LARICS ]

If you missed ROS World, that's okay, because all of the talks are now online. Here's the opening keynote from Vivian Chu and Diligent robotics, along with a couple fun lightning talks.

[ ROS World 2020 ]

This week's CMU RI Seminar is by Chelsea Finn from Stanford University, on Data Scalability for Robot Learning.

Recent progress in robot learning has demonstrated how robots can acquire complex manipulation skills from perceptual inputs through trial and error, particularly with the use of deep neural networks. Despite these successes, the generalization and versatility of robots across environment conditions, tasks, and objects remains a major challenge. And, unfortunately, our existing algorithms and training set-ups are not prepared to tackle such challenges, which demand large and diverse sets of tasks and experiences. In this talk, I will discuss two central challenges that pertain to data scalability: first, acquiring large datasets of diverse and useful interactions with the world, and second, developing algorithms that can learn from such datasets. Then, I will describe multiple approaches that we might take to rethink our algorithms and data pipelines to serve these goals. This will include algorithms that allow a real robot to explore its environment in a targeted manner with minimal supervision, approaches that can perform robot reinforcement learning with videos of human trial-and-error experience, and visual model-based RL approaches that are not bottlenecked by their capacity to model everything about the world.

[ CMU RI ] Continue reading

Posted in Human Robots

#437608 Video Friday: Agility Robotics Raises ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

Digit is now in full commercial production and we’re excited to announce a $20M funding rounding round co-led by DCVC and Playground Global!

Digits for everyone!

[ Agility Robotics ]

A flexible rover that has both ability to travel long distances and rappel down hard-to-reach areas of scientific interest has undergone a field test in the Mojave Desert in California to showcase its versatility. Composed of two Axel robots, DuAxel is designed to explore crater walls, pits, scarps, vents and other extreme terrain on the moon, Mars and beyond.

This technology demonstration developed at NASA’s Jet Propulsion Laboratory in Southern California showcases the robot’s ability to split in two and send one of its halves — a two-wheeled Axle robot — over an otherwise inaccessible slope, using a tether as support and to supply power.

The rappelling Axel can then autonomously seek out areas to study, safely overcome slopes and rocky obstacles, and then return to dock with its other half before driving to another destination. Although the rover doesn’t yet have a mission, key technologies are being developed that might, one day, help us explore the rocky planets and moons throughout the solar system.

[ JPL ]

A rectangular robot as tiny as a few human hairs can travel throughout a colon by doing back flips, Purdue University engineers have demonstrated in live animal models. Why the back flips? Because the goal is to use these robots to transport drugs in humans, whose colons and other organs have rough terrain. Side flips work, too. Why a back-flipping robot to transport drugs? Getting a drug directly to its target site could remove side effects, such as hair loss or stomach bleeding, that the drug may otherwise cause by interacting with other organs along the way.

[ Purdue ]

This video shows the latest results in the whole-body locomotion control of the humanoid robot iCub achieved by the Dynamic Interaction Control line at IIT-Istituto Italiano di Tecnologia in Genova (Italy). In particular, the iCub now keeps the balance while walking and receiving pushes from an external user. The implemented control algorithms also ensure the robot to remain compliant during locomotion and human-robot interaction, a fundamental property to lower the possibility to harm humans that share the robot surrounding environment.

This is super impressive, considering that iCub was only able to crawl and was still tethered not too long ago. Also, it seems to be blinking properly now, so it doesn’t look like it’s always sleepy.

[ IIT ]

This video shows a set of new tests we performed on Bolt. We conducted tests on 5 different scenarios, 1) walking forward/backward 2) uneven surface 3) soft surface 4) push recovery 5) slippage recovery. Thanks to our feedback control based on Model Predictive Control, the robot can perform walking in the presence of all these uncertainties. We will open-source all the codes in a near future.

[ ODRI ]

The title of this video is “Can you throw your robot into a lake?” The title of this video should be, “Can you throw your robot into a lake and drive it out again?”

[ Norlab ]

AeroVironment Successfully Completes Sunglider Solar HAPS Stratospheric Test Flight, Surpassing 60,000 Feet Altitude and Demonstrating Broadband Mobile Connectivity.

[ AeroVironment ]

We present CoVR, a novel robotic interface providing strong kinesthetic feedback (100 N) in a room-scale VR arena. It consists of a physical column mounted on a 2D Cartesian ceiling robot (XY displacements) with the capacity of (1) resisting to body-scaled users actions such as pushing or leaning; (2) acting on the users by pulling or transporting them as well as (3) carrying multiple potentially heavy objects (up to 80kg) that users can freely manipulate or make interact with each other.

[ DeepAI ]

In a new video, personnel from Swiss energy supply company Kraftwerke Oberhasli AG (KWO) explain how they were able to keep employees out of harm’s way by using Flyability’s Elios 2 to collect visual data while building a new dam.

[ Flyability ]

Enjoy our Ascento robot fail compilation! With every failure we experience, we learn more and we can improve our robot for its next iteration, which will come soon… Stay tuned for more!

FYI posting a robot fails video will pretty much guarantee you a spot in Video Friday!

[ Ascento ]

Humans are remarkably good at using chopsticks. The Guinness World Record witnessed a person using chopsticks to pick up 65 M&Ms in just a minute. We aim to collect demonstrations from humans and to teach robot to use chopsticks.

[ UW Personal Robotics Lab ]

A surprising amount of personality from these Yaskawa assembly robots.

[ Yaskawa ]

This paper presents the system design, modeling, and control of the Aerial Robotic Chain Manipulator. This new robot design offers the potential to exert strong forces and moments to the environment, carry and lift significant payloads, and simultaneously navigate through narrow corridors. The presented experimental studies include a valve rotation task, a pick-and-release task, and the verification of load oscillation suppression to demonstrate the stability and performance of the system.

[ ARL ]

Whether animals or plants, whether in the water, on land or in the air, nature provides the model for many technical innovations and inventions. This is summed up in the term bionics, which is a combination of the words ‘biology‘ and ‘electronics’. At Festo, learning from nature has a long history, as our Bionic Learning Network is based on using nature as the source for future technologies like robots, assistance systems or drive solutions.

[ Festo ]

Dogs! Selfies! Thousands of LEGO bricks! This video has it all.

[ LEGO ]

An IROS workshop talk on “Cassie and Mini Cheetah Autonomy” by Maani Ghaffari and Jessy Grizzle from the University of Michigan.

[ Michigan Robotics ]

David Schaefer’s Cozmo robots are back with this mind-blowing dance-off!

What you just saw represents hundreds of hours of work, David tells us: “I wrote over 10,000 lines of code to create the dance performance as I had to translate the beats per minute of the song into motor rotations in order to get the right precision needed to make the moves look sharp. The most challenging move was the SpongeBob SquareDance as any misstep would send the Cozmos crashing into each other. LOL! Fortunately for me, Cozmo robots are pretty resilient.”

[ Life with Cozmo ]

Thanks David!

This week’s GRASP on Robotics seminar is by Sangbae Kim from MIT, on “Robots with Physical Intelligence.”

While industrial robots are effective in repetitive, precise kinematic tasks in factories, the design and control of these robots are not suited for physically interactive performance that humans do easily. These tasks require ‘physical intelligence’ through complex dynamic interactions with environments whereas conventional robots are designed primarily for position control. In order to develop a robot with ‘physical intelligence’, we first need a new type of machines that allow dynamic interactions. This talk will discuss how the new design paradigm allows dynamic interactive tasks. As an embodiment of such a robot design paradigm, the latest version of the MIT Cheetah robots and force-feedback teleoperation arms will be presented.

[ GRASP ]

This week’s CMU Ri Seminar is by Kevin Lynch from Northwestern, on “Robotics and Biosystems.”

Research at the Center for Robotics and Biosystems at Northwestern University encompasses bio-inspiration, neuromechanics, human-machine systems, and swarm robotics, among other topics. In this talk I will give an overview of some of our recent work on in-hand manipulation, robot locomotion on yielding ground, and human-robot systems.

[ CMU RI ] Continue reading

Posted in Human Robots