Tag Archives: robotic

#438014 Meet Blueswarm, a Smart School of ...

Anyone who’s seen an undersea nature documentary has marveled at the complex choreography that schooling fish display, a darting, synchronized ballet with a cast of thousands.

Those instinctive movements have inspired researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), and the Wyss Institute for Biologically Inspired Engineering. The results could improve the performance and dependability of not just underwater robots, but other vehicles that require decentralized locomotion and organization, such as self-driving cars and robotic space exploration.

The fish collective called Blueswarm was created by a team led by Radhika Nagpal, whose lab is a pioneer in self-organizing systems. The oddly adorable robots can sync their movements like biological fish, taking cues from their plastic-bodied neighbors with no external controls required. Nagpal told IEEE Spectrum that this marks a milestone, demonstrating complex 3D behaviors with implicit coordination in underwater robots.

“Insights from this research will help us develop future miniature underwater swarms that can perform environmental monitoring and search in visually-rich but fragile environments like coral reefs,” Nagpal said. “This research also paves a way to better understand fish schools, by synthetically recreating their behavior.”

The research is published in Science Robotics, with Florian Berlinger as first author. Berlinger said the “Bluedot” robots integrate a trio of blue LED lights, a lithium-polymer battery, a pair of cameras, a Raspberry Pi computer and four controllable fins within a 3D-printed hull. The fish-lens cameras detect LED’s of their fellow swimmers, and apply a custom algorithm to calculate distance, direction and heading.

Based on that simple production and detection of LED light, the team proved that Blueswarm could self-organize behaviors, including aggregation, dispersal and circle formation—basically, swimming in a clockwise synchronization. Researchers also simulated a successful search mission, an autonomous Finding Nemo. Using their dispersion algorithm, the robot school spread out until one could detect a red light in the tank. Its blue LEDs then flashed, triggering the aggregation algorithm to gather the school around it. Such a robot swarm might prove valuable in search-and-rescue missions at sea, covering miles of open water and reporting back to its mates.

“Each Bluebot implicitly reacts to its neighbors’ positions,” Berlinger said. The fish—RoboCod, perhaps?—also integrate a Wifi module to allow uploading new behaviors remotely. The lab’s previous efforts include a 1,000-strong army of “Kilobots,” and a robotic construction crew inspired by termites. Both projects operated in two-dimensional space. But a 3D environment like air or water posed a tougher challenge for sensing and movement.

In nature, Berlinger notes, there’s no scaly CEO to direct the school’s movements. Nor do fish communicate their intentions. Instead, so-called “implicit coordination” guides the school’s collective behavior, with individual members executing high-speed moves based on what they see their neighbors doing. That decentralized, autonomous organization has long fascinated scientists, including in robotics.

“In these situations, it really benefits you to have a highly autonomous robot swarm that is self-sufficient. By using implicit rules and 3D visual perception, we were able to create a system with a high degree of autonomy and flexibility underwater where things like GPS and WiFi are not accessible.”

Berlinger adds the research could one day translate to anything that requires decentralized robots, from self-driving cars and Amazon warehouse vehicles to exploration of faraway planets, where poor latency makes it impossible to transmit commands quickly. Today’s semi-autonomous cars face their own technical hurdles in reliably sensing and responding to their complex environments, including when foul weather obscures onboard sensors or road markers, or when they can’t fix position via GPS. An entire subset of autonomous-car research involves vehicle-to-vehicle (V2V) communications that could give cars a hive mind to guide individual or collective decisions— avoiding snarled traffic, driving safely in tight convoys, or taking group evasive action during a crash that’s beyond their sensory range.

“Once we have millions of cars on the road, there can’t be one computer orchestrating all the traffic, making decisions that work for all the cars,” Berlinger said.

The miniature robots could also work long hours in places that are inaccessible to humans and divers, or even large tethered robots. Nagpal said the synthetic swimmers could monitor and collect data on reefs or underwater infrastructure 24/7, and work into tiny places without disturbing fragile equipment or ecosystems.

“If we could be as good as fish in that environment, we could collect information and be non-invasive, in cluttered environments where everything is an obstacle,” Nagpal said. Continue reading

Posted in Human Robots

#438012 Video Friday: These Robots Have Made 1 ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

HRI 2021 – March 8-11, 2021 – [Online Conference]
RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
Let us know if you have suggestions for next week, and enjoy today's videos.

We're proud to announce Starship Delivery Robots have now completed 1,000,000 autonomous deliveries around the world. We were unsure where the one millionth delivery was going to take place, as there are around 15-20 service areas open globally, all with robots doing deliveries every minute. In the end it took place at Bowling Green, Ohio, to a student called Annika Keeton who is a freshman studying pre-health Biology at BGSU. Annika is now part of Starship’s history!

[ Starship ]

I adore this little DIY walking robot- with modular feet and little dials to let you easily adjust the walking parameters, it's an affordable kit that's way more nuanced than most.

It's called Bakiwi, and it costs €95. A squee cover made from feathers or fur is an extra €17. Here's a more serious look at what it can do:

[ Bakiwi ]

Thanks Oswald!

Savva Morozov, an AeroAstro junior, works on autonomous navigation for the MIT mini cheetah robot and reflects on the value of a crowded Infinite Corridor.

[ MIT ]

The world's most advanced haptic feedback gloves just got a huge upgrade! HaptX Gloves DK2 achieves a level of realism that other haptic devices can't match. Whether you’re training your workforce, designing a new product, or controlling robots from a distance, HaptX Gloves make it feel real.

They're the only gloves with true-contact haptics, with patented technology that displace your skin the same way a real object would. With 133 points of tactile feedback per hand, for full palm and fingertip coverage. HaptX Gloves DK2 feature the industry's most powerful force feedback, ~2X the strength of other force feedback gloves. They're also the most accurate motion tracking gloves, with 30 tracked degrees of freedom, sub-millimeter precision, no perceivable latency, and no occlusion.

[ HaptX ]

Yardroid is an outdoor robot “guided by computer vision and artificial intelligence” that seems like it can do almost everything.

These are a lot of autonomous capabilities, but so far, we've only seen the video. So, best not to get too excited until we know more about how it works.

[ Yardroid ]

Thanks Dan!

Since as far as we know, Pepper can't spread COVID, it had a busy year.

I somehow missed seeing that chimpanzee magic show, but here it is:

[ Simon Pierro ] via [ SoftBank Robotics ]

In spite of the pandemic, Professor Hod Lipson’s Robotics Studio persevered and even thrived— learning to work on global teams, to develop protocols for sharing blueprints and code, and to test, evaluate, and refine their designs remotely. Equipped with a 3D printer and a kit of electronics prototyping equipment, our students engineered bipedal robots that were conceptualized, fabricated, programmed, and endlessly iterated around the globe in bedrooms, kitchens, backyards, and any other makeshift laboratory you can imagine.

[ Hod Lipson ]

Thanks Fan!

We all know how much quadrupeds love ice!

[ Ghost Robotics ]

We took the opportunity of the last storm to put the Warthog in the snow of Université Laval. Enjoy!

[ Norlab ]

They've got a long way to go, but autonomous indoor firefighting drones seem like a fantastic idea.

[ CTU ]

Individual manipulators are limited by their vertical total load capacity. This places a fundamental limit on the weight of loads that a single manipulator can move. Cooperative manipulation with two arms has the potential to increase the net weight capacity of the overall system. However, it is critical that proper load sharing takes place between the two arms. In this work, we outline a method that utilizes mechanical intelligence in the form of a whiffletree.

And your word of the day is whiffletree, which is “a mechanism to distribute force evenly through linkages.”

[ DART Lab ]

Thanks Raymond!

Some highlights of robotic projects at FZI in 2020, all using ROS.

[ FZI ]

Thanks Fan!

iRobot CEO Colin Angle threatens my job by sharing some cool robots.

[ iRobot ]

A fascinating new talk from Henry Evans on robotic caregivers.

[ HRL ]

The ANA Avatar XPRIZE semifinals selection submission for Team AVATRINA. The setting is a mock clinic, with the patient sitting on a wheelchair and nurse having completed an initial intake. Avatar enters the room controlled by operator (Doctor). A rolling tray table with medical supplies (stethoscope, pulse oximeter, digital thermometer, oxygen mask, oxygen tube) is by the patient’s side. Demonstrates head tracking, stereo vision, fine manipulation, bimanual manipulation, safe impedance control, and navigation.

[ Team AVATRINA ]

This five year old talk from Mikell Taylor, who wrote for us a while back and is now at Amazon Robotics, is entitled “Nobody Cares About Your Robot.” For better or worse, it really doesn't sound like it was written five years ago.

Robotics for the consumer market – Mikell Taylor from Scott Handsaker on Vimeo.

[ Mikell Taylor ]

Fall River Community Media presents this wonderful guy talking about his love of antique robot toys.

If you enjoy this kind of slow media, Fall River also has weekly Hot Dogs Cool Cats adoption profiles that are super relaxing to watch.

[ YouTube ] Continue reading

Posted in Human Robots

#438006 Smellicopter Drone Uses Live Moth ...

Research into robotic sensing has, understandably I guess, been very human-centric. Most of us navigate and experience the world visually and in 3D, so robots tend to get covered with things like cameras and lidar. Touch is important to us, as is sound, so robots are getting pretty good with understanding tactile and auditory information, too. Smell, though? In most cases, smell doesn’t convey nearly as much information for us, so while it hasn’t exactly been ignored in robotics, it certainly isn’t the sensing modality of choice in most cases.

Part of the problem with smell sensing is that we just don’t have a good way of doing it, from a technical perspective. This has been a challenge for a long time, and it’s why we either bribe or trick animals like dogs, rats, vultures, and other animals to be our sensing systems for airborne chemicals. If only they’d do exactly what we wanted them to do all the time, this would be fine, but they don’t, so it’s not.

Until we get better at making chemical sensors, leveraging biology is the best we can do, and what would be ideal would be some sort of robot-animal hybrid cyborg thing. We’ve seen some attempts at remote controlled insects, but as it turns out, you can simplify things if you don’t use the entire insect, but instead just find a way to use its sensing system. Enter the Smellicopter.

There’s honestly not too much to say about the drone itself. It’s an open-source drone project called Crazyflie 2.0, with some additional off the shelf sensors for obstacle avoidance and stabilization. The interesting bits are a couple of passive fins that keep the drone pointed into the wind, and then the sensor, called an electroantennogram.

Image: UW

The drone’s sensor, called an electroantennogram, consists of a “single excised antenna” from a Manduca sexta hawkmoth and a custom signal processing circuit.

To make one of these sensors, you just, uh, “harvest” an antenna from a live hawkmoth. Obligingly, the moth antenna is hollow, meaning that you can stick electrodes up it. Whenever the olfactory neurons in the antenna (which is still technically alive even though it’s not attached to the moth anymore) encounter an odor that they’re looking for, they produce an electrical signal that the electrodes pick up. Plug the other ends of the electrodes into a voltage amplifier and filter, run it through an analog to digital converter, and you’ve got a chemical sensor that weighs just 1.5 gram and consumes only 2.7 mW of power. It’s significantly more sensitive than a conventional metal-oxide odor sensor, in a much smaller and more efficient form factor, making it ideal for drones.

To localize an odor, the Smellicopter uses a simple bioinspired approach called crosswind casting, which involves moving laterally left and right and then forward when an odor is detected. Here’s how it works:

The vehicle takes off to a height of 40 cm and then hovers for ten seconds to allow it time to orient upwind. The smellicopter starts casting left and right crosswind. When a volatile chemical is detected, the smellicopter will surge 25 cm upwind, and then resume casting. As long as the wind direction is fairly consistent, this strategy will bring the insect or robot increasingly closer to a singular source with each surge.

Since odors are airborne, they need a bit of a breeze to spread very far, and the Smellicopter won’t be able to detect them unless it’s downwind of the source. But, that’s just how odors work— even if you’re right next to the source, if the wind is blowing from you towards the source rather than the other way around, you might not catch a whiff of it.

Whenever the olfactory neurons in the antenna encounter an odor that they’re looking for, they produce an electrical signal that the electrodes pick up

There are a few other constraints to keep in mind with this sensor as well. First, rather than detecting something useful (like explosives), it’s going to detect the smells of pretty flowers, because moths like pretty flowers. Second, the antenna will literally go dead on you within a couple hours, since it only functions while its tissues are alive and metaphorically kicking. Interestingly, it may be possible to use CRISPR-based genetic modification to breed moths with antennae that do respond to useful smells, which would be a neat trick, and we asked the researchers—Melanie Anderson, a doctoral student of mechanical engineering at the University of Washington, in Seattle; Thomas Daniel, a UW professor of biology; and Sawyer Fuller, a UW assistant professor of mechanical engineering—about this, along with some other burning questions, via email.

IEEE Spectrum, asking the important questions first: So who came up with “Smellicopter”?

Melanie Anderson: Tom Daniel coined the term “Smellicopter”. Another runner up was “OdorRotor”!

In general, how much better are moths at odor localization than robots?

Melanie Anderson: Moths are excellent at odor detection and odor localization and need to be in order to find mates and food. Their antennae are much more sensitive and specialized than any portable man-made odor sensor. We can't ask the moths how exactly they search for odors so well, but being able to have the odor sensitivity of a moth on a flying platform is a big step in that direction.

Tom Daniel: Our best estimate is that they outperform robotic sensing by at least three orders of magnitude.

How does the localization behavior of the Smellicopter compare to that of a real moth?

Anderson: The cast-and-surge odor search strategy is a simplified version of what we believe the moth (and many other odor searching animals) are doing. It is a reactive strategy that relies on the knowledge that if you detect odor, you can assume that the source is somewhere up-wind of you. When you detect odor, you simply move upwind, and when you lose the odor signal you cast in a cross-wind direction until you regain the signal.

Can you elaborate on the potential for CRISPR to be able to engineer moths for the detection of specific chemicals?

Anderson: CRISPR is already currently being used to modify the odor detection pathways in moth species. It is one of our future efforts to specifically use this to make the antennae sensitive to other chemicals of interest, such as the chemical scent of explosives.

Sawyer Fuller: We think that one of the strengths of using a moth's antenna, in addition to its speed, is that it may provide a path to both high chemical specificity as well as high sensitivity. By expressing a preponderance of only one or a few chemosensors, we are anticipating that a moth antenna will give a strong response only to that chemical. There are several efforts underway in other research groups to make such specific, sensitive chemical detectors. Chemical sensing is an area where biology exceeds man-made systems in terms of efficiency, small size, and sensitivity. So that's why we think that the approach of trying to leverage biological machinery that already exists has some merit.

You mention that the antennae lifespan can be extended for a few days with ice- how feasible do you think this technology is outside of a research context?

Anderson: The antennae can be stored in tiny vials in a standard refrigerator or just with an ice pack to extend their life to about a week. Additionally, the process for attaching the antenna to the electrical circuit is a teachable skill. It is definitely feasible outside of a research context.

Considering the trajectory that sensor development is on, how long do you think that this biological sensor system will outperform conventional alternatives?

Anderson: It's hard to speak toward what will happen in the future, but currently, the moth antenna still stands out among any commercially-available portable sensors.

There have been some experiments with cybernetic insects; what are the advantages and disadvantages of your approach, as opposed to (say) putting some sort of tracking system on a live moth?

Daniel: I was part of a cyber insect team a number of years ago. The challenge of such research is that the animal has natural reactions to attempts to steer or control it.

Anderson: While moths are better at odor tracking than robots currently, the advantage of the drone platform is that we have control over it. We can tell it to constrain the search to a certain area, and return after it finishes searching.

What can you tell us about the health, happiness, and overall wellfare of the moths in your experiments?

Anderson: The moths are cold anesthetized before the antennae are removed. They are then frozen so that they can be used for teaching purposes or in other research efforts.

What are you working on next?

Daniel: The four big efforts are (1) CRISPR modification, (2) experiments aimed at improving the longevity of the antennal preparation, (3) improved measurements of antennal electrical responses to odors combined with machine learning to see if we can classify different odors, and (4) flight in outdoor environments.

Fuller: The moth's antenna sensor gives us a new ability to sense with a much shorter latency than was previously possible with similarly-sized sensors (e.g. semiconductor sensors). What exactly a robot agent should do to best take advantage of this is an open question. In particular, I think the speed may help it to zero in on plume sources in complex environments much more quickly. Think of places like indoor settings with flow down hallways that splits out at doorways, and in industrial settings festooned with pipes and equipment. We know that it is possible to search out and find odors in such scenarios, as anybody who has had to contend with an outbreak of fruit flies can attest. It is also known that these animals respond very quickly to sudden changes in odor that is present in such turbulent, patchy plumes. Since it is hard to reduce such plumes to a simple model, we think that machine learning may provide insights into how to best take advantage of the improved temporal plume information we now have available.

Tom Daniel also points out that the relative simplicity of this project (now that the UW researchers have it all figured out, that is) means that even high school students could potentially get involved in it, even if it’s on a ground robot rather than a drone. All the details are in the paper that was just published in Bioinspiration & Biomimetics. Continue reading

Posted in Human Robots

#437992 This Week’s Awesome Tech Stories From ...

ARTIFICIAL INTELLIGENCE
This Chinese Lab Is Aiming for Big AI Breakthroughs
Will Knight | Wired
“China produces as many artificial intelligence researchers as the US, but it lags in key fields like machine learning. The government hopes to make up ground. …It set AI researchers the goal of making ‘fundamental breakthroughs by 2025’ and called for the country to be ‘the world’s primary innovation center by 2030.’ BAAI opened a year later, in Zhongguancun, a neighborhood of Beijing designed to replicate US innovation hubs such as Boston and Silicon Valley.”

ENVIRONMENT
What Elon Musk’s $100 Million Carbon Capture Prize Could Mean
James Temple | MIT Technology Review
“[Elon Musk] announced on Twitter that he plans to give away $100 million of [his $180 billion net worth] as a prize for the ‘best carbon capture technology.’ …Another $100 million could certainly help whatever venture, or ventures, clinch Musk’s prize. But it’s a tiny fraction of his wealth and will also only go so far. …Money aside, however, one thing Musk has a particular knack for is generating attention. And this is a space in need of it.”

HEALTH
Synthetic Cornea Helped a Legally Blind Man Regain His Sight
Steve Dent | Engadget
“While the implant doesn’t contain any electronics, it could help more people than any robotic eye. ‘After years of hard work, seeing a colleague implant the CorNeat KPro with ease and witnessing a fellow human being regain his sight the following day was electrifying and emotionally moving, there were a lot of tears in the room,’ said CorNeat Vision co-founder Dr. Gilad Litvin.”

BIOTECH
MIT Develops Method for Lab-Grown Plants That May Eventually Lead to Alternatives to Forestry and Farming
Darrell Etherington | TechCrunch
“If the work of these researchers can eventually be used to create a way to produce lab-grown wood for use in construction and fabrication in a way that’s scalable and efficient, then there’s tremendous potential in terms of reducing the impact on forestry globally. Eventually, the team even theorizes you could coax the growth of plant-based materials into specific target shapes, so you could also do some of the manufacturing in the lab, by growing a wood table directly for instance.”

AUTOMATION
FAA Approves First Fully Automated Commercial Drone Flights
Andy Pasztor and Katy Stech Ferek | The Wall Street Journal
“US aviation regulators have approved the first fully automated commercial drone flights, granting a small Massachusetts-based company permission to operate drones without hands-on piloting or direct observation by human controllers or observers. …The company’s Scout drones operate under predetermined flight programs and use acoustic technology to detect and avoid drones, birds, and other obstacles.”

SPACE
China’s Surging Private Space Industry Is Out to Challenge the US
Neel V. Patel | MIT Technology Review
“[The Ceres-1] was a commercial rocket—only the second from a Chinese company ever to go into space. And the launch happened less than three years after the company was founded. The achievement is a milestone for China’s fledgling—but rapidly growing—private space industry, an increasingly critical part of the country’s quest to dethrone the US as the world’s preeminent space power.”

CRYPTOCURRENCY
Janet Yellen Will Consider Limiting Use of Cryptocurrency
Timothy B. Lee | Ars Technica
“Cryptocurrencies could come under renewed regulatory scrutiny over the next four years if Janet Yellen, Joe Biden’s pick to lead the Treasury Department, gets her way. During Yellen’s Tuesday confirmation hearing before the Senate Finance Committee, Sen. Maggie Hassan (D-N.H.) asked Yellen about the use of cryptocurrency by terrorists and other criminals. ‘Cryptocurrencies are a particular concern,’ Yellen responded. ‘I think many are used—at least in a transactions sense—mainly for illicit financing.’i”

SCIENCE
Secret Ingredient Found to Power Supernovas
Thomas Lewton | Quanta
“…Only in the last few years, with the growth of supercomputers, have theorists had enough computing power to model massive stars with the complexity needed to achieve explosions. …These new simulations are giving researchers a better understanding of exactly how supernovas have shaped the universe we see today.”

Image Credit: Ricardo Gomez Angel / Unsplash Continue reading

Posted in Human Robots

#437990 Video Friday: Record-Breaking Drone Show ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

HRI 2021 – March 8-11, 2021 – [Online]
RoboSoft 2021 – April 12-16, 2021 – [Online]
Let us know if you have suggestions for next week, and enjoy today's videos.

A new parent STAR robot is presented. The parent robot has a tail on which the child robot can climb. By collaborating together, the two robots can reach locations that neither can reach on its own.

The parent robot can also supply the child robot with energy by recharging its batteries. The parent STAR can dispatch and recuperate the child STAR automatically (when aligned). The robots are fitted with sensors and controllers and have automatic capabilities but make no decisions on their own.

[ Bio-Inspired and Medical Robotics Lab ]

How TRI trains its robots.

[ TRI ]

The only thing more satisfying than one SCARA robot is two SCARA robots working together.

[ Fanuc ]

I'm not sure that this is strictly robotics, but it's so cool that it's worth a watch anyway.

[ Shinoda & Makino Lab ]

Flying insects heavily rely on optical flow for visual navigation and flight control. Roboticists have endowed small flying robots with optical flow control as well, since it requires just a tiny vision sensor. However, when using optical flow, the robots run into two problems that insects appear to have overcome. Firstly, since optical flow only provides mixed information on distances and velocities, using it for control leads to oscillations when getting closer to obstacles. Secondly, since optical flow provides very little information on obstacles in the direction of motion, it is hardest to detect obstacles that the robot is actually going to collide with! We propose a solution to these problems by means of a learning process.

[ Nature ]

A new Guinness World Record was set on Friday in north China for the longest animation performed by 600 unmanned aerial vehicles (UAVs).

[ Xinhua ]

Translucency is prevalent in everyday scenes. As such, perception of transparent objects is essential for robots to perform manipulation. In this work, we propose LIT, a two-stage method for transparent object pose estimation using light-field sensing and photorealistic rendering.

[ University of Michigan ] via [ Fetch Robotics ]

This paper reports the technological progress and performance of team “CERBERUS” after participating in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge.

And here's a video report on the SubT Urban Beta Course performance:

[ CERBERUS ]

Congrats to Energy Robotics on 2 million euros in seed funding!

[ Energy Robotics ]

Thanks Stefan!

In just 2 minutes, watch HEBI robotics spending 23 minutes assembling a robot arm.

HEBI Robotics is hosting a webinar called 'Redefining the Robotic Arm' next week, which you can check out at the link below.

[ HEBI Robotics ]

Thanks Hardik!

Achieving versatile robot locomotion requires motor skills which can adapt to previously unseen situations. We propose a Multi-Expert Learning Architecture (MELA) that learns to generate adaptive skills from a group of representative expert skills. During training, MELA is first initialised by a distinct set of pre-trained experts, each in a separate deep neural network (DNN). Then by learning the combination of these DNNs using a Gating Neural Network (GNN), MELA can acquire more specialised experts and transitional skills across various locomotion modes.

[ Paper ]

Since the dawn of history, advances in science and technology have pursued “power” and “accuracy.” Initially, “hardness” in machines and materials was sought for reliable operations. In our area of Science of Soft Robots, we have combined emerging academic fields aimed at “softness” to increase the exposure and collaboration of researchers in different fields.

[ Science of Soft Robots ]

A team from the Laboratory of Robotics and IoT for Smart Precision Agriculture and Forestry at INESC TEC – Technology and Science are creating a ROS stack solution using Husky UGV for precision field crop agriculture.

[ Clearpath Robotics ]

Associate Professor Christopher J. Hasson in the Department of Physical Therapy is the director Neuromotor Systems Laboratory at Northeastern University. There he is working with a robotic arm to provide enhanced assistance to physical therapy patients, while maintaining the intimate therapist and patient relationship.

[ Northeastern ]

Mobile Robotic telePresence (MRP) systems aim to support enhanced collaboration between remote and local members of a given setting. But MRP systems also put the remote user in positions where they frequently rely on the help of local partners. Getting or ‘recruiting’ such help can be done with various verbal and embodied actions ranging in explicitness. In this paper, we look at how such recruitment occurs in video data drawn from an experiment where pairs of participants (one local, one remote) performed a timed searching task.

[ Microsoft Research ]

A presentation [from Team COSTAR] for the American Geophysical Union annual fall meeting on the application of robotic multi-sensor 3D Mapping for scientific exploration of caves. Lidar-based 3D maps are combined with visual/thermal/spectral/gas sensors to provide rich 3D context for scientific measurements map.

[ COSTAR ] Continue reading

Posted in Human Robots