Tag Archives: turtlebot

#439870 Video Friday: TurtleBot 4

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

Silicon Valley Robot Block Party – October 23, 2021 – Oakland, CA, USASSRR 2021 – October 25-27, 2021 – New York, NY, USALet us know if you have suggestions for next week, and enjoy today's videos.
We'll have more details on this next week, but there's a new TurtleBot, hooray!

Brought to you by iRobot (providing the base in the form of the new Create 3), Clearpath, and Open Robotics.
[ Clearpath ]
Cognitive Pilot's autonomous tech is now being integrated into production Kirovets K-7M tractors, and they've got big plans: “The third phase of the project envisages a fully self-driving tractor control mode without the need for human involvement. It includes group autonomous operation with a 'leader', the movement of a group of self-driving tractors on non-public roads, the autonomous movement of a robo-tractor paired with a combine harvester not equipped with an autonomous control system, and the use of an expanded set of farm implements with automated control and functionality to monitor their condition during operation.”

[ Cognitive Pilot ]
Thanks, Andrey!
Since the start of the year, Opteran has been working incredibly hard to deliver against our technology milestones and we're delighted to share the first video of our technology in action. In the video you can see Hopper, our robot dog (named after Grace Hopper, a pioneer of computer programming) moving around a course using components of Opteran Natural Intelligence, [rather than] a trained deep learning neural net. Our small development kit (housing an FPGA) sat on top of the robot dog guides Hopper, using Opteran See to provide 360 degrees of stabilised vision, and Opteran Sense to sense objects and avoid collisions.
[ Opteran ]
If you weren't paying any attention to the DARPA SubT Challenge and are now afraid to ask about it, here are two recap videos from DARPA.

[ DARPA SubT ]
A new control system, designed by researchers in MIT's Improbable AI Lab and demonstrated using MIT's robotic mini cheetah, enables four-legged robots to traverse across uneven terrain in real-time.
[ MIT ]
Using a mix of 3D-printed plastic and metal parts, a full-scale replica of NASA's Volatiles Investigating Polar Exploration Rover, or VIPER, was built inside a clean room at NASA's Johnson Space Center in Houston. The activity served as a dress rehearsal for the flight version, which is scheduled for assembly in the summer of 2022.
[ NASA ]
What if you could have 100x more information about your industrial sites? Agile mobile robots like Spot bring sensors to your assets in order to collect data and generate critical insights on asset health so you can optimize performance. Dynamic sensing unlocks flexible and reliable data capture for improved site awareness, safety, and efficiency.
[ Boston Dynamics ]
Fish in Washington are getting some help navigating through culverts under roads, thanks to a robot developed by University of Washington students Greg Joyce and Qishi Zhou. “HydroCUB is designed to operate from a distance through a 300-foot-long cable that supplies power to the rover and transmits video back to the operator. The goal is for the Washington State Department of Transportation which proposed the idea, to use the tool to look for vegetation, cracks, debris and other potential 'fish-barriers' in culverts.”

[ UW ]
Thanks, Sarah!
NASA's Perseverance Mars rover carries two microphones which are directly recording sounds on the Red Planet, including the Ingenuity helicopter and the rover itself at work. For the very first time, these audio recordings offer a new way to experience the planet. Earth and Mars have different atmospheres, which affects the way sound is heard. Justin Maki, a scientist at NASA's Jet Propulsion Laboratory and Nina Lanza, a scientist at Los Alamos National Laboratory, explain some of the notable audio recorded on Mars in this video.
[ JPL ]
A new kind of fiber developed by researchers at MIT and in Sweden can be made into cloth that senses how much it is being stretched or compressed, and then provides immediate tactile feedback in the form of pressure or vibration. Such fabrics, the team suggests, could be used in garments that help train singers or athletes to better control their breathing, or that help patients recovering from disease or surgery to recover their normal breathing patterns.
[ MIT ]
Partnering with Epitomical, Extend robotic has developed a mobile manipulator and a perception system, to let anyone to operate it intuitively through VR interface, over a wireless network.
[ Extend Robotics ]
Here are a couple of videos from Matei Ciocarlie at the Columbia University ROAM lab talking about embodied intelligence for manipulation.

[ ROAM Lab ]
The AirLab at CMU has been hosting an excellent series on SLAM. You should subscribe to their YouTube channel, but here are a couple of their more recent talks.

[ Tartan SLAM Series ]
Robots as Companions invites Sougwen Chung and Madeline Gannon, two artists and researchers whose practices not only involve various types of robots but actually include them as collaborators and companions, to join Maria Yablonina (Daniels Faculty) in conversation. Through their work, they challenge the notion of a robot as an obedient task execution device, questioning the ethos of robot arms as tools of industrial production and automation, and ask us to consider it as an equal participant in the creative process.
[ UofT ]
These two talks come from the IEEE RAS Seasonal School on Rehabilitation and Assistive Technologies based on Soft Robotics.

[ SofTech-Rehab ] Continue reading

Posted in Human Robots

#437765 Video Friday: Massive Robot Joins ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

AWS Cloud Robotics Summit – August 18-19, 2020 – [Online Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

Here are some professional circus artists messing around with an industrial robot for fun, like you do.

The acrobats are part of Östgötateatern, a Swedish theatre group, and the chair bit got turned into its own act, called “The Last Fish.” But apparently the Swedish Work Environment Authority didn’t like that an industrial robot—a large ABB robotic arm—was being used in an artistic performance, arguing that the same safety measures that apply in a factory setting would apply on stage. In other words, the robot had to operate inside a protective cage and humans could not physically interact with it.

When told that their robot had to be removed, the acrobats went to court. And won! At least that’s what we understand from this Swedish press release. The court in Linköping, in southern Sweden, ruled that the safety measures taken by the theater had been sufficient. The group had worked with a local robotics firm, Dyno Robotics, to program the manipulator and learn how to interact with it as safely as possible. The robot—which the acrobats say is the eighth member of their troupe—will now be allowed to return.

[ Östgötateatern ]

Houston Mechathronics’ Aquanaut continues to be awesome, even in the middle of a pandemic. It’s taken the big step (big swim?) out of NASA’s swimming pool and into open water.

[ HMI ]

Researchers from Carnegie Mellon University and Facebook AI Research have created a navigation system for robots powered by common sense. The technique uses machine learning to teach robots how to recognize objects and understand where they’re likely to be found in house. The result allows the machines to search more strategically.

[ CMU ]

Cassie manages 2.1 m/s, which is uncomfortably fast in a couple of different ways.

Next, untethered. After that, running!

[ Michigan Robotics ]

Engineers at Caltech have designed a new data-driven method to control the movement of multiple robots through cluttered, unmapped spaces, so they do not run into one another.

Multi-robot motion coordination is a fundamental robotics problem with wide-ranging applications that range from urban search and rescue to the control of fleets of self-driving cars to formation-flying in cluttered environments. Two key challenges make multi-robot coordination difficult: first, robots moving in new environments must make split-second decisions about their trajectories despite having incomplete data about their future path; second, the presence of larger numbers of robots in an environment makes their interactions increasingly complex (and more prone to collisions).

To overcome these challenges, Soon-Jo Chung, Bren Professor of Aerospace, and Yisong Yue, professor of computing and mathematical sciences, along with Caltech graduate student Benjamin Rivière (MS ’18), postdoctoral scholar Wolfgang Hönig, and graduate student Guanya Shi, developed a multi-robot motion-planning algorithm called “Global-to-Local Safe Autonomy Synthesis,” or GLAS, which imitates a complete-information planner with only local information, and “Neural-Swarm,” a swarm-tracking controller augmented to learn complex aerodynamic interactions in close-proximity flight.

[ Caltech ]

Fetch Robotics’ Freight robot is now hauling around pulsed xenon UV lamps to autonomously disinfect spaces with UV-A, UV-B, and UV-C, all at the same time.

[ SmartGuard UV ]

When you’re a vertically symmetrical quadruped robot, there is no upside-down.

[ Ghost Robotics ]

In the virtual world, the objects you pick up do not exist: you can see that cup or pen, but it does not feel like you’re touching them. That presented a challenge to EPFL professor Herbert Shea. Drawing on his extensive experience with silicone-based muscles and motors, Shea wanted to find a way to make virtual objects feel real. “With my team, we’ve created very small, thin and fast actuators,” explains Shea. “They are millimeter-sized capsules that use electrostatic energy to inflate and deflate.” The capsules have an outer insulating membrane made of silicone enclosing an inner pocket filled with oil. Each bubble is surrounded by four electrodes, that can close like a zipper. When a voltage is applied, the electrodes are pulled together, causing the center of the capsule to swell like a blister. It is an ingenious system because the capsules, known as HAXELs, can move not only up and down, but also side to side and around in a circle. “When they are placed under your fingers, it feels as though you are touching a range of different objects,” says Shea.

[ EPFL ]

Through the simple trick of reversing motors on impact, a quadrotor can land much more reliably on slopes.

[ Sherbrooke ]

Turtlebot delivers candy at Harvard.

I <3 Turtlebot SO MUCH

[ Harvard ]

Traditional drone controllers are a little bit counterintuitive, because there’s one stick that’s forwards and backwards and another stick that’s up and down but they’re both moving on the same axis. How does that make sense?! Here’s a remote that gives you actual z-axis control instead.

[ Fenics ]

Thanks Ashley!

Lio is a mobile robot platform with a multifunctional arm explicitly designed for human-robot interaction and personal care assistant tasks. The robot has already been deployed in several health care facilities, where it is functioning autonomously, assisting staff and patients on an everyday basis.

[ F&P Robotics ]

Video shows a ground vehicle autonomously exploring and mapping a multi-storage garage building and a connected patio on Carnegie Mellon University campus. The vehicle runs onboard state estimation and mapping leveraging range, vision, and inertial sensing, local planning for collision avoidance, and terrain analysis. All processing is real-time and no post-processing involved. The vehicle drives at 2m/s through the exploration run. This work is dedicated to DARPA Subterranean Challange.

[ CMU ]

Raytheon UK’s flagship STEM programme, the Quadcopter Challenge, gives 14-15 year olds the chance to participate in a hands-on, STEM-based engineering challenge to build a fully operational quadcopter. Each team is provided with an identical kit of parts, tools and instructions to build and customise their quadcopter, whilst Raytheon UK STEM Ambassadors provide mentoring, technical support and deliver bite-size learning modules to support the build.

[ Raytheon ]

A video on some of the research work that is being carried out at The Australian Centre for Field Robotics, University of Sydney.

[ University of Sydney ]

Jeannette Bohg, assistant professor of computer science at Stanford University, gave one of the Early Career Award Keynotes at RSS 2020.

[ RSS 2020 ]

Adam Savage remembers Grant Imahara.

[ Tested ] Continue reading

Posted in Human Robots