Tag Archives: arm

#439929 GITAI’s Autonomous Robot Arm Finds ...

Late last year, Japanese robotics startup GITAI sent their S1 robotic arm up to the International Space Station as part of a commercial airlock extension module to test out some useful space-based autonomy. Everything moves pretty slowly on the ISS, so it wasn't until last month that NASA astronauts installed the S1 arm and GITAI was able to put the system through its paces—or rather, sit in comfy chairs on Earth and watch the arm do most of its tasks by itself, because that's the dream, right?

The good news is that everything went well, and the arm did everything GITAI was hoping it would do. So what's next for commercial autonomous robotics in space? GITAI's CEO tells us what they're working on.

In this technology demonstration, the GITAI S1 autonomous space robot was installed inside the ISS Nanoracks Bishop Airlock and succeeded in executing two tasks: assembling structures and panels for In-Space Assembly (ISA), and operating switches & cables for Intra-Vehicular Activity (IVA).

One of the advantages of working in space is that it's a highly structured environment. Microgravity can be somewhat unpredictable, but you have a very good idea of the characteristics of objects (and even of lighting) because everything that's up there is excessively well defined. So, stuff like using a two-finger gripper for relatively high precision tasks is totally possible, because the variation that the system has to deal with is low. Of course, things can always go wrong, so GITAI also tested teleop procedures from Houston to make sure that having humans in the loop was also an effective way of completing tasks.

Since full autonomy is vastly more difficult than almost full autonomy, occasional teleop is probably going to be critical for space robots of all kinds. We spoke with GITAI CEO Sho Nakanose to learn more about their approach.

IEEE Spectrum: What do you think is the right amount of autonomy for robots working inside of the ISS?

Sho Nakanose: We believe that a combination of 95% autonomous control and 5% remote judgment and remote operation is the most efficient way to work. In this ISS demonstration, all the work was performed with 99% autonomous control and 1% remote decision making. However, in actual operations on the ISS, irregular tasks will occur that cannot be handled by autonomous control, and we believe that such irregular tasks should be handled by remote control from the ground, so we believe that the final ratio of about 5% remote judgment and remote control will be the most efficient.

GITAI will apply the general-purpose autonomous space robotics technology, know-how, and experience acquired through this tech demo to develop extra-vehicular robotics (EVR) that can execute docking, repair, and maintenance tasks for On-Orbit Servicing (OOS) or conduct various activities for lunar exploration and lunar base construction. -Sho Nakanose

I'm sure you did many tests with the system on the ground before sending it to the ISS. How was operating the robot on the ISS different from the testing you had done on Earth?

The biggest difference between experiments on the ground and on the ISS is the microgravity environment, but it was not that difficult to cope with. However, experiments on the ISS, which is an unknown environment that we have never been to before, are subject to a variety of unexpected situations that were extremely difficult to deal with, for example an unexpected communication breakdown occurred due to a failed thruster firing experiment on the Russian module. However, we were able to solve all the problems because the development team had carefully prepared for the irregularities in advance.

It looked like the robot was performing many tasks using equipment designed for humans. Do you think it would be better to design things like screws and control panels to make them easier for robots to see and operate?

Yes, I think so. Unlike the ISS that was built in the past, it is expected that humans and robots will cooperate to work together in the lunar orbiting space station Gateway and the lunar base that will be built in the future. Therefore, it is necessary to devise and implement an interface that is easy to use for both humans and robots. In 2019, GITAI received an order from JAXA to develop guidelines for an interface that is easy for both humans and robots to use on the ISS and Gateway.

What are you working on next?

We are planning to conduct an on-orbit extra-vehicular demonstration in 2023 and a lunar demonstration in 2025. We are also working on space robot development projects for several customers for which we have already received orders. Continue reading

Posted in Human Robots

#439691 Researchers develop bionic arm that ...

Cleveland Clinic researchers have engineered a first-of-its-kind bionic arm for patients with upper-limb amputations that allows wearers to think, behave and function like a person without an amputation, according to new findings published in Science Robotics. Continue reading

Posted in Human Robots

#439604 Elephant Robotics Expands Lightweight ...

This article is sponsored by Elephant Robotics.

Elephant Robotics is well known for its line of innovative products that help enhance manufacturing, assembly, education, and more. In 2020, Elephant Robotics released the world's smallest 6-axis robot arm: myCobot. Since its release, myCobot has sold over 5,000 units to clients all over the world.

Following the footprint of myCobot and to fulfill the demand from more users, Elephant Robotics is now expanding its Lightweight Robot Arm Product Line.

myCobot provides an answer for affordable commercial robot arms
The idea of a lightweight commercial robot arm has been raised for a long time, but factory and assembly lines are still the most common scenes for robot arms. A traditional robot arm is usually heavy, loud, and difficult to program. Most importantly, the price is too high, and the cost recovery cycle becomes unacceptably long. These issues have limited robot arms from entering commercial settings.

Elephant Robotics' myCobot series, for the first time, provides an answer for all these issues.

The myCobot series of lightweight 6-axis robots has a payload from 250 grams to 2 kilograms and a working range from 280 to 600 mm. The innovative all-in-one design from
Elephant Robotics allows these robots to get rid of the traditional control box and have all controllers and panels integrated into the base.
myCobot series robots are all open source and support various ways of programming and are super easy for beginners to use and adapt to their needs.

• myCobot 280, as the knock-out product, is an open-source robot arm with a 250 g payload. It is an ideal platform for learning ROS, V-rep, myBlockly, Matlab, CAN, and 485 bus-mastering control.

• myCobot 320 has a payload of 1 kg payload and a continuous working time of 8 hours. myCobot 320 provides an unprecedented option for the service industry.

• myCobot Pro 600, as the top-level product of myCobot series products, features 600 mm arm reach and 2 kg payload. It is equipped withy three harmonic drives that are being used on the commercial robot for the first time. myCobot Pro 600 is expanding the use of robot arms to medical, catering, manufacturing, and other industries, which have not benefited from automation.

The myCobot series of robotic arms provides usability, security, and low-noise. Compared to other options, it's a highly competitive choice for a wide range of automation applications. It allows quick deployment and enables human-robot collaboration. It's safe, increases efficiency for businesses, and is a cost-effective solution.

Traditional industry + robot arm?
The myCobot series can be used for commercial scenarios including production, manufacturing, and assembly. For some more creative ideas, check out the following videos:
to make coffee, to make matcha, provide a robot message, or to help a photographer work.

myCobot Pro as a photographer assistant.
Elephant Robotics

The myCobot series can also be used for scientific research, educational purposes, and medical purposes.

A couple of other unique examples include using it as a smart barista to expand a coffee business; to provide an excellent experience of robot massage; to help in a photographic studio for more accurate and stable precision work; to produce efficient line work and to help print out photos continuously for the perfect combination of artistic creation and robotics.
It can also work as an assistant in a workshop for human and robot collaboration and infinite creativity. Its all-in-one design also make it a great fit for automated guided vehicle (AGV) solutions.

All of the products in the myCobot line are open source and work with Elephant Robotics' myStudio, a one-stop platform for all of the robots from
Elephant Robotics. This platform provides continuous updates of firmware, video tutorials, and provides maintenance and repair information (e.g. tutorials, Q&A, etc.). Users can also buy several accessories targeted at robotic collaboration applications as well.

Open source robot arm
myCobot product line offers various software interfaces and adapt to the majority of development platforms. myCobot product line can be integrated with applications like the Robot Operating System (ROS) and MoveIt, and various APIs, including Python, C++, C#, Java, and Arduino. It also supports multiple ways of programming, including myBlockly and RoboFlow.

Elephant aims to provide the best development experience and lower the development barriers to allow more users to have their hand on myCobots to create useful applications.

“With the new myCobot series products, we are happy to enable customers to create more efficiently on a larger scale than ever before,” said
Elephant Robotics cofounder and CEO Joey Song. “We have helped customers from different industries to achieve automation upgrading like the Tumor Thermal Therapy Robot in medical use.”

“We are hoping to allow more people to use our latest robotic arm,” he added, ” to create and enhance their businesses and maker work.” Continue reading

Posted in Human Robots

#439243 Scientists Added a Sense of Touch to a ...

Most people probably underestimate how much our sense of touch helps us navigate the world around us. New research has made it crystal clear after a robotic arm with the ability to feel was able to halve the time it took for the user to complete tasks.

In recent years, rapid advances in both robotics and neural interfaces have brought the dream of bionic limbs (like the one sported by Luke Skywalker in the Star Wars movies) within touching distance. In 2019, researchers even unveiled a robotic prosthetic arm with a sense of touch that the user could control with their thoughts alone.

But so far, these devices have typically relied on connecting to nerves and muscles in the patient’s residual upper arm. That has meant the devices don’t work for those who have been paralyzed or whose injuries have caused too much damage to those tissues.

That may be about to change, though. For the first time, researchers have allowed a patient to control a robotic arm using a direct connection to their brain while simultaneously receiving sensory information from the device. And by closing the loop, the patient was able to complete tasks in half the time compared to controlling the arm without any feedback.

“The control is so intuitive that I’m basically just thinking about things as if I were moving my own arm,” patient Nathan Copeland, who has been working with researchers at the University of Pittsburgh for six years, told NPR.

The results, reported in Science, build on previous work from the same team that showed they could use implants in Copeland’s somatosensory cortex to trigger sensations localized to regions of his hand, despite him having lost feeling and control thanks to a spinal cord injury.

The 28-year-old had also previously controlled an external robotic arm using a neural interface wired up to his motor cortex, but in the latest experiment the researchers combined the two strands of research, with impressive results.

In a series of tasks designed to test dexterity, including moving objects of different shapes and sizes and pouring from one cup to another, Copeland was able to reduce the time he took to complete these tasks from a median of 20 seconds to just 10, and his performance was often equivalent to that of an able-bodied person.

The sensory information that Copeland receives from the arm is still fairly rudimentary. Sensors measure torque in the joints at the base of the robotic fingers, which is then translated into electrical signals and transmitted to the brain. He reported that the feedback didn’t feel natural, but more like pressure or a gentle tingling.

But that’s still a lot more information than cab be gleaned from simply watching the hand’s motions, which is all he had to go on before. And the approach required almost no training, unlike other popular approaches based on sensory substitution that stimulate a patch of skin or provide visual or audio cues that the patient has to learn to associate with tactile sensations.

“We still have a long way to go in terms of making the sensations more realistic and bringing this technology to people’s homes, but the closer we can get to recreating the normal inputs to the brain, the better off we will be,” Robert Gaunt, a co-author of the paper, said in a press release.

“When even limited and imperfect sensation is restored, the person’s performance improved in a pretty significant way.”

An external robotic arm is still a long way from a properly integrated prosthetic, and it will likely require significant work to squeeze all the required technology into a more portable package. But Bolu Ajiboye, a neural engineer from Case Western Reserve University, told Wired that providing realistic sensory signals directly to the brain, and in particular ones that are relayed in real time, is a significant advance.

In a related perspective in Science, Aldo Faisal of Imperial College London said that the integration of a sense of touch may not only boost the performance of prosthetics, but also give patients a greater sense of ownership over their replacement limbs.

The breakthrough, he added, also opens up a host of interesting lines of scientific inquiry, including whether similar approaches could help advance robotics or be used to augment human perception with non-biological sensors.

Image Credit: RAEng_Publications from Pixabay Continue reading

Posted in Human Robots

#439100 Video Friday: Robotic Eyeball Camera

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
ICRA 2021 – May 30-5, 2021 – Xi'an, China
RoboCup 2021 – June 22-28, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

What if seeing devices looked like us? Eyecam is a prototype exploring the potential future design of sensing devices. Eyecam is a webcam shaped like a human eye that can see, blink, look around and observe us.

And it's open source, so you can build your own!

[ Eyecam ]

Looks like Festo will be turning some of its bionic robots into educational kits, which is a pretty cool idea.

[ Bionics4Education ]

Underwater soft robots are challenging to model and control because of their high degrees of freedom and their intricate coupling with water. In this paper, we present a method that leverages the recent development in differentiable simulation coupled with a differentiable, analytical hydrodynamic model to assist with the modeling and control of an underwater soft robot. We apply this method to Starfish, a customized soft robot design that is easy to fabricate and intuitive to manipulate.

[ MIT CSAIL ]

Rainbow Robotics, the company who made HUBO, has a new collaborative robot arm.

[ Rainbow Robotics ]

Thanks Fan!

We develop an integrated robotic platform for advanced collaborative robots and demonstrates an application of multiple robots collaboratively transporting an object to different positions in a factory environment. The proposed platform integrates a drone, a mobile manipulator robot, and a dual-arm robot to work autonomously, while also collaborating with a human worker. The platform also demonstrates the potential of a novel manufacturing process, which incorporates adaptive and collaborative intelligence to improve the efficiency of mass customization for the factory of the future.

[ Paper ]

Thanks Poramate!

In Sevastopol State University the team of the Laboratory of Underwater Robotics and Control Systems and Research and Production Association “Android Technika” performed tests of an underwater anropomorphic manipulator robot.

[ Sevastopol State ]

Thanks Fan!

Taiwanese company TCI Gene created a COVID test system based on their fully automated and enclosed gene testing machine QVS-96S. The system includes two ABB robots and carries out 1800 tests per day, operating 24/7. Every hour 96 virus samples tests are made with an accuracy of 99.99%.

[ ABB ]

A short video showing how a Halodi Robotics can be used in a commercial guarding application.

[ Halodi ]

During the past five years, under the NASA Early Space Innovations program, we have been developing new design optimization methods for underactuated robot hands, aiming to achieve versatile manipulation in highly constrained environments. We have prototyped hands for NASA’s Astrobee robot, an in-orbit assistive free flyer for the International Space Station.

[ ROAM Lab ]

The new, improved OTTO 1500 is a workhorse AMR designed to move heavy payloads through demanding environments faster than any other AMR on the market, with zero compromise to safety.

[ ROAM Lab ]

Very, very high performance sensing and actuation to pull this off.

[ Ishikawa Group ]

We introduce a conversational social robot designed for long-term in-home use to help with loneliness. We present a novel robot behavior design to have simple self-reflection conversations with people to improve wellness, while still being feasible, deployable, and safe.

[ HCI Lab ]

We are one of the 5 winners of the Start-up Challenge. This video illustrates what we achieved during the Swisscom 5G exploration week. Our proof-of-concept tele-excavation system is composed of a Menzi Muck M545 walking excavator automated & customized by Robotic Systems Lab and IBEX motion platform as the operator station. The operator and remote machine are connected for the first time via a 5G network infrastructure which was brought to our test field by Swisscom.

[ RSL ]

This video shows LOLA balancing on different terrain when being pushed in different directions. The robot is technically blind, not using any camera-based or prior information on the terrain (hard ground is assumed).

[ TUM ]

Autonomous driving when you cannot see the road at all because it's buried in snow is some serious autonomous driving.

[ Norlab ]

A hierarchical and robust framework for learning bipedal locomotion is presented and successfully implemented on the 3D biped robot Digit. The feasibility of the method is demonstrated by successfully transferring the learned policy in simulation to the Digit robot hardware, realizing sustained walking gaits under external force disturbances and challenging terrains not included during the training process.

[ OSU ]

This is a video summary of the Center for Robot-Assisted Search and Rescue's deployments under the direction of emergency response agencies to more than 30 disasters in five countries from 2001 (9/11 World Trade Center) to 2018 (Hurricane Michael). It includes the first use of ground robots for a disaster (WTC, 2001), the first use of small unmanned aerial systems (Hurricane Katrina 2005), and the first use of water surface vehicles (Hurricane Wilma, 2005).

[ CRASAR ]

In March, a team from the Oxford Robotics Institute collected a week of epic off-road driving data, as part of the Sense-Assess-eXplain (SAX) project.

[ Oxford Robotics ]

As a part of the AAAI 2021 Spring Symposium Series, HEBI Robotics was invited to present an Industry Talk on the symposium's topic: Machine Learning for Mobile Robot Navigation in the Wild. Included in this presentation was a short case study on one of our upcoming mobile robots that is being designed to successfully navigate unstructured environments where today's robots struggle.

[ HEBI Robotics ]

Thanks Hardik!

This Lockheed Martin Robotics Seminar is from Chad Jenkins at the University of Michigan, on “Semantic Robot Programming… and Maybe Making the World a Better Place.”

I will present our efforts towards accessible and general methods of robot programming from the demonstrations of human users. Our recent work has focused on Semantic Robot Programming (SRP), a declarative paradigm for robot programming by demonstration that builds on semantic mapping. In contrast to procedural methods for motion imitation in configuration space, SRP is suited to generalize user demonstrations of goal scenes in workspace, such as for manipulation in cluttered environments. SRP extends our efforts to crowdsource robot learning from demonstration at scale through messaging protocols suited to web/cloud robotics. With such scaling of robotics in mind, prospects for cultivating both equal opportunity and technological excellence will be discussed in the context of broadening and strengthening Title IX and Title VI.

[ UMD ] Continue reading

Posted in Human Robots