Tag Archives: Safety

#441055 Industrial Functional Safety Training ...

This is a sponsored article brought to you by UL Solutions.

Invest in building your team’s excellence with functional safety training and certification services from UL Solutions, a global safety science leader.

Our UL Certified Functional Safety Certification programs provide your team opportunities to learn about — or deepen their existing knowledge and skills in — functional safety to achieve professional credentials in this space.

We offer personnel certification at both the professional and expert levels in automotive, autonomous vehicles, electronics and semiconductors, machinery, industrial automation, and cybersecurity.

You can now register for any of the offerings listed below. All our instructor-led, virtual courses provide a deep dive into key functional safety standards.

IEC 61511
UL Certified Functional Safety Professional in IEC 61511 Class with Exam – Virtual

This three-day course provides a comprehensive overview of the IEC 61511:2016 and ANSI/ISA 61511:2018 standards for the process industry. Participants who complete all three days of training can take a two-hour certification exam on the morning of the fourth day. Those who pass the exam earn individual certification as a UL Certified Functional Safety Professional in IEC 61511 or UL-CFSP.

Purchase training→

IEC 61508
Functional Safety Overview and Designing Safety-Related Electronic Control Systems in Accordance with IEC 61508 Standard Class with Exam – Virtual (English)

This three-day course helps engineers, developers and managers successfully apply IEC 61508 to their safety-related electrical systems. IEC 61508 serves as the base functional safety standard for various industries, including process, nuclear and machinery, among others. This course includes a one-hour follow-up Q&A session (scheduled at a later date) with one of UL Solutions’ functional safety experts.

Purchase training→

UL 4600
UL Certified Autonomy Safety Professional Training in UL 4600 2nd Edition Class with Exam – Virtual (English)

This 2.5-day course highlights modern-day autonomous robotics, industrial automation, sensors and semi-automated technologies and how they can apply to safety. The course focuses on UL 4600, the Standard for Evaluation of Autonomous Products, and includes information on related safety standards.

Purchase training→

Functional Safety Training for Earth-Moving Machinery in Agricultural Tractor and Construction Control Systems Per ISO 25119, ISO 13849 and ISO 19014
UL Certified Functional Safety Professional Training in Agriculture and Construction Machinery Class with Exam – Virtual (English)

This 2.5-day course will cover functional safety standards and concepts related to agricultural and construction earth-moving machinery. Applicable standards covered in this training include the EU Machinery Directive; ISO 19014:2018, Earth-Moving Machinery — Functional Safety — Part 1: Methodology to Determine Safety-Related Parts of the Control System and Performance Requirements; and ISO 25119:2018, Tractors and Machinery for Agriculture and Forestry — Safety-Related Parts of Control Systems. UL Solutions’ experts will cover topics such as hazard identification and risk assessment per ISO 12100:2010, Safety of Machinery — General Principles for Design — Risk Assessment and Risk Reduction. Case studies on a range of topics, including motor drives and safety product life cycles, will also help provide examples of how the requirements and concepts of the standards apply.

Purchase training→

ISO 13849, IEC 62061, IEC 61800-5-2, 25119, and the EU Machinery Directive
UL Certified Functional Safety Professional Training in Machinery Class with Exam – Virtual (English)

This 2.5-day course is for engineers working on programmable machinery and control systems. The training course will cover functional safety standards and concepts related to the EU Machinery Directive, including ISO 13849, Safety of Machinery – Safety-Related Parts of Control Systems; IEC 61800-5-2, Adjustable Speed Electrical Power Drive Systems – Part 5-2: Safety Requirements – Functional; and IEC 62061, Safety of Machinery – Functional Safety of Safety-Related Electrical, Electronic and Programmable Electronic Control Systems.

Purchase training→ Continue reading

Posted in Human Robots

#439678 Video Friday: Afghan Girls Robotics Team ...

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USAWeRobot 2021 – September 23-25, 2021 – [Online Event]IROS 2021 – September 27-1, 2021 – [Online Event]ROSCon 2021 – October 20-21, 2021 – [Online Event]Let us know if you have suggestions for next week, and enjoy today's videos.
Five members of an all-girl Afghan robotics team have arrived in Mexico, fleeing an uncertain future at home after the recent collapse of the U.S.-backed government and takeover by the Taliban.
[ Reuters ] via [ FIRST Mexico ]
Thanks, Fan!
As far as autonomous cars are concerned, there's suburban Arizona difficulty, San Francisco difficulty, and then Asia rush hour difficulty. This is a 9:38 long video that is actually worth watching in its entirety because it's a fully autonomous car from AutoX driving through a Shenzhen urban village. Don't miss the astonished pedestrians, the near-miss with a wandering dog, and the comically one-sided human-vehicle interaction on a single lane road.

The AutoX Gen5 system has 50 sensors in total, as well as a vehicle control unit of 2200 TOPS computing power. There are 28 cameras capturing a total of 220 million pixels per second, six high-resolution LiDAR offering 15 million points per second, and 4D RADAR with 0.9-degree resolution encompassing a 360-degree view around the vehicle. Using cameras and LiDAR fusion perception blind spot modules, the Gen5 system covers the entire RoboTaxi body with zero blind spots.[ AutoX ]
Sometimes, robots do nice things for humans.

[ US Soccer ]
Body babbling? Body babbling.

[ CVUT ]
Thanks, Fan!
Matias from the Oxford Robotics Institute writes, “This is a demonstration of our safe visual teach and repeat navigation system running on the ANYmal robot in the Corsham mines/former Cold War bunker in the UK. This is part of some testing we've been doing for the DARPA SubT challenge as part of the Cerberus team.”

[ Oxford Robotics ]
Thanks, Matias!
We built a robotic chess player with a universal robot UR5e, a 2D camera, and a deep-learning neural network to illustrate what we do at the Mechatronics, Automation, and Control System Lab at the University of Washington.
[ MACS Lab ] via [ UW Engineering ]
Thanks, Sarah!
Autonomous inspection of powerlines with quadrotors is challenging. Flights require persistent perception to keep a close look at the lines. We propose a method that uses event cameras to robustly track powerlines. The performance is evaluated in real-world flights along a powerline. The tracker is able to persistently track the powerlines, with a mean lifetime of the line 10x longer than existing approaches.
[ ETHZ ]
I could totally do this, I just choose not to.

[ Flexiv ]
Thanks, Yunfan!
Drone Badminton enables people with low vision to play badminton again using a drone as a ball. This has the potential to diversify the physical activity for people with low vision.
[ Digital Nature Group ]
Even with the batteries installed, the Open Dynamic Robot Initiative's quadruped is still super skinny looking.

[ ODRI ]
At USC's Center for Advanced Manufacturing, we have developed a space for multidisciplinary human-robot interaction. The Baxter robot collaborates with the user to execute their own customizable tie-dye design.
[ USC Viterbi ]
I will never understand the impulse that marketing folks have to add bizarre motor noises to robot videos.

[ DeepRobotics ]
FedEx and Berkshire Grey have teamed up to streamline small package processing.
[ FedEx ]
ABB robot amalyzing COVID tests in a fully automated, unmanned state, back and forth between the stations Assist in the delivery of specimens between points, 24 hours a day, 24 hours a day, test results of 96 specimens can be completed every 60 minutes, processing more than 1,800 specimens per day.
[ ABB ]
Thanks, Fan!
This is, and I quote, “the best and greatest robot death scene of all time.”

[ The Black Hole ]
Thanks, Mark!
Audrow Nash interviews Melonee Wise for the Sense Think Act podcast.

[ Sense Think Act ]
Tom Galluzzo interviews Andrew Thomaz for the Crazy Hard Robots podcast.

[ Crazy Hard Robots ] Continue reading

Posted in Human Robots

#439311 Amazon develops new technologies to ...

Teams at the Amazon Robotics and Advanced Technology labs in both Seattle, Washington, and northern Italy have begun diligently testing out new technology they hope will improve safety for employees by carrying out tasks such as transportation of carts, packages and totes through Amazon facilities. Continue reading

Posted in Human Robots

#439110 Robotic Exoskeletons Could One Day Walk ...

Engineers, using artificial intelligence and wearable cameras, now aim to help robotic exoskeletons walk by themselves.

Increasingly, researchers around the world are developing lower-body exoskeletons to help people walk. These are essentially walking robots users can strap to their legs to help them move.

One problem with such exoskeletons: They often depend on manual controls to switch from one mode of locomotion to another, such as from sitting to standing, or standing to walking, or walking on the ground to walking up or down stairs. Relying on joysticks or smartphone apps every time you want to switch the way you want to move can prove awkward and mentally taxing, says Brokoslaw Laschowski, a robotics researcher at the University of Waterloo in Canada.

Scientists are working on automated ways to help exoskeletons recognize when to switch locomotion modes — for instance, using sensors attached to legs that can detect bioelectric signals sent from your brain to your muscles telling them to move. However, this approach comes with a number of challenges, such as how how skin conductivity can change as a person’s skin gets sweatier or dries off.

Now several research groups are experimenting with a new approach: fitting exoskeleton users with wearable cameras to provide the machines with vision data that will let them operate autonomously. Artificial intelligence (AI) software can analyze this data to recognize stairs, doors, and other features of the surrounding environment and calculate how best to respond.

Laschowski leads the ExoNet project, the first open-source database of high-resolution wearable camera images of human locomotion scenarios. It holds more than 5.6 million images of indoor and outdoor real-world walking environments. The team used this data to train deep-learning algorithms; their convolutional neural networks can already automatically recognize different walking environments with 73 percent accuracy “despite the large variance in different surfaces and objects sensed by the wearable camera,” Laschowski notes.

According to Laschowski, a potential limitation of their work their reliance on conventional 2-D images, whereas depth cameras could also capture potentially useful distance data. He and his collaborators ultimately chose not to rely on depth cameras for a number of reasons, including the fact that the accuracy of depth measurements typically degrades in outdoor lighting and with increasing distance, he says.

In similar work, researchers in North Carolina had volunteers with cameras either mounted on their eyeglasses or strapped onto their knees walk through a variety of indoor and outdoor settings to capture the kind of image data exoskeletons might use to see the world around them. The aim? “To automate motion,” says Edgar Lobaton an electrical engineering researcher at North Carolina State University. He says they are focusing on how AI software might reduce uncertainty due to factors such as motion blur or overexposed images “to ensure safe operation. We want to ensure that we can really rely on the vision and AI portion before integrating it into the hardware.”

In the future, Laschowski and his colleagues will focus on improving the accuracy of their environmental analysis software with low computational and memory storage requirements, which are important for onboard, real-time operations on robotic exoskeletons. Lobaton and his team also seek to account for uncertainty introduced into their visual systems by movements .

Ultimately, the ExoNet researchers want to explore how AI software can transmit commands to exoskeletons so they can perform tasks such as climbing stairs or avoiding obstacles based on a system’s analysis of a user's current movements and the upcoming terrain. With autonomous cars as inspiration, they are seeking to develop autonomous exoskeletons that can handle the walking task without human input, Laschowski says.

However, Laschowski adds, “User safety is of the utmost importance, especially considering that we're working with individuals with mobility impairments,” resulting perhaps from advanced age or physical disabilities.
“The exoskeleton user will always have the ability to override the system should the classification algorithm or controller make a wrong decision.” Continue reading

Posted in Human Robots

#439100 Video Friday: Robotic Eyeball Camera

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
ICRA 2021 – May 30-5, 2021 – Xi'an, China
RoboCup 2021 – June 22-28, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

What if seeing devices looked like us? Eyecam is a prototype exploring the potential future design of sensing devices. Eyecam is a webcam shaped like a human eye that can see, blink, look around and observe us.

And it's open source, so you can build your own!

[ Eyecam ]

Looks like Festo will be turning some of its bionic robots into educational kits, which is a pretty cool idea.

[ Bionics4Education ]

Underwater soft robots are challenging to model and control because of their high degrees of freedom and their intricate coupling with water. In this paper, we present a method that leverages the recent development in differentiable simulation coupled with a differentiable, analytical hydrodynamic model to assist with the modeling and control of an underwater soft robot. We apply this method to Starfish, a customized soft robot design that is easy to fabricate and intuitive to manipulate.

[ MIT CSAIL ]

Rainbow Robotics, the company who made HUBO, has a new collaborative robot arm.

[ Rainbow Robotics ]

Thanks Fan!

We develop an integrated robotic platform for advanced collaborative robots and demonstrates an application of multiple robots collaboratively transporting an object to different positions in a factory environment. The proposed platform integrates a drone, a mobile manipulator robot, and a dual-arm robot to work autonomously, while also collaborating with a human worker. The platform also demonstrates the potential of a novel manufacturing process, which incorporates adaptive and collaborative intelligence to improve the efficiency of mass customization for the factory of the future.

[ Paper ]

Thanks Poramate!

In Sevastopol State University the team of the Laboratory of Underwater Robotics and Control Systems and Research and Production Association “Android Technika” performed tests of an underwater anropomorphic manipulator robot.

[ Sevastopol State ]

Thanks Fan!

Taiwanese company TCI Gene created a COVID test system based on their fully automated and enclosed gene testing machine QVS-96S. The system includes two ABB robots and carries out 1800 tests per day, operating 24/7. Every hour 96 virus samples tests are made with an accuracy of 99.99%.

[ ABB ]

A short video showing how a Halodi Robotics can be used in a commercial guarding application.

[ Halodi ]

During the past five years, under the NASA Early Space Innovations program, we have been developing new design optimization methods for underactuated robot hands, aiming to achieve versatile manipulation in highly constrained environments. We have prototyped hands for NASA’s Astrobee robot, an in-orbit assistive free flyer for the International Space Station.

[ ROAM Lab ]

The new, improved OTTO 1500 is a workhorse AMR designed to move heavy payloads through demanding environments faster than any other AMR on the market, with zero compromise to safety.

[ ROAM Lab ]

Very, very high performance sensing and actuation to pull this off.

[ Ishikawa Group ]

We introduce a conversational social robot designed for long-term in-home use to help with loneliness. We present a novel robot behavior design to have simple self-reflection conversations with people to improve wellness, while still being feasible, deployable, and safe.

[ HCI Lab ]

We are one of the 5 winners of the Start-up Challenge. This video illustrates what we achieved during the Swisscom 5G exploration week. Our proof-of-concept tele-excavation system is composed of a Menzi Muck M545 walking excavator automated & customized by Robotic Systems Lab and IBEX motion platform as the operator station. The operator and remote machine are connected for the first time via a 5G network infrastructure which was brought to our test field by Swisscom.

[ RSL ]

This video shows LOLA balancing on different terrain when being pushed in different directions. The robot is technically blind, not using any camera-based or prior information on the terrain (hard ground is assumed).

[ TUM ]

Autonomous driving when you cannot see the road at all because it's buried in snow is some serious autonomous driving.

[ Norlab ]

A hierarchical and robust framework for learning bipedal locomotion is presented and successfully implemented on the 3D biped robot Digit. The feasibility of the method is demonstrated by successfully transferring the learned policy in simulation to the Digit robot hardware, realizing sustained walking gaits under external force disturbances and challenging terrains not included during the training process.

[ OSU ]

This is a video summary of the Center for Robot-Assisted Search and Rescue's deployments under the direction of emergency response agencies to more than 30 disasters in five countries from 2001 (9/11 World Trade Center) to 2018 (Hurricane Michael). It includes the first use of ground robots for a disaster (WTC, 2001), the first use of small unmanned aerial systems (Hurricane Katrina 2005), and the first use of water surface vehicles (Hurricane Wilma, 2005).

[ CRASAR ]

In March, a team from the Oxford Robotics Institute collected a week of epic off-road driving data, as part of the Sense-Assess-eXplain (SAX) project.

[ Oxford Robotics ]

As a part of the AAAI 2021 Spring Symposium Series, HEBI Robotics was invited to present an Industry Talk on the symposium's topic: Machine Learning for Mobile Robot Navigation in the Wild. Included in this presentation was a short case study on one of our upcoming mobile robots that is being designed to successfully navigate unstructured environments where today's robots struggle.

[ HEBI Robotics ]

Thanks Hardik!

This Lockheed Martin Robotics Seminar is from Chad Jenkins at the University of Michigan, on “Semantic Robot Programming… and Maybe Making the World a Better Place.”

I will present our efforts towards accessible and general methods of robot programming from the demonstrations of human users. Our recent work has focused on Semantic Robot Programming (SRP), a declarative paradigm for robot programming by demonstration that builds on semantic mapping. In contrast to procedural methods for motion imitation in configuration space, SRP is suited to generalize user demonstrations of goal scenes in workspace, such as for manipulation in cluttered environments. SRP extends our efforts to crowdsource robot learning from demonstration at scale through messaging protocols suited to web/cloud robotics. With such scaling of robotics in mind, prospects for cultivating both equal opportunity and technological excellence will be discussed in the context of broadening and strengthening Title IX and Title VI.

[ UMD ] Continue reading

Posted in Human Robots