Tag Archives: training
This is a sponsored article brought to you by UL Solutions.
Invest in building your team’s excellence with functional safety training and certification services from UL Solutions, a global safety science leader.
Our UL Certified Functional Safety Certification programs provide your team opportunities to learn about — or deepen their existing knowledge and skills in — functional safety to achieve professional credentials in this space.
We offer personnel certification at both the professional and expert levels in automotive, autonomous vehicles, electronics and semiconductors, machinery, industrial automation, and cybersecurity.
You can now register for any of the offerings listed below. All our instructor-led, virtual courses provide a deep dive into key functional safety standards.
UL Certified Functional Safety Professional in IEC 61511 Class with Exam – Virtual
This three-day course provides a comprehensive overview of the IEC 61511:2016 and ANSI/ISA 61511:2018 standards for the process industry. Participants who complete all three days of training can take a two-hour certification exam on the morning of the fourth day. Those who pass the exam earn individual certification as a UL Certified Functional Safety Professional in IEC 61511 or UL-CFSP.
Functional Safety Overview and Designing Safety-Related Electronic Control Systems in Accordance with IEC 61508 Standard Class with Exam – Virtual (English)
This three-day course helps engineers, developers and managers successfully apply IEC 61508 to their safety-related electrical systems. IEC 61508 serves as the base functional safety standard for various industries, including process, nuclear and machinery, among others. This course includes a one-hour follow-up Q&A session (scheduled at a later date) with one of UL Solutions’ functional safety experts.
UL Certified Autonomy Safety Professional Training in UL 4600 2nd Edition Class with Exam – Virtual (English)
This 2.5-day course highlights modern-day autonomous robotics, industrial automation, sensors and semi-automated technologies and how they can apply to safety. The course focuses on UL 4600, the Standard for Evaluation of Autonomous Products, and includes information on related safety standards.
Functional Safety Training for Earth-Moving Machinery in Agricultural Tractor and Construction Control Systems Per ISO 25119, ISO 13849 and ISO 19014
UL Certified Functional Safety Professional Training in Agriculture and Construction Machinery Class with Exam – Virtual (English)
This 2.5-day course will cover functional safety standards and concepts related to agricultural and construction earth-moving machinery. Applicable standards covered in this training include the EU Machinery Directive; ISO 19014:2018, Earth-Moving Machinery — Functional Safety — Part 1: Methodology to Determine Safety-Related Parts of the Control System and Performance Requirements; and ISO 25119:2018, Tractors and Machinery for Agriculture and Forestry — Safety-Related Parts of Control Systems. UL Solutions’ experts will cover topics such as hazard identification and risk assessment per ISO 12100:2010, Safety of Machinery — General Principles for Design — Risk Assessment and Risk Reduction. Case studies on a range of topics, including motor drives and safety product life cycles, will also help provide examples of how the requirements and concepts of the standards apply.
ISO 13849, IEC 62061, IEC 61800-5-2, 25119, and the EU Machinery Directive
UL Certified Functional Safety Professional Training in Machinery Class with Exam – Virtual (English)
This 2.5-day course is for engineers working on programmable machinery and control systems. The training course will cover functional safety standards and concepts related to the EU Machinery Directive, including ISO 13849, Safety of Machinery – Safety-Related Parts of Control Systems; IEC 61800-5-2, Adjustable Speed Electrical Power Drive Systems – Part 5-2: Safety Requirements – Functional; and IEC 62061, Safety of Machinery – Functional Safety of Safety-Related Electrical, Electronic and Programmable Electronic Control Systems.
Purchase training→ Continue reading
A horse, a zebra and artificial intelligence helped a team of Carnegie Mellon University researchers teach a robot to recognize water and pour it into a glass. Continue reading
Your weekly selection of awesome robot videos
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.
IEEE ARSO 2022: 28 May–30 May 2022, LONG BEACH, CALIF.RSS 2022: 21 June–1 July 2022, NEW YORK CITYERF 2022: 28 June–30 June 2022, ROTTERDAM, NETHERLANDSRoboCup 2022: 11 July–17 July 2022, BANGKOKIEEE CASE 2022: 20 August–24 August 2022, MEXICO CITYCLAWAR 2022: 12 September–14 September 2022, AZORES, PORTUGALCoRL 2022: 14 December–18 December 2022, AUCKLAND, NEW ZEALANDEnjoy today's videos!
Finally, after the first Rocky movie in 1976, the Robotic Systems Lab presents a continuation of the iconic series. Our transformer robot visited Philly in 2022 as part of the International Conference on Robotics and Automation.
[ Swiss-Mile ]
Human cells grown in the lab could one day be used for a variety of tissue grafts, but these cells need the right kind of environment and stimulation. New research suggests that robot bodies could provide tendon cells with the same kind of stretching and twisting as they would experience in a real human body. It remains to be seen whether using robots to exercise human cells results in a better tissue for transplantation into patients.
[ Nature ]
Researchers from Carnegie Mellon University took an all-terrain vehicle on wild rides through tall grass, loose gravel and mud to gather data about how the ATV interacted with a challenging, off-road environment.
The resulting dataset, called TartanDrive, includes about 200,000 of these real-world interactions. The researchers believe the data is the largest real-world, multimodal, off-road driving dataset, both in terms of the number of interactions and types of sensors. The five hours of data could be useful for training a self-driving vehicle to navigate off road.[ CMU ]
Chengxu Zhou from the University of Leeds writes, “we have recently done a demo with one operator teleoperating two legged manipulator for a bottle opening task.”
[ Real Robotics ]
We recently hosted a Youth Fly Day, bringing together 75 Freshman students from ICA Cristo Rey All Girls Academy of San Francisco for a day of hands-on exposure to and education about drones. It was an exciting opportunity for the Skydio team to help inspire the next generation of women pilots and engineers.
[ Skydio ]
Legged robotic systems leverage ground contact and the reaction forces they provide to achieve agile locomotion. However, uncertainty coupled with the discontinuous nature of contact can lead to failure in real-world environments with unexpected height variations, such as rocky hills or curbs. To enable dynamic traversal of extreme terrain, this work introduces the utilization of proprioception to estimate and react to unknown hybrid events and elevation changes and a two-degree-of-freedom tail to improve control independent of contact.
If you like this and are in the market for a new open source quadruped controller, CMU's got that going on, too.
[ Robomechanics Lab ]
A bolt-on 360 camera kit for your drone that costs $430.
[ Insta360 ]
I think I may be too old to have any idea what's going on here.
[ Neato ]
I'm not the biggest fan of the way the Stop Killer Robots folks go about trying to make their point, but they have a new documentary out, so here you go.
[ Immoral Code ]
This symposium hosted by the U.S. Department of Commerce and National Institute of Standards and Technology, Stanford Institute for Human-Centered Artificial Intelligence (HAI), and the FinRegLab, brought together leaders from government, industry, civil society, and academia to explore potential opportunities and challenges posed by artificial intelligence and machine learning deployment across different economic sectors, with a particular focus on financial services and healthcare.
[ Stanford HAI ] Continue reading
In their efforts to create smart robots, AI researchers have understandably tended to focus on the brains. But a group from MIT say AI can help us design better bodies for them too, and we should be doing both in parallel.
For a robot to solve a task, its brain and its body have to sync up perfectly to get the job done. That means that an effective AI controller that’s good at piloting one kind of body won’t necessarily work well for one that’s very different.
The standard approach is to simply design a robot body—either by hand or using AI design tools—and then train an AI to control it. But an even better solution is to carry out both processes simultaneously so that the control AI can give feedback on how changes to the body make it easier or more difficult to solve the problem.
This is known as co-design, and it’s not entirely new. But running these two optimization processes in parallel is very complicated, and it can take a long time to reach a useful solution. Because the design algorithm has to try out thousands of different configurations, the approach only works in simulation, and typically, researchers have to build a testing environment from scratch or heavily adapt existing robot training simulations.
All this takes a lot of work, which has led to most co-design environments focusing on a small number of simple tasks. And because most have been developed by separate groups, it’s not easy to compare results across them.
In an attempt to solve these problems, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has created a co-design simulator called Evolution Gym that allows researchers to test out their approaches on a wide range of tasks and terrains using a highly customizable robot design framework. The simulator has also been designed so that groups with fewer computing resources can still use it.
“With Evolution Gym we’re aiming to push the boundaries of algorithms for machine learning and artificial intelligence,” MIT’s Jagdeep Bhatia said in a press release. “By creating a large-scale benchmark that focuses on speed and simplicity, we not only create a common language for exchanging ideas and results within the reinforcement learning and co-design space, but also enable researchers without state-of-the-art compute resources to contribute to algorithmic development in these areas.”
For simplicity the simulator, which will be presented at the Conference on Neural Information Processing Systems this week, only works in two dimensions. The team has designed 30 unique tasks, which include things like walking, jumping over obstacles, carrying or pulling objects, and crawling under barriers, and researchers can also design their own challenges.
The environment allows design algorithms to build robots by linking together squares that can be soft, rigid, or actuators—essentially muscles that enable the rest of the robot to move. An AI system then learns how to pilot this body and gives the design algorithm feedback on how good it was at different tasks.
By repeating this process many times the two algorithms can reach the best possible combination of body layout and control system to solve the challenge.
To set some benchmarks for their simulator, the researchers tried out three different design algorithms working in conjunction with a deep reinforcement learning algorithm that learned to control the robots through many rounds of trial and error.
The co-designed bots performed well on the simpler tasks, like walking or carrying things, but struggled with tougher challenges, like catching and lifting, suggesting there’s plenty of scope for advances in co-design algorithms. Nonetheless, the AI-designed bots outperformed ones design by humans on almost every task.
Intriguingly, many of the co-design bots took on similar shapes to real animals. One evolved to resemble a galloping horse, while another, set the task of climbing up a chimney, evolved arms and legs and clambered up somewhat like a monkey.
The simulator has been open-sourced and is free to use, and the team’s hope is that other researchers will now come and try out their co-design algorithms on the platform, which will make it easier to compare results.
“Evolution Gym is part of a growing awareness in the AI community that the body and brain are equal partners in supporting intelligent behavior,” the University of Vermont’s Josh Bongard said in the press release. “There is so much to do in figuring out what forms this partnership can take. Gym is likely to be an important tool in working through these kinds of questions.”
Image Credit: MIT CSAIL via YouTube Continue reading
Everywhere you look in technology today, you find buzz about the promise of emergent technologies such as machine learning (ML) and artificial intelligence (AI). From curating the content that we watch on streaming services to finding ways to improve intense logistical processes, ML- and AI-based technologies already impact our lives in many ways. Increasingly, these …
The post Is AI the Future of Training for New Employees? appeared first on TFOT. Continue reading