Tag Archives: need

#428140 Singapore International Robotics Expo

Singapore International Robo Expo debuts as the robotics sector is poised for accelerated growth

In partnership with Experia Events, the Singapore Industrial Automation Association sets its sights on boosting the robotics solutions industry with this strategic global platform for innovation and technology

Singapore, 18 October 2016 – The first Singapore International Robo Expo (SIRE), organised by Experia Events and co-organised by the Singapore Industrial Automation Association (SIAA), will be held from 1 to 2 November 2016, at Sands Expo and Convention Centre, Marina Bay Sands.

Themed Forging the Future of Robotics Solutions, SIRE will comprise an exhibition, product demonstrations, networking sessions and conferences. SIRE aims to be the global platform for governments, the private sector and the academia to engage in dialogues, share industry best practices, network, forge partnerships, and explore funding opportunities for the adoption of robotics solutions.

“SIRE debuts at a time when robotics has been gaining traction in the world due to the need for automation and better productivity. The latest World Robotics Report by the International Federation of Robotics has also identified Singapore as a market with one of the highest robot density in manufacturing – giving us more opportunities for further development in this field, and well as its extension into the services sectors.

With the S$450 million pledged by the Singapore government to the National Robotics Programme to develop the industry over the next three years, SIRE is aligned with these goals to cultivate the adoption of robotics and support the growing industry. As an association, we are constantly looking for ways to bring together robotic adoption, collaboration among partners, and providing support with funding for our members. SIRE is precisely the strategic platform for this,” said Mr Oliver Tian, President, SIAA.

SIRE has attracted strong interest from institutes of higher learning (IHLs), research institutes, local and international enterprises, with innovation and technology applicable for a vast range of industries from manufacturing to healthcare.

ST Kinetics, the Title Sponsor for the inaugural edition of the event, is one of the key exhibitors, together with other leading industry players such as ABB, Murata, Panasonic, SICK Pte Ltd, and Tech Avenue amongst others. Emerging SMEs such as H3 Dynamics, Design Tech Technologies and SMP Robotics Singapore will also showcase their innovations at the exhibition. Participating research institute, A*STAR’s SIMTech, and other IHLs supporting the event include Ngee Ann Polytechnic, Republic Polytechnic and the Institute of Technical Education (ITE).

Visitors will also be able to view “live” demonstrations at the Demo Zone and come up close with the latest innovations and technologies. Some of the key highlights at the zone includes the world’s only fully autonomous outdoor security robot developed by SMP Robotics Singapore, as well as ABB’s Yumi, IRB 14000, a collaborative robot designed to work in close collaboration and proximity with humans safely. Dynamic Stabilization Systems, SIMTech and Design Tech will also be demonstrating the capabilities of their robotic innovations at the zone.

At the Singapore International Robo Convention, key speakers representing regulators, industry leaders and academia will come together, exchange insights and engage in discourse to address the various aspects of robotic and automation technology, industry trends and case studies of robotics solutions. There will also be a session discussing the details of the Singapore National Robotics Programme led by Mr Haryanto Tan, Head, Precision Engineering Cluster Group, EDB Singapore.

SIRE will also host the France-Singapore Innovation Days in collaboration with Business France, the national agency supporting the international development of the French economy. The organisation will lead a delegation of 20 key French companies to explore business and networking opportunities with Singapore firms, and conduct specialized workshops.

To further foster a deeper appreciation and to inspire the next generation of robotics and automation experts, the event will also host students from higher institutes of learning on Education Day on 2 November. Students will be able to immerse themselves in the exciting developments of the robotics industry and get a sampling of how robotics can be applied to real-world settings by visiting the exhibits and interacting with representatives from participating companies.

Mr Leck Chet Lam, Managing Director, Experia Events, says, “SIRE will be a game changer for the industry. We are expecting the industry’s best and new-to-market players to showcase their innovations, which could potentially add value to the operations across a wide spectrum of industry sectors, from manufacturing to retail and service, and healthcare. We also hope to inspire the robotics and automation experts of tomorrow with our Education Day programme.

Experia Events prides itself as a company that organises strategic events for the global stage, featuring thought leaders and working with the industries’ best. It is an honour for us to be partnering SIAA, a recognised body and key player in the robotics industry. We are privileged to be able to help elevate Singapore’s robotics industry through SIRE and are pulling out all stops to ensure that the event will be a resounding success.”

SIRE is supported by Strategic Partner, IE Singapore as well as agencies including EDB Singapore, GovTech Singapore, InfoComm Media Development Authority, A*STAR’s SIMTech, and Spring Singapore.

###

For further enquiries, please contact:

Marilyn HoExperia Events Pte LtdDirector, CommunicationsTel: +65 6595 6130Email: marilynho@experiaevents.com

Genevieve YeoExperia Events Pte LtdAssistant Manager, CommunicationsTel: +65 6595 6131Email: genevieveyeo@experiaevents.com
The post Singapore International Robotics Expo appeared first on Roboticmagazine. Continue reading

Posted in Human Robots | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

#428053 Omnidirectional Mobile Robot Has Just ...

Spherical Induction Motor Eliminates Robot’s Mechanical Drive System
PITTSBURGH— More than a decade ago, Ralph Hollis invented the ballbot, an elegantly simple robot whose tall, thin body glides atop a sphere slightly smaller than a bowling ball. The latest version, called SIMbot, has an equally elegant motor with just one moving part: the ball.
The only other active moving part of the robot is the body itself.
The spherical induction motor (SIM) invented by Hollis, a research professor in Carnegie Mellon University’s Robotics Institute, and Masaaki Kumagai, a professor of engineering at Tohoku Gakuin University in Tagajo, Japan, eliminates the mechanical drive systems that each used on previous ballbots. Because of this extreme mechanical simplicity, SIMbot requires less routine maintenance and is less likely to suffer mechanical failures.
The new motor can move the ball in any direction using only electronic controls. These movements keep SIMbot’s body balanced atop the ball.
Early comparisons between SIMbot and a mechanically driven ballbot suggest the new robot is capable of similar speed — about 1.9 meters per second, or the equivalent of a very fast walk — but is not yet as efficient, said Greg Seyfarth, a former member of Hollis’ lab who recently completed his master’s degree in robotics.
Induction motors are nothing new; they use magnetic fields to induce electric current in the motor’s rotor, rather than through an electrical connection. What is new here is that the rotor is spherical and, thanks to some fancy math and advanced software, can move in any combination of three axes, giving it omnidirectional capability. In contrast to other attempts to build a SIM, the design by Hollis and Kumagai enables the ball to turn all the way around, not just move back and forth a few degrees.
Though Hollis said it is too soon to compare the cost of the experimental motor with conventional motors, he said long-range trends favor the technologies at its heart.
“This motor relies on a lot of electronics and software,” he explained. “Electronics and software are getting cheaper. Mechanical systems are not getting cheaper, or at least not as fast as electronics and software are.”
SIMbot’s mechanical simplicity is a significant advance for ballbots, a type of robot that Hollis maintains is ideally suited for working with people in human environments. Because the robot’s body dynamically balances atop the motor’s ball, a ballbot can be as tall as a person, but remain thin enough to move through doorways and in between furniture. This type of robot is inherently compliant, so people can simply push it out of the way when necessary. Ballbots also can perform tasks such as helping a person out of a chair, helping to carry parcels and physically guiding a person.
Until now, moving the ball to maintain the robot’s balance has relied on mechanical means. Hollis’ ballbots, for instance, have used an “inverse mouse ball” method, in which four motors actuate rollers that press against the ball so that it can move in any direction across a floor, while a fifth motor controls the yaw motion of the robot itself.
“But the belts that drive the rollers wear out and need to be replaced,” said Michael Shomin, a Ph.D. student in robotics. “And when the belts are replaced, the system needs to be recalibrated.” He said the new motor’s solid-state system would eliminate that time-consuming process.
The rotor of the spherical induction motor is a precisely machined hollow iron ball with a copper shell. Current is induced in the ball with six laminated steel stators, each with three-phase wire windings. The stators are positioned just next to the ball and are oriented slightly off vertical.
The six stators generate travelling magnetic waves in the ball, causing the ball to move in the direction of the wave. The direction of the magnetic waves can be steered by altering the currents in the stators.
Hollis and Kumagai jointly designed the motor. Ankit Bhatia, a Ph.D. student in robotics, and Olaf Sassnick, a visiting scientist from Salzburg University of Applied Sciences, adapted it for use in ballbots.
Getting rid of the mechanical drive eliminates a lot of the friction of previous ballbot models, but virtually all friction could be eliminated by eventually installing an air bearing, Hollis said. The robot body would then be separated from the motor ball with a cushion of air, rather than passive rollers.
“Even without optimizing the motor’s performance, SIMbot has demonstrated impressive performance,” Hollis said. “We expect SIMbot technology will make ballbots more accessible and more practical for wide adoption.”
The National Science Foundation and, in Japan, Grants-in-Aid for Scientific Research (KAKENHI) supported this research. A report on the work was presented at the May IEEE International Conference on Robotics and Automation in Stockholm, Sweden.

Video by: Carnegie Mellon University
###
About Carnegie Mellon University: Carnegie Mellon (www.cmu.edu) is a private, internationally ranked research university with programs in areas ranging from science, technology and business, to public policy, the humanities and the arts. More than 13,000 students in the university’s seven schools and colleges benefit from a small student-to-faculty ratio and an education characterized by its focus on creating and implementing solutions for real problems, interdisciplinary collaboration and innovation.

Communications Department
Carnegie Mellon University
5000 Forbes Ave.
Pittsburgh, PA 15213
412-268-2900
Fax: 412-268-6929

Contact: Byron Spice For immediate release:
412-268-9068 October 4, 2016
bspice@cs.cmu.edu
The post Omnidirectional Mobile Robot Has Just Two Moving Parts appeared first on Roboticmagazine. Continue reading

Posted in Human Robots | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

#428039 Naturipe Berry Growers Invests in ...

FOR IMMEDIATE RELEASE CONTACT: Gary Wishnatzki
O: (813)498-4278
C: (813)335-3959
gw@harvestcroo.com

NATURIPE BERRY GROWERS INVESTS IN HARVEST CROO ROBOTICS
Adds to the growing list of strawberry industry investors

Tampa, FL (September 20, 2016) – Naturipe Berry Growers has joined the growing list of strawberry industry investors supporting Harvest CROO Robotics’ mission to answer the need for agricultural labor with technology. Naturipe is one of the largest strawberry growers in North America. With the support of Naturipe, now more than 20% of the U.S. strawberry industry has invested in Harvest CROO Robotics.

“The lack of availability of labor to harvest strawberries is one of the great challenges facing our industry,” said Rich Amirsehhi, President and CEO of Naturipe Berry Growers. “Harvest CROO Robotics’ technology to harvest berries has tremendous promise to solve this critical problem.”

Harvest CROO Robotics continues to develop and test the latest technology for agricultural robotics. The company will test their latest prototype during the Florida strawberry season, which begins in November. Improvements include harvest speed and the development of an autonomous mobile platform that will carry the robotic pickers through the field. After berries are picked, they will be transferred overhead to the platform level, where they will be inspected and packed into consumer units by delta robots. The development of the packing robots, next year, will mark another key milestone in Harvest CROO Robotics’ technological advances.

“The technology is prepared to make a major leap this coming season,” said Bob Pitzer, Co-founder and Chief Technology Officer of Harvest CROO. “We were at commercial speed, last March, at a rate of 8 seconds to pick a plant. Now by using embedded processors and a streamlined picking head design, we expect to easily cut that time in half.”

“Naturipe Berry Growers sees joining this collaborative effort as an important step in ensuring the sustainability of the U.S. strawberry industry and putting our growers in a position to be early adopters of the technology,” said Amirsehhi.

Harvest CROO is currently fundraising in preparation for the next round of prototypes. To learn more about Harvest CROO, including investment opportunities, contact info@harvestcroo.com.
###

About Harvest CROO:

Harvest CROO (Computerized Robotic Optimized Obtainer) began in 2012 on Gary Wishnatzki’s vision of creating a solution to the dwindling labor force in agriculture. With the expertise of Co-founder and Chief Technical Officer, Bob Pitzer, they began developing the first Harvest CROO machines. In Previous rounds, $1.8 million was raised through qualified investors. Many of these investors are members of the strawberry industry, including Sweet Life Farms, Sam Astin III, California Giant, Inc., Main Street Produce, Inc., Sweet Darling Sales, Inc. Innovative Produce Inc., DG Berry, Inc., Central West, and Naturipe Berry Growers. In Round C, Harvest CROO is seeking to raise $3 million to build the next version, the Alpha unit, which will be the predecessor to a production model. To learn more about Harvest CROO, including current career opportunities for experienced engineers, contact info@harvestcroo.com.

About Naturipe Berry Growers:

Naturipe Berry Growers (NBG) is a co-op of growers that was founded in 1917 as the Central California Berry Growers Association. NBG markets their fruit through Naturipe Farms LLC, which is a grower-owned producer and international marketer of healthy, best tasting, premium berries. With production primarily from multi generation family farms, located in prime berry growing regions throughout North and South America. The diverse grower base ensures year-round availability of “locally grown” and “in-season global” conventional and organic berries. Naturipe Farms, formed in 2000, is a partnership between MBG Marketing, Hortifrut SA, Naturipe Berry Growers and Munger Farms. With sales and customer service offices located strategically throughout the USA – (HQ) Salinas CA., Grand Junction MI., Estero FL., Boston MA., Wenatchee WA., Atlanta GA.
For more information visit: www.naturipefarms.com or https://www.facebook.com/Naturipe
The post Naturipe Berry Growers Invests in Harvest CROO Robotics appeared first on Roboticmagazine. Continue reading

Posted in Human Robots | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

#426831 Industrial robot runtime programming

Article provided by: www.robotct.ru
In this article, runtime programming is understood as the process of creating an executable program for a robot controller (hereinafter referred to as “robot”) on an external controller. In this case the robot performs the program iteratively, by sending the minimum executable command or batch of commands to it. In other words, in runtime programming, the executable program is sent to the robot in portions, thus the robot does not have, store, or know the entire executable program beforehand. Such an approach allows creating an abstract parameterized executable program, which is generated by the external device “on the fly”, i.e., during runtime.
Under the cut, there is the description and a real example of how runtime programming works.
Typically, a program for a robot is a sequence of positions of the robot manipulator. Each of these positions is characterized by the TCP (Tool Center Point) position, the point of the tip of the tool mounted on the manipulator (by default, TCP is in the center of robot’s flange, see the picture below, but its position may be adjusted, and it is often that TCP with the tip of the tool mounted on the manipulator of the robot). Therefore, when programming, TCP position in space is often specified, and the robot determines the positions of manipulator’s joints itself. Further in this article, we will use the term “TCP position”, or, in other words, the point that the robot shall arrive to.

The program for the robot may also contain control logic (branching, loops), simple mathematical operations, and commands for controlling peripheral devices – analog and digital inputs/outputs. In the proposed approach to runtime programming, a standard PC is used as an external controller, which can use powerful software that ensures the necessary level of abstraction (OOP and other paradigms), and tools that ensure speed and ease of developing complex logic (high-level programming languages). The robot itself has only to deal with the logic that is critical to response rate, for execution of which the reliability of an industrial controller is required, for example, prompt and adequate response to an emergency situation. The control of the peripherals connected to the robot is simply “proxied” by the robot on the PC, allowing the PC to activate or deactivate corresponding signals on the robot; it is something similar to controlling “legs” of Arduino.

As it has been noted earlier, runtime programming enables sending the program to the robot in portions. Usually, a set of states of output signals and several points, or even only a single point is sent. Thus, the trajectory of the TCP movement performed by the robot may be built dynamically, and some of its parts may belong both to different technological processes, and even to different robots (connected to the same external controller), where a group of robots works.
For example, the robot has moved to one of the working areas, performed the required operations, then – to the next one, then to yet another one, and then back to the first one, etc. In different working areas, the robot performs operations required for different technological processes, where programs are executed in parallel threads on the external controller, which allocates the robot to different processes that do not require constant presence of the robot. This mechanism is similar to the way an OS allocates processor time (execution resource) to various threads, and at the same time, different executors are not linked to threads throughout the whole period of program execution.
A little more theory, and we will proceed to practice.
Description of the existing methods of programming industrial robots.
Without regard to the approach of runtime programming introduced in this article, two ways of programming industrial robots are usually identified. Offline and online programming.
The process of online programming occurs with direct interaction of the programmer and the robot at the location of usage. Using a remote control, or by physical movement, the tool (TCP) mounted on the flange of the robot is moved to the desired point.
The advantage of this method of programming is the ease of approach to robot programming. One does not have to know anything about programming; it is enough to state the sequence of robot positions.
An important disadvantage of this approach is the significantly increased time consumption, when the program is increased at least to several dozen (not to mention thousands) points, or when it (the program) is subsequently modified. In addition, during such learning, the robot cannot be used for work.
The process of offline programming, as the name implies, occurs away from the robot and its controller. The executable program is developed in any programming environment on a PC, after which it is entirely loaded into the robot. However, programming tools for such development are not included into the basic delivery set of the robot, and are additional options to be purchased separately, and expensive on the whole.
The advantage of offline programming is that the robot may be used in production and may work, while the program is being developed. The robot is only needed to debug ready programs. There is no need to go to the automation object and program the robot in person.
A great disadvantage of the existing offline programming environments is their high cost. Besides, it is impossible to dynamically distribute the executable program to different robots.
As an example, let us consider creating a robot program in runtime mode, which enables the process of writing an ad with a marker.

Result:

ATTENTION! The video is not an advertisement, the vacancy is closed. The article was written after the video had become obsolete, to show the proposed approach to programming.

The written text:
HELLO, PEOPLE! WE NEED A DEVELOPER TO CREATE A WEB INTERFACE OF OUR KNOWLEDGE SYSTEM.
THIS WAY WE WILL BE ABLE TO GET KNOWLEDGE FROM YOU HUMANOIDS.
AND, FINALLY, WE’LL BE ABLE TO CONQUER AND IMPROVE THIS WORLD

READ MORE: HTTP://ROBOTCT.COM/HI
SINCERELY YOURS, SKYNET =^-^=
To make the robot write this text, it was necessary to send over 1,700 points to the robot.
As an example, the spoiler contained a screenshot of the program drawing a square from the robot’s remote control. It only has 5 points (lines 4-8); each point is in fact a complete expression, and takes one line. The manipulator traverses each of the four points, and returns to the starting point upon completion.
The screenshot of the remote control with the executable program:

If the program is written this way, it would take at least 1,700 lines of code, a line per point. What if you have to change the text, or the height of the characters, or the distance between them? Edit all the 1,700 point lines? This contradicts the spirit of automation!
So, let’s proceed to the solution…
We have a FANUC LR Mate 200iD robot with an R-30i B series cabinet controller. The robot has a preconfigured TCP at the marker end, and the coordinate system of the desktop, so we can send the coordinates directly, without worrying about transforming the coordinates from the coordinate system of the table into the coordinate system of the robot.
To implement the program of sending the coordinates to the robot, which will calculate the absolute values of each point, we will use the RCML programming language that supports this robot and, which is important, which is free for anyone to use.
Let’s describe each letter with dots, but in the relative coordinates inside the frame, in which the letter will be inscribed, rather than in the real space coordinates. Each letter will be drawn by a separate function receiving the sequence number of the character in the line, line number and the size of the letter as input parameters, and sending a set of points to the robot with calculated absolute coordinates for each point.
To write a text, we will have to call a series of functions that would draw the letters in the sequence in which they (letters) are present in the text. RCML has a meager set of tools for working with strings, so we will write an external Python script which will generate a program in RCML – essentially, generate only the sequence of function calls that corresponds to the sequence of letters.
The whole code is available in repository: rct_paint_words
Let us consider the output file in more detail, execution starts from function main():

Spoiler: “Let us consider the code for drawing a letter, for example, letter A:”
function robot_fanuc::draw_A(x_cell,y_cell){
// Setting the marker to the point, the coordinates of the point are 5% along X and 95% along Y within the letter frame
robot->setPoint(x_cell, y_cell, 5, 95);
// Drawing a line
robot->movePoint(x_cell, y_cell, 50, 5);
// Drawing the second line
robot->movePoint(x_cell, y_cell, 95, 95);
// We get the “roof” /

// Moving the marker lifted from the table to draw the cross line
robot->setPoint(x_cell, y_cell, 35, 50);
// Drawing the cross-line
robot->movePoint(x_cell, y_cell, 65, 50);
// Lifting the marker from the table to move to the next letter
robot->marker_up();
}
End of spoiler

Spoiler: “The functions of moving the marker to the point, with or without lifting, are also very simple:”
// Moving the lifted marker to the point, or setting the point to start drawing
function robot_fanuc::setPoint(x_cell, y_cell, x_percent, y_precent){
// Calculating the absolute coordinates
x = calculate_absolute_coords_x(x_cell, x_percent); y = calculate_absolute_coords_y(y_cell, y_precent);

robot->marker_up(); // Lifting the marker from the table
robot->marker_move(x,y); // Moving
robot->marker_down(); // Lowering the marker to the table

// Moving the marker to the point without lifting, or actually drawing
function robot_fanuc::movePoint(x_cell, y_cell, x_percent, y_precent){ x = calculate_absolute_coords_x(x_cell, x_percent); y = calculate_absolute_coords_y(y_cell, y_precent);
// Here everything is clear robot->marker_move(x,y);
}
End of spoiler

Spoiler: Functions marker_up, marker_down, marker_move contain only the code of sending the changed part of the TCP point coordinates (Z or XY) to the robot.
function robot_fanuc::marker_up(){
robot->set_real_di(“z”, SAFE_Z);
er = robot->sendMoveSignal();
if (er != 0){
system.echo(“error marker upn”);
throw er;
}
}

function robot_fanuc::marker_down(){
robot->set_real_di(“z”, START_Z);
er = robot->sendMoveSignal();
if (er != 0){
system.echo(“error marker downn”);
throw er;
}
}

function robot_fanuc::marker_move(x,y){
robot->set_real_di(“x”, x);
robot->set_real_di(“y”, y);
er = robot->sendMoveSignal();
if (er != 0){
system.echo(“error marker moven”);
throw er;
}
}
End of spoiler

All configuration constants, including size of letters, their number in the line, etc., were put to a separate file.
Spoiler: “Configuration file”
define CHAR_HEIGHT_MM 50 // Character height in mm
define CHAR_WIDTH_PERCENT 60 // Character width in percentage of height

define SAFE_Z -20 // Safe position of the tip of the marker along the z-axis
define START_Z 0 // Working position of the tip of the marker along the z-axis

// Working area border
define BORDER_Y 120
define BORDER_X 75

// ON/OFF signals
define ON 1
define OFF 0

// Pauses between sending certain signals, milliseconds
define _SIGNAL_PAUSE_MILLISEC 50
define _OFF_PAUSE_MILLISEC 200

// Euler angles of the initial marker position
define START_W -179.707 // Roll
define START_P -2.500 // Pitch
define START_R 103.269 // Yaw

// Euler angles of marker turn
define SECOND_W -179.704
define SECOND_P -2.514
define SECOND_R -14.699

define CHAR_OFFSET_MM 4 // Spacing between letters

define UFRAME 4 // Table number
define UTOOL 2 // Tool number
define PAYLOAD 4 // Load number
define SPEED 100 // Speed
define CNT 0 // Movement smoothness parameter
define ROTATE_SPEED // Speed in turn

define HOME_PNS 4 // The number of the PNS program for home position return
End of spoiler

In total, we’ve got about 300 lines of high level code that took not more than 1 hour to develop and write.
If the problem had been solved in the “straightforward” manner by online programming with the use of points, it would have taken more than 9 hours (approximately 20-25 seconds per point, given the fact that there are over 1,700 points). In this case, the developer’s sufferings are unimaginable :), especially when he would have found out that he had forgotten about the indents between the frames that the letters were inscribed in, or the height of the letters was wrong, and the text did not fit in.
Conclusion:
The use of runtime programming is one of the ways to create executable software. The advantages of this approach include the following:
The possibility of writing and debugging programs without the need to stop the robot, thus minimizing the downtime for changeover.
A parameterized executable program that’s easy to edit.
Dynamic activation and deactivation robots in the active technological task, and cooperation of robots from various manufacturers.
Thus, with the use of runtime programming, an executable command may be described in a way to make its execution available for any robot within the working group, or may be written for a particular robot, that will be the only one to execute it.
However, this approach has one significant limitation – incorrect understanding of the displacement smoothing instruction (CNT) by the robot, or ignoring it, since when only the current point is sent, the robot knows nothing about the next one, and cannot calculate the smoothed trajectory for bypassing the current point with smoothing.
Spoiler: “What is trajectory smoothing?”
When moving the robot’s tool, two parameters may be adjusted:
Travel speed
Level of smoothing
Travel speed sets the speed of the tool travel in mm/sec.
Level of smoothing (CNT) allows passing a group of points along the trajectory with the least distance between the extreme points of the group.

End of spoiler

The danger of using this instruction in the runtime mode is that the robot reports its arrival to the smoothed target point, although in reality the robot is still moving towards it. The robot does it to request the next point, and to calculate smoothing. Evidently, it is impossible to know exactly in what position the robot is when passing such a point, besides, tool activation at the manipulator may be required at a certain point. The robot will send a signal about reaching the point, but it is not actually so. In this case, the tool will be enabled before it is needed. At the best case, the robot will simply ignore the CNT instruction (depending on the model).
This may be fixed by sending 2 or more points at a time, where the CNT point is not the last one; however, this increases program complexity and the burden on the programmer.
Article provided by: robotct.ru
Photo Credits: Robotct.ru

The post Industrial robot runtime programming appeared first on Roboticmagazine. Continue reading

Posted in Human Robots | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a comment