“Safety is vital when deploying a new robot system”

Mar 30, 2020

A quick look at what’s in store for the robotics industry in 2020 together with Paul Szeflinski, Founder & President of IAS Inc. in the US

 
The robotics industry has grown continously in recent years. In 2018, the International Federation of Robotics estimated that global robot installations surpassed 400,000 units per year. As this number continues to grow, more and more manufacturers are seeing the value in industrial robotics. It also means that manufacturers are implementing their first robots in to handle a variety of tasks for the first time. For these companies, it can be overwhelming to determine just where to start in implementing their first system. SICK has invited Paul Szeflinski, Founder & President of IAS Inc., to provide some insights on the trends for 2020 in the field of robotics.
“Safety is vital when deploying a new robot system”
“Safety is vital when deploying a new robot system”

 

What trends have you seen in the field of robot automation?

Szeflinski We are continuing to see growth in robotics within industries that have been historically slower to invest in the field. Namely, general industry and food & beverage. As we consult with our clients, we are hearing them speak about robotics with a higher comfort level than ever before. Previously they may have been hesitant to pull the trigger and adopt to something new, but more frequently now, they are viewed as a competitive necessity that keep costs low and improve the quality of their products.
 
The work space between robots and humans is also shrinking. The surge in collaborative robots are the one cause behind this, but industrial robots with safety-related work spaces are also being deployed to combine speed and human interaction. Collaborative work space or not, safety is vital when deploying a new robot system. Meeting the Robotics Industries Association (RIA) safety standards has always been a priority for IAS and SICK in the US, but clients are now often bringing up the topic independently.
 
In addition, we’ve seen that offline programming software (OLP) is becoming more robust and affordable. By importing the CAD of our cell design and programming offline, we can move the programming work in our projects earlier into our project schedule and we don’t have to wait for the robot to hit our floor to perform the bulk of the programming load. Application specific add-ons within OLP software are continuing to become more robust as well. For example, in palletizing applications with our Sprinter Series work cells, we can teach box sizes and pallet configuration parameters and have the software generate the line-by-line code. This leaves us more time for real-world testing and ultimately cuts down our delivery times.

 

The work space between robots and humans is shrinking
The work space between robots and humans is shrinking

 

How are you seeing data from robotics being used to improve manufacturing processes?

Szeflinski By looking at data acquired over long periods of time by the robot system, manufacturers are able to gain insights into its performance that is otherwise difficult or impossible to analyze without a large dataset. Depending on the client, this analysis can be basic or quite advanced. The three largest points of concern we’ve seen are:
  • Cycle time analysis of the robot’s operation to assure the entire system is functioning according to planned.
  • Checking system idle time by seeing if the robot system is well supplied with product on the infeed side and not jammed up on the outfeed.
  • Predictive maintenance on the robot itself. Each robot application will have products, tooling, and motion profiles that will wear on the axis differently. By bringing together the usage data with diagnostic tools for the robot’s parts, it is possible to get an idea of what components may decrease in efficiency or may fail and need to be replaced first.

 

Improving manufacturing processes with robotics
Improving manufacturing processes with robotics

 

What have you seen change in the past five years with vision-guided robot automation?

Szeflinski There have been great strides made in the world of vision guided robotics in the past five years as the technology becomes useful to a wide array of applications. We’ve seen four main areas of impact:
  • Evolution of 2D cameras – As 2D vision solutions have become more affordable, compact, and robust, the implementation of VGRs is easier than ever. An increasing number of options and enhanced capabilities are now available that enable a robot to be guided more accurately. This makes VGRs a viable solution when parts are moving quickly. Emerging 4K and higher resolutions are coupled with higher frame rates to provide more precise and accurate solutions.
  • 3D analysis is easier and less intensive – Large strides have been made on the software side of 3D vision. Data acquisition, calibration, and stitching of multi-camera data used to be time consuming and cumbersome processes. Now they are all much faster, more accurate, and more user-friendly. With all of the dollars that companies are putting toward making accessible 3D vision a reality, we expect this to progress even further in the future.
  • Automated bin picking – Seeing parts in a random configuration and guiding the robot to unload a parts bin is now much more possible than it was five years ago. Significant testing is still needed as there are so many factors to a successful implementation of a bin picking system, and the best way to assure all parties of reliable results is to run the cycles to find out.
  • A rise in integrated hardware/software/robot packages – Now, third party camera systems can be directly connected and handled by industrial robots. Guidance packages are then available from robot manufacturers.

 

Great strides in the world of vision guided robotics
Great strides in the world of vision guided robotics

 

What are unique cases where you’ve seen vision-guided robots being used?

Szeflinski We’ve noticed three main instances of this:
  • Bin picking and kitting – This seems to be gaining a lot of traction recently in the packaging and logistics industries. When products are already boxed, the handling requirements can be less delicate and less precise than other robotic applications.
  • Stacking and racking – IAS has had applications where the end-user was looking to automate the task of racking product into carts. During our automation of a seed processing facility, we had robots at the end of each of the two lines placing trays full of seeds into bakery carts that would be then transferred into control rooms. Another application called for picking stacks of trays that included vitamins that were loaded and unloaded from push carts. Factory items such as bakery and push carts are never uniform and are subject to wear over time. In both applications, end-of-arm-tool mounted cameras helped our robots navigate their changing environment for the perfect motion path to accommodate what they were seeing.
  • Vision to enable robot decision making – In high-mix scenarios, we’ve used vision to identify products via code reading or unique physical features and then used the spatial data to guide the robot to the part to either sort it properly or perform a value adding process.
 
Any predictions for the future of robotic automation?
Szeflinski We predict that digital offerings will provide new insights into the equipment on the floor and will speed up engineering processes. This shift will be felt from new project specifications to programming changes to maintenance. Important information will be more detailed and more accessible as digital models and real-world IIoT data will be able to be merged.
 
For example, if a customer across the country wants to add a slip sheet station in one of our robotic palletizers, we can design this into our CAD model, program the robot digitally, and either send the program along with any drawings, or log into the robot remotely to make the changes. The only downtime would be the physical implementation of the slip sheet station and there is zero travel time involved. This is all possible today and adoption of practices like this are growing quickly.

 

Digital offerings providing new insights into systems engineering
Digital offerings providing new insights into systems engineering

Do you have any questions?

Get in touch with our experts

Sensor solutions for robotics
Robotics image
Sensor solutions for robotics
Robotics image

Working together as equals

Thanks to sensors from SICK, robots perceive more precisely. For all challenges in the field of robotics: Robot Vision, Safe Robotics, End-of-Arm Tooling, and Position Feedback.
Learn more