Industrieautomation braucht optimale Augen: 3D-Kameras mit Time-of-Flight-Technologie (3D-ToF). In einem bereichsübergreifenden Projekt hat SICK zum richtigen Zeitpunkt gleich zwei solcher Vision-Systeme auf den Markt gebracht, von denen jedes in seinem Feld Maßstäbe setzt.
Gemeinsam mehr im Blick: 3D-Vision-Kameras mit Time-of-Flight-Technologie
Action in der Industrie-4.0-Produktion: 12 Pakete korrekt auf der Palette: Check, zwölf Pakete korrekt auf der Palette: Check, zwölf Pakete korrekt auf der Palette: Check – die Kommissionierung läuft wie am Schnürchen, ein Fall für blau. Währenddessen holt ein mobiler Roboter Palettennachschub und ein Mitarbeiter kommt ihm auf dem Weg durch die Werkshalle etwas nahe, er verlangsamt direkt seine Fahrt – ein Fall für gelb.
Blau steht für höchste Automatisierungsanforderungen oder die Produktlinie Visionary-T Mini, die erste Linie, bei deren 3D-ToF-Technologie SICK mit Microsoft kooperiert. Und gelb steht für höchste Sicherheitsanforderungen oder die Kamera safeVisionary2, die auf den ersten Blick außer der Farbe der Mini fast gleicht, aber als erste 3D-ToF-Kamera weltweit die hohe Sicherheitszertifizierung Performance Level c nach EN 13849 erreicht hat. Beide sind sogenannte 3D-Snapshot Kameras, die Industriekunden unterschiedlichster Branchen nutzen.
Schwerpunkt Programmierbarkeit oder Sicherheit
Das Besondere an unserer blauen Kamera ist nicht nur ihre Kompaktheit, Robustheit und ihre hervorragende Datenqualität, sondern auch ihre Programmierbarkeit. Durch dieses sogenannte Edge Computing lassen sich Apps direkt im Sensor programmieren. Je nachdem, welche Anwendung mit dem Sensor gelöst werden soll, kann der Kunde die dazu passende Software selbst auswählen, Software-defined Sensing eben. Dazu bringt das Gerät außer der Kamera auch gleich die Rechnerkapazitäten mit; das macht es so vielseitig einsetzbar“, betont Dr. Anatoly Sherman, Head of 3D Snapshot, Product Management & Applications Engineering.
With the yellow safeVisionary2, however, the main benefit is right there in the product name: safety. Most of the time, that means preventing collisions in mobile robots, creating a 3D protective field for collaborative robots or preventing service robot from falling. “Our yellow cameras are certified for that kind of job; in this case, by TÜV. That includes the integrated software. It comes installed on the device, because the safeVisionary2 is a safety product. All the user has to do is adjust a few parameters relating to integration and the relevant application, and then the camera can be put into operation right away,” says Torsten Rapp, Head of the Autonomous Safety business unit, as he explains the various potential applications of the two camera sensors.
However, as well as the technology, there is something else that makes these cameras special, and that is the unprecedented collaboration across multiple departments during work on their development. “Our customers are constantly looking at their productivity and want affordable solutions with maximum additional benefit and ever faster turnaround times,” says Sherman. “We regard it as the perfect time to significantly step up our internal collaboration and, for the first time, develop a highly adaptable programmable ‘blue’ camera and a safety-focused ‘yellow’ camera in one project right from the outset.” The blue team contributed 3D expertise over the course of the multi-year development work, while the yellow team brought safety know-how to the table.
New approach requires a new way of thinking
With a task like this before them, everyone involved had to set about changing their mindsets. Certain freedoms that might be available in the development of a blue automation sensor were subject to limitations imposed by the process for the safety version. “Yes, we did occasionally all have to go the extra mile in pursuit of our goal of a more customer-focused approach, but it was worth it,” says Rapp. The new approach enabled two products to be launched in relatively quick succession, and both immediately met with high demand.
For the team members involved, the project is a prime example of how SICK could approach further sensor development in the future. Speaking of the future, things in industry are moving from automation toward autonomization and complete interconnectivity between machinery and sensors as part of adaptive production.
The goal is for all input production data to be evaluated in real time with smart algorithms and for it to be possible for processes to undergo continuous adjustment. This is to result in fully digitalized and interconnected production environments. “Our vision systems, which also work in real time, are a good building block for that,” says Rapp. The cameras act like eyes, delivering exactly the high-quality 3D data that is required for optimized autonomous decision-making. “And we all know that there’s only so much you can do without a good pair of eyes,” adds Sherman.
3D time-of-flight Technology: Precise 3D data in real time
Time-of-flight (3D ToF) refers to a method of measuring the time it takes for a light signal to travel between a camera and a target simultaneously for each point in the image. Once the time of arrival or the phase shift of the reflected light is known, it is possible to determine the distance to the object. The Visionary-T Mini from SICK, for example, delivers more than 6.5 million 3D distance data points per second, all on a very stable platform. This method, also known as 3D snapshot technology, can use time-of-flight measurement to gain a three-dimen-sional picture of even static scenes without any need for actuators or moving mechanical parts in the camera.
Momentum: Magazine from the 2022 Annual Report
It is worthwhile to scrutinize success stories for momentum. What was the moment or occasion that set everything rolling and brought success?
The articles in this magazine show that momentum is no coincidence and is more than just a chain of fortunate circumstances. Momentum comes from intuition, inspiration, experience, competence and passion.
