3D Time-of-Flight camera module for industrial and robotic applications and system solution for event-based image processing

LUCID Vision Labs GmbH from Ilsfeld presents the new »Helios2 Wide 3D Time-of-Flight Camera«, which uses a DepthSense™ IMX556PLR Back-Illuminated ToF image sensor from Sony and a wide angle lens with a viewing angle of 108º. The system is particularly suitable for applications with a short working distance and large working area, such as full-size palletizing applications. The 3D camera delivers a depth resolution of 640 x 480 pixels at a working distance of up to 8.3 meters and a frame rate of 30 fps.

The time-of-flight method uses a laser diode to emit light and a camera to record the reflection of the illuminated object, similar to the principle of radar. It measures the time of flight of the reflected signal and the distance from the reflecting surface can be determined for each pixel in the camera sensor. This allows a complete two-dimensional height image to be captured in just a few milliseconds. This image data is then converted into a 3D model of the object.

The Arena Software Development Kit (SDK) developed by LUCID provides a variety of control elements for the camera. For example, the intensity and depth of a scene can be displayed in either a 2D view or a 3D point cloud view, manipulated and aligned in real time. Settings can be adjusted and displayed in real time, such as false color overlay and depth ranges.

3D Time-of-Flight camera module
© LUCID Vision Labs GmbH
The system is particularly suitable for applications with a short working distance and a large working range.

Event-based vision

Camera for event-based vision
© LUCID Vision Labs GmbH
Due to its innovative sensor technology, the system significantly reduces image processing costs and can be used in a wide range of industrial applications.

In addition, the event-based »Triton2 EVS camera« with the IMX636/637 image sensors from Sony and the Metavision® Intelligence Suite from PROPHESEE, which offers a resolution of 1280 x 720 pixels depending on the system variant, will be on display. Event-based vision opens up new industrial applications, as innovative sensor technology generally reduces the amount of image processing required, resulting in increased system performance and lower power consumption. This is because »event-based vision sensors (EVS)« only send data from pixels where a change in intensity has been detected. This minimizes the amount of data and the processor resources required to analyse the image. Event-based image processing systems produce up to 1000 times less data than a conventional sensor while achieving a higher equivalent temporal resolution of >10000 fps.

 

The system has a compact size of 29 mm x 44 mm and is equipped with IP67 protection, industrial M8/M12 connectors and GigE PoE. It can be used in a variety of industrial applications such as motion analysis, oscillation monitoring, object tracking, autonomous driving or high-speed detection.