text.skipToContent text.skipToNavigation

FAQ About the 3D Sensors

What is structured light? What is a 3D point cloud? How does triangulation work? On this page, the most frequently asked questions on the topic of 3D sensors have been compiled for both experts and laypeople. 

A 3D point cloud is a set of points that define the point in the space using the three spatial coordinates.

The triangulation angle between the light source and the camera causes illuminated objects to cast shadows. The object obstructs the light and creates what is known as shading.

If the background is obscured by an object, it is no longer visible to the camera. This is the so-called obstruction.

Depth information can only be output for areas that are neither shaded nor obstructed. The greater the triangulation angle, the more intense the shading or obstruction.

Illumination via LEDs can be dangerous at high light intensities. The risk groups are used to assess the security or risk.


Free group:
Luminaires do not pose a photobiological hazard.

Risk group 1:
Luminaires do not pose a hazard due to normal restrictions caused by user behavior.

Risk group 2:
Due to the normal human reactions of turning away from bright sources of light and withdrawing from thermal discomfort, bright light sources do not represent any danger.

Risk group 3:
Luminaires are already a danger for volatile or short-term radiation. Use in general illumination is not permitted.


ShapeDrive G4 is rated as risk group 2.

Structured light is an illumination technology where the light creates a defined pattern such as stripes, lines or grids.

Triangulation is a measuring method with which the distance to an object is calculated based on angular relationships. The distance of the object can be determined by the known distance between the light source and the camera (baseline) and the triangulation angle.

LEDs are most efficient here and the shorter wavelength improves scatter behavior on metal surfaces.

The 3D sensor operates reliably in workshops with standard conditions, but cold light with a color temperature >5500 K (e.g. neon tubes) can interfere with the absorption process and should be avoided.
We use filters to block out as much ambient light as possible. However, the 3D sensor is only suitable for outdoor use to a limited extent.

The blue wavelength used is reflected much less effectively by red and orange surfaces than most others. To compensate for this, appropriate exposure time, gain and LED power settings must be made until a suitable setting is found.

SDK stands for Software Development Kit. An SDK is a collection of programming examples, tools or libraries that makes integration or use of the hardware easier.

GigE Vision is an industry standard for uniform data exchange between cameras and 3D sensors. The standard is defined by the Association for Advancing Automation (A3).

The sensor should be firmly mounted and not subject to vibrations. Avoid direct sunlight. There should be no direct reflections from the illumination into the camera.

The sensor should be aligned vertically to the object and have as clear a view of the object as possible.

ShapeDrive is wenglor’s product brand for 3D sensors. The G4 suffix designates the latest generation of the high-performance hardware platform, which forms the basis for all future 2D/3D profile sensors and 3D sensors. The 3D sensors have the following four performance features:

1. Processing unit: Two Dual Core Arm® processors up to 1.3 GHz
2. Field Programmable Gate Array (FPGA)
3. Memory: Large (4 GB) and fast (19.2 Gbit/s) memory
4. Connectivity: integrated Gigabit Ethernet interface

More information on the performance features can be found here.

The specified resolution in the measuring range Z is in similar ranges. The resolution in the measuring range X and Y reduces towards the end of the measuring volume.

Yes, our 3D sensors can be operated with GigE Vision and support Mono8, Mono16 and Coord3D_ABC32f for the transmission of image, brightness and spatial information.

No, the corresponding standard – risk group 2 (EN 62471) – does not provide for measures for the combination of wavelength and radiation intensity.

Yes, an uncorrected 2D image can be made available via GigE Vision or SDK.

Yes, that's possible. The interface adapts to the speed of the receiving system.

Yes, the sensor is suitable for industrial use and can be used on moving components.

The maximum measuring volume is defined for each 3D sensor during calibration. However, the ROI function can be used to reduce the measuring range.

No, there is no compatibility.

No, the ShapeDrive blue light does not pose a danger to the eye light. Nevertheless, we recommend that you do not look into the “light engine” for long periods.

Yes, there is an SDK in C++.

The pattern projection process works very well with a camera. The advantage of a second camera in the sensor to avoid shadowing has rarely proven to be necessary in most applications.

Didn’t find an answer to a question? No problem.

Product Comparison