Skip to content

What Is a Time of Flight Sensor (ToF)?

The time of flight sensor (ToF) has a peculiar name. It doesn’t necessarily mean it will calculate the time a flying object is in the air, nor does it measure the precise time an object takes off from the ground. Before understanding what a ToF sensor does, it’s essential to understand what ToF actually is. ToF measures the time it takes for a physical object to travel a given distance through a medium. Typically, this measurement can determine velocity and path length, but it can also be used to learn about an object’s dimensions.

A Time of Flight sensor can use all the information gained using ToF principles for applications such as robot movement, human-machine interfaces, – like the second-generation Kinect sensor for the Xbox One – smartphone cameras, machine vision, and even Earth topography. While all these uses aren’t exactly the same, the information provided can serve all their purposes. Now that we’ve established what a ToF sensor can do, it’s equally important to determine what they consist of, how they generate the information, and, finally what specific purposes this information can serve for in the world of robotics.

ToF
How a ToF sensor works with light reflections. Image via Wikipedia

What Makes Up a ToF Sensor?

From a general perspective, a ToF sensor isn’t a device that requires decades of research to understand. It’ll consist of a few parts but none of them are particularly obscure or hard to piece together.

The first part is the lens, which, given that it’s essentially a camera, is pretty easy to understand. The lens itself, like any other camera, gathers the reflected light since it cannot produce any light by itself nor can it acquire depth signal from ambient light. According to a scientific study by Subhash Chandra Sadhu for Texas Instruments, ToF cameras have “have special requirements to be met with while selecting or designing the lenses.” While the rest of the study goes on to explain the specifics (which he explains very well), it’s important to understand these limitations if you want to fabricate your own ToF sensors in the future.

Industrial 3D Camera
An example of a ToF sensor. Image via Quality Magazine

Also included in the ToF camera package is the integrated light source that keeps the seen, well, lit. Considering that all the light must come from the sensor, it’s equally important to make sure that no outside sources of light – like sunlight – disrupt the image intake.

Then there’s the image sensor, the centerpiece of the ToF camera. The sensor does the heavy lifting, storing all capture image information, including the time it takes for the light to travel from the integrated light source to the object and then back.

Finally, there’s the interface, which shows the data captured. It’s the less showy aspect of the ToF, but, hey, it’s still essential!

How Does a Time of Flight Sensor Work?

The Time of Flight sensor is able to capture depth information for every pixel in the image captured. It is mainly used for machine vision applications and advantages include the sensor’s compact construction, its relative ease-of-use, a precise accuracy of approximately 1cm, and high frame rates.

There are 2 principle ways in which a ToF sensor can determine distance and depth.

The first is is a ToF sensor based on light pusled sources. This form will measure the time it takes for a light pulse to travel to from the emitter to the scene and then back. Once everything has been measured and taken, through the magic of mathematics and algorithms, the distance and depth of all the objects captured by the sensor are calculated and determined.

At Seeed Studio, they mocked up a graphic that simply yet accurately depicts how the process works.

Easy enough, right?

ToF Sensor
This graphic explains more clearly how the light refracts from the object back to the sensor and how that measures the distance from each relevant point. Image via Seeed Studio.

The second way is a ToF based on continuous waves which detects the phase shift of reflected light. The modulating amplitude creates a light source inm a sinusoidal form with a known frequency.The detector then determines the phase shift (a shift when the graph of the sine and cosine functions shift left or right from their standard position) of reflected light.

Once this process happens, more math happens as well, determining the distance and depth of all the objects captured by the sensor.

 While the end results of both methods are similar, their journeys differ. The illumination of the entire scene, regardless of the method, will make it possible to determine the distance and depth of each object scanned by the sensor – all in a single shot.
 
The result? A range map in which the pixels encode the distance to each point on the captured scene.
 
Over at Melexis, they showed a depth image of a man in a car. The colors are represented as follows:
– The blue sections indicate that the point(s) is(are) far away
– A red section indicates closer proximity to the sensor
Time-of-Flight - outputs
Image via Melexis.

The Advantages and Limitations of the Time of Flight Sensors

Like any set of technological tools, there are upsides and downsides.

 

Some of the clear advantages of using ToF sensors for 3D measurements are the following:

  1. Higher resolution captures.
  2. Real-time capabilities – no need to wait days for a result.
  3. Works in low light conditions – even no light is possible.
  4. The costs aren’t particularly high.
 
On the other hand, its limitations are worth considering, just in case your needs aren’t lined up with what a ToF sensor can do. They are as follows:
  1. The presence of scattered light due to unwanted reflections.
  2. External bright surfaces that are close to the camera can quickly scatter too much light into the lens, creating artifacts.
  3. ToF distance measurement requires light that has been reflected just once.
  4. If a light has been reflected multiple times, it can lead to the distorting of the measurements. These multiple reflections are usually caused by corners and concave shapes.
  5. Ambient light and sunlight make it more difficult to capture outdoors (sunlight causes the saturation of sensor pixels)

In What Manufacturing Contexts Can You Use ToF Sensors?

ToF sensors are highly practical in numerous applications including logistics, factory automation, and autonomous robotics and vehicles.

For logistics, ToF sensors can help guide robotic arms for packaging assistance, box filling, stacking, volume scanning, and labelling. A pick-and-place case study conducted by Lucid at Pensur, an engineering company, looked into how their 3D vision systems allowed for a far more efficient process and freed up valuable time for employees who were stuck doing the menial job day-in-and-out.

In the context of factory automation, ToF sensors can guide robots to find and pick up objects and place them where they need to be. Think of a car assembly. Nothing changes from car to car, but the ToF sensors will point out where everything is and everywhere they need to be.

Play Video

ToF sensors can also be used in the context of maritime navigation in that the sensors can use AI-based object recognition. This is done to increase security on boats during sailing by detecting objects that may conflict with the ship’s path such as fishing boats, buoys, and debris, which can’t be detected by using only the ship’s radar.

IDenTV showed a brief example on their YouTube page of how these cameras can work and how quickly they can detect objects even at far distances.

Play Video

Finally, for autonomous robots, a ToF sensor can help a robot plan and execute a task all on its own. Whether those processes consist of sanding, powder coating, or batch painting, the ToF sensor can help the robot understand each object’s specific dimensions and, with the help of the right software, can execute each task necessary by knowing where to start and stop.

ToF sensors are at the core of AutonomyOS™. They are the key to the first step: 3D perception, helping autonomous robots figure out what they need to do in real-time.

Products

Upcoming Events

Throughout the year, the Omnirobotic team travels all over North America to showcase their latest innovations.