How does tof camera technology measure distance?

Image sensors are gradually developing from 2D to 3D. The introduction of depth information makes the scalability of applications such as smartphones, automobiles, and AR become higher and higher. Moving from 2D to 3D is a major trend in the development of sensors in the future. At present, the mainstream technologies of 3D sensing include: binocular vision, structured light and ToF camera. Binocular vision and structured light are mainly based on the principle of triangulation, and ToF camera is mainly based on the time-of-flight ranging of light.
The principle of the ToF camera scheme is to transmit an infrared light source to the target object, measure the transmission time of the light between the lens and the object, and calculate the distance of the measured object from the camera through the time of flight of the light pulse. ToF camera technology has the characteristics of strong anti-interference and higher FPS refresh rate, so it can perform better in dynamic scenes. In addition, ToF camera technology has a small calculation amount of depth information, and the corresponding CPU/ASIC calculation amount is also low, so the requirements for the algorithm are lower.

How does tof camera technology measure distance?