Lasers are used in numerous applications in a wide variety of fields. Whether it’s transmitting internet communication via optical fibers, welding, cutting or marking materials in industrial processes, performing sophisticated medical procedures, or even intercepting drones and missiles – lasers seem to be everywhere nowadays.
There is another field in which lasers have become dominant players – laser-assisted measurements. So, what exactly are laser-assisted measurements? As the name suggests, these are measurements of physical quantities enabled or improved by the use of lasers. There are various technologies relying on lasers, each harnessing one or more laser properties. In this article we will present this field and give several notable examples.
How lasers changed the way we measure distances
Let’s begin by discussing how lasers changed the way we measure distances. Interestingly, there are many ways in which lasers are used to determine the distance between objects, and each method is chosen according to the dimensions of the object we need to measure, its distance from the measurement device, and the required resolution. For example, the term Light Detection and Ranging (LiDAR) refers to several techniques used to generate three-dimensional maps of the environment, based on projecting laser light and detecting the reflected signal. LiDAR is widely used in applications such as autonomous vehicle navigation, where the computer that runs the car needs to gather real-time information about its surroundings to determine the optimal driving strategy and avoid accidents.
The most known LiDAR implementation is based on time of flight measurement, in which the time between emission and detection of a laser pulse is recorded, and the distance between the source and the object is calculated using the known light’s velocity.
Devices based on similar operation principle also allow measuring the velocity of objects (e.g., cars) by sending multiple laser pulses at known time intervals towards a moving object and calculating its velocity based on the difference between the times it takes the light to complete a round trip. Figure 1 shows a laser speed gun used by low-enforcement agencies. Other LiDAR technologies, such as continuous-wave frequency modulation, rely on constantly changing the laser’s frequency and deducing the distance by measuring the returning light’s frequency.
We will dive much deeper into LiDAR technologies in a future blog post. Returning to distance measurement – if we want to measure small-scale objects with higher resolution, for example in quality assurance of industrial processes, we typically use other measurement techniques, for instance triangulation. Luckily, we already wrote a blog post on non-contact distance measurement technologies, so we will not elaborate more on this topic here.
Moving to a different field in which lasers have proven to be useful for measurements – interferometry is a term used to describe a group of measurement methods that rely on superimposing electromagnetic radiation that traverses different paths in order to extract information from the generated interference pattern.
Though it is possible to perform interferometric measurements using non-coherent light sources, laser-based interferometry techniques became a tool for many industrial and scientific applications. In these applications, a laser source is typically split into two beams that pass different optical paths and then recombine at a detector.
By evaluating the resulting interference pattern the difference between the optical paths of the two beams can be calculated. Using this information the shape of objects can be mapped, the value of the refractive index of materials can be evaluated, and even gravitational waves can be detected: gravitational waves are disturbances in the curvature of space-time originating from accelerated masses as predicted by Einstein’s in his general theory of relativity. The most advanced interferometer that was ever built stands in the heart of the Laser Interferometer Gravitational-Wave Observatory experiment. It is able to detect gravitational waves due to the change they exert on a laser beam that travels in one of its four km long interferometer arms. This device is so sensitive that it is able to detect a change in the optical path of less than 1/10,000 of the size of a proton!
Interferometry also enables the measurement of angular velocity and orientation using a device called an optical gyroscope. A classical gyroscope consists of a spinning disc mounted in gimbals and it allows detecting changes in the orientation of moving objects: if placed on a moving object, a gyroscope’s disc will continue to spin in the same orientation due to conservation of angular momentum while the gimbals will rotate following the object’s change of orientation. An optical gyroscope relies on the Sagnac effect – when two beams originating from the same laser source pass through the same optical path in opposite directions and then superimposed, they form an interference pattern. When the device is static, the time it takes the two beams to complete a round trip is the same, and the generated interference pattern is the baseline signal.
The interference pattern changes if the device is spun, since the beam that travels in a direction contrary to the device’s rotation experiences shorter delay compared to the counter-propagating beam. This is because the light’s velocity is the same regardless of the direction, and therefore the time and distance traveled by the beams change according to Einstein’s special theory of relativity. Consequently, the resulting interference fringes are displaced compared to the baseline signal and the phase shift of the pattern is proportional to the angular velocity of the device. Gyroscopes based on this effect are called fiber-optic gyroscopes, and they consist of a laser coupled into a long optical fiber. Alternatively, the Sagnac effect can be manifested in a ring laser, consisting of two independent resonant modes that travel the cavity of a ring-shaped laser in opposite directions. Rotation of the device causes the two counter-propagating waves to differ in their frequencies due to the difference between the two beams’ round trip times, and the constraint that these round trips consists of an integer multiple of each mode’s wavelength, in order for the light to constructively build inside the cavity. The ring laser gyroscope operation relies on continuously sampling a small fraction of the two beams outside the cavity. When these beams interfere a beat frequency proportional to their frequency difference can be observed and calibrated with respect to the rotation velocity. Optical gyroscopes from the types described above are advantageous over classical mechanical ones due to the fact they do not contain any moving parts, require low operating current, are shock resilient and are extremely precise. They are mostly used in inertial navigation systems in airplanes and satellites.
Another example for the use of lasers in measurements is spectroscopy – the study of the interaction between light and matter. Since atoms and molecules have distinct energy levels, a laser with a well-defined wavelength can be used as a tool to investigate the energy gap between two electronic states of atoms or molecules: for instance, by recording when light is absorbed in matter.
With the development of this field, scientists have created an immense catalog of atomic and molecular transitions of almost every element and molecule known to us. With this knowledge, lasers are frequently used to detect the presence of various materials in gases or even identify condensed explosives.
Microscopy is one more field that benefitted from the development of lasers. At first, microscopes contained only lenses to magnify the object under investigation, using natural light. These classical optical microscopes were limited by the diffraction limit, due to the wave nature of light. Since the invention of lasers, scientists have found ways to harness their properties to surpass the diffraction limit. These efforts created a new field, known as super-resolution microscopy, which extends beyond classical conventional microscopy. For example, a confocal laser scanning microscope contains a small hole through which the light is passed, so that the reflected light returns to the detector only from the focal plane of the system. This allows mapping a cross section of the sample under investigation. Using intense laser light this technique allows measurement of objects much thicker than possible using conventional optical microscopes, and it is mostly used in biology and medicine.
There are also other microscopic techniques that rely on the laser light’s degree of polarization or on induced nonlinear optical effects, such as polarized-light microscopy and two or multiphoton excitation microscopy, respectively. Figure 2 shows a slide containing cells for examination in a laser scanning microscope.
Lasers also play a key role in the most precise time measurement device built by man– the atomic clock. Atomic clocks use the known energy-level differences of certain atoms to generate a highly-stable and repeatable signal, with a known frequency. Time can be measured from this frequency standard by counting the number of periods.
The unit of the second itself is defined as the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom. Small-scale atomic clocks are used in applications ranging from tuning the frequency of cell-phone stations to setting the time for the global positioning system’s satellites, but lasers are not required for their operation. However, in order to achieve the highest possible accuracy, in which a clock is expected to lose no more than 1 second in 100+ million years, the atoms that stand in the heart of the device need to be cooled down using a process called laser cooling. Since the temperature of the ensemble is related to the atoms’ velocity, it is possible to cool the atoms to near absolute zero by slowing them down.
For the purpose of cooling down atoms using lasers, several laser beams are directed onto an ensemble of identical atoms, and their velocities decrease as they absorb and reemit photons: by setting the laser’s wavelength to be a little higher than the energy level difference between two atomic levels, we allow only atoms that move towards the laser source to absorb and then emit a photon, since for these moving atoms the light’s frequency seems to be the right one for the atomic transition, due to the relativistic Doppler effect: the classical Doppler effect is the reason why the sound pitch of an ambulance driving towards us is higher than the sound of the same ambulance as it drives away from us. This happens due to the relative movement between us and the ambulance. The relativistic Doppler effect is a similar phenomenon that occurs in light waves because of a relative motion between the source and the observer.
Laser cooling is a very interesting subject which is worth a blog post of its own, but why do atomic clocks perform better with cold atoms? This is mainly since cooler atoms move less so the frequency of the radiation emitted from them is less Doppler-shifted, making the frequency standard more stable and hence the clock more accurate. Figure 3 shows a highly-precise laser-enabled laboratory atomic clock built at the American National Institute for Standards and Technology.
Interestingly, the Doppler effect can also be used to measure fluid velocity: this is done using a laser Doppler velocimeter – this device records the frequency change due to reflections from moving particles to measure the velocity of fluids with great accuracy. How does it do that? The laser’s light frequency is shifted when it is reflected from moving particles due to this effect, and by measuring this frequency shift the object’s velocity can be calculated.
Lasers are also heavily used in the optical metrology field. With the rapid advancement of the microelectronics industry, the size of elements placed on integrating circuits have reduced significantly over the past 50 years: the transistor size had reduced by a factor of 2000 – from 10 mm in 1971 to about 5 nm today. This reduction resulted in an ever-increasing complexity of the manufactured integrated circuits, triggering the need for precise inspection during the various fabrication processes. Lasers have proven to be a key ingredient of several optical inspection technologies that are used today. Some technologies rely on the measurement of scattered or reflected laser light from the sample to detect defects, while others use pulsed sources to heat the sample for a short period of time, thus generating acoustic waves that can be detected by another laser beam. The incorporation of lasers is therefore essential for quality control and significantly improves the yield of the automated processes that stand in the heart of the microelectronics industry.
To sum it up…
What we hope you’ve learned from this short article is that lasers have become measurement tools for many physical quantities, and that the development of advanced laser systems constantly improves the accuracy, speed and resolution of these techniques and will potentially pave the way to even more novel measurements technologies. Here at Ophir we develop solutions to analyze your laser beam and measure the total power and energy of your laser sources. Want to learn more? Please visit our website.