Monochromators work based on the principle of wavelength selection through the process of dispersion. It is an optical device used to isolate a narrow range of wavelengths from a broader spectrum of light. Linear dispersion is an important concept in monochromators that describes how the separation of wavelengths (dispersion) changes linearly with respect to the position on the output spectrum.
Linear dispersion (Dλ) is a measure of the wavelength separation (Δλ) change with respect to the change in position of the output spectrum. In other words, it quantifies the rate at which different wavelengths are dispersed across the output spectrum and is measured in nm/mm. Mathematically, linear dispersion is defined as the change in wavelength (Δλ) per unit distance (Δx) at the output.
Our Newsletters keep you up to date with the Photonics Industry.
By signing up for our newsletter you agree to our Terms of Service and acknowledge receipt of our Privacy Policy.
By creating an account, you agree with our Terms of Service and Privacy Policy.