Microscopic Knowledge: Technical Terms Related to Microscopy

August 10, 2023

Knowing the technical terminology associated with microscopy is a great way to get everyone started.

Send your inquiry

1. DOF (Depth of Field)

Depth of Field (DOF) refers to the range of distances between the front and back of the subject measured by imaging that can obtain clear images at the front of the camera lens or other imagers.

Objects can be seen clearly within a tolerance zone despite slight changes in the distance between the object and the lens. This tolerance in which objects are brought into focus is called "depth of field".

The closer the subject, the smaller the depth of field, and the further the subject, the greater the depth of field.

 

2. Numerical Aperture

Numerical aperture or N.A. is a numerical value that expresses the resolving power of a lens and is defined by the following equation.

n: the refractive index of the observed medium [such as n (air) = 1]

θ: The angle formed by the optical axis and the outermost light within the effective diameter of the lens

 

N.A. = n * sin θ

The resolution of the lens is determined by N.A., but the role of diffraction cannot be ignored. The phenomenon that causes light to scatter like waves is called "diffraction". Due to diffraction, even the highest resolution lenses cannot focus on a single point, resulting in a disc-shaped focal point. The smallest resolvable spot of light is called an "Airy disk". The radius of the Airy disk is expressed by the following formula.

 

λ: wavelength of light

N.A: numerical aperture

0.61: fixed value

 

r = 0.61 * λ/N.A.

The value derived from this formula is the "resolution". According to this formula, the larger the numerical aperture (N.A.), the smaller the radius of the Airy disk. Therefore, the larger the numerical aperture (N.A.) of the lens, the smaller the features that can be resolved and the sharper the image.

 

3. Color Resolution

How many color grayscales each pixel can express is called color resolution (also called color depth or color depth).

Color resolution usually expresses the intensity of the 3 primary colors (red, green and blue) in how many levels. The number of levels representing each color depends on the number of bits available to uniquely characterize each shade.

 

For example, when using an 8-bit color camera, the range of each color is represented by 8 bits or 256 gray levels (that is, 2 to the 8th power).

When evaluating the three primary colors R, G, and B, an 8-bit color camera can represent 256 × 256 × 256 = 16,777,216 colors.

It is said that the human eye has the ability to distinguish about 10 million colors. So, representing 16.77 million colors should be more than enough. But the actual color resolution of the human eye varies from color to color (e.g. the human eye is more sensitive to green).

 

4. Image Sensor

An image sensor, or photosensitive element, is a device that converts an optical image into an electronic signal, and it is widely used in digital cameras and other electronic optical devices.

Image sensors typically consist of many tiny photodiodes that convert light into electrical signals. After the light hits the image sensor, it is focused to each photodiode through a tiny lens, and then the image sensor converts the light into an electrical signal and outputs the final image.

Image sensors are able to gather light intensity information, but cannot reproduce color by themselves.

In order to reproduce a color image, a primary color (red, green and blue) color filter can be placed in front of the photodiode. Alternatively, a four-color (cyan, magenta, yellow, and green) filter can be used. However, primary color filters enable better color reproduction and are better suited for digital images.

To display an image through a digital microscope system, an image sensor equipped with a color filter converts the light received by the lens into a digital signal.

After processing, the digital signal is converted into an image and displayed on the screen.

Send your inquiry