Posted on
Optical and IR filters are both key components in imaging devices. They are designed to focus on and filter out specific wavelengths of light in order to create images, both for digital cameras and thermal imaging devices. An image sensor filter is a type of imaging filter that helps transmit and interpret color information, and manufacturers need to solve a few common thin film challenges in order for these filters to meet specific application demands.
What Are Imaging Filters?
Imaging filters include a range of filters that can transmit desired wavelengths through to the photodiode while filtering out other wavelengths. These filters include bandpass filters, color filters and IR filters, and they’re used to enhance an image’s quality based on a number of characteristics. For example, by isolating certain light wavelengths, imaging filters may be used to control camera exposure or image contrast on a digital camera.
Applications for optical imaging filters include digital cameras, smartphones and LED illuminators.
How Image Sensor Filters Are Used
In digital cameras, there are two types of digital image sensors: charge-coupled device (CCD) sensors and complementary metal-oxide semiconductor (CMOS) sensors. Both types convert light (photons) into electrons for each pixel or cell in an image.
In a CCD sensor, each pixel’s electric charge value is converted to a digital value, which creates high-quality, low-noise images. This technology is more mature but consumes more power than CMOS sensors and is more expensive to manufacture.
In a CMOS sensor, there are several transistors at each pixel to amplify and transmit the charge, and photons from light tend to hit these transistors instead of the photodiodes directly. This makes CMOS sensors more susceptible to noise and less sensitive to light, which can affect image quality; however, they are immensely cheaper to make.
Image sensor filters are used primarily to assist in obtaining color information about the incoming light, but they are also used to protect the image sensor, whether it’s a CCD or CMOS. Color filters may work across UV/VIS or IR ranges. On a digital camera, IR filters are typically used to block out IR light from reaching the image sensor, preventing damage to the sensor. In thermographic cameras, however, IR filters are used to isolate and transmit IR light wavelengths to create thermal images.
The configuration of image sensor filters typically features an “on-chip” design, meaning that the filter is bonded directly to the image sensor. A micro lens is used to capture the light so it can pass through the color filter and reach the photodiode.
Thin Film Challenges in Image Sensor Filters
One of the primary concerns for image sensor filters is the overall thickness of the filter. Because it is bonded directly to the sensor, the total thickness of the sensor inherently increases. But newer applications, such as smartphones, are driving demand for increasingly smaller form factors and pixel sizes. Meeting these very tight thickness constraints is a challenge for any on-chip image sensor filter.
As pixel sizes shrink, resolution is adversely affected, causing a need for higher resolution filters. Attaining a high resolution depends largely on achieving a smaller pigment size, as well as gaining tight uniformity across the color filter. Other properties such as optical thickness, transmission, density and adhesion all affect filter performance, and need to be accounted for during thin film deposition.
In-situ controls can help monitor and improve these specs during the deposition process. Read more in our white paper, “Solving 5 Big Thin Film Deposition Challenges Facing Contract Coaters”.