top of page
Writer's pictureVadzo Imaging

What is a Digital Imaging Sensor and What Comes Them?

Updated: Jul 22

capability of digital imaging sensors has allowed us to take better images than ever before, with many more megapixels.

Photographs have made a lot of progress in terms of capture, storage, and distribution. The days of photographing with silver bromide film are long gone. These are the days of digital photography, which provides better quality, and better handling, and allows us to easily manage our images.

In this blog, we examine the principles of image sensor technology used in machine vision cameras, and how they are categorized.


Digital Image Sensors

Digital cameras are the devices on which the digital image sensor is most frequently used for capturing and storing digital images. The capability of digital imaging sensors has allowed us to take better images than ever before, with many more megapixels. Digital sensors have evolved throughout time, growing in size, resolution, and capability. As a result, we now have greater freedom to take pictures in a way that better satisfies our creative requirements.

Some of the exterior components that are used in the camera will be detailed in an attempt to understand how the sensor is integrated with the camera. The image sensor, which takes the place of traditional film in a digital camera, as well as the lens, which directs light onto it, are its two main parts.


Digital Image Sensors in the Data Collection

As the cameras continue to improve, so do how we capture images and the ways of using those images in many industries. In the age of artificial intelligence, imaging systems are taking over the role of the human eye. Imaging is one of the fastest-growing modes of data collection and acquisition. A lot of data can be collected using imaging, such as age, gender, and height, from various daily life activities. For example, we can collect data from images when people are standing at a signal, using a vending machine, entering a store, crossing the street, passing a kiosk, registering at some conference, ordering a coffee, boarding a flight, etc.

Imaging not only helps collect information about people but also about vehicles, packages, animals, birds, fish, medicines, groceries, etc.


Components of Digital Image Sensors

Sensors are categorized according to their structure such as (CCD or CMOS), chroma type (color or monochrome), and shutter type (global or rolling shutter), etc. They are also classified based on their resolution, frame rate, pixel size, and sensor format. Understanding these phrases will help you decide which sensor is suitable for your application.


Based on the Building Block of the Sensor

Camera Modules are systems that include automotive cameras, image sensors, and lens modules. Image sensors in the camera module transform camera pictures into electrical impulses for processing. For ADAS applications, two types of image sensors are available:

  1. CCD – Charge-Coupled Device sensors

  2. CMOS – Complementary Metal-Oxide Semiconductor sensors

CMOS sensors are highly preferred and used due to their low power consumption, ease of integration, faster frame rate, and low manufacturing cost. The detailed comparison between these sensors is explained under the topic of CCD vs CMOS Sensor: Which one is appropriate for Camera Applications?


Based on the Pixel Scanning Methodology

Sensors in imaging systems collect and store images for different processing and analysis applications. To snap pictures, these sensors employ an electronic shutter. The exposure of photon wells on the sensor is managed by an electronic shutter. Additionally, it controls whether a full matrix or line-by-line exposure of the pixels is used. The two primary types of the electronic shutters are,

  1. Global Shutter

  2. Rolling Shutter

The article on the topic of Global Shutter vs Rolling Shutter:  Appropriate Justification for Choosing the Best Among Them described the differentiation between them in detail.


Based on the Output Format

However, most digital sensors function by capturing light in a grid of photosites, just like a grid of buckets would do for raindrops. Each photosite is exposed to capture incoming light as the exposure commences. When the exposure is finished, the electrical signal representing each photosite’s occupancy is read, quantified, and recorded as a numerical value in an image file. There are two primary types of visible light sensors (but not infrared, ultraviolet, or X-ray);

  1. Color Sensor

  2. Monochrome Sensor

As the most practical solution for image applications, color cameras have grown in popularity. The truth is that monochrome cameras with trustworthy sensors are a considerably more practical and efficient solution for some embedded vision applications. The rationale is that when compared to color cameras, monochrome cameras can capture images with excellent details and sensitivity. The topic of the Difference between Monochrome Sensors and Colour sensors helps you comprehend better in the understanding of why monochrome cameras have various advantages over color cameras.


Based on the Output Interface

The interface for CMOS image sensors is split into two libraries:

  1. MIPI – Mobile Industry Processor Interface

  2. DVP – Digital Video Port

The fundamental distinction between DVP and MIPI is that the latter uses a high-speed divergent serial interface, whilst the former uses a parallel interface. In addition to supporting greater resolution and frame rate than the DVP interface, the MIPI interface offers a wider data bandwidth. Due to the MIPI interface’s complexity and requirement for additional hardware


Image Sensor Format

Image sensors are available in a variety of formats (sometimes referred to as optical classes, sensor sizes, or types), as well as packages. The overall size of a sensor is determined by its resolution and pixel size, with larger sensors typically having better resolutions or larger pixel sizes than smaller sensors. Choosing a lens and other camera optics requires knowledge of the Image Sensor Format. Each lens is created for a particular resolution and format of the sensor.

filename

Pixel Size

The pixel size is expressed in micrometers (m), which takes into account the full area of the photodiode and any adjacent circuitry. A photodiode, an amplifier, a reset gate, a transfer gate, and a floating diffusion make up a CMOS pixel. Nevertheless, since they can sometimes be shared between pixels, these elements might not necessarily be present in each pixel. It is best to refer to the sensor’s spectrum response (quantum efficiency) as well as other sensor performance findings to have a more precise knowledge of Image Sensor Sensitivity.


Wrapping Up

Each one has a distinct function and use case. Not all requirements can be fulfilled by one type. We must select the type of sensor that best suits our requirements based on the use case.

What is the best way to choose a sensor type? What are the advantages and disadvantages of each type? Are there any other subcategories we should be aware of? 

Understanding the vocabulary and technology used in digital sensors may help you choose the right camera for your application more effectively if you are familiar with the answers to the questions above. For instance, selecting the appropriate lens will depend on certain sensor parameters, such as pixel size and sensor format. Additionally, you’ll be better prepared to assess if new sensor technologies are advantageous for your application when they become available.

If you are confused with questions and the types, Vadzo’s team will be happy to assist you. Vadzo has the pleasure of working with imaging sensors from businesses such as On Semiconductors, Sony, Omnivision, and others.

Feel free to contact us.

14 views
bottom of page