Platform

ON THIS PAGE

  • Image Quality
  • Color camera ISP configuration
  • Low-light increased sensitivity
  • Camera tuning
  • Motion blur

Image Quality

Image Quality (IQ) is a measure of how well the image represents the original scene. It's a combination of many factors, such as sharpness, noise, color accuracy, and more.There are a few ways to improve Image Quality (IQ) on OAK cameras. A few examples:
  1. Changing Color camera ISP configuration
  2. Try keeping camera sensitivity low - Low-light increased sensitivity
  3. Camera tuning with custom tuning blobs
  4. Ways to reduce Motion blur effects
For best IQ, we suggest testing it yourself for your specific application. You can use RGB Camera Control to try out different ISP configurations and exposure/sensitivity values dynamically (live).

Color camera ISP configuration

You can configure ColorCamera ISP values such as sharpness, luma denoise, and chroma denoise, which can improve IQ. We have noticed that sometimes these values provide better results:
Python
1camRgb = pipeline.create(dai.node.ColorCamera)
2camRgb.initialControl.setSharpness(0)     # range: 0..4, default: 1
3camRgb.initialControl.setLumaDenoise(0)   # range: 0..4, default: 1
4camRgb.initialControl.setChromaDenoise(4) # range: 0..4, default: 1
The zoomed-in image above showcases the IQ difference between ISP configuration (discussion here). Note that for best IQ, you would need to test and evaluate these values for your specific application.On the Wide FOV cameras, you can select between wide FOV IMX378 and OV9782. In general, the IQ of OV9782 won't be as good as IMX378, as the resolution is much lower, and it's harder to deal with sharpness/noise at low resolutions. With high resolution the image can be downscale and noise would be less visible. And even though OV9782 has quite large pixels, in general the noise levels of global shutters are more significant than for rolling shutter.

Low-light increased sensitivity

On the image below you can see how different sensitivity values affect the IQ. Sensitivity will only add analog gain, which will increase the image noise. When in low-light environment, one should always increase exposure, not sensitivity. Note that by default, depthai will always do so - but when using 30FPS, max exposure is 33ms. For the right image below, we have set ColorCamera to 10 FPS, so we were able to increase exposure to 100ms.About 15x digitally zoomed-in image of a standard A4 camera tuning target at 420cm (40 lux). We have used 12MP IMX378 (on OAK-D) for this image.

Camera tuning

Our library supports setting camera IQ tuning blob, which would be used for all cameras. By default, cameras will use a general tuning blob, which works great in most cases - so changing the camera tuning blob is not needed for most cases.
Py
1import depthai as dai
2
3pipeline = dai.Pipeline()
4pipeline.setCameraTuningBlobPath('/path/to/tuning.bin')
Available tuning blobsTo tune your own camera sensors, one would need Intel's software, for which a license is needed
  • so the majority of people will only be able to use pre-tuned blobs. Currently available tuning blobs:
  • Mono tuning for low-light environments here. This allows auto-exposure to go up to 200ms (otherwise limited with default tuning to 33ms). For 200ms auto-exposure, you also need to limit the FPS (monoRight.setFps(5))
  • Color tuning for low-light environments here. Comparison below. This allows auto-exposure to go up to 100ms (otherwise limited with default tuning to 33ms). For 200ms auto-exposure, you also need to limit the FPS (rgbCam.setFps(10)). Known limitation: flicker can be seen with auto-exposure over 33ms, it is caused by auto-focus working in continuous mode. A workaround is to change from CONTINUOUS_VIDEO (default) to AUTO (focusing only once at init, and on further focus trigger commands): :code:camRgb.initialControl.setAutoFocusMode(dai.CameraControl.AutoFocusMode.AUTO)
  • OV9782 Wide FOV color tuning for sunlight environments here. Fixes lens color filtering on direct sunlight, see blog post here. It also improves LSC (Lens Shading Correction). Currently doesn't work for OV9282, so when used on eg. Series 2 OAK with Wide FOV cams, mono cameras shouldn't be enabled.
  • Camera exposure limit: max 500us, max 8300us. These tuning blobs will limit the maximum exposure time, and instead start increasing ISO (sensitivity) after max exposure time is reached. This is a useful approach to reduce the Motion blur.

Motion blur

Motion blur appears when the camera shutter is opened for a longer time, and the object is moving during that time.
Rolling shutter sensor timings
The animation shows the difference between a shorter (left) and a longer (right) exposure time. During a shorter exposure time, even a fast moving object can move a shorter distance, which causes less motion blur.In the image above the right foot moved about 50 pixels during the exposure time, which results in a blurry image in that region. The left foot was on the ground the whole time of the exposure, so it's not blurry.In high-vibration environments we recommend using Fixed-Focus color camera, as otherwise the Auto-Focus lens will be shaking and cause blurry images docs here.Potential workarounds:
  • Have better (brighter) lighting, which will cause the camera to use a shorter exposure time, and thus reduce motion blur.
  • Limit the shutter (exposure) time - this will decrease the motion blur, but will also decrease the light that reaches the sensor, so the image will be darker. You could either use a larger sensor (so more photons hit the sensor) or use a higher ISO (sensitivity) value. One option to limit max exposure time is by using a Camera tuning blob.
Py
1camRgb = pipeline.create(dai.node.ColorCamera)
2# Max exposure limit in microseconds. After this time, ISO will be increased instead of exposure.
3camRgb.initialControl.setAutoExposureLimit(10000) # Max 10ms
  • If the motion blur negatively affects your model's accuracy, you could fine-tune it to be more robust to motion blur by including motion blur images in your training dataset. Example video: