OakCamera allows users to easily record video streams so the scene can later be fully replayed (see Replaying documentation), including reconstructing the stereo depth perception.

The script below will save color, left, and right H265 encoded streams into video files. Frames are synchronized (via timestamps) before being saved.

from depthai_sdk import OakCamera, RecordType

with OakCamera() as oak:
    color = oak.create_camera('color', resolution='1080P', fps=20, encode='H265')
    left = oak.create_camera('left', resolution='800p', fps=20, encode='H265')
    right = oak.create_camera('right', resolution='800p', fps=20, encode='H265')

    # Synchronize & save all (encoded) streams
    oak.record([color.out.encoded, left.out.encoded, right.out.encoded], './', RecordType.VIDEO)
    # Show color stream
    oak.visualize([], scale=2/3, fps=True)


Recording pipeline of the script above

After 20 seconds I stopped the recording and SDK will print where the files were saved (in my case ./1-18443010D116631200):

Mode                 LastWriteTime         Length Name
----                 -------------         ------ ----
-a----         10/3/2022  12:50 PM           9281 calib.json
-a----         10/3/2022  12:51 PM       19172908 color.mp4
-a----         10/3/2022  12:51 PM       15137490 left.mp4
-a----         10/3/2022  12:51 PM       15030761 right.mp4

This depthai-recording can then be used next time to reconstruct the whole scene using the Replaying feature.

Supported recording types

  1. RecordType.VIDEO

  2. RecordType.BAG

In the next release we will also support MCAP.

1. Video

This option will write each stream separately to a video file. There are three options for saving these files:

  1. If we are saving unencoded frames SDK will use cv2.VideoWriter class to save these streams into .avi file.

  2. If we are saving encoded streams and we have av installed (PyAv library) SDK will save encoded streams directly to .mp4 container. This will allow you to watch videos with a standard video player. There’s no decoding/encoding (or converting) happening on the host computer and host CPU/GPU/RAM usage is minimal. More information here.

  3. Otherwise SDK will save encoded streams to files (eg. color.mjpeg) and you can use ffmpeg or mkvmerge to containerize the stream so it’s viewable by most video players. More information here.

200 frames from 4K color camera using different encoding options (MJPEG, H.264, H.265) using av:

2. Rosbag

Currently, we only support recording depth to the rosbag (recording.bag). In the future we will also support color (which is depth aligned) stream and mono streams. You can open the rosbag with the [RealSense Viewer]( to view the depth:

Got questions?

We’re always happy to help with development or other questions you might have.