OakCamera allows users to easily record video streams so the scene can later be fully replayed (see Replaying documentation), including reconstructing the stereo depth perception.

The script below will save color, left, and right H265 encoded streams into video files. Frames are synchronized (via timestamps) before being saved.

from depthai_sdk import OakCamera, RecordType

with OakCamera() as oak:
    color = oak.create_camera('color', resolution='1080P', fps=20, encode='H265')
    left = oak.create_camera('left', resolution='800p', fps=20, encode='H265')
    right = oak.create_camera('right', resolution='800p', fps=20, encode='H265')

    # Synchronize & save all (encoded) streams
    oak.record([color.out.encoded, left.out.encoded, right.out.encoded], './', RecordType.VIDEO)
    # Show color stream
    oak.visualize([], scale=2/3, fps=True)


Recording pipeline of the script above

After 20 seconds we stopped the recording and SDK printed the location of saved files (./1-18443010D116631200 in our case):

Mode                 LastWriteTime         Length Name
----                 -------------         ------ ----
-a----         10/3/2022  12:50 PM           9281 calib.json
-a----         10/3/2022  12:51 PM       19172908 color.mp4
-a----         10/3/2022  12:51 PM       15137490 left.mp4
-a----         10/3/2022  12:51 PM       15030761 right.mp4

This depthai-recording can then be used next time to reconstruct the whole scene using the Replaying feature.

Supported recording types

  1. RecordType.VIDEO

  2. RecordType.BAG

  3. RecordType.MCAP

1. Video

This option will write each stream separately to a video file. There are three options for saving these files:

  1. If we are saving unencoded frames SDK will use cv2.VideoWriter class to save these streams into .avi file.

  2. If we are saving encoded streams and we have av installed (PyAv library) SDK will save encoded streams directly to .mp4 container. This will allow you to watch videos with a standard video player. There’s no decoding/encoding (or converting) happening on the host computer and host CPU/GPU/RAM usage is minimal. More information here.

  3. Otherwise SDK will save encoded streams to files (eg. color.mjpeg) and you can use ffmpeg or mkvmerge to containerize the stream so it’s viewable by most video players. More information here.

200 frames from 4K color camera using different encoding options (MJPEG, H.264, H.265) using av:

2. Rosbag

Currently, we only support recording depth to the rosbag (recording.bag). In the future we will also support color (which is depth aligned) stream and mono streams. You can open the rosbag with the RealSense Viewer to view the depth:

3. MCAP recording

An alternative to Rosbags are mcap files which can be viewed with Foxglove studio. You can find MCAP recording example here. Currently supported streams:

  • MJPEG encoded color/left/right/disparity. Lossless MJPEG/H264/H265 aren’t supported by Foxglove Studio.

  • Non-encoded color/left/right/disparity/depth frames.

  • Pointcloud, enable with recorder.config_mcap(pointcloud=True). It converts depth frame to pointcloud on the host.

Standalone Foxglove studio streaming demo can be found here.

Available topics in Foxglove Studio from MCAP recorded by example

Got questions?

Head over to Discussion Forum for technical support or any other questions you might have.