The script below will save color, left, and right H265 encoded streams into video files. Frames are synchronized (via timestamps) before being saved.
from depthai_sdk import OakCamera, RecordType with OakCamera() as oak: color = oak.create_camera('color', resolution='1080P', fps=20, encode='H265') left = oak.create_camera('left', resolution='800p', fps=20, encode='H265') right = oak.create_camera('right', resolution='800p', fps=20, encode='H265') # Synchronize & save all (encoded) streams oak.record([color.out.encoded, left.out.encoded, right.out.encoded], './', RecordType.VIDEO) # Show color stream oak.visualize([color.out.camera], scale=2/3, fps=True) oak.start(blocking=True)
After 20 seconds I stopped the recording and SDK will print where the files were saved (in my case
Mode LastWriteTime Length Name ---- ------------- ------ ---- -a---- 10/3/2022 12:50 PM 9281 calib.json -a---- 10/3/2022 12:51 PM 19172908 color.mp4 -a---- 10/3/2022 12:51 PM 15137490 left.mp4 -a---- 10/3/2022 12:51 PM 15030761 right.mp4
This depthai-recording can then be used next time to reconstruct the whole scene using the Replaying feature.
Supported recording types
In the next release we will also support MCAP.
This option will write each stream separately to a video file. There are three options for saving these files:
If we are saving unencoded frames SDK will use
cv2.VideoWriterclass to save these streams into
If we are saving encoded streams and we have
avinstalled (PyAv library) SDK will save encoded streams directly to
.mp4container. This will allow you to watch videos with a standard video player. There’s no decoding/encoding (or converting) happening on the host computer and host CPU/GPU/RAM usage is minimal. More information here.
Otherwise SDK will save encoded streams to files (eg.
color.mjpeg) and you can use ffmpeg or mkvmerge to containerize the stream so it’s viewable by most video players. More information here.
200 frames from 4K color camera using different encoding options (MJPEG, H.264, H.265) using
Currently, we only support recording
depth to the rosbag (
recording.bag). In the future we will also support color (which is depth aligned)
stream and mono streams. You can open the rosbag with the [RealSense Viewer](https://www.intelrealsense.com/sdk-2/#sdk2-tools) to view the depth:
We’re always happy to help with development or other questions you might have.