RGB Encoding¶
This example shows how to configure the depthai video encoder in h.265 format to encode the RGB camera input at 8MP/4K/2160p (3840x2160) at 30FPS (the maximum possible encoding resolultion possible for the encoder, higher frame-rates are possible at lower resolutions, like 1440p at 60FPS), and transfers the encoded video over XLINK to the host, saving it to disk as a video file.
Pressing Ctrl+C will stop the recording and then convert it using ffmpeg into an mp4 to make it playable. Note that ffmpeg will need to be installed and runnable for the conversion to mp4 to succeed.
Be careful, this example saves encoded video to your host storage. So if you leave it running, you could fill up your storage on your host.
Similar samples:
Encoded bitstream (either MJPEG, H264, or H265) from the device can also be saved directly into .mp4 container with no computational overhead on the host computer. See demo here for more information.
Matroska
Besides ffmpeg
and .mp4
video container (which is patent encumbered), you could also use the mkvmerge
(see MKVToolNix for GUI usage) and .mkv
video container
to mux encoded stream into video file that is supported by all major video players
(eg. VLC)
mkvmerge -o vid.mkv video.h265
Demo¶
Setup¶
Please run the install script to download all required dependencies. Please note that this script must be ran from git context, so you have to download the depthai-python repository first and then run the script
git clone https://github.com/luxonis/depthai-python.git
cd depthai-python/examples
python3 install_requirements.py
For additional information, please follow installation guide
Source code¶
Also available on GitHub
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 | #!/usr/bin/env python3
import depthai as dai
# Create pipeline
pipeline = dai.Pipeline()
# Define sources and output
camRgb = pipeline.create(dai.node.ColorCamera)
videoEnc = pipeline.create(dai.node.VideoEncoder)
xout = pipeline.create(dai.node.XLinkOut)
xout.setStreamName('h265')
# Properties
camRgb.setBoardSocket(dai.CameraBoardSocket.CAM_A)
camRgb.setResolution(dai.ColorCameraProperties.SensorResolution.THE_4_K)
videoEnc.setDefaultProfilePreset(30, dai.VideoEncoderProperties.Profile.H265_MAIN)
# Linking
camRgb.video.link(videoEnc.input)
videoEnc.bitstream.link(xout.input)
# Connect to device and start pipeline
with dai.Device(pipeline) as device:
# Output queue will be used to get the encoded data from the output defined above
q = device.getOutputQueue(name="h265", maxSize=30, blocking=True)
# The .h265 file is a raw stream file (not playable yet)
with open('video.h265', 'wb') as videoFile:
print("Press Ctrl+C to stop encoding...")
try:
while True:
h265Packet = q.get() # Blocking call, will wait until a new data has arrived
h265Packet.getData().tofile(videoFile) # Appends the packet data to the opened file
except KeyboardInterrupt:
# Keyboard interrupt (Ctrl + C) detected
pass
print("To view the encoded data, convert the stream file (.h265) into a video file (.mp4) using a command below:")
print("ffmpeg -framerate 30 -i video.h265 -c copy video.mp4")
|
Also available on GitHub
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 | #include <csignal>
#include <iostream>
// Includes common necessary includes for development using depthai library
#include "depthai/depthai.hpp"
// Keyboard interrupt (Ctrl + C) detected
static std::atomic<bool> alive{true};
static void sigintHandler(int signum) {
alive = false;
}
int main(int argc, char** argv) {
using namespace std;
std::signal(SIGINT, &sigintHandler);
// Create pipeline
dai::Pipeline pipeline;
// Define sources and outputs
auto camRgb = pipeline.create<dai::node::ColorCamera>();
auto videoEnc = pipeline.create<dai::node::VideoEncoder>();
auto xout = pipeline.create<dai::node::XLinkOut>();
xout->setStreamName("h265");
// Properties
camRgb->setBoardSocket(dai::CameraBoardSocket::CAM_A);
camRgb->setResolution(dai::ColorCameraProperties::SensorResolution::THE_4_K);
videoEnc->setDefaultProfilePreset(30, dai::VideoEncoderProperties::Profile::H265_MAIN);
// Linking
camRgb->video.link(videoEnc->input);
videoEnc->bitstream.link(xout->input);
// Connect to device and start pipeline
dai::Device device(pipeline);
// Output queue will be used to get the encoded data from the output defined above
auto q = device.getOutputQueue("h265", 30, true);
// The .h265 file is a raw stream file (not playable yet)
auto videoFile = std::ofstream("video.h265", std::ios::binary);
cout << "Press Ctrl+C to stop encoding..." << endl;
while(alive) {
auto h265Packet = q->get<dai::ImgFrame>();
videoFile.write((char*)(h265Packet->getData().data()), h265Packet->getData().size());
}
cout << "To view the encoded data, convert the stream file (.h265) into a video file (.mp4) using a command below:" << endl;
cout << "ffmpeg -framerate 30 -i video.h265 -c copy video.mp4" << endl;
return 0;
}
|