There are two ways to synchronize messages from different sensors (frames, IMU packet, ToF, etc.);
Software syncing (based on timestamp/sequence numbers)
Hardware syncing (multi-sensor sub-ms accuracy, hardware trigger)
This documentation page focuses on software syncing. There are two approaches for it:
Sequece number syncing - for streams set to the same FPS, sub-ms accuracy can be achieved
Timestamp syncing - for streams with different FPS, syncing with other sensors either onboard (eg. IMU) or also connected to the host computer (eg. USB ToF sensor)
Sequece number syncing¶
If we want to synchronize multiple messages from the same OAK, such as:
Camera frames from ColorCamera or MonoCamera (color, left and right frames)
Messages generated from camera frames (NN results, disparity/depth, edge detections, tracklets, encoded frames, tracked features, etc.)
We can use sequence number syncing, demos here. Each frame from ColorCamera/MonoCamera will get assigned a sequence number, which then also gets copied to message generated from that frame.
For sequence number syncing FPS of all cameras need to be the same. On host or inside script node you can get message’s sequence number like this:
# Get the message from the queue message = queue.get() # message can be ImgFrame, NNData, Tracklets, ImgDetections, TrackedFeatures... seqNum = message.getSequenceNum()
Through firmware sync, we’re monitoring for drift and aligning the capture timestamps of all cameras (left, right, color), which are taken at the MIPI Start-of-Frame (SoF) event. The Left/Right global shutter cameras are driven by the same clock, started by broadcast write on I2C, so no drift will happen over time, even when running freely without a hardware sync.
The RGB rolling shutter has a slight difference in clocking/frame-time, so when we detect a small drift, we’re modifying the frame-time (number of lines) for the next frame by a small amount to compensate.
If sensors are set to the same FPS (default is 30), the above two approaches are already integrated into depthai and enabled by default, which allows us to achieve sub-ms delay between all frames + messages generated by these frames!
[Seq 325] RGB timestamp: 0:02:33.549449 [Seq 325] Disparity timestamp: 0:02:33.549402 ----------- [Seq 326] RGB timestamp: 0:02:33.582756 [Seq 326] Disparity timestamp: 0:02:33.582715 ----------- [Seq 327] RGB timestamp: 0:02:33.616075 [Seq 327] Disparity timestamp: 0:02:33.616031
Disparity and color frame timestamps indicate that we achieve well below sub-ms accuracy.
As opposed to sequence number syncing, timestamp syncing can sync:
streams with different FPS
IMU results with other messages
messages with other devices connected to the computer, as timestamps are synced to the host computer clock
Feel free to check the demo here which uses timestamps to sync IMU, color and disparity frames together, with all of these streams producing messages at different FPS.
In case of multiple streams having different FPS, there are 2 options on how to sync them:
Removing some messages from faster streams to get the synced FPS of the slower stream
Duplicating some messages from slower streams to get the synced FPS of the fastest stream
Timestamps are assigned to the frame at the MIPI Start-of-Frame (SoF) events, more details here.
We’re always happy to help with code or other questions you might have.