ObjectTracker¶
Object tracker tracks detected objects from the ImgDetections using Kalman filter and hungarian algorithm.
How to place it¶
pipeline = dai.Pipeline()
objectTracker = pipeline.create(dai.node.ObjectTracker)
dai::Pipeline pipeline;
auto objectTracker = pipeline.create<dai::node::ObjectTracker>();
Inputs and Outputs¶
┌───────────────────┐
inputDetectionFrame │ │passthroughDetectionFrame
───────────────────►│-------------------├─────────────────────────►
│ │ out
│ Object ├─────────────────────────►
inputTrackerFrame │ Tracker │ passthroughTrackerFrame
───────────────────►│-------------------├─────────────────────────►
inputDetections │ │ passthroughDetections
───────────────────►│-------------------├─────────────────────────►
└───────────────────┘
Message types
inputDetectionFrame
- ImgFrameinputTrackerFrame
- ImgFrameinputDetections
- ImgDetectionsout
- TrackletspassthroughDetectionFrame
- ImgFramepassthroughTrackerFrame
- ImgFramepassthroughDetections
- ImgDetections
Zero term tracking¶
Zero term tracking performs object association, which means that it does not conduct prediction and tracking based on previous tracking history. Object association would mean that detected objects from an external detector are mapped with tracked objects which has been detected and is being tracked from previous frames.
Short term tracking¶
Short-term tracking allows to track objects between frames, thereby reducing the need to run object detection on each frame. This works great with NN models that can’t achieve 30FPS (eg. YoloV5); tracker can provide tracklets when there was no inference, so the whole system can run at 30FPS.
Supported object tracker types¶
SHORT_TERM_KCF
: Kernelized Correlation Filter tracking. KCF utilizes properties of circulant matrix to enhance the processing speed. Paper here.SHORT_TERM_IMAGELESS
: Short-term tracking allows to track objects on frames where object detection was skipped, by extrapolating object trajectory from previous detections.ZERO_TERM_COLOR_HISTOGRAM
: Utilizes position, shape and input image information such as RGB histogram to perform object tracking.ZERO_TERM_IMAGELESS
: Only utilizes rectangular shape of detected object and position information for object tracking. It does not use color information of tracking objects. It achieves higher throughput than ZERO_TERM_COLOR_HISTOGRAM. User needs to consider the trade-off between throughput and accuracy when choosing the object tracker type.
A similar comparison of object trackers with more information can be found here.
Maximum number of tracked objects¶
ObjectTracker node can track up to 60 objects at once. At the moment the firmware crashes if there are more than 60 objects to track.
Usage¶
pipeline = dai.Pipeline()
objectTracker = pipeline.create(dai.node.ObjectTracker)
objectTracker.setDetectionLabelsToTrack([15]) # Track only person
# Possible tracking types: ZERO_TERM_COLOR_HISTOGRAM, ZERO_TERM_IMAGELESS, SHORT_TERM_IMAGELESS, SHORT_TERM_KCF
objectTracker.setTrackerType(dai.TrackerType.ZERO_TERM_COLOR_HISTOGRAM)
# Take the smallest ID when new object is tracked, possible options: SMALLEST_ID, UNIQUE_ID
objectTracker.setTrackerIdAssignmentPolicy(dai.TrackerIdAssignmentPolicy.SMALLEST_ID)
# You have to use Object tracker in combination with detection network
# and an image frame source - mono/color camera or xlinkIn node
dai::Pipeline pipeline;
auto objectTracker = pipeline.create<dai::node::ObjectTracker>();
objectTracker->setDetectionLabelsToTrack({15}); // Track only person
// Possible tracking types: ZERO_TERM_COLOR_HISTOGRAM, ZERO_TERM_IMAGELESS, SHORT_TERM_IMAGELESS, SHORT_TERM_KCF
objectTracker->setTrackerType(dai::TrackerType::ZERO_TERM_COLOR_HISTOGRAM);
// Take the smallest ID when new object is tracked, possible options: SMALLEST_ID, UNIQUE_ID
objectTracker->setTrackerIdAssignmentPolicy(dai::TrackerIdAssignmentPolicy::SMALLEST_ID);
// You have to use Object tracker in combination with detection network
// and an image frame source - mono/color camera or xlinkIn node
Examples of functionality¶
Reference¶
-
class
depthai.node.
ObjectTracker
-
class
Id
Node identificator. Unique for every node on a single Pipeline
-
getAssetManager
(*args, **kwargs) Overloaded function.
getAssetManager(self: depthai.Node) -> depthai.AssetManager
getAssetManager(self: depthai.Node) -> depthai.AssetManager
-
getInputRefs
(*args, **kwargs) Overloaded function.
getInputRefs(self: depthai.Node) -> list[depthai.Node.Input]
getInputRefs(self: depthai.Node) -> list[depthai.Node.Input]
-
getInputs
(self: depthai.Node) → list[depthai.Node.Input]
-
getName
(self: depthai.Node) → str
-
getOutputRefs
(*args, **kwargs) Overloaded function.
getOutputRefs(self: depthai.Node) -> list[depthai.Node.Output]
getOutputRefs(self: depthai.Node) -> list[depthai.Node.Output]
-
getOutputs
(self: depthai.Node) → list[depthai.Node.Output]
-
getParentPipeline
(*args, **kwargs) Overloaded function.
getParentPipeline(self: depthai.Node) -> depthai.Pipeline
getParentPipeline(self: depthai.Node) -> depthai.Pipeline
-
setDetectionLabelsToTrack
(self: depthai.node.ObjectTracker, labels: list[int]) → None
-
setMaxObjectsToTrack
(self: depthai.node.ObjectTracker, maxObjectsToTrack: int) → None
-
setTrackerIdAssignmentPolicy
(self: depthai.node.ObjectTracker, type: depthai.TrackerIdAssignmentPolicy) → None
-
setTrackerThreshold
(self: depthai.node.ObjectTracker, threshold: float) → None
-
setTrackerType
(self: depthai.node.ObjectTracker, type: depthai.TrackerType) → None
-
setTrackingPerClass
(self: depthai.node.ObjectTracker, trackingPerClass: bool) → None
-
class
-
class
dai::node
::
ObjectTracker
: public dai::NodeCRTP<Node, ObjectTracker, ObjectTrackerProperties>¶ ObjectTracker node. Performs object tracking using Kalman filter and hungarian algorithm.
Public Functions
-
void
setTrackerThreshold
(float threshold)¶ Specify tracker threshold.
- Parameters
threshold
: Above this threshold the detected objects will be tracked. Default 0, all image detections are tracked.
-
void
setMaxObjectsToTrack
(std::int32_t maxObjectsToTrack)¶ Specify maximum number of object to track.
- Parameters
maxObjectsToTrack
: Maximum number of object to track. Maximum 60 in case of SHORT_TERM_KCF, otherwise 1000.
-
void
setDetectionLabelsToTrack
(std::vector<std::uint32_t> labels)¶ Specify detection labels to track.
- Parameters
labels
: Detection labels to track. Default every label is tracked from image detection network output.
-
void
setTrackerType
(TrackerType type)¶ Specify tracker type algorithm.
- Parameters
type
: Tracker type.
-
void
setTrackerIdAssignmentPolicy
(TrackerIdAssignmentPolicy type)¶ Specify tracker ID assignment policy.
- Parameters
type
: Tracker ID assignment policy.
-
void
setTrackingPerClass
(bool trackingPerClass)¶ Whether tracker should take into consideration class label for tracking.
Public Members
-
Input
inputTrackerFrame
= {*this, "inputTrackerFrame", Input::Type::SReceiver, false, 4, true, {{DatatypeEnum::ImgFrame, false}}}¶ Input ImgFrame message on which tracking will be performed. RGBp, BGRp, NV12, YUV420p types are supported. Default queue is non-blocking with size 4.
-
Input
inputDetectionFrame
= {*this, "inputDetectionFrame", Input::Type::SReceiver, false, 4, true, {{DatatypeEnum::ImgFrame, false}}}¶ Input ImgFrame message on which object detection was performed. Default queue is non-blocking with size 4.
-
Input
inputDetections
= {*this, "inputDetections", Input::Type::SReceiver, false, 4, true, {{DatatypeEnum::ImgDetections, true}}}¶ Input message with image detection from neural network. Default queue is non-blocking with size 4.
-
Output
out
= {*this, "out", Output::Type::MSender, {{DatatypeEnum::Tracklets, false}}}¶ Outputs Tracklets message that carries object tracking results.
-
Output
passthroughTrackerFrame
= {*this, "passthroughTrackerFrame", Output::Type::MSender, {{DatatypeEnum::ImgFrame, false}}}¶ Passthrough ImgFrame message on which tracking was performed. Suitable for when input queue is set to non-blocking behavior.
-
Output
passthroughDetectionFrame
= {*this, "passthroughDetectionFrame", Output::Type::MSender, {{DatatypeEnum::ImgFrame, false}}}¶ Passthrough ImgFrame message on which object detection was performed. Suitable for when input queue is set to non-blocking behavior.
-
Output
passthroughDetections
= {*this, "passthroughDetections", Output::Type::MSender, {{DatatypeEnum::ImgDetections, true}}}¶ Passthrough image detections message from neural network output. Suitable for when input queue is set to non-blocking behavior.
Public Static Attributes
-
static constexpr const char *
NAME
= "ObjectTracker"¶
-
void