Main Content

vision.HistogramBasedTracker

Histogram-based object tracking

Description

The histogram-based tracker incorporates the continuously adaptive mean shift (CAMShift) algorithm for object tracking. It uses the histogram of pixel values to identify the tracked object.

To track an object:

  1. Create the vision.HistogramBasedTracker object and set its properties.

  2. Call the object with arguments, as if it were a function.

To learn more about how System objects work, see What Are System Objects?

Creation

Description

example

hbtracker = vision.HistogramBasedTracker returns a tracker that tracks an object by using the CAMShift algorithm. It uses the histogram of pixel values to identify the tracked object. To initialize the tracking process, you must use the initializeObject function to specify an exemplar image of the object.

hbtracker = vision.HistogramBasedTracker(Name,Value) sets properties using one or more name-value pairs. Enclose each property name in quotes. For example, hbtracker = vision.HistogramBasedTracker('ObjectHistogram',[])

Properties

expand all

Unless otherwise indicated, properties are nontunable, which means you cannot change their values after calling the object. Objects lock when you call them, and the release function unlocks them.

If a property is tunable, you can change its value at any time.

For more information on changing property values, see System Design in MATLAB Using System Objects.

Normalized pixel value histogram, specified as an N-element vector. This vector specifies the normalized histogram of the object's pixel values. Histogram values must be normalized to a value between 0 and 1. You can use the initializeObject method to set the property.

Tunable: Yes

Usage

Description

bbox = hbtracker(I) returns a bounding box, of the tracked object. Before using the tracker, you must identify the object to track, and set the initial search window. Use the initializeObject function to do this.

[bbox,orientation] = hbtracker(I) additionally returns the angle between the x-axis and the major axis of the ellipse that has the same second-order moments as the object. The returned angle is between –pi/2 and pi/2.

[bbox,orientation,score] = hbtracker(I) additionally returns the confidence score for the returned bounding box that contains the tracked object.

Input Arguments

expand all

Video frame, specified as grayscale or any 2-D feature map that distinguishes the object from the background. For example, I can be a hue channel of the HSV color space.

Output Arguments

expand all

Bounding box, returned as a four-element vector in the format, [x y width height].

Orientation, returned as an angle between –pi/2 and pi/2. The angle is measured from the x-axis and the major axis of the ellipse that has the same second-order moments as the object.

Score, returned as a scalar in the range [0 1]. A value of 1 corresponds to the maximum confidence. 1.

Object Functions

To use an object function, specify the System object™ as the first input argument. For example, to release system resources of a System object named obj, use this syntax:

release(obj)

expand all

initializeObjectSet object to track
initializeSearchWindowSet initial search window
stepRun System object algorithm
releaseRelease resources and allow changes to System object property values and input characteristics
resetReset internal states of System object

Examples

collapse all

Track and display a face in each frame of an input video.

Create System objects for reading and displaying video.

videoReader = VideoReader("vipcolorsegmentation.avi");
videoPlayer = vision.VideoPlayer();

Read the first video frame, which contains the object. Convert the image to HSV color space. Then define and display the object region.

objectFrame = im2single(readFrame(videoReader));
objectHSV = rgb2hsv(objectFrame);
objectRegion = [40, 45, 25, 25];
objectImage = insertShape(objectFrame,"rectangle",objectRegion,Color=[1 0 0]);

figure
imshow(objectImage)
title("Red box shows object region")

(Optionally, you can select the object region using your mouse. The object must occupy the majority of the region. Use the following command.)

figure; imshow(objectFrame); objectRegion=round(getPosition(imrect))

Set the object, based on the hue channel of the first video frame.

tracker = vision.HistogramBasedTracker;
initializeObject(tracker, objectHSV(:,:,1) , objectRegion);

Track and display the object in each video frame. The while loop reads each image frame, converts the image to HSV color space, then tracks the object in the hue channel where it is distinct from the background. Finally, the example draws a box around the object and displays the results.

while hasFrame(videoReader)
  frame = im2single(readFrame(videoReader));
  hsv = rgb2hsv(frame);
  bbox = tracker(hsv(:,:,1));
  out = insertShape(frame,"rectangle",bbox,Color=[1 0 0]);
  videoPlayer(out);
end

Release the video player.

release(videoPlayer);

References

[1] Bradsky, G.R. "Computer Vision Face Tracking For Use in a Perceptual User Interface." Intel Technology Journal. January 1998.

Extended Capabilities

Version History

Introduced in R2012a

See Also

Functions