3D ball tracker: Difference between revisions

From ISRWiki
Jump to navigation Jump to search
(added some documentation on initialization.ini)
No edit summary
Line 156: Line 156:
== ToDo: Tracker ==
== ToDo: Tracker ==
;[[Image:empty_bullet.png]]Fix $PF3DTracker/conf/Find_IPP.cmake :
;[[Image:empty_bullet.png]]Fix $PF3DTracker/conf/Find_IPP.cmake :
;[[Image:empty_bullet.png]]Turn image size into a parameter loaded at start time :
;[[Image:empty_bullet.png]]Turn image size into a parameter loaded at start time (then check if images really have that size) :
;[[Image:empty_bullet.png]]Turn the number of particles into a parameter loaded at start time :
;[[Image:empty_bullet.png]]Turn the number of particles into a parameter loaded at start time :
;[[Image:empty_bullet.png]]Get rid of IPP dependency, using OpenCV :
;[[Image:empty_bullet.png]]Get rid of IPP dependency, using OpenCV :

Revision as of 13:54, 14 May 2009

This page contains information on how to build, set up and run PF3DTracker, the particle-filter-based 3D ball tracker. The page is in the process of being written, so it's very incomplete and the information you find on it might be inaccurate.


Get the source code

The source code of the tracker is not in the iCub repository at the moment, I need to make sure it can be built on Windows before I commit it. Send me a mail at mtaiana at isr*ist*utl*pt and I'll give you the code.


Build the tracker

Build the tracker on Linux

I'll call the root directory of the tracker code $PF3DTracker. To build the tracker you need to enter $PF3DTracker, then run the commands:

  cmake .
  make

I also run a command to create links to the executable files in the $PF3DTracker directory:

  ln -s binary/* .

Build the tracker on Windows

Sorry, I don't know how to do that so far.


Set the tracker up

You need to create a colour model for the specific ball you want to track. This is done by grabbing images with the camera you want to use, cutting out the parts of the images where the ball is seen and pasting them all together in one file. The background of this image should be white, as white pixels are discarded when building the model histogram. The robustness of the tracker will depend on this model: you should include images in which the ball is seen under different lighting conditions. The more images you cut out, the better.

The colour template image should look something like this:

You need to create a shape model for the specific ball you want to track. This is done using the Matlab script $PF3DTracker/matlab_files/write_initial_ball_points.m. You should set three parameters inside the script: R, R1 and R2. R is the radius of the ball you want to track, in millimetres. R1 and R2 are the radii that are used to project the inner and outer contour (see [[1]] for more details). If you want a precise estimate of the 3D position of the ball, you should set R1 and R2 close to the value of R (e.g. 10% difference). If you want the tracker to be able to withstand high accelerations of the ball, maintaining the number of particles used low, you should increase the difference up to 30% (this is the value I typically use). This script will create a file called something like: initial_ball_points_31mm_30percent.csv.

You need to create a dynamic model for the ball. Basically you have to fill in the dynamic matrix. I use a constant velocity model, with random acceleration. The data for this is stored in: models/motion_model_matrix.csv. I'm not sure that the tracker will work properly with other configurations of the motion model.

You need to calibrate the camera you use (e.g. with camCalibConf).

You need to customize the file that sets the tracker up on start up: $PF3DTracker/initialization.ini. Here is an example:

 ####################################
 #configuration file for PF3DTracker#
 ####################################
 
 
 #############
 #module name#
 #############
 name                        /icub/PF3DTracker
 
 
 #########################
 #port names and function#
 #########################
 inputVideoPort              /icub/PF3DTracker/videoIn 
 #inputVideoPort             receives images from the grabber or the rectifying program.
 outputVideoPort             /icub/PF3DTracker/videoOut
 #outputVideoPort            produces images in which the contour of the estimated ball is highlighted
 outputDataPort              /icub/PF3DTracker/dataOut
 #outputDataPort             produces a stream of data in the format: X, Y, Z, likelihood, U, V, seeing_object
 outputParticlePort          /icub/PF3DTracker/particleOut
 #outputParticlePort         produces data for the plotter. it is usually not active for performance reasons.
 outputAttentionPort         /icub/PF3DTracker/attentionOut
 #outputAttentionPort        produces data for the attention system, in terms of a peak of saliency.
 
 
 #################################
 #projection model and parameters#
 #################################
 #projectionModel [perspective|equidistance|unified]
 projectionModel             perspective
 
 #iCubLisboaLeftEye_2009_03_04
 perspectiveFx 	217.934
 perspectiveFy		218.24
 perspectiveCx		185.282
 perspectiveCy		121.498
 
 
 #######################
 #tracked object models#
 #######################
 #trackedObjectType [sphere|parallelogram]
 trackedObjectType           sphere
 trackedObjectColorTemplate  models/model.bmp
 trackedObjectShapeTemplate  models/initial_ball_points_46mm_30percent.csv
 motionModelMatrix           models/motion_model_matrix.csv
 
 
 #######################
 #initialization method#
 #######################
 #initialization method [search|3dEstimate|2dEstimate]
 initializationMethod        3dEstimate
 initialX                       0
 initialY                       0
 initialZ                     500
 
 
 ####################
 #visualization mode#
 ####################
 #only applies to the sphere.
 #circleVisualizationMode	[0=inner and outer cirlce | 1=one circle with the correct radious] default 0.
 circleVisualizationMode	1
 
 
 #########################
 #attention-related stuff#
 #########################
 #the tracker produces a value of likelihood at each time step.
 #that value can be used to infer if the object it is tracking is the correct one.
 #this procedure is not very robust.
 #20Millions is a good threshold level when you have the right color model. 5M.
 likelihoodThreshold         5000000
 attentionOutputMax 300
 attentionOutputDecrease 0.99
 
 
 ##########################
 #image saving preferences#
 ##########################
 #save images with OpenCV?
 saveImagesWithOpencv        false
 #always use the trailing slash here.
 saveImagesWithOpencvDir     ./yarp_result_images/

The number of particles used in the tracker should also be a parameter contained in this file and loaded at start up time by the tracker.

Run the tracker

To run the tracker you need to:

  #run an image rectifier, in case you need it (cameras with a non-negligible distortion)
  camCalib --file iCubLisboaLeftEye320x240_2009_03_04.ini --name /icub/camcalib/left
  
  #run the tracker itself
  ./PF3DTrackerMain --file initialization.ini
  
  #start a viewer
  yarpview /viewer
  
  #connect all the ports
  yarp connect /icub/cam/left /icub/camcalib/left/in
  yarp connect /icub/camcalib/left/out /icub/PF3DTracker/videoIn
  yarp connect /icub/PF3DTracker/videoOut /viewer

Theoretical foundations of the tracker

If you want to know more on the theoretical ideas behind the tracker, please have a look at the papers on this page: [2].


Demo videos

If you want to watch videos and evaluate the performance of the tracker, please have a look at this page: [3].


ToDo: Tracker

Fix $PF3DTracker/conf/Find_IPP.cmake
Turn image size into a parameter loaded at start time (then check if images really have that size)
Turn the number of particles into a parameter loaded at start time
Get rid of IPP dependency, using OpenCV
Build the tracker on Windows
Start writing a wiki-based tutorial


ToDo: Wiki

write about initialization.ini
wait for feedback