The whole sequence is recorded and then converted into a series of individual frames before running the program. The advantage of this procedure is the independence of time constraints between frames to locate interesting features. Although the tracking process is automated, it is also possible to help the algorithm when the desire object is not found; this, however, is expected to happen only occasionally. Currently the algorithm is based on colour detection because it is easier to isolate from the image the objects we are interest in tracking.
Colours are categorised from samples taken from the first frames at the beginning of the sequence, or from a file with previous similar samples. A 3D density estimation model is calculated once all samples have been taken for all categories. For each colour group its distribution is calculated according to equations 1. Subsequently, equation 2 is used for calculating the probability of a RGB sample to belong to class k, where k represent the colour we are interest in tracking. This technique reduces variability caused by different lighting conditions and changes in the scene. At the moment, for distinguishing between other objects in the scene, it is necessary to add all notable regions to the database. For instance, we must create an 'insect' category and take samples from the image, as well as creating a 'ground' and 'shadow' groups. Joints are currently stained with basic colours (red, orange or light blue).
Each color is display on the right according to the mean value calculated during colour training. Colour orange used 42 samples, blue 5135 and 'insect' 1282.
For this example, colour red used 242 samples, green 532, blue 484 and 'other' 7346.
For the image calibration it is necessary to specify at least 9 real coordinates and then indicate those in the image displayed. The more points given, the better the approximation. The algorithm assumes a smooth distortion which can be modelled by a series of quadratic functions whose curvatures vary at different heights. Therefore, at a given 'y', the quadratic function can be determined by equation (3), each of the coefficients calculated by equation (4).
Pixels on the left were mapped to the right according to equations (3) and (4). Because of compression on the bottom, not all the spaces on the right were filled.
Because calibration is so flexible, it also take into account rotation.