Journal "Software Engineering"
a journal on theoretical and applied science and technology
Issue N10 2017 year
There are many approaches to object localization and tracking, and the most common way to achieve accurate results is to use a camera group instead of a single camera. Correct mapping between frames from distinct cameras helps to achieve more reliable tracking and more accurate object motion estimation, and calibration is the only way to find these mappings in practice. In this article we propose a new technique of extrinsic self calibration based principally on [1, 12] and absolute error upper-estimates for extrinsic parameters and fundamental matrix reconstruction. We assume that the scene is located outdoor and there is no physical access to the scene (so it is not possible to use special calibration targets). We also assume that cameras intrinsic parameters and global positioning data are known with some given error. Error estimation improves accuracy of object localization and determines the range of applications of the whole tracking algorithm. Since methods of fundamental matrix reconstruction are very specific and depend on many additional constraints, we derive error estimates for three methods: two iterative and one robust. For each method the upper estimate of absolute error is calculated on the assumption that feature point projections are determined with fixed error Ax. The absolute error of extrinsic camera parameters is estimated as a function of the fundamental matrix and feature point projections. Finally, we examine a few methods for removing outliers of feature matching on the synthetic data obtained by rendering an outdoor 3D scene from multiple viewpoints. Poor performance of these methods — Lowe ratio-test with several automated techniques of threshold determination and one novel half-automated technique — is discovered under certain conditions showing the need to improve the proposed algorithm and adapt other methods of finding outliers.