A mobile platform with a catadioptric sensor
The aim of this paper is to present a mobile catadioptric omnidirectional vision system composed of a camera and a cone-shaped mirror integrated with a direct georeferencing system. The relationship between image and object space is established with generic/empiric or physical models. The models were implemented and tested with real data and some results are presented. The results showed that accuracies around 5 cm in planimetry can be achieved, which is suitable for several applications, including the generation of control scenes that was the original motivation of this work.