UAV-LiCAM SYSTEM DEVELOPMENT: CALIBRATION AND GEO-REFERENCING
In the last decade, applications of unmanned aerial vehicles (UAVs), as remote-sensing platforms, have extensively been investigated for fine-scale mapping, modeling and monitoring of the environment. In few recent years, integration of 3D laser scanners and cameras onboard UAVs has also received considerable attention as these two sensors provide complementary spatial/spectral information of the environment. Since lidar performs range and bearing measurements in its body-frame, precise GNSS/INS data are required to directly geo-reference the lidar measurements in an object-fixed coordinate system. However, such data comes at the price of tactical-grade inertial navigation sensors enabled with dual-frequency RTK-GNSS receivers, which also necessitates having access to a base station and proper post-processing software. Therefore, such UAV systems equipped with lidar and camera (UAV-LiCam Systems) are too expensive to be accessible to a wide range of users. Hence, new solutions must be developed to eliminate the need for costly navigation sensors. In this paper, a two-fold solution is proposed based on an in-house developed, low-cost system: 1) a multi-sensor self-calibration approach for calibrating the Li-Cam system based on planar and cylindrical multi-directional features; 2) an integrated sensor orientation method for georeferencing based on unscented particle filtering which compensates for time-variant IMU errors and eliminates the need for GNSS measurements.