BUILDING EXTRACTION USING MULTI SENSOR SYSTEMS
In this study, the automatic building extraction is aimed using object-based image analysis method with multi sensor system includes LiDAR, digital camera and GPS/IMU. The image processing techniques, segmentation and classification methods were used for automatic object extraction with defined rule set. The proposed method based on object based classification to overcome the limitation of traditional pixel based classification such as confusion of classes. The generated Digital Surface Model (DSM) from LiDAR point cloud was used to separate building and vegetation classes. The morphologic filters were utilized also optimization of mixed classes. In our proposed approach for building extraction, multi-resolution, contrast-difference and chessboard segmentations were applied. The object-based classification method was preferred in classification process with defined fuzzy rules. First, vegetation and ground classes were generated than building regions were derived with using the results of the classification and segmentation. The data set was obtained from the project of "NABUCCO Gas Pipeline Project". The data set actually was collected for corridor mapping of pipeline which will link the Eastern border of Turkey, to Baumgarten in Austria via Bulgaria, Romania and Hungary. The study area is a suburban neighborhood located in the city of Sivas, Turkey. The Leica ALS60 LiDAR system, DiMAC, Dalsa Area Bayer RGB Charge Coupled (CCD) Camera and GPS and CUS6 IMU system were used for data collection. The additional data sets were generated with point cloud collected by LiDAR and RGB images from digital camera. The rule sets for automatic building extraction were developed in Definiens e-Cognition Developer 8.64 program system. To evaluate the performance of proposed automatic building extraction approach, reference data set was generated with digitizing of extracted building over the orthoimage. The accuracy assessment was performed with completeness and correctness analyses. Based on the completeness and accuracy analysis, the success rates of 83.08% for completeness and 85.51% for correctness were achieved.