INDOOR POSITIONING BASED-ON IMAGES AIDED BY ARTIFICIAL NEURAL NETWORKS

Hung, M. C.; Liao, J. K.; Chiang, K. W.

Indoor positioning has attracted much attention in recent years due to the trend of Internet of Things (IoT), which is capable of providing numerous applications such as personal tracking, vehicle locator, and Location-Based Service (LBS). To put LBS into practice, positioning and navigation are one of the necessary techniques. Using the smartphone to process indoor positioning also become more and more usual. The most common algorithm in the inertial navigation system is Pedestrian Dead Reckoning (PDR), utilizing sensors built-in the smartphones to conquer the strait of GNSS-denied environment. However, for the purpose of eliminating the error accumulated with time, PDR combining with other algorithms, for instance, updating some geospatial information steadily is a better way to solve this problem. Therefore, this research proposes the imaged based aided algorithm. Moreover, in this study, a novel Artificial Neural Networks (ANN) embedded the system is proposed. The self-designed georeferenced markers and the indoor floor plan will be produced by an Indoor Mobile Mapping System (IMMS) in advance. This research proposed using Cascade-Correlation neural Network (CCN) to estimate the distance between the marker and the smartphone camera. The accuracy using this method can achieve to 0.27 meter. As if at least three coordinates and the distance can be obtained simultaneously, the position of the user can be calculated by the trilateration method. From the experiment, the accuracy of the positioning is about 0.5 meter. This way seems to have the high potential to bring into play on the real-time indoor positioning.

Zitieren

Zitierform:

Hung, M. C. / Liao, J. K. / Chiang, K. W.: INDOOR POSITIONING BASED-ON IMAGES AIDED BY ARTIFICIAL NEURAL NETWORKS. 2019. Copernicus Publications.

Zugriffsstatistik

Gesamt:
Volltextzugriffe:
Metadatenansicht:
12 Monate:
Volltextzugriffe:
Metadatenansicht:

Grafik öffnen

Rechte

Rechteinhaber: M. C. Hung et al.

Nutzung und Vervielfältigung:

Export