GCP-Based Automated Fine Alignment Method for Improving the Accuracy of Coordinate Information on UAV Point Cloud Data
Abstract
:1. Introduction
2. Framework for Research Methodology
2.1. Phase 1: GCPs Detection
2.2. Phase 2: GCPs Global Coordinate Extraction
2.3. Phase 3: Transformation Matrix Estimation
2.4. Phase 4: Fine Alignment
3. Implementation of GCPs Detection Model
3.1. UAV Images Aquisition
3.2. Image Pre-Processing for Training
3.3. Training and Evaluation
4. Case Study
4.1. UAV Data Acquisition and Pre-Processing
4.2. Experimental Method
4.3. Experimental Results
5. Summary, Contributions and Future Work
5.1. Summary
5.2. Contributions
5.3. Limitations and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Wang, Q.; Tan, Y.; Mei, Z. Computational Methods of Acquisition and Processing of 3D Point Cloud Data for Construction Applications. Arch. Comput. Methods Eng. 2020, 27, 479–499. [Google Scholar] [CrossRef]
- Wang, Q.; Kim, M.K. Applications of 3D point cloud data in the construction industry: A fifteen-year review from 2004 to 2018. Adv. Eng. Inform. 2019, 39, 306–319. [Google Scholar] [CrossRef]
- Wang, J.; Zhang, S.; Teizer, J. Geotechnical and safety protective equipment planning using range point cloud data and rule checking in building information modeling. Autom. Constr. 2015, 49, 250–261. [Google Scholar] [CrossRef]
- Ma, Z.; Liu, S. A review of 3D reconstruction techniques in civil engineering and their applications. Adv. Eng. Inform. 2018, 37, 163–174. [Google Scholar] [CrossRef]
- Chae, M.J.; Lee, G.W.; Kim, J.Y.; Park, J.W.; Cho, M.Y. A 3D surface modeling system for intelligent excavation system. Autom. Constr. 2011, 20, 808–817. [Google Scholar] [CrossRef]
- Wang, Q.; Kim, M.K.; Cheng, J.C.P.; Sohn, H. Automated quality assessment of precast concrete elements with geometry irregularities using terrestrial laser scanning. Autom. Constr. 2016, 68, 170–182. [Google Scholar] [CrossRef]
- Liu, Y.-F.; Cho, S.; Spencer, B.F.; Fan, J.-S. Concrete Crack Assessment Using Digital Image Processing and 3D Scene Reconstruction. J. Comput. Civ. Eng. 2016, 30, 04014124. [Google Scholar] [CrossRef]
- El-Omari, S.; Moselhi, O. Integrating 3D laser scanning and photogrammetry for progress measurement of construction work. Autom. Constr. 2008, 18, 1–9. [Google Scholar] [CrossRef]
- Bügler, M.; Borrmann, A.; Ogunmakin, G.; Vela, P.A.; Teizer, J. Fusion of Photogrammetry and Video Analysis for Productivity Assessment of Earthwork Processes. Comput. Civ. Infrastruct. Eng. 2017, 32, 107–123. [Google Scholar] [CrossRef]
- Ray, S.J.; Teizer, J. Computing 3D blind spots of construction equipment: Implementation and evaluation of an automated measurement and visualization method utilizing range point cloud data. Autom. Constr. 2013, 36, 95–107. [Google Scholar] [CrossRef]
- Valero, E.; Bosché, F.; Forster, A. Automatic segmentation of 3D point clouds of rubble masonry walls, and its application to building surveying, repair and maintenance. Autom. Constr. 2018, 96, 29–39. [Google Scholar] [CrossRef]
- Park, S.; Kim, S.; Seo, H. Study on Representative Parameters of Reverse Engineering for Maintenance of Ballasted Tracks. Appl. Sci. 2022, 12, 5973. [Google Scholar] [CrossRef]
- Remondino, F.; Barazzetti, L.; Nex, F.; Scaioni, M.; Sarazzi, D. UAV photogrammetry for mapping and 3D modeling—Current status and future perspectives. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2011, XXXVIII, 14–16. [Google Scholar] [CrossRef] [Green Version]
- Guan, S.; Zhu, Z.; Wang, G. A Review on UAV-Based Remote Sensing Technologies for Construction and Civil Applications. Drones 2022, 6, 117. [Google Scholar] [CrossRef]
- Martínez-Carricondo, P.; Agüera-Vega, F.; Carvajal-Ramírez, F.; Mesas-Carrascosa, F.J.; García-Ferrer, A.; Pérez-Porras, F.J. Assessment of UAV-photogrammetric mapping accuracy based on variation of ground control points. Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 1–10. [Google Scholar] [CrossRef]
- Daftry, S.; Hoppe, C.; Bischof, H. Building with drones: Accurate 3D facade reconstruction using MAVs. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 3487–3494. [Google Scholar]
- Rodriguez-Gonzalvez, P.; Gonzalez-Aguilera, D.; Lopez-Jimenez, G.; Picon-Cabrera, I. Image-based modeling of built environment from an unmanned aerial system. Autom. Constr. 2014, 48, 44–52. [Google Scholar] [CrossRef]
- Julge, K.; Ellmann, A.; Köök, R. Unmanned aerial vehicle surveying for monitoring road construction earthworks. Balt. J. Road Bridg. Eng. 2019, 14, 1–17. [Google Scholar] [CrossRef]
- Leitão, J.P.; De Vitry, M.M.; Scheidegger, A.; Rieckermann, J. Assessing the quality of digital elevation models obtained from mini unmanned aerial vehicles for overland flow modelling in urban areas. Hydrol. Earth Syst. Sci. 2016, 20, 1637–1653. [Google Scholar] [CrossRef] [Green Version]
- Wierzbicki, D.; Kedzierski, M.; Fryskowska, A. Assesment of the influence of UAV image quality on the orthophoto production. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. ISPRS Arch. 2015, 40, 1–8. [Google Scholar] [CrossRef] [Green Version]
- Nouwakpo, S.K.; Weltz, M.A.; McGwire, K. Assessing the performance of structure-from-motion photogrammetry and terrestrial LiDAR for reconstructing soil surface microtopography of naturally vegetated plots. Earth Surf. Process. Landf. 2016, 41, 308–322. [Google Scholar] [CrossRef]
- Bemis, S.P.; Micklethwaite, S.; Turner, D.; James, M.R.; Akciz, S.; Thiele, S.T.; Bangash, H.A. Ground-based and UAV-Based photogrammetry: A multi-scale, high-resolution mapping tool for structural geology and paleoseismology. J. Struct. Geol. 2014, 69, 163–178. [Google Scholar] [CrossRef]
- Fang, Y.; Chen, J.; Cho, Y.K.; Zhang, P. A point cloud-vision hybrid approach for 3D location tracking of mobile construction assets. In Proceedings of the ISARC 2016—33rd International Symposium on Automation and Robotics in Construction, Auburn, AL, USA, 18–21 July 2016; pp. 613–620. [Google Scholar] [CrossRef] [Green Version]
- Álvares, J.S.; Costa, D.B.; Melo, R.R.S. de Exploratory study of using unmanned aerial system imagery for construction site 3D mapping. Constr. Innov. 2018, 18, 301–320. [Google Scholar] [CrossRef]
- Sanz-Ablanedo, E.; Chandler, J.H.; Rodríguez-Pérez, J.R.; Ordóñez, C. Accuracy of Unmanned Aerial Vehicle (UAV) and SfM photogrammetry survey as a function of the number and location of ground control points used. Remote Sens. 2018, 10, 1606. [Google Scholar] [CrossRef] [Green Version]
- Dandois, J.P.; Olano, M.; Ellis, E.C. Optimal altitude, overlap, and weather conditions for computer vision uav estimates of forest structure. Remote Sens. 2015, 7, 13895–13920. [Google Scholar] [CrossRef] [Green Version]
- Mesas-Carrascosa, F.J.; García, M.D.N.; De Larriva, J.E.M.; García-Ferrer, A. An analysis of the influence of flight parameters in the generation of unmanned aerial vehicle (UAV) orthomosaicks to survey archaeological areas. Sensors 2016, 16, 1838. [Google Scholar] [CrossRef] [Green Version]
- Varbla, S.; Puust, R.; Ellmann, A. Accuracy assessment of RTK-GNSS equipped UAV conducted as-built surveys for construction site modelling. Surv. Rev. 2021, 53, 477–492. [Google Scholar] [CrossRef]
- Cheng, L.; Chen, S.; Liu, X.; Xu, H.; Wu, Y.; Li, M.; Chen, Y. Registration of laser scanning point clouds: A review. Sensors 2018, 18, 1641. [Google Scholar] [CrossRef] [Green Version]
- Kim, P.; Chen, J.; Cho, Y.K. Automated Point Cloud Registration Using Visual and Planar Features for Construction Environments. J. Comput. Civ. Eng. 2018, 32, 1–13. [Google Scholar] [CrossRef]
- Peppa, M.V.; Hall, J.; Goodyear, J.; Mills, J.P. Photogrammetric assessment and comparison of dji phantom 4 pro and phantom 4 rtk small unmanned aircraft systems. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. ISPRS Arch. 2019, 42, 503–509. [Google Scholar] [CrossRef]
- Bolkas, D. Assessment of GCP Number and Separation Distance for Small UAS Surveys with and without GNSS-PPK Positioning. J. Surv. Eng. 2019, 145, 1–17. [Google Scholar] [CrossRef]
- Esri Technical Support Available online. Available online: https://support.esri.com/en/technical-article/000023062 (accessed on 3 August 2021).
- Pan, Y.; Zhang, L. Roles of artificial intelligence in construction engineering and management: A critical review and future trends. Autom. Constr. 2021, 122, 103517. [Google Scholar] [CrossRef]
- Baduge, S.K.; Thilakarathna, S.; Perera, J.S.; Arashpour, M.; Sharafi, P.; Teodosio, B.; Shringi, A.; Mendis, P. Artificial intelligence and smart vision for building and construction 4.0: Machine and deep learning methods and applications. Autom. Constr. 2022, 141, 104440. [Google Scholar] [CrossRef]
- Osco, L.P.; Marcato Junior, J.; Marques Ramos, A.P.; de Castro Jorge, L.A.; Fatholahi, S.N.; de Andrade Silva, J.; Matsubara, E.T.; Pistori, H.; Gonçalves, W.N.; Li, J. A review on deep learning in UAV remote sensing. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102456. [Google Scholar] [CrossRef]
- Jocher, G.; Stoken, A.; Borovec, J. Ultralytic/Yolov5. Available online: https://github.com/ultralytics/yolov5 (accessed on 25 June 2021).
- Nepal, U.; Eslamiat, H. Comparing YOLOv3, YOLOv4 and YOLOv5 for Autonomous Landing Spot Detection in Faulty UAVs. Sensors 2022, 22, 464. [Google Scholar] [CrossRef] [PubMed]
- Puliti, S.; Astrup, R. Automatic detection of snow breakage at single tree level using YOLOv5 applied to UAV imagery. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102946. [Google Scholar] [CrossRef]
- Jintasuttisak, T.; Edirisinghe, E.; Elbattay, A. Deep neural network based date palm tree detection in drone imagery. Comput. Electron. Agric. 2022, 192, 106560. [Google Scholar] [CrossRef]
- Wang, C.-Y.; Liao, H.-Y.M.; Wu, Y.-H.; Chen, P.-Y.; Hsieh, J.-W.; Yeh, I.-H. CSPNet: A New Backbone That Can Enhance Learning Capability of CNN. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, Seattle, WA, USA, 13–19 June 2020; pp. 390–391. [Google Scholar]
- Liu, S.; Qi, L.; Qin, H.; Shi, J.; Jia, J. Path Aggregation Network for Instance Segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 18–23 June 2018; pp. 8759–8768. [Google Scholar]
- Jain, A.; Mahajan, M.; Saraf, R. Standardization of the Shape of Ground Control Point (GCP) and the Methodology for Its Detection in Images for UAV-Based Mapping Applications. In Polish River Basins and Lakes—Part II; Springer International Publishing AG: Cham, Switzerland; London, UK, 2019; Volume 943, pp. 459–476. [Google Scholar]
- Besl, P.J.; McKay, N.D. A Method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 1992, 14, 239–256. [Google Scholar] [CrossRef] [Green Version]
- Pomerleau, F.; Colas, F.; Siegwart, R. A Review of Point Cloud Registration Algorithms for Mobile Robotics. Found. Trends® Robot. 2015, 4, 1–104. [Google Scholar] [CrossRef] [Green Version]
- Shorten, C.; Khoshgoftaar, T.M. A survey on Image Data Augmentation for Deep Learning. J. Big Data 2019, 6, 1–48. [Google Scholar] [CrossRef]
- Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft COCO: Common objects in context. In European Conference on Computer Vision; Springer: Cham, Switerland, 2014; pp. 740–755. [Google Scholar]
- Dong, Z.; Liang, F.; Yang, B.; Xu, Y.; Zang, Y.; Li, J.; Wang, Y.; Dai, W.; Fan, H.; Hyyppäb, J.; et al. Registration of large-scale terrestrial laser scanner point clouds: A review and benchmark. ISPRS J. Photogramm. Remote Sens. 2020, 163, 327–342. [Google Scholar] [CrossRef]
- Zhou, Q.-Y.; Park, J.; Koltun, V. Open3D: A Modern Library for 3D Data Processing. arXiv 2018, arXiv:1801.09847. [Google Scholar]
- Bello, S.A.; Yu, S.; Wang, C.; Adam, J.M.; Li, J. Review: Deep learning on 3D point clouds. Remote Sens. 2020, 12, 1729. [Google Scholar] [CrossRef]
Dataset | Images | Tiles | Augmentation | |
---|---|---|---|---|
Training | 446 | 3197 | Saturation | 3197 |
Brightness | 3197 | |||
Exposure | 3197 | |||
Blur | 3197 | |||
Noise | 3197 | |||
Rotation | 3197 | |||
Validation | 125 | 756 | N.A. | |
Test | 61 | 382 | N.A. |
GCP Number | X Coordinates | Y Coordinates | Z Coordinates |
---|---|---|---|
1 | 252,124.345 | 463,707.495 | 96.914 |
2 | 252,228.311 | 463,692.878 | 105.361 |
3 | 252,476.020 | 463,722.543 | 102.988 |
4 | 252,534.588 | 463,724.533 | 98.485 |
5 | 252,654.625 | 463,744.767 | 96.595 |
GCP Number | Data 1 | |||||
GCPs Detection Results | Fine Alignment Results | |||||
X | Y | Z | X | Y | Z | |
1 | 0.0240 | 0.0508 | 0.0606 | 0.0019 | 0.0285 | 0.0072 |
2 | 0.0217 | 0.0954 | 0.0043 | 0.0038 | 0.0379 | 0.0153 |
3 | 0.0150 | 0.0271 | 0.0198 | 0.0380 | 0.0089 | 0.0221 |
4 | 0.0635 | 0.0436 | 0.0244 | 0.0397 | 0.0073 | 0.0122 |
5 | 0.0262 | 0.0191 | 0.0124 | 0.0040 | 0.0078 | 0.0018 |
Average RMSE(m) | 0.0666 | 0.0333 | ||||
GCP number | Data 2 | |||||
GCPs detection results | Fine alignment results | |||||
X | Y | Z | X | Y | Z | |
3 | 0.0284 | 0.0616 | 0.1562 | 0.0397 | 0.0197 | 0.0126 |
4 | 0.0458 | 0.0284 | 0.0407 | 0.0460 | 0.0290 | 0.0174 |
5 | 0.0162 | 0.0689 | 0.0707 | 0.0062 | 0.0092 | 0.0048 |
Average RMSE(m) | 0.1126 | 0.0384 | ||||
GCP number | Data 3 | |||||
GCPs detection results | Fine alignment results | |||||
X | Y | Z | X | Y | Z | |
1 | 0.0049 | 0.0869 | 0.0032 | 0.0063 | 0.0115 | 0.0042 |
2 | 0.0105 | 0.0181 | 0.0259 | 0.0018 | 0.0162 | 0.0058 |
3 | 0.0170 | 0.0173 | 0.0888 | 0.0045 | 0.0050 | 0.0016 |
Average RMSE(m) | 0.0708 | 0.0128 |
Data 2 | Data 3 | ||
---|---|---|---|
Fine alignment framework | 21.8 s | 18.61 s | |
point-to-point ICP | Threshold 0.1 | 29.35 s | |
Threshold 0.3 | 34.95 s | ||
Threshold 0.5 | 39.16 s | ||
Plane matching | Threshold 0.1 | 64.23 s | |
Threshold 0.3 | 68.32 s | ||
Threshold 0.5 | 73.15 s |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Choi, Y.; Park, S.; Kim, S. GCP-Based Automated Fine Alignment Method for Improving the Accuracy of Coordinate Information on UAV Point Cloud Data. Sensors 2022, 22, 8735. https://doi.org/10.3390/s22228735
Choi Y, Park S, Kim S. GCP-Based Automated Fine Alignment Method for Improving the Accuracy of Coordinate Information on UAV Point Cloud Data. Sensors. 2022; 22(22):8735. https://doi.org/10.3390/s22228735
Chicago/Turabian StyleChoi, Yeongjun, Suyeul Park, and Seok Kim. 2022. "GCP-Based Automated Fine Alignment Method for Improving the Accuracy of Coordinate Information on UAV Point Cloud Data" Sensors 22, no. 22: 8735. https://doi.org/10.3390/s22228735