Araştırma Makalesi
BibTex RIS Kaynak Göster

From photogrammetric modeling to augmented application of a quarry

Yıl 2026, Sayı: 012 , 12 - 25 , 30.04.2026
https://izlik.org/JA29LU94HR

Öz

The study presents a methodological approach for a pipeline from three-dimensional photogrammetry model generation by an Unmanned Aerial Vehicle (UAV) to an Augmented Reality (AR) application development in a quarry. Initially, a high precision 3D model of a real quarry pit was generated using the imagery of a non-RTK UAV. After scanning the area, the successive photographs taken by the drone were processed by using a photogrammetry software application to build a mesh that was optimized to 50,000 polygons. It was later developed into an interactive holographic application with the Unity engine and Vuforia platform. Two AR tracking paradigms, namely Image Target and Ground Plane were generated and tested on both mobile platforms and Microsoft HoloLens 2 device. In a system evaluation conducted by a group of 17 students and 12 academics who tested the application, a minimum average score of 85 and 90 relatively, out of 100 were given in terms of spatial understanding. This perspective was framed within the context of increased memorability, positive contribution to learning, more enjoyable learning, improved educational quality, and the widespread adoption of such applications. Academicians' outcomes related to the practices included in the study were found to be more positive and this qualitatively supports the educational effectiveness of the system. Additionally, it is observed that there is a trade-off between geometric accuracy and real-time rendering execution on mobile platforms. It was concluded that the Ground Plane method provides a workable material for a high-fidelity digital shadow, which can be applied to remote inspection. Additionally, it is foreseen that AR applications executed on smartphone/tablet and HoloLens 2 have infrastructurally potential to enhance the level of engineering, planning and control process and support mine-safety.

Kaynakça

  • [1] T. Zhan, K. Yin, J. Xiong, and Z. He, “Augmented reality and virtual reality displays: Perspectives and challenges,” iScience, vol. 23, no. 8, 2020, doi: 10.1016/j.isci.2020.101397.
  • [2] P. Singh, V. Murthy, D. Kumar, and S. Raval, “A comprehensive review on application of drone, virtual reality and augmented reality with their application in dragline excavation monitoring in surface mines,” Geomatics, Nat. Hazards Risk, vol. 15, no. 1, 2024, doi: 10.1080/19475705.2024.2327399.
  • [3] G. Lampropoulos, P. Fernández‐Arias, A. Antón‐Sancho, and D. Vergara, “Examining the role of augmented reality and virtual reality in safety training,” Electronics, vol. 13, no. 19, 2024, doi: 10.3390/electronics13193952.
  • [4] F. Mana, P. Jeannette, E. Theresa, and J. Kietzmann, “Go boldly! Explore augmented reality (AR), virtual reality (VR), and mixed reality (MR) for business,” Bus. Horizons, vol. 61, no. 5, pp. 657–666, 2018, doi: 10.1016/j.bushor.2018.05.009.
  • [5] J. Garzón, “An overview of twenty-five years of augmented reality in education,” Multimodal Technol. Interact., vol. 5, no. 7, 2021, doi: 10.3390/mti5070037.
  • [6] V. Gheorghe, F. Girbacia, D. Mihai, B. Razvan, and G. Carmen, “Mapping the emergent trends in industrial augmented reality,” Electronics, vol. 12, no. 7, 2023, doi: 10.3390/electronics12071719.
  • [7] C. Ke and X. Fan, “The renaissance of augmented reality in construction: history, present status and future directions,” Smart Sustain. Built Environ., 2020, doi: 10.1108/SASBE-08-2020-0124.
  • [8] M. Thakra, A. Devare, and M. H. Devare, “Augmented reality (AR) and virtual reality (VR) for UAV swarm visualization,” in UAV Swarm Visualization, 2025, pp. 207–239, doi: 10.1007/979-8-8688-1047-3_6.
  • [9] D. Drascic and P. Milgram, “Perceptual issues in augmented reality,” Proc. SPIE, vol. 2653, pp. 123–134, 1996, doi: 10.1117/12.237425.
  • [10] A. Bhardwaj. “AR, VR and MR: What’s trending?” Open Source For You. https://www.opensourceforu.com/2024/03/ar-vr-and-mr-whats-trending/ (accessed Apr. 28, 2026).
  • [11] A. Gör and C. Coşkun, “Augmented reality as an exhibition method,” Global J. Arts Educ., vol. 7, no. 3, 2017.
  • [12] M. B. Ibáñez and C. Delgado-Kloos, “Augmented reality for STEM learning: A systematic review,” Comput. Educ., vol. 123, pp. 109–123, 2018, doi: 10.1016/j.compedu.2018.05.002.
  • [13] E. Dzardanova and V. Kasapakis, “Virtual reality: A journey from vision to commodity,” IEEE Ann. Hist. Comput., vol. 45, no. 1, pp. 18–30, 2023, doi: 10.1109/MAHC.2022.3208774.
  • [14] P. Milgram and F. Kishino, “A taxonomy of mixed reality visual displays,” IEICE Trans. Inf. Syst., vol. 77, no. 12, pp. 1321–1329, 1994.
  • [15] Abigail. “The virtual spectrum- Understanding AR, MR, VR and XR.” CreatXR. https://creatxr.com/the-virtuality-spectrum-understanding-ar-mr-vr-and-xr/ (accessed Apr. 28, 2026).
  • [16] C. Javvaji et al., “Immersive innovations: Exploring the diverse applications of virtual reality (VR) in healthcare,” Cureus, vol. 16, no. 3, 2024, doi: 10.7759/cureus.56137.
  • [17] S. Shinde, V. Samale, N. Yede, and A. Pilay, “VR safety training for hazardous work,” Int. J. Multidiscip. Res. (IJFMR), vol. 7, no. 1, 2025, doi: 10.36948/ijfmr.2025.v07i01.
  • [18] K. Erarslan and M. Özdemir, “Utilization of augmented and mixed reality in training mining machines,” ESTUDAM, vol. 5, no. 2, pp. 48–56, 2024, doi: 10.53608/estudambilisim.1583427.
  • [19] L. Perfetti, S. Teruggi, C. Achille, and F. Fassi, “Rapid and low-cost photogrammetric survey of hazardous sites, from measurements to VR dissemination,” Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., vol. XLVIII-2/W1-2022, pp. 207–214, 2022, doi: 10.5194/isprs-archives-XLVIII-2-W1-2022-207-2022.
  • [20] ArcGIS. “Virtual reality (VR) and augmented reality (AR) with ArcGIS.” Esri MediaSpace. https://mediaspace.esri.com/media/t/1_e8v5bwla (accessed Apr. 28, 2026).
  • [21] M. Baird, S. Haegler, and R. Hansen. “Virtual reality (VR) and augmented reality (AR) with ArcGIS.” ESRI Event. [suspicious link removed] (accessed Jan. 24, 2024).
  • [22] A. Chaturvedi. “5 ways LiDAR is transforming the world before our eyes.” Geospatial World. https://www.geospatialworld.net/blogs/5-ways-lidar-is-transforming-the-world-before-our-eyes/ (accessed Dec. 22, 2021).
  • [23] L. Duarte, A. C. Teodoro, O. Moutinho, and J. A. Goncalves, “Open-source GIS application for UAV photogrammetry based on MicMac,” Int. J. Remote Sens., vol. 38, no. 8-10, pp. 3181–3202, 2017, doi: 10.1080/01431161.2016.1259685.
  • [24] A. Francois. “QGis for LIDAR: digital surface model (DSM) with CloudCompare and LAStools.” Blog GIS Territ. https://www.sigterritoires.fr/index.php/en/qgis-for-lidar-digital-surface-model-dsm-with-cloudcompare-and-lastools/ (accessed Aug. 20, 2023).
  • [25] J. Jacobs, R. C. W. Webber-Youngman, and E. van Wyk, “Potential augmented reality applications in the mining industry,” 2016, doi: 10.13140/RG.2.2.27751.44961.
  • [26] B. C. Kress and W. J. Cummings, “Towards the ultimate mixed reality experience: hololens display architecture choices,” Symp. Digest Tech. Papers, vol. 48, no. 1, pp. 127–131, 2017, doi: 10.1002/sdtp.11586.
  • [27] D. D. Mascarenas et al., “Augmented reality for next generation infrastructure inspections,” Struct. Health Monit., vol. 20, no. 4, pp. 1957–1979, 2021, doi: 10.1177/1475921720953846.
  • [28] P. Singh, V. Murthy, D. Kumar, and S. Raval, “A comprehensive review on application of drone, virtual reality and augmented reality with their application in dragline excavation monitoring in surface mines,” Geomatics, Nat. Hazards Risk, vol. 15, no. 1, 2024, doi: 10.1080/19475705.2024.2327399.
  • [29] J. Suh, S. Lee, and Y. Choi, “UMineAR: mobile-tablet-based abandoned mine hazard site investigation support system using augmented reality,” Minerals, vol. 7, no. 10, p. 198, 2017, doi: 10.3390/min7100198.
  • [30] Vuforia. “Spatial augmented reality with Vuforia engine in unity.” Unity Technologies. https://resources.unity.com/unitenow/onlinesessions/spatial-augmented-reality-with-vuforia-engine-in-unity (accessed Jan. 10, 2021).
  • [31] Wingtra. “Drones for mining: how to use and choose what’s best.” Wingtra. https://wingtra.com/drone-mapping-applications/mining-and-aggregates (accessed Jan. 21, 2021).
  • [32] J. Jacobs, R. C. W. Webber-Youngman, and E. van Wyk, “Potential augmented reality applications in the mining industry,” pp. 1–8, Jan. 2016, doi: 10.13140/RG.2.2.27751.44961.
  • [33] L. Lattanzi et al., “Digital twin for smart manufacturing: a review of concepts towards a practical industrial implementation,” Int. J. Comput. Integr. Manuf., vol. 34, no. 6, pp. 567–597, 2020, doi: 10.1080/0951192X.2021.1911003.
  • [34] J. Trauer et al., “What is a digital twin?–definitions and insights from an industrial case study in technical product development,” in Proc. Design Soc.: DESIGN Conf., vol. 1, 2020, pp. 757–766, doi: 10.1017/dsd.2020.15.
  • [35] S. Baidya et al., “Digital twin in safety-critical robotics applications: Opportunities and challenges,” in Proc. IEEE Int. Perform. Comput. Commun. Conf. (IPCCC), 2022, pp. 101–107.
  • [36] G. Pronost et al., “Towards a framework for the classification of digital twins and their applications,” IFAC-PapersOnLine, vol. 54, no. 1, 2021, doi: 10.1109/ICE/ITMC52061.2021.9570114.
  • [37] F. von Haxthausen, Y. Chen, and F. Ernst, “Superimposing holograms on real world objects using HoloLens 2 and its depth camera,” Curr. Dir. Biomed. Eng., vol. 7, no. 1, p. 111, 2021, doi: 10.1515/cdbme-2021-1024.
  • [38] S. Teruggi and F. Fassi, “Mixed reality content alignment in monumental environments,” Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., p. 901, 2022, doi: 10.5194/isprs-archives-xliii-b2-2022-901-2022.
  • [39] A. Ulvi, V. İzci, and A. Y. Yiğit, “Investigation of the effect of the number and distribution of ground control point (GCP) on map production accuracy,” Erciyes Univ. Fen Bilim. Enst. Fen Bilim. Derg., vol. 40, no. 2, pp. 167–180, 2024. [Online]. Available: https://izlik.org/JA95HP77LS
  • [40] J. Knodt, “Single edge collapse quad-dominant mesh reduction,” arXiv, 2024, doi: 10.48550/arxiv.2411.16874.
  • [41] H. D. Liu, X. Zhang, and C. Yuksel, “Simplifying triangle meshes in the wild,” arXiv, 2024, doi: 10.48550/arxiv.2409.15458.
  • [42] H. K. Chang, J. Choi, and C. M. Yeum, “3D reconstruction by looking: Instantaneous blind spot detector for indoor SLAM through mixed reality,” arXiv, 2024, doi: 10.48550/arxiv.2411.12514.
  • [43] H. C. Gagnon et al., “Gap affordance judgments in mixed reality: Testing the role of display weight and field of view,” Front. Virtual Real., vol. 2, 2021, doi: 10.3389/frvir.2021.654656.
  • [44] N. Biswas, A. Mukherjee, and S. Bhattacharya, “Are you feeling sick? – A systematic literature review of cybersickness in virtual reality,” ACM Comput. Surv., vol. 56, no. 11, p. 1, 2024, doi: 10.1145/3670008.
  • [45] I. Miguel-Alonso, D. Checa, H. Guillen-Sanz, and A. Bustillo, “Evaluation of the novelty effect in immersive virtual reality learning experiences,” Virtual Reality, vol. 28, no. 1, 2024, doi: 10.1007/s10055-023-00926-5.
  • [46] B. Sobota and D. Cvetković, “Mixed reality and three-dimensional computer graphics,” in IntechOpen eBooks, IntechOpen, 2020, doi: 10.5772/intechopen.77405.
  • [47] K. Essmiller et al., “Exploring mixed reality based on self-efficacy and motivation of users,” Res. Learn. Technol., vol. 28, 2020, doi: 10.25304/rlt.v28.2331.
  • [48] N. Ezdina, E. Y. Dotsenko, E. V. Shavina, and Y. S. Valeeva, “Convergent technological and hyperconvergent forms of productivity improvement in the extractive sector of economy,” Int. J. Technol., vol. 15, no. 3, p. 571, 2024, doi: 10.14716/ijtech.v15i3.5661.
  • [49] M. G. Don, T. R. Wanasinghe, R. G. Gosine, and P. Warrian, “Digital twins and enabling technology applications in mining: Research trends, opportunities, and challenges,” IEEE Access, vol. 13, p. 6945, 2025, doi: 10.1109/access.2025.3526881.
  • [50] C. E. Emere, O. A. Oguntona, I. Ohiomah, and E. Ayorinde, “Harnessing emerging technologies in the global mining sector from a bibliometric standpoint,” Mining, vol. 5, no. 1, p. 13, 2025, doi: 10.3390/mining5010013.
  • [51] J. Adams, F. Flavell, and R. Raureti, “Mixed reality results in vocational education: a case study with HoloLens 2,” Res. Learn. Technol., vol. 30, 2022, doi: 10.25304/rlt.v30.2803.
  • [52] J. D. Valencia-Quiceno, V. Kecojević, A. McBrayer, and D. Bogunovic, “Augmented reality system for training of heavy equipment operators in surface mining,” Min. Metall. Explor., vol. 41, pp. 2217–2229, 2024, doi: 10.1007/s42461-024-01047-6.
  • [53] V. Balaska, I. T. Papapetros, K. M. Oikonomou, L. Bampis, and Α. Gasteratos, “UAV object detection and positioning in a mining industrial metaverse with custom geo-referenced data,” arXiv, 2025, doi: 10.48550/arxiv.2506.13505.
  • [54] Z. Niu, H. Xia, P. Tao, and T. Ke, “Accuracy assessment of UAV photogrammetry system with RTK measurements for direct georeferencing,” ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., p. 169, 2024, doi: 10.5194/isprs-annals-x-1-2024-169-2024.
  • [55] C. Tan et al., “Accuracy analysis of UAV aerial photogrammetry based on RTK mode, flight altitude, and number of GCPs,” Meas. Sci. Technol., vol. 35, no. 10, p. 106310, 2024, doi: 10.1088/1361-6501/ad5dd7.
  • [56] S. M. Dlamini and Y. O. Ouma, “Large-scale topographic mapping using RTK-GNSS and multispectral UAV drone photogrammetric surveys: Comparative evaluation of experimental results,” Geomatics, vol. 5, no. 2, p. 25, 2025, doi: 10.3390/geomatics5020025.
  • [57] S. Tadeja, Y. Lu, M. Rydlewicz, W. Rydlewicz, T. Bubas, and P. O. Kristensson, “Exploring gestural input for engineering surveys of real-life structures in virtual reality using photogrammetric 3D models,” Multimedia Tools Appl., vol. 80, no. 20, p. 31039, 2021, doi: 10.1007/s11042-021-10520-z.
  • [58] M. H. A. Yusri, M. Johan, and M. H. M. Ramli, “Preservation of cultural heritage: A comparison study of 3D modelling between laser scanning, depth image and photogrammetry methods,” J. Mech. Eng., vol. 19, no. 2, pp. 125–145, 2022, doi: 10.24191/jmeche.v19i2.19768.
  • [59] Y. Gao, É. Peillard, J. Normand, G. Moreau, Y. Liu, and Y. Wang, “Influence of virtual objects’ shadows and lighting coherence on distance perception in optical see‐through augmented reality,” J. Soc. Inf. Disp., vol. 28, no. 2, p. 117, 2019, doi: 10.1002/jsid.832.
  • [60] J. M. Liu, G. Narasimham, J. K. Stefanucci, S. H. Creem-Regehr, and B. Bodenheimer, “Distance perception in modern mobile augmented reality,” in Proc. 2022 IEEE Conf. Virtual Real. 3D User Interfaces Abstr. Workshops (VRW), 2020, p. 196, doi: 10.1109/vrw50115.2020.00042.
  • [61] M. Poupard, F. Larrue, H. Sauzeon, and A. Tricot, “A systematic review of immersive technologies for education: Learning performance, cognitive load and intrinsic motivation,” Br. J. Educ. Technol., vol. 56, no. 1, pp. 5–41, 2024, doi: 10.1111/bjet.13503.
  • [62] M. Poupard, F. Larrue, H. Sauzeon, and A. Tricot, “A systematic review of immersive technologies for education: effects of cognitive load and curiosity state on learning performance,” HAL Open Science, pp. 1–38, 2025. [Online]. Available: https://hal.archives-ouvertes.fr/hal-03906797
  • [63] K. Cheng, “Reading an augmented reality book: An exploration of learners’ cognitive load, motivation, and attitudes,” Australas. J. Educ. Technol., vol. 33, no. 4, pp. 53–69, 2017, doi: 10.14742/ajet.2820.
  • [64] V. Candido and A. S. Cattaneo, “Applying cognitive theory of multimedia learning principles to augmented reality and its effects on cognitive load and learning outcomes,” Comput. Hum. Behav. Rep., vol. 18, p. 100678, 2025, doi: 10.1016/j.chbr.2025.100678.

Bir Taş Ocağının Fotogrametrik Modellemesinden Artırılmış Gerçeklik Uygulamasına

Yıl 2026, Sayı: 012 , 12 - 25 , 30.04.2026
https://izlik.org/JA29LU94HR

Öz

Bu çalışma, bir taş ocağında İnsansız Hava Aracı (İHA) tarafından üç boyutlu fotogrametri modeli oluşturulmasından Artırılmış Gerçeklik (AR) uygulaması geliştirilmesine kadar uzanan bir süreç için metodolojik bir yaklaşım sunmaktadır. Başlangıçta, gerçek bir taş ocağı çukurunun yüksek hassasiyetli 3D modeli, RTK olmayan bir İHA'nın görüntüleri kullanılarak oluşturulmuştur. Alan tarandıktan sonra, dron tarafından çekilen ardışık fotoğraflar, 50.000 poligona optimize edilmiş bir ağ oluşturmak için bir fotogrametri yazılım uygulaması kullanılarak işlenmiştir. Daha sonra, Unity motoru ve Vuforia platformu ile etkileşimli bir holografik uygulamaya dönüştürülmüştür. Görüntü Hedefi ve Zemin Düzlemi olmak üzere iki AR izleme paradigması oluşturulmuş ve hem mobil platformlarda hem de Microsoft HoloLens 2 cihazında test edilmiştir. Uygulamayı test eden 17 öğrenci ve 12 akademisyenden oluşan bir grup tarafından yapılan sistem değerlendirmesinde, mekansal anlama açısından 100 üzerinden sırasıyla en az 85 ve 90 ortalama puan verilmiştir. Bu bakış açısı, artan akılda kalıcılık, öğrenmeye olumlu katkı, daha keyifli öğrenme, iyileştirilmiş eğitim kalitesi ve bu tür uygulamaların yaygın olarak benimsenmesi bağlamında çerçevelenmiştir. Çalışmaya dahil edilen uygulamalarla ilgili akademisyenlerin sonuçlarının daha olumlu olduğu ve bunun sistemin eğitimsel etkinliğini niteliksel olarak desteklediği bulunmuştur. Ek olarak, mobil platformlarda geometrik doğruluk ve gerçek zamanlı işleme arasında bir denge olduğu gözlemlenmiştir. Zemin Düzlemi yönteminin, uzaktan denetim için uygulanabilecek yüksek doğruluklu dijital gölge için işlevsel bir malzeme sağladığı sonucuna varılmıştır. Ayrıca, akıllı telefon/tablet ve HoloLens 2'de yürütülen AR uygulamalarının, mühendislik, planlama ve kontrol süreçlerinin seviyesini yükseltme ve maden güvenliğini destekleme potansiyeline sahip olduğu öngörülmektedir.

Kaynakça

  • [1] T. Zhan, K. Yin, J. Xiong, and Z. He, “Augmented reality and virtual reality displays: Perspectives and challenges,” iScience, vol. 23, no. 8, 2020, doi: 10.1016/j.isci.2020.101397.
  • [2] P. Singh, V. Murthy, D. Kumar, and S. Raval, “A comprehensive review on application of drone, virtual reality and augmented reality with their application in dragline excavation monitoring in surface mines,” Geomatics, Nat. Hazards Risk, vol. 15, no. 1, 2024, doi: 10.1080/19475705.2024.2327399.
  • [3] G. Lampropoulos, P. Fernández‐Arias, A. Antón‐Sancho, and D. Vergara, “Examining the role of augmented reality and virtual reality in safety training,” Electronics, vol. 13, no. 19, 2024, doi: 10.3390/electronics13193952.
  • [4] F. Mana, P. Jeannette, E. Theresa, and J. Kietzmann, “Go boldly! Explore augmented reality (AR), virtual reality (VR), and mixed reality (MR) for business,” Bus. Horizons, vol. 61, no. 5, pp. 657–666, 2018, doi: 10.1016/j.bushor.2018.05.009.
  • [5] J. Garzón, “An overview of twenty-five years of augmented reality in education,” Multimodal Technol. Interact., vol. 5, no. 7, 2021, doi: 10.3390/mti5070037.
  • [6] V. Gheorghe, F. Girbacia, D. Mihai, B. Razvan, and G. Carmen, “Mapping the emergent trends in industrial augmented reality,” Electronics, vol. 12, no. 7, 2023, doi: 10.3390/electronics12071719.
  • [7] C. Ke and X. Fan, “The renaissance of augmented reality in construction: history, present status and future directions,” Smart Sustain. Built Environ., 2020, doi: 10.1108/SASBE-08-2020-0124.
  • [8] M. Thakra, A. Devare, and M. H. Devare, “Augmented reality (AR) and virtual reality (VR) for UAV swarm visualization,” in UAV Swarm Visualization, 2025, pp. 207–239, doi: 10.1007/979-8-8688-1047-3_6.
  • [9] D. Drascic and P. Milgram, “Perceptual issues in augmented reality,” Proc. SPIE, vol. 2653, pp. 123–134, 1996, doi: 10.1117/12.237425.
  • [10] A. Bhardwaj. “AR, VR and MR: What’s trending?” Open Source For You. https://www.opensourceforu.com/2024/03/ar-vr-and-mr-whats-trending/ (accessed Apr. 28, 2026).
  • [11] A. Gör and C. Coşkun, “Augmented reality as an exhibition method,” Global J. Arts Educ., vol. 7, no. 3, 2017.
  • [12] M. B. Ibáñez and C. Delgado-Kloos, “Augmented reality for STEM learning: A systematic review,” Comput. Educ., vol. 123, pp. 109–123, 2018, doi: 10.1016/j.compedu.2018.05.002.
  • [13] E. Dzardanova and V. Kasapakis, “Virtual reality: A journey from vision to commodity,” IEEE Ann. Hist. Comput., vol. 45, no. 1, pp. 18–30, 2023, doi: 10.1109/MAHC.2022.3208774.
  • [14] P. Milgram and F. Kishino, “A taxonomy of mixed reality visual displays,” IEICE Trans. Inf. Syst., vol. 77, no. 12, pp. 1321–1329, 1994.
  • [15] Abigail. “The virtual spectrum- Understanding AR, MR, VR and XR.” CreatXR. https://creatxr.com/the-virtuality-spectrum-understanding-ar-mr-vr-and-xr/ (accessed Apr. 28, 2026).
  • [16] C. Javvaji et al., “Immersive innovations: Exploring the diverse applications of virtual reality (VR) in healthcare,” Cureus, vol. 16, no. 3, 2024, doi: 10.7759/cureus.56137.
  • [17] S. Shinde, V. Samale, N. Yede, and A. Pilay, “VR safety training for hazardous work,” Int. J. Multidiscip. Res. (IJFMR), vol. 7, no. 1, 2025, doi: 10.36948/ijfmr.2025.v07i01.
  • [18] K. Erarslan and M. Özdemir, “Utilization of augmented and mixed reality in training mining machines,” ESTUDAM, vol. 5, no. 2, pp. 48–56, 2024, doi: 10.53608/estudambilisim.1583427.
  • [19] L. Perfetti, S. Teruggi, C. Achille, and F. Fassi, “Rapid and low-cost photogrammetric survey of hazardous sites, from measurements to VR dissemination,” Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., vol. XLVIII-2/W1-2022, pp. 207–214, 2022, doi: 10.5194/isprs-archives-XLVIII-2-W1-2022-207-2022.
  • [20] ArcGIS. “Virtual reality (VR) and augmented reality (AR) with ArcGIS.” Esri MediaSpace. https://mediaspace.esri.com/media/t/1_e8v5bwla (accessed Apr. 28, 2026).
  • [21] M. Baird, S. Haegler, and R. Hansen. “Virtual reality (VR) and augmented reality (AR) with ArcGIS.” ESRI Event. [suspicious link removed] (accessed Jan. 24, 2024).
  • [22] A. Chaturvedi. “5 ways LiDAR is transforming the world before our eyes.” Geospatial World. https://www.geospatialworld.net/blogs/5-ways-lidar-is-transforming-the-world-before-our-eyes/ (accessed Dec. 22, 2021).
  • [23] L. Duarte, A. C. Teodoro, O. Moutinho, and J. A. Goncalves, “Open-source GIS application for UAV photogrammetry based on MicMac,” Int. J. Remote Sens., vol. 38, no. 8-10, pp. 3181–3202, 2017, doi: 10.1080/01431161.2016.1259685.
  • [24] A. Francois. “QGis for LIDAR: digital surface model (DSM) with CloudCompare and LAStools.” Blog GIS Territ. https://www.sigterritoires.fr/index.php/en/qgis-for-lidar-digital-surface-model-dsm-with-cloudcompare-and-lastools/ (accessed Aug. 20, 2023).
  • [25] J. Jacobs, R. C. W. Webber-Youngman, and E. van Wyk, “Potential augmented reality applications in the mining industry,” 2016, doi: 10.13140/RG.2.2.27751.44961.
  • [26] B. C. Kress and W. J. Cummings, “Towards the ultimate mixed reality experience: hololens display architecture choices,” Symp. Digest Tech. Papers, vol. 48, no. 1, pp. 127–131, 2017, doi: 10.1002/sdtp.11586.
  • [27] D. D. Mascarenas et al., “Augmented reality for next generation infrastructure inspections,” Struct. Health Monit., vol. 20, no. 4, pp. 1957–1979, 2021, doi: 10.1177/1475921720953846.
  • [28] P. Singh, V. Murthy, D. Kumar, and S. Raval, “A comprehensive review on application of drone, virtual reality and augmented reality with their application in dragline excavation monitoring in surface mines,” Geomatics, Nat. Hazards Risk, vol. 15, no. 1, 2024, doi: 10.1080/19475705.2024.2327399.
  • [29] J. Suh, S. Lee, and Y. Choi, “UMineAR: mobile-tablet-based abandoned mine hazard site investigation support system using augmented reality,” Minerals, vol. 7, no. 10, p. 198, 2017, doi: 10.3390/min7100198.
  • [30] Vuforia. “Spatial augmented reality with Vuforia engine in unity.” Unity Technologies. https://resources.unity.com/unitenow/onlinesessions/spatial-augmented-reality-with-vuforia-engine-in-unity (accessed Jan. 10, 2021).
  • [31] Wingtra. “Drones for mining: how to use and choose what’s best.” Wingtra. https://wingtra.com/drone-mapping-applications/mining-and-aggregates (accessed Jan. 21, 2021).
  • [32] J. Jacobs, R. C. W. Webber-Youngman, and E. van Wyk, “Potential augmented reality applications in the mining industry,” pp. 1–8, Jan. 2016, doi: 10.13140/RG.2.2.27751.44961.
  • [33] L. Lattanzi et al., “Digital twin for smart manufacturing: a review of concepts towards a practical industrial implementation,” Int. J. Comput. Integr. Manuf., vol. 34, no. 6, pp. 567–597, 2020, doi: 10.1080/0951192X.2021.1911003.
  • [34] J. Trauer et al., “What is a digital twin?–definitions and insights from an industrial case study in technical product development,” in Proc. Design Soc.: DESIGN Conf., vol. 1, 2020, pp. 757–766, doi: 10.1017/dsd.2020.15.
  • [35] S. Baidya et al., “Digital twin in safety-critical robotics applications: Opportunities and challenges,” in Proc. IEEE Int. Perform. Comput. Commun. Conf. (IPCCC), 2022, pp. 101–107.
  • [36] G. Pronost et al., “Towards a framework for the classification of digital twins and their applications,” IFAC-PapersOnLine, vol. 54, no. 1, 2021, doi: 10.1109/ICE/ITMC52061.2021.9570114.
  • [37] F. von Haxthausen, Y. Chen, and F. Ernst, “Superimposing holograms on real world objects using HoloLens 2 and its depth camera,” Curr. Dir. Biomed. Eng., vol. 7, no. 1, p. 111, 2021, doi: 10.1515/cdbme-2021-1024.
  • [38] S. Teruggi and F. Fassi, “Mixed reality content alignment in monumental environments,” Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., p. 901, 2022, doi: 10.5194/isprs-archives-xliii-b2-2022-901-2022.
  • [39] A. Ulvi, V. İzci, and A. Y. Yiğit, “Investigation of the effect of the number and distribution of ground control point (GCP) on map production accuracy,” Erciyes Univ. Fen Bilim. Enst. Fen Bilim. Derg., vol. 40, no. 2, pp. 167–180, 2024. [Online]. Available: https://izlik.org/JA95HP77LS
  • [40] J. Knodt, “Single edge collapse quad-dominant mesh reduction,” arXiv, 2024, doi: 10.48550/arxiv.2411.16874.
  • [41] H. D. Liu, X. Zhang, and C. Yuksel, “Simplifying triangle meshes in the wild,” arXiv, 2024, doi: 10.48550/arxiv.2409.15458.
  • [42] H. K. Chang, J. Choi, and C. M. Yeum, “3D reconstruction by looking: Instantaneous blind spot detector for indoor SLAM through mixed reality,” arXiv, 2024, doi: 10.48550/arxiv.2411.12514.
  • [43] H. C. Gagnon et al., “Gap affordance judgments in mixed reality: Testing the role of display weight and field of view,” Front. Virtual Real., vol. 2, 2021, doi: 10.3389/frvir.2021.654656.
  • [44] N. Biswas, A. Mukherjee, and S. Bhattacharya, “Are you feeling sick? – A systematic literature review of cybersickness in virtual reality,” ACM Comput. Surv., vol. 56, no. 11, p. 1, 2024, doi: 10.1145/3670008.
  • [45] I. Miguel-Alonso, D. Checa, H. Guillen-Sanz, and A. Bustillo, “Evaluation of the novelty effect in immersive virtual reality learning experiences,” Virtual Reality, vol. 28, no. 1, 2024, doi: 10.1007/s10055-023-00926-5.
  • [46] B. Sobota and D. Cvetković, “Mixed reality and three-dimensional computer graphics,” in IntechOpen eBooks, IntechOpen, 2020, doi: 10.5772/intechopen.77405.
  • [47] K. Essmiller et al., “Exploring mixed reality based on self-efficacy and motivation of users,” Res. Learn. Technol., vol. 28, 2020, doi: 10.25304/rlt.v28.2331.
  • [48] N. Ezdina, E. Y. Dotsenko, E. V. Shavina, and Y. S. Valeeva, “Convergent technological and hyperconvergent forms of productivity improvement in the extractive sector of economy,” Int. J. Technol., vol. 15, no. 3, p. 571, 2024, doi: 10.14716/ijtech.v15i3.5661.
  • [49] M. G. Don, T. R. Wanasinghe, R. G. Gosine, and P. Warrian, “Digital twins and enabling technology applications in mining: Research trends, opportunities, and challenges,” IEEE Access, vol. 13, p. 6945, 2025, doi: 10.1109/access.2025.3526881.
  • [50] C. E. Emere, O. A. Oguntona, I. Ohiomah, and E. Ayorinde, “Harnessing emerging technologies in the global mining sector from a bibliometric standpoint,” Mining, vol. 5, no. 1, p. 13, 2025, doi: 10.3390/mining5010013.
  • [51] J. Adams, F. Flavell, and R. Raureti, “Mixed reality results in vocational education: a case study with HoloLens 2,” Res. Learn. Technol., vol. 30, 2022, doi: 10.25304/rlt.v30.2803.
  • [52] J. D. Valencia-Quiceno, V. Kecojević, A. McBrayer, and D. Bogunovic, “Augmented reality system for training of heavy equipment operators in surface mining,” Min. Metall. Explor., vol. 41, pp. 2217–2229, 2024, doi: 10.1007/s42461-024-01047-6.
  • [53] V. Balaska, I. T. Papapetros, K. M. Oikonomou, L. Bampis, and Α. Gasteratos, “UAV object detection and positioning in a mining industrial metaverse with custom geo-referenced data,” arXiv, 2025, doi: 10.48550/arxiv.2506.13505.
  • [54] Z. Niu, H. Xia, P. Tao, and T. Ke, “Accuracy assessment of UAV photogrammetry system with RTK measurements for direct georeferencing,” ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., p. 169, 2024, doi: 10.5194/isprs-annals-x-1-2024-169-2024.
  • [55] C. Tan et al., “Accuracy analysis of UAV aerial photogrammetry based on RTK mode, flight altitude, and number of GCPs,” Meas. Sci. Technol., vol. 35, no. 10, p. 106310, 2024, doi: 10.1088/1361-6501/ad5dd7.
  • [56] S. M. Dlamini and Y. O. Ouma, “Large-scale topographic mapping using RTK-GNSS and multispectral UAV drone photogrammetric surveys: Comparative evaluation of experimental results,” Geomatics, vol. 5, no. 2, p. 25, 2025, doi: 10.3390/geomatics5020025.
  • [57] S. Tadeja, Y. Lu, M. Rydlewicz, W. Rydlewicz, T. Bubas, and P. O. Kristensson, “Exploring gestural input for engineering surveys of real-life structures in virtual reality using photogrammetric 3D models,” Multimedia Tools Appl., vol. 80, no. 20, p. 31039, 2021, doi: 10.1007/s11042-021-10520-z.
  • [58] M. H. A. Yusri, M. Johan, and M. H. M. Ramli, “Preservation of cultural heritage: A comparison study of 3D modelling between laser scanning, depth image and photogrammetry methods,” J. Mech. Eng., vol. 19, no. 2, pp. 125–145, 2022, doi: 10.24191/jmeche.v19i2.19768.
  • [59] Y. Gao, É. Peillard, J. Normand, G. Moreau, Y. Liu, and Y. Wang, “Influence of virtual objects’ shadows and lighting coherence on distance perception in optical see‐through augmented reality,” J. Soc. Inf. Disp., vol. 28, no. 2, p. 117, 2019, doi: 10.1002/jsid.832.
  • [60] J. M. Liu, G. Narasimham, J. K. Stefanucci, S. H. Creem-Regehr, and B. Bodenheimer, “Distance perception in modern mobile augmented reality,” in Proc. 2022 IEEE Conf. Virtual Real. 3D User Interfaces Abstr. Workshops (VRW), 2020, p. 196, doi: 10.1109/vrw50115.2020.00042.
  • [61] M. Poupard, F. Larrue, H. Sauzeon, and A. Tricot, “A systematic review of immersive technologies for education: Learning performance, cognitive load and intrinsic motivation,” Br. J. Educ. Technol., vol. 56, no. 1, pp. 5–41, 2024, doi: 10.1111/bjet.13503.
  • [62] M. Poupard, F. Larrue, H. Sauzeon, and A. Tricot, “A systematic review of immersive technologies for education: effects of cognitive load and curiosity state on learning performance,” HAL Open Science, pp. 1–38, 2025. [Online]. Available: https://hal.archives-ouvertes.fr/hal-03906797
  • [63] K. Cheng, “Reading an augmented reality book: An exploration of learners’ cognitive load, motivation, and attitudes,” Australas. J. Educ. Technol., vol. 33, no. 4, pp. 53–69, 2017, doi: 10.14742/ajet.2820.
  • [64] V. Candido and A. S. Cattaneo, “Applying cognitive theory of multimedia learning principles to augmented reality and its effects on cognitive load and learning outcomes,” Comput. Hum. Behav. Rep., vol. 18, p. 100678, 2025, doi: 10.1016/j.chbr.2025.100678.
Toplam 64 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Maden Mühendisliği (Diğer)
Bölüm Araştırma Makalesi
Yazarlar

Zayyad Abdul Masanawa 0009-0003-3287-9224

Mehmet Özdemir 0000-0002-8164-8874

Kaan Erarslan 0000-0002-1875-4009

Gönderilme Tarihi 17 Aralık 2025
Kabul Tarihi 17 Nisan 2026
Yayımlanma Tarihi 30 Nisan 2026
IZ https://izlik.org/JA29LU94HR
Yayımlandığı Sayı Yıl 2026 Sayı: 012

Kaynak Göster

APA Masanawa, Z. A., Özdemir, M., & Erarslan, K. (2026). From photogrammetric modeling to augmented application of a quarry. Journal of Scientific Reports-C, 012, 12-25. https://izlik.org/JA29LU94HR
AMA 1.Masanawa ZA, Özdemir M, Erarslan K. From photogrammetric modeling to augmented application of a quarry. JSR-C. 2026;(012):12-25. https://izlik.org/JA29LU94HR
Chicago Masanawa, Zayyad Abdul, Mehmet Özdemir, ve Kaan Erarslan. 2026. “From photogrammetric modeling to augmented application of a quarry”. Journal of Scientific Reports-C, sy 012: 12-25. https://izlik.org/JA29LU94HR.
EndNote Masanawa ZA, Özdemir M, Erarslan K (01 Nisan 2026) From photogrammetric modeling to augmented application of a quarry. Journal of Scientific Reports-C 012 12–25.
IEEE [1]Z. A. Masanawa, M. Özdemir, ve K. Erarslan, “From photogrammetric modeling to augmented application of a quarry”, JSR-C, sy 012, ss. 12–25, Nis. 2026, [çevrimiçi]. Erişim adresi: https://izlik.org/JA29LU94HR
ISNAD Masanawa, Zayyad Abdul - Özdemir, Mehmet - Erarslan, Kaan. “From photogrammetric modeling to augmented application of a quarry”. Journal of Scientific Reports-C. 012 (01 Nisan 2026): 12-25. https://izlik.org/JA29LU94HR.
JAMA 1.Masanawa ZA, Özdemir M, Erarslan K. From photogrammetric modeling to augmented application of a quarry. JSR-C. 2026;:12–25.
MLA Masanawa, Zayyad Abdul, vd. “From photogrammetric modeling to augmented application of a quarry”. Journal of Scientific Reports-C, sy 012, Nisan 2026, ss. 12-25, https://izlik.org/JA29LU94HR.
Vancouver 1.Zayyad Abdul Masanawa, Mehmet Özdemir, Kaan Erarslan. From photogrammetric modeling to augmented application of a quarry. JSR-C [Internet]. 01 Nisan 2026;(012):12-25. Erişim adresi: https://izlik.org/JA29LU94HR