Date Log

This work is licensed under a Creative Commons Attribution 4.0 International License.
Low-Cost 3D Depth Sensors for Mobile Applications and Control Systems – Accuracy Assessments Using Surveying Techniques
Corresponding Author(s) : Daniel Janos
Geomatics and Environmental Engineering,
Vol. 19 No. 1 (2025): Geomatics and Environmental Engineering
Abstract
This article focuses on low-cost LiDAR (light detection and ranging) sensors and 3D depth cameras. Particular attention was paid to their accuracy and compliance with the technical specifications that were provided by their respective manufacturers. The following devices were tested: Stereolabs ZED 2i, Stereolabs ZED, and Intel RealSense D435i depth cameras, and the Intel RealSense L515 LiDAR sensor. An experiment was carried out to measure a geometrically diverse environment (which is typical for in-motion imaging) where both the measurement range and the distortion that is generated by each device’s algorithms on edges, folds, planes, and 3D objects could be evaluated. Depth sensors are often used with excessive confidence as to their geometric reliability. The aim of this work is to assess the actual accuracy of such sensors, which may constitute the ground truth for accuracy losses that could result from the operations of autonomous vehicles. Based on the results, the accuracy information that was provided by the respective manufacturers was difficult to obtain under real conditions. It was found that the low-cost devices could be used in industrial projects, but their operations must take place under certain conditions and settings. It was also necessary to know their capabilities and limitations in order to take full advantage of what they offer.
Keywords
Download Citation
Endnote/Zotero/Mendeley (RIS)BibTeX
- Raj T., Hashim F.H., Huddin A.B., Ibrahim M.F., Hussain A.: A survey on LiDAR scanning mechanisms. Electronics, vol. 9(5), 2020, 741. https://doi.org/10.3390/electronics9050741.
- Nam D.V., Gon-Woo K.: Solid-state LiDAR based-SLAM: A concise review and application, [in:] 2021 IEEE International Conference on Big Data and Smart Computing (BigComp), Jeju Island, Korea (South), 2021, IEEE, Piscataway 2021, pp. 302–305. https://doi.org/10.1109/BigComp51126.2021.00064.
- Cabo C., Del Pozo S., Rodríguez-Gonzálvez P., Ordóñez C., GonzálezAguilera D.: Comparing Terrestrial Laser Scanning (TLS) and Wearable Laser Scanning (WLS) for individual tree modeling at plot level. Remote Sensing, vol. 10(4), 2018, 540. https://doi.org/10.3390/rs10040540.
- Wei W., Shirinzadeh B., Nowell R., Ghafarian M., Ammar M.M.A., Shen T.: Enhancing solid state LiDAR mapping with a 2D spinning LiDAR in urban scenario SLAM on ground vehicles. Sensors, vol. 21(5), 2021, 1773. https://doi.org/10.3390/s21051773.
- Benedek C., Majdik A., Nagy B., Rozsa Z., Sziranyi T.: Positioning and perception in LIDAR point clouds. Digital Signal Processing, vol. 119, 2021, 103193. https://doi.org/10.1016/j.dsp.2021.103193.
- Kutila M., Pyykönen P., Holzhüter H., Colomb M., Duthon P.: Automotive LiDAR performance verification in fog and rain, [in:] 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, USA, 2018, IEEE, Piscataway 2018, pp. 1695–1701. https://doi.org/10.1109/ITSC.2018.8569624.
- Li K., Li M., Hanebeck U.D.: Towards high-performance solid-state-LiDAR-inertial odometry and mapping. IEEE Robotics and Automation Letters, vol. 6(3), 2021, pp. 5167–5174. https://doi.org/10.1109/lra.2021.3070251.
- Nehme H., Aubry C., Solatges T., Savatier X., Rossi R., Boutteau R.: LiDARbased structure tracking for agricultural robots: Application to autonomous navigation in vineyards. Journal of Intelligent & Robotic Systems, vol. 103, 2021, 61. https://doi.org/10.1007/s10846-021-01519-7.
- Haddeler G., Aybakan A., Akay M.C., Temeltas H.: Evaluation of 3D LiDAR sensor setup for heterogeneous robot team. Journal of Intelligent & Robotic Systems, vol. 100, 2020, pp. 689–709. https://doi.org/10.1007/s10846-020-01207-y.
- Murtiyoso A., Grussenmeyer P., Landes T., Macher H.: First assessments into the use of commercial-grade solid state lidar for low cost heritage documentation. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XLIII-B2-2021, 2021, pp. 599–604. https://doi.org/10.5194/isprs-archives-xliii-b2-2021-599-2021.
- Boehler W., Vicent M.B., Marbs A.: Investigating laser scanner accuracy. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XXXIV-5/C15 [CIPA 2003 XVIII International Symposium], 2003, pp. 696–701.
- Lee K.H., Ehsani R.: Comparison of two 2D laser scanners for sensing object distances, shapes, and surface patterns. Computers and Electronics in Agriculture, vol. 60(2), 2008, pp. 250–262. https://doi.org/10.1016/j.compag.2007.08.007.
- Jafri S.R.U.N., Shamim S., Faraz S.M., Ahmed A., Yasir S.M., Iqbal J.: Characterization and calibration of multiple 2D laser scanners. PLOS ONE, vol. 17(7), 2022, e0272063. https://doi.org/10.1371/journal.pone.0272063.
- Aijazi A.K., Malaterre L., Trassoudaine L., Checchin P.: Systematic evaluation and characterization of 3D solid state LiDAR sensors for autonomous ground vehicles. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XLIII-B1-2020, 2020, pp. 199–203. https://doi.org/10.5194/isprs-archives-xliii-b1-2020-199-2020.
- Jozkow G., Wieczorek P., Karpina M., Walicka A., Borkowski A.: Performance evaluation of sUAS equipped with Velodyne HDL-32E LiDAR sensor. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XLII-2/W6, 2017, pp. 171–177. https://doi.org/10.5194/isprs-archives-xlii-2-w6-171-2017.
- Lam T.F., Blum H., Siegwart R., Gawel A.: SL Sensor: An open-source, real-time and robot operating system-based structured light sensor for high accuracy construction robotic applications. Automation in Construction, vol. 142, 2022, 104424. https://doi.org/10.1016/j.autcon.2022.104424.
- Plaß B., Emrich J., Götz S., Kernstock D., Luther C., Klauer T.: Evaluation of point cloud data acquisition techniques for scan-to-BIM workflows in healthcare, [in:] FIG e-Working Week 2021: Smart Surveyors for Land and Water Management – Challenges in a New Reality, the Netherlands, 21–25 June 2021, FIG, 2021.
- Breitbarth A., Hake C.B., Notni G.: Measurement accuracy and practical assessment of the lidar camera Intel RealSense L515, [in:] Lehmann P., Osten W., Albertazzi Gonçalves Jr. A. (eds.), Optical Measurement Systems for Industrial Inspection XII, Proceedings of SPIE, vol. 11782, SPIE – International Society for Optics and Photonics, 2021, pp. 218–229. https://doi.org/10.1117/12.2592570.
- Zollhöfer M.: Commodity RGB-D sensors: Data acquisition, [in:] Rosin P., Lai Y.K., Shao L., Liu Y. (eds.), RGB-D Image Analysis and Processing, Advances in Computer Vision and Pattern Recognition, Springer, Cham 2019, pp. 3–13. https://doi.org/10.1007/978-3-030-28603-3_1.
- Rodríguez-Gonzálvez P., Guidi G.: RGB-D sensors data quality assessment and improvement for advanced applications, [in:] Rosin P., Lai Y.K., Shao L., Liu Y. (eds.), RGB-D Image Analysis and Processing, Advances in Computer Vision and Pattern Recognition, Springer, Cham 2019, pp. 67–86. https://doi.org/10.1007/978-3-030-28603-3_4.
- Carfagni M., Furferi R., Governi L., Santarelli C., Servi M., Uccheddu F., Volpe Y.: Metrological and critical characterization of the Intel D415 stereo depth camera. Sensors, vol. 19(3), 2019, 489. https://doi.org/10.3390/s19030489.
- Moghari M.D., Noonan P., Henry D.L., Fulton R., Young N., Moore K., Kyme A.: Characterization of the Intel RealSense D415 stereo depth camera for motion-corrected CT imaging, [in:] 2019 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC), Manchester, UK, 2019, IEEE, Piscataway 2019, pp. 1–3. https://doi.org/10.1109/NSS/MIC42101.2019.9059935.
- Curto E., Araujo H.: An experimental assessment of depth estimation in transparent and translucent scenes for Intel RealSense D415, SR305 and L515. Sensors, vol. 22(19), 2022, 7378. https://doi.org/10.3390/s22197378.
- Lourenço F., Araujo H.: Intel RealSense SR305, D415 and L515: experimental evaluation and comparison of depth estimation, [in:] Farinella G.M., Radeva P., Braz J., Bouatouch K. (eds.), Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, Volume 4: VISAPP, SciTePress, Setúbal 2021, pp. 362–369. https://doi.org/10.5220/0010254203620369.
- Servi M., Mussi E., Profili A., Furferi R., Volpe Y., Governi L., Buonamici F.: Metrological characterization and comparison of D415, D455, L515 RealSense devices in the close range. Sensors, vol. 21(22), 2021, 7770. https://doi.org/10.3390/s21227770.
- Tadic V.: Study on automatic electric vehicle charging socket detection using ZED 2i depth sensor. Electronics, vol. 12(4), 2023, 912. https://doi.org/10.3390/electronics12040912.
- Connolly L., O’Gorman D., Garland J., Tobin E.F.: Spatial mapping of light aircraft with stereo-vision camera for use on unmanned aircraft system for defect localisation, [in:] 2023 International Conference on Unmanned Aircraft Systems (ICUAS), Warsaw, Poland, IEEE, Piscataway 2023, pp. 413–418. https://doi.org/10.1109/icuas57906.2023.10155987.
- Ortiz L.E., Cabrera V.E., Gonçalves L.M.: Depth data error modeling of the ZED 3D vision sensor from stereolabs. Electronic Letters on Computer Vision and Image Analysis, vol. 17(1), 2018, pp. 1–15. https://doi.org/10.5565/rev/elcvia.1084.
- Skoczeń M., Ochman M., Spyra K., Nikodem M., Krata D., Panek M., Pawłowski A.: Obstacle detection system for agricultural mobile robot application using RGB-D cameras. Sensors, vol. 21(16), 2021, 5292. https://doi.org/10.3390/s21165292.
- Xue G., Li R., Liu S., Wei J.: Research on underground coal mine map construction method based on LeGO-LOAM improved algorithm. Energies, vol. 15(17), 2022, 6256. https://doi.org/10.3390/en15176256.
- Castorena J., Puskorius G.V., Pandey G.: Motion guided LiDAR camera self-calibration and accelerated depth upsampling for autonomous vehicles. Journal of Intelligent & Robotic Systems, vol. 100, 2020, pp. 1129–1138. https://doi.org/10.1007/s10846-020-01233-w.
- Liu J., Zhan X., Chi C., Zhang X., Zhai C.: Robust extrinsic self-calibration of camera and solid state LiDAR. Journal of Intelligent & Robotic Systems, vol. 109, 2023, 81. https://doi.org/10.1007/s10846-023-02015-w.
- Słomiński S., Sobaszek M.: Intelligent object shape and position identification for needs of dynamic luminance shaping in object floodlighting and projection mapping. Energies, vol. 13(23), 2020, 6442. https://doi.org/10.3390/en13236442.
- Liu L.S., Lin J.F., Yao J.X., He D.W., Zheng J.S., Huang J., Shi P.: Path planning for smart car based on Dijkstra algorithm and dynamic window approach. Wireless Communications and Mobile Computing, vol. 2021, 2021, 8881684. https://doi.org/10.1155/2021/8881684.
- Chghaf M., Rodriguez S., El Ouardi A.: Camera, LiDAR and multi-modal SLAM systems for autonomous ground vehicles: A survey. Journal of Intelligent & Robotic Systems, vol. 105, 2022, 2. https://doi.org/10.1007/s10846-022-01582-8.
- Dworak D., Baranowski J.: Adaptation of Grad-CAM method to neural network architecture for LiDAR pointcloud object detection. Energies, vol. 15(13), 2022, 4681. https://doi.org/10.3390/en15134681.
- Holder M., Elster L., Winner H.: Digitalize the twin: A method for calibration of reference data for transfer real-world test drives into simulation. Energies, vol. 15(3), 2022, 989. https://doi.org/10.3390/en15030989.
- Tadic V., Toth A., Vizvari Z., Klincsik M., Sari Z., Sarcevic P., Sarosi J., Biro I.: Perspectives of RealSense and ZED depth sensors for robotic vision applications. Machines, vol. 10(3), 2022, 183. https://doi.org/10.3390/machines10030183.
- Intel RealSense: Intel® RealSense™ LiDAR Camera L515, Datasheet, Rev 003, 2021.
- Stereolabs: ZED 2i Camera and SDK Overview, Rev 1, 2022.
- Stereolabs: ZED 2i. https://www.stereolabs.com/zed-2i/ [access: 18.08.2023].
- Stereolabs: ZED Camera and SDK Overview, Rev 2, 2018.
- Intel RealSense: Intel® RealSense™ Product Family D400 Series, Datasheet, Rev 015, 2023.
- Intel RealSense: Intel® RealSense™ Depth Camera D435i. https://www.intelrealsense.com/depth-camera-d435i/ [access: 18.08.2023].
- Intel RealSense: Intel® RealSense™ LiDAR Camera L515. https://www.intelrealsense.com/lidar-camera-l515/ [access: 18.08.2023].
- Cabrera E.V., Ortiz L.E., Silva B.M.D., Clua E.W., Gonçalves L.M.: A versatile method for depth data error estimation in RGB-D sensors. Sensors, vol. 18(9), 2018, 3122. https://doi.org/10.3390/s18093122.
- Zhang L., Xia H., Qiao Y.: Texture synthesis repair of RealSense D435i depth images with object-oriented RGB image segmentation. Sensors, vol. 20(23), 2020, 6725. https://doi.org/10.3390/s20236725.
- Trosin M., Dekemati I., Szabó I.: Measuring soil surface roughness with the RealSense D435i. Acta Polytechnica Hungarica, vol. 18(6), 2021, pp. 141–155.
- Forsey-Smerek A., Paige C., Ward F., Haddad D.D., Sanneman L., Todd J., Heldmann J., Lim D., Newman D.: Assessment of depth data acquisition methods for virtual reality mission operations support tools, [in:] 2022 IEEE Aerospace Conference (AERO), Big Sky, MT, USA, 2022, IEEE, Piscataway 2022, pp. 1–14. https://doi.org/10.1109/AERO53065.2022.9843571.
- Intel RealSense: Intel® RealSense™ LiDAR Camera L515, Section Frequently Asked Questions. https://www.intelrealsense.com/lidar-camera-l515/ [access: 18.08.2023].
- Abbas M.A., Setan H., Majid Z., Chong A.K., Idris K.M., Aspuri A.: Calibration and accuracy assessment of Leica ScanStation C10 terrestrial laser scanner, [in:] Abdul Rahman A., Boguslawski P., Gold C., Said M. (eds.), Developments in Multidimensional Spatial Data Models, Lecture Notes in Geoinformation and Cartography, Springer, Berlin, Heidelberg 2013, pp. 33–47. https://doi.org/10.1007/978-3-642-36379-5_3.
- Antanavičiūtė U., Obuchovski R., Paršeliūnas E.K., Popovas M.G.D., Šlikas D.: Some issues regarding the calibration of the terrestrial laser scanner Leica Scanstation C10. Geodesy and Cartography, vol. 39(3), 2013, pp. 138–143. https://doi.org/10.3846/20296991.2013.840356.
- Leica Geosystems AG: Leica ScanStation C10, Product Specification, Heerbrugg, Switzerland, 2011.
- Zoller+Fröhlich: Z+F IMAGER® 5010 C, Datasheet, 2013.
- Leica Geosystems AG: Leica Nova MS50, Datasheet, Heerbrugg, Switzerland, 2013.
- Stereolabs: ZED SDK 4.0, Release Notes. https://www.stereolabs.com/developers/release/ [access: 18.08.2023].
- OpenFields: CloudComPy. https://www.simulation.openfields.fr/index.php/projects/cloudcompy [access: 18.08.2023].
References
Raj T., Hashim F.H., Huddin A.B., Ibrahim M.F., Hussain A.: A survey on LiDAR scanning mechanisms. Electronics, vol. 9(5), 2020, 741. https://doi.org/10.3390/electronics9050741.
Nam D.V., Gon-Woo K.: Solid-state LiDAR based-SLAM: A concise review and application, [in:] 2021 IEEE International Conference on Big Data and Smart Computing (BigComp), Jeju Island, Korea (South), 2021, IEEE, Piscataway 2021, pp. 302–305. https://doi.org/10.1109/BigComp51126.2021.00064.
Cabo C., Del Pozo S., Rodríguez-Gonzálvez P., Ordóñez C., GonzálezAguilera D.: Comparing Terrestrial Laser Scanning (TLS) and Wearable Laser Scanning (WLS) for individual tree modeling at plot level. Remote Sensing, vol. 10(4), 2018, 540. https://doi.org/10.3390/rs10040540.
Wei W., Shirinzadeh B., Nowell R., Ghafarian M., Ammar M.M.A., Shen T.: Enhancing solid state LiDAR mapping with a 2D spinning LiDAR in urban scenario SLAM on ground vehicles. Sensors, vol. 21(5), 2021, 1773. https://doi.org/10.3390/s21051773.
Benedek C., Majdik A., Nagy B., Rozsa Z., Sziranyi T.: Positioning and perception in LIDAR point clouds. Digital Signal Processing, vol. 119, 2021, 103193. https://doi.org/10.1016/j.dsp.2021.103193.
Kutila M., Pyykönen P., Holzhüter H., Colomb M., Duthon P.: Automotive LiDAR performance verification in fog and rain, [in:] 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, USA, 2018, IEEE, Piscataway 2018, pp. 1695–1701. https://doi.org/10.1109/ITSC.2018.8569624.
Li K., Li M., Hanebeck U.D.: Towards high-performance solid-state-LiDAR-inertial odometry and mapping. IEEE Robotics and Automation Letters, vol. 6(3), 2021, pp. 5167–5174. https://doi.org/10.1109/lra.2021.3070251.
Nehme H., Aubry C., Solatges T., Savatier X., Rossi R., Boutteau R.: LiDARbased structure tracking for agricultural robots: Application to autonomous navigation in vineyards. Journal of Intelligent & Robotic Systems, vol. 103, 2021, 61. https://doi.org/10.1007/s10846-021-01519-7.
Haddeler G., Aybakan A., Akay M.C., Temeltas H.: Evaluation of 3D LiDAR sensor setup for heterogeneous robot team. Journal of Intelligent & Robotic Systems, vol. 100, 2020, pp. 689–709. https://doi.org/10.1007/s10846-020-01207-y.
Murtiyoso A., Grussenmeyer P., Landes T., Macher H.: First assessments into the use of commercial-grade solid state lidar for low cost heritage documentation. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XLIII-B2-2021, 2021, pp. 599–604. https://doi.org/10.5194/isprs-archives-xliii-b2-2021-599-2021.
Boehler W., Vicent M.B., Marbs A.: Investigating laser scanner accuracy. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XXXIV-5/C15 [CIPA 2003 XVIII International Symposium], 2003, pp. 696–701.
Lee K.H., Ehsani R.: Comparison of two 2D laser scanners for sensing object distances, shapes, and surface patterns. Computers and Electronics in Agriculture, vol. 60(2), 2008, pp. 250–262. https://doi.org/10.1016/j.compag.2007.08.007.
Jafri S.R.U.N., Shamim S., Faraz S.M., Ahmed A., Yasir S.M., Iqbal J.: Characterization and calibration of multiple 2D laser scanners. PLOS ONE, vol. 17(7), 2022, e0272063. https://doi.org/10.1371/journal.pone.0272063.
Aijazi A.K., Malaterre L., Trassoudaine L., Checchin P.: Systematic evaluation and characterization of 3D solid state LiDAR sensors for autonomous ground vehicles. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XLIII-B1-2020, 2020, pp. 199–203. https://doi.org/10.5194/isprs-archives-xliii-b1-2020-199-2020.
Jozkow G., Wieczorek P., Karpina M., Walicka A., Borkowski A.: Performance evaluation of sUAS equipped with Velodyne HDL-32E LiDAR sensor. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XLII-2/W6, 2017, pp. 171–177. https://doi.org/10.5194/isprs-archives-xlii-2-w6-171-2017.
Lam T.F., Blum H., Siegwart R., Gawel A.: SL Sensor: An open-source, real-time and robot operating system-based structured light sensor for high accuracy construction robotic applications. Automation in Construction, vol. 142, 2022, 104424. https://doi.org/10.1016/j.autcon.2022.104424.
Plaß B., Emrich J., Götz S., Kernstock D., Luther C., Klauer T.: Evaluation of point cloud data acquisition techniques for scan-to-BIM workflows in healthcare, [in:] FIG e-Working Week 2021: Smart Surveyors for Land and Water Management – Challenges in a New Reality, the Netherlands, 21–25 June 2021, FIG, 2021.
Breitbarth A., Hake C.B., Notni G.: Measurement accuracy and practical assessment of the lidar camera Intel RealSense L515, [in:] Lehmann P., Osten W., Albertazzi Gonçalves Jr. A. (eds.), Optical Measurement Systems for Industrial Inspection XII, Proceedings of SPIE, vol. 11782, SPIE – International Society for Optics and Photonics, 2021, pp. 218–229. https://doi.org/10.1117/12.2592570.
Zollhöfer M.: Commodity RGB-D sensors: Data acquisition, [in:] Rosin P., Lai Y.K., Shao L., Liu Y. (eds.), RGB-D Image Analysis and Processing, Advances in Computer Vision and Pattern Recognition, Springer, Cham 2019, pp. 3–13. https://doi.org/10.1007/978-3-030-28603-3_1.
Rodríguez-Gonzálvez P., Guidi G.: RGB-D sensors data quality assessment and improvement for advanced applications, [in:] Rosin P., Lai Y.K., Shao L., Liu Y. (eds.), RGB-D Image Analysis and Processing, Advances in Computer Vision and Pattern Recognition, Springer, Cham 2019, pp. 67–86. https://doi.org/10.1007/978-3-030-28603-3_4.
Carfagni M., Furferi R., Governi L., Santarelli C., Servi M., Uccheddu F., Volpe Y.: Metrological and critical characterization of the Intel D415 stereo depth camera. Sensors, vol. 19(3), 2019, 489. https://doi.org/10.3390/s19030489.
Moghari M.D., Noonan P., Henry D.L., Fulton R., Young N., Moore K., Kyme A.: Characterization of the Intel RealSense D415 stereo depth camera for motion-corrected CT imaging, [in:] 2019 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC), Manchester, UK, 2019, IEEE, Piscataway 2019, pp. 1–3. https://doi.org/10.1109/NSS/MIC42101.2019.9059935.
Curto E., Araujo H.: An experimental assessment of depth estimation in transparent and translucent scenes for Intel RealSense D415, SR305 and L515. Sensors, vol. 22(19), 2022, 7378. https://doi.org/10.3390/s22197378.
Lourenço F., Araujo H.: Intel RealSense SR305, D415 and L515: experimental evaluation and comparison of depth estimation, [in:] Farinella G.M., Radeva P., Braz J., Bouatouch K. (eds.), Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, Volume 4: VISAPP, SciTePress, Setúbal 2021, pp. 362–369. https://doi.org/10.5220/0010254203620369.
Servi M., Mussi E., Profili A., Furferi R., Volpe Y., Governi L., Buonamici F.: Metrological characterization and comparison of D415, D455, L515 RealSense devices in the close range. Sensors, vol. 21(22), 2021, 7770. https://doi.org/10.3390/s21227770.
Tadic V.: Study on automatic electric vehicle charging socket detection using ZED 2i depth sensor. Electronics, vol. 12(4), 2023, 912. https://doi.org/10.3390/electronics12040912.
Connolly L., O’Gorman D., Garland J., Tobin E.F.: Spatial mapping of light aircraft with stereo-vision camera for use on unmanned aircraft system for defect localisation, [in:] 2023 International Conference on Unmanned Aircraft Systems (ICUAS), Warsaw, Poland, IEEE, Piscataway 2023, pp. 413–418. https://doi.org/10.1109/icuas57906.2023.10155987.
Ortiz L.E., Cabrera V.E., Gonçalves L.M.: Depth data error modeling of the ZED 3D vision sensor from stereolabs. Electronic Letters on Computer Vision and Image Analysis, vol. 17(1), 2018, pp. 1–15. https://doi.org/10.5565/rev/elcvia.1084.
Skoczeń M., Ochman M., Spyra K., Nikodem M., Krata D., Panek M., Pawłowski A.: Obstacle detection system for agricultural mobile robot application using RGB-D cameras. Sensors, vol. 21(16), 2021, 5292. https://doi.org/10.3390/s21165292.
Xue G., Li R., Liu S., Wei J.: Research on underground coal mine map construction method based on LeGO-LOAM improved algorithm. Energies, vol. 15(17), 2022, 6256. https://doi.org/10.3390/en15176256.
Castorena J., Puskorius G.V., Pandey G.: Motion guided LiDAR camera self-calibration and accelerated depth upsampling for autonomous vehicles. Journal of Intelligent & Robotic Systems, vol. 100, 2020, pp. 1129–1138. https://doi.org/10.1007/s10846-020-01233-w.
Liu J., Zhan X., Chi C., Zhang X., Zhai C.: Robust extrinsic self-calibration of camera and solid state LiDAR. Journal of Intelligent & Robotic Systems, vol. 109, 2023, 81. https://doi.org/10.1007/s10846-023-02015-w.
Słomiński S., Sobaszek M.: Intelligent object shape and position identification for needs of dynamic luminance shaping in object floodlighting and projection mapping. Energies, vol. 13(23), 2020, 6442. https://doi.org/10.3390/en13236442.
Liu L.S., Lin J.F., Yao J.X., He D.W., Zheng J.S., Huang J., Shi P.: Path planning for smart car based on Dijkstra algorithm and dynamic window approach. Wireless Communications and Mobile Computing, vol. 2021, 2021, 8881684. https://doi.org/10.1155/2021/8881684.
Chghaf M., Rodriguez S., El Ouardi A.: Camera, LiDAR and multi-modal SLAM systems for autonomous ground vehicles: A survey. Journal of Intelligent & Robotic Systems, vol. 105, 2022, 2. https://doi.org/10.1007/s10846-022-01582-8.
Dworak D., Baranowski J.: Adaptation of Grad-CAM method to neural network architecture for LiDAR pointcloud object detection. Energies, vol. 15(13), 2022, 4681. https://doi.org/10.3390/en15134681.
Holder M., Elster L., Winner H.: Digitalize the twin: A method for calibration of reference data for transfer real-world test drives into simulation. Energies, vol. 15(3), 2022, 989. https://doi.org/10.3390/en15030989.
Tadic V., Toth A., Vizvari Z., Klincsik M., Sari Z., Sarcevic P., Sarosi J., Biro I.: Perspectives of RealSense and ZED depth sensors for robotic vision applications. Machines, vol. 10(3), 2022, 183. https://doi.org/10.3390/machines10030183.
Intel RealSense: Intel® RealSense™ LiDAR Camera L515, Datasheet, Rev 003, 2021.
Stereolabs: ZED 2i Camera and SDK Overview, Rev 1, 2022.
Stereolabs: ZED 2i. https://www.stereolabs.com/zed-2i/ [access: 18.08.2023].
Stereolabs: ZED Camera and SDK Overview, Rev 2, 2018.
Intel RealSense: Intel® RealSense™ Product Family D400 Series, Datasheet, Rev 015, 2023.
Intel RealSense: Intel® RealSense™ Depth Camera D435i. https://www.intelrealsense.com/depth-camera-d435i/ [access: 18.08.2023].
Intel RealSense: Intel® RealSense™ LiDAR Camera L515. https://www.intelrealsense.com/lidar-camera-l515/ [access: 18.08.2023].
Cabrera E.V., Ortiz L.E., Silva B.M.D., Clua E.W., Gonçalves L.M.: A versatile method for depth data error estimation in RGB-D sensors. Sensors, vol. 18(9), 2018, 3122. https://doi.org/10.3390/s18093122.
Zhang L., Xia H., Qiao Y.: Texture synthesis repair of RealSense D435i depth images with object-oriented RGB image segmentation. Sensors, vol. 20(23), 2020, 6725. https://doi.org/10.3390/s20236725.
Trosin M., Dekemati I., Szabó I.: Measuring soil surface roughness with the RealSense D435i. Acta Polytechnica Hungarica, vol. 18(6), 2021, pp. 141–155.
Forsey-Smerek A., Paige C., Ward F., Haddad D.D., Sanneman L., Todd J., Heldmann J., Lim D., Newman D.: Assessment of depth data acquisition methods for virtual reality mission operations support tools, [in:] 2022 IEEE Aerospace Conference (AERO), Big Sky, MT, USA, 2022, IEEE, Piscataway 2022, pp. 1–14. https://doi.org/10.1109/AERO53065.2022.9843571.
Intel RealSense: Intel® RealSense™ LiDAR Camera L515, Section Frequently Asked Questions. https://www.intelrealsense.com/lidar-camera-l515/ [access: 18.08.2023].
Abbas M.A., Setan H., Majid Z., Chong A.K., Idris K.M., Aspuri A.: Calibration and accuracy assessment of Leica ScanStation C10 terrestrial laser scanner, [in:] Abdul Rahman A., Boguslawski P., Gold C., Said M. (eds.), Developments in Multidimensional Spatial Data Models, Lecture Notes in Geoinformation and Cartography, Springer, Berlin, Heidelberg 2013, pp. 33–47. https://doi.org/10.1007/978-3-642-36379-5_3.
Antanavičiūtė U., Obuchovski R., Paršeliūnas E.K., Popovas M.G.D., Šlikas D.: Some issues regarding the calibration of the terrestrial laser scanner Leica Scanstation C10. Geodesy and Cartography, vol. 39(3), 2013, pp. 138–143. https://doi.org/10.3846/20296991.2013.840356.
Leica Geosystems AG: Leica ScanStation C10, Product Specification, Heerbrugg, Switzerland, 2011.
Zoller+Fröhlich: Z+F IMAGER® 5010 C, Datasheet, 2013.
Leica Geosystems AG: Leica Nova MS50, Datasheet, Heerbrugg, Switzerland, 2013.
Stereolabs: ZED SDK 4.0, Release Notes. https://www.stereolabs.com/developers/release/ [access: 18.08.2023].
OpenFields: CloudComPy. https://www.simulation.openfields.fr/index.php/projects/cloudcompy [access: 18.08.2023].