Enhanced Autonomous Exploration and Mapping of an Unknown Environment with the Fusion of Dual RGB-D Sensors

Ningbo Yu, Shirong Wang

Engineering ›› 2019, Vol. 5 ›› Issue (1) : 164-172.

PDF(1853 KB)
PDF(1853 KB)
Engineering ›› 2019, Vol. 5 ›› Issue (1) : 164-172. DOI: 10.1016/j.eng.2018.11.014
Research
Research Robotics—Article

Enhanced Autonomous Exploration and Mapping of an Unknown Environment with the Fusion of Dual RGB-D Sensors

Author information +
History +

Abstract

The autonomous exploration and mapping of an unknown environment is useful in a wide range of applications and thus holds great significance. Existing methods mostly use range sensors to generate two-dimensional (2D) grid maps. Red/green/blue-depth (RGB-D) sensors provide both color and depth information on the environment, thereby enabling the generation of a three-dimensional (3D) point cloud map that is intuitive for human perception. In this paper, we present a systematic approach with dual RGB-D sensors to achieve the autonomous exploration and mapping of an unknown indoor environment. With the synchronized and processed RGB-D data, location points were generated and a 3D point cloud map and 2D grid map were incrementally built. Next, the exploration was modeled as a partially observable Markov decision process. Partial map simulation and global frontier search methods were combined for autonomous exploration, and dynamic action constraints were utilized in motion control. In this way, the local optimum can be avoided and the exploration efficacy can be ensured. Experiments with single connected and multi-branched regions demonstrated the high robustness, efficiency, and superiority of the developed system and methods.

Keywords

Autonomous exploration / Red/green/blue-depth / Sensor fusion / Point cloud / Partial map simulation / Global frontier search

Cite this article

Download citation ▾
Ningbo Yu, Shirong Wang. Enhanced Autonomous Exploration and Mapping of an Unknown Environment with the Fusion of Dual RGB-D Sensors. Engineering, 2019, 5(1): 164‒172 https://doi.org/10.1016/j.eng.2018.11.014

References

[1]
Ström D.P., Bogoslavskyi I., Stachniss C.. Robust exploration and homing for autonomous robots. Robot Auton Syst. 2017; 90: 125-135.
[2]
Zhang Z., Nejat G., Guo H., Huang P.. A novel 3D sensory system for robot-assisted mapping of cluttered urban search and rescue environments. Intell Serv Robot. 2011; 4(2): 119-134.
[3]
Huang J., Ri M., Wu D., Ri S.. Interval type-2 fuzzy logic modeling and control of a mobile two-wheeled inverted pendulum. IEEE Trans Fuzzy Syst. 2017; 26(4): 2030-2038.
[4]
Gai S., Jung E.J., Yi B.J.. Multi-group localization problem of service robots based on hybrid external localization algorithm with application to shopping mall environment. Intell Serv Robot. 2016; 9(3): 257-275.
[5]
Pirbhulal S., Zhang H., Alahi M., Ghayvat H., Mukhopadhyay S.C., Zhang Y., . A novel secure IoT-based smart home automation system using a wireless sensor network. Sensors. 2016; 17(1): 69-87.
[6]
Huang J., Xu W., Mohammed S., Shu Z.. Posture estimation and human support using wearable sensors and walking-aid robot. Robot Auton Syst. 2015; 73: 24-43.
[7]
Krajnïk T., Santos J.M., Duckett T.. Life-long spatiotemporal exploration of dynamic environments. In: Proceedings of the European Conference on Mobile Robots; 2015 Sep 2–4. Lincoln, UK. New York: IEEE; 2015. p. 1-8.
[8]
Yamauchi B.. A frontier-based approach for autonomous exploration. In: Proceedings of 1997 IEEE International Symposium on Computational Intelligence in Robotics and Automation; 1997 Jul 10–11. Monterey, CA, USA. New York: IEEE; 1997. p. 146-151.
[9]
Lee K., Doh N.L., Chung W.K.. An exploration strategy using sonar sensors in corridor environments. Intell Serv Robot. 2010; 3(2): 89-98.
[10]
Ryu H., Chung W.K.. Local map-based exploration for mobile robots. Intell Serv Robot. 2013; 6(4): 199-209.
[11]
Keidar M., Kaminka G.A.. Efficient frontier detection for robot exploration. Int J of Rob Res. 2014; 32(2): 215-236.
[12]
Cieslewski T., Kaufmann E., Scaramuzza D.. Rapid exploration with multi-rotors: a frontier selection method for high speed flight. In: Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent robots and Systems; 2017 Sep 24–28. Vancouver, Canada. New York: IEEE; 2017. p. 2135-2142.
[13]
Carrillo H., Dames P., Kumar V., Castellanos J.A.. Autonomous robotic exploration using occupancy grid maps and graph SLAM based on Shannon and Renyi Entropy. In: Proceedings of the 2015 IEEE International Conference on Robotics and Automation; 2015 May 26–30; Seattle, WA, USA. New York: IEEE; 2015. p. 487-494.
[14]
Lauri M., Ritala R.. Planning for robotic exploration based forward simulation. Robot Auton Syst. 2016; 83: 15-31.
[15]
Bai S., Wang J., Chen F., Englot B.. Information-theoretic exploration Bayesian optimization. In: Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems; 2016 Oct 9–14; Daejeon, Korea. New York: IEEE; 2016. p. 1816-1822.
[16]
Shade R., Newman P.. Choosing where to go: complete 3D exploration with stereo. In: Proceedings of the 2011 IEEE International Conference on Robotics and Automation; 2011 Sep 9–13; Shanghai, China. New York: IEEE; 2011. p. 2806-2811.
[17]
Yu N., Wang S., Xu C.. RGB-D based autonomous exploration mapping of a mobile robot in unknown indoor environment. ROBOT. 2017; 39(6): 860-871.
[18]
Umari H., Mukhopadhyay S.. Autonomous robotic exploration and on multiple rapidly-exploring randomized trees. In: Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots Systems; 2017 Sep 24–28; Vancouver, Canada. New York: IEEE; 2017. p. 1396-1402.
[19]
Vallvé J., Andrade-Cetto J.. Potential information fields for mobile t exploration. Robot Auton Syst. 2015; 69(205): 68-79.
[20]
Henry P., Krainin M., Herbst E., Ren X., Fox D.. RGB-D mapping: using depth cameras for dense 3D modeling of indoor environments. In: Proceedings of the 12th International Symposium on Experimental Robotics; 2010 Dec 18–21; New Delhi, India. Berlin: Springer; 2010. p. 22-25.
[21]
Engelhard N., Endres F., Hess J., Sturm J., Burgard W.. Realtime 3D visual SLAM with a hand-held RGB-D camera. In: Proceedings of the RGB-D Workshop on 3D Perception in Robotics at the Euro Robotics Forum. Sweden: Vasteras; 2011. p. 1-5.
[22]
Klein G., Murray D.. Parallel tracking and mapping for small AR workspaces. In: Proceedings of the 6th IEEE and ACM International Symposium Mixed and Augmented Reality; 2017 Nov 13–16; Nara, Japan. Berlin: Springer; 2007. p. 225-234.
[23]
Labbé M., Michaud F.. Online global loop closure detection large-scale multi-session graph-based SLAM. In: Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems; 2014 Sep 14–18; Chicago, IL, USA. Berlin: Springer; 2014. p. 2661-2666.
[24]
Yu N., Li Y., Xu C., Liu J.. Localization and motion planning and control of mobile robots with RGB-D SLAM. J Syst Sci Math Sci. 2015; 35(7): 838-847.
[25]
Yang S., Yi X., Wang Z., Wang Y., Yang X.. Visual SLAM using Multiple RGB-D cameras. In: Proceedings of the 2015 IEEE International Conference on Robotics and Biomimetics; 2015 Dec 6–9; Zhuhai, China. Berlin: Springer; 2015. p. 1389-1395.
[26]
Chow J.C., Lichti D.D., Hol J.D., Bellusci G., Luinge H.. IMU and multiple RGB-D camera fusion for assisting indoor stop-and-go 3D terrestrial laser scanning. Robotics. 2014; 3(3): 247-280.
[27]
Munaro M., Basso F., Menegatti E.. Openptrack: Open source multi-camera calibration and people tracking for RGB-D camera networks. Robot Auton Syst. 2016; 75: 525-538.
[28]
Huang C., Yang T.. Line detecting and tracking of a mobile robot with multiple RGB-D cameras. In: Proceedings of the 2016 IEEE 13th International Conference on Networking, Sensing, and Control; 2016 Apr 28–30; Mexico City, Mexico. Berlin: Springer; 2015. p. 1-4.
[29]
Grisetti G., Grzonka S., Stachniss C., Pfa P., Burgard W.. Efficient estimation of accurate maximum likelihood maps in 3D. In: Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems; 2007 Oct 29–Nov 2; San Diego, CA, USA. Berlin: Springer; 2007. p. 3472-3478.
[30]
Rusu R.B.. Semantic 3D object maps for everyday manipulation in human living environments. Könstliche Intelligenz. 2010; 24(4): 345-348.
[31]
Sarbolandi H., Leoch D., Kolb A.. Kinect range sensing: Structured-light versus time-of-flight Kinect. Comput Vis Image Underst. 2015; 139: 1-20.
[32]
Thrun S., Burgard W., Fox D.. Probabilistic robotics. Commun ACM. 2002; 45(3): 52-57.
[33]
Khoshelham K., Elberink S.O.. Accuracy and resolution of Kinect depth data for indoor mapping applications. Sensors. 2012; 12(2): 1437-1454.
[34]
Faion F., Friedberger S., Zea A., Hanebeck U.D.. Intelligent sensor-scheduling for multi-Kinect-tracking. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems; 2012 Oct 7–12; Vilamoura, Portugal. New York: IEEE; 2012. p. 3993-3999.
[35]
Fox D., Burgard W., Thrun S.. The dynamic window approach to collision avoidance. IEEE Trans Robot Autom. 1997; 4(1): 23-33.
[36]
Zheng S., Zhang Q., Zheng R., Huang B., Song Y., Chen X.. Combining a multi-agent system and communication middleware for smart home control: a universal control platform architecture. Sensors. 2017; 17(9): 2135-2159.
[37]
Yu N., Wang K., Li Y., Xu C., Liu J.. A haptic shared control algorithm for flexible human assistance to semi-autonomous robots. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems; 2015 Sep 28–Oct 2; Hamburg, Germany. Berlin: Springer; 2015. p. 5241-5246.
Acknowledgements

The authors would like to thank Mathieu Labbé from Université de Sherbrooke, Canada, and Mikko Lauri from Tampere University of Technology, Finland, for their helpful suggestions.

The authors would like to acknowledge the National Natural Science Foundation of China (61720106012 and 61403215), the Foundation of State Key Laboratory of Robotics (2006-003) and the Fundamental Research Funds for the Central Universities for the financial support of this work.

Compliance with ethics guidelines

Ningbo Yu and Shirong Wang declare that they have no conflict of interest or financial conflicts to disclose.

RIGHTS & PERMISSIONS

2019 Chinese Academy of Engineering
AI Summary AI Mindmap
PDF(1853 KB)

Accesses

Citations

Detail

Sections
Recommended

/