This paper, by proposing a wheel-LiDAR method of odometry and mapping(WLOAM), using wheel encoder, steering encoder,LiDAR, and optional GPS for autonomous vehicles, estimates the low-drift pose at real-time and builds a high-accurate map. The odometry consists of the wheel odometry algorithm and the LiDAR odometry algorithm. The former estimates the 3-DOF ego-motion of LiDAR at a high frequency based on Ackermann steering geometry, whose resulting pose increment is applied in point clouds de-skewing and works as a fine initial guess for LiDAR odometry while the latter performs the 6-DOF scan-to-map LiDAR pose optimization at a relatively low frequency to compensate the pose error accumulated by the wheel odometry, whose core is a two-stage method with an angle-based metric for extracting features. The mapping method is based on the factor graph consisting of the LiDAR odometry factor, the loop closure factor, and the optional GPS factor, which is solved via incremental smoothing and mapping (iSAM) to produce a global map online. An auto-aligned-GPS-factor is proposed for fusing GPS measurement incrementally without explicit initialization. The proposed method was extensively evaluated on the datasets gathered from the autonomous vehicle platform and compared with related open-sourced works. The results show a lower drift rate, which reaches 0.53% in the largest test described in this paper. The implementation of the proposed method is open-sourced for communication (https://github.com/Saki-Chen/W-LOAM).
[1] CADENA C , CARLONE L , CARRILLO H , et al . Past, present, and future of simultaneous localization and mapping: toward the robust-perception age[J]. IEEE Transactions on Robotics, 2016, 32(6):1309. DOI: 10.1109/TRO.2016.2624754 .
[2] BRESSON G , ALSAYED Z , YU L , et al . Simultaneous localization and mapping: a survey of current trends in autonomous driving[J]. IEEE Transactions on Intelligent Vehicles, 2017, 2(3):194. DOI: 10.1109/TIV.2017.2749181 .
[3] MOHAMED S A S , HAGHBAYAN M H , WESTERLUND T , et al . A survey on odometry for autonomous navigation systems[J]. IEEE Access, 2019, 7: 97466. DOI: 10.1109/ACCESS.2019.2929133 .
[4] ZHANG J , SINGH S . Low-drift and real-time lidar odometry and mapping[J]. Autonomous Robots, 2017, 41(2): 401. DOI: 10.1007/s10514-016-9548-2 .
[5] SHAN T , ENGLOT B , MEYERS D , et al . LIO-SAM: tightly-coupled lidar inertial odometry via smoothing and mapping[C]// 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Las Vegas: IEEE, 2020: 5135.
[6] BRUNKER A , WOHLGEMUTH T , FREY M , et al . Odometry 2.0: a slip-adaptive EIF-based four-wheel-odometry model for parking[J]. IEEE Transactions on Intelligent Vehicles, 2019, 4(1): 114. DOI: 10.1109/TIV.2018.2886675 .
[7] SHAN T , ENGLOT B . LeGO-LOAM: Lightweight and ground-optimized lidar odometry and mapping on variable terrain[C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Madrid: IEEE, 2018: 4758.
[8] KAESS M , RANGANATHAN A , DELLAERT F . iSAM: Incremental smoothing and mapping[J]. IEEE Transactions on Robotics, 2008, 24(6): 1365. DOI: 10.1109/TRO.2008.2006706 .
[9] KAESS M , JOHANNSSON H , ROBERTS R , et al . iSAM2: Incremental smoothing and mapping using the bayes tree[J]. The International Journal of Robotics Research, 2012, 31(2): 216. DOI: 10.1177/0278364911430419 .
[10] UMEYAMA S . Least-squares estimation of transformation parameters between two point patterns[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1991, 13(4): 376. DOI: 10.1109/34.88573 .
[11] QUIGLEY M , GERKEY B , CONLEY K , et al . ROS: An open-source robot operating system[C/OL]. (2009-01-01)[2021-08-25]. https://www.researchgate.net/publication/303138182_ROS_An_open-source_Robot_Operating_System.
[12] 郭媛媛 . 全自动泊车系统的位姿估计技术研究[D]. 上海:同济大学, 2016. GUO Yuanyuan . Research on pose estimation technology of fully automatic parking system[D]. Shanghai: Tongji University, 2016.
[13] RAJ T, HASHIM F H , HUDDIN A B , et al . A survey on LiDAR scanning mechanisms[J]. Electronics, 2020, 9(5): 741. DOI: 10.3390/electronics9050741 .
[14] XU W , ZHANG F . FAST-LIO: A fast, robust lidar-inertial odometry package by tightly-coupled iterated Kalman filter[J]. IEEE Robotics and Automation Letters, 2021, 6(2): 3317. DOI: 10.1109/LRA.2021.3064227 .
[15] HARTLEY R , ZISSERMAN A . Multiple view geometry in computer vision[M]. 2nd ed. Cambridge: Cambridge University Press, 2003.
[16] ZHANG S , XIAO L , NIE Y , et al . Lidar odometry and mapping based on two-stage feature extraction[C]// 39th Chinese Control Conference (CCC 2020). Shenyang: Chinese Association of Automation, 2020: 3966.
[17] DELLAERT F , KAESS M . Factor graphs for robot perception[J]. Foundations and Trends in Robotics, 2017, 6(1/2): 1. DOI: 10.1561/2300000043 .
[18] DELLAERT F . Factor graphs and GTSAM: A hands-on introduction[J]. Georgia Institute of Technology, 2012. [2021-06-25]. https://www.semanticscholar.org/paper/Factor-Graphs-and-GTSAM%3A-A-Hands-on-Introduction-Dellaert/b94fbf48299d78cd586c057e700763ec09b88f80.
[19] YANG A , LUO Y , CHEN L , et al . Survey of 3D map in SLAM: Localization and navigation[C]// FEI M, MA S, LI X, et al. Advanced Computational Methods in Life System Modeling and Simulation. Singapore: Springer Singapore, 2017: 410. DOI: 10.1007/978-981-10-6370-1_41 .
[20] DING W , HOU S , GAO H , et al . LiDAR inertial odometry aided robust LiDAR localization system in changing city scenes[C]// 2020 IEEE International Conference on Robotics and Automation (ICRA). Paris: IEEE, 2020: 4322.
[21] STURM J , ENGELHARD N , ENDRES F , et al . A benchmark for the evaluation of RGB-D SLAM systems[C]// 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems. Vilamoura-Algarve, Portugal: IEEE, 2012.
[22] GEIGER A , LENZ P , URTASUN R . Are we ready for autonomous driving? The KITTI vision benchmark suite[C]// 2012 IEEE Conference on Computer Vision and Pattern Recognition. Providence: IEEE, 2012.