Transactions of Nanjing University of Aeronautics & Astronautics
×

分享给微信好友或者朋友圈

使用微信“扫一扫”功能。
参考文献 1
MA H, SMARTE, AHMEDA, et al . Radar image‑based positioning for usv under GPS denial environment[J]. IEEE Transactions on Intelligent Transportation Systems, 2017, 19(1): 72‑80.
参考文献 2
LIX, XUQ . A reliable fusion positioning strategy for land vehicles in GPS‑denied environments based on low‑cost sensors[J]. IEEE Transactions on Industrial Electronics, 2017, 64(4): 3205‑3215.
参考文献 3
SUHRJ K, JANGJ, MIND, et al . Sensor fusion‑based low‑cost vehicle localization system for complex urban environments[J]. IEEE Transactions on Intelligent Transportation Systems, 2017, 18(5): 1078‑1086.
参考文献 4
LINF, WANGH, WANGW, et al . Vehicle state and parameter estimation based on dual unscented particle filter algorithm[J]. Transactions of Nanjing University of Aeronautics and Astronautics, 2014, 31(5): 568‑575.
参考文献 5
KAUFFMANK . Field demonstration of plug and play navigation system using scorpion and smart sensors/cables[C]//Joint Navigation Conference. Dayton: ION, 2017.
参考文献 6
TOMPKINSS . DARPA‑BAA‑11‑14all source positioning and navigation[EB/OL]. (2010‑1‑1)[2019‑3‑15]. https://www.fbo.gov/index?s=opportunity&mode=form&id=cef82a45ab7d64a5445eadf277f13dbe&tab=core&_cview=1.
参考文献 7
ELSNERD L . Universal plug‑n‑play sensor integration for advanced navigation[D]. Dayton,USA: Air Force Institute of Technology, 2012.
参考文献 8
JUANGJ C, HSIEHW L, CHANGC C . Development of a plug‑and‑play ROS‑based land vehicular navigation suite[C]//30th International Technical Meeting of the Satellite Division of the Institute of Navigation. Portland: ION, 2017 : 1959‑1963.
参考文献 9
SOLOVIEVA, YANGC . Reconfigurable integration filtering engine (RIFE) for plug‑and‑play (PnP) navigation[C]//26th International Technical Meeting of The Satellite Division of the Institute of Navigation. Nashville: ION, 2013: 16‑20.
参考文献 10
LYNENS, ACHTELIKM W, WEISSS, et al . A robust and modular multi‑sensor fusion approach applied to mav navigation[C]//2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Tokyo: IEEE, 2013: 3923‑3929.
参考文献 11
GROVESP D . The complexity problem in future multisensor navigation and positioning systems: A modular solution[J]. The Journal of Navigation, 2014, 67(2): 311‑326.
参考文献 12
ZHUZ, ADAMSK, VENABLED, et al . Goal‑driven sensor configuration in a navigation system[C]//2016 IEEE/ION Position, Location and Navigation Symposium (PLANS). Savannah: ION, 2016 : 527‑534.
参考文献 13
STRASDATH, MONTIELJ M, DAVISONA . Visual SLAM: Why filter?[J]. Image and Vision Co⁃mputing, 2012, 30(2): 65‑77.
参考文献 14
MASCAROR, TEIXEIRAL, HINZMANNT, et al . GOMSF: Graph‑optimization based multi‑sensor fusion for robust UAV pose estimation[C]//2018 IE⁃EE International Conference on Robotics and Automation (ICRA). Karlsruhe: IEEE, 2018: 1421‑1428.
参考文献 15
QINT, CAOS, PANJ, et al . A general optimization‑based framework for global pose estimation with multiple sensors[EB/OL]. (2019‑1‑11)[2019‑3‑10]. https://arxiv.org/abs/1901.03638?context=cs.
参考文献 16
INDELMANV, WILLIAMSS, KAESSM, et al . Information fusion in navigation systems via factor graph based incremental smoothing[J]. Robotics and Autonomous Systems, 2013, 61(8): 721‑738.
参考文献 17
CHIUH P, ZHOUX S, CARLONEL, et al . Constrained optimal selection for multi‑sensor robot navigation using plug‑and‑play factor graphs[C]//2014 IEEE International Conference on Robotics and Aut‑omation (ICRA). Hong Kong: IEEE, 2014 : 663‑670.
参考文献 18
MERFELSC, STACHNISSC . Pose fusion with chain pose graphs for automated driving[C]//2016 IE⁃EE/RSJ International Conference on Intelligent Robots and Systems (IROS). Deajeon: IEEE, 2016: 3116‑3123.
参考文献 19
WATSONR M, GROSSJ N . Robust navigation in GNSS degraded environment using graph optimization[EB/OL]. (2018‑6‑23)[2019‑3‑10]. https://arxiv.org/abs/1806.08899.
参考文献 20
KSCHISCHANGF R, FREYB J, H‑ALOELIGER . Factor graphs and the sum‑product algorithm[J]. IEEE Transactions on information theory, 2001, 47(2): 498‑519.
参考文献 21
DELLAERTF, KAESSM . Square root SAM: Simultaneous localization and mapping via square root information smoothing[J]. The International Journal of Robotics Research, 2006, 25(12): 1181‑1203.
Document Sections

    Abstract

    Achieving accurate navigation information by integrating multiple sensors is key to the safe operation of land vehicles in global navigation satellite system (GNSS)‑denied environment. However, current multi‑sensor fusion methods are based on stovepipe architecture, which is optimized with custom fusion strategy for specific sensors. Seeking to develop adaptable navigation that allows rapid integration of any combination of sensors to obtain robust and high‑precision navigation solutions in GNSS‑denied environment, we propose a generic plug‑and‑play fusion strategy to estimate land vehicle states. The proposed strategy can handle different sensors in a plug‑and‑play manner as sensors are abstracted and represented by generic models, which allows rapid reconfiguration whenever a sensor signal is additional or lost during operation. Relative estimations are fused with absolute sensors based on improved factor graph, which includes sensors’ error parameters in the non‑linear optimization process to conduct sensor online calibration. We evaluate the performance of our approach using a land vehicle equipped with a global positioning system (GPS) receiver as well as inertial measurement unit (IMU), camera, wireless sensor and odometer. GPS is not integrated into the system but treated as ground truth. Results are compared with the most common filtering‑based fusion algorithm. It shows that our strategy can process low‑quality input sources in a plug‑and‑play and robust manner and its performance outperforms filtering‑based method in GNSS‑denied environment.

    摘要

    暂无

  • 0 Introduction

    One of the essential technologies that ensure reliable operation of land vehicles is navigation. Current land vehicles heavily rely on global navigation satellite system(GNSS). However, when land vehicles run in the dense or even GNSS‑denied environment, GNSS signal degrades or even fails to locate land vehicles[1].

    When GNSS signal is unavailable, accurate navigation solutions can be obtained through integrating multiple sensors. Multi‑sensor fusion methods have been deeply studied and widely applied in the field of land vehicles[2‑4]. However, these navigation systems are based on stovepipe architecture[5], which is customized for specific sensors and measurement sources. It brings about huge costs whenever the navigation system requires changes or updates. To change existing fusion architectures, Defense Advanced Research Projects Agency (DARPA),USA launched All Source Positioning and Navigation (ASPN) project in 2010[6]. ASPN project aims to develop adaptable navigation that allows rapid integration of any combination of sensors to enable low cost, and seamless navigation solutions for military users on any operational platform and in any environment. Many researchers have performed research on ASPN.

    For the software systems, Elsner and Juang designed the plug‑and‑play multisensory fusion schemes based on robot operating system (ROS)[7‑8]. For the fusion architectures and algorithms, filtering‑based estimation methods are mostly used. Soloviev et al. proposed reconfigurable integration filtering Engine (RIFE). In RIFE, various sensors are represented by generic classes. Each class is defined by the type of sensor measurement and the filter can be reconfigured by instantiating a sensor object whenever a new sensor is connected to system[9]. Lynen et al. proposed multi‑sensor‑fusion extend kalman filter (MSF‑EKF) to process time‑delayed, relative and absolute measurements from a theoretically unlimited number of different sensors. Its modular design allows seamless handling of additional/lost sensor signals[10]. Groves proposed sensor fusion modular integrated architecture, where different subsystems are constructed to process and integrate different sources[11]. Zhu et al.presented a goal‑driven sensor configuration. CPU time, power, and weight are combined to reconfigure sensor suite and all chosen measurements are integrated using EKF[12]. Although above research has achieved satisfactory results, the filtering‑based methods have in common that they restrict the state vector to the most recent state and marginalize out all old information, which brings out suboptimal performance[13‑14]. In contrast to filtering‑based methods, a graphical model known as factor graph represents information fusion problem as a graph‑based nonlinear least squares optimization. It encodes the connectivity between the unknown variable nodes and the received measurements. Multisensory fusion methods via factor graph can handle delayed and asynchronous sources in a flexible way because past states are kept during the global optimization process[15]. And it outperforms EKF because of the re‑linearization process[16]. Chiu et al.proposed a constrained optimal selection for sensors based on factor graph and the optimal subsets of sensors are selected with available resources, navigation accuracy and observability index[17]. Considering the real‑time application, Merfels et al. proposed a sliding‑window factor graph method for autonomous vehicles[18]. Watson et al. evaluated the effectiveness of robust optimization techniques using the factor graph framework. It shows that the factor graph algorithm in conjunction with robust optimization can achieve reasonable performance in the GNSS‑degraded environment[19]. However, above research is still optimized with custom fusion solutions, which is inadequate for the flexible and extensible needs of land vehicles navigation system.

    Seeking to develop adaptable navigation that allows rapid integration of any combination of sensors to enable seamless, robust and accurate navigation solutions in GNSS‑denied environment, we propose a generic plug‑and‑play fusion strategy based on factor graph for land vehicles. The strategy is designed using abstraction method. Various abstract sensor models are designed by the type of sensors, rather than for a specific sensor. When a sensor is connected into the navigation system, the specific sensor model is built from the abstract model and its error registration is implemented. The proposed strategy allows rapid reconfiguration of any combination of sensors. Also, its modularity enables the fusion architecture to be flexible and extensible to new sensors and new capabilities. In addition, time‑delayed sensor data, which presents low‑quality characteristics, can be processed in a natural way based on the improved factor graph, in which error parameters of sensors are also added into the graph model to conduct sensor online calibration. We evaluate performance of the proposed strategy using a land vehicle equipped with heterogeneous sensors. It shows that our strategy can process low‑quality data in a plug‑and‑play and robust manner and its performance outperforms the most common filter‑based method.

  • 1 Generic Sensor Fusion Strategy

    The proposed strategy is shown in Fig.1, whi⁃ch consists of three parts, preprocessing layer, abstracting layer and fusing layer.

    Fig.1
                            Generic multi‑sensor fusion strategy

    Fig.1 Generic multi‑sensor fusion strategy

  • 1.1 Preprocessing layer

    In the preprocessing layer, raw measurement sources are processed into usable navigation information. When a sensor is connected into the system, it is recognized and corresponding ID is attached into this source. Then, data conversion is conducted according to specific sensor type. For example, images of camera are converted into pose estimates. Considering that sensors are placed in different locations of a vehicle, spatial parameters among different sensors obtained from an offline calibration are offset in space‑time alignment. Also, time stamping is implemented in this step. Relative and absolute measurements are also aligned by transformation between different frames.

  • 1.2 Abstracting layer

    In the abstracting layer, various abstract sensor models are designed according to the type of sensors. This layer consists of four abstract models, that is, dead reckoning model, position model, velocity model, and attitude model. The specific model of a sensor can be instantiated using its templates by identifying information’s ID. Also, sensor error registrations are conducted. For example, a sensor’s specific noise and error parameters are added into the built model.

    Dead reckoning model represents recursive sensors, such as inertial or other dead reckoning sensors. Its abstract model can be conceptually described by following continuous nonlinear differential equation

    x˙=fDRx,α,Δ
    (1)

    where x is the navigation state, representing the vehicle’s position, attitude and velocity; Δ the increment of the vehicle measured by sensors and α the calculated model of errors in sensors. Other models represent sensors that provide with other measurement information, that is, position, velocity and attitude. Their abstract models can be described in a unified way

    z=hMx+n
    (2)

    where x is navigation state, representing the vehicle’s position, attitude and velocity; z the information measured by sensors and n a measurement noise, which is assumed to be zero mean Gaussian noise. hM is the measurement function, relating between the measurement and navigation state.

  • 1.3 Fusing layer

    In the fusing layer, non‑linear optimization methods based on factor graph is formulated. A factor graph is a bipartite graph G=F,X,E with two types of nodes: Factor nodes fiF and variable nodes xiX . Edges eijE can exist only between factor nodes and variable nodes, and are present if and only if the factor fi involves a variable xi . The factor graph G defines one factorization of the function fX as

    fX=fi(Xi)
    (3)

    where Xi is the set of all variables xi connected by an edge to factor fi [20].

    A factor describes an error between the predicted and actual measurements. Assuming a Gaussian noise model, a measurement factor can be written as

    fi(Xi)=d[hi(Xi)-zi]
    (4)

    where hi(Xi) is the measurement model as a function of the state variables Xi ; zi the actual measurement and d() a cost function, which is the squared Mahalanobis distance, defined as deeTΣ-1e , with Σ being the measurement covariance. Process models can be represented using factors in a similar manner.

    Eq.(3) should be minimized by adjusting the estimates of the variables X . The optimal estimate is the one that minimizes the error of the entire graph[21]

    Xˆ=argminX(ifi(Xi))
    (5)

    Different sensor information is added into the factor graph as variable and factor nodes. The time‑delayed and asynchronous measurements can be incorporated into the factor graph in a natural way, leading to better estimates for current states.

  • 2 An Improved Sensor Fusion Method for Land Vehicles Based on Factor Graph

    The structure of the improved multisensory fusion method is shown in Fig.2. Based on factor gra⁃ph framework, sensor errors are added into the graph model to implement global optimization. The optimized error parameters are utilized to calibrate sensor measurements. Owing to sensor error online calibration, better estimates for the whole trajectory can be obtained.

    Fig.2
                            Structure of the improved fusion method

    Fig.2 Structure of the improved fusion method

    Considering that the most common sensors in typical navigation applications of land vehicles, improved factor graph for land vehicles is built in Fig.3. The considered sensors are IMU, GPS, od⁃ometer, visual sensors, and wireless sensors. In this paper, GPS factor is built in the graph model to be adaptive to various applications. However, GPS signal is not fused with other sensors but used as ground truth in the field tests to prove the performance of the proposed algorithm in GNSS‑denied environment.

    Fig.3
                            Improved factor graph for land vehicles

    Fig.3 Improved factor graph for land vehicles

    Sensors’ error parameters are added into graph to implement global optimization. Black hollow circles mean navigation states and fIMU means IMU factor. Jasper hollow circles mean IMU bias, which is introduced at a lower frequency than navigation states as it changes slowly during operation. Blue solid circles mean odometer factor while grey hollow circles represent scale factor error of odometer. Red, yellow and purple solid circles mean visual odometry, wireless sensor, and GPS factor, respectively. Green hollow circles represent scale error of camera. Navigation states of land vehicles and error parameters of sensors are optimized together to improve estimation accuracy. Error parameters are used to modify corresponding measurements. Sensor factors are built as follows

  • 2.1 IMU factor

    IMU factor is built to connect navigation states at two sequential times. Considering time k and time k+1 , IMU factor is derived as

    fIMU(xk+1,xk,αk)d(xk+1-h(xk,αk,zk))
    (6)

    where xk+1 and xk are navigation states at time k+1 and k , respectively; zk=αkωk is the given IMU measurements, that is, acceleration and angular rate; αk the bias of inertial sensor, which is estimated to modify the IMU sensor data. The Euler integration prediction function with a noise is adopted to represent h() . In the same way, bias factor can be described as

    fbias(αk+1,αk)d(αk+1-g(αk))
    (7)

    where αk+1 and αk are the biases at time k+1 and k , respectively. Bias is modelled as constant error.

  • 2.2 Odometer factor

    Odometer provides with velocity information and its factor can be represented as

    fODO(xk,βk)d(zkODO-hODO(xk,βk))
    (8)

    where zkODO and xk are the velocities of odometer and navigation state at time k ; βk is the scale factor error, which is obtained to modify odometer data. In the same way, scale factor error can be derived as

    fscale(βk+1,βk)d(βk+1-g(βk))
    (9)

    where βk+1 and βk are the scale factor errors at time k+1 and k , respectively. Scale factor error is modelled as constant error.

  • 2.3 GPS factor

    GPS factor is built to provide with absolute position and its factor can be modelled as

    fGPS(xk)d(zkGPS-hGPS(xk))
    (10)

    where zkGPS and xk are the positions of GPS and navigation state at time k .

  • 2.4 Wireless sensor factor

    Wireless sensor provides ranging information to base stations. When wireless sensor can receive at least three ranging information to base stations whose positions are obtained in advance, it can provide with position in the given frames and its factor can be modelled as

    fWS(xk)d(zkWS-hWS(xk))
    (11)

    where zkWS and xk are the positions of wireless sensor and navigation state at time k .

  • 2.5 Visual sensor factor

    Visual sensor provides with relative position when visual odometry algorithm is used. After the relative and absolute measurements are aligned, it provides with pose information in the global frame. Its factor can be represented as

    fVOP(xk)d(zkVOP-hVOP(xk,λk))
    (12)
    fVOH(xk)d(zkVOH-hVOH(xk))
    (13)

    where zkVOP and xk are the position of visual sensor and navigation state at time k ; zkVOH is the yaw of visual sensor at time k ; λk the scale error and it is modelled as constant error. Its factor can be represented as

    fscale(λk+1,λk)d(λk+1-g(λk))
    (14)

    where λk+1 and λk are the scale errors at time k+1 and k , respectively.

  • 3 Experiment

    In the field tests, we use a land vehicle equipped with a GPS receiver as well as IMU, stereo camera, UWB (a kind of wireless sensor) and odometer. The land vehicle is shown in Fig.4. GPS receiver provides with precise positioning of centimeter‑level solutions when it operates in real‑time kinematic (RTK) mode, which is treated as ground truth. GPS is not integrated into the navigation system, which only to evaluate the performance of the proposed strategy in GNSS‑denied environment. Data acquisition module is designed based on ROS.

    Fig.4
                            Land vehicle used in the field test

    Fig.4 Land vehicle used in the field test

    The trajectory of the field test is shown in Fig.5with Google map. The starting point is mar⁃ked with a star and arrows show the driving direction. A certain color of the trajectory means the corresponding section where a certain combination of sensors is integrated into the navigation system, because some sensors are available in specific circumstances. For example, red line is surrounded by base stations, and the UWB is available only in this part. Also, the roadway in blue part is the area where feature is sparse, which leaves the camera in an unusable state and not be integrated into the navigation system. In the test, different information sources are integrated to the system whenever they are available.

    Fig.5
                            Field test in general road campus of NUAA

    Fig.5 Field test in general road campus of NUAA

    When a sensor is connected into system, specific models are constructed and corresponding factors are added into the factor graph. And time‑delayed and asynchronous measurements can be fused in the factor graph in a truly plug‑and‑play manner since past states are kept to perform global optimization.

    We compare our results with the most common filtering‑based method, EKF. The drawback of a basic EKF is that linearization happens only once, which can lead to a lower performance. Also, EKF is sensitive to time‑delayed measurements which presents low‑quality characteristics, as states cannot be propagated back in the filter. To evaluate impacts of low‑quality information on EKF, we add time delays and noise into sensor data at different times, which equivalently injects faults into the data. Time delay is set to be 1 s, which is large enough to implement error excitation.

    The trajectory comparison is shown in Fig.6. The east and the north position error comparisons are plotted in Figs.7,8, in which the time periods with faulted VO measurements are marked with dashed lines. We can see that both fusion methods present slow drift as there is no absolute position measurement at most times. Also, position errors of EKF are highly increased during periods when faulted data is fused. On the contrary, since past states are kept in global optimization process, delayed information can be added to graph model based on their time stamp in a plug‑and‑play way, leading to better estimates for current states.Root‑mean‑ square (RMS) errors of the position error is illustrated in Table1.

    Fig.6
                            Trajectory comparison

    Fig.6 Trajectory comparison

    Fig.7
                            East position error comparison

    Fig.7 East position error comparison

    Fig.8
                            North position error comparison

    Fig.8 North position error comparison

    Table 1 RMS comparison(m)

    RMSEastNorth
    EKF10.7813.80
    Proposed method5.999.44
  • 4 Conclusions

    We propose a generic plug‑and‑play multi‑sensors fusion strategy for land vehicles in GNSS‑denied environment. The strategy handles different sensors in a flexible way as sensors are represented by their generic models. Relative estimations are fused with absolute sensors based on improved factor graph, in which sensors’ error parameter can be added into graph optimization to perform sensor online calibration. We demonstrate the performance of our system through field tests. It shows that traditional filtering method is heavily influenced by low‑quality sensor data. Our strategy can process time‑delayed input sources in a plug‑and‑play and robust manner and its performance outperforms EKF in GNSS‑denied environment.

    In our future work, the integrated quality of the measurements, not just restricted to sensor accuracy, will be considered to measure sensor’s confidence level in the fusion process, thus further improving robustness and accuracy of the system.

  • References

    • 1

      MA H, SMART E, AHMED A, et al . Radar image‑based positioning for usv under GPS denial environment[J]. IEEE Transactions on Intelligent Transportation Systems, 2017, 19(1): 72‑80.

    • 2

      LI X, XU Q . A reliable fusion positioning strategy for land vehicles in GPS‑denied environments based on low‑cost sensors[J]. IEEE Transactions on Industrial Electronics, 2017, 64(4): 3205‑3215.

    • 3

      SUHR J K, JANG J, MIN D, et al . Sensor fusion‑based low‑cost vehicle localization system for complex urban environments[J]. IEEE Transactions on Intelligent Transportation Systems, 2017, 18(5): 1078‑1086.

    • 4

      LIN F, WANG H, WANG W, et al . Vehicle state and parameter estimation based on dual unscented particle filter algorithm[J]. Transactions of Nanjing University of Aeronautics and Astronautics, 2014, 31(5): 568‑575.

    • 5

      KAUFFMAN K . Field demonstration of plug and play navigation system using scorpion and smart sensors/cables[C]//Joint Navigation Conference. Dayton: ION, 2017.

    • 6

      TOMPKINS S . DARPA‑BAA‑11‑14all source positioning and navigation[EB/OL]. (2010‑1‑1)[2019‑3‑15]. https://www.fbo.gov/index?s=opportunity&mode=form&id=cef82a45ab7d64a5445eadf277f13dbe&tab=core&_cview=1.

    • 7

      ELSNER D L . Universal plug‑n‑play sensor integration for advanced navigation[D]. Dayton,USA: Air Force Institute of Technology, 2012.

    • 8

      JUANG J C, HSIEH W L, CHANG C C . Development of a plug‑and‑play ROS‑based land vehicular navigation suite[C]//30th International Technical Meeting of the Satellite Division of the Institute of Navigation. Portland: ION, 2017 : 1959‑1963.

    • 9

      SOLOVIEV A, YANG C . Reconfigurable integration filtering engine (RIFE) for plug‑and‑play (PnP) navigation[C]//26th International Technical Meeting of The Satellite Division of the Institute of Navigation. Nashville: ION, 2013: 16‑20.

    • 10

      LYNEN S, ACHTELIK M W, WEISS S, et al . A robust and modular multi‑sensor fusion approach applied to mav navigation[C]//2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Tokyo: IEEE, 2013: 3923‑3929.

    • 11

      GROVES P D . The complexity problem in future multisensor navigation and positioning systems: A modular solution[J]. The Journal of Navigation, 2014, 67(2): 311‑326.

    • 12

      ZHU Z, ADAMS K, VENABLE D, et al . Goal‑driven sensor configuration in a navigation system[C]//2016 IEEE/ION Position, Location and Navigation Symposium (PLANS). Savannah: ION, 2016 : 527‑534.

    • 13

      STRASDAT H, MONTIEL J M, DAVISON A . Visual SLAM: Why filter?[J]. Image and Vision Co⁃mputing, 2012, 30(2): 65‑77.

    • 14

      MASCARO R, TEIXEIRA L, HINZMANN T, et al . GOMSF: Graph‑optimization based multi‑sensor fusion for robust UAV pose estimation[C]//2018 IE⁃EE International Conference on Robotics and Automation (ICRA). Karlsruhe: IEEE, 2018: 1421‑1428.

    • 15

      QIN T, CAO S, PAN J, et al . A general optimization‑based framework for global pose estimation with multiple sensors[EB/OL]. (2019‑1‑11)[2019‑3‑10]. https://arxiv.org/abs/1901.03638?context=cs.

    • 16

      INDELMAN V, WILLIAMS S, KAESS M, et al . Information fusion in navigation systems via factor graph based incremental smoothing[J]. Robotics and Autonomous Systems, 2013, 61(8): 721‑738.

    • 17

      CHIU H P, ZHOU X S, CARLONE L, et al . Constrained optimal selection for multi‑sensor robot navigation using plug‑and‑play factor graphs[C]//2014 IEEE International Conference on Robotics and Aut‑omation (ICRA). Hong Kong: IEEE, 2014 : 663‑670.

    • 18

      MERFELS C, STACHNISS C . Pose fusion with chain pose graphs for automated driving[C]//2016 IE⁃EE/RSJ International Conference on Intelligent Robots and Systems (IROS). Deajeon: IEEE, 2016: 3116‑3123.

    • 19

      WATSON R M, GROSS J N . Robust navigation in GNSS degraded environment using graph optimization[EB/OL]. (2018‑6‑23)[2019‑3‑10]. https://arxiv.org/abs/1806.08899.

    • 20

      KSCHISCHANG F R, FREY B J, LOELIGER H‑A . Factor graphs and the sum‑product algorithm[J]. IEEE Transactions on information theory, 2001, 47(2): 498‑519.

    • 21

      DELLAERT F, KAESS M . Square root SAM: Simultaneous localization and mapping via square root information smoothing[J]. The International Journal of Robotics Research, 2006, 25(12): 1181‑1203.

  • Author contributions & Acknowledgements

    Prof. LAI Jizhou designed the study and guided the experiments. Mr. BAI Shiyu conducted the analysis and wrote the manuscript. Mr. XU Xiaowei participated in the experiments. Dr. LÜ Pin contributed to the discussion and background of the study. All authors commented on the draft and approved the submission.

    Acknowledgements:This work was partially supported by the National Natural Science Foundation of China (No. 61703207), the Jiangsu Provincial Natural Science Foundation of China (No.BK20170801), the Aeronautical Science Foundation of China (No.2017ZC52017), and the Jiangsu Provincial SixTalent Peaks (No.2015‑XXRJ‑005), and the Jiangsu Province Qing Lan Project.

    Competing Interests

    The authors declare no competing interests.

LAIJizhou

Affiliation: Key Laboratory of Navigation, Control and Health‑Management Technologies of Advanced Aerocraft, Ministry of Industry and Information Technology, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, P. R. China

Role:Corresponding author

Email:laijz@nuaa.edu.cn.

Profile:E‑mail address:laijz@nuaa.edu.cn.

BAIShiyu

Affiliation: Key Laboratory of Navigation, Control and Health‑Management Technologies of Advanced Aerocraft, Ministry of Industry and Information Technology, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, P. R. China

Profile: BAI Shiyuis currently a Ph.D. candidate at Nanjing University of Aeronautics and Astronautics. His research is focused on multi‑sensor fusion.Mr

XUXiaowei

Affiliation: Key Laboratory of Navigation, Control and Health‑Management Technologies of Advanced Aerocraft, Ministry of Industry and Information Technology, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, P. R. China

Profile: XU Xiaoweiis currently a Ph.D. candidate in Nanjing University of Aeronautics and Astronautics. His research is focused on downhole autonomous navigation.Dr

LÜPin

Affiliation: Key Laboratory of Navigation, Control and Health‑Management Technologies of Advanced Aerocraft, Ministry of Industry and Information Technology, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, P. R. China

Profile: LÜ Pinreceived the Ph.D. degree in navigation, guidance and control from Nanjing University of Aeronautics and Astronautics, Nanjing, China, in 2015. He is currently a lecturer in Nanjing University of Aeronautics and Astronautics. His research is focused on rotational inertial navigation and dynamic model assisted navigation.

Zhang Bei

角 色:中文编辑

Role:Editor

html/njhkhten/201902002/alternativeImage/91750859-44b5-45da-88e0-9e604242bee4-F001.jpg
html/njhkhten/201902002/alternativeImage/91750859-44b5-45da-88e0-9e604242bee4-F002.jpg
html/njhkhten/201902002/alternativeImage/91750859-44b5-45da-88e0-9e604242bee4-F003.jpg
html/njhkhten/201902002/alternativeImage/91750859-44b5-45da-88e0-9e604242bee4-F004.jpg
html/njhkhten/201902002/alternativeImage/91750859-44b5-45da-88e0-9e604242bee4-F005.jpg
html/njhkhten/201902002/alternativeImage/91750859-44b5-45da-88e0-9e604242bee4-F006.jpg
html/njhkhten/201902002/alternativeImage/91750859-44b5-45da-88e0-9e604242bee4-F007.jpg
html/njhkhten/201902002/alternativeImage/91750859-44b5-45da-88e0-9e604242bee4-F008.jpg
RMSEastNorth
EKF10.7813.80
Proposed method5.999.44

Fig.1 Generic multi‑sensor fusion strategy

Fig.2 Structure of the improved fusion method

Fig.3 Improved factor graph for land vehicles

Fig.4 Land vehicle used in the field test

Fig.5 Field test in general road campus of NUAA

Fig.6 Trajectory comparison

Fig.7 East position error comparison

Fig.8 North position error comparison

Table 1 RMS comparison(m)

image /

  • References

    • 1

      MA H, SMART E, AHMED A, et al . Radar image‑based positioning for usv under GPS denial environment[J]. IEEE Transactions on Intelligent Transportation Systems, 2017, 19(1): 72‑80.

    • 2

      LI X, XU Q . A reliable fusion positioning strategy for land vehicles in GPS‑denied environments based on low‑cost sensors[J]. IEEE Transactions on Industrial Electronics, 2017, 64(4): 3205‑3215.

    • 3

      SUHR J K, JANG J, MIN D, et al . Sensor fusion‑based low‑cost vehicle localization system for complex urban environments[J]. IEEE Transactions on Intelligent Transportation Systems, 2017, 18(5): 1078‑1086.

    • 4

      LIN F, WANG H, WANG W, et al . Vehicle state and parameter estimation based on dual unscented particle filter algorithm[J]. Transactions of Nanjing University of Aeronautics and Astronautics, 2014, 31(5): 568‑575.

    • 5

      KAUFFMAN K . Field demonstration of plug and play navigation system using scorpion and smart sensors/cables[C]//Joint Navigation Conference. Dayton: ION, 2017.

    • 6

      TOMPKINS S . DARPA‑BAA‑11‑14all source positioning and navigation[EB/OL]. (2010‑1‑1)[2019‑3‑15]. https://www.fbo.gov/index?s=opportunity&mode=form&id=cef82a45ab7d64a5445eadf277f13dbe&tab=core&_cview=1.

    • 7

      ELSNER D L . Universal plug‑n‑play sensor integration for advanced navigation[D]. Dayton,USA: Air Force Institute of Technology, 2012.

    • 8

      JUANG J C, HSIEH W L, CHANG C C . Development of a plug‑and‑play ROS‑based land vehicular navigation suite[C]//30th International Technical Meeting of the Satellite Division of the Institute of Navigation. Portland: ION, 2017 : 1959‑1963.

    • 9

      SOLOVIEV A, YANG C . Reconfigurable integration filtering engine (RIFE) for plug‑and‑play (PnP) navigation[C]//26th International Technical Meeting of The Satellite Division of the Institute of Navigation. Nashville: ION, 2013: 16‑20.

    • 10

      LYNEN S, ACHTELIK M W, WEISS S, et al . A robust and modular multi‑sensor fusion approach applied to mav navigation[C]//2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Tokyo: IEEE, 2013: 3923‑3929.

    • 11

      GROVES P D . The complexity problem in future multisensor navigation and positioning systems: A modular solution[J]. The Journal of Navigation, 2014, 67(2): 311‑326.

    • 12

      ZHU Z, ADAMS K, VENABLE D, et al . Goal‑driven sensor configuration in a navigation system[C]//2016 IEEE/ION Position, Location and Navigation Symposium (PLANS). Savannah: ION, 2016 : 527‑534.

    • 13

      STRASDAT H, MONTIEL J M, DAVISON A . Visual SLAM: Why filter?[J]. Image and Vision Co⁃mputing, 2012, 30(2): 65‑77.

    • 14

      MASCARO R, TEIXEIRA L, HINZMANN T, et al . GOMSF: Graph‑optimization based multi‑sensor fusion for robust UAV pose estimation[C]//2018 IE⁃EE International Conference on Robotics and Automation (ICRA). Karlsruhe: IEEE, 2018: 1421‑1428.

    • 15

      QIN T, CAO S, PAN J, et al . A general optimization‑based framework for global pose estimation with multiple sensors[EB/OL]. (2019‑1‑11)[2019‑3‑10]. https://arxiv.org/abs/1901.03638?context=cs.

    • 16

      INDELMAN V, WILLIAMS S, KAESS M, et al . Information fusion in navigation systems via factor graph based incremental smoothing[J]. Robotics and Autonomous Systems, 2013, 61(8): 721‑738.

    • 17

      CHIU H P, ZHOU X S, CARLONE L, et al . Constrained optimal selection for multi‑sensor robot navigation using plug‑and‑play factor graphs[C]//2014 IEEE International Conference on Robotics and Aut‑omation (ICRA). Hong Kong: IEEE, 2014 : 663‑670.

    • 18

      MERFELS C, STACHNISS C . Pose fusion with chain pose graphs for automated driving[C]//2016 IE⁃EE/RSJ International Conference on Intelligent Robots and Systems (IROS). Deajeon: IEEE, 2016: 3116‑3123.

    • 19

      WATSON R M, GROSS J N . Robust navigation in GNSS degraded environment using graph optimization[EB/OL]. (2018‑6‑23)[2019‑3‑10]. https://arxiv.org/abs/1806.08899.

    • 20

      KSCHISCHANG F R, FREY B J, LOELIGER H‑A . Factor graphs and the sum‑product algorithm[J]. IEEE Transactions on information theory, 2001, 47(2): 498‑519.

    • 21

      DELLAERT F, KAESS M . Square root SAM: Simultaneous localization and mapping via square root information smoothing[J]. The International Journal of Robotics Research, 2006, 25(12): 1181‑1203.