Motion Estimation
One of the key issues in autonomous mobile robots is to keep track its
position. Usually this problem is addressed by using the on board sensors
to gather information of the environment for localization and mapping
purposes. Many applications in robotics use techniques to estimate the
robot displacement among successive range measurements. The objective of
the scan matching techniques is to compute the relative motion of a
vehicle between two consecutive configurations by maximizing the overlap
between the range measurements obtained at each configuration. They
usually assume an initial estimation of the relative pose of the scans
that is provided by the vehicle odometry.
Our contribution
resides in the definition of a new distance measure in the image space of
the sensor that takes into account both, translation and rotation at the
same time. The distance between two points is the norm (in a sense we are
going to define) of the smallest rigid body transformation that leads a
point to the other one. I.e our distance naturally depends on translation
and rotation. We use this distance in both steps of the ICP algorithm:
-
Matching of
each point of a scan with the closest feature of the other scan in terms
of our distance.
-
Computation
of relative displacement by least square minimization of the errors (in
terms of our distance).
With this formulation we obtain results that ameliorate by far the
algorithm that we were using (proposed by Lu and Millios in 1997)
(the most used algorithm for scan matching) in terms of robustness and
precision. Furthermore, we present in the paper the extension to the 3D
problem, which could be used by the robotics, computer vision and graphics
communities that use the ICP algorithm to address sensor motion
estimation, location and map building, object recognition, pattern
analysis, image registration, and scene understanding among others.