Modeling
Dynamic environments for local sensor based navigation |
This page only provides pictures, animations and links to videos. For technical details, see thesis and publications. References to appropriate papers will be included soon.
|
|
|
Particle filters approximate better the non linear distribution. The non linearities appear due to the observation process and, especially, due to the data association problem. The histogram on the right illustrates the bi-modal distribution in x-axis. An example of tracking moving objects with a laser sensor using JPDAFilters: EKF based (video 1) Particle filter based (video
2). |
Improved versions of the ICP algorithm: Metric Based ICP and probabilistic ICP |
|
Pictures coming soon |
|
|
|
EM
BASED ROBOT DISPLACEMENT ESTIMATION AND MEASUREMENT CLASSIFICATION
|
||||
The algorithm combines two maps: a grid map to represent the static parts of the environment and a dynamic map composed by a set of filters. The EM formulation of the probabilistic scan matching uses these models to jointly estimate the robot displacement and the measurement classification. For videos, click here and see links below. |
From the robot point of view. |
|||
LOCAL SENSOR BASED NAVIGATION ARCHITECTURE |
||||
The previous model has been integrated in an hybrid architecture that selectively uses the information of each of the maps. The three main modules are: 1. Model Builder Module: construction of a model
of the environment. Two maps: a grid map and a set of filters. 2. Planner Module: extraction of the
connectivity of the free space (used to avoid the cyclical motions and trap
situations). It uses only the static structure of the environment. 3. Obstacle Avoidance: computation of the
collision-free motion using the Nearness Diagram Navigation. It uses both the
static and dynamic maps. Collision with dynamic objects is predicted using
lineal models. The system has been evaluated using a robotic wheelchair. Some videos illustrating its performance: Architecture description (video 1) Maps on a dynamic environment (video 2). |
||||
TOWARDS
AN AID MOBILITY ASSISTANT FOR COGNITIVE DISABLE CHILDREN
|
||||
When developing aid mobility systems, cognitive disabled people impose some new challenges. In addition to adapted interfaces, training sessions and games are required to teach them how to drive the vehicle (e.g. right/left, relations between the display information, the real world, relation between the commands and the wheelchair motion,…). We collaborate with Colegio
Publico Alborada in |
||||
Tactile screen |
Push button |
Display |
||
According to several criteria, the school selected five children to evaluate the system. The field trials also helped us to evaluate the performance of our sensor based navigation system (and especially the modelling module) in realistic conditions. |
||||
|
|
|||
Some videos of the children driving the wheelchair: Voice interface (video 1) Tactile screen (video 2) Push button (coming soon) |
||||
We have studied the minimum information required to localize robots using dynamic features (e.g. using the information provided by a tracking system). The minimal setup is the two robot scenario. This is important due to visibility constraints and coverage. We consider two different types of sensors: |
|
Laser sensor (range and bearing, but unidentifiable features) |
|
Vision (bearing and identities, but not range) |
|
Pictures and videos
coming soon |