Alex Goldhoorn

Publication

← Back to Publications

Searching and Tracking People with Cooperative Mobile Robots

Alex Goldhoorn, Anaïs Garrell, René Alquézar and Alberto Sanfeliu
Autonomous Robots, vol. 42, pp. 1-25, 2017
PDF | BibTeX | DOI

Abstract

Social robots should be able to search and track people in order to help them. In this paper we present two different techniques of coordinated multi-robots for searching and tracking people. A probability map (belief) of a target person location is maintained, and to initialize and update it, two methods were implemented and tested: one based on a reinforcement learning algorithm and the other based on a particle filter. The person is tracked if visible, otherwise an exploration is done by making a balance, for each candidate location, between the belief, the distance, and whether close locations are already being explored by other robots of the team. The validation of the approach was accomplished throughout an extensive set of simulations using up to five agents and a large amount of dynamic obstacles; furthermore, over three hours of real-life experiments with two robots searching and tracking were recorded and analysed.

Experimental Videos

Videos from experiments with the Multi-agent HB-PF Explorer. Two mobile robots (Tibi and Dabo) cooperatively search for and track a person recognized using AR Markers. Environment and map files: maps page.

Video Legend

Each experiment video shows three sections:

  1. Left: Map and probability maps
  2. Right-top: Video focusing on Tibi
  3. Right-bottom: Video focusing on Dabo

Map Elements

Probability Map Elements

Limitations

The robots detect the person in two phases: 1) by laser detection of legs, and 2) by AR tag. The AR tag alone results in some false positives, so tags are only accepted when a laser detection is close enough. When a false positive occurs, the probability temporarily increases for that location, but recovers as the false positive is typically detected only briefly.

Exploration

Exploration experiments without the person present, to evaluate cooperative area coverage.

Exploration 1

Both robots explored continuously. One stopped for a minute due to a hardware issue but recovered. Near the end both took the same route as their goals converged, but maintained sufficient spacing to cover it thoroughly.

Exploration 2

Dabo was unable to navigate the ramp autonomously — the narrow passage caused the laser to detect the inclined floor as an obstacle. It was teleoperated up the ramp.

Exploration 3

Both robots explored the area. Dabo again needed manual assistance up the ramp due to limited manoeuvring space.

Following with Static and Dynamic Obstacles

Follow

Both robots follow the person through the environment.

Search and Follow

The robots start by searching, then follow for an extended time. Only one robot's video feed is available at the start. The experiment ended due to a hardware issue with Dabo.

Follow 2

The robots start with the person visible and follow throughout. One robot stopped intermittently due to hardware issues; the other continued searching and tracking.

Follow in a Group

Three other people walk in front of the target as dynamic obstacles. The robots maintain tracking of the correct person throughout.

Experiment Statistics

Summary of over 3 hours of real-life experiments with two robots:

Metric Exploration Search & Track Tracking Total
Distance per robot (km) 1.2 1.2 0.7 3.2
Total time (h) 1.1 1.2 0.9 3.2
Avg. visibility (%) 0 16.3 36.4 15.3
Avg. distance to person (m) - 8.4 ± 6.4 8.4 ± 5.6 8.3 ± 5.9
Avg. time found (s) - 106.8 ± 138.7 23.5 ± 42.5 72.9 ± 117.6

← Back to Publications