ABSTRACT

CONTENTS 4.1 Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171

4.1.1 Computational Topics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172 4.2 The Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173

4.2.1 Reading an Entire Log File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175 4.2.2 Exploring Log Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179 4.2.3 Visualizing the Path . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184 4.2.4 Exploring a “Look” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187 4.2.5 The Error Distribution for Range Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190

4.3 Detecting a Circular Target . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194 4.3.1 Connecting Segments Behind the Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198 4.3.2 Determining If a Segment Corresponds to a Circle . . . . . . . . . . . . . . . . . . . 200

4.4 Detecting the Target with Streaming Data in Real Time . . . . . . . . . . . . . . . . . . . . . 213 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215

4.1 Description In this case study, we explore robots searching for a circular target in a rectangular course that contains numerous obstacles (see Figure 4.1). The robots use a search strategy to move around the course, avoiding the obstacles and searching for the target in the shortest time possible. The robot continuously reports its location and also what it “sees” all around it. It searches for the target and ends when it determines it has found it, or after 30 minutes of searching. The robot can detect objects up to a distance of 2 meters away. In this chapter, we focus on processing these location and sight records and developing a classifier to detect if the robot is “looking at” the target. We use a statistical approach to determine if the

target

We look at log files for 100 different experiments (or runs), each log file containing the entire path information for that robot and its search for the target. The data include the location of the robot as it moves and what it “sees” at each of these positions. We explore the characteristics of each of these experiments, e.g., whether they found the target, how long the experiment lasted (up to the 30-minute time limit), how fast the robot moved, the locations of the obstacles, and the variability in the measurements. We develop the classifier for detecting the target and explore its operating characteristics, e.g. type I and type II error rates. We then discuss how to use the functionality to read lines in the log file to do classification from this streaming data in real time.