The Army Research Laboratory’s Robotics Collaborative Technology Alliance (RCTA) is a program intended to change robots from tools that soldiers use into teammates with which soldiers can work. This requires the integration of fundamental and applied research in perception, artificial intelligence, and human-robot interaction. In October of 2014, the RCTA assessed progress towards integrating this research. This assessment was designed to evaluate the robot's performance when it used new capabilities to perform selected aspects of a mission. The assessed capabilities included the ability of the robot to: navigate semantically outdoors with respect to structures and landmarks, identify doors in the facades of buildings, and identify and track persons emerging from those doors. We present details of the mission-based vignettes that constituted the assessment, and evaluations of the robot’s performance in these vignettes.
This work represents the fifth in a series of studies on safe operations of unmanned ground vehicles in the proximity of
pedestrians. The U.S. Army Research Laboratory (ARL), the National Institute of Standards and Technology (NIST),
and the Robotics Collaborative Technology Alliance (RCTA) conducted the study on the campus of NIST in
Gaithersburg, MD in 2009, the final year of the RCTA.
The experiment was to assess the performance of six RCTA algorithms to detect and track moving pedestrians from
sensors mounted on a moving platform. Sensors include 2-D and 3-D LADAR, 2-D SICK, and stereovision. Algorithms
reported only detected human tracks. NIST ground truth methodology was used to assess the algorithm-reported
detections as to true positive, misclassification, or false positive as well as distance to first detection and elapsed tracking
time. A NIST-developed viewer facilitated real-time data checking and subsequent analysis. Factors of the study include
platform speed, pedestrian speed, and clutter density in the environment. Pedestrian motion was choreographed to ensure
similar perspective from the platform regardless of experimental conditions. Pedestrians were upright in the principal
study, but excursions examined group movement, nonlinear paths, occluded paths, and alternative postures.
We will present the findings of this study and benchmark detection and tracking for subsequent robotic research in this
program. We also address the impact of this work on pedestrian avoidance.
Man portable robots have been fielded extensively on the battlefield to enhance mission effectiveness of soldiers in
dangerous conditions. The robots that have been deployed to date have been teleoperated. The development of assistive
behaviors for these robots has the potential to alleviate the cognitive load placed on the robot operator. While full
autonomy is the eventual goal, a range of assistive capabilities such as obstacle detection, obstacle avoidance, waypoint
navigation, can be fielded sooner in a stand-alone fashion. These capabilities increase the level of autonomy on the
robots so that the workload on the soldier can be reduced.
The focus of this paper is on the design and execution of a series of scientifically rigorous experiments to quantifiably
assess operator performance when operating a robot equipped with some of these assistive behaviors. The experiments
helped to determine a baseline for teleoperation and to evaluate the benefit of Obstacle Detection and Obstacle
Avoidance (OD/OA) vs. teleoperation and OD/OA with Open Space Planning (OSP) vs. teleoperation. The results of
these experiments are presented and analyzed in the paper.
In September 2007 the Army Research Laboratory (ARL) Robotics Collaborative Technology Alliance (CTA)
conducted an assessment of multiple pedestrian detection algorithms based upon LADAR or video sensor data. Eight
detection algorithms developed by the Robotics CTA member organizations, including ARL, were assessed in an
experiment conducted by the National Institute of Science & Technology (NIST) and ARL to determine the probability
of detection/misclassification and false alarm rate as a function of vehicle speed, degree of environmental clutter, and
pedestrian speeds. The study is part of an ongoing investigation of safe operations for unmanned ground vehicles.
This assessment marked the first time in this program that human movers acted as targets for detection from a moving
vehicle. A focus of the study was to choreograph repeatable human movement scenarios relative to the movement of the
vehicle. The resulting data is intended to support comparative analysis across treatment conditions and to allow
developers to examine performance with respect to specific detection and tracking events. Events include humans
advancing and retreating from the vehicle at different angles, humans crossing paths in close proximity and occlusion
situations where sight to the mover from the sensor system is momentarily lost. A detailed operational procedure ensured
repeatable human movement with independent ground truth supplied by a NIST ultra wideband wireless tracking
system. Post processing and statistical analysis reconciled the tracking algorithm results with the NIST ground truth. We
will discuss operational considerations and results.
ARL is developing the autonomous capability to directly support the Army's future requirements to employ unmanned
systems. The purpose of this paper is to document and benchmark the current ARL Collaborative Technology Alliance
(CTA) capabilities in detecting, tracking and avoiding moving humans and vehicles from a moving unmanned vehicle.
For this experiment ARL and General Dynamics Robotic Systems (GDRS) conducted an experiment involving an ARL
eXperimental Unmanned Vehicle (XUV) operating in proximity to a number of stationary and moving human surrogates
(mannequins) and moving vehicles. In addition there were other objects along the XUV route of the experiment such as
barrels, fire hydrants, poles, cones, and other clutter.
The experiment examined the performance of seven algorithms using a series of sensor modalities to detect stationary
and moving objects. Three of the algorithms showed promise, detecting human surrogates and vehicles with
probabilities ranging from 0.64 to 0.85, while limiting probability of misclassification to 0.14 to 0.37. Moving
mannequins were detected with slightly higher probabilities than fixed mannequins. The distance from the ground truth
at the time of detection suggests that at a speed of 20 kph with a minimum distance to detection of 19.38 m, the vehicle
would have a minimum of 3.5 seconds to avoid a mannequin or vehicle if detected by one of these three algorithms.
Among mannequins and vehicles and, mannequins were more frequently detected than vehicles.
During FY03, the U.S. Army Research Laboratory undertook a series of experiments designed to assess the maturity of autonomous mobility technology for the Future Combat Systems Armed Robotic Vehicle concept. The experiments assessed the technology against a level 6 standard in the technology readiness level (TRL) maturation schedule identified by a 1999 Government Accounting Office report. During the course of experimentation, 646 missions were conducted over a total distance of ~560 km and time of ~100 hr. Autonomous operation represented 96% and 88% of total distance and time, respectively. To satisfy the TRL 6 "relevant environment" standard, several experimental factors were varied over the three-site test as part of a formal, statistical, experimental design. This paper reports the specific findings pertaining to relevant-environment questions that were posed for the study and lends additional support to the Lead System Integrator decision that TRL 6 has been attained for the autonomous navigation system.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.