NRL flight-tests autonomous multi-target, multi-user tracking capability

August 17, 2011

WASHINGTON -- The Naval Research Laboratory (NRL) and the Space Dynamics Laboratory (SDL) through the support of the Office of Naval Research (ONR), has shown an autonomous multi-sensor motion-tracking and interrogation system that reduces the workload for analysts by automatically finding moving objects, then presenting high-resolution images of those objects with no human input.

Intelligence, surveillance and reconnaissance (ISR) assets in the field generate vast amounts of data that can overwhelm human operators and can severely limit the ability of an analyst to generate intelligence reports in operationally relevant timeframes. This multi-user tracking capability enables the system to manage collection of imagery without continuous monitoring by a ground or airborne operator, thus requiring fewer personnel and freeing up operational assets.

"These tests display how a single imaging sensor can be used to provide imagery of multiple tracked objects," said Dr. Brian Daniel, research physicist, NRL ISR Systems and Processing Section. "A job typically requiring multiple sensors."

During flight tests, March 2011, multiple real-time tracks generated by a wide-area persistent surveillance sensor (WAPSS) were autonomously cross-cued to a high-resolution narrow field-of-view (NFOV) interrogation sensor via an airborne network. Both sensors were networked by the high-speed Tactical Reachback Extended Communications, TREC, data-link provided by the NRL Information Technology Division, Satellite and Wireless Technology Branch.

"The demonstration was a complete success," noted Dr. Michael Duncan, ONR program manager. "Not only did the network sensing demonstration achieve simultaneous real-time tracking, sensor cross cueing and inspection of multiple vehicle-sized objects, but we also showed an ability to follow smaller human-sized objects under specialized conditions."

The network sensing demonstration utilized sensors built under other ONR sponsored programs. The interrogation sensor was the precision, jitter-stabilized EyePod developed under the Fusion, Exploitation, Algorithm, and Targeting High-Altitude Reconnaissance (FEATHAR) program. EyePod is a dual-band visible-near infrared and long-wave infrared sensor mounted inside a nine-inch gimbal pod assembly designed for small UAV platforms. The mid-wave infrared nighttime WAPSS (N-WAPSS) was chosen as the wide-area sensor, and has a 16 mega-pixel, large format camera that captures single frames at four hertz (cycles per second) and has a step-stare capability with a one hertz refresh rate.

Using precision geo-projection of the N-WAPSS imagery, all moving vehicle-size objects in the FOV were tracked in real-time. The tracks were converted to geodetic coordinates and sent via an air-based network to a cue manager system. The cue manager autonomously tasked EyePod to interrogate all selected tracks for target classification and identification.
-end-


Naval Research Laboratory

Related Intelligence Articles from Brightsurf:

Human intelligence just got less mysterious says Leicester neuroscientist
NEUROSCIENCE EXPERTS from the University of Leicester have released research that breaks with the past fifty years of neuroscientific opinion, arguing that the way we store memories is key to making human intelligence superior to that of animals.

Artificial intelligence (AI)-aided disease prediction
Artificial Intelligence (AI)-aided Disease Prediction https://doi.org/10.15212/bioi-2020-0017 Announcing a new article publication for BIO Integration journal.

New roles for clinicians in the age of artificial intelligence
New Roles for Clinicians in the Age of Artificial Intelligence https://doi.org/10.15212/bioi-2020-0014 Announcing a new article publication for BIO Integration journal.

Classifying galaxies with artificial intelligence
Astronomers have applied artificial intelligence (AI) to ultra-wide field-of-view images of the distant Universe captured by the Subaru Telescope, and have achieved a very high accuracy for finding and classifying spiral galaxies in those images.

Using artificial intelligence to smell the roses
A pair of researchers at the University of California, Riverside, has used machine learning to understand what a chemical smells like -- a research breakthrough with potential applications in the food flavor and fragrance industries.

Artificial intelligence can help some businesses but may not work for others
The temptation for businesses to use artificial intelligence and other technology to improve performance, drive down labor costs, and better the bottom line is understandable.

Artificial intelligence for very young brains
Montreal's CHU Sainte-Justine children's hospital and the √ČTS engineering school pool their expertise to develop an innovative new technology for the segmentation of neonatal brain images.

Putting artificial intelligence to work in the lab
An Australian-German collaboration has demonstrated fully-autonomous SPM operation, applying artificial intelligence and deep learning to remove the need for constant human supervision.

Composing new proteins with artificial intelligence
Scientists have long studied how to improve proteins or design new ones.

Artificial intelligence and family medicine: Better together
Researcher at the University of Houston are encouraging family medicine physicians to actively engage in the development and evolution of artificial intelligence to open new horizons that make AI more effective, equitable and pervasive.

Read More: Intelligence News and Intelligence Current Events
Brightsurf.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.