Kitware Develops Functional Object Recognition Capabilities

Automatically detecting objects such as people and vehicles performing specific functions or complex activities is one of the most challenging problems in video analytics. Complex functional object recognition is required to detect objects that are defined by their behavior rather than their appearance, such as delivery trucks, police patrol vehicles, buses, and road cleaning vehicles. It is also used to detect complex threat patterns such as IED emplacement.

During Phase II of the project “Vision with a Purpose: Inferring the Function of Objects in Video,” Kitware created a prototype demonstration system to capture the power of human intuition by having users define complex functional objects’ components, as well as the components’ relationships, in a new graphical model. The model is then used to automatically and efficiently scan vast video scenes over long periods of time. In addition, the workflow enables users to share feedback to improve results. When example videos of functional objects are available, the system can also learn the core characteristics of their activities using machine learning techniques. The capability can be applied across diverse domains including wide-area motion imagery (WAMI), aerial full-motion video and ground surveillance video.

The research and development conducted for the workflow has produced crucial state-of-the-art technologies for vision-based recognition of functional objects. The core capabilities of these technologies include learning functional object models from examples using machine learning techniques, detecting known functional objects using a complex activity model, detecting anomalous functional objects, modeling relationships between locations and movers, and analyzing multi-scale patterns of life.

The project’s developments add new dimensions to the utility of WAMI videos and should play a crucial role in advancing the technologies utilized for national security and defense.

This material is based upon work supported by the Defense Advanced Research Projects Agency (DARPA) under Contract Number W31P4Q-10-C-0262.

Approved for Public Release, Distribution Unlimited

 

Questions or comments are always welcome!