Machine Learning-Based Data Analysis
Navy SBIR 2020.1 - Topic N201-085
SSP - Mr. Michael Pyryt -
Opens: January 14, 2020 - Closes: February 26, 2020 (8:00 PM ET)


TITLE: Machine Learning-Based Data Analysis


TECHNOLOGY AREA(S): Human Systems, Information Systems

ACQUISITION PROGRAM: Strategic Weapons Systems: Trident II D5 and D5 Life Extension (LE) ACAT IC

The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 3.5 of the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.

OBJECTIVE: Develop, demonstrate and field an algorithm and process for conducting an automated real-time scan of navigation subsystem data from a database for disturbances, abnormal trends, and problems that can learn to predict future disturbances, abnormal trends, and problems, which would be implemented to provide real-time fault analysis and failure prediction for inertial navigation systems (INS).

DESCRIPTION: Data analysis of INS performance has historically been human labor intensive and heavily reliant on the ability of a person or team of people to perform data analysis in a lab instead of in real- time. Typical real-time monitoring of INS performance relies upon the system to create discrete error codes based on physical sensors and conditions. While this approach has been successful in the past, it has limitations and has an element of human error risk in the analysis of large data fields. The use of scanning and evaluation tools based on machine learning (ML) technology would significantly enhance the abilities of the human analyst to focus on problems identified from synthesized data rather than sifting thru raw data streams or reacting to one of many hundreds of discrete alarms that may occur. ML technology has the potential to dramatically reduce the likelihood of an analyst missing anomalies in the analysis of data caused by sensors or equipment that have degraded performance, but not by enough to exceed a human-established threshold or ability for pattern matching. ML technology should also offer the ability to detect higher order abnormalities with INS system performance by aggregating a variety of seemingly unrelated direct sensor error codes. It should offer the ability to classify errors, and have behavior-based or anomaly-based detection that may otherwise go undetected. ML should also offer the ability to conduct extensive data mining to predict a potential system failure and the opportunity to conduct the analysis in real time on the ship instead of time late.

Often times, anomalies caused by sensors or equipment falling into this category go undetected because humans have limitations such as imperfect memory, fatigue, etc., that make them reliant on the tripping of an alarm or passing of an established threshold to identify issues that a machine can learn to identify based on the big data set it was trained with. ML tools can be used to classify data sets to recognize abnormal subsystem behaviors to be flagged for further analysis. After these algorithms are developed to improve the lab analysis, they may be integrated as a real-time out-of-band problem monitor on the associated system. Similar to the way Network Intrusion Detection Systems can monitor network data flow for problem behaviors, these tools could passively monitor the system data for problem trends and behaviors, and then issue warnings to the operators of more significant systemic faults.

With a focus on optimizing system affordability, reliability, maintainability, serviceability, and operability, any proposed design concept, demonstration model, or production model must utilize standard interfaces wherever possible, leverage commercially available components or elements, be diagnosable and serviceable by qualified Navy sailors for any preventative or anticipated corrective maintenance required at a periodicity more frequently than once every nine months; have a mean time between failure in excess of twelve months; be configuration controlled and upgradable; be modular with an open systems architecture; and include the identification of potentially replaceable components or units to be carried as spares.

Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. owned and operated with no foreign influence as defined by DoD 5220.22-M, National Industrial Security Program Operating Manual, unless acceptable mitigating procedures can and have been implemented and approved by the Defense Security Service (DSS). The selected contractor and/or subcontractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances, in order to perform on advanced phases of this project as set forth by DSS and Strategic Systems Programs (SSP) in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material IAW DoD 5220.22-M during the advanced phases of this contract.

PHASE I: Conduct a concept development effort of the requirements outlined in the Description. Identify or develop an analysis methodology or ML technology and process to conduct an automated scan of various data streams related to INS that has the ability to learn to predict future disturbances, abnormal trends, and problems. Conduct feasibility studies of the proposed concept. Develop a Phase II plan.

Phase I will be UNCLASSIFIED, and the contractor will not require access to any classified data (in other words, if the research outcome relates to classified data, the work itself can be performed using "dummy" data of the same level of complexity).

PHASE II: Further develop the proposed concept and build a demonstrational prototype based on the concept. Ensure that the prototype is able to conduct an automated scan of various data streams of INS-related information using provided data and has the ability to learn to predict future disturbances, abnormal trends, and problems. Once the algorithm demonstrates the ability to learn to predict future problems, ensure that it is able to automate a scan on similar data streams as was used for the algorithm training. Ensure that the algorithm is able to report identified anomalies and sufficient background information to simplify root cause analysis of the subject problem/disturbance by the subject matter expert (SME). Develop a transition plan that identifies the scope, effort, and resources required to extend the prototype algorithm and process to additional analysis tasks, to include training for additional combinations of data streams to look for different problems or disturbances; and development of an out-of-band problem detector that could be considered for shipboard installation for real-time disturbance detection. Provide onsite training of the algorithm design, operation, maintenance, and interfaces with the host system.

Participate in a Preliminary Design Review (PDR) event. Install on a test ship for system performance testing. Deliver a Data Disclosure Package (DDP) that includes at a minimum: form, fit, function, operation, maintenance, installation and training data, procedures and information plus the data necessary or related to: overall physical, functional, interface, and performance characteristics; corrections or changes to Government-furnished data or software; and data or software that the Government has previously received unlimited rights to or that is otherwise lawfully available to the Government.

(Note: Though Phase II work may become classified (see Description section for details), the Proposal for Phase II work will be UNCLASSIFIED. If the selected Phase II contractor does not have the required certification for classified work, the SSP program office will work with the contractor to facilitate certification of related personnel and facility.)

PHASE III DUAL USE APPLICATIONS: Work with the Navy to implement the analysis toolkit as described in the Phase III transition plan at a designated Navy lab and as a SSP alteration (SPALT)  on designated ships. Provide documentation and support materials to transfer the mature analysis toolkit to Navy SMEs. Ensure sufficient cyber security and software assurance requirements are met in accordance with DFARS Clause 252.204-7012, NIST Special Publication 800-171, NIST Special Publication 800-53, and NIST Special Publication 800-37. In addition, SPALT requirements to enable the software to be deployed at Navy data analysis labs and ships must be met.

Provide an updated DDP prior to fielding that must include at a minimum: any updates to the Phase II DDP and installation and maintenance procedures and processes; cyber security and authority to operate certifications for Navy ship use; qualification requirements and results; demonstrated compliance with SPALT requirements; and testing results.

This ML application has dual use commercial or military applications in any complex system that uses sensors to detect abnormalities, synthesize multiple unrelated data streams, and conduct failure analysis or fault localization of the underlying system such as propulsion or power generation plants, ships, aircraft, and space systems.


1. Witten, Ian H. and Frank, Eibe. “Data Mining: Practical machine learning tools and techniques.” Morgan Kaufmann, 2011, p. 664, ISBN 978-0-12-374856-0.

2. MacKay, David J. C. “Information Theory, Inference, and Learning Algorithms.” Cambridge University Press: Cambridge, 2003. ISBN 0-521-64298-1.

3. Duda, Richard O., Hart, Peter E. and Stork, David G.  “Pattern classification (2nd edition)” Wiley, New York, ISBN 0-471-05669-3.

4. Bishop, Christopher. “Neural Networks for Pattern Recognition.” Oxford University Press, 1995. ISBN 0-19-853864-2.

5. Hodge, V.J. and Austin, J. “A Survey of Outlier Detection Methodologies.” Artificial Intelligence Review, 22 (2), 2004, pp. 85-126.

KEYWORDS: Data Analysis; Machine Learning; Pattern Matching; Anomaly Detection; Classification; Data Mining; Behavior Based Detection