Unified Operational Picture for Anti-Submarine Warfare

Navy SBIR 21.1 - Topic N211-048
NAVSEA - Naval Sea Systems Command
Opens: January 14, 2021 - Closes: February 24, 2021 March 4, 2021 (12:00pm est)

N211-048 TITLE: Unified Operational Picture for Anti-Submarine Warfare

RT&L FOCUS AREA(S): Machine Learning/AI

TECHNOLOGY AREA(S): Human Systems

The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 3.5 of the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.

OBJECTIVE: Develop a fused picture from acoustic and non-acoustic sensors that transforms masses of data into concise, useful information for operators, watch team, and decision makers.

DESCRIPTION: Undersea warfare (USW) presents a uniquely complex environment to the human operator involving phenomena not present in environments commercial products focus on. Current systems rely heavily on manual association of contact information across sensors. This can be challenging in cluttered environments. Sensor improvements (resulting in more arrays, more gain, more beams, etc.) compound the problem, which can in turn lead to a degradation in situational awareness, incorrect contact picture, and possibly loss of tactical control.

The variable nature of the ocean floor, changing currents, unpredictable water temperature and density layers, marine life, and a huge spectrum of vessel traffic create a highly complex tactical picture in which an adversary can hide. Multiple specialized and highly sensitive sensors have been deployed over the years to contend with these conditions and fully penetrate the undersea battlespace. However, under stressing conditions and, taken collectively, the array of sensors employed by the undersea warfighter yields a copious flow of data and information that must be rapidly analyzed and interpreted. A multi-sensor fusion technology is needed to generate a unified and consistent tactical picture. The solution must be capable of analyzing, assimilating, and fusing data in an approach that considers both coherent and incoherent processing across multiple sensors with utilization of kinematic and spectral information in order to generate a single, unified, decision-quality, tactical picture.

While the technology sought under this topic will need to comply with cybersecurity protocols, cybersecurity, per se, is not necessarily required as an embedded aspect of the solution provided. While ideally fusion would involve multiple sensors having simultaneous contact, there will be times when only one sensor has contact. The fusion desired is an overarching awareness of contacts as they are perceived by different sensors and modes, both when there is temporal overlap and when there is not temporal overlap.

During Phase II, the technology will be evaluated by Navy subject matter experts and Fleet operators in a prototype sonar system using at-sea test data for validation. It may also be evaluated in an unmanned operation if appropriate for the solution.

Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. Owned and Operated with no Foreign Influence as defined by DOD 5220.22-M, National Industrial Security Program Operating Manual, unless acceptable mitigating procedures can and have been implemented and approved by the Defense Counterintelligence Security Agency (DCSA). The selected contractor and/or subcontractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances, in order to perform on advanced phases of this contract as set forth by DCSA and NAVSEA in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material IAW DoD 5220.22-M during the advance phases of this contract.

PHASE I: Develop a concept for the unified tactical picture that meets the requirements in the Description section. Feasibility will be demonstrated through analytical modeling, and developing and documenting the innovative algorithms, concepts, and architectures, and quantifying achievable performance gains. The Phase I Option, if exercised, will include the initial system specifications and a capabilities description to build a prototype in Phase II.

PHASE II: Develop and deliver the concept for the unified tactical picture into a prototype. The prototype will be evaluated by Navy subject matter experts and Fleet operators in a prototype sonar system using at-sea test data to validate that it is fit for use. Conduct additional laboratory testing, modeling, or analytical methods as appropriate depending on the company�s proposed approach.

It is probable that the work under this effort will be classified under Phase II (see Description section for details).

PHASE III DUAL USE APPLICATIONS: Support the Navy in transitioning the technology to Navy use through system integration and qualification testing for the unified tactical picture prototype developed in Phase II. Assist in transition and integration of the prototype to a future Advanced Capabilities Build (ACB) update to the AN/SQQ-89A(V)15 Combat System. Potentially integrate the technology into other sonar systems and military sensor systems.

Additionally, the technology could be of interest to intelligence, military, law enforcement, or market tracking for situations where a unified view needs to be assembled from a diverse set of sensor measurements or real-time situational awareness must be assembled in dynamic or volatile situations.

REFERENCES:

  1. Moacdieh, Nadine Marie and Sarter, Nadine. "The Effects of Data Density, Display Organization, and Stress on Search Performance: An Eye Tracking Study of Clutter." IEEE Transactions on Human-Machine Systems 47, December 2017, pp. 886-895. https://ieeexplore.ieee.org/document/7971994. Libraries holding this document can be found at https://www.worldcat.org/title/the-effects-of-data-density-display-organization-and-stress-on-search-performance-an-eye-tracking-study-of-clutter/oclc/7252229922&referer=brief_results
  2. Agrawal, Rashmi. "Technologies for Handling Big Data." Handbook of Research on Big Data Clustering and Machine Learning, IGI Global, October, 2019. https://www.worldcat.org/title/technologies-for-handling-big-data/oclc/8303222462&referer=brief_results
  3. United States Navy Fact File: AN/SQQ-89(V) Undersea Warfare / Anti-Submarine Warfare Combat System. https://www.navy.mil/navydata/fact_display.asp?cid=2100&tid=318&ct=2

KEYWORDS: Multi-sensor; Fusion of tactical sensors; Tactical Picture; coherent processing; incoherent processing; kinematic information.

[ Return ]