Artificial Intelligence (AI)-Generated Domain Specific Model and Ontology

Navy SBIR 25.1- Topic N251-030
Naval Sea Systems Command (NAVSEA)
Pre-release 12/4/24   Opens to accept proposals 1/8/25   Closes 2/5/25 12:00pm ET    [ View Q&A ]

N251-030 TITLE: Artificial Intelligence (AI)-Generated Domain Specific Model and Ontology

OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Trusted AI and Autonomy

The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.

OBJECTIVE: Develop a concept for a comprehensive assessment methodology to automatically generate a surface Domain Specific Model (DSM) and a model of concepts and their relationships (i.e., ontology) from domain-related technical documentation and generate machine readable interface documentation (e.g., JSON, XML).

DESCRIPTION: The Integrated Combat System (ICS) operates independently and as part of a netted integrated force with shared sensors, Command and Control (C2), weapons, and communications. A surface ship can have over forty (40) system elements that have unique data models. This data must be normalized through a common ontology to ensure a common understanding that is useable for machine processing (e.g., Artificial Intelligence and Machine Learning [AI/ML]). Manual generation of this DSM is a daunting task that has yet to be successfully accomplished. Once a DSM is established, new sensors, weapons, and communications elements can be integrated with little to no changes to the integration software, thus reducing time and required acquisition funding.

In contemporary military operations, the synergy and interoperability of diverse combat systems are critical for mission success. However, achieving seamless integration remains a formidable challenge due to the disparate data formats and structures employed by various platforms. This SBIR topic proposes harnessing the power of AI to devise a unified common data model (CDM) tailored specifically for combat systems (i.e., DSM). By employing advanced ML algorithms, natural language processing (NLP) techniques, and ontological analysis, the Navy seeks a capability to automatically extract, analyze, and harmonize data schemas from multiple sources. The envisioned AI-driven CDM will serve as a foundational framework for standardizing data representation, facilitating real-time data exchange, and enhancing decision-making processes across heterogeneous combat environments. There is no commercial technology that can generate a combat system CDM from Navy technical documentation sources.

The solution will employ a comprehensive assessment methodology comprised of simulation-based testing, real-world data integration trials, and user feedback analysis to evaluate the effectiveness, efficiency, AI trustworthiness, and usability of the proposed AI-driven CDM in enhancing combat system interoperability. The results of this assessment will provide valuable insights into the practical implications and potential limitations of implementing AI technologies for combat system integration, thereby informing future research directions and operational strategies in military contexts.

Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. owned and operated with no foreign influence as defined by 32 U.S.C. § 2004.20 et seq., National Industrial Security Program Executive Agent and Operating Manual, unless acceptable mitigating procedures can and have been implemented and approved by the Defense Counterintelligence and Security Agency (DCSA) formerly Defense Security Service (DSS). The selected contractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances. This will allow contractor personnel to perform on advanced phases of this project as set forth by DCSA and NAVSEA in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material during the advanced phases of this contract IAW the National Industrial Security Program Operating Manual (NISPOM), which can be found at Title 32, Part 2004.20 of the Code of Federal Regulations.

PHASE I: Develop a concept for the AI-based DSM that meets the parameters of the Description. Demonstrate the feasibility of the concept in meeting the Navy’s need by a combination of analysis, modeling, and simulation. The Phase I Option, if exercised, will include initial design specifications and capabilities description to build a prototype solution in Phase II.

PHASE II: Develop and deliver a prototype AI-based DSM based upon the results of Phase I. Demonstrate the prototype’s functionality through ingesting of data from various representative simulated combat system sensor, weapon, and/or communication elements provided by the government. Demonstrate the ability to modify resource settings and send controls.

It is probable that the work under this effort will be classified under Phase II (see Description section for details).

PHASE III DUAL USE APPLICATIONS: Support the Navy in transitioning the technology to Navy use. The final product will be a set of containerized applications that transform sensor, weapon, and communications data into the DSM and back into element unique interface specifications to send element unique settings and controls. Provide necessary product-level objective quality evidence to support product certification for use. It is anticipated that DSM can become a standard for future element developments, thus minimizing future data transformations.

Automated generation of a DSM using AI has application beyond military systems. Any industry where there are differences in terminology can use this technology to achieve commonality.

REFERENCES:

1. Murphy, A. and Moreland, J. "Integrating AI Microservices into Hard-Real-Time SoS to Ensure Trustworthiness of Digital Enterprise Using Mission Engineering." JIDPS, 25(1), 2021, pp. 38-54. https://dl.acm.org/doi/10.3233/JID-210013

2. Moore, Ryan. "PEO IWS X Program Overview and ICS Development." April 2022. 1100_ICS Engagement brief 03302022_SAS_Distro A.pdf

3. "Generative AI: Key Opportunities and Research Challenges." Carnegie Mellon University/Software Engineering Institute, 2023.

4. "National Industrial Security Program Executive Agent and Operating Manual (NISP), 32 U.S.C. § 2004.20 et seq. (1993)." https://www.ecfr.gov/current/title-32/subtitle-B/chapter-XX/part-2004

KEYWORDS: Power of Artificial Intelligence; Diverse Combat System; Unique Interfaces; Domain Specific Model; DSM; Combat Management System; Natural Language Processing; Integrated Combat System; ICS


** TOPIC NOTICE **

The Navy Topic above is an "unofficial" copy from the Navy Topics in the DoD 25.1 SBIR BAA. Please see the official DoD Topic website at www.dodsbirsttr.mil/submissions/solicitation-documents/active-solicitations for any updates.

The DoD issued its Navy 25.1 SBIR Topics pre-release on December 4, 2024 which opens to receive proposals on January 8, 2025, and closes February 5, 2025 (12:00pm ET).

Direct Contact with Topic Authors: During the pre-release period (December 4, 2024, through January 7, 2025) proposing firms have an opportunity to directly contact the Technical Point of Contact (TPOC) to ask technical questions about the specific BAA topic. Once DoD begins accepting proposals on January 8, 2025 no further direct contact between proposers and topic authors is allowed unless the Topic Author is responding to a question submitted during the Pre-release period.

DoD On-line Q&A System: After the pre-release period, until January 22, at 12:00 PM ET, proposers may submit written questions through the DoD On-line Topic Q&A at https://www.dodsbirsttr.mil/submissions/login/ by logging in and following instructions. In the Topic Q&A system, the questioner and respondent remain anonymous but all questions and answers are posted for general viewing.

DoD Topics Search Tool: Visit the DoD Topic Search Tool at www.dodsbirsttr.mil/topics-app/ to find topics by keyword across all DoD Components participating in this BAA.

Help: If you have general questions about the DoD SBIR program, please contact the DoD SBIR Help Desk via email at [email protected]

Topic Q & A

1/21/25  Q. Reference:
OBJECTIVE: Develop a concept for a comprehensive assessment methodology to automatically generate a surface Domain Specific Model (DSM) and a model of concepts and their relationships (i.e., ontology) from domain-related technical documentation and generate machine readable interface documentation (e.g., JSON, XML). [DEPARTMENT OF DEFENSE Small Business Innovation Research (SBIR) Program SBIR 25.1 Annual Program Broad Agency Announcement (BAA)]

Question:
In what sense is “assessment methodology” being used when describing the objective? A methodology that is assessing a process might not typically “generate” the output content of that process (like a DSM), whereas a methodology that is executing a process would likely produce some usable output content, other than an assessment. Is the desire for both a process that can “generate” a DSM and a method for “assessing” what elements might need to be part of such a process?
   A. Assessment methodology is better intended as the second definition, meaning a process to produce a usable output.
1/20/25  Q.
  1. Can you provide more detail regarding what is meant by a surface Domain Specific Model? Specifically;
    1. What is the "Content of the DSM Model" --- Is it a Data Model for the Data & Communication elements such the Sensor Data, Radar Data, Gun Control Data, etc.?
    2. How is this Data currently described --- Is it a set of Data Definition Models, OR Meta Data Representation, OR SCHEMA based representation, etc. ??
  2. What is a "Model of Concepts"?
  3. What is the "Content" of the "Model of Concepts" ?
  4. What is the specific meaning of “demonstrate the feasibility” during Phase?
    1. How is it Represented ?
    2. How much of technical detail is expected in Phase I in order to demonstrate the feasibility?
   A.
  1. Can you provide more detail regarding what is meant by a surface Domain Specific Model? Specifically;
    1. Yes, it is a data model for Surface Navy data and communication elements.
    2. Currently described in various sensor, weapon, and communication element design descriptions (e.g. IDD) as well as warfighting publications.
  2. What is a "Model of Concepts"?
    1. Essentially a conceptual data model that shows the data that an organization uses or intents to use in business operations. In this case this would be sensor, weapon, communications, and command and control data related to shipboard tactical operations.
  3. What is the "Content" of the "Model of Concepts" ?
    1. The content would be the data elements and high level relationships.
  4. What is the specific meaning of “demonstrate the feasibility” during Phase?
    1. This can be anything from a technical description of process to create the DSM to examples of what the DSM may look like within the scope of Phase I funding.
    2. This would be a description of approach as well as basic examples. The product should provide confidence that a Phase II funding decision would yield a detailed and defendable DSM.
1/12/25  Q. What is the format of the domain-related technical documentation? Is it provided in a machine-readable format (e.g., JSON, XML, YAML), or in a human-readable format (e.g., text files, PDFs, Word documents)?
   A. While there is a possibility some may be machine-readable, the fast majority would be human readable formats, such as Word documents and PDFs.
1/12/25  Q. Are there any restrictions or guidelines regarding the specific technologies, platforms, or tools (e.g., cloud services, open-source software) that can be used in Phase I?
   A. No restrictions aside as long as the data being utilized is being considered. For example, if the government provides CUI data during Phase I, that data would need to be managed appropriately.
1/5/25  Q.
  1. What specific types of domain-related technical documentation will be provided for generating the DSM and ontology? Are there standard formats (e.g., JSON, XML) that must be supported during Phase I?
  2. What criteria or benchmarks will be used to evaluate the effectiveness, efficiency, and trustworthiness of the AI-generated DSM and ontology?
  3. Should the solution prioritize compatibility with existing Integrated Combat System (ICS) tools or frameworks? Are there specific interfaces or standards to consider during the development?
  4. What scope and scale of simulation-based testing are expected during the prototype demonstration in Phase II? Will government-provided simulated data fully represent real-world conditions?
  5. How critical is the explainability of the AI system? Should the solution include features to explain decision-making processes or ontology relationships to end-users?
  6. For potential non-military applications, are there specific industries or use cases (e.g., manufacturing, healthcare) the solution should address during design?
   A.
  1. Specific types of documentation include interface control documents and NTTPs.
  2. This will vary based on the proposed approach.
  3. It should be compatible with ICS, but there aren’t defined frameworks/schemas to be leveraged yet. JC3IEDM could be a good starting point. Specific interfaces/standards beyond that are not defined.
  4. Simulation in Phase II is not expected to be high fidelity and instead focus on implementation using realistic elements.
  5. This would be of interest.
  6. None that we are specifically thinking of.
1/3/25  Q.
  1. Is the focus of this effort purely generation of the DSM, i.e. the ontology, or would the actual transformation/integration of the data feeds into the ontology also in scope?
  2. Does the methodology need to be able to understand both structured data schemas (such as XSD schemas) as well as unstructured technical documentation?
  3. Are changes to the source data schemas, additions of data schemas to the system of systems, and removals of data schemas from the system of systems anticipated? At what frequency?
  4. Are there any particular features or capabilities you consider must-have for the initial concept?
  5. Are there any particular data sources or formats we should focus on initially?
  6. Are there specific scenarios, use cases, or operational environments that the initial concept should simulate during Phase I?
  7. What specific criteria or metrics will be used to determine the success or feasibility of the Phase I concept? How do you envision the AI model being evaluated in terms of effectiveness, efficiency, and accuracy in generating the DSM?
  8. What types of user feedback or input should be collected during Phase I? Are there specific stakeholders (e.g., Navy engineers, system operators) who should be involved in the early-stage evaluation of the concept?
   A.
  1. Actual integration of the data feeds is of interest.
  2. Yes.
  3. Changes are anticipated, but may be variable. Changes could be weeks to months.
  4. Features/capabilities would vary based on proposed solution.
  5. Emphasis will largely be on using technical documentations such as PDF based documents.
  6. No specific scenarios envisioned.
  7. Specific criteria has not been defined and would be catered to the proposed solution.
  8. Navy engineers would likely be the user feedback focus for Phase I.
12/20/24  Q. Are there specific computational constraints, hardware requirements, or runtime environments that the solution must be designed around?
   A. There are currently no clearly defined computational constraints.


[ Return ]