DIGITAL ENGINEERING - Automated Knowledge Base Extraction and Student Assessment

Navy STTR 23.A - Topic N23A-T014
NAVSEA - Naval Sea Systems Command
Pre-release 1/11/23   Opens to accept proposals 2/08/23   Closes 3/08/23 12:00pm ET    [ View Q&A ]

N23A-T014   TITLE: DIGITAL ENGINEERING - Automated Knowledge Base Extraction and Student Assessment

OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Artificial Intelligence (AI)/Machine Learning (ML)

OBJECTIVE: Develop an automated capability to generate exams with answer keys using Artificial Intelligence or Machine Learning (AI/ML)-powered data mining for Undersea Warfare (USW).

DESCRIPTION: When systems are updated, development of training materials for the updated system can be labor-intensive and can take significant time to prepare. Meanwhile, students are being taught using potentially deprecated information. This is particularly true when the system is complex. The problem is compounded when updates occur frequently. USW system operation is a skill that is highly perishable, increasing the importance of accurate, updated training products. The importance of USW to national security necessitates system updates on a frequent basis. This makes developing current, up-to-date training materials for the updated capabilities key to harnessing the power of these systems.

Today, over 450 sailors graduate annually from Surface Combat Systems Training Command San Diego (SCSTC-SD). To stay current with the ongoing efforts to continually update USW related course materials, exams are currently still manually produced, administered, and graded by hand, which consumes numerous man-hours that could be better spent providing course instruction and running tactical scenarios in high fidelity virtual trainers. Creating a Surface Force USW Knowledge Bank to augment Instructional Design, with up-to-date USW References, would significantly decrease this workload.

The Navy seeks a solution that can (1) mine documentation for existing capability associated with the SQQ-89 and UYQ-100, (2) autonomously develop a core USW training knowledge base with associated exam questions and answer keys, and (3) automatically identify deltas to the core knowledge base associated with approved capability improvements and appropriately adjust the core knowledge base to remove deprecated content.

AI/ML techniques such as natural language processing (NLP) and data mining have improved markedly in recent years. The Navy seeks a technology that automatically generates tests and answer keys from functional description documents (FDDs), concepts of employment (CONEMPs), and concepts of operation (CONOPs), and other USW References created during capability developments and deployment. The AI/ML technology will demonstrate capability at a complexity level that is analogous to the target USW systems but need not be a military system. The Government will provide data and specifications as needed to demonstrate the capability.

Utilizing various USW References including the FDD, CONEMPs, and CONOPs created during capability developments, this innovative tool would utilize AI/ML to search these references and generate an automated testing function. This tool would produce a test bank comprised of applicable and correct questions, generate an answer key, grade, and provide test score results as applicable. Automated exam generation capability, based on material(s) of interest, does not currently exist due to the complex and changing system requirements and reference updates. This tool will also appropriately adjust the core knowledge base to remove deprecated content.

There is currently no technology available that smartly mines data from selected materials to document, categorize, and interpret information then generate up-to-date exams, and subsequent administration. Furthermore, current technologies do not allow for the interpretation and evaluation logic to verify that sailors are fully understanding and experiencing the intended system improvements.

USW reference resources are drafted and finalized over time and not necessarily in a prescribed order as they are continuously updated. Vendors provide the FDD outlining the capability of the system, displays, unique features, operating instructions, etc. However, the FDD does not explain how the updated system will function on deployment or how the sailor would employ the system specifically for anti-submarine warfare (ASW). The schoolhouse instructors manually translate the FDD into training material by creating a Modernization Training Team PowerPoint deck for training. AI/ML could be utilized to glean appropriate training materials from the FDD more efficiently, and effectively, than an instructor can manually.

The CONEMP provides a big picture look regarding how the sailor should use this updated system capability at a high level specifically, what is the employment and how this fits into the totality of tools at their disposal. The information in this document must also be included in the repertoire of training materials. As system engineers evaluate the system, they generate the CONOP which drills down into the details explicitly aimed at how to hunt for submarines. It describes exactly how the operator should use the system for this specific purpose. An AI/ML tool could be used to continuously mine for appropriate course material incorporating the operating requirements of the system related to the overall ASW process.

While AI/ML can be utilized to mine course materials from the FDD, CONEMP, CONOPs, and existing training PowerPoint documents, a more tailored and unique AI/ML capability would be necessary to evaluate the more subjective issue of whether sailors are understanding how to use the new system attributes specifically for ASW. AI/ML would need to be utilized to generate training materials and questions designed to analyze sailor understanding of the system nuances and how it works within the larger platform for ASW.

The School House gets the FDD with a copy of the code associated with the update/upgrade. They do not have CONEMP or CONOP immediately. As these documents and additional USW references, like Operator Employment Guides, become available over time, the instructors will need the flexibility to engage the AI/ML tool to include these materials in ongoing training course material updates. The new technology will be introduced in parallel with capability fielding and provide training personnel with the most current information to train students. This will increase the speed at which the Fleet can adopt transformational capabilities to maintain a warfighting edge. The threshold would be reducing the delay specific to developing training products by a factor of two.

PHASE I: Develop a concept for an automated capability to generate exams with answer keys using AI/ML-powered data mining that meets the parameters in the Description. Demonstrate that the concept can feasibly meet the requirements through analysis and modeling. The Phase I Option, if exercised, will include the initial design specifications and capabilities description to build a prototype solution in Phase II.

PHASE II: Develop and deliver a prototype automated capability to generate exams with answer keys using AI/ML-powered data mining. Demonstrate functionality under the required service conditions. Demonstrate the prototype performance through the required range of parameters given in the Description. The prototype will be tested by Government subject matter experts (SMEs).

PHASE III DUAL USE APPLICATIONS: Support the Navy in transitioning the technology to Navy use. The final product will consist of a capability to generate exam questions and answer keys from system documentation, generate delta questions and associated keys, and appropriately adjust the core knowledge base to remove deprecated content. The resultant technology will be used and tested by the training and Integrated Logistics Support (ILS) team supporting Undersea Warfare Systems such as AN/SQQ-89A(V)15 and AN/UYQ-100.

The technology developed could also be used to develop training products for any complex skillset subject to rapid change due to policy or technology changes. Examples of such complex skillsets include law enforcement and air traffic control.

REFERENCES:

1.       Deshpande, Adit "Deep Learning Research Review Week 3: Natural Language Processing." 10 Jan 2017. https://adeshpande3.github.io/Deep-Learning-Research-Review-Week-3-Natural-Language-Processing

2.       Mooney, Raymond J. and Razvan Bunescu "Mining Knowledge from Text Using Information Extraction" SIGKDD Explorations June 2005: Vol. 7:1. https://www.cs.utexas.edu/~ml/papers/text-kddexplore-05.pdf

3.       Navy Fact Files "AN/SQQ-89(V) Undersea Warfare / Anti-Submarine Warfare Combat System." Updated 20 Sep 2021. https://www.navy.mil/Resources/Fact-Files/Display-FactFiles/Article/2166784/ansqq-89v-undersea-warfare-anti-submarine-warfare-combat-system/

4.       Navy Fact Files "AN/UYQ-100 Undersea Warfare Decision Support System (USW-DSS)." Updated 20 Sep 2021. https://www.navy.mil/Resources/Fact-Files/Display-FactFiles/Article/2166791/anuyq-100-undersea-warfare-decision-support-system-usw-dss/

5.       Piontek, Mary E. "Best Practices for Designing and Grading Exams." Center for Research on Learning and Teaching (CRLT) Occasional Papers No. 24. 2008. http://www.crlt.umich.edu/sites/default/files/resource_files/CRLT_no24.pdf

 

KEYWORDS: Automatically generate tests and answer keys; Artificial Intelligence and Machine Learning; AI/ML; natural language processing; NLP; functional description documents; FDDs; concepts of employment� CONEMPs; concepts of operation; CONOPs


** TOPIC NOTICE **

The Navy Topic above is an "unofficial" copy from the Navy Topics in the DoD 23.A STTR BAA. Please see the official DoD Topic website at www.defensesbirsttr.mil/SBIR-STTR/Opportunities/#announcements for any updates.

The DoD issued its Navy 23.A STTR Topics pre-release on January 11, 2023 which opens to receive proposals on February 8, 2023, and closes March 8, 2023 (12:00pm ET).

Direct Contact with Topic Authors: During the pre-release period (January 11, 2023 thru February 7, 2023) proposing firms have an opportunity to directly contact the Technical Point of Contact (TPOC) to ask technical questions about the specific BAA topic. Once DoD begins accepting proposals on February 8, 2023 no further direct contact between proposers and topic authors is allowed unless the Topic Author is responding to a question submitted during the Pre-release period.

SITIS Q&A System: After the pre-release period, and until February 22, 2023, (at 12:00 PM ET), proposers may submit written questions through SITIS (SBIR/STTR Interactive Topic Information System) at www.dodsbirsttr.mil/topics-app/, login and follow instructions. In SITIS, the questioner and respondent remain anonymous but all questions and answers are posted for general viewing.

Topics Search Engine: Visit the DoD Topic Search Tool at www.dodsbirsttr.mil/topics-app/ to find topics by keyword across all DoD Components participating in this BAA.

Help: If you have general questions about the DoD SBIR/STTR program, please contact the DoD SBIR Help Desk via email at [email protected]

Topic Q & A

1/24/23  Q. What materials (documents, exams, etc.) will be available prior to submitting a proposal? What materials will be available during the project?
   A. Nothing in addition to the published topic and listed references is available prior to submitting a proposal. We are not able to provide classified materials during Phase I, and most of the documentation associated with new capabilities would be classified. However we are interested in seeing how the proposed technology could take an example expert-written document (e.g., a dissertation or thesis or highly technical paper) and derive questions and answers for new users from the expert-written document.
1/24/23  Q. Who are the typical students attending the SCSTC-SD? Do they have additional background knowledge from former training or experiences?
   A. There are a range of students attending SCSTC-SD. Apprentices are learning the system for the first time, but would need exam questions (and answers) related to new capabilities even though all the capabilities are new to them. In similar fashion, journeymen who have trained on prior systems need exam questions (and answers) related to the new capabilities.
1/24/23  Q. What types of exam questions should the system generate? For example: multiple choice, free response, procedural recall, etc.?
   A. The exam questions can range in format, though I don�t believe there would be any reason for essay responses. Typical questions would be those where the answer could be graded electronically rather than by human evaluation of long written responses.
1/24/23  Q. Does the Navy anticipate that any of the exams, materials, documents, and/or software will be controlled or classified information?
   A. The materials associated with new capabilities for SQQ-89 and similar systems would be classified. Exams and answer keys could be classified as well.

[ Return ]