N241-043 TITLE: Extended Reality (XR) for Use in Naval Shipyard Industrial Environments
OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Human-Machine Interfaces
OBJECTIVE: Develop extended reality (XR) solutions to combat safety mishaps and, decrease repair/maintenance times, additional travel expenses, re-work, congested areas, and unnecessary breakdowns and repairs in naval shipyards.
DESCRIPTION: The Navy is seeking the use of XR, which encompasses virtual reality (VR), augmented reality (AR), and mixed reality (MR), to cultivate a digital workforce for driving digital transformation of Navy Shipyards (NSYs). The Navy is seeking technology to introduce high velocity learning, first time quality, and better objective quality evidence (OQE) while removing safety hazards, travel expenses, unplanned work, and physical costs associated with waste generation, mock-ups, tear downs, assembly/disassembly. The Navy currently works and trains in harsh high-risk environments, lacks operational efficiencies, requires additional travel for troubleshooting, assistance, and inspections, lacks first time quality (human errors), congested areas with tours and workers, and requires unnecessary disassembly and repairs.
In its current state, shipyards carry out activities by qualified and trained personnel who perform procedures established by technical documentation in its physical form. When working in an industrial environment in which losing a single asset can mean million-dollar losses or those with complex procedures or time-sensitive product timelines, increasing efficiency and eliminating human error whenever possible becomes crucial for the enterprise. Failure of valuable equipment correlates to extra cost required to purchase material and/or extra man-days required to recover and repair. When equipment in one shop or area is down, it has potential to create a ripple effect that halts or diminishes work in other areas, leading to even further costs and delays. These issues are currently resolved by following complex procedures for diagnostics and maintenance, some of which are sparsely or poorly documented or so extensive that it requires additional training or understanding when conveying to maintainers. In some instances, troubleshooting or repair requires an expert to travel and perform the maintenance or repair leading to additional travel costs. Moreover, record keeping in physical formats is becoming obsolete so there exists a need to transfer physical documents such as standard operating procedures (SOPs), records, and reports into a secure digital format.
XR will cultivate the needed workforce by filling in technology gaps and will unlock new opportunities, drive new efficiencies, and inform analytical-based decision making. Public shipyards are currently working to move design, planning, and execution to computers in order to replace the physical formats that are currently used. XR will catalyze the workforce to efficiently move in the direction they are seeking allowing for 2D and 3D interactive digital forms that are easier to understand, work to, and maintain.
The Navy is seeking to develop XR solutions tailored for use in NSY environments to include considerations such as bandwidth availability (e.g., 5G networks), safety effects (e.g., motion sickness), XR interface (e.g., without peripherals such as mouse, keyboard, or touchscreen), 3D model generation, easy content creation, and IT/cyber security zero trust principles and policies for use in arenas such as mockups, training, surveys, hands free step-by-step instructions, reality capture (3D modeling), remote assistance, virtual tours, spacial computing, object recognition, dimensional digital twinning, behavioral digital twinning, remote collaboration, digital retention, and record keeping. XR solutions should allow for long term archival and would not require someone to be physically present for training or assisted applications and should introduce high velocity learning, first time quality, and better objective quality evidence (OQE) while removing safety hazards, travel expenses, unplanned work, and physical costs associated with waste, mock-ups, tear downs, assembly/disassembly. This includes cutting-edge and powerful technologies that provide an engaging environment, leading to useful data capture about the task(s) being performed. Advances in XR in the commercial sector could be adapted for shipyard use.
Augmented Reality (AR):
Problem: Unfamiliar situations or anomalies require expert/experienced personnel or engineering intervention over the phone with word or photo descriptions of the situation during maintenance, inspection, troubleshooting, or any procedural work. The technician or mechanic describes the situation and solution is implemented for remediation. This all happens without the expert seeing the "true" environment the person is trying to explain or support and can delay processes. Step-by-step instructions for complex maintenance or assembly involve steps that can be difficult to remember and are subject to interpretation. Maintainers and operators perform steps by reading from an operating procedure while holding the current step in memory before coming back to the paper copy for further instructions, which can be distracting.
Desired Resolution: The objective is to develop an AR solution to allow for remote assistance and communication in real-time using digital tools. These devices would be the eyes and ears of the expert and the on-site professional can act as the surrogate body to accomplish the work. AR methods would avoid transit and travel costs and time, disseminate knowledge quicker, and less experienced personnel can be accompanied by senior personnel or experts when needed. With AR, users would work on steps, without diverting their eyes or hands from the work being performed. As tasks are completed, there would be visual confirmation and the next step would be presented when the current step is completed safely and with good quality.
Virtual Reality (VR):
Problem: Training is primarily performed in classroom-based sessions, with physical mock-ups, part task trainings, just-in-time trainings, and work familiarization with examinations and evaluations. Mechanics and technicians need a human trainer and rely on rudimentary tools and paper documentation to complete their job functions. Shipyards incur costs associated with providing the right information, by the right people, at the right place, and with the right equipment. Current mock-ups are being physically produced, torn down, and lose historical knowledge overtime, which cost up to millions of dollars depending on the intricacy of the project. Construction requires experts from various organizations and locations to meet for collaboration and usually entails tours of areas where work will be performed, which congest and distract the area. Space at NSYs is at a premium and VR would save space on mock ups, tear downs, and/or storage of mock ups not being utilized, rather than bringing in large bulky machinery and components for training or mock-up or utilizing large spaces for this application, virtual reality allows access to features through a 3D virtual model that can be visualized in its intended space while a 1:1 scale is maintained.
Desired Resolution: The objective is to develop a VR solution to modernize operations and amplify physical mockup training by creating immersive training environments, streamlining the creation and sustainment of technical documents, and enhancing mechanic, technician, and engineering services. VR methods would reduce costs by decreasing the amount of text to be translated; streamlining training processes for operators of complex equipment; training in safe environments that would otherwise be harmful, expensive or dangerous; and providing the opportunity to train on exact configurations rather than "similar" configurations. VR would allow access to features through a 3D virtual model that can be visualized in its intended shop or space while a 1:1 scale is maintained, removing the need to view in a reduced format. Additional text, graphics, and videos should be superimposed along with any manuals, procedures, or documentation. VR solution should allow all parties to collaborate in their current location(s) removing the need for travel expenses. Furthermore, mock-ups and spaces would be viewed digitally and can bring in unlimited number of machinery or equipment of any size, enabling realistic demonstrations similar to physical spaces. This removes the need to physically interrupt locations or to build expensive parts for demonstrations. Mock-ups will have the capability of digitally being placed in the intended location to capture any potential issues that may arise during installation or use before physical movement takes place.
Mixed Reality (MR):
Problem: Quality assurance and inspection tasks are subject to human error with a potential for negative outcomes such as equipment failure, injury, pollution, damage, and more. Inspectors perform inspections using checklists in order to confirm quality of work, equipment, or structures.
Desired Resolution: The objective is to develop MR to provide enhanced information to the inspector. The user can be taken through the inspection process where settings, states, locations, and parts can be presented to the inspector who can compare expected values and verify with automated visual confirmation and has the ability to alert the user when results are outside expected values. MR methods would reduce inspection errors due to human error, improve reliability of inspection tasks, assist with interactive checks, and automate storage or results for traceability and reproducibility. MR should also provide the value of monitoring student behaviors to ensure proper job execution and increased procedure accuracy. Instructors can adjust training based on student’s ergonomics, positioning, and time spent in certain areas to help reduce safety risks and exposure while applying the best ergonomic practices.
PHASE I: Develop a concept to implement XR solutions into secured industrial workspaces by identifying the highest anticipated risks associated with the concept and proposing viable risk mitigation strategies to technological and reliability challenges. Feasibility must be demonstrated through modeling and analysis for a useful product to be used in NSYs including technical feasibility of integrating virtual or augmented visuals into current processes as well as meeting risk management framework guidelines associated with cybersecurity compliance.
The Phase I Option, if exercised, will include the initial design specifications and capabilities description to build a prototype solution in Phase II.
PHASE II: Develop, task, test XR, and deliver a prototype for existing networks utilizing current work processes. Develop hardened system architecture and complete the risk management process for gaining cybersecurity accreditation for system deployment. Develop high fidelity prototype(s) that are acceptable for use within the current NSY infrastructure and demonstrate technological competence through evaluation and modeling over systems and processes that are already available and in place at NSYs. Demonstrate prototype performance in a simulated or realistic/piloted environment. Identify, evaluate, and mitigate risks, roadblocks, and challenges. Create milestones to incorporate this technology into the Phase III development plan.
PHASE III DUAL USE APPLICATIONS: Further refine the prototype(s) and support the Navy in transitioning, testing, validating, and certifying the technology for shipyard use. Introduce training and incorporate the product into NSY processes for sustainment.
Commercial applications may include, but are not limited to, any public industrial environment setting performing common trade work, maintenance, repair, or inspections.
KEYWORDS: Extended Reality; Augmented Reality; Virtual Reality; Mixed Reality; Remote Assistance; Navy Shipyards
** TOPIC NOTICE **
The Navy Topic above is an "unofficial" copy from the Navy Topics in the DoD 24.1 SBIR BAA. Please see the official DoD Topic website at www.defensesbirsttr.mil/SBIR-STTR/Opportunities/#announcements for any updates.
The DoD issued its Navy 24.1 SBIR Topics pre-release on November 28, 2023 which opens to receive proposals on January 3, 2024, and now closes February 21, (12:00pm ET).
Direct Contact with Topic Authors: During the pre-release period (November 28, 2023 through January 2, 2024) proposing firms have an opportunity to directly contact the Technical Point of Contact (TPOC) to ask technical questions about the specific BAA topic. Once DoD begins accepting proposals on January 3, 2024 no further direct contact between proposers and topic authors is allowed unless the Topic Author is responding to a question submitted during the Pre-release period.
SITIS Q&A System: After the pre-release period, until January 24, 2023, at 12:00 PM ET, proposers may submit written questions through SITIS (SBIR/STTR Interactive Topic Information System) at www.dodsbirsttr.mil/topics-app/ by logging in and following instructions. In SITIS, the questioner and respondent remain anonymous but all questions and answers are posted for general viewing.
Topics Search Engine: Visit the DoD Topic Search Tool at www.dodsbirsttr.mil/topics-app/ to find topics by keyword across all DoD Components participating in this BAA.
|- During Phase I would we be already be able to directly start working and testing with the SBIR POC team or maintenance workers at one of the shipyards recommended by tge SBIR POC to get early feedback and data for the prototype design in Phase II?
- Would the SBIR POC setup the contact of a team between the company and a team your choice or should the startup bring their own contact?
|-Yes, you would be interfacing with a diverse team from the public shipyards who will be providing feedback and data for the prototyping design. This is necessary in order to provide enough information for a successful Phase II prototype that fits into the shipyards infrastructure.
-The government SBIR POCs will set up regular cadences via Microsoft Teams on FlankSpeed and bring in the necessary and available team members to interface with the company.
|There are three distinct requests in this topic:
|The Navy will consider proposals specialized in one or more of the AR/XR/VR sub-topics. The crucial element is interoperability, so that content is reusable between the domains. The primary objective is a hands free display (augmented reality) of procedural content and cuing as a performance support tool for industrial maintenance personnel, but there is a continuum of training to task performance support that virtual content maps to very well. Immersive virtual content would be used for basic familiarity training and refresher rehearsals, mixed reality content using AR and physical training devices / part task trainers would be used for more expert levels of training, and finally a subset of just the cueing and automated QA functions would be used as the performance support tool. The idea is for the maintainer to see the same cueing across the entire training to performance continuum. Procedural content standards the Navy uses currently are MIL-STD-3008 and S1000D. Some visual “look and feel” standards can be found in MIL-STD-1472H or latest version, with some new commercial standards being developed by the International Mixed Reality Standards Association (IMRSA) as a subsidiary of the Object Management Group (OMG). To date commercial simulation engines such as Unity and Unreal Engine have been used to sequence the content, with physical device management addressed via the OpenXR library to the greatest extent possible. The DoD has used both the visual content standards: Graphics Language Transfer Files (glTF) managed by the Kronos group and OMG, and the PIXAR Universal Scene Description file type.
|Phase I objectives reference “secured industrial workspaces” and cybersecurity - can you please provide more details about the security/cybersecurity compliance the solution must adhere to? Including but not limited to: will the end solution be used in areas where forward-facing cameras on the headset must be disabled, will the software need to reside/use storage on Flank Speed, and any other factors that would impact headset and platform selection?
|Use of technologies involving extended reality (XR) require authorization to operate (ATO) with impact levels (IL) of 4 or 5. IL4 accommodates DoD Controlled Unclassified Information (CUI) and IL5 accommodates DoD CUI & National Security Systems (NSS). The cameras on the headset can be used, but will need to go through physical security requirements established by the shipyards and require approvals for use. Although FlankSpeed is encouraged for use, the software isn’t required to reside in FlankSpeed, other storage will require ATOs similar to FlankSpeed. Other factors that affect selection would be determined during Phase I.
Reference for DOD and Navy cybersecurity standards: DODI 8510.01 and SECNAV M-5239.3