Mixed Reality Point Cloud Manipulation

Navy SBIR 25.1- Topic N251-033
Naval Sea Systems Command (NAVSEA)
Pre-release 12/4/24   Opens to accept proposals 1/8/25   Closes 2/5/25 12:00pm ET

N251-033 TITLE: Mixed Reality Point Cloud Manipulation

OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Sustainment

OBJECTIVE: Develop a capability to visualize and modify 3-D point cloud models generated by Light Detection and Ranging (LiDAR) and photogrammetry with mixed reality hardware to improve the ability for engineers and technicians to perform virtual ship checks to support design, installation, and modernization to deliver ships on time at lower costs.

DESCRIPTION: Program Executive Offices (PEOs), shipyards, Original Equipment Manufacturers (OEMs), Alteration Installation Teams (AITs), Regional Maintenance Centers (RMCs), and others perform countless ship checks and inspections throughout a ship’s lifecycle. Investments are currently being made into creating dimensional digital twins with LiDAR, photogrammetry, and other 3-D scanning technologies. These technologies have proven invaluable for generating 3-D models that aid in various maintenance and sustainment functions throughout an asset’s lifecycle, but the Navy does not have an effective environment for visualizing and collaborating in the review of ship models.

3-D model generators and consumers visit ships, submarines, or other physical objects of interest, 3-D scan the physical asset leveraging LiDAR or Photogrammetry technologies, generate a 3-D data model with point cloud software, and then view the 3-D model in a 2-D environment (typically a computer monitor) to support future 3-D work (example: installation and modernization). This approach limits user performance and fidelity relative to what fully 3-D models offer, and results in lower effectiveness in the use of this technology.

Immersive 3-D native environments such as augmented reality (AR), virtual reality (VR), or holographic displays provide the opportunity to experience 3-D models in their native dimensions by allowing users to explore and visualize structures and components with every aspect of the model in a familiar and lifelike environment. This will allow naval architects, engineers, technicians, logisticians, shipyard workers, and others across the NAVSEA enterprise to gain significantly more value out of 3-D models with the ability to collaborate in real-time as if physically visiting the ship as a team.

While specific use cases differ in application, the general improvements to visualization are of scale, proportions, special relationships, interferences, and overlays of technical data and annotations from previous inspection and work crews. All these factors will be invaluable to maintenance planning and coordination. Direct return on investments will be seen by improved detection and resolution of physical interferences, design flaws or conflicts, physical damage to equipment or platforms, or other issues with material condition over traditional 2-D renderings on computer screens. Finally, mixed reality will offer the ability for collaborative touring, viewing, diagnosis, and resolution if the aforementioned issues to help diverse teams resolve challenges significantly faster, but currently these tools are not yet mature enough for wide adoption.

To improve the application, execution, and use of 3-D scanning technologies for shipyard applications, NAVSEA would greatly benefit from research, development, and transitioning of software tools that allow the exploration of models in full 3-D views. This concept of employment would be directly applicable to two primary user communities for design purposes:

    1. Ship-level inspections, issue documentation, and tagging which occurs on the deck plates of ships and are reviewed by both local and distributed engineering teams. Teams specifically inspect equipment for work and maintenance discrepancies (paint issues, corrosion, loose nuts, bolts, fittings, et al), which should be annotated, documented, and reported via Navy IT systems. In a 3-D environment those annotations can be made directly in a 3-D model environment to better correlate issue status with the specific physical location and piece of equipment of concern, and then models can be shared across multiple teams to maintain a single maintenance operations and maintenance picture.
    2. Long-term (multi-year) and short term (single year) modernization planning design work which occurs at the shipyard, at contractor offices, or at distributed engineering Navy laboratories. Engineers, architects, and technicians will take existing 3-D models and drawings, import CAD models for future installations and redesign, and look for interferences, poor condition of existing structures and materials, and will annotate corrections that need to be performed by other teams. A collaborative environment where these models can be viewed and toured by diverse teams to collaborate and rapidly resolve issues is critical, as is the ability to compare as-designed drawings to as-built and current condition models and take measurements inside of those models.

PHASE I: Provide detailed workflows for ingesting 3-D point clouds into vendor software and hardware. Demonstrate similar capability using contractor provided data to assess feasibility. To support this, the government will provide detailed requirements for interaction functionality, data specifications and standards for government models (provided at contract award). The Phase I Option, if exercised, will include the initial design specifications, capabilities description, a preliminary timetable, and a budget to build a scaled prototype solution in Phase II.

PHASE II: Demonstrate the ability to ingest, manipulate, and mark up 3D models of Navy-representative ships generated by the government, with annotations that can be shared across team-mates. Develop a full-scale prototype and complete a successful demonstration of the prototype’s capabilities.

PHASE III DUAL USE APPLICATIONS: Assist the Navy in transitioning this technology in the form of a fully operational system (premised on the Phase II prototype) to government use initially on DDG 51 class ships. The final product delivered at the end of Phase III will be an integrated hardware and software solution that can be used by any industry, academia, or government engineering or operations teams that can benefit from collaboration in 3-D space. This includes operations planning, construction and construction management, surveying, and any other use case with similar requirements.

REFERENCES:

1. Wirth, Florian et al. "Pointatme: efficient 3d point cloud labeling in virtual reality." 2019 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2019.

2. Evangelos, Alexiou; Yang, Nanyang and Ebrahimi, Touradj. "PointXR: A toolbox for visualization and subjective evaluation of point clouds in virtual reality." 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX). IEEE, 2020.

3. Garrido, Daniel et al. "Point cloud interaction and manipulation in virtual reality." 2021 5th International Conference on Artificial Intelligence and Virtual Reality (AIVR).

4. Stets, Jonathan Dyssel et al. "Visualization and labeling of point clouds in virtual reality." SIGGRAPH Asia 2017 Posters, Article No. 31, pp. 1-2.

5. Maloca, Peter M, et al. "High-performance virtual reality volume rendering of original optical coherence tomography point-cloud data enhanced with real-time ray casting." Translational vision science & technology, Vol 7, 2, 2018.

KEYWORDS: LiDAR; Photogrammetry; Point-Cloud; Mixed-Reality; Annotation; Virtual Ship Check

TPOC 1: Jason Bickford
(805) 228-8395
Email: [email protected]

TPOC 2: Nicholas Tastad
(202) 781-265
Email: [email protected]


** TOPIC NOTICE **

The Navy Topic above is an "unofficial" copy from the Navy Topics in the DoD 25.1 SBIR BAA. Please see the official DoD Topic website at www.dodsbirsttr.mil/submissions/solicitation-documents/active-solicitations for any updates.

The DoD issued its Navy 25.1 SBIR Topics pre-release on December 4, 2024 which opens to receive proposals on January 8, 2025, and closes February 5, 2025 (12:00pm ET).

Direct Contact with Topic Authors: During the pre-release period (December 4, 2024, through January 7, 2025) proposing firms have an opportunity to directly contact the Technical Point of Contact (TPOC) to ask technical questions about the specific BAA topic. Once DoD begins accepting proposals on January 8, 2025 no further direct contact between proposers and topic authors is allowed unless the Topic Author is responding to a question submitted during the Pre-release period.

DoD On-line Q&A System: After the pre-release period, until January 22, at 12:00 PM ET, proposers may submit written questions through the DoD On-line Topic Q&A at https://www.dodsbirsttr.mil/submissions/login/ by logging in and following instructions. In the Topic Q&A system, the questioner and respondent remain anonymous but all questions and answers are posted for general viewing.

DoD Topics Search Tool: Visit the DoD Topic Search Tool at www.dodsbirsttr.mil/topics-app/ to find topics by keyword across all DoD Components participating in this BAA.

Help: If you have general questions about the DoD SBIR program, please contact the DoD SBIR Help Desk via email at [email protected]


[ Return ]