Spatial Data Comparison for Markerless Augmented Reality (AR) Anchoring
Navy SBIR 2020.1 - Topic N201-019
NAVAIR - Ms. Donna Attick - email@example.com
Opens: January 14, 2020 - Closes: February 26, 2020 (8:00 PM ET)
AREA(S): Human Systems, Information Systems
PROGRAM: PMA251 Aircraft Launch & Recovery Equipment (ALRE)
Develop a software solution to localize an augmented reality (AR) headset user
within a space by making a comparison between spatial mapping data collected
live from the headset and scanned/modeled data collected at an earlier time and
stored on the device. The proposed solution should work with an existing,
commercially available AR headset.
The Navy and Marine Corps currently have several efforts underway looking at
applying AR technology to provide maintainer guidance, and improve
maintenance-action success rate and repair time. Many current commercial
off-the-shelf (COTS) AR hologram anchoring solutions make use of the device's
onboard camera and fiducial marker detection to localize a user in space and
overlay instructions, animations, warnings, schematics, and technical data but
these solutions are limited by the chosen device's camera quality and
computational power. Target-based solutions also mandate that a physical marker
be placed on the piece of equipment to be detected, which is unacceptable in a
number of maintenance environments. More powerful image and object recognition
technology exists that foregoes the need for a fiducial marker but these
solutions are heavily dependent on the upload of government data to proprietary
cloud services which severely restricts utilization due to both government
data sensitivity and cyber limitations on internet access.
Design, develop and demonstrate feasibility of a software to meet the
requirements provided in the Description. Design a high-level software and use
a simplified example of the methodology as a proof-of-concept. The Phase I
effort will include prototype plans to be developed under Phase II.
Build and demonstrate a prototype system for a chosen AR headset and test in
both interior and exterior environments to highlight capability in lighting
conditions that range from bright sunlight to darkness in all weather
DUAL USE APPLICATIONS: Further develop the solution on chosen AR device.
Transition as Support Equipment within other Navy-developed applications.
1. Liu, L.,
Li, H., & Gruteser, M. Edge Assisted Real-time Object Detection for Mobile
Augmented Reality. Proceedings of The 25th Annual International Conference on
Mobile Computing and Networking, 2019. doi:10.1145/3300061.3300116. www.winlab.rutgers.edu/~luyang/papers/mobicom19_augmented_reality.pdf
2. Dow, E.
M., Farr, E. M., Gildein, M. E., II, & Vaughan, M. J. U.S. Patent No. US
10,169,384 B2. Washington, DC: U.S. Patent and Trademark Office, 2019. https://www.researchgate.net/profile/Eli_Dow/publication/330090603_Augmented_Reality_Model_Comparison_and_Deviation_Detection/links/5c2ce07192851c22a3554b5c/Augmented-Reality-Model-Comparison-and-Deviation-Detection.pdf?origin=publication_detail
Augmented; Mixed; Reality; Spatial; Fiducial; Hologram