High Dynamic Range Sensor Simulation
Navy SBIR 2008.1 - Topic N08-010 NAVAIR - Mrs. Janet McGovern - [email protected] Opens: December 10, 2007 - Closes: January 9, 2008 N08-010 TITLE: High Dynamic Range Sensor Simulation TECHNOLOGY AREAS: Information Systems, Sensors, Human Systems ACQUISITION PROGRAM: PMA-205, Aviation Training Systems OBJECTIVE: Establish innovative computer algorithms and associated technologies for creating High Dynamic Range (HDR) sensor simulation that leverages advanced database, rendering, and display capabilities at display-limited resolutions. DESCRIPTION: With the increased requirements of night operations in all aspects of the military, the use of night imaging devices has been amplified. As a result, a greater demand for training systems with an ever-increasing level of accuracy which can no longer be satisfied by the traditional methods of database creation, scene rendering, and display output. Advances have been to increase fidelity, but none have been coordinated in a single effort. For example, the Naval Aviation Simulation Master Plan (NASMP) Portable Source Initiative (NPSI) seeks to standardize archival specifications for high precision, HDR, and physics-based data types. However, traditional simulation processes, formats, and hardware architectures limit the deployment of emerging HDR display technologies. Solutions are to result in generalized ways for the image generator to gracefully transition from stored data resolution to enhanced display-limited resolution beyond the maximum database spatial resolution. In the hardware and rendering software domain, new technologies for processing, storing, and rendering HDR imagery for real-time use are on the horizon, yet most image generation systems still use the equivalent of the traditional fixed function capabilities, thus limiting dynamic range to 8 bits per component. Physically representative high-fidelity, real-time rendering of environmental components, such as lighting and atmospherics, are just starting to enter the market, yet only a few systems use such technologies. Finally, there are display systems coming to market that produce a far greater range of intensities (16 bits per component), yet few are programs investigating how to bring such technology to bear in the simulation of sensor imagery. New techniques and algorithms are required for moving sensor simulation from the traditional 8-bit world to support HDR throughout the entire system. Additional requirements are to identify gaps in the traditional work flow, and produce algorithms and techniques that will preserve dynamic range within source data, pipeline computation, and display representation. Emerging technologies that are physically as well as perceptually accurate can be exploited in the areas of displays and graphic architectures for developing advanced sensor systems. PHASE I: Propose innovative new techniques for creating run-time databases that preserve the dynamic range of a variety of simulated sensor imagery from source data. Demonstrate the feasibility of the proposed approach using a detailed analysis of the frame-rate performance and dynamic range preservation. Consider sensor imagery variables and outline scene inference methods, for different natural (vegetation, rocks, etc) and cultural features (roads, houses, power-line, etc). Propose new mathematical/physics-based modeling algorithm(s), that derive the high dynamic range scene imagery from source data. PHASE II: Demonstrate an end-to-end HDR sensor simulation that uses all of the algorithms, techniques, and understanding developed in Phase I. Demonstrate with both specific natural and cultural objects being rendered and collect data to compare the simulations with actual sensor imagery, as a validation of the algorithms effectiveness. Show, through measurement and analysis, that dynamic range was preserved. In cases where it was degraded, quantify the degradation and create mitigation suggestions. PHASE III: Finalize and produce the software as a standalone application, fully capable sensor simulation that can be installed at training sites. Transition the new technology into existing training simulation systems. PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: Commercial potential in the defense and commercial sectors, including Homeland Security, Law Enforcement, Public Safety, and Business Intelligence. Industries to benefit would range from geo-specific imagery for land management purposes, to entertainment-gaming. REFERENCES: 2. Roimela, K., T. Aarnio, and J. Itaranta. "High Dynamic Range Texture Compression." SIGGRAPH 2006 Proceedings (August 2006). 3. Mantiuk, R., A. Efremov, K Myszkowski, and H. Seidel. "Backward Compatible High Dynamic Range MPEG Video Compression." SIGGRAPH 2006 Proceedings (August 2006). 4. Lindsay, C., and E. Agu. "Real-time Wavelength-dependant Rendering Pipeline." SIGGRAPH 2006 Proceedings (August 2006). 5. Olano, Marc and Bob Kuehne, "SGI OpenGL Shader™ Level-of-Detail White Paper", SGI Document 007-4555-001, 2002 6. C. Bloom. "Terrain Texture Compositing by Blending in the Frame-Buffer(aka "Splatting" Textures)", Nov. 2, 2000 7. N. Tatarchuk. "Practical parallax occlusion mapping with approximate soft shadows for detailed surface rendering", International Conference on Computer Graphics and Interactive Techniques ACM SIGGRAPH 2006 Courses, pp 81-112 8. Brawley, Z., and Tatarchuk, N. 2004. Parallax Occlusion Mapping: Self-Shadowing, Perspective-Correct Bump Mapping Using Reverse Height Map Tracing. In ShaderX3: Advanced Rendering with DirectX and OpenGL, Engel, W., Ed., Charles River Media, pp. 135-154. 9. Heidrich, W., and Seidel, H.-P. 1998. Ray-tracing Procedural Displacement Shaders, In Graphics Interface, pp. 8-16. 10. Kaneko, T., Takahei, T., Inami, M., Kawakami, N., Yanagida, Y., Maeda, T., Tachi, S. 2001. Detailed Shape Representation with Parallax Mapping. In Proceedings of ICAT 2001, pp. 205-208. KEYWORDS: Sensor; Rendering; Simulation; Training; High Dynamic; Visual TPOC: (407)380-4631
|