Optical Perception System for Situational Awareness and Contact Detection for Unmanned Surface Vessels
Navy SBIR 2011.3 - Topic N113-175
NAVSEA - Mr. Dean Putnam - [email protected]
Opens: August 29, 2011 - Closes: September 28, 2011

N113-175 TITLE: Optical Perception System for Situational Awareness and Contact Detection for Unmanned Surface Vessels

TECHNOLOGY AREAS: Sensors

ACQUISITION PROGRAM: PMS 406 Unmanned Influence Sweep System Program of Record - ACAT III

OBJECTIVE: To develop an optical perception system for unmanned surface vessels (USVs) to support the lookout function as defined by COMDTINST M16672.2D, "Navigation Rules" Rule 5.

Transition Path: Littoral Combat Ship (LCS) Mine Warfare Mission Package: Unmanned Influence Sweep System (UISS) and other Navy USVs under PEO LMW PMS420 Unmanned Maritime Systems Program Office

DESRIPTION: The Unmanned Surface Vehicle (USV) at the heart of the UISS is required to follow Navigation Rule 5: "Every vessel shall at all times maintain a proper look-out by sight and hearing as well as by all available means appropriate in the prevailing circumstances and conditions so as to make a full appraisal of the situation and of the risk of collision." Since the vessel is unmanned, the lookout function must be supported by a perception system consisting of sensors and processing that provide situational appraisal to a remote operator of the USV and to an onboard automated command and control system.

The current capability for providing the lookout function onboard the USV consists of a camera system, a radar and microphone. These provide only a rudimentary situational awareness without sufficient data to enable appropriate action based on a full appraisal of the situation and the risk of collision. The focus of this topic is to develop an innovative optical sensor and processor subsystem for the total perception processing system. The optical subsystem will provide a continuous 360 degree field of view, process the raw data and provide the contact attributes as an output to an operator or an onboard autonomous control system to support obstacle/collision avoidance in accordance with Navigation Rule 5. Current state of the art optical perception systems do not meet the goals of USV operational needs with respect to the Navigation Rules. The Navy has reviewed and used a variety of optical technologies and strategies to provide USVs with optical situational awareness (SA) and contact detection (CD), but to date these approaches lack the ability to satisfactorily capture images and process the digital data, and fail to meet requirements with respect to performance (stabilization, coverage, range, obstacle detection) and environment (shock, water intrusion, green water impact).

These existing technologies, furthermore, are not suitable for supporting even basic USV operation by a remote operator. Existing technology simply overloads the operator with information. A human onboard a craft can quickly rotate to get a 360 degree appraisal of the environment and is self stabilizing. An operator behind a remote console controlling a pan-tilt-zoom camera or switching between multiple fixed camera views, as is required by current technology, is an extremely ineffective and fatiguing approach. The processor subsystem should collect all data from the sensor and process the data into a useable output format. Output types would include streaming video, still pictures of contacts of interest and contact attribute data.

Contacts may include all sizes of power and sailing vessels, buoys and other navigation markers, structures on land including light houses and floating, semi-submerged debris (log to ISO container). Attributes may include contact size, height to length ratio, range, bearing and speed/direction. The objective is to provide the contact attributes a person would need to make a full appraisal of the situation and of the risk of collision.

The processor shall have the capability to detect navigation lights and day shapes on other vessels (Navigation Rules, Part C) from the raw sensor data and provide their attributes. The processor shall also have the capability to detect and provide attributes of navigation aids such as color, lights and shapes.

Environmental effects must be taken into account in developing the optical subsystem. These include water intrusion/impacts and craft motions. State of the art systems not been operated in higher sea states and thus have not addressed such issues as motions, shock, vibration, water spray and water impact. The optical subsystem must be capable of both performing and surviving in the intended environment.

The subsystem must be able to receive communications directing it, for example, to zoom in on an image or replay a captured sequence. This communication could come from a remote human operator or an onboard autonomous control system, both of which will be receiving inputs from the radar and audio sensor subsystems. Such communications will allow the optical subsystem to "focus" both the optical sensor as well as processing power on an indicated area. This would be similar to a human operator who hears something coming from a particular direction and focuses in that direction. Further development and integration into a complete perception processing system could occur under Phase III, but it is only the intent of this topic to define such interfaces.

Reference 1, slide 14, provides a picture of the USV and its principal hardware including the current navigation sensors. The desired camera subsystem should have a field of view (FOV) that provides 360 degrees in the horizontal plane and be able to view contacts on the water surface from within 10 yards (man in the water and larger) of the vessel to the horizon (12m long by 3m high and larger) during operation, which includes significant vessel motions (e.g., incurred during sea state 3 operations) and operations in all visibility conditions (day, night, rain, snow, fog, etc.). The processor shall have the capability to detect a contact on the water or shore from the raw sensor data, and provide contact attributes. Maximum detection range for navigation aids, such as buoys, and other vessels is two nautical miles and minimum detection range is 10 yards. Determination of specific requirements for resolution will be the responsibility of the proposer and shall be based on the processors� requirements to perform contact detection as defined below. The camera subsystem would typically be mounted on an arch approx 10� off the water, and is subject to sea spray, direct sunlight and occasional green water impacts.

This SBIR topic is not soliciting the development of computer hardware technology as part of the perception processing system. Ruggedized computing systems exist on the market. Environmental requirements can be met by either using a ruggedized computer able to directly handle the environment or by repackaging the system (shock mounts, cooling, etc.). However, novel optical processing techniques and technologies shall be used to minimize the required processing power and footprint. Hardware selection shall address environmental issues. The processor would normally be installed below decks, in a relatively sheltered compartment not directly exposed to the elements.

PHASE I: Complete preliminary design for the proposed optical sub-system. The design should include details on system hardware and software architecture and should specify key system components and their expected performance. Provide convincing evidence of the feasibility of the system design to meet the objectives of the topic. Perform bench top experimentation where applicable to demonstrate concepts.

PHASE II: Develop detailed hardware and software design for the optical sub-system. Fabricate and test a prototype. In a laboratory environment demonstrate that the prototype meets the performance goals established in Phase I. Verify final prototype operation in a representative environment and provide results. Develop a cost benefit analysis and a Phase III installation, testing, and validation plan.

PHASE III: Construct a full-scale prototype and install on board a selected combatant craft. Conduct extended shipboard testing. Support transition and integration of the subsystem into a full system, including radar and audio subsystems.

PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: The technology developed under this topic will be applicable to any Unmanned Surface Vehicle of similar size and outfitting as the UISS USV. As radar has greatly helped the maritime industry with regards to safe navigation, optical perception systems can also enhance safe navigation of manned and unmanned craft alike.

REFERENCES:
1. D. Ashton. Unmanned Maritime Systems Overview. Presentation to The Maritime Alliance Conference. 17 November 2010. [Google: "Nov 17, 2010 ... Unmanned Maritime Systems Overview. Presented to: The Maritime Alliance Conference. Presented by: CAPT Duane Ashton,."]

2. ONLINE COMDTINST M16672.2D, NAVIGATION RULES (International-Inland). http://www.navcen.uscg.gov/?pageName=navRulesContent

3. NAVIGATION RULES FREQUENTLY ASKED QUESTIONS. Question 12. http://www.navcen.uscg.gov/?pageName=navRulesFAQ

4. S. Calfee and N. C. Rowe. An Expert System and Tutor for Maritime Navigation Rules. http://faculty.nps.edu/ncrowe/oldstudents/ccrt02b.htm

5. See SITIS under this topic number for Additional Guidance for Compact Autonomous Perception Processing System for Situational Awareness and Contact Detection Unmanned Surface Vessels.

KEYWORDS: camera, sensor, unmanned, USV, optical, perception

** TOPIC AUTHOR (TPOC) **
DoD Notice:  
Between July 28 and August 28, 2011, you may talk directly with the Topic Authors to ask technical questions about the topics. Their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is
not allowed starting August 29, 2011, when DoD begins accepting proposals for this solicitation.
However, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS (11.3 Q&A) during the solicitation period for questions and answers, and other significant information, relevant to the SBIR 11.3 topic under which they are proposing.

If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at (866) 724-7457 or email weblink.