Multimodal Interaction Technologies to Support Small Unit Leaders

Navy SBIR 20.2 - Topic N202-133

Office of Naval Research (ONR) - Ms. Lore-Anne Ponirakis [email protected]

Opens: June 3, 2020 - Closes: July 2, 2020 (12:00 pm ET)

 

 

N202-133       TITLE: Multimodal Interaction Technologies to Support Small Unit Leaders

 

RT&L FOCUS AREA(S): Autonomy

TECHNOLOGY AREA(S): Human Systems

 

OBJECTIVE: Develop a prototype system that leverages the current state-of-the-art in multimodal input/output (I/O) methodologies to control unmanned systems (UxS) at varying levels (i.e., from issuing broad tasking down to teleoperation) and to monitor status (i.e., see video from cameras, position on a map). This system will enable a graceful transition between Human-computer Interaction (HCI) technologies, including gesture [Ref 1], speech, eyetracking [Ref 2], manual control, teleoperation, and more [Ref 3]. This transition can be initiated by the user or by the system itself detecting environmental or operational circumstances.

 

DESCRIPTION: A number of unmanned systems are being deployed to the Fleet and Force, including Naval Special Warfare (NSW) operators and U.S. Marine Corps small unit leaders. UxS can provide enhanced command & control (C2) and Intelligence, Surveillance, and Reconnaissance (ISR) capabilities, but there remains an open question on how to effectively control these systems. There are many use cases that demand different control schemes. For example, if remotely surveilling a building, direct teleoperation and monitoring through a tablet may suffice. However, in room clearance operations, the warfighter�s hands will be occupied, so a speech interface and monitoring through a HUD is ideal. In yet another scenario, eyetracking or a gestural interface may be required. How to gracefully transition between these interaction modalities is unknown, and human-machine teaming is ripe for the integration of interface technologies to support a variety of operations.

 

One solution is to draw from the communications domain, which uses the Primary, Alternate, Contingency, and Emergency (PACE) model to allow for failover between communications systems. While human-machine interfaces cannot be ranked like communications protocols (e.g., by available bandwidth), there are advantages and disadvantages to different input (control) methods (teleoperation, eyetracking, speech, gestures, etc.) and output (monitoring) methods (weapon-attached screen, tablet, HUD, etc.). It is critical to understand the operator- and environment-centered circumstances that lend themselves to specific I/O methods working better than others.

 

This SBIR topic seeks to integrate existing human-machine interface technologies, minimize the amount of extra equipment needed to be carried by the warfighter, and develop a prototype system that allows for graceful transition between I/O methodologies based on a number of factors (user preference, operational circumstances, system recommendation, etc.). The system should be easy to use by the warfighter and provide flexible interaction modalities with UxS(s) as missions and situations rapidly change.

 

PHASE I: Determine requirements for how warfighters will use companion UxS(s) in missions, focusing on NSW and Marine Corps squad leader use cases. Collect information on various I/O methodologies and determine how they can be integrated into a holistic UxS control and monitoring system. Phase I deliverables will include: (1) use cases for warfighter and UxS teaming, (2) identification of control and monitoring systems for integration, (3) an understanding of the pros and cons of each I/O modality and associated human factors principles for design, and (4) mock-ups or a prototype of the system.

 

The Phase I Option, if exercised, should also include the processing and submission of all required human subjects use protocols, should these be required. Due to the long review times involved, human subject research is not allowed during Phase I. Phase II plans should include key component technological milestones and plans for at least one operational test and evaluation, to include user testing.

 

PHASE II: Develop a prototype system based on Phase I effort and conduct a field demonstration between a user and UxS(s). Specifically, the target audience (e.g., NSW operator or Marine Corps squad team leader) will be identified, along with a relevant UxS(s). Technologies identified in Phase I will be integrated with the user�s standard equipment. Additional software will be created to manage the various I/O modalities, allowing for smooth user-initiated transition or software-automated transition based on detection of environmental or operator workload circumstances. System design will occur in an iterative fashion, with multiple user group interactions feeding back into development. Phase II deliverables include: (1) a working prototype of the system that a user is able to control and switch between modalities, and (2) a field demonstration of a complete or near-complete system to users and stakeholders with users completing a variety of scenarios, easily switching between input (control) and output (monitoring) methods.

 

PHASE III DUAL USE APPLICATIONS: Support the customer (NSW or Marine Corps) in transitioning the technology for use. Further develop the software and hardware system for evaluation to determine its effectiveness in the field for NSW or Marine Corps scenarios. As appropriate, focus on broadening capabilities and commercialization plans.

 

Commercially, there are many explorations of different human-machine interface modalities. Companies are developing augmented reality technologies (e.g., Microsoft and Apple), eyetracking (e.g., Tobii and some Samsung Galaxy phones), speech interfaces (e.g., Amazon, Google, and Apple), and gesture control (e.g., Google Pixel phones, Microsoft Kinect). Development of affordable, scalable, and non-proprietary human-machine interfaces is not a current priority in the private sector. However, as new phones, tablets, smart watches, wireless earbuds, AR glasses, and more come to market, the commercial world will need to develop an integrated control scheme to manage these devices without overwhelming the user. Therefore, technology developed will have broad application to the private sector.

 

REFERENCES:

1. Cauchard, J. R., Tamkin, A., Wang, C. Y., Vink, L., Park, M., Fang, T., & Landay, J. A. (2019, March). Drone. io: A Gestural and Visual Interface for Human-Drone Interaction. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 153-162). IEEE.

 

2. Yuan, L., Reardon, C., Warnell, G., & Loianno, G. (2019). Human gaze-driven spatial tasking of an autonomous MAV. IEEE Robotics and Automation Letters, 4(2), 1343-1350.

 

3. Calhoun, G. L., Draper, M. H., Guilfoos, B. J., & Ruff, H. A. (2005). Tactile and Aural Alerts in High Auditory Load UAV Control Environments. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 49(1), 145�149.

 

KEYWORDS: Human-computer Interaction, Eyetracking, Speech Interfaces, Gesture Interfaces, Augmented Reality

 

** TOPIC NOTICE **

The Navy Topic above is an "unofficial" copy from the overall DoD 20.2 SBIR BAA. Please see the official DoD DSIP Topic website at rt.cto.mil/rtl-small-business-resources/sbir-sttr/ for any updates. The DoD issued its 20.2 SBIR BAA on May 6, 2020, which opens to receive proposals on June 3, 2020, and closes July 2, 2020 at 12:00 noon ET.

Direct Contact with Topic Authors. During the pre-release period (May 6 to June 2, 2020) proposing firms have an opportunity to directly contact the Technical Point of Contact (TPOC) to ask technical questions about the specific BAA topic.

Questions should be limited to specific information related to improving the understanding of a particular topic�s requirements. Proposing firms may not ask for advice or guidance on solution approach and you may not submit additional material to the topic author. If information provided during an exchange with the topic author is deemed necessary for proposal preparation, that information will be made available to all parties through SITIS (SBIR/STTR Interactive Topic Information System). After the pre-release period, questions must be asked through the SITIS on-line system as described below.

SITIS Q&A System. Once DoD begins accepting proposals on June 3, 2020 no further direct contact between proposers and topic authors is allowed unless the Topic Author is responding to a question submitted during the Pre-release period. However, proposers may submit written questions through SITIS at www.dodsbirsttr.mil/submissions/login, login and follow instructions. In SITIS, the questioner and respondent remain anonymous but all questions and answers are posted for general viewing.

Topics Search Engine: Visit the DoD Topic Search Tool at www.dodsbirsttr.mil/topics-app/ to find topics by keyword across all DoD Components participating in this BAA.

Help: If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at 703-214-1333 or via email at [email protected]

Return