Methods for Actionable Measures of Absolute Cognitive Workload
Navy STTR 2016.A - Topic N16A-T002
NAVAIR - Monica Clements - navair.sbir@navy.mil
Opens: January 11, 2016 - Closes: February 17, 2016

N16A-T002 TITLE: Methods for Actionable Measures of Absolute Cognitive Workload

TECHNOLOGY AREA(S): Air Platform, Human Systems

ACQUISITION PROGRAM: PMA-202 Aircrew Systems Program Office

OBJECTIVE: Develop an innovative and cost-effective capability that will provide an objective, measurable means of workload for determining impacts on individual operator, crew-level, and/or multi-team system level performance when life support or aircrew systems are added or modified.

DESCRIPTION: In the Naval community, improving affordability is one of the main focus areas. Specifically, standardized workload management systems have been deemed one essential component to gain increased affordability. This is partly because, as the capabilities of information technology systems and networks continue to grow, workers are increasingly challenged to process more information, interact with more interconnected systems, and juggle more tasks that compete for their simultaneous attention. That is, it is critical to know human performance limitations when introducing complex, cognitive tasks and state-of-the-art technologies, equipment and new environments to warfighters. Knowledge of these limitations can help researchers and developers understand and evaluate the potentially negative impacts on safety and the efficiency of operations. Frequently, this involves assessing workload impacts; however, current workload assessment methods do not adequately support system development or enhanced decision making with objective measurements.

Current state-of-the-practice is to assess workload, either physical or cognitive, through a variety of assessment methods (Gudipati & Pennathur). The most commonly implemented is subjective measurement techniques (e.g., Bedford, Modified Cooper Harper, NASA TLX); however, there is an increased desire for more objective data on which to base decisions. A variety of objective measurement techniques exist for cognitive workload including performance measures (e.g., reaction time, errors; Kantowitz et al., 1983), psychophysiological measures, and analytical measures. Recent efforts have focused on modeling to help address concerns of limited resources and impacts of a variety of factors that affect performance (e.g., Pharmer, Paulsen, Alicia, 2011). New, cost-reducing methods are needed to support systems acquisition decisions, and these methods will need to improve on existing methods, in at least three ways, as described below.

This effort seeks to investigate a hybrid approach that would allow for the real-time measurement (e.g., measurement results as an operator tests new equipment) of physical and cognitive workload and, with the results and modeling capabilities, understand how variations in the associated factors might impact operator safety and performance. Finally, as integrated technologies and operations continue to expand, consideration beyond the individual operator to crew-level and multi-team systems is required.

The requested technology should be consistent with research and theory (e.g., Wickens, 2008), assess workload and its effect on performance, and include a strategy for predicting future workload levels once experience is accumulated. As a part of this effort, displays to indicate the rate of performance degradation and workload increases (physical and/or cognitive) should be investigated. Stakeholders should be involved to help shape how the resulting technology should highlight when workload levels reach limits that degrade human cognition and performance, so that they information can be used during design, development, testing, and evaluation of operational and training systems in order to support upgrade activities and decision making with objective data.

The measurement tool should also take into account the ways cognitive work changes as expertise develops [2]. Workload measures are frequently sought when a new system or technology is introduced or an existing system is changed. Because operators will have the opportunity to adapt to the system or technology over time, decision makers have yet another reason to discount or doubt the value of the workload measure. Effective cognitive workload assessment tools would take advantage of what is known about expertise acquisition to make sound predictions about the potential for operators in a given domain, with training and practice, to achieve a manageable level of cognitive workload.

This capability has a range of applicability from aircrew systems through investigation of life support systems, to training systems development and effectiveness evaluations. As we pipe more and more data into our control centers, aircraft cockpits, and automobile consoles, it becomes more and more critical that we be able to determine when workload affects the operatorís ability to compensate safely.

PHASE I: Demonstrate feasibility, utility and effectiveness of proposed approach as, discussed in the Description section, for assessing cognitive workload and impacts to the degradation of cognition and performance.

PHASE II: Develop a prototype of the absolute cognitive workload technology and refine the underlying cognitive workload assessment method based on research across a range of fast-paced and high-consequence work domains. At least one validation study should evaluate the ability of the technology to make reliable and useful predictions about workload.

PHASE III DUAL USE APPLICATIONS: The company should support the Navy in transitioning by integrate the workload assessment technology into research, development, test and evaluation facilities and programs that support acquisition and training. Demonstrate cost reduction and benefits to the quality of systems and technology enhancements. This technology can be used to benefit systems development and technology upgrades across military, department of defense, and civilian sectors. For example, the Federal Aviation Administrationís (FAAís) NextGen initiative to upgrade and enhance National Airspace System (NAS) operations, in particular, could benefit from improved workload assessment methods and technologies. The resulting technology will benefit programs by supporting optimization of designs where workload is known to be high and may benefit from automation, artificial intelligence, or enhanced human machine interfaces, as well as those where consequences of degraded performance are high.

REFERENCES:

1. Bi, S., & Salvendy, G. (1994). Analytical modeling and expericognitive study of human workload in scheduling of advanced manufacturing systems. International Journal of Human Factors in Manufacturing, 4(2), 205-234

2. Ericsson, K., Charness, N., Feltovich, P., & Hoffman, R. (2006). The Cambridge handbook of expertise and expert performance. New York, NY: Cambridge University Press

3. Fontenelle, G., & Laughery, K. (1988). A workload assessment aid for human engineering design. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (pp. 1122-1125). Thousand Oaks, CA: SAGE Publications

4. Grier, R., Wickens, C., Kaber, D., Strayer, D., Boehm-Davis, D., Trafton, J. G., & St. John, M. (2008). The red-line of workload: Theory, research, and design. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (pp. 1204-1208).

5. Gudipati, S. & Pennathur, A. Workload Assessment Techniques for Job Design. http://www.semac.org.mx/archivos/6-9.pdf

6. Hollnagel, E. (1998). Cognitive reliability and error analysis method. New York, NY: Elsevier

7. Hollnagel, E., & Woods, D. (2005). Joint cognitive systems: Foundations of cognitive systems engineering. Boca Raton, FL: CRC Press

8. Neville, K., Bisson, R., French, J., Martinez, J., & Storm, W. (1994). A study of the effects of repeated 36-hour simulated missions on B-1B aircrew members. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (pp. 51-55). Thousa

9. Patterson, E. & Miller, J. (2010). Macrocognition metrics and scenarios: design and evaluation for real-world teams. Aldershot, UK: Ashgate.

10. Pharmer, J. A., Paulsen, M., & Alicia, T. J. (2011) Validating Environmental Stressor Algorithms for Human Performance Models. Human Systems Integration Symposium. https://www.navalengineers.org/ProceedingsDocs/HSIS2011/Papers/Pharmer.pdf

11. Wickens, C. D. (2008). Multiple resources and mental workload. The Journal of the Human Factors and Ergonomics Society, 50(3), 449-455. http://www.researchgate.net/profile/Christopher_Wickens/publication/23157812_Multiple_Resources_and_Mental_Workloa

12. Wierwille, W. & Eggemeier, F. (1993). Recommendations for cognitive workload measurement in a test and evaluation environment. Human Factors, 35(2), 263-281

13. Woods, D. (2005). Generic support requirements for cognitive work: laws that govern cognitive work in action. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (pp. 317-321). Thousand Oaks, CA: SAGE Publications

14. Xie, B. & Salvendy, G. (2000). Review and reappraisal of modeling and predicting cognitive workload in single- and multi-task environments. Work & Stress, 14(1), 74-99

KEYWORDS: test and evaluation; performance assessment; human-in-the-loop

TPOC-1: 407-380-4773

TPOC-2: 407-380-4528

Questions may also be submitted through DoD SBIR/STTR SITIS website.

** TOPIC AUTHOR (TPOC) **
DoD Notice:  
Between December 11, 2015 and January 10, 2016 you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. Their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is
not allowed starting January 11, 2016 , when DoD begins accepting proposals for this solicitation.
However, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS (16.1 Q&A) during the solicitation period for questions and answers, and other significant information, relevant to the SBIR 16.1 topic under which they are proposing.

If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at 800-348-0787 or sbirhelp@bytecubed.com