A Systematic Review of Current Adaptive Human-Machine Interface Research

PDF: https://ryanblakeney.com/wp-content/uploads/2020/09/A-Systematic-Review-of-Current-Adaptive-Human-Machine-Interface-Research.pdf

Ryan A. Blakeney

Embry-Riddle Aeronautical University

UNSY 691 Graduate Capstone Paper

Submitted to the Worldwide Campus

in Partial Fulfillment of the Requirements of the Degree of

Master of Science in Unmanned Systems

October 2020

Abstract

Adaptive Human-Machine Interfaces are displays inside a cockpit or controls station that change depending on predetermined variables. These interfaces can change the operator's displayed data to ensure their workload or situation awareness stays within a safe and efficient level for their system's safe operations. The term Adaptive Human-Machine Interface covers multiple areas of research without indicating a singular topic or design. This study aims to unite future research on Adaptive Human-Machine Interfaces; this study researches the common categories and sub-categories of multiple research papers to determine the most prominent type of research out of Workload Analysis, Situation Awareness, and Autonomy. A vigorous search that covered areas associated with Adaptive Human-Machine Interfaces produced 108 samples. Analyzing the results from the Chi-Square goodness of fit test indicated that Autonomy is the most common category in the research of Adaptive Human-Machine Interfaces, c2 (2, n = 1295) = 13.4305, p = 0.0012, while encompassing multiple sub-categories from the three main categories. The title of "Adaptive Human-Machine Interface" should be used for future research into this topic. Of the samples gathered, the research titles fell into five sub-categories that are evenly divided based on the Chi-Square goodness of fit test. The use of the title Adaptive Human-Machine Interface unites future research under a single term.

            Keywords: adaptive human-machine interface, autonomy, workload, situation awareness, analysis

A Systematic Review of Current Adaptive Human-Machine Interface Research

Significance of the Study

Research on the future of Human-Machine Interfaces (HMI) has led to the development of Adaptive HMI. Adaptive HMI research has forked into multiple efforts with different names that mean the same topic. For example, the Intelligent Adaptive Interface (IAI) by Hou, Kobierski, & Brown (2007) discusses an adaptive interface that changes based on the operator state or mission. Another example is the Cognitive Human-Machine Interface (CHMI) by Lim, Ramasamy, Gardi, Kistan, & Sabatini (2018), which discusses an adaptive interface that changes based on EEG and eye-gaze.

In some cases, researchers have used the same name and acronyms to research similar but different methods of Adaptive HMI. Manawadu, Kamezaki, Ishikawa, Kawano, & Sugano (2017) research the use of Adaptive Human-Machine Interfaces on autonomous ground vehicles using operator feedback through hand gestures, whereas Ali, Jain, Lal, & Sharma (2012) research Adaptive Human-Machine Interface changes based on multiple modalities such as operator emotion and eye-gaze.  This separation of research using the same name has created numerous unique efforts to define and develop new Adaptive HMI systems.

Statement of the Problem

This study establishes a standard definition for Adaptive HMI to guide current and future research for human factors engineers. Ongoing research for Adaptive HMI has multiple definitions and understandings of this topic. A single standard for the definition of Adaptive HMI provided researchers with a shared vocabulary and shared knowledge of the end goal for their research. A single standard may enable human factors engineers with a shared mental model when refining or creating new interfaces.

Research Question and Hypotheses

The defined research question is as follows: "What are the most prominent research definition and standards for Adaptive Human-Machine Interfaces?" The question design is to specifically address the lack of uniformity in the research of Adaptive HMI. By doing so, the study of the available research into Adaptive HMI presents the most prevalent research and recommends a standard definition for others to consider. This research relates appropriately to the descriptive qualitative analysis to determine the resulting frequencies of specific definitions and research. To determine the most prominent definition and standard for Adaptive Human-Machine Interfaces, a null and alternative hypothesis as follows.

H1o: There is no statistical significance showing the disparity between definitions of Adaptive Human-Machine Interfaces within the three identified categories (Workload, Situation Awareness, and Autonomy).

H1a: There is statistical significance showing the disparity between definitions of Adaptive Human-Machine Interfaces within the three identified categories (Workload, Situation Awareness, and Autonomy).

After researching the primary hypothesis for this paper, a secondary and tertiary hypothesis helps determine the most prominent sub-categories and the title of the research for Adaptive Human-Machine Interfaces. To determine the most prominent sub-category and title, a null and alternative hypothesis as follows.

H2o: There is no statistical significance showing the disparity between specific types of research of Adaptive Human-Machine Interfaces within the fifteen identified sub-categories.

H2a: There is statistical significance showing the disparity between specific types of research of Adaptive Human-Machine Interfaces within the fifteen identified sub-categories.

H3o: There is no statistical significance showing the disparity between titles for Adaptive Human-Machine Interface.

H3a: There is statistical significance showing the disparity between titles for Adaptive Human-Machine Interface.

Review of Relevant Literature

The literature review is focused on Workload Management, Situation Awareness, and Autonomy to match the three categories for the research. These three categories had the most prominent presence among the samples used for the study. A literature review focusing on these topics will show the diversity in research in Adaptive Interfaces.

Interface Adaptation for Workload Management

Human-Machine Interaction research has grown from the study of efficient and ergonomic design to the research and design of multimodal interaction in control stations where there is a study of natural human interactions alongside a standard mouse and keyboard (Zander, Kothe, Jatzev, & Gaertner, 2010).  In an interview with Mikel Atkins, a Senior Human Factors / Crew Systems Engineer for Lockheed Martin, Mikel explained that Human Factors (HF) technology has evolved from simple computer interfaces such as a mouse & keyboard to touchscreens. Human Factors engineers have begun to increase the amount of information available and displayed to operators (M. Atkins, personal communication, June 22, 2020). The increase in data requires HF engineers to fuse the available information into one display to condense the amount of data shown while still showing relevant data to the operator (M. Atkins, personal communication, June 22, 2020).

According to Pankok, Bass, Smith, Storm, Walker, Shepherd, & Spencer (2017), varying levels of automation will benefit the design of future control stations by taking into account the varying degrees of workload experienced when operating in real-world conditions. An increase in information availability in high workload situations can cause an oversaturation of information to the operator and cause a decrease in performance (Zander et al., 2010). Managing the amount of information shown using an adaptive interface can be done by measuring the operator's mental workload to determine how much information should be added to or removed from the interface (Lim, Ramasamy, Gardi, Kistan, & Sabatini, 2018).

The definition of mental workload is a level of mental demands placed on an operator while performing their required duties (Vidulich & Tsang, 2015). In contrast, Situation Awareness is the information in the operator's memory during task performance (Vidulich & Tsang, 2015). Mental workload is typically associated with the required work to complete a task to a safe and logical completion. Situation Awareness requires the operators to retain information in their long-term memory to understand their situation and how they should react to it.

Sensor-equipped control stations know about an operator's activities, preferences, and previous interactions that provide data that can be used by the control station to be proactive and anticipate the operator's actions, needs, and preferences (Zander et al., 2010). The sensors vary based on the type of control station and the type of operation. Examples of sensor technology are eye-gaze tracking, voice recognition, and an electroencephalogram (EEG), as seen in Figure 1. These systems can be utilized together or separately to monitor the operator and measure their mental state. As the mental workload is measured, the operator display changes to manage the operator's workload.

Figure 1. Eye-Gaze Tracking and Cognitive Measurement Devices. Adapted from "Avionics Human-Machine Interfaces and Interactions for Manned and Unmanned Aircraft," by Lim, Y., Gardi, A., Sabatini, R., Ramasamy, S., Kistan, T., Ezer, N., . . . Bolia, R., 2018, Copyright 2018

An efficiently designed interface should be transparent to the operator and focus their mental workload on the appropriate tasks (Leanne, Treacy, Robert, & Jacob, 2009). Measurements of an operator's ability to complete a task are standard quantitative measures used to determine operator interface evaluation. However, measuring operator workload is done by qualitatively observing subjects or administering subjective surveys (Leanne et al., 2009). The combination of sensors and surveys ensure the operator's personal view of their mental state and workload is measured. These qualitative views of their workload can help HF engineers iterate through an interface design to ensure the final product is efficient for the operators.

In research by Ayaz, Shewokis, Bunce, Izzetoglu, Willems, & Onaral (2012), Functional near-infrared (fNIR) can provide a measurement of an operator's mental workload and show the development of expertise during practice. Multiple studies on UAS operators found that the more training and experience an operator received, the less brain activity was measured (Ayaz, Shewokis, Bunce, Izzetoglu, Willems, & Onaral, 2012). This measurable drop in brain activity indicated that more proficient operators experienced a lower workload and freed the neural resources to tend to other stimuli (Ayaz et al., 2012).   

According to Aricò, Borghini, Di Flumeri, Colosimo, Bonelli, Golfetti, & Babiloni (2016), Adaptive Automation (AA) is a process to keep task workload demand within a level in which the operator maintains an efficient amount of capability to perform their duties. The authors research the use of pBCI or passive Brain-Computer Interface systems to measure the brain activity of an operator. At the same time, they fly to understand the amount of workload the crewmembers are experiencing (Aricò, Borghini, Di Flumeri, Colosimo, Bonelli, Golfetti, & Babiloni, 2016). Passive Brain-Computer and AA have definitions that refer to an Adaptive Interface but do not share the same terminology.

Interface Adaptation for Situation Awareness

In this paper, Situation Awareness or Situational Awareness's definition per the Federal Aviation Administration (n.d.) is "an accurate perception and understanding of all factors and conditions within the four fundamental risk elements that affect safety before, during, and after the flight." The terms Situation and Situational will be interchangeable in this paper due to a recent change in the Federal Aviation Administration (FAA) definition.  To utilize a single operator to command and coordinate multiple UAVs, a Cognitive Human Machine-Interfaces (CHMI) is recommended (Lim, Samreeloy, Chantaraviwat, Ezer, Gardi, & Sabatini, 2019). A CHMI will allow a human operator to process large quantities of information to enable quick and efficient tactical decision making based on their gained situation awareness (Lim et al., 2019). Currently, the Federal Aviation Administration (2019) requires waivers to fly multiple UAS with a single operator under §107.35 – Operation of Multiple Small UAS.  The use of a CHMI must be included in the design of a control station to ensure the Unmanned Aircraft System (UAS) design considers the interface's dynamic nature (Lim et al., 2019).

A CHMI will monitor the operator's Situational Awareness (SA) and workload in the control station by estimating the human operator's workload, fatigue, and SA during a mission (Lim et al., 2019). The data used to monitor the workload and SA will be estimated to enable real-time changes to the display to facilitate more effective mission management (Lim et al., 2019). As the operator moves through a mission, the CHMI can modify the display to decrease operator workload and increase SA (Lim et al., 2019). An area of concern for future control station design is to ensure the operators expect the changes. According to Mikel, operators can find display changes disturbing whenever the interface changes without their knowledge (M. Atkins, personal communication, June 22, 2020).

One of the more critical areas for an efficient UAS operation and control of multiple UAS is having adequate SA on the overall environments, required tasks, and status of the aircraft (Chen, Barnes, & Harper-Sciarini, 2010). An overly autonomous system can decrease SA due to an overreliance on an autonomous system (Chen et al., 2010).  Maintaining high SA on the operations of a UAS and the mission is critical to ensure safe operations. An adaptive interface enables the operator to monitor a flight's essential components based on what is important at that time and the operator's current SA. The three pillars of aviate, navigate, and communicate must have support at all times for the operator (M. Atkins, personal communication, June 22, 2020).

An area of concern for UAS operators is change blindness or the inability to perceive and attend to changes to an environment (Chen, Barnes, & Harper-Sciarini, 2010). The use of an adaptive interface ensures that the display shows the human operator important information continuously as it changes based on the operator's cognitive ability to maintain the SA. Research findings have found that operator SA can be affected by the attentional control skills and confidence in one's ability, informational displays (timing/relevance of information presented), and how it is displayed (Chen, Barnes, & Harper-Sciarini, 2010).

In a study by Cosenzo, Chen, Reinerman-Jones, Barnes, & Nicholson (2010), it was found that aircrew were more capable of maintaining a higher SA during a mission when the control station used adaptive automation. During the study, the researchers evaluated aircrew members to determine how effective the operators were at maintaining SA throughout the scenario with an adaptive system. Situation Awareness for this study was determined to be the operator's understanding of the environment in which their aircraft was flying. The study results indicate that when the task load increased from low to high, the operator's SA improved more with the adaptive interface (Cosenzo et al., 2010).

Adaptive Human-Machine Interface Autonomy

Automation in adaptive interfaces enables the system to understand and make a decision without operator input. As operators using a traditional HMI, key presses on a keyboard are required to modify or change the display. While the design of adaptive interfaces reduces mental workload, automation can result in a reduction in SA, an increase in operator complacency, and atrophy of skill (Solovey, Lalooses, Chauncey, Weaver, Parasi, Scheutz, & Jacob, 2011). Differences in an operator's cognitive workload or SA drives the method of automation in an adaptive interface.

Automation in an adaptive interface must ensure it provides a balance for the human workload to ensure the display is challenging but manageable to prevent operator complacency (Solovey et al., 2011). The automation in an adaptive display must use concepts like Artificial Intelligence (AI) or Machine Learning to understand and predict operator requirements while maintaining an efficient display during operations. Machine learning will monitor operator workload and detect specific conditions that drive a change in the adaptive display to manage the operator's cognitive workload (Solovey et al., 2011).

Advances in automation, human cognitive modeling, artificial intelligence, and machine learning can take many forms to allow for adaptive displays to the human operator (Madni & Madni, 2018). The automation allows the human to have the ability to intervene in the autonomous operations, and redirect resources, reallocate tasks, modify display parameters (Madni & Madni, 2018). Adaptive interfaces will utilize artificial intelligence and machine learning methods, and an overreliance upon automated systems can likely increase operator complacency.

For Autonomy, an aspect to understand how to change an adaptive interface is artificial intelligence and machine learning. According to Wu, Wang, Niu, Hu, & Fan (2018), machine learning can change an interface's autonomy levels based on the operators' workload. The system utilizes machine learning to identify multiple targets to an operator to help ease the workload. If the workload of the operator is determined to be too high to perform their duties efficiently, the automation will increase the Autonomation of the target recognition system using machine learning algorithms (Wu et al., 2018). This type of system uses an adaptive interface to ensure the operators are shown relevant information depending on their workload and the system's Autonomy. 

Some automation uses machine learning algorithms to classify predictors for operator workload, situation awareness, and task management. In a study by Zhang, Yin, & Wang (2017), a cognitive task load (CTL) algorithm identifies changes in EEG and electrocardiogram (ECG) to determine the changes and current status of an operator. Within 80% accuracy, the machine learning algorithm could predict the operators' task load to manage their interface (Zhang et al., 2017). The use of these algorithms can help understand and classify the operators' measurements and decide how to change the interface.

Adaptive Interface concepts in the literature contain three elements of the ability to assess the system and environmental states; the ability to determine the operator's state; and the ability to adapt the interface according to the first two elements (Lim, Gardi, Sabatini, Ramasamy, Kistan, Ezer, Bolia, 2018). Although adaptive interfaces are a new area of research, the increase in human performance and cognition can greatly enhance the operator's capabilities through varying methods (Lim et al., 2018). To ensure adequate usability for the users, machine learning algorithms must monitor environmental and operator cognitive data (Lim et al., 2018). A challenge for using physiological measurements of an operator is ensuring truthful data to train the interface's algorithms so the interface can understand the operator's actual mental state (Lim et al., 2018).

Summary

The literature review provides information on how Adaptive Human-Machine Interfaces can gauge an operator's SA while modifying the display. The changes in how much SA the operator has during a mission or flight is measured using technology such as voice or EEG. Adaptive Human-Machine Interfaces can also measure operator workload to determine how much SA the display should provide at any given time. To determine how and when to change the display, automation is used to determine how much to change, when to change the display, and what to show at any given time using machine learning or artificial intelligence.

The complexity of the topics in this literature review contributes to the variability in terms, subjects, and definitions of Adaptive Interfaces. Workload, Situation Awareness, and the algorithms to support the two are all labeled in the literature review as an Adaptive Interface. The use of the term Adaptive Interface in the literature, regardless of the subject matter, supports this paper by showing the research differences under the same terminology.

Method

Research Design

The study's design included an extensive search of related material associated with Adaptive Human-Machine Interfaces and a meta-analysis of the pre-existing data. To ensure the research design meets the research question, research related to the subject of Adaptive Interfaces was examined and categorized to determine the prominent definition and standard for adaptive interfaces. As depicted in Appendix A, these categories and sub-categories were the most prominent topics in reviewing the 108 samples found.

Data for this research comprised journals and books related to the topic of "Adaptive Human-Machine Interfaces." The published information indicated that it was peer-reviewed or published by an accredited university. The published work included adaptive interfaces used in manned and unmanned systems.  Data were collected using the online journal resources ERAU Hunt Library Eagle Search, Sage Journals Database, ResearchGate, Science Direct, Springer Link, Google Scholar, and IEEE Xplore. The available research on this topic focused on unmanned systems and included other systems that use Adaptive HMI. 

Categorizing the journals into three categories of Workload, Situation Awareness, and Autonomy with the definitions listed in Appendix A will allow for a qualitative data analysis approach. A Chi-square goodness of fit test measured the 108 journals' categories to indicate if the journals are equally indicating the same definition and standards or if the journals favor one topic over the others. The data from this test indicated the most predominant definition and the standard used. An additional sub-category measured adaptive interfaces' terminology to provide a single term to refer to Adaptive Interfaces. A review of the 108 samples indicated that multiple research journals discuss the same topic; however, they use a different term. For each of the three categories, additional sub-categories will help determine more precisely what each journal discusses in each category, as seen in Appendix A.

For Workload Management, the sub-categories are the measurement of workload, us of eye-gaze, use of voice, and use of EEG. In the literature review, measuring workload appears to differentiate between research. For Situation Awareness, the sub-categories are the measurement of operator SA, loss of SA, and Situation to change HMI. These sub-categories help differentiate the specific research of SA in adaptive interfaces. For Autonomy, the sub-categories are AI, automation, and machine learning. Autonomy was put into these three sub-categories to understand what terminology researchers were using.

Assumptions

Assumptions for the statistical test is that the journals will fit into the three categories. To ensure all of the journals fit into the categories, the chosen topics are the predominant topics from an initial literature review. This research designates three categories and sub-categories based on the review of the literature samples.

An assumption for the categories in this research is that they are the common research topic for adaptive interfaces. A thorough analysis determined the most common definitions used in the samples, and the three categories used were the most common topics in the data samples.

Limitations

A limitation of this study is the specificity of the topic and the interpretation of the categories and sub-categories. Research into adaptive interfaces is new and is still ongoing. As the topic is small and specific, using only an unmanned systems aspect will make it difficult to acquire all of the samples required for the test. To ensure the samples are significant for this research, journals will be required to be closely related to adaptive interfaces in unmanned systems, or they will utilize the same technology used to monitor the operator.

Another limitation of this research is the qualitative nature of the study. Not all material associated with this topic were found due to limitations in access to all available information. Despite a data scrape of known research journal hosting websites, some data is likely missing from this research. Thus, the analysis of the qualitative data in this research is associated with the samples found for this paper.

Delimitations

A delimitation of this study is the focus on unmanned systems. To ensure research benefits the unmanned systems community, the research will focus on unmanned systems. However, this research will use published work not directly related to unmanned systems to ensure the research project does not ignore relevant work related to the project. Using work unrelated to unmanned systems will ensure the outcome of the project is useable as a definition and standard for all aviation human factors engineering instead of just unmanned systems

Another delimitation is the personal determination of the categories in which to place the samples for this research. To ensure this delimitation does not negatively affect the results, the main categories are from a manual literature review and an automated computer-generated category system to ensure there were no personal biases to the categories.

Results

The hypotheses for this research are as follows:

H1o: There is no statistical significance showing the disparity between definitions of Adaptive Human-Machine Interfaces within the three identified categories (Workload, Situation Awareness, and Autonomy).

H1a: There is statistical significance showing the disparity between definitions of Adaptive Human-Machine Interfaces within the three identified categories (Workload, Situation Awareness, and Autonomy).

H2o: There is no statistical significance showing the disparity between specific types of research of Adaptive Human-Machine Interfaces within the fifteen identified sub-categories.

H2a: There is statistical significance showing the disparity between specific types of research of Adaptive Human-Machine Interfaces within the fifteen identified sub-categories.

H3o: There is no statistical significance showing the disparity between titles for Adaptive Human-Machine Interface.

H3a: There is statistical significance showing the disparity between titles for Adaptive Human-Machine Interface.

A sample size of 108 was used in the research for the three main categories. Using G*Power, a post-hoc power analysis yielded a power of 0.80 based on the sample size of 108: medium effect size of 0.3, and a level of significance of 0.05. The Chi-Square Goodness of fit test used two degrees of freedom for the three categories. The Chi-Square Goodness of fit test used for the sub-categories used 9 degrees of freedom with a sample size of 295. Using G*Power, a post-hoc power analysis on this dataset yielded a power of 0.80, based on a sample size of 295: a medium effect size of 0.23, and a level of significance of 0.05. The sub-categories test used for the definition used four degrees of freedom using a sample size of 114. Using G*Power, a post-hoc analysis yielded a power of 0.80, based on a sample size of 114: a medium effect size of 0.324, and a level of significance of 0.05.

The first Chi-square goodness-of-fit test was conducted to examine the main categorical data, resulting in X2 (2, n = 1295) = 13.4305, p = 0.0012, shown in Table 1. The goal of this analysis was to evaluate the frequencies between categories and determine significance. As indicated, the p-value was smaller than 0.05, indicating that the frequencies found in each category do not show an equal distribution; and were statically different from what is expected by chance. According to the results shown in Table 1 and Figure 2, the Autonomy category was disproportionately over-represented. The Workload and Situation Awareness categories were equally represented. There was enough evidence in this dataset to reject the null hypothesis for H1; there was a clear overrepresentation of the research of automation in Adaptive Interface Research.

Table 1

Chi-Square Results Categories

Workload Situation Awareness Autonomy
Observed Freq. 83 84 128
Expected Freq. (prop.) 98.33235 (.33) 98.33235 (.33) 98.33235 (.33)
p-value 0.0012
Test Statistic 13.43052
Note. X2 =13.4305*, df=2. Numbers in parentheses, (), are expected proportions. Prop. = proportion. *p <0.05.

Figure 2. Observed numbers for the three main categories.

A second Chi-Square goodness-of-fit test was conducted to examine the sub-categorical data, resulting X2 (9, n = 295) = 148.627, p = 0.000, shown in Table 2. The goal of this analysis was to evaluate the frequencies between the sub-categories and determine significance. As indicated, the p-value was smaller than 0.05, indicating that the frequencies found in each category are not equally distributed; and were statically different from what would be expected by chance. According to the results shown in Table 2 and Figure 3, Measurements of Operator Situation Awareness and Automation were overly represented in the data. There was enough evidence in this dataset to reject the null hypothesis for H2.

Table 2

Chi-Square Results for Sub-Categories

Sub-Category Observed Test Proportion Expected Contribution to Chi-Square
1A 7 0.1 29.5 17.1610
1B 14 0.1 29.5 8.1441
1C 34 0.1 29.5 0.6864
1D 28 0.1 29.5 0.0763
2A 65 0.1 29.5 42.7203
2B 10 0.1 29.5 12.8898
2C 9 0.1 29.5 14.2458
3A 36 0.1 29.5 1.4322
3B 68 0.1 29.5 50.2458
3C 24 0.1 29.5 1.0254
  p-value 0.000    
  Test Statistic 148.627    
Note. X2 = 148.627*, df=9. Numbers in parentheses, (), are expected proportions. *p <0.05.

Figure 3. Observed numbers for the ten sub-categories.

A third Chi-Square goodness-of-fit test was conducted to examine the sub-categorical data, resulting X2 (4, n = 114) = 59.5087, p = 0.00, shown in Table 3. This analysis aimed to evaluate the most prominent title or definition between the names used to identify Adaptive Human-Machine Interfaces. As indicated, the p-value was smaller than 0.05, indicated the frequencies found in each category are not equally distributed; and were statically different from what would be expected by chance. According to the results shown in Table 3 and Figure 4, Adaptive Human-Machine Interface and Adaptive Automation were overly represented while Context-Adaptive User Interface was underrepresented. There was enough evidence in this dataset to reject the null hypothesis for H3. The data indicate that the most prominent titles are Adaptive Human-Machine Interface and Adaptive Automation.

Table 3

Chi-Square Results for titles of the samples

Sub-Category Observed Test Proportion Expected Contribution to Chi-Square
DA 9 0.2 22.8 8.3526
DB 45 0.2 22.8 21.6158
DD 15 0.2 22.8 2.6684
DC 5 0.2 22.8 13.8964
DE 40 0.2 22.8 12.9754
p-value 0.00
  Test Statistic 59.5087    
Note. X2 = 59.50877*, df=4. Numbers in parentheses, (), are expected proportions. *p <0.05.

Figure 4. Observed numbers for the five title categories

Conclusion

Chi-Square Test on the Main Categories

The first Chi-Square test results indicated that the Autonomy category is over-represented in the 108 samples collected. With a power of 0.8 and a p-value of 0.0012, the results showed the data was significantly different. Research indicated that research for Adaptive Human-Machine Interfaces predominately focused on the automation of the display for operators. Though the research for Adaptive Interfaces utilized different components of Situation Awareness and Workload Analysis, the research indicated that most research focused on applying automation such as Artificial Intelligence or Machine Learning to modify the interface autonomously.

Research that focused on Situation Awareness and Workload Analysis was under-represented in the 108 samples. This indicated that there were, in fact, areas of Adaptive Interface research that covered these areas; however, they were used to supplement the use of autonomy to manage how the display was changed for an operator. In the review of the 108 samples, the use of automation was evident in most research focusing on SA and Workload. Results supported the research by indicating that within the 108 samples, there was a predominant category in the research.

Chi-Square Test on the Sub-Categories

The second Chi-Square test results indicated that the categories of Measurements of Operator Situation Awareness and Automation were over-represented. Despite the main categories indicating that autonomy was the primary research for Adaptive Human-Machine Interfaces, all research utilized many functions of Workload Analysis, Situation Awareness, and Automation. This supported the first Chi-Square results by indicating that although Autonomy was the over-represented category in the 108 samples, the samples' researched utilized functions from all three categories.

Chi-Square Test on the Definition Categories

The results of the third Chi-Square goodness of fit test indicated that two main titles were overly represented in the data. Adaptive Human-Machine Interface and Adaptive Automation were overly represented in the 108 data samples. As shown in Table A5, the categories were the most prominent titles or terms used to describe the Adaptive Interface for the research. Of the five title categories, only one of the categories did not use the term "Adaptive." The results indicated there are multiple titles for the different categories and sub-categories of research.

Recommendations

The results of this research indicate that future research into Adaptive Interfaces should follow the category of Autonomy. The research of Adaptive Interfaces should indicate how they are utilizing or supporting the autonomous nature or operations of an adaptive system. Using this same category for the research, future research can use the same terminology so that Human Factors engineers will use a shared understanding of Adaptive Interfaces.

The results from this research show that research for Adaptive Interfaces covers multiple sub-categories, despite a common theme of an adaptive interface. Future research should consider relating how their research will aid or directly impact an adaptive interface's automation. By using this type of anchor, engineers will have a baseline to base their research on. The title of future work should consider the terminology of "Adaptive Human-Machine Interface." The last test results indicate that there is no single term used to describe this type of interface. Uniting all future research under a single common term will enable future engineers to understand how their research will relate to others. A common term will also show future researchers of this topic where to look. This creates a single location or repository by uniting all research under a single definition.

References

            References marked with an asterisk indicate studies included in the meta-analysis.

*Ahmad, A. R., Basir, O. A., & Hassanein, K. (2004, December). Adaptive User Interfaces for Intelligent E-Learning: Issues and Trends. In ICEB (pp. 925-934).

*Akiki, P. A., Bandara, A. K., & Yu, Y. (2014). Adaptive model-driven user interface development systems. ACM Computing Surveys (CSUR), 47(1), 1-33.

*Ali, S. I., Jain, S., Lal, B., & Sharma, N. (2012). A framework for modeling and designing of intelligent and adaptive interfaces for human computer interaction. International Journal of Applied Information Systems (IJAIS) Volume.

*Aricò, P., Borghini, G., Di Flumeri, G., Colosimo, A., Bonelli, S., Golfetti, A., ... & Babiloni, F. (2016). Adaptive automation triggered by EEG-based mental workload index: a passive brain-computer interface application in realistic air traffic control environment. Frontiers in human neuroscience, 10, 539.

*Ayaz, H., Shewokis, P. A., Bunce, S., Izzetoglu, K., Willems, B., & Onaral, B. (2012). Optical brain monitoring for operator training and mental workload assessment. Neuroimage, 59(1), 36-47.

*Bach-y-Rita, P., & Kercel, S. W. (2003). Sensory substitution and the human–machine interface. Trends in cognitive sciences, 7(12), 541-546.

*Barnes, M. J., & Oron-Gilad, T. (2011, September). Limitations and Advantages of Autonomy in Controlling Multiple Systems: an International View. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 55, No. 1, pp. 2010-2014). Sage CA: Los Angeles, CA: SAGE Publications.

*Behymer, K. J., Mersch, E. M., Ruff, H. A., Calhoun, G. L., & Spriggs, S. E. (2015). Unmanned vehicle plan comparison visualizations for effective human-autonomy teaming. Procedia Manufacturing, 3, 1022-1029.

*Calhoun, G. L., Ruff, H. A., Behymer, K. J., & Mersch, E. M. (2017). Operator-autonomy teaming interfaces to support multi-unmanned vehicle missions. In Advances in Human Factors in Robots and Unmanned Systems (pp. 113-126). Springer, Cham.

*Caridakis, G., Karpouzis, K., & Kollias, S. (2008). User and context adaptive neural networks for emotion recognition. Neurocomputing, 71(13-15), 2553-2562.

*Castillo-Garcia, J., Hortal, E., Bastos, T., Iánez, E., Caicedo, E., & Azorin, J. (2015, June). Active learning for adaptive brain machine interface based on Software Agent. In 2015 23rd Mediterranean Conference on Control and Automation (MED) (pp. 44-48). IEEE.

*Cerny, T., Donahoo, M. J., & Song, E. (2013). Towards effective adaptive user interfaces design. In Proceedings of the 2013 Research in Adaptive and Convergent Systems (pp. 373-380).

*Chen, J. Y., Barnes, M. J., & Harper-Sciarini, M. (2010). Supervisory control of multiple robots: Human-performance issues and user-interface design. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 41(4), 435-454.

*Choi, J. K., Kwon, Y. J., Jeon, J., Kim, K., Choi, H., & Jang, B. (2018, October). Conceptual Design of Driver-Adaptive Human-Machine Interface for Digital Cockpit. In 2018 International Conference on Information and Communication Technology Convergence (ICTC) (pp. 1005-1007). IEEE.

*Cooke, N. J., & Gawron, V. (2016). Human Systems Integration for Remotely Piloted Aircraft Systems. Remotely Piloted Aircraft Systems: A Human Systems Integration Perspective, 1.

*Cosenzo, K., Chen, J., Reinerman-Jones, L., Barnes, M., & Nicholson, D. (2010, September). Adaptive automation effects on operator performance during a reconnaissance mission with an unmanned ground vehicle. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 54, No. 25, pp. 2135-2139). Sage CA: Los Angeles, CA: SAGE Publications.

*Damilano, L., Guglieri, G., Quagliotti, F., & Sale, I. (2012). FMS for unmanned aerial systems: HMI issues and new interface solutions. Journal of Intelligent & Robotic Systems, 65(1-4), 27-42.

*de Graaf, M., Varkevisser, M., Kempen, M., & Jourden, N. (2011, July). Cognitive adaptive man machine interfaces for the firefighter commander: design framework and research methodology. In International Conference on Foundations of Augmented Cognition (pp. 588-597). Springer, Berlin, Heidelberg.

*de Visser, E., Jacobs, B., Chabuk, T., Freedy, A., & Scerri, P. (2012). Design and evaluation of the Adaptive Interface Management System (AIMS) for collaborative mission planning with unmanned vehicles. In Infotech@ Aerospace 2012 (p. 2528).

*de Visser, E., Kidwell, B., Payne, J., Lu, L., Parker, J., Brooks, N., ... & Parasuraman, R. (2013, September). Best of both worlds: Design and evaluation of an adaptive delegation interface. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 57, No. 1, pp. 255-259). Sage CA: Los Angeles, CA: SAGE Publications.

*Dominguez, C., Strouse, R., Papautsky, E. L., & Moon, B. (2015). Cognitive design of an application enabling remote bases to receive unmanned helicopter resupply. Journal of Human-Robot Interaction, 4(2), 50-60.

*Dorneich, M. C., Passinger, B., Hamblin, C., Keinrath, C., Vašek, J., Whitlow, S. D., & Beekhuyzen, M. (2011, September). The crew workload manager: an open-loop adaptive system design for next generation flight decks. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 55, No. 1, pp. 16-20). Sage CA: Los Angeles, CA: SAGE Publications.

*Elm, W. C., Potter, S. S., Gualtieri, J. W., Roth, E. M., & Easter, J. R. (2003). Applied cognitive work analysis: A pragmatic methodology for designing revolutionary cognitive affordances. Handbook of cognitive task design, 357-382.

Federal Aviation Administration. (n.d.). FAA Activities, Courses, Seminars, &amp; Webinars. Retrieved August 10, 2020, from https://www.faasafety.gov/gslac/ALC/course_content.aspx?cID=408

Federal Aviation Administration. (2019). Part 107 waivers. Retrieved from https://www.faa.gov/uas/commercial_operators/part_107_waivers/

*Fern, L. C. (2016). A Cognitive Systems Engineering Approach to Developing HMI Requirements for New Technologies.

*Flach, J. M., Jacques, P. F., Patrick, D. L., Amelink, M., Van Paassen, M. M., & Mulder, M. (2003). A search for meaning: A case study of the approach-to-landing. Handbook of cognitive task design, 171-191.

*Fortmann, F., & Mengeringhausen, T. (2014, September). Development and Evaluation of an Assistant System to Aid Monitoring Behavior during Multi-UAV Supervisory Control: Experiences from the D3CoS Project. In Proceedings of the 2014 European Conference on Cognitive Ergonomics (pp. 1-8).

*Gevins, A., Leong, H., Du, R., Smith, M. E., Le, J., DuRousseau, D., ... & Libove, J. (1995). Towards measurement of brain function in operational environments. Biological Psychology, 40(1-2), 169-186.

*Gombolay, M., Bair, A., Huang, C., & Shah, J. (2017). Computational design of mixed-initiative human–robot teaming that considers human factors: situational awareness, workload, and workflow preferences. The International journal of robotics research, 36(5-7), 597-617.

*Gullà, F., Ceccacci, S., Germani, M., & Cavalieri, L. (2015). Design adaptable and adaptive user interfaces: A method to manage the information. In Ambient Assisted Living (pp. 47-58). Springer, Cham.

*Halme, A., Leppänen, I., Suomela, J., Ylönen, S., & Kettunen, I. (2003). WorkPartner: interactive human-like service robot for outdoor applications. The international journal of robotics Research, 22(7-8), 627-640.

*Hancock, P. A., Jagacinski, R. J., Parasuraman, R., Wickens, C. D., Wilson, G. F., & Kaber, D. B. (2013). Human-automation interaction research: past, present, and future. ergonomics in design, 21(2), 9-14.

*Haritos, T. (2017). A Study of Human-Machine Interface (HMI) Learnability for Unmanned Aircraft Systems Command and Control.

*Hastie, H., Lohan, K., Chantler, M., Robb, D. A., Ramamoorthy, S., Petrick, R., ... & Lane, D. (2018). The ORCA hub: Explainable offshore robotics through intelligent interfaces. arXiv preprint arXiv:1803.02100.

*Heard, J., & Adams, J. A. (2019). Multi-Dimensional Human Workload Assessment for Supervisory Human–Machine Teams. Journal of Cognitive Engineering and Decision Making, 13(3), 146-170.

*Heard, J., Fortune, J., & Adams, J. A. (2019, November). Speech Workload Estimation for Human-Machine Interaction. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 63, No. 1, pp. 277-281). Sage CA: Los Angeles, CA: SAGE Publications.

*Heffner, K., & Hassaine, F. (2011). Towards intelligent operator interfaces in support of autonomous uvs operations. PEGASUS SIMULATION SERVICES INC MONTREAL (CANADA).

*Hervé, S. (2012). Context-adaptive Multimodal User Interfaces. Technical report, University of Fribourg.

*Hilton, S., Sabatini, R., Gardi, A., Ogawa, H., & Teofilatto, P. (2019). Space traffic management: Towards safe and unsegregated space transport operations. Progress in Aerospace Sciences, 105, 98-125.

*Hocraffer, A., & Nam, C. S. (2017). A meta-analysis of human-system interfaces in unmanned aerial vehicle (UAV) swarm management. Applied ergonomics, 58, 66-80.

*Hou, M., Kobierski, R. D., & Brown, M. (2007). Intelligent adaptive interfaces for the control of multiple UAVs. Journal of Cognitive Engineering and Decision Making, 1(3), 327-362.

*Hou, M., Zhu, H., Zhou, M., & Arrabito, G. R. (2010). Optimizing operator–agent interaction in intelligent adaptive interface design: A conceptual framework. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 41(2), 161-178.

*Ilbeygi, M., & Kangavari, M. R. (2018). Comprehensive architecture for intelligent adaptive interface in the field of single‐human multiple‐robot interaction. ETRI Journal, 40(4), 483-498.

*Ilbeygi, M., & Kangavari, M. R. (2019). A New Single-Display Intelligent Adaptive Interface for Controlling a Group of UAVs. Journal of AI and Data Mining, 7(2), 341-353.

*Jeong, H., & Liu, Y. (2017). Modeling of stimulus-response secondary tasks with different modalities while driving in a computational cognitive architecture.

*Kaber, D., Hancock, P., Jagacinski, R., Parasurman, R., Wickens, C., Wilson, G., ... & Ockerman, J. (2011, September). Pioneers in cognitive engineering & decision making research–foundational contributions to the science of human-automation interaction. In Proceedings of the human factors and ergonomics society annual meeting (Vol. 55, No. 1, pp. 321-325). Sage CA: Los Angeles, CA: SAGE Publications.

*Klus, H., Herrling, D., & Rausch, A. (2015). Interface roles for dynamic adaptive systems. Proceedings of ADAPTIVE, 80-84.

*Kosicki, T., & Thomessen, T. (2013). Cognitive human-machine interface applied in remote support for industrial robot systems. International journal of advanced robotic systems, 10(10), 342.

*Langley, P. (1997, September). Machine learning for adaptive user interfaces. In Annual Conference on Artificial Intelligence (pp. 53-62). Springer, Berlin, Heidelberg.

*Lavie, T., & Meyer, J. (2010). Benefits and costs of adaptive user interfaces. International Journal of Human-Computer Studies, 68(8), 508-524.

*Leanne, M., Treacy, E., Robert, J., & Jacob, K. (2009). Brain measurement for usability testing and adaptive interfaces: an example of uncovering syntactic workload with functional near infrared spectroscopy. In Proceedings of the SIGCHI Conference 2009 on Human Factors in Computing Systems (pp. 2185-2194).

*Lee, J. C., & Tan, D. S. (2006, October). Using a low-cost electroencephalograph for task classification in HCI research. In Proceedings of the 19th annual ACM symposium on User interface software and technology (pp. 81-90).

*Lim, Y., Gardi, A., Ezer, N., Kistan, T., & Sabatini, R. (2018, June). Eye-tracking sensors for adaptive aerospace human-machine interfaces and interactions. In 2018 5th IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace) (pp. 311-316). IEEE.

*Lim, Y., Gardi, A., Pongsakornsathien, N., Sabatini, R., Ezer, N., & Kistan, T. (2019). Experimental characterisation of eye-tracking sensors for adaptive human-machine systems. Measurement, 140, 151-160.

*Lim, Y., Gardi, A., Ramasamy, S., Vince, J., Pongracic, H., Kistan, T., & Sabatini, R. (2017, September). A novel simulation environment for cognitive human factors engineering research. In 2017 IEEE/AIAA 36th Digital Avionics Systems Conference (DASC) (pp. 1-8). IEEE.

*Lim, Y., Gardi, A., Sabatini, R., Ramasamy, S., Kistan, T., Ezer, N., . . . Bolia, R. (2018). Avionics human-machine interfaces and interactions for manned and unmanned aircraft. Progress in Aerospace Sciences, 102, 1-46. doi:10.1016/j.paerosci.2018.05.002

*Lim, Y., Gardi, A., Sabatini, R., Ranasinghe, K., Ezer, N., Rodgers, K., & Salluce, D. (2019). Optimal energy-based 4D guidance and control for terminal descent operations. Aerospace Science and Technology, 95, 105436.

*Lim, Y., Liu, J., Ramasamy, S., & Sabatini, R. (2016, August). Cognitive Remote Pilot-Aircraft Interface for UAS Operations. In Proceedings of the 2016 International Conference on Intelligent Unmanned Systems (ICIUS 2016), Xi'an, China (pp. 23-25).

*Lim, Y., Ramasamy, S., Gardi, A., Kistan, T., & Sabatini, R. (2018). Cognitive human-machine interfaces and interactions for unmanned aircraft. Journal of Intelligent & Robotic Systems, 91(3-4), 755-774.

*Lim, Y., Ranasinghe, K., Gardi, A., Ezer, N., & Sabatini, R. (2018, September). Human-machine interfaces and interactions for multi UAS operations. In Proceedings of the 31st Congress of the International Council of the Aeronautical Sciences (ICAS 2018), Belo Horizonte, Brazil (pp. 9-14).

*Lim, Y., Samreeloy, T., Chantaraviwat, C., Ezer, N., Gardi, A., & Sabatini, R. (2019). Cognitive human-machine interfaces and interactions for multi-UAV operations. In AIAC18: 18th Australian International Aerospace Congress (2019): HUMS-11th Defence Science and Technology (DST) International Conference on Health and Usage Monitoring (HUMS 2019): ISSFD-27th International Symposium on Space Flight Dynamics (ISSFD) (p. 40). Engineers Australia, Royal Aeronautical Society..

*Liu, J., Gardi, A., Ramasamy, S., Lim, Y., & Sabatini, R. (2016). Cognitive pilot-aircraft interface for single-pilot operations. Knowledge-based systems, 112, 37-53.

*Llaneras, R. E., Cannon, B. R., & Green, C. A. (2017). Strategies to assist drivers in remaining attentive while under partially automated driving: Verification of human–machine interface concepts. Transportation research record, 2663(1), 20-26.

*Madni, A. M., & Madni, C. C. (2018). Architectural Framework for Exploring Adaptive Human-Machine Teaming Options in Simulated Dynamic Environments. Systems, 6(4), 44.

*Manawadu, U. E., Kamezaki, M., Ishikawa, M., Kawano, T., & Sugano, S. (2017, June). A multimodal human-machine interface enabling situation-Adaptive control inputs for highly automated vehicles. In 2017 IEEE Intelligent Vehicles Symposium (IV) (pp. 1195-1200). IEEE.

*Mannaru, P., Balasingam, B., Pattipati, K., Sibley, C., & Coyne, J. (2016, May). Cognitive context detection in UAS operators using eye-gaze patterns on computer screens. In Next-Generation Analyst IV (Vol. 9851, p. 98510F). International Society for Optics and Photonics.

*Mannaru, P., Balasingam, B., Pattipati, K., Sibley, C., & Coyne, J. (2016, May). Cognitive context detection using pupillary measurements. In Next-Generation Analyst IV (Vol. 9851, p. 98510Q). International Society for Optics and Photonics.

*Mannaru, P., Balasingam, B., Pattipati, K., Sibley, C., & Coyne, J. (2016, September). Cognitive context detection for adaptive automation. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 60, No. 1, pp. 223-227). Sage CA: Los Angeles, CA: Sage Publications.

*Matsunaga, H., & Nakazawa, H. (1999). Development of adaptive human-machine interface to match human satisfaction. IFAC Proceedings Volumes, 32(2), 6529-6534.

*Mezhoudi, N. (2013, March). User interface adaptation based on user feedback and machine learning. In Proceedings of the companion publication of the 2013 international conference on Intelligent user interfaces companion (pp. 25-28).

*Miller, C., Hamell, J., Ruff, H., Barry, T., Draper, M., & Calhoun, G. (2012). Adaptable operator-automation interface for future unmanned aerial systems control: Development of a highly flexible delegation concept demonstration. In Infotech@ Aerospace 2012 (p. 2529).

*Mouloua, M., Ferraro, J. C., Kaplan, A. D., Mangos, P., & Hancock, P. A. (2019). 9 Human Factors Issues Regarding Automation Trust in UAS Operation, Selection, and Training. Human Performance in Automated and Autonomous Systems: Current Theory and Methods, 169.

*Mueller, J. B., Miller, C., Kuter, U., Rye, J., & Hamell, J. (2017). A human-system interface with contingency planning for collaborative operations of unmanned aerial vehicles. In AIAA Information Systems-AIAA Infotech@ Aerospace (p. 1296).

*Nasoz, F., Lisetti, C. L., & Vasilakos, A. V. (2010). Affectively intelligent and adaptive car interfaces. Information Sciences, 180(20), 3817-3836.

*Neville, K., Blickensderfer, B., Archer, J., Kaste, K., & Luxion, S. P. (2012, September). A cognitive work analysis to identify human-machine interface design challenges unique to uninhabited aircraft systems. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 56, No. 1, pp. 418-422). Sage CA: Los Angeles, CA: SAGE Publications.

*Padmanaban, N., Konrad, R., Stramer, T., Cooper, E. A., & Wetzstein, G. (2017). Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays. Proceedings of the National Academy of Sciences, 114(9), 2183-2188.

Pankok, C., Bass, E. J., Smith, P. J., Bridewell, J., Dolgov, I., Walker, J., . . . Spencer, A. (2017). A7 - UAS Human Factors Control Station Design Standards (Plus Function Allocation, Training, and Visual Observer). Alliance for System Safety of UAS through Research Excellence (ASSURE)—Federal Aviation Administration Center of Excellence for Unmanned Aerial System Research.

Pankok, C., Bass, E. J., Smith, P. J., Storm, R., Walker, J., Shepherd, A., & Spencer, A. (2017). A10–Human Factors Considerations of Unmanned Aircraft System Procedures & Control Stations: Tasks CS-1 through CS-5 (No. DOT/FAA/AR-xx/xx). William J. Hughes Technical Center (US).

*Piuzzi, B., Cont, A., & Balerna, M. (2014, May). The workload sensing for the human machine interface of unmanned air systems. In 2014 IEEE Metrology for Aerospace (MetroAeroSpace) (pp. 50-55). IEEE.

*Raya, R., Rocon, E., Ceres, R., & Pajaro, M. (2012, April). A mobile robot controlled by an adaptive inertial interface for children with physical and cognitive disorders. In 2012 IEEE international conference on technologies for practical robot applications (TePRA) (pp. 151-156). IEEE.

*Rebai, R., Maalej, M. A., Mahfoudhi, A., & Abid, M. (2016). Building and evaluating an adaptive user interface using a Bayesian network approach. International Journal of Computer Science and Information Security, 14(7), 548.

*Reinecke, K., & Bernstein, A. (2011). Improving performance, perceived usability, and aesthetics with culturally adaptive user interfaces. ACM Transactions on Computer-Human Interaction (TOCHI), 18(2), 1-29.

*Reinecke, K., & Bernstein, A. (2013). Knowing what a user likes: A design science approach to interfaces that automatically adapt to culture. Mis Quarterly, 427-453.

*Ross, W., Morris, A., Ulieru, M., & Guyard, A. B. (2013, October). RECON: An adaptive human-machine system for supporting intelligence analysis. In 2013 IEEE International Conference on Systems, Man, and Cybernetics (pp. 782-787). IEEE.

*Rouse, W. B. (1988). Adaptive aiding for human/computer control. Human factors, 30(4), 431-443.

*Schafer, D., & Kaufman, D. (2018). Augmenting Reality with Intelligent Interfaces. Artificial Intelligence: Emerging Trends and Applications, 221.

*Shi, Y., Taib, R., Ruiz, N., Choi, E., & Chen, F. (2007). Multimodal human-machine interface and user cognitive load measurement. IFAC Proceedings Volumes, 40(16), 200-205.

*Solovey, E. T., Lalooses, F., Chauncey, K., Weaver, D., Parasi, M., Scheutz, M., ... & Jacob, R. J. (2011, May). Sensing cognitive multitasking for a brain-based adaptive user interface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 383-392).

*Song, I. J., & Cho, S. B. (2013). Bayesian and behavior networks for context-adaptive user interface in a ubiquitous home environment. Expert Systems with Applications, 40(5), 1827-1838.

*Stark, B., Coopmans, C., & Chen, Y. (2012, August). A framework for analyzing human factors in unmanned aerial systems. In 2012 5th international symposium on resilient control systems (pp. 13-18). IEEE.

*Stowers, K., Oglesby, J., Sonesh, S., Leyva, K., Iwig, C., & Salas, E. (2017). A framework to guide the assessment of human–machine systems. Human factors, 59(2), 172-188.

*Strenzke, R., Uhrmann, J., Benzler, A., Maiwald, F., Rauschert, A., & Schulte, A. (2011, August). Managing cockpit crew excess task load in military manned-unmanned teaming missions by dual-mode cognitive automation approaches. In AIAA guidance, navigation, and control conference (p. 6237).

*Suomela, J., & Halme, A. (2001). Cognitive human machine interface of workpartner robot. IFAC Proceedings Volumes, 34(19), 51-56.

*Szafir, D., Mutlu, B., & Fong, T. (2017). Designing planning and control interfaces to support user collaboration with flying robots. The International Journal of Robotics Research, 36(5-7), 514-542.

*Teo, G., Reinerman-Jones, L., Matthews, G., & Szalma, J. (2015). Comparison of measures used to assess the workload of monitoring an unmanned system in a simulation mission. Procedia Manufacturing, 3, 1006-1013.

*Terwilliger, B. A., Ison, D. C., Vincenzi, D. A., & Liu, D. (2014, June). Advancement and application of unmanned aerial system human-machine-interface (HMI) technology. In International Conference on Human Interface and the Management of Information (pp. 273-283). Springer, Cham.

*Theissing, N., & Schulte, A. (2013). Intent-Based UAV Mission Management Using an Adaptive Mixed-Initiative Operator Assistant System. In AIAA Infotech@ Aerospace (I@ A) Conference (p. 4802).

*Toker, D., Conati, C., Carenini, G., & Haraty, M. (2012, July). Towards adaptive information visualization: on the influence of user characteristics. In International conference on user modeling, adaptation, and personalization (pp. 274-285). Springer, Berlin, Heidelberg.

*Trujillo, A. C., Fan, H., Cross, C. D., Hempley, L. E., Cichella, V., Puig-Navarro, J., & Mehdi, S. B. (2015). Operator informational needs for multiple autonomous small vehicles. Procedia Manufacturing, 3, 936-943.

*Vidulich, M. A., & Tsang, P. S. (2015). The confluence of situation awareness and mental workload for adaptable human–machine systems. Journal of Cognitive Engineering and Decision Making, 9(1), 95-97.

*Villani, V., Sabattini, L., Czerniaki, J. N., Mertens, A., Vogel-Heuser, B., & Fantuzzi, C. (2017, September). Towards modern inclusive factories: A methodology for the development of smart adaptive human-machine interfaces. In 2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA) (pp. 1-7). IEEE.

*Vincenzi, D. A., Terwilliger, B. A., & Ison, D. C. (2015). Unmanned aerial system (UAS) human-machine interfaces: new paradigms in command and control. Procedia Manufacturing, 3, 920-927.

*Wallhoff, F., Ablaßmeier, M., Bannat, A., Buchta, S., Rauschert, A., Rigoll, G., & Wiesbeck, M. (2007, July). Adaptive human-machine interfaces in cognitive production environments. In 2007 IEEE international conference on multimedia and expo (pp. 2246-2249). IEEE.

*Wei, Z., Zhuang, D., Wanyan, X., Liu, C., & Zhuang, H. (2014). A model for discrimination and prediction of mental workload of aircraft cockpit display interface. Chinese Journal of Aeronautics, 27(5), 1070-1077.

*Wijayasinghe, I. B., Saadatzi, M. N., Peetha, S., Popa, D. O., & Cremer, S. (2018, August). Adaptive Interface for Robot Teleoperation using a Genetic Algorithm. In 2018 IEEE 14th International Conference on Automation Science and Engineering (CASE) (pp. 50-56). IEEE.

*Wohleber, R. W., Matthews, G., Lin, J., Szalma, J. L., Calhoun, G. L., Funke, G. J., ... & Ruff, H. A. (2019). Vigilance and automation dependence in operation of multiple unmanned aerial systems (UAS): a simulation study. Human factors, 61(3), 488-505.

*Wu, X., Wang, C., Niu, Y., Hu, X., & Fan, C. (2018). Adaptive human-in-the-loop multi-target recognition improved by learning. International Journal of Advanced Robotic Systems, 15(3), 1729881418774222.

*Yazdi, F., Przybysz, K., & Göhner, P. (2014, September). Context-sensitive human-machine interface of automation systems: Introduction of an adaptive concept and prototype. In Proceedings of the 2014 IEEE Emerging Technology and Factory Automation (ETFA) (pp. 1-8). IEEE.

*Yigitbas, E. (2019). Model-driven Engineering of Self-adaptive User Interfaces (Doctoral dissertation, Universitätsbibliothek).

*Zander, T. O., Kothe, C., Jatzev, S., & Gaertner, M. (2010). Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction. chap. Enhancing Human-Computer Interaction with Input from Active and Passive Brain-Computer Interfaces, 181–199.

*Zhang, J., Yin, Z., & Wang, R. (2017). Design of an adaptive human-machine system based on dynamical pattern recognition of cognitive task-load. Frontiers in neuroscience, 11, 129.

Appendix A

Coding Scheme and Definitions

This appendix contains the definitions for the categories and sub-categories used throughout the study for the Adaptive Human-Machine Interface categories. A table for the definitions is also used; however, this table is not used for the chi-square test analysis.

Table A1

Top-Level Categories

Adaptive Interface Category Code Definition
Workload Management 1 Measurement of Operator Workload
Situation Awareness 2 Measurements of Operator's Situation Awareness
Autonomy 3 Automatic Changes in the HMI for the Operator

Table A2

Workload Management Sub-categories

Workload Sub-category Code Definition
Measurement of Operator Workload 1A Any process to measure operator workload
Use of Eye-Gaze 1B Use of eye-gaze measuring equipment
Use of Voice 1C Use of voice measuring equipment
Use of EEG 1D  Use of EEG measuring equipment

Table A3

Situation Awareness Sub-Categories

SA Sub-Category Code Definition
Measurements of Operator Situation Awareness 2A Any process to measure operator SA
Loss of Situation Awareness 2B Any measurement to show a loss or reduction in operator SA
Situation to change HMI 2C Any changes to the HMI based on a situation or scenario 

Table A4

Autonomy Sub-Categories

Autonomy Sub-Category Code Definition
Artificial Intelligence 3A Any use of artificial intelligence to change the HMI
Automation 3B The use of automation to change the HMI
Machine Learning 3C  The use of machine learning to change the HMI

Table A5

Definition Sub-Categories

Definition Sub-Category Code
Cognitive Human Machine Interface (CHMI) DA
Adaptive Human Machine Interface DB
Intelligent Adaptive Interface (IAI) DC
Context-Adaptive Operator Interface DD
Adaptive Automation DE

Appendix B

LIST OF ACRONYMS

AA                              Adaptive Automation

CHMI                          Cognitive Adaptive Human-Machine Interface

ECG                            Electrocardiogram

EEG                            Electroencephalogram

FAA                            Federal Aviation Administration

fNIR                            functional Near Infrared

HF                               Human Factors

HMI                            Human-Machine Interface

IAI                              Intelligent Adaptive Interface

pCBI                           passive Brain-Computer  Interface

UAS                            Unmanned Aircraft System

Appendix C

Table C1

Samples used for this research

Category Sub-Category Title APA Reference
1,3 1C,3E,DB A Cognitive Systems Engineering Approach To Developing Hmi Requirements For New Technologies Dissertation Fern, L. C. (2016). A Cognitive Systems Engineering Approach to Developing HMI Requirements for New Technologies.
2 2A A Cognitive Work Analysis To Identify Human-Machine Interface Design Challenges Unique To Uninhabited Aircraft Systems Neville, K., Blickensderfer, B., Archer, J., Kaste, K., & Luxion, S. P. (2012, September). A cognitive work analysis to identify human-machine interface design challenges unique to uninhabited aircraft systems. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 56, No. 1, pp. 418-422). Sage CA: Los Angeles, CA: SAGE Publications.
2 2A A Framework For Analyzing Human Factors In Unmanned Aerial Systems Stark, B., Coopmans, C., & Chen, Y. (2012, August). A framework for analyzing human factors in unmanned aerial systems. In 2012 5th international symposium on resilient control systems (pp. 13-18). IEEE.
3 3A,DB,DC A Framework For Modeling And Designing Of Intelligent And Adaptive Interfaces For Human Computer Interaction Ali, S. I., Jain, S., Lal, B., & Sharma, N. (2012). A framework for modeling and designing of intelligent and adaptive interfaces for human computer interaction. International Journal of Applied Information Systems (IJAIS) Volume.
2,3 2A,2B,3B,DE A Framework To Guide The Assessment Of Human–Machine Systems Stowers, K., Oglesby, J., Sonesh, S., Leyva, K., Iwig, C., & Salas, E. (2017). A framework to guide the assessment of human–machine systems. Human factors59(2), 172-188.
1,2,3 1C,2A,3A,3B,DE A Human-System Interface With Contingency Planning For Collaborative Operations Of Unmanned Aerial Vehicles Mueller, J. B., Miller, C., Kuter, U., Rye, J., & Hamell, J. (2017). A human-system interface with contingency planning for collaborative operations of unmanned aerial vehicles. In AIAA Information Systems-AIAA Infotech@ Aerospace (p. 1296).
1,2,3 1C,2A,2B,3B A Meta-Analysis Of Human-System Interfaces In Unmanned Aerial Vehicle (Uav) Swarm Management Hocraffer, A., & Nam, C. S. (2017). A meta-analysis of human-system interfaces in unmanned aerial vehicle (UAV) swarm management. Applied ergonomics58, 66-80.
1,3 1C,3DB A Mobile Robot Controlled By An Adaptive Inertial Interface For Children With Physical And Cognitive Disorders Raya, R., Rocon, E., Ceres, R., & Pajaro, M. (2012, April). A mobile robot controlled by an adaptive inertial interface for children with physical and cognitive disorders. In 2012 IEEE international conference on technologies for practical robot applications (TePRA) (pp. 151-156). IEEE.
2,3 2A,3B A Model For Discrimination And Prediction Of Mentalworkload Of Aircraft Cockpit Display Interface Wei, Z., Zhuang, D., Wanyan, X., Liu, C., & Zhuang, H. (2014). A model for discrimination and prediction of mental workload of aircraft cockpit display interface. Chinese Journal of Aeronautics27(5), 1070-1077.
1,2,3 1C,2A,3B,DB,DC A Multimodal Human-Machine Interface Enabling Situation-Adaptive Control Inputs For Highly Automated Vehicles Manawadu, U. E., Kamezaki, M., Ishikawa, M., Kawano, T., & Sugano, S. (2017, June). A multimodal human-machine interface enabling situation-Adaptive control inputs for highly automated vehicles. In 2017 IEEE Intelligent Vehicles Symposium (IV) (pp. 1195-1200). IEEE.
1,2,3 1C,1D,2A,2C,3B,DB,DE A Novel Simulation Environment For Cognitive Human Factors Engineering Research Lim, Y., Gardi, A., Ramasamy, S., Vince, J., Pongracic, H., Kistan, T., & Sabatini, R. (2017, September). A novel simulation environment for cognitive human factors engineering research. In 2017 IEEE/AIAA 36th Digital Avionics Systems Conference (DASC) (pp. 1-8). IEEE.
2,3 2A,3B A Search For Meaning- A Case Study Of The Approach-Tolanding Flach, J. M., Jacques, P. F., Patrick, D. L., Amelink, M., Van Paassen, M. M., & Mulder, M. (2003). A search for meaning: A case study of the approach-to-landing. Handbook of cognitive task design, 171-191.
1,2,3 1C,2A,3A,3B,DE A Study Of Human-Machine Interface (HMI) Learnability For Unmanned Aircraft Systems Command And Control Haritos, T. (2017). A Study of Human-Machine Interface (HMI) Learnability for Unmanned Aircraft Systems Command and Control.
1,3 1D,3A,3C Active Learning For Adaptive Brain Machine Interface Based On Software Agent Castillo-Garcia, J., Hortal, E., Bastos, T., Iánez, E., Caicedo, E., & Azorin, J. (2015, June). Active learning for adaptive brain machine interface based on Software Agent. In 2015 23rd Mediterranean Conference on Control and Automation (MED) (pp. 44-48). IEEE.
1,2,3 1C,2A,3B,DE Adaptable Operator-Automation Interface For Future Unmanned Aerial Systems Control- Development Of A Highly Flexible Delegation Concept Demonstration Miller, C., Hamell, J., Ruff, H., Barry, T., Draper, M., & Calhoun, G. (2012). Adaptable operator-automation interface for future unmanned aerial systems control: Development of a highly flexible delegation concept demonstration. In Infotech@ Aerospace 2012 (p. 2529).
3 3A,3B,DB Adaptive Aiding For Human-Computer Control Rouse, W. B. (1988). Adaptive aiding for human/computer control. Human factors30(4), 431-443.
2,3 2A,3B,DE Adaptive Automation Effects On Operator Performance During A Reconnaissance Mission With An Unmanned Ground Vehicle Cosenzo, K., Chen, J., Reinerman-Jones, L., Barnes, M., & Nicholson, D. (2010, September). Adaptive automation effects on operator performance during a reconnaissance mission with an unmanned ground vehicle. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 54, No. 25, pp. 2135-2139). Sage CA: Los Angeles, CA: SAGE Publications.
1,2,3 1B,1D,2A,2B,3B,3C,DE Adaptive Automation Triggered By EEG-Based Mental Workload Index A Passive Braincomputer Interface Application In Realistic Air Traffic Control Environment Aricò, P., Borghini, G., Di Flumeri, G., Colosimo, A., Bonelli, S., Golfetti, A., ... & Babiloni, F. (2016). Adaptive automation triggered by EEG-based mental workload index: a passive brain-computer interface application in realistic air traffic control environment. Frontiers in human neuroscience10, 539.
2,3 2A,3A,3B,3C,DE Adaptive Human-In-The-Loop Multi-Target Recognition Improved By Learning Wu, X., Wang, C., Niu, Y., Hu, X., & Fan, C. (2018). Adaptive human-in-the-loop multi-target recognition improved by learning. International Journal of Advanced Robotic Systems15(3), 1729881418774222.
1,2,3 1B,2A,3A,DB,DD Adaptive Human-Machine Interfaces In Cognitive Production Environments Wallhoff, F., Ablaßmeier, M., Bannat, A., Buchta, S., Rauschert, A., Rigoll, G., & Wiesbeck, M. (2007, July). Adaptive human-machine interfaces in cognitive production environments. In 2007 IEEE international conference on multimedia and expo (pp. 2246-2249). IEEE.
1 1C,DB Adaptive Interface For Robot Teleoperation Using A Genetic Algorithm Wijayasinghe, I. B., Saadatzi, M. N., Peetha, S., Popa, D. O., & Cremer, S. (2018, August). Adaptive Interface for Robot Teleoperation using a Genetic Algorithm. In 2018 IEEE 14th International Conference on Automation Science and Engineering (CASE) (pp. 50-56). IEEE.
1,3 1C,3C,DD Adaptive Model-Driven User Interface Development Systems Akiki, P. A., Bandara, A. K., & Yu, Y. (2014). Adaptive model-driven user interface development systems. ACM Computing Surveys (CSUR)47(1), 1-33.
1,2,3 1C,2A,3B Advancement And Application Of Unmanned Aerial System Human-Machine-Interface (Hmi) Technology Terwilliger, B. A., Ison, D. C., Vincenzi, D. A., & Liu, D. (2014, June). Advancement and application of unmanned aerial system human-machine-interface (HMI) technology. In International Conference on Human Interface and the Management of Information (pp. 273-283). Springer, Cham.
3 3A,3C,DB Affectively Intelligent And Adaptive Car Interfaces Nasoz, F., Lisetti, C. L., & Vasilakos, A. V. (2010). Affectively intelligent and adaptive car interfaces. Information Sciences180(20), 3817-3836.
2,3 2A,3A,DC Applied Cognitive Work Analysis- A Pragmatic Methodology For Designing Revolutionary Cognitive Affordances Elm, W. C., Potter, S. S., Gualtieri, J. W., Roth, E. M., & Easter, J. R. (2003). Applied cognitive work analysis: A pragmatic methodology for designing revolutionary cognitive affordances. Handbook of cognitive task design, 357-382.
2,3 2A,3A,3B,3C,DB,DE Architectural Framework For Exploring Adaptive Human-Machine Teaming Options In Simulated Dynamic Environments Madni, A. M., & Madni, C. C. (2018). Architectural Framework for Exploring Adaptive Human-Machine Teaming Options in Simulated Dynamic Environments. Systems6(4), 44.
1,3 1C,3A Augmenting Reality With Intelligent Interfaces Schafer, D., & Kaufman, D. (2018). Augmenting Reality with Intelligent Interfaces. Artificial Intelligence: Emerging Trends and Applications, 221.
1,2,3 1D,2A,2C,3A,3B,DA Avionics Human-Machine Interfaces And Interactions For Manned And Unmanned Aircraft Lim, Y., Gardi, A., Sabatini, R., Ramasamy, S., Kistan, T., Ezer, N., . . . Bolia, R. (2018). Avionics human-machine interfaces and interactions for manned and unmanned aircraft. Progress in Aerospace Sciences, 102, 1-46. doi:10.1016/j.paerosci.2018.05.002
3 3A,DB,DD Bayesian And Behavior Networks For Context-Adaptive User Interface In A Ubiquitous Home Environment Song, I. J., & Cho, S. B. (2013). Bayesian and behavior networks for context-adaptive user interface in a ubiquitous home environment. Expert Systems with Applications40(5), 1827-1838.
2,3 2A,3A,3B,DB Benefits And Costs Of Adaptive User Interfaces Lavie, T., & Meyer, J. (2010). Benefits and costs of adaptive user interfaces. International Journal of Human-Computer Studies68(8), 508-524.
2,3 2A,3B,DB,DC,DE Best Of Both Worlds- Design And Evaluation Of An Adaptive Delegation Interface de Visser, E., Kidwell, B., Payne, J., Lu, L., Parker, J., Brooks, N., ... & Parasuraman, R. (2013, September). Best of both worlds: Design and evaluation of an adaptive delegation interface. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 57, No. 1, pp. 255-259). Sage CA: Los Angeles, CA: SAGE Publications.
1,3 1A,1D,3C Brain Measurement For Usability Testing And Adaptive Interfaces- An Example Of Uncovering Syntactic Workload With Functional Near Infrared Spectroscopy Leanne, M., Treacy, E., Robert, J., & Jacob, K. (2009). Brain measurement for usability testing and adaptive interfaces: an example of uncovering syntactic workload with functional near infrared spectroscopy. In Proceedings of the SIGCHI Conference 2009 on Human Factors in Computing Systems (pp. 2185-2194).
1,2,3 1A,1B,1C,1D,2A,3A,3B,3C,DE Brain-Computer Interfaces Applying Our Minds To Human-Computer Interaction Zander, T. O., Kothe, C., Jatzev, S., & Gaertner, M. (2010). Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction. chap. Enhancing Human-Computer Interaction with Input from Active and Passive Brain-Computer Interfaces, 181–199.
1,2,3 1C,2A,3B,DE Cognitive Adaptive Man Machine Interfaces For The Firefighter Commander- Design Framework And Research Methodology de Graaf, M., Varkevisser, M., Kempen, M., & Jourden, N. (2011, July). Cognitive adaptive man machine interfaces for the firefighter commander: design framework and research methodology. In International Conference on Foundations of Augmented Cognition (pp. 588-597). Springer, Berlin, Heidelberg.
1,2,3 1B,1D,2A,3B,3C,DB,DE Cognitive Context Detection For Adaptive Automation Mannaru, P., Balasingam, B., Pattipati, K., Sibley, C., & Coyne, J. (2016, September). Cognitive context detection for adaptive automation. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 60, No. 1, pp. 223-227). Sage CA: Los Angeles, CA: Sage Publications.
1,3 1B,3B,3C Cognitive Context Detection In Uas Operators Using Eye-Gaze Patterns On Computer Screens Mannaru, P., Balasingam, B., Pattipati, K., Sibley, C., & Coyne, J. (2016, May). Cognitive context detection in UAS operators using eye-gaze patterns on computer screens. In Next-Generation Analyst IV (Vol. 9851, p. 98510F). International Society for Optics and Photonics.
1,2,3 1B,2A,3C Cognitive Context Detection Using Pupillary Measurements Mannaru, P., Balasingam, B., Pattipati, K., Sibley, C., & Coyne, J. (2016, May). Cognitive context detection using pupillary measurements. In Next-Generation Analyst IV (Vol. 9851, p. 98510Q). International Society for Optics and Photonics.
2,3 2A,3B Cognitive Design Of An Application Enabling Remote Bases To Receive Unmanned Helicopter Resupply Dominguez, C., Strouse, R., Papautsky, E. L., & Moon, B. (2015). Cognitive design of an application enabling remote bases to receive unmanned helicopter resupply. Journal of Human-Robot Interaction4(2), 50-60.
1,3 1C,3A,3B,DA Cognitive Human Machine Interface Of Workpartner Robot Suomela, J., & Halme, A. (2001). Cognitive human machine interface of workpartner robot. IFAC Proceedings Volumes34(19), 51-56.
1,3 1C,3B,DA Cognitive Human-Machine Interface Applied In Remote Support For Industrial Robot Systems Kosicki, T., & Thomessen, T. (2013). Cognitive human-machine interface applied in remote support for industrial robot systems. International journal of advanced robotic systems10(10), 342.
1,2,3 1C,2A,2C,3B,3C,DA,DB,DE Cognitive Human-Machine Interfaces And Interactions For Multi-Uav Operations Lim, Y., Samreeloy, T., Chantaraviwat, C., Ezer, N., Gardi, A., & Sabatini, R. (2019). Cognitive human-machine interfaces and interactions for multi-UAV operations. In AIAC18: 18th Australian International Aerospace Congress (2019): HUMS-11th Defence Science and Technology (DST) International Conference on Health and Usage Monitoring (HUMS 2019): ISSFD-27th International Symposium on Space Flight Dynamics (ISSFD) (p. 40). Engineers Australia, Royal Aeronautical Society..
1,2,3 1D,2A,2C,3A,3B,DA Cognitive Human-Machine Interfaces And Interactions For Unmanned Aircraft Lim, Y., Ramasamy, S., Gardi, A., Kistan, T., & Sabatini, R. (2018). Cognitive human-machine interfaces and interactions for unmanned aircraft. Journal of Intelligent & Robotic Systems91(3-4), 755-774.
1,2,3 1C,1D,2A,2C,3B,3C,DB,DE Cognitive Pilot-Aircraft Interface For Single-Pilot Operations Liu, J., Gardi, A., Ramasamy, S., Lim, Y., & Sabatini, R. (2016). Cognitive pilot-aircraft interface for single-pilot operations. Knowledge-based systems112, 37-53.
2,3 2A,2C,3B,DA,DB,DE Cognitive Remote Pilot-Aircraft Interface For UAS Operations Lim, Y., Liu, J., Ramasamy, S., & Sabatini, R. (2016, August). Cognitive Remote Pilot-Aircraft Interface for UAS Operations. In Proceedings of the 2016 International Conference on Intelligent Unmanned Systems (ICIUS 2016), Xi'an, China (pp. 23-25).
1,2 1D,2A,DE Comparison Of Measures Used To Assess The Workload Of Monitoring An Unmanned System In A Simulation Mission Teo, G., Reinerman-Jones, L., Matthews, G., & Szalma, J. (2015). Comparison of measures used to assess the workload of monitoring an unmanned system in a simulation mission. Procedia Manufacturing3, 1006-1013.
1,2,3 1B,1C,1D,2A,3A,3C,DE,DC Comprehensive Architecture For Intelligent Adaptive Interface In The Field Of Single‐Human Multiple‐Robot Interaction Ilbeygi, M., & Kangavari, M. R. (2018). Comprehensive architecture for intelligent adaptive interface in the field of single‐human multiple‐robot interaction. ETRI Journal40(4), 483-498.
2,3 2A,2B,3A,3B,DE Computational Design Of Mixed-Initiative Human–Robot Teaming That Considers Human Factors- Situational Awareness, Workload, And Workflow Preferences Gombolay, M., Bair, A., Huang, C., & Shah, J. (2017). Computational design of mixed-initiative human–robot teaming that considers human factors: situational awareness, workload, and workflow preferences. The International journal of robotics research36(5-7), 597-617.
1,3 1C,3A,3C,Db Conceptual Design Of Driver-Adaptive Human-Machine Interface For Digital Cockpit Choi, J. K., Kwon, Y. J., Jeon, J., Kim, K., Choi, H., & Jang, B. (2018, October). Conceptual Design of Driver-Adaptive Human-Machine Interface for Digital Cockpit. In 2018 International Conference on Information and Communication Technology Convergence (ICTC) (pp. 1005-1007). IEEE.
1 1C,DD Context-Adaptive Multimodal User Interfaces Hervé, S. (2012). Context-adaptive Multimodal User Interfaces. Technical report, University of Fribourg.
1 1C,3B Context-Sensitive Human-Machine Interface Of Automation Systems- Introduction Of An Adaptive Concept And Prototype Yazdi, F., Przybysz, K., & Göhner, P. (2014, September). Context-sensitive human-machine interface of automation systems: Introduction of an adaptive concept and prototype. In Proceedings of the 2014 IEEE Emerging Technology and Factory Automation (ETFA) (pp. 1-8). IEEE.
3 3A,3B,DB Design Adaptable And Adaptive User Interfaces- A Method To Manage The Information Gullà, F., Ceccacci, S., Germani, M., & Cavalieri, L. (2015). Design adaptable and adaptive user interfaces: A method to manage the information. In Ambient Assisted Living (pp. 47-58). Springer, Cham.
2,3 2A,3A,3B,DB,DE Design And Evaluation Of The Adaptive Interface Management System (Aims) For Collaborative Mission Planning With Unmanned Vehicles de Visser, E., Jacobs, B., Chabuk, T., Freedy, A., & Scerri, P. (2012). Design and evaluation of the Adaptive Interface Management System (AIMS) for collaborative mission planning with unmanned vehicles. In Infotech@ Aerospace 2012 (p. 2528).
1,2,3 1D,2A,3B,3C,DB,DE Design Of An Adaptive Human-Machine System Based On Dynamical Pattern Recognition Of Cognitive Task-Load Zhang, J., Yin, Z., & Wang, R. (2017). Design of an adaptive human-machine system based on dynamical pattern recognition of cognitive task-load. Frontiers in neuroscience11, 129.
1,2,3 1C,2A,3A,3B Designing Planning And Control Interfaces To Support User Collaboration With Flying Robots Szafir, D., Mutlu, B., & Fong, T. (2017). Designing planning and control interfaces to support user collaboration with flying robots. The International Journal of Robotics Research36(5-7), 514-542.
2,3 2A,3A,3B,DB Development And Evaluation Of An Assistant System To Aid Monitoring Behavior During Multi-Uav Supervisory Control- Experiences From The D3Cos Project Fortmann, F., & Mengeringhausen, T. (2014, September). Development and Evaluation of an Assistant System to Aid Monitoring Behavior during Multi-UAV Supervisory Control: Experiences from the D3CoS Project. In Proceedings of the 2014 European Conference on Cognitive Ergonomics (pp. 1-8).
1,3 1D,3B,DB Development Of Adaptive Human-Machine Interface To Match Human Satisfaction Matsunaga, H., & Nakazawa, H. (1999). Development of adaptive human-machine interface to match human satisfaction. IFAC Proceedings Volumes32(2), 6529-6534.
1,2,3 1B,1C,1D,2A,2C,3A,3B,3C,DA,DB,DE Experimental Characterisation Of Eye-Tracking Sensors For Adaptive Human-Machine Systems Lim, Y., Gardi, A., Pongsakornsathien, N., Sabatini, R., Ezer, N., & Kistan, T. (2019). Experimental characterisation of eye-tracking sensors for adaptive human-machine systems. Measurement140, 151-160.
1,2,3 1B,1C,1D,2A,3B,DE Eye-Tracking Sensors For Adaptive Aerospace Human-Machine Interfaces And Interactions Lim, Y., Gardi, A., Ezer, N., Kistan, T., & Sabatini, R. (2018, June). Eye-tracking sensors for adaptive aerospace human-machine interfaces and interactions. In 2018 5th IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace) (pp. 311-316). IEEE.
2,3 2A,3B FMS For Unmanned Aerial Systems Hmi Issues And New Interface Solutions Damilano, L., Guglieri, G., Quagliotti, F., & Sale, I. (2012). FMS for unmanned aerial systems: HMI issues and new interface solutions. Journal of Intelligent & Robotic Systems65(1-4), 27-42.
2,3 2A,2B,3B,DE Human Factors Issues Regarding Automation Trust In Uas Operation, Selection, And Training Mouloua, M., Ferraro, J. C., Kaplan, A. D., Mangos, P., & Hancock, P. A. (2019). 9 Human Factors Issues Regarding Automation Trust in UAS Operation, Selection, and Training. Human Performance in Automated and Autonomous Systems: Current Theory and Methods, 169.
2,3 2A,3B Human Systems Integration For Remotely Piloted Aircraft Systems Cooke, N. J., & Gawron, V. (2016). Human Systems Integration for Remotely Piloted Aircraft Systems. Remotely Piloted Aircraft Systems: A Human Systems Integration Perspective, 1.
1,2,3 1D,2A,3B,DB,DE Human-Automation Interaction Research- Past, Present, And Future Hancock, P. A., Jagacinski, R. J., Parasuraman, R., Wickens, C. D., Wilson, G. F., & Kaber, D. B. (2013). Human-automation interaction research: past, present, and future. ergonomics in design21(2), 9-14.
1,2,3 1C,2A,3B,DA,DB,DE Human-Machine Interfaces And Interactions For Multi Uas Operations Lim, Y., Ranasinghe, K., Gardi, A., Ezer, N., & Sabatini, R. (2018, September). Human-machine interfaces and interactions for multi UAS operations. In Proceedings of the 31st Congress of the International Council of the Aeronautical Sciences (ICAS 2018), Belo Horizonte, Brazil (pp. 9-14).
1 1A,DB Improving Performance, Perceived Usability, And Aesthetics With Culturally Adaptive User Interfaces Reinecke, K., & Bernstein, A. (2011). Improving performance, perceived usability, and aesthetics with culturally adaptive user interfaces. ACM Transactions on Computer-Human Interaction (TOCHI)18(2), 1-29.
2,3 2A,2B,3B,DB,DC,DE Intelligent Adaptive Interfaces For The Control Of Multiple Uavs Hou, M., Kobierski, R. D., & Brown, M. (2007). Intelligent adaptive interfaces for the control of multiple UAVs. Journal of Cognitive Engineering and Decision Making1(3), 327-362.
2,3 2A,3B Intent-Based Uav Mission Management Using An Adaptive Mixed-Initiative Operator Assistant System Theissing, N., & Schulte, A. (2013). Intent-Based UAV Mission Management Using an Adaptive Mixed-Initiative Operator Assistant System. In AIAA Infotech@ Aerospace (I@ A) Conference (p. 4802).
1 1A,DB Interface Roles For Dynamic Adaptive Systems Klus, H., Herrling, D., & Rausch, A. (2015). Interface roles for dynamic adaptive systems. Proceedings of ADAPTIVE, 80-84.
3 3A,3C,DB Knowing What A User Likes- A Design Science Approach To Interfaces That Automatically Adapt To Culture Reinecke, K., & Bernstein, A. (2013). Knowing what a user likes: A design science approach to interfaces that automatically adapt to culture. Mis Quarterly, 427-453.
2,3 2A,3B,DB,DC,DE Limitations And Advantages Of Autonomy In Controlling Multiple Systems- An International View Barnes, M. J., & Oron-Gilad, T. (2011, September). Limitations and Advantages of Autonomy in Controlling Multiple Systems: an International View. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 55, No. 1, pp. 2010-2014). Sage CA: Los Angeles, CA: SAGE Publications.
3,3 3A,3C,DB Machine Learning For Adaptive User Interfaces Langley, P. (1997, September). Machine learning for adaptive user interfaces. In Annual Conference on Artificial Intelligence (pp. 53-62). Springer, Berlin, Heidelberg.
1,2,3 1C,2A,3B,DC,DE Managing Cockpit Crew Excess Task Load In Military Manned-Unmanned Teaming Missions By Dual-Mode Cognitive Automation Approaches Strenzke, R., Uhrmann, J., Benzler, A., Maiwald, F., Rauschert, A., & Schulte, A. (2011, August). Managing cockpit crew excess task load in military manned-unmanned teaming missions by dual-mode cognitive automation approaches. In AIAA guidance, navigation, and control conference (p. 6237).
1,3 1C,3C,DD Model-Driven Engineering Of Self-Adaptive User Interfaces Yigitbas, E. (2019). Model-driven Engineering of Self-adaptive User Interfaces (Doctoral dissertation, Universitätsbibliothek).
1 1C Modeling Of Stimulus-Response Secondary Tasks With Different Modalities While Driving In A Computational Cognitive Architecture Jeong, H., & Liu, Y. (2017). Modeling of stimulus-response secondary tasks with different modalities while driving in a computational cognitive architecture.
1,2,3 1D.2A.3B.3C.DE Multi-Dimensional Human Workload Assessment For Supervisory Human–Machine Teams Heard, J., & Adams, J. A. (2019). Multi-Dimensional Human Workload Assessment for Supervisory Human–Machine Teams. Journal of Cognitive Engineering and Decision Making13(3), 146-170.
1 1B Multimodal Human-Machine Interface And User Cognitive Load Measurement Shi, Y., Taib, R., Ruiz, N., Choi, E., & Chen, F. (2007). Multimodal human-machine interface and user cognitive load measurement. IFAC Proceedings Volumes40(16), 200-205.
1,2,3 1B,2A,3B,DB,DC New Single-Display Intelligent Adaptive Interface For Controlling A Group Of Unmanned Aerial Vehicles Ilbeygi, M., & Kangavari, M. R. (2019). A New Single-Display Intelligent Adaptive Interface for Controlling a Group of UAVs. Journal of AI and Data Mining7(2), 341-353.
2,3 2A,3B Operator Informational Needs For Multiple Autonomous Small Vehicles Trujillo, A. C., Fan, H., Cross, C. D., Hempley, L. E., Cichella, V., Puig-Navarro, J., & Mehdi, S. B. (2015). Operator informational needs for multiple autonomous small vehicles. Procedia Manufacturing3, 936-943.
2,3 2A,3B,DC,DE Operator-Autonomy Teaming Interfaces To Support Multi-Unmanned Vehicle Missions Calhoun, G. L., Ruff, H. A., Behymer, K. J., & Mersch, E. M. (2017). Operator-autonomy teaming interfaces to support multi-unmanned vehicle missions. In Advances in Human Factors in Robots and Unmanned Systems (pp. 113-126). Springer, Cham.
1,3 1C,1D,3B,DE Optical Brain Monitoring For Operator Training And Mental Workload Assessment Ayaz, H., Shewokis, P. A., Bunce, S., Izzetoglu, K., Willems, B., & Onaral, B. (2012). Optical brain monitoring for operator training and mental workload assessment. Neuroimage59(1), 36-47.
2 2C,DA Optimal Energy-Based 4D Guidance And Control For Terminal Descent Operations Lim, Y., Gardi, A., Sabatini, R., Ranasinghe, K., Ezer, N., Rodgers, K., & Salluce, D. (2019). Optimal energy-based 4D guidance and control for terminal descent operations. Aerospace Science and Technology95, 105436.
1,2,3 1B,2A,3A,3B,DB,DC Optimizing Operator–Agent Interaction In Intelligent Adaptive Interface Design- A Conceptual Framework Hou, M., Zhu, H., Zhou, M., & Arrabito, G. R. (2010). Optimizing operator–agent interaction in intelligent adaptive interface design: A conceptual framework. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews)41(2), 161-178.
1 1B,DB Optimizing Virtual Reality For All Users Through Gaze-Contingent And Adaptive Focus Displays Padmanaban, N., Konrad, R., Stramer, T., Cooper, E. A., & Wetzstein, G. (2017). Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays. Proceedings of the National Academy of Sciences114(9), 2183-2188.
1,2,3 1D,2A,2B,3B,DB,DE Pioneers In Cognitive Engineering & Decision Making Research – Foundational Contributions To The Science Of Human-Automation Interaction Kaber, D., Hancock, P., Jagacinski, R., Parasurman, R., Wickens, C., Wilson, G., ... & Ockerman, J. (2011, September). Pioneers in cognitive engineering & decision making research–foundational contributions to the science of human-automation interaction. In Proceedings of the human factors and ergonomics society annual meeting (Vol. 55, No. 1, pp. 321-325). Sage CA: Los Angeles, CA: SAGE Publications.
1,2,3 1D,2A,3A,DB Recon- An Adaptive Human-Machine System For Supporting Intelligence Analysis Ross, W., Morris, A., Ulieru, M., & Guyard, A. B. (2013, October). RECON: An adaptive human-machine system for supporting intelligence analysis. In 2013 IEEE International Conference on Systems, Man, and Cybernetics (pp. 782-787). IEEE.
1,2,3 1A,2A,3B,3C,DB Sensing Cognitive Multitasking For A Brain-Based Adaptive User Interface Solovey, E. T., Lalooses, F., Chauncey, K., Weaver, D., Parasi, M., Scheutz, M., ... & Jacob, R. J. (2011, May). Sensing cognitive multitasking for a brain-based adaptive user interface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 383-392).
1 1D Sensory Substitution And The Human–Machine Interface Bach-y-Rita, P., & Kercel, S. W. (2003). Sensory substitution and the human–machine interface. Trends in cognitive sciences7(12), 541-546.
2 2A,2C Space Traffic Management- Towards Safe And Unsegregated Space Transport Operations Hilton, S., Sabatini, R., Gardi, A., Ogawa, H., & Teofilatto, P. (2019). Space traffic management: Towards safe and unsegregated space transport operations. Progress in Aerospace Sciences105, 98-125.
1,3 1C,3C,DB Speech Workload Estimation For Human-Machine Interaction Heard, J., Fortune, J., & Adams, J. A. (2019, November). Speech Workload Estimation for Human-Machine Interaction. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 63, No. 1, pp. 277-281). Sage CA: Los Angeles, CA: SAGE Publications.
2,3 2A,2B,3B Strategies To Assist Drivers In Remaining Attentive While Under Partially Automated Driving Llaneras, R. E., Cannon, B. R., & Green, C. A. (2017). Strategies to assist drivers in remaining attentive while under partially automated driving: Verification of human–machine interface concepts. Transportation research record2663(1), 20-26.
1,2,3 1D,2A,2B,3B,DC,DE Supervisory Control Of Multiple Robots- Human-Performance Issues And User-Interface Design Chen, J. Y., Barnes, M. J., & Harper-Sciarini, M. (2010). Supervisory control of multiple robots: Human-performance issues and user-interface design. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews)41(4), 435-454.
2 2A The Confluence Of Situation Awareness And Mental Workload For Adaptable Human–Machine Systems Vidulich, M. A., & Tsang, P. S. (2015). The confluence of situation awareness and mental workload for adaptable human–machine systems. Journal of Cognitive Engineering and Decision Making9(1), 95-97.
1,2,3 1D,2A,3B,DE The Crew Workload Manager- An Open-Loop Adaptive System Design For Next Generation Flight Decks Dorneich, M. C., Passinger, B., Hamblin, C., Keinrath, C., Vašek, J., Whitlow, S. D., & Beekhuyzen, M. (2011, September). The crew workload manager: an open-loop adaptive system design for next generation flight decks. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 55, No. 1, pp. 16-20). Sage CA: Los Angeles, CA: SAGE Publications.
2,3 2A,3A The Orca Hub- Explainable Offshore Robotics Through Intelligent Interfaces Hastie, H., Lohan, K., Chantler, M., Robb, D. A., Ramamoorthy, S., Petrick, R., ... & Lane, D. (2018). The ORCA hub: Explainable offshore robotics through intelligent interfaces. arXiv preprint arXiv:1803.02100.
1,3 1D,3B The Workload Sensing For The Human Machine Interface Of Unmanned Air Systems Piuzzi, B., Cont, A., & Balerna, M. (2014, May). The workload sensing for the human machine interface of unmanned air systems. In 2014 IEEE Metrology for Aerospace (MetroAeroSpace) (pp. 50-55). IEEE.
1 1A,DB Towards Adaptive Information Visualization- On The Influence Of User Characteristics Toker, D., Conati, C., Carenini, G., & Haraty, M. (2012, July). Towards adaptive information visualization: on the influence of user characteristics. In International conference on user modeling, adaptation, and personalization (pp. 274-285). Springer, Berlin, Heidelberg.
3 3A,DB Towards Effective Adaptive User Interfaces Design Cerny, T., Donahoo, M. J., & Song, E. (2013). Towards effective adaptive user interfaces design. In Proceedings of the 2013 Research in Adaptive and Convergent Systems (pp. 373-380).
1,2,3 1C,2A,3B,DC,DE Towards Intelligent Operator Interfaces In Support Of Autonomous Uvs Operations Heffner, K., & Hassaine, F. (2011). Towards intelligent operator interfaces in support of autonomous uvs operations. PEGASUS SIMULATION SERVICES INC MONTREAL (CANADA).
1 1D,DB Towards Measurement Of Brain Function In Operational Environments Gevins, A., Leong, H., Du, R., Smith, M. E., Le, J., DuRousseau, D., ... & Libove, J. (1995). Towards measurement of brain function in operational environments. Biological Psychology40(1-2), 169-186.
2,3 2A,3B,DB,DE Towards Modern Inclusive Factories- A Methodology For The Development Of Smart Adaptive Human-Machine Interfaces Villani, V., Sabattini, L., Czerniaki, J. N., Mertens, A., Vogel-Heuser, B., & Fantuzzi, C. (2017, September). Towards modern inclusive factories: A methodology for the development of smart adaptive human-machine interfaces. In 2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA) (pp. 1-7). IEEE.
1,2,3 1D,2A,3B Unmanned Aerial System (Uas) Human-Machine Interfaces New Paradigms In Command And Control Vincenzi, D. A., Terwilliger, B. A., & Ison, D. C. (2015). Unmanned aerial system (UAS) human-machine interfaces: new paradigms in command and control. Procedia Manufacturing3, 920-927.
3 3B,DC Unmanned Vehicle Plan Comparison Visualizations For Effective Human-Autonomy Teaming Behymer, K. J., Mersch, E. M., Ruff, H. A., Calhoun, G. L., & Spriggs, S. E. (2015). Unmanned vehicle plan comparison visualizations for effective human-autonomy teaming. Procedia Manufacturing3, 1022-1029.
1,3 1B,3A,3C User Interface Adaptation Based On User Feedback And Machine Learning Mezhoudi, N. (2013, March). User interface adaptation based on user feedback and machine learning. In Proceedings of the companion publication of the 2013 international conference on Intelligent user interfaces companion (pp. 25-28).
1,3 1D,3C Using A Low-Cost Electroencephalograph For Task Classification In Hci Research Lee, J. C., & Tan, D. S. (2006, October). Using a low-cost electroencephalograph for task classification in HCI research. In Proceedings of the 19th annual ACM symposium on User interface software and technology (pp. 81-90).
1,3 1D,3B,DE Vigilance And Automation Dependence In Operation Of Multiple Unmanned Aerial Systems (Uas)- A Simulation Study Wohleber, R. W., Matthews, G., Lin, J., Szalma, J. L., Calhoun, G. L., Funke, G. J., ... & Ruff, H. A. (2019). Vigilance and automation dependence in operation of multiple unmanned aerial systems (UAS): a simulation study. Human factors61(3), 488-505.
3 3A,DD User and context adaptive neural networks for emotion recognition Caridakis, G., Karpouzis, K., & Kollias, S. (2008). User and context adaptive neural networks for emotion recognition. Neurocomputing71(13-15), 2553-2562.
3 3A,DB Building and evaluating an adaptive user interface using a Bayesian network approach Rebai, R., Maalej, M. A., Mahfoudhi, A., & Abid, M. (2016). Building and evaluating an adaptive user interface using a Bayesian network approach. International Journal of Computer Science and Information Security14(7), 548.
3 3A Adaptive User Interfaces for Intelligent E-Learning Issues and Trends Ahmad, A. R., Basir, O. A., & Hassanein, K. (2004, December). Adaptive User Interfaces for Intelligent E-Learning: Issues and Trends. In ICEB (pp. 925-934).
1 1A,1C,DA WorkPartner: Interactive Human-like Service Robot for Outdoor Applications Halme, A., Leppänen, I., Suomela, J., Ylönen, S., & Kettunen, I. (2003). WorkPartner: interactive human-like service robot for outdoor applications. The international journal of robotics Research, 22(7-8), 627-640.

Previous
Previous

US Army tests jamming pod on Gray Eagle drone

Next
Next

Influences on Memory