Workload Analysis of Menu Layering in Unmanned Aircraft Systems Control Stations

PDF Version: https://ryanblakeney.com/wp-content/uploads/2020/05/Workload-Analysis-of-Menu-Layering-in-Unmanned-Aircraft-Systems-Control-Stations.pdf

Abstract

Control Stations (CS) is an interface that allows for Unmanned Aircraft Systems (UAS) operators to operate their aircraft. During a flight, the operators must be capable of monitoring the health and status of the unmanned aircraft system. Three main functions of a control station are mission planning, observing, and piloting. Control Stations can range from a small handheld controller, a laptop computer, a large mobile container, or a complex building-based system. Most unmanned aircraft accidents are attributed to operator error, part of which is related to due to excessive workload. To reduce contributing to operator workload, the operator’s control station should be comfortable, and the interface should be intuitive. Human factors engineering uses scientific knowledge about human behavior, physical characteristics, and cognitive capability specifying the design and use of a human-machine system. The objective is to improve system efficiency by minimizing human error and optimize timely performance, comfort, and safety. The operator within the control station must be able to access all relevant information when needed; however, they must not be overwhelmed with a cluttered control station. The number of controls and displays should be limited to what is needed to operate the unmanned aircraft system effectively. The software/algorithm should have a design to contain a menu-driven interface, to allow the operator to find the necessary information in a timely and error-free manner commensurate with the urgency of the task. Clearly defined standards for control stations would enable human factors engineers to develop efficient and intuitive designs for unmanned aircraft system operators

Keywords: unmanned aircraft systems, human factors, menu layering, control station, workload

Workload Analysis of Menu Layering in Unmanned Aircraft Systems Control Stations

The research paper focuses on the effects of menu layering in Unmanned Aircraft System’s (UAS) Control Stations (CS). A Control Station is the interface for a remote pilot to control their Unmanned Aircraft (UA). This interface is used by the pilot to perform multiple tasks associated with the operations of the UA. Accidents associated with the effects of human factors accounted for 69% of all unmanned aircraft accidents in the United States Air Force (Rogers, B., Palmer, B., Chitwood, J., and Hover, 2004) Although the Department of Defense (DoD) UAS mishaps differ between systems, an average of 50 mishaps occur every 100,000 flight hours, in contrast to the Department of Defense manned aircraft average one mishap per 100,000 flight hours. (Weibel & Hansman, 2005). The Department of Defense defines mishap, and an accident where personnel are injured or the UAS sustained damage or was destroyed (Department of Defense, 2018). The Department of Defense uses the terms mishap and accident interchangeably throughout their reports and research.

Information access, command and control functions, and communication functions for UAS operators should be efficient, easy to learn, and easy to use. According to NATO (2019), the design, location, and access of critical controls that may require immediate action must be compatible with the quick and accurate reaction of the operators during emergency operation. If an interface utilizes pull-down menus, the controls that require immediate reaction of the operators must be accessible at the first level of the menu pull-down (NATO, 2019). The accessibility of information and actions by the operator is critical for the safe use of the UAS.

Department of Defense standards of design for the UAS was updated with a manual from the Under Secretary of Defense in 2012 and more recently by NATO in 2019. These standards take into account nearly all facets of design considerations for control stations used for unmanned aircraft. The standards for menus for the Department of Defense are that menus should have no more than 10 but no less than three options per menu (Under Secretary of Defense, 2012). Options, in this case, are the selectable functions in a system, and a menu is a window with a list of options. The reason for this is to ensure there are not too many menus by having multiple menus with only two or fewer options and by avoiding overloading the menu options for the operator by having more than ten options. The manual explains that complex menus with deep hierarchies should not exceed more than four steps (Under Secretary of Defense, 2012). By ensuring there are a lower number of menus and options to a task, there will be fewer steps required to complete the task. If the operator is required to complete multiple steps to complete a task, it could be indicative of options that are layered deep in the user interface of the control station.

The learning outcomes addressed in this research paper are:

  • Analyze the effects of menu layering for unmanned aircraft systems and their impact on the operator's ability to perform flight functions.
  • Evaluate the advantages and disadvantages of unmanned system control stations concerning their current and future intended uses as they relate to human factors. 
  • Identify and describe the significant human factors issues surrounding the design, use, and implementation of unmanned systems in today's commercial and military environments.
  • Evaluate the commonalities and differences of human factors issues surrounding the design, use, and implementation of unmanned systems as they are compared to the manned systems. 
  • Describe the issues surrounding crews with a higher workload caused by poor human factors design in the control station and human-machine interface.

Problem Statement

The control station for UAS presents information to the operator as well as allows them to fly the aircraft through a Human-Machine Interface (HMI) that includes layered menus on displays. The Human Machine Interface is the part of a system at which the human and the machine meet and interact to allow for the operators to input controls and read output from the system using displays (Under Secretary of Defense, 2012). The interface for each UAS varies based on the objective, requirements, and the complexity of the system.

Menus should be consistent across the design of the control station to ensure ease of use by the operator. The information that is available through the HMI should be easily accessed by the operators to ensure ease of use to minimize user workload through the use of the menus and information display. Minimizing workload allows the operators to have additional time to allow them to grow their situation awareness (SA) by giving them the ability to focus on more than one area. Menu layering with the use of pull-down and sub-menus that require the operators to navigate multiple menus can affect their workload.

Significance of Issue

The workload for UAS operators can affect their performance during flight. NATO (2019), explains that the operators must have access to critical actions or options with rapid and precise reaction when they are faced with an emergency situation. The reduction in workload would allow the operators to focus on more critical and timely tasks that may be required based on the environment. With a rate of 69% of all unmanned aircraft accidents caused by human factors, workload reduction may help reduce the number of accidents that may occur.

Excessive workload has been listed in multiple human factors guidelines, studies, and research to be determined as a critical piece of a human factors engineers design for the Human-machine interface and the control station. NATO (2019), Ahlstrom, & Longo (2016), and Yeh, Swider, & Donovan (2016) have gone into great detail with their standards or guidelines for human factors design to ensure that operator workload is minimized to ensure the operators are not oversaturated with tasks or information. If the menu design for UAS can reduce user workload, there is a potential to lower the number of accidents or mishaps related to human factors.

Review of Relevant Literature

Guidelines

Human Factors Guidelines for Unmanned Aircraft Systems is an article written by Hobbs & Lyall (2016) that focuses on the control station. The authors explain that existing control stations were designed without human factors design principles in mind (Hobbs & Lyall, 2016).  Design problems that were commonly found on control stations are a reliance on textual presentation of information, complicated sequences of menus to perform time-critical or frequent tasks, unguarded safety controls placed in areas where they can be easily activated, controls that perform critical functions, controls that cannot be reached from the pilot seat, and multiple displays (Hobbs & Lyall, 2016).

It is believed that some of these design deficiencies were due to rushed production to meet wartime needs (Hobbs & Lyall, 2016). The authors believe there are currently no human factors guidelines for the design of control stations of unmanned systems. In Table 1, the authors explain the considerations that are necessary to ensure adequate design in a UAS. The authors explain that although the control station of a UAS resembles both an aircraft cockpit and an office room, the design must be modified to ensure the crews operating the aircraft have access to relevant information during the flight.

The authors also cover five aspects they may be addressed by their designated guidelines for designing a UAS human-machine interface. The first area that is covered is the task description. The designers of the interface must understand what the intent is behind their UAS. If the mission is an Intelligence, Surveillance, and Reconnaissance (ISR) mission, the task description should indicate this. The designers will take this information into account for the design.

Two more guidelines are display requirements and control requirements. Display guidelines describe what information must be provided to the operators; however, it does not specify how that information is displayed (Hobbs & Lyall, 2016). The control guidelines indicate what inputs the control station must be capable of receiving from the operators without specifying how it is designed (Hobbs & Lyall, 2016). The fourth guideline is the properties of the interface. These physical properties for the interface include the layout, shape, color, visibility, and the use of color (Hobbs & Lyall, 2016). This guideline explains what the operator will be utilizing to fly the aircraft from the control station.

Table 1. Some Unique Human Factors Challenges of Unmanned Aircraft Systems.
Note. Reprinted from “Human Factors Guidelines for unmanned aircraft systems,” by Hobbs, A., & Lyall, B. 2016, p. 24., Copyright 2016 by Hobbs, A., & Lyall, B.

The final guideline is general human factors principles. This is a broad guideline that covers the entirety of the UAS. The guideline covers all areas to ensure inconsistencies are dealt with in the design of the control station. The guidelines will help when all of the systems in the human-machine interface are put together. If there are mismatches in how information is controlled or displayed, this guideline will help create the path to merging the systems to ensure there is an intuitive and efficient design.

This is relevant to menu layering research in that it explains the areas and guidelines that should be taken into account when designing a system. These guidelines assist designers in ensuring the control station is designed appropriately to allow for the systems to meet the requirements of the user and provide an efficient system for the operators to control their aircraft. The designers for the system would need to ensure the control station is designed to execute the tasks that were listed in the requirements for the UAS.

The Office of Under Secretary of Defense (2012) developed guidelines and standards titled “Unmanned Aircraft Systems Ground Control Station Human-Machine Interface” for the design of the control station for UAS. The guidelines are extensive and cover a large area of design standards for CS. One of the concepts that are explained is the act of Card Sorting. Card Sorting is the act of having a user assist in the design of the Human Control Interface (HCI). This method allows the users of the UAS to help determine the layout by showing designers what is essential to the users and also shows how the user would define the grouping of the information that is being displayed.

The manual also explains that menu hierarchies should focus on shallow and broad menu designs over narrow and deep. Menus should require no more than three steps to complete a task to ensure the design takes into account the response time and display rate of information so that when the computer response to the user’s action, it can enact the requests of the operator more quickly (Under Secretary of Defense, 2012). Having no more than four steps allows operators to get to the information they need more quickly. Fewer steps in the menu design also allow the operators to choose the correct menu more quickly. By having multiple menus to choose from, the operator can more quickly select a specific menu to a specific function as opposed to starting with one menu to access numerous functions.

The Federal Aviation Administration published the “human factors design standard” by Ahlstrom & Longo (2003) that is a compilation of human factors practices and principles for the procurement, development, engineering, and testing of Federal Aviation Administration systems, facilities, and equipment (Ahlstrom & Longo, 2003). For HMI design, designs should focus on the visual and spatial representation of information instead of text or verbal displays to avoid high workload (Ahlstrom & Longo, 2003). To help reduce the workload on the operators, dynamic menus should be used to ensure options are displayed to the users that are relevant to the current environment (Ahlstrom & Longo, 2003). This shows that dynamic menus should be used instead of layering options through multiple menus to allow the user to reach their desired information or selections when they are trying to complete a task.

The North Atlantic Treaty Organization (NATO) (2019) developed the Unmanned Aircraft Systems Airworthiness Requirements (USAR). These guidelines define the requirements of UAS design to achieve airworthiness certification of rotary and fixed-wing aircraft that operate between 150 and 20,000 kg (NATO, 2019). NATO (2019) defines workload as the amount of work assigned to or expected from a person in a specified time. According to NATO (2019), the workload of an individual operator for a UAS must take into account operations of essential elements of a UAS, navigation, flight path control, communication, compliance with ATC, and command decisions (NATO, 2019).

NATO (2019) explains that safety-critical controls must be available to the operator to allow for the rapid and precise reaction of the UAS crew in an emergency situation. If the UAS has a "pull-down" menu, the operators must be capable of accessing information in the first level of the menu to allow for prompt reaction (NATO, 2019). The access to information or options in the first layer of a pull-down menu would allow the crews to have access to relevant actions in a short amount of time. If the users can access the information more easily, it can reduce workload by lowering the amount of time the user spends trying to access the information.

The Human Factors Considerations in the Design and Evaluation of Flight Deck Displays and Controls Version 2.0 was created by Yeh, Swider, & Donovan (2016) to identify and prioritize guidance on human factors issues in the design and evaluation of flight deck displays and controls for all types of aircraft. The report explains that hierarchical structure in menus should be designed to allow the operators to step through the available menus or options in a logical way to complete their tasks (Yeh, Swider, & Donovan, 2016). This type of navigation allows the operators to logically reach their menu or options based on the grouping and path of the options to ensure an intuitive and easy to use menu design.

The number of sub-menus should be designed to allow for quick access to the desired options without over-reliance on memorizing the steps of the menu structure (Yeh, Swider, & Donovan, 2016). The number of steps required to choose the desired options should be consistent in frequency, importance, and urgency of the operator's task for the environment (Yeh, Swider, & Donovan, 2016). By designing the menus to allow for quick and intuitive access to relevant information, the operator should experience a reduction in workload by having easy access to their desired options or data.

Menu laying of information should not hinder the pilot's ability to identify the location of their desired options or control (Yeh, Swider, & Donovan, 2016). Considerations should be made by the human factors engineers on the location and accessibility of control functions within various menu layers and how the pilot will navigate the menus (Yeh, Swider, & Donovan, 2016). For menu-based controls, ensure the number and complexity of the steps required to retrieve or access a control or option, is appropriate to the intended task or use of the control (Yeh, Swider, & Donovan, 2016). The number of sub-menus layered should allow for easy access without over-reliance on memorizing the location of the menu structure (Yeh, Swider, & Donovan, 2016). Menu design that is intuitive for intended use can allow operators to access controls or information without memorizing the steps to reach their intended location in the hierarchy.

Mishaps Related to Control Station Design

Marshall et al. (2016) found that human error accounted for roughly half of all UAS mishap accidents (Marshall et al, 2016). The authors compared the U.S. Air Force, Navy, and Army UAS mishaps. They found that the mishap rates for U.S. Air Force UAS were higher due to the high autonomy in their UAS (Marshall et al, 2016). This increase in human factors for the U.S. Air Force was determined to be higher than other services due to the higher stress environments where the UAS are flown and have a lower opportunity for human interaction with the UAS (Marshall et al, 2016). Other military services that fly unmanned aircraft regularly maintain manual control of the aircraft throughout the flight, which maintains the operator’s attention. Increased use of manual control increases the workload on the operator.

The authors explain the use of the human factors analytics and classification system (HFACS). As seen in Figure 1, the HFACS classified four levels of failure. This system is used to determine which areas contributed most to a mishap to enable investigators to determine where to focus their attention. This guides investigators to understand how to enable future operations to avoid future mishaps by fixing specific failures that occurred. The model looks at the 4 Ws of What, When, Why, and What. This system can be used to help determine the workload of the operator during an incident to try and find the contributing factors that may have been related to how much work the operator was doing at the time of the incident. This type of investigation can help determine if the workload was related to the HMI of the mishap UAS.

Marshal et al (2016)  gather multiple studies regarding UAS mishaps and determined that the impact of human factors performance is significantly mitigated by operational context, human-system integration, system automation, crew composition, and crew training. Human system integration is the human element in all aspects of a systems lifecycle to reduce resource utilization and system costs from inefficiency while dramatically increasing system performance and productivity. System automation can be great to reduce operator workload; however, there are requirements for large amounts of mission planning, and there is the possibility of distrust of the automation if the system does not respond to how the pilot expects.

Crew size, composition, and crew training can have a significant impact on the ability of the operations teams to execute their mission (Marshall et al, 2016). New UAS are commonly upgraded and do not share similar control stations or crew constructs. This causes a delay in crew training as it can take time to change crew training to meet the advances or changes in new UAS upgrades. This is similar to a manned pilot learning to fly a new aircraft.

Figure 1. Human Factors Analytics and Classification System (HFACS). Reprinted from “Introduction to Unmanned Aircraft Systems,” by Marshall, D. M., Barnhart, R. K., Shappee, E., & Most, M. T. (Eds.). 2016, Copyright 2016 by Marshall, D. M., Barnhart, R. K., Shappee, E., & Most, M. T.

The Department of Defense UAS control stations are full of human factors and ergonomic deficiencies. In studies on mishaps for UAS, 69% of mishaps were human factors related to where 24% of the mishaps are directly attributed to the control station (Waraich et al, 2013). In 2005, the Department of Defense started creating road maps and designating guidelines for the design and acquisition of UAS. According to the authors, the DoD has failed to create a specific standard for the creation of the control stations of their UAS. The authors explain that the Government Accountability Office (GAO) interviewed multiple UAS manufacturers and found that development and design lacked incorporation of human factors engineering. This led to the creation of numerous UAS with different types of control stations.

Control Station Design

In the book titled “Remotely Piloted Aircraft Handbook of Human Factors in Air Transportation Systems, Alan Hobbs (2017) looks at requirements and deficiencies in the design of UAS. One of the deficiencies explained by Hobbs (2017) is “complicated, multistep sequences required to perform routine or time-critical tasks, often involving menu trees.”  This type of deficiency is key to this research paper as it shows that layering information inside menus can cause even the simplest task to be difficult to complete. If the operator is required to navigate these menus often, it can add time to completing the task while making the task itself tedious. If the operator simply wants to check their oil temperature, menu layering can increase the time it would take over a gauge that may show continuously throughout the flight.

Another deficiency the author goes over is an operator’s heavy reliance on memory to keep track of a UAS’s status or flight plan details (Hobbs, 2017). If information is buried deep through menus or behind multiple layers of information, it can make it difficult for the operator to view the information often without detriment to the task at hand of flying the aircraft. This is relevant in that is explains that data fusion, or the combination of information in a single location, is essential for the displaying of information to the operator. If information is buried inside multiple layers of menus or displays, it is possible to force the pilot to simply write down their information or attempt to memorize it for later recall. This is dangerous for emergency operations. If a pilot is under stress while dealing with an emergency, it is unlikely they would be able to recall the information easily (Pastor, Royo, Santamaria, Prats, & Barrado, 2012).

When designing a new control station for a UAS, it is essential to iterate your designs continuously to ensure they are meeting your intended goal. A method that can be used to do this is the “Modified Cooper Harper Evaluation Tool (MCH-UVD) for Unmanned Vehicle Displays” by Cummings, Myers, & Scott (2006). The MCH-UVD Tool is used to evaluate designs for UAS to ensure they are operator friendly. Cummings et al. (2006) explain that unlike losses in productivity due to poor application design for some businesses, a poorly designed UAS interface can be catastrophic enough to cause human casualties. As seen in Figure 2, the Display Qualities Rating Scale helps interface designers ensure their concepts are taking human factors into account.

Figure 2. Modified Cooper-Harper (MCH-UVD) Scale. Reprinted from “Modified Cooper Harper Evaluation Tool for Unmanned Vehicle Displays,” by Cummings, M. L., Myers, K., & Scott, S. D., 2006, Copyright 2006 by M. L. Cummings, Kevin Myers, Stacey D. Scott.

Using the MCH-UVD, designers can create a product and have a test user try and utilize the design. As the user goes through the use of the control station, the diagram is used from the bottom left to show where the operator interaction starts to show deficiencies in the design. An ideal score or rating from the MCH-UVD is a one, and the worst-case scenario is a 10. A 1 would indicate that the display is providing adequate information to the operator with little requirement for the operator to make up for the lack of information display. A 10 on the scale indicates that the design does not show the operator their desired information.

The scale is relevant to layering on the UAS control station by helping designers understand where deficiencies exist in the current iteration of the display for the operators. An area of concern for using this diagram for the design and testing of control station displays is “variation in opinion can be significant and subjective opinions should not necessarily guide interface design because often what users like can be detrimental to their performance” (Cummings et al, 2006).

Research by Fern et al (2012) was completed to determine how a Cockpit Situation Display inside a UAS control station can help reduce the workload of Air Traffic Control (ATC) with unmanned aircraft flying in the Nation Airspace System. The research uses both ATC controllers and UAS operators to determine the workload of the controllers when the UAS operators had the capability to see other traffic around them (Fern, Kenny, Shively, & Johnson, 2012). The research determined that with a CSD installed on the control station of a UAS, both the pilot and the ATC controller saw reduced workload from each other, thus allowing each to focus their attention on other areas during the flights.

Cockpit Situation Displays can show information to the UAS operator that would normally be unavailable during flight. The shared picture between ATC and the operator allows the UAS to fly in the NAS without hindering the air traffic controller's ability to integrate the UAS with manned aircraft. Workload reduction by fusing data into a CSD allows the UAS operator to see information more easily than having them navigate menus to retrieve the information. This type of research shows that by displaying information more easily to the operators, the workload is reduced.

Alternative Actions

Initial Design

Unmanned aircraft systems require extensive planning and design to ensure an intuitive interface is created for operator use. Human Factors Engineers must plan and test their designs to ensure they are performing the intended function and safely complying with the applicable regulations. In an interview with Mikel Atkins, a Senior Human Factors / Crew Systems Engineer for Lockheed Martin, Mikel explained that when designing an aircraft, one needs to know what the system will be used for (M. Atkins, personal communication, March 29, 2020). Mikel explained that knowing what the customer is looking for regarding the Concept of Operations (CONOPS), who will be flying, and what the mission is for, will provide an understanding of where to begin with control station design (M. Atkins, personal communication, March 29, 2020).

The design of a control station and the interface will go through many iterations. The use of a measurement tool for performance like the MCH-UVD will allow designers to understand where the design may have deficiencies. This is critical in the process of determining information like how many layers of information should be displayed or kept in menus. Many UAS initial designs look like engineering test tools, which is not what most end users are looking for in a final design. When creating an interface, the engineers want an interface that provides the end-user with proper Situation Awareness (SA) without overloading them with unnecessary information (M. Atkins, personal communication, March 29, 2020). Ensuring data fusion on the display can allow the user to see relevant information without having to go looking for it. By designing the systems with an adequate amount of information displayed at a given time, the user will have enough SA to complete their tasks while having enough capacity to work with any additional tasks that may arise during flight.

Initial designs for UAS will typically change through the design process. The engineers will take time to determine the useability of their systems by testing it with users and measuring their capabilities using a table similar to the MCH-UVD. These rating scales allow the engineers to understand where the deficiencies exist to modify them and allow the UAS to have more human factors friendly designs. By using an iterative process to design, test, and modify/redesign the interfaces, engineers can ensure that significant deficiencies in the design are removed, and the users are capable of performing their mission or task with little or no human factors that affect their performance.

Design Standards or Guidelines

With the Under Secretary of Defense (2012) guidelines titled “Unmanned Aircraft Systems Ground Control Station Human-Machine Interface,” the DoD has created guidelines or standards for future UAS designs. These standards define how the DoD will measure and expect their future UAS designs to follow. This guide should give engineers an idea and best practices for where to start their design process to allow for DoD UAS design to move in the direction of the requested systems (Under Secretary of Defense, 2012).  It is not the goal of the guide to define every aspect of the design process but to give an idea of where the designs should be starting to allow engineers to focus on methods that have been proven effective in the eyes of the DoD (Under Secretary of Defense, 2012). 

With a guide for the DoD, the private industry has an idea of where to start their process. However, despite these starting points, there is no existing standard for all UAS. According to Mikel (2020), as far as industry is concerned, there is not a definable standard for UAS (M. Atkins, personal communication, March 29, 2020). He explains that many engineers have come up with a so-called standard for what they do, but there is no set design standard for all UAS.

One size does not fit all for design. The amount of information that an operator has available has created many data fusion concepts and applications (M. Atkins, personal communication, March 29, 2020). Data fusion is an alternative to layering information inside a control station. By showing multiple pieces of information on a single display, the operator can look at and comprehend whatever information they need while still looking at all of the information available. An example of where this is used is a HUD or Head-Up Display on some manned aircraft. A head-up display sits in front of the pilots' windscreen so they can still have primary flight information in their field of view while looking out-the forward window. The SA that is gained from this is similar to fusing data in on a control station interface.

A similarity between manned and unmanned aircraft with data fusion is the display of information when it is needed and not displaying information when it is not. If the pilot sees too much information when they are trying to accomplish a task, they can be oversaturated with data. A manned pilot can select what is displayed in their HUD to ensure they only see what they want to see. UAS displays should have selectable information to ensure the operators are not overloaded with information. Ideally, CS would have adaptive or smart display systems that show relevant information only when it is needed (M. Atkins, personal communication, March 29, 2020).

If the pilot can receive relevant information autonomously, it would allow less work from the operator as the system knows what should be displayed during each phase of a mission or flight. A method that most engineers start with to ensure an adequate design is to start with the basics of Aviate, Navigate, and Communicate (M. Atkins, personal communication, March 29, 2020). These are the most basic guidelines that cannot go away during a flight. According to Mikel (2020), the basic methodology that should be followed for new systems or system redesign are:

  1. Aviate, Navigate, Communicate (Remember, communicate includes the platform, not just people)
  2. CONOPS for mission - What Components will I need to fly the mission?
  3. Contingency planning - Loss of Com/Nav, Weather, System Malfunctions, etc.
  4. The Look Ahead - As pilots, we are always thinking several steps ahead.  Does the system help me with this?
  5. Data Fusion - Get the pilots in a room and discuss how they want/need to have data displayed.
  6. Pilot Expectation - Did the system react in the way the pilot expected.  Surprises are not good.

Menu Depth and Complexity

Operators must have access to information relatively quickly. The design, location, and access to critical controls that may require immediate action must be accessible with a quick and accurate reaction of the operator in an emergency situation (NATO, 2019). If an operator has to spend excess time navigating the menu hierarchy, it can increase the time it takes to complete the task at hand.

If an interface utilizes pull-down menus, the controls that require immediate reaction of the operators must be accessible at the first level of the menu pull-down (NATO, 2019). This is similar to how most manned aircraft have immediate access to emergency switches or handles during flight. UAS operators must be capable of accessing information and actions to ensure they can complete a task with urgency. This is a critical human factor deficiency for unmanned aircraft. UAS that operate on a computer screen has the ability to easily modify the display at the needs or request of the users. This has a disadvantage because UAS interfaces can be modified easily. This can be difficult for the operators who would have to change their displays to either match their preferences or standards for the company or unit that they fly for.

Over the years, three layers deep on menus have been found to be a good guideline for menu layering (M. Atkins, personal communication, March 29, 2020). SA can be fluid, and depending on the mission, SA can require more or fewer components. Elements vital to the mission must be readily available to the operators, and engineers love to layer elements of information (M. Atkins, personal communication, March 29, 2020). UAS operations that last for long durations would require crew changes. These crew changes would require the control stations to maintain the same integrity or positions of the HMI display between operators to allow each crew to have the same baseline standards as they take over the aircraft.

The complexity of the control station is directly proportional to the complexity of the UAS (M. Atkins, personal communication, March 29, 2020). Data fusion is important for complex systems to avoid putting too many displays or showing too much irrelevant information at the wrong time during the flight. It is difficult for the operators to have too many displays during a mission if they are cluttered, especially if they do not need all of the information at that given time.

Recommendations

Understanding human factors in UAS are critical to ensure efficient and intuitive control station design. Future research should focus on defining standards and guidelines for UAS. A set standard for UAS that complete the same types of missions would allow for more focused research on the same types of control stations and interfaces. If there are too many different types of control stations, the research will focus on different areas, and it is possible to slow the progression of UAS in the military and civilian sectors.

Clearly defined standards can also define the maximum depth of data or information depending on the active mission or operation. By defining the standards for each mission, crews can create a common view of where to find information and also when information is displayed. Standards for menu depth can also force engineers to fuse information that is similar or relevant, depending on the mission set. Research should be done to define what standard displays and menu depths would be most effective for each mission set. These standards should be tested by a range of UAS operators from different backgrounds or systems to ensure opinions from each type of UAS are recorded. Once these standards are established, all future UAS designs can follow these standards to ensure common baselines for all UAS.

References

Ahlstrom, V., & Longo, K. (2003). Human Factors Design Standard (HF-STD-001). Atlantic City International Airport, NJ: Federal Aviation Administration William J. Hughes Technical Center.

Breda, L. V. (2012). Supervisory Control of Multiple Uninhabited Systems-Methodologies and Enabling Human-Robot Interface Technologies (Commande et surveillance de multiples systemes sans pilote-Methodologies et technologies habilitantes d’interfaces homme-machine) (No. AC/323 (HFM-170) TP/451). NATO RESEARCH AND TECHNOLOGY ORGANIZATION NEUILLY-SUR-SEINE (FRANCE).

Cooke, N. J., Rowe, L. J., Bennett, W. J., & Joralmon, D. Q. (Eds.). (2016). Remotely piloted aircraft systems : A human systems integration perspective.

Cummings, M. L., Myers, K., & Scott, S. D. (2006, November). Modified Cooper Harper evaluation tool for unmanned vehicle displays. In Proceedings of UVS Canada: Conference on Unmanned Vehicle Systems Canada.

Department of Defense. (2018, August 31). Mishap Notification, Investigation, Reporting, and Record Keeping. Retrieved from https://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodi/605507p.pdf

Fern, L., Kenny, C. A., Shively, R. J., & Johnson, W. (2012). UAS integration into the NAS: An examination of baseline compliance in the current airspace system. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 56(1), 41-45. doi:10.1177/1071181312561029

General Atomics. (n.d.). Certifiable Ground Control Station Controls First End-To-End Flight. Retrieved from http://www.ga.com/certifiable-ground-control-station-controls-first-end-to-end-flight

Haber, J., & Chung, J. (2016). Assessment of UAV operator workload in a reconfigurable multi-touch ground control station environment. Journal of Unmanned Vehicle Systems, 4(3), 203-216. doi:10.1139/juvs-2015-0039

Hobbs, A. (2017). 17 Remotely Piloted Aircraft. Handbook of Human Factors in Air Transportation Systems, 379.

Hobbs, A., & Lyall, B. (2016). Human Factors Guidelines for Unmanned Aircraft Systems. Ergonomics in Design, 24(3), 23–28. https://doi.org/10.1177/1064804616640632

J. M. Peschel and R. R. Murphy, “On the Human–Machine Interaction of Unmanned Aerial System Mission Specialists,” in IEEE Transactions on Human-Machine Systems, vol. 43, no. 1, pp. 53-62, Jan. 2013.

Landry, S. J. (2017). Handbook of human factors in air transportation systems. CRC Press.

Lim, Y., Gardi, A., Sabatini, R., Ramasamy, S., Kistan, T., Ezer, N., . . . Bolia, R. (2018). Avionics human-machine interfaces and interactions for manned and unmanned aircraft. Progress in Aerospace Sciences, 102, 1-46. doi:10.1016/j.paerosci.2018.05.002

Marshall, D. M., Barnhart, R. K., Shappee, E., & Most, M. T. (Eds.). (2016). Introduction to Unmanned Aircraft Systems. CRC Press.

Maza, I., Caballero, F., Molina, R., Peña, N., & Ollero, A. (2010). Multimodal interface technologies for UAV ground control stations: A comparative analysis. Journal of Intelligent & Robotic Systems, 57(1-4), 371-391. doi:http://dx.doi.org.ezproxy.libproxy.db.erau.edu/10.1007/s10846-009-9351-9

Maurino, D. E., Reason, J., Johnston, N., & Lee, R. B. (2017). Beyond aviation human factors: Safety in high technology systems. Routledge.

NATO. (2019) AEP-4671 UNMANNED AIRCRAFT SYSTEMS AIRWORTHINESS REQUIREMENTS (USAR), AEP-4671 UNMANNED AIRCRAFT SYSTEMS AIRWORTHINESS REQUIREMENTS (USAR)

Pastor, E., Royo, P., Santamaria, E., Prats, X., & Barrado, C. (2012). In-flight contingency management for unmanned aerial vehicles. Journal of Aerospace Computing, Information, and Communication, 9(4), 144-160. doi:10.2514/1.55109

Rogers, B., Palmer, B., Chitwood, J., & Hover, G. (2004). Human–systems issues in UAV design and operation (Technical Report HSIAC-RA-2004-001).Wright-Patterson.AFB, OH: Human Systems Information Analysis Center

Sadraey, Mohammad. Unmanned Aircraft Design : A Review of Fundamentals, Morgan & Claypool Publishers, 2017. ProQuest Ebook Central

Savage-Knepshield, P., & Chen, J. (2017). Advances in Human Factors in Robots and Unmanned Systems Proceedings of the AHFE 2016 International Conference on Human Factors in Robots and Unmanned Systems, July 27-31, 2016, Walt Disney World®, Florida, USA  (1st ed. 2017.). https://doi.org/10.1007/978-3-319-41959-6

Thompson, W. T., Tvaryanas, A. P., & Constable, S. H. (2005). US military unmanned aerial vehicle mishaps: Assessment of the role of human factors using Human Factors Analysis and Classification System (HFACS). Terra Health, Inc. and the 311th Performance Directorate Performance Enhancement Research Division.

Tso, K. S., Tharp, G. K., Tai, A. T., Draper, M. H., Calhoun, G. L., & Ruff, H. A. (2003, October). A human factors testbed for command and control of unmanned air vehicles. In Digital Avionics Systems Conference, 2003. DASC’03. The 22nd (Vol. 2, pp. 8-C). IEEE.

Under Secretary of Defense. (2012) Unmanned Aircraft Systems Ground Control Station Human-Machine Interface, Unmanned Aircraft Systems Ground Control Station Human-Machine Interface

Viquerat, A., Blackhall, L., Reid, A., Sukkarieh, S., & Brooker, G. (2008). Reactive collision avoidance for unmanned aerial vehicles using doppler radar. In Field and Service Robotics (pp. 245-254). Springer Berlin/Heidelberg.

 Waraich, Q. R. (“Raza”), Mazzuchi, T. A., Sarkani, S., & Rico, D. F. (2013). Minimizing Human Factors Mishaps in Unmanned Aircraft Systems. Ergonomics in Design, 21(1), 25–32. https://doi.org/10.1177/1064804612463215

Yanushevsky, R. (2011). Guidance of unmanned aerial vehicles.

Yeh, M., Swider, C., Jo, Y. J., & Donovan, C. (2016). Human factors considerations in the design and evaluation of flight deck displays and controls: version 2.0 (No. DOT-VNTSC-FAA-17-02). John A. Volpe National Transportation Systems Center (US).

Previous
Previous

Air to Air Sense and Avoid on Unmanned Aircraft Systems

Next
Next

UAS Crewmember/Operator Requirements