How Neuroplasticity Affects Perception & Performance

As humans experience events in their lives, their brain reshapes or changes the neurons in their brain. The changes in the brain from experience are called Experience-dependent Plasticity (Goldstein, 2018). As the brain changes the way it understands objects, this can affect our ability to perceive or understand how things exist or function. These changes in the brain can occur from normal life events or in the case of an Unmanned Aircraft System (UAS) operator, the changes in how the brain functions can occur from training.

Training is a critical piece of how humans learn to operate UAS. The continuous practice and education associated with training are what allow pilots to learn how to fly UAS. The U.S. Air Force training to become a UAS pilot can last up to nine months (King, 2015). The long and extensive training allows Air Force instructors to teach and provide experience to allow for experience-dependent plasticity to occur. Goldstein (2018) explains that the brain can be tuned by using an environment to teach the brain what to expect. This creates a known environment for the brain to use in top-down processing.

Control Station Design

Human factors engineers can capitalize on the brain's ability to train itself to understand and expect certain aspects of a known environment. If the engineers can create a system or a control station that takes into account how the brain functions and changes based on the known environment, an intuitive and easy to understand control station can be created. Engineers can create an environment where the operator gets comfortable and begins to expect certain aspects or pieces of information throughout a flight.

An easy to use and understand control station creates a baseline or foundation for top-down processing. By ensuring the control station remains predictable, the engineers can modify specific aspects or parameters based on the mission or how the aircraft is operating. This type of design has the potential to decrease operator workload throughout a mission by ensuring the operators can quickly see and understand any changes in the aircraft’s health and status.

A study for testing methods of showing information to a UAS pilot allowed for multiple days of training in their study to allow for neuroplasticity to occur (Cooke, Rowe, Bennett Jr, & Joralmon, 2016). The study showed that allowing for neuroplasticity to occur allowed for increased situation awareness and decreased workload for the participants (Cooke et al., 2016). Human factors engineers can create a system that allows users to train and practice over a period of days or months. This training is shown to help increase the operator's capabilities to use their control station more effectively.

Limitation of Perception in Control Stations

An area of perception that can cause issues for UAS operators is the sensor data that is collected by an intelligence collecting aircraft. In an interview with Technical Sergeant Christopher Hook, an RQ-4B Sensor Operator, he explained that imagery analyst and some Sensor Operators (SO) are trained to understand how to read an image based on multiple metrics like shadow, shape, size, and tone (TSgt C. Hook, personal communication, June 13, 2020). The sensor operator's ability to look at an object and understand what they are looking at requires extensive training over the course of six months and multiple phases in their training course (TSgt C. Hook, personal communication, June 13, 2020).

An example of perceptual issues that occur during this process is viewpoint invariance. Viewpoint invariance means it is easier for humans to understand what an object is from multiple angles. It is difficult to utilize Artificial Intelligence (AI) to determine what an object is from different angles due to the computer's inability to understand that it sees the same object but from different angles (Goldstein, 2018). Training a computer model would require providing the AI with the same object but from multiple sides or viewpoints. An SO would not have the same issue as AI when trying to determine what an object is from different viewpoints as humans

Human factors engineers can aide in the SO's ability to determine what they are looking at through their sensor by providing AI that has been trained to look at an object from multiple points of view. This type of system would reduce the workload on the SO and allow them to monitor and assist in the collection more quickly over time (TSgt C. Hook, personal communication, June 13, 2020). The AI system could provide the SO with an idea or answer to what they are looking at in an image. This would have varying degrees of accuracy; however, if the accuracy is high enough, it would reduce the amount of time an SO would need to study an image to understand what their sensor is looking at, thus reducing the workload.

References

Cooke, N. J., Rowe, L. J., Bennett Jr, W., & Joralmon, D. Q. (Eds.). (2016). Remotely piloted aircraft systems: A human systems integration perspective. John Wiley & Sons.

Goldstein, E. B. (2018). Cognitive psychology: Connecting mind, research, and everyday experience. Nelson Education.

King, L. (2015). DoD unmanned aircraft systems training programs. International Civil Aviation Organization, March, 24.

Previous
Previous

Alertness in Sustained, Selective, Alternating, and Divided Attention

Next
Next

Neural Networks and Human-Machine Interactions