After gaining his PhD in Cognitive Psychology and Human-Computer Interaction from the University of Wales Swansea, Dale joined QinetiQ (formerly the Defence Evaluation Research Agency) and worked primarily on defence programmes. After two years working on maritime systems he joined the Aerospace Group at Farnborough where he attained Technical Focus for Human Factors within QinetiQ Aerospace Division. Here Dale gained experience in applying Human Factors knowledge across different programmes ranging from pervasive networks, ubiquitous computing, commercial flight decks, and advanced HMI for Future Offensive Air Systems (FOAS). For several years Dale was the Human Factors lead for the UK MoD Applied Research Programme (ARP) Autonomy & Mission Management for unmanned systems. This would eventually lead to the successful demonstration of a fast jet controlling multiple unmanned air systems, where Dale led the design of the displays implemented into the fast jet cockpit (Tornado F2A). Dale was also Human Factors lead for QinetiQ on the first two phases of civil UK Unmanned Air Vehicle programme – ASTRAEA (Autonomous Systems Technology Related Airborne Evaluation & Assessment). During this time Dale assisted with the creation of the Human Factors Chapter for CAP722 – UAS Operations in UK Airspace. Since joining Coventry University in 2012, Dale leads the Human Factors research strategy for UAS, and has worked on several UAS research projects. This has ranged from identifying training requirements in UAS for the UK Royal Navy; demonstrating Human-Agent mission planning systems for emergency response, to providing guidance on design principles for a Ground Control Station that allows the control of multiple UAS. Dale is also involved in a number of initiatives surrounding Human Factors for autonomous cars. He has presented at many conference papers on Human Factors for unmanned and autonomous systems, such as AIAA Aviation and ICRAT 2016.
We have witnessed the steady increase in technologies that have allowed unmanned systems to expand their capability, whilst at the same time increasing their potential application. In the past we have tended to focus predominantly on larger UAS, with major projects attempting to better understand the constraints related to operating UAS in different classes of airspace. Previous data has indicated that there is a significant human factor that contributes to UAS mishaps and incidents (Williams, 2004; Giese, Carr & Chahl, 2013). There are several important human factors issues we need to consider when evaluating the operator interaction with an unmanned system. The predominant factor in this instance is the nature of remoteness between the platform and the operator. The significant reduction of contextual cues that we would normally associate with manned aviation in the cockpit can no longer be considered, and thus different forms of ensuring the operator is brought into the control loop with the aircraft under his/her control. In some instances we can see an increase in reliance on highly automated systems that either provide information to other components of the aircraft (e.g. sensors to inform sense and avoid actions), or directly to the flight direction of the aircraft – essentially designating the operator as a supervisor of the system under their control. This presentation will discuss the issues surrounding key human factor issues that relate to UAS operation, with particular reference to the growing interest in using advanced automation and autonomy to control the platform(s). This is particularly relevant as we observe changes in regulations surrounding small UAS.