Human Computer Interfacing (HCI) for Autonomous Detect and Avoid (DAA) Systems on Unmanned Aircraft Systems (UAS)
Navy SBIR 2016.1 - Topic N161-020 NAVAIR - Ms. Donna Attick - [email protected] Opens: January 11, 2016 - Closes: February 17, 2016 N161-020 TITLE: Human Computer Interfacing (HCI) for Autonomous Detect and Avoid (DAA) Systems on Unmanned Aircraft Systems (UAS) TECHNOLOGY AREA(S): Air Platform, Human Systems ACQUISITION PROGRAM: PMA 262 Persistent Maritime Unmanned Aircraft Systems OBJECTIVE: Develop and validate Human Computer Interfacing (HCI) for governing the interaction of autonomous Detect and Avoid (DAA) maneuvers and human-initiated inputs, creating a holistic DAA capability. This would include display format of conveying current and impending autonomous maneuvering information. The HCI developed as a result of this project could apply to future DAA algorithm validation for Group 3-5 fixed-wing unmanned aircraft systems (UAS). DESCRIPTION: Detect and Avoid (DAA) capabilities are a mix of human-in-the-loop (HITL) and autonomous components. For UAS, likely requirements will include DAA systems that provide autonomous capability for last-resort collision avoidance. Currently, standards and designs do not exist for how these collision avoidance maneuvers (determined and executed by the unmanned aircraft (UA) without human input) will be governed with respect to other human-initiated inputs. This challenge is unique to UAS as the pilot and the algorithms are not co-located and as such, time delays exist. There are several questions to be addressed with regard to these collision avoidance maneuvers. For example, during which scenarios do algorithm inputs take precedence over human inputs (including pilot abort option)? What information regarding UA algorithm "intent" (possible impending maneuvers) needs to be presented to the human? Should the human be provided with only the "best" (algorithm-decided) maneuver or with a range of all maneuvers that are projected to have acceptable outcomes? What role does the human have in accepting, rejecting, or overriding maneuvers determined by the UA? How is link latency mitigated? We must consider that human inputs are made using what the human understood at one point in time and the UA might currently have different information available. Also, possible UA "intended" maneuvers to be presented to the human could be outdated by the time the human views them. How do rules of precedence and intent change when command and control (C2) downlink and uplink is interrupted, or when the algorithm determines a maneuver is required and no time for the human to respond can be allotted? Guidance will be solicited from Subject Matter Experts (SME) experienced in manned aircraft operational rule sets, UAS operation, pilot displays and design, human factors, and system safety. Surveys, or other instruments that leverage industry experience and solicit expert preferences, will be considered during guidance development. The display parameters, such as encounter geometry and time delays, should be varied as to sufficiently cover a variety of conditions. One-way communication delays can be assumed to be between 2 and 5 seconds. Consideration should be given to displays which closely mimic the 2-D Traffic Alert and Collision Avoidance System (TCAS) displays used in manned aircraft. Human performance measures, such as response latency and collision avoidance success rate, will be used to evaluate the HCI and the interaction of algorithm and human inputs. The HCI should assume that DAA algorithm outputs are in the form of bank and pitch commands to the flight controls, and will also output a time until the maneuver needs to be executed. This SBIR will develop HCI that when integrated as a holistic DAA design will enable UA platforms to operate in mixed use airspace, resulting in greater mission capability for future unmanned Naval aviation programs. This will reduce use of additional assets in accomplishing missions, enabling significant reductions in life-cycle cost through increased time-on-station with fewer requirements for support aircraft and procedures. PHASE I: Develop HCI design covering all logical possibilities of autonomous versus human-initiated controls, as well as the design of displaying the intended and actual DAA-determined maneuvers. Validate the design with pilot SMEs. Translate the design into algorithms (software) for laboratory simulations. Develop simple displays that show the encounter between the UA and the collision threat aircraft. Determine the most efficient manner for which to display recommendations and alerts to operators. Provide simple algorithm maneuver decisions that are displayed to the pilot. PHASE II: Prepare design documentation (to include functional requirements and interfaces) and develop prototype software for the HCI. Demonstrate the HCI in a laboratory simulation. Include integration of existing DAA algorithms into HITL simulations of realistic collision avoidance scenarios. Phase II should result in a tested, TRL-6 software product with associated design documentation that shows promise for transition to acquisition. PHASE III DUAL USE APPLICATIONS: Refine and validate the software implementation. Assist with transition of the final software application/suite into Navy (and/or other DoD agencies) systems. The HCI will interface with existing DAA algorithms on a Group 3-5 fixed-wing UAS and become part of the ground-based control station and airborne mission computer software. Integration will include software safety, system safety, and airworthiness evaluations common to military platforms. The Triton UAS may be the first platform to transition this software. The technologies developed in this SBIR are relevant to any other government agencies or private companies that operate or develop autonomous systems. While focused on UAS operation, the certification techniques developed in this project would also have applicability to autonomous ground, maritime and land-based air vehicles. Additionally, the test/demonstration techniques developed in this project would have direct relevance to the FAA's efforts to certify and integrate state and commercial unmanned systems into the National Airspace System REFERENCES: 1. Parasuraman, R., Sheridan, T. B., and Wickens, C. D. (2000). Fellow, IEEE: A Model for Types and Levels of Human Interaction with Automation. IEEE Transactions on Systems, Man, and Cyberkinetics-Part A: Systems and Humans, 30(3) 2. Parasuraman, R. & Riley, V. (1997). Humans and automation: Use, misuse, disuse, abuse. Human Factors: The Journal of the Human Factors and Ergonomics Society, 39(2), 230-253. KEYWORDS: Airworthiness; Autonomy; Flight Control; Human Interface; Unmanned Aircraft (UA); Detect and Avoid (DAA) TPOC-1: 301-342-7717 TPOC-2: 301-342-8406 Questions may also be submitted through DoD SBIR/STTR SITIS website.
|