Data Science Driven Aircrew Performance Measurement and Proficiency System
Navy SBIR 2018.1 - Topic N181-026 NAVAIR - Ms. Donna Attick - [email protected] Opens: January 8, 2018 - Closes: February 7, 2018 (8:00 PM ET)
TECHNOLOGY AREA(S): Air
Platform, Human Systems, Weapons ACQUISITION PROGRAM: PMA 205
Naval Aviation Training Systems The technology within this
topic is restricted under the International Traffic in Arms Regulation (ITAR),
22 CFR Parts 120-130, which controls the export and import of defense-related
material and services, including export of sensitive technical data, or the
Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls
dual use items. Offerors must disclose any proposed use of foreign nationals
(FNs), their country(ies) of origin, the type of visa or work permit possessed,
and the statement of work (SOW) tasks intended for accomplishment by the FN(s)
in accordance with section 5.4.c.(8) of the Announcement. Offerors are advised
foreign nationals proposed to perform on this topic may be restricted due to
the technical data under US Export Control Laws. OBJECTIVE: Develop a software
technology to pre-process, fuse, and store data from multiple sources for human
performance assessment and proficiency tracking during training, with the
capability to parse and synchronize disparate data from live, virtual, and
constructive (LVC) aviation training system sources such as range
instrumentation, aircraft systems, virtual simulators, and constructive
applications to output automated performance metrics. Develop a human-machine
interface that provides visualization tools that facilitate data synthesis by
human-in-the-loop users. DESCRIPTION: Navy leadership
has issued guidance to move from reactive decisions to proactive or predictive
solutions leveraging data-driven analytics to aid in decision-making and
proficiency tracking.� Agreement across the Department of Defense for
quantitative, data-driven decisions is an important first step; however,
implementing systems capable of collecting, storing, fusing, analyzing,
interpreting, and safeguarding that information is a difficult challenge.�
Leveraging advances in data science for training performance assessment is a
critical domain where technology provides a means to increase accuracy and
reduce workload.� Instructors do not currently have enough time for a rigorous
and detailed performance evaluation of each flight.� Research has clearly
demonstrated that high workload has the potential to negatively affect the
accuracy and effectiveness of subjective performance ratings and the subsequent
feedback provided to trainees [Ref 2], thereby reducing the quality and
quantity of training data that feeds back to decision-makers within the Naval
Aviation Enterprise (NAE). PHASE I: Design an
architecture and process for linking available data sources to tactical aircrew
performance in warfighting capabilities based on fleet tactical recommendations
(i.e., Tactics, Techniques, and Procedures (TTP)) and mission-essential tasks
references (e.g., Wing Training Manuals, Training & Readiness Matrices),
that is flexible to incorporate future tactics and scalable to address
individual to multi-team performance.� Demonstrate the feasibility of
implementing a software-based solution to process, parse, and fuse disparate
data sources and types (e.g., aircraft data, sensor data, simulator data, video
files, range instrumentation data, and voice communication recordings) for a
single platform.� Design advanced data science approaches (e.g., machine
learning, artificial intelligence, voice recognition, image processing) for
automated and human-in-the-loop data output for performance assessment,
facilitating feedback, and support longitudinal trend analysis computations.�
Risk Management Framework guidelines should be considered and adhered to during
the development to support information assurance compliance.� Phase I should include
development of prototype plans for Phase II. PHASE II: Develop and
demonstrate a prototype software solution based on the designed architecture
and a process that fuses multiple data sources and types, and outputs automated
and human-in-the-loop data output for performance assessment, facilitates
feedback, and supports longitudinal trend analysis computations.� Evaluate the
efficiencies and return on investment gains associated with semi-automated
and/or automated data processing.� Demonstrate software scalability to multiple
missions and/or multiple platforms.� Develop and evaluate the usability of a
human-machine interface that provides visualization tools to facilitate data
synthesis by human-in-the-loop users and displays automated data outputs.� Risk
Management Framework guidelines should be considered and adhered to during the
development to support information assurance compliance. PHASE III DUAL USE
APPLICATIONS: Conduct the testing and integration necessary to support
transition to a fleet training site.� Implement any outstanding Risk Management
Framework guidelines to ensure information assurance compliance; complete the
process to seek a standalone Authority To Operate (ATO) and/or support a
transition training site to incorporate the developed training solution into an
existing ATO depending on transition customer�s desire.� Continue development
to expand the architecture to new data sources and future references sources on
aircrew performance and/or software scalability to multiple missions and/or
multiple platforms.� Improvements in technology to collect detailed performance
on operators are applicable to all military and commercial systems where
operator reliability is critical to mission success.� Successful technology
development would be applicable to most military systems where it would be
possible to take advantage of the large quantities of data being produced in training
events and efficiently processing that data into meaningful performance
metrics. Similar applications would be useful in commercial aviation, space,
and maritime industries. REFERENCES: 1. Ault, F. �Report of the
Air-to-Air Missile System Capability Review.� July-November 1968. https://www.researchgate.net/publication/235090392_Report_of_the_Air-to-Air_Missile_System_Capability_Review_July-November_1968_Volume_1 2. Bretz, R. D., Milkovich,
G. T. & Read, W. �The current state of performance appraisal research and
practice: Concerns, directions, and implications.� Journal of Management, 1992,
18(2), 321-352. http://dx.doi.org/10.1177/014920639201800206 3. Brobst, W. D. Thompson, K.
L. & Brown, A. C. �Air Wing Training Study: Modeling Aircrew Training for
Acquiring and Maintaining Tactical Proficiency.� A Synthesis of CBA�s Work,
October 2006.� http://www.dtic.mil/dtic/tr/fulltext/u2/a490976.pdf 4. Fan, Jianqing, Han, Fang,
and Liu, Han. �Challenges of Big Data Analysis.� National Science Review,
Volume 1, Issue 2, 1 June 2014, pp 293�314. http://nsr.oxfordjournals.org/content/1/2/293.short 5. Griffin, G.R. & Shull,
R.N. �Predicting F/A-18 Fleet Replacement Squadron Performance Using an
Automated Battery of Performance-Based Tests.� Naval Aerospace Medical Research
Laboratory, Naval Air Station, Pensacola, Florida, July 1990. http://www.dtic.mil/dtic/tr/fulltext/u2/a223899.pdf 6. Horowitz, Stanley A.,
Hammon, Colin P. & Palmer, Paul R. �Relating Flying-Hour Activity to the
Performance of Aircrews.� Institute for Defense Analyses, Alexandria, Virginia,
December 1987. http://www.dtic.mil/docs/citations/ADA199004 7. Kahneman, D. (1973).
Attention and effort (p. 246). Englewood Cliffs, NJ: Prentice-Hall. https://www.princeton.edu/~kahneman/docs/attention_and_effort/Attention_hi_quality.pdf 8. Ellett, Jennifer M. and
Khalfan, Shaun. �The Transition Begins: DoD Risk Management Framework: An
Overview.� CHIPS: The Department of the Navy�s Information Technology Magazine,
April-June 2014. http://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=5015 KEYWORDS: Proficiency;
Performance Assessment; Aircrew; Human Factors; Training; Debrief
|