Technologies to Aid Real-Time Training, Evaluation of Student Performance, and Capture of Performance Metrics
Navy SBIR 2013.1 - Topic N131-048
NAVSEA - Mr. Dean Putnam - [email protected]
Opens: December 17, 2012 - Closes: January 16, 2013
N131-048 TITLE: Technologies to Aid Real-Time Training, Evaluation of Student Performance, and Capture of Performance Metrics
TECHNOLOGY AREAS: Electronics, Human Systems
ACQUISITION PROGRAM: PMS 339, Surface Training Systems Program Office
OBJECTIVE: To develop innovative techniques for the real-time capture of student performance metrics to aid instructors in training and performance assessments.
DESCRIPTION: Sailors are trained on a wide variety of complex tasks through the use of simulators. For well over two decades, simulation has proved to be a highly cost effective method for developing important skills and for building proficiency. However, the full promise of simulation as a training method is sub-optimized. In even the most advanced simulators, an instructor must still be present with the student in order to provide mentoring as well as to assess performance and provide feedback; all of which relies solely on the instructor's visual observation of the student and still requires the instructor to manually record, track, and database those observations. This way of training is costly, and despite significant advances in simulation to facilitate and improve training, student throughput is constrained by the number of available instructors to support the critical assessment dimension of training. The demand for training complex tasks in simulated systems has been growing over the past decade and is exceeding the current amount of available instructors, but the full promise of simulation is constrained by the lack of a robust, automated, and embedded assessment process that can enhance the effectiveness of simulation as a training method and reduce instructor manpower requirements.
This current method of training using simulation is sub-optimized because existing technologies have not been sufficient to automatically assess performance of complex tasks and provide feedback to the instructor and student. The automated performance assessment of complex tasks is challenging because it involves the detailed understanding and modeling of how experts think (Ref 2). Assessment of performance is very challenging in non-linear simulations where more than one pathway to success or failure may exist and solutions to this problem require innovative solutions in the modeling process. There are ranges of actions that a student can take which results in an acceptable demonstration of proficiency. Much of the difficulty lies in interpreting what actions mean in the broader context of the task (rather than each discrete event being "right" or "wrong") (Ref 3).
The current "state of the art" simulation system for providing immediate feedback is the intelligent tutoring system (ITS). Most existing intelligent tutoring systems are built for training tasks which have discrete linear, "right" and "wrong" actions associated with them. However, some prototyping has been done in the field of intelligent tutoring for complex systems, (e.g. ship handling training) demonstrating that complex training tasks can be successfully accomplished in a synthetic environment using an ITS. Further, this prototyping suggests that the instructor-to-student ratio can be reduced from 1:1 to 1:2 or greater (Ref 1). Research has shown that an automated assessment system could greatly expand the current capability of an ITS and further reduce instructor requirements (Refs 2 and 3).
An innovative approach is needed to develop a technology that can interface with existing simulators for the purpose of automatically monitoring student performance, identifying assessment-relevant aspects of the students' actions, automatically feeding the information into an assessment system, alerting an instructor to potential student knowledge or skill deficiencies, and allowing the instructor to engage real time (in situ) or after action and correct these issues before a student "goes off track" and negative training results. Such a technology solution must capture what is normally recorded through visual observation based on instructor subject matter expertise and personal bias. So that multiple students can be trained simultaneously by one instructor, instead of the current 1:1 instructor-to-student ratio, the goal is 1:6 with a threshold of 1:3. The technology solution must provide a detailed skill assessment, require minimal intervention by an instructor, and require minimal instruction for individual use.
PHASE I: The small business will identify and define functionality, feasibility, and concepts for a training module that can function in a simulated environment which would reduce the amount of required instructors per student during simulator training and would include a summative report of student performance and feedback. Required Phase I deliverables will include a determination of the technical feasibility of the concept, a development of incremental approaches for achieving this goal, and development of a detailed analysis of predicted performance including instructor manpower reductions. The company will provide a Phase II development plan with performance goals, key technical milestones, and technical risk.
PHASE II: Based on the results of Phase I and the Phase II development plan, the small business will develop a prototype system to be targeted for integration with the Conning Officer Virtual Environment (COVE) ship handling trainer or similar ship handling trainer for evaluation at Surface Warfare Officer School (SWOS). Detailed interface requirements for COVE and the Conning Officer Ship handling Assessment (COSA) system as well as performance standards will be provided to the business developing the prototype system. General interface information is provided via Ref 4 for development of the proposal. This prototype will be evaluated in a simulated relevant environment to determine its capability to meet the performance goals defined in the Phase II development plan. This evaluation will include an assessment of the degree to which the prototype will reduce instructor manpower requirements such that one instructor can effectively administer training to three or more students, generally prospective Officer of the Decks (OOD) and Junior Officers of the Deck (JOOD) / Conning Officers performing complicated ship handling tasks such as getting a ship underway from a pier, performing a twist, conducting a harbor transit in confined waters, coming along side a replenishment ship, or mooring to a pier. System performance will be demonstrated and evaluated by the Navy. Evaluation results will be used to refine the prototype into an initial design which will meet the requirements. The company will prepare a Phase III development plan to transition the technology to full Navy use.
PHASE III: If Phase II is successful, the company will be expected to support the Navy in transitioning the technology for schoolhouse use. The company will develop the full module to be implemented in the COVE system. The company will support the Navy and SWOS for test and validation to certify the system for use and ensure that it meets training objectives.
PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: The potential for commercial application and dual use would apply to advanced DoD training systems and possibly training systems used in commercial industry. The level of instructors required to teach complex tasks is an issue that is faced across the DoD. In this fiscally constrained environment, solutions which would reduce the amount of instructors required for training has re-use potential in all simulated training systems across the DoD. In addition to DoD, the gaming industry would be interested in innovative models that can accurately assess performance and provide feedback. In addition, the commercial maritime industry utilizes ship handling simulators similar to the technology employed for naval ship handling training. These systems also lack automatic assessment capabilities. Reducing the need for instructors through improved instructor feedback is applicable to other DoD training commands and specific improvements to ship handling training has the potential to be used in commercial shipping industry training.
REFERENCES:
1. Peters, S., Bratt, E., & Kirschenbaum, S., "Automated support for learning in simulation: Intelligent tutoring of shiphandling." 2011. Paper presented at the Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC), 2011. 25 April 2012 <http://ntsa.metapress.com/link.asp?id=u1m0845185n46816>
2. Iseli, M. R., Koenig, A. D., Lee, J. J., & Wainess, R., "Automatic assessment of complex task performance in games and simulations." 2010. University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST). 25 April 2012 <http://www.cse.ucla.edu/products/reports/R775.pdf>
3. Koenig, A.D., Lee, J., Iseli, M., & Wainess, R., "A conceptual framework for assessing performance in games and simulations", 2009. University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST). 25 April 2012 <http://www.cse.ucla.edu/products/reports/R771.pdf>
4. COVE Performance Interface Criteria Sheet. (See Additional Information posted by TPOC on 11/26/12.)
KEYWORDS: Intelligent Tutor System; training performance assessment; Conning Officer Shiphandling Assessment (COSA); training performance feedback; Surface Warfare Officer School (SWOS); Conning Officer Virtual Environment (COVE);
** TOPIC AUTHOR (TPOC) ** DoD Notice: Between 16 November, 2012 through 16 December 2012, you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. Their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is not allowed starting 17 December, 2012, when DoD begins accepting proposals for this solicitation.
However, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS (13.1 Q&A) during the solicitation period for questions and answers, and other significant information, relevant to the SBIR 13.1 topic under which they are proposing.
If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at (866) 724-7457 or email weblink.
|