Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to Department of Defense, Washington Headquarters Services, Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS.
REPORT DATE (DD-MM-YYYY)2. REPORT TYPE 3. DATES COVERED (From -To)
DISTRIBUTION / AVAILABILITY STATEMENTTApproved for public release; distribution is unlimited.
SUPPLEMENTARY NOTES
ABSTRACTStandards emerging today position the training research community on the eve of a scientific breakthrough. In the near future, the scientific community will likely benefit from the ability to routinely cross-compare training technologies and techniques from laboratory training study results, various operational training implementations, and possibly even live exercises. To achieve this capability, common standards must exist in the competencies to be assessed, the metrics used to evaluate those competencies, and the technology enablers to implement those assessments across training organizations. The current work aims (a) to discuss how the use of these common standards can afford this cross-comparison capability, (b) to provide a proof-of-concept study relying on only these standards, illustrating how this approach can be capitalized on at numerous training facilities, (c) to highlight where the common standards can be expanded, and (d) to provide some baseline distributed simulation within-simulator learning results. Thirty-five F-16 teams participated in week-long distributed mission operations (DMO) simulator training. The study was successfully conducted using the common standards with data captured on 31 teams. Minor issues were discovered in the technology enabling standards and recommendations are provided. By the end of the training week, F-16 teams increased weapons employment effectiveness and their kill ratios increased, while launching weapons at longer ranges and permitting fewer enemy strikers to reach their target. The results suggest assessing human performance across installations for cross-comparison of results is feasible, but some maturation of technology enabler standards is necessary to provide a routine, automatic, and robust inter-organizational cross-comparison capability.