Defense and Aerospace Test and Analysis Workshop

Workshop Overview

The Defense and Aerospace Test and Analysis (DATA) Workshop is the result of a multi-organization collaboration with the Director of Operational Test & Evaluation (DOT&E) within the Office of the Secretary of Defense, National Aeronautics and Space Administration (NASA), the Institute for Defense Analyses (IDA), and the Section on Statistics in Defense and National Security (SDNS) of the American Statistical Association (ASA). The workshop is strategically designed to strengthen the community by applying rigorous statistical approaches to test design and data analysis in the fields of defense and aerospace.

 

The three-day workshop will showcase a combination of applied problems, unique methodological approaches, and tutorials from leading academics.

Our goal is to facilitate collaboration among all involved, including other government agencies.

Design of Experiments

The complexity of modern defense and aerospace systems, and the environment in which they operate, means that “live” testing is often expensive and time-consuming.  Thus, test efficiency is key.  Design of Experiments (DOE) techniques allow test planners to maximize information gain for a given set of resources.  Modeling & Simulation (M&S) and Integrated Test & Evaluation (IT&E) are a core practices that the T&E community can implement to aid in test efficiency and increase understanding of system performance, but they also require detailed and deliberate planning using DOE best practices.

 

Sessions with this theme will focus on DOE methodologies and applications across all T&E domains.  This can include test design techniques for verifying, validating, and accrediting M&S, quantifying the uncertainty in live or M&S outcomes, as well as methods for integrating, to the extent possible, all available information, including previous test outcomes, to sequentially update and refine future testing. Other methodological topics covered in these sessions may include reliability and Bayesian test design methods.

Test & Evaluation Methods for Emerging Technology

The drive to develop new technologies is expanding to more actors with lower barriers of entry, and moving at accelerating speed. New threats to our systems will continue to emerge, especially in the cyber space, putting pressure on our operational T&E community to include faster and more broadly-scoped strategies. As technology advances, researchers including the DoD are becoming more interested in how machines can perform without the aid of humans. Autonomy research includes concepts such as environment perception, decision making, operation, and ethics.

 

Sessions with this theme will: 1) investigate why and how to improve user engagement with systems that now include artificial intelligence, robotic teammates, and augmented reality; 2) explore Test & Evaluation methods and challenges for emerging technologies and domains including: Cybersecurity of software-enabled systems, hypersonics, directed energy weapons, electronic warfare, autonomous weapon systems, artificial intelligence, and additive manufacturing; and 3) discuss methods used to model how the quality of human-system interaction and human-machine teaming impacts operational performance, survey development, test design, scale validation, and network analysis.

Data Management / Reproducible Research

The goal of data management is to make sure you get what you need from your data.  Achieving this goal requires organizations and test programs to have a plan for acquiring, cleaning, analyzing, documenting, organizing, and archiving data.  Reproducibility is a key feature of successful data management. Reproducible analyses are transparent and easy for reviewers to verify because results and figures can be traced directly to the data and methods that produced them. Poor reproducibility habits result in analyses that are difficult or impossible to review, prone to compounded mistakes, and inefficient to re-run in the future.  They can lead to duplication of effort or even loss of accumulated knowledge when a researcher leaves your organization.

These sessions will 1) share best practices, lessons learned, and open challenges for implementing data management strategies within T&E;  2) discuss the benefits of reproducible research and demonstrate ways that analysts can introduce reproducible or automated research practices during each phase of the analysis workflow: preparing for an analysis, performing the analysis, and presenting results.

Analysis Tools and Techniques

Testers should use robust analytical techniques to ensure that conclusions are defensible and glean maximum information from the data collected from the test design. Statistical analysis techniques allow system evaluators to make rigorous statements about system characteristics by objectively summarizing the data, determining which factors are significant, discovering trends in the data, and quantifying uncertainty in results. Often times these methodologies are complex or not widely understood; user-friendly software tools and case study demonstrations of techniques can help make these methods more accessible to the larger T&E community.

Sessions with this theme will introduce statistical methodologies for T&E, discuss case study applications of analytical techniques, and showcase software or interactive tools for conducting rigorous analyses.  Specific topics may include statistical modeling, reliability analysis, or M&S analysis.

Special Topics

Special topics sessions will include topics such as sustainment analyses and data visualization.

Sustainment Analyses: Quantifying the benefits of specific “what if” scenarios are critical to making real-world evaluations and weapons system understanding and development. To do this, researchers use end-to-end simulations that model all aspects of weapons system sustainment (spares, manpower, operations, and maintenance), which are powerful tools for understanding how investments in different parts of the sustainment system affect readiness.

Data Visualization: The best way to effectively communicate one’s ideas is often to provide visual representations of conclusions and data that are easily understood by the audience. Visual perception is fast and efficient, allowing us to detect patterns and quickly distill large amounts of information. Effective data visualization takes advantage of this fact, aiding in comprehension of complex topics or key takeaways from our analyses.

Mark Your Calendar

  • October: Contributed Abstract Submission Opens 
  • December: Registration Opens
  • January: Contributed Abstract Submission Closes
  • April 26-28: DATAWorks 2022!

 

Subscribe in the menu bar for the latest news and announcements!

 



Abstract Submissions

Coming Soon!

Location and Lodging

DATAWorks 2022 will be held at IDA's Potomac Yard Location.

IDA (Potomac Yard) is located at 730 East Glebe Road, Alexandria, VA, 22301.


ROOM BLOCK:

(TBA)

 

Room Reservation | IDA on Maps | Transportation | City Guide (Links Coming Soon!)