Defense and Aerospace Test and Analysis Workshop

April 26-28, 2022

Institute for Defense Analyses

Alexandria, VA

Post Workshop Survey

COVID-19 Policy:


In light of the COVID-19 situation, IDA has implemented precautionary measures to ensure the health, safety, and well-being of our employees and visitors to our sites. Please see the Registration page for current policies and precautions. IDA continues to monitor the COVID-19 Pandemic closely and these policies are subject to change.

Workshop Overview

The Defense and Aerospace Test and Analysis (DATA) Workshop is the result of a multi-organization collaboration with the Director of Operational Test & Evaluation (DOT&E) within the Office of the Secretary of Defense, National Aeronautics and Space Administration (NASA), the Institute for Defense Analyses (IDA), and the Section on Statistics in Defense and National Security (SDNS) of the American Statistical Association (ASA). The workshop is strategically designed to strengthen the community by applying rigorous statistical approaches to test design and data analysis in the fields of defense and aerospace.


The three-day workshop will showcase a combination of applied problems, unique methodological approaches, and tutorials from leading academics.

Our goal is to facilitate collaboration among all involved, including other government agencies.

Design of Experiments

The complexity of modern defense and aerospace systems, and the environment in which they operate, means that “live” testing is often expensive and time-consuming.  Thus, test efficiency is key.  Design of Experiments (DOE) techniques allow test planners to maximize information gain for a given set of resources.  Modeling & Simulation (M&S) and Integrated Test & Evaluation (IT&E) are a core practices that the T&E community can implement to aid in test efficiency and increase understanding of system performance, but they also require detailed and deliberate planning using DOE best practices.


Sessions with this theme will focus on DOE methodologies and applications across all T&E domains.  This can include test design techniques for verifying, validating, and accrediting M&S, quantifying the uncertainty in live or M&S outcomes, as well as methods for integrating, to the extent possible, all available information, including previous test outcomes, to sequentially update and refine future testing. Other methodological topics covered in these sessions may include reliability and Bayesian test design methods.

Test and Evaluation Methods for Emerging Technology

The drive to develop new technologies is expanding to more actors with lower barriers of entry, and moving at accelerating speed. New threats to our systems will continue to emerge, especially in the cyber space, putting pressure on our operational T&E community to include faster and more broadly-scoped strategies. As technology advances, researchers including the DoD are becoming more interested in how machines can perform without the aid of humans. Autonomy research includes concepts such as environment perception, decision making, operation, and ethics.


Sessions with this theme will: 1) investigate why and how to improve user engagement with systems that now include artificial intelligence, robotic teammates, and augmented reality; 2) explore Test & Evaluation methods and challenges for emerging technologies and domains including: Cybersecurity of software-enabled systems, hypersonics, directed energy weapons, electronic warfare, autonomous weapon systems, artificial intelligence, and additive manufacturing; and 3) discuss methods used to model how the quality of human-system interaction and human-machine teaming impacts operational performance, survey development, test design, scale validation, and network analysis.

Data Management and Reproducible Research

The goal of data management is to make sure you get what you need from your data.  Achieving this goal requires organizations and test programs to have a plan for acquiring, cleaning, analyzing, documenting, organizing, and archiving data.  Reproducibility is a key feature of successful data management. Reproducible analyses are transparent and easy for reviewers to verify because results and figures can be traced directly to the data and methods that produced them. Poor reproducibility habits result in analyses that are difficult or impossible to review, prone to compounded mistakes, and inefficient to re-run in the future.  They can lead to duplication of effort or even loss of accumulated knowledge when a researcher leaves your organization.

These sessions will 1) share best practices, lessons learned, and open challenges for implementing data management strategies within T&E;  2) discuss the benefits of reproducible research and demonstrate ways that analysts can introduce reproducible or automated research practices during each phase of the analysis workflow: preparing for an analysis, performing the analysis, and presenting results.

Analysis Tools and Techniques

Testers should use robust analytical techniques to ensure that conclusions are defensible and glean maximum information from the data collected from the test design. Statistical analysis techniques allow system evaluators to make rigorous statements about system characteristics by objectively summarizing the data, determining which factors are significant, discovering trends in the data, and quantifying uncertainty in results. Often times these methodologies are complex or not widely understood; user-friendly software tools and case study demonstrations of techniques can help make these methods more accessible to the larger T&E community.

Sessions with this theme will introduce statistical methodologies for T&E, discuss case study applications of analytical techniques, and showcase software or interactive tools for conducting rigorous analyses.  Specific topics may include statistical modeling, reliability analysis, or M&S analysis.

Special Topics

Special topics sessions will include topics such as sustainment analyses and data visualization.

Sustainment Analyses: Quantifying the benefits of specific “what if” scenarios are critical to making real-world evaluations and weapons system understanding and development. To do this, researchers use end-to-end simulations that model all aspects of weapons system sustainment (spares, manpower, operations, and maintenance), which are powerful tools for understanding how investments in different parts of the sustainment system affect readiness.

Data Visualization: The best way to effectively communicate one’s ideas is often to provide visual representations of conclusions and data that are easily understood by the audience. Visual perception is fast and efficient, allowing us to detect patterns and quickly distill large amounts of information. Effective data visualization takes advantage of this fact, aiding in comprehension of complex topics or key takeaways from our analyses.

Mark Your Calendar

  • January 28: Contributed Abstract Submission Closes
  • January 28: Invited Abstracts Due
  • January 31: Registration Opens
  • April 15: In-Person Registration Closes
  • April 22: Virtual Registration Closes
  • April 26-28: DATAWorks 2022!


Subscribe in the menu bar for the latest news and announcements!


Abstract Submissions

If you’ve been invited to submit an abstract,
please use the Invited Abstracts button
Invited Abstracts

Keynote Speakers

Featured Panel: Evolving Data-Centric Organizations

Short Course Instructors

Agenda Summary

All times in Eastern Daylight Time (EDT). Use of “A/B/C/D” in the schedule denotes parallel sessions

April 26th

9:00 – 10:15 

Short Courses

10:15 – 10:30 


10:30 – 12:00 

Short Courses

12:00 – 1:00 Lunch
1:00 – 2:30 Short Courses
2:30 – 2:45


2:45 – 4:30 

Short Courses



April 27th

8:30 – 8:40 

Opening Remarks

8:40 – 9:20

Keynote 1

9:20 – 10:00

Keynote 2

10:00 – 10:20


10:20 – 12:20

Session 1A/1B/1C
Mini-Tutorial 1

12:20 – 1:30


1:30 – 3:00

Session 2A/2B
Speed Session 1
Mini-Tutorial 2

3:00 – 3:20


3:20 – 4:50 Session 3A/3B
Speed Session 2
Mini-Tutorial 3
5:00 – 7:00

Reception &
Poster Session

April 28th

8:30 – 8:40 

Opening Remarks

8:40 – 9:20

Keynote 3

9:30 – 10:30

Featured Panel
10:30 – 10:50 Break
10:50 – 12:20

Session 4A/4B/4C
Mini-Tutorial 4

12:20 – 1:30


1:30 – 3:00

Session 5A/5B
Mini-Tutorial 5

3:00 – 3:20


3:20 – 4:20

Session 6A/6B/6D

4:20 – 4:50 

Awards & Closing Remarks


DATAWorks 2022 will be held at IDA's Potomac Yard Location.

IDA (Potomac Yard) is located at 730 East Glebe Road, Alexandria, VA, 22305.

Parking and Transportation

  • Entrance - Enter through the main IDA doors on East Glebe Road.
    • Check-in will be in the Conference Center lobby
  • Parking - The visitor parking lot has an entrance on Seaton Ave as shown on the map
    • Note: This parking lot is first-come, first-served
    • Please do NOT park in the Target parking lot or in the IDA building garage at risk of being towed

Shuttle Schedule

Please Note: For Renaissance and Hyatt Regency hotels guests only

Shuttle Schedule


Hyatt Regency Crystal City

2799 Richmond HWY

Arlington, VA 22202

Telephone: (703) 418-1234

Group Name: DATAWorks

Dates: April 25-28, 2022

Rate: $258 (or prevailing Government Per Diem Rate)

Book your group rate for DATAWorks Room Block

Last Day to Book: Monday, April 4, 2022

*Free Shuttle to IDA from this location

Residence Inn Alexandria Old Town

1456 Duke Street

Alexandria, VA 22314

Telephone: (703) 548-5475

Group Name: DATAWorks April 2022

Dates: April 25-28, 2022

Rate: $219

Book your group rate for DATAWorks Room Block

Last Day to Book: Monday, March 28, 2022

Renaissance Arlington Capital View Hotel

2800 South Potomac Ave

Arlington, VA 22202

Telephone: (703) 413-1300

Group Name: DATAWorks Room Block

Dates: April 25-28, 2022

Rate: $249

Book your group rate for DATAWorks Room Block

Last Day to Book: Monday, March 28, 2022

*Free Shuttle to IDA from this location