Call for Contributed Talk and Poster Abstracts

Young professional authors and researchers are encouraged to submit abstracts that highlight or apply one of the workshop themes regarding defense and aerospace for the 2020 workshop. Academics, scientists, statisticians, and engineers active in the field will be present to evaluate selected entries. This also provides the opportunity to network and engage with the defense, aerospace, academic community, and industry practitioners. Contributed abstracts will be invited to provide a 15 minute presentation or asked to participate in a poster session. Contributed abstracts will also be eligible for consideration of a best contributed paper or best contributed poster award.

Workshop Overview

The Defense and Aerospace Test and Analysis (DATA) Workshop is the result of a multi-organization collaboration with the Director of Operational Test & Evaluation (DOT&E) within the Office of the Secretary of Defense, National Aeronautics and Space Administration (NASA), the Institute for Defense Analyses (IDA), and the Section on Statistics in Defense and National Security (SDNS) of the American Statistical Association (ASA). The workshop is strategically designed to strengthen the community by applying rigorous statistical approaches to test design and data analysis in the fields of defense and aerospace. The three-day workshop will showcase a combination of applied problems, unique methodological approaches, and tutorials from leading academics. Our goal is to facilitate collaboration among all involved, including other government agencies.

Progress and Challenges in Program Evaluation

The Department of Defense (DoD) and NASA develop and acquire some of the world’s most sophisticated technological systems. In cooperation with these organizations, we face challenges in ensuring that these systems undergo testing and evaluation (T&E) that is both adequate and efficient prior to their use in the field. Sessions with this theme will feature case study examples highlighting advancements in T&E methodologies that we have made as a community. Topics covered may include modeling and simulation validation, uncertainty quantification, experimental design usage, Bayesian analysis, among others. Sessions will also discuss the current challenges and areas in the field that require more research.

Machine Learning and Artificial Intelligence

With the future of automation trending towards using machine learning (ML) and artificial intelligence (AI), we aim to provide resources, set precedent, and inspire innovation in these fields. Sessions with this theme will discuss how we can quantify confidence that ML and/or AI algorithms function as intended. To do this we explore whether the functions are free of vulnerabilities that are either intentionally or unintentionally integrated as part of the data or algorithm. Topics covered may include testing and evaluation (T&E) metrics and methods for AI algorithms, T&E metrics and methods for cyber-physical systems with AI algorithms, threat portrayal for ML/AI algorithms, and robust ML/AI algorithm design.

Cybersecurity and Software Test & Evaluation

The number of known cyber-attacks has been steadily increasing over the past ten years, exposing millions of data records. With the help of leading minds in the cyber field, we work to not only to discover new ways of identifying these attacks, but also ways to counter future attacks. Sessions with this theme will discuss metrics and test methodologies for testing cyber-physical systems with an emphasis on characterizing cyber resiliency. Topics covered may include identifying cybersecurity threats (IP and non-IP) across critical operational missions, development and testing secure architectures, identifying critical system components that enable attack vectors, developing and testing countermeasures to cyber-attacks, and testing methods for cyber-physical systems that include machine learning algorithms.

Modeling the Human-System Interaction

Mission outcomes are heavily determined by how effectively operators interact with systems under operational conditions as decided by their objective performance and their perceived performance. In order to effectively interpret these systems, we analytically observe team dynamics to predict how changing operating conditions will affect the dynamic and therefore productivity. Sessions with this theme will discuss methods used to collect and model the quality of human-system interaction and its impact on operational performance. Topics covered will include survey development/test design, statistical methods for surveys, scale validation, and network analysis.

Science Applications of Test and Evaluation

Scientists use a variety of tools to perform analyses within their areas of expertise. Our goal is to demonstrate knowledge of how to improve analyses that they already perform and to introduce new tools to increase analysis efficiency. Sessions with this theme will cover simulation, prediction, uncertainty quantification, and inference for physical and physical-statistical modeling of geophysical processes. Topics covered may include Earth Science problems (e.g. ice sheet evolution models, atmospheric and ocean processes, the carbon cycle, land surface processes, natural hazards), astronomy and cosmology (e.g. exoplanet detection, galactic formation, cosmic microwave background), and planetary science (e.g. planetary atmospheres, formation).

Guide for Authors:

To submit your abstract, please complete the form below. Authors are invited to submit an abstract of no longer than 500 words.

Important Dates:

Deadline for submission: January 31, 2020
Notification of acceptance: February 14, 2020

Disclaimer: PHOTOGRAPHY AND/OR VIDEOTAPING

Upon acceptance, you understand and recognize that you may be photographed, filmed, videotaped, and/or tweeted and you hereby give DATAWorks the right to take pictures and/or recordings of you. You also grant the right for DATAWorks to use your image, recording, name, and affiliated institution, without compensation, for exhibition in any medium. If you wish to abstain from all processes, please notify Elizabeth Lee and Macy Mathews, dataworks@testscience.org, stating your objection to the above statement.

This form is currently closed for submissions.