Short Course Instructors
Course Title: Using R Markdown & the Tidyverse to Create Reproducible Research
Abstract: R is one of the major platforms for doing statistical analysis and research. This course introduces the powerful and popular R software through the use of the RStudio IDE. This course covers the use of the tidyverse suite of packages to import raw data (readr), do common data manipulations (dplyr and tidyr), and summarize data numerically (dplyr) and graphically (ggplot2). In order to promote reproducibility of analyses, we will discuss how to code using R Markdown - a method of R coding that allows one to easily create PDF and HTML documents that interweave narrative, R code, and results. List of packages to install: tidyverse, GGally, Lahman, tinytex
Instructor Bio: Justin Post is a Teaching Associate Professor and the Director of Online Education in the Department of Statistics at North Carolina State University. Teaching has always been his passion and that is his main role at NCSU. He teaches undergraduate and graduate courses in both face-to-face and distance settings. Justin is an R enthusiast and has taught many short courses on R, the tidyverse, R shiny, and more.: Materials URL:
Course Title: Categorical Data Analysis
Workshop Theme: Analysis Tools and Techniques
Categorical data is abundant in the 21st century, and its analysis is vital to advance research across many domains. Thus, data-analytic techniques that are tailored for categorical data are an essential part of the practitioner’s toolset. The purpose of this short course is to help attendees develop and sharpen their abilities with these tools. Topics covered in this short course will include binary and multi-category logistic regression, ordinal regression, and classification, and methods to assess predictive accuracy of these approaches will be discussed. Data will be analyzed using the R software package, and course content loosely follow Alan Agresti’s excellent textbook “An Introduction to Categorical Data Analysis, Third Edition.”
Instructor Bio: Chris Franck is an Assistant Professor in the Department of Statistics at Virginia Tech.
Course Title: A Practitioner’s Guide to Advanced Topics in DOE
Workshop Theme: Design of Experiments
Abstract: Having completed a first course in DOE and begun to apply these concepts, engineers and scientists quickly learn that test and evaluation often demands knowledge beyond the use of classical designs. This one-day short course, taught by an engineer from a practitioner’s perspective, targets this problem. Three broad areas are covered:
- Optimal designs address questions such as how to accommodate constraints in the design space, specify unique models, and fractionate general factorial designs. Choices for methods revolve around objectives. For example, is the priority to best estimate model parameters or to reduce overall response prediction error?
- Split plot designs allow restrictions to randomization for hard-to-change factors. These designs, including optimal adaptations, are now supported by most commercial experimental design software and within the reach of the practitioner.
- Sequential design approaches strive to minimize resource requirements at the onset of a test. A phased approach includes a plan to add runs to achieve specific objectives following an initial test analysis. The approach is especially effective in high-dimensional spaces where the influence of all factors is in question and/or model order is unknown. One example is the use of screening designs where all factors are included in the first test with the expectation that many will be found insignificant.
The course format is to introduce relevant background material, discuss case studies, and provide software demonstrations. Case studies and demonstrations are derived from a variety of sources, including aerospace testing and DOD T&E. Learn design approaches, design comparison metrics, best practices, and lessons learned from the instructor’s experience. A first course in Design of Experiments is a prerequisite.
Instructor Bio: Drew Landman has 34 years of experience in engineering education as a professor at Old Dominion University. Dr. Landman’s career highlights include13 years (1996-2009) as chief engineer at the NASA Langley Full-Scale Wind Tunnel in Hampton, VA. Landman was responsible for program management, test design, instrument design and calibration and served as the lead project engineer for many automotive, heavy truck, aircraft, and unmanned aerial vehicle wind tunnel tests including the Centennial Wright Flyer and the Boeing X-48B and C. His research interests and sponsored programs are focused on wind tunnel force measurement systems and statistically defensible experiment design primarily to support wind tunnel testing. Dr. Landman has served as a consultant and trainer in the area of statistical engineering to test and evaluation engineers and scientists at AIAA, NASA, Aerovironment, Airbus, Aerion, ATI, USAF, US Navy, US Marines and the Institute for Defense Analysis. Landman founded a graduate course sequence in statistical engineering within the ODU Department of Mechanical and Aerospace Engineering. He is currently co-authoring a text on wind tunnel test techniques.
Course Title: Operational Cyber Resilience in Engineering and Systems Test
Workshop Theme: Test and Evaluation Methods for Emerging Technology
Cyber resilience is the ability to anticipate, withstand, recover from, and adapt to adverse conditions, stresses, attacks, or compromises on systems that use or are enabled by cyber resources. As a property defined in terms of system behavior, cyber resilience presents special challenges from a test and evaluation perspective. Typically, system requirements are specified in terms of technology function and can be tested through manipulation of the systems operational environment, controls, or inputs. Resilience, however, is a high-level property relating to the capacity of the system to recover from unwanted loss of function. There are no commonly accepted definitions of how to measure this system property. Moreover, by design, resilience behaviors are exhibited only when the system has lost critical functions. The implication is that the test and evaluation of requirements for operational resilience will involve creating, emulating, or reasoning about the internal systems states that might result from successful attacks.
This tutorial will introduce the Framework for Operational Resilience in Engineering and System Test (FOREST), a framework that supports the derivation of measures and metrics for developmental and operational test plans and activities for cyber resilience in cyber-physical systems. FOREST aims to provide insights to support the development of testable requirements for cyber resilience and the design of systems with immunity to new vulnerabilities and threat tactics. FOREST's elements range from attack sensing to the existence and characterization of resilience modes of operation to operator decisions and forensic evaluation. The framework is meant to be a reusable, repeatable, and practical framework that calls for system designers to describe a system’s operational resilience design in a designated, partitioned manner that aligns with resilience requirements and directly relates to the development of associated test concepts and performance metrics.
The tutorial introduces model-based systems engineering (MBSE) tools and associated engineering methods that complement FOREST and support the architecting, design, or engineering aspects of cyber resilience. Specifically, it features Mission Aware, a MBSE meta-model and associated requirements and architecture analysis process targeted to decomposition of loss scenarios into testable resilience features in a system design. FOREST, Mission Aware, and associated methodologies and digital engineering tools will be applied to two case studies for cyber resilience: (1) Silverfish, a hypothetical networked munition system and (2) an oil distribution pipeline. The case studies will lead to derivations of requirements for cyber resilience and survivability, along with associated measures and metrics.
Peter A. Beling is a professor in the Grado Department of Industrial and Systems Engineering and associate director of the Intelligent Systems Division in the Virginia Tech National Security Institute. Dr. Beling’s research interests lie at the intersections of systems engineering and artificial intelligence (AI) and include AI adoption, reinforcement learning, transfer learning, and digital engineering. He has contributed extensively to the development of methodologies and tools in support of cyber resilience in military systems. He serves on the Research Council of the Systems Engineering Research Center (SERC), a University Affiliated Research Center for the Department of Defense.
Tom McDermott is the Deputy Director and Chief Technology Officer of the Systems Engineering Research Center at Stevens Institute of Technology in Hoboken, NJ. He leads research on Digital Engineering transformation, education, security, and artificial intelligence applications. Mr. McDermott also teaches system architecture concepts, systems thinking and decision making, and engineering leadership for universities, government, and industry. He serves on the INCOSE Board of Directors as Director of Strategic Integration.
Tim Sherburne is a research associate in the Intelligent System Division of the Virginia Tech National Security Institute. Sherburne was previously a member of the systems engineering staff at the University of Virginia supporting Mission Aware research through rapid prototyping of cyber resilient solutions and model-based systems engineering (MBSE) specifications. Prior to joining the University of Virginia, he worked at Motorola Solutions in various Software Development and Systems Engineering roles defining and building mission critical public safety communications systems.