|
Opening Remarks |
Saturday, September 27, 2025 |
8:00 AM–8:10 AM |
Embassy Suites Minneapolis; Plymouth Ballroom AB |
Please join us for opening remarks with conference co-chair, Dr. Timothy Slocum. |
|
|
|
|
A Framework for the Analysis of Single-Case Results: Visual Analysis and Type I Error Control |
Saturday, September 27, 2025 |
8:10 AM–9:50 AM |
Embassy Suites Minneapolis; Plymouth Ballroom AB |
Area: SCI; Domain: Theory |
CE Instructor: John Ferron, Ph.D.Ph.D. |
Abstract: This session explores a comprehensive approach to analyzing single-case research by integrating visual analysis with strategies to control Type I error. The first presentation will examine the central role of visual analysis in single-case design, highlighting its flexibility, common applications, and both strengths and limitations. The second presentation will address how to mitigate the risk of drawing incorrect conclusions due to Type I error, outlining analytic methods that complement visual inspection and help ensure valid inferences. Together, these presentations aim to strengthen researchers’ ability to interpret single-case data with greater rigor and confidence by balancing clinical judgment with methodological safeguards. |
Instruction Level: Advanced |
Target Audience: Single-case researchers, reviewers, editors, and those who teach single-case designs |
Learning Objectives: 0. 1. Attendees will be able to identify different analytic purposes that can be accomplished through visual analysis. 0. 2. Attendees will be able to identify strengths and weaknesses of visual analysis. 0. 3. Attendees will be able to identify the strengths and weaknesses of different approaches to controlling Type I errors. 0. 4. Attendees will be able to select an approach to controlling Type I errors that complements their visual analysis and that is consistent with their design and data. |
|
A Framework for the Analysis of Single-Case Results: Visual Analysis |
KATIE WOLFE (University of South Carolina) |
Abstract: Visual analysis is foundational to single-case research; it is used throughout the process of conducting a study to inform experimental decisions and at the end of a study to evaluate results and draw conclusions. In this session, we will discuss the flexibility of visual analysis, its numerous applications within single-case research, and strengths and challenges related to the method. |
Katie Wolfe is an Associate Professor at the University of South Carolina in the areas of special education and applied behavior analysis. Her research focuses primarily on the factors that influence visual analysis as well as on the correspondence between visual analysis and other analytic methods for single-case research data. She has developed trainings and protocols to increase the reliability of visual analysis. Her other research interests include teacher’s use of graphed data to inform instructional decision-making and supporting parents and caregivers of children diagnosed with autism and related disabilities. |
|
Controlling Type I Errors in the Analysis of Single-Case Data |
JOHN FERRON (University of South Florida) |
Abstract: Behavior may change because of an intervention or may change because of something other than the intervention. If the change is due to something other than the intervention, but we conclude it was from the intervention we have made a Type I error. In this session we will focus on methods that have been developed to limit the chances of making a Type I error when analyzing single-case data. We will consider options that are available, as well as the circumstances in which each option does and does not work well. The aim is to help participants be able to select an approach that will complement their visual analysis, and is appropriate given their single-case research design and their data. |
John Ferron is an educational statistician with expertise in the analysis of single-case data. He has contributed to the development and examination of statistical methods for analyzing single-case studies, including the development of masked visual analysis methods, studies of the power of randomization tests, studies of the application of multilevel modeling to the analysis and meta-analysis of single-case studies, and development of the percentage of goal obtained effect size. He was funded by IES as part of a group to examine multilevel modeling approaches for analyzing and meta-analyzing single-case experimental studies, as part of a group providing an advanced training institute on single-case experimental design methods, and as part of group that developed apps and guides to increase the accessibility of analytic methods for single-case data. |
|
|
|
|
A Framework for the Analysis of Single-Case Results: Statistical Modeling and Quantification Techniques |
Saturday, September 27, 2025 |
10:05 AM–11:45 AM |
Embassy Suites Minneapolis; Plymouth Ballroom AB |
Area: SCI; Domain: Theory |
CE Instructor: John Falligant, Ph.D. |
Abstract: This session presents two complementary approaches to advancing the analysis of single-case data through quantitative models and statistical techniques. The first presentation explores theoretical frameworks (Shull’s bout-analytic model and response disequilibrium theory) that reconceptualize behavior and reinforcement in terms of temporally extended patterns. These models offer new analytic tools for understanding response organization and treatment effects, with strong potential for translational research. The second presentation addresses the quantification of intervention effects in single-case experimental designs (SCDs), focusing on selecting and interpreting appropriate effect size measures. Attendees will be introduced to classes of statistical techniques, including non-overlap, regression-based, and mean-based metrics, and guided through effect size calculation using an intuitive R Shiny application. Together, these talks offer a practical and conceptual foundation for improving the precision, utility, and interpretability of single-case research outcomes. |
Instruction Level: Advanced |
Target Audience: Single-case researchers, reviewers, editors, and those who teach single-case designs |
Learning Objectives: 0. 1. Attendees will be able to identify different analytical purposes that can be accomplished by statistical modeling. 0. 2. Attendees will be able to select a statistical modeling technique that is consistent with their analytic purpose. 0. 3. Attendees will be able to identify the strengths and weaknesses of different approaches to quantify intervention effectiveness. 0. 4. Attendees will be able to select an approach to estimating intervention effectiveness that complements their visual analysis and that is consistent with their design and data. 0. 5. Attendees will be able to interpret and report the results of the quantitative analysis. 0. 6. Attendees will be able to select appropriate quantitative techniques for assessing the temporal dynamics of behavior. 0. 7. Attendees will be able to interpret graphical and statistical representations of temporal dynamics in behavior, including time-series data, and communicate their implications for research and/or practice. |
|
Behavior and Reinforcement as Temporally Extended Processes: Rethinking Units of Analysis Through Quantitative Models |
JOHN FALLIGANT (Auburn University) |
Abstract: Quantitative models of behavior provide theory-driven accounts of how behavior unfolds over time and is shaped by environmental contingencies. While such models have a longstanding role in the experimental analysis of behavior, their application in clinical settings remains limited. This presentation highlights two intriguing frameworks—Shull’s (2001) bout-analytic model and Timberlake and Allison’s (1974) response disequilibrium theory—and explores their translational potential for applied behavior analysis. Each model reconceptualizes what counts as a meaningful unit of behavior or reinforcement, shifting focus from discrete to temporally extended events. Shull’s bout model characterizes operant behavior as organized into bouts and pauses, with parameters that correspond to motivational and motoric processes. This structure offers a precise framework for analyzing response patterns and evaluating treatment effects. Response disequilibrium theory reframes reinforcer efficacy in terms of behavioral allocation and constraint, offering a dynamic and testable alternative to traditional notions of reinforcer preference and value. This presentation will provide brief primers on each model, review relevant translational and applied findings, and identify future directions for extending their applications in behavior analysis. Emphasis will be placed on how these frameworks can generate testable hypotheses and enhance conceptual foundations of applied research and practice. |
Dr. Falligant is an assistant professor in the Department of Psychological Sciences at Auburn University. His clinical work focuses on the functional assessment and treatment of challenging behavior among children, adolescents, and young adults with neurodevelopmental disorders. His research emphasizes concepts and methods grounded in translational behavior science, aiming to bridge the gap between basic behavioral research and applied practice. This involves the fine-grained analysis of behavioral events, including the microstructure of behavior and its dynamics, and coalesces around neurobehavioral variables underlying dysfunction, persistence, and change. Dr. Falligant’s research is supported by the National Institutes of Health and the Brain and Behavior Research Foundation. He is the 2025 recipient of the B.F. Skinner Foundation New Researcher Award from the American Psychological Association. |
|
Introduction to Single-Case Experimental Design Effect Indices |
MARIOLA MOEYAERT (University at Albany) |
Abstract: Due to the increased interest in establishing an evidence base for intervention effectiveness using single-case experimental design (SCD) studies, there is a need for appropriate quantification of intervention effectiveness. However, analyzing SCD data is not always straightforward and can be challenging because the choice of an appropriate statistic depends on the research question and SCD data characteristics. During the presentation, a distinction is made between non-overlap statistics, regression-based statistics, mean based-statistics, and others that can be used to quantify intervention effectiveness. A step-by-step demonstration of effect size calculation will be provided using a point-and-click R Shiny application. This presentation is designed to help participants gain a working understanding of appropriate effect size selection and interpretation. |
Mariola Moeyaert is an Associate Professor of Statistics in the Division of Educational Psychology and Methodology at the University at Albany. Her primary research interest is related to methodological considerations when conducting (meta-)analyses of single-case research design data. To date, she has published 84 methodological and applied peer-reviewed manuscripts in top tier research outlets about this topic. Dr. Moeyaert successfully completed three IES funded grants related to (meta-) analysis of SCEDs and is currently co-PI on an IES funded grant (R305D240044, 9/1/2024-8/31/2027) investigating methods for causal mechanisms in SCED (meta-)analysis. Dr. Moeyaert was for three years an instructor of the Training Institute on Single-Case Intervention Research Design and Analysis (2021-2024), also funded by the Institute of Education Sciences. According to Google Scholar (consulted on February 27, 2025), her h-index is 33, i10-index = 57 and she has been cited over 3,171 times. |
|
|
|
|
Interactive Workgroups: A Framework for the Analysis of Single-Case Results |
Saturday, September 27, 2025 |
1:15 PM–2:05 PM |
Embassy Suites Minneapolis; Topaz/Turqoise/Opal |
Area: SCI; Domain: Theory |
CE Instructor: John Ferron, Ph.D.Ph.D. |
KATIE WOLFE (University of South Carolina), JOHN FERRON (University of South Florida), JOHN FALLIGANT (Auburn University), MARIOLA MOEYAERT (University at Albany – State University of New York) |
Description: In this session our focus will be on aligning analysis methods with the purposes they aim to fulfill. We will consider visual analyses and its flexibility in accomplishing a range of analytic purposes, the use of statistical models to quantify specific features of the behavioral data, methods for estimating the magnitude of treatment effects, and techniques for controlling Type I errors. We will consider the assumptions underlying the methods and the situations and conditions for which they do and do not fully accomplish the analytic purposes for which they were developed. We aim to engage the audience in discussions and activity that will further thinking on how to develop analytic plans that align with the specific goals of their research and that employ multiple analytic techniques in a complementary manner. |
Learning Objectives: 0. 1. Attendees will be able to identify different analytic purposes that can be accomplished through visual analysis. | 0. 2. Attendees will be able to identify strengths and weaknesses of visual analysis. | 0. 3. Attendees will be able to identify the strengths and weaknesses of different approaches to controlling Type I errors. | 0. 4. Attendees will be able to select an approach to controlling Type I errors that complements their visual analysis and that is consistent with their design and data. | 0. 1. Attendees will be able to identify different analytical purposes that can be accomplished by statistical modeling. | 0. 2. Attendees will be able to select a statistical modeling technique that is consistent with their analytic purpose. | 0. 3. Attendees will be able to identify the strengths and weaknesses of different approaches to quantify intervention effectiveness. | 0. 4. Attendees will be able to select an approach to estimating intervention effectiveness that complements their visual analysis and that is consistent with their design and data. | 0. 5. Attendees will be able to interpret and report the results of the quantitative analysis. | 0. 6. Attendees will be able to select appropriate quantitative techniques for assessing the temporal dynamics of behavior. | 0. 7. Attendees will be able to interpret graphical and statistical representations of temporal dynamics in behavior, including time-series data, and communicate their implications for research and/or practice. |
Activities: The audience will break into small groups for interactive discussions and applied activities related to Drs. Wolfe, Ferron, Falligant, & Moeyaert's symposia. |
Audience: Single-case researchers, reviewers, editors, and those who teach single-case designs |
Content Area: Methodology |
Instruction Level: Advanced |
|
|
Complete Reporting of Methods and Results in Single-Case Published Reports |
Saturday, September 27, 2025 |
2:15 PM–3:05 PM |
Embassy Suites Minneapolis; Plymouth Ballroom AB |
Area: SCI; Domain: Theory |
CE Instructor: Wendy A. Machalicek, Ph.D. |
Presenting Authors: WENDY A. MACHALICEK (University of Oregon), KIMBERLY VANNEST (University of Vermont), ANNA PETURSDOTTIR (University of Nevada, Reno) |
Abstract: The accumulation of evidence to inform intervention practices requires both methodologically rigorous experimental designs and detailed reporting in published reports. Quality of reporting serves as a proxy for methodological quality including those aspects which we are unable to observe after the experiment concludes. In single-case research, as in other research traditions, researchers sometimes (or often) omit essential methodological details from published reports that researchers need to ascertain methodological quality. This session focuses on topics pertinent to the complete reporting (and not cherry picking) of methods and results in single-case intervention studies. First, we will present an overview of the rationale for enhanced methods and results reporting practices and existing reporting practices within single-case research (Machalicek). Second, we will present data related to statistical conclusion validity in the adapted alternating treatments design, discuss the importance of replicating differences between conditions to decrease the probability of Type I error, and provide recommendations for reporting replications (Petursdottir). Finally, we will present recommendations for the improved articulation of design, visual and statistical analysis decisions in published single-case reports (Vannest). Participants of this session will provide input to inform our collective understanding of the feasibility, acceptability, and potentially effects of our recommendations. |
Instruction Level: Advanced |
Target Audience: Single-case researchers, reviewers, editors, and those who teach single-case designs |
Learning Objectives: 0. state the rationale for enhanced methods and results reporting practices within single-case intervention research. 0. explain the contribution of replication within the adapted alternating treatments design to statistical conclusion validity. 0. list three recommendations for the improved articulation of design, visual and statistical analysis decisions. |
|
WENDY A. MACHALICEK (University of Oregon) |
Dr. Wendy Machalicek is Professor and incoming Chair of the Department of Special Education and Clinical Sciences in the College of Education at the University of Oregon. Her scholarship focuses on the assessment and treatment of challenging behavior for children with developmental disability with a focus on coaching natural change agents in evidence-based interventions. She has published 132 scholarly works and received federal funding in these areas by the U.S. Institute of Education Sciences, the U.S. Office of Special Education Programs, and the National Institute on Disability and Rehabilitation Research. She is PI of the Institute for Education Sciences funded Methods Training Institute in Advanced Single-Case Research Design and Analysis and Co-Editor-in-Chief of the Journal of Positive Behavior Interventions. |
KIMBERLY VANNEST (University of Vermont) |
Dr. Kimberly Vannest is currently the Chair of the Department of Education in the College of Education and Social Services at the University of Vermont. Her scholarship includes interventions for academic and behavioral risk and disability, single case experimental design methodology, student progress monitoring, and teacher behaviors. She has produced nearly 200 scholarly works including high-impact peer reviewed papers and free online software to calculate effect sizes in time series designs (www.singlecaseresearch.org ). Her work has received state and U.S. federal funding from the National Science Foundation, the Department of Defense, the The Institute of Education Sciences, the Office of Special Education Programs, the Texas Education Agency and the Vermont Department of Education. |
ANNA PETURSDOTTIR (University of Nevada, Reno) |
Dr. Anna Petursdottir is an Associate Professor of Behavior Analysis at the University of Nevada, Reno. Her scholarship seeks to advance knowledge of the operation of basic behavioral processes in human language and cognition, and to translate that knowledge into potential application; for example, in education, training, and language intervention. Additional research interests include single-case experimental designs and associated methods. She has published over 70 scholarly works in these areas including in flagship behavior analysis peer-reviewed journals. She currently serves on the Board of Editors for a number of highly regarded journals in behavior analysis including the Journal of Applied Behavior Analysis, the Journal of Behavioral Education, The Analysis of Verbal Behavior, and the European Journal of Behavior Analysis. |
|
|
|
|
Interactive Workgroups: Complete Reporting of Methods and Results in Single-Case Published Reports |
Saturday, September 27, 2025 |
3:20 PM–4:10 PM |
Embassy Suites Minneapolis; Topaz/Turqoise/Opal |
Area: SCI; Domain: Theory |
CE Instructor: Wendy A. Machalicek, Ph.D. |
WENDY A. MACHALICEK (University of Oregon), KIMBERLY VANNEST (University of Vermont), ANNA INGEBORG PETURSDOTTIR (University of Nevada, Reno) |
Description: The accumulation of evidence to inform intervention practices requires both methodologically rigorous experimental designs and detailed reporting in published reports. Quality of reporting serves as a proxy for methodological quality including those aspects which we are unable to observe after the experiment concludes. In single-case research, as in other research traditions, researchers sometimes (or often) omit essential methodological details from published reports that researchers need to ascertain methodological quality. This session focuses on topics pertinent to the complete reporting (and not cherry picking) of methods and results in single-case intervention studies. First, we will present an overview of the rationale for enhanced methods and results reporting practices and existing reporting practices within single-case research (Machalicek). Second, we will present data related to statistical conclusion validity in the adapted alternating treatments design, discuss the importance of replicating differences between conditions to decrease the probability of Type I error, and provide recommendations for reporting replications (Petursdottir). Finally, we will present recommendations for the improved articulation of design, visual and statistical analysis decisions in published single-case reports (Vannest). Participants of this session will provide input to inform our collective understanding of the feasibility, acceptability, and potentially effects of our recommendations. |
Learning Objectives: 0. state the rationale for enhanced methods and results reporting practices within single-case intervention research. | 0. explain the contribution of replication within the adapted alternating treatments design to statistical conclusion validity. | 0. list three recommendations for the improved articulation of design, visual and statistical analysis decisions. |
Activities: The audience will break into small groups for interactive discussions and applied activities related to Drs. Machalicek, Vannest, and Ingeborg Petursdottir's presentation. |
Audience: Single-case researchers, reviewers, editors, and those who teach single-case designs |
Content Area: Methodology |
Instruction Level: Advanced |
|
|
SCD+: Utilizing a Mixed Methods Approach to Enhance the Understanding of Single Case Findings |
Saturday, September 27, 2025 |
4:20 PM–5:10 PM |
Embassy Suites Minneapolis; Plymouth Ballroom AB |
Area: SCI; Domain: Theory |
CE Instructor: Angel Fettig, Ph.D.phd |
Presenting Authors: ANGEL FETTIG (University of Washington), SHAWNA HARBIN (Purdue University) |
Abstract: Single Case Design is a well suited, rigorous methodological approach to understanding practices that work with children with significant and complex support needs. Considering SCD plus qualitative in a mixed methods approach can answer questions beyond relationships between practices and outcomes. Mixed Methods research answers research questions that consider who practices work for and the contexts and conditions that promote desired outcomes, and describes discrepant effects in our applied settings of school, homes, and communities (Love et al., 2023). In this panel, we will discuss values of mixed methods approaches that include SCD and approaches in doing so with illustrative examples. |
Instruction Level: Advanced |
Target Audience: Single-case researchers |
Learning Objectives: 0. 1. Understand the importance and utility of mixed methods approaches in enhancing single case design research 0. 2. Understand how single case research approaches can be applied in the 3 core mixed methods designs 0. 3. Recognize how a mixed methods approach could enhance findings from their current and/or previous single-case projects |
|
ANGEL FETTIG (University of Washington) |
Angel Fettig’s research focuses on supporting the social-emotional development for young children with disabilities. She uses mixed methods research approaches by applying single case and group design to understand intervention effects combined with qualitative methods to gain deeper insights on for whom and under what conditions are the interventions most impactful. She is currently the PI for Project Mixer, an OSEP funded leadership grant that trains doctoral students in conducting mixed methods research in the field of special education. |
SHAWNA HARBIN (Purdue University) |
Shawna Harbin is a Clinical Assistant Professor at Purdue University with the Department of Human Development and Family Science. Her work prepares early childhood educators and early intervention practitioners to provide high-quality, inclusive practices to all children and families. Correspondingly, her teaching and research focus on supporting children’s social-emotional learning through educator training and coaching. Shawna uses mixed methods research that incorporates single case design with qualitative methods to understand the effectiveness of target interventions as well as the contextual factors that potentially contribute to the intervention’s impact. |
|
|
|
|
Interactive Workgroups: SCD+: Utilizing a Mixed Methods Approach to Enhance the Understanding of Single Case Findings |
Saturday, September 27, 2025 |
5:20 PM–6:10 PM |
Embassy Suites Minneapolis; Topaz/Turqoise/Opal |
Area: SCI; Domain: Theory |
CE Instructor: Angel Fettig, Ph.D.phd |
ANGEL FETTIG (University of Washington), SHAWNA HARBIN (Purdue University) |
Description: Single Case Design is a well suited, rigorous methodological approach to understanding practices that work with children with significant and complex support needs. Considering SCD plus qualitative in a mixed methods approach can answer questions beyond relationships between practices and outcomes. Mixed Methods research answers research questions that consider who practices work for and the contexts and conditions that promote desired outcomes, and describes discrepant effects in our applied settings of school, homes, and communities (Love et al., 2023). In this panel, we will discuss values of mixed methods approaches that include SCD and approaches in doing so with illustrative examples. |
Learning Objectives: 0. 1. Understand the importance and utility of mixed methods approaches in enhancing single case design research | 0. 2. Understand how single case research approaches can be applied in the 3 core mixed methods designs | 0. 3. Recognize how a mixed methods approach could enhance findings from their current and/or previous single-case projects. |
Activities: The audience will break into small groups for interactive discussions and applied activities related to Drs. Fettig & Harbin's presentation. |
Audience: Single-case researchers, reviewers, editors, and those who teach single-case designs |
Content Area: Methodology |
Instruction Level: Advanced |