|
Advancements in Research on Decision-Making in Behavioral Treatment |
Sunday, May 24, 2020 |
8:00 AM–9:50 AM |
Walter E. Washington Convention Center, Level 1, Salon H |
Area: DDA/AUT; Domain: Applied Research |
Chair: Allison Nicole White (Michigan State University ) |
Discussant: Tyra Paige Sellers (Behavior Analyst Certification Board) |
CE Instructor: Allison Nicole White, M.Ed. |
Abstract: The purpose of this symposium is to describe recent advancements in research on decision-making in the context of behavioral interventions. The first study will describe a decision tool designed to support behavior analysts and classroom teachers when conducting FBAs to inform function-based interventions. The second study will report results on the effectiveness of an instructional package consisting of a brief online training and a decision-making model on increasing the accuracy of instructional decisions made by preservice teachers and prospective behavior analysts. The third study will review a framework for modifying the objectives and approach of interventions for challenging behavior, based on both in-session and out- of-session outcomes. The fourth study reports results from translational application of probability discounting to evaluate how data accuracy affects the choices practitioners make during visual analysis. |
Instruction Level: Intermediate |
Keyword(s): challenging behavior, decision making, FBA, probability discounting |
Target Audience: Board Certified Behavior Analysts |
Learning Objectives: At the conclusion of this presentation, participants will be able to: (1) describe a tool to select, design, and implement hypothesis testing strategies for students with persistent challenging behavior; (2) use an instructional package to make informed instructional decisions by analyzing students’ performance data; (3) use a framework to modify objective and interventions for challenging behavior; and (4) describe how changes in probabilities may affect decision-making during visual analysis |
|
Piloting a Collaborative Decision Tool to Increase Rigor and Relevance of Functional Behavior Assessments |
BLAIR LLOYD (Vanderbilt University), Jessica Torelli (Vanderbilt University), Marney Pollack (Vanderbilt University) |
Abstract: School practitioners typically rely on interviews and direct observations to complete functional behavior assessments (FBAs) and inform function-based interventions. While data from descriptive FBAs may be sufficient to inform effective interventions in some cases, others warrant more rigorous assessment tools that involve some form of hypothesis testing. We will present on a decision tool designed to support behavior analysts and classroom teachers to select, design, and implement hypothesis testing strategies for students with persistent challenging behavior. Using a response-guided framework, the decision tool supports practitioner selection of (a) hypothesis testing strategy, (b) practical adaptations to maximize efficiency, (c) assessment location, (d) assessment implementer, and (e) method of data collection. We will present data from practitioner teams who used the decision tool for two elementary students with persistent challenging behavior whose initial FBAs were inconclusive. Results support the initial promise of the decision tool in (a) supporting effective collaboration between behavior analysts and classroom teachers and (b) producing interpretable assessment outcomes supported by initial intervention data. We identify supporting roles the research team played throughout this process to highlight important next steps for preparing practitioners to use and apply this decision tool independently. |
|
Evaluation of an Instructional Package for Data-Based Decision Making |
MEKA MCCAMMON (University of South Carolina), Katie Wolfe (University of South Carolina), Ashley Holt (University of South Carolina ), Lauren LeJeune (University of South Carolina ) |
Abstract: Adapting interventions based on student progress is paramount to the effectiveness of instruction in special education and applied behavior analysis. There is limited research on effective and time efficient methods for teaching educators and clinicians to make informed instructional decisions by analyzing students’ performance data. Preliminary evidence from Kipfmiller et al., (2019) suggests that a decision-making model can be effective in increasing the accuracy of data-based decision making by front-line employees. The purpose of this study was to evaluate a more complex model in which participants were taught to identify the type of data pattern then subsequently make an instructional decision. We used a multiple baseline across participants design to evaluate the effectiveness of an instructional package consisting of a brief online training and a decision-making model on increasing the accuracy of instructional decisions made by preservice teachers and prospective behavior analysts. All the participants increased their percentage of correctly identifying data patterns and instructional decisions across multiple exemplars during assessment sessions. The implications of these findings along with one-month follow-up data will be discussed. |
|
Considering the Process and Product of Intensive Intervention Through Data Triangulation |
IPSHITA` BANERJEE (Peabody College, Vanderbilt University), Joseph Michael Lambert (Vanderbilt University), Nealetta Houchins-Juarez (Vanderbilt University), Bailey Copeland (Vanderbilt University) |
Abstract: This study highlights a framework for modifying the objectives and approach of interventions for challenging behavior, based on both in-session and out- of-session outcomes. The framework is intended to maximize efficacy and social validity through a flexible but systematic approach to data analysis. Variables considered include child outcomes, basic behavioral processes (e.g., bursts, contrast), shifting caregiver values, implementation fidelity, resource constraints, and preference. Preliminary findings from three child-caregiver dyads suggest objectives reflected by contemporary research practices (e.g., suppression of challenging behavior, increases in functional communication and compliance, discrimination training, delay/denial tolerance) reflect desirable and socially valid outcomes. However, the intervention variables responsible for producing them (e.g., prompting techniques, treatment dosage, etc.) are far more idiosyncratic. |
|
Visual Analysis With Dynamic Data Sets and Changing Data Accuracy |
ALLISON NICOLE WHITE (Michigan State University ), Matthew T. Brodhead (Michigan State University), David J. Cox (Johns Hopkins University School of Medicine) |
Abstract: Practitioners often decide to continue or modify an intervention using visual analysis of data paths that lengthen from session-to-session. We used a novel, lengthening data path procedure to parametrically assess how reducing data accuracy changed decisions to continue or modify an intervention in 30 students of behavior-analytic graduate programs. Additionally, because of potential similarities between data accuracy and probability, we examined how one probability discounting equation described individual choice. We found that decreasing data accuracy systematically reduced the number of sessions participants waited to modify an intervention for 25 of the 30 participants. When data accuracy was 100%, most participants waited 9-10 sessions before intervening. When data accuracy was below 60%, most participants waited 4-6 sessions before intervening. Lastly, the probability discounting equation described patterns of choice well for 16 participants. Data accuracy influenced most participants’ visual analyses in a systematic manner. However, the degree of influence differed between individuals. |
|
|