|Improving Outcomes of Intensive Behavioral Intervention
|Tuesday, May 27, 2014
|12:00 PM–1:50 PM
|W183c (McCormick Place Convention Center)
|Area: AUT/EDC; Domain: Applied Research
|Chair: Susan Ainsleigh (Bay Path College)
|Discussant: Catherine R. Green (Simmons College)
|CE Instructor: Susan Ainsleigh, Ed.D.
Administrators who develop and oversee instructional programs are universally concerned with the effectiveness of instructional programming. Monitoring of program effectiveness requires the planning and assessment of the achievement of specific, desired programmatic outcomes. Effective programs have been described as those that achieve fluency, generalization, and maintenance and do so efficiently faster than these outcomes would be achieved in less effective instructional programs (Moran & Mallott, 2004). An educational model using an intensive delivery of behavioral intervention has been proposed as an effective means for educating children with autism spectrum disorders (Cummings & Carr, 2009;Cohen, Amerine-Dickens, & Smith, 2006; Smith, 1999). However, intensive program often required intensive planning and monitoring to achieve desired outcomes; even with intensive monitoring these desired outcomes can be difficult to achieve. This symposium presents several components of an intensive, clinic-based ABA program that strengthen instructional outcomes through the use of structural analysis to assist in the select of effective intervention packages, programming for generalization using the most effective and efficient program design, and systematic decision-making to assure timely data-based decision making.
|Keyword(s): decision-making, program outcomes, promoting generalization, structural analysis
Make it Last! Selecting Effective Strategies to Promote Generalization of Speech Therapy Outcomes
|AMAL AL-NABULSI (Jeddah Institute for Speech and Hearing)
Speech and language therapy is often provided as a component of intensive behavioral intervention, and shares with behavioral therapy both the goal of prompting generalization of therapy outcomes and the challenge of achieving this goal. Several strategies have been identified in the speech and behavioral literature as associated with improved generalization; however, few studies demonstrate the superiority of one strategy over another in regards to the generalization of targeted skills. Further, although speech and language targets are routinely targeted in both behavioral and speech therapy, few studies exist that suggest which strategies promote improved generalization of specific language targets. This paper reviews several known strategies for promoting generalized outcomes, and compares the use of three known strategies: programming common stimuli, teaching loosely, and programming indiscriminable contingencies with two common targets of speech therapy: following teachers directions and imitating the actions of an adult. An alternating treatment design across conditions was used to examine the effectiveness of different generalization-promoting strategies.
Using Structural Analysis to Increase Active Student Responding
|Chengan Yuan (Jeddah Institute for Speech and Hearing), SANAA IBRAHIM (Jeddah Institute for Speech and Hearing)
In designing behavioral programming, behavior analysts must select the most effective instructional methodologies to achieve instructional outcomes. In addition to selecting specific behavior change procedures, the analyst must also examine antecedent variables associated with the presence or absence of optimal performance. This analytic process referred to as structural analysis has been used to examine a variety of instructional and performance conditions for targets, such as work performance (Green, et al, 1991), but primarily have been utilized to examine the effect of varying antecedent conditions on the exhibition of problem behavior (Vaughn & Horner, 1997; Smith & Iwata, 1997). This presentation demonstrates the use of structural analysis to identify antecedent or instructional variables associated with active responding by two children with autism spectrum disorders. In both cases, analyses were conducted by manipulating instructional variables to determine which variables were associated with reduced latency to responding and increased rates of correct responding. Instructional variables manipulated included variations in trial sequencing (massed and distributed discrete trials teaching), task content (single or mixed types of tasks), and the presence of absence of motor movement incorporated with instructional trials. For both subjects, results of structural analyses suggested specific trial sequencing was associated with optimal rates of responding (distributed versus massed) and the presence of motor movement resulted in shorter latencies to responding. Differences in optimal responding were noted across conditions for different types of tasks for one of the subjects (language tasks versus visual tasks). The process of structural analysis is presented, with a focus on the selection of effective instructional models based on the results of structural analyses.
A Decision-Making Model for Improving Behavior Analytic Services
|SUSAN AINSLEIGH (Bay Path College), Shumaila Jaffrey (Jeddah Institute for Speech and Hearing)
Clinical practice is often an uncertain exercise (Gambrill, 1990). Across the field of ABA exist wide variations in the way services are implemented, monitored, and modified. Decision-making - about when to intervene, when to change an intervention, and when to discontinue intervention - are all part of daily practice, and yet, with few guidelines as to how to make such decisions, clinicians often rely on instinct and hence, mistakes are inevitable. Such mistakes are costly and potentially slow progress, and are compounded when decision-making is spread across less experienced or newly trained personnel. This paper presents a decision-making model using visually displayed data that guides interventionists, case managers, clinical supervisors, or families receiving services in the clinical making process. It presents specific guidelines on how to monitor performance data to determine when to make a clinical change (such as when to intervene, when to discontinue an intervention, when to fade prompts or thin reinforcement and punishment schedules, or when to implement strategies to promote generalization). Attendees will learn how make better, more timely decisions related to the provision of behavior analytic services, and how to monitor that those under their supervision do the same. A case study using the model across 4 behavior analysts responsible for clinical decision making in an intensive behavior program will be presented, with an examination of how the use of the model in various formats resulted in more accurate, timely, and independent decision making.
Fixed Versus Variable Schedules of Performance Monitoring on Program Implementation and Concurrent Effect on Student Performance in Intensive Behavioral Programs
|CHENGAN YUAN (Jeddah Institute for Speech and Hearing), Shumaila Jaffrey (Jeddah Institute for Speech and Hearing)
Previous research has demonstrated that performance monitoring combined with reinforcement contingencies can increase treatment integrity of instructors in special education and behaviorally-based instructional programs. In addition, changes in student performance have been associated with improvements in treatment integrity. DiGennero et al (2007), for example, noted that improvements in treatment integrity accomplished by performance feedback delivered to teachers were also associated with lower rates of problem behavior of students. Previous research has noted that application of performance feedback results in higher rates of treatment integrity, and that performance feedback schedules thinned to up to 2 weeks between feedback sessions can maintain high levels of treatment integrity. On-going observation and performance feedback remains a critical aspect of maintaining treatment integrity, however, and in an intensive therapy program, maintaining dense schedules of performance feedback can be costly and inefficient. In addition, evidence of changes in student performance across differing levels of treatment integrity remains more scarce. This study examined the effects of fixed versus variable schedules of performance feedback on program implementation by therapists in an intensive ABA program. Concurrent effects on student performance, both in relation to rates of correct responding and rates of problem behavior were examined.