|
Advances in Data Collection Techniques: Impact on Clinical Decision-Making |
Saturday, May 23, 2009 |
1:00 PM–2:20 PM |
North 124 A |
Area: AUT/DDA; Domain: Applied Behavior Analysis |
Chair: Ginette Wilson-Bishop (Melmark New England) |
Discussant: Arthur Richard Campbell (Melmark New England) |
CE Instructor: Anibal Gutierrez, Jr., Ph.D. |
Abstract: Clinical decision-making rests, in large part, on the accuracy of the data being collected to inform those decisions. The importance of representative and accurate behavioral assessment to guide intervention cannot be understated. However, these considerations must be balanced with the identification of efficient data collection systems that are also user-friendly. This symposium highlights advances in data collection methodology and the subsequent impact on clinical decision-making. The first two talks share results from a two-part study that sought to examine the effects of data collection methodology on the mastery and maintenance of skills learned by young children with autism through discrete trial training. The final talk will present an alternative use of conditional probabilities and contingency space analysis for measuring treatment integrity, which refers to consistent and accurate plan implementation by change agents over time. Presenters will summarize existing research, describe advances in the data collection techniques, offer empirical examples, and discuss implications within clinical settings. |
|
Comparison of First Trial Probe and Continuous Data Collection Procedures in an Early Childhood Program for Children with Developmental Disabilities |
GINETTE WILSON-BISHOP (Melmark New England), Florence D. DiGennaro Reed (Melmark New England) |
Abstract: This purpose of this presentation is to share the results of an investigation that sought to extend the findings of a study conducted by Cummings and Carr (in press). These researchers found that continuous and first trial probe data collection procedures did not result in significantly different acquisition data but that first trial probe data collection resulted in relatively (a) quicker mastery and (b) poorer maintenance data. As an extension of Cummings and Carr’s research, which was carried out in an analog setting, the present study was conducted in an applied setting (i.e., students’ classrooms) by teachers during typical instruction, using common classroom materials. A multi-element design was used to evaluate clinical decision-making based on visual analysis of continuous versus first trial probe data collection during implementation of receptive programs. Follow-up probes were conducted for three weeks following mastery and a treatment acceptability questionnaire was completed by teachers to assess their acceptability of the different data collection procedures. Data are currently being collected. |
|
A Comparison of Three Types of Data Collection Procedures on Skill Acquisition and Maintenance in Children with Developmental Disabilities |
Florence D. DiGennaro Reed (Melmark New England), GINETTE WILSON-BISHOP (Melmark New England) |
Abstract: Limited research exists to guide the types of data collection methods used within discrete trial training programs for children with disabilities. To date, only one study (Cummings & Carr, in press) has been published on the systematic examination of the impact of first trial probe and continuous data collection procedures on mastery and maintenance of skills. This presentation will share findings from the second part a two-part study extending the findings of Cummings and Carr. The current study replicated Cummings and Carr’s methodology; however, the researchers also examined a third type of data collection technique. Within discrete trial training, the difference between first trial probe, intermittent (e.g., first, fifth, and tenth trial), and continuous (e.g., trial-by-trial) data collection procedures on the skill acquisition and maintenance of receptive programs of children with developmental disabilities was examined using a multi-element design. In addition, teacher acceptability of the data collection methods was assessed using a Likert-type scale. Data collection is presently underway. |
|
A Contingency Space Analysis of Treatment Integrity: Assessing Implementation Accuracy and Consistency |
Derek D. Reed (Melmark New England), FLORENCE D. DIGENNARO REED (Melmark New England) |
Abstract: While the reliable and accurate collection of data on dependent variables has long been a virtue of behavior analytic research, only recently have behavior analysts looked towards improving the degree to which independent variables are delivered in their intended and prescribed manner. The degree of accuracy and consistency in the implementation of behavior change procedures has been termed “treatment integrity” or “procedural fidelity.” A majority of such studies has focused exclusively on improving levels of treatment integrity in behavior change agents and have historically measured treatment integrity as the percentage of treatment steps implemented correctly. In this presentation, we propose that a contingency space analysis of the change agent’s delivery of consequences to clients’ behaviors may provide further insight into the effects of treatment integrity on operant learning. Using data from clinical cases, we will highlight the various ways in which supplementing traditional accuracy measures of treatment integrity with contingency space analyses may provide additional information on plan implementation and treatment efficacy to assist in decision-making regarding treatment modifications or change agent performance enhancement opportunities. |
|
|