Association for Behavior Analysis International

The Association for Behavior Analysis International® (ABAI) is a nonprofit membership organization with the mission to contribute to the well-being of society by developing, enhancing, and supporting the growth and vitality of the science of behavior analysis through research, education, and practice.


35th Annual Convention; Phoenix, AZ; 2009

Event Details

Previous Page


Symposium #169
Evidence-based, Empirically Supported, Best Practice: What Does it All Mean?
Sunday, May 24, 2009
9:00 AM–10:20 AM
North 122 BC
Area: EDC/VRB; Domain: Service Delivery
Chair: Ronnie Detrich (Wing Institute)
Discussant: Janet S. Twyman (Headsprout)
Abstract: The evidence-based practice movement has resulted in a proliferation of terms, which are often used interchangeably. This has resulted in confusion for both the consumer of services and for researchers trying to identify and validate effective practices. The fundamental assumption informing all of these terms is that effective practices are best identified through scientific research. The challenge is to validly identify practices that are most likely to be effective based on the research evidence. Three general approaches have emerged: a consensus expert panel, identification and validation of principles and broad strategies, and identification and validation of specific procedures. Different terms have been used by different professional organizations to describe these approaches. The lack of specificity can create the impression that the evidence-based practice movement is another fad without substance and result in consumers and decision makers ignoring attempts to base interventions on scientific evidence. The purpose of this symposium is to describe the range of approaches to the challenge of identifying effective practices, to discuss the strengths and limitations of each approach, and to review various terms that are commonly used.
Best Practice Guidelines: Standing on the Shoulders of Giants?
RONNIE DETRICH (Wing Institute)
Abstract: In the evidence-based practice movement one of the primary goals is to provide guidance to practitioners and clients about how to best proceed with a specific problem. Among the approaches for guiding practitioners and clients are best practice guidelines, which are formulated by experts in a particular area who make recommendations regarding assessment and intervention practices with respect to a particular problem or population. The assumption is that these experts have a broad knowledge of the research literature and can make informed, coherent statements about best practices. The success of a best practice workgroup depends on how the panel of experts is constructed. If the panel is drawn too narrowly then the resulting guidelines may not reflect knowledge of all of the relevant research and approaches to a population or problem. If the panel is drawn too broadly then the widely variant perspectives make it difficult for the panel to reach consensus on what constitutes best practices. One of the benefits of best practice guidelines is that they serve a social validity function for various interventions. This is important when the evidence-based approach validates an intervention that is more intrusive than other evidence-based interventions for a particular problem or population.
Research Based Principles: What Practice Can’t Do Without
TRINA D. SPENCER (Utah State University)
Abstract: Research based principles of behavior provide practitioners with powerful strategies for constructing effective interventions. These principles are based on an enormous, systematic and diverse literature that has developed across many decades. Effective applications of these principles to specific problems have been empirically demonstrated in thousands of published studies. In Baer, Wolf, and Risley’s (1968) terms, Applied Behavior Analysis should be conceptually systematic as this allows the behavior analyst to adapt interventions to the unique needs of a particular client in a particular context while staying within a thoroughly research based framework. This approach has been called the scientist-practitioner model (Barlow & Hayes, 1984). The primary advantage of using this method of deriving effective practices is that practitioners have the freedom and skill to make inferences and generalizations from research findings to construct individualized interventions. However, this approach is not without limitations; to use this approach effectively practitioners must have a high level of conceptual skill and extensive applied experience. In addition, the validation of principles remains one step removed from the validation of specific interventions. A variety of terms are used to refer to this approach, such as scientifically-based, research-based, and empirically-supported.
Evidence-Based Interventions – Validating Specific Interventions
TIMOTHY A. SLOCUM (Utah State University)
Abstract: There is broad consensus that scientific research should inform practice; however, there are numerous strategies that might be employed to strengthen the relationship between research and practitioner behavior. One alternative is to attempt to scientifically validate specific interventions. This approach takes the ‘intervention’ as the unit of analysis. An operationally defined intervention is validated by identifying scientific research that demonstrates specific positive outcomes for a particular population of clients in specific contexts. Validated interventions are often referred to as evidence-based interventions, evidence-based practices, or empirically validated interventions. The strength of this approach is its specificity and the relatively objective nature of the validation process. The main limitations of this approach are the ‘flip sides’ of its strengths. As a function of its specificity, this approach minimizes our ability to generalize research findings to different, but related problems; it does not recognize the validity of broad principles. Its highly operationalized validation processes leave little room for judgment about the research base, which can result in errors in the validation process (false positives and false negatives). Leading examples of this approach include the What Works Clearinghouse, SAMHSA’s National Registry of Evidence-based programs and practices, and the National Autism Center’s National Standards Project.



Back to Top
Modifed by Eddie Soh