Association for Behavior Analysis International

The Association for Behavior Analysis International® (ABAI) is a nonprofit membership organization with the mission to contribute to the well-being of society by developing, enhancing, and supporting the growth and vitality of the science of behavior analysis through research, education, and practice.

Search

32nd Annual Convention; Atlanta, GA; 2006

Event Details


Previous Page

 

Symposium #105
Behavioral Systems Analysis and Computer-Based Instructional Design: The Reciprocity Between Basic Theory and Instructional Applications
Sunday, May 28, 2006
9:00 AM–10:20 AM
Dunwoody
Area: TPC; Domain: Theory
Chair: Roger D. Ray ((AI)2, Inc.; Rollins College)
Discussant: Patrick S. Williams (University of Houston, Downtown)
Abstract: Kantor’s disregard for experimentation and advocacy of purely descriptive strategies resulted in his having little influence on the current science of behavior analysis (Verplank, 1983; Schoenfeld, 1969). Disparity between Kantor’s powerful conceptualization of interbehavioral psychology and weak or non-existent efforts to translate his ideas into research led Ray and Brown (1975, 1976) to publish some of the earliest studies to address this problem. A series of research articles articulating their “systems approach to behavior” followed (Ray & Ray, 1976; Ray, 1977; Ray, Upson & Henderson, 1977) and a decade later Ray and Delprato (1989) presented a systematic summarization of this approach emphasizing three facets of descriptive interbehavioral systems research strategies and tactics: structural analysis, functional analysis, and operations analysis. But it may be argued that all descriptive research strategies lack confirmation of their completeness (i.e., lack tests for the adequacy of descriptions). However, computerized modeling and simulations offer unique solutions to this confirmation problem. Simulations, in the form of interactive computerized instructional and training systems, present a “virtual reality” that is precisely the affirming phenomenological feedback descriptive researchers need for making adequacy inferences. Software that allows user history analysis offers yet another approach to adequacy feedback and analysis.
 
Interbehavioral Systems Analysis, Simulations, Modeling, and Computerized Instructional Systems Design: A Retrospective on Lessons from CyberRat.
ROGER D. RAY ((AI)2, Inc.; Rollins College)
Abstract: Verplank (1983) noted that, despite the highly significant conceptual power of Kantor’s “interbehavioral” approach to psychology (Kantor, 1959), Kantor shunned laboratory experimentation. Kantor rejected the existence of cause-effect relations among the various participating factors defining his Interbehavioral event (Kantor, 1959). Thus, instead of analyzing causal relations, Kantor argued for purely descriptive approaches. CyberRat is a “virtual reality” simulation that models rats in operant chambers based on such an approach. CyberRat is based upon concepts first presented by Ray et al’s “Systems Approach to Behavior” publication series (Ray & Brown, 1975, 1976; Ray & Ray, 1976; Ray, 1977; Ray, Upson & Henderson, 1977). Ray and Delprato (1989) summarized descriptive interbehavioral systems research’s strategies and tactics. In their article, a “narrative reconstruction” of coded behaviors was described as one strategy for assessing the completeness, or adequacy, of descriptive data. CyberRat offers yet another unexplored approach to adequacy testing based on “virtual reality” reconstructions. These are accomplished via dynamically and stochastically determined digital video editing algorithms that have parameters altered via user interactions that simulate operant experimental manipulations and their implications for behavior. Lessons learned about inadequacies of descriptive research that inspired CyberRat are discussed.
 
Modeling an Operations Analysis of Descriptive Research Publications: A New Taxonomy of Observational Procedure Variations.
JESSICA M. RAY (Rollins College), Roger D. Ray ((AI)2, Inc.; Rollins College)
Abstract: Verplanck’s Operations Analysis (http://web.utk.edu/~wverplan/opanalma.html) was developed for defining terms in their clearest and most fundamental forms. Most of the experimental operations defined in Verplanck’s Glossary & Thesaurus (c.f., Verplanck, 1957; and the expanded on-line version at: http://psych-ai.com/www/WSV.html) are reflective of this process. Much of the G/T presents a taxonomy for describing experimenter behaviors. This empirically founded taxonomic approach to categorizing extant research efforts led Verplanck to articulate operations analysis as a fundamental procedure in making inductive generalizations in the behavioral sciences--a synthesis he eventually referred to as “Operation Analytic Behaviorism” (Verplanck, 1996). Using a similar operations approach we reviewed as many exemplar publications as possible to reflect differences in observational sampling and recording procedures used by researchers. Attempts to model these operations in a new coding trainer software system suggested the need for new criteria and terms for classifying and simulating observational procedure variations. Our operations analysis uses concurrent applications of time and behavior as defining events for when and what to observe and record in observational research. What emerged from our effort is a general software system that trains users for any method defined by the taxonomy we present.
 
Adapting Adaptive Instruction: The students should guide us (with a nod to the Wiki Way).
DAVID A. ECKERMAN (University of North Carolina, Chapel Hill), Steven M. Kemp (University of North Carolina, Chapel Hill)
Abstract: Feedback from students and colleagues may be used to continuously improve online learning materials such as those provided by AI2, Inc. Their MediaMatrix system preserves student responses in a manner that aids this process. Responses to fill-blank questions from a newly developed "Learning Chapter" provided guidance to a colleague helping to revise the text, the questions, and the acceptable answers to these questions. In this process, effective sections of the current text were distinguished from less effective. Further, questions occasioning technical (small response class) were distinguished from those occasioning nontechnical (larger response class) answers. Using MediaMatrix, an author may aspire to develop "expert" rather than merely "stereotyped" behavior. Basic behavioral principles apply to this process of text-improvement, which also mimics the "open source" approach to the development of software and information resources sometimes known as The Wiki Way.
 

BACK TO THE TOP

 

Back to Top
ValidatorError
  
Modifed by Eddie Soh
DONATE
{"isActive":false}