|
Behavior Analysis in Education: Improving Test Performance and Graphic Data Analysis in the Classroom |
Sunday, May 28, 2006 |
1:30 PM–2:50 PM |
Techwood |
Area: TBA; Domain: Applied Research |
Chair: Jennifer L. Austin (California State University, Fresno) |
Abstract: This symposium will present contemporary behavior-analytic research on two topics in education: improving university student test performance and improving the visual inspection performance of university students and schoolteachers. In the first study, Jennifer Simon will present data demonstrating that a rehearsal intervention can be effective in improving university students’ ability to correctly answer written test questions. In the second study, Kelise Stewart will present data demonstrating using the conservative dual-criterion visual inspection aid improves the ability of undergraduate students to visually detect behavior change. In the third study, David Richman will present data demonstrating that a treatment package consisting of lectures and written materials can increase the agreement between undergraduate student and expert ratings of behavior change. In the final study, Jennifer Austin will present data demonstrating that visual-inspection training incorporating graphic displays improves the ability of teachers to visually detect behavior change. |
|
The Effects of a Written Rehearsal Procedure on Undergraduate Test Performance. |
JENNIFER SIMON (University of Kansas), Rachel H. Thompson (University of Kansas) |
Abstract: The effects of a written rehearsal procedure on test performance among 37 college students enrolled in an undergraduate, lecture-based course were investigated. The rehearsal procedure involved the students writing questions and answers related to lecture material until correct independent performance was achieved. This procedure resulted in improved scores on experimental questions in comparison to control questions when students rehearsed immediately following each lecture (daily task) and when rehearsal was programmed only just prior to each test (unit task). Students continued to implement the procedure when a point contingency for completion was eliminated. Interobserver agreement (IOA) was collected on 40% of the tests for all units. IOA for the unit tests was 97.7% (range, 96% to 99%). IOA was collected on 40% of all daily and unit tasks for completion and accuracy. IOA for the completion and procedural integrity of all daily and unit tasks was 100%. |
|
Teaching College Students to use the Conservative Dual Criterion Method of Visual Inspection. |
KELISE STEWART (Western Michigan University), James E. Carr (Western Michigan University), Charlie Brandt (Western Michigan University), Meade M. McHenry (Western Michigan University) |
Abstract: We evaluated two interventions for improving the ability of undergraduate students to visually detect behavior change from A-B design graphs. Six students were first exposed to videotaped lecture that focused on evaluating between-changes in data paths by evaluating their trend, level, and variability. The lecture was ineffective in improving visual inspection performance above baseline/chance levels. Students were then exposed to a lecture on how to use the conservative dual-criterion (CDC) method, which involves superimposing the mean and regression lines from a data path onto the subsequent data path to assist with change detection. Results indicate that the CDC method improved the visual inspection of all students to near-100% accuracy. Interestingly, performance declined when the CDC lines were removed from graphs, suggesting a potential barrier to the maintenance of visual inspection performance under this method. |
|
Teaching Visual Inspection of Reversal Designs to Undergraduate Students. |
DAVID M. RICHMAN (University of Maryland, Baltimore County), Sung Woo Kahng (Johns Hopkins University School of Medicine & Kennedy Krieger Institute), Steven J. Pitts (University of Maryland, Baltimore County) |
Abstract: Level of agreement for interpreting 36 ABAB reversal design graphs was assessed between 45 Journal of Applied Behavior Analysis (JABA) editorial board members and 14 undergraduate students taking an upper-level psychology course on functional analysis and treatment of behavior disorders. The undergraduate students rated 36 graphs “yes” or “no” with regards to whether the graphs showed patterns of data demonstrating experimental control over the dependent variable (i.e., pre-instruction and post-instruction). The students then received two 75 minute lectures on interpreting ABAB reversal designs, and they were assigned readings on topics of measurement, design, and graphing direct observation data. The primary dependent variable was level of agreement between the two groups pre- and post-instruction. Results of the pre-instruction ratings showed that the 14 undergraduate students agreed with JABA board of editors at .345 Kappa (68% agreement), and agreement increased to .570 Kappa (80% agreement) after instruction. One interesting finding was that the majority of undergraduate students agreed with JABA board of editors pre-test and post-test when the JABA board of editors rated the graphs as indicative of experimental control. The gain in agreement was a substantial reduction in false positive ratings from the undergraduate students (from 40% to 24% error). |
|
Using Graphic Displays to Improve Teachers’ Detection of Changes in Student Behavior. |
JENNIFER L. AUSTIN (California State University, Fresno), Allana D. Luquette (Families First of Florida) |
Abstract: Although the influence of graphic feedback on teacher and student behavior has been demonstrated repeatedly in the behavior analytic literature, little has been done to evaluate the influence of graphic feedback on teachers’ abilities to accurately identify behavior change. Accurate detection of change may prove crucial to a teacher’s willingness to maintain implementation of behavior programs, especially if changes in behavior are gradual. Typically, teachers rely strictly on their perceptions of behavior change, rather than actual changes, when evaluating the effects of an intervention. Access to graphed data on student behavior may improve their abilities to recognize actual treatment effects. This study assessed the effects of graphic displays and training in visual inspection of graphed data on the abilities of teachers to accurately identify changes in student behavior. In addition, we evaluated the effects of the independent variables on participant decisions to continue behavioral interventions. Results showed that for all three teachers who participated in the study, accuracy in detection of behavior changes improved when graphic displays were provided. Additional improvements were not observed with the introduction of training, possibly due to a ceiling effect. Treatment effects on teachers’ decisions to maintain interventions were variable across conditions. |
|
|