|Continuous Assessment of Learner Behavior: Performance Monitoring Tools
|Monday, May 30, 2022
|11:00 AM–11:50 AM
|Meeting Level 1; Room 102A
|Area: DEV/TBA; Domain: Applied Research
|Chair: Ji Young Kim (Teachers College, Columbia University)
|CE Instructor: Ji Young Kim, Ph.D.
Performance monitoring tools can be used to track and enhance the performance of staff and recipients of behavior analytic services. One important dimension of performance monitoring is analyzing the interlocking three-term contingencies between an instructor and a learner which can be achieved using the Teacher Performance Rate and Accuracy measure (or the TPRA). Another important dimension of performance monitoring is measuring when sufficient learning has occurred, i.e., the use of mastery criteria. Lastly, a decision-tree protocol, which includes but is not limited to, mastery criteria, can be used for making moment-to-moment decisions regarding learner behavior based on continuous data analysis. In this talk, we review data demonstrating how these three performance monitoring tools can aid instructors in making clinical decisions regarding the performance of their learners. These decisions can enhance educational outcomes for both the instructor and the learner and lead to more efficient rates of acquisition across objectives.
|Instruction Level: Intermediate
|Keyword(s): data analysis, mastery criteria, performance monitoring, treatment integrity
Audience should have basic knowledge of three-term contingencies and the necessary components to an effective learning environment. Basic knowledge of taking data, monitoring performance, and data analysis skills would aid in understanding the material presented herein.
|Learning Objectives: At the conclusion of the presentation, participants will be able to: 1) monitor learner performance using the Teacher Performance Rate and Accuracy scale and apply the scale within a learning environment as a treatment integrity tool, 2) monitor learner performance using mastery criteria to signal whether sufficient learning on a given objective has occurred, and 3) use a decision-tree protocol to make moment-to-moment decisions and monitor learner performance through continuous data analysis.
|Addressing the Feasibility of the Teacher Performance Rate and Accuracy Scale as a Treatment Integrity Tool
|KIEVA SOFIA HRANCHUK (St. Lawrence College), Michael James Williams (Maltby Centre)
|Abstract: We implemented a multiple probe across participants design to analyze the effects of behavioral skills training (BST) on teaching assistants’ effective delivery of instruction as measured through their performance on the Teacher Performance Rate and Accuracy (TPRA) scale. Effective instruction is defined as instruction that is both accurate and fluent. Three adult teaching assistants, newly hired at a kindergarten readiness program that employed the principles of applied behavior analysis, were selected to participate. The participants had no previous experience implementing three-term contingency trials. Dependent variables included two components of the TPRA scale measured pre- and post-intervention: 1) percent of correctly delivered trials, and 2) rate of trial delivery. Results indicated that BST increased the accurate delivery of correct three-term contingency trials by teaching assistants as measured through TPRA scale observations. The intervention also successfully increased the teaching assistants’ rate of trials delivered per minute.
|Mastery Criteria as a Performance Monitoring Tool in Educational Settings
|DANIEL MARK FIENUP (Teachers College, Columbia University), Kristina Wong (Columbia University), Sarah M. Richling (Auburn University)
|Abstract: Performance monitoring tools can be used to track and enhance the performance of staff and recipients of behavior analytic services. One important metric are rules for instructors to determine when sufficient learning has occurred that signal the instructor to change or terminate instruction and move on to teaching new behavior. These performance criteria for “mastery”, in fact, have implications for the future behavior of students we serve. In this talk, we review data demonstrating how different performance criteria produce different educational outcomes (e.g., maintenance) and propose a model for using these data to address a fuller range of educational outcomes.
|A Decision Protocol for Teachers: A Strategic Science Application to Teacher Training and Performance Outcomes
|Dolleen-Day Keohane (Nicholls State University, Touchstone), JO ANN PEREIRA DELGADO (Teachers College, Columbia University)
|Abstract: Training teachers to make effective instructional decisions utilizing procedures rooted in the strategic science of teaching is paramount to accelerating student learning. In this presentation, we will cover the CABAS® decision tree protocol (Greer and Keohane, 2005), which has been applied and expanded upon in CABAS® Model schools for over two decades. Teachers first learn to utilize a set of rules based on the visual inspection of the graphs to signal a decision opportunity to either continue a phase or cease a phase and select a new objective. When instructional problems are encountered, teachers are taught to make higher-level decisions by learning to follow verbally governed algorithms, which involves an analysis of the curriculum and the context for learning. A comprehensive dynamic training package comprised of research-based performance management components (i.e., graph and curriculum checks, decision logs, decision graphs, and verbally governed supervisor learn units) allows for differentiated teacher instruction and the acquisition of contingency shaped and verbally mediated teacher competencies.