This question bank includes questions that teams can use to develop a hypothesis about why an individual or group of students may not be responding to an intervention. The hypothesis should help guide intervention planning and selection of intensification strategies using the Intervention Intensification Strategy Checklist. When developing a hypothesis, teams should consider the intervention design, fidelity of implementation, and learner needs. Intervention fidelity data collected using the Student Intervention Implementation Log and informal diagnostic data may help teams answer the questions included in the question bank.
Search
Resource Type
DBI Process
Subject
Implementation Guidance and Considerations
Student Population
Audience
Search
This Innovation Configuration can serve as a foundation for strengthening existing preparation programs so that educators exit with the ability to use various forms of assessment to make data-based educational and instructional decisions within an MTSS. The expectation is that these skills can be further honed and supported through inservice as practicing teachers.
This guide is a set of strategies and key practices with the ultimate goal of supporting students with the most intensive behavioral needs, their families, and educators in their transitions back to school during and following the global pandemic in a manner that prioritizes their health and safety, social and emotional needs, and behavioral and academic growth.
Successful implementation of a multi-tiered system of supports (MTSS) and, specifically, intensive intervention through the data-based individualization (DBI) process, demands the collection and analysis of data. As teams consider data collection, challenges may occur with assessment administration, scoring, and data entry (Taylor, 2009). This resource reviews three data collection and entry challenges and strategies to ensure data about risk status and responsiveness accurately represent student performance and minimize measurement errors.
Progress monitoring is an essential part of a multi-tiered system of supports (MTSS) and, specifically, the data-based individualization (DBI) process. It allows educators and administrators to understand whether students are responding to intervention and if adaptations are needed. In addition, these data are often used to set high-quality academic and behavioral goals within the individualized education program (IEP) for students with disabilities. With the closure of schools due to the COVID-19 pandemic, educators and administrators need to rethink how they collect and analyze progress monitoring data in a virtual setting. This collection of frequently asked questions is intended to provide a starting place for consideration.
The purpose of this guide is to provide an overview of behavioral progress monitoring and goal setting to inform data-driven decision making within tiered support models and individualized education programs (IEPs).
This tool is designed to help educators collect and graph academic progress monitoring data across multiple measures as a part of the data-based individualization (DBI) process. This tool allows educators to store data for multiple students (across multiple measures), graph student progress, and set individualized goals for a student on specific measures.
In this video, Dr. Devin Kearns, an Assistant Professor of Special Education in the Department of Education Psychology at the Neag School of Education at the University of Connecticut and NCII Trainer & Coach, discusses importance of consistency when selecting, administering, and scoring progress monitoring tools.
In this video, Dr. Joe Wehby, Senior Advisor to the National Center for Intensive Intervention and Associate Professor in the Vanderbilt University Department of Special Education, discusses the number of data points needed to make decisions for students with intensive behavior needs.
Norms for oral reading fluency (ORF) can be used to help educators make decisions about which students might need intervention in reading and to help monitor students’ progress once instruction has begun. This paper describes the origins of the widely used curriculum-based measure of ORF and how the creation and use of ORF norms has evolved over time. Using data from three widely-used commercially available ORF assessments (DIBELS, DIBELS Next, and easyCBM), a new set of compiled ORF norms for grade 1-6 are presented here along with an analysis of how they differ from the norms created in 2006.