This question bank includes questions that teams can use to develop a hypothesis about why an individual or group of students may not be responding to an intervention. The hypothesis should help guide intervention planning and selection of intensification strategies using the Intervention Intensification Strategy Checklist. When developing a hypothesis, teams should consider the intervention design, fidelity of implementation, and learner needs. Intervention fidelity data collected using the Student Intervention Implementation Log and informal diagnostic data may help teams answer the questions included in the question bank.
Search
Resource Type
DBI Process
Subject
Implementation Guidance and Considerations
Student Population
Audience
Search
This rubric uses descriptors of the dimensions of the Taxonomy of Intervention Intensity to support teams in selecting and evaluating validated interventions for small groups or individual students. Teams may consider using data available on the National Center on Intensive Intervention Academic Tools Chart and the publishers’ websites as well as results from previous implementation efforts. Each dimension will be rated on a scale of 0– Fails to Address Standard to 3 – Addresses Standard Well. Taxonomy of Intervention Intensity: Academic Rating Rubric Related Resources Taxonomy of Intervention Intensity Resources
The purpose of this guide is to provide an overview of behavioral progress monitoring and goal setting to inform data-driven decision making within tiered support models and individualized education programs (IEPs).
The 2017 Supreme Court decision Endrew F. v. Douglas County School District highlighted the importance of monitoring students’ progress toward appropriately challenging individualized educational program (IEP) annual goals and making changes to students’ educational programs when needed. In this guide, we explain how educators can establish IEP goals that are measurable, ambitious, and appropriate in light of the student's circumstances.
In this video, Amy McKenna, a special educator in Bristol Warren Regional School District shares her experience with data-based individualization (DBI). Amy discusses how she learned about DBI, the impact her use of the DBI process had on students she worked with, and how DBI helped changed her practice as a special educator.
In this article, Dr. Jennifer Ledford shares information about single-case design research and how it relates to intensive intervention as well as resources from the Council for Exceptional Children Division for Research (CEC DR).
This tool is designed to help educators collect and graph academic progress monitoring data across multiple measures as a part of the data-based individualization (DBI) process. This tool allows educators to store data for multiple students (across multiple measures), graph student progress, and set individualized goals for a student on specific measures.
An effective and efficient data system is essential for successful implementation of a multi-tiered system of support (MTSS). However, prior to selecting an appropriate system, schools and districts must identify what its staff and community need and what resources the district or school has to support an MTSS data system. This two-step tool can help teams to consider both what their needs are and to evaluate available tools against those needs. Step 1 can help your team systematically identify and document your MTSS data system needs and current context and step 2 focuses on selecting and evaluating a data system for conducting screening and progress monitoring within a tiered system of support based on the identified needs and context from step 1
Norms for oral reading fluency (ORF) can be used to help educators make decisions about which students might need intervention in reading and to help monitor students’ progress once instruction has begun. This paper describes the origins of the widely used curriculum-based measure of ORF and how the creation and use of ORF norms has evolved over time. Using data from three widely-used commercially available ORF assessments (DIBELS, DIBELS Next, and easyCBM), a new set of compiled ORF norms for grade 1-6 are presented here along with an analysis of how they differ from the norms created in 2006.