This question bank includes questions that teams can use to develop a hypothesis about why an individual or group of students may not be responding to an intervention. The hypothesis should help guide intervention planning and selection of intensification strategies using the Intervention Intensification Strategy Checklist. When developing a hypothesis, teams should consider the intervention design, fidelity of implementation, and learner needs. Intervention fidelity data collected using the Student Intervention Implementation Log and informal diagnostic data may help teams answer the questions included in the question bank.
Search
Resource Type
DBI Process
Subject
Implementation Guidance and Considerations
Student Population
Audience
Search
The purpose of this guide is to provide an overview of behavioral progress monitoring and goal setting to inform data-driven decision making within tiered support models and individualized education programs (IEPs).
This tool is designed to help educators collect and graph academic progress monitoring data across multiple measures as a part of the data-based individualization (DBI) process. This tool allows educators to store data for multiple students (across multiple measures), graph student progress, and set individualized goals for a student on specific measures.
In this video, Dr. Joe Wehby, Senior Advisor to the National Center for Intensive Intervention and Associate Professor in the Vanderbilt University Department of Special Education, discusses the number of data points needed to make decisions for students with intensive behavior needs.
In this video, Dr. Lynn Fuchs, Nicholas Hobbs Professor of Special Education and Human Development at Vanderbilt University and Senior Advisor to the National Center on Intensive Intervention, shares considerations for adapting interventions when the validated intervention program wasn’t successful.
In this video, Dr. Devin Kearns, an Assistant Professor of Special Education in the Department of Education Psychology at the Neag School of Education at the University of Connecticut and NCII Trainer & Coach, discusses importance of consistency when selecting, administering, and scoring progress monitoring tools.
How do you know if an intervention, program, or practice is likely to be effective with a particular subgroup of students? What resources are there to help school, district, and State leaders identify and select evidence-based practices (EBPs)? EBPs play an increasingly prominent role in Federal education policy. In both State Systemic Improvement Plans (SSIPs) and provisions in the Every Student Succeeds Act (ESSA), States are being asked to implement practices and programs that have evidence of effectiveness.
An effective and efficient data system is essential for successful implementation of a multi-tiered system of support (MTSS). However, prior to selecting an appropriate system, schools and districts must identify what its staff and community need and what resources the district or school has to support an MTSS data system. This two-step tool can help teams to consider both what their needs are and to evaluate available tools against those needs. Step 1 can help your team systematically identify and document your MTSS data system needs and current context and step 2 focuses on selecting and evaluating a data system for conducting screening and progress monitoring within a tiered system of support based on the identified needs and context from step 1
Norms for oral reading fluency (ORF) can be used to help educators make decisions about which students might need intervention in reading and to help monitor students’ progress once instruction has begun. This paper describes the origins of the widely used curriculum-based measure of ORF and how the creation and use of ORF norms has evolved over time. Using data from three widely-used commercially available ORF assessments (DIBELS, DIBELS Next, and easyCBM), a new set of compiled ORF norms for grade 1-6 are presented here along with an analysis of how they differ from the norms created in 2006.