Oregon Research Institute, Eugene, United States
Erin A. Chaparro
Oregon Research Institute, Eugene, United States;
Educational and Community Supports, College of Education, University of Oregon
Caroline Dennis
Oregon Research Institute, Eugene, United States
Keith Smolkowski
Oregon Research Institute, Eugene, United States
Acknowledgement: This research was supported by Grant No. R305A150325 from the United States Department of Education, Institute of Education Sciences.
When reviewing options for new curricular and intervention programs, school district administrators may work with a committee or gather input from specialists, including special education teachers and school psychologists. Ideally, the selection of a new intervention would begin with identified objectives; objectives define the important aspects of a decision opportunity (
In this article, we use guidelines and practices from the field of economics to conduct a prospective CE analysis of a curriculum for middle school English learners (EL). A CE analysis involves estimating the costs relative to the effectiveness of alternatives in producing a common outcome and can be an important tool in choosing the alternative with the lowest cost for any given level of educational effectiveness (
The fundamental component in an economic evaluation of public programs is cost analysis which focuses on capturing the costs entailed in implementing a program, without attempting to measure or assign value to its positive impact or benefits (
Thorough analyses of cost alone have been completed for interventions such as First Step Next, an early intervention for disruptive behavior disorder (
Cost studies regarding students with limited English proficiency status have focused on instructional approaches such as double immersion or sheltered English and on statewide funding and other policies, rather than curricular and professional development decisions (
Direct instruction is the explicit teaching of basic academic skills. This approach is used in a number of programs, including Reading Mastery (
Lack of Research on English Language Curricula
Given the range of languages spoken by students in U.S. schools, schools can teach students English, but to do so, teachers need empirically supported easy-to-implement instructional tools or curricula. Unfortunately, few English language development (ELD) curricula have been rigorously evaluated under controlled settings and with adequate sample sizes to recommend their use in U.S. schools. Fewer have been deemed effective for use with middle school EL students, and limited evidence is available on the optimal design and delivery of effective and efficient English language instruction (
This lack of scientifically tested curricula led to this evaluation of DISE, which includes the sequential introduction of new material and clear directions to teachers and is unique compared to other programs on the market. DISE instruction is highly interactive employing the use of choral response and other effective explicit instructional delivery strategies. Teachers present to students with clear demonstrations and continuous opportunities for students to practice by using the instructional stimulus of images displayed on a screen for all students to view at the same time. There are no student materials, only a teacher manual which includes the student placement test, and the instructional display materials. Teachers actively monitor students’ understanding and provide immediate corrective feedback to prevent students from learning misinformation or misapplying new skills. The instructional pace keeps students motivated and engaged yet allows sufficient opportunities for students to practice. With these instructional qualities, DISE was developed to teach groups of ELs who may not speak the same language.
Research strongly supports the critical role of English language proficiency and listening comprehension in the academic success of older ELs (e.g.,
High-quality professional development is essential to ensure fidelity of implementation (
Our goal is to provide a comprehensive estimate of the resources needed to implement the DISE program and assess these program costs relative to their effectiveness in order to provide a demonstration for leaders in the position of selecting, installing, and sustaining programs. This involves calculating the economic cost of the program implementation, which includes, in addition to the budgeted expenditures, opportunity costs associated with personnel time and use of existing equipment, materials, and facilities, and then computing cost per effectiveness. We focus on estimating the costs associated with resources needed for DISE implementation above and beyond the resources used in business as usual ELD of EL students and provide estimates of the incremental CE ratio for the DISE program. Our CE estimates constitute a necessary input for curricular decisions focused on ELs, including future comparisons with similar programs, and our analysis demonstrates a method that can be applied to a range of educational interventions.
The Middle School English Learner Project randomly assigned 29 schools from Texas, Washington, and Oregon, within eight districts, to either implement DISE or conduct business as usual (BAU) in their beginner ELD classrooms (see
Schools and Teachers
All schools included students in Grades 6–8, ranging from 536 to 1,383 total students with an average class size of 12 students. Over the course of the project, 18 teachers and one instructional assistant
Professional Development and Coaching for Teachers
Teachers were trained to deliver the DISE curriculum with their newcomer (i.e., students who had recently immigrated to the United States) EL groups and received coaching from NIFDI during the 2 years of district participation in the study. Initial 2-day DISE training took place in person with the NIFDI coach, and included an orientation to the curricular approach as well as experiential practice delivering DISE lessons. After this training, the NIFDI coach conducted an on-site coaching visit and three follow-up remote coaching sessions via live videoconference; for each session, the coach observed the teacher delivering DISE and offered individualized feedback. During the second year, the coach conducted half-day refresher trainings, on-site coaching visits, and three remote coaching sessions. Teachers also submitted weekly surveys on lesson progress and were encouraged to reach out to the NIFDI coach for support with particular situations.
Curriculum
DISE has two levels designed to develop and accelerate oral ELD for non-English speaking students in Grades 4–12. Teachers are trained to administer a DISE placement test that is embedded within the teacher manual with no additional materials needed. This study pertains only to DISE Level 1 as we targeted students just beginning to learn English. Students received 45–55 min of DISE instruction as a part of their daily ELD instruction, which occurs at times that do not compromise their access to the core curriculum in other content areas.
ELD Instruction in Study Conditions
In both conditions, teachers delivered classroom instruction primarily in English to all students in the classroom. All participating schools had English as the primary language of instruction for the duration of the school day for EL students regardless of English language proficiency level. All teachers taught for their entire class period in middle schools, typically determined at the district level, so the instructional context and time were balanced across conditions. The length of ELD class periods was the same in both conditions, so students received the same amount of instruction each day. Teachers taught all students in their classrooms according to condition assignment.
The IDEA Proficiency Test (IPT II;
In
The differential response by initial skill is critical for the interpretation of the effects because the investigation targeted only students with beginner to early intermediate English oral language skills with DISE Level 1 lessons. We operationalized the baseline performance as IPT pretreatment scores of 12 or below. One quarter of the sample scored higher than 12, and some students scored above the 50th percentile on the IPT. These students functioned too highly at baseline to benefit from the DISE lessons under study and therefore, obscured the evidence of impact of DISE Level 1.
To estimate effects for the beginner to early intermediate target sample, we estimated the effect size for the lower 75% of students who at baseline scored a 12 on the IPT or below. For this range, we estimated an effect size of g = 0.19, 95% CI [0.05, 0.33] in the first intervention year. These effect sizes more accurately reflect the efficacy of the DISE lessons under investigation in this project for the intended student sample, so we focus on these effect sizes in our CE analysis.
We estimate the cost of DISE using the “ingredients method” (
We begin by identifying the resources needed for the program by the stages of implementation, shown in the first column of
We then identify the quantities required for each ingredient, and the associated prices, considering both the direct expenses (such as fees and cost of materials) and opportunity costs—the value of the best alternative use of the resource or the value of what needs to be given up for that particular use of the resource. For example, even when a teacher dedicates time to the program activities (e.g., training) within working hours with no extra payment, the value of teacher time must be accounted for in-cost calculations if the teacher spends this time on program activities in addition to time spent on their usual duties. The value of the next best alternative for a resource is typically captured by its market price. In the example of teacher’s time, the teacher’s hourly salary and benefits represent an estimate of the value of teacher’s time.
Ingredients and Prices: Costs of DISE Curriculum, Training, and Coaching
DISE curriculum package includes all program-related material needed for both initial student assessment and instruction. The price of curriculum and services are the fees and charges from the DISE publisher. These prices are obtained from NIFDI (
There were several commercially published instructional tools and software programs used in the BAU classrooms including Milestones, Language Power, Florida Center for Reading Research Student Activities, Keys to Learning, Texas Primary Reading Inventory, ESL Reading Smart, Rosetta Stone, Imagine Learning, and Read 180. Several teachers used their own materials, other books, or shorter texts. In incremental cost calculations, the cost of these instructional materials should be deducted from the cost of the DISE curriculum to reflect the costs exceeding BAU costs. Because, the material used in BAU classrooms varied widely and, in some cases, was free for schools, we ignore the cost of these alternative curricula. The cost of adopting DISE would be lower than our estimate for a school replacing other costly instructional material with DISE.
Ingredients and Prices: Teachers and Other Personnel
We consider teacher time spent in 2-day in-person DISE training in the first year and a half-day refresher in the second year. For teacher time spent on coaching, we consider only the feedback portion of the coaching sessions and exclude the teacher’s time during the observation portion which is a part of the delivery of DISE instruction. We also consider information technology (IT) staff time spent on set-up and ongoing support for remote coaching sessions.
As noted earlier, the length of ELD class periods was the same in DISE and BAU classrooms, and thus, the teachers spent the same amount of time each day delivering lessons in each condition. Accordingly, there is no incremental cost associated with teacher time delivering DISE because the teaching of DISE material replaces teaching time in BAU. Similarly, incremental cost calculations include the cost of any extra prep time a teacher devotes to DISE material beyond their typical prep time. The results of the teacher surveys indicate that the average prep time for DISE teachers is 28 min per day (SD = 18.2, range 15–60), whereas the average prep time for BAU teachers is 40 min per day (SD = 15.5, range 15–60). These survey results are based on a subsample of teachers and are provided to justify that no additional teacher preparation time was needed in the treatment condition. Therefore, we argue that no additional prep time should be included in this incremental cost calculation.
Teacher and IT staff time are priced at the total of hourly wages and benefits. For teacher and IT staff wages, we used national median annual pay for “middle school teacher” and “computer support specialist,” respectively, provided in the U.S. Bureau of Labor Statistics Occupational Outlook Handbook (
Ingredients and Prices: Facility Costs
DISE training and the feedback portion of the coaching sessions take place in school spaces available for meetings. In the district-level implementation scenario, we assume that all teachers are trained at the same time in district offices. We follow the approach outlined in
Ingredients and Prices: Equipment Costs
The equipment used for DISE delivery includes a computer and a projector or a smart board. All schools are equipped with these devices, and they are dedicated to classroom instruction; therefore, there is no alternative use and reallocation from one use to another, and they are incurred in the BAU scenario.
Audio–visual equipment for remote coaching includes a device with internet connection and videoconferencing capability (e.g., computer with a webcam, tablet, smart phone). All schools are equipped with an internet connection (see Footnote 2). Although most schools provide teachers with handheld devices or laptops with built-in webcams, we acknowledge that some teachers may have access only to desktop computers, in which case a webcam can be purchased to enable videoconferencing. We include the current average purchase price of a webcam. We ignore the amortization of this investment and its possible useful life beyond the 2 years of DISE activities or any potential use for non-DISE activities.
Sensitivity Analyses
As noted previously, we calculate costs per school assuming one ELD teacher per school, which is typically the case. We also provide cost estimates for a school with two ELD teachers. Additionally, we consider a district-level implementation assuming that there are five middle schools in the district and one ELD teacher per school and that all ELD teachers were trained together. We use the same effect sizes across these different implementation rollout scenarios assuming that the teacher training is equally effective regardless of whether all teachers in a district are trained together or one at a time. We note that, in the study, teachers were typically trained in groups of four to six, and thus, our effect size estimates are likely to reflect the expected effect sizes in the district-level rollout scenario.
CE ratio is the cost divided by the effect size measured in the same unit (e.g., per student) and represents the cost associated with per unit of change in the outcome of interest. We calculate the CE ratio for a 1-year implementation of DISE by dividing the per-student costs by the effect sizes from
In addition to our baseline scenario of school-level implementation with one ELD teacher and 12 Level 1 students per school, we provide incremental CE ratios for other scenarios with varying numbers of Level 1 EL students (from 5 to 25), two ELD teachers, and district-level implementation with five schools. We assume that the average effect size observed in the study applies to those scenarios.
We provide the list of ingredients and their costs in the first two columns of
The per-year cost of implementing DISE for 2 years is $7,254 per school per year and $604 per student per year in a school with one ELD teacher and 12 Level 1 EL students. The per-year cost of a 2-year implementation is lower than 1-year implementation because the curriculum costs are incurred in the first year, and the first-year training is longer than the training in the subsequent year.
Economies of Scale
In the second column of
There are also potential savings to be realized when all middle schools in a district implement the program at the same time, and all ELD teachers in the district are pooled in one training session. This is an example of economies of scale where relatively high fixed costs (training fees, in this case) spread across multiple schools resulting in declining average per-school costs. In the fourth column of
We calculate incremental CE ratios of the 1-year program for the beginner to early intermediate target sample who scored 12 or lower on the IPT, Hedges’ g = 0.19, 95% CI [0.05, 0.33]. In a school with 12 beginners to early intermediate (Level 1) EL students and one ELD teacher, the incremental CE ratio is $4,163, 95% CI [$2,397, $15,820] (
What is the CE of high-quality installation and implementation of an intervention or curricula targeted at a specific student group? In this article, we have summarized the steps involved in the process of determining the answer. There are guidelines set forth by Institute for Education Sciences for conducting CE analysis for educational programs and interventions (
Demands on administrators to identify and implement effective practices, protocols, and curricula are multifaceted. In their ever-expanding consultation role school, psychologists are asked to collaborate with other specialists including ELD teachers and administrators on intervention and curriculum implementation (
Effective tools and professional development are especially critical for ELD teachers because of the pressure from state policies for older EL students to quickly learn English and master content areas such as science and social studies. School psychologists who evaluate older students for special education eligibility, some of whom also have an EL designation, must determine if access to effective language instruction has been delivered. Knowing not only that a program is effective, but that implementation included professional development and coaching, can support these evaluations of access to instruction. Education researchers also need to consider the implementation costs of the interventions they develop and evaluate, as implementation costs may be prohibitive for end users, even if the intervention itself is effective. For EL students specifically, few resources offer guidance on which programs are effective or their full implementation costs. There are handful of published CE analyses of education programs, but to our knowledge, none have looked at programs supporting EL at the secondary level. For these reasons, this study offers practical importance for administrators, school psychologists, ELD teachers, and intervention researchers as a model of integrating economic and efficacy evaluations.
In addition to these contributions, some limitations should be noted. CE of an educational program can be decided by a comparison between the CE ratio and the willingness to pay for the educational outcome it targets (
Although we consider costs of scenarios with varying numbers of students, teachers, and schools, we do not have any sensitivity analyses to account for every possible variability in program implementation or prices. The costs calculated here reflect the resource requirements observed in the context of our study sample. For example, we observed that each district had a different set of eligibility requirements for students to be assigned to the ELD class and used different assessment tools to identify student levels. Accordingly, we did not include costs associated with administering IPT in our cost calculations. For schools or districts that choose to adopt IPT as an assessment tool to select students for intervention, costs would be higher. Similarly, we assume that all teachers in the district are trained at the same time in district-level implementation scenarios. This may not be possible for very large districts or there may be situations where some teachers may be absent during training. In these scenarios, the district/school would have to incur additional training fees. Another example is that the teachers in DISE and BAU conditions in our study reported similar prep times suggesting that there was no additional prep time associated with DISE implementation. If teachers spend longer prep times in DISE, the costs associated with the additional teacher prep time would have to be included in cost calculations. Overall, we provide sufficient detail on units and prices in
We take the confidence intervals for effect sizes into account while providing confidence intervals for the CE ratios but assume the same costs across schools. This implies that the program implementation and the associated resource use, as well as the prices for the resources, are assumed to be fixed/homogenous across participating schools. We are confident that this is a reasonable assumption considering that 85% of the costs are related to material, training, and coaching, with nearly no room for variation in their use or prices.
We note that the use of national prices for personnel (e.g., teacher pay) and facilities, instead of actual compensation or local rates in study sites, is to improve the generalizability of our results and usefulness of our cost estimates to a broad audience as suggested by
Finally, our CE ratio estimate reflects the cost of per unit effect size for the beginner to early intermediate target sample. The findings might vary slightly for EL students who are intermediate and advanced and being taught from DISE Level 2 or being taught by a paraprofessional instead of an ELD teacher.
We believe that, despite these limitations, we offer a useful case study that can be replicated with other curricula in combination with professional development and coaching. Further, given the scarcity of published CE analyses on curricula and the lack of those specifically focused on supporting EL students, this study provides an exciting extension to the field’s current understanding of program implementation.
In this cost analysis, we demonstrate how to apply the ingredients method to an educational context. Specifically, we examine the costs associated with implementing a curriculum to support ELD teachers and students with EL designation in middle schools. This CE analysis provides a framework for identifying relevant implementation costs that can support administrators and other system decision-makers, such as school psychologists, providing input (
Arens, S. A., Stoker, G., Barker, J., Shebby, S., Wang, X., Cicchinelli, L. F., & Williams, J. M. (2012). Effects of curriculum and teacher professional development on the language proficiency of elementary english language learner students in the central region. Final report. NCEE 2012-4013. National Center for Education Evaluation and Regional Assistance.
August, D., & Shanahan, T. (2006). Developing literacy in second-language learners: Report of the national literacy panel on language minority children and youth. Lawrence Erlbaum.
Bahr, M. W., Leduc, J. D., Hild, M. A., Davis, S. E., Summers, J. K., & Mcneal, B. (2017). Evidence for the expanding role of consultation in the practice of school psychologists. Psychology in the Schools, 54(6), 581–595. 10.1002/pits.22020
Baker, S., Lesaux, N., Jayanthi, M., Dimino, J., Proctor, C. P., Morris, J., Gersten, R., Haymond, K., Kieffer, M. J., Linan-Thompson, S., & Newman-Gonchar, R. (2014). Teaching academic content and literacy to English learners in elementary and middle school (NCEE 2014-4012). National Center for Education Evaluation and Regional Assistance (NCEE), Institute of Education Sciences, U.S. Department of Education.
Ballard & Tighe. (2010). IPT II-Oral English technical manual, (Forms E & F).
Barrett, C. A., Truckenmiller, A. J., & Eckert, T. L. (2020). Performance feedback during writing instruction: A cost-effectiveness analysis. The School Psychologist, 35(3), 193–200. 10.1037/spq0000356
Blonigen, B. A., Harbaugh, W. T., Singell, L. D., Horner, R. H., Irvin, L. K., & Smolkowski, K. S. (2008). Application of economic analysis to school-wide positive behavior support (SWPBS) programs. Journal of Positive Behavior Interventions, 10(1), 5–19. 10.1177/1098300707311366
Borman, G. D., & Hewes, G. M. (2002). The long-term effects and cost-effectiveness of success for all. Educational Evaluation and Policy Analysis, 24(4), 243–266. 10.3102/01623737024004243
Borman, G. D., Park, S. J., & Min, S. (2015). The district-wide effectiveness of the Achieve3000 program: A quasi-experimental study, (ERIC Document Reproduction Service No. ED558845).
Bowden, A. B., & Belfield, C. (2015). Evaluating the Talent Search TRIO program: A benefit-cost analysis and cost-effectiveness analysis. Journal of Benefit-Cost Analysis, 6(3), 572–602. 10.1017/bca.2015.48
Bowden, A. B., Shand, R., Belfield, C. R., Wang, A., & Levin, H. M. (2016). Evaluating educational interventions that induce service receipt: A case study application of City Connects. The American Journal of Evaluation, 38(3), 405–419. 10.1177/1098214016664983
Bureau of Labor Statistics. U.S. Department of Labor. (2019a). Occupational outlook handbook, middle school teachers. Retrieved October 01, 2018, from
Bureau of Labor Statistics. U.S. Department of Labor. (2019b). Employer costs for employee compensation—December 2017. Retrieved October 07, 2019, from
Carlo, M. S., August, D., McLaughlin, B., Snow, C. E., Dressler, C., Lippman, D., Lively, T. J., & White, C. (2004). Closing the gap: Addressing the vocabulary needs of English language learners in bilingual and mainstream classrooms. Reading Research Quarterly, 39(2), 188–215. 10.1598/RRQ.39.2.3
Catts, H. W., Adlof, S. M., & Ellis Weismer, S. (2006). Language deficits in poor comprehenders: A case for the simple view of reading. Journal of Speech, Language, and Hearing Research, 49(2), 278–293. 10.1044/1092-4388(2006/023)
Catts, H. W., Fey, M. E., Tomblin, J. B., & Zhang, X. (2002). A longitudinal investigation of reading outcomes in children with language impairments. Journal of Speech, Language, and Hearing Research, 45(6), 1142–1157. 10.1044/1092-4388(2002/093)
Chaparro, E. A., Smolkowski, K., Gunn, B., Dennis, C. & Vadasy, P. (2022). Evaluating the efficacy of an English language development program for middle school English learners. Journal of Education for Students Placed at Risk (JESPAR), 27(4), 322–352. 10.1080/10824669.2022.2045993
Cheung, A. C. K., & Slavin, R. E. (2012). Effective reading programs for Spanish-dominant English Language Learners (ELLs) in the elementary grades: A synthesis of research. Review of Educational Research, 82(4), 351–395. 10.3102/0034654312465472
Clarke, B., Cil, G., Smolkowski, K., Sutherland, M., Turtura, J., Doabler, C. T., Fien, H., & Baker, S. K. (2020). Conducting a cost-effectiveness analysis of an early numeracy intervention. School Psychology Review, 49(4), 359–373. 10.1080/2372966X.2020.1761236
Clemen, R. T., & Reilly, T. (2014). Making hard decisions with DecisionTools (3rd ed.). South-Western Cengage Learning.
Council of the Great City Schools. (2008). Raising the achievement of English language learners in the Seattle public schools: Report of the strategic support team of the Council of the Great City Schools, (ERIC Document Reproduction Service No. ED505337).
Crowley, D. M., Dodge, K. A., Barnett, W. S., Corso, P., Duffy, S., Graham, P., Greenberg, M., Haskins, R., Hill, L., Jones, D. E., Karoly, L. A., Kuklinski, M. R., & Plotnick, R. (2018). Standards of evidence for conducting and reporting economic evaluations in prevention science. Prevention Science, 19(3), 366–390. 10.1007/s11121-017-0858-1
Engelmann, S., & Bruner, E. (1983). Reading mastery I. Science Research Associates.
Engelmann, S., & Carnine, D. (2003). Connecting math concepts: Level A (Teacher’s presentation book, student material, and teacher’s guide). Science Research Associates.
Engelmann, S., & Hanner, S. (2008). Reading mastery reading strand Level K(Signature ed.) (Teacher’s presentation book, student material, literature guide and teacher’s guide). SRA/McGraw-Hill.
Engelmann, S., Johnston, D., Engelmann, O., & Silbert, J. (2011). Direct instruction spoken English (DISE). Voyager Sopris Learning.
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature (FMHI Publication #231). University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network.
Francis, D. J., Rivera, M., Lesaux, N., Kieffer, M., & Rivera, H. (2006). Practical guidelines for the education of English language learners: Research-based recommendations for instruction and academic interventions. RMC Research Corporation, Center on Instruction.
Frey, A. J., Kuklinski, M. R., Bills, K., Small, J. W., Fomess, S. R., Walker, H. M., Feil, E. G., & Seeley, J. R. (2019). Comprehensive cost analysis of First Step Next for preschoolers with disruptive behavior disorder: Using real-world intervention data to estimate costs at scale. Prevention Science, 20(8), 1219–1232. 10.1007/s11121-019-01035-z
Goldenberg, C. (2008). Teaching English language learners. What research does—And does not—Say. American Educator, 32(2), 42–44.
Gray, L., Thomas, N., & Lewis, L. (2010). Educational technology in U.S. Public Schools: Fall 2008(NCES 2010–034). U.S. Department of Education, National Center for Education Statistics. U.S. Government Printing Office.
Gu, L., & Hsieh, C. N. (2019). Distinguishing features of young English language learners’ oral performance. Language Assessment Quarterly, 16(2), 180–195. 10.1080/15434303.2019.1605518
Gunn, B. (2003). Supplemental reading instruction to develop second language literacy. In E.Durán (Ed.), Systematic instruction in reading for Spanish speaking students (pp. 164–180). Charles. C. Thomas.
Gunn, B., Smolkowski, K., Biglan, A., Black, C., & Blair, J. (2005). Fostering the development of reading skill through supplemental instruction: Results for Hispanic and non-Hispanic students. Journal of Special Education, 39(2), 66–85. 10.1177/00224669050390020301
Hammond, J. S., Keeney, R. L., & Raiffa, H. (1999). Smart choices: A practical guide to making better decisions. Harvard Business School Press.
Hollands, F., Bowden, A., Belfield, C., Levin, H., Cheng, H., Shand, R., Pan, Y., & Hanisch-Cerda, B. (2014). Cost-effectiveness analysis in practice: Interventions to improve high school completion. Educational Evaluation and Policy Analysis, 36(3), 307–326. 10.3102/0162373713511850
Hollands, F., Pratt-Williams, J., & Shand, R. (2021). Cost analysis standards and guidelines 1.1. Cost analysis in practice (CAP) project.
Hollands, F. M., Hanisch-Cerda, B., Levin, H. M., Belfield, C. R., Menon, A., Shand, R., Pan, Y., Bakir, I., & Cheng, H. (2015). CostOut—the CBCSE cost tool kit. Center for Benefit-Cost Studies of Education, Teachers College, Columbia University.
Hollands, F. M., Kieffer, M. J., Shand, R., Pan, Y., Cheng, H., & Levin, H. M. (2016). Cost-effectiveness analysis of early reading programs: A demonstration with recommendations for future research. Journal of Research on Educational Effectiveness, 9(1), 30–53. 10.1080/19345747.2015.1055639
Hollands, F. M., & Levin, H. M. (2017). The critical importance of costs for education decisions (REL 2017–274). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Analytic Technical Assistance and Development.
Hunter, L. J., DiPerna, J. C., Hart, S. C., & Crowley, M. (2018). At what cost? Examining the cost effectiveness of a universal social–emotional learning program. School Psychology Quarterly, 33(1), 147–154. 10.1037/spq0000232
Husereau, D., Drummond, M., Petrou, S., Carswell, C., Moher, D., Greenberg, D., Augustovski, F., Briggs, A. H., Mauskopf, J., Loder, E., & the CHEERS Task Force. (2013). Consolidated health economic evaluation reporting standards (CHEERS) statement. International Journal of Technology Assessment in Health Care, 29(2), 117–122. 10.1017/S0266462313000160
Hwang, J. K., Mancilla-Martinez, J., McClain, J. B., Oh, M. H., & Flores, I. (2020). Spanish-speaking English learners’ English language and literacy skills: The predictive role of conceptually-scored vocabulary. Applied Psycholinguistics, 41(1), 1–24. 10.1017/S0142716419000365
Institute of Education Sciences. (2020). Cost analysis: A starter kit (Version 1.1). U.S. Department of Education. Retrieved November 10, 2020, from
Jimenez-Castellanos, O., & Topper, A. M. (2012). The cost of providing an adequate education to English language learners: A review of the literature. Review of Educational Research, 82(2), 179–232. 10.3102/0034654312449872
Kretlow, A., & Helf, S. (2013). Teacher implementation of evidence-based practices in Tier 1: A national survey. Teacher Education and Special Education, 36(3), 167–185. 10.1177/0888406413489838
Levin, H. M., & Belfield, C. (2015). Guiding the development and use of cost-effectiveness analysis in education. Journal of Research on Educational Effectiveness, 8(3), 400–418. 10.1080/19345747.2014.915604
Levin, H. M., & McEwan, P. J. (2001). Cost-effectiveness analysis: Methods and applications (2nd ed.). SAGE Publications.
Levin, H. M., McEwan, P. J., Belfield, C., Bowden, A. B., & Shand, R. (2018). Economic evaluation in education: Cost-effectiveness and benefit-cost analysis. SAGE Publications. 10.4135/9781483396514
Long, K., Brown, J. L., Jones, S. M., Aber, J. L., & Yates, B. T. (2015). Cost analysis of a school-based social and emotional learning and literacy intervention. Journal of Benefit-Cost Analysis, 6(3), 545–571. 10.1017/bca.2015.6
Mathes, P. G., Pollard-Durodola, S. D., Cárdenas-Hagan, E., Linan-Thompson, S., & Vaughn, S. (2007). Teaching struggling readers who are native Spanish speakers: What do we know?Language, Speech, and Hearing Services in Schools, 38(3), 260–271. 10.1044/0161-1461(2007/027)
Morrison, J. Q., Hawkins, R. O., & Collins, T. A. (2020). Evaluating the cost-effectiveness of the Dyslexia Pilot Project: A multitiered system of supports for early literacy. Psychology in the Schools, 57(4), 522–539. 10.1002/pits.22343
Odom, S. L. (2008). The tie that binds: Evidence-based practice, implementation science, and early intervention. Topics in Early Childhood Special Education, 29(1), 53–61. 10.1177/0271121408329171
Parrish, T. B. (1994). A cost analysis of alternative instructional models for limited English proficient students in California. Journal of Education Finance, 19(3), 256–278.
Pashler, H., Bain, P., Bottge, B., Graesser, A., Koedinger, K., McDaniel, M., & Metcalfe, J. (2007). Organizing instruction and study to improve student learning. A practice guide (NCER 2007-2004). National Center for Education Research, Institute of Education Sciences, U.S. Department of Education.
Reynolds, A. J., & Temple, J. A. (2008). Cost-effective early childhood development programs from preschool to third grade. Annual Review of Clinical Psychology, 4(1), 109–139. 10.1146/annurev.clinpsy.3.022806.091411
Rivera, M. O., Francis, D. J., Fernandez, M., Moughamian, A. C., Jergensen, J., & Lesaux, N. K. (2010). Effective practices for English language learners: Principals from five states speak. Center on Instruction, RMC Research Corporation. (ERIC Document Reproduction Service No. ED517795).
Saunders, W., Goldenberg, C., & Marcelletti, D. (2013). English language development: Guidelines for instruction. American Educator, 37(2), 13.
Stockard, J. (2011). Direct Instruction and first grade reading achievement: The role of technical support and time of implementation. Journal of Direct Instruction, 11, 31–50.
Turner, A. J., Sutton, M., Harrison, M., Hennessey, A., & Humphrey, N. (2020). Cost-effectiveness of a school-based social and emotional learning intervention: Evidence from a cluster-randomised controlled trial of the promoting alternative thinking strategies curriculum. Applied Health Economics and Health Policy, 18(2), 271–285. 10.1007/s40258-019-00498-z
Uccelli, P., Galloway, E. P., Barr, C. D., Meneses, A., & Dobbs, C. L. (2015). Beyond vocabulary: Exploring cross-disciplinary academic-language proficiency and its association with reading comprehension. Reading Research Quarterly, 50(3), 337–356. 10.1002/rrq.104
Uccelli, P., & Phillips Galloway, E. (2017). Academic language across content areas: Lessons from an innovative assessment and from students’ reflections about language. Journal of Adolescent & Adult Literacy, 60(4), 395–404. 10.1002/jaal.553
Vuchinich, S., Flay, B. R., Aber, L., & Bickman, L. (2012). Person mobility in the design and analysis of cluster-randomized cohort prevention trials. Prevention Science, 13, 300–313. 10.1007/s11121-011-0265-y
What Works Clearinghouse. (2006). Reading Mastery
What Works Clearinghouse. (2012). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance.
WIDA Consortium. (2007). ACCESS for ELLs: Scores, reliability and validity [Paper presentation]. ISBE Meeting, Chicago, IL.
Wood, C. L., Goodnight, C. I., Bethune, K. S., Preston, A. I., & Cleaver, S. L. (2016). Role of professional development and multi-level coaching in promoting evidence-based practice in education. Learning Disabilities, 14(2), 159–170.
Submitted: March 4, 2021 Revised: June 21, 2022 Accepted: July 12, 2022