Zum Hauptinhalt springen

The cost-effectiveness of an English language curriculum for middle school English learners

Cil, Gulcan ; Chaparro, Erin A. ; et al.
In: School Psychology, Jg. 38 (2023), S. 48-58
Online unknown

The Cost-Effectiveness of an English Language Curriculum for Middle School English Learners By: Gulcan Cil;
Oregon Research Institute, Eugene, United States
Erin A. Chaparro
Oregon Research Institute, Eugene, United States;
Educational and Community Supports, College of Education, University of Oregon
Caroline Dennis
Oregon Research Institute, Eugene, United States
Keith Smolkowski
Oregon Research Institute, Eugene, United States

Acknowledgement: This research was supported by Grant No. R305A150325 from the United States Department of Education, Institute of Education Sciences.

When reviewing options for new curricular and intervention programs, school district administrators may work with a committee or gather input from specialists, including special education teachers and school psychologists. Ideally, the selection of a new intervention would begin with identified objectives; objectives define the important aspects of a decision opportunity (Clemen & Reilly, 2014; Hammond et al., 1999). For intervention adoption, these can include maximizing student learning and minimizing costs, or identifying the most cost-effective approach. Frequently, special education teachers or school psychologists must teach themselves new interventions (Kretlow & Helf, 2013), so a careful consideration of objectives should include the introduction, training, and coaching required to implement evidence-based programs effectively (Wood et al., 2016). Implementation may also entail other considerations, such as the facilities for training and implementation. The federal government has started to encourage education administrators, interventionists, and researchers to consider a more thorough approach to evaluating the cost and cost-effectiveness (CE) of implementing a curriculum (Hollands & Levin, 2017; Levin & Belfield, 2015).

In this article, we use guidelines and practices from the field of economics to conduct a prospective CE analysis of a curriculum for middle school English learners (EL). A CE analysis involves estimating the costs relative to the effectiveness of alternatives in producing a common outcome and can be an important tool in choosing the alternative with the lowest cost for any given level of educational effectiveness (Levin & McEwan, 2001). Our intention is to provide a model demonstration of a CE analysis that could assist educators in the selection and implementation of other evidence-based practices intended to improve outcomes and sustain those outcomes when the systemic structures are fully funded.

Overview of CE Analysis

The fundamental component in an economic evaluation of public programs is cost analysis which focuses on capturing the costs entailed in implementing a program, without attempting to measure or assign value to its positive impact or benefits (Blonigen et al., 2008; Hollands & Levin, 2017; Levin et al., 2018). Cost-effectiveness analysis compares the costs of similar programs to achieve the same unit of change in the target outcome. Researchers often characterize the effectiveness of a program in terms of a standardized effect size such as the change in standard deviation units. The CE ratio (cost/effect size) describes the CE of a program, where the costs come from the cost analysis.

Thorough analyses of cost alone have been completed for interventions such as First Step Next, an early intervention for disruptive behavior disorder (Frey et al., 2019), a social–emotional learning and literacy intervention (Long et al., 2015), school-wide positive behavior supports (Blonigen et al., 2008), and to compare instructional models for elementary school EL (Parrish, 1994). There have been rigorous applications of CE analysis for a range of educational interventions from early childhood (Borman & Hewes, 2002; Reynolds & Temple, 2008) to elementary (Barrett et al., 2020; Bowden et al., 2016; Clarke et al., 2020; Hollands et al., 2016; Hunter et al., 2018; Morrison et al., 2020; Turner et al., 2020) and secondary school students (Bowden & Belfield, 2015; Hollands et al., 2014). While there are various precedents for economic evaluations in education, few have focused specifically on students eligible for limited English proficiency status, whom schools often call EL or emergent bilingual students.

Cost studies regarding students with limited English proficiency status have focused on instructional approaches such as double immersion or sheltered English and on statewide funding and other policies, rather than curricular and professional development decisions (Jimenez-Castellanos & Topper, 2012; Parrish, 1994). To our knowledge, no cost or CE analysis has been published for a program focused on middle school ELs in particular. The Middle School English Learner Project was a 5-year evaluation of the efficacy of the Direct Instruction Spoken English (DISE; Engelmann et al., 2011) curriculum among middle school EL students. The Middle School English Learner Project provides a demonstration of a high-quality implementation of DISE as well as an opportunity to document and analyze the costs of that implementation.

Overview of Direct Instruction

Direct instruction is the explicit teaching of basic academic skills. This approach is used in a number of programs, including Reading Mastery (Engelmann & Bruner, 1983; Engelmann & Hanner, 2008), Connecting Math Concepts (Engelmann & Carnine, 2003), and DISE (Engelmann et al., 2011). Direct instruction programs are currently used with over 1 million students in one third of the elementary schools in the United States (What Works Clearinghouse, 2006). Research supports the effectiveness of direct instruction for English monolinguals, and it is also recommended for ELs across a range of policy and research reports (Carlo et al., 2004; Council of the Great City Schools, 2008; Francis et al., 2006; Pashler et al., 2007; Rivera et al., 2010). Direct instruction has been shown to be an important characteristic of reading programs and interventions for EL students in general (August & Shanahan, 2006; Cheung & Slavin, 2012; Goldenberg, 2008; Gunn, 2003; Gunn et al., 2005;Mathes et al., 2007). For this reason, DISE was chosen as a program that had not yet been researched but had great potential to be effective with EL newcomers in middle schools. There are a limited number of English oral language teacher-led curricula that focus exclusively on EL students.

Lack of Research on English Language Curricula

Given the range of languages spoken by students in U.S. schools, schools can teach students English, but to do so, teachers need empirically supported easy-to-implement instructional tools or curricula. Unfortunately, few English language development (ELD) curricula have been rigorously evaluated under controlled settings and with adequate sample sizes to recommend their use in U.S. schools. Fewer have been deemed effective for use with middle school EL students, and limited evidence is available on the optimal design and delivery of effective and efficient English language instruction (Arens et al., 2012; Saunders et al., 2013). For example, What Works Clearinghouse (What Works Clearinghouse, 2012) identified only a few programs that met What Works Clearinghouse evidence standards of potentially positive or positive results for EL students in 6th through 8th grade. For example, Borman et al. (2015) reported on Achieve3000 which met What Works Clearinghouse standards with reservations because it is a quasi-experimental design embedded within one district. This program, along with others also reviewed by the What Works Clearinghouse, are not comprehensive, teacher-led, instructional curricula. Although we have learned a great deal about the constructs of language and literacy for emerging bilingual students, few of those studies include students who have nascent levels of English proficiency and are focused specifically on teaching oral ELD.

DISE Curriculum and Professional Development/Coaching Model

This lack of scientifically tested curricula led to this evaluation of DISE, which includes the sequential introduction of new material and clear directions to teachers and is unique compared to other programs on the market. DISE instruction is highly interactive employing the use of choral response and other effective explicit instructional delivery strategies. Teachers present to students with clear demonstrations and continuous opportunities for students to practice by using the instructional stimulus of images displayed on a screen for all students to view at the same time. There are no student materials, only a teacher manual which includes the student placement test, and the instructional display materials. Teachers actively monitor students’ understanding and provide immediate corrective feedback to prevent students from learning misinformation or misapplying new skills. The instructional pace keeps students motivated and engaged yet allows sufficient opportunities for students to practice. With these instructional qualities, DISE was developed to teach groups of ELs who may not speak the same language.

Research strongly supports the critical role of English language proficiency and listening comprehension in the academic success of older ELs (e.g., Catts et al., 2002, 2006; Hwang et al., 2020; Uccelli et al., 2015). For EL students, there is a relationship between academic language and reading comprehension with speaking proficiency providing an important step in the language development of EL students (Gu & Hsieh, 2019; Uccelli & Phillips Galloway, 2017). The breadth of research focused on academic language and speaking proficiency contributes to the recommendation that academic vocabulary should be taught explicitly to EL students in middle schools (Baker et al., 2014). Before students are taught more advanced academic vocabulary, they must be taught more basic academic language which is an important component of the DISE program. The content of DISE aligns with and builds upon the research on language development by teaching the specific aspects of English oral language EL’s need for adequate speaking and listening skills. For example, students learn and practice the application of rules of grammar (e.g., number, tense) in varied sentence types.

High-quality professional development is essential to ensure fidelity of implementation (Fixsen et al., 2005; Odom, 2008). The DISE teachers in the Middle School English Learner Project received training and coaching from the National Institute for Direct Instruction (NIFDI), the developers of DISE, and primary implementation-support providers. NIFDI has demonstrated that schools receiving their support significantly outperform schools not receiving their support when both schools were using the same instructional curricula (Stockard, 2011). NIFDI offers training to any school. However, the training is not required nor is it provided when a school or district purchases the curriculum as a stand-alone product. The Middle School English Learner Project tested the DISE curriculum with NIFDI training and coaching for teachers, so the results reported are not generalizable to schools or districts who purchase the curriculum only and do not receive teacher training or coaching from the curriculum developers, NIFDI.

Goals of DISE CE Analysis

Our goal is to provide a comprehensive estimate of the resources needed to implement the DISE program and assess these program costs relative to their effectiveness in order to provide a demonstration for leaders in the position of selecting, installing, and sustaining programs. This involves calculating the economic cost of the program implementation, which includes, in addition to the budgeted expenditures, opportunity costs associated with personnel time and use of existing equipment, materials, and facilities, and then computing cost per effectiveness. We focus on estimating the costs associated with resources needed for DISE implementation above and beyond the resources used in business as usual ELD of EL students and provide estimates of the incremental CE ratio for the DISE program. Our CE estimates constitute a necessary input for curricular decisions focused on ELs, including future comparisons with similar programs, and our analysis demonstrates a method that can be applied to a range of educational interventions.

Method

The Middle School English Learner Project randomly assigned 29 schools from Texas, Washington, and Oregon, within eight districts, to either implement DISE or conduct business as usual (BAU) in their beginner ELD classrooms (see Table 1). Schools were recruited based on their district-reported numbers of EL students served. All participating schools had sheltered English instruction programs where instruction was delivered in English. The participating ELD teacher at schools in the DISE condition received training from a certified DISE trainer on how to teach the program with fidelity. Teachers in the BAU condition used the ELD curriculum approved by their district and adopted by their school. We tested all 6th- and 7th-grade students in the participating teachers’ beginner ELD classrooms in the Fall and Spring of Years 1 and 2. See Chaparro et al. (2022), for a full description of the project and its efficacy results.
spq-38-1-48-tbl1a.gif

Participants

Schools and Teachers

All schools included students in Grades 6–8, ranging from 536 to 1,383 total students with an average class size of 12 students. Over the course of the project, 18 teachers and one instructional assistant delivered DISE in their EL classrooms. The average number of years taught by teachers was 12, and 8 years teaching English as a second language (ESL). Three of the teachers had master’s degrees, and the rest had bachelor’s degrees.

Professional Development and Coaching for Teachers

Teachers were trained to deliver the DISE curriculum with their newcomer (i.e., students who had recently immigrated to the United States) EL groups and received coaching from NIFDI during the 2 years of district participation in the study. Initial 2-day DISE training took place in person with the NIFDI coach, and included an orientation to the curricular approach as well as experiential practice delivering DISE lessons. After this training, the NIFDI coach conducted an on-site coaching visit and three follow-up remote coaching sessions via live videoconference; for each session, the coach observed the teacher delivering DISE and offered individualized feedback. During the second year, the coach conducted half-day refresher trainings, on-site coaching visits, and three remote coaching sessions. Teachers also submitted weekly surveys on lesson progress and were encouraged to reach out to the NIFDI coach for support with particular situations.

Curriculum

DISE has two levels designed to develop and accelerate oral ELD for non-English speaking students in Grades 4–12. Teachers are trained to administer a DISE placement test that is embedded within the teacher manual with no additional materials needed. This study pertains only to DISE Level 1 as we targeted students just beginning to learn English. Students received 45–55 min of DISE instruction as a part of their daily ELD instruction, which occurs at times that do not compromise their access to the core curriculum in other content areas.

ELD Instruction in Study Conditions

In both conditions, teachers delivered classroom instruction primarily in English to all students in the classroom. All participating schools had English as the primary language of instruction for the duration of the school day for EL students regardless of English language proficiency level. All teachers taught for their entire class period in middle schools, typically determined at the district level, so the instructional context and time were balanced across conditions. The length of ELD class periods was the same in both conditions, so students received the same amount of instruction each day. Teachers taught all students in their classrooms according to condition assignment.

Outcome Measure

The IDEA Proficiency Test (IPT II; Ballard & Tighe, 2010) was the primary measure of oral English language for the Middle School English Learner Project. This measure was collected for research purposes only and is not a required component of DISE curriculum; thus, the costs associated with the administration of it were not included in our cost calculations. We administered the test at the beginning of the school year to establish each student’s baseline English oral language proficiency, and at end of the school year to determine language proficiency growth. Internal consistency reliability is .90, the test–retest reliability is .95 (Ballard & Tighe, 2010). The IPT correlated .60–.65 on listening, speaking, reading, and writing of the WIDA ACCESS for English language learners (WIDA Consortium, 2007). All assessors were paid research assistants who did not know the assigned condition of each school. They received training before each data collection timepoint. During both initial and refresher trainings, each assessor established reliability above 80% agreement with the trainer on each test, and newer assessors verified their reliability through shadow scoring with experienced assessors on the first day of each assessment period.

Source of Effect Sizes

In Chaparro et al. (2022), we estimated an overall 1-year effect size of g = 0.05, 95% CI [−0.12, 0.23] based on a cluster-focused, intent-to-treat analysis approach (Vuchinich et al., 2012). As reported in Chaparro et al. (2022), however, due to district protocols, study classrooms included students outside of our targeted language proficiency range. The study and initial lessons of the curriculum are designed for students with little to no English proficiency, but sometimes schools have limitations that require them to place beginner and intermediate EL students in the same instructional grouping. For this reason, about 25% of students had baseline IPT scores much higher than expected (i.e., in the intermediate to advanced range) for the level of English language instruction, while 75% of students scored in the targeted beginner to early intermediate range. We, therefore, tested and found that initial scores on English oral language moderated student gains. Students who began with less proficiency in English oral language benefited more from DISE.

The differential response by initial skill is critical for the interpretation of the effects because the investigation targeted only students with beginner to early intermediate English oral language skills with DISE Level 1 lessons. We operationalized the baseline performance as IPT pretreatment scores of 12 or below. One quarter of the sample scored higher than 12, and some students scored above the 50th percentile on the IPT. These students functioned too highly at baseline to benefit from the DISE lessons under study and therefore, obscured the evidence of impact of DISE Level 1.

To estimate effects for the beginner to early intermediate target sample, we estimated the effect size for the lower 75% of students who at baseline scored a 12 on the IPT or below. For this range, we estimated an effect size of g = 0.19, 95% CI [0.05, 0.33] in the first intervention year. These effect sizes more accurately reflect the efficacy of the DISE lessons under investigation in this project for the intended student sample, so we focus on these effect sizes in our CE analysis.

Estimating the Cost of DISE Implementation

We estimate the cost of DISE using the “ingredients method” (Levin et al., 2018) focusing on the 14 of these 29 schools that were randomized to deliver DISE and using information collected concurrently with the efficacy trials. We use national prices and estimate the costs from a societal perspective including all the program-related costs regardless of who pays for them. We calculate the incremental cost of DISE implementation relative to BAU considering additional resource requirements associated with using DISE among middle school EL students compared to teaching the same students without the DISE curriculum. We exclude the costs associated with developing the program, and present costs for 1-year implementation as well as 2 years of implementation by adding first-year costs and discounted second-year costs. We first estimate costs at the school level as the school is the unit of implementation in the DISE program, and then convert per-school costs into per-student costs to be used in CE ratio calculations by dividing the per-school costs by the number of students. We assume 12 beginners to early intermediate (Level 1) EL students per school based on the average number of EL students per school in our sample. We provide the cost estimates for other scenarios with 5–25 students per school.

We begin by identifying the resources needed for the program by the stages of implementation, shown in the first column of Table 2. We rely on invoices (for fees and other charges), teacher surveys (for teacher time), and field logs and experience of DISE staff and researchers.
spq-38-1-48-tbl2a.gif

We then identify the quantities required for each ingredient, and the associated prices, considering both the direct expenses (such as fees and cost of materials) and opportunity costs—the value of the best alternative use of the resource or the value of what needs to be given up for that particular use of the resource. For example, even when a teacher dedicates time to the program activities (e.g., training) within working hours with no extra payment, the value of teacher time must be accounted for in-cost calculations if the teacher spends this time on program activities in addition to time spent on their usual duties. The value of the next best alternative for a resource is typically captured by its market price. In the example of teacher’s time, the teacher’s hourly salary and benefits represent an estimate of the value of teacher’s time.

Ingredients and Prices: Costs of DISE Curriculum, Training, and Coaching

DISE curriculum package includes all program-related material needed for both initial student assessment and instruction. The price of curriculum and services are the fees and charges from the DISE publisher. These prices are obtained from NIFDI (https://www.nifdi.org/) and records kept by the research team. The training and coaching fees included in our cost calculations cover the costs associated with trainer/coach time providing training/coaching as well as their travel and lodging expenses, and are expected to be incurred by schools or school districts.

There were several commercially published instructional tools and software programs used in the BAU classrooms including Milestones, Language Power, Florida Center for Reading Research Student Activities, Keys to Learning, Texas Primary Reading Inventory, ESL Reading Smart, Rosetta Stone, Imagine Learning, and Read 180. Several teachers used their own materials, other books, or shorter texts. In incremental cost calculations, the cost of these instructional materials should be deducted from the cost of the DISE curriculum to reflect the costs exceeding BAU costs. Because, the material used in BAU classrooms varied widely and, in some cases, was free for schools, we ignore the cost of these alternative curricula. The cost of adopting DISE would be lower than our estimate for a school replacing other costly instructional material with DISE.

Ingredients and Prices: Teachers and Other Personnel

We consider teacher time spent in 2-day in-person DISE training in the first year and a half-day refresher in the second year. For teacher time spent on coaching, we consider only the feedback portion of the coaching sessions and exclude the teacher’s time during the observation portion which is a part of the delivery of DISE instruction. We also consider information technology (IT) staff time spent on set-up and ongoing support for remote coaching sessions.

As noted earlier, the length of ELD class periods was the same in DISE and BAU classrooms, and thus, the teachers spent the same amount of time each day delivering lessons in each condition. Accordingly, there is no incremental cost associated with teacher time delivering DISE because the teaching of DISE material replaces teaching time in BAU. Similarly, incremental cost calculations include the cost of any extra prep time a teacher devotes to DISE material beyond their typical prep time. The results of the teacher surveys indicate that the average prep time for DISE teachers is 28 min per day (SD = 18.2, range 15–60), whereas the average prep time for BAU teachers is 40 min per day (SD = 15.5, range 15–60). These survey results are based on a subsample of teachers and are provided to justify that no additional teacher preparation time was needed in the treatment condition. Therefore, we argue that no additional prep time should be included in this incremental cost calculation.

Teacher and IT staff time are priced at the total of hourly wages and benefits. For teacher and IT staff wages, we used national median annual pay for “middle school teacher” and “computer support specialist,” respectively, provided in the U.S. Bureau of Labor Statistics Occupational Outlook Handbook (Bureau of Labor Statistics. U.S. Department of Labor, 2019a). We converted annual salaries to hourly wages assuming 1,440 work hours per year for teachers, as suggested in the Institute of Education Sciences cost analysis guidelines (Institute of Education Sciences, 2020), and 2,080 work hours per year for other staff. The benefit rates are from the Bureau of Labor Statistics Employer Costs for Employee Compensation (Bureau of Labor Statistics. U.S. Department of Labor, 2019b) and represent 33% of total compensation.

Ingredients and Prices: Facility Costs

DISE training and the feedback portion of the coaching sessions take place in school spaces available for meetings. In the district-level implementation scenario, we assume that all teachers are trained at the same time in district offices. We follow the approach outlined in Levin et al. (2018) to calculate the costs associated with the use of these facilities. First, we use the median construction costs of a middle school building and an office building (to proxy for district offices) suggested by CostOut (Hollands et al., 2015). These construction costs are uprated by 21% to include furniture, furnishing, fees, and site preparation. We then annualize these building costs over 30 years using the conventional 3% interest rate to obtain the cost per square foot of space per year. Annual use is assumed to be 1,440 hr (duration of academic year) for the school building and 2,080 hr for the district offices. The space size needed is assumed to be 500 square foot for training and coaching. We also maintain that there are no additional facility-use costs associated with DISE delivery and the coaching observations, as both take place during instruction time in classrooms otherwise used for BAU instruction of the same subjects and thus not relevant for incremental cost calculations.

Ingredients and Prices: Equipment Costs

The equipment used for DISE delivery includes a computer and a projector or a smart board. All schools are equipped with these devices, and they are dedicated to classroom instruction; therefore, there is no alternative use and reallocation from one use to another, and they are incurred in the BAU scenario.

Audio–visual equipment for remote coaching includes a device with internet connection and videoconferencing capability (e.g., computer with a webcam, tablet, smart phone). All schools are equipped with an internet connection (see Footnote 2). Although most schools provide teachers with handheld devices or laptops with built-in webcams, we acknowledge that some teachers may have access only to desktop computers, in which case a webcam can be purchased to enable videoconferencing. We include the current average purchase price of a webcam. We ignore the amortization of this investment and its possible useful life beyond the 2 years of DISE activities or any potential use for non-DISE activities.

Sensitivity Analyses

As noted previously, we calculate costs per school assuming one ELD teacher per school, which is typically the case. We also provide cost estimates for a school with two ELD teachers. Additionally, we consider a district-level implementation assuming that there are five middle schools in the district and one ELD teacher per school and that all ELD teachers were trained together. We use the same effect sizes across these different implementation rollout scenarios assuming that the teacher training is equally effective regardless of whether all teachers in a district are trained together or one at a time. We note that, in the study, teachers were typically trained in groups of four to six, and thus, our effect size estimates are likely to reflect the expected effect sizes in the district-level rollout scenario.

CE Analysis

CE ratio is the cost divided by the effect size measured in the same unit (e.g., per student) and represents the cost associated with per unit of change in the outcome of interest. We calculate the CE ratio for a 1-year implementation of DISE by dividing the per-student costs by the effect sizes from Chaparro et al. (2022; see above, for adjusted effect size for the target sample). The CE ratios we report are incremental CE ratios as we consider resource use beyond the resources used in BAU condition, and the effect sizes reflect the improvement in outcomes relative to the counterfactual. Our incremental CE ratio estimate reflects the cost of per unit effect size for the beginner to early intermediate target sample who scored a 12 on the IPT or below.

In addition to our baseline scenario of school-level implementation with one ELD teacher and 12 Level 1 students per school, we provide incremental CE ratios for other scenarios with varying numbers of Level 1 EL students (from 5 to 25), two ELD teachers, and district-level implementation with five schools. We assume that the average effect size observed in the study applies to those scenarios.

Results
Cost of DISE Implementation

We provide the list of ingredients and their costs in the first two columns of Table 3. We estimate that the cost of 1-year implementation of DISE is $9,943 per school and $791 per student in a school with one ELD teacher and 12 Level 1 EL students. Eighty-five percent of the perschool costs ($8,056) are program-related expenditures including charges for program material, training, and coaching; 13% ($1,256) is the cost of teacher time, and the remaining 2% is other expenses.
spq-38-1-48-tbl3a.gif

The per-year cost of implementing DISE for 2 years is $7,254 per school per year and $604 per student per year in a school with one ELD teacher and 12 Level 1 EL students. The per-year cost of a 2-year implementation is lower than 1-year implementation because the curriculum costs are incurred in the first year, and the first-year training is longer than the training in the subsequent year.

Economies of Scale

In the second column of Table 3, per school and per teacher costs are equal because we assume that there is only one ELD teacher per school. Schools with additional ELD teachers would experience some savings on per-teacher costs due to the fact that training fees are fixed at the school level and do not change with the number of teachers to be trained. We calculate that the cost of implementing DISE for a school with two ELD teachers would be $14,847 with a per-teacher cost of $7,424 (third column of Table 3).

There are also potential savings to be realized when all middle schools in a district implement the program at the same time, and all ELD teachers in the district are pooled in one training session. This is an example of economies of scale where relatively high fixed costs (training fees, in this case) spread across multiple schools resulting in declining average per-school costs. In the fourth column of Table 3, we provide a hypothetical example of a district with five middle schools and one ELD teacher per school, in which case per-school costs of DISE drop to $6,221, indicating an almost 35% decrease from the cost of single-school implementation. Per-school costs of DISE do not change with the number of EL students in the school (assuming the number of teachers remains constant). Thus, per-student cost decreases proportionately with the number of students per school.

CE of DISE

We calculate incremental CE ratios of the 1-year program for the beginner to early intermediate target sample who scored 12 or lower on the IPT, Hedges’ g = 0.19, 95% CI [0.05, 0.33]. In a school with 12 beginners to early intermediate (Level 1) EL students and one ELD teacher, the incremental CE ratio is $4,163, 95% CI [$2,397, $15,820] (Table 4) which implies that it costs approximately $4,163 to achieve a 1.0 effect size in student performance over the BAU condition. The incremental CE ratio falls to $2,726, that is, the program becomes more cost-effective when all schools in a district implement DISE assuming five schools in district and same number of Level 1 EL students and ELD teachers per school. Table 4 presents incremental CE ratios for the other scenarios. The DISE program becomes more cost-effective as the number of schools increases in the district-level implementation scenario. We also show how the incremental CE ratio changes with the number of Level 1 EL students per school within the range of number of students we observe in our study, assuming the same estimated effect size applies to all schools. The DISE program becomes more cost-effective (i.e., incremental CE ratio declines) as the number of students per school increases due to the fact that per-student cost decreases proportionately with the number of students per school while the effect size remains the same. We note that the decline in incremental CE ratio would be smaller if there are fewer Level 1 EL students in some schools in the district or if there is a decline in effect sizes in the district-level implementation scenario.
spq-38-1-48-tbl4a.gif

Discussion

What is the CE of high-quality installation and implementation of an intervention or curricula targeted at a specific student group? In this article, we have summarized the steps involved in the process of determining the answer. There are guidelines set forth by Institute for Education Sciences for conducting CE analysis for educational programs and interventions (Hollands et al., 2021; Institute of Education Sciences, 2020), and generally agreed upon standards in other fields including prevention science (Crowley et al., 2018) and medicine (Husereau et al., 2013). Our CE analysis approach follows these said guidelines closely and fits well with the standards in other fields. Prospective cost data collection, specifying the perspective from which the cost analysis is conducted, use of ingredients method with a comprehensive description of resources and units, providing estimates for total and per-participant costs are some examples of these standards that establish the rigor of our findings. We provide a detailed itemized list of ingredients needed for a standard DISE implementation in Table 2 which can be used as a model when conducting other cost analyses. We estimate the cost of 1-year implementation and also demonstrate the economies of scale for district-level implementation with per-school costs decreasing by almost 35% compared to single-school implementation costs. These results are relevant for district leadership considering scaling up a program across a group of schools within the same year in contrast to rolling out a program at one school in one school year, and then another school the following year. In the case of DISE implementation, we provide a rationale for training all relevant staff at one time.

Practical Importance

Demands on administrators to identify and implement effective practices, protocols, and curricula are multifaceted. In their ever-expanding consultation role school, psychologists are asked to collaborate with other specialists including ELD teachers and administrators on intervention and curriculum implementation (Bahr et al., 2017). School psychologists and other specialists work with administrators to evaluate new interventions and curricula for purchase and implementation for groups of students who are at risk for academic and behavioral challenges. The expanding role of school psychologists calls for an understanding of the implementation of interventions from a systems level perspective—not only the intervention’s effect size but its implementation cost, including the professional development and coaching necessary to achieve the full potential impact.

Effective tools and professional development are especially critical for ELD teachers because of the pressure from state policies for older EL students to quickly learn English and master content areas such as science and social studies. School psychologists who evaluate older students for special education eligibility, some of whom also have an EL designation, must determine if access to effective language instruction has been delivered. Knowing not only that a program is effective, but that implementation included professional development and coaching, can support these evaluations of access to instruction. Education researchers also need to consider the implementation costs of the interventions they develop and evaluate, as implementation costs may be prohibitive for end users, even if the intervention itself is effective. For EL students specifically, few resources offer guidance on which programs are effective or their full implementation costs. There are handful of published CE analyses of education programs, but to our knowledge, none have looked at programs supporting EL at the secondary level. For these reasons, this study offers practical importance for administrators, school psychologists, ELD teachers, and intervention researchers as a model of integrating economic and efficacy evaluations.

Limitations

In addition to these contributions, some limitations should be noted. CE of an educational program can be decided by a comparison between the CE ratio and the willingness to pay for the educational outcome it targets (Levin et al., 2018). CE analysis also informs choices between programs with similar targeted outcomes by enabling comparison based on their relative CE. Given the absence of information on society’s willingness to pay for EL outcomes and lack of valuation research for potential improvements in oral English proficiency, it is not possible to evaluate the CE of DISE. We are also unable to evaluate the CE of DISE program relative to other EL programs because there are no readily available CE data on such programs. While this is a limitation, we are hopeful that the results of our analysis will inform future comparisons and decisions as the CE data for other EL programs become available.

Although we consider costs of scenarios with varying numbers of students, teachers, and schools, we do not have any sensitivity analyses to account for every possible variability in program implementation or prices. The costs calculated here reflect the resource requirements observed in the context of our study sample. For example, we observed that each district had a different set of eligibility requirements for students to be assigned to the ELD class and used different assessment tools to identify student levels. Accordingly, we did not include costs associated with administering IPT in our cost calculations. For schools or districts that choose to adopt IPT as an assessment tool to select students for intervention, costs would be higher. Similarly, we assume that all teachers in the district are trained at the same time in district-level implementation scenarios. This may not be possible for very large districts or there may be situations where some teachers may be absent during training. In these scenarios, the district/school would have to incur additional training fees. Another example is that the teachers in DISE and BAU conditions in our study reported similar prep times suggesting that there was no additional prep time associated with DISE implementation. If teachers spend longer prep times in DISE, the costs associated with the additional teacher prep time would have to be included in cost calculations. Overall, we provide sufficient detail on units and prices in Table 2 and on the calculation of overall costs in Table 3 to provide an opportunity to the reader to account for such possible variability in resource use in other contexts and to identify the costs relevant to their setting.

We take the confidence intervals for effect sizes into account while providing confidence intervals for the CE ratios but assume the same costs across schools. This implies that the program implementation and the associated resource use, as well as the prices for the resources, are assumed to be fixed/homogenous across participating schools. We are confident that this is a reasonable assumption considering that 85% of the costs are related to material, training, and coaching, with nearly no room for variation in their use or prices.

We note that the use of national prices for personnel (e.g., teacher pay) and facilities, instead of actual compensation or local rates in study sites, is to improve the generalizability of our results and usefulness of our cost estimates to a broad audience as suggested by Hollands et al. (2021). To inform decision-making in a specific location, geographic adjustments need to be made on the portion of the costs associated with school personnel time and facilities which, as noted earlier, account for a relatively small portion of total costs.

Finally, our CE ratio estimate reflects the cost of per unit effect size for the beginner to early intermediate target sample. The findings might vary slightly for EL students who are intermediate and advanced and being taught from DISE Level 2 or being taught by a paraprofessional instead of an ELD teacher.

We believe that, despite these limitations, we offer a useful case study that can be replicated with other curricula in combination with professional development and coaching. Further, given the scarcity of published CE analyses on curricula and the lack of those specifically focused on supporting EL students, this study provides an exciting extension to the field’s current understanding of program implementation.

Conclusion

In this cost analysis, we demonstrate how to apply the ingredients method to an educational context. Specifically, we examine the costs associated with implementing a curriculum to support ELD teachers and students with EL designation in middle schools. This CE analysis provides a framework for identifying relevant implementation costs that can support administrators and other system decision-makers, such as school psychologists, providing input (Barrett et al., 2020). We demonstrate how school systems can receive cost savings by training all relevant faculty at once as opposed to rolling out training incrementally. The CE ratio is a valuable metric representing the cost of improved outcomes for a targeted population of students. As the field of economic evaluation of educational programs continues to expand, it is our hope that these DISE CE analysis results move the field forward.

Footnotes

1  One teacher chose to train an instructional aide to deliver DISE to a subgroup of students who were less proficient in English language to allow the teacher to work with students with higher proficiency.

2  As of 2008, 97% of the public schools in the United States had instructional computers and LCD/DLP projectors in classrooms, and 98% had computers with internet access (Gray et al., 2010). We assume that these percentages have reached 100% in the past 13 years.

References

Arens, S. A., Stoker, G., Barker, J., Shebby, S., Wang, X., Cicchinelli, L. F., & Williams, J. M. (2012). Effects of curriculum and teacher professional development on the language proficiency of elementary english language learner students in the central region. Final report. NCEE 2012-4013. National Center for Education Evaluation and Regional Assistance.

August, D., & Shanahan, T. (2006). Developing literacy in second-language learners: Report of the national literacy panel on language minority children and youth. Lawrence Erlbaum.

Bahr, M. W., Leduc, J. D., Hild, M. A., Davis, S. E., Summers, J. K., & Mcneal, B. (2017). Evidence for the expanding role of consultation in the practice of school psychologists. Psychology in the Schools, 54(6), 581–595. 10.1002/pits.22020

Baker, S., Lesaux, N., Jayanthi, M., Dimino, J., Proctor, C. P., Morris, J., Gersten, R., Haymond, K., Kieffer, M. J., Linan-Thompson, S., & Newman-Gonchar, R. (2014). Teaching academic content and literacy to English learners in elementary and middle school (NCEE 2014-4012). National Center for Education Evaluation and Regional Assistance (NCEE), Institute of Education Sciences, U.S. Department of Education. http://ies.ed.gov/ncee/wwc/publications_reviews.aspx

Ballard & Tighe. (2010). IPT II-Oral English technical manual, (Forms E & F).

Barrett, C. A., Truckenmiller, A. J., & Eckert, T. L. (2020). Performance feedback during writing instruction: A cost-effectiveness analysis. The School Psychologist, 35(3), 193–200. 10.1037/spq0000356

Blonigen, B. A., Harbaugh, W. T., Singell, L. D., Horner, R. H., Irvin, L. K., & Smolkowski, K. S. (2008). Application of economic analysis to school-wide positive behavior support (SWPBS) programs. Journal of Positive Behavior Interventions, 10(1), 5–19. 10.1177/1098300707311366

Borman, G. D., & Hewes, G. M. (2002). The long-term effects and cost-effectiveness of success for all. Educational Evaluation and Policy Analysis, 24(4), 243–266. 10.3102/01623737024004243

Borman, G. D., Park, S. J., & Min, S. (2015). The district-wide effectiveness of the Achieve3000 program: A quasi-experimental study, (ERIC Document Reproduction Service No. ED558845). https://files.eric.ed.gov/fulltext/ED558845.pdf

Bowden, A. B., & Belfield, C. (2015). Evaluating the Talent Search TRIO program: A benefit-cost analysis and cost-effectiveness analysis. Journal of Benefit-Cost Analysis, 6(3), 572–602. 10.1017/bca.2015.48

Bowden, A. B., Shand, R., Belfield, C. R., Wang, A., & Levin, H. M. (2016). Evaluating educational interventions that induce service receipt: A case study application of City Connects. The American Journal of Evaluation, 38(3), 405–419. 10.1177/1098214016664983

Bureau of Labor Statistics. U.S. Department of Labor. (2019a). Occupational outlook handbook, middle school teachers. Retrieved October 01, 2018, from https://www.bls.gov/ooh/education-training-and-library/middle-school-teachers.htm

Bureau of Labor Statistics. U.S. Department of Labor. (2019b). Employer costs for employee compensation—December 2017. Retrieved October 07, 2019, from https://www.bls.gov/news.release/archives/ecec_03202018.pdf

Carlo, M. S., August, D., McLaughlin, B., Snow, C. E., Dressler, C., Lippman, D., Lively, T. J., & White, C. (2004). Closing the gap: Addressing the vocabulary needs of English language learners in bilingual and mainstream classrooms. Reading Research Quarterly, 39(2), 188–215. 10.1598/RRQ.39.2.3

Catts, H. W., Adlof, S. M., & Ellis Weismer, S. (2006). Language deficits in poor comprehenders: A case for the simple view of reading. Journal of Speech, Language, and Hearing Research, 49(2), 278–293. 10.1044/1092-4388(2006/023)

Catts, H. W., Fey, M. E., Tomblin, J. B., & Zhang, X. (2002). A longitudinal investigation of reading outcomes in children with language impairments. Journal of Speech, Language, and Hearing Research, 45(6), 1142–1157. 10.1044/1092-4388(2002/093)

Chaparro, E. A., Smolkowski, K., Gunn, B., Dennis, C. & Vadasy, P. (2022). Evaluating the efficacy of an English language development program for middle school English learners. Journal of Education for Students Placed at Risk (JESPAR), 27(4), 322–352. 10.1080/10824669.2022.2045993

Cheung, A. C. K., & Slavin, R. E. (2012). Effective reading programs for Spanish-dominant English Language Learners (ELLs) in the elementary grades: A synthesis of research. Review of Educational Research, 82(4), 351–395. 10.3102/0034654312465472

Clarke, B., Cil, G., Smolkowski, K., Sutherland, M., Turtura, J., Doabler, C. T., Fien, H., & Baker, S. K. (2020). Conducting a cost-effectiveness analysis of an early numeracy intervention. School Psychology Review, 49(4), 359–373. 10.1080/2372966X.2020.1761236

Clemen, R. T., & Reilly, T. (2014). Making hard decisions with DecisionTools (3rd ed.). South-Western Cengage Learning.

Council of the Great City Schools. (2008). Raising the achievement of English language learners in the Seattle public schools: Report of the strategic support team of the Council of the Great City Schools, (ERIC Document Reproduction Service No. ED505337). https://eric.ed.gov/?id=ED505337

Crowley, D. M., Dodge, K. A., Barnett, W. S., Corso, P., Duffy, S., Graham, P., Greenberg, M., Haskins, R., Hill, L., Jones, D. E., Karoly, L. A., Kuklinski, M. R., & Plotnick, R. (2018). Standards of evidence for conducting and reporting economic evaluations in prevention science. Prevention Science, 19(3), 366–390. 10.1007/s11121-017-0858-1

Engelmann, S., & Bruner, E. (1983). Reading mastery I. Science Research Associates.

Engelmann, S., & Carnine, D. (2003). Connecting math concepts: Level A (Teacher’s presentation book, student material, and teacher’s guide). Science Research Associates.

Engelmann, S., & Hanner, S. (2008). Reading mastery reading strand Level K(Signature ed.) (Teacher’s presentation book, student material, literature guide and teacher’s guide). SRA/McGraw-Hill.

Engelmann, S., Johnston, D., Engelmann, O., & Silbert, J. (2011). Direct instruction spoken English (DISE). Voyager Sopris Learning.

Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature (FMHI Publication #231). University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network. https://nirn.fpg.unc.edu/resources/implementation-research-synthesis-literature

Francis, D. J., Rivera, M., Lesaux, N., Kieffer, M., & Rivera, H. (2006). Practical guidelines for the education of English language learners: Research-based recommendations for instruction and academic interventions. RMC Research Corporation, Center on Instruction. https://www.centeroninstruction.org/files/ELL1-Interventions.pdf

Frey, A. J., Kuklinski, M. R., Bills, K., Small, J. W., Fomess, S. R., Walker, H. M., Feil, E. G., & Seeley, J. R. (2019). Comprehensive cost analysis of First Step Next for preschoolers with disruptive behavior disorder: Using real-world intervention data to estimate costs at scale. Prevention Science, 20(8), 1219–1232. 10.1007/s11121-019-01035-z

Goldenberg, C. (2008). Teaching English language learners. What research does—And does not—Say. American Educator, 32(2), 42–44.

Gray, L., Thomas, N., & Lewis, L. (2010). Educational technology in U.S. Public Schools: Fall 2008(NCES 2010–034). U.S. Department of Education, National Center for Education Statistics. U.S. Government Printing Office.

Gu, L., & Hsieh, C. N. (2019). Distinguishing features of young English language learners’ oral performance. Language Assessment Quarterly, 16(2), 180–195. 10.1080/15434303.2019.1605518

Gunn, B. (2003). Supplemental reading instruction to develop second language literacy. In E.Durán (Ed.), Systematic instruction in reading for Spanish speaking students (pp. 164–180). Charles. C. Thomas.

Gunn, B., Smolkowski, K., Biglan, A., Black, C., & Blair, J. (2005). Fostering the development of reading skill through supplemental instruction: Results for Hispanic and non-Hispanic students. Journal of Special Education, 39(2), 66–85. 10.1177/00224669050390020301

Hammond, J. S., Keeney, R. L., & Raiffa, H. (1999). Smart choices: A practical guide to making better decisions. Harvard Business School Press.

Hollands, F., Bowden, A., Belfield, C., Levin, H., Cheng, H., Shand, R., Pan, Y., & Hanisch-Cerda, B. (2014). Cost-effectiveness analysis in practice: Interventions to improve high school completion. Educational Evaluation and Policy Analysis, 36(3), 307–326. 10.3102/0162373713511850

Hollands, F., Pratt-Williams, J., & Shand, R. (2021). Cost analysis standards and guidelines 1.1. Cost analysis in practice (CAP) project. https://capproject.org/resources

Hollands, F. M., Hanisch-Cerda, B., Levin, H. M., Belfield, C. R., Menon, A., Shand, R., Pan, Y., Bakir, I., & Cheng, H. (2015). CostOut—the CBCSE cost tool kit. Center for Benefit-Cost Studies of Education, Teachers College, Columbia University. https://www.cbcsecosttoolkit.org

Hollands, F. M., Kieffer, M. J., Shand, R., Pan, Y., Cheng, H., & Levin, H. M. (2016). Cost-effectiveness analysis of early reading programs: A demonstration with recommendations for future research. Journal of Research on Educational Effectiveness, 9(1), 30–53. 10.1080/19345747.2015.1055639

Hollands, F. M., & Levin, H. M. (2017). The critical importance of costs for education decisions (REL 2017–274). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Analytic Technical Assistance and Development. https://ies.ed.gov/ncee/edlabs/

Hunter, L. J., DiPerna, J. C., Hart, S. C., & Crowley, M. (2018). At what cost? Examining the cost effectiveness of a universal social–emotional learning program. School Psychology Quarterly, 33(1), 147–154. 10.1037/spq0000232

Husereau, D., Drummond, M., Petrou, S., Carswell, C., Moher, D., Greenberg, D., Augustovski, F., Briggs, A. H., Mauskopf, J., Loder, E., & the CHEERS Task Force. (2013). Consolidated health economic evaluation reporting standards (CHEERS) statement. International Journal of Technology Assessment in Health Care, 29(2), 117–122. 10.1017/S0266462313000160

Hwang, J. K., Mancilla-Martinez, J., McClain, J. B., Oh, M. H., & Flores, I. (2020). Spanish-speaking English learners’ English language and literacy skills: The predictive role of conceptually-scored vocabulary. Applied Psycholinguistics, 41(1), 1–24. 10.1017/S0142716419000365

Institute of Education Sciences. (2020). Cost analysis: A starter kit (Version 1.1). U.S. Department of Education. Retrieved November 10, 2020, from https://ies.ed.gov/seer/pdf/IES_Cost_Analysis_Starter_Kit_V1_1.pdf

Jimenez-Castellanos, O., & Topper, A. M. (2012). The cost of providing an adequate education to English language learners: A review of the literature. Review of Educational Research, 82(2), 179–232. 10.3102/0034654312449872

Kretlow, A., & Helf, S. (2013). Teacher implementation of evidence-based practices in Tier 1: A national survey. Teacher Education and Special Education, 36(3), 167–185. 10.1177/0888406413489838

Levin, H. M., & Belfield, C. (2015). Guiding the development and use of cost-effectiveness analysis in education. Journal of Research on Educational Effectiveness, 8(3), 400–418. 10.1080/19345747.2014.915604

Levin, H. M., & McEwan, P. J. (2001). Cost-effectiveness analysis: Methods and applications (2nd ed.). SAGE Publications.

Levin, H. M., McEwan, P. J., Belfield, C., Bowden, A. B., & Shand, R. (2018). Economic evaluation in education: Cost-effectiveness and benefit-cost analysis. SAGE Publications. 10.4135/9781483396514

Long, K., Brown, J. L., Jones, S. M., Aber, J. L., & Yates, B. T. (2015). Cost analysis of a school-based social and emotional learning and literacy intervention. Journal of Benefit-Cost Analysis, 6(3), 545–571. 10.1017/bca.2015.6

Mathes, P. G., Pollard-Durodola, S. D., Cárdenas-Hagan, E., Linan-Thompson, S., & Vaughn, S. (2007). Teaching struggling readers who are native Spanish speakers: What do we know?Language, Speech, and Hearing Services in Schools, 38(3), 260–271. 10.1044/0161-1461(2007/027)

Morrison, J. Q., Hawkins, R. O., & Collins, T. A. (2020). Evaluating the cost-effectiveness of the Dyslexia Pilot Project: A multitiered system of supports for early literacy. Psychology in the Schools, 57(4), 522–539. 10.1002/pits.22343

Odom, S. L. (2008). The tie that binds: Evidence-based practice, implementation science, and early intervention. Topics in Early Childhood Special Education, 29(1), 53–61. 10.1177/0271121408329171

Parrish, T. B. (1994). A cost analysis of alternative instructional models for limited English proficient students in California. Journal of Education Finance, 19(3), 256–278. https://www.jstor.org/stable/40703855

Pashler, H., Bain, P., Bottge, B., Graesser, A., Koedinger, K., McDaniel, M., & Metcalfe, J. (2007). Organizing instruction and study to improve student learning. A practice guide (NCER 2007-2004). National Center for Education Research, Institute of Education Sciences, U.S. Department of Education. https://files.eric.ed.gov/fulltext/ED498555.pdf

Reynolds, A. J., & Temple, J. A. (2008). Cost-effective early childhood development programs from preschool to third grade. Annual Review of Clinical Psychology, 4(1), 109–139. 10.1146/annurev.clinpsy.3.022806.091411

Rivera, M. O., Francis, D. J., Fernandez, M., Moughamian, A. C., Jergensen, J., & Lesaux, N. K. (2010). Effective practices for English language learners: Principals from five states speak. Center on Instruction, RMC Research Corporation. (ERIC Document Reproduction Service No. ED517795). https://eric.ed.gov/?id=ED517795

Saunders, W., Goldenberg, C., & Marcelletti, D. (2013). English language development: Guidelines for instruction. American Educator, 37(2), 13.

Stockard, J. (2011). Direct Instruction and first grade reading achievement: The role of technical support and time of implementation. Journal of Direct Instruction, 11, 31–50. https://www.nifdi.org/research/journal-of-di/volume-11-winter-2011/494-direct-instruction-and-first-grade-reading-achievement-the-role-of-technical-support-and-time-of-implementation/file.html

Turner, A. J., Sutton, M., Harrison, M., Hennessey, A., & Humphrey, N. (2020). Cost-effectiveness of a school-based social and emotional learning intervention: Evidence from a cluster-randomised controlled trial of the promoting alternative thinking strategies curriculum. Applied Health Economics and Health Policy, 18(2), 271–285. 10.1007/s40258-019-00498-z

Uccelli, P., Galloway, E. P., Barr, C. D., Meneses, A., & Dobbs, C. L. (2015). Beyond vocabulary: Exploring cross-disciplinary academic-language proficiency and its association with reading comprehension. Reading Research Quarterly, 50(3), 337–356. 10.1002/rrq.104

Uccelli, P., & Phillips Galloway, E. (2017). Academic language across content areas: Lessons from an innovative assessment and from students’ reflections about language. Journal of Adolescent & Adult Literacy, 60(4), 395–404. 10.1002/jaal.553

Vuchinich, S., Flay, B. R., Aber, L., & Bickman, L. (2012). Person mobility in the design and analysis of cluster-randomized cohort prevention trials. Prevention Science, 13, 300–313. 10.1007/s11121-011-0265-y

What Works Clearinghouse. (2006). Reading Mastery1/SRS/McGraw-Hill. Institute of Education Sciences. https://ies.ed.gov/ncee/wwc/Docs/InterventionReports/WWC_Reading_Mastery_092806.pdf

What Works Clearinghouse. (2012). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance.

WIDA Consortium. (2007). ACCESS for ELLs: Scores, reliability and validity [Paper presentation]. ISBE Meeting, Chicago, IL. https://slideplayer.com/slide/757716/

Wood, C. L., Goodnight, C. I., Bethune, K. S., Preston, A. I., & Cleaver, S. L. (2016). Role of professional development and multi-level coaching in promoting evidence-based practice in education. Learning Disabilities, 14(2), 159–170. https://files.eric.ed.gov/fulltext/EJ1118436.pdf

Submitted: March 4, 2021 Revised: June 21, 2022 Accepted: July 12, 2022

Titel:
The cost-effectiveness of an English language curriculum for middle school English learners
Autor/in / Beteiligte Person: Cil, Gulcan ; Chaparro, Erin A. ; Dennis, Caroline ; Smolkowski, Keith
Link:
Zeitschrift: School Psychology, Jg. 38 (2023), S. 48-58
Veröffentlichung: American Psychological Association (APA), 2023
Medientyp: unknown
ISSN: 2578-4226 (print) ; 2578-4218 (print)
DOI: 10.1037/spq0000515
Schlagwort:
  • Developmental and Educational Psychology
  • Education
Sonstiges:
  • Nachgewiesen in: OpenAIRE
  • Rights: OPEN

Klicken Sie ein Format an und speichern Sie dann die Daten oder geben Sie eine Empfänger-Adresse ein und lassen Sie sich per Email zusenden.

oder
oder

Wählen Sie das für Sie passende Zitationsformat und kopieren Sie es dann in die Zwischenablage, lassen es sich per Mail zusenden oder speichern es als PDF-Datei.

oder
oder

Bitte prüfen Sie, ob die Zitation formal korrekt ist, bevor Sie sie in einer Arbeit verwenden. Benutzen Sie gegebenenfalls den "Exportieren"-Dialog, wenn Sie ein Literaturverwaltungsprogramm verwenden und die Zitat-Angaben selbst formatieren wollen.

xs 0 - 576
sm 576 - 768
md 768 - 992
lg 992 - 1200
xl 1200 - 1366
xxl 1366 -