• No results found

School characteristics influencing the implementation of a data-based decision making intervention

N/A
N/A
Protected

Academic year: 2021

Share "School characteristics influencing the implementation of a data-based decision making intervention"

Copied!
21
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Full Terms & Conditions of access and use can be found at

http://www.tandfonline.com/action/journalInformation?journalCode=nses20

Download by: [Universiteit Twente.] Date: 30 May 2017, At: 23:17

School Effectiveness and School Improvement

An International Journal of Research, Policy and Practice

ISSN: 0924-3453 (Print) 1744-5124 (Online) Journal homepage: http://www.tandfonline.com/loi/nses20

School characteristics influencing the

implementation of a data-based decision making

intervention

Marieke van Geel, Adrie J. Visscher & Bernard Teunis

To cite this article: Marieke van Geel, Adrie J. Visscher & Bernard Teunis (2017): School

characteristics influencing the implementation of a data-based decision making intervention, School Effectiveness and School Improvement, DOI: 10.1080/09243453.2017.1314972

To link to this article: http://dx.doi.org/10.1080/09243453.2017.1314972

© 2017 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.

Published online: 14 Apr 2017.

Submit your article to this journal

Article views: 205

View related articles

(2)

ARTICLE

School characteristics in

fluencing the implementation of a

data-based decision making intervention

Marieke van Geela, Adrie J. Visscheraand Bernard Teunisb

aDepartment of Teacher Development, University of Twente, Enschede, The Netherlands;bPO-Raad,

Utrecht, The Netherlands

ABSTRACT

There is an increasing global emphasis on using data for decision making, with a growing body of research on interventions aimed at implementing and sustaining data-based decision making (DBDM) in schools. Yet, little is known about the school features that facilitate or hinder the implementation of DBDM. Based on a literature review, the authors identified 12 potentially critical fac-tors for the successful implementation of DBDM. Following an intensive DBDM intervention, 16 schools’ characteristics were stu-died and related to the success of DBDM implementation after participating in an intensive DBDM intervention for 2 subsequent academic years. Strong instructional leadership, maximum expo-sure to the intervention, standardization of work processes, as well as staff continuity and a strong academic coach appear to be strongly related to implementation success. The remaining school context features and the culture and structure of the schools were not associated with the success of DBDM implementation.

ARTICLE HISTORY

Received 1 October 2015 Accepted 30 March 2017

KEYWORDS

Data use; data-based decision making; educational reform; intervention

Introduction

In education, there is a growing emphasis on using data to guide decision making, under the assumption that this will enhance student achievement. This increasing interest in data use in education is twofold. On the one hand, there is the context of accountability in which school leaders and teachers are held accountable for the quality of the education they provide (Lai & Schildkamp, 2013). Data such as student achievement scores on standardized tests are used in a summative way to hold schools accountable to external parties such as parents and, in The Netherlands, the Inspectorate of Education. On the other hand, there is a growing recognition that data should not only be used for compliance and accountability but also for continuous improvement (Kingston & Nash, 2011; Lai & Schildkamp,2013; Mandinach,2012). In this context, data use is perceived as a way to inform teachers about students’ needs, allowing them to adapt and adjust instruction based on the input. Similarly, school leaders can use data as the basis for their decisions at the school level (Lai & Schildkamp,2013).

Interest in data use is reflected by the growing body of literature on the topic. Recently, several special journal issues and edited volumes have been dedicated

CONTACTMarieke van Geel marieke.vangeel@utwente.nl

http://dx.doi.org/10.1080/09243453.2017.1314972

© 2017 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.

This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or built upon in any way.

(3)

specifically to data use (e.g., Coburn & Turner, 2012; Schildkamp, Ehren, & Lai, 2012; Schildkamp, Lai, & Earl,2013; Turner & Coburn,2012). Based on such literature, Hamilton et al. (2009) compiled a set of recommendations regarding the use of student achieve-ment to support instructional decisions. However, they also concluded that only few studies drawfirm conclusions on the effects of data use, and that recommendations are primarily based on case studies, descriptive studies, and expert opinions (Hamilton et al., 2009). In a literature review on data use in education, Marsh (2012) recognized a similar trend: The majority of the studies she reviewed were descriptive (e.g., case study design, interviews, focus groups, observations, document analysis). However, some studies have been more conclusive in showing that the use of data to reflect on and to adapt education can improve student achievement (Campbell & Levin, 2009; Carlson, Borman, & Robinson, 2011; Faber & Visscher, 2014; Lai & McNaughton, 2013). In this paper, we have used the following, broad definition of data-based decision making (DBDM) by Ikemoto and Marsh (2007):“teachers, principals, and administrators system-atically collecting and analyzing data to guide a range of decisions to help improve the success of students and schools” (p. 108).

Although the use of data to inform school practice is promoted in various countries, little is known about the (differential) effects of interventions aimed at implementing DBDM in schools. The expectation is that DBDM will lead to better informed instructional decisions, and to instruction that is more closely adapted to students’ needs. Instruction tailor-made to students’ needs is expected to lead to higher student achievement. In order to achieve these goals, schools will have to learn to work in a DBDM manner, which requires professional development interventions as school staff will have to develop new knowledge and skills. The success of such interventions will depend not only on the quality and nature of the interventions but also on how the characteristics of the school context, in which the intervention is introduced, affects intervention implementation.

According to Berliner (2002), schools are complex and dynamic networks of social interactions between actors with variable potential to influence each other. Moreover, ordinary life events (e.g., illness, pregnancy, staff/student mobility) are important deter-minants for the course and impact of interventions. Berliner argued that schools vary greatly, for example, in terms of their internal staff networks, curriculum material, budget, staff professionalism, and the nature of the student population. The combina-tion of these factors leads to unique complexities to deal with at each site. It also complicates the replication offindings across school sites, and the generation of theory, for example, concerning whether an intervention is effective, and why (or why not). Unlike engineers, for instance, educational scientists cannot control the context in which they intervene, while the impact of that context on the intervention’s success may be strong. However, little is known about how school contexts affect the implementation success in the case of DBDM interventions. Therefore, in this study we explored which school characteristics at the start and throughout a DBDM intervention facilitate or hinder its implementation.

School characteristics critical for implementing DBDM

Based on literature on school effectiveness, teacher professionalization, comprehensive school reform, and DBDM, we identified several school characteristics that were deemed

(4)

potentially critical for implementing DBDM through a school-wide intervention. The selection was based on the subjective assessment of the importance of factors found in literature.

Knowledge, skills, and attitude

First, school staff’s motivation to work on DBDM was considered important for its successful implementation (Fullan, 1991; Locke & Latham, 2002). Motivation reflects the extent to which school staff members would consider DBDM worth working on (Visscher & Coe,2002). Higher levels of DBDM motivation were therefore expected to be positively related to DBDM implementation.

Visscher and Ehren (2011) stressed that DBDM places high demands on school organization. To use a student monitoring system (SMS) and translate resulting data into planned and implemented instructional activities is, in many cases, new to school staff. Blanc et al. (2010) highlighted DBDM knowledge and skills that are essential for making DBDM work: knowledge and skills for evaluating student progress (e.g., how to use a student monitoring system and interpret the collected data) and for setting SMART (specific, measurable, attainable, realistic, and time-bound) performance goals. The DBDM knowledge and skills already available within school teams at the start of the intervention may therefore influence the success of a DBDM intervention.

Leadership, resources, and culture

Aside from individual knowledge, skills, and attitudes, there are also aspects at the school organizational level that may influence the success of the school-wide imple-mentation of DBDM. Instructional school leaders monitor and support (optimize) the required changes. A school leader can influence the implementation process by stres-sing the value and importance of the intended changes, and by supporting staff in the adoption of the innovative approaches (Leithwood, Harris, & Hopkins, 2008; Levin & Datnow,2012). This role is not restricted to the formal school leader. Distributed leader-ship has been shown to have a greater influence on schools and students (Leithwood et al.,2008). So-called “teacher leaders” can also function as role models for colleagues by convincing and supporting colleagues to emulate their example (Berends, Bodilly, & Kirby,2002).

An important form of support that formal leaders can provide staff with is the allocation of the resources required for innovation. Change requires time and other capital to allow for adaptation to the new approach (Van Veen, Zwart, & Meirink,2011). School improvement is often resisted due to a shortage of resources, inhibiting the incorporation of new behavior into existing schema (Tyack & Tobin, 1994; Wong & Meyer,1998). Schools and teachers are often expected to accomplish relatively complex innovations while simultaneously fulfilling their regular duties. Bodilly, Keltner, Purnell, Reichardt, and Schuyler (1998) showed that schools that allocated more time to work on an innovation were more capable of implementing these changes.

Ideally, school team members value cooperation and work together to achieve school-wide DBDM implementation. Cooperation in the context of an innovation can be an important mechanism that allows school staff to share, for example, successful

(5)

classroom strategies (Keuning, Van Geel, Visscher, Fox, & Moolenaar,2016). Furthermore, cooperation can also help overcome problems and allows staff to mutually support each other’s learning on how to implement DBDM (Geijsel, Sleegers, Stoel, & Krüger,2009).

Aside from the degree of cooperation within the school, a school’s performance orientation is also important, as DBDM is a tool to maximize the achievement of all students. In schools with performance-oriented cultures, high performance in core subjects is considered important for all students (Inspectie van het Onderwijs, 2010; Scheerens & Bosker, 1997). Furthermore, staff in such schools work on accomplishing this goal, and encourage and reward high student performance in these subjects. It is likely that a more performance-oriented school will be more receptive to the implemen-tation of DBDM, since they are more focused on maximizing student achievement.

Another organizational factor that was expected to potentially influence DBDM imple-mentation and sustainability is the degree of standardization of DBDM activities. Doolaard (1999) found that the standardization of core activities within schools during an educa-tional innovation promoted the successful implementation of those innovations. To illustrate using the context of DBDM, one can, for example, think of following the Plan-Do-Check-Act cycle (Boudett, City, & Murnane,2005), the standardization of instructional plans, the use of protocols for analyzing student performance with a student monitoring system, and instructional approaches like the direct instruction model (Becker,1992).

Intervention exposure and continuity

The successful implementation and continuity of DBDM through an intervention may also be influenced by the degree of exposure to the intervention. The collective participation of all school staff has been shown to enhance the effect of teachers’ professional development (Desimone, 2009), which may support the successful imple-mentation of DBDM throughout a school.

Within schools, staff continuity was also expected to promote the sustained imple-mentation of DBDM, which generally involves a considerable change from usual prac-tices (Berends et al.,2002; Bodilly et al., 1998). If the school leader leaves the school, it was considered questionable whether the successor would promote the innovation to the same extent. Potential barriers to implementation could arise as the successor did not participate in the decision to adopt the innovation, may be uninformed about its background, and will need energy and time to adapt to the new position (time and energy that cannot be spent on DBDM) (Hallinger & Heck,2011). Furthermore, in schools with frequent leadership transitions, these transitions could impede improvement efforts in the school, although this does not have to be the case when both the predecessor and successor share a common vision and when there are strong teacher leaders (Fink & Brayman,2006). Mobility among other school staff members may also be detrimental for the implementation of DBDM because new teachers and academic coaches may lack the prerequisite knowledge and skills for DBDM, and may require time to learn about and adapt to their new jobs. It should be noted, however, that staff mobility could also positively affect an innovation process, for instance, in cases where new staff members are better professionals than the previous.

(6)

The DBDM intervention

The present study was conducted during the implementation of DBDM using a school-wide intervention. In this section, we describe the development and content of the intervention as well as its rationale.

Commissioned by the Dutch Ministry of Education in 2007, a pilot training course was developed at the University of Twente. In this training, seven school teams were trained on how to use their student monitoring system and analyze and interpret the resulting student performance data. The results showed that there were other prerequisites for developing DBDM practices and further measures were still needed to enhance the quality of instruction and student outcomes. At the request of the staff from the schools participating in the pilot study (the school leader, academic coach, and teachers from Grades 2 and 4), the intervention was expanded to all aspects of DBDM. These aspects included setting performance goals and the development of instructional plans based on information concerning student progress and performance goals set.

In the academic years 2010, 2011, and 2012, the Focus intervention was carried out for thefirst time with 43 schools. The participants included the school leader, academic coach, and teachers of Grades 1 to 3 in thefirst intervention year and the school leader, academic coach, and teachers of Grades 4 to 6 in the second year in the eastern part of The Netherlands (Staman, Visscher, & Luyten,2014). Based on the results of the Focus I intervention, and experiences with these schools in combination with insights from the literature on professional development and comprehensive school reform, the interven-tion was optimized and implemented in 53 schools during the 2011 to 2013 academic years as the Focus II intervention (Van Geel, Keuning, Visscher, & Fox, 2016). The University of Twente appointed four project trainers; each responsible for training and coaching a group of schools. The study reported here concerns 16 of those schools, trained by one of the trainers. In the following section, the design and content of the Focus II intervention, as well as the rationale behind the intervention, will be described. The intervention is a 2-year training course for entire primary school teams, aimed at helping the teams acquire the knowledge and skills required for DBDM, and implement-ing and sustainimplement-ing DBDM in the school organization by followimplement-ing the DBDM cycle (Figure 1). The training course and the accompanying protocols and documents were developed by the University of Twente, but participating schools were encouraged to adapt them to their specific context. Trainers supported school leaders in fulfilling the required conditions for leadership, school culture, internal professional networks, and for internal collaboration. InFigure 2, the intervention outline, as well as the distribution of meetings across the two intervention academic years, is presented.

During the first intervention year, each step of the DBDM process (Figure 1) was introduced sequentially. Practical preconditions had to be met to be able to realize DBDM. Therefore, the availability of assessment tools (standardized tests) and techno-logical tools (a student monitoring system) were requirements for participating in the training. Prior to the first intervention meeting, a meeting with the school leader and school board was organized to stress their roles in encouraging, motivating, and supporting their team. Furthermore, this meeting was organized to assure that they would allocate sufficient time to staff for working on DBDM activities (such as analyzing

(7)

data, planning, and evaluating instruction) and that other practical preconditions would be fulfilled (e.g., planning team meetings for discussing school results).

The first four meetings were predominantly spent on working on participants’ DBDM knowledge and skills: using the student monitoring system, analyzing and interpreting test score data, diagnosing students’ learning needs, setting goals, and developing instructional plans.

At the individual level, aside from knowledge and skills, beliefs and attitudes also played a role when introducing and implementing DBDM. No special meetings were dedicated to overcoming resistance and motivate participants for using DBDM. However, the benefits of DBDM, as specified by scientific literature and participants’ positive experiences in the pilot and Focus I intervention were presented. During Meeting 5 in thefirst year, the cycle of DBDM was completed for thefirst time when student achievement results were discussed in a team meeting. The trainers stressed to school leaders and participants that data were supposed to be used to support improvement, not to evaluate staff. This was done to contribute to a school culture of trust and collaboration, in which the school team perceives a shared responsibility for student performance. Meeting 6 focused on collaboration among team members by observing each other’s lessons, either to learn from the colleague they visited and/or to provide each other with feedback on specific topics.

As of Meeting 7 in thefirst year, the meetings were aimed at internalizing, sustaining, and broadening the scope of DBDM. Participants were also supported in applying their decisions in the classroom, for example, by means of coaching sessions in which the trainers observed teachers in the classroom and provided them with feedback. During team meetings, attention was paid to DBDM-related issues raised by the school.

Aside from team meetings, the trainers met with school leaders and school boards twice a year to discuss their roles in the innovation process, the school’s progression, and the goals to be set for the upcoming period. During these meetings, the importance of encouragement and support by school leaders was emphasized.

Critical factors in implementing DBDM by means of the Focus II intervention As described in the theoretical framework, a review of the relevant literature led to the identification of several factors that were expected to be important for implementing DBDM through the Focus II intervention:

(8)

Knowledge, skills, and attitude

(1) School staff’s motivation to work on DBDM (2) Prior DBDM knowledge and skills

(9)

Leadership, resources, and culture (3) Instructional leadership

(4) The provision of innovation resources (5) Teacher leaders

(6) Performance-oriented school culture

(7) Attitude towards school internal, instruction-oriented cooperation (8) Standardization of DBDM activities

Intervention exposure and continuity

(9) The extent to which the entire school team is exposed to the intervention (10) School staff continuity

Furthermore, most primary schools in The Netherlands also have an appointed academic coach. The academic coach can also be important for promoting DBDM by analyzing school-level performance data, and by encouraging and supporting DBDM within the school team. A strong academic coach can help teachers and principals learn how to utilize data for informed decisions by monitoring the school-wide implementation of DBDM and by solving any problems that may occur.

Academic coach

(11) Quality of the academic coach

Finally, since schools vary in terms of instructional quality and student achievement, size, urbanization, and features of the student population and school-context features must also be considered.

Context

(12) School context features (at the start of the intervention)

The purpose of this study was to identify which of these 12 school characteristics would prove to be related to DBDM implementation success in a school-wide intervention.

Methodology

The data used in this study were collected at the beginning and end of the intervention. In this section, the participants and measures are presented, and data collection and analysis procedures are explained.

Participants

In November 2010, over 500 primary schools in the northern and central parts of The Netherlands were invited to attend a project briefing in their region to determine their interest in participating in the Focus II intervention. Eleven briefings were organized, which led to 53 schools participating. Four full-time trainers were appointed to guide the schools in implementing DBDM for the duration of the two intervention years. Schools were guided by the same trainer over the course of the entire intervention.

(10)

For the current study, data from the 16 schools guided by one trainer were used. The reason for the use of a subset of data was that data collection was intensive and for practical reasons could only be done by one of the trainers.

Characteristics of these 16 schools are presented inTable 1. School size in the current study ranged from 45 to 727 students, with an average of 204 students per school. School socioeconomic status (SES; categorized as high, medium, and low) was based on the percentage of students in a school who had been assigned extra“weight” based on parental educational levels1indicating low SES. As Table 1shows, the sample did not include low-SES schools. “Exposure” indicates whether the entire school team partici-pated in the intervention during the full two intervention years, which is the case for the majority of schools.

Measures and data collection

At the beginning and end of the intervention, questionnaires were administered to measure the factors related to the intervention starting situation, the intervention processes, and its effects. In the following section, we provide a more detailed descrip-tion of how measures were collected in this study. For some of the studied constructs, questionnaires were only administered to school leaders, in which case the sample was too small to compute scale reliability.

School staff’s motivation (DBDM and SMS)

At the start of the intervention, a questionnaire was administered to all participants to identify their attitude and motivation regarding DBDM, and DBDM components (4 items, score range = 1–4, N = 197, α = .84), and more specifically regarding analyzing the results of SMS analyses (5 items, score range = 1–4, N = 190, α = .78).

DBDM knowledge and skills

To determine the degree to which participants were able to interpret output from the student monitoring system, a knowledge test was conducted at the beginning of the intervention. Since two different student monitoring systems were used by the schools, two versions of this test were developed (System C: 30 items,α = .84, N = 91; System P: 25 items,α = .81, N = 111; score: percentage of correct answers).

Table 1.Context characteristics of sample schools (N = 16).

Number of schools Percentage School size Small (<150) 8 (50.00%)

Medium (150–350) 6 (37.50%) Large (>350) 2 (12.50%)

School SES High 7 (43.75%)

Medium 9 (56.25%)

Low 0 (0.00%)

Urbanization Urban 3 (18.75%)

Rural 13 (81.25%)

Intervention exposure Full 13 (81.25%)

(11)

Instructional school leaders

A questionnaire concerning the school leader was also administered as part of the pre-intervention data collection (13 items, score range = 1–4, N = 160, α = .88). This questionnaire was aimed at measuring instructional leadership and was administered to all participants except for the school leaders.

Innovation resources

To identify the degree to which participants perceived having been facilitated in practi-cing DBDM, a questionnaire was administered at the end of the intervention about support in terms of provided time (7 items, score range = 0–1, N = 99, α = .69) and resources (1 item, score range = 1–4, N = 188). School leaders were asked whether they provided the team with time to work on DBDM (4 items, score range = 1–4, N = 22: 16 school leaders and 6 deputy heads).

Teacher leaders

At the end of the intervention, all participants were asked whether the implementation process had been managed by the formal school leader and the academic coach, or whether teacher leaders had also promoted the implementation of DBDM in the school (1 item, score range = 1–4, N = 188).

Performance-oriented school culture

When the intervention started, a questionnaire was administered to determine the degree to which the school culture could be characterized as “performance oriented” (10 items, score range = 1–4, N = 178, α = .70).

Cooperation

At the beginning of the intervention, a questionnaire was administered to determine staff’s attitude towards professional cooperation (5 items, score range = 1–4, N = 197,α = .72).

DBDM standardization

By means of a questionnaire, administered to all participants at the end of the inter-vention, it was determined to what degree work processes had been standardized at the end of the intervention (13 items, score range 1–4, N = 187, α = .87).

Intervention exposure

Exposure indicates whether the entire school team was exposed to the intervention. Three schools chose to participate with only part of the school team, for example, by excluding teachers from the lower grades from participating or by only participating with one teacher per grade. School exposure as defined here is not the same as the degree to which school staff attended each of the project meetings (the latter, however, was registered, and our data showed that schools with high levels of non-attendance implemented the innovation poorly).

This variable was scored as either one (the entire school team was exposed to the intervention) or zero (only part of the school team was exposed to the intervention).

(12)

School staff continuity

After the intervention had been implemented, the degree of continuity (the opposite of staff mobility) of the school leader, academic coach, and teachers was determined per school. This was done by indicating the extent to which the position had been fulfilled by the same person as opposed to multiple people. For teachers, a distinc-tion was made between no or almost no turnover (fewer than 15% turnover, indi-cated with 0 inTable 3), limited turnover (15% < x < 30%, indicated with 1), and high turnover (> 30%, indicated with 2) during the intervention period.

Quality of the academic coach

At the end of the intervention, a questionnaire was administered to teachers to assess their perceptions regarding the level of knowledge and skills of their academic coaches (7 items, score range = 1–4, N = 174, α = 0.84).

Context

School quality (SQ) was based on the judgement by the school inspectorate at the beginning of the intervention in 2011 and 2 years earlier. The inspectorate in The Netherlands distinguishes between three categories of school performance: basic level (i.e., sufficient), weak, and very weak. All schools were judged as performing at the basic level at the start of the intervention. SES, urbanization, and school-size data were extracted from an inspectorate database at the start of the intervention.

Intervention effects

To determine intervention effects at the school level, three measures were used. First, the trainer evaluated the quality of the instructional plans of Grades 2, 4, and 6 at the end of the intervention according to four criteria (with a scale ranging from 1 to 4). Points were awarded based on the quality of the description of the instructional needs of students, the degree to which goals were set and formulated in a SMART way, the quality of the description of the planned instructional strategies, and the extent to which the instructional plan incorporated the reservation of (extra) instructional time for students with specific needs. The average score was used as an indicator for the quality of the instructional plans.

Second, the trainer observed lessons in each school and used the ICALT (International Comparative Analysis of Learning and Teaching) classroom observation instrument (Van de Grift,2007) to rate these lessons on 35 items, with a score ranging from 1 through 4. Although the ICALT was not designed for the measurement of DBDM specifically and is somewhat limited as far as the measurement of instructional differentiation is con-cerned, we decided to use the ICALT as no better alternative existed. Teachers were rated on all ICALT items, that is, on typical DBDM teaching activities as well as on general teaching aspects (e.g., classroom management, clear instruction, etc.). The reason for this holistic evaluation of teaching was that teachers, during trainer classroom visits and by means of student perceptions, had received feedback on all those aspects of teaching. Teachers could apply voluntarily for the classroom observations, so the observations are not necessarily representative of the teaching quality of all teachers in the school. However, more than half of the teachers were observed in all schools, and these

(13)

teachers were teaching lower, middle, and upper grades. The average score across all observations was used as an indicator of instructional quality at the school level.

Finally, the trainer implementing the intervention graded the schools on a scale from 1 to 10. This grading was based on his overall judgement of the degree to which DBDM had been embedded in the school. Although this measure reflects his subjective opinion, it is based on his experience with all 16 schools, which he had coached intensively for two full consecutive academic years. He based his school grades on the interim reports he had written about schools’ progress during the intervention, the quality of the school meetings on student progress at the end of the project (the trainer attended these meetings), the quality of schools’ planning of DBDM activities after 2 years, and his general impression of how schools had worked on DBDM and how successfully they had implemented DBDM after 2 years.

The three scores per school (for instructional plans, classroom observations, and the DBDM implementation grade from the trainer) were added, leading to a score between 3 and 18. It should be noted that the overall trainer grade has a relatively high weight in this combined score (a score between 1 and 10, whereas the other two DBDM imple-mentation scores have a maximum of 4), since this was expected to be the most “comprehensive” indicator of DBDM implementation. This combined score was used to rank schools; higher scores reflect a more successful implementation of DBDM.

Results

This study aimed to identify school characteristics that contribute to the successful implementation of DBDM. InTable 2, the school scores per measure and the aggregated DBDM implementation scores are presented. For the sake of clarity, schools are ordered based on their overall implementation score.

InTable 3, the schools are again ordered based on their ranking. For each quantita-tive variable, the top quartile schools (marked with +) and the bottom quartile schools (marked with−) are indicated. Overall, schools with high DBDM implementation scores tended to score in the top quartiles of the explanatory school characteristics. The opposite is true for schools with lower implementation scores.

Knowledge, skills, and attitude

There was no clear relationship between a school team’s general level of motivation for the DBDM components at the start of the intervention and the degree to which DBDM was implemented. However, schools that achieved a low degree of DBDM implementa-tion belonged to the lowest quartile of scores for “motivation for using the student monitoring system”. High levels of implementation were characterized by the highest levels of motivation for using the SMS.

Three out of eight schools with highest DBDM implementation scores had high scores for DBDM knowledge and skills at the start of the intervention (i.e., they knew which analyses were possible and were able to correctly interpret the results of such analyses), whereas three out of eight low DBDM implementers performed poorly in terms of DBDM knowledge and skills.

(14)

Leadership, resources, and culture

Table 3 shows that schools with the lowest implementation scores all belong to the bottom quartile for instructional leadership at the start of the intervention. In other words, the team members of these schools did not consider their leaders to play an important role in the maintenance and improvement of instructional quality within their schools. School leaders in these schools rarely provided their teams with enough time and/or material to work on DBDM. Schools with teacher leaders who were more active and supportive to their teaching colleagues proved to be more successful in implement-ing DBDM.

The degree to which schools are performance oriented did not distinguish high and low implementers of DBDM, and neither did a cooperative attitude.

Although not all strong DBDM implementers were strong in terms of the standardiza-tion of DBDM activities, low DBDM implementers were schools in which the standardi-zation of such activities was low.

Intervention exposure and continuity

All schools in which only part of the school team had been exposed to the intervention proved less successful in implementing DBDM. Furthermore, continuity of three cate-gories of school staff (the school leader, academic coaches, and teachers) were strongly associated with the implementation of DBDM– less mobility among school staff appears to be important for success. Indeed, the top four DBDM implementers all had very stable school teams.

Academic coach

Schools with academic coaches perceived as knowledgeable regarding DBDM proved to be more successful in implementing DBDM.

Table 2.Implementation score per school, ranked from largest to smallest overall implementation score.

School Classroom observations Instructional plans Trainer judgement Overall implementation score M 2.9 3.3 8.5 14.7 A 3.0 3.2 8.0 14.2 O 2.9 2.8 8.5 14.2 N 2.8 2.8 8.5 14.1 K 2.8 3.0 8.0 13.8 D 3.0 3.0 7.8 13.8 P 2.9 2.8 8.0 13.7 E 2.8 2.9 7.8 13.5 G 3.2 2.9 6.8 12.9 C 2.6 3.2 6.5 12.3 L 2.7 2.7 6.5 11.9 I 3.3 2.1 5.8 11.2 H 2.5 2.5 6.0 11.0 J 2.9 2.4 5.5 10.8 B 2.8 2.6 5.0 10.4 F 3.0 2.3 4.5 9.8

(15)

Table 3. Overview of schools (ranked top to bottom) with highest and lowest scores per variable. 1.Motivation for DBDM 2.DBDM Knowledge& skills 3.Instructional Leader 4.Innov. resources 5.Teacher leaders 6.Performance-orientedSchool culture 7.Cooperation Attitude 8.DBDM Standardization 9.Exposure to intervention 10.Continuity 11.Academic coach 12.Context School DBDM components Analyzingresults Time Materials Leader Coach Teachers Knowledge& skills RecentSQ PreviousSQ Size SES Urbanization #plusses #minuses M+ + + − + 1 000 + B BS H R 5 1 A − ++ + − + 1 000 B W S A U 4 2 O+ + − + 1 000 + B BS A R 4 1 N+ + − + 1 000 B B S A R 3 1 K+ + − ++ − 1 102 − BB S A R 4 3 D+ − 1 102 B B M A R 1 1 P −− − ++ − 1 010 + B BS A R 3 4 E − ++ + + + +++ + 1 0 0 1 + B B S H R 1 0 1 G+ −− 1 002 B B M H R 1 2 C + + 1 012 − BW M A U 2 1 L −−− + + + 0 010 − BB S A R 3 4 I − + −− 0 000 − BB M H R 1 4 H+ −− − − − 1 102 B B M H R 1 5 J −− − − − 0 002 B B L H R 0 5 B+ + −− − 1 002 B W L H U 2 3 F − −−− + −− − 1 112 B B M A R 1 7

(16)

Context

Pertaining to the contextual variables, a school’s quality as judged by the school inspectorate was not associated with the level of DBDM implementation. However, schools varied very little on these variables. Almost all schools that were successful in implementing DBDM were small schools with a student population with an average SES. Interestingly, the majority of high-SES schools were in the lower end of the DBDM implementation distribution. There was too little between-school variation in terms of urbanization to be able to discern its relationship with the level of DBDM implementation.

Conclusion and discussion

Despite the global interest in DBDM, empirical studies into the process of implementing DBDM in schools are limited. The successful implementation of DBDM is an important prerequisite for investigating its impact on school performance. Thus, our goal was to investigate which school characteristics influence the DBDM implementation process. In the present study, we investigated which of 12 potentially relevant school characteristics found in a literature review influenced DBDM implementation. This was done by study-ing 16 schools in depth, while they participated in an intensive, 2-year-long DBDM intervention.

This qualitative study had strengths (the longitudinal and in-depth study of school context variation from various perspectives) but also a number of limitations. The sample was relatively small, which did enable the in-depth approach. Furthermore, schools were relatively homogeneous in terms of urbanization, school size (limited large schools), average student SES, and the inspectorate’s school quality judgement. The dependent variable was heavily influenced by the subjective opinion of the trainer. Some variables measured using self-reports may alternatively be measured using more valid measures (e.g., the resources provided by school leaders to his/her staff for work-ing on the intervention; teachwork-ing quality before and after the intervention). For example, the school indicated with “E” shows top-quartile perceptions on almost every variable. An explanation might be that this was a small school in a small town, with a team that had been working together for years, and so the staff was part of each other’s social lives as well. Using measures that were more objective might have shed a different light on the factors we aimed to measure by means of perceptions. The collection of such measures would however be difficult, very time consuming and expensive. Ranking schools is not easy as it presupposes that a very precise distinction can be made between positions in the ranking (for that reason, we only focused on the extremes of the ranking).

Although the research design of this study does not allow for generalizations and causal claims, the longitudinal, in-depth investigation of how the school characteristics interacted with the DBDM innovation intervention has provided a valuable basis for verification in more large-scale, (quasi-)experimental research. The findings of our exploration should be put to the test in studies with other designs, other samples, and for other DBDM interventions as some of the schools only differed marginally in terms of implementation score (e.g., the scores for schools ranked 2–8 ranged from 14.2

(17)

to 13.5). This complicates exploring differences between those schools and investigating the relation with the independent variables at the school level. However, we were interested in comparing schools with high versus schools with low implementation scores. Given the ranges for those two categories of schools, it is possible to generate inferences between schools with high and low implementation scores. Future research with more large-scale data collection may result in a more wide distribution of scores, which may also influence the relations between independent and dependent variables. Where research budgets allow, other instruments for data collection would also be worthwhile.

Our results confirm that schools with high and low implementation scores vary considerably and in important ways, and that a number of the differences between schools complement the varying degrees of success in implementing a DBDM interven-tion, which impacts and requires much from the school as a whole in terms of organiza-tional structures, professionalism, resources, and effort.

In line with Timperley (2008), the findings showed that a school team’s general motivation towards DBDM was not related to DBDM implementation. It could be that participants can successfully implement DBDM if the intervention leads to meaningful training activities that help them do their job better, regardless of their initial attitude.

Surprisingly, motivation for the use of a student monitoring system was positively associated with DBDM implementation. This could be related to the fact that the use of the student monitoring system was one of thefirst steps in the intervention. Thus, if participants were not motivated in using it, it would have been unlikely that the subsequent steps, which built on the results of using the student monitoring system, would have been carried out effectively.

Developing DBDM knowledge and skills was an explicit aim of the intervention; therefore, it was remarkable that schools with the highest DBDM implementation scores also had the highest scores for initial DBDM knowledge and skills. Implementing DBDM is a complex endeavor, and therefore it is possible that schools can focus more on the other aspects of DBDM implementation if a core requirement for DBDM (the analysis and interpretation of student performance data) has already been fulfilled. Therefore, they can become more successful in implementing DBDM as a whole. It may also be that the fulfilment of this requirement prior to the intervention reflects that these schools are active, capable, and committed to make DBDM happen.

All schools in which only part of the school team had been exposed to the interven-tion were less successful in the implementainterven-tion of DBDM, possibly because the imple-mentation could not succeed without full collaboration. Working with the whole team on the innovation can also have a facilitating and supportive (from colleagues) effect, and stimulates a feeling of school-wide commitment to bringing about the innovation. Continuity of the school leader, academic coaches, and teachers appears to relate strongly to the implementation of DBDM. It is striking that this obvious variable receives little attention in innovation literature whereas it is imaginable that staff changes in school teams hinders the implementation of complex interventions that require sus-tained effort.

Both instructional leadership by the formal school leader and the presence of supportive teacher leaders proved to be related to the successful implementation of DBDM. A knowledgeable academic coach also enhances implementation. This is in line

(18)

with our expectations; teachers are generally busy with a variety of tasks, and an intervention adds to their burden, and leaders who encourage them and support them with vision, expertise, and facilities can make a difference (Lai & Schildkamp, 2013; Lange, Range, & Welsh,2012; Levin & Datnow,2012).

School organizational and cultural characteristics such as a school’s performance orientation, and a school’s attitude towards internal cooperation did not seem to be associated with DBDM implementation. However, attitudes towards cooperation might be very different from actual cooperative behavior. Measures of true cooperation structures might have provided more insight into the relevance of cooperation for DBDM.

Low DBDM implementers in our sample were schools in which the standardization of DBDM activities was also low; in other words, they left decisions on how DBDM activities were carried out more to the discretion of school staff, which may result in less consistency in DBDM implementation (Doolaard,1999).

School-context features overall did not seem to be strongly related to implementa-tion success. However, it was remarkable that small schools (with fewer than 150 students) appeared to be more successful in implementing DBDM than large schools (more than 350 students). In larger school teams, it might be more difficult to standar-dize DBDM activities and cooperate intensively, simply because more people are involved. On the other hand, small schools often work with multi-grade groups, which might impede DBDM in practice (Little,2001,2006)– but this clearly did not emerge in our study.

Furthermore, schools with an average-SES student population were generally more successful implementers than those with a high-SES student population. A possible explanation might be that the need for improvement was smaller in high-SES schools, since their students will achieve relatively well anyway (Scheerens & Bosker,1997). The overall, global school quality judgement by the inspectorate did not seem to be associated with the level of DBDM implementation, which probably was caused by the fact that the schools in our sample varied very little on this variable (almost all schools had received the“basic” qualification).

What do these findings imply? If our findings are confirmed in future studies that allow causal claims, they may be translated into recommendations for schools that plan to implement DBDM. Although most of the school characteristics that proved to matter in our sample cannot be influenced by schools directly, some definitely can be. Schools could then enhance successful DBDM implementation by exposing the entire school team to the intervention, and reduce staff mobility. School leaders could also promote DBDM implementation by providing their teachers with sufficient time and materials to work on DBDM. The standardization of DBDM processes also seemed to be related to the successful implementation of DBDM. If thisfinding also holds in other studies, school leaders and trainers should pay attention to developing clear guidelines and agreements on the execution of DBDM activities (like the analysis of student data, the instruction plan formats used, and when and how to discuss school performance data).

The results of our investigation may also have implications for those who design and execute similar interventions. In schools with poor leadership at the start of the innova-tion process (including weak teacher leaders and weak academic coaches), they may explicitly work on developing leadership to the required level as part of the intervention.

(19)

This, in turn, may strengthen the effect of all their other DBDM implementation activities considerably, as without at least a sufficient level of leadership it may be very hard to successfully implement data-based decision making.

Note

1. Students were assigned extra“weight” if their parents completed lower levels of education. Students received an extra weight of 0.3 (maximum parental educational level of lower vocational education) or 1.2 (maximum parental educational level of primary education or special needs education). Schools received additional funding based on student weights as it was assumed that schools with weighted students faced greater challenges.

Disclosure statement

No potential conflict of interest was reported by the authors. Notes on contributors

Marieke van Geel works as a postdoctoral researcher at the University of Twente. Her current research is about differentiated instruction. She is specifically interested in data use, assessment, school improvement and teacher professional development.

Adrie J. Visscherworks as a professor at the University of Twente and the University of Groningen in The Netherlands. His research focuses on the utilization of performance feedback for the improvement of student, teacher, and school performance.

Bernard Teunisused to work at the University of Twente, where he was appointed as one of the trainers who guided schools in implementing DBDM. Currently, he is working as a policy advisor for the Council of Primary Education (PO-Raad). His work is mainly focused on professionalization of teachers and school leaders.

References

Becker, W. C. (1992). Direct instruction: A twenty year review. In R. P. West & L. A. Hamerlynck (Eds.), Designs for excellence in education: The legacy of B.F. Skinner (pp. 71–112). Longmont, CO: Sopris West.

Berends, M., Bodilly, S., & Kirby, S. N. (2002). Looking back over a decade of whole-school reform: The experience of New American schools. Phi Delta Kappan, 84, 168–175. doi:10.1177/ 003172170208400214

Berliner, D. C. (2002). Educational research: The hardest science of all. Educational Researcher, 31(8), 18–20. doi:10.3102/0013189X031008018

Blanc, S., Christman, J. B., Liu, R., Mitchell, C., Travers, E., & Bulkley, K. E. (2010). Learning to learn from data: Benchmarks and instructional communities. Peabody Journal of Education, 85, 205– 225. doi:10.1080/01619561003685379

Bodilly, S. J., Keltner, B., Purnell, S., Reichardt, R., & Schuyler, G. (1998). Lessons from New American schools’ scale-up phase: Prospects for bringing designs to multiple schools. Retrieved fromhttp:// www.rand.org/content/dam/rand/pubs/monograph_reports/1998/MR942.pdf

Boudett, K. P., City, E. A., & Murnane, R. J. (Eds.). (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

(20)

Campbell, C., & Levin, B. (2009). Using data to support educational improvement. Educational Assessment, Evaluation and Accountability, 21, 47–65. doi:10.1007/s11092-008-9063-x

Carlson, D., Borman, G. D., & Robinson, M. (2011). A multistate district-level cluster randomized trial of the impact of data-driven reform on reading and mathematics achievement. Educational Evaluation and Policy Analysis, 33, 378–398. doi:10.3102/0162373711412765

Coburn, C. E., & Turner, E. O. (2012). The practice of data use: An introduction. American Journal of Education, 118, 99–111. doi:10.1086/663272

Desimone, L. M. (2009). Improving impact studies of teachers’ professional development: Toward better conceptualizations and measures. Educational Researcher, 38, 181–199. doi:10.3102/ 0013189X08331140

Doolaard, S. (1999). Schools in change or schools in chains: The development of educational effectiveness in a longitudinal perspective. Enschede, The Netherlands: Twente University Press. Faber, J. M., & Visscher, A. J. (2014). Digitale leerlingvolgsystemen: Een review van de effecten op

leerprestaties [Digital student monitoring systems: A review of the effects on student achieve-ment]. Retrieved from: http://archief.kennisnet.nl/uploads/tx_kncontentelements/Digitale_leer lingvolgsystemen_Kennisnet_eindversie_20022014.pdf

Fink, D., & Brayman, C. (2006). School leadership succession and the challenges of change. Educational Administration Quarterly, 42, 62–89. doi:10.1177/0013161X05278186

Fullan, M. (1991). The new meaning of educational change. New York, NY: Teachers College Press. Geijsel, F. P., Sleegers, P. J. C., Stoel, R. D., & Krüger, M. L. (2009). The effect of teacher psychological and school organizational and leadership factors on teachers’ professional learning in Dutch schools. The Elementary School Journal, 109, 406–427. doi:10.1086/593940

Hallinger, P., & Heck, R. H. (2011). Exploring the journey of school improvement: Classifying and analyzing patterns of change in school improvement processes and learning outcomes. School Effectiveness and School Improvement, 22, 1–27. doi:10.1080/09243453.2010.536322

Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J. (2009). Using student achievement data to support instructional decision making (NCEE 2009-4067). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved fromhttps://ies.ed.gov/ncee/ wwc/Docs/PracticeGuide/dddm_pg_092909.pdf

Ikemoto, G. S., & Marsh, J. A. (2007). Cutting through the“data-driven” mantra: Different concep-tions of data-driven decision making. Yearbook of the National Society of Education, 106, 105– 131. doi:10.1111/j.1744-7984.2007.00099.x

Inspectie van het Onderwijs. (2010). Opbrengstgericht werken in het basisonderwijs [Data-based decision making in primary education]. Utrecht, The Netherlands : Author.

Keuning, T., Van Geel, M., Visscher, A., Fox, J.-P., & Moolenaar, N. M. (2016). The transformation of schools’ social networks during a data-based decision making reform. Teachers College Record, 118(9), 1–33.

Kingston, N., & Nash, B. (2011). Formative assessment: A meta-analysis and a call for research. Educational Measurement: Issues and Practice, 30(4), 28–37. doi:10.1111/j.1745-3992.2011.00220.x Lai, M. K., & McNaughton, S. (2013). Analysis and discussion of classroom and achievement data to

raise student achievement. In K. Schildkamp, M. K. Lai, & L. Earl (Eds.), Data-based decision making in education: Challenges and opportunities (pp. 23–47). Dordrecht, The Netherlands: Springer. doi:10.1007/978-94-007-4816-3

Lai, M. K., & Schildkamp, K. (2013). Data-based decision making: An overview. In K. Schildkamp, M. K. Lai, & L. Earl (Eds.), Data-based decision making in education: Challenges and opportunities (pp. 9–22). Dordrecht, The Netherlands: Springer. doi:10.1007/978-94-007-4816-3

Lange, C., Range, B., & Welsh, K. (2012). Conditions for effective data use to improve schools: Recommendations for school leaders. International Journal of Educational Leadership Preparation, 7(3), 1–11.

Leithwood, K., Harris, A., & Hopkins, D. (2008). Seven strong claims about successful school leadership. School Leadership & Management, 28, 27–42. doi:10.1080/13632430701800060

(21)

Levin, J. A., & Datnow, A. (2012). The principal role in data-driven decision making: Using case-study data to develop multi-mediator models of educational reform. School Effectiveness and School Improvement, 23, 179–201. doi:10.1080/09243453.2011.599394

Little, A. W. (2001). Multigrade teaching: Towards an international research and policy agenda. International Journal of Educational Development, 21, 481–497. doi:10.1016/S0738-0593(01) 00011-6

Little, A. W. (Ed.). (2006). Education for all and multigrade teaching: Challenges and opportunities. Dordrecht, The Netherlands: Springer. doi:10.1007/1-4020-4591-3

Locke, E. A., & Latham, G. P. (2002). Building a practically useful theory of goal setting and task motivation: A 35-year odyssey. American Psychologist, 57, 705–717. doi: 10.1037//0003-066X.57.9.705

Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making to inform practice. Educational Psychologist, 47, 71–85. doi:10.1080/00461520.2012.667064

Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record, 114(11), 1–48.

Scheerens, J., & Bosker, R. J. (1997). The foundations of educational effectiveness. Oxford, UK: Elsevier.

Schildkamp, K., Ehren, M., & Lai, M. K. (2012). Editorial article for the special issue on data-based decision making around the world: From policy to practice to results. School Effectiveness and School Improvement, 23, 123–131. doi:10.1080/09243453.2011.652122

Schildkamp, K., Lai, M. K., & Earl, L. (Eds.). (2013). Data-based decision making in education: Challenges and opportunities. Dordrecht, The Netherlands: Springer. doi:10.1007/978-94-007-4816-3

Staman, L., Visscher, A. J., & Luyten, H. (2014). The effects of professional development on the attitudes, knowledge and skills for data-driven decision making. Studies in Educational Evaluation, 42, 79–90. doi:10.1016/j.stueduc.2013.11.002

Timperley, H. (2008). Teacher professional learning and development. Brussels, Belgium: International Academy of Education.

Turner, E. O., & Coburn, C. E. (2012). Interventions to promote data use: An introduction. Teachers College Record, 114(11), 1–13.

Tyack, D., & Tobin, W. (1994). The“grammar” of schooling: Why has it been so hard to change? American Educational Research Journal, 31, 453–479. doi:10.3102/00028312031003453

Van de Grift, W. (2007). Quality of teaching in four European countries: A review of the literature and application of an assessment instrument. Educational Research, 49, 127–152. doi:10.1080/ 00131880701369651

Van Geel, M., Keuning, T., Visscher, A. J., & Fox, J.-P. (2016). Assessing the effects of a school-wide data-based decision-making intervention on student achievement growth in primary schools. American Educational Research Journal, 53, 360–394. doi:10.3102/0002831216637346

Van Veen, K., Zwart, R., & Meirink, J. (2011). What makes teacher professional development effective? A literature review. In M. Kooy & K. van Veen (Eds.), Teacher learning that matters: International perspectives (pp. 3–21). New York, NY: Routledge.

Visscher, A. J., & Coe, R. (Eds.). (2002). School improvement through performance feedback. Lisse, The Netherlands: Swets & Zeitlinger.

Visscher, A. J., & Ehren, M. (2011). De eenvoud en complexiteit van Opbrengstgericht Werken [The simplicity and complexity of data-driven teaching]. Retrieved fromhttp://www.rijksoverheid.nl/ documenten-en-publicaties/rapporten/2011/07/13/de-eenvoud-en-complexiteit-van-opbrengst gericht-werken.html

Wong, K. K., & Meyer, S. J. (1998). Title I schoolwide programs: A synthesis offindings from recent evaluation. Educational Evaluation and Policy Analysis, 20, 115–136. doi:10.3102/ 01623737020002115

Referenties

GERELATEERDE DOCUMENTEN

The cycle is completed by using the sulphur dioxide dissolved in concentrated sulphuric acid (50 to 70 wt %), to depolarise the anode of the electrolyser cell. Sulphuric acid,

187 Daarmee leek het lot voor de betekenis van Prinsjesdag bezegeld; voortaan zou deze term alleen nog maar gebruikt worden voor de jaarlijkse opening van de Staten-Generaal. In

Thesis, MSc Urban and Regional Planning University of Amsterdam 11th of June, 2018 Daniel Petrovics daniel.petrovics1@gmail.com (11250844) Supervisor: Dr.. Mendel Giezen Second

Lastly, the proposed moderating effect of board cultural diversity on the relationship between sustainability reporting and the internationalization of scope and scale did

dollar, as described by Krugman, Obstfeld and Melitz (2015, p. 528), while they only account for a relatively small proportion of the global economy. In contrast, many emerging

Between 1945 and 1949, the Dutch East Indies government and military institutions produced many documentary films, which were intended to convey their policies

Deze redenering laat mede zien dat het absoluut geen markante gedachte is dat het Hof van Justitie de wijze waarop Nederland zijn faillissementsprocedure, met name als

However, for OD-pairs with more distinct route alternatives (OD-group B), participants collectively shift towards switch-averse profiles when travel time information is provided..