• No results found

Pisa indicators of educational achievement

Detailed analyses

2.1.1 Pisa indicators of educational achievement

pisa conducts internationally standardized and nationally representative tests to measure the performance of 15 year-olds in reading, maths and science.6 It has carried out surveys every three years since 2000 and now has 70 participating countries. Performance is mapped on a scale with a mean of 500 test-score points and a standard deviation of 100 points across the oecd countries. As explained by the oecd (2014a), not all pisa results can be compared over time due to differences in scaling, sampling and testing conditions.7 As an important example, while the reading test has been uniformly scaled since 2000, the maths and science tests have had uniform scales only since 2003 and 2006, respectively. Following the oecd (2014a), we include only those countries with valid data to compare

4

We use the oecd’s pisa data – rather than the iea’s timss or pirls data, or the oecd’s piaac data – due to its larger coverage. Hanushek and Woessmann (2010) find that pisa and timss scores are highly correlated, the fourth core task of education (selecting and sorting students on their abilities and interests)

between assessments. This implies among other things that we use all five available waves for reading (2000, 2003, 2006, 2009 and 2012) and exclude wave 2000 for maths and waves 2000 and 2003 for science.8

Mean pisa test scores

Our first outcome indicator is the mean pisa test score in maths, reading or science. Table 2.2 shows maths scores in 2003 and 2012 for each of the countries studied in this volume. Table 2.3 shows reading scores in 2000 and 2012 (oecd 2014a). Countries are grouped per region and ranked according to their 2012 score. Both tables also document the changes in intermediate years. The bar chart on the righthand side of each table visually demonstrates the performance of countries in 2003 (or 2000) and 2012 (for more information on how to read the tables in this report, see page 49).9 As can be seen, test scores vary substantially across the seven regions and 36 countries. To gauge the magnitude of the differences, a difference of 41 score points corresponds to approximately one year of formal schooling (oecd 2014a). A few results stand out in both tables and hence hold for both maths and reading:

1 The countries in Eastern Asia (Korea and Japan) outperformed all European countries in 2012 (as well as all others).

2 Bulgaria, Cyprus and Romania achieved the lowest 2012 scores.

3 Southern European scores in 2012 were considerably lower on average than those of Western and Northern Europe.

4 Western European countries did not perform differently on average from their peers in Northern Europe, Northern America and Oceania in 2012.

5 Intraregional differences in 2012 were largest in Central and Eastern Europe, with scores ranging from around 440 in Bulgaria and Romania to around 520 in Estonia and Poland.

6 Countries with low scores in the first year for which data are available in most cases had improved their scores by 2012, while countries with high initial scores in most cases had seen their scores deteriorate by 2012. This suggests that countries’ test scores converge over time.10 7 Romania, Poland and Bulgaria (maths) and Poland, Latvia and Germany

(reading) achieved the largest improvements in test scores, while Sweden, Finland and New Zealand (both maths and reading) witnessed the largest declines.

We should note that the quality of education, while important, is not the only determinant of test scores. This carries the danger that score differ-ences are attributed to education while in some cases they may stem from different social conditions (Dronkers 2011). Merry (2013), for example, demonstrates that the U.S. deficit in reading relative to Canada already existed at ages 4-5, before formal schooling had a chance to matter. Hence, while the test scores reported here reflect a country’s performance in terms of cognitive skills and human capital, they do not necessarily reflect the quality of its education.

8

Results of the 2015 pisa assessment were not yet available at the time of writing.

9

Table A2.1 in the appendix to this chapter

(www.scp.nl) reports maths and reading scores for all available years (oecd 2014a). Table A2.2 as part of our statistical analysis of the factors that drive educational outcomes.

Region Country 2003 2006 2009 2012 2012 2012 vs 2003

400 440 480 520 560 Western

Europe Switzerland 527 +3 +4 –3 531

Netherlands 538 –7 –5 –3 523

Belgium 529 –9 –5 0 515

Northern

Europe Finland 544 +4 –7 –22 519

Denmark 514 –1 –10 –3 500

Norway 495 –5 +8 –9 489

largest increase 2012

largest decrease 2003

This study compares 36 countries in seven regions. The different regions and countries each have their own colour, which will be used in all tables and figures throughout this book. In Chapter 9 we make an exception by presenting a separate figure for each region and comparing only the countries within each region.

We use data for the period from 1995 up to the most recent available year. If data for 1995 are not available, we use the first available year after 1995 (in this example 2003). If this score is not available for 2003, the first available score is instead reported in the column of the relevant subsequent year (in this example 2006, 2009 or 2012).

This column reports data for the most recent available year.

In cases where these data are not available, the last available data (if any) are instead reported in the column for the relevant previous year.

Increase or decrease compared to previous score.

We computed the net increase or decrease in a country’s maths score over the period from the first available to the last available year. The three countries with the largest net increase (or the largest net decrease) are highlighted by using distinct colours for their arrows.In cases where multiple countries have the same net increase or decrease, we high- light more than three countries.

How to read the tables in this report

Region Country 2003 2006 2009 2012 2012 2012 vs 2003

400 440 480 520 560 Western

Europe

Switzerland 527 +3 +4 –3 531

Netherlands 538 –7 –5 –3 523

Belgium 529 –9 –5 0 515

Germany 503 +1 +9 +1 514

Austria 506 –1 . 506 506

Ireland 503 –2 –14 +14 501

France 511 –15 +1 –2 495

United Kingdom . 495 –3 +2 494

Luxembourg 493 –3 –1 +1 490

Northern

Europe Finland 544 +4 –7 –22 519

Denmark 514 –1 –10 –3 500

Norway 495 –5 +8 –9 489

Sweden 509 –7 –8 –16 478

Southern

Europe Portugal 466 0 +21 0 487

Italy 466 –4 +21 +2 485

Czech Republic 516 –6 –17 +6 499

Latvia 483 +3 –4 +9 491

Slovak Republic 498 –6 +5 –15 482

Lithuania . 486 –9 +2 479

Hungary 490 +1 –1 –13 477

Croatia . 467 –7 +11 471

Romania . 415 +12 +18 445

Bulgaria . 413 +15 +11 439

Oceania Australia 524 –4 –6 –10 504

New Zealand 523 –1 –3 –19 500

Northern America

Canada 532 –5 0 –9 518

United States 483 –9 +13 –6 481

Eastern

Table 2.2 Mean PISA maths scores

Source: OECD (2014a).

For reading instructions see page 49

Region Country 2000 2003 2006 2009 2012 2012 2012 vs 2000

400 440 480 520 560 Western

Europe

Ireland 527 –12 +2 –21 +27 523

Netherlands . 513 –6 +1 +3 511

Switzerland 494 +5 0 +2 +8 509

Belgium 507 0 –6 +5 +3 509

Europe Finland 546 –3 +4 –11 –12 524

Norway 505 –5 –16 +19 +1 504

Czech Republic 492 –3 –6 –5 +15 493

Latvia 458 +33 –12 +5 +5 489

Hungary 480 +2 0 +12 –6 488

Croatia . . 477 –1 +9 485

Slovenia . . 494 –11 –2 481

Lithuania . . 470 –2 +9 477

Slovak Republic . 469 –3 +11 –14 463

Romania 428 . 396 +28 +14 438

Bulgaria 430 . 402 +27 +7 436

Oceania New Zealand 529 –7 –1 0 –9 512

Australia 528 –3 –12 +2 –3 512

Table 2.3 Mean PISA reading scores

Source: OECD (2014a).

For reading instructions see page 49

Region Country 2003 2012 2012 2012 vs 2003

0 5 10 15 20 25 30 Western

Europe France 20 +2 22

Belgium 23 –3 20

United Kingdom . 12 12

Netherlands 18 –6 12

Slovak Republic 24 +1 25

Hungary 26 –3 23

Bulgaria . 22 22

Romania . 19 19

Poland 16 +1 17

Slovenia . 16 16

Czech Republic 18 –2 16

Latvia 12 +3 15

Lithuania . 14 14

Croatia . 12 12

Estonia . 9 9

Oceania New Zealand 17 +1 18

Australia 14 –2 12

Northern

America United States 19 –4 15

Canada 10 –1 9

Table 2.4 Inequality based on parental socioeconomic status

Notes: For comparability over time, PISA 2003 values on the PISA index of economic, social and cultural status have been rescaled to the PISA 2012 scale of the index. Source: OECD (2014b).

For reading instructions see page 49

The descriptive analysis above is confined to maths and reading because pisa science scores are available only since 2006. One may ask whether the analysis above would have produced different results for science if the data had been available. A more general question is to what extent coun-tries’ maths, reading and science scores move together. In other words, if a country does well in maths, for example, is it likely also to do well in reading and science? We addressed this question by computing correla-tions between the mean 2012 maths, reading and science scores and found these to be very high, indicating that countries’ average scores across the different subjects do indeed converge to a large extent.11 Table A2.3 in the appendix to this chapter reports the mean science scores for all available years (oecd 2014a).

The proportion of variation in pisa test scores explained by socioeconomic status Our second outcome indicator is the proportion of variation in pisa test scores that is explained by parental socioeconomic status. According to the oecd (2014b), equal opportunity in education does not imply that all students will have the same outcomes from education, but it does mean that students’ socioeconomic status has little or no impact on their performance. In line with this definition, the oecd (2014b) constructs an indicator of inequality of educational opportunity by assessing statistically, and for each country separately, how much of the variation in students’ pisa maths scores can be explained by their socioeconomic status.12 Table 2.4 shows the 2003 and 2012 values of this indicator for each of the countries studied in this volume. It also documents the changes in inequality between 2003 and 2012. Countries are grouped per region and ranked according to their 2012 level of inequality. The bar chart on the righthand side of the figure visually demonstrates the levels of inequality in 2003 and 2012. A few results stand out in Table 2.4:

1 The Eastern Asian countries not only achieved the highest average 2012 performance in maths (Table 2.2) but were also among the countries with the lowest inequality, the others being the Northern European countries (except Denmark), Estonia, Canada and Italy.

2 2012 inequality was highest in the Slovak Republic, Hungary, France and Bulgaria.

3 Western European countries performed worse on average in 2012 than their peers in Eastern Asia, Northern Europe and Northern America, but not very different from their peers in Oceania and Southern Europe.

4 Intraregional differences in 2012 were largest in Central and Eastern Europe, with inequality levels ranging from around 9% in Estonia to around 25% in the Slovak Republic and Hungary.

5 Intraregional differences were also fairly large in Western Europe, with inequality being relatively low in the Netherlands, the United Kingdom and Switzerland, but high in France and Belgium.

6 Inequality based on parental socioeconomic status reduced in most countries between 2003 and 2012.

11

The correlations were 0.90 for maths and reading, 0.92 for maths and science, and 0.93 for reading and science (N = 35).

12

More precisely, this indicator corresponds to the R-squared of separate bivariate regressions of a student’s maths score on his/her parental socioeconomic status (pisa). It is available only for maths and only for the years 2003 and 2012.

7 Germany, the Netherlands and Switzerland achieved the largest reduc-tions in inequality, while Spain, Latvia and France witnessed the largest increases.

Countries that perform well on average do not always achieve equal opportunity To assess how inequality based on socioeconomic status is related to average performance, we plotted inequality against countries’ mean pisa maths scores in 2012 (see Figure A2.1 in the appendix to this chapter).

It turns out that countries which perform better in terms of average maths performance also seem to do better in providing equal opportunity to students from disadvantaged backgrounds. However, this association is fairly weak.13 This is in line with the observation that some countries with similar average performances have distinctly different levels of inequality.

Examples include the Slovak Republic and Italy, Luxemburg and Norway, France and the United Kingdom, and Belgium and Finland. These results appear to be consistent with the notion that policies aimed at improving average educational performance only go some way towards achieving equal opportunity for students from disadvantaged social backgrounds.14

2.1.2 iccs indicators of civic knowledge, value beliefs and