PISA for Development: Expanding the Global Education Community Esperanto or Developing a Dialect? By Camilla Addey
By Camilla Addey, Humboldt University in Berlin[1]
In the 1990s, the International Large-Scale Assessment (ILSA) phenomenon suddenly exploded and over the following two decades, it saw an impressive increase in the number of countries taking part and the ILSA programmes available – the most widely known being the Programme for International Student Assessment (PISA). Although the main ILSA administrators, the Organisation for Economic Co-operation and Development (OECD) and the International Association for the Evaluation of Educational Achievement (IEA), once questioned the validity of international comparisons of learning outcomes, they now rate and rank educational performance and skills, describing such comparisons as an indispensable policy tool. Henry et al. described this as a shift ‘from philosophical doubt to statistical certainty’ (2001: 90).
Lower and middle income countries have taken part in PISA since it was first implemented, but it was not until 2013 that the OECD publicly acknowledged that PISA data were poorly-relevant for policy in such contexts: the PISA tests were not developed for a clustering of performance at the lower levels, nor are they sufficiently representative when large proportions of 15 year olds are out of school (hence not taking the test). Officially initiated in 2012, and unofficially in the making since the poor PISA experience of India in 2009, the OECD is determined to make PISA for Development (PISA-D) a success story.[2]
To understand the OECD’s recent work (including PISA-D), it is important to acknowledge that in 2012, at the OECD’s 50th Anniversary Council Meeting at Ministerial level, concern was expressed about the OECD membership being a historical relic. The OECD Member States agreed that the global economic gravity had moved over the previous fifty years and with it, global economic governance had shifted. The meeting was followed by the publication of an OECD Strategy on Development in which the Organization’s new vision was stated: it would be a more inclusive policy sharing Organization, sharing its evidence-based approaches to policy with what it defines as ‘developing countries’. In addition, the OECD was aware (and fearful) of losing the global traction it has recently gained in education with PISA (Bloem 2015): it needed to innovate and expand.
The expansion of PISA to include lower and middle income countries raises profound questions about the significance of PISA in such contexts, and its claim to produce more policy-relevant data whilst ensuring comparability with the main PISA instrument. In order for PISA-D data to compare with PISA data, there are limits to how much PISA-D could be ‘enhanced’ to become ‘meaningful and interpretable in national contexts’ (as described in the initial PISA-D meetings).
Over the last three years, the OECD has been working closely with its PISA-D private contractors (Educational Testing Service, The Learning Bar, cApStAn, Westat, and Pearson), seven countries (Ecuador, Paraguay, Honduras, Guatemala, Cambodia, Zambia and Senegal), aid partners,[3] and expert partners,[4] and assessment programmes.[5] So what does PISA for Development look like after three years of negotiations? Have they ‘enhanced’ the programme to make it more relevant to the contexts where PISA-D is being implemented? What will the PISA-D data look like when it is published in late 2018?
Drawing on observations of an international PISA-D meeting[6] and interviews in 2015 and 2016 at the OECD, at The Learning Bar (a private contractor developing the PISA-D background questionnaires), and with high level policy actors in Ecuador and Paraguay (two PISA-D countries), to understand how PISA-D’s policy-relevance threshold was established, it appears that many different interests were involved in the making of PISA-D. These included sharing policy knowledge beyond membership, geopolitical expansion, business opportunities, and political ties with global PISA community. Using the notions of epistemic communities (Haas 1992) and socio-material semiotics (Law 2008) suggests that principled, normative and causal beliefs, shared understandings of validity, and the shared policy agendas have been accepted temporarily by actors who are piggy-backing their interests on the PISA-D assemblage. The PISA-D assemblage draws greatly on the success and prestige of the main PISA, which is automatically transfers to all those involved with PISA. But the price of the main PISA prestige comes at a cost for PISA-D actors.
To benefit from the main PISA’s global prestige, PISA-D actors had to decide what they preferred: context policy relevance or comparing with PISA. Interviewees speak of sacrificing data that reflects country realities, preserving PISA, having stronger ties with the main PISA, and global rituals of belonging (Addey 2015). This, however, does not mean that PISA-D is no longer relevant, but that it is differently relevant than one might deduce from the OECD’s policy-relevance claim. I suggest two interpretations, which are not mutually exclusive. On the one hand, it might be suggested that in the PISA era, policy-relevance is about which knowledge counts and not what knowledges are relevant. On the other hand, it might be that PISA and PISA-D are not about the data. In December 2015, I interviewed the OECD’s Andreas Schleicher who very honestly stated that PISA ‘is not really about the scores’. He described PISA as the global education community’s common language. In other words, PISA is an Esperanto. What has been observed with PISA-D might be described as a desire to speak the global education community’s official language, rather than an Esperanto dialect.
Camilla Addey is based at Humboldt University in Berlin where she researches global education policy and international large-scale assessments in lower and middle income countries. Email: Camilla.Addey@hu-berlin.de
[1] This blog draws on data gathered for the author’s ‘PISA for Development for Policy – P4D4Policy’ research project, carried out with the support of the Fritz Thyssen Foundation.
[2] Author’s interviews with OECD staff.
[3] These include: France, the Inter-American Development Bank (IADB), Korea, the World Bank, the Global Partnership for Education (GPE), Norway (Norad), UK (DFID), Germany (BMZ/GIZ), Japan (JICA) and Ireland (Irish Aid).
[4] These include: UNESCO, UNESCO Institute of Statistics (UIS), The Global Education Monitoring Report (GEMR) team, UNICEF, Education International, PISA, PIAAC team.
[5] These include: ASER; EGRA – Early Grade Reading Assessment; EGMA – Early Grade Math Assessment; SACMEQ – Southern and Eastern Africa Consortium for Monitoring Educational; PASEC – Programme d’Analyse des Systèmes éducatifs des États et gouvernements membres de la CONFEMEN; PIRLS – Progress in International Reading Literacy Study; TIMSS – Trends in International Mathematics and Science Study; LLECE – Latin American Laboratory for Assessment of the Quality of Education; LAMP – Literacy Assessment and Monitoring Programme.
[6] The International Advisory Group meeting in Asuncion in March 2016.
References
Addey, C. (2015). Participating in international literacy assessments in Lao PDR and Mongolia: a global ritual of belonging. In Literacy as numbers: researching the politics and practices of international literacy assessment. M. Hamilton, B. Maddox and C. Addey. Cambridge, Cambridge University Press.
Bloem, S. (2015). The OECD Directorate for Education as an Independent Knowledge Producer through PISA. Governing Educational Spaces. Knowledge, Teaching, and Learning in Transition. H.-G. Kotthoff and E. Klerides, Sense.
Haas, P. (1992). Introduction: Epistemic communities and international policy coordination. International Organization 46(1).
Henry, M., B. Lingard, et al. (2001). The OECD, Globalization and Education Policy. London, Pergamon.
Law, J. (2008). “On Sociology and STS.” The Sociological Review 56(4).
Other NORRAG Blogs about PISA for Development:
- Can the Measurement of Learning Outcomes Lead to Quality Education for All? – By Pablo Zoido (former OECD), Michael Ward, Kelly Makowiecki, Lauren Miller, Catalina Covacevich (OECD) (July 21st, 2016)
- Expanding PISA: The OECD and Global Governance in Education – By Sam Sellar and Bob Lingard, School of Education, The University of Queensland. (June 20th, 2014)
- PISA for Development: One World, One Measure for Learning? – By Angeline M. Barrett, University of Bristol. (January 13th, 2014)
- PISA in Low and Middle Income Countries – By Simone Bloem, formerly OECD. (January 10th, 2014)
- PISA for Development and the Post-2015 Agenda – By Michael Davidson, Michael Ward and Alejandro Gomez Palma, OECD. (January 8th, 2014)
>> View full lists of related NORRAG blogs about learning assessments, global governance, data.
NORRAG (Network for International Policies and Cooperation in Education and Training) is an internationally recognised, multi-stakeholder network which has been seeking to inform, challenge and influence international education and training policies and cooperation for almost 30 years. NORRAG has more than 4,500 registered members worldwide and is free to join. Not a member? Join free here.
Pingback : Can the Measurement of Learning Outcomes Lead to Quality Education for All? | NORRAG NEWSBite
Pingback : Norrag Presentation by NORRAG at UKFIET 2017 on the Contribution of Large Scale Assessments to the Monitoring of SDGs - Norrag
Pingback : NORRAG – Can the Measurement of Learning Outcomes Lead to Quality Education for All? By Pablo Zoido, Michael Ward, Kelly Makowiecki, Lauren Miller, and Catalina Covacevich - NORRAG -
Pingback : NORRAG – Framing the Future: PISA for Development and the Future of Education Governance by Euan Auld, Jeremy Rappleye, Paul Morris - NORRAG -