Tag Archives economic data

Data inaccuracy is a global problem — a plea for decolonizing the debate on the quality of statistics

Policy makers and researchers want their analyses and advice to be evidence-based. Economic and social statistics seem to provide the hard data needed to make decisions. But those statistics are often inaccurate and are measured imprecisely. In this blog article, ISS Professor of International Economics and Macroeconomics Peter van Bergeijk points out the biases of academia in the global North, showing that their pretention about their statistical superiority perpetuates power imbalances in academic and policy discourses.

In the realm of economic analysis, the pursuit of precision has long been hailed as the ultimate goal. Yet, behind the veil of statistical exactitude lies a disturbing reality: the pervasive presence of measurement errors, typically disregarded and sidelined. Data inaccuracy at first sight might look like a very boring issue, but it is actually highly relevant for development studies, amongst others because of the data-based nature of the Sustainable Development Goals (SDGs). Indeed, large data-driven projects such as the SDGs are seen as a major step forward, but since no attention is paid to the accuracy of the target variables during the selection of quantified goals, disappointment is, so to speak, built in.

A good example is the goal to reduce global poverty. Espen Prydz, Dean Jolliffe, and Umar Serajuddin compared per capita income statistics calculated from the national accounts to countrywide household surveys. The numbers should be the same, but the difference can be as much as 50%. They found that for the year 2011, the World Bank’s target of reducing poverty to less than 3% globally was met when looking at national accounts, but that the number of people living in poverty was actually twice as high when household surveys were considered. Those who do not recognize or report inaccuracies such as these rest on their laurels before the work is done.

It is unfortunately common practice for economists to sweep data inaccuracies under the rug. We are confused with data that doesn’t work the way we want it to. So, we continue ignoring the problem. Obscuring their existence means that we don’t know the extent of these inconsistencies. And, as I argue below, the mainstream ignores its own mistakes while emphasizing the statistical problems of developing countries.

A rude awakening

The journey into the murky waters of economic data accuracy often begins with a rude awakening. For many, like me, it was during our early professional endeavours that we encountered an unsettling truth: economic and social statistics, even from reputed sources, are riddled with inconsistencies and inaccuracies. The disillusionment can be profound when young professionals realize that the numbers they rely on to inform critical decisions are far from infallible. I was introduced to this issue only when regarding the reading list of my final exam. The examiner asked what I thought about this, and I told him that this should have been part of teaching from Day 1.

Yet, the issue has persisted for many decades. Published in 1950 as a discussion paper and in 1963 as a monograph by Princeton University Press, Oskar Morgenstern in his seminal work On the Accuracy of Economic Observations was one of the first to expose the shadows of data imperfection. Through meticulous case studies, Morgenstern revealed the inaccuracies that plague economic and social statistics; he uncovered measurement error to the tune of 20% to 50%, even for massively quoted numbers such as GDP, international trade, and the current account of countries worldwide. The areas from which Morgenstern drew his examples are wide and include agriculture, natural resources, (un)employment, prices, and production. In my last book that I briefly discuss below, I show that these problems persist, redoing Morgenstern’s research and adding some of the SDG indicators (poverty, health and nutrition, and illegal flows).

Figure 1. Only few economic analyses have few data inaccuracies — most studies have significant error rates. Source: Peter A.G. van Bergeijk, 2024, On the inaccuracies of macroeconomic observations, National Accounting Review, Figure 5.

A wake-up call

Yet, despite the continued relevance of Morgenstern’s findings, nothing has changed, and the discourse around measurement error remains conspicuously absent from mainstream economic narratives. What happened was that a discourse developed that presented inaccuracy as a major problem in Africa and Asia. The publication in 2014 by Morten Jerven, Professor in Development Studies at the Norwegian University of Life Sciences, of Poor Numbers: How We Are Misled by African Development Statistics and What to Do about It struck me in particular and served  as a wake-up call. Jerven fell into the trap of the fallacy of portraying measurement issues as a problem confined to the global South.

The cost of pursuing perfection

The roots of the ignorance of measurement error lie in the biases embedded within academia in the global North. The training of economists is steeped in a tradition that venerates precision and disregards imperfection, perpetuating a culture of intellectual superiority and exclusion. This obsession with precision not only blinds economists and social scientists to the realities of measurement error; the global South, all too often relegated to the margins of economic analysis, bears the brunt of this colonial legacy.

The prevailing narrative portrays measurement issues as exclusive to developing countries, conveniently ignoring the systemic inaccuracies that plague advanced economies. Evidence-based policy making will often be counterproductive if based on mock statistics. A trade balance can be reported as showing a deficit and may induce policy measures to boost exports and restrict imports. But the statistics can easily get it wrong and may deprive a nation from important resources.

Doing economics (education) better

The imperative for decolonization in the measurement error debate cannot be overstated. It is time to dismantle the colonial perceptions that have long dictated our approach to economic data accuracy. This necessitates a fundamental re-evaluation of our methodologies and practices, rooted in principles of transparency, accountability, and inclusivity. I decided to write my last book titled On the Inaccuracies of Economic Observations: Why and How We Could Do Better on this topic because an accessible up-to-date text is necessary to convince students that this is a real problem. I hope that it will help to improve the education of economists and other social scientists.

In the book, which has just been published by Edward Elgar, I pay tribute to Morgenstern and redo much of his analysis but for more recent times and for a much broader country sample. Most importantly, I provide a concrete methodology and strategy for what I call crowd-researching the extent of measurement error and empowering data users, allowing them to take part in dismantling the power hierarchies that fundamentally perpetuate colonial perception within economic discourse. There have occasionally been calls for fundamental changes of the statistics producers, but that has not led to any change. It is time to try to do this differently. It is perfectly doable for data users to investigate the measurement errors that plague social and economic statistics.

A radical reimagining

Our journey towards progress demands a radical reimagining of our approach to data accuracy. Confronting the legacies and mainstream views that underpin our current discourse helps to pave the way for a more equitable and inclusive understanding of economic and social phenomena. Especially in the current times of disinformation, scientific rigour requires us all to be transparent about measurement error and its impact on our analysis and policy advice.

The implications clearly stretch beyond academia. Policymakers must confront the uncertainties inherent in economic observations, recognizing the limitations of relying on flawed data for decision-making. Similarly, researchers must embrace methodologies that prioritize inclusivity and equity by treating the problem of statistical quality from the perspective that the problems are similar for all countries. But this starts with the recognition that it is indeed a phenomenon that should not be ignored but ultimately should be embraced for what it is.

Opinions expressed in Bliss posts reflect solely the views of the author of the post in question.

About the author: Peter van Bergeijk

Peter van Bergeijk

Peter van Bergeijk is almost emeritus professor. His valedictory ‘In Praise of Observations’ is scheduled October 1, 16:00 +

link to event:  https://www.iss.nl/en/events/valedictory-lecture-professor-peter-van-bergeijk-2024-10-01

Are you looking for more content about Global Development and Social Justice? Subscribe to Bliss, the official blog of the International Institute of Social Studies, and stay updated about interesting topics our researchers are working on.