Kuala Lumpur, 1 November 2023
An Independent Expert Group (IEG) convened by the United Nations University’s International Institute for Global Health (UNU IIGH) has released a strong statement criticizing the wide and uncritical use of global university rankings.
The IEG highlights the vital importance of universities in delivering not just education, training, and research, but also in shaping public policy, promoting informed public discourse, and helping advance democracy and human rights.
However, although marketed as a tool for improving university performance and providing information to prospective students, the statement describes how global university rankings lead to a variety of perverse behaviours and negative impacts that undermine key aspects of the mission of universities.
According to IEG member Marion Lloyd, a research professor at the Institute for the Study of the University and Education at the National Autonomous University of Mexico, “Global university rankings exert too much influence over higher education, promoting a narrow and simplistic version of success that overlooks many of the rich and vital contributions that universities make to society”.
The Problem with Rankings
The IEG statement highlights nine problems with global university rankings.
Key among them is that the very idea of global university rankings is fundamentally flawed. It is simply not possible to produce a fair and credible global league table of universities given their multiple missions and their diverse social, economic, and political contexts around the world.
Because there is no adjustment made for the resources available to universities, rankings inevitably advantage historically-privileged institutions and help perpetuate global inequalities in higher education instead of raising academic standards equitably and universally. “It is not appropriate for universities from historically exploited and disadvantaged regions to feel compelled to compete on an un-level playing field with a set of rules that are biased in favour of the Global North” stated Akosua Adomako Ampofo, Professor of African and Gender Studies at the University of Ghana.
Moreover, the methodologies employed by the major rankers are opaque, while demonstrating a clear bias towards the English language, certain types of research, and STEM subjects (Science, Technology, Engineering, and Mathematics). This undermines the importance of teaching and of the humanities and social sciences. Disturbingly, the lack of transparency over the data and scoring systems employed raises serious doubts about their reliability and objectivity.
“These rankings perversely incentivise universities to prioritise short-term and sometimes unethical interventions to improve their rankings, rather than the needs of their students, staff, local communities, or of society more generally” stated Marion Lloyd, adding that “the constant and short-sighted obsession with annual rankings comes at the cost of long-term and broader goals, which is especially harmful given the many serious and complex problems facing society”.
The IEG statement also highlights the extractive nature of major global rankings and the fact that the rankings industry is dominated by private businesses whose fundamental mission is to produce profits. According to UNU-IIGH’s Professor David McCoy who helped convene the IEG, “Many of the commercial practices of the rankings industry are simply not in the public interest and result in significant resources being diverted away from core academic functions”.
A Call to Action
The IEG calls for a better understanding of the flaws and limitations of global university rankings and for the adoption of better alternative ways to assess and describe the unique and specific attributes of different universities. It also encourages universities to disengage from the costly and extractive practices of the rankings game and to diminish the influence of unaccountable commercial organizations on higher education.
Read the full statement on this link: https://doi.org/10.37941/PB/2023/2
Media Contact:
For media inquiries or interviews with the authors, please contact:
Gopi Kharel
Knowledge Management and Communications Manager
UNU- International Institute for Global Health
Email: gopi.kharel@unu.edu
***
About the Independent Expert Group (IEG):
The IEG is a diverse group of experts convened by the United Nations University International Institute for Global Health to critically examine and address the impact of global university rankings on higher education. Its members come from various fields and backgrounds, working collectively to bring about positive change in the sector. See the bios of the experts at the annex of the statement.
Note to Editors: High-resolution images and additional expert quotes are available upon request.
Appendix: Global University Rankings - Key Facts for Journalists
The first global university ranking was published in 2003.
The number of producers of global and other university rankings has grown constantly over the past two decades, with more than 60 global and regional rankings produced in 2023. For some, ranking universities has become a highly profitable business.
The most influential global rankings are produced by 4 private companies: the UK-based Times Higher Education (owned by Inflexion Private Equity Partners LLP) and Quacquarelli Symonds; the American U.S. News & World Report; and Shanghai Ranking Consultancy from China.
There are currently ca. 21,000 accredited or recognised higher education institutions in the world. The most comprehensive global university rankings, however, include up to around 2,000 institutions, located in roughly 100 countries.
The Top 100 in the global rankings by QS, Shanghai Ranking Consultancy, and THE are largely fixed. Very few new institutions ever enter the ‘top’, although universities may shift positions within that range.
Major global university rankings privilege wealthier and research-oriented institutions from the English-speaking countries of the Global North, which comprise the majority of those ranked in the top 100.
The business behind rankings
None of the 4 major global university rankings disclose how their scores are calculated.
Recent research suggests that universities that buy products and services from QS and THE may have better chances of moving up in those rankings.
The 4 major global rankings rely on Elsevier's and Clarivate's proprietary bibliometric data. This further strengthens the two companies' already dominant position in the academic publishing and data analytics markets. Elsevier's profit margin is almost 40% (with over $3 billion in annual revenue), rivalling that of Apple and Google.
QS, THE, and U.S. News collect vast amounts of data from universities and publicly accessible sources, which they then privatise in order to market them to universities, governments, and other interested parties in the form of performance analytics.
By progressively enlarging the number of universities included in their rankings, major ranking organizations also expand the market of prospective buyers of their data products, analytics, and consulting services—which is key to their business model.
Ranking companies that also sell consulting services to universities and governments are in the position of conflict of interest.
Methodological issues
The academic community generally agrees that global university rankings are methodologically flawed.
Global university rankings offer a dramatically simplified and unrealistic view of university performance, which gives a skewed idea of what universities' functions and activities are, thus misleading prospective students.
Rankings exaggerate the actual differences between the ‘quality’ of universities, as small variations in scores can result in radically different positions in the ranking.
50% of the total score in the QS World University Rankings is based on a survey of subjective opinions provided by anonymous individuals. In the case of Times Higher Education's World University Rankings and the U.S. News Best Global Universities, subjective opinions make up 33% and 25% of the total score, respectively.
60% of the total score in the Shanghai Ranking is based on publications and citations, while the number of Nobel Prizes and Field Medals weighs 30%.
In QS and THE global rankings, the weights assigned to different indicators can change every few years. The methodological reasoning behind these adjustments is not fully disclosed.
QS, THE, and U.S. News use different methodologies to calculate their global, regional, and national rankings. As a result, the same university can be ranked the 2ndbest in its region in the QS World University Ranking, while being 7thin the regional ranking published by QS in the same year.
Rankings can in some cases be highly volatile. It may take only 1 highly cited researcher joining the faculty for a university to climb 100 places in Times Higher Education´s World University Ranking.
Pressure and incentives
Participating in a global ranking can put enormous pressure on universities, sometimes requiring significant human and financial resources. This places universities and governments with limited budgets even more at a disadvantage.
University rankings give universities a large incentive to manipulate the data, misrepresent key statistics, and fabricate false information about their performances.
The number of universities withdrawing their participation from various rankings has increased dramatically in the past few years. END