University rankings have proliferated in the last decade, but every time the world league tables are released, complaints follow. Is it time to have a more objective and useful measure of the worth of universities and the education they provide?
In the light of rising tuition fees, and graduates in many economies faced with underemployment and worse, unemployment, there needs to be a robust way to measure if attending university really leads to a gain in knowledge and skills.
Take the two world rankings released earlier this month, which served up stellar results for two Singapore universities, but also left some scratching their heads about what to really make of it all.
First, last month, London-based education consultancy Quacquarelli Symonds (QS), released its annual world league tables for the top universities.
The ranking surprised many here, because both the National University of Singapore (NUS) and Nanyang Technological University (NTU) leapt several places into the top 13 in the world. NUS went up from 22 to 12, while NTU's rise was even more dramatic - it went from 39 to 13.
Another surprise was NTU being just one place behind NUS. It usually lags several places behind.
Then, QS competitor, Times Higher Education, released its expanded top 800 universities list on Oct 1. Last year, it listed only the top 400.
NUS, despite falling one place to 26th position, emerged as Asia's No. 1 university. Tokyo University, which won top spot last year, fell 20 places to 43rd. NTU moved up six places to 55th.
As for the complaints, on the QS rankings, one ST reader wrote in asking: "Really, NUS and NTU are better than Cornell and Yale?" The latter are Ivy League institutions in the United States; Yale was in 15th position, Cornell was No. 17.
And an NUS alumnus asked: "I know NTU is getting better, but surely it can't get so much better in one year, to be just one place behind NUS."
As I pointed out earlier ("NUS, NTU in top 13 of world ranking"; Sept 15), the dramatic rise of the two Singapore universities was in part due to the change in methodology on how research citation was being counted.
QS said this was to correct the bias created by a large volume of citations by publications in fields such as life sciences and medicine, compared to the arts and humanities.
Still, even without the change in methodology, both NUS and NTU would have moved up the table. In the case of NTU, more than 10 places. So, one must credit NUS and NTU for improving, at least in the areas that QS takes into account.
Why give tables such coverage?
What is the takeaway for parents and students pondering university options? Well, the fact is, all league tables have their limitations.
By choosing a particular set of indicators, such as how other academics rate a university and staff-faculty ratio, rankers decide what matters in higher education.
What is more, some of the most important aspects of a university education, such as teaching quality, are assessed using proxy measures such as the number of staff with PhDs and the ratio of academic staff to students.
Across the world, academics complain that the rankings, in using a mix of subjective and objective data, are based on "bad social science". Still, universities continue to allocate more resources to maximise their league table performance.
Readers have also written in to ask why The Straits Times accords so much space to these league tables.
The main reason - to serve our readers. It's a fact that however flawed they are, increasingly students and parents are using the tables to select universities, both here and abroad.
Academics, too, use the tables when deciding on positions, and some governments base funding decisions on the rankings of their home institutions.
I put the question on the usefulness of these comparisons to the chiefs of NUS and NTU.
NTU president Bertil Andersson's response: "What was there before these rankings came about? Absolutely nothing. The quality of education provided by universities was assumed.
"Young universities like NTU may be making big improvements, but no one was looking at it, so we went unnoticed. So, it has been good for NTU."
He recalled how when he took over as provost of NTU in 2008, he tried to ignore the rankings - until he saw parents and students coming to the university open house with the rankings in their hands. "They would ask me why is NTU not in the top 50? When I went overseas to recruit academic talents, some had not even heard of us and some used to think we are a university in China."
But he is not blind to the downside of rankings and stressed that NTU is still focused on providing a world-class environment for learning and teaching.
NUS president Tan Chorh Chuan said the rankings can provide useful composite information on a university's progress over time, and a general indication of where it stands in the region and the world.
He admitted that NUS' performance in the rankings has certainly helped to attract top academics and students, but added: "This effect will not be sustained unless we are also able to nurture home-grown talent and to create a vibrant enabling environment for both foreign and local talent to excel."
He advised students deciding on their university education to base their choice on their interests and aspirations and the quality of the educational programmes and experience.
"Rankings may have some impact, as highly ranked universities such as NUS tend to have deeper partnerships with a wider set of high-quality institutions around the world. Our reputation also opens up opportunities for our students and graduates."
Alternatives being developed
But Prof Tan stressed that rankings are just one measure of the university. "Qualitative aspects of a distinctive university education - for instance, innovative learning approaches, global exposure and student experience - are hard to measure, but these are what differentiate a student's educational experience and longer-term prospects."
So despite the rankings benefiting them, the two university chiefs are well aware of their limitations.
Perhaps students using league tables will be better served by the university rankings being developed by the Organisation for Economic Cooperation and Development (OECD).
Mr Stefan Kapferer, OECD's deputy secretary-general who was here for a conference on higher education last week, said the think-tank has been looking at how to objectively measure learning outcomes of undergraduate students. He did not go into the details of how the assessment will be done, but based on a trial test a few years ago, it is likely to assess students' critical thinking skills, plus competencies in specific disciplines.
This could eventually lead to league tables for universities similar to OECD's Programme for International Student Assessment (Pisa).
Mr Kapferer noted that global university rankings rely heavily on research, but this is perhaps "not the most important aspect" for young students. The test is a chance to compare results across universities, and have more transparency.
The question on whether universities really do help young people acquire the required knowledge and skills in a field is also an important one for policymakers to ask as they spend more tax dollars to fund institutions.
This is especially in the light of some studies in the US that show that students make no significant gains in learning outcomes in their undergraduate years. A widely quoted 2011 study by Richard Arum of New York University and Josipa Roksa of the University of Virginia, showed that a large proportion of students showed no significant improvement in skills during their first two years of college.
A follow-up last year tracked the same cohort of undergraduates as they finished college and entered the working world. The results again were not positive.
When asked if Singapore would take part in the OECD test, the Ministry of Education said it was open to learning more about the scope of the study.
Mr Kapferer admitted that only a handful of countries have shown interest in participating in the test, but noted that in the early years there was also much resistance to the Pisa test that assesses 15-year-olds in mathematics, science and reading. Singapore consistently does well in Pisa.
Over 70 countries participate in the survey conducted every three years.
There have been several suggestions on how OECD's test can be improved. One worthwhile idea is to survey students at the start of their course and when they complete their degree studies.
There is also a need to take into account their social background and the level of qualifications when they entered university, to see how much value a university adds.
If done well and on the basis of good, objective data, the OECD benchmarking test may become a game changer in higher education - and parents and students left confused amid the annual rankings onslaught will hopefully be closer to understanding the true worth of a university education.
This article was first published on Oct 22, 2015.
Get a copy of The Straits Times or go to straitstimes.com for more stories.