Futility of varsity rankings

The HEC needs to issue its own universities’ ranking, giving potential college goers the opportunity to figure out which university is right according to their individual priorities

Futility of varsity rankings

In the years before university rankings, students in Pakistan went by marks/grades of the last admitted student on "merit lists" of the previous year’s admissions, to gauge selectiveness of various university programmes. By the late 90s, a number of "modern" universities, the likes of GIKI, LUMS, NUST etc., had carved out their niches alongside established public universities.

Most of the these universities were rather unique in a number of aspects including the tuition fees they charged (often orders of magnitudes higher than public universities), in that they based admissions decisions in large part on tests they conducted themselves, among other things. The latter took them out of the league of universities that could be compared by "merit lists". This broke the method prospective students and their parents could use to rank any and all university programmes against each other.

National level awareness of university rankings in Pakistan developed relatively recently, in the last 15 years, and began around the time the University Grants Commission was rechristened the HEC. It began with rankings of foreign (often US) universities, which were used as a filter on scholarship applications.

Global university rankings are numerous and are compiled on the levels of individual countries, regions/continents as well as globally. Arguably, among the first and most widely known and cited university rankings in Pakistan are the US News rankings of American universities, although their local relevance was limited. Without going into too much detail, the ranking methodology amounts to a weighted score of various quantitative measures of university programs http://www.usnews.com/education/best-colleges/articles/how-us-news-calculated-the-ranking.

With the HECs post-2002 expansion in size, scope and variety of destination countries for its foreign scholarship programmes more comprehensive global rankings were needed that could be used to assess the ranks of universities outside the US. Beginning in 2004, Quacquarelli Symonds Limited published its QS University Rankings which was readily picked up in Pakistan a few years later and are still tracked today. Its methodology is similar to US News’ but with the addition of at least one amorphous factor, namely employer reputation http://www.topuniversities.com/university-rankings-articles/world-university-rankings/qs-world-university-rankings-methodology. Interestingly though, the cost of education, which affects the cost-to-benefit ratio of a programme, is not included as a factor.

Another ranking that receives some attention in Pakistan is the Times Higher Education (THE) University Ranking https://www.timeshighereducation.com/world-university-rankings/about-the-times-higher-education-world-university-rankings, which is based on self-reported data provided by universities. It is unclear whether or not the data provided by universities is verified for inflation or falsification.

The "Ranking Web of Universities" is yet another ranking http://www.webometrics.info/en/Methodology that briefly received some attention in Pakistan in spite of its sketchy methodology but quickly fell out of view. This ranking is based on data collected from crawling and mining universities’ websites.

Since a number of Pakistani universities started making appearances within the top 800 of some of these lists, several local universities’ Quality Assurance departments began allocating significant resources towards chasing these rankings. Once a year, when these rankings come out they give a few Pakistani universities something to cheer about -- their names are appearing in one obscure "international" ranking or other.

Over time, this awareness about university rankings trickled down to local college goers and their parents, who now increasingly base their information on these rather arbitrarily ordered lists. Eventually, in 2010 the HEC began compiling and publishing university rankings of local universities. They were first updated in 2013 and have been revised annually since. The HEC rankings are also based on self-reported data solicited from universities and on factors that seem in large part irrelevant http://www.hec.gov.pk/InsideHEC/Divisions/QALI/Others/RankingofUniversities/Documents/Information%20for%20Ranking%202016%20Proforma.docx. These include factors such as:

Number of students receiving External Scholarships excluding HEC need based and University Own Scholarships

Number of trainings received by full time faculty members

Number of full time faculty members recruited

Number of community outreach programmes, civil engagements and community services

Number of international academics olympiad and equivalent awards won (1st three positions) by students

Number of national and international awards

These rankings are rather coarse in the sense that they cover only six categories by broad areas of study, namely: agriculture/veterinary, art and design, computer sciences and IT, business education, engineering and technology and medical. The HEC does not provide rankings down to the level of colleges, departments or programmes. In addition to the above six rankings, there are three additional rankings that classify universities based on their size, categorised as large, medium and small and then rank them.

More recently, in early 2016 a research lab at the Information Technology University, Lahore issued its ITU University Ranking http://itu.edu.pk/research/scientometrics-lab. The methodology used by this ranking is unclear.

Regardless of which ranking you look at, they all take a common approach: they all base university ranks on a weighted sum of a number of mostly empirical factors. Most of them exclude what is for many the most important and impactful metric that influences admissions decision -- cost.

Rising college tuition rates have been receiving a lot of attention in the US, and education costs in Pakistan have been following suit. For more and more parents, sending their children to a college means they too have to take out loans like their US counterparts. Why then are none of the rankings acknowledging cost as a factor in making college admission decisions? Would the inclusion of college costs address the shortcomings in these rankings? Of course not. There are many other metrics that could be deciding factors, e.g. distance from hometown, male-to-female student ratio, vicinity to a big city, cost of living etc. Am I then advocating for the abolition of rankings altogether?

A few years ago, Jeffrey E. Stake at Indiana University Maurer School of Law collected data of US law schools and created the Law School Ranking Game http://monoborg.law.indiana.edu/LawRank. The Law School Ranking Game allows prospective students to weight different school factors according to their own personal circumstances and create a ranking tailor made to their unique personal circumstances. A student can elect to make cost the only factor or not a factor at all, can make average-earnings after graduation and distance to their hometown equal factors etc. Malcolm Gladwell made the point how arbitrarily assigned weights in these rankings radically affect the rankings they produce.

When US News’ weights are used with various factors in the Law School Ranking Game one obtains the expected ranking with the University of Chicago, Yale, Harvard, Stanford and Columbia making up the top 5. But as soon as just a handful of those weights are changed according to different preferences, several universities in the top 10 are displaced by universities that have never made it into the top 10.

In 2013, the US Department of Education created College Scorecard https://collegescorecard.ed.gov after it was tasked with producing a definitive ranking but faced the same challenges and was unable to come up with one that was widely acceptable to most colleges. Instead, the DoE chose to do something similar to the Law School Ranking Game and released the raw data on which the rankings were to be based, and made it accessible to the public to let students and parents make their decisions based on their individual needs and circumstances.

The HEC is in the midst of internal reforms, and among the programmes that are being re-evaluated and revised are the university rankings. Instead of issuing its own rigid, arbitrary and misleading rankings, I argue that the HEC ought to issue its own annual Pakistani universities ranking game, and release the raw data on which that rankings game is based along with it.

Give potential college goers the opportunity to figure out which college or university is right according to their individual priorities. By sharing statistics such as graduation rates, employment rates within 12 months of graduating, alumni salary surveys, cost of education, cost of living etc. students will be in a much more informed position to evaluate trade-offs. If these numbers are made public it would allow the market of admission seekers to hold universities with high price-tag programmes that do not add significant value to an individual’s future prospects to account (Some will argue here that the purpose of education should not merely be landing a well/high-earning job, but that is a topic for a different debate).

Far too many graduates are lured by fly-by-night colleges with promises of landing well-paying employment with nothing more than promises and anecdotal evidence, only to discover later that things aren’t so. If you think that this data is hard to collect, you should know that alumni offices of several universities in Pakistan have been keeping in touch with their graduates and conducting employment surveys for a while now, including LUMS and NUST. Is it too much to ask then that this be made the norm?

Futility of varsity rankings