Legislators have been making a fuss over the inclusion in Hong Kong’s “Top Talent Pass” scheme of a work visa for He Jantui. Professor He has a criminal record, albeit a rather exotic and unusual one: he was convicted of doing unauthorised research after editing the genes of unborn babies.
This was a controversial experiment which many scientists considered premature and ethically dubious. But in many jurisdictions you wouldn’t end up in jail for it.
Complaints centred round the fact that Prof He had, presumably, declared that he satisfied one of the requirements of the talent-hunting scheme, which is that applicants have been working for three of the last five years.
Prof He has spend most of the last five years in prison. This is not, I suppose, entirely incompatible with paid employment. I don’t know if prisoners in China are invited to sew mailbags, paint road signs or do any of the other useful things sometimes offered as paid work to prisoners.
Anyway his visa has now been cancelled, and it is suggested that all future applicants for the scheme, which is supposed to boost our supply of “top talent” by admitting its possessors on easy visa terms, should be required to produce a certificate of no criminal conviction. That looks a bit like over-kill. Such matters are unlikely to come up very often. A swift Googling of the names would suffice to eliminate from contention anyone who has starred in an academic scandal.
A more serious criticism of the whole scheme is its preoccupation with the “world’s top 100 universities”. There are numerous lists ranking universities from which one might choose, so the Immigration Department has compiled its own, with input from four: “Times Higher Education World University Rankings, Quacquarelli Symonds (QS) World University Rankings, US News and World Report’s Best Global Universities Rankings and Shanghai Jiao Tong University Academic Ranking of World Universities.”
Having graduated from a university which was ranked the best in the world by one of those systems last year I hope I can say without being accused of sour grapes that lists of this kind are notoriously worthless.
The idea of ranking universities was launched decades ago by US News and World etc, which originally ranked only American universities. The ranking list has been dogged by controversy for years.
Periodically American institutions have tried to boycott it; it is particularly unpopular with law schools for some reason. Some academics decry the system as a joke, other complain that universities have learnt how to “game the system” to get higher ratings, and some of them have been caught cheating to do so.
The measurement of research output is a contentious area and scoring institutions on “reputation” looks circular. The easiest way to get a reputation is to score well on the ranking table.
Defenders of the system say it gives parents and students the information they need to make informed choices. But it has other purposes. The President of the University of Alberta said it was “time to question these third-party rankings that are actually marketing driven, designed to sell particular issues of a publication with repurposing of their content into even higher sales volume special editions with year-long shelf life.”
The international comparisons are open to similar objections. They also seem to have proliferated categories so that everyone can claim to be good at something: in Hong Kong, for example, HKU says it is “first in Hong Kong” (THES and QS) 10th in “employability rankings” (QS) and 1st in “most international universities” (THES). The UST is 2nd in the list of “young universities” (QS) 1st in Hong Kong in the employability rankings, and so on. CUHK is first in HK in the US News ranking, 2nd in HK (ARWU) and 1st in HK in Reuters Most Innovative Universities. City U is fourth in the young universities list and 1st in HK in “citations per faculty”. Poly U is 2nd in HK in “sustainability” (QS). This is a school in which all children win prizes.
The main focus though, in the lists which count for immigration purposes, is on two areas: research and “reputation”. Both of these have problems. The pursuit of research outputs favours the wealthy and, alas, the unscrupulous, who can resort to the subtle manipulations lamented by Stuart Richie in his book “Science Fictions” (a fun read, incidentally) or the blatant cheating recounted in an article in the latest edition of The Economist.
The reputation thing is a bit of a joke. When I was a university teacher I was occasionally asked by one of these ranking organisations to provide a list of universities which in my view were good. Unless you have had a very mobile career this is difficult to do fairly, or at all. I knew which Australian universities could put on a good journalism conference. I had some notion of their journalism teaching. And that was it.
I knew quite a lot about standards of English-language debating in Hong Kong because I was regularly conscripted as a judge. But that only tells you about a small bit of each institution. The HKU team, for example, always seemed to consist of fierce and fluent ladies of a South Asian appearance from the Law Faculty. This didn’t tell you much about the rest of the place.
So I put in a good word for the institutions I had studied at, carefully ignored Cambridge and decided after some thought not to mention the London School of Economics because whenever I write a reference for someone who is applying there they get rejected.
It is difficult to believe that many people put much more thought into this sort of thing, for which you are of course not paid.
In a recent article in “Nature” Elizabeth Gadd (a big wheel in the research assessment industry) said:
“The literature on research management is full of critiques of rankings. Rankings are methodologically challenged — often using inappropriate indicators such as counting Nobel-prizewinning alumni as a proxy for offering a quality education. They favour publications in English, and institutions that did well in past rankings. So, older, wealthier organizations in Europe and North America consistently top the charts. Rankings apply a combination of indicators that might not represent universities’ particular missions, and often overlook societal impact or teaching quality.”
Similar misgivings are expressed by Elen Hazelkorn, a professor at the Technological University of Dublin:
“There are over 18,000 university-level institutions worldwide. Those ranked within the top 500 would be within the top 3% worldwide. Yet, by a perverse logic, rankings have generated a perception amongst the public, policymakers and stakeholders that only those within the top 20, 50 or 100 are worthy of being called excellent.
“There is no such thing as an objective ranking nor a reason why indicators should be either weighted (or given particular weights) or aggregated. Although rankings purport to measure higher education quality, they focus on a limited set of attributes for which (internationally) comparable data is available. This means that most global rankings focus unduly on research and reputation. Rankings are not an appropriate method for assessing or comparing quality, or the basis for making strategic decisions by countries or universities.”
The complaints of inherent bias are confirmed by the Immigration Department’s list, which has 54 universities in the USA, a further 19 in the UK, eight in Australia and six in Canada. There are, for reasons we will not go into, nine in Mainland China. This doesn’t leave much room for the rest of the world and nine countries/territories only have one entry – Argentina (with the only top university in South America, apparently) Finland, Ireland, Malaysia, Mexico, Norway, Russia, Spain and Taiwan. Europe is well represented but only west of the Oder: there is no room for the famous and ancient institutions in the Czech Republic, Poland or Hungary.
In the department’s view there are no top universities in India or the entire African continent. Another odd feature of the “Aggregate Top 100 universities “ list provided is that there are 176 universities on it. Not compiled by a Maths Major, apparently.
A more subtle complaint about this approach is that being a graduate of a top university doesn’t guarantee that you will be a top graduate. Universities which are working on research and reputation tend to treat teaching as a subsidiary activity, to be fobbed off to the inexperienced, part-time, or semi-retired. Star professors will be recruited on the explicit basis that they will not be required to teach undergraduates.
And some American universities manage to combine a reputation for exclusivity with a variety of back doors through which unlikely scholastic stars like George W Bush can enter.
This is known as ALDC admission. A is for athletes, L is for legacies (which means the offspring of graduates) D is for dean’s list (which means the offspring of potential donors) and C is for Children (of the university’s own staff). These categories account for no less than 30 per cent of the students admitted to Harvard.
Then in the UK we have the system by which duplicitous toffs like Boris Johnson find their way to Oxford. Don’t get me started.
Leave a Reply