Jump to content

Times Higher Education–QS World University Rankings

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by EJohn59 (talk | contribs) at 03:11, 30 October 2009 (Undid revision 322851171 by 222.127.16.117 (talk)vandalism). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Times Higher Education-QS World University Rankings is an annual publication that ranks the "Top 200 World Universities", and is published by Times Higher Education and Quacquarelli Symonds (QS). The full listings feature on the Times Higher Education website and appear later on the QS website. They have been running since 2004 and are broken down by subject and region.

The ranking weights are:

  • Peer Review Score (40%)
  • Recruiter Review (10%)
  • International Faculty Score (5%)
  • International Students Score (5%)
  • Faculty/Student Score (20%)
  • Citations/Faculty Score (20%).

2009 Rankings (full data)

The full table of the top 200 universities along with all the analysis and methodology was published on the Times Higher Education website at one minute past midnight on 8 October 2009.[1]

Best university per country (in the top 100):

Times Higher Education - QS World University Rankings (Top 20)

2009 rankings[2] 2008 rankings[3] 2007 rankings[4] 2006 rankings[5] 2005 rankings[6] 2004 rankings[7] University Country Average score
01 01 01 01 01 01 Harvard University US 01
02 03 02= 02 03 06 University of Cambridge UK 03
03 02 02= 04= 07 08 Yale University US 04
04 07 09 25 28 34 University College London UK 18
05= 06 05 09 13 14 Imperial College London UK 09
05= 04 02= 03 04 05 University of Oxford UK 04
07 08 07= 11 17 13 University of Chicago US 11
08 12 06 10 09 09 Princeton University US 09
09 09 10 04= 02 03 Massachusetts Institute of Technology US 06
10 05 07= 07 08 04 California Institute of Technology US 07
11 10 11 12 20 19 Columbia University US 14
12 11 14 26 32 28 University of Pennsylvania US 21
13 13= 15 23 27 25 Johns Hopkins University US 19
14 13= 13 13 11 52 Duke University US 19
15 15 20= 15 14 23 Cornell University US 17
16 17 19 06 05 07 Stanford University US 12
17 16 16 16 23 16 Australian National University Australia 17
18 20 12 21 24 21 McGill University Canada 19
19 18 38= 29 36 31 University of Michigan US 29
20= 23 23 33= 30 48 University of Edinburgh UK 30
20= 24 42 24 21 10 ETH Zurich (Swiss Federal Institute of Technology) Switzerland 24

Commentary

Several universities in the UK and the Asia-Pacific region have commented on the rankings. Vice-Chancellor of Massey University, Professor Judith Kinnear says the Times Higher Education-QS ranking is a “wonderful external acknowledgement of several University attributes, including the quality of its research, research training, teaching and employability.“ She says the rankings are a true measure of a university’s ability to fly high internationally: “The Times Higher Education ranking provides a rather more and more sophisticated, robust and well rounded measure of international and national ranking than either New Zealand’s Performance Based Research Fund (PBRF) measure or the Shanghai rankings.” [8]

Ian Leslie, the pro-vice chancellor for research at Cambridge University said: "It is very reassuring that the collegiate systems of Cambridge and Oxford continue to be valued by and respected by peers, and that the excellence of teaching and of research at both institutions is reflected in these rankings."

The vice-chancellor of Oxford University, Dr. John Hood, said: "The exceptional talents of Oxford's students and staff are on display daily. This last year has seen many faculty members gaining national and international plaudits for their teaching, scholarship and research, and our motivated students continue to achieve in a number of fields, not just academically. Our place amongst the handful of truly world-class universities, despite the financial challenges we face, is testament to the quality and the drive of the members of this university's environment."

Vice-Chancellor of the University of Wollongong in Australia, Professor Gerard Sutton, said the ranking was a testament to a university’s standing in the international community, identifying… “an elite group of world-class universities.” [9]

Criticism

The Times Higher Education Rankings have been criticized[10] for placing too much emphasis on peer review, which receives 40% of the overall score. Some have expressed concern on the manner in which the peer review has been carried out. In a certain report[11], Peter Wills from the University of Auckland, New Zealand wrote of the Times Higher Education-QS Ranking:

"But we note also that this survey establishes its rankings by appealing to university staff, even offering financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest interest to rank their own institution more highly than others. This means the results of the survey and any apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic value in any case. We are vehemently opposed to the evaluation of the University according to the outcome of such PR competitions."

Some errors have also been reported on the faculty-student ratio used in the ranking. At the 16th Annual New Zealand International Education Conference held at Christchurch, New Zealand in August 2007, Simon Marginson presented a paper[12] which outlines the fundamental flaws underlying the Times Higher Education-QS Rankings. A similar article[13] (also published by the same author) appeared in The Australian newspaper in December 2006. Some of the points mentioned include:

"Half of the THES index is comprised by existing reputation: 40 per cent by a reputational survey of academics (‘peer review’), and another 10 per cent determined by a survey of ‘global employers’. The THES index is too easily open to manipulation as it is not specified who is surveyed or what questions are asked. By changing the recipients of the surveys, or the way the survey results are factored in, the results can be shifted markedly."

  1. The pool of responses is heavily weighted in favour of academic ‘peers’ from nations where the Times is well-known, such as the UK, Australia, New Zealand, Malaysia and so on.
  2. Results have been highly volatile. There have been many sharp rises and falls, especially in the second half of the THES top200 where small differences in metrics can generate large rankings effects. Fudan in China has oscillated between 72 and 195, RMIT in Australia between 55 and 146. In the US, Emory has risen from 173 to 56 and Purdue fell from 59 to 127.

Although THES-QS had introduced several changes in methodology in 2007 which were aimed at addressing some of the above criticisms[14], the ranking has continued to attract criticisms. In an article[15] in the peer-reviewed BMC Journal authored by several scientists from USA and Greece, it was pointed out:

"If properly performed, most scientists would consider peer review to have very good construct validity; many may even consider it the gold standard for appraising excellence. However, even peers need some standardized input data to peer review. The Times (sic) simply asks each

expert to list the 30 universities they regard as top institutions of their area without offering input data on any performance indicators. Research products may occasionally be more visible to outsiders, but it is unlikely that any expert possesses a global view of the inner workings of teaching at institutions worldwide. Moreover, the expert selection process of The Times is entirely unclear. The survey response rate among the selected experts was only <1% in 2006 (1 600 of 190 000 contacted). In the absence of any guarantee for protection from selection

biases, measurement validity can be very problematic."

Alex Usher, Vice President of the Educational Policy Institute in USA, commented:[16]

"Most people in the rankings business think that the main problem with the Times (sic) is the opaque way it constructs its sample for its reputational rankings - a not-unimportant question given that reputation makes up 50% of the sample. Moreover, this year's switch from using raw reputation scores to using normalized Z-scores has really shaken things up at the top-end of the rankings by reducing the advantage held by really top universities - University of British Columbia (UBC) for instance, is now functionally equivalent to Harvard in the Peer Review score, which, no disrespect to UBC, is ludicrous. I'll be honest and say that at the moment the THES Rankings are an inferior product to the Shanghai Jiao Tong's Academic Ranking of World Universities."

Andrew Oswald, Professor of Economics at University of Warwick, laid out the following criticism of the 2008 Times Higher Education-QS league tables:[17]

"This put Oxford and Cambridge at equal second in the world. Lower down, at around the bottom of the world top-10, came University College London, above MIT. A university with the name of Stanford appeared at number 19 in the world. The University of California at Berkeley was equal to Edinburgh at 22 in the world. Such claims do us a disservice. The organisations who promote such ideas should be unhappy themselves, and so should any supine UK universities who endorse results they view as untruthful. Using these league table results on your websites, universities, if in private you deride the quality of the findings, is unprincipled and will ultimately be destructive of yourselves, because if you are not in the truth business what business are you in, exactly? Worse, this kind of material incorrectly reassures the UK government that our universities are international powerhouses. Let us instead, a bit more coolly, do what people in universities are paid to do. Let us use reliable data to try to discern the truth. In the last 20 years, Oxford has won no Nobel Prizes. (Nor has Warwick.) Cambridge has done only slightly better. Stanford University in the United States, purportedly number 19 in the world, garnered three times as many Nobel Prizes over the past two decades as the universities of Oxford and Cambridge did combined. "

Some have argued that the Academic Ranking of World Universities by Shanghai Jiao Tong University may be more respectable despite its perceived bias towards the natural sciences. However this ranking system too has received its fair share of criticism.[11] [16]

References

  1. ^ "THE-QS World University Rankings 2009".
  2. ^ "THE-QS World University Rankings 2009".
  3. ^ "THE-QS World University Rankings 2008".
  4. ^ "THES-QS World University Rankings 2007".
  5. ^ "THES World University Rankings 2006".
  6. ^ "THES-QS World University Rankings 2005".
  7. ^ "THES World University Rankings 2004".
  8. ^ Flying high internationally
  9. ^ "UOW listed in Top 200 World University Rankings"
  10. ^ The THES University Rankings: Are They Really World Class? by Richard Holmes
  11. ^ a b Response to Review of Strategic Plan by Peter Wills
  12. ^ Rankings: Marketing Mana or Menace? by Simon Marginson
  13. ^ Rankings Ripe for Misleading by Simon Marginson
  14. ^ Sowter, Ben (1 November 2007). THES – QS World University Rankings 2007 - Basic explanation of key enhancements in methodology for 2007"
  15. ^ International ranking systems for universities and institutions: a critical appraisal by John Ioannidis et. al.
  16. ^ a b The Times Higher Education Rankings and the Dawn of Global Higher Education Data Standards by Alex Usher
  17. ^ There's nothing Nobel in deceiving ourselves by Andrew Oswald, The Independent on Sunday

See also