International University Rankings - Benefits and limitations - - PowerPoint PPT Presentation

international university rankings benefits and limitations
SMART_READER_LITE
LIVE PREVIEW

International University Rankings - Benefits and limitations - - PowerPoint PPT Presentation

International University Rankings - Benefits and limitations Seminari sobre rnquings internacionals Barcelona, 21 June 2011 Gero Federkeil, CHE Centre for Higher Education, Germany www.che.de Presentation The CHE A Short Introduction


slide-1
SLIDE 1

www.che.de

Gero Federkeil,

CHE Centre for Higher Education, Germany

International University Rankings

  • Benefits and limitations

Seminari sobre rànquings internacionals

Barcelona, 21 June 2011

slide-2
SLIDE 2

The CHE – A Short Introduction The rise of international rankings International rankings – indicators & data sources

Presentation

International rankings – a critical view

2 ACUP 2011 | Gero Federkeil |

Conclusions

slide-3
SLIDE 3

Private, not-profit organisation Founded in 1994 by Bertelsmann Foundation and German Rectors Conference

The CHE - Centre for Higher Education

Goal: Initiate and promote of reforms in German higher education Activities: HE policy issues (e.g. Bologna, funding, …) Consulting Communication & training Ranking

3 ACUP 2011 | Gero Federkeil |

slide-4
SLIDE 4

The CHE - Centre for Higher Education

Ranking of German universities among founding tasks of CHE First ranking published in 1998 Extension of fields and indicators Continuous further development of methodology Internationalisation

Extension of CHE Ranking: Austria, Switzerland, Netherlands 2011: Ranking in Spain in cooperation with Fundació CYD U-Multirank project to “develop the concept and test the feasibility

  • f a global multi-dimensional university ranking”

Founding member of IREG –Observatory on Academic Rankings and Excellence (“Berlin Principles”)

4 ACUP 2011 | Gero Federkeil |

slide-5
SLIDE 5

The CHE – A Short Introduction The rise of international rankings International Rankings – Indicators & data sources International Ranking – A critical view

5 ACUP 2011 | Gero Federkeil |

Conclusions

slide-6
SLIDE 6

The rise of international rankings

ACUP 2011 | Gero Federkeil | 6

Shanghai Jiatong University: Academic Ranking of World Universities

(2003 - ) Original purpose: comparison of Chinese universities with rest of the world

http://www.arwu.org/index.jsp

Times Higher Education (THE)/QS World Rankings (2004-2009) 2010 Separation of partners QS World University Rankings (2010 - )

Private consulting company http://www.topuniversities.com/

THE/Thomson Reuters World Rankings (2010 - ) Co-operation with leading provider of bibliometric data base http://www.timeshighereducation.co.uk/world-university-rankings/

slide-7
SLIDE 7

The rise of international rankings

ACUP 2011 | Gero Federkeil | 7

HEEACT (Taiwan): Performance Ranking of Scientific Papers

Purely bibliometric ranking http://ranking.heeact.edu.tw/en-us/2010/homepage/

Centre for Science and Technology Studies (CWTS): Leiden Rankings

Purely bibliometric ranking http://www.cwts.nl/ranking/LeidenRankingWebSite.html

Scimago Institutions Ranking

Purely bibliometric ranking http://www.scimagoir.com/index.php

Ranking Ecole des Mines Paris

Analysis of university of graduation of CEO of Top 500 companies http://www.mines-paristech.fr/Actualites/PR/Ranking2011EN-Fortune2010.html

Webometrics Ranking of web-presence

http://www.webometrics.info/

slide-8
SLIDE 8

The duality of rankings

ACUP 2011 | Gero Federkeil | 8

The emergence of global rankings are a result of a growing global competetion in higher education; at the same time those rankings re-enforce this competition by their own results

Global Rankings have an impact on: National policies (excellence initiatives, scholarships) Institutional strategies

slide-9
SLIDE 9

The CHE – A Short Introduction International Rankings – Indicators & data sources The rise of international rankings International Ranking – A critical view

9 ACUP 2011 | Gero Federkeil |

Conclusions

slide-10
SLIDE 10

ACUP 2011 | Gero Federkeil | Shanghai Jiaotong Ranking QS Indicator Weight Indicator Weight SCI publications 20 % Reputation among scholars 40 % Publications Science & Nature 20 % Reputation among employers 10 % Highly cited authors 20 % Citations 20 % Nobel Prizes & Field Medals 20 % Student-staff-ratio 20 % Alumni with NobelPrizes 10 % International students 10 % Size 10 % International staff 10 %

World Rankings: Indicators

  • Measurement of research
  • Due to indicators/data bases: mainly in science and

technology

  • Mixture of different dimensions, mainly reputation
  • What does the total score measure?

10

slide-11
SLIDE 11

ACUP 2011 | Gero Federkeil | THE World Rankings HEEACT Ranking Indicator Weight Indicator Weight Teaching 30.0 % Publications 1999- 2009 20 % Research 30.0 % Citations 1999 - 2009 10 % Citations 32.5 % Research Excellence 50 % Industrial Income 2.5 % H Index (20 %) International Mix 5 % Highly cited papers (15 %) Papers in high impact journals (15 %)

World Rankings: Indicators

  • Mainly Research
  • 34.5 % based on reputation
  • What does the total score measure?
  • Research only
  • Bibliometric data = bias towards sciences
  • Long-term perspective

11

slide-12
SLIDE 12

Comparison of Results: Top 10 Position

ACUP 2011 | Gero Federkeil | 12

QS THE ARWU University of Cambridge Harvard University Harvard University Harvard University California Institute of Technology University of California, Berkeley Yale University Massachusetts Institute of Technology Stanford University UCL (University College London) Stanford University Massachusetts Institute of Technology (MIT) Massachusetts Institute of Technology (MIT) Princeton University University of Cambridge University of Oxford University of Cambridge California Institute of Technology Imperial College London University of Oxford Princeton University University of Chicago University of California Berkeley Columbia University California Institute of Technology (Caltech) Imperial College London University of Chicago Princeton University Yale University University of Oxford Columbia University University of California Los Angeles Yale University University of Pennsylvania University of Chicago Cornell University Stanford University Johns Hopkins University University of California, Los Angeles Duke University Cornell University University of California, San Diego University of Michigan Swiss Federal Institute of Technology Zurich University of Pennsylvania Cornell University University of Michigan University of Washington Johns Hopkins University University of Toronto University of Wisconsin - Madison ETH Zurich (Swiss Federal Institute of Technology) Columbia University The Johns Hopkins University McGill University University of Pennsylvania University of California, San Francisco Australian National University Carnegie Mellon University The University of Tokyo

University of Barcelona 148 142 201 - 300 Autonomous University of Barcelona 173 Not among Top 200 301 – 400

slide-13
SLIDE 13

The CHE – A Short Introduction International Ranking – A critical view The rise of international rankings International Rankings – Indicators & data sources

13 ACUP 2011 | Gero Federkeil |

Conclusions

slide-14
SLIDE 14

Indicators used – a critical assessment

ACUP 2011 | Gero Federkeil | 14

Bibliometric indicators

  • Central element of research
  • Differences in methdological

quality (e.g. field-normalised citation rates)

  • Publications: lack of control for size
  • Field biases (humanities, engin.)
  • Language bias

Reputation

  • Reputation is a social reality
  • No performance indicator
  • Highly dependent on sample
  • Not very reliable in international

perspective

slide-15
SLIDE 15

Indicators used – a critical assessment

ACUP 2011 | Gero Federkeil | 15

Nobel prizes

  • High level excellence
  • Field biases (only a few fields)
  • Time problem / institutional affiliation

„Small indicators“

  • Try to bring in other dimensions

than reserach

  • Problems in definition and data

collection (e.g. internat. students)

  • Problems in validity (e.g. student-

staff-ratio)

slide-16
SLIDE 16

General approach – A critical sessement

ACUP 2011 | Gero Federkeil |

International rankings differ in their indicators. But with regard to the general methodology there is a ranking orthodoxy and a growing number of alternative approaches

16

slide-17
SLIDE 17 No se puede mostrar la imagen en este momento.

Ranking orthodoxy I: Institutional ranking

ACUP 2011 | Gero Federkeil |

slide-18
SLIDE 18

ACUP 2011 | Gero Federkeil |

Institutional rankings Multi-level rankings: Field specific rankings

Most target groups/ users (prospective students, academic staff) are interested in information about „their“ field Universities are heterogeneous units; fields/faculties differ in their performance  Rankings of whole institutions give misleading averages

Critique of ranking orthodoxy I Global rankings increasingly introduced field based rankings

18

slide-19
SLIDE 19

Ranking orthodoxy II: „Composite indicator“

ACUP 2011 | Gero Federkeil |

slide-20
SLIDE 20

ACUP 2011 | Gero Federkeil |

Composite overall indicator Multi-dimensional ranking Composite indicators blur profiles and strengths & weaknesses There are neither theoretical nor empirical arguments for assigning specific weights to single indicators Heterogeneity of preferences on indicators among stakeholders /users (“quality is in the eye of the beholder”)  fixed weights patronise users  Rankings should leave decision about relevance of indicators to users

Critique of ranking orthodoxy II Global rankings started to include elements of personalisation

20

slide-21
SLIDE 21

Ranking orthodoxy III: League tables

ACUP 2011 | Gero Federkeil | 21

slide-22
SLIDE 22

Conclusions I: Methodolgy

ACUP 2011 | Gero Federkeil | 22

Global rankings helped to bring higher education into public debate Their methods are flawed field-bias in favour of (biomedical) hard sciences language bias in disfavour of non-english speaking countries problems with validity and reliability of indicators Although there are some recent changes, they still follow the orthodox ranking approach mainly institutional rankings use of composite indicator league table approach This approach may be good for media interest, but does not provide meaningful information to important stakeholders/users

slide-23
SLIDE 23

League tables Group approach (top, middle, bottom) Small differences in the numerical value of an indicator lead to big differences in league table positions (ignoring issues of statistical errors and insecurity) League tables tend to exaggerate differences between universities (“7th is better than 12th“, “326 is better than 341”)  Rankings should refer to groups / clusters rather than to exact league table positions

Critique of ranking orthodoxy III

 ACUP 2011 | Gero Federkeil |

Most rankings still stick to league table approach; a few deviate

23

slide-24
SLIDE 24

The CHE – A Short Introduction Conclusions The rise of international rankings International Rankings – Indicators & data sources

24 ACUP 2011 | Gero Federkeil |

International Ranking – A critical view

slide-25
SLIDE 25

Conclusions II: The politics of ranking

ACUP 2011 | Gero Federkeil | 25

Due to their indicators & data sources most global rankings more

  • r less exclusively focus on research

And, in fact they are rankings of one particular type of institutions

  • nly: internationally oriented, comprehensive research universities

This led to an obsession about „world class university“ De-valuation of institutional profiles different from that (specialised, teaching, regional ….) There is a need for an alternative approach that is multi- dimensional and makes visible different fields of excellence

slide-26
SLIDE 26

ACUP 2011 | Gero Federkeil | 26

slide-27
SLIDE 27

There might be some limits to rankings in general

„You‘re kidding! You count publications?“

ACUP 2011 | Gero Federkeil |

slide-28
SLIDE 28

ACUP 2011 | Gero Federkeil | 28

More information:

www.che-ranking.de www.u-multirank.eu www.ireg-observatory.org gero.federkeil@che-ranking.de