The US News rankings of US graduate programs are out again, and I've heard a fair bit of discussion about them. My department, for example, went up slightly in the rankings, while the chemistry department here slipped a few spots. I want to point out something that US News makes no effort to broadcast: The US News graduate rankings are a popularity contest. What I mean is, the US News rankings are the result of an opinion survey taken of department chairs, not the result of actual quantitative metrics like publication rates, citation rates, research funding, major awards, graduation rates, or the like. Essentially the rankings give you a snapshot of the perception of the community of department chairs in a discipline, not an actual real ranking of some defined quality. This has some consequences. For example, perceptions are very hard to change, so it's unlikely that there will be lots of movement on these rankings unless there are exceptional circumstances (e.g., a particular department wins a couple of Nobel prizes out of the blue). It is distressing to me to see how much importance some people (prospective students on the one hand, administrators on the other) place on rankings that measure reputation rather than something truly quantitative.
The NRC, by contrast, does survey real data like those mentioned above. Unfortunately, they seem to be in a mode of continuously delaying the release of their "decadal survey". Anyone have any more insight into why that is taking so long?
13 comments:
how about the ranking by academic analytics? but it is not freely accessible
My department isn't ranked all that high. I'm actually very happy about that...it means we don't attract so many of the students who obsess over stupid rankings! So (in *my* field here) we quietly have a lot of funding, often get 8+ first-author papers out in the course of a PhD, and still graduate in very slightly over five years. Meanwhile, the atmosphere is generally pretty friendly. There's a lot to be said for being underrated. :)
I am also curious about the NRC rankings.
Though my department is ranked rather low in this ranking, most of my colleagues are world-class (and are recognized as such). The real difficulty with being ranked low is that the caliber of graduate students we are able to attract is also quite low. This makes the rankings somewhat self-fulfilling since being in an environment with underperforming students does interfere with your own education.
Such is life.
Anon - they seem to use actual data, though I can't get their rankings. From what I found via google, it looks like their methodology may favor small departments relative to large ones.
Jes, I agree to some large degree, though from the faculty side there are other benefits to higher rankings. For example, fair or not, these things make a difference in getting center grants, attracting postdoc and faculty candidates, etc. (Also, you may want to update the link in your blogger profile - it points to the old url for your blog....)
I am actually quite worried about this. These rankings basically say that even in academia, people have preconceived popularity notions of which schools are "top" without knowing about any of the science or research going on.
So my question is, say you were looking for a PhD position or a postdoc in a research field you found interesting. Would it be better to go to a school ranked in the 20's on US News which has like 10-15 faculty working on your area and has lots of publications, grant funding etc? Or should you go to a top 5 US News school which is not as strong in your research area?
Because, when one is applying for jobs (be they academia, government, industry, postdocs, whatever), these rankings basically say that people will just be impressed by the name of the school you went to and not look at the research you did.
So is it worth the risk to go to a lower ranked, un-prestigious school for research interest? Will it seriously affect you academically?
Essentially the rankings give you a snapshot of the perception of the community of department chairs in a discipline, not an actual real ranking of some defined quality.
Unfortunately, Doug, that matters very little in the end, especially when it comes to the bottom line of a faculty, a graduate student, a postdoc. If everyone believes the same thing, then that is the law of the land, and metrics become irrelevant. This is why, to me, USN&WR's rankings are the only ones worth paying any attention to, "quantitative" rankings being for the most part a pastime of a few connoisseurs.
If anything, it tends to work the other way around, perception ends up influencing the "objective" rankings as well. You surely remember how NRC's "quantitative" rankings of 1995 mimicked very closely those of USN&WR -- and it is a widespread joke that the reason the new ones are being delayed so much is that they have not yet found a way of making them reproduce the ones of USN&WR satisfactorily.
Hi Doug,
Lemme ask you this. In quantitative metrics, do you think your department would be ranked higher than UC Berkeley, MIT or even UT Austin and Urbana Champaign? I think you have only a fraction of their funding and publications...
Sylow - I think if you look at publications, citations, and funding, and you normalize by faculty size, our department is competitive with three or four of the departments ranked above us by US News. No, not MIT or Berkeley.
Doug, I do not understand this "normalization" argument. If I have a department with only 6 faculty members and if I divide the total funding (and perhaps publications) by 6, does it mean that that department should be ranked higher than Berkeley which has 70 faculty? This makes no sense... The breadth of the department is very important in ANY ranking no matter what methodology you use unless that small department is getting Nobel in every 5 years. If you are only good at a certain subfield of physics or chemistry but no significant activity in anything else, you will not be ranked high in anything.
Sylow - Breadth is certainly important. However, don't you care about how good the individual researchers are, not how large the department is? For equal amounts of publications, citations, etc., wouldn't a 40-person department be better in some real sense than an 80-person department?
Doug, I think, even in individual terms, Berkeley would rank far ahead of your department. Just look at how many faculty they have in National Academies and you have or how many Guggenheim, APS fellows they have or you have (even after "normalization")
Sylow, I never said our department was as strong as Berkeley or MIT or Harvard. You're just being argumentative.
On the subject of rankings, I'm happy to say that Oxford got hammered in this year's UK research assessment exercise that ranked Lancaster university the best physics department in the UK -- followed closely by Bath. (I clicked through the Bath physics department and concluded that I had never even heard of any of their faculty).
Cambridge did pretty well in the rankings, but Oxford ranked below 10th. Needless to say, when ranking come out so screwy, you feel much better about ignoring them completely.
You can see the whole rankings here
http://www.guardian.co.uk/education/table/2008/dec/18/rae-2008-physics
There is a complicated reason for why the rankings came out this way having to do with who is actually declared to be part of the physics department, and how (and why) you game the system in this respect. Turns out the people deciding how to submit information from Oxford happily knew that we would get hammered but for funding reasons having to do with how the data submitted was to used, they laughed all the way to the bank. So no one here shed too many tears over the matter. Nonetheless it does feel strange to say, "I'm at Oxford, the 14th best physics department in the UK".
At any rate, after this research assessment exercise, the UK decided that they would not do another one -- and a new evaluation system is being discussed to be introduced several years down the line.
Post a Comment