Obviously not being extremely familiar with the goings-on in Stateside academia, I am amused by what appears to be an ongoing tussle to make make academia more relevant to real-world concerns. Instead of staying in ivory towers, American academics are urged to make [gasp!] tangible contributions to society. I hate to noodle our US-based friends again on the Amerocentric nature of the blogosphere, but us bumpkins (1) have already taken a look at this question and (2) are in the process of doing something about it.
In general, British academia is more reliant on state funding than American academia. Whereas US institutions have more experience in soliciting contributions from alumni and others--especially elite ones--this is not necessarily the case here in the UK. Hence the many complaints over Lord Mandelson in the wake of his holiday season announcement that UK higher education funding over the next few years will be slashed. Even if research-oriented universities will not feel the brunt of these cuts since research is supposed to be spared ahead of teaching, the Russell 20 group of the UK's top research universities is unhappy. See Michael Arthur and Wendy Piatt, both of the Russell Group, commenting on how universities face "meltdown."
In previous years, the mechanism that the UK government has used to allocate research funding from the government has been the Research Assessment Exercise (RAE), last held in 2008 and in 2001. In those RAEs, experts in each field were appointed to judge the output of various departments' research. For instance, in Politics, Economics, Sociology, etc. However, some have expressed dissatisfaction over two things: (1) potential arbitrariness of the judgements made by examiners and (2) the aforementioned lack of real-world relevance of much research.
And so it has come to pass that the RAE is in the process of being replaced with the new Research Excellence Framework (REF) during the next assessment cycle. The innovations of note here are twofold. To replace potentially subjective decisions on the merits of others' work, we are moving from a qualitative to a quantitative measurement system that makes use of bibliometric information like the Social Science Citation Index (SSCI). From the REF guide we have this:
As someone who does consulting for government, I believe that this change is long overdue as there is much that can be gained from dealing with real-world policy issues. I used to have a long debate with an old professor who claimed that the "n=1" of actually working in the public or private sector didn't matter since researchers could draw on "n=thousands upon thousands" from academic research. Then, as now, I disagreed since one cannot gain a proper perspective pecking away at the keyboard divorced from the day-to-day workings of business and government. You must both talk the talk and walk the walk IMHO.
These upcoming changes are already having an effect of university hiring here in the UK. For instance, the well-respected University of Warwick is currently looking to hire an Assistant Professor for IPE. Lo and behold, among the listed duties and responsibilities are the following:
Real-world relevance is more likely to become the litmus test of research worthiness. Even the newer, high-profile academic journals are moving in this direction. At least in this respect, those of us in the UK are somewhat ahead of the curve.
In general, British academia is more reliant on state funding than American academia. Whereas US institutions have more experience in soliciting contributions from alumni and others--especially elite ones--this is not necessarily the case here in the UK. Hence the many complaints over Lord Mandelson in the wake of his holiday season announcement that UK higher education funding over the next few years will be slashed. Even if research-oriented universities will not feel the brunt of these cuts since research is supposed to be spared ahead of teaching, the Russell 20 group of the UK's top research universities is unhappy. See Michael Arthur and Wendy Piatt, both of the Russell Group, commenting on how universities face "meltdown."
In previous years, the mechanism that the UK government has used to allocate research funding from the government has been the Research Assessment Exercise (RAE), last held in 2008 and in 2001. In those RAEs, experts in each field were appointed to judge the output of various departments' research. For instance, in Politics, Economics, Sociology, etc. However, some have expressed dissatisfaction over two things: (1) potential arbitrariness of the judgements made by examiners and (2) the aforementioned lack of real-world relevance of much research.
And so it has come to pass that the RAE is in the process of being replaced with the new Research Excellence Framework (REF) during the next assessment cycle. The innovations of note here are twofold. To replace potentially subjective decisions on the merits of others' work, we are moving from a qualitative to a quantitative measurement system that makes use of bibliometric information like the Social Science Citation Index (SSCI). From the REF guide we have this:
How will citation information be used in assessing outputs?And here's the kicker as far real-world relevance is concerned: In addition to the aforementioned outputs, it is proposed that a quarter of the overall excellence score be judged on impact or usage among practitioners. For us in political science, this should mean performing paid work for firms, development agencies, or local, state, or national governments that bolsters our research interests. In other words, there is an emerging emphasis on demonstrating work that is of use outside of the usual circuit of refereed journals, conferences, and workshops. The REF guide offers us this nifty graphic:
We conducted a substantive pilot exercise to test how to use citation information in the REF. We concluded that citation information is not sufficiently robust to be used formulaically or as a primary indicator of quality, but there is considerable scope for it to inform and enhance the process of expert review. We propose that:
• Those UOAs for which robust data is available will make use of citation information. Sub-panels will decide this in advance. We expect that medicine, science and engineering panels will do so, but that the arts, humanities and a number of other panels will not.
• We will provide the relevant panels with citation information about the number of times that submitted outputs have been cited, and with appropriate benchmarks.
• These panels will use the information to inform and supplement their review of the
outputs, to assist with achieving consistency, international benchmarking and where possible reducing workloads.
• There will be clear guidelines on using the data robustly to take account of the known limitations and to avoid bias (for example, citations are less meaningful for recently published outputs, and are not available for certain types of output). Panels will not make judgements about the quality of outputs solely on the basis of citation information; expert judgement must be applied. All submitted outputs will be treated equally, whether or not there is citation information available for them.
As someone who does consulting for government, I believe that this change is long overdue as there is much that can be gained from dealing with real-world policy issues. I used to have a long debate with an old professor who claimed that the "n=1" of actually working in the public or private sector didn't matter since researchers could draw on "n=thousands upon thousands" from academic research. Then, as now, I disagreed since one cannot gain a proper perspective pecking away at the keyboard divorced from the day-to-day workings of business and government. You must both talk the talk and walk the walk IMHO.
These upcoming changes are already having an effect of university hiring here in the UK. For instance, the well-respected University of Warwick is currently looking to hire an Assistant Professor for IPE. Lo and behold, among the listed duties and responsibilities are the following:
4. Where appropriate and expedient, to secure contract work to benefit (your) research activity and to provide resources to underpin this activity...What can I say? Ditto. Like many universities, Warwick is sprouting a consulting arm of its own that can help prepare it for the REF, here called Research Support Services. Among its aims are:
8. To work where appropriate with Research Support Services in realising potential
commercial benefits of research for the Department and the University.
AimsWhile, I generally applaud these changes, I must point out that left-leaning academics may increasingly find themselves marginalized as a result of their inability to provide evidence of this sort of work. A Marxist instructor I know of is delighted with finding work with China's Communist Party for scholarship, though others may not be so lucky. For instance, I have no idea how Trotskyites can address a public policy question like, say, garbage collection.
* to transfer knowledge and skills from the university to industry
* to develop graduates for industrial careers
* to increase industrial relevance of academic research and teaching
* to encourage investment by industry into innovation
Benefits to the Company
* Highly skilled graduate to work on a strategic company project
* Access to academic expertise and university facilities
* Improved competitiveness and financial benefits from completed projects
Benefits to the Academic and the University
* Development of collaborations with innovative businesses
* Development of business-relevant teaching materials
* Conference material and publish high quality research papers
Real-world relevance is more likely to become the litmus test of research worthiness. Even the newer, high-profile academic journals are moving in this direction. At least in this respect, those of us in the UK are somewhat ahead of the curve.