Archive for Research Assessment

On Scholarly Communication

Posted in Open Access with tags , , , , on June 9, 2025 by telescoper

My marking duties being over, it’s time once more to take up the cudgels against the academic publishing racket, at least in a small way, by sharing an article from the European University Association called Reclaiming academic ownership of the scholarly communication system. I recommend you read the entire piece, which is an extended briefing note. It can be downloaded as a PDF here. One of the points it makes very strongly is how much the Open Access movement has been hijacked by commercial publishers.

I will share a couple of sections with you here. First, some background information about Open Access Publishing:

Now I’ll cut to the chase and share the key points from the end.

  1. Accelerate the reform of research assessment. Most of the issues in the current publishing system are rooted in how academic staff are evaluated. Research assessment reform is essential to break the cycle of dependence on high-impact commercial journals and related metrics. Universities should consider broadening the criteria used in academic evaluation, to ensure that recognition goes beyond research to include teaching, innovation, leadership, open science practices, and societal outreach. While institutional, regulatory, and cultural factors can either facilitate or hinder reforms, many universities are already taking the initiative and implementing changes (even in countries with centrally regulated academic career assessment processes).
  2. Strengthen institutional publishing services and infrastructures. A robust, sustainable and interoperable scholarly publishing ecosystem requires each university to properly curate their research contributions and outputs, through institutional or shared infrastructure and services (e.g. repositories, publishing platforms, and CRIS systems). Strengthening these institutional capacities may require reallocating resources and cooperation (see points 3 and 4). This should also apply to the various institutional departments (libraries, research management, etc.) and staff needed to support academics and researchers.
  3. Cooperate and coordinate with other universities, research performing and funding organisations, as well as researchers’ associations and learned societies. The challenges of scholarly publishing are systemic, and no single institution can tackle them alone. Universities should align their efforts with other academic organisations, funders and research institutions. Cooperation and coordination can be valuable for advocacy, policy development and implementation, as well as for shared or “horizontal” services and infrastructures. Cooperation can also take place within regional, national, European and global frameworks.
  4. Critically evaluate expenditure on commercial research publishing and information products and services. As new not-for-profit publishing alternatives emerge and consolidate, universities should regularly evaluate their expenditure on commercial products and services, including journal publication costs and research databases. By promoting cost transparency and cost efficiency, institutions can make informed decisions that support innovation and reinvest funds into institutional publishing services and infrastructure (see point 2). Where feasible, preference should be given to not-for-profit solutions, ultimately reducing costs and ensuring sustainability.
  5. Support and promote the use of rights retention by the university community. Rights retention should be used to regain academic ownership of scholarly communication. Universities should actively advocate for legislative reforms that allow researchers to retain their rights and freely share their research. They should also educate and inform their faculty and researchers of the importance of rights retention and provide legal support. Where legally feasible, institutions should implement and enforce rights retention policies to ensure that publicly funded research remains publicly accessible.
  6. Ensure researcher engagement. Any transition toward a more equitable and sustainable scholarly communication system must involve the academic community. Universities should raise awareness of the systemic issues in scholarly publishing and create spaces for dialogue, reflection, and co-design to discuss how to address them at institutional level. Engaging researchers early and consistently can help shift perceptions, foster a sense of shared responsibility and build support for longterm cultural change.

I endorse all of these, and have written about some of them before (e.g. here) but I would add to the first that universities should actively lobby their governments to change research assessment methods which in many cases are causing an immense waste of public money by outsourcing research assessment to entities, such as Scopus, who are mere fronts for the academic publishing industry.

ResearchFish Again

Posted in Biographical, Science Politics with tags , , , , , , on April 1, 2025 by telescoper

One of the things I definitely don’t miss about working in the UK university system is the dreaded Researchfish. If you’ve never heard of this bit of software, it’s intended to collect data relating to the outputs of research grants funded by the various Research Councils. That’s not an unreasonable thing to want to do, of course, but the interface is – or at least was when I last used it several years ago – extremely clunky and user-unfriendly. That meant that, once a year, along with other academics with research grants (in my case from STFC) I had to waste hours uploading bibliometric and other data by hand. A sensible system would have harvested this automatically as it is mostly available online at various locations or allowed users simply to upload their own publication list as a file; most of us keep an up-to-date list of publications for various reasons (including vanity!) anyway. Institutions also keep track of all this stuff independently. All this duplication seemed utterly pointless.

I always wondered what happened to the information I uploaded every year, which seemed to disappear without trace into the bowels of RCUK. I assume it was used for something, but mere researchers were never told to what purpose. I guess it was used to assess the performance of researchers in some way.

When I left the UK in 2018 to work full-time in Ireland, I took great pleasure in ignoring the multiple emails demanding that I do yet another Researchfish upload. The automated reminders turned into individual emails threatening that I would never again be eligible for funding if I didn’t do it, to which I eventually replied that I wouldn’t be applying for UK research grants anymore anyway. So there. Eventually the emails stopped.

Then, about three years ago, ResearchFish went from being merely pointless to downright sinister as a scandal erupted about the company that operates it (called Infotech), involving the abuse of data and the bullying of academics. I wrote about this here. It then transpired that UKRI, the umbrella organization governing the UK’s research council had been actively conniving with Infotech to target critics. An inquiry was promised but I don’t know what became of that.

Anyway, all that was a while ago and I neither longer live nor work in the UK so why mention ResearchFish again, now?

The reason is something that shocked me when I found out about it a few days ago. Researchfish is now operated by commercial publishing house Elsevier.

Words fail. I can’t be the only person to see a gigantic conflict of interest. How can a government agency allow the assessment of its research outputs to be outsourced to a company that profits hugely by the publication of those outputs? There’s a phrase in British English which I think is in fairly common usage: marking your own homework. This relates to individuals or organizations who have been given the responsibility for regulating their own products. Is very apt here.

The acquisition of Researchfish isn’t the only example of Elsevier getting its talons stuck into academia life. Elsevier also “runs” the bibliometric service Scopus which it markets as a sort of quality indicator for academic articles. I put “runs” in inverted commas because Scopus is hopelessly inaccurate and unreliable. I can certainly speak from experience on that. Nevertheless, Elsevier has managed to dupe research managers – clearly not the brightest people in the world – into thinking that Scopus is a quality product. I suppose the more you pay for something the less inclined you are to doubt its worth, because if you do find you have paid worthless junk you look like an idiot.

A few days ago I posted a piece that include this excerpt from an article in Wired:

Every industry has certain problems universally acknowledged as broken: insurance in health care, licensing in music, standardized testing in education, tipping in the restaurant business. In academia, it’s publishing. Academic publishing is dominated by for-profit giants like Elsevier and Springer. Calling their practice a form of thuggery isn’t so much an insult as an economic observation. 

With the steady encroachment of the likes of Elsevier into research assessment, it is clear that as well as raking in huge profits, the thugs are now also assuming the role of the police. The academic publishing industry is a monstrous juggernaut that is doing untold damage to research and is set to do more. It has to stop.

The Scopus Horror Show

Posted in Open Access with tags , , , on October 31, 2024 by telescoper

Today being Hallowe’en, it seems an appropriate time to tell you a horror story. A few weeks ago I posted about the inaccuracy of the Scopus bibliographic database. I’ve contacted Scopus multiple times to supply them with correct data about the Open Journal of Astrophysics, but the errors persist. It seems I’ll have to take legal action to get them to correct the false and misleading information Scopus is displaying.

I was recently told about a paper with the title The museum of errors/horrors in Scopus. Written by F. Franceschini, D. Maisano & L. Mastrogiacomo and published in 2016, it demonstrates that people have known how poor Scopus is for many years. Yet still it is used.

Here is part of the abstract:

Recent studies have shown that the Scopus bibliometric database is probably less accurate than one thinks. As a further evidence of this fact, this paper presents a structured collection of several weird typologies of database errors, which can therefore be classified as horrors. Some of them concern the incorrect indexing of so-called Online-First paper, duplicate publications, and the missing/incorrect indexing of references. A crucial point is that most of these errors could probably be avoided by adopting some basic data checking systems.

DOI: 10.1016/j.joi.2015.11.006

Eight years on, there’s no sign of scopus adopting “basic data systems” but they don’t really have an incentive to improve do they? It seems the world of research assessment refuses to question the reliability of the product. Critical thinking is an alien concept to the bean counters.

P.S. Oíche Shamhna shona daoibh go léir!

Reformscape – the Video!

Posted in Education, Maynooth with tags , , , on February 2, 2024 by telescoper

Not long ago, I posted an item about the San Francisco Declaration on Research Assessment (DORA). I was interested this week to see the latest initiative from DORA which is called Reformscape. You can read much more about here. Here are some excerpts from the introduction:

  • The old ways of assessing the quality of research and progressing the careers of researchers are no longer fit for purpose. These dated approaches are neither fair nor responsible and often leave talented people overlooked, holding back progress in diversity, equity and inclusion. Institutions are increasingly expected to move with the times and update their assessment practices, but making meaningful change isn’t easy. 
  • Luckily, many have gone before you. Institutions around the world have been busy figuring out how to overcome the challenges of reforming academic career assessment, and we are here to help you learn from their experiences.
  • DORA Reformscape is an online tool where you can explore examples of how to bring responsible assessment for hiring, promotion and tenure into your institution, and to share your approach with others.
  • DORA Reformscape is an online tool where you can explore examples of how to bring responsible assessment for hiring, promotion and tenure into your institution, and to share your approach with others.

There’s also an introductory video:

Since my own institution, Maynooth University, is a signatory of DORA I am sure that it will already be working to implement the Reformscape recommendations into its own processes…

The Physics Overview

Posted in Science Politics with tags , , , , , , , , on January 17, 2009 by telescoper

I found out by accident the other day that the Panels conducting the 2008 Research Assessment Exercise have now published their subject overviews, in which they comment trends within each discipline.

Heading straight for the overview produced by the panel for Physics (which is available together with two other panels here),I found some interesting points, some of which relate to comments posted on my previous items about the RAE results (here and here) until I terminated the discussion.

One issue that concerns many physicists is how the research profiles produced by the RAE panel will translate into funding. I’ve taken the liberty of extracting a couple of paragraphs from the report to show what they think. (For those of you not up with the jargon, UoA19 is the Unit of Assessment 19, which is Physics).

The sub-panel is pleased with how much of the research fell into the 4* category and that this excellence is widely spread so that many smaller departments have their share of work assessed at the highest grade. Every submitted department to UoA19 had at least 70% of their overall quality profile at 2* or above, i.e. internationally recognised or above.

Sub-panel 19 takes the view that the research agenda of any group, or of any individual for that matter, is interspersed with fallow periods during which the next phase of the research is planned and during which outputs may be relatively incremental, even if of high scientific quality. In the normal course of events successful departments with a long term view will have a number of outputs at the 3* and 2* level indicating that the groundwork is being laid for the next set of 4* work. This is most obviously true for those teams involved with very major experiments in the big sciences, but also applies to some degree in small science. Thus the quality profile is a dynamic entity and even among groups of very high international standing there is likely to be cyclic variation in the relative amounts of 3* and 4* work according to the rhythm of their research programmes. Most departments have what we would consider a healthy balance between the perceived quality levels. The subpanel strongly believes that the entire overall profile should be considered when measuring the quality of a department, rather than focussing on the 4* component only.

I think this is very sensible, but for more reasons than are stated. For a start the judgement of what is 4* or 3* must be to some extent subjective and it would be crazy to allocate funding entirely according to the fraction of 4* work. I’ve heard informally that the error in any of the percentages for any assessment is plus or minus 10%, which also argues for a conservative formula. However one might argue about the outcome, the panels clearly spent a lot of time and effort determining the profiles so it would seem to make sense to use all the information they provide rather than just a part.

Curiously, though, the panel made no comment about why it is that physics came out so much worse than chemistry in the 2008 exercise (about one-third of the chemistry departments in the country had a profile-weighted quality mark higher than or equal to the highest-rated physics department). Perhaps they just think UK chemistry is a lot better than UK physics.

Anyway, as I said, the issue most of us are worrying about is how this will translate into cash. I suspect HEFCE hasn’t worked this out at all yet either. The panel clearly thinks that money shouldn’t just follow the 4* research, but the HEFCE managers might differ. If they do wish to follow a drastically selective policy they’ve got a very big problem: most physics departments are rated very close together in score. Any attempt to separate them using the entire profile would be hard to achieve and even harder to justify.

The panel also made a specific comment about Wales and Scotland, which is particularly interesting for me (being here in Cardiff):

Sub-panel 19 regards the Scottish Universities Physics Alliance collaboration between Scottish departments as a highly positive development enhancing the quality of research in Scotland. South of the border other collaborations have also been formed with similar objectives. On the other hand we note with concern the performance of three Welsh departments where strategic management did not seem to have been as effective as elsewhere.

I’m not sure whether the dig about Welsh physics departments is aimed at the Welsh funding agency HEFCW or the individual university groups; SUPA was set up with the strong involvement of SFC and various other physics groupings in England (such as the Midlands Physics Alliance) were actively encouraged by HEFCE. It is true, though, that the 3 active physics departments in Wales (Cardiff, Swansea and Aberystwyth) all did quite poorly in the RAE. In the last RAE, HEFCW did not apply as selective a funding formula as its English counterpart HEFCE with the result that Cardiff didn’t get as much research funding as it would if it had been in England. One might argue that this affected the performance this time around, but I’m not sure about this as it’s not clear how any extra funding coming into Cardiff would have been spent. I doubt if HEFCW will do any different this time either. Welsh politics has a strong North-South issue going on, so HEFCW will probably feel it has to maintain a department in the North. It therefore can’t penalise Aberystwyth too badly for its poor RAE showing. The other two departments are larger and had very similar profiles (Swansea better than Cardiff, in fact) so there’s very little justification for being too selective there either.

The panel remarked on the success of SUPA which received a substantial injection of cash from the Scottish Funding Council (SFC) and which has led to new appointments in strategic areas in several Scottish universities. I’m a little bit skeptical about the long-term benefits of this because the universities themselves will have to pick up the tab for these positions when the initial funding dries up. Although it will have bought them extra points on the RAE score the continuing financial viability of physics departments is far from guaranteed because nobody yet knows whether they will gain as much cash from the outcome as they spent to achieve it. The same goes for other universities, particularly Nottingham, who have massively increased their research activity with cash from various sources and consequently done very well in the RAE. But will they get back as much as they have put in? It remains to be seen.

What I would say about SUPA is that it has definitely given Scottish physics a higher profile, largely from the appointment of Ian Halliday to front it. He is an astute political strategist and respected scientist who performed impressively as Chief Executive of the now-defunct Particle Physics and Astronomy Research Council and is also President of the European Science Foundation. Having such a prominent figurehead gives the alliance more muscle than a group of departmental heads would ever hope to have.

So should there be a Welsh version of SUPA? Perhaps WUPA?

Well, Swansea and Cardiff certainly share some research interests in the area of condensed-matter physics but their largest activities (Astronomy in Cardiff, Particle Physics in Swansea) are pretty independent. It seems to me to be to be well worth thinking of some sort of initiative to pool resources and try to make Welsh physics a bit less parochial, but the question is how to do it. At coffee the other day, I suggested an initiative in the area of astroparticle physics could bring in genuinely high quality researchers as well as establishing synergy between Swansea and Cardiff, which are only an hour apart by train. The idea went down like a lead balloon, but I still think it’s a good one. Whether HEFCW has either the resources or the inclination to do something like it is another matter, even if the departments themselves were to come round.

Anyway, I’m sure there will be quite a lot more discussion about our post-RAE strategy if and when we learn more about the funding implications. I personally think we could do with a radical re-think of the way physics in Wales is organized and could do with a champion who has the clout of Scotland’s SUPA-man.

The mystery as far as I am concerned remains why Cardiff did so badly in the ratings. I think the first quote may offer part of the explanation because we have large groups in Astronomical Instrumentation and Gravitational Physics, both of which have very long lead periods. However, I am surprised and saddened by the fact that the fraction rated at 4* is so very low. We need to find out why. Urgently.