The iron law of scientific oligarchies

The scientific community has entrenched oligarchies. Science would be better served if they were broken up.

In scientific research, the aim should be to do solid work, and every now and then a bigger story will emerge by itself. The funding to do that work gets distributed equitably based on the quality of science rather than the topic, and people get on with the task of reading a little of nature’s infinite book of secrecy.

The problem nowadays is that the ability to secure funding is arguably more important to a scientist’s career than the quality of their science – if you are successful in obtaining money then your scientific output can take its chances, but if you are not successful in obtaining money then your scientific output will dwindle as the resources available to do it drain away. Competing to obtain funds for proposed work is more important in career terms than actually doing the work itself.

The founding idea of scientific funding being awarded competitively is to ensure that only good-quality scientific proposals are supported, and that money is not simply divvied out in a nepotistic way. As with a lot of ideas, this sounds and is good at smaller scales, but frays once things have expanded – and especially when demand outstrips supply. In a small community with plentiful resources, competitively-awarded funding acts as a quality control measure. But as the number of good-quality proposals exceeds the funding capacity in the system, funding decisions inevitably become more biased, as is the case now.

There’s another effect that comes into play once things have increased in size beyond a certain point: Michels’ iron law of oligarchy. This asserts that while direct democracy is feasible in small groups (such as pirate ships or, less romantically, in parts of Switzerland), it becomes unwieldy once groups have achieved a certain size and at that point representatives are needed. This delegation of responsibility and therefore power creates a leadership class whose influence becomes entrenched. The leadership class occupies administrative positions within the enlarged group that ensure access is controlled, and their grip over the decision-making and flow of information (think funding panels, editorial boards, tenure committees) guarantees a maintenance of the status quo.

This creates a feedback loop, in which the most important goal is to signal that you belong in the oligarchy rather than matching the standards which originally generated that oligarchy. The toxic effects this generates are felt at every level of the career ladder as people scramble to keep on the guide rails, and explains why people are so obsessed with publishing supposedly “blockbuster” papers in prestige journals. It’s not about the science, it’s about signalling that you either already belong or deserve to belong in a wealthy club with a very restricted membership. The defining property of prestige journals is not the quality or originality of the work, but that they are hard to be accepted into. Just like a private club. 

As Michael Eisen has observed, we perhaps overrate the influence of prestige journals. For most would-be oligarchs, a lot of their membership credentials are already baked in before that critical first-author postdoc paper comes along (and recent work on how few universities supply the majority of professors in the USA supports that conclusion). But while a lot of those credentials are extremely hard to access for those outside the circle of privilege (undergraduate university, postgraduate university, postdoctoral lab) the seeming equality of the publication process and the fact that this may be both the first and last opportunity for many scientists with less advantageous backgrounds makes it seem like the be-all and end-all of their research careers. And for many, it might well be. 

Because once you’re in, you’re in. Even with the frankly obscene charges that prestige journals are currently levying, it’s not uncommon to hear people say that they’d pay such sums in a heartbeat, because getting that publication is the gateway to getting an ERC grant or a similarly lavish funding award. The 10,000+ article publication fee therefore becomes an investment in a potential 1,000,000+ grant windfall, i.e. a 100-times return. And each subsequent article creates the possibility of another grant windfall. That leads to the current system where publishing supposedly “high-impact” papers is the portal to securing yet more funding and a chance for funders, institutions, and individuals to all trumpet the excellent research that’s being done with taxpayer money.  

In science terms, there’s two problems with this – and they both lead to inequitable distribution of funds across a very narrow range of labs. First, funding science is an investment. And just as with the stock market, you don’t really know whether the investment will pay off. Just because somebody once published something labelled a breakthrough does not by any means – and invariably never does – mean that they will continue to publish actual scientific breakthroughs for the rest of their scientific career. In fact, what most scientists do is settle down into the comfortable middle-aged spread of research – a larger group contentedly puzzling out all the little intricacies of the thing they first stumbled upon. Wayne Wahls has eloquently highlighted how risk is minimised by investing heavily in the blue chip stuff, i.e. groups that are already successful and well-established, instead of distributing the money more widely (fun fact: capping NIH funding at 1,000,000 dollars per group would free up sufficient funds to support 10,000 new groups!). 

Second, in a hypercompetitive system where research quality is no longer a guarantor of funding but merely the first hurdle, it’s better to be working in a hot area than a warm or lukewarm one. This produces the now-usual stampedes of groups toward whatever thing is currently trending, be it lipid rafts, autophagy, extracellular vesicles, epigenetics, phase separation, or coronavirus, with prestige journals acting as tastemakers and trendsetters. Prestige journals are not good barometers of quality science but they are excellent barometers of fashionable science. Good science is much more widely distributed. This is the second way of minimising (or rather, justifying) risk – a trendy area must be important because of the attention it’s getting, and so investing in that area demonstrates that money is being directed to important projects.

It’s a comfortable system because funders can point to papers published in prestige journals as evidence of good investment, and papers in prestige journals increasingly require a sizeable investment in personnel and infrastructure to be impressive. The question is what is better served – the science, or the people primarily benefitting from the distribution of funds, i.e. the oligarchs. 

It suggests a lack of oversight and regulation, and despite all the justified hand-wringing the fact remains that the funding landscape we see today is the natural endpoint of the current structure of scientific research. If we insist on continuing to judge science and scientists primarily on papers and grants, this is where we arrive. More money won’t help – it will change the absolute numbers but not the percentages, so we’ll have more oligarchs but the same system architecture. Another, very revealing argument is often heard from the oligarch class – that if more money is not available for the system, then we should reduce the number of people and train fewer PhDs. In other words, reduce access (the oligarchs’ labs are obviously exempt from these restrictions).

It’s common to see funders and institutions engaged in “No, you first” antics when the question of system reform comes up, but ultimately taxpayers should be entitled to find out whether the money invested in research is being used effectively to address societal needs. What would effective investment mean? High quality work, a diverse portfolio of work across a broad spectrum of scientific disciplines and areas, and equitable access to good-quality infrastructure that ensures the maximum possible number of good scientists are being produced. Instead, money is currently being spent on sustaining a leadership class whose activities are geared primarily towards maintaining that status (with advancing knowledge a hoped-for by-product of the process).

If we want systemic change, then we need to break up the oligarchies. Restock the funding panels, editorial boards, and tenure committees with a more-diverse (and ideally younger) membership, make membership on each body time-limited, and ensure there are gaps between consecutive terms. In other words, spread the responsibility within the community to combat the emergence of entrenched groups and maximise inclusivity in the decision-making process. The science won’t suffer – there are far more good scientists than there are oligarchs.

Related postings:
Hat tip – Wayne Wahls – highlights Wahls’ eLife article on funding disparities in the USA.
Beating the Odds – on whether a handicapping system might benefit research.
Pop science – how “high impact” really means “mainstream”.
Blockbuster science vs arthouse science – how good research can be done by auteurs.
The new dubstep – how stampedes toward trendy areas damage quality.

2 thoughts on “The iron law of scientific oligarchies

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s