Isn’t it time we acknowledged that research is only occasionally a pleasure?
Research is an addiction. It’s consuming. It can become the focus of your whole world, but what does it usually mean in practice? Research too often requires long hours, is stressful, unavoidably coupled to a need for luck and serendipity…and yet we’re in thrall to it.
Many papers have more authors than citations. Citations don’t mean reads. Reads don’t mean respect.
But it’s research activity that represents how we scientists like to see ourselves – advancing human knowledge, going into the unknown. Going into the unknown is, however, generally not as comfortable as Captain Kirk had it in “Star Trek”. It’d be more like if Spock and Uhuru and Bones and Sulu were all competing with each other, and if Kirk had to keep playing a lottery in order to get fuel to keep the Enterprise running, and if Scotty was great but prone to alcoholic binges that limited his working hours, and if his crew was a mixed bag of outstanding young stars and Ensign Rickys who were doomed to screw up immediately and be obliterated. (No, wait, that is kind of what his crew was like.)
The problem with research is that it so seldom makes us happy. Research can be a pretty miserable state of affairs when that’s all you’re doing, because the opportunity for positive feedback is so limited. How often do people tell you your last paper was any good? How often do you even have a paper – a paper that you can really consider yours, instead of being a named but noteless co-author – that you can actually celebrate? The increasing length of scientific publications means that the gap between publishing keeps lengthening, so research junkies just get more and more strung out between massive hits. It’s like continually veering between chronic abstinence and acute debauchery, instead of settling into some loving, rewarding, and stable relationship.
But always there’s that cultish mentality in the background that says research is the be-all and end-all.
It’s not helped by the fact that we’re often so embedded in tiny niches that we lose perspective on the importance of our own work. Cognitive bias inevitably inclines us to think our research is more significant than it is, especially if we’re immersed in it to the exclusion of all else, so the inevitable rejections – of manuscripts, of Fellowships, of grants – means that we’re only ever a step away from an existential crisis instead of seeing them for what they are: a vital part of the system that helps keep it rigorous and meritocratic. And that’s the positive spin. If you start seeing the rejections as political, cliquey, tribalistic, or motivated by personal feelings…well, your blood pressure is going to be spending more time up than down, and your mood the opposite.
So we’re left with the stress of work, the slow trickle of data, the fear of competition, the agony of the review process…counteracted only briefly by a gnat’s-orgasm buzz of a successful publication or grant application.
Another problem with a research-intensive existence is that there’s not (paradoxically) enough mental stimulation. Scientists are intellectuals. Alongside our practical work we should be leading an intellectually rich existence – thinking, talking, reading, writing, debating, collaborating, teaching, mentoring, cogitating, communicating, ruminating. The all-hours-at-the-bench culture of blind data acquisition and a rush to publish success stories is – particularly in the post-genomic era of annotation – often not very mentally rewarding. It’s why conferences are invariably so much fun. Conferences transiently create the kind of intellectually lively environment that we tell ourselves we’re in the whole time.
Instead, research-intensive science too often creates a weird kind of plantation culture. More research means more papers, which means more grants, which means more money, which means more people, which means more research…and so on. The process keeps running as a self-propagating, self-fertilising system, but one in which the plantation owners (who nowadays too often resemble salespeople, self-promoters, and merchants) reap the profits. Meanwhile, the diligent wage workers subscribe to a collective Stockholm syndrome that lets them believe their contributions are uniquely valuable and that they too might one day get to run a plantation of their own, even as they are exploited as a disposable and replaceable resource by the people they aspire to emulate.
Strangely, this wilful mass delusion doesn’t extend to other aspects of the scientific life. Teaching, for example, is acknowledged to be difficult, time-consuming, frequently dull, yielding uncertain rewards, and often accompanied by a large dose of ingratitude from both above and below one’s station. It can be inspiring, and it can make a real impact. But not always.
Research is just the same, but we refuse to see it in the same pragmatic terms. Like theoretical physicists lording it over experimentalists (the “Oompa-loompas of science” line remains “The Big Bang Theory”’s best joke), research-intensive labs invariably see themselves as superior to labs with a larger teaching component, but it’s an open question as to who actually contributes the most to society. The reflexive insistence of research institutes that they’re more important to science may be a subconscious reaction to the fact that most of the people there are stressed and miserable and seeking validation.
Hobbies are fun precisely because you don’t depend on them to pay the bills. As soon as you turn your hobby into your job, a lot of the fun tends to go out of it. Similarly, research is at its most fun when you’re just pottering around and the pressure is off. These are also the conditions that are most likely to stimulate creativity and bold ideas. Reducing the reliance on research should – paradoxically – benefit research. But an excessive emphasis on research will lead to lots of high-quality consolidation work unless the incentive structure is put together in an innovative way – otherwise you simply get places that fixate on how many “high impact” (yawn) papers they can produce in “top journals” (yawn) in the shortest possible time. This isn’t science, this is Goodhart’s law in action.
Smart mentorship should encourage intellectual engagement and not just data generation. Smart training should encourage young scientists to develop a portfolio of scientific skills (lecturing, communicating, supervising, writing) as well as a facility at benchwork.
It’s important to see the bigger picture and recognise that science is more, much more, than just research. It’s more fun that way.