Prof. Eve Marder, from Brandeis University, was one of the founder editors of eLife, a scientific journal launched in 2012, one of the few journals that is still run by working scientists, as opposed to so-called “professional” editors, like most of the commercial journals.
Dr. Marder wrote recently an opinion article for the journal in which she sharply criticises the kinds of words, often derogatory, that reviewers use when judging research papers, grants and appointments.
She writes: “Over the years I have grown to truly abhor some of the words that are overused and abused when we review manuscripts, job candidates, and grant applications. In particular, I now detest five words: incremental, novelty, mechanism, descriptive, and impact. These words are codes behind which we hide, and are frequently used in lieu of actual explanations of what people think about the subject at hand.”Read more...
It can be considerably frustrating to have to summarize many years of work in just 150 words, but that is what scientists often have to do at the time of writing the Abstract section of their research papers. However, a well written Abstract is crucially important, as it is the first thing (sometimes the only thing!) that readers will read, including the journal Editors that will decide about its publication. It can really be a make-it-or-break-it for the success of the article. However, many young and budding scientists often struggle with this section, usually because of an inability to distill the single most important and essential part of the discovery in a clear and simple way. Read more...
Once upon a time, science journals were run by scientific societies and their editors were active scientists. Very few of these remain today. Instead, nowadays most journals are own by private, usually very large, publishing companies and their editors are “professional”. That is, their only job is to be journal editors, they are not active scientists. Most of them were active scientists earlier in their career, but left academia to become “professional” editors, usually, shortly after their postdoctoral studies. Because of this, most professional editors are much younger (no problem there) and considerably more inexperienced (hmmm… ) than the principal investigators from whom they receive manuscripts for consideration. Typically, these “youngish” editors can get advice and (one would hope) guidance from more senior editors within the same journal or publishing organization, but they are pretty much in charge of the main decisions of the manuscripts assigned to them. Read more...
The other day, I run into O.A., one of my former students who is now a research group leader. O.A. is not the type that lacks self-confidence, and although having a bit of a lazy attitude, he has some good ideas and a good feel for where the money is. I asked him how his research was going. He responded with a tepid smile, as if to indicate that I had asked the right question: “Very good. Next week I have a paper coming out in Nature, although I am only second last author in that one. I published a paper in EMBO Journal jus a few weeks ago. And we have also made some very interesting observations which will likely lead to a paper in a high-impact journal!”
I do not live in a bubble, so, in a disappointing way, I was not surprised. But it was difficult for me to keep myself from venting a remark of frustration, “O., you tell me where you are publishing your work, but you don’t tell me what the work was about, what you have discovered! Isn’t that the important thing?” Well, I did not actually make that last rhetorical question, but I should have.
Here is O.A., one of my former students, one of the promising ones, explaining his research in terms of the journals in which it is getting published, as if that were the only thing that matters in his science. Yes, he may have been trying to make an impression on his former mentor. But… where did the science go? Isn’t that what really counts? The “Journal Syndrome” has advanced to such point that the title of the journal in which the research is published becomes more important than the research itself and hence the preferred short-hand description for science output.
How did we get to this situation and can this trend be reversed? Without doubt, this is a direct product of the current addiction to Impact Factors, the mother of most curses in modern science. However, while making Impact Factors disappear would appear very difficult at this time, avoiding the Journal Syndrome should be relatively simpler. When someone asks about your research, pretend he or she is a distant relative with no inside knowledge and simply tell them what you found in as few and simple words as possible. Journal Syndrome manifests most commonly when the other person is also a scientist. In this case, you can allow yourself a bit more jargon and specifics, but the key point is always to keep the focus on your new findings. Try this next time. And if you are at the other end, as I was with my student O.A., don’t let them get away with the journal babble. Force them to tell you what they found. Hopefully, they’ll know…
A newly recruited staff in a research group has her first meeting with the principal investigator, a full Professor, to discuss projects and tasks to carry out in the lab. During the conversation, it becomes apparent that the so-called principal investigator is nothing more than a former clinician turned science administrator that pretends leading a research group. There are no new projects coming from the mind of this principal investigator.
“Go to PubMed and find something interesting to work on”, says the Professor.
Astonished, the newly recruited lab member becomes silent and after a few awkward minutes leaves the room, in shock.
“Go to PubMed and find something interesting to work on”. Now, we should point out that PubMed is the public repository of all scientific literature in the life sciences and biomedicine of the entire planet since the beginning of time. There are literary millions of papers in the repository. How does one find “something interesting to work on” there? Is this the best advice, the best guidance that this so-called senior scientist has to offer to his newly recruited lab member?
I could not believe when I first heard this, but it is a true story. It happened at the National University of Singapore, but the characters shall remain anonymous. There are likely people like that in most universities around the world. Group leaders out there that have no clue whatsoever of what science is about, or what is to be an inspiring mentor. How their reputations survive is a total mystery.
I have been around long enough to remember the time when there were no impact factors. (Don’t know what an impact factor is? Read HERE). We all knew that, say, Nature, was more prestigious (or sexy, hot, trendy, impactful, whatever you want…) than, say, JBC. And that JBC was better journal than many (actually many!) other (ie lower) journals. We did not need any impact factors to realise that. And of course this “intuitive” information was used to evaluate job candidates and assess tenure. A paper in Nature was very important, we all knew that, and did not need any impact factors. The problem now is that impact factors put a hard number on what earlier was an intuitive, soft process. So, now we know that not only is Nature “better” than JBC, it is actually 10.12 times “better”. And PNAS is 2.23 times “better”. That is what has generated so many problems and distortions. The temptation to use those numbers is just too high, irresistible. For the journals, for the papers in them, and for individual scientists. And the numbers change every year. When applied to individual papers this gets totally crazy. Imagine. The “value” of a given paper can be higher (or lower) this year than, say, 3 years ago when it was published. The same paper, the same data. And let’s not get started with what the impact factor has done to innovaiton and creativity. (For a good view on this, read Sydney Brenner’s interview HERERead more...
Marc Kirschner is the John Franklin Enders University Professor and chair of the Department of Systems Biology, Harvard Medical School, Boston, MA 02115. He recently wrote an editorial for Science magazine published on 14 June 2013.
Kirschner debunks the notion of research “impact” as the likelihood that the proposed work will have a “sustained and powerful influence”. He writes that: “Especially in fundamental research, which historically underlies the greatest innovation, the people doing the work often cannot themselves anticipate the ways in which it may bring human benefit. Thus, under the guise of an objective assessment of impact, such requirements invite exaggerated claims of the importance of the predictable outcomes—which are unlikely to be the most important ones. This is both misleading and dangerous“.
Importantly, he says that “One may be able to recognize good science as it happens, but significant science can only be viewed in the rearview mirror. To pretend otherwise distorts science.”
Everyone preoccupied about the “impact” of current and future research should read Marc Kirschner’s full piece HERE.
Open access journals charge fees to their authors for publication of accepted articles. Some of those fees can be quite significant. Cell Reports, a new journal from Cell Press, charges $5,000 per article, the highest among open access research periodicals. There is currently a debate as to whether the journals that charge the most are the most influential. A recent survey appears to indicate that price doesn’t always buy prestige in open access. My friend and colleague M.F. has recently made a prescient comment in this context: “…but apart from the commercial desire to maximize profits, the pricing is probably designed as part of the brand signal, to make the point that this should be in the very top tier of journals. Similar to launching a new “premium” wine to the market, if price on release is low, the consumers will never perceive it as a premium wine… . Time will tell if this self-fulfilling prophecy is indeed true, or if journals like Open Biology or eLife can completely break that model.“
Novel scientific propositions are initially taken with skepticism. Eventually, they become accepted –at least some of them. The transition between heressy and main stream has been debated ad nauseam. British geneticist J.B.S. Haldane (1892-1964) has been famous for many things, one among them was his incisive sarcasm. Haldane was an assiduous contributor to the Journal of Gentics, not only of scientific articles, but often many book reviews. One of those reviews, published in 1963 (Journal of Genetics Vol. 58, page 464), is perhaps the best known among the lot, not because of the book being reviewed, but because of Haldane’s now famous description of the stages in the process of acceptance of scientific theories. In Haldane’s words, theories invariably pass through “the “usual four stages”:
1. worthless nonsense
2. interesting, but perverse
3. true, but quite unimportant
4. I always said so
Kudos to Haldane for his sharp insight into the scientific process, which the passage of time has only helped to confirm.
Thank you for sending us your paper “Downregulation of HlpxE-mediated transcription doubles median life-span expectancy in humans”, but I am afraid we cannot offer to publish it in The Current Biologist.
We appreciate the interest in the issue you are addressing, and your results sound potentially significant for the field, but our feeling is that at this stage your paper would be better suited to a somewhat more specialised journal.
I am sorry that we cannot give you a more positive response, but thank you for your interest in The Current Biologist.
Regards,
Geoffrey South
Editor”
Many of us —professional scientists writing research articles— have had to confront this type of letters from journal editors. We have grown accustomed to them. A standard cut-and-paste piece of text used knee-jerkedly by editors without much thought or consideration. We file them promptly, and move on. After all, there are plenty of journals around, both general and specialized. No big deal, right?
However, the concept of “a more specialized journal” remains as elusive as ever. Admitedly, The Current Biologist is a half-invented journal, but there are still plenty like it that claim to be “generalists” and yet publish papers with titles like “Slicing-Independent RISC Activation Requires the Argonaute PAZ Domain” or “Distinct Roles of Talin and Kindlin in Regulating Integrin α5β1 Function and Trafficking” or “SUMOylation of the α-Kleisin Subunit of Cohesin Is Required for DNA Damage-Induced Cohesion“ and so forth… How about that for “specialized” knowledge?
A journal that publishes papers containing three or more abbreviations or jargon terms in their titles can not honestly claim to be a generalist. To ask their authors to submit their work to a more specialized journal is –at the very least– disrespectful to the authors who have put so much work behind a research study. Understandably, however, polite alternatives require more time and effort from journal editors. “We feel that the results presented in your manuscript lack mechanistic insights and therefore seem too preliminary for our journal” would seem like a more honest and realistic alternative. Or why not simply let them know the truth: “The topic of your study falls outside the scope of our journal”? Alas, either of these requires editors to have read the manuscript, which —sadly— is not always the case.
Perhaps it’s time to launch the “Journal of Specialized Biology” , a forum for all those research papers that —like the one above on human life-span doubling— have been deemed to too specialized for die-hard “Argonaute PAZ Domain” generalists.