Category Archives: Science

Making science (part XIV): Journal Syndrome

The other day, I run into O.A., one of my former students who is now a  research group leader. O.A. is not the type that  lacks self-confidence, and although having a bit of a lazy attitude, he has some good ideas and a good feel for where the money is. I asked him how his research was going. He responded with a tepid smile, as if to indicate that I had asked the right question: “Very good. Next week I have a paper coming out in Nature, although I am only second last author in that one. I published a paper in EMBO Journal jus a few weeks ago. And we have also made some very interesting observations which will likely lead to a paper in a high-impact journal!

I do not live in a bubble, so, in a disappointing way, I was not surprised. But it was difficult for me to keep myself from venting a remark of frustration, “O., you tell me where you are publishing your work, but you don’t tell me what the work was about, what you have discovered! Isn’t that the important thing?” Well, I did not actually make that last rhetorical question, but I should have.

Here is O.A., one of my former students, one of the promising ones, explaining his research in terms of the journals in which it is getting published, as if that were the only thing that matters in his science. Yes, he may have been trying to make an impression on his former mentor. But…  where did the science go? Isn’t that what really counts? The “Journal Syndrome” has advanced to such point that  the title of the journal in which the research is published becomes more important than the research itself and hence the preferred short-hand description for science output.

How did we get to this situation and can this trend be reversed? Without doubt, this is a direct product of the current addiction to Impact Factors, the mother of most curses in modern science.  However, while making Impact Factors disappear would appear very difficult at this time, avoiding the Journal Syndrome should be relatively simpler. When someone asks about your research, pretend he or she is a distant relative with no inside knowledge and simply tell them what you found in as few and simple words as possible. Journal Syndrome manifests most commonly when the other person is also a scientist.  In this case, you can allow yourself a bit more jargon and specifics, but the key point is always to keep the focus on your new findings. Try this next time. And if you are at the other end, as I was with my student O.A., don’t let them get away with the journal babble. Force them to tell you what they found. Hopefully, they’ll know…

Making science (part XIII): How not to make science

A newly recruited staff in a research group has her first meeting with the principal investigator, a full Professor,  to discuss projects and tasks to carry out in the lab. During the conversation, it becomes apparent that the so-called principal investigator is nothing more than a former clinician turned science administrator that pretends leading a research group. There are no new projects coming from the mind of this principal investigator.

Go to PubMed and find something interesting to work on”, says the Professor.

Astonished, the newly recruited lab member becomes silent and after a few awkward minutes leaves the room, in shock.

“Go to PubMed and find something interesting to work on”. Now, we should point out that PubMed is the public repository of all scientific literature in the life sciences and biomedicine of the entire planet since the beginning of time. There are literary millions of papers in the repository. How does one find “something interesting to work on” there? Is this the best advice, the best guidance that this so-called senior scientist has to offer to his newly recruited lab member?

I could not believe when I first heard this, but it is a true story. It happened at the National University of Singapore, but the characters shall remain anonymous. There are likely people like that in most universities around the world.  Group leaders out there that have no clue whatsoever of what science is about, or what is to be an inspiring mentor. How their reputations survive is a total mystery.

Making science (part XII): The problem with impact factors

I have been around long enough to remember the time when there were no impact factors. (Don’t know what an impact factor is? Read HERE). We all knew that, say, Nature, was more prestigious (or sexy, hot, trendy, impactful, whatever you want…) than, say, JBC. And that JBC was better journal than many (actually many!) other (ie lower) journals. We did not need any impact factors to realise that. And of course this “intuitive” information was used to evaluate job candidates and assess tenure. A paper in Nature was very important, we all knew that, and did not need any impact factors. The problem now is that impact factors  put a hard number on what earlier was an intuitive, soft process. So, now we know that not only is Nature “better” than JBC, it is actually 10.12 times “better”. And PNAS is 2.23 times “better”. That is what has generated so many problems and distortions. The temptation to use those numbers is just too high, irresistible. For the journals, for the papers in them, and for individual scientists. And the numbers change every year. When applied to individual papers this gets totally crazy. Imagine. The “value” of a given paper can be higher (or lower) this year than, say,  3 years ago when it was published. The same paper, the same data. And let’s not get started with what the impact factor has done to innovaiton and creativity. (For a good view on this, read Sydney Brenner’s interview HERE).

Here is an idea. Why don’t we all get together and sue collectively Thomson Reuters for having commercialised (or Eugene Garfield, for having invented) this monster and caused so much havoc?

“A boring Noble Prize” (or a lesson in mediocre science journalism)

At the end of September each year, science journalists all over the world make their forecasts for the upcoming announcement of the Nobel Prizes that take place during the first week of October in Stockholm, Sweden. The week begins with the announcement of the winners of the Nobel Prize in Physiology or Medicine, awarded by the Karolinska Institute. It is followed by the Physics, Chemistry and Literature Prizes. As expected, this activity is all the more significant at Swedish newspapers and TV and radio stations, and this year of 2013 was no exception. Inger Atterstam, from the Svenska Dagbladet newspaper, is regarded as one of the most accredited science journalists in Sweden. Her 2013 forecast for the Physiology or Medicine Nobel Prize was vast and broad (to be on the safe side, presumably), and included scientists responsible for discoveries concerning the epidemiology of smoking, cochlea implants, treatments against malaria, rheumatoid arthritis, leukaemia and even Bill and Melinda Gates (!) (The nature of the discoveries made by the Gates couple which according to Ms. Atterstam deserved such a high honour was, however, not revealed).

Outside Nobel Forum, minutes after the announcement, Ms. Atterstam is visibly upset about the choice made by the Nobel Assembly for the 2013 Nobel Prize in Physiology or Medicine, and she makes no effort to hide her discontent in front of the cameras. The Prize went to James Rothman, Randy Schekman and Thomas Sudhof for their discoveries of the molecular mechanisms that control the specificity of trafficking, fusion and release of vesicles within and from cells. A long overdue award to one of the most influential and fundamental concepts in modern cell biology with direct relevance to a great number of human diseases including diabetes and neurological disorders. Incredibly important, but far away from any of the predictions made by Ms. Atterstam during the previous days. And it shows.

Microphone in hand, she confronts the unforgiving camera visibly distressed. Her eyes roll from left to right eluding the lens, her breath is heavy and agitated, her body swings back and forth. She does not pull her punches: “This was a  very traditional Nobel Prize, namely to three white, middle-class men coming from three of USA’s most prestigious and Nobel-awarded universities, Standford, Berkeley and Yale…” Wow! How about that for a bigoted statement? After a brief (and failed) attempt to explain some of the substance behind the discoveries, Ms. Atterstam revels in her own ignorance: “On the other hand, this is a very traditional and boring Nobel Prize because it is about very basic research that none really understands and that does not have any relevance, except in the realm of science.” Interesting words, coming from one of the leading science journalists in Sweden. Ms. Atterstam concluding remarks say it all: “The Nobel Committee has this time —once again— chosen not to give the Prize to applied research that concerns people [she chokes here] and which could thereby have drawn greater attention. We shall keep our hopes for the Higgs particle tomorrow.” Ms. Atterstam clearly considers the Higgs boson to be a discovery in applied science of immediate concern to people. 😉

Well, what else can be said? Here is one of the most prestigious science journalists of Sweden trying to explain basic research to the general public. As they say, with friends like Ms. Atterstam, who needs any enemies?

Making science (part XI): A perverted view of “impact”

Marc Kirschner is the John Franklin Enders University Professor and chair of the Department of Systems Biology, Harvard Medical School, Boston, MA 02115. He recently wrote an editorial for Science magazine published on 14 June 2013.

Kirschner debunks the notion of research “impact” as the likelihood that the proposed work will have a “sustained and powerful influence”. He writes that: “Especially in fundamental research, which historically underlies the greatest innovation, the people doing the work often cannot themselves anticipate the ways in which it may bring human benefit. Thus, under the guise of an objective assessment of impact, such requirements invite exaggerated claims of the importance of the predictable outcomes—which are unlikely to be the most important ones. This is both misleading and dangerous“.

Importantly, he says that “One may be able to recognize good science as it happens, but significant science can only be viewed in the rearview mirror. To pretend otherwise distorts science.

Everyone preoccupied about the “impact” of current and future research should read Marc Kirschner’s full piece HERE.

Making science (part X): What open access journals have in common with premium wine

Open access journals charge fees to their authors for publication of accepted articles. Some of those fees can be quite significant. Cell Reports, a new  journal from Cell Press, charges $5,000 per article, the highest among open access research periodicals. There is currently a debate as to whether the journals that charge the most are the most influential. A recent survey appears to indicate that price doesn’t always buy prestige in open access. My friend and colleague M.F. has recently made a prescient comment in this context: “…but apart from the commercial desire to maximize profits, the pricing is probably designed as part of the brand signal, to make the point that this should be in the very top tier of journals. Similar to launching a new “premium” wine to the market, if price on release is low, the consumers will never perceive it as a premium wine… . Time will tell if this self-fulfilling prophecy is indeed true, or if journals like Open Biology or eLife can completely break that model.

Making science (part IX): The process of acceptance of scientific theories

Novel scientific propositions are initially taken with skepticism. Eventually, they become accepted –at least some of them. The transition between heressy and main stream has been debated ad nauseam. British geneticist J.B.S. Haldane (1892-1964) has been famous for many things, one among them was his incisive sarcasm. Haldane was an assiduous contributor to the Journal of Gentics, not only of scientific articles, but often many book reviews. One of those reviews, published in 1963 (Journal of Genetics Vol. 58, page 464), is perhaps the best known among the lot, not because of the book being reviewed, but because of Haldane’s now famous description of the stages in the process of acceptance of scientific theories. In Haldane’s words, theories invariably pass through “the “usual four stages”:

1. worthless nonsense
2. interesting, but perverse
3. true, but quite unimportant
4. I always said so

Kudos to Haldane for his sharp insight into the scientific process, which the passage of time has only helped to confirm.

I always said so!

Making science (part VIII): A more specialized journal

“Dear Author, 

Thank you for sending us your paper “Downregulation of HlpxE-mediated transcription doubles median life-span expectancy in humans”, but I am afraid we cannot offer to publish it in The Current Biologist.

We appreciate the interest in the issue you are addressing, and your results sound potentially significant for the field, but our feeling is that at this stage your paper would be better suited to a somewhat more specialised journal.

I am sorry that we cannot give you a more positive response, but thank you for your interest in The Current Biologist.

Regards,
Geoffrey South
Editor”

Many of us —professional scientists writing research articles— have had to confront this type of letters from journal editors. We have grown accustomed to them. A standard cut-and-paste piece of text used knee-jerkedly by editors without much thought or consideration. We file them promptly, and move on. After all, there are plenty of journals around, both general and specialized. No big deal, right?

However, the concept of “a more specialized journal” remains as elusive as ever. Admitedly, The Current Biologist is a half-invented journal, but there are still plenty like it that claim to be “generalists” and yet publish papers with titles like “Slicing-Independent RISC Activation Requires the Argonaute PAZ Domain” or “Distinct Roles of Talin and Kindlin in Regulating Integrin α5β1 Function and TraffickingorSUMOylation of the α-Kleisin Subunit of Cohesin Is Required for DNA Damage-Induced Cohesion and so forth… How about that for “specialized” knowledge?

A journal that publishes papers containing three or more abbreviations or jargon terms in their titles can not honestly claim to be a generalist. To ask their authors to submit their work to a more specialized journal is –at the very least– disrespectful to the authors who have put so much work behind a research study. Understandably, however, polite alternatives require more time and effort from journal editors. “We feel that the results presented in your manuscript lack mechanistic insights and therefore seem too preliminary for our journal” would seem like a more honest and realistic alternative. Or why not simply let them know the truth: “The topic of your study falls outside the scope of our journal”? Alas, either of these requires editors to have read the manuscript, which —sadly— is not always the case.

Perhaps it’s time to launch the “Journal of Specialized Biology” , a forum for all those research papers that —like the one above on human life-span doubling— have been deemed to too specialized for die-hard “Argonaute PAZ Domain” generalists.

 

Making science (part VII): On the utility of science

In a recent interview for the podcast series of the Proceedings of the National Academy of Sciences of the USA, Ira Mellman of Genentech expressed his views on the utility of science practiced at academic institutions. After an academic career at Rockefeller and Yale University, Mellman joined Genentech in 2007 where he is Vice President of Research Oncology. Mellman is a member of the National Academy of Sciences since 2011. The interview is about current challenges in the field of cancer immunotherapy. But things get a bit more controversial at the end. Scroll the audio featured below to -0:35 and you’ll hear this:

PNAS: “I asked Mellman whether his move from academia to industry has brought him closer to his goal of practicing people-centered science.”

Mellman: “It gives one a deep feeling of satisfaction that you’ve actually done science that’s meaningful to people’s lives – and not just interesting, which is what one normally does in the academic realm.  You can be a terrific success if you are serially interesting, and it doesn’t really matter if you’re particularly useful. Here you really have to be both.”

So there you have it. At academia, it does not matter if you are useful or not.

I have to respectfully disagree with Mellman. First of all, I am yet to find an academic scientist who does not care whether his/her discoveries have an impact on people’s lives. Second, the practical consequences of all research are always of great importance at academic institutions, particularly in biomedicine. Clinical utility of biomedical research is always looming in the guidelines of all research grants. There is no interesting biomedical research –or successful biomedical scientists for that matter— that are not useful. Coming from Yale and Rockefeller, Mellman should know this very well.

But one point of greater philosophical interest is the general concept of “usefulness” in basic research. What does Mellman mean by being useful? Intriguingly, both of the research programmes mentioned as examples of people-centered  science in the interview are rooted in basic discoveries made at academic institutions. Were those original discoveries not useful? Had they never been made, there would not be any “people-centered science” for Mellman. Two good quotes come to mind here as well:

“…the shortest path to medical breakthroughs may not come from a direct attack against a specific disease. Critical medical insights frequently arise from attempts to understand fundamental mechanisms in organisms that are much easier to study than humans”  — Bruce Alberts.

“Translational research is meaningless without something to translate.” “The idea that tens of thousands of scientists are sitting on secret knowledge that could be applied today, if only they were provided with the simple tools and incentives that are needed to create a start-up company, is simply absurd” — Howy Jacobs.

The utility of basic discoveries is difficult to predict and Mellman has got himself into slippery territory here. Sadly, more than a serious statement about utility in science, Mellman would simply seem to be justifying himself in front of his academic peers.

UPDATE August 17 2013: Must-read Editorials by Huda Zoghbi and Marc Kirschner in Science magazine on this topic.