Some recent interactions with medical researchers and conferences I've attended have caused me to think about incentives in the field of research, and I'm quite worried.
First, I've realized how extremely finicky and sensitive the scientific process is. Final results can be significantly skewed in the "wrong" direction by variations in equipment, ingredient formulations, specific techniques, and parameters used. Many intermediate ingredients (cells, RNA, etc.) are available off the shelf, which seems convenient, but often has the risk of quite variable quality (I personally saw researchers re-ordering some RNA compound because twice they received something that failed to work as advertised). The scientific process is complex, difficult, and still so labor-intensive. You would think that in the 21st century the "rote" work would all be outsourced to cheaper labor destinations and/or fully automated with machines, but that's still far from prevalent. Science is still being done in many of the most prominent research universities in a form that's closer to high school chemistry than to 2001: A Space Odyssey. Secondly, I've frequently heard about "publish or perish" and the extreme focus on publishing positive, statistically significant results. This makes people care more about quantity than quality and on "proving" hypotheses right rather than disproving them or trying new techniques, even if they don't work. There are no rewards for failure, and you can't get a patent for trying. In professional science, as opposed to school, there is unfortunately no "A for effort." And I think we lose a lot of valuable information and create a lot of wasted time by duplicating techniques instead of sharing with each other by publishing things like, "I tried these 5,000 combinations. They didn't work. So if you read this, you might want to try something else." This results in several complications to the pure pursuit of knowledge and improvement of the human condition. The best publications are peer-reviewed, and the "peers" are the ones competing with the authors for the same publication slots. The one-way anonymity (not double blind) means that people scratch the backs of their friends and form "societies" (which like to meet at conferences) that are really like old boys' clubs for cheering each other on and publishing each other's work. Also, the focus on publishing creates so much published research that no one can follow it and keep track of it. I'm always shocked when I see scientific citations listing that an article was on pages 1,056-1,064. Who out there is reading thousand-page long journals? I see the same problem with patents: sure, publishing research and filing a patent make the knowledge accessible (when searching for it) but they don't make it prevalent and don't cue anyone to read the findings by themselves. In addition, because of the drive to publish quantity and show "results" even when they're suspect, it drives the quality of research down, yielding false results. John Ioannidis at Stanford wrote about how too much medicine relies on flawed assumptions, explaining how most published research findings are false. The WSJ wrote several articles explaining how pharmaceutical companies are unable to reproduce most research findings (see above about scientific complexity and sensitivity to specific conditions and compounds). It's like we're giving people prizes for trying something a thousand times until finally they get lucky enough (or are careful enough) to produce something scientifically significant instead of rewarding them for working hard and producing truthful results (and sharing their experiences either way). I'm not trying to diss researchers or publications or universities. I know almost all the individuals are honest and extremely hard-working and do believe in the deeper goals of science. I just think the current system is sub-optimal, and I don't know how to fix it. I'm curious to hear what others think.
7 Comments
6/5/2012 10:21:09 am
Scientific method wasnt designed for finding breakthroughs, i thought it was designed for being able to replicate results as best as possible when breakthroughs are finally found.
Reply
Max
6/5/2012 10:51:19 am
Thanks for the comments and good points. I'm intrigued by your idea of analytics and metrics to help people really absorb new research more efficiently and effectively.
Reply
6/6/2012 02:54:39 am
I don't particularly use analytics and metrics in my daily life, but I've watched how it helps people and companies identify interesting trends that have helped them break out of any falsely believed facts. 6/6/2012 02:54:47 am
I don't particularly use analytics and metrics in my daily life, but I've watched how it helps people and companies identify interesting trends that have helped them break out of any falsely believed facts. 6/6/2012 02:55:00 am
I don't particularly use analytics and metrics in my daily life, but I've watched how it helps people and companies identify interesting trends that have helped them break out of any falsely believed facts.
Max
6/6/2012 03:48:01 am
Good point on the handling of data vs. simple quantity of data.
Reply
Helen
6/7/2012 04:42:34 pm
I can't agree with you more on the current publication system. What I heard from my life science colleagues is that it must be solid stories that can be published on high-profile journals. But "to tell a good story", how many negative data are omitted either intentionally or not? If all the data at the negative side of the story can be disclosed, I would expect to see they bring about a completely new "story" and improve the efficiency of scientific discovery. Another underlying question is that can the complex biological system really be interpreted as stories? I don't think it is the only way or best way to present outputs of scientific research.
Reply
Your comment will be posted after it is approved.
Leave a Reply. |
Archives
July 2024
Categories
All
Subscribe |