small sample sizes, poor study design, researcher bias, and selective reporting and other problems combine to make most research findings false.For my Masters project all the first year grad students were given a paper and asked to critique it. It turned out all of them were wrong. And all of them were published in leading journals for their respective fields. I recall reading the paper and thinking, "this is utter crap". It was horrible. It was easy to knock this thing down and I was a stupid first year grad student. Everyone else had obviously bad papers too. How did these things get published? Having never been a reviewer for an academic paper I can't answer that. But I'm sure there is bias against turning a paper down. Almost a philanthropic need to not hurt a person's feelings. And of course who has time to read all those damn papers anyway? There was a lesson here that the advisors were trying to get across - "Don't let this be you."
Poor study design and poor use of statistics are very prevalent in academic research. My advisor forced me to take advanced courses on both of these during my Masters program. Knowing what you can and cannot say about a set of data is paramount to your work. It was not uncommon for people to slip up in their defenses because they said something that just wasn't supported by the facts and captured data that was just plain useless because of the design of the data producing and capturing system.
Bias, however, is the force that I saw most often when I was doing my PhD. It was everywhere. Smart people could look at a set of data and interpret it in completely opposing ways based on their ingoing assumptions. It was constant. And we ALL had it. I'm thankful that our group was quite open about discussing experiments, theory, and simulations. It helped overcome this problem. But I can remember lots of times that I ran an experiment or completed some theory or ran a simulation and the answer just perpelexed me. There was a creeping tendency in these situations to want to redo what I had done and rejigger something slightly in the hopes that the 'problem' would remedy itself. There were times, when I was stressed and had been working long hours, that I just wanted to change the data. There was no group discussion at those times to sway me away. The only thing that didn't was a quote I recall. I don't even recall who it was. But it went something like, 'it's the moments when your results don't come out like you expected that something interesting could be going on." I wouldn't be surprised if 75% of data is 'massaged' as we would say.
The worst case of this occurred during a defense I attended. Defenses were open where I studied. There was a closed door portion at the end but the presentation and a Q&A period were open for anyone who was interested. In this particular case a PhD candidate from the medical school was defending some work he had done on blood flow that was in the same field as I was studying - fluid dynamics. About 5 transparencies (they were transparencies back then) in he showed an equation that he had used. It was a 3-D integral using radial coordinates. In this case you must add additional terms to the integral.
Only the problem was he didn't have the right terms. He had a cosine where a sine should have been. I thought to myself, "that must be a typo but I'll ask anyway". To my surprise (and dismay) it wasn't a typo. An argument ensued between me and the defender until finally his advisor said, "It's wrong. We'll fix it." I felt horrible. Luckily his work wasn't that hard to replicate. I did not attend the second defense. But this guy had done his work and made calculations that were incorrect and yet he had said they were correct by way of comparison to real experimental data. What happened? On what night did this smart man decide to massage the data or say, 'screw it - I'll just make the data up"?
While I could lean on him for what he did, I also know it is easy to do. Very very easy. One too many cups of coffee. One too many late nights. One too many instances where your advisor berates you. One too many setbacks. Anyone can do it. And therefore we shouldn't read any articles that report on academic research and take it at face value. And yet we do. We as readers have bias. Watch next time you see a report that supports your bias. You read it and mutter to yourself, "I knew it.". And next time you see one that refutes your bias, you'll not read it and think back to this article that most of them are wrong anyway. Bias is just insidious. The researchers have it and so do the consumers of the research. Be careful out there.
No comments:
Post a Comment