Xin Shen, Hui-fen Zhu, Feng-rong He, Wei Xing, Li Li, Jing Liu, Juan Yang, Xing-fei Pan, Ping Lei, Zhi-hua Wang, and Guan-xin Shen
An anti-transferrin receptor antibody enhanced the growth inhibitory effects of chemotherapeutic drugs on human non-hematopoietic tumor cells
International Immunopharmacology 8:1813-1820, 20 December 2008
Some folk on twitter were discussing the difficulty of correcting the academic literature when researchers make a mistake which slips past editors and reviewers.
I (used to) have an interest in the area, particularly the (really obvious) technical problems and (really obvious) technical solutions in academic publishing, but escaped the field because I had little interest in the utterly absurd non-technical barriers that stand in their way. More interesting than the tedious process of correcting mistakes, I think, is the tedious process of highlighting deliberate fraud. There is more of a reason to tread delicately with research fraud, of course. In many cases, wasting public money and misleading others (especially in fields which have important applications, like medicine), amounts to a crime, with far greater repercussions than mere mistakes, and accusations of crime should not be made lightly. But equally, they can't be treated lightly. The attached shows a couple of Western blots. These are a way to show that treatment of some cells in a dish with a medicine has changed the amount of a specific protein of interest in the cells. In the first row of each blot, the bands change in intensity, showing that the protein of interest changes in quantity. The second row is just a control, showing actin — an unrelated protein which we know is present in cells at consistent levels and which isn't affected by medicines*. The actin bands do not change in intensity, reassuring us that changes in the first set of bands really are due to relative changes in the quantity of that specific protein, and not simply because the researchers just used less from those samples. Except that in this case the bands at the bottom have been copy-pasted in photoshop, so we don't know that the actin bands really were all the same intensity, we don't know that the researchers really did put the same amount of total protein in from each sample, and so we don't know that the medicine they were testing really does do what they claim it does. Whoever did it took some very basic steps to disguise it — with some rotations and mirroring. But neglected to clone-stamp out some of the tell-tale specks of dirt and random warps in the shapes of the copied bands. All they've really done is waste some money and some time. The worst that can happen is that some more gets wasted pursuing it a little bit further before folk discover that actually it doesn't work as claimed, and it's chalked up as another line of research that looked promising but for whatever reason couldn't translate to medicine. In the scheme of things, it hardly seems worth getting excited about. It's a fluke that I happened to spot this one: it must be the tip of an iceberg. The point is not so much what the researchers did, but what happens in the publishing system when it's pointed out what the researchers did. I first saw the figure almost five years ago**, and I've finally got bored of it sitting there with no progress on correction. It's three and half years since I pointed it out Elsevier and the journal's academic editor, and two and half years since they promised they "will move forward with a published note identifying that the figure was suspect," which I've still not seen evidence of. Nothing will be done about the inappropriate behaviour itself — "it happened sufficiently long ago that the student was no longer at the university, and finding original data was challenging." So if the publishers aren't going to facilitate this sort of thing (and if I'm no longer inside that industry having to play along with their system), I think in this case the balance is in favour of ignoring the absurd non-technical barriers and the need to tread carefully around serious accusations, and in favour of using the obvious technical solution to getting things published. * yes, I know I'm oversimplifying actin.** yes, that was several months before Elsevier's publication date. Unfortunately the Elsevier journal to which it was duplicate submitted had already published it before I could find a way to deal with it from inside the system. It really is (or was, in 2008) challenging to deal with research misconduct from provincial Chinese research institutions.
A result inside 4 years is pretty good and a lot faster than usual. This does show that journals, authors are often not interested in ‘getting it right”. There are, of course noble exceptions, but these are unusual. At my last count of retractions on Retraction Watch (post entitled “Getting Science Right Side Up”; http://ferniglab.wordpress.com/2013/09/16/getting-science-right-side-up/) just a few retractions fell into the category where authors had been proactive in correcting mistakes/artefacts. The rest were the consequence of blood being squeezed out of a stone….
So congratulations are in order!