Apologies for the gap between posts – I wanted to spend more time on this particular piece to make sure that I captured what is a technical matter accurately.
In our last post, we looked at the 2005 research study published by Coyle on Lance Armstrong, in which he concluded that Armstrong’s efficiency had improved by 8% over a 7 year period during which he was tested.
That finding became a crucial cog in the legal defence of Armstrong after his fifth Tour de France title, because Coyle argued that this physiological adaptation was responsible for Armstrong’s dominance of the Tour. We said in our last post that there were numerous scientific flaws in the study, not least of all the fact that Coyle never once tested Armstrong during the Tour de France season. What transpired in the aftermath of the 2005 publication was only really the beginning, and those study design considerations and problems of interpretation were soon going to be overtaken by the realisation that in fact, the data was incorrect thanks to a calculation error made in working out the efficiency. That is the subject of this post.
The “minor miscalculation”
The story behind this latest revelation is that three scientists and one mathematician from Australia requested access to Coyle’s data, presumably to examine it more thoroughly. At this stage, we must point out that the Coyle finding was surprising and challenged for a number of reasons:
- The numerous flaws in design, questions around calibration and issues of interpretation that we have already explained
- The fact that cycling efficiency had never been shown to change so consistently over a period of time as a result of training/maturation. Take the following quote, from Michael Ashenden, one of the paper’s authors: “They were really concerned, on a scientific level, that Coyle had been able to perpetuate this myth that cycling efficiency changes.”
- There is a theory that cancer treatment and chemotherapy, should cause a decrease in efficiency. Why? Because what happens with the removal of a testicle and subsequent chemotherapy is that the body shifts to a greater use of fat as a source of fuel. Fat, however, is not as efficient as carbohydrate – the energy produced per liter of oxygen is lower for fat than carbohydrates, and so this theory suggests that Armstrong should, if anything, have been less efficient. This is not by itself reason for doubt – theory is, after all, any hypothesis exists in order to be proven incorrect, but it was another reason to view the Coyle data with skepticism.
However, the key point was that this “science” was being propogated as fact, when there were very clearly errors and problems with it. The scientists approach was then to ask for the data, so that they could analyse it themselves, interrogate it and discover whether in fact the conclusions were valid.
Getting the data – apparently a problem
Science is, in theory, transparent. It should be, and every scientist should be confident enough in their method and results to make data available at any time for evaluation. If they are not, then the data should not be published. That’s why a scientific paper is so particular about its method – every study should be repeatable by others, to confirm or refute what it has found. If the process is legitimate, then the data should be above “reproach”. It turns out that getting hold of Lance Armstrong’s data was not quite so simple. Apparently, Ashenden and his colleagues were stonewalled when they raised their concerns, and eventually had to lodge a case of scientific misconduct against Coyle, with his University in Texas.
That finally got them SOME data – I must stress, that it seems they only got some of the information. If you look at the letter that Gore et al sent to JAP[cite]10.1152/japplphysiol.90459.2008[/cite], they state in the opening paragraph that “Coyle made available raw data from the January 1993 test…” If you then jump to the end of the letter, you read the following: “The magnitude of this error warrants recalculation of the entire data set, but raw data from the remaining test sessions are not available from the author”.
Therefore, despite having gone through an official complaint of scientific misconduct, the scientists are still not able to evaluate ALL the evidence, because its author will not make it available. Now, why is this the case? Could Coyle have “lost” the data? Highly doubtful – I suspect that given the subject and his importance, this data will be backed up numerous times. So then the only conclusion is that some reason exists not to make it available.
The Error and why the complete data set must be evaluated
The Australian group, in their letter, make the point that a calculation error had been made, which is really the catalyst for the latest round of discussion. What was that error? We gave a hint in our previous post when we introduced the following graph:
This graph shows how the delta efficiency is calculated – basically, it’s the inverse of the slope of this line. So your use of oxygen rises as you do more and more work, and if you take the slope, invert it, you have a measure of how much work is doing per unit of energy consumed (you calculate energy using oxygen).
Now, what Coyle did, you’ll recall, is calculate Armstrong’s efficiency over 7 years from 1993 to 1999, interrupted by the cancer diagnosis and treatment.
However, what the Australian researchers realised when they looked at the data from 1993, is that Coyle had used the wrong equation. Without going into massive detail, if you look at the graph above, you will see that even when you are doing ZERO work, you are still using oxygen (otherwise, you’d be dead). This resting oxygen use, at zero load is important, because it reflects a base metabolic rate. Now, what Coyle did was to neglect this zero work point, and he FORCED the line to go through zero. This means that the slope of the relationship between oxygen use and work rate was changed because he used the wrong equation, and his calculation of delta efficiency is incorrect.
What the Australians realised is that if you used the CORRECT equation, then the values calculated change substantially. Their letter provides the numbers – they applied the correct equation and worked out that Armstrong’s Delta Efficiency in 1993 was actually 23.55%, and NOT 21.75% as Coyle reported. The graph below shows the effect of this correction.
The question marks are there, of course, because we’re all speculating as to how the correct equation would change the values – that data has yet to be released, so speculation is all we have…
The impact of the change
The change is huge – 8%, and therefore, the rest of the data must be evaluated. In response to this revelation, Coyle has admitted that he made an error. However, he has downplayed the importance of this error, saying that it is minor and makes “no practical difference”. I might point out that his error is in fact LARGER than the change in efficiency he found in Lance Armstrong! The 8% change was significant when it was Lance’s efficiency, apparently it is not when it is the error he made…
The other defence put forward by Coyle is that the error is reduced in signficance because he calculated efficiency at a high VO2, and so the effect of a resting metabolic rate is expected to be minimal. This is in fact completely incorrect. The higher the VO2, the greater the impact of the calculation on the slope. In otherwords, the slope actually changes by MORE (and hence, the efficiency changes) when you have a high VO2, than a low VO2. So Coyle’s defence doesn’t hold there either.
Summing up: The big picture
The relevance of this error, and the whole process of evaluating this paper, extends into the scientific community, perhaps more than it does the cyclist. So these two posts have been much more technical than we usually write, but hopefully you can appreciate the importance of discussing this kind of scientific misinterpretation and error.
In response to it all, Coyle is quoted in the New York Times as saying: “This is a minor waste on my time. However, I don’t understand how they can afford to spend so much time on this. Don’t they have real jobs?”
I suppose one would put this down to opinion, but as I wrote the other, I would say that as a scientist, your “real job” is to pursue scientific truth. So in fact, they did exactly what they were supposed to. It doesn’t seem that the same applied in 2005 when the study was done.