As promised, we turn our attention to this story, which broke last week, coinciding with the news that Lance Armstrong is coming out of retirement and will try to race the 2009 Tour de France. It’s quite an intricate story, and technical, so forgive the longish post, but we try to go back to the beginning and then work through the sequence of events in order.
The story, which was reported in the New York Times, reports that Ed Coyle, physiologist at the University of Texas, admitted to making what he calls a “minor error” when calculating Lance Armstrong’s efficiency during his research. That “minor error” happens to have a major impact on the study’s findings, since his main finding was that Armstrong became more efficient between 1993 and 1999.
The paper was, it must be said, widely criticized from the beginning. It drew two separate letters, criticizing the methods, debating the scientific stringency of testing, and questioning the conclusions. It became something of a “shining light” to research without quality control, a running joke of sorts within sections of the scientific community.
I recall attending a conference in the USA soon after its publication – it was the hot topic, of course, because not only was Coyle doing research on an elite subject, it was THE elite subject – the “greatest physiological specimen in the world”. Just like roadies wait years to see rock stars or musicians, any exercise physiologist would leap at the chance to publish data on a record-breaking Tour de France cyclist!
So predictably, the paper was something of a conversation starter at scientific conferences and in the media. At the conferences, conversation was not positive, however, with many dismissing it as trivia, rather than science. They were being kind…Few would have expected the next two years to keep the paper quite as much in the public and legal eye, since it became a legal defence vehicle for Armstrong, as we shall see…
With regards to the media, the paper was huge – it was reported widely as “fact” that Armstrong’s success was the result of his never-seen-before increase in efficiency. This is typical of how media spin sometimes contaminates science. In this particular case, that science was not even particularly “clean”, with many holes, but nevertheless, the media lapped it up. This is of course frustrating for most scientists, since the “sensationalization” of science is rarely constructive. When it is also poor science, all the more reason to pursue the truth…
First things first: The Coyle study from 2005. What was found?
To begin with, we have to look back and report on the findings of Coyle in the research study that is now the focal point of the “error”.
The paper[cite]10.1152/japplphysiol.00216.2005[/cite] was called “Improved muscular efficiency displayed as Tour de France champion matures”, which kind of reveals the paper’s hand from the very first line. Here’s a breakdown of what Coyle did (note that we’re focusing only on the efficiency part, and not some of the other measurements made. If interested, you can download the entire paper here. See also the end of this post for the links to the entire series of exchanges in the Journal of Applied Physiology.)
The figure below demonstrates just what Coyle did, and what he found.
The research began in November 1992, when Coyle did his first battery of tests on Armstrong. He then did a second test a few months later, followed by a third in 1993. Then a long break interrupted the testing, and it was in that period that Armstrong was diagnosed with and treated for testicular cancer.
Testing resumed in August of 1997, and the final test took place in November 1999. This was the only testing session that co-incided with Armstrong’s Tour de France dominance (1999 to 2005), although it must be pointed out that it was done in November, four months AFTER Armstrong won the 1999 Tour. This has relevance for Coyle’s conclusions, as we shall see.
The test: Explaining efficiency measures
Testing consisted of a VO2max test, during which time, Coyle measured gases (oxygen in, CO2 out) and did a blood lactate measurement at the end of the test. He calculated two important variables:
- Gross Efficiency – the ratio of work done to energy expended to do the work. The work done is taken from the power output, while the energy expended to do the work is calculated using the respiratory gases and calculations we won’t get into here. But for example, if a cyclist is riding along at 200 W, and their respiratory gases are used to calculate that their energy consumption is 1000 W (or Joules per second), then that cyclist is 20% efficient, according to this method.
- Delta Efficiency – this is a more comprehensive method, because it is calculated as the ratio of the change in work done per minute to the change in energy expended per minute. It is considered a better measure of efficiency because it takes into account the use of oxygen (and energy) at rest and when no work is being performed.
This gets technical, but the simple way to think of this is that as you do more work, your oxygen use rises. We can use that oxygen use to calculate how much energy you are using, and then say that energy use is proportional to work rate (that’s fairly obvious, hopefully).
Now, if we take the inverse of the slope of that line (in otherwords, work done vs. energy use), then we can work out delta efficiency. However, it’s critical that this slope take into account what the energy use was when you were not doing any work – the resting energy use, and also the energy use when cycling at zero load. The graph below is schematic, but I use it illustrate the point – the energy use rises with increasing work rate, but must take into account energy use when work rate is zero.
Studies as far back as 1975 (Gaesser & Brooks) have shown that gross efficiency tends to skew the results, because of the failure to account for energy use at zero load. Therefore, delta efficiency is considered the better method, but only if used properly, as we’ll see!
Coyle’s study found that Armstrong’s efficiency increased progressively over the 7 years in which he was tested, as shown in the figure above. His delta efficiency improved from 21.37% in 1992 to 23.12% in 1999. This increase (1.8 percentage points) is relatively small, and it must be noted, is actually less than the typical error of the equipment used to measure it with! In other words, taking nothing but equipment variation into account, this kind of change is possible…
Based on this finding, Coyle named his study, and the theory was published that Lance Armstrong had seen a progressive increase in his efficiency over the years. This was to become a key part of this “armour” in 2005, when this study would provide some support for his claims that his ascendancy in the world of cycling was “natural”. Coyle speculated on a number of physiological factors explaining this finding (changes in enzyme activity, muscle fibre switches etc.). However, the first responses to Coyle’s paper were swift…
The first response Criticism of methods and overinterpretation of data
The first response and criticism was swift, and came from two sources. First, David Martin and colleagues from Australia wrote a letter[cite]10.1152/japplphysiol.00507.2005[/cite] titled “Has Armstrong’s cycle efficiency improved?”. This was accompanied by a letter from Yorck Olaf Schumacher and his colleagues[cite]10.1152/japplphysiol.00563.2005[/cite] titled “Scientific considerations for physiological evaluations of elite athletes”.
Essentially, these letters criticized the study design, the method and the scientific process follwed, including the conclusions. The raised the following points:
- Timing of testing sessions – Coyle very clearly concluded that his measurements of muscular efficiency were of paramount significance to Armstrong’s Tour victories. He denies this, but the title and his conclusions make very clear that his view is that Armstrong’s success is a function of this improved efficiency. Coyle would go on to testify in court that Armstrong’s rise could have been achieved without doping, so it’s quite clear that his finding was intended for support of Armstrong’s Tour performance. Yet remarkably, NOT A SINGLE testing session co-incided with the Tour. All the testing happened out of season, and only the 1999 test even overlapped with the Armstrong Tour victories.
- Issues around equipment – calibration, reliability, validity etc., which we won’t get into here, other than to say that over a period of seven years, the control of equipment is obviously crucial. Coyle responded to these queries, and they do not seem to have huge influence over the current debate
- The conclusion – Coyle’s was really one of the first papers to even suggest that muscular efficiency improves over time and with training. While this would seem intriguing, it also disagrees with many other findings, which are that extensive endurance training does not improve cycling efficiency. Also, efficiency is not a factor that seems to be associated with performance in elite cyclists, and so the conclusions are ‘liberal’, to say the least.
The next steps: Re-analyzing the data and digging up errors
These issues are primarily behind my earlier observation that the paper was widely criticized, even early on. However, what transpired next is even more significant, because Christopher Gore, Michael Ashenden, Ken Sharpe and David Martin continued their quest for the “truth”, and eventually managed to get hold of (some) data from Coyle’s testing[cite]10.1152/japplphysiol.90459.2008[/cite].
Between the publication of the paper in 2005 and the latest round of debate, there was also the matter of a court case in which Coyle was a paid expert witness on behalf of Armstrong. His testimony was aimed at building a credible case for how Armstrong could have dominated the sport for 7 years thanks to the remarkable physiology put forward in this paper. And so this study, with its holes, flaws and inaccuracies, actually went on to form part of a legal argument despite those problems. It also reveals a big part of Coyle’s incentives, something we’ll look at in our next post.
These holes and flaws in the Coyle study however pale into insignificance when compared to the latest revelations, where analysis, and some “between the lines” reading of Coyle’s data revealed outright errors in the research. That is, it’s no longer a case of questionable methods and over-interpretations, it’s now a matter of miscalculation and wrong results. All the way from the lab, into the media, and on into the court-room!
But in the name of time (and length!) I’m going to call it for today’s post, and leave you with that teaser, which we’ll pick up on tomorrow, when we look the “minor error” and what impact it has on the results.
Join us then!