VO2max Test

James Spragg is a young South African exercise physiologist who has carved out an interesting niche for his research. It is based on the idea that the fastest athlete on fresh legs is not necessarily the fastest athlete on fatigued legs, which is an important distinction, as in most endurance races, it is better to be the guy or gal who is fastest on fatigued legs. Yet conventional fitness testing protocols ignore this reality, which is a problem, because it has the potential to skew athletes’ training too far in the direction of improving fresh-legged performance.

In one of his early studies, Spragg teamed up with several other researchers, including Iñigo Mujika, whose name you might recognize from his work related to the 80/20 intensity balance, to compare power profiles in nine members of a U23 cycling team and five professional cyclists. Interestingly, they found that the U23 riders were able to generate as much power as the pros on fresh legs. Had this experiment been limited to non-fatigued performance testing, we would have been left to wonder why the U23 cyclists were not also on professional teams. But what Spragg and his collaborators also found was that, in U23 cyclists, achievable power outputs began to decline after 1,500 to 2,000 kilojoules (about 3,600 to 4,800 calories) of prior work was completed, whereas in professional cyclists, performance fell off only after 3,000 kJ of pedaling.

What’s more, a later study by Dutch and South African researchers found that, among top-tier professional cyclists, those able to do the most work before their power output capacity dropped off performed best in races. So, it appears that the ability to ride fast on tired legs is a key factor separating the best from the rest, both between and within echelons of cycling.

Spragg’s recent study is also his most ambitious to date. It involved collecting power data from every training ride and race completed by 30 U23 professional cyclists over three years. The aim was to determine how individual cyclists’ fresh and fatigued power profiles changed over the course of a competitive season and how these changes related to their training. The main findings were as follows:

  • Fresh power profiles remained relatively stable throughout the season.
  • Fatigued power profiles changed over the course of the season.
  • The difference between fresh and fatigued power profiles also varied as the season unfolded, indicating that the two phenomena are independent.
  • More time spent at low intensity in training predicted better 2-minute power on both fresh and fatigued legs.
  • A shift away from moderate intensity toward high intensity was associated with a stronger fatigued power profile (i.e., a smaller delta between fresh and fatigued power)

An important implication of these findings is that, depending on the type of event an athlete is training for, performing fitness testing in a fresh state may be of limited value. If you specialize in the 400m freestyle event or the 1500m track event, then perhaps testing in a fresh state has greater relevance. But if you’re training for a marathon or an Ironman 70.3, I would imagine that fatigued fitness testing would tell you more. In a narrative review published in October 2021, Spragg, Mujika, and three other colleagues provide detailed recommendations for incorporating fitness testing into training for road cycling events, one of which is to “avoid single effort prediction trials, such as functional threshold power.” As a running and triathlon coach, I personally lean toward using regular workouts to assess fitness. For example, tacking a fast finish onto the end of a long run serves as a good measure of fatigued performance capacity in a marathoner while also functioning as a relevant fitness-builder for the marathon.

Another interesting finding from Spragg’s 2022 study is that cyclists who maintained their peak training load through the late season also maintained their fatigue resistance, whereas those who reduced their training load during this period lost fatigue resistance. This finding is consistent with other studies reporting a correlation between training volume and fatigue resistance/endurance. One example is a 2020 study byThorsten Emig of Paris-Saclay University and Jussi Peltonen of the Polar Corporation, who collected and analyzed training and racing data from devices worn by more than 14,000 runners for a combined 1.6 million exercise sessions. For the purposes of this experiment, endurance was defined as the percentage of VO2max running velocity that a runner could sustain for one hour, and the data showed a strong positive correlation between training volume and endurance thus defined.

I wish all of this science had been available when I wrote 80/20 Running back in 2014. It would have bolstered the argument I made therein about how the typical exercise science study design puts a thumb on the scale in favor of HIIT-focused training when compared against the type of training elite endurance athletes do. It’s less of a problem nowadays, but back then it was common to use fresh-legged VO2max tests as the basis for such comparisons. But we now know that a VO2max test performed after extensive prior exercise is likely to yield different results that are more relevant to real-world race performance, and that high-volume, mostly low-intensity yields better results in pre-fatigued fitness tests.

Oh, well. That’s what second editions are for, right? In the meantime, you can check out our cycling plans here – some are built to improve your FTP and can be used in your off season.

A study just published in the International Journal of Sports Physiology caught my attention, and I’d like to tell you about it. Conducted by researchers at the University of Worcester, it compared performance, pacing strategy, perceived exertion, and affect in a 10K solo time trial and a 10K race in a group of 14 male runners.

Half of the runners performed the time trial before the race (on a separate day) and the other half performed the time trial after the race (also on a separate day) to ensure that the order of the two events did not skew the results. As you might expect, most of the runners covered the 10K distance faster in the race context than they did in the solo time trial. The average time in the latter was 40:28, compared to 39:32 in the former—that’s a 2.3 percent difference.

Pacing strategies did not differ between the two events. Most of the runners started and finished both the time trial and the race faster than they ran the middle part. Nor was perceived exertion different. By and large, the runners felt they ran equally hard in the race and the time trial. But there was a significant difference in reported positive affect. Simply put, the runners enjoyed the race more, and the authors of the study believe it was this bump in positive affect that the runners got from the competitive environment that accounted for their superior performance.

There’s nothing new in the finding that runners run faster in competition than they do against the clock. One important implication of this fact is that, if I were to ask you to run a solo time trial as a way to gauge your current fitness level so that I could assign appropriate pace targets for your training, I would get a somewhat inexact picture of your current fitness level. The result wouldn’t be completely worthless, as it would know it was about 2.3 percent slower than you could have gone in a race, but still a race would be better.

Time-based time trials (e.g. 30 minutes rather than 10K) I trust even less. They work well enough in cycling, where fitness testing is done mainly indoors, but runners aren’t accustomed to thinking in terms of duration when trying to pace all-out efforts. The typical competitive recreational runner is simply more likely to botch the pacing of a time-based trial than of a distance-based time trial or race.

Lab-based physiological tests such as lactate threshold tests and VO2max tests I trust even less. They look so scientific, what with the breathing mask and the blood draws and all, but studies have shown that small adjustments to the design of these tests yield significantly different results. For example, a traditional VO2max test features an open-loop design, meaning it continues until the subject quits voluntarily. But a closed-loop alternative created by Lex Mauger and Nick Sculthorpe at the University of Bedfordshire results in far greater VO2max scores in most subjects. (It bears noting that a race itself is a closed loop.)

For all of these reasons, when I want to know how fit a runner is, I either ask the runner for a recent race result or I request that the runner complete a race. In the latter scenario, I specifically ask the runner to do a 5K race. 

The 5K distance is preferable to other standard race distance in a number of ways. For starters, it’s by far the most popular race distance, so it’s usually no trouble to find a local event to do. Additionally, a 5K race is more doable for runners at all levels of fitness. Many beginners can’t even run 10K, let alone race that distance. Even advanced runners, meanwhile, need less recovery time after a 5K than they do after a longer race, so jumping into a 5K for testing purpose is less disruptive to the flow of training.

Finally, I find that a 5K race result generally offers a more reliable basis for prescribing appropriate target training paces than do results from longer events. That’s because both aerobic and anaerobic fitness factors contribute to 5K performance, whereas anaerobic factors make very little contribution to performance at 10K and up. A 5K performance typically gives me a good sense of where to start with an athlete pace-wise with everything from short repeats at 1500-meter race pace to sustained steady-state efforts.

So, if you want me to create a training plan for you, be prepared to give me a recent 5K time—or to jump into your next local 5K!

$ubscribe and $ave!

  • Access to over 600 plans
  • Library of 5,000+ workouts
  • TrainingPeaks Premium
  • An 80/20 Endurance Book

 

30 day money back guarentee

For as little as $2.32 USD per week, 80/20 Endurance Subscribers receive:

  • 30-day Money Back Guarantee