Jump to content

Good news for SWs and other runners


Randomguy

Recommended Posts

Why Don't Runners' Knees Fail More Often?

Biology suggests that decades of running should invariably blow out your knees. Scientists are trying to understand why that doesn’t happen.

 

The evidence suggests that running is neutral at worst, and possibly even helpful, for the long-term health of your knees.

 
 
 
First, the bad news. A sophisticated new model shows that if you start running at age 23 and put in less than two miles a day, there’s a 98 percent chance that your knees will fail by the age of 55. Now, the good news: that doesn’t actually happen in real life. In fact, as I and many others have repeatedly pointed out, the evidence suggests that running is neutral at worst, and possibly even helpful, for the long-term health of your knees. So the real question—and it’s an interesting one—is why runners’ knees don’t fail more than average.

The basic problem is that cartilage, the rubbery material that provides shock absorption between the bones in your knee joint, has no blood vessels or nerves. For that reason, it’s generally considered pretty much inert and unable to repair itself. Over a lifetime of loading, it gradually wears down until the bones are grinding against each other. That’s knee osteoarthritis, the most common joint problem in the U.S., which affects about 10 percent of men and 13 percent of women over the age of 60.

In recent years, though, there’s been pushback against the view that cartilage is just an inert lump. Back in 2006, a bioengineering researcher named Bahaa Seedhom hypothesized that cartilage could actually respond and adapt to the stresses imposed by day-to-day activities, an idea he dubbed “cartilage conditioning.” In fact, he suggested that a lack of joint stress might explain why sedentary people develop knee osteoarthritis. More recently, University of California Davis researcher Keith Baar has suggested that connective tissues, including cartilage, do have self-repair abilities that can be triggered by the right combination of exercise and diet.

 

A new study in the open-source journal PeerJ, by University of Maryland biomechanist Ross Miller and his former doctoral student Rebecca Krupenevich, explores how these various factors might come together to explain why runners don’t all end up in wheelchairs. It’s a modeling study that combines the measured properties of cartilage and the forces involved in running and walking to predict when knees should fail with and without the existence of self-repair and adaptation abilities in the cartilage. The punchline: there’s good reason to think that, like so many other parts of your body, your cartilage really does get stronger the more you use it.

In essence (i.e. glossing over a few thousand words of details and a bunch of equations), the study involved the following steps:

  1. Analyze the gait of 22 volunteers during walking and running, in order to calculate the forces and loads on their knee cartilage. The results suggest that running puts at least twice as much strain on cartilage as walking.
  2. Estimate how many loading cycles (i.e. steps) the cartilage in a knee can handle, using the loads calculated in step 1 along with data from mechanical testing performed on cow cartilage.
  3. Calculate how long it will take for a knee to fail assuming you either walk 3.7 miles (6 kilometers) each day or walk 1.9 miles (3 kilometers) and run 1.9 miles each day.
  4. Redo the calculations assuming knee cartilage has the ability to either adapt (get stronger so that each subsequent loading cycle does less damage) or repair itself (reverse the damage done by previous loading cycles).
  5. Compare the results to reality and see what seems most reasonable.

If you consider the walking-only condition, assuming you start with healthy cartilage at age 23, the model predicts a 36 percent chance of knee failure by age 55 if the cartilage can neither adapt nor repair itself. If you add in some self-repair ability, that probability drops to 13 percent. According to the researchers, this is roughly in line with real-life data on how often non-obese adults with no knee injuries end up developing knee osteoarthritis, which gives us some confidence that the model is plausible.

The picture is bleaker for running, though: a 98 percent chance of knee failure by age 55. Even if you adjust the model so that the cartilage can do some self-repair, the probability is still 95 percent. This is not consistent with real life. The only way to get sensible numbers is to assume that the cartilage can also adapt, presumably thanks to the ability of cells in the cartilage to sense the mechanical pressure imposed by running. The model incorporates three forms of adaptation: thicker cartilage, springier cartilage, and thicker bone that spreads the load out over a wider area. If you try tweaking the parameters for any one of those forms of adaptation, you have to make crazily unrealistic changes in order to get the desired results. But if you assume a combination of modest and realistic changes for each of the three types of adaptation, the probability of failure drops to below 13 percent, matching the walking-only scenario.

There’s a smattering of evidence that running can indeed cause positive adaptation in both knee cartilage and bone. But it’s patchy at best. At this point, cartilage conditioning remains a hypothesis. Still, Miller figures it’s the most likely explanation for the puzzlingly healthy knees of long-time runners, if only by process of elimination. Otherwise, he says, you’d have to conclude that either our cartilage is virtually indestructible, or that running puts way less load on it than our current calculations suggest. But all the lab data on isolated bits of cartilage—which can’t adapt or repair itself because its owner is dead—suggests that cartilage does wear out over years or decades. And we have various independent lines of evidence confirming the estimates of cartilage loads during running, including from knee implants with sensors in them.

The easy takeaway here is the same one we take from all the observational studies that find runners have generally healthy knees: keep on running, and don’t worry that you’re “using up” your joints. (You’re not using up your heartbeats either, just for the record.) The trickier question is what happens if you do develop knee osteoarthritis, which happens to lots of people, both runners and non-runners, especially if they suffered acute knee injuries when they were younger. If loading the joint really does trigger positive cartilage adaptations, that’s an argument in favour of continuing to run to the extent that symptoms permit, rather than switching completely to non-load-bearing activities like swimming. As I wrote a few years ago, there’s a bit of evidence that running doesn’t seem to accelerate the progression of existing osteoarthritis. But this is still a big open question, and one that I hope researchers like Miller will continue to explore.

Link to comment
Share on other sites

  • 1 month later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...