Astrocytes may Hold the Key to Exercise-Induced Cognitive Enhancement

Originally published on the PLOS Neuroscience Community

Forget expensive pills or exotic miracle supplements. Exercise may be the most effective – not to mention free and accessible – cognitive enhancer on the market. Research in humans has shown that physical activity can improve cognitive function and may help stave off dementia, yet the biological mechanisms underlying these benefits aren’t fully understood. Animal studies have made substantial progress on this front, demonstrating such positive responses to running as enhanced neurogenesis and elevated levels of neural growth factors. However, much of this research has been relatively narrowly focused, with particular attention devoted to neuronal changes and one notable brain region – the hippocampus. The hippocampus is selectively important for certain functions like learning and episodic memory, but exercise improves a range of cognitive processes, many of which depend on other, non-hippocampal brain regions. Therefore, researchers from Princeton University looked beyond the hippocampus and neurons to more thoroughly characterize the neural events that impart cognitive protection from physical activity. In their study recently published in PLOS ONE, Adam Brockett and colleagues report that running enhances performance on various cognitive tasks, improvements which may be mediated by changes in astrocytes, the lesser appreciated brain cells.

Running selectively boosts some cognitive functions

To manipulate levels of physical activity, rats were divided into a group of runners who were allowed free access to running wheels for 12 days, and another group of sedentary controls. Prior studies have shown that running improves performance on tasks requiring the hippocampus, like learning and memory. Here, the runners and non-runners were subjected to three tests to determine how exercise affects cognitive functions that are not dependent on the hippocampus. An object-in-place task, which tests how well rats remember the location of previously encountered objects, relies on the medial prefrontal cortex, hippocampus and perirhinal cortex. A novel object task, in which rats distinguish novel from familiar objects, selectively depends on the perirhinal cortex. Lastly, a set-shifting task, supported by the orbitofrontal and medial prefrontal cortices, measures attention and cognitive flexibility.

Compared to their non-runner companions, the runners performed better on the object-in-place test and on several measures of the set-shifting task. However, there were no differences between runners and non-runners in performance on the novel object recognition test. Of course, the cognitive benefits of running don’t end here, since many cognitive domains were not assessed in this test battery. But these findings highlight a striking selectivity of the brain-boosting powers of exercise. In particular, they suggest that running may enhance functions that specifically depend on the medial prefrontal and orbitofrontal cortices, along with the hippocampus, but it does not appear to modulate perirhinal-dependent functions.

Cognitive enhancement is linked to astrocytes

Although behavioral changes provide a window into the underlying neural events, they do not tell the complete mechanistic story. To directly examine how running affects the brain, the researchers assessed changes to both neuronal and non-neuronal brain cells. Running induced widespread neuronal changes, including higher levels of pre- and postsynaptic markers throughout the brain (including in the hippocampus and orbitofrontal, medial prefrontal and perirhinal cortices), and increased density and length of dendritic spines in the medial prefrontal cortex. While these effects suggest that exercise elicits generalized synaptic changes, they do not explain why particular cognitive functions are selectively boosted over others.

The researchers therefore looked for this crucial link to behavior in astrocytes. As Brockett explains, “We hypothesized that all cells likely change as a function of experience. We chose to focus on astrocytes because there is lots of evidence suggesting that astrocytes could be implicated in cognitive behavior. Loss of astrocytes correlate with impairment on a cognitive task and astrocytes connect the majority of neurons to blood vessels. They extend numerous processes that envelop nearby synapses, and gliotransmitters have been implicated directly in LTP-induction.”

Confirming their suspicions, in runners, astrocytes increased in size (Figure, A-B) and showed more contacts with blood vessels (Figure, C-D). But these changes only occurred in the hippocampus, medial prefrontal cortex and orbitofrontal cortex – critically, all regions that support the tasks showing running-related improvement. In contrast, running did not alter astrocytes in the perirhinal cortex, a region necessary for novel object recognition, which did not benefit from running. Thus, while running modified both neurons and astrocytes, the pattern of selective cognitive enhancements corresponded only with changes to astrocytes.

In the hippocampus, medial prefrontal cortex and orbitofrontal cortex, astrocytes were larger and made more contacts with blood vessels for rats who ran than those who did not. Brockett et al., 2015

In the hippocampus, medial prefrontal cortex and orbitofrontal cortex, astrocytes were larger and made more contacts with blood vessels for rats who ran than those who did not. Brockett et al., 2015

Implications for the active human

Although the varied and widespread cognitive benefits of exercise have long been appreciated, this study provides some of the first insight into the remarkable selectivity of these enhancements. Follow-up studies will help elucidate why, from both biological and evolutionary perspectives, running would demonstrate such selectivity. Might, for example, attention or task-switching abilities have been more important than object recognition for the efficiency of both animals and our persistence-hunting, distance-runner ancestors? Does running more heavily recruit certain brain regions over others, making them more susceptible to remodeling?

Given the cognitive and neurobiological differences between rats and humans, future research will be important to help extrapolate beyond rodents. Currently it’s unclear how different forms of exercise enjoyed by humans – for instance, swimming, yoga or strength training – uniquely influence distinct cognitive functions. According to Brockett,

“There is a lot of evidence that running has numerous beneficial effects on rodent and human cognitive functioning, but it is likely that aerobic exercise in general is responsible for these effects rather than running per se.”

Perhaps most notably, these findings add to the growing pool of studies underscoring the importance of astrocytes in neural processes that support cognition, and reveal a novel role for these cells in experience-dependent plasticity. As Brockett explains:

“Astrocytes are a unique cell type that haven’t been explored as much as neurons by the field of Neuroscience at large. Few studies have directly examined the role of astrocytes in complex behavior, and this was our first attempt at investigating this question.”


Alaei H, et al (2007). Daily running promotes spatial learning and memory in rats. Pathophysiology. 14:105–8. doi:10.1016/j.pathophys.2007.07.001

Brockett AT, LaMarca EA, Gould E (2015) Physical Exercise Enhances Cognitive Flexibility as Well as Astrocytic and Synaptic Markers in the Medial Prefrontal Cortex. PLoS ONE 10(5): e0124859. doi:10.1371/journal.pone.0124859

Gibbs ME, O’Dowd BS, Hertz E, Hertz L (2006) Astrocytic energy metabolism consolidates memory in young chicks. Neuroscience 141(1): 9-13. doi:10.1016/j.neuroscience.2006.04.038

Henneberger C, Papouin T, Oliet SH, Rusakov DA (2010). Long-term potentiation depends on release of D-serine from astrocytes. Nature. 463:232-6. doi:10.1038/nature08673

Kramer AF, Erickson KI (2007). Capitalizing on cortical plasticity: influence of physical activity on cognition and brain function. Trends Cogn Sci. 11: 342–8. doi:10.1016/j.tics.2007.06.009

Marlatt MW, Potter MC, Lucassen PJ, van Praag H (2012). Running throughout middle-age improves memory function, hippocampal neurogenesis, and BDNF levels in female C57BL/6J mice. Dev Neurobiol. 72:943–52. doi:10.1002/dneu.22009

van Praag H, Kempermann G, Gage FH (1999). Running increases cell proliferation and neurogenesis in the adult mouse dentate gyrus. Nat Neurosci. 2:266–70. doi:10.1038/6368

Tagged , , , , , , , ,

Barefoot Running Workshop 3: Hills, Speed & Precautions

Many thanks to all who attended the final session of our Barefoot Running Workshops! In today’s workshop we built on the fundamentals of running mechanics covered in the first and second workshops. We tweaked our speed and hill running techniques, addressed safety issues unique to barefooting and took running video selfies for gait analysis. Here are some of the highlights of the day’s fun …

Happy, dirty feet, post-hills and sprints!

Happy, dirty feet, post-hills and sprints!



When running downhill, the impact on the body increases due to acceleration from gravity. When you drop a ball, it will fall faster when it hits the ground if dropped from 10 feet than 5 feet; similarly, your body will actually descend faster towards the ground when plummeting downhill than climbing up. The secret to effective downhill running lies in using that acceleration to your advantage, rather than letting gravity get the better of you.

Minimize bouncing. With that extra distance between you and the earth, downhill running comes with additional vertical motion. Try to minimize any unnecessary upwards motions, like jumping or bouncing, that will only exacerbate the stress from the downward fall. Aim to stay low to the ground and level on the horizontal plane.

Bend the knees. The knees serve as shock absorbers, so bent knees can greatly counter the added stress from downhill running. This also facilitates a low, steady stride, making it even easier to avoid bouncing and pounding.

Avoid breaking. Embrace gravity, don’t resist it. Steep inclines will automatically increase your pace, and a faster than normal clip can feel uncomfortable. A natural – often subconscious – response is to put on the breaks, stiffening the joints to counter the impact. This defense mechanism is far from beneficial, creating unnecessary tension as we clench in resistance, which only opens the door to injury. Take advantage of the acceleration and allow yourself to float. Once you release and embrace the descent, the ride will feel more like flying than a downward crash.

Don’t over-stride. Over-striding is always dangerous, but exceptionally so when running downhill. What’s worse, downhills actually encourage over-striding, as they entice us to extend the leg out in front as a protective mechanism. This only forces you into a heel strike and increases stress on the shins and knees – a dangerous combination when coupled with an already elevated impact from the incline!


In contrast to downhills, uphill running requires us to fight against gravity. Maintaining proper form will keep you strong to efficiently conquer these demands.

Lean into hill. Exaggerate your forward lean to counteract the incline. But take care to lean not at the waist, but with the entire body. Collapsing forward will only increase your workload and make that hill feel extra torturous!

Stay tall. Since the goal is upward movement, aim to lengthen the body upward. This is where good posture is key, keeping the back tall and long, head high and looking forward.

Steady effort. Powering up a daunting hill may not be the best tactic to smoke your competition. Your strong sprint could easily backfire, leaving you exhausted by the time you summit. Rather than keeping a steady pace, aim to maintain a steady effort. This of course, means slowing it down on those inclines. To track your effort, monitor your breathing rate; regular breathing means regular effort and is a good indication you’re not over-exerting yourself.


Increase forward lean. To run faster, we need to increase the amount of forward motion per step. This extra ground coverage can be achieved relatively easily be simply leaning forward.

Light feet and high cadence. Faster speed does not in fact require higher cadence (leg turnover rate). You should strive for the same high cadence as always (at least 180 steps per minute). However, when sprinting this high cadence will even further work to your advantage. Speed can be more strenuous on bare feet, encouraging shearing and friction. Keeping your foot-strike light and cadence high can minimize these effects by reducing your ground contact time.

Open stride. Don’t be afraid to open up your stride. Barefoot running often encourages a shorter stride, but a longer stride can help support speed for any runner. Allow your hips to open a bit more and your leg to lift a touch higher than usual, but remain fluid and never force a gait change.



Blisters & Abrasions

Blisters and raw skin are relatively common for novice barefoot runners. While unpleasant, these can be valuable training tools as they’re telltale signs of sub-optimal biomechanics. Use their appearance and location to pinpoint your weaknesses. Blisters on your toes? You may be pushing off or gripping excessively. Calluses on your heel? You may be striking too far back on the rearfoot. Abrasion on the ball of your foot? Try not to scrape, shuffle or shear the foot on landing, but lightly place and lift instead.

Dangerous” debris

The greatest concern for the new barefoot runner is cutting or bruising their feet on all the glass, rocks and dirty needles littering our earth. In truth, such dangers aren’t prevalent and are relatively innocuous to the conditioned bare foot. That said, there are of course certain encounters that are best avoided by even the most experienced barefooters. 

Urban debris. Most obvious are artificial hazards such as shards of glass or rusty nails. Large dangers are easily avoided by scanning the ground, and smaller ones may not even penetrate the thick, tough skin of the foot’s plantar surface. Of course, in the unlikely case you sustain a bad cut or puncture, seek medical attention!

Natural debris. More likely to take down a barefoot runner are hazards lurking naturally in the trails and grass. Thorns and burs love attaching to feet and although painful, are easily pulled out. Landing hard on a stone can bruise, but the feet will become resilient to even the most daunting rocks and pebbles as the feet strengthen with experience. A less often considered risk, but one that’s taken down yours truly on countless occasions, are bees. Depending on your reaction to bee stings, you may want to seriously reconsider running in grass, especially during the spring and summer, when bees love frolicking through the grass as much as we humans do.


Heat: Depending on your foot conditioning and tolerance, hot ground can pose unique challenges to the barefoot runner. But because of the reduced foot-contact time when running compared to walking, it’s surprisingly easier on your feet to run on hot terrain. Some surfaces heat more readily than others, so stick to concrete or dirt over pavement. The painted white lines on roads can offer some refuge, as long as you’re careful to avoid cars!

Cold. In some aspects, the cold can be more hazardous to the barefooter than the heat. In extreme cases, the feet can go numb, which reduces sensory feedback and encourages poor biomechanics (not to mention posing a risk of frostbite!). Feet often warm up after just a few minutes of running, but if you do lose sensation, stay smart and stop or put on some protection. Just a pair of socks will often suffice to keep the feet warm while retaining a mostly barefoot feel.

Wet: Running through the rain, mud and puddles can be one of the most exhilarating barefoot experiences. But stay cautious of smooth surfaces, which can become dangerously slick when wet. Water can also soften the skin, making it more likely to rub raw on rough terrain or long runs.

As both a student and teacher, these workshops have been a far more rewarding and educational experience than I originally anticipated. More importantly, they’ve also served as a fantastic tool to connect with the small but passionate community of barefoot runners in San Diego. Given how fun and successful this “pilot” series has been, there will definitely be more! Please don’t hesitate to get in touch with suggestions for what you would like featured in upcoming workshops, and stay tuned details about future events.

Happy barefoot trails!

Tagged , , , , , , , ,

The Mitochondrial Hypothesis: Is Alzheimer’s a Metabolic Disease?

Originally published on the PLOS Neuroscience Community

Despite decades of research devoted to understanding its origins, Alzheimer’s disease remains a daunting and devastating neurological mystery, ranking as the sixth leading killer of Americans. Countless therapeutic attempts, each designed with fresh anticipation, have repeatedly failed. A common thread across many of these drugs is their targeting the defining marker of the disease, amyloid plaques – those nasty extracellular deposits of beta-amyloid protein that invariably present in the Alzheimer’s brain and are thought to be toxic to neurons. Given the frustrating loss of research money, time and effort, many scientists agree it’s time to stop running circles around the amyloid hypothesis and begin seriously considering alternative explanations. One such theory showing increasing promise is the “mitochondrial hypothesis”. Its proponents posit that mitochondrial dysfunction lies at the heart of neural degeneration, driven by metabolic abnormalities which lead to classic Alzheimer’s pathology.

Steps by which mitochondrial function may lead to Alzheimer's. Based on the model outlined in Swerdlow et al (2010).

Steps by which mitochondrial function may lead to Alzheimer’s. Based on the model outlined in Swerdlow et al (2010).

Thank mom for your genetic risk

The first hints at this possibility arose from epidemiological observations about the genetic patterns of Alzheimer’s prevalence. These findings suggest that genetic influences may include more nuanced interactions than the better-known contributions from genes such as ApoE and TOMM40. Although both parents determine genetic risk, your likelihood of getting Alzheimer’s is much higher if the affected parent was your mother. This argues strongly that some maternal element underlies the association. Mitochondrial DNA is a logical target, as this subset of DNA is solely passed down from the mother. Many features of Alzheimer’s show this same maternal-dominant inheritance; those whose mother (but not father) had the disease also show reduced glucose metabolism and cognitive function, as well as elevated PIB uptake (a marker of amyloid) and brain atrophy.

Are metabolic enzymes the pathological trigger?

So if mitochondrial dysfunction initiates the Alzheimer’s cascade, what are the steps leading from metabolic disruption to neurodegeneration and ultimately, dementia? Studies point to cytochrome oxidase – a key enzyme for mitochondrial metabolism that’s encoded by both mitochondrial and nuclear DNA – as a likely trigger for early pathological events. Studies suggest that the enzyme is dysfunctional in the earliest disease stages; its activity is reduced not just in those with Alzheimer’s, but even in asymptomatic individuals who are at genetic risk for the disease or had a mother with Alzheimer’s. Furthermore, this stunted activity is linked directly to mitochondrial (or maternal) genetic contributions. By simply replacing the mitochondrial portion of the cytochrome oxidase DNA with DNA from Alzheimer’s patients, an otherwise normal cell will now have reduced cytochrome oxidase activity.

Bridging metabolism to Alzheimer’s pathology

For the mitochondrial theory to hold water, it must critically account for the classic pathological markers that define Alzheimer’s and have shaped traditional disease models – namely, amyloid plaques, tau tangles and brain atrophy. Indeed, growing evidence is elegantly bridging altered mitochondrial function to these key markers. For instance, disrupting mitochondrial electron transport chain activity (if you’ve forgotten your basic biochemistry, this is essential to cell metabolism) increases phosphorylated tau. What’s more, inhibiting cytochrome oxidase promotes a host of neurotoxic downstream effects including increased oxidative stress, apoptosis and amyloid production. Conversely, there’s also evidence that amyloid disrupts electron transport chain and cytochrome oxidase function, posing a chicken-or-egg conundrum. Amyloid has been found to buddy-up to mitochondria, but which comes first, the amyloid or the mitochondrial dysfunction, isn’t entirely clear. Both events occur early in the disease process, even before individuals show any symptoms of cognitive impairment. Whatever the mechanism, neurons from Alzheimer’s patients show signs of increased mitochondrial degradation. And when a neuron’s “powerhouse” begins to degrade, it cannot possibly support normal cognitive function.

A promising path for progress

It remains to be seen whether metabolic dysfunction is the key to unlocking the mechanisms of Alzheimer’s, and to ultimately developing effective therapeutics. While the current evidence is quite promising, many of the issues underlying the failure of other theories (poor translation of animal findings to humans, the challenge of identifying causal mechanistic pathways, etc.) similarly apply to the mitochondrial hypothesis. But at the very least, the proposal lays new ground for neuroscientists to continue progressing forward after a recent history of frustrating dead-ends. Even if mitochondria don’t hold the answer researchers have been seeking, understanding its contributions to Alzheimer’s pathology can only bring us closer to solving the mystery of this devastating disease.


Crouch PJ et al (2005). Copper-dependent inhibition of human cytochrome c oxidase by a dimeric conformer of amyloid-beta1-42. J Neurosci. 25(3):672-9. doi: ​10.​1523/​JNEUROSCI.​4276-04.​2005

Debette S et al. (2009). Association of parental dementia with cognitive and brain MRI measures in middle-aged adults. Neurology. 73(24):2071-8. doi: 10.1212/WNL.0b013e3181c67833

Edland SD et al (1996). Increased risk of dementia in mothers of Alzheimer’s disease cases: evidence for maternal inheritance. Neurology. 47:254–6. doi: 10.​1212/​WNL.​47.​1.​254

Hirai K et al. (2001). Mitochondrial abnormalities in Alzheimer’s disease. J Neurosci. 21(9):3017-23

Hogliner GU et al. (2005). The mitochondrial complex I inhibitor rotenone triggers a cerebral tauopathy. J Neurochem. 95(4):930-9. doi: 10.1111/j.1471-4159.2005.03493

Honea RA et al. (2010). Reduced gray matter volume in normal adults with a maternal family history of Alzheimer disease. Neurology. 74(2):113-20. doi: 10.1212/WNL.0b013e3181c918cb

Kish SJ et al. (1992). Brain cytochrome oxidase in Alzheimer’s disease. J Neurochem. 59(2):776-9. doi: 10.1111/j.1471-4159.1992.tb09439

Mosconi L et al (2007). Maternal family history of Alzheimer’s disease predisposes to reduced brain glucose metabolism. Proc Natl Acad Sci. 104(48):19067-72. doi: 10.1073/pnas.0705036104

Mosconi L et al. (2010). Increased fibrillar amyloid-{beta} burden in normal individuals with a family history of late-onset Alzheimer’s. Proc Natl Acad Sci. 107(13):5949-54. doi: 10.1073/pnas.0914141107

Mosconi L et al. (2011). Reduced Mitochondria Cytochrome Oxidase Activity in Adult Children of Mothers with Alzheimer’s Disease. J Alzheimers Dis. 27(3): 483–490. doi: 10.3233/JAD-2011-110866

Roses AD et al (2010). A TOMM40 variable-length polymorphism predicts the age of late-onset Alzheimer’s disease. Pharmacogenomics J. 10(5): 375–84. doi: 10.1038/tpj.2009.69

Swerdlow RH et al. (1997). Cybrids in Alzheimer’s disease: a cellular model of the disease? Neurology. 49(4):918-25. doi: ​10.​1212/​WNL.​49.​4.​918

Swerdlow RH, Burns JM and Khan SM (2010). The Alzheimer’s Disease Mitochondrial Cascade Hypothesis. J Alzheimers Dis. 20 Suppl 2:S265-79. doi: 10.3233/JAD-2010-100339

Swerdlow RH. (2012). Mitochondria and cell bioenergetics: increasingly recognized components and a possible etiologic cause of Alzheimer’s disease. Antioxid Redox Signal. 16(12):1434-55. doi: 10.1089/ars.2011.4149

Valla J et al. (2010). Reduced posterior cingulate mitochondrial activity in expired young adult carriers of the APOE ε4 allele, the major late-onset Alzheimer’s susceptibility gene. J Alzheimers Dis. 22(1):307-13. doi: 10.3233/JAD-2010-100129

Tagged , , , , , ,

The Vancouver Half, a victorious defeat

When life gives you lemons … suck it up. Isn’t that how the saying goes? Well, at the Scotiabank Vancouver Half Marathon last weekend – my second barefoot half – I sucked it up and it was sour.

The saga began a few weeks prior, when I was spontaneously struck with debilitating chest pain. It gripped me intensely, leaving me barely able to breath and fearing a heart attack. An X-ray showed a healthy heart, lungs and ribcage, yet the pain persisted for weeks. Massage, active release and chiropractic adjustments brought some temporary relief, and although I’ll never know for sure, I now suspect it was a strained pec or intercostal muscle. Many days, running was impossible. On good days I could eke out a short, slow, uncomfortable trot. To make matters worse, the stress and tension in my chest and back trickled down to knock the rest of my body out of wack. My opposite leg felt weak and limp, as if it were dragging powerless behind me … as if it belonged to someone else, completely out of my neuromuscular control. As race day neared, I began to abandon my hopes of running at all, mentally preparing for a restful vacation exploring a new city.

Come race morning, I convinced myself anything was possible and knew I would regret not at least trying. The gun went off and to my great surprise, my chest quickly loosened up and my breathing was fluid. My right leg, on the other hand, forgot how to move. For the first seven miles, it took every ounce of mental focus to coerce my muscles into lifting and propelling forward my dead leg. The sun blazed as the pack of runners hugged every smidgen of shade to escape the 80 degree heat. My battle to maintain a semblance of a functional stride intensified as I pranced precariously over nasty stretches of gravel. Eight miles in, a tiny stone sent a zinger through my toe and I pulled to the side for several minutes waiting for the ache to subside. I fought the discouraged voices rationalizing an early finish and pushed ahead. The toe pain gradually dissipated and I even enjoyed a brief surge of strength and fluidity.

But by that point, it was too late and the damage from my wonky gait coupled with the hot, rough and canted roads, had been done. My right heel began to burn and I felt an escalating squish as my bare foot struck the pavement with each step. I refused to inspect my foot and acknowledge that a monstrous blood blister had developed, with four miles still remaining. I refused to focus on the distance ahead, allowing myself to think only of the present moment. “Just take one more step. One step is nothing. Then, just take one more.” I convinced myself that the pain was illusory – that it only existed if I gave it life – and somehow, this denial empowered me through, single squishy step by squishy step. As I sprinted to the finish, a huge smile was plastered on my face and a flood of endorphins masked the havoc I had wreaked on my body. And just like Cinderella at midnight, as I crossed the finish line and broke that invisible endorphin wall, my ecstatic sprint transformed into an awkward hobble over to the medical tent.


As I saw my finish time, I was surprisingly unfazed by learning I had raced my slowest half ever. Those 13.1 miles were more painful than any I had raced before, but they hurt far less than a DNF or worse – a DNS. Despite the physical pain and frustration, I genuinely enjoyed almost every moment. There is a reason runners return again and again to race, through heat, injury and fatigue … the energy of the running community, the intoxication of the journey, and the discoveries along the way entice us back as addictive rewards.

Several years ago this race would have devastated me. Indeed, by dwelling on insignificant matters of time and speed, racing can destroy a runner and quench the very passion that fuels us to run. But by embracing each experience as a novel opportunity for growth and self-discovery, we can only evolve into better runners – and better human beings. For me, the aggregate challenges of my years of running have reinforced one invaluable lesson. We runners are so much stronger, and our bodies capable of so much more, than we’re aware. Our power is only bounded by the limits of our mind and the integrity of our spirit. To paraphrase a particularly accomplished marathoner, my fastest days may be behind me, but my best running days lay ahead.

Tagged , , , , , ,

Barefoot Running Workshop 2: Lower and Upper Biomechanics

In the first of our Barefoot Running Workshops, we explored facts and fiction of barefoot running, sensory awareness and mechanics of the foot. In our second workshop today, we introduced basic running kinematics before moving north from the foot to cover mechanics of the lower and upper body. Here’s a recap of the workshop highlights for those who missed it.


Before diving into the nitty-gritty of leg, hip and torso function, it’s essential to understand how one gets from zero to running in the first place. Running has been described in a multitude of ways, from a controlled fall to alternating one-legged hops to a springy, aerial variant of walking. Given this confusing jumble of terminology, what then are the essential movements that convert a stationary body to a running body? The basic motion is far simpler than most runners would imagine. There’s no jumping, bouncing or flying required! In essence, running is nothing more than marching while moving forward.

March. Simply pick up the feet, the ankle gliding parallel to the shin up to the knee. Return the foot to its starting position and repeat with the other leg. That’s it. The 100-ups are a great exercise to reinforce this motor pattern.

Lean. Once you’ve mastered marching in place, it’s time to transform this into forward motion. This too is simpler than it sounds. To move forward, the body must lean forward. This lean should NOT come from bending at the waist; “sitting” or folding forward will cause a host of problems from the back to the hips to the knees. Instead, the lean should originate at the ankles, the entire body leaning angled together along the same plane. By simply adopting a slight lean from the ankles, you will fall forward and be propelled from stationary marching into forward travel. March, lean, and BAM … you’re magically running!


Lift the legs. A constant upward motion should be maintained throughout the gait cycle. This is especially important after striking, when the legs should immediately lift up. The feet should land directly under the hips, neither reaching forward nor crossing over the midline. Both overstriding and a cross-over gait can lead to various injuries. The Gait Guys offer an excellent series of videos on correcting a cross-over gait (part 1, 2 and 3).

Bend the knees. To facilitate a smooth ride, bend and relax the knees. The knees can serve as shock absorbers when allowed to flex, so the greater the bend, the less impact will be sustained upon landing. This is especially helpful when running downhill.

Stable hips. The shin bone’s connected to the thigh bone … the thigh bone’s connected to the hip bone … Yes, it’s all connected, and these chains are particularly notable in the context of how the legs move in response to the hips. The hips are indeed the powerhouse and main driver of a strong running stride. Strong, stable hips are essential, and muscular imbalances or poor hip mechanics are the source of many leg and foot injuries. Don’t let the hips sink or drop, but keep them level on the horizontal plane. The hips serve as the body’s steering wheel, so be sure to keep them facing forward and aligned with the shoulders.


Core rotation. Some rotation is key to balancing the body’s left-right movements, but excessive rotation, or from the wrong place, can be problematic. Most of the rotation should originate in the core. Imagine the pelvis as a chandelier, the torso as its suspension cable and your head as the ceiling. The pelvis should dangle, relaxed, and rotate freely from the waist, supported by the strength of the strong, elongated core. As the right legs swings back, the right pelvis rotates back. It’s not forced or pulled, but swings naturally, allowing greater leg extension without over-stressing the hips. (The chandelier example was adapted from this excellent article.)

Shoulders and arms. Keep the shoulders low and relaxed, but don’t slouch. Some shoulder motion is fine, but be careful not to dip them or overly rotate the chest. After the hips, the shoulders serve as a second steering wheel, so they should remain stable and facing forward. Keep the arms close to your sides, elbows at a 90 degree angle and swinging forward and backward rather than across the chest. The rhythm of your arms directly affects hip and leg motion; a rapid arm punp can encourage faster leg turnover, and fluid forward-backward swinging will minimize inefficient lateral movements.

Head and posture. Your head leads and guides its body below. Keep your head up and neck stretched tall and long. Unlike owls, humans are blessed with eyes that move independently from the head, so you can still look at the ground without titling the head down. The entire body – from the ankles up to the tip of the head – should form a strong, continuous line, without kinks from poor posture or bending at the waist. Imagine being lifted upwards, suspended by a bird or plane (or pick your favorite flying power-creature) directly above your head.


Now it’s time to integrate these elements into your perfect running form! This video from the Natural Running Center is a beautiful example of a strong, efficient stride. Revisit this video and try to mimic Mark’s fluid, light motion whenever you need a refresher.

The final key to optimizing your stride is forgetting everything you just read and just run. Yes, I am (mostly) serious. Sometimes less can be more in terms of tapping into your optimal gait. Running is one of the most natural movements for humans, and a strong, healthy body will readily fall into it’s own unique running stride. Obsessing about every component of your form will not only take the joy out of running, but can also backfire, inducing unnecessary tension or forced, inefficient motor patterns. If you find this occurring while tweaking your running mechanics, abandon the effort and simply allow your body move fluidly and aimlessly. You might find that your muscles were one step ahead of your mind, and knew the route to efficient running all along.

Join us for the final session of our Barefoot Running Workshop series Sunday, July 12 at 3 pm. As usual, we’ll meet at the Founder’s Statue at the northwest corner of Balboa and El Prado in Balboa Park. In this final session, we’ll wrap up with how best to run hills and do speed work, as well as safety and practical considerations of running barefoot. More details and RSVP here.

Tagged , , , ,

Task Shifting may Shift our Understanding of the Default Network

Originally published on the PLOS Neuroscience Community

Over the past two decades, one of the most impactful discoveries to come from the surge in functional MRI (fMRI) research has been the existence of the brain’s “default network”. Countless studies have found that that this system, mainly comprising medial frontal, parietal and temporal, and lateral parietal regions, is most active during rest or passive tasks such as mind-wanderingimagining or self-reflection. A new study, recently published in eLife by Ben Crittenden, Daniel Mitchell and John Duncan, presents a striking finding that may flip our understanding of the role of the default network on its head.

Task-switching: the common thread?

Many of the experiments evoking default network activity compare relatively unconstrained states conducive to rest or mind-wandering against rigid task conditions with targeted cognitive demands. Thus, while these studies contrast active and passive conditions, they also incidentally contrast states of sustained attentional focus with unrestricted, dynamically changing mental landscapes. Crittenden and colleagues argue that these shifting cognitive contexts may be the common thread to default network activity and thus explain its promiscuous involvement across such heterogeneous conditions. First author Crittenden explains how their seemingly radical diversion from classic theories came about through a serendipitous pilot experiment: “I developed an initial version of the current experiment to test the idea of which regions may be involved in orchestrating large switches, and the default network came out as really strong at the individual subject level. If these results held out we could be onto something quite interesting. We tweaked the task a bit and fortunately it followed the pilot data really nicely!”

To test their new hypothesis, the researchers conducted fMRI while participants performed three levels of task switching–make a major cognitive switch, a minor switch or no switch. For example, if they were previously asked whether two geometric figures were the same shape, a minor change would be determining if two figures were the same height, whereas a major change would be determining if a dolphin is living or non-living. The minor-switch condition is similar in cognitive load to other tasks that have not shown reliable default network activation. If context changes are driving the default network, then radical task switches should more effectively engage it.

Task conditions. A switch from the red-box to the blue-box tasks would be a minor switch, whereas a switch from the red-box to the green-box task would be a major switch. Adapted from Crittenden et al., 2015

Task conditions. A switch from the red-box to the blue-box tasks would be a minor switch, whereas a switch from the red-box to the green-box task would be a major switch. Adapted from Crittenden et al., 2015

Major task switches recruit the default network

Past studies have found that the default network does not function as a whole, but roughly dissociates into three subnetworks – “core,” medial temporal lobe (MTL) and dorsomedial prefrontal cortex (DMPFC) networks. Suspecting that these subnetworks are not equally involved in switching, they analyzed each subnetwork separately.

Compared to repeating the same task, major task switches activated the core and MTL networks. Small task switches did not activate any of the subnetworks. Using multivoxel pattern analysis, they further showed that the pattern of activity (versus the overall activation level) in all three subnetworks distinguished between the highly dissimilar tasks, but only the DMPFC network discriminated similar tasks. Thus, although both the overall magnitude and pattern of activity signaled contextual shifts, Crittenden raises some caution over interpreting the source of the pattern discrimination. “I imagine that a considerable amount of the classification accuracy between dissimilar tasks will be driven by lower-level visual features. However, it is still interesting that the default network is reliably representing this task information, which given the usual definition of the default network as task-negative, one may not have predicted.”

Activity for regions of the core (yellow), MTL (green) and DMPFC (blue) subnetworks for major (light colors) and minor (dark colors) task switches. Major switches activate many regions of the core and MTL subnetworks. Adapted from Crittenden et al., 2015

Activity for regions of the core (yellow), MTL (green) and DMPFC (blue) subnetworks for major (light colors) and minor (dark colors) task switches. Major switches activate many regions of the core and MTL subnetworks. Adapted from Crittenden et al., 2015

A shifting theory

If this finding is replicated, it could be the beginning of a major shift in our understanding of default network function. In contrast to the wealth of prior studies implicating the default network as “task-negative” – shutting down during demanding task conditions – here the default network was maximally engaged during dramatic contextual changes. These large task switches were objectively more challenging (participants responded more slowly) than the small-switch or no-switch conditions, in striking opposition to the notion that task difficulty suppresses the network. This implies that cognitive control or effort aren’t the key factors modulating these regions, but rather changing contextual states.

But does this model fit with the other mental states that reliability recruit the default network? Although it’s not yet clear what aspects of task shifting drive the observed response, the authors convincingly argue that indeed, many common default network activations can be accounted for by changes in cognitive context. At rest, during mind-wandering, imagining or reflecting on one’s past experiences, the mind is relatively free to jump between cognitive states. This contrasts with the constrained task conditions used in most fMRI studies that typically deactivate the default network. This relative cognitive liberty may give rise to radical mental shifts, for example, from thinking about the loud banging of the MRI scanner to planning your afternoon errands. Whether these spontaneous contextual changes are frequent enough to ramp up default network activity as observed remains to seen. Alternatively, the key factor may not be adoption of a new task, but the attentional release to do so. When switching from one task to another, the brain must let go of its attention to the first task before focusing on the next. In passive cognitive states, attention is relaxed, liberating the mind to focus on various tasks at will.

Until their findings are replicated and expanded, Crittenden explains that these possibilities are yet speculation. “I think that switches could be a contributing factor to the signal, however, by its nature the signal that we are envisioning is likely to be quite transient. More sustained activation such as during reminiscing/prospection/navigation etc. is likely to be a strong driver of default network activity. As we all like to say – more experiments are needed!”


Addis DR, Wong AT and Schacter DL (2007). Remembering the past and imagining the future: common and distinct neural substrates during event construction and elaboration. Neuropsychologia. 45(7):1363-77. doi: 10.1016/j.neuropsychologia.2006.10.016

Buckner RL (2012). The serendipitous discovery of the brain’s default network. Neuroimage. 62(2):1137-45. doi: 10.1016/j.neuroimage.2011.10.035

Crittenden BM, Mitchell DJ and Duncan J (2015). Recruitment of the default mode network during a demanding act of executive control. eLife. 4:e06481. doi: 10.7554/eLife.06481.001

Mason MF et al. (2007). Wandering Minds: The Default Network and Stimulus-Independent Thought. Science. 315(5810):393-5. doi: 10.1126/science.1131295

Gusnard DA, Akbudak E, Shulman GL and Raichle ME (2001). Medial prefrontal cortex and self-referential mental activity: Relation to a default mode of brain function. PNAS. 98(7):4259-64. doi: 10.1073/pnas.071043098

Tagged , ,

A sitting injury in disguise

Six months ago I wrote in jubilation that I had finally overcome my 17-year long struggle with high-hamstring tendinopathy. What first emerged as a nagging high school running injury has since haunted me with its frustratingly sporadic flare-ups. It seemed to rear its ugly head at random, with no clear relation to any aspect of my training – not distance, speed or hills. It must be my form, I reasoned. Since rehabbing with PRP, I’ve devoted these past several months to optimizing my running mechanics to prevent another resurgence of the dreaded hamstring pain. And these efforts have been paying off, as I’ve felt stronger and more fluid in my running than perhaps ever before. I considered myself officially victorious over this decades-long injury.

That is, until one week ago … one week ago, on a rest day (i.e., no running) at the end of an easy, low-mileage recovery week. After some light morning yoga and a day spent sitting at lab, I began to feel an achy spasm and cramping in my hamstring – an all-too familiar sensation that literally appeared out of nowhere on my drive home. The pain escalated over the subsequent hours and I was soon in the throes of my worst hamstring flare-up in a year. Tension and pain radiated from my neck down through the back of my knee and I fantasized about a chiropractic adjustment of my misaligned back and pelvis. I struggled through each run this week, but rest was not an option. In fact, the greatest pain was at rest; sitting, and especially driving, were torturous and even sleeping was a challenge. A week later, I’m finally seeing some light at the end of the tunnel, thanks to some aggressive ART, Graston and dry needling. While encouraging, this does not answer what caused the flare-up in the first place. I had done nothing obvious to exacerbate it, and had even been cautiously respecting my recovery week.

04The answer, I’m now convinced, lay in a tiny skin irritation at the ischial tuberosity where the hamstring tendon attaches to the sit bones – the exact spot where I felt the most intense pain. The spot appeared coincidentally – or so I thought – around the same time the hamstring pain first set in. The “coincidence” didn’t faze me until a week later, and I finally began putting two and two together. The irritation was at the epicenter of my pain, which escalated to unbearable when sitting. Incidentally, there have been only two periods of my life when I’ve enjoyed extended relief from the injury: First, during a six-month trip around the world, during which my days were spent walking, hiking and exploring. Second, another six-month period while briefly working in a lab that required me to be on my feet at length. A clear pattern began to emerge. Freedom from desk-work and sitting correlated with symptom relief, whereas excessive sitting (with a sore on my tush as proof) correlated with spontaneous hamstring flare-ups.

Could my running injury have been a sitting injury all along? Perhaps this is wishful thinking. Perhaps there remains a running training error at the heart of the issue that I have yet to discover. But until then, I’m confidently adding hamstring trauma to the growing list of reasons sitting is hazardous to our health and a threat to the sanity of a running addict.

Tagged , , , ,

Barefoot Running Workshop 1: Myths, Sensations, Foot-strike

Thanks to the awesome crew who attended my first Barefoot Running Workshop, lessons were learned and loads of barefoot fun was had! We dispelled myths, explored the pleasantness of soft pine needles and the not-so-pleasantness of hot, rough pavement, and most importantly, left with happy, dirty feet.


As a recap for attendees or those interested in future workshops, below is an overview of the highlights from our first session. In this introductory meeting, we covered: 1) the facts and fiction of barefoot running, 2) the importance of sensory feedback and awareness, and 3) mechanics of the foot (don’t fret … we’re not foot-centric and will address mechanics above the foot in the next workshop).


MYTH 1. Barefoot running will cure my injuries.

Fact: Injuries are often the result of training errors, such as overtraining or incorrect form. Taking off your shoes can’t compensate for these mistakes, but the increased awareness and sensations from being barefoot can help you better listen to your body and train smarter.

MYTH 2: Barefoot running causes foot fractures, Achilles tears and calf strains.

Fact: Running carries a risk of injury, regardless of what is or is not on your feet. There are certainly reports of sustaining such injuries when running barefoot, but these are almost always due to transitioning too aggressively, or doing too much too soon (see also here and here). A gradual, conservative transition while respecting your body’s warning signs will let you run safely and injury-free.

MYTH 3: Barefoot running is just another fad and a gimmick.

Fact: Barefoot running is as old as man, and was how humans first began running. Conventional running shoes are only a very recent invention (introduced only in the 1970’s with the advent of recreational jogging). Despite misleading marketing, the cushioned soles and raised heels of typical running shoes have never been shown to improve running or prevent injury (See Pete Larson’s great book for more on the science of running shoes).

MYTH 4: I will cut my feet on glass, step on rocks or catch a disease.

Fact: Sure, these are possibilities, but the ground is much less dangerous than the fear-mongerers will have you believe! Most of the earth is not, in fact, littered with broken glass and dirty needles. You will quickly learn to automatically pay attention to your surroundings to easily avoid such dangers. Your feet will also become more resilient against lesser dangers like stones, twigs or gravel.

MYTH 5: I need to build up calluses to toughen up my feet for barefoot running.

Fact: Calluses results from excess friction and are a sign of poor form. If you develop calluses or blisters, you are likely shearing, shuffling or pounding excessively. Over time your skin will become thicker and more resilient, but should not be rough or callused.

MYTH 6: Barefoot running will make me a faster or more efficient runner.

Fact: While barefoot running will change how you run and is unlikely to impair it, there is conflicting evidence as to whether it will improve or not affect your running economy. When first learning to run barefoot, the body will naturally demand a slower pace and reduced mileage. But as the body adapts over time, runners will gradually return to their earlier performance level. One’s response to going bare depends on many factors, including training history, running conditions and distance.

MYTH 7: You cannot run competitively or quickly barefoot.

Fact: There have many exceptional competitive barefoot runners throughout history, including Abebe Bikala (winner of the 1960 Olympic marathon in Rome) and the 1980’s Olympian Zola Budd.

MYTH 8: It’s best to run barefoot on the grass or sand.

Fact: If you’re looking for a bit of fun, go ahead and frolic barefoot through a grassy park or along the beach. But if your aim is to learn proper running form, stick to firm ground. Soft surfaces – just like cushioned shoes – can encourage lazy technique, particularly heel striking and heavy landing, and may even be more stressful to the body. Firm, even surfaces will provide the best feedback and sensations to train your neuromuscular system to run well.

MYTH 9: I can get the same benefits from minimalist shoes, without the risks of going barefoot.

Fact: Running in footwear – yes, even the most minimal shoe – will change how your run. Zero drop and thin-soled shoes carry certain advantages over conventional shoes, but a key benefit of being barefoot is the rich sensory feedback from your skin. You cannot experience these benefits with rubber between your foot and the earth.

MYTH 10: I can’t run barefoot because I’m flat-footed, overweight, too old, etc …

Fact: Anyone can run barefoot, regardless of age, shape or size. Running barefoot naturally encourages you to run lighter, easing the impact on your joints and tissues. Weak feet result from disuse, and will quickly become stronger with foot exercises and barefoot activities.


Enhanced sensory input lies at the heart of the many benefits of barefoot running. To maximally reap these benefits, we must become aware of our body’s response to the environment. What do you feel when running on concrete, pavement, gravel, dirt or grass? How about on hot, cold or wet surfaces? How do your sensory experience and gait change on various terrains? Note any sensitivity on the skin of your feet, your sense of stability and your proprioception. Do you run more lightly, quickly or fluidly on any particular surface?


Along with intensifying sensory experiences, running barefoot also heightens awareness of your internal and external environments. Running requires constant feedback to the body from its surroundings, and listening to these messages is key to safe, healthy and strong running. Take advantage of all your senses – especially your vision, hearing and touch – to maintain contact with your external environment. With a bit of practice you will begin to automatically scan for hazards (rocks, thorns, traffic, cyclists or playing children!) and for the optimal placement of your next step. At the same time, your internal awareness will naturally increase. Acknowledging your body’s responses to the environment will help refine your form, correct mechanical errors and prevent injury. If something feels off, play with your stride until you regain fluidity. But if you feel you’re pushing too far, listen to your body’s call for rest.


Foot-strike. What part of the foot touches first (forefoot, heel, midfoot)? Barefoot running encourages a mid- to forefoot strike, which research suggests may beneficially redistribute impact forces compared to heel-striking. However, there’s still no clear consensus over the “right” foot strike, or whether it even matters for injury prevention or performance.

Do you land more on the outside or inside of the foot? A natural strike will involve both pronation and supination, beginning with a slight inward roll followed by an outward roll at push-off. As these motions should come naturally, it is best not to force them, but to focus on landing with the whole foot at once. A helpful tool is visualizing the foot as a tripod; it is most stable when all three corners – the base of the big toe, base of the little toe and the heel – all contact the ground together.

Relax. Are the feet tense or relaxed? The feet may clench as a defense mechanism, especially on rough terrain. This can be dangerous and lead to excessive foot slapping, heavy impact and foot or shin pain. Relax the ankle and let the foot land softly.

Lift, don’t push. Do the feet push off or pound the ground? They should instead touch only briefly, followed by an immediate lift. The overall motion of the foot should be upwards, lifting from the ground rather than slamming downwards. This will prevent shuffling, shearing or twisting, which can lead to blisters or calluses.

Over-striding. Where do the feet land relative to your center-of-mass? They should land directly beneath the hips, not in front. Over-striding – or striking with the feet too far forward – is one of the most common sources of running injuries.

Cadence. Are the feet turning over rapidly? Aim for a high cadence (turnover rate), as this may help minimize impact forces and improve efficeincy. 180 steps per minute is roughly considered ideal.

Check out the recap from our second session, in which covered the fundamentals of running form, including lower and upper body mechanics. In our third and final session July 12, we’ll explore hills and speed and practical concerns of barefoot running.

Tagged , , , , ,

How does Sports Training Restructure the Brain?

Originally published on the PLOS Neuroscience Community

The impact of regular exercise on the body is obvious. It improves cardiovascular fitness, increases strength and tones muscle. While these transformations are visible to the naked eye, changes to brain structure and function by physical activity occur behind the scenes and are therefore less understood. It’s not news that the brain is wonderfully plastic, dynamically reorganizing in response to every sensory, motor or cognitive experience. One might imagine therefore, that elite athletes–who train rigorously to perfect specialized movements–undergo robust neural adaptations that support, or reflect, their exceptional neuromuscular skills. Different sports, invoking different movements, will target unique neural substrates, but most physical activities similarly rely on regions that are key for eliciting, coordinating and controlling movement, such as the motor cortex, cerebellum and basal ganglia. In a new study published in Experimental Brain Research, Yu-Kai Chang and colleagues explored how microstructure in the basal ganglia reflects training and skill specialization of elite athletes.

Runners, martial artists and weekend warriors

The study enrolled groups of elite runners and elite martial artists, along with a control group of non-athletes who only engaged in occasional, casual exercise. Although both groups of athletes were highly trained (averaging over four hours of training daily), their uniquely specialized skills were key for determining whether basal ganglia structure varied by sport or by athletic training generally. The groups did not differ in terms of basic physical attributes, demographics or intelligence, but as expected, the athletes were more physically fit than the controls.

Measuring microstructure

The researchers focused on the basal ganglia, a set of nuclei comprising the caudate, putamen, globus pallidus, substantia nigra and subthalamic nuclei, since these structures serve critical roles in preparing for and executing movements and learning motor skills.

Structures of the basal ganglia

Structures of the basal ganglia

They used diffusion tensor imaging (DTI), which measures how water flows and diffuses within the brain. Since water diffusion is determined by neural features like axon density and myelination, it is more sensitive to finer-scale brain structure than traditional MRI approaches that measure the size or shape of brain regions. Fractional anisotropy (FA) and mean diffusivity (MD) are common metrics to assess, respectively, the directionality and amount of diffusion. Typically, higher FA and lower MD are thought to reflect higher integrity or greater organization of white matter.

Globus pallidus restructures in athletes

The basal ganglia microstructure of the athletes and controls were remarkably similar, with one exception. The internal globus pallidus showed lower FA and a trend for higher MD in the athletes than the non-athletes, but there were no differences between the runners and martial artists.

This result in intriguing for two reasons. First, it’s notable that both athletic groups showed a similar magnitude difference from non-athletes. Thus, acquiring and refining skilled movements more generally, rather than any particular movement pattern unique to running or martial arts, may restructure the globus pallidus. As study author Erik Chang explains,

“With the current results, we can only speculate that the experience of high intensity sport training, but not sport-specific factors, would trigger the localized changes in DTI indices we observed.”

This would make sense, considering the area is an important output pathway of the basal ganglia, broadly critical for learning and controlling movements. It’s likely that other regions may undergo more specialized adaptations to sport-specific training. Chang expects that future studies using a whole-brain approach with “distinctions between sport types and reasonable sample size would find cross-sectional differences or longitudinal changes in brain structure related to motor skill specialization.”

Second, although we expect athletic training to enhance regional brain structure, the reduced FA and increased MD observed in these elite athletes would commonly be considered signs of reduced white matter integrity. This is somewhat surprising in light of other studies reporting positive correlations between physical fitness and white matter integrity in non-professional athletes and children. But as Chang points out, “Professional sport experience is quite different from leisure training.” Although unexpected, this finding aligns well with similar reports that intensive training in dancersmusicians and multilinguals is associated with reduced gray or white matter volume or reduced FA. Why would this be? For starters, DTI doesn’t directly measure axonal integrity or myelination–only water diffusion. So while sports training has some clearly reorganizing effect on basal ganglia, we can’t yet infer what changes are occurring at the neuronal level. One interesting possibility is that the development of such expertise involves neuronal reorganization or pruning as circuits become more specialized and efficient. Chang cautions that their findings “could reflect the manifestation of an array of factors, including increased neural efficiency, altered cortical iron concentration in the elite athletes, or other training-specific/demographic variables.”

In the broader context, this study is a striking example of why care is warranted in interpreting neuroplasticity. Depending on the study conditions, the same intervention–here, athletic training–can apparently remodel the brain in opposing directions. This is an important reminder that although we like to assume that bigger is better in terms of brain structure, this is not always true, highlighting the need to more deeply explore exactly how and why these neural adaptations occur. Chang eagerly anticipates that future studies incorporating “HARDI (High-angular-resolution diffusion imaging) and Q-ball vector analysis, together with larger sample sizes and longitudinal design, will be very helpful in revealing finer microscopic structural differences among different types of elite athletes.”


Chang YK, Tsai JH, Wang CC and Chang EC (2015). Structural differences in basal ganglia of elite running versus martial arts athletes: a diffusion tensor imaging study. Exp Brain Res. doi: 10.1007/s00221-015-4293-x

Chaddock-Heyman L, et al. (2014). Aerobic fitness is associated with greater white matter integrity in children. Cortex. 54:179-89. doi: 10.1016/j.cortex.2014.02.014

Elmer S, Hänggi J and Jäncke L (2014). Processing demands upon cognitive, linguistic, and articulatory functions promote grey matter plasticity in the adult multilingual brain: Insights from simultaneous interpreters. Front Hum Neurosci. 8:584. doi: 10.3389/fnhum.2014.00584

Hänggi J, Koeneke S, Bezzola L and Jäncke L (2010). Structural neuroplasticity in the sensorimotor network of professional female ballet dancers. Hum Brain Mapp. 31(8):1196-206. doi:10.1002/hbm.20928

Imfeld A, et al. (2009). White matter plasticity in the corticospinal tract of musicians: a diffusion tensor imaging study. Neuroimage. 46(3):600-7. doi: 10.1016/j.neuroimage.2009.02.025

Tseng BY, et al. (2013). White matter integrity in physically fit older adults. Neuroimage. 82:510-6. doi: 10.1016/j.neuroimage.2013.06.011

Tagged , , , , , ,

A call for acceptance of career polygamy in science

Throughout my academic career from undergrad to my current postdoc, I’ve been perplexed by my atypical relationship with science. Yes, research and I have maintained a long, passionate love affair, but an affair apparently unlike those enjoyed by my colleagues. My unconventional attitude towards my work has served as a disconcerting voice that I’m just not cut out for a serious scientific career. I’ll certainly never win a Nobel, probably won’t publish in Science and may never even hold a faculty position. This reality has never really bothered me, but my lack of bother has been a subtle source of concern.

Only now, as a postdoc years into my Neuroscientific career, am I beginning to understand what makes my love affair with science so unusual. It’s by no means less genuine or less impassioned than those of colleagues madly pursuing tenure-track jobs; rather, it’s set apart by its polygamous nature. I get enthralled by new theories, overwhelmed with the excitement of shiny new data, and bore friends and family with my ecstatic ramblings about my research. I am a scientist for no other reason than I love it. However, it’s not the only object of my affection. I have never been, and probably never will be, able to suppress my love for so many other facets of life. A monogamous relationship with Neuroscience would just never suffice for me.

20131007-001611Since I was a teenager, a certain passage from Sylvia Plath’s the Bell Jar has always haunted me. She shared her predicament of being unable to choose a single fig – a life path, and as her indecision gripped her the figs wilted, leaving her starving and without a future. I’ve long been distraught by this similar fear of foregoing any one of my many dreams, wavering among so many enticing options and failing to commit to one whole-heartedly. As did Sylvia, I too considered this a flaw … a characteristic that would hold me back and prevent me from attaining my goals. As I’m finally understanding that these scattered passions or lack of focus – call it what you will – lie at the heart of my atypical approach to my work, I am also finally accepting that this is not necessarily a flaw.

“Good” scientists come in all shapes and sizes, but common to all is a sincere curiosity, a longing for answers and a rigorous devotion to unveiling them. Although these are precisely the factors that originally drew me to Neuroscience, I have always struggled with the conviction that I must not love my work quite enough – or at least not as much as the rock-stars around me, spending grueling hours in the lab, aiming for the highest impact-factor journals and power-networking with the bigwigs in their field. To a certain degree, these are crucial elements of a successful research trajectory, and I too have worked hard, held my research quality to the highest standards, and of course reveled in the rewards of grants and publications. But I have worked equally hard outside of the lab. Throughout grad school and my postdoc, I’ve allowed myself to pick several of those ripe, juicy figs and have savored every one of them. I’m not talking about the conventional concept of work-life balance that we’ve come to accept – at least superficially – is essential for job satisfaction. I’m referring more specifically to work-work balance. I indulge my writing addiction through freelance writing and editing and won’t hesitate to take on other side-projects as I’m so inspired. These endeavors are often neuro-related, but sometimes sprout from my obsession with running and fascination with sports physiology and biomechanics. These extra-neuro pursuits are as much “work” as my research, and I approach them all with the same intensity and devotion. They have not limited my productivity as a Neuroscientist, but have actually fostered it, by keeping me fresh, motivated and engaged with novel perspectives within and beyond the science community.

I’ve been blessed with both graduate and postdoc advisers who’ve been remarkably supportive of my promiscuous work habits, which has doubtlessly contributed to my own recent acceptance of my choices. Yet, I suspect my fortune is the exception rather than the rule, with the admission of this sort of behavior being met with disapproval or condemnation in many labs. In the current academic environment, time spent outside lab or even (gasp!) enjoying yourself is too often considered a sign of laziness or lack of drive. Tales of researchers working themselves to poor health or even suicide are rampant. It’s not clear how a field based on incentives so beautiful as curiosity and understanding has become so ugly, but it’s far time this trend is reversed. Outside interests or other professional pursuits should not be sources of guilt, and are not – contrary to common belief – prohibitive of a flourishing scientific career. Any culture that discourages the nurturing of broad interests can be toxic, stifling both personal growth and, ironically, professional development and productivity.

While there is certainly nothing wrong with the driven pursuit of a focused scientific career – and I strongly admire my dedicated colleagues who have chosen this path – it’s time we reject the myth that this is the only honorable or effective route to scientific success. As a first step, I’m embracing my relationship with Neuroscience, idiosyncrasies and all, and proudly proclaiming that we’ve been polygamous all along.

Tagged , ,

Get every new post delivered to your Inbox.

Join 44 other followers