Category Archives: Neuro

Task Shifting may Shift our Understanding of the Default Network

Originally published on the PLOS Neuroscience Community

Over the past two decades, one of the most impactful discoveries to come from the surge in functional MRI (fMRI) research has been the existence of the brain’s “default network”. Countless studies have found that that this system, mainly comprising medial frontal, parietal and temporal, and lateral parietal regions, is most active during rest or passive tasks such as mind-wanderingimagining or self-reflection. A new study, recently published in eLife by Ben Crittenden, Daniel Mitchell and John Duncan, presents a striking finding that may flip our understanding of the role of the default network on its head.

Task-switching: the common thread?

Many of the experiments evoking default network activity compare relatively unconstrained states conducive to rest or mind-wandering against rigid task conditions with targeted cognitive demands. Thus, while these studies contrast active and passive conditions, they also incidentally contrast states of sustained attentional focus with unrestricted, dynamically changing mental landscapes. Crittenden and colleagues argue that these shifting cognitive contexts may be the common thread to default network activity and thus explain its promiscuous involvement across such heterogeneous conditions. First author Crittenden explains how their seemingly radical diversion from classic theories came about through a serendipitous pilot experiment: “I developed an initial version of the current experiment to test the idea of which regions may be involved in orchestrating large switches, and the default network came out as really strong at the individual subject level. If these results held out we could be onto something quite interesting. We tweaked the task a bit and fortunately it followed the pilot data really nicely!”

To test their new hypothesis, the researchers conducted fMRI while participants performed three levels of task switching–make a major cognitive switch, a minor switch or no switch. For example, if they were previously asked whether two geometric figures were the same shape, a minor change would be determining if two figures were the same height, whereas a major change would be determining if a dolphin is living or non-living. The minor-switch condition is similar in cognitive load to other tasks that have not shown reliable default network activation. If context changes are driving the default network, then radical task switches should more effectively engage it.

Task conditions. A switch from the red-box to the blue-box tasks would be a minor switch, whereas a switch from the red-box to the green-box task would be a major switch. Adapted from Crittenden et al., 2015

Task conditions. A switch from the red-box to the blue-box tasks would be a minor switch, whereas a switch from the red-box to the green-box task would be a major switch. Adapted from Crittenden et al., 2015

Major task switches recruit the default network

Past studies have found that the default network does not function as a whole, but roughly dissociates into three subnetworks – “core,” medial temporal lobe (MTL) and dorsomedial prefrontal cortex (DMPFC) networks. Suspecting that these subnetworks are not equally involved in switching, they analyzed each subnetwork separately.

Compared to repeating the same task, major task switches activated the core and MTL networks. Small task switches did not activate any of the subnetworks. Using multivoxel pattern analysis, they further showed that the pattern of activity (versus the overall activation level) in all three subnetworks distinguished between the highly dissimilar tasks, but only the DMPFC network discriminated similar tasks. Thus, although both the overall magnitude and pattern of activity signaled contextual shifts, Crittenden raises some caution over interpreting the source of the pattern discrimination. “I imagine that a considerable amount of the classification accuracy between dissimilar tasks will be driven by lower-level visual features. However, it is still interesting that the default network is reliably representing this task information, which given the usual definition of the default network as task-negative, one may not have predicted.”

Activity for regions of the core (yellow), MTL (green) and DMPFC (blue) subnetworks for major (light colors) and minor (dark colors) task switches. Major switches activate many regions of the core and MTL subnetworks. Adapted from Crittenden et al., 2015

Activity for regions of the core (yellow), MTL (green) and DMPFC (blue) subnetworks for major (light colors) and minor (dark colors) task switches. Major switches activate many regions of the core and MTL subnetworks. Adapted from Crittenden et al., 2015

A shifting theory

If this finding is replicated, it could be the beginning of a major shift in our understanding of default network function. In contrast to the wealth of prior studies implicating the default network as “task-negative” – shutting down during demanding task conditions – here the default network was maximally engaged during dramatic contextual changes. These large task switches were objectively more challenging (participants responded more slowly) than the small-switch or no-switch conditions, in striking opposition to the notion that task difficulty suppresses the network. This implies that cognitive control or effort aren’t the key factors modulating these regions, but rather changing contextual states.

But does this model fit with the other mental states that reliability recruit the default network? Although it’s not yet clear what aspects of task shifting drive the observed response, the authors convincingly argue that indeed, many common default network activations can be accounted for by changes in cognitive context. At rest, during mind-wandering, imagining or reflecting on one’s past experiences, the mind is relatively free to jump between cognitive states. This contrasts with the constrained task conditions used in most fMRI studies that typically deactivate the default network. This relative cognitive liberty may give rise to radical mental shifts, for example, from thinking about the loud banging of the MRI scanner to planning your afternoon errands. Whether these spontaneous contextual changes are frequent enough to ramp up default network activity as observed remains to seen. Alternatively, the key factor may not be adoption of a new task, but the attentional release to do so. When switching from one task to another, the brain must let go of its attention to the first task before focusing on the next. In passive cognitive states, attention is relaxed, liberating the mind to focus on various tasks at will.

Until their findings are replicated and expanded, Crittenden explains that these possibilities are yet speculation. “I think that switches could be a contributing factor to the signal, however, by its nature the signal that we are envisioning is likely to be quite transient. More sustained activation such as during reminiscing/prospection/navigation etc. is likely to be a strong driver of default network activity. As we all like to say – more experiments are needed!”

References

Addis DR, Wong AT and Schacter DL (2007). Remembering the past and imagining the future: common and distinct neural substrates during event construction and elaboration. Neuropsychologia. 45(7):1363-77. doi: 10.1016/j.neuropsychologia.2006.10.016

Buckner RL (2012). The serendipitous discovery of the brain’s default network. Neuroimage. 62(2):1137-45. doi: 10.1016/j.neuroimage.2011.10.035

Crittenden BM, Mitchell DJ and Duncan J (2015). Recruitment of the default mode network during a demanding act of executive control. eLife. 4:e06481. doi: 10.7554/eLife.06481.001

Mason MF et al. (2007). Wandering Minds: The Default Network and Stimulus-Independent Thought. Science. 315(5810):393-5. doi: 10.1126/science.1131295

Gusnard DA, Akbudak E, Shulman GL and Raichle ME (2001). Medial prefrontal cortex and self-referential mental activity: Relation to a default mode of brain function. PNAS. 98(7):4259-64. doi: 10.1073/pnas.071043098

Tagged , ,

How does Sports Training Restructure the Brain?

Originally published on the PLOS Neuroscience Community

The impact of regular exercise on the body is obvious. It improves cardiovascular fitness, increases strength and tones muscle. While these transformations are visible to the naked eye, changes to brain structure and function by physical activity occur behind the scenes and are therefore less understood. It’s not news that the brain is wonderfully plastic, dynamically reorganizing in response to every sensory, motor or cognitive experience. One might imagine therefore, that elite athletes–who train rigorously to perfect specialized movements–undergo robust neural adaptations that support, or reflect, their exceptional neuromuscular skills. Different sports, invoking different movements, will target unique neural substrates, but most physical activities similarly rely on regions that are key for eliciting, coordinating and controlling movement, such as the motor cortex, cerebellum and basal ganglia. In a new study published in Experimental Brain Research, Yu-Kai Chang and colleagues explored how microstructure in the basal ganglia reflects training and skill specialization of elite athletes.

Runners, martial artists and weekend warriors

The study enrolled groups of elite runners and elite martial artists, along with a control group of non-athletes who only engaged in occasional, casual exercise. Although both groups of athletes were highly trained (averaging over four hours of training daily), their uniquely specialized skills were key for determining whether basal ganglia structure varied by sport or by athletic training generally. The groups did not differ in terms of basic physical attributes, demographics or intelligence, but as expected, the athletes were more physically fit than the controls.

Measuring microstructure

The researchers focused on the basal ganglia, a set of nuclei comprising the caudate, putamen, globus pallidus, substantia nigra and subthalamic nuclei, since these structures serve critical roles in preparing for and executing movements and learning motor skills.

Structures of the basal ganglia

Structures of the basal ganglia

They used diffusion tensor imaging (DTI), which measures how water flows and diffuses within the brain. Since water diffusion is determined by neural features like axon density and myelination, it is more sensitive to finer-scale brain structure than traditional MRI approaches that measure the size or shape of brain regions. Fractional anisotropy (FA) and mean diffusivity (MD) are common metrics to assess, respectively, the directionality and amount of diffusion. Typically, higher FA and lower MD are thought to reflect higher integrity or greater organization of white matter.

Globus pallidus restructures in athletes

The basal ganglia microstructure of the athletes and controls were remarkably similar, with one exception. The internal globus pallidus showed lower FA and a trend for higher MD in the athletes than the non-athletes, but there were no differences between the runners and martial artists.

This result in intriguing for two reasons. First, it’s notable that both athletic groups showed a similar magnitude difference from non-athletes. Thus, acquiring and refining skilled movements more generally, rather than any particular movement pattern unique to running or martial arts, may restructure the globus pallidus. As study author Erik Chang explains,

“With the current results, we can only speculate that the experience of high intensity sport training, but not sport-specific factors, would trigger the localized changes in DTI indices we observed.”

This would make sense, considering the area is an important output pathway of the basal ganglia, broadly critical for learning and controlling movements. It’s likely that other regions may undergo more specialized adaptations to sport-specific training. Chang expects that future studies using a whole-brain approach with “distinctions between sport types and reasonable sample size would find cross-sectional differences or longitudinal changes in brain structure related to motor skill specialization.”

Second, although we expect athletic training to enhance regional brain structure, the reduced FA and increased MD observed in these elite athletes would commonly be considered signs of reduced white matter integrity. This is somewhat surprising in light of other studies reporting positive correlations between physical fitness and white matter integrity in non-professional athletes and children. But as Chang points out, “Professional sport experience is quite different from leisure training.” Although unexpected, this finding aligns well with similar reports that intensive training in dancersmusicians and multilinguals is associated with reduced gray or white matter volume or reduced FA. Why would this be? For starters, DTI doesn’t directly measure axonal integrity or myelination–only water diffusion. So while sports training has some clearly reorganizing effect on basal ganglia, we can’t yet infer what changes are occurring at the neuronal level. One interesting possibility is that the development of such expertise involves neuronal reorganization or pruning as circuits become more specialized and efficient. Chang cautions that their findings “could reflect the manifestation of an array of factors, including increased neural efficiency, altered cortical iron concentration in the elite athletes, or other training-specific/demographic variables.”

In the broader context, this study is a striking example of why care is warranted in interpreting neuroplasticity. Depending on the study conditions, the same intervention–here, athletic training–can apparently remodel the brain in opposing directions. This is an important reminder that although we like to assume that bigger is better in terms of brain structure, this is not always true, highlighting the need to more deeply explore exactly how and why these neural adaptations occur. Chang eagerly anticipates that future studies incorporating “HARDI (High-angular-resolution diffusion imaging) and Q-ball vector analysis, together with larger sample sizes and longitudinal design, will be very helpful in revealing finer microscopic structural differences among different types of elite athletes.”

References

Chang YK, Tsai JH, Wang CC and Chang EC (2015). Structural differences in basal ganglia of elite running versus martial arts athletes: a diffusion tensor imaging study. Exp Brain Res. doi: 10.1007/s00221-015-4293-x

Chaddock-Heyman L, et al. (2014). Aerobic fitness is associated with greater white matter integrity in children. Cortex. 54:179-89. doi: 10.1016/j.cortex.2014.02.014

Elmer S, Hänggi J and Jäncke L (2014). Processing demands upon cognitive, linguistic, and articulatory functions promote grey matter plasticity in the adult multilingual brain: Insights from simultaneous interpreters. Front Hum Neurosci. 8:584. doi: 10.3389/fnhum.2014.00584

Hänggi J, Koeneke S, Bezzola L and Jäncke L (2010). Structural neuroplasticity in the sensorimotor network of professional female ballet dancers. Hum Brain Mapp. 31(8):1196-206. doi:10.1002/hbm.20928

Imfeld A, et al. (2009). White matter plasticity in the corticospinal tract of musicians: a diffusion tensor imaging study. Neuroimage. 46(3):600-7. doi: 10.1016/j.neuroimage.2009.02.025

Tseng BY, et al. (2013). White matter integrity in physically fit older adults. Neuroimage. 82:510-6. doi: 10.1016/j.neuroimage.2013.06.011

Tagged , , , , , ,

A call for acceptance of career polygamy in science

Throughout my academic career from undergrad to my current postdoc, I’ve been perplexed by my atypical relationship with science. Yes, research and I have maintained a long, passionate love affair, but an affair apparently unlike those enjoyed by my colleagues. My unconventional attitude towards my work has served as a disconcerting voice that I’m just not cut out for a serious scientific career. I’ll certainly never win a Nobel, probably won’t publish in Science and may never even hold a faculty position. This reality has never really bothered me, but my lack of bother has been a subtle source of concern.

Only now, as a postdoc years into my Neuroscientific career, am I beginning to understand what makes my love affair with science so unusual. It’s by no means less genuine or less impassioned than those of colleagues madly pursuing tenure-track jobs; rather, it’s set apart by its polygamous nature. I get enthralled by new theories, overwhelmed with the excitement of shiny new data, and bore friends and family with my ecstatic ramblings about my research. I am a scientist for no other reason than I love it. However, it’s not the only object of my affection. I have never been, and probably never will be, able to suppress my love for so many other facets of life. A monogamous relationship with Neuroscience would just never suffice for me.

20131007-001611Since I was a teenager, a certain passage from Sylvia Plath’s the Bell Jar has always haunted me. She shared her predicament of being unable to choose a single fig – a life path, and as her indecision gripped her the figs wilted, leaving her starving and without a future. I’ve long been distraught by this similar fear of foregoing any one of my many dreams, wavering among so many enticing options and failing to commit to one whole-heartedly. As did Sylvia, I too considered this a flaw … a characteristic that would hold me back and prevent me from attaining my goals. As I’m finally understanding that these scattered passions or lack of focus – call it what you will – lie at the heart of my atypical approach to my work, I am also finally accepting that this is not necessarily a flaw.

“Good” scientists come in all shapes and sizes, but common to all is a sincere curiosity, a longing for answers and a rigorous devotion to unveiling them. Although these are precisely the factors that originally drew me to Neuroscience, I have always struggled with the conviction that I must not love my work quite enough – or at least not as much as the rock-stars around me, spending grueling hours in the lab, aiming for the highest impact-factor journals and power-networking with the bigwigs in their field. To a certain degree, these are crucial elements of a successful research trajectory, and I too have worked hard, held my research quality to the highest standards, and of course reveled in the rewards of grants and publications. But I have worked equally hard outside of the lab. Throughout grad school and my postdoc, I’ve allowed myself to pick several of those ripe, juicy figs and have savored every one of them. I’m not talking about the conventional concept of work-life balance that we’ve come to accept – at least superficially – is essential for job satisfaction. I’m referring more specifically to work-work balance. I indulge my writing addiction through freelance writing and editing and won’t hesitate to take on other side-projects as I’m so inspired. These endeavors are often neuro-related, but sometimes sprout from my obsession with running and fascination with sports physiology and biomechanics. These extra-neuro pursuits are as much “work” as my research, and I approach them all with the same intensity and devotion. They have not limited my productivity as a Neuroscientist, but have actually fostered it, by keeping me fresh, motivated and engaged with novel perspectives within and beyond the science community.

I’ve been blessed with both graduate and postdoc advisers who’ve been remarkably supportive of my promiscuous work habits, which has doubtlessly contributed to my own recent acceptance of my choices. Yet, I suspect my fortune is the exception rather than the rule, with the admission of this sort of behavior being met with disapproval or condemnation in many labs. In the current academic environment, time spent outside lab or even (gasp!) enjoying yourself is too often considered a sign of laziness or lack of drive. Tales of researchers working themselves to poor health or even suicide are rampant. It’s not clear how a field based on incentives so beautiful as curiosity and understanding has become so ugly, but it’s far time this trend is reversed. Outside interests or other professional pursuits should not be sources of guilt, and are not – contrary to common belief – prohibitive of a flourishing scientific career. Any culture that discourages the nurturing of broad interests can be toxic, stifling both personal growth and, ironically, professional development and productivity.

While there is certainly nothing wrong with the driven pursuit of a focused scientific career – and I strongly admire my dedicated colleagues who have chosen this path – it’s time we reject the myth that this is the only honorable or effective route to scientific success. As a first step, I’m embracing my relationship with Neuroscience, idiosyncrasies and all, and proudly proclaiming that we’ve been polygamous all along.

Tagged , ,

A New Mechanism for Neurovascular Coupling in FMRI

Originally published on the PLOS Neuroscience Community

Although fMRI is the most commonly used tool for detecting human brain activity, the blood oxygen level dependent (BOLD) signal does not directly reflect neuronal activity, but instead, measures changes in blood flow and oxygen metabolism. This “neurovascular coupling” – the translation of neural to vascular signals – lies at the core of fMRI’s utility as a proxy for neural activity, yet there’s still uncertainty over exactly how neural processes drive vascular signals. The neural-to-vascular link is largely obscured by the complex cascade of events involved in neural activity, including glucose metabolism, oxygen consumption, neurotransmitter release and recycling, and changing membrane potentials. Past research has pointed to astrocytes as key players in the neurovascular coupling game, as these cells envelop both neurons and blood vessels. A key signaling molecule, both within astrocytes and between astrocytes and other cells, is ATP, best known for its role as the “cellular energy currency.” In their recent paper published in the Journal of Neuroscience, Jack Wells, Isabel Christie and colleagues explored the physiological mechanisms by which astrocytes might serve as the neurovascular interface of fMRI. Their study tested whether astrocytic purines – including ATP and its products ADP and AMP – are critical for the BOLD response.

ATP is key to eliciting the BOLD response

The authors speculated that, if astrocytic ATP mediates the vascular response to neural activity, blocking ATP should impair the BOLD signal. In normal rats, electrically stimulating one forepaw induces a BOLD response and ATP release in the somatosensory cortex of the opposite side of the brain. Therefore, to test if ATP is required for the BOLD response, they first disrupted ATP on only one side of the somatosensory cortex, and then stimulated both forepaws. They expressed TMPAP, which breaks down purines, into one side of the forepaw region of the rats’ somatosensory cortices, and a control into the other side. Oddly enough, although these vectors weren’t cell-specific, they were mainly expressed in astrocytes – but not neurons – a convenient pattern for testing the selective role of astrocytes in neurovascular coupling.

As expected, the BOLD response to forepaw stimulation was typical in control somatosensory cortex. But the signal was reduced in cortex expressing TMPAP (see Figure, A left and B top). This suggested that purine signaling is indeed important for a normal BOLD response. But what if the altered signal resulted from some other effect of the TMPAP expression, besides the intended purine reductions? For instance, breaking down ATP and its products could lead to build-up of the inhibitory neurotransmitter adenosine, which could interfere with normal neural activity. The authors repeated the experiment, this time using an adenosine antagonist to block any effects of adenosine accumulation. The results were the same. The BOLD response was reduced with TMPAP and did not normalize by blocking adenosine (see Figure, A right and B bottom), confirming that the effect wasn’t simply an artifact of adenosine build-up.

Group activation maps (A) and response curves (B) show that the BOLD response to forepaw stimulation is reduced after blocking purine signaling (TMPAP), compared to control (EGFP). The effect remains even after accounting for adenosine build-up with the adenosine antagonist DPCPX. From Wells et al., 2015.

Group activation maps (A) and response curves (B) show that the BOLD response to forepaw stimulation is reduced after blocking purine signaling (TMPAP), compared to control (EGFP). The effect remains even after accounting for adenosine build-up with the adenosine antagonist DPCPX. From Wells et al., 2015.

Does ATP support neural and vascular signaling or just their coupling?

If astrocytic purine signaling is truly involved in the translation of neural activity to a cerebrovascular response, interfering with purines should diminish the BOLD effect (as they showed), but neural activity and the background vascular state should remain unchanged. Indeed, multiunit recordings showed that TMPAP did not affect the neural response to forepaw stimulation, and arterial spin labeling indicated no change in resting blood flow or vascular reactivity.

Astrocytic ATP: One piece of the puzzle

Results from each of these experiments provided a critical piece of the neurovascular puzzle, illustrating the role of astrocytic purines in the series of events translating neural activity to the BOLD response. Together, they suggest that ATP signaling in astrocytes is critical for a normal vascular response to neural activity, but importantly, is not needed for either neural or vascular function alone. In other words, astrocytic ATP selectively underlies the coupling of neural and vascular activity.

It’s important to note that, although these findings show that ATP is important for neurovascular coupling, it’s unlikely this is the only mechanism supporting the BOLD response. While this study doesn’t directly trace the intricate events by which ATP mediates neurovascular coupling, the authors offer several plausible pathways. ATP is known to trigger calcium responses in astrocytes, which – through a series of downstream processes – could cause vascular effects like blood vessel dilation that are key to the BOLD response. However, ATP does not just support communication between astrocytes, but is also involved in neuron-to-astrocyte and astrocyte-to-blood vessel signaling. Any of these interactions could feasibly explain why ATP is required for the vascular response to neural activity. Of course, we can’t rule out the influence of ATP in neurons, which also may modulate vascular function independent of astrocytes. Although TMPAP was primarily expressed in astrocytes, this wasn’t exclusive; it’s possible that ATP levels were also reduced in neurons and may have affected the BOLD response in distinct ways.

Many questions remain regarding the physiological origins of the BOLD response to neural activity. However, these findings from Wells, Christie and colleagues help to solidify the role of astrocytes, and to introduce ATP as a key player, in the neurovascular coupling game.

References

Wells JA, Christie IN et al. (2015). A Critical Role for Purinergic Signalling in the Mechanisms Underlying Generation of BOLD fMRI Responses. J Neurosci 35(13):5284-92. doi: 10.1523/JNEUROSCI.3787-14.2015

Tagged

This is your Brain on Wine: FMRI Signals of Alcohol Content

Originally published on the PLOS Neuroscience Community

In today’s burgeoning wine industry, winemakers are in constant search of ways to perfect their product and achieve an edge over the competition. Complicating the challenge of producing a bottle that we’re sure to select at our next fine dining experience is the variability across palates. The individuality and unpredictability of sensory experiences – which may further be manipulated by context or expectations – make predicting a wine’s appeal a daunting task. In a dream world, winemakers could peer directly into the brain to examine the biological response to a smoky syrah or a spicy zinfandel. Such a tool could theoretically empower producers to target their wine characteristics to not just the psychological, but also the physiological response to a wine. In their study recently published in PLOS ONE, Frost and colleagues sought to accomplish just this, using functional MRI to assess brain responses to a wine’s flavor attributes.

Rather than assess relatively subjective features like fruitiness, tannins or fullness, the researchers focused on alcohol content, a more objective – and therefore easier to quantify – property. Twenty-one “inexperienced” wine drinkers (they imbibed less than once per week) participated in four wine-tasting sessions while undergoing functional MRI. During each session, they alternated among sipping a tasteless solution, a low-alcohol red wine (13-13.5%) and a high-alcohol red wine (14.5-15%). A different pair of low- and high-alcohol wines, matched on flavor, was tasted in each session. A post-scan taste-test confirmed that participants could not tell the difference between the low- and high-alcohol wines of each pair, as they rated their tastes as essentially identical.

Frost and colleagues identified 30 brain regions of interest that were activated by drinking wine, regardless of alcohol content. This set of areas was then further tested for effects of alcohol. Of these regions, only the right insula and right cerebellum were differentially activated by alcohol level, demonstrating greater activity to the low- than high-alcohol wines. Surprisingly, no regions preferentially activated to more alcoholic wines.

This is your brain on wine. The right insula (left) and right cerebellum (right) were more active when participants drank low- than high-alcohol wines. Adapted from Frost et al., 2015.

This is your brain on wine. The right insula (left) and right cerebellum (right) were more active when participants drank low- than high-alcohol wines. Adapted from Frost et al., 2015.

The cerebellum is known to be involved in sensorimotor processing, which could reasonably account for its activation by subtle differences in alcohol perception. However, both the insula and cerebellum have been shown to be modulated by taste, activating to more intense flavors and feelings of satiety. Shouldn’t high-alcohol wines – which are arguably more intense– therefore more heavily engage these regions? The authors dug deeper into the literature to interpret these unexpected findings.

They propose that because these areas are involved in “cognitive modulation of sensory perception” and “coordinating the acquisition of sensory information,” the lower alcohol wines might have “induced a greater attentional orienting and exploration of the sensory attributes.”

Yet there’s one tiny hole in this explanation, at least when considering the current evidence alone. We could reasonably link activation of these regions to flavor intensity or taste perception if there were some associated behavioral indication that the wines elicit distinct sensory experiences. However, the participants in fact report no perceptible taste difference between the two classes of wines. This discrepancy between the subjective perceptual experiences and brain responses suggests that the observed insular and cerebellar effects may reflect some sensory aspect of wine-tasting that lies below conscious awareness.

Although the researchers don’t directly discuss this possibility, it’s worth exploring. Since the difference in alcohol content between the wine types was notably small (just ~1.5%), it’s not surprising that the participants couldn’t detect a taste difference. It would be interesting to see whether the activations would be more robust to a wider gap in alcohol levels, or might track with a continuum of alcohol content. Furthermore, the study participants were “inexperienced” wine drinkers. Perhaps the taste differences would have been perceptible – or the brain responses stronger – in a sample of connoisseurs with more “refined palates.” As the evidence stands, we can’t conclude whether the BOLD responses indeed reflect effects of wine taste perception that were simply too subtle and hence immeasurable here, or instead relate to lower-level, unconscious sensory processes.

So what do these findings mean for the winemaker looking to neuroscience for a marketing advantage? It’s safe to assume that manipulating the alcohol content of a wine will indeed affect brain physiology (in fact, the known influence of alcohol on the BOLD signal raises concern over confounds between the wine conditions). However, it’s unclear how this brain response relates to a wine drinker’s sensory experience, let alone preference for one wine over another.

As blogger Neuroskeptic points out in his recent commentary on the study, “it’s not clear whether a brain scan is the best way to approach the question of whether high alcohol is overpowering. Surely the same thing could be demonstrated using a taste test.”

Despite these considerations, Frost and colleagues establish a solid stepping-stone to further explore the complex relationship between a wine’s flavor profile and consumers’ gustatory and neural responses. More importantly for wine-lovers everywhere, their study offers a key first step towards unraveling how and why that bold, oaky cabernet beats a merlot any day.

References

Bower JM et al. (1981). Principles of Organization of a Cerebro-Cerebellar Circuit. Brain Behav Evol 18:1-18. doi:10.1159/000121772

Frost R et al. (2015). What Can the Brain Teach Us about Winemaking? An fMRI Study of Alcohol Level Preferences. PLOS ONE. doi: 10.1371/journal.pone.0119220

Plassmann H et al. (2008). Marketing actions can modulate neural representations of experienced pleasantness. Proc Natl Acad Sci 105(3):1050-4. doi:10.1073/pnas.0706929105

Small DM et al. (2003). Dissociation of Neural Representation of Intensity and Affective Valuation in Human Gustation. Neuron 39(4):701-11. doi:10.1016/S0896-6273(03)00467-7

Smeets PAM et al. (2006). Effect of satiety on brain activation during chocolate tasting in men and women. Am J Clin Nutr 83(6):1297-1305.

Tagged , , , , ,

Mapping Memory Circuits with High-Field FMRI

Originally posted on the PLOS Neuroscience Community

How we create and recall memories has long fascinated scientists, spurring decades of research into the brain mechanisms supporting memory. These studies overwhelmingly point to the hippocampus as an essential structure for memory formation; yet despite these efforts, we still don’t fully understand how hippocampal circuits transform stimulus input into stored memories, in part due to several fundamental methodological challenges.

The most commonly used functional imaging method in humans, fMRI, neither measures neural activity directly nor attains ideal spatiotemporal resolutions. Although more powerful, invasive techniques can be used in animals, it’s arguable whether they can be applied to assess higher cognitive functions like episodic memory, as the jury’s out on whether this process is uniquely human or shared with animals. However, recent neuroimaging advances are rapidly narrowing the power gap between invasive and non-invasive techniques, helping to reconcile findings across animal and human studies. In particular, high-field, high-resolution fMRI in humans is becoming more feasible, permitting sub-millimeter spatial resolution. Although the BOLD signal from fMRI only approximates the neural signal, such methodological advances get us one step closer to imaging neural activity during cognitive functions like memory formation. A team of researchers recently took advantage of high-field fMRI to investigate sub-region and layer-specific memory activity in the medial temporal lobe, an area critical for long-term memory acquisition.

The hippocampal-entorhinal circuit

Within the medial temporal lobe, the entorhinal cortex (EC) and hippocampus (including subfields dentate gyrus, CA1, CA2 and CA3) make up a well-characterized circuit, in which superficial EC layers project to the dentate gyrus and CA1 via the perforant path, on to CA3 via mossy fibers, to CA1 via schaffer collaterals, and finally return back to the deep layers of the EC. We know this circuit is important for memory, as the hippocampus is essential for memory encoding and other processes that presumably support memory, including novelty detection or pattern separation and completion. However, the mapping of these functions onto human entorhinal-hippocampal pathways is incomplete. 

The entorhinal-hippocampal circuit

The entorhinal-hippocampal circuit

Imaging memory with high-field fMRI

To examine how novelty and memory signals are distributed along the EC-hippocampus circuit, Maass and colleagues conducted high-resolution (0.8 mm isotropic voxels) 7T fMRI while participants performed an incidental encoding task. The subjects viewed a series of novel and familiar scenes during scanning, and later completed a surprise memory recall test on the scenes they had previously seen. This allowed the researchers to assess brain activity related to novelty – by comparing novel and familiar trials – as well as activity related to successful memory encoding – by comparing trials that were subsequently remembered and forgotten. On each subject’s structural brain image, they parcellated the EC into superficial input layers and deep output layers, and segmented the hippocampus into CA1 and a combined dentate gyrus/CA2/CA3 region (DG/CA2/3). 

Segmentation of entorhinal cortex layers (left) and hippocampal subfields (right). Maass et al., 2014.

Segmentation of entorhinal cortex layers (left) and hippocampal subfields (right). Maass et al., 2014.

Double-dissociation of novelty and encoding signals

Across participants, novel scenes activated DG/CA2/3, whereas successful encoding activated CA1, and the strength of this CA1 signal predicted retrieval accuracy. Next, Maass and colleagues looked at subject-level voxel-wise activity, which preserves high spatial resolution by eliminating the need for smoothing and across-subject averaging. Using multivariate Bayes decoding, which can be used to compare the log evidence that various regions predict a particular cognitive state, they evaluated whether EC or hippocampal regions predict novelty and memory encoding. As illustrated by the relative log evidences in the below graphs, DG/CA2/3 (A right) and CA1 (B right) respectively signaled novelty and encoding, consistent with their group-level findings. But this analysis further showed that superficial EC (the input layers to the hippocampus) and deep EC (the output layers from the hippocampus) also respectively predicted novelty (A left) and encoding (B left). What’s more, superficial EC and DG/CA2/3 functionally coupled during novelty processing, whereas deep EC and CA1 coupled during encoding. 

Multivariate Bayes decoding predicts novelty and encoding from entorhinal cortex and hippocampal activity. Maass et al., 2014.

Multivariate Bayes decoding predicts novelty and encoding from entorhinal cortex and hippocampal activity. Maass et al., 2014.

In essence, these findings suggest a division of labor across the EC-hippocampal circuit, where hippocampal input pathways participate in novelty detection, and output pathways transform these signals for memory storage. The researchers offer a model in which information about stimulus identity feeds in from upstream regions such as the perirhinal and parahippocampal cortices, which are known to process object and scene identity. Hippocampal pattern separation or comparator computations might then be performed to both assess novelty and reduce interference between stimulus representations, transforming the novelty signal into output for long-term storage. This explanation for how hippocampal circuits process a stimulus representation is reasonable, considering that DG/CA3 is important for pattern separation, and CA1 has been proposed as a neural comparator, processes which may determine the memory fate of a stimulus representation.

Cautions and caveats

A segregation of function across EC layers and hippocampal subfields does not necessarily imply that these mappings are mutually exclusive. For instance, it’s likely that output pathways still carry a novelty signal, and memory formation may begin earlier in the processing stream than detected here. Despite the impressive resolution in this study, allowing fine segmentation of cortical layers and subregions, noise and artifact are inherent concerns for any fMRI study. As the BOLD signal is a crude estimate of neural activity, there may well be a ceiling to the power of high-field fMRI, even with the most rigorous methods. How accurately these region- and layer-specific signals map onto memory functions therefore remains to be validated. And of course, we can’t infer directionality, causality or any direct relationship to neural activity from fMRI alone. It’s tempting to interpret early circuit activity as an input signal and late activity as an output signal, or to assume that the BOLD response reflects excitatory neural activity; however, we’ll need more direct neuroimaging tools to trace the flow of neural signal and confirm these speculations.

Together, Maass and colleagues’ study advances the field of cognitive neuroscience on two fronts. First, it helps bridge the gap between robust yet invasive imaging tools and non-invasive but less powerful approaches commonplace in human imaging studies. Their successful application of high-field fMRI demonstrates the feasibility of assessing human brain activity with sub-millimeter resolution, paving the way for the standardization and refinement of these tools. Second, and perhaps most critically, it allows us to peer into the brain at previously impossible scales to view the live hippocampal circuit hard at work, processing and engendering memories. While past fMRI studies have effectively shown where memories are woven together, these findings refine this anatomical precision to bring us one step closer to understanding how hippocampal circuits accomplish this feat.

First author Anne Maass kindly offered to answer a few questions about her research. Here is a brief interview with Maass and her colleagues.

Are there unique methodological concerns to consider when using high-field, high-resolution fMRI?

The increased signal-to-noise ratio provided by MRI at 7T enables us to acquire fMRI data at an unprecedented level of anatomical detail. However, ultra high-field fMRI is also more vulnerable to distortions and susceptibility-related artifacts and the negative effect of motion increases with resolution.

In particular, the anterior medial temporal lobe regions, such as the entorhinal cortex and perirhinal cortex, are often affected by susceptibility artifacts. Nevertheless, an optimized 7T protocol as we used in our study can reduce (but not fully eliminate) these signal dropouts and distortions, e.g. by the very small voxel size, shorter echo times as well as optimized shimming and distortion correction. We therefore had to manually discard functional volumes with visible dropouts and distortions.

The analysis of high-resolution functional data raises additional challenges, for instance the precise coregistration of structural and functional (often partial) images or the normalization into a standard space, which is usually done for group comparisons. In our study, we aimed to evaluate functional differences between entorhinal and hippocampal layers and subregions. We thus manually defined our regions of interest and chose a novel approach that enables to use the individual (raw) functional data to achieve highest anatomical precision.

Have other studies examined the hippocampal-entorhinal circuit during memory encoding or novelty detection using more direct neural imaging tools, for example, with intracranial EEG? If so, how do they align with your findings?

Although there have been several intracranial EEG recording studies in humans that investigated functional coupling between hippocampus and EC (i.e. Fernandez et al., 1999), to our knowledge, these studies have not been able to look at deep versus superficial EC or at specific hippocampal subfields.

Your findings have obvious implications for memory disorders. Have you done any work investigating how the hippocampal-entorhinal memory circuit is disrupted in Alzheimer’s or other dementias?

To investigate layer-specific processing in aging or neurodegenerative diseases is of course of particular interest as aging seems to affect particularly entorhinal input from superficial EC layers to the dentate gyrus and also taupathology in Alzheimer’s disease emerges in the superficial EC layers, subsequently spreading to particular hippocampal subregions or layers (i.e. CA1 apical layers). However, high-resolution fMRI at 7T is particularly challenging in older people. The high probability of exclusion criteria (e.g. implants) complicates subject recruitment and stronger subject movement increases motion artefacts. So far we have collected functional data at 7T in healthy older people with 1mm isotropic resolution that we are currently analyzing. In addition, further studies are planned that focus on changes in intrinsic functional connectivity of the hippocampal-entorhinal network in early Alzheimer’s disease.

What further questions do your results raise regarding hippocampal memory pathways, and do you have plans to follow-up on these questions with future studies?

One further question that we are currently addressing is how hippocampal and neocortical connectivity with the EC is functionally organized in humans.

While the rodent EC shows a functional division into lateral and medial parts based on differential anatomical connectivity with parahippocampal and hippocampal subregions, almost nothing is known about functional subdivisions of the human EC.

In addition to characterizing entorhinal functional connectivity profiles in young adults, we also want to study how these are altered by exercise training. Finally, we aim to resolve how aging affects object vs. scene processing (and pattern separation) in different components of the EC and subfields of the hippocampus.

References

Dere E et al. (2006). The case for episodic memory in animals. Neurosci Biobehav Rev 30(8):1206-24. doi:10.1016/j.neubiorev.2006.09.005

Fernandez G et al. (1999). Real-Time Tracking of Memory Formation in the Human Rhinal Cortex and Hippocampus. Science 285(5433):1582-5. doi:1 0.1126/science.285.5433.1582

Friston K et al. (2008). Bayesian decoding of brain images. Neuroimage 39(1):181-205. doi:10.1016/j.neuroimage.2007.08.013

Maass A, et al. (2014). Laminar activity in the hippocampus and entorhinal cortex related to novelty and episodic encoding. Nat Commun 5:5547. doi:10.1038/ncomms6547

Tagged , , , , ,

#PLOS #SfN14 Highlights: Exercise, Energy Intake and the Brain

Originally published on the PLOS Neuroscience Community

This Thanksgiving, many of us will be manipulating our energy balance — in one way or another. Most will be building our energy stores with a hefty dose of Turkey and pumpkin pie, while others may tap into those reserves at their local turkey trot. Either way, we’ll need to look no further than the mirror to be reminded how diet and exercise mold the body.

Less obvious, however, is how energy availability regulates brain health. Emerging research is showing that tweaking our energy use through diet and exercise elicits positive metabolic changes that promote better neuronal and mental health. In their symposium “Exercise, Energy Intake, and the Brain” at this year’s recent Society for Neuroscience conference, scientists Henriette van PraagMonika FleshnerMichael Schwartz and Mark Mattson discussed the mechanisms underlying the brain-energy relationship.

FLESHNER

In her talk, Fleshner demonstrated the powerful effects of exercise on our response to stress. Not only does physical activity make many organisms — of all shapes and sizes — simply feel good (rats, frogs and even slugs will voluntarily run on wheels in the wild!), but it also does wonders to protect us against the hazards of stress. After only six weeks of regular running, an individual will begin to show signs of stress-robustness, including being more resilient and resistant to stress. But just how does that morning jog help us combat a stressful day at work?

Animals of all types love to run!

Animals of all types love to run!

According to Fleshner, exercise attenuates the typical stress-induced activation of the dorsal raphe nucleus — a major source of serotonergic projections. Although a logical player in this process might be the medial prefrontal cortex (mPFC) since this area regulates serotonin transmission in the dorsal raphe nucleus, the mPFC isn’t necessary for exercise-related stress resistance. Rather, six weeks of wheel-running increases levels of 5HT1A inhibitory autoreceptors and reduces stress-induced serotonin release. Thus, it appears that exercise effectively puts the breaks on the dorsal raphe nucleus-mediated serotonergic response to stress. What’s more, this stress-robustness likely involves a widespread coordinated response including exercise-induced epigenetic changes. In fact, Fleshner showed that a host of stress-related genes are differentially expressed in physically active and inactive individuals.

MATTSON

Mattson opened his discussion with some inspiring anecdotes on the subjective benefits of exercise and fasting. For instance, celebrated writer Joyce Carol Oates is known to do some of her best writing while running, an experience with which I — a runner and writer myself — am dearly familiar.

Running seems to allow me, ideally, an expanded consciousness in which I can envision what I’m writing as a film or a dream. — Joyce Carol Oates

Sure, it may feel like physical activity makes our thoughts flow more fluidly, but just how and why might exercise spark greater neural efficiency? Exercise promotes mitochondrial growth and development systemically, and it’s well accepted that what’s good for the body is good for brain; the benefits of exercise aren’t limited to muscle cell mitochondria, but extend to neuronal mitochondrial as well. Mattson outlined a molecular pathway by which physical activity influences mitochondrial integrity, neurogenesis and plasticity in the hippocampus. While the nitty-gritty details of the circuit are beyond the scope of this post, two key players are worth mentioning: the protein PGC1-alpha which is activated by exercise, and SIRT3, levels of which are increased in runners compared to non-runners. Notably, PGC1-alpha is necessary for mitochondrial biogenesis, including in hippocampal neurons, and activates SIRT3, which is important for normal long-term potentiation and synaptic calcium release. In short, exercise triggers a cascade of cellular processes that promote efficient mitochonondrial and neuronal function.

Mattson next highlighted some parallels between the neuroprotective effects of exercise and intermittent energy restriction (such as fasting or calorie restriction). Notably, running has been shown to up-regulate BDNF, CREB-activation and DNA-repair mechanisms which may combat the deleterious effects of aging. Energy restriction similarly elicits a host of positive neurobiological effects — for instance, increased synaptic density and neurogenesis — and even promotes longevity (30% greater lifespan in rats isn’t bad!). There’s some evidence that intermittent fasting may actually be more beneficial than calorie restriction, as it more effectively lowers heart rate, a process likely mediated by increased BDNF. Finally, Mattson pointed out that both exercise and fasting enhance production of ketones, a highly robust source of neuronal energy that have also been shown to enhance cognitive function and neural plasticity.

Unfortunately, I was only unable to attend the additional talks by van Pragg and Schwartz. But fortunately, the symposium speakers compiled a Journal of Neuroscience review highlighting their key points.

Running! If there’s any activity happier, more exhilarating, more nourishing to the imagination, I can’t think what it might be. In running the mind flies with the body; the mysterious efflorescence of language seems to pulse in the brain, in rhythm with our feet and the swinging of our arms. — Joyce Carol Oates

Tagged , , , ,

@PLOSNeuro #SfN14 Highlights: Intracranial EEG and Brain Stimulation

Originally published on the PLOS Neuroscience Community

Despite their many advantages, traditional tools to study neurocognitive function in humans, such as EEG or fMRI, carry several disadvantages compared to those usable on animals. Perhaps the most significant limitation is the challenge of imaging neural activity of live human brains during mental functions, which inherently requires the application of invasive neuroimaging techniques. Recently, the cognitive neuroscientist’s tool-belt has rapidly expanded, with the growing prevalence and usability of powerful imaging methods such as intracranial EEG – or electrocorticography (ECOG) – and electrical brain stimulation, that permit direct recording or stimulation of neuronal activity in live, conscious humans.

The SfN symposium Studying Human Cognition with Intracranial EEG and Electrical Brain Stimulation (previously previewed here, including an interview with speaker Josef Parvizi) explored current advances in these evolving methods along with their applications to the human cognitive experience.

Knight

UC Berkeley’s Bob Knight opened the symposium by highlighting the unique perks of ECOG over more traditional imaging techniques — points which were later recapitulated by other speakers — including its remarkably high spatial and temporal resolution and exceptional signal to noise ratio. ECOG is in fact so precise that it can reliably measure signal down to the single trial level – a feat neither EEG nor fMRI can boast. In just his brief introduction, Knight shared some impressive clinical and cognitive applications of these electrophysiology techniques. For instance, intracranial EEG signal from the auditory cortex was effective (with 99% accuracy!) at reconstructing words, holding clear implications for patients with speech impairments. My personal favorite highlight of the session, however, was the reconstruction of Pink Floyd’s “Another brick in the wall” from intracranial auditory cortex recordings.

Parvizi

First up, Josef Parvizi from Stanford University presented his lab’s multimodal approach to neurocognitive assessment, incorporating fMRI, ECOG and electrical brain stimulation. Parvizi shared a series of cases illustrating the powerful – and entertaining — applications of brain stimulation. In response to stimulation of the “salience network”, which had been previously mapped using fMRI, one patient responded that he felt like he was “riding in a storm”, but “felt nothing” after sham stimulation. A second patient reported the sense that “something bad is going to happen,” confirming in both patients emotionally driven reactions to “salience network” stimulation. In a final, particularly compelling, demonstration, Parvizi showed the effects of fusiform face area stimulation: “You just turned into somebody else,” the subject reported. “That was a trip!”

Malach

Next, Rafael Malach of the Weizmann Institute discussed his lab’s use of intracranial EEG to measure spontaneous neural activity at rest. FMRI is most commonly used to study resting-state activity; however, the BOLD signal may be contaminated by non-neural signal, and — due to its poor temporal resolution — is effectively blind to rapid events. Using ECOG, which overcomes both of these hurdles, Malach demonstrated how high frequency gamma activity accurately reflects neuronal firing rate and can assess functional connectivity. Surprisingly, spontaneous activity between recording sites on opposite hemispheres is more highly correlated than between adjacent recording sites. So ECOG may be a powerful tool for measuring spontaneous activity, but this is only valuable if we can identify the signal’s associated mental processes. Using the comical and celebrated example of the entorhinal cortex “Simpsons neuron”, which selectively fired in response to images of the Simpsons or immediately before spontaneous recall of the cartoon, Malach suggested that spontaneous activity exceeding an awareness threshold might indeed represent conscious thoughts.

Lachaux

Jean-Philippe Lachaux, from the Lyon Research Council, took a slightly different angle on the applications of ECOG, highlighting its unique suitability for evaluating naturalistic behavior. Because of its robustness against artifacts problematic in EEG or fMRI — like motion, blinking or signal distortion — ECOG can be more flexibly used in a variety of environments. These applications can be enhanced by integrating it with other tools such as eye-tracking, to more accurately associate natural behavior with neural activity in real time. Furthermore, Lachaux illustrated the power of ECOG at unraveling the temporal dynamics of functional interactions. Lachaux presented data questioning the common assumption that inter-region communication is typically a one-way street, proposing instead that such interactions may be more akin to reciprocal “shared conversations”.

Kastner

Sabine Kastner of Princeton University wrapped up the session with her lab’s comparative studies of attention in humans and monkeys. Combining human intracranial EEG with single-unit and LFP measures in monkeys during attention (Flanker task), she reported similar attention modulation in human and monkey intraparietal sulcus. Intriguingly, while attention modulated high gamma in both species, it also increased low frequency oscillations in humans. At the heart of cognitive neuroscience is the question of how neural activity translates to thoughts and behavior. To directly address this issue, Kastner is using electrophysiology to identify the optimal neural code for attention. In both humans and monkeys, she finds that spike phase better predicts behavior than spike rate, inching us one step closer to resolving the brain-cognition relationship.

Judging by the responses to my live-tweeting of this symposium, I’ll conjecture that the Neuro community is as intrigued and excited as yours-truly about the potential applications of ECOG and brain stimulation. In the words of @WiringTheBrain,

“This stuff is so COOL! And scary. But mainly COOL!”

Tagged , , , , , , ,

What does Work-Related Burnout do to the Brain?

Originally published on the PLOS Neuroscience Community

We’ve all experienced it – the fatigue, stress and irritability after a long day of work. For most, these feelings are fleeting, and are nothing a good night’s sleep or a cup of tea over a good book can’t remedy. But for others, the daily stress extends into weeks and months, and eventually into long-term burnout. The physical toll on the over-worked can be so extreme that occupational burnout is being increasingly recognized as a serious medical condition. While the behavioral symptoms – including problems with memory or concentration, mood imbalances, insomnia and body aches – are well documented, the consequences of chronic burnout on brain function, and how such neural changes give rise to emotional dysregulation, have been inadequately examined. A recent PLOS One study, by Amita Golkar and colleagues from the Karolinska Institute, sought to better understand how chronic work-related stress alters brain function and emotional processing. While their findings confirm that impaired emotional regulation has neurobiological roots, another expert in the field has raised the question of whether stress may affect additional neural circuits undetected here.

Assessing stress

Thirty-two individuals with chronic burnout and 61 healthy controls participated. The patients worked 60-70 hours per week, manifested symptoms including sleeplessness, fatigue, irritability, cognitive impairments or impaired working ability for at least a year, and had lost at least six months of work to sickness. Each participant completed two test sessions, including a startle response task to measure emotional regulation, and resting-state functional MRI to evaluate functional brain connectivity.

During the behavioral task, a series of neutral and negative pictures was shown, with each picture flashed before and after an instruction cue (Figure 1). For negative pictures, subjects were told to either up-regulate, down-regulate or maintain their emotional response to the image (i.e., to experience the second presentation as more, less or similarly emotionally charged as the first presentation). Neutral pictures were always paired with the instruction to maintain their emotional response. To assess how the cues affected participants’ physiological responses to the images, during each picture presentation the researchers administered an acoustic startle and measured eye-blink responses using electromyography. This allowed them to compare stress responses to an identical stimulus, differing only in how the participants manipulated their emotional reactions.

Figure 1. Startle responses were measured before and after an emotional regulation cue to the same picture. doi: 10.1371/journal.pone.0104550

Figure 1. Startle responses were measured before and after an emotional regulation cue to the same picture. doi: 10.1371/journal.pone.0104550

Burnout impairs emotional regulation

When they were told to maintain or up-regulate their emotional responses, the burnout and control groups showed similar startle responses (response to the post-cue picture – response to the pre-cue picture). But critically, during the down-regulate condition the burnout group not only exhibited a greater stress response than controls (Figure 2), but also reported less success at implementing the emotional regulation instructions to the negative images. Just from these behavioral findings, it’s clear that chronic stress can dramatically alter how we process negative emotions. In particular, the burnt-out workers demonstrated less control over their reactions to negative experiences, showing signs of elevated distress that they were unable to dampen.

Figure 2. Patients showed an exaggerated response to negative images when instructed to down-regulate their emotions. doi: 10.1371/journal.pone.0104550

Figure 2. Patients showed an exaggerated response to negative images when instructed to down-regulate their emotions. doi: 10.1371/journal.pone.0104550

Burnout alters limbic function

Given this strong evidence that something was awry in these patients’ emotional regulation circuitry, Golkar and colleagues next asked whether altered neural function might underlie their symptoms. Naturally, they looked to the limbic system, a brain network involved in processing emotion. They focused particularly on the amygdala, which is known to be critical for evoking fear and anxiety, and is enlarged in people with occupational stress. Here, functional connectivity during rest between the amygdala and several brain regions was altered in patients; most notably, connections were weaker with the prefrontal cortex and stronger with the insula. What’s more, the stronger the correlation of the amygdala with the insula or a thalamic/hypothalamic region, the higher the individual’s perceived stress. Finally, connectivity between the amygdala and the anterior cingulate correlated with participants’ ability to down-regulate their emotional response from the startle-response task.

Figure 3. Differences in functional connectivity with the amygdala between patients and controls. doi: 10.1371/journal.pone.0104550

Figure 3. Differences in functional connectivity with the amygdala between patients and controls. doi: 10.1371/journal.pone.0104550

The findings of Golkar and colleagues help to establish a concrete understanding of the cognitive and neural changes underlying a too-often overlooked serious health condition. These findings add credence to the subjective feeling of being overly sensitive to negativity, or unable to control emotions, when burnt out. Perhaps more importantly, they confirm that such emotional impairments indeed have neurobiological underpinnings – changes that fit in beautifully with our knowledge of how the brain processes emotion. A stress-related disconnect between the amygdala and the prefrontal cortex and anterior cingulate – even at rest – builds upon earlier studies showing reduced volume and altered task-evoked responses in these areas associated with stress. And chronic stress was further related to amygdala hyperconnectivity with the insula and thalamus/hypothalamus, key regions for eliciting a stress response.

Dissociating the neural effects of stress

However, this study leaves several questions unanswered and raises a few more. Given the complexity of the patients’ psychological conditions, there were most certainly numerous other physical and psychological differences between the groups that went undocumented and uncontrolled. In the future, closer examination of these possible confounds will help identify their unique neural and behavioral effects. Furthermore, in addition to functional changes in several expected regions, altered resting connectivity also occurred in two unexpected regions – the cerebellum and motor cortex. Whether these were false positives, or whether occupational stress may have additional underappreciated motor or cognitive consequence, remains to be seen.

Another perspective

Because of the study’s justifiable focus on connectivity with the amygdala, it’s unclear how specific or broad the neural changes associated with chronic stress may be. Tom Liu, a researcher studying resting-state brain connectivity at UC San Diego, who was not involved in this study, explains,

“This begs the question of what other connections might be different between the two groups or perhaps show even better correlation with the stress scores. The issue there is that because of the large number of potential connections, a researcher is very quickly faced with a large multiple comparisons problem – this is an open issue in the field.”

Further work will help clarify whether stress – or other differences between the groups – predominantly affects limbic circuitry or might also contribute to global brain changes. Liu points out,

“One aspect that would have been interesting to look at is whether there were any global differences between the two groups that could have accounted for the differences, as the authors did not perform global signal regression.”

For instance, two recent studies report altered global signal associated with schizophrenia and variance in vigilance.

Golkar et al. help to bridge the gap between the emotional dysregulation of workplace burnout and its long-term impact on brain function. Such work is a valuable step towards not only better understanding the brain’s response to stress, but also better equipping us to manage our emotional and brain health – even after a long day of work.

References

Blix E, Perski A, Berglund H and Savic I (2013). Long-Term Occupational Stress Is Associated with Regional Reductions in Brain Tissue Volumes. PLOS One 8(6): e64065. doi:10.1371/journal.pone.0064065

Davis M (1992). The role of the amygdala in fear and anxiety. Annu Rev Neurosci 15:353-75. doi: 10.1146/annurev.ne.15.030192.002033

Flynn FG, Benson DF and Ardila, A (1999). Anatomy of the insula functional and clinical correlates. Aphasiology 13(1): 55-78. http://dx.doi.org/10.1080/026870399402325

Herman JP and Cullinan WE (1997). Neurocircuitry of stress: central control of the hypothalamo-pituitary-adrenocortical axis. Trends Neurosci 20(2):78-84. doi: 10.1016/S0166-2236(96)10069-2

Golkar A, Johansson E, Kasahara M, Osika W, Perski A and Ivanka S (2014). The influence of work-related chroinic stress on the regulation of emotion and on functional connectivity in the brain. PLOS One 9(9): e104550. doi: 10.1371/journal.pone.0104550

Jovanovic H, Perski A, Berglund H amd Savic I (2011). Chronic stress is linked to 5-HT(1A) receptor changes and functional disintegration of the limbic networks. Neuroimage 55(3):1178-88. doi: 10.1016/j.neuroimage.2010.12.060

LeDoux JE (2000). Emotion Circuits in the Brain. Annu Rev Neurosci 23: 155-84. doi: 10.1146/annurev.neuro.23.1.155

Savic I (2013). Structural Changes of the Brain in Relation to Occupational Stress. Cereb Cortex. doi: 10.1093/cercor/bht348

Schutte N, Toppinen S, Kalimo R and Schaufeli W (2000). The factorial validity of the Maslach Burnout Inventory—General Survey (MBI—GS) across occupational groups and nations. J Occup Organ Psych, 73(1), 53-66. http://dx.doi.org/10.1348/096317900166877

Wong CW, Olafsson V, Tal O, Liu TT (2013). The amplitude of the resting-state fMRI global signal is related to EEG vigilance measures. Neuroimge, 83, 983-90. doi: 10.1016/j.neuroimage.2013.07.057

Yang GJ, Murray JD, Repovs G, Cole MW, Savic A, Glasser MF, Pittenger C, Krystal JH, Wang XJ, Pearlson GD, Glahn DC, Anticevic A (2014). Altered global brain signal in schizophrenia. PNAS, 111(20), 7438-43. doi: 10.1073/pnas.1405289111

Tagged , , , , , , , , , ,

Brain Connectivity Patterns of Shifting Memory Processes

Originally published on the PLOS Neuroscience Community

At a recent dinner party, the memories flood your mind as you reminisce with an old friend. A woman approaches and your friend introduces you: “I’d like you to meet my wife, Margaret.” Your attention shifts from the past to this present moment, as you focus on making a new association between “Margaret” and the tall, dark-haired woman before you.

As during a dinner party with old friends and new acquaintances, the dynamically shifting stimulus landscape around us may trigger the retrieval of old memories or the formation of novel ones, often in overlap or rapid succession. What’s more, memory does not simply involve compartmentalized processes of the birth or reactivation of memories in isolation. Rather, successful execution of these processes also relies on support from non-mnemonic processing, such as evaluating a recalled memory or paying attention to new information. Although there is some overlap in the brain regions involved in laying down new memories and recovering old ones, the complex coordination of the many sub-processes of encoding and retrieval naturally requires cross-talk across distinct neural systems.

The brain’s medial temporal lobe is commonly considered the seat of memory – with the hippocampus lying at its heart – as encoding and retrieval rely critically on these regions. However, just as memory involves the coordination of many cognitive functions, so does it require the coordination of widespread brain networks. Both small-scale circuits across hippocampal subregions, and long-range brain systems, work together to integrate sensory information, control attention and filter relevant details in support of memory. A recent study from Katherine Duncan, Alexa Tompary, and Lila Davachi at NYU demonstrated just how the hippocampus shifts its communication with the surrounding brain to support its remarkable ability to rapidly switch between memory processes.

The researchers conducted fMRI while participants performed alternating blocks during which they encoded pairs of objects and then recalled those object pairs. A day later, participants returned for an unscanned long-term memory test, in which they reported whether they recognized the objects, and rated how confidently they recalled the pairs. This delayed memory test was used to measure how well the object associations had been encoded the day prior.

A standard analysis confirmed that across all hippocampal subregions (CA1, DG/CA3, subiculum) activity increased for both successful encoding and retrieval. Notably, the retrieval effect was strongest in DG/CA3, in line with past studies suggesting that this region might function as an auto-associative network that serves to reactivate stored memory traces. Now, we’ve long known that the hippocampus is engaged during these processes; but less certain is how the region interacts with the surrounding brain.

The researchers focused on hippocampal subregion CA1, an important hub along the bidirectional cortex-hippocampus highway, as it both receives input from the medial temporal lobe (via the dentate gyrus and CA3), and also provides output back to the cortex. Connectivity between DG/CA3 and CA1 was stronger during the retrieval than the encoding block, whereas connectivity with CA1 didn’t differ between memory blocks for any of the other medial temporal lobe or midbrain regions they investigated (Figure 1). Thus, not only was DG/CA3 highly activated, but it was also more strongly connected with its downstream hippocampal target, during retrieval.

Figure 1. Connectivity between CA1 and DG/CA3 is stronger during retrieval than encoding. Adapted from Duncan et al., 2014.

Figure 1. Connectivity between CA1 and DG/CA3 is stronger during retrieval than encoding. Adapted from Duncan et al., 2014.

But how might memory-specific communication across regions subserve the brain’s changing cognitive goals? To test whether connectivity patterns in fact support memory success, the researchers correlated functional connectivity measures with encoding and retrieval performance. Supporting their prior findings, CA1-DG/CA3 connectivity correlated with immediate retrieval accuracy, but not with long-term memory (i.e., day 2 retrieval) (Figure 2, left). Conversely, connectivity between CA1 and the ventral tegmentum correlated with long-term memory, but not immediate retrieval accuracy (Figure 2, right). As Davachi explains, “This suggests that whatever this signal represents, it is explaining long-term – not short-term – memory, which arguably suggests that across subject variability in CA1-ventral tegmentum connectivity is related to the consolidation of memories, not just their initial encoding.”

Figure 2. CA1-DG/CA3 connectivity correlates with immediate retrieval, whereas CA1-ventral tegmentum connectivity correlates with memory consolidation. Adapted from Duncan et al., 2014.

Figure 2. CA1-DG/CA3 connectivity correlates with immediate retrieval, whereas CA1-ventral tegmentum connectivity correlates with memory consolidation. Adapted from Duncan et al., 2014.

Notably, these connectivity patterns emerged when examining activity across each encoding or retrieval block, but disappeared when isolating the trial-evoked responses. It therefore seems possible that these increases in connectivity strength may not directly support isolated moments of memory formation or reactivation, but instead, auxiliary processes that evolve gradually over time. However, Davachi cautions “These null effects do not necessarily imply that there are not important trial-evoked changes in connectivity, but rather, that the trial-evoked data are simply swamped with the incoming perceptual and task signals.”

While the role of DG/CA3 and its connectivity to CA1 in associative retrieval has been well documented, the encoding-specific link between CA1 and the ventral tegmentum is less expected. Regions such as the medial temporal lobe and the prefrontal cortex are traditionally considered the major players in memory encoding; yet, recent research has hinted at a more important role for the ventral tegmentum than previously thought. Furthermore, this finding aligns well with animal studies showing that input to CA1 from the ventral tegmentum is required for synaptic plasticity, and that long-term potentiation – key to long-term memory formation – is dopamine-dependent. But as Davachi emphasizes, “You can never know if the BOLD response is related to long-term potentiation. All we show is that the coordinated activation between the ventral tegmentum and CA1 is related to successful encoding (and not retrieval) but what this represents is unclear.” Indeed, the ventral tegmentum is involved in a host of other, non-memory functions as well, such as novelty detection and motivation, both of which would be critical for the encoding task used here – or when making a new acquaintance at a dinner party. What remains to be determined is how, if at all, hippocampal connectivity with the ventral tegmentum supports memory consolidation, or rather, these adjunct processes that might be important for establishing new memories.

Although this study demonstrated unique hippocampal interactions during encoding and retrieval, it can’t speak to the direction of information flow. For instance, since the hippocampal-ventral tegmentum connection is reciprocal, signaling could feasibly proceed in either direction. Furthermore, their findings don’t show that encoding and retrieval are exclusively associated with CA1-ventral tegmentum and CA1-DG/CA3 connectivity, respectively – only that the strength of these interactions differs depending on the memory manipulation.

While further studies, especially those which more directly measure neural activity, will help clarify questions concerning directionality and causality, these findings build significantly upon our knowledge of human memory. In particular, Duncan and colleagues’ techniques enable the assessment of communication with and across hippocampal subregions while directly evaluating memory, which has been challenging in animals. Their findings not only raise several important questions for follow-up, but critically, also bridge the gap between human and animal studies to help unify our understanding of the brain systems supporting encoding and retrieval.

References

Duncan K, Tompary A and Davachi L (2014). Associative encoding and retrieval are predicted by functional connectivity in distinct hippocampal area CA1 pathways. J Neurosci 34(34): 11188-98. doi: 10.1523/jneurosci.0521-14.2014

Murty VP and Adcock RA (2014). Enriched encoding: Reward motivation organizes cortical networks for hippocampal detection of unexpected events. Cereb Cortex 24(8):2160-8. doi: 10.1093/cercor/bht063

Treves A and Rolls ET (2004). Computational constraints suggest the need for two distinct input systems to the hippocampal CA3 network. Hippocampus 2(2):189-99. doi: 10.1002/hipo.450020209

Tagged , , , , , , ,