People with Allergies May Have Lower Risk of Brain Tumors

New research adds to the growing body of evidence suggesting that there’s a link between allergies and reduced risk of a serious type of cancer that starts in the brain. This study suggests the reduced risk is stronger among women than men, although men with certain allergy profiles also have a lower tumor risk.

The study also strengthens scientists’ belief that something about having allergies or a related factor lowers the risk for this cancer. Because these tumors, called glioma, have the potential to suppress the immune system to allow them to grow, researchers have never been sure whether allergies reduce cancer risk or if, before diagnosis, these tumors interfere with the hypersensitive immune response to allergens.

Scientists conducting this study were able to analyze stored blood samples that were taken from patients decades before they were diagnosed with glioma. Men and women whose blood samples contained allergy-related antibodies had an almost 50 percent lower risk of developing glioma 20 years later compared to people without signs of allergies.

“This is our most important finding,” said Judith Schwartzbaum, associate professor of epidemiology at Ohio State University and lead author of the study. “The longer before glioma diagnosis that the effect of allergies is present, the less likely it is that the tumor is suppressing allergies. Seeing this association so long before tumor diagnosis suggests that antibodies or some aspect of allergy is reducing tumor risk.

“It could be that in allergic people, higher levels of circulating antibodies may stimulate the immune system, and that could lower the risk of glioma,” said Schwartzbaum, also an investigator in Ohio State’s Comprehensive Cancer Center. “Absence of allergy is the strongest risk factor identified so far for this brain tumor, and there is still more to understand about how this association works.”

Many previous studies of the link between allergies and brain tumor risk have been based on self-reports of allergy history from patients diagnosed with glioma. No previous studies have had access to blood samples collected longer than 20 years before tumor diagnosis.

The current study also suggested that women whose blood samples tested positive for specific allergy antibodies had at least a 50 percent lower risk for the most serious and common type of these tumors, called glioblastoma. This effect for specific antibodies was not seen in men. However, men who tested positive for both specific antibodies and antibodies of unknown function had a 20 percent lower risk of this tumor than did men who tested negative.

Glioblastomas constitute up to 60 percent of adult tumors starting in the brain in the United States, affecting an estimated 3 in 100,000 people. Patients who undergo surgery, radiation and chemotherapy survive, on average, for about one year, with fewer than a quarter of patients surviving up to two years and fewer than 10 percent surviving up to five years.

The study is published online in the Journal of the National Cancer Institute.

Schwartzbaum and colleagues were granted access to specimens from the Janus Serum Bank in Norway. The bank contains samples collected from citizens during their annual medical evaluations or from volunteer blood donors for the last 40 years. Norway also has registered all new cases of cancer in the country since 1953, and personal identification numbers enable cross-referencing those cases with previously collected blood samples.

The researchers analyzed stored samples from 594 people who were diagnosed with glioma (including 374 diagnosed with glioblastoma) between 1974 and 2007. They matched these samples for date of blood collection, age and sex with 1,177 samples from people who were not diagnosed with glioma for comparison.

The researchers measured the blood samples for levels of two types of proteins called IgE, or immunoglobulin E. This is a class of antibodies produced by white blood cells that mediate immune responses to allergens. Two classes of IgE participate in the allergic response: allergen-specific IgE, which recognizes specific components of an allergen, and total IgE, which recognizes these components but also includes antibodies with unknown functions.

In each sample, the scientists determined whether the serum contained elevated levels of IgE specific to the most common allergens in Norway as well as total IgE. The specific respiratory allergens included dust mites; tree pollen and plants; cat, dog and horse dander; and mold.

The researchers then conducted a statistical analysis to estimate the association between elevated concentrations of allergen-specific IgE and total IgE and the risk of developing glioma.

Among women, testing positive for elevated levels of allergen-specific IgE was associated with a 54 percent decreased risk of glioblastoma compared to women who tested negative for allergen-specific IgE. The researchers did not see this association in men.

However, the relation between total IgE levels and glioma risk was not different for men and women, statistically speaking. For men and women combined, testing positive for elevated total IgE was linked to a 25 percent decreased risk of glioma compared with testing negative for total IgE.

The analysis for effects on glioblastoma risk alone suggested a similar decreased risk for both men and women combined whose samples tested positive for high levels of IgE, but the findings were considered borderline in terms of statistical significance, meaning the association could also be attributed to chance.

“There is definitely a difference in the effect of allergen-specific IgE between men and women. And even results for total IgE suggest there still may be a difference between the sexes. The reason for this difference is unknown,” Schwartzbaum said.

What the study does provide evidence for, however, is the likelihood that the immune systems of people with respiratory allergies could have a protective effect against this type of brain cancer. The ability to investigate this association over four decades between blood sampling and tumor diagnosis gave the researchers better insight into the relationship between allergies and tumor risk, Schwartzbaum said.

For example, a positive test for elevated concentrations of total IgE was associated with a 46 percent decreased risk for developing a glioma 20 years later compared to samples testing negative for elevated IgE, according to the analysis. That decreased risk was only about 25 percent in samples that tested positive for high levels of total IgE taken two to 15 years prior to diagnosis.

“There may be a trend – the closer the samples get to the time of diagnosis, the less help the IgE is in decreasing the risk for glioma. However, if the tumor were suppressing allergy, we would expect to see a bigger difference in risk near the time of diagnosis,” Schwartzbaum said.

Schwartzbaum plans to further analyze the serum samples for concentration of cytokines, which are chemical messengers that promote or suppress inflammation as part of the immune response, to see if these proteins have a role in the relationship between elevated IgE levels and lowered tumor risk.

This work was funded by the National Cancer Institute, the National Institutes of Health and a Research Enhancement and Assistance Program grant from Ohio State’s Comprehensive Cancer Center.

Co-authors include Bo Ding, Anders Ahlbom and Maria Feychting of the Karolinska Institutet in Stockholm, Sweden; Tom Borge Johannesen and Tom Grimsrud of the Cancer Registry of Norway; Liv Osnes of Ulleval University Hospital in Oslo, Norway; and Linda Karavodin of Karavodin Preclinical Consulting in Encinitas, Calif.


Journal Reference:

  1. J. Schwartzbaum, B. Ding, T. B. Johannesen, L. T. N. Osnes, L. Karavodin, A. Ahlbom, M. Feychting, T. K. Grimsrud. Association Between Prediagnostic IgE Levels and Risk of Glioma. JNCI Journal of the National Cancer Institute, 2012; DOI: 10.1093/jnci/djs315
 

Irony seen through the eye of MRI

— In the cognitive sciences, the capacity to interpret the intentions of others is called "Theory of Mind" (ToM). This faculty is involved in the understanding of language, in particular by bridging the gap between the meaning of the words that make up a statement and the meaning of the statement as a whole.

In recent years, researchers have identified the neural network dedicated to ToM, but no one had yet demonstrated that this set of neurons is specifically activated by the process of understanding of an utterance. This has now been accomplished: a team from L2C2 (Laboratoire sur le Langage, le Cerveau et la Cognition, Laboratory on Language, the Brain and Cognition, CNRS / Université Claude Bernard-Lyon 1) has shown that the activation of the ToM neural network increases when an individual is reacting to ironic statements.

Published in Neuroimage, these findings represent an important breakthrough in the study of Theory of Mind and linguistics, shedding light on the mechanisms involved in interpersonal communication.

In our communications with others, we are constantly thinking beyond the basic meaning of words. For example, if asked, "Do you have the time?" one would not simply reply, "Yes." The gap between what is said and what it means is the focus of a branch of linguistics called pragmatics. In this science, "Theory of Mind" (ToM) gives listeners the capacity to fill this gap. In order to decipher the meaning and intentions hidden behind what is said, even in the most casual conversation, ToM relies on a variety of verbal and non-verbal elements: the words used, their context, intonation, "body language," etc.

Within the past 10 years, researchers in cognitive neuroscience have identified a neural network dedicated to ToM that includes specific areas of the brain: the right and left temporal parietal junctions, the medial prefrontal cortex and the precuneus. To identify this network, the researchers relied primarily on non-verbal tasks based on the observation of others' behavior[1]. Today, researchers at L2C2 (Laboratoire sur le Langage, le Cerveau et la Cognition, Laboratory on Language, the Brain and Cognition, CNRS / Université Claude Bernard-Lyon 1) have established, for the first time, the link between this neural network and the processing of implicit meanings.

To identify this link, the team focused their attention on irony. An ironic statement usually means the opposite of what is said. In order to detect irony in a statement, the mechanisms of ToM must be brought into play. In their experiment, the researchers prepared 20 short narratives in two versions, one literal and one ironic. Each story contained a key sentence that, depending on the version, yielded an ironic or literal meaning. For example, in one of the stories an opera singer exclaims after a premiere, "Tonight we gave a superb performance." Depending on whether the performance was in fact very bad or very good, the statement is or is not ironic.

The team then carried out functional magnetic resonance imaging (fMRI) analyses on 20 participants who were asked to read 18 of the stories, chosen at random, in either their ironic or literal version. The participants were not aware that the test concerned the perception of irony. The researchers had predicted that the participants' ToM neural networks would show increased activity in reaction to the ironic sentences, and that was precisely what they observed: as each key sentence was read, the network activity was greater when the statement was ironic. This shows that this network is directly involved in the processes of understanding irony, and, more generally, in the comprehension of language.

Next, the L2C2 researchers hope to expand their research on the ToM network in order to determine, for example, whether test participants would be able to perceive irony if this network were artificially inactivated.

Note:

[1] For example, Grèzes, Frith & Passingham (J. Neuroscience, 2004) showed a series of short (3.5 second) films in which actors came into a room and lifted boxes. Some of the actors were instructed to act as though the boxes were heavier (or lighter) than they actually were. Having thus set up deceptive situations, the experimenters asked the participants to determine if they had or had not been deceived by the actors in the films. The films containing feigned actions elicited increased activity in the rTPJ (right temporal parietal junction) compared with those containing unfeigned actions.


Journal Reference:

  1. Nicola Spotorno, Eric Koun, Jérôme Prado, Jean-Baptiste Van Der Henst, Ira A. Noveck. Neural evidence that utterance-processing entails mentalizing: The case of irony. NeuroImage, 2012; 63 (1): 25 DOI: 10.1016/j.neuroimage.2012.06.046
 

Judging adolescents' actions: Teens mature intellectually before they mature emotionally

Determining when a teenage brain becomes an adult brain is not an exact science but it's getting closer, according to an expert in adolescent developmental psychology, speaking at the American Psychological Association's 120th Annual Convention.

Important changes in adolescent brain anatomy and activity take place far later in development than previously thought, and those findings could impact how policymakers and the highest courts are treating teenagers, said Laurence Steinberg, PhD. "Explicit reference to the science of adolescent brain development is making its way into the national conversation," said Steinberg, a professor of psychology at Temple University.

He referred to the recent Supreme Court ruling in Miller v. Alabama, which cited APA's amicus brief explaining the current research. The ruling found that even in cases involving homicide, statutes that provide for mandatory life without parole for juveniles are unconstitutional. APA also filed amicus briefs in two prior Supreme Court cases in which the court ruled that the death penalty and life without parole in non-homicide cases are never constitutional where juveniles are involved.

"The Supreme Court decision that eliminated mandatory life without parole sentences for juveniles in homicide cases was certainly a step in the right direction but might have gone further as it is still possible for an adolescent to receive a sentence of life without parole, even though it isn't mandatory," Steinberg said

Many adolescents developmentally do not have the same control over their actions as mature adults and should be treated differently, according to Steinberg. Specific structural changes occur in the brain during adolescence, as do tremendous changes in how the brain works, he said. For example, from adolescence into early adulthood, there is a strengthening of activity in brain systems involving self-regulation, and functional MRIs have shown that reward centers in the adolescent brain are activated more than in children or adults, he said.

"Heightened sensitivity to anticipated rewards motivates adolescents to engage in risky acts, such as unprotected sex, fast driving or drugs when the potential for pleasure is high. This hypersensitivity to reward is particularly pronounced when they're with their friends," he said.

Policymakers face the question of when teenagers are responsible for their actions or can make reasoned decisions, such as in medical situations, and there is no simple answer because it is possible that an adolescent may be mature enough for some but not all decisions, according to Steinberg. The circumstances under which a 16-year-old makes medical decisions or commits crimes are very different and place different demands on their brains and abilities, he said. Brain systems implicated in basic cognitive processes reach adult levels of maturity by mid-adolescence, whereas those that are active in self-regulation do not fully mature until late adolescence or even early adulthood, he noted.

"In other words, adolescents mature intellectually before they mature socially or emotionally, a fact that helps explain why teenagers who are so smart in some respects sometimes do surprisingly dumb things," he said. "From a neuroscientific standpoint, it therefore makes perfect sense to have a lower age for autonomous medical decision-making than for eligibility for capital punishment, because certain brain systems mature earlier than others."

How the research should be interpreted and applied by policymakers and the courts is an issue behavioral researchers and scientists are considering as their discipline becomes more prominently featured in top legal and policy arguments, Steinberg added. "Some will use this evidence to argue in favor of restricting adolescents' rights, and others will use it to advocate for policies that protect adolescents from harm," he said. "In either case, scientists should welcome the opportunity to inform policy and legal discussions with the best available empirical evidence."

 

Identifying a new target for amyotrophic lateral sclerosis treatment

Amyotrophic lateral sclerosis (ALS) is a progressive disease wherein the cells of the central nervous system (CNS) involved in movement and coordination are destroyed. Although the mechanism of ALS is not completely understood, inflammation is believed to play a role in the disease process.

A recent study by Howard Weiner and colleagues at Harvard Medical School and Tufts School of Medicine investigated the role of inflammation in a mouse model of ALS. Weiner and colleagues found that the recruitment of activated immune cells known as monocytes into the spinal cord correlated with increased CNS cell death, and this recruitment was mediated by high expression of the chemoattractant protein CCL2 by resident spinal cord-derived immune cells.

Antibody-mediated depletion of the monocyte population reduced cellular recruitment to the spinal cord, decreased CNS cell death, and extended survival time in the mice. The analogous monocyte population in humans with ALS exhibited a similar inflammatory signature to the ALS model mice, suggesting that this cell population could serve as a marker of disease progression in human ALS patients. Thus, these results identify an inflammatory monocyte population as a potential therapeutic target for ALS.


Journal Reference:

  1. Oleg Butovsky, Shafiuddin Siddiqui, Galina Gabriely, Amanda J. Lanser, Ben Dake, Gopal Murugaiyan, Camille E. Doykan, Pauline M. Wu, Reddy R. Gali, Lakshmanan K. Iyer, Robert Lawson, James Berry, Anna M. Krichevsky, Merit E. Cudkowicz, Howard L. Weiner. Modulating inflammatory monocytes with a unique microRNA gene signature ameliorates murine ALS. Journal of Clinical Investigation, 2012; DOI: 10.1172/JCI62636
 

Dyslexia caused by faulty signal processing in brain; Finding offers clues to potential treatments

Many children and adults have difficulties reading and writing, and the reason is not always obvious. Those who suffer from dyslexia can exhibit a variety of symptoms. Thanks to research carried out by Begoña Díaz and her colleagues at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, a major step forward has been made in understanding the cause of dyslexia.

The scientists have discovered an important neural mechanism underlying dyslexia and shown that many difficulties associated with dyslexia can potentially be traced back to a malfunction of the medial geniculate body in the thalamus. The results provide an important basis for developing potential treatments.

People who suffer from dyslexia have difficulties with identifying speech sounds in spoken language. For example, while most children are able to recognise whether two words rhyme even before they go to school, dyslexic children often cannot do this until late primary school age. Those affected suffer from dyslexia their whole lives. However, there are also always cases where people can compensate for their dyslexia. "This suggests that dyslexia can be treated. We are therefore trying to find the neural causes of this learning disability in order to create a basis for improved treatment options," says Díaz.

Between five and ten percent of the world's children suffer from dyslexia, yet very little is known about its causes. Even though those affected do not lack intelligence or schooling, they have difficulties in reading, understanding and explaining individual words or entire texts. The researchers showed that dyslexic adults have a malfunction in a structure that transfers auditory information from the ear to the cortex is a major cause of the impairment: the medial geniculate body in the auditory thalamus does not process speech sounds correctly. "This malfunction at a low level of language processing could percolate through the entire system. This explains why the symptoms of dyslexia are so varied," says Díaz.

Under the direction of Katharina von Kriegstein, the researchers conducted two experiments in which several volunteers had to perform various speech comprehension tasks. When affected individuals performed tasks that required the recognition of speech sounds, as compared to recognize the voices that pronounced the same speech, magnetic resonance tomography (MRT) recordings showed abnormal responses in the area around the medial geniculate body. In contrast, no differences were apparent between controls and dyslexic participants if the tasks involved only listening to the speech sounds without having to perform a specific task. "The problem, therefore, has nothing to do with sensory processing itself, but with the processing involved in speech recognition," says Díaz. No differences could be ascertained between the two test groups in other areas of the auditory signalling path.

The findings of the Leipzig scientists combine various theoretical approaches, which deal with the cause of dyslexia and, for the first time, bring together several of these theories to form an overall picture. "Recognising the cause of a problem is always the first step on the way to a successful treatment," says Díaz. The researchers' next project is now to study whether current treatment programmes can influence the medial geniculate body in order to make learning to read easier for everyone in the long term.


Journal Reference:

  1. B. Diaz, F. Hintz, S. J. Kiebel, K. von Kriegstein. Dysfunction of the auditory thalamus in developmental dyslexia. Proceedings of the National Academy of Sciences, 2012; DOI: 10.1073/pnas.1119828109
 

Searching for tumors or handguns can be like looking for food

— If past experience makes you think there's going to be one more cashew at the bottom of the bowl, you're likely to search through those mixed nuts a little longer.

But what keeps the attention of a radiologist who sees just 70 suspicious lesions in 1,000 mammograms or a baggage screener who hasn't found a handgun in more than a year?

The answer, according to biological theory and a laboratory study conducted by Duke University psychologists, may be to make those professional searchers believe there are more targets to be found.

"In the real world, most of the time you don't have to find absolutely everything," said post-doctoral researcher Matthew S. Cain. Consequently, we tend to search for something until our experience tells us the payoffs are declining.

But baggage screeners and radiologists are expected to find absolutely everything, which research by this group says is a real long shot. In earlier work, they found that humans visually searching for things can miss targets that only appear rarely, and will often miss items after finding a first target, even if there are more to be found.

In a laboratory experiment that taps into the biological theory of foraging behavior, the Duke group found that the shortcomings in our visual searching abilities may be rooted in the evolutionary past.

Test subjects were presented with a series of screens and told to pick out a particular shape among many similar shapes. Some subjects were given an environment where there would be lots of targets to find; others had slimmer pickings. Feedback after each screen showed them what they had missed.

"The basic pattern is very clear," said Cain, a post-doctoral researcher at Brown University who did this work in the Duke Visual Cognition Lab. "Searchers who had found that there could be a lot of targets stayed on task longer. Searchers who had fewer targets to find gave up on a given screen sooner."

The experiment also tested the research group's earlier findings on "satisfaction of search," in which people are unlikely to see a second target after they've found a first one. Again, the people with fewer targets to find were more likely to quit searching after one hit; those with higher-frequency targets stayed on task longer.

The study, which appeared online Aug. 6 in Psychological Science, was supported by the Army Research Office and the Department of Homeland Security. Cain and Stephen Mitroff, a Duke professor of psychology & neuroscience, are part of a regional research collaborative in North Carolina's Research Triangle known as the Institute for Homeland Security Solutions. They are currently working with Transportation Security Administration (TSA) baggage screening officials at Raleigh-Durham International Airport to study real-world expert visual searching.

The researchers say that for crucial searching tasks like airport security and cancer screening, the effectiveness of screeners could be improved by making them think there are more targets to be found.

In fact, through a program known as "threat image projection," the TSA is already doing this by digitally inserting phantom images of contraband into the images seen by baggage screeners to increase the hit rate and improve attention, Cain said.

Now the Duke team knows why this works. It seems to come from something called "foraging theory" in biology. It's well studied in nature that a foraging creature decides at some point that the patch of food it is working on is no richer than the surrounding environment, and moves on to find something better.

Foraging theory has been supported by field studies on birds, primates, insects and rodents and even in creatures that don't have real brains, such as plants, said Michael Platt, director of the Duke Institute for Brain Sciences and Center for Cognitive Neuroscience. He and his colleagues Ben Hayden, now at the University of Rochester, and John Pearson helped this team apply the foraging model to their work on visual searches. (A study by Platt last year found the areas of a monkey's brain that apparently govern this foraging behavior).

"This study endorses the idea that the brains of a wide array of animals, including humans, evolved to solve foraging problems in similar ways — whether foraging for food, mates or visual information," Platt said. "What's really fascinating is the implication that our brains are specialized to search in specific ways, and that these biases may have real-world consequences for medicine or national security."

"Applying animal foraging theory to human searches can open new doors for how to improve performance," said Mitroff, who co-authored the study and directs the Duke Visual Cognition Lab. "The key to reducing errors in difficult searches, such as those done by radiologists and airport security officers, is to understand where things go wrong. These new findings suggest we may be able to make searchers better by simply adjusting their expectations."


Journal Reference:

  1. M. S. Cain, E. Vul, K. Clark, S. R. Mitroff. A Bayesian Optimal Foraging Model of Human Visual Search. Psychological Science, 2012; DOI: 10.1177/0956797612440460
 

Learning: Stressed people use different strategies and brain regions

— Stressed and non-stressed people use different brain regions and different strategies when learning. This has been reported by the cognitive psychologists PD Dr. Lars Schwabe and Professor Oliver Wolf from the Ruhr-Universität Bochum in the Journal of Neuroscience. Non-stressed individuals applied a deliberate learning strategy, while stressed subjects relied more on their gut feeling. "These results demonstrate for the first time that stress has an influence on which of the different memory systems the brain turns on," said Lars Schwabe.

The experiment: Stress due to ice-water

The data from 59 subjects were included in the study. Half of the participants had to immerse one hand into ice-cold water for three minutes under video surveillance. This stressed the subjects, as hormone assays showed. The other participants had to immerse one of their hands just in warm water. Then both the stressed and non-stressed individuals completed the so-called weather prediction task. The subjects looked at playing cards with different symbols and learned to predict which combinations of cards announced rain and which sunshine. Each combination of cards was associated with a certain probability of good or bad weather. People apply differently complex strategies in order to master the task. During the weather prediction task, the researchers recorded the brain activity with MRI.

Two routes to success

Both stressed and non-stressed subjects learned to predict the weather according to the symbols. Non-stressed participants focused on individual symbols and not on combinations of symbols. They consciously pursued a simple strategy. The MRI data showed that they activated a brain region in the medial temporal lobe — the hippocampus, which is important for long-term memory. Stressed subjects, on the other hand, applied a more complex strategy. They made ​​their decisions based on the combination of symbols. They did this, however, subconsciously, i.e. they were not able to formulate their strategy in words. The result of the brain scans was also accordingly: In the case of the stressed volunteers the so-called striatum in the mid-brain was activated — a brain region that is responsible for more unconscious learning. "Stress interferes with conscious, purposeful learning, which is dependent upon the hippocampus," concluded Lars Schwabe. "So that makes the brain use other resources. In the case of stress, the striatum controls behaviour — which saves the learning achievement."


Journal Reference:

  1. L. Schwabe, O. Wolf. Stress modulates the engagement of multiple memory systems in classification learning. Journal of Neuroscience, 2012 DOI: 10.1523/JNEUROSCI.1484-12.2012
 

One week of therapy may help reorganize brain, reduce stuttering

Just one week of speech therapy may reorganize the brain, helping to reduce stuttering, according to a study published in the August 8, 2012, online issue of Neurology®, the medical journal of the American Academy of Neurology.

The Chinese study gives researchers new insights into the role of different brain regions in stuttering, which affects about one percent of adults.

The study involved 28 people with stuttering and 13 people who did not stutter. Fifteen of the people with stuttering received a week of therapy with three sessions per day. The other stutterers and the controls received no therapy. Therapy involved the participants repeating two-syllable words that were spoken to them and then reading words presented to them visually. There was no time limit in either task. The average scores on stuttering tests and percent of stuttered syllables improved for those who received the therapy. There was no change in scores for the stutterers who did not receive therapy.

Brain scans were used to measure the thickness of the cerebral cortex in the brain for all participants at the beginning and end of the study. They also measured the interactions between areas of the brain while at rest, called resting state functional connectivity. Thickness and strength of interactions was reduced in an area of the brain important in speech and language production called the pars opercularis for those with stuttering compared to the controls. Increased strength of interactions was found in the cerebellum for those with stuttering compared to the controls.

For those who received the therapy, the functional connectivity in the cerebellum was reduced to the same level as that of the controls. There was no change in the pars opercularis area of the brain.

"These results show that the brain can reorganize itself with therapy, and that changes in the cerebellum are a result of the brain compensating for stuttering," said study author Chunming Lu, PhD, of Beijing Normal University in China. "They also provide evidence that the structure of the pars opercularis area of the brain is altered in people with stuttering."

Christian A. Kell, MD, of Goethe University in Frankfurt, Germany, who wrote an editorial accompanying the study, said, "These findings should further motivate therapists and researchers in their efforts to determine how therapy works to reorganize the brain and reduce stuttering."

The study was supported by the National Natural Science Foundation of China.


Journal Reference:

  1. C. Lu, C. Chen, D. Peng, W. You, X. Zhang, G. Ding, X. Deng, Q. Yan, P. Howell. Neural anomaly and reorganization in speakers who stutter: A short-term intervention study. Neurology, 2012; 79 (7): 625 DOI: 10.1212/WNL.0b013e31826356d2
 

Hormone in fruit flies sheds light on diabetes cure, weight-loss drug for humans

— Manipulating a group of hormone-producing cells in the brain can control blood sugar levels in the body — a discovery that has dramatic potential for research into weight-loss drugs and diabetes treatment.

In a paper published in the October issue of Genetics and available online now, neurobiologists at Wake Forest University examine how fruit flies (Drosophila) react when confronted with a decreased diet.

Reduced diet or starvation normally leads to hyperactivity in fruit flies — a hungry fly buzzes around feverishly, looking for more food. That happens because an enzyme called AMP-activated kinase stimulates the secretion of the adipokinetic hormone, which is the functional equivalent of glucagon. This hormone acts opposite of insulin, as it tells the body to release the sugar, or food, needed to fuel that hyperactivity. The body uses up its energy stores until it finds food.

But when Wake Forest's Erik Johnson, an associate professor of biology, and his research team turned off AMP-activated kinase, the cells decreased sugar release and the hyperactive response stopped almost completely — even in the face of starvation.

"Since fruit flies and humans share 30 percent of the same genes and our brains are essentially wired the same way, it suggests that this discovery could inform metabolic research in general and diabetes research specifically," said Johnson, the study's principal investigator. "The basic biophysical, biochemical makeup is the same. The difference in complexity is in the number of cells. Why flies are so simple is that they have approximately 100,000 neurons versus the approximately 11 billion in humans."

Medical advances as a result of this research might include:

Diabetes research: Adipokinetic hormone is the insect equivalent to the hormone glucagon in the human pancreas. Glucagon raises blood sugar levels; insulin reduces them. However, it is difficult to study glucagon systems because the pancreatic cells are hard to pull apart. Studying how this similar system works in the fruit fly could pave the way to a drug that targets the cells that cause glucagon to tell the body to release sugar into the blood — thus reducing the need for insulin shots in diabetics.

Weight-loss drugs: An "exercise drug" would turn on all AMP-activated kinase in the body and trick the body into thinking it was exercising. "Exercise stimulates AMP-activated kinase, so manipulation of this molecule may lead to getting the benefits of exercise without exercising," Johnson said. In previous research published in the online journal PLoS ONE, Johnson and his colleagues found that, when you turn off AMP-activated kinase, you get fruit flies that "eat a lot more than normal flies, move around a lot less, and end up fatter."

Johnson's current study is funded by the National Science Foundation and the National Institutes of Health. Co-authors are Jason Braco, Emily Gillespie and Gregory Alberto of Wake Forest, and Jay Brenman of the University of North Carolina-Chapel Hill.


Journal Reference:

  1. J. T. Braco, E. L. Gillespie, G. E. Alberto, J. E. Brenman, E. C. Johnson. Energy-dependent Modulation of Glucagon-like Signaling in Drosophila via the AMP-activated Protein Kinase. Genetics, 2012; DOI: 10.1534/genetics.112.143610
 

Smelling a skunk after a cold: Brain changes after a stuffed nose protect the sense of smell

Has a summer cold or mold allergy stuffed up your nose and dampened your sense of smell? We take it for granted that once our nostrils clear, our sniffers will dependably rebound and alert us to a lurking neighborhood skunk or a caramel corn shop ahead.

That dependability is no accident. It turns out the brain is working overtime behind the scenes to make sure the sense of smell is just as sharp after the nose recovers.

A new Northwestern Medicine study shows that after the human nose is experimentally blocked for one week, brain activity rapidly changes in olfactory brain regions. This change suggests the brain is compensating for the interruption of this vital sense. The brain activity returns to a normal pattern shortly after free breathing has been restored.

Previous research in animals has suggested that the olfactory system is resistant to perceptual changes following odor deprivation. This new paper focuses on humans to show how that's possible. The study is published in the journal Nature Neuroscience.

"You need ongoing sensory input in order for your brain to update smell information," said Keng Nei Wu, the lead author of the paper and a graduate student in neuroscience at Northwestern University Feinberg School of Medicine. "When your nostrils are blocked up, your brain tries to adjust to the lack of information so the system doesn't break down. The brain compensates for the lack of information so when you get your sense of smell back, it will be in good working order."

For the study, Wu completely blocked the nostrils of 14 participants for a week while they lived in a special low-odor hospital room. At night, participants were allowed to breathe normally while they slept in the room.

After the smell deprivation, researchers found an increase in activity in the orbital frontal cortex and a decrease of activity in the piriform cortex, two regions related to the sense of smell.

"These changes in the brain are instrumental in maintaining the way we smell things even after seven days of no smell," Wu said.

When unrestricted breathing was restored, people were immediately able to perceive odors. A week after the deprivation experience, the brain's response to odors had returned to pre-experimental levels, indicating that deprivation-caused changes are rapidly reversed.

Such a rapid reversal is quite different from other sensory systems, such as sight, which typically have longer-lasting effects due to deprivation. The olfactory system is more agile, Wu suggested, because smell deprivation due to viral infection or allergies is common.

This study also has clinical significance relating to upper respiratory infection and sinusitis, especially when such problems become chronic, at which point ongoing deprivation could cause more profound and lasting changes, Wu noted.

"It also implies that deprivation has a significant impact on the brain, rather than on the nose itself," Wu said. "More knowledge about how the system reacts to short-term deprivation may provide new insights into how to deal with this problem in a chronic context."

Other Northwestern authors include Bruce K. Tan, James D. Howard, David B. Conley and Jay A. Gottfried, the senior author.


Journal Reference:

  1. Keng Nei Wu, Bruce K Tan, James D Howard, David B Conley, Jay A Gottfried. Olfactory input is critical for sustaining odor quality codes in human orbitofrontal cortex. Nature Neuroscience, 2012; DOI: 10.1038/nn.3186