Looking at a tough hill to climb? Depends on your point of view

People tend to overestimate the steepness of slopes — and psychologists studying the phenomenon have made a discovery that refutes common ideas about how we perceive inclines in general.

For more than a decade, researchers thought that our judgment was biased by our fatigue or fear of falling, explained Dennis Shaffer, associate professor of psychology at Ohio State University's Mansfield campus. We perceive climbing or descending hills as difficult or dangerous, so when we look at an incline, our view is clouded by the expected physical exertion or danger of traversing it.

For a study in the current issue of the journal Psychological Science, Shaffer and then-undergraduate student Mariagrace Flint uncovered a contradiction, when they compared how we perceive the angle of stairs versus escalators.

"We found that people tend to overestimate a slant even when they are looking at an escalator, and climbing or descending it would require practically no effort at all," Shaffer said.

For the study, 200 passersby were asked to judge the angle of a set of stairs on the Mansfield campus, while another 200 were asked to judge the angle of an escalator in a Mansfield Sears store. In each case, 100 people viewed the angle from the top, and 100 from the bottom.

On average, people consistently overestimated the slant by 18-19 degrees, regardless of whether they were looking at a set of stairs or an escalator, from the top or from the bottom. The actual slope of the steps was 25 degrees, and the slope of the escalator was 30 degrees, but people judged them to be an average of 44 degrees and 48 degrees, respectively.

"In fact, their overestimates were virtually identical," Shaffer said.

The study adds to a growing body of evidence that body-based factors, such as climbing effort or perceived danger, do not have the strong influences on our perception of slant that researchers once thought.

At least, Shaffer added, we can take comfort that our misperceptions are consistent.

"The range of effort required for different activities — walking or running, riding a bike or a ski lift, driving or riding in a car, or riding an escalator — is large, and it depends on whether we're feeling energized or fatigued," he said.

"The constancy of slant perception shown here, like other perceptual constancies such as size, color, lightness, and orientation, guarantees that our perception of important features in the real world remains stable across large variations in viewing conditions."

He suspects that our perception of slant is biased by a more basic misperception: the angle of our gaze. People, he says, tend to think they are looking downward at a sharper angle than they actually are.

"If people believe they are looking more downward towards the bottom of the hill, and the hill looks perpendicular to their line of sight from there, the hill could look steeper to them," he said. "But this hasn't been tested — we're working on that next."

Other research has already shown that people standing above a hill think the hill is less steep when they stand right at the edge, then more steep when they move back.

Shaffer is also studying a related effect: why, when viewing a hill from below, we overestimate slant more as we stand farther away — up to a distance of 50 meters, when our estimates level off.

Despite the apparent constancy of our misperceptions, Shaffer maintains that it's possible for people to learn to judge slopes more accurately.

"I've had roofers take my class before, and they always seem to be accurate with their estimations," he said.


Journal Reference:

  1. D. M. Shaffer, M. Flint. Escalating Slant: Increasing Physiological Potential Does Not Reduce Slant Overestimates. Psychological Science, 2010; DOI: 10.1177/0956797610393744

Crocodile tears don't fool us all: Study gives behavioral clues to spot fabricated versus genuine displays of remorse

— How easy is it to fake remorse? Not so easy if your audience knows what to look for.

In the first investigation of the nature of true and false remorse, Leanne ten Brinke and colleagues, from the Centre for the Advancement of Psychology and Law (CAPSL), University of British Columbia and Memorial University of Newfoundland in Canada, show that those who fake remorse show a greater range of emotional expressions and swing from one emotion to another very quickly — a phenomenon referred to as emotional turbulence — as well as speak with more hesitation. These findings have important implications for judges and parole board members, who look for genuine remorse when they make their sentencing and release decisions.

Ten Brinke's work is published in Springer's journal Law and Human Behavior.

Deception is a common aspect of human social interaction that can have major implications if undetected, particularly in the context of crime sentencing and parole hearings, where the perceived credibility of the defendants' emotion during their testimony informs decisions about their future.

Ten Brinke and colleagues examined the facial, verbal and body language behaviors associated with emotional deception in videotaped accounts of true personal wrongdoing, with either genuine or fabricated remorse, among 31 Canadian undergraduate students. Their analysis of nearly 300,000 frames showed that those participants who displayed false remorse displayed more of the seven universal emotions (happiness, sadness, fear, disgust, anger, surprise, and contempt) than those who were genuinely sorry.

The authors grouped the emotions displayed in facial expressions into three categories: positive (happiness), negative (sadness, fear, anger, contempt, disgust) and neutral (neutral, surprise). They found that participants who were genuinely remorseful did not often swing directly from positive to negative emotions, but went through neutral emotions first. In contrast, those who were deceiving the researchers made more frequent direct transitions between positive and negative emotions, with fewer displays of neutral emotions in between. In addition, during fabricated remorse, students had a significantly higher rate of speech hesitations than during true remorse.

The authors conclude: "Our study is the first to investigate genuine and falsified remorse for behavioral cues that might be indicative of such deception. Identifying reliable cues could have considerable practical implications — for example for forensic psychologists, parole officers and legal decision-makers who need to assess the truthfulness of remorseful displays."


Journal Reference:

  1. Leanne Brinke, Sarah MacDonald, Stephen Porter, Brian O’Connor. Crocodile Tears: Facial, Verbal and Body Language Behaviours Associated with Genuine and Fabricated Remorse. Law and Human Behavior, 2011; DOI: 10.1007/s10979-011-9265-5

Brief diversions vastly improve focus, researchers find

— A new study in the journal Cognition overturns a decades-old theory about the nature of attention and demonstrates that even brief diversions from a task can dramatically improve one's ability to focus on that task for prolonged periods.

The study zeroes in on a phenomenon known to anyone who's ever had trouble doing the same task for a long time: After a while, you begin to lose your focus and your performance on the task declines.

Some researchers believe that this "vigilance decrement," as they describe it, is the result of a drop in one's "attentional resources," said University of Illinois psychology professor Alejandro Lleras, who led the new study. "For 40 or 50 years, most papers published on the vigilance decrement treated attention as a limited resource that would get used up over time, and I believe that to be wrong. You start performing poorly on a task because you've stopped paying attention to it," he said. "But you are always paying attention to something. Attention is not the problem."

Lleras had noticed that a similar phenomenon occurs in sensory perception: The brain gradually stops registering a sight, sound or feeling if that stimulus remains constant over time. For example, most people are not aware of the sensation of clothing touching their skin. The body becomes "habituated" to the feeling and the stimulus no longer registers in any meaningful way in the brain.

In previous studies, Lleras explored the limits of visual perception over time, focusing on a phenomenon called Troxler Fading: when continual attention to a stationary object in one's peripheral vision can lead to that object's complete "disappearance" from view.

"Constant stimulation is registered by our brains as unimportant, to the point that the brain erases it from our awareness," Lleras said. "So I thought, well, if there's some kind of analogy about the ways the brain fundamentally processes information, things that are true for sensations ought to be true for thoughts. If sustained attention to a sensation makes that sensation vanish from our awareness, sustained attention to a thought should also lead to that thought's disappearance from our mind!"

In the new study, Lleras and postdoctoral fellow Atsunori Ariga tested participants' ability to focus on a repetitive computerized task for about an hour under various conditions. The 84 study subjects were divided into four groups:

  • The control group performed the 50-minute task without breaks or diversions.
  • The "switch" group and the "no-switch" group memorized four digits prior to performing the task, and were told to respond if they saw one of the digits on the screen during the task. Only the switch group was actually presented with the digits (twice) during the 50-minute experiment. Both groups were tested on their memory of the digits at the end of the task.
  • The "digit-ignored" group was shown the same digits presented to the switch group during the task, but was told to ignore them.

As expected, most participants' performance declined significantly over the course of the task. But most critically, Lleras said, those in the switch group saw no drop in their performance over time. Simply having them take two brief breaks from their main task (to respond to the digits) allowed them to stay focused during the entire experiment.

"It was amazing that performance seemed to be unimpaired by time, while for the other groups performance was so clearly dropping off," Lleras said.

This study is consistent with the idea that the brain is built to detect and respond to change, Lleras said, and suggests that prolonged attention to a single task actually hinders performance.

"We propose that deactivating and reactivating your goals allows you to stay focused," he said. "From a practical standpoint, our research suggests that, when faced with long tasks (such as studying before a final exam or doing your taxes), it is best to impose brief breaks on yourself. Brief mental breaks will actually help you stay focused on your task!"


Journal Reference:

  1. Atsunori Ariga, Alejandro Lleras. Brief and rare mental 'breaks' keep you focused: Deactivation and reactivation of task goals preempt vigilance decrements. Cognition, 2011; DOI: 10.1016/j.cognition.2010.12.007

Brain's 'radio stations' have much to tell scientists

Like listeners adjusting a high-tech radio, scientists at Washington University School of Medicine in St. Louis have tuned in to precise frequencies of brain activity to unleash new insights into how the brain works.

"Analysis of brain function normally focuses on where brain activity happens and when," says Eric C. Leuthardt, MD. "What we've found is that the wavelength of the activity provides a third major branch of understanding brain physiology."

Researchers used electrocorticography, a technique for monitoring the brain with a grid of electrodes temporarily implanted directly on the brain's surface. Clinically, Leuthardt and other neurosurgeons use this approach to identify the source of persistent, medication-resistant seizures in patients and to map those regions for surgical removal. With the patient's permission, scientists can also use the electrode grid to experimentally monitor a much larger spectrum of brain activity than they can via conventional brainwave monitoring.

Scientists normally measure brainwaves with a process called electroencephalography (EEG), which places electrodes on the scalp. Brainwaves are produced by many neurons firing at the same time; how often that firing occurs determines the activity's frequency or wavelength, which is measured in hertz, or cycles per second. Neurologists have used EEG to monitor consciousness in patients with traumatic injuries, and in studies of epilepsy and sleep.

In contrast to EEG, electrocorticography records brainwave data directly from the brain's surface.

"We get better signals and can much more precisely determine where those signals come from, down to about one centimeter," Leuthardt, assistant professor of neurosurgery, of neurobiology and of biomedical engineering, says. "Also, EEG can only monitor frequencies up to 40 hertz, but with electrocorticography we can monitor activity up to 500 hertz. That really gives us a unique opportunity to study the complete physiology of brain activity."

Leuthardt and his colleagues have used the grids to watch consciousness fade under surgical anesthesia and return when the anesthesia wears off. They found each frequency gave different information on how different circuits changed with the loss of consciousness, according to Leuthardt.

"Certain networks of brain activity at very slow frequencies did not change at all regardless of how deep under anesthesia the patient was," Leuthardt says. "Certain relationships between high and low frequencies of brain activity also did not change, and we speculate that may be related to some of the memory circuits."

Their results also showed a series of changes that occurred in a specific order during loss of consciousness and then repeated in reverse order as consciousness returned. Activity in a frequency region known as the gamma band, which is thought to be a manifestation of neurons sending messages to other nearby neurons, dropped and returned as patients lost and regained consciousness.

The results appeared in December in the Proceedings of the National Academy of Sciences.

In another paper that will publish Feb. 9 in The Journal of Neuroscience, Leuthardt and his colleagues have shown that the wavelength of brain signals in a particular region can be used to determine what function that region is performing at that time. They analyzed brain activity by focusing on data from a single electrode positioned over a number of different regions involved in speech. Researchers could use higher-frequency bands of activity in this brain area to tell whether patients:

  • had heard a word or seen a word
  • were preparing to say a word they had heard or a word they had seen
  • were saying a word they had heard or a word they had seen.

"We've historically lumped the frequencies of brain activity that we used in this study into one phenomenon, but our findings show that there is true diversity and non-uniformity to these frequencies," he says. "We can obtain a much more powerful ability to decode brain activity and cognitive intention by using electrocorticography to analyze these frequencies."


Journal References:

  1. J. D. Breshears, J. L. Roland, M. Sharma, C. M. Gaona, Z. V. Freudenburg, R. Tempelhoff, M. S. Avidan, E. C. Leuthardt. Stable and dynamic cortical electrophysiology of induction and emergence with propofol anesthesia. Proceedings of the National Academy of Sciences, 2010; 107 (49): 21170 DOI: 10.1073/pnas.1011949107
  2. Gaona CM, Sharma M, Freudenburg ZV, Breshears JD, Bundy DT, Roland J, Barbour D, Schalk G, Leuthardt EC. Nonuniform high-gamma (60-500 hz) power changes dissociate cognitive task and anatomy in human cortex. The Journal of Neuroscience, 2011; (in press)

The brain knows what the nose smells, but how?

— Mice know fear. And they know to fear the scent of a predator. But how do their brains quickly figure out with a sniff that a cat is nearby?

It's a complex process that starts with the scent being picked up by specific receptors in their noses. But until now it wasn't clear exactly how these scent signals proceeded from nose to noggin for neural processing.

In a study to be published in Nature, Stanford researchers describe a new technique that makes it possible to map long-distance nerve connections in the brain. The scientists used the technique to map for the first time the path that the scent signals take from the olfactory bulb, the part of the brain that first receives signals from odor receptors in the nose, to higher centers of the mouse brain where the processing is done.

"No one could trace signals across neural connections to a specific type of neuron at a specific location before," said biology Professor Liqun Luo. This is Luo's first study of the mouse olfactory system, but his lab has spent 10 years studying olfactory pathways in the fruit fly. Because mouse brains are so much larger and more complex that those of flies, Luo and postdoctoral researcher Kazunari Miyamichi had to develop an entirely new experimental technique.

These techniques can be used to do more than just study how mice smell. "The tools we've developed can be applied to trace neural connections of any part of the nervous system," Luo said. The tools could be used to understand how mouse brains process information from their other senses, or how the brain controls movement. The tools could also be adapted for use in rats and other mammalian species, he said.

To trace the neural pathways, the researchers injected mouse brains with two viruses, one after the other.

The researchers first injected a low-grade virus into the higher centers of a mouse brain, where it infected nearby neurons.

This first virus left the neurons susceptible to infection by the second virus, which was injected two weeks later. The second virus — fluorescent red in color — was designed by collaborator Edward Callaway at the Salk Institute.

Genes introduced by the first virus allowed the next virus to infect its way from the higher brain to the olfactory bulb, going in the opposite direction of scent signals. By following the backward progress of the second virus, the scientists could identify the neurons in the olfactory bulb where the virus ended up, thanks to the red fluorescence.

The scientists then sliced each mouse brain into about 60 thin sections, and took photos of all of them through a microscope. They used a sophisticated algorithm to combine the images from 35 mice into a 3-D model of the olfactory bulb designed by graduate students Fernando Amat and Farshid Moussavi in Professor Mark Horowitz's electric engineering group. This allowed them to look for patterns between where the virus started in the higher brain centers and where in the olfactory bulb it finished its journey.

They found that most of the nerve pathways heading to the higher processing centers that direct the mice's innate like or dislike of certain odors, and trigger a response to them, originated from one region — the top part of the olfactory bulb. This could explain how the mouse brain directs the animal's innate fear response to cat or fox urine.

This is in contrast to the neurons heading to the brain areas which process learned responses to odor. The neurons associated with learned responses are scattered all over the olfactory bulb, and their relative lack of organization could reflect their flexibility in allowing the mice to learn to avoid or be attracted to new smells.

The group also found that each neuron in the brain's higher centers receives signals from at least four neurons in the olfactory bulb, each of which receives input from a large number of like odor receptors. This progressive funneling and processing helps explain how the brain integrates the information from many different odors, Luo said.

In addition, he said, "There might be similar organizational principles in flies and mice, despite the evolutionary distance between them."

Luo said he will use the techniques in this study to take a more detailed look at other parts of the mouse olfactory bulb and brain, with the eventual goal of understanding how the brain processes specific odors. He said he was also working to improve the technique to track neurons across longer distances, allowing him to look in more detail at other pathways in the mouse nervous system.


Journal Reference:

  1. Kazunari Miyamichi, Fernando Amat, Farshid Moussavi, Chen Wang, Ian Wickersham, Nicholas R. Wall, Hiroki Taniguchi, Bosiljka Tasic, Z. Josh Huang, Zhigang He, Edward M. Callaway, Mark A. Horowitz, Liqun Luo. Cortical representations of olfactory input by trans-synaptic tracing. Nature, 2010; DOI: 10.1038/nature09714

Women subject to objectifying gazes show decreased math ability

 Women who are looked at as sexual objects not only react as sexual objects, they also exhibit less proficiency with math, according to a new study published in the March 2011 issue of the journal Psychology of Women Quarterly.

The study examined the effect of the objectifying gaze (the visual inspection of one's body by another person) on undergraduates' math performance. Motivation to interact with the objectifying person in the future was also measured as well as body image outcomes, including body surveillance, body shame, and body dissatisfaction.

One hundred and fifty undergraduates (67 women and 83 men) from a large U.S. Midwestern university participated in the study.

Researchers found that the objectifying gaze lowered women's math performance, but not men's. The objectifying gaze also increased women's, but not men's, motivation to have further interactions with their partner. Finally, the research found that an objectifying gaze did not influence body surveillance, body shame, or body dissatisfaction for women or men.

"The objectifying gaze is particularly problematic for women," write authors Sarah J. Gervais, Theresa K. Vescio, and Jill Allen. "And it may lead to a vicious cycle in which women are first objectified and, as a result, underperform, confirming the notion that women's looks are more important than what they can do."


Journal Reference:

  1. S. J. Gervais, T. K. Vescio, J. Allen. When What You See Is What You Get: The Consequences of the Objectifying Gaze for Women and Men. Psychology of Women Quarterly, 2011; DOI: 10.1177/0361684310386121

Expectations speed up conscious perception

The human brain works incredibly fast. However, visual impressions are so complex that their processing takes several hundred milliseconds before they enter our consciousness. Scientists at the Max Planck Institute for Brain Research in Frankfurt am Main have now shown that this delay may vary in length. When the brain possesses some prior information − that is, when it already knows what it is about to see − conscious recognition occurs faster. Until now, neuroscientists assumed that the processes leading up to conscious perception were rather rigid and that their timing did not vary.

On their way from the eye, visual stimuli are analysed in manifold ways by different processing stages in the brain. It is not until they have passed several processing steps that the stimuli reach conscious perception. This unconscious processing prior to perception usually takes approximately 300 milliseconds. The Max Planck scientists were now able to demonstrate that the timing of this process, far from being rigid, is in fact variable. In an experiment, participants perceived stimuli more efficiently and faster if they knew what to expect.

To investigate this, the scientists showed the participants images with a background of randomly distributed dots on a monitor. During an image sequence, the distribution of the dots systematically changed such that a symbol gradually appeared. Following each image, the participants indicated if they could see the symbol by pressing a button. As soon as the symbol had appeared fully and was clearly recognisable, the scientists presented the same image sequence in reverse order, such that the symbol gradually faded again. During the entire experiment, electroencephalographic (EEG) activity of the participants was measured.

Whereas the participants took relatively long to recognise the symbol in the first sequence of images with increasing visibility, the threshold of awareness in the second, reverse presentation of images was much lower. The participants were able to recognise the letters even at very poor resolution. "Expectations based on previously acquired information apparently help to perceive the object consciously," says Lucia Melloni, first author of the study. Once the participants knew which symbol was hiding in the random field of noise, they were able to perceive it better. The scientists have thus confirmed previous studies, according to which people perceive moving objects better if they already know in which direction the objects will move.

Moreover, the measurements of EEG activity produced astonishing results. "We found that the timing of EEG activity for conscious perception changed depending on the person's expectations," says Lucia Melloni. If the participants could predict what they were going to see, the characteristic EEG pattern for conscious perception took place 100 milliseconds earlier than without prior expectations.

The scientists might thus have found a conclusive explanation for the contradictory results of other neuroscientific research groups. Depending on the study, they had sometimes found very early and sometimes very late EEG activity correlating with conscious perception. "Our research explains this variability in timing. Apparently, the brain does not process the stimuli rigidly and at the same speed; rather, it is flexible," explains Wolf Singer. Processing is thus faster if the brain only has to compare the incoming visual information with a previously established expectation. As a result, conscious perception occurs earlier. In contrast, if the brain has to assess a stimulus from scratch due to a lack of prior information, the processing takes longer.

These results may show that previous EEG studies have been interpreted incorrectly. "Since the interpretation depends heavily on the sequence of events, EEG activity may have been incorrectly allocated to consciousness processes," surmises Wolf Singer, the Director of the Department for Neurophysiology at the Max Planck Institute for Brain Research in Frankfurt. "In light of these results, it appears necessary to reinvestigate the neuronal correlates of consciousness."


Journal Reference:

  1. L. Melloni, C. M. Schwiedrzik, N. Muller, E. Rodriguez, W. Singer. Expectations Change the Signatures and Timing of Electrophysiological Correlates of Perceptual Awareness. Journal of Neuroscience, 2011; 31 (4): 1386 DOI: 10.1523/JNEUROSCI.4570-10.2011

Learn more quickly by transcranial magnetic brain stimulation, study in rats suggests

 What sounds like science fiction is actually possible: thanks to magnetic stimulation, the activity of certain brain nerve cells can be deliberately influenced. What happens in the brain in this context has been unclear up to now. Medical experts from Bochum under the leadership of Prof. Dr. Klaus Funke (Department of Neurophysiology) have now shown that various stimulus patterns changed the activity of distinct neuronal cell types. In addition, certain stimulus patterns led to rats learning more easily. The knowledge obtained could contribute to cerebral stimulation being used more purposefully in future to treat functional disorders of the brain.

The researchers have published their studies in the Journal of Neuroscience and in the European Journal of Neuroscience.

Magnetic pulses stimulate the brain

Transcranial magnetic stimulation (TMS) is a relatively new method of pain-free stimulation of cerebral nerve cells. The method, which was presented by Anthony Barker for the first time in 1985, is based on the fact that the cortex, the rind of the brain located directly underneath the skull bone, can be stimulated by means of a magnetic field. TMS is applied in diagnostics, in fundamental research and also as a potential therapeutic instrument. Used in diagnostics, one single magnetic pulse serves to test the activability of nerve cells in an area of the cortex, in order to assess changes in diseases or after consumption of medications or also following a prior artificial stimulation of the brain. One single magnetic pulse can also serve to test the involvement of a certain area of the cortex in a sensorial, motoric or cognitive task, as it disturbs its natural activity for a short period, i.e. "switches off" the area on a temporary basis.

Repeated stimuli change cerebral activity

Since the mid-1990's, repetitive TMS has been used to make purposeful changes to the activability of nerve cells in the human cortex: "In general, the activity of the cells drops as a result of a low-frequency stimulation, i.e. with one magnetic pulse per second. At higher frequencies from five to 50 pulses per second, the activity of the cells increases," explained Prof. Funke. Above all, the researchers are specifically addressing with the effects of specific stimulus patterns like the so-called theta burst stimulation (TBS), in which 50 Hz bursts are repeated with 5 Hz. "This rhythm is based on the natural theta rhythm of four to seven Hertz which can be observed in an EEG," says Funke. The effect is above all dependent on whether such stimulus patterns are provided continuously (cTBS, attenuating effect) or with interruptions (intermittent, iTBS, strengthening effect).

Contact points between cells are strengthened or weakened

It is unknown to a great extent how precisely the activity of nerve cells is changed by repeated stimulation. It is assumed that the contact points (synapses) between the cells are strengthened (synaptic potentation) or weakened (synaptic depression) as a result of the repeated stimulation, a process which also plays an important role in learning. Some time ago, it was also shown that the effects of TMS and learning interact in humans.

Inhibitory cortical cells react particularly sensitive to stimulation

The researchers in Bochum have now shown for the first time that an artificial cortex stimulation specifically changes the activity of certain inhibitory nerve cells as a function of the stimulus protocol used. The balanced interaction of excitatory and inhibitory nerve cells is the absolute prerequisite for healthy functioning of the brain. Nerve cells specialised in inhibition of other nerve cells show a much greater variety in terms of cell shape and activity structure than their excitatory counterparts. Amongst other things, they produce various functional proteins in their cell body. In his studies, Prof. Funke has concentrated on the examination of the proteins Parvalbumin (PV), Calbindin-D28k (CB) and Calretinin (CR). They are formed by various inhibitory cells as a function of activity, with the result that their quantity gives information about the activity of the nerve cells in question.

Stimulus patterns act specifically on certain cells

For example, the examinations showed that activating stimulation protocol (iTBS) almost only reduces the PV content of the cells, whereas continuous stimulation attenuating activity (cTBS protocol), or a likewise attenuating 1 Hz stimulation, mainly reduces the CB production. CR formation was not changed by any of the tested stimulus protocols. Registration of the electrical activity of nerve cells confirmed a change in inhibition of the cortical activity.

Learning more quickly after stimulation

In a second study, recently published in the European Journal of Neuroscience, Prof. Funke's group was able to show that rats also learned more quickly if they were treated with the activating stimulus protocol (iTBS) before each training, but not if the inhibiting cTBS protocol has been used. It was seen that the initially reduced formation of the protein Parvalbumin (PV) was increased again by the learning procedure, but only in the areas of the brain involved in the learning process. For animals not involved in the specific learning task, production of PV remained reduced following iTBS. "The iTBS treatment therefore initially reduces the activity of certain inhibiting nerve cells more generally, with the result that the following learning activities can be stored more easily," concludes Prof. Funke. "This process is termed "gating." In a second step, the learning activity restores the normal inhibition and PV production."

More purposeful treatment in future

Repetitive TMS is already being used in clinical trials with limited success for therapy of functional disorders of the brain, above all in severe depressions. In addition, it was shown that especially disorders of the inhibitory nerve cells play an important role in neuropsychiatric diseases such as schizophrenia. "It is doubtless too early to derive new forms of treatment of functional disorders of the brain from the results of our study, but the knowledge obtained provides an important contribution for a possibly more specific application of TMS in future," is Prof. Funke's hope.


Journal References:

  1. Annika Mix, Alia Benali, Ulf T. Eysel, Klaus Funke. Continuous and intermittent transcranial magnetic theta burst stimulation modify tactile learning performance and cortical protein expression in the rat differently. European Journal of Neuroscience, 2010; 32 (9): 1575 DOI: 10.1111/j.1460-9568.2010.07425.x
  2. Benali, A., Trippe, J., Weiler, E., Mix, A., Petrasch-Parwez, E., Girzalsky, W., Eysel, U.T., Erdmann, R. and Funke, K. Theta-burst transcranial magnetic stimulation alters cortical inhibition. J. Neurosci.,

Brain 'GPS' illuminated in migratory monarch butterflies

— A new study takes a close look at the brain of the migratory monarch butterfly to better understand how these remarkable insects use an internal compass and skylight cues to navigate from eastern North America to Mexico each fall. The research, published by Cell Press in the January 27 issue of the journal Neuron, provides key insights into how ambiguous sensory signals can be integrated in the brain to guide complex navigation.

Previous research has shown that migrants use a time-compensated "sun compass" to maintain a southerly direction during flight. "In general, this sun compass mechanism proposes that skylight cues providing directional information are sensed by the eyes and that this sensory information is then transmitted to a sun compass system in the brain," explains senior study author, Dr. Steven Reppert from the University of Massachusetts Medical School. "There, information from both eyes is integrated and time compensated for the sun's movement by a circadian clock so that flight direction is constantly adjusted to maintain a southerly bearing over the day."

Dr. Reppert and coauthor Dr. Stanley Heinze were interested in studying exactly how skylight cues are processed by migrating monarchs and how the skylight pattern of polarized light may provide directional information on cloudy days. "The pattern of linearly polarized skylight is arranged as concentric circles of electric field vectors (E-vectors) around the sun, and they can indicate the sun's position, even when the sun itself is covered with clouds," says Dr. Reppert. "However, the symmetrical nature of the polarized skylight pattern leads to directional uncertainty unless the pattern is integrated with the horizontal position of the sun, called the solar azimuth."

Dr. Heinze compared the neuronal organization of the monarch brain sun compass network to that of the well-characterized desert locust and found it to be remarkably similar. He went on to show that individual neurons in the sun compass were tuned to specific E-vector angles of polarized light, as well as azimuth-dependent responses to unpolarized light. Interestingly, the responses of individual neurons to these two different stimuli were mediated through different parts of the monarch eye. The responses were then integrated in the sun compass part of the monarch brain to form an accurate representation of skylight cues throughout the day.

"Our results reveal the general layout of the neuronal machinery for sun compass navigation in the monarch brain and provide insights into a possible mechanism of integrating polarized skylight information and solar azimuth," conclude the authors. "More generally, our results address a fundamental problem of sensory processing by showing how seemingly contradictory skylight signals are integrated into a consistent, neural representation of the environment."


Journal Reference:

  1. Stanley Heinze, Steven M. Reppert. Sun Compass Integration of Skylight Cues in Migratory Monarch Butterflies. Neuron, 2011; 69 (2): 345-358 DOI: 10.1016/j.neuron.2010.12.025