Infants raised in bilingual environments can distinguish unfamiliar languages

Infants raised in households where Spanish and Catalan are spoken can discriminate between English and French just by watching people speak, even though they have never been exposed to these new languages before, according to University of British Columbia psychologist Janet Werker.

Presented at the American Association for the Advancement of Science (AAAS) Annual Meeting in Washington, DC, Werker's latest findings provide further evidence that exposure to two native languages contributes to the development of perceptual sensitivity that extends beyond their mother tongues.

Werker has previously shown that bilingual infants can discern different native languages at four, six and eight months after birth. While monolingual babies have the ability to discern two languages at four and six months, they can no longer do so at eight months.

In Werker's latest study with Prof. Núria Sebastián-Gallés from the Universitat Pompeu Fabra in Barcelona, infants of four and six months were shown silent videos of talking faces speaking English and French. They found that babies growing up bilingual with Spanish and Catalan — a Romance language spoken in Andorra and Catalonia — were able to distinguish between English and French simply through facial cues, even though they had never before seen speakers of either language.

"The fact that this perceptual vigilance extends even to two unfamiliar languages suggests that it's not just the characteristics of the native languages that bilingual infants have learned about, but that they appear to have also developed a more general perceptual vigilance," says Werker, Canada Research Chair in Psychology and director of UBC's Infant Studies Centre.

"These findings, together with our previous work on newborn infants, provide even stronger evidence that human infants are equally prepared to grow up bilingual as they are monolingual," Werker adds. "The task of language separation is something they are prepared to do from birth — with bilinguals increasingly adept over time."

Crossing borders in language science: What bilinguals tell us about mind and brain

Sonja Kotz leads the Minerva research group "Neurocognition of Rhythm in Communication" at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig. She will present evidence from neuroimaging on the impact of cognitive functions on bilingual processing at the AAAS symposium "Crossing Borders in Language Science: What Bilinguals Tell Us About Mind and Brain."

Rhythm, as the recurrent patterning of events in time, underlies most human behavior such as speech, music, and body movements. Sonja Kotz investigates how temporal patterns in di!erent languages influence the processing of phonological, semantic, and syntactic information. Individuals who learn a new language usually need time to develop a "feel" for its characteristics. With rapid speech it can initially even be difficult to recognize individual words. "This is because the brain has to become accustomed to new speech rhythms," explains Sonja Kotz.

Our brain is very good at recognizing patterns in the environment and uses them to create general predictions about the near future.

"We assume there is a neural network permanently engaged in evaluating information about duration, rhythm, tempo and stress of syllables in order to recognize temporal regularities in the stream of speech," says Kotz. During language acquisition, this network could store fundamental regularities of speech in the brain so that later, language processing is more efficient.

Rhythm processing predominantly occurs in brain areas in and just belowthe cerebral cortex but also in motor areas and evolutionarily older areas like the cerebellum and basal ganglia. "This points to an early stage of development," says Kotz. "The evolution of language would not have been possible without the development of brain areas which have the ability to structure events temporally." To meet the high communicative demands of homo sapiens, the motor system in and below the cerebral cortex might have become increasingly sensitive to rhythmic input.

Juggling languages can build better brains

Once likened to a confusing tower of Babel, speaking more than one language can actually bolster brain function by serving as a mental gymnasium, according to researchers.

Recent research indicates that bilingual speakers can outperform monolinguals–people who speak only one language–in certain mental abilities, such as editing out irrelevant information and focusing on important information, said Judith Kroll, Distinguished Professor of Psychology, Penn State. These skills make bilinguals better at prioritizing tasks and working on multiple projects at one time.

"We would probably refer to most of these cognitive advantages as multi-tasking," said Kroll, director of the Center for Language Science. "Bilinguals seem to be better at this type of perspective taking."

Kroll said that these findings counter previous conclusions that bilingualism hindered cognitive development.

"The received wisdom was that bilingualism created confusion, especially in children," said Kroll told attendees Feb. 18 at the annual meeting of the American Association for the Advancement of Science in Washington D.C. "The belief was that people who could speak two or more languages had difficulty using either. The bottom line is that bilingualism is good for you."

Researchers trace the source of these enhanced multi-tasking skills to the way bilinguals mentally negotiate between the languages, a skill that Kroll refers to as mental juggling.

When bilinguals speak with each other, they can easily slip in and out of both languages, often selecting the word or phrase from the language that most clearly expresses their thoughts. However, fluent bilinguals rarely make the mistake of slipping into another language when they speak with someone who understands only one language.

"The important thing that we have found is that both languages are open for bilinguals; in other words, there are alternatives available in both languages," Kroll said. "Even though language choices may be on the tip of their tongue, bilinguals rarely make a wrong choice."

This language selection, or code switching, is a form of mental exercise, according to Kroll.

"The bilingual is somehow able to negotiate between the competition of the languages," Kroll said. "The speculation is that these cognitive skills come from this juggling of languages."

Kroll's symposium at the meeting included distinguished language scientists who have investigated the consequences of bilingualism across the lifespan. Ellen Bialystok, Distinguished Research Professor of Psychology at York University, Toronto, was instrumental in demonstrating that bilingualism improves certain mental skills.

According to Bialystok, the benefits of bilingualism appear across age groups. Studies of children who grow up as bilingual speakers indicate they are often better at perspective-taking tasks, such as prioritizing, than monolingual children. Experiments with older bilingual speakers indicate that the enhanced mental skills may protect them from problems associated with aging, such as Alzheimer's disease and dementia.

Researchers use MRIs and electroencephalographs to track how the brain operates when it engages in language juggling. They also use eye-movement devices to watch how bilinguals read sentences. When a person reads, the eyes jump through the sentence, stopping to comprehend certain words or phrases. These distinctive eye movements can offer researchers clues on the subtle ways bilinguals comprehend language compared to monolinguals.

Kroll noted that the enhanced brain functions of bilinguals do not necessarily make them more intelligent or better learners.

"Bilinguals simply acquire specific types of expertise that help them attend to critical tasks and ignore irrelevant information," Kroll said.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of NewsPsychology ( or its staff.

Scientists steer car with the power of thought

— You need to keep your thoughts from wandering, if you drive using the new technology from the AutoNOMOS innovation labs of Freie Universität Berlin. The computer scientists have developed a system making it possible to steer a car with your thoughts. Using new commercially available sensors to measure brain waves — sensors for recording electroencephalograms (EEG) — the scientists were able to distinguish the bioelectrical wave patterns for control commands such as "left," "right," "accelerate" or "brake" in a test subject.

They then succeeded in developing an interface to connect the sensors to their otherwise purely computer-controlled vehicle, so that it can now be "controlled" via thoughts. Driving by thought control was tested on the site of the former Tempelhof Airport.

The scientists from Freie Universität first used the sensors for measuring brain waves in such a way that a person can move a virtual cube in different directions with the power of his or her thoughts. The test subject thinks of four situations that are associated with driving, for example, "turn left" or "accelerate." In this way the person trained the computer to interpret bioelectrical wave patterns emitted from his or her brain and to link them to a command that could later be used to control the car. The computer scientists connected the measuring device with the steering, accelerator, and brakes of a computer-controlled vehicle, which made it possible for the subject to influence the movement of the car just using his or her thoughts.

"In our test runs, a driver equipped with EEG sensors was able to control the car with no problem — there was only a slight delay between the envisaged commands and the response of the car," said Prof. Raúl Rojas, who heads the AutoNOMOS project at Freie Universität Berlin. In a second test version, the car drove largely automatically, but via the EEG sensors the driver was able to determine the direction at intersections.

The AutoNOMOS Project at Freie Universität Berlin is studying the technology for the autonomous vehicles of the future. With the EEG experiments they investigate hybrid control approaches, i.e., those in which people work with machines.

The computer scientists have made a short film about their research, which is available at: http://tinyurl.com/BrainDriver

Brains of blind people reading in Braille show activity in same area that lights up when sighted readers read

The portion of the brain responsible for visual reading doesn't require vision at all, according to a new study published online on Feb. 17 in Current Biology, a Cell Press publication. Brain imaging studies of blind people as they read words in Braille show activity in precisely the same part of the brain that lights up when sighted readers read. The findings challenge the textbook notion that the brain is divided up into regions that are specialized for processing information coming in via one sense or another, the researchers say.

"The brain is not a sensory machine, although it often looks like one; it is a task machine," said Amir Amedi of The Hebrew University of Jerusalem. "A brain area can fulfill a unique function, in this case reading, regardless of what form the sensory input takes."

Unlike other tasks that the brain performs, reading is a recent invention, about 5400 years old. Braille has been in use for less than 200 years. "That's not enough time for evolution to have shaped a brain module dedicated to reading," Amedi explained.

Nevertheless, study coauthor Laurent Cohen showed previously in sighted readers that a very specific part of the brain, known as the visual word form area or VWFA for short, has been co-opted for this purpose. But no one knew what might happen in the brains of blind people who learn to read even though they've had no visual experience at all.

In the new study, Amedi's team used functional magnetic resonance imaging to measure neural activity in eight people who had been blind since birth while they read Braille words or nonsense Braille. If the brain were organized around processing sensory information, one might expect that Braille reading would depend on regions dedicated to processing tactile information, Amedi explained. If instead the brain is task oriented, you'd expect to find the peak of activity across the entire brain in the VWFA, right where it occurs in sighted readers, and that is exactly what the researchers found.

Further comparison of brain activity in blind and sighted readers showed that the patterns in the VWFA were indistinguishable between the two groups.

"The main functional properties of the VWFA as identified in the sighted are present as well in the blind, are thus independent of the sensory modality of reading, and even more surprisingly do not require any visual experience," the researchers wrote. "To the best of our judgment, this provides the strongest support so far for the metamodal theory [of brain function]," which suggests that brain regions are defined by the tasks they perform. "Hence, the VWFA should also be referred to as the tactile word form area, or more generally as the (metamodal) word form area."

The researchers suggest that the VWFA is a multisensory integration area that binds simple features into more elaborate shape descriptions, making it ideal for the relatively new task of reading.

"Its specific anatomical location and its strong connectivity to language areas enable it to bridge high-level perceptual word representation and language-related components of reading," they wrote. "It is therefore the most suitable region to be taken over during reading acquisition, even when reading is acquired via touch without prior visual experience."

Amedi said the researchers plan to examine brain activity as people learn to read Braille for the first time, to find out how rapidly this takeover happens. "How does the brain change to process information in words?" he asked. "Is it instantaneous?"


Journal Reference:

  1. Lior Reich, Marcin Szwed, Laurent Cohen, and Amir Amedi. A Ventral Visual Stream Reading Center Independent of Visual Experience. Current Biology, 2011; DOI: 10.1016/j.cub.2011.01.040

Brain-machine interfaces make gains by learning about their users, letting them rest, and allowing for multitasking

NewsPsychology (Feb. 21, 2011) — You may have heard of virtual keyboards controlled by thought, brain-powered wheelchairs, and neuro-prosthetic limbs. But powering these machines can be downright tiring, a fact that prevents the technology from being of much use to people with disabilities, among others. Professor José del R. Millán and his team at the Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland have a solution: engineer the system so that it learns about its user, allows for periods of rest, and even multitasking.

In a typical brain-computer interface (BCI) set-up, users can send one of three commands — left, right, or no-command. No-command is the static state between left and right and is necessary for a brain-powered wheelchair to continue going straight, for example, or to stay put in front of a specific target. But it turns out that no-command is very taxing to maintain and requires extreme concentration. After about an hour, most users are spent. Not much help if you need to maneuver that wheelchair through an airport.

In an ongoing study demonstrated by Millán and doctoral student Michele Tavella at the AAAS 2011 Annual Meeting in Washington, D.C., the scientists hook volunteers up to BCI and ask them to read, speak, or read aloud while delivering as many left and right commands as possible or delivering a no-command. By using statistical analysis programmed by the scientists, Millán’s BCI can distinguish between left and right commands and learn when each subject is sending one of these versus a no-command. In other words, the machine learns to read the subject’s mental intention. The result is that users can mentally relax and also execute secondary tasks while controlling the BCI.

The so-called Shared Control approach to facilitating human-robot interactions employs image sensors and image-processing to avoid obstacles. According to Millán, however, Shared Control isn’t enough to let an operator to rest or concentrate on more than one command at once, limiting long-term use.

Millán’s new work complements research on Shared Control and makes multitasking a reality while at the same time allows users to catch a break. His trick is in decoding the signals coming from EEG readings on the scalp — readings that represent the activity of millions of neurons and have notoriously low resolution. By incorporating statistical analysis, or probability theory, his BCI allows for both targeted control — maneuvering around an obstacle — and more precise tasks, such as staying on a target. It also makes it easier to give simple commands like “go straight” that need to be executed over longer periods of time (think back to that airport) without having to focus on giving the same command over and over again.

It will be a while before this cutting-edge technology makes the move from lab to production line, but Millán’s prototypes are the first working models of their kind to use probability theory to make BCIs easier to use over time. His next step is to combine this new level of sophistication with Shared Control in an ongoing effort to take BCI to the next level, necessary for widespread use. Further advancements, such as finer grained interpretation of cognitive information, are being developed in collaboration with the European project for Tools for Brain Computer (http://www.tobi-project.org/). The multinational project is headed by Professor Millán and has moved into the clinical testing phase for several BCIs.

Email or share this story:


Story Source:

The above story is reprinted (with editorial adaptations by newsPsychology staff) from materials provided by Ecole Polytechnique Fédérale de Lausanne, via EurekAlert!, a service of AAAS.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of NewsPsychology or its staff.

Dial 5683 for love: Dialing certain numbers on a cell phone changes your emotional state

A psychological scientist in Germany has found a way that cell phones, and specifically texting, have hacked into our brains. Just by typing the numbers that correspond to the letters in a word like "love," we can activate the meaning of that word in our minds.

The results are published in Psychological Science, a journal of the Association for Psychological Science.

For the study, Sascha Topolinski and his students at the University of Würzburg in Germany created a list of German words that can be typed on a cell phone keypad without typing the same digit twice in a row. Also, each number combination could spell only one word.

For one experiment, Topolinski used a set of number sequences that correspond to positive words, like 54323 ("liebe" — love) and 373863 ("freund" — friend), and a set for negative words, like 7245346 ("schleim" — slime) and 26478 ("angst" — fear). Volunteers were handed a cell phone with stickers over the buttons so they could only see the numbers, not the corresponding letters, and were told to type the number sequences. After typing each one, they rated how pleasant it had been to dial the number on the phone. Volunteers believed they were participating in a study on ergonomics — in the debriefing afterward, none had any idea that the numbers might relate to words.

On average, volunteers preferred dialing numbers that related to positive words over those related to negative words. Merely dialing the numbers that corresponded to those letters — not even pushing them multiple times, as you'd usually do to text words on a 10-digit keypad — was enough to activate the concepts in their minds.

This induction of concepts also occurred in another group of volunteers who were asked to dial phone numbers and then identify words on a computer screen immediately afterwards. Volunteers were able to identify words that were implied by the preceding phone number more quickly than words that were had nothing to do with the preceding number sequence.

Topolinski relates these findings to a psychology concept called "embodiment" — the idea that certain body movements can make you think about related ideas. Clenching a fist makes people think about power, for example, and holding a heavy clipboard makes them think something is important. "But this is a new door in embodiment research," Topolinski says. "Participants always did some finger movements. They just typed numbers in the cell phone. But I could induce 'slime' or 'love' — any meaning. This was a kind of a motor cipher that you can encode into the muscle system and use to induce a variety of ideas in participants."

The work has practical implications, too. In another experiment, Topolinski had volunteers type numbers that were supposed to go with specific types of businesses; a word that implied the German word for "jewelry" for a jeweler, or "apartment" for a rental office. After dialing the phone number and hearing an answering machine message, volunteers rated the business on its attractiveness. When the number matched the business, volunteers gave the business a higher rating than when they were mismatched; for example, a number for "wealth" for a financial counsel.

Business owners could take this effect into mind when choosing a phone number, Topolinski says. For example, "if you are a lawyer, try to get a phone number which implies the word 'justice,' or if you have a donation hotline, include the sequence 4483 for 'give.'"


Journal Reference:

  1. S. Topolinski. I 5683 You: Dialing Phone Numbers on Cell Phones Activates Key-Concordant Concepts. Psychological Science, 2011; DOI: 10.1177/0956797610397668

Human sentence processing unaffected by sentence structure, eye-movement finds

 The hierarchical structure of sentences appears to be less important in human sentence processing than previously assumed, according to a new study of readers' eye movements. Readers seem to pay attention to simple word sequences above all. These are the conclusions of research conducted by Dr. Stefan Frank and Prof. Rens Bod from the Institute for Logic, Language and Computation (ILLC) of the University of Amsterdam (UvA).

Their findings, which run counter to the prevailing view, will soon be published in the journal Psychological Science. These results provide new insights into human language cognition: structures that have been considered extremely important in understanding language do not appear to have psychological relevance.

Seen superficially, sentences consist of a series of words. However, sentences also have a deeper hierarchical structure: they consist of phrases, which may themselves consist of phrases, and so on. Since the pioneering work of the linguist Noam Chomsky in the 1950s, most psycholinguists believe that this structure plays a crucial role in sentence processing. However, Frank and Bod show that the cognitive system of language users is especially sensitive to the superficial, serial structure of sentences, so the hierarchical structure does not really matter.

Expectations during reading

While reading text, expectations are continuously built up with regard to the words to follow. Infringing on these expectations slows down reading, which is detectable in the reader's eye movements. The UvA researchers took a dataset of these types of eye movement patterns and connected that to various statistical models of language. Based on certain assumptions about the sentence structure, these models calculate the degree to which each word was to be expected. It appeared that the eye movements were accurately predicted by models that look only at the superficial word sequences rather than hierarchical structures. This suggests that the reader largely ignores sentence structure and pays attention to the word sequences instead.

The study is part of the Vici-programme 'Integrating Cognition' funded by the Netherlands Organisation for Scientific Research (NWO) and headed by Rens Bod.

Innovative iPhone app developed to carry out psychological and social research

Royal Holloway, University of London has joined an international team of researchers to develop a new way of conducting psychological and social research.

Instead of bringing people into laboratories the team has launched an iPhone/iPad app that people can download for free in English, French and Dutch.

The app, called the "Science XL: Test your word power," adapts a classic behavioural psychology experiment for iPhone or iPad use. The user can test his or her word power by deciding whether each word presented is a real word or a non-word. The application measures accuracy and importantly the time taken to make such decisions, i.e reaction time.

The world-wide results obtained will help researchers to advance their understanding of how the brain recognises words and could ultimately help in understanding disorders such as dyslexia. They are aiming to get data from 50,000 individuals for each of the three languages they are studying.

Professor Rastle, from the Department of Psychology at Royal Holloway, is the UK member of the research team. She says: "Using the iPhone or iPad to conduct scientific research is a revolutionary new concept. It could change the way that human social and psychological research is conducted because it allows us to access vast numbers of individuals from a range of demographics relatively inexpensively."

The app is free to download from iTunes AppStore (search for "Science XL") and is non-profit making.

Physical activity linked to political participation

NewsPsychology (Feb. 1, 2011) — How is going for a jog like voting for president? As far as our brains are concerned, physical activity and political activity are two sides of the same coin. Scientists found that people who live in more active states are also more likely to vote. And in an experiment, volunteers who were exposed to active words like “go” and “move” said they were more likely to vote than did people who saw words like “relax” and “stop.”

The study was inspired by research showing that brains lump all kinds of activity together. For instance, a message that’s meant to promote fitness — physical activity — can also trigger people to eat more — another kind of activity, and with the exact opposite result. Kenji Noguchi of the University of Southern Mississippi, Ian M. Handley of Montana State University, and Dolores Albarracín of the University of Illinois at Urbana-Champaign were inspired by the 2008 presidential election to see if the same was true for political activity.

For a new study published in Psychological Science, a journal of the Association for Psychological Science, the researchers pulled together data on how often people exercise, diabetes rates, obesity, and use of amphetamines and other stimulants, to create an “action-tendency index” for the 50 states. They ranked states, from the movers and shakers of Colorado, Alaska, and Oregon to slower-paced Mississippi, West Virginia, and Tennessee. The states’ ranking for physical activity roughly corresponded with voter turnout in the 2004 election.

The researchers also tested this link with experiments. In one experiment before the 2008 election, students completed words with some letters missing — so they were exposed to words like “go” and “active” or “relax” and “paralyze.” Students who had encountered active words were more likely to say they would vote in the presidential election than students who worked with words like “freeze.”

This link to physical activity could be used to encourage people to vote, says Albarracín. “It could be anything from promoting voting in a sports context to connecting voting to a self-help context that encourages being proactive — that’s a big audience that’s thinking about how to improve their own lives and may not otherwise think of doing so politically. This might be easier than getting politically naïve or uninvolved people to vote because they care about politics per se.”

Email or share this story:


Story Source:

The above story is reprinted (with editorial adaptations by newsPsychology staff) from materials provided by Association for Psychological Science.

Journal Reference:

  1. K. Noguchi, I. M. Handley, D. Albarracin. Participating in Politics Resembles Physical Activity: General Action Patterns in International Archives, United States Archives, and Experiments. Psychological Science, 2010; DOI: 10.1177/0956797610393746

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of NewsPsychology or its staff.