Attention-deficit/hyperactivity disorder is the most prevalent mental disorder in children under aged 18 years in the United States, followed by depression, behavioral or conduct problems, anxiety, substance-use disorders, autism spectrum disorders, and Tourette syndrome, according to a report issued May 16 by the Centers for Disease Control and Prevention.
Millions of children in the United States have mental disorders, which are "a substantial public health concern with considerable associated costs to individuals, families, and society," concludes the report, titled "Mental Health Surveillance Among Children in the United States – 2005-2011." The report estimates that 20% of the children in the United States have mental disorders; the prevalence of mental disorders is increasing; and about $247 billion is spent on mental health in children annually.
This is the first CDC report to track the number of children (under age 18 years) in the United States who have a specific mental disorder, defined as "serious deviations from expected cognitive, social, and emotional development." The definition includes conditions that meetDSM-IV-TR criteria or criteria in the International Classification of Diseases. The report is published as a supplement to the May 17 Morbidity and Mortality Weekly Report (MMWR 2013;62 [Suppl. 2]:1-35).
"This is an important and helpful report," Dr. David Fassler, a child psychiatrist in Burlington, Vt., said in an interview. "Consistent with previous findings, it demonstrates that nearly one child in five has signs and symptoms of a significant psychiatric illness in any given year.
"The report also confirms that the prevalence of these disorders appears to be increasing, although the authors note that this finding may be due to changes in case definition, public perception, or policies regarding access to health care."
Furthermore, he predicted that the data in the report would be useful to parents, advocates, legislators, regulators. The findings "underscore the growing need for enhanced access to comprehensive mental health and substance abuse treatment services for children, adolescents and families," added Dr. Fassler, clinical professor of psychiatry, University of Vermont, Burlington.
The findings of the report were based on data from different national surveys and studies in the United States. The proportion of children affected by the outcomes studies varied by the condition, the survey and age group, the authors said, but they "begin to illustrate the impact of mental disorders among children."
Among the findings was that, in any given year, 13%-20% of children have a mental disorder. Among current diagnoses of mental disorders reported by parents of children aged 3-17 years, attention-deficit/hyperactivity disorder (ADHD) was the most common, at almost 7%. The next most common were behavioral or conduct problems (3.5%), anxiety (3%), depression (2.1%), autism spectrum disorders (ASDs) (1.1%), and Tourette syndrome (0.2% among children aged 6-17 years).
In addition, almost 5% of adolescents aged 12-17 years reported having an "illicit drug use disorder" in the past year, and 4.2% had an alcohol abuse disorder in the past year.
In 2010, suicide was the second-leading cause of death among children aged 12-17 years. Among people aged 10-19 years, the suicide rate was 4.5 suicides/100,000 people. About 8% of adolescents aged 12-17 years said that they had at least 14 "mentally unhealthy days in the past month."
The authors also found that all demographic groups were affected by mental disorders, but the estimated prevalence varied among racial and ethnic groups. For example, the prevalence of ADHD was lowest among Hispanic children and behavioral or conduct problems were the highest among black non-Hispanic children. ASDs "tended to be higher" among white non-Hispanic children, and white non-Hispanic children were affected by anxiety more than were black non-Hispanic children.
In addition, anxiety, ADHD, and ASD were more common among children who had health insurance.
The authors concluded that surveillance "is a critical first step in the public health approach to mental health among children," and that surveillance data "can help prioritize areas for research on risk and protective factors and provide empirical evidence to develop effective interventions that can prevent mental disorders and promote mental health."
The authors of the report are from the CDC, the Health Resources and Services Administration, the National Institute of Mental Health, and the Substance Abuse and Mental Health Services Administration.
The report complements a 2011 CDC report on mental illness in adults.Read article >>
Despite resistance to bullying from both employers and employees, many workplace bullies achieve high levels of career success, according to a new study from the University at Buffalo School of Management.
Published in the Journal of Managerial Psychology, the study found that some workplace bullies have high social skill that they use to strategically abuse their coworkers, yet still receive positive evaluations from their supervisors.
The study marks the first attempt to measure the relationship between being a bully and job performance. It offers an initial explanation of why bullies thrive in the workplace despite organizational attempts to sanction bullying behaviors.
“Many bullies can be seen as charming and friendly, but they are highly destructive and can manipulate others into providing them with the resources they need to get ahead,” says the study’s co-author, Darren Treadway, PhD, associate professor of organization and human resources in the UB School of Management.
Workplace bullying is pervasive. The study has noted that as many as half of all employees in the U.S. have witnessed bullying at work, and 35 percent have been the target of bullying.
The researchers collected behavioral and job performance data over two time periods from 54 employees at a mental health organization in the northwest U.S. to capture the individual differences and social perception of bullies in the workplace. Regression analyses were conducted on this sample size, consistent with previous studies.
The results showed a strong correlation between bullying, social competence and positive job evaluations.
Treadway says the findings are relevant beyond the health services industry and that companies should limit bullying behavior while rewarding high-performing employees.
“Employers can work to reduce the prevalence by finding organizationally appropriate ways for employees to achieve their goals, by incorporating measures of civility and camaraderie into performance evaluations, and by helping staff to develop the skills needed to manage bullies,” says Treadway.
Future research, he says, should focus on how bullies select their victims.Read article >>
Cancer survivors sometimes suffer from a condition known as “chemo fog”—a cognitive impairment caused by repeated chemotherapy. A study hints at a controversial idea: that brain-training software might help lift this cognitive cloud.
Various studies have concluded that cognitive training can improve brain function in both healthy people and those with medical conditions, but the broader applicability of these results remains controversial in the field.
In a study published in the journal Clinical Breast Cancer, investigators report that those who used a brain-training program for 12 weeks were more cognitively flexible, more verbally fluent, and faster-thinking than survivors who did not train.
Patients treated with chemotherapy show changes in brain structure and function in line with diffuse brain injury, and they often report long-term cognitive effects, says Shelli Kesler, a Stanford University clinical neuropsychologist who led the research. The new study “suggests that cognitive training could be one possible avenue for helping to improve cognitive function in breast cancer survivors treated with chemotherapy,” she says.
The results may not convince everyone. “One of the biggest challenges in the cognitive training world is to show an effect that generalizes to real-world functioning,” says Susan Landau, a neuroscientist at the University of California, Berkeley. Several companies offer commercial cognitive training programs that promise improvements in memory, attention, mental agility, and problem-solving skills. The appeal is clear, says Zach Hambrick, a psychologist at Michigan State University in East Lansing, but whether they have lasting general effects is not.
In the study conducted by Kesler and colleagues, the participants trained at home on Lumosity, a collection of gamelike cognitive exercises developed by Lumos Labs in San Francisco. (Lumos Labs did not fund the study.)
Kesler’s project is one of around two dozen efforts using Lumosity software to study human cognition. With 35 million customers worldwide, Lumosity is collecting what it says is the world’s largest database of human cognition, which could be queried for connections between lifestyle and cognitive ability. “Our technology collects a lot of data and makes it easy to run experiments to learn more generally about human cognitive performance,” says Mike Scanlon, cofounder of Lumos Labs. “We track all of the results from the cognitive testing and training, and we can combine that with demographic information to learn about how people’s cognitive performance changes and develops over the years.”
One such finding, he says, is a correlation between outside weather temperature and cognitive performance: “It turned out that the colder it is, the higher people’s performance is, even though generally they are inside doing this on a computer.”
Most of the scientific projects involving Lumosity’s software are exploring the effectiveness of brain training in different populations, from schoolchildren to stroke patients. For the study on breast cancer survivors, 41 women aged 40 and older, who were at least a year and half past their last chemotherapy treatment, were tested on several cognitive tasks at the beginning of the study. Then half the women used Lumosity training modules for 20 to 30 minutes four times a week for 12 weeks, and all were tested again.
When the investigators tested the participants in verbal memory, processing speed, and cognitive function, they found that the women who had used the brain training program improved in three of five objective measures.
“This is a well-done study—they had not just one transfer test but several,” says Hambrick, who notes that many studies of cognitive training depend on a single test to measure results. “But an issue is the lack of activity within the control group.” Better would be to have the control group do another demanding cognitive task in lieu of Lumosity training, something analogous to a placebo, he says: “The issue is that maybe the improvement in the group that did the cognitive training doesn’t reflect enhancement of basic cognitive processes per se, but could be a motivational phenomenon.”
Even if the effects are due to motivation or some other benefit not related to mental agility, that’s still useful, says Landau. “If [cognitive training] is something that makes people feel good and improves their confidence in their own skills, that’s not trivial at all,” she says. “That could be a big part of the effect that’s observed.”Read article >>
‘Research has found emotional eaters tend to eat more when happy’, reports the Mail Online website. The news is based on a small study looking at whether experimentally altering mood has an effect on the amount of calories a person eats.
The researchers examined the effects on what they describe as ‘emotional eaters’ – people who reported using food as a coping mechanism for emotions.
A group of 86 students, who said they were either emotional or non-emotional eaters, were shown TV and movie clips to evoke either a positive, negative or neutral mood. The researchers then assessed how much the students ate when provided with bowls of crisps and chocolate, as well as assessing their change in mood.
Emotional eaters who were shown the positive mood-inducing scenes significantly increased their food intake compared to emotional eaters shown the neutral mood-inducing scenes. However, the negative mood-inducing scenes had no effect on food intake of emotional or non-emotional students.
The common assumption is that emotional eaters eat more when in a negative mood, but this study provides very limited evidence to suggest that this may not always be the case.
However, because this experiment was based in a laboratory and researchers did not measure how hungry people were, even this finding should be viewed with caution. As ever, more and better research is needed if people with eating disorders or weight problems are to be helped effectively.
Cannot control your eating?
It is common for many of us to take solace in scoffing down an entire chocolate bar or pigging out on a pizza at the end of a bad day. But if you find yourself regularly binge eating in order to cope with emotional stress then you may need medical help for binge eating.
Warning signs include:
Eating a large amount of food when you are not hungry
Eating alone or secretly due to being embarrassed about the amount of food you eat
Having feelings of guilt, shame or disgust after binge eating
Where did the story come from?
The study was carried out by researchers from Maastricht University in The Netherlands and was funded by the Netherlands Organization for Scientific Research. It was published in the peer-reviewed journal, Appetite.
The story was picked up by the Mail Online website and it was covered appropriately, although the limitations of the study could have been described in more detail.
What kind of research was this?
This was a laboratory study looking at the effect of experimentally influencing mood changes in a group of students reported to be emotional or non-emotional eaters, and then looking at the effect on their food and calorie intake.
The researchers say emotional eaters are thought to increase their food intake in response to negative emotions, but little is known about the effect of positive emotions on their food intake. Meanwhile, non-emotional eaters are not believed to change their intake levels in response to emotions, and they might even restrict food intake in response.
The main limitation of this research is that a study of a small, select population sample under experimental conditions can only provide very limited indications about the possible influence emotions may have upon the eating patterns of different people in daily life.
For example, if you thought that researchers could be measuring how much you were eating it could make you, perhaps unconsciously, reluctant to eat as much as you normally would. Alternatively, being in this type of study could make you nervous, leading you to eat more than you normally would.
What did the research involve?
The researchers recruited 86 psychology students in their second year at Maastricht University in the Netherlands who received credit points for their participation. The students were predominantly female (75%) and had an average age of 21.6 years (range 19 to 43).
The students answered a series of questionnaires to assess their mental health and eating behaviours. Emotional eating was assessed using a questionnaire called the Dutch Eating Behaviour Questionnaire (DEBQ). Students were asked, ‘Do you have a desire to eat when you’re feeling lonely?’ and provided answers on a five-point Likert scale that ranged from ‘never’ to ‘very often’.
The researchers then carried out a series of experiments in a laboratory setting that aimed to change the student’s mood. Students were randomly allocated to view clips from television or films that aimed to evoke either a positive, negative or neutral mood:
28 students were shown two clips to evoke a positive mood. Firstly, they were shown a scene from the television series Mr. Bean (which showed Mr. Bean struggling to copy answers from his neighbor during an exam). The second clip was taken from the movie ‘When Harry Met Sally’ which showed the famous scene where Meg Ryan’s character simulates an orgasm in front of other diners in a restaurant.
28 students were shown one negative clip from the film ‘The Green Mile’, which showed an innocent man being executed.
30 students were shown part of a documentary about fishing to evoke a neutral mood.
The students were told to give in to the emotions the clips evoked, and were presented with bowls containing 191g of chocolate (white, milk and dark, equivalent to 1,000 kcal), 225g of salted crisps (1,229 kcal) and 225g of ketchup crisps (1,217 kcal). The bowls were weighed before and after the experiment to determine the amount of food eaten and calorie intake.
The students were asked to assess their mood using a visual analogue scale (this is essentially a straight line – where the far left of the line represents poor mood and the far right represents very good mood) at five points during the experiment:
Before the experiment began
Immediately after watching the television or movie scenes
5 minutes after the experiment
10 minutes after the experiment
15 minutes after the experiment
The students were told when entering the laboratory that they were taking part in an experiment on the effect of movie clips on taste perception.
The researchers analyzed their results using validated methods and adjusted the results for gender, body mass index (BMI), external eating and dietary restraint as assessed by the DEBQ, and negative mood as assessed by the Positive and Negative Affect Schedule (PANAS).
What were the basic results?
Overall, there was no significant difference between emotional eaters eating more than non-emotional eaters who were shown positive, negative or neutral clips.
When looking specifically at only the emotional eaters:
Those shown the positive mood-inducing scenes significantly increased their intake of food compared to those shown the neutral mood-inducing scenes
There was no difference in food intake between students shown negative mood-inducing scenes and those shown neutral or positive mood-inducing scenes
How did the researchers interpret the results?
The researchers concluded that self-reported emotional eaters respond in a different way to emotions than non-emotional eaters. They say that emotional eaters ate more in a positive mood compared to a neutral mood, whereas non-emotional eaters ate about the same amount in both conditions.
In discussing the results, the researchers say the findings could be of value for the treatment of obesity.
Overall, this small study provides very limited evidence to suggest emotional eaters eat more when feeling in a positive mood. There are several limitations to this study, some of which are noted by the researchers. These include the facts that:
The laboratory setting may not be an appropriate setting to test emotional eating with different mood feelings. It is possible that students felt uncomfortable in this setting and limited their food intake as they were being watched
The students were told they were partaking in an experiment of taste perceptions, so may have been inclined to eat more than they normally would have because of what they were told the study was looking at
No hunger measurements were taken during the study and how hungry each student was could have greatly affected the results
There was no group included in the study that did not eat, so it is not possible to say from the findings that the changes in mood were due to food intake
All of the participants were students, so findings may not be the same as if the same experiments were carried out in different groups who report being emotional eaters
To draw firmer conclusions about the effects of mood on emotional eating, larger studies of different groups are required that carry out experiments in more natural environments.Read article >>
Your brain often works on autopilot when it comes to grammar. That theory has been around for years, but University of Oregon neuroscientists have captured elusive hard evidence that people indeed detect and process grammatical errors with no awareness of doing so.
Participants in the study — native-English speaking people, ages 18-30 — had their brain activity recorded using electroencephalography, from which researchers focused on a signal known as the Event-Related Potential (ERP). This non-invasive technique allows for the capture of changes in brain electrical activity during an event. In this case, events were short sentences presented visually one word at a time.
Subjects were given 280 experimental sentences, including some that were syntactically (grammatically) correct and others containing grammatical errors, such as "We drank Lisa's brandy by the fire in the lobby," or "We drank Lisa's by brandy the fire in the lobby." A 50 millisecond audio tone was also played at some point in each sentence. A tone appeared before or after a grammatical faux pas was presented. The auditory distraction also appeared in grammatically correct sentences.
This approach, said lead author Laura Batterink, a postdoctoral researcher, provided a signature of whether awareness was at work during processing of the errors. "Participants had to respond to the tone as quickly as they could, indicating if its pitch was low, medium or high," she said. "The grammatical violations were fully visible to participants, but because they had to complete this extra task, they were often not consciously aware of the violations. They would read the sentence and have to indicate if it was correct or incorrect. If the tone was played immediately before the grammatical violation, they were more likely to say the sentence was correct even it wasn't."
When tones appeared after grammatical errors, subjects detected 89 percent of the errors. In cases where subjects correctly declared errors in sentences, the researchers found a P600 effect, an ERP response in which the error is recognized and corrected on the fly to make sense of the sentence.
When the tones appear before the grammatical errors, subjects detected only 51 percent of them. The tone before the event, said co-author Helen J. Neville, who holds the UO's Robert and Beverly Lewis Endowed Chair in psychology, created a blink in their attention. The key to conscious awareness, she said, is based on whether or not a person can declare an error, and the tones disrupted participants' ability to declare the errors. But, even when the participants did not notice these errors, their brains responded to them, generating an early negative ERP response. These undetected errors also delayed participants' reaction times to the tones.
"Even when you don't pick up on a syntactic error your brain is still picking up on it," Batterink said. "There is a brain mechanism recognizing it and reacting to it, processing it unconsciously so you understand it properly."
The brain processes syntactic information implicitly, in the absence of awareness, the authors concluded. "While other aspects of language, such as semantics and phonology, can also be processed implicitly, the present data represent the first direct evidence that implicit mechanisms also play a role in the processing of syntax, the core computational component of language."
It may be time to reconsider some teaching strategies, especially how adults are taught a second language, said Neville, a member of the UO's Institute of Neuroscience and director of the UO's Brain Development Lab.
Children, she noted, often pick up grammar rules implicitly through routine daily interactions with parents or peers, simply hearing and processing new words and their usage before any formal instruction. She likened such learning to "Jabberwocky," the nonsense poem introduced by writer Lewis Carroll in 1871 in "Through the Looking Glass," where Alice discovers a book in an unrecognizable language that turns out to be written inversely and readable in a mirror.
For a second language, she said, "Teach grammatical rules implicitly, without any semantics at all, like with jabberwocky. Get them to listen to jabberwocky, like a child does."
The National Institute on Deafness and Other Communication Disorders of the National Institutes of Health supported the research (grant 5R01DC000128).Read article >>
An international consortium of researchers has created the largest computer model of human metabolism to date, an astonishingly detailed roadmap that points the way to better understanding of cancer, obesity, diabetes, heart disease and a host of other conditions. It’s a powerful new tool that will speed the development of new drugs and treatments and, eventually, may allow doctors to tailor medicine to each patient’s personal biology.
The model, called Recon 2, details thousands of metabolic functions that occur within humans’ cells. By understanding these functions, their interactions and how they influence cellular activity, scientists can get the big picture of the microscopic cellular universe.
“Metabolism is central to much of our body’s function, and this model captures thousands of different metabolic processes,” explained Jason Papin, a researcher at the University of Virginia School of Medicine involved in the project. “We start with the human genome. This modeling effort is a way to functionalize the genome, a way to make value out of that sequence information.
“With the genome, you have a parts list, the components. What this model does is take the functions associated with those components and put them together in a mathematical way so that you can start to predict how it will behave.”
The model is by far the most complete computer representation of metabolism yet, incorporating several previous models and more than 1,000 papers. It represents a collaborative effort of a substantial percentage of the top metabolism researchers from around the globe. By bringing together so much of science’s understanding of metabolism, the researchers have created a way to better understand the metabolic mistakes that cause disease – and to speed future breakthroughs to battle those diseases.
Take cancer, for example. “The idea would be that if a patient’s tumor becomes resistant to existing therapies, these models of metabolism can help point to new therapies or new pathways that we can target with drugs to help stop growth,” Papin said. “Cancer growth is a function of metabolism. Metabolism is there to help it grow. And we’re hoping this modeling effort will help us know how to inhibit some of those key processes.”
The researchers describe the model in a paper in the May issue of the journal Nature Biotechnology. They have made the model freely available online, at www.humanmetabolism.org, and they’re already at work making it even more comprehensive.
“This is really a starting point,” Papin said. “The model has much, much to be improved, for sure. But in the end what we want to be able to do is have a computer model of the whole cell, and with that computer model hopefully be able to make all kinds of useful predictions and guide new experiments and help interpret new data that’s generated. So while this is a first step, I think it’s an important, big first step.”Read article >>
Prenatal exposure to the flu virus has previously been linked to schizophrenia, and investigators now say the same exposure may be a risk factor for bipolar disorder.
In a population-based cohort of Californians born between 1959 and 1966, exposure to influenza in utero was associated with a nearly fourfold increase in the risk of bipolar disorder (odds ratio [OR] 4.21, 95% CI 1.60 to 11.05; P=0.004) -- even after adjustment for maternal age and history of psychiatric disorders and other potential confounders, according to Alan Brown, MD, of Columbia University in New York City, and colleagues.
The research, published online May 8 in JAMA Psychiatry, was a continuation of a nested-case control study conducted to examine whether serologically documented prenatal exposure to flu increased the risk for schizophrenia.
In that earlier study, published in 2004, the investigators reported a seven-fold increase in schizophrenia risk among those exposed to influenza during the first trimester, and a threefold increase in risk associated with exposure during mid-pregnancy.
Using the same birth cohort, Brown and colleagues examined whether gestational exposure to influenza may be a risk factor for bipolar disorder. Cases and controls were drawn from the Child Health and Development Study (CHDS), which recruited nearly all pregnant women in northern California receiving obstetric care from Kaiser Permanente between January 1959 and January 1967.
Information about influenza during pregnancy was obtained from maternal medical records.
Potential cases of bipolar disorder were identified through database linkages of identifiers among the study group, Kaiser Permanente's database, and a large county healthcare database. Mailed questionnaires, interviews, and data from an earlier psychiatric follow-up study of the birth cohort were all used to confirm the diagnosis.
The researchers found that offspring exposed to influenza infection in the womb were nearly four times more likely to develop bipolar disorder overall, while exposure in the third trimester was associated with a nearly five-fold increase in risk (unadjusted OR 4.72, 95% CI 1.13 to 19.76;P=0.03). Influenza exposure in the second trimester tended to be associated with a nearly six-fold increase in risk, although it was not statistically significant (unadjusted OR 5.89, 95% CI 0.80 to 43.36; P=0.08).
The researchers concluded that studies in larger birth cohorts are needed, as is further investigation of the CHDS cohort, including examination of archived serum specimens taken from the mothers during pregnancy for evidence of influenza antibody.
"Further studies with larger sample sizes will be necessary to confirm the present findings and to probe more precisely the specificity of timing of exposure during gestation and bipolar disorder onset," the researchers wrote.
They added that if exposure to influenza infection in utero is demonstrated to be causally associated with bipolar disorder, preventative measures such as prepregnancy vaccination will prove to be particularly important in women who might become pregnant, especially if other risk factors for bipolar disorder are present.
"Maternal influenza may be a risk factor for bipolar disorder," the researchers wrote. "Although replication is required, the findings suggest that prevention of maternal influenza may reduce the risk of bipolar disorder."Read article >>
A brain-training task that increases the number of items an individual can remember over a short period of time may boost performance in other problem-solving tasks by enhancing communication between different brain areas. The new study being presented this week in San Francisco is one of a growing number of experiments on how working-memory training can measurably improve a range of skills – from multiplying in your head to reading a complex paragraph.
“Working memory is believed to be a core cognitive function on which many types of high-level cognition rely, including language comprehension and production, problem solving, and decision making,” says Brad Postle of the University of Wisconsin-Madison, who is co-chairing a session on working-memory training at the Cognitive Neuroscience Society (CNS) annual meeting today in San Francisco. Work by various neuroscientists to document the brain’s “plasticity”– changes brought about by experience – along with technical advances in using electromagnetic techniques to stimulate the brain and measure changes, have enabled researchers to explore the potential for working-memory training like never before, he says.
The cornerstone brain-training exercise in this field has been the “n-back” task, a challenging working memory task that requires an individual to mentally juggle several items simultaneously. Participants must remember both the recent stimuli and an increasing number of stimuli before it (e.g., the stimulus “1-back”, “2-back”, etc.). These tasks can be adapted to also include an audio component or to remember more than one trait about the stimuli over time – for example, both the color and location of a shape.
Through a number of experiments over the past decade, Susanne Jaeggi of the University of Maryland, College Park, and others have found that participants who train with n-back tasks over the course of approximately a month for about 20 minutes per day not only get better at the n-back task itself, but also experience “transfer” to other cognitive tasks on which they did not train. “The effects generalize to important domains such as attentional control, reasoning, reading, or mathematical skills”, Jaeggi says. “Many of these improvements remain over the course of several months, suggesting that the benefits of the training are long lasting.”
As yet unresolved and controversial, however, has been understanding which factors determine whether working-memory training will generalize to other domains, as well as how the brain changes in response to the training. Work by Postle’s group using a new technique of applying electromagnetic stimulation on the brains of people undergoing working-memory training addresses some of these questions.
Training increases connectivity
Bornali Kundu of the University of Wisconsin-Madison, who works in Postle’s laboratory, used transcranial magnetic stimulation (TMS) with electroencephalography (EEG) to measure activity in specific brain circuits before and after training with an n-back task. “Our main finding was that training on the n-back task increased the number of items an individual could remember over a short period of time,” explains Kundu, who is presenting these new results today. “This increase in short-term memory performance was associated with enhanced communication between distant brain areas, in particular between the parietal and frontal brain areas.”
In the n-back task, Kundu’s team presented stimuli one-at-a-time on a computer screen and asked participants to decide if the current stimulus matched both the color and location of the stimulus presented a certain number of presentations previously. The color varied among seven primary colors, and the location varied among eight possible positions arranged in a square formation. The control task was playing the video game Tetris, which involves moving colored shapes to different locations, but does not require participants to remember anything. Before and after the training, researchers administered a range of cognitive tasks on which subjects did not receive training, and simultaneously delivered TMS while recording EEG, to measure communication between brain areas during task performance.
After practicing the n-back task for 5 hours a day and 5 days per week over 5 weeks, subjects were able to remember more items over short periods of time. Importantly, for those whose working memory improved, communication between the dorsolateral prefrontal cortex (DLPFC) and parietal cortex also improved. “This is in comparison to the control group, who showed no such differences in neural communication after practicing Tetris for 5 weeks,” Kundu says.
Working-memory training also produced improvement on cognitive tasks for which participants were not trained that are also believed to rely on communication between the parietal cortex and DLPFC. For two of these tasks – the ability to detect a change in a briefly presented array of squares, and the ability to detect a red letter “C” embedded in a field of distracting stimuli of rotated red “C”’s and blue “C”’s– those who had trained in the n-back test also showed a decrease in task-related EEG. The training exercise had registered a similar decrease. “The overall picture seems to be that the extent of transfer of training to untrained tasks depends on the overlap of neural circuits recruited by the two,” Kundu says.
Developing future therapies
Moving forward, many cognitive neuroscientists are working to see how working-memory training may specifically help clinical populations, such as patients with ADHD. “If we can learn the ‘rules’ that govern how, why, and when cognitive training can produce improvements that generalize to untrained tasks, it may be that therapies can be developed for patients suffering from neurological or psychiatric disease,” Postle says.
Jaeggi’s teams, as well as Torkel Klingberg of the Karolinska Institute in Sweden, who is also presenting at the symposium today in San Francisco, have had success with such training for children with ADHD, decreasing the symptoms of inattention. “Here, the reason working-memory training may transfer to tests of fluid intelligence, as well as to a reduction in ADHD-associated hyperactivity symptoms, may be because both of those complex behaviors use some of the same brain circuits also used in performing the working-memory training tasks,” Kundu says.
“Individual differences in working memory performance have been related to individual differences in numerous real world skills such as reading comprehension, performance on standardized tests, and much more, “she adds. “I would not expect the same sorts of transfer effects that have been seen with working-memory training to happen if an individual practiced a task that used a minimally overlapping network, such as, for example, shooting three-pointers – which presumably uses different brain areas like primary and secondary motor cortex and the cerebellum.”
Jaeggi says that it is important to understand that cognitive abilities are not as unchangeable as some might think. “Even though there is certainly a hereditary component to mental abilities, that does not mean that there are not also components that are malleable and respond to experience and practice, “she says. “Whereas we try to strengthen participants’ working memory skills in our research, there are other routes that are possible as well, such as for example physical or musical training, meditation, nutrition, or even sleep.”
Despite all the promising research, Jaeggi says, researchers still need to understand many aspects of this work, such as ”individual differences that influence training and transfer effects, the question of how long the effects last, and whether and how the effects translate into more real-world settings and ultimately, academic achievement.”Read article >>
Can you "click" with someone after only four minutes? That's the question at the heart of new research by Stanford scholars Dan McFarland and Dan Jurafsky that looks at how meaningful bonds are formed.
McFarland, a sociologist at Stanford's Graduate School of Education, and Jurafsky, a computational linguist, analyzed the conversations of heterosexual couples during speed dating encounters to find out why some people felt a sense of connection after the meeting and others didn't.
Their paper, "Making the Connection: Social Bonding in Courtship Situations," was published this month in the American Journal of Sociology.
"One of the key features of a community, social network or relationship is the sense that it's meaningful, that there is some kind of force behind the relationship," McFarland said. "We wanted to get at what the essence of the connection is, what makes people feel like they bonded."
McFarland said much of the literature on social bonding points to characteristics – traits, status, attributes, motivation, experiences – as reasons why people connect. But, he said, those explanations ignore or downplay the role of communication.
There is a great deal of uncertainty, the paper notes, about the meaning of signals we send to other people, and how that plays into forging interpersonal connections.
"We wanted to see if there is anything about the interaction that matters or is it really just what I look like, what I do, what my motivation is. Is it all things that are psychological or in my head or is there actually something in how we hit it off?"
Their analysis of nearly 1,000 dates found that words, indeed, do matter. How the words are delivered, when and for how long make a difference to how people feel toward each other, and in this case, whether the men and women sensed that they "clicked" during their encounter.
The four-minute date, the study found, was enough time to forge a meaningful relationship – something that seemed to go beyond looks and motivation. But female participants reported lower rates of "clicking" than men, suggesting the women are more selective and, in this particular setting, more powerful.
The participants in the study were graduate students at Stanford, and wore audio recording devices during their dates. The dates lasted four minutes each, and after they were done, the participants filled out a scorecard that, among other things, asked if he or she would like to go out on a real date with the person. If both parties said yes, a real date was set up.
For the purposes of this study, the participants also filled out pre- and post-date surveys.
The dates were transcribed and computer software was used to analyze the words and speech to see if any characteristics of the language corresponded to the participants' reporting of feeling a sense of connection.
"We were looking at conversational behaviors or speech features and how they express characteristics of the social experience, how you feel about the other person," Jurafsky said.
Women reported a sense of connection to men who used appreciative language ("That's awesome" or "Good for you") and sympathy ("That must be tough on you").
Women also reported clicking with male partners who interrupted them – not as a way to redirect the conversation but to demonstrate understanding and engagement, for example, by finishing a sentence or adding to it.
Both genders reported clicking when their conversations were mainly about the women.
"You could say men are self-centered and women are always trying to please men and dates will go well if they talk about the guy, but it turns out that's just not true. It's just the opposite," McFarland said. "This is a situation in life where women have the power, women get to decide. So talking about the empowered party is a sensible strategy toward feeling connected."
While interrupting could be viewed as positive, asking a lot of questions tended to have a negative result.
"Women feel disconnected when they have to ask men questions, or when men ask them questions," the paper said. Questions were used by women to keep a lagging conversation going and by men who had nothing to say.
Successful dates, the paper notes, were associated with women being the focal point and engaged in the conversation, and men demonstrating alignment with and understanding of the women.
Shared stories also indicated a sense of connection, as did speakers who showed enthusiasm by varying their speech to get louder and softer.
The researchers said the longer it took for the individuals to decide on a date, the more they reported having a bonding experience, suggesting communication can change someone's feelings about another person and break the association with traits.
Further studies could look at same-sex relationships, for example, or could explore the transitions to other states, like marriage.
Stanford's Institute for Research in the Social Sciences and various grants from the National Science Foundation supported this interdisciplinary research effort.Read article >>
Lower levels of sweat, as measured by skin conductance activity (SCA), have been linked with conduct disorder and aggressive behavior in children and adolescents.
Leader of the study, Professor Stefanie van Goozen of Cardiff University’s School of Psychology hypothesizes that “aggressive children may have lower levels of physiological arousal because they don’t experience the same level of emotional arousal in response to fearful situations as their less aggressive peers. Because they have a weaker fear response, they are more likely to engage in antisocial behavior.”
The researchers wanted to know whether the link between low SCA and aggressive behaviors could be observed even as early as infancy. To investigate this, researchers attached recording electrodes to infants’ feet at age one and measured their skin conductance at rest, in response to loud noises, and after encountering a fear-inducing remote-controlled robot.
Data was also collected on their aggressive behaviors at age three, as rated by the infants’ mothers.
The results revealed that one-year-old infants with lower SCA at rest and during the robot encounter were more physically and verbally aggressive at age three.
It was also gleaned that SCA was the only factor in the study that predicted later aggression. The other measures taken at infancy—mothers’ reports of their infants’ temperament, for instance—did not predict aggression two years later.
These findings suggest that while a physiological measure (SCA) taken in infancy predicts aggression, mothers’ observations do not.
“This runs counter to what many developmental psychologists would expect, namely that a mother is the best source of information about her child,” van Goozen notes.
At the same time, this research has important implications for intervention strategies: “These findings show that it is possible to identify at-risk children long before problematic behavior is readily observable,” van Goozen concludes. “Identifying precursors of disorder in the context of typical development can inform the implementation of effective prevention programs and ultimately reduce the psychological and economic costs of antisocial behavior to society.”
Professor Adrian Raine, chair of the department of criminology at the University of Pennsylvania says: “Stefanie van Goozen’s latest novel findings are powerful in showing that objective physiological measures predict to later aggression over and above other measures.
“If these new results can be replicated and extended to other ages, they have potentially important implications for the future prediction of aggressive and violent behavior. They highlight the promise of biological measures too in better understanding the etiology of fearless aggressive behavior”.Read article >>
Mindfulness — paying attention to one’s current experience in a non-judgmental way — might help us to learn more about our own personalities, according to a new article published in the March 2013 issue of Perspectives on Psychological Science, a journal of the Association for Psychological Science.
Recent research has highlighted the fact that we have many blind spots when it comes to understanding our patterns of thinking, feeling, and behaving. Despite our intuition that we know ourselves the best, other people have a more accurate view of some traits (e.g., intellect) than we do. In some cases, blind spots in self-knowledge can have negative consequences, such as poor decision-making, poor academic achievement, emotional and interpersonal problems, and lower life satisfaction.
In this new article, psychological scientist Erika Carlson of Washington University in St. Louis explores one potential strategy for improving self-knowledge: mindfulness.
Mindfulness — a technique often recognized for its positive effects on mental health — involves paying attention to your current experience (e.g., thoughts, feelings) and observing it in a non-judgmental manner.
According to Carlson, these two components of mindfulness, attention and nonjudgmental observation, can overcome the major barriers to knowing ourselves. She argues that the motivation to see ourselves in a desirable way is one of the main obstacles to self-knowledge. For instance, people may overestimate their virtuous qualities to ward off negative feelings or boost self-esteem. However, non-judgmental observation of one’s thoughts, feelings, and behavior, might reduce emotional reactivity — such as feelings of inadequacy or low self-esteem — that typically interferes with people seeing the truth about themselves.
Lack of information is another barrier to self-knowledge — in some situations, people might not have the information they would need to accurately assess themselves. For instance, we have a hard time observing much of our nonverbal behavior, so we may not know that we’re grimacing or fidgeting during a serious conversation. Mindfulness could also help in this domain, as research has shown that mindfulness training is associated with greater bodily awareness.
Drawing from cognitive, clinical, and social psychology, Carlson outlines a theoretical link between mindfulness and self-knowledge that suggests focusing our attention on our current experiences in a nonjudgmental way could be an effective tool for getting to know ourselves better.
This research was supported by National Science Foundation Grant BCS-1025330 awarded to Simine Vazire.Read article >>
Prisoners who are psychopaths lack the basic neurophysiological “hardwiring” that enables them to care for others, according to a new study by neuroscientists at the University of Chicago and the University of New Mexico.
“A marked lack of empathy is a hallmark characteristic of individuals with psychopathy,” said the lead author of the study, Jean Decety, the Irving B. Harris Professor in Psychology and Psychiatry at UChicago. Psychopathy affects approximately 1 percent of the United States general population and 20 percent to 30 percent of the male and female U.S. prison population. Relative to non-psychopathic criminals, psychopaths are responsible for a disproportionate amount of repetitive crime and violence in society.
“This is the first time that neural processes associated with empathic processing have been directly examined in individuals with psychopathy, especially in response to the perception of other people in pain or distress,” he added.
The results of the study, which could help clinical psychologists design better treatment programs for psychopaths, are published in the article, “Brain Responses to Empathy-Eliciting Scenarios Involving Pain in Incarcerated Individuals with Psychopathy,” which appears online April 24 in the journal JAMA Psychiatry.
Joining Decety in the study were Laurie Skelly, a graduate student at UChicago; and Kent Kiehl, professor of psychology at the University of New Mexico.
For the study, the research team tested 80 prisoners between ages 18 and 50 at a correctional facility. The men volunteered for the test and were tested for levels of psychopathy using standard measures.
They were then studied with functional MRI technology, to determine their responses to a series of scenarios depicting people being intentionally hurt. They were also tested on their responses to seeing short videos of facial expressions showing pain.
The participants in the high psychopathy group exhibited significantly less activation in the ventromedial prefrontal cortex, lateral orbitofrontal cortex, amygdala and periaqueductal gray parts of the brain, but more activity in the striatum and the insula when compared to control participants, the study found.
The high response in the insula in psychopaths was an unexpected finding, as this region is critically involved in emotion and somatic resonance. Conversely, the diminished response in the ventromedial prefrontal cortex and amygdala is consistent with the affective neuroscience literature on psychopathy. This latter region is important for monitoring ongoing behavior, estimating consequences and incorporating emotional learning into moral decision-making, and plays a fundamental role in empathic concern and valuing the well-being of others.
“The neural response to distress of others such as pain is thought to reflect an aversive response in the observer that may act as a trigger to inhibit aggression or prompt motivation to help,” the authors write in the paper.
“Hence, examining the neural response of individuals with psychopathy as they view others being harmed or expressing pain is an effective probe into the neural processes underlying affective and empathy deficits in psychopathy,” the authors wrote.
Decety is one of the world’s leading experts on the biological underpinnings of empathy. His work also focuses on the development of empathy and morality in children.
The study with prisoners was supported with a $1.6 million grant from the National Institute of Mental Health.Read article >>
Curiosity is the engine of intellectual achievement — it’s what drives us to keep learning, keep trying, keep pushing forward. But how does one generate curiosity, in oneself or others? George Loewenstein, a professor of economics and psychology at Carnegie Mellon University, proposed an answer in the classic 1994 paper, “The Psychology of Curiosity.”
Curiosity arises, Loewenstein wrote, “when attention becomes focused on a gap in one’s knowledge. Such information gaps produce the feeling of deprivation labeled curiosity. The curious individual is motivated to obtain the missing information to reduce or eliminate the feeling of deprivation.” Loewenstein’s theory helps explain why curiosity is such a potent motivator: it’s not only a mental state but also an emotion, a powerful feeling that impels us forward until we find the information that will fill in the gap in our knowledge.
Here, three practical ways to use information gaps to stimulate curiosity:
1. Start with the question. Cognitive scientist Daniel Willingham notes that teachers — along with parents, managers, and leaders of all kinds — are often “so eager to get to the answer that we do not devote sufficient time to developing the question,” Willingham writes in his book, Why Don’t Students Like School? Yet it’s the question that stimulates curiosity; being told an answer quells curiosity before it can even get going. Instead of starting with the answer, begin by posing for yourself and others a genuinely interesting question — one that opens an information gap.
2. Prime the pump. In his 1994 paper, George Loewenstein noted that curiosity requires some initial knowledge. We’re not curious about something we know absolutely nothing about. But as soon as we know even a little bit, our curiosity is piqued and we want to learn more. In fact, research shows that curiosity increases with knowledge: the more we know, the more we want to know. To get this process started, Loewenstein suggests, “prime the pump” with some intriguing but incomplete information.
3. Bring in communication. Language teachers have long put a similar idea to use in exercises that open an information gap and then require learners to communicate with each other in order to fill it. For example, one student might be given a series of pictures illustrating the beginning of the story, while the student’s partner is given a series of pictures showing how that same story ends. Only by speaking with each other (in the foreign language they are learning, of course) can the students fill in each others’ information gaps.
This technique can be adapted to all kinds of settings: for example, colleagues from different departments could be asked to complete a task together, one that requires the identification of information gaps that the coworkers, with their different areas of expertise, must fill in for each other. Communication solves the problem — and leaves the participants curious to know more.Read article >>
Theodore Berger, a biomedical engineer and neuroscientist at the University of Southern California in Los Angeles, envisions a day in the not too distant future when a patient with severe memory loss can get help from an electronic implant. In people whose brains have suffered damage from Alzheimer’s, stroke, or injury, disrupted neuronal networks often prevent long-term memories from forming. For more than two decades, Berger has designed silicon chips to mimic the signal processing that those neurons do when they’re functioning properly—the work that allows us to recall experiences and knowledge for more than a minute. Ultimately, Berger wants to restore the ability to create long-term memories by implanting chips like these in the brain.
The idea is so audacious and so far outside the mainstream of neuroscience that many of his colleagues, says Berger, think of him as being just this side of crazy. “They told me I was nuts a long time ago,” he says with a laugh, sitting in a conference room that abuts one of his labs. But given the success of recent experiments carried out by his group and several close collaborators, Berger is shedding the loony label and increasingly taking on the role of a visionary pioneer.
Berger and his research partners have yet to conduct human tests of their neural prostheses, but their experiments show how a silicon chip externally connected to rat and monkey brains by electrodes can process information just like actual neurons. “We’re not putting individual memories back into the brain,” he says. “We’re putting in the capacity to generate memories.” In an impressive experiment published last fall, Berger and his coworkers demonstrated that they could also help monkeys retrieve long-term memories from a part of the brain that stores them.
If a memory implant sounds farfetched, Berger points to other recent successes in neuroprosthetics. Cochlear implants now help more than 200,000 deaf people hear by converting sound into electrical signals and sending them to the auditory nerve. Meanwhile, early experiments have shown that implanted electrodes can allow paralyzed people to move robotic arms with their thoughts. Other researchers have had preliminary success with artificial retinas in blind people.
Still, restoring a form of cognition in the brain is far more difficult than any of those achievements. Berger has spent much of the past 35 years trying to understand fundamental questions about the behavior of neurons in the hippocampus, a part of the brain known to be involved in forming memory. “It’s very clear,” he says. “The hippocampus makes short-term memories into long-term memories.”
What has been anything but clear is how the hippocampus accomplishes this complicated feat. Berger has developed mathematical theorems that describe how electrical signals move through the neurons of the hippocampus to form a long-term memory, and he has proved that his equations match reality. “You don’t have to do everything the brain does, but can you mimic at least some of the things the real brain does?” he asks. “Can you model it and put it into a device? Can you get that device to work in any brain? It’s those three things that lead people to think I’m crazy. They just think it’s too hard.”
Cracking the Code
Berger often speaks in sentences that stretch to paragraph length and have many asides, footnotes, and complete diversions from the point. I ask him to define memory.
“It’s a series of electrical pulses over time that are generated by a given number of neurons,” he says. “That’s important because you can reduce it to this and put it back into a framework. Not only can you understand it in terms of the biological events that happened; that means that you can poke it, you can deal with it, you can put an electrode in there, and you can record something that matches your definition of a memory. You can find the 2,147 neurons that are part of this memory. And what do they generate? They generate this series of pulses. It’s not bizarre. It’s something you can handle. It’s useful. It’s what happens.”
This is the conventional view of memory, but it only scratches the surface. And to Berger’s perpetual frustration, many colleagues who probe this mysterious realm of the brain haven’t attempted to go much deeper. Neuroscientists track electrical signals in the brain by monitoring action potentials, microvolt changes on the surfaces of neurons. But all too often, says Berger, their reports oversimplify what’s actually taking place. “They find an important event in the environment and count action potentials,” he says. “They say, ‘It went up from 1 to 200 after I did something. I’m finding something interesting.’ What are you finding? ‘Activity went up.’ But what are you finding? ‘Activity went up.’ So what? Is it coding something? Is it representing something that the next neuron cares about? Does it make the next neuron do something different? That’s what we’re supposed to be doing: explaining things, not just describing things.”
If one neuron fires at a specific time and place, what exactly do the neighboring neurons do in response?
Berger takes a marker and fills a whiteboard from top to bottom with a line of circles that represent neurons. Next to each one, he draws a horizontal line that has a different pattern of blips on it. “This is you in my brain,” he says. “My hippocampus has already formed a long-term memory of you. I’ll remember you into next week. But how can I distinguish you from the next person? Let’s say there are 500,000 cells in the hippocampus that represent you, and there are all sorts of things that each cell is coding—like how your nose is relative to your eyebrow—and they code that with different patterns. So the reality of the nervous system is really complicated, which is why we’re still asking such basic, limited questions about it.”
In graduate school at Harvard, Berger’s mentor was Richard Thompson, who studied localized, learning-induced changes in the brain. Thompson used a tone and a puff of air to condition rabbits to blink their eyes, aiming to determine where the memory he induced was stored. The idea was to find a specific place in the brain where the learning was localized, says Berger: “If the animal did learn and you removed it, the animal couldn’t remember.”
Thompson, with Berger’s help, managed to do just that; they published the results in 1976. To find the site in the rabbits, they equipped the animals’ brains with electrodes that could monitor the activity of a neuron. Neurons have gates on their membranes, which let electrically charged particles like sodium and potassium in and out. Thompson and Berger documented the electrical spikes seen in the hippocampus as rabbits developed a memory. Both the spikes’ amplitude (representing the action potential) and their spacing formed patterns. It can’t be an accident, Berger thought, that cells fire in a way that forms patterns with respect to time.
This led him to a central question that underlies his current work: as cells receive and send electrical signals, what pattern describes the quantitative relationship between the input and the output? That is, if one neuron fires at a specific time and place, what exactly do the neighboring neurons do in response? The answer could reveal the code that neurons use to form a long-term memory.
But it soon became clear that the answer was extremely complex. In the late 1980s, Berger, working at the University of Pittsburgh with Robert Sclabassi, became fascinated by a property of the neuronal network in the hippocampus. When they stimulated the hippocampus of a rabbit with electrical pulses (the input) and charted how signals moved through different populations of neurons (the output), the relationship they observed between the two wasn’t linear. “Let’s say you put in 1 and get 2,” says Berger. “That’s pretty easy. It’s a linear relation.” It turns out, however, that there’s “essentially no condition in the brain where you get linear activity, a linear summation,” he says. “It’s always nonlinear.” Signals overlap, with some suppressing an incoming pulse and some accentuating it.
By the early 1990s, his understanding—and computing hardware—had advanced to the point that he could work with his colleagues at the University of Southern California’s department of engineering to make computer chips that mimic the signal processing done in parts of the hippocampus. “It became obvious that if I could get this stuff to work in large numbers in hardware, you’ve got part of the brain,” he says. “Why not hook up to what’s existing in the brain? So I started thinking seriously about prosthetics long before anybody even considered it.”
A Brain Implant
Berger began working with Vasilis Marmarelis, a biomedical engineer at USC, to begin making a brain prosthesis. They first worked with hippocampal slices from rats. Knowing that neuronal signals move from one end of the hippocampus to the other, the researchers sent random pulses into the hippocampus, recorded the signals at various locales to see how they were transformed, and then derived mathematical equations describing the transformations. They implemented those equations in computer chips.
Next, to assess whether such a chip could serve as a prosthesis for a damage hippocampal region, the researchers investigated whether they could bypass a central component of the pathway in the brain slices. Electrodes placed in the region carried electrical pulses to an external chip, which performed the transformations normally done in the hippocampus. Other electrodes delivered the signals back to the slice of brain.
“I never thought I’d see this go into humans, and now our discussions are about when and how. I never thought I’d live to see the day.”
Then the researchers took a leap forward by trying this in live rats, showing that a computer could in fact serve as an artificial component of the hippocampus. They began by training the animals to push one of two levers to receive a treat, recording the series of pulses in the hippocampus as they chose the correct one. Using those data, Berger and his team modeled the way the signals were transformed as the lesson was converted into a long-term memory, and they captured the code believed to represent the memory itself. They proved that their device could generate this long-term memory code from input signals recorded in rats’ brains while they learned the task. Then they gave the rats a drug that interfered with their ability to form long-term memories, causing them to forget which lever produced the treat. When the researchers pulsed the drugged rats’ brains with the code, the animals were again able to choose the right lever.
Last year, the scientists published primate experiments involving the prefrontal cortex, a part of the brain that retrieves the long-term memories created by the hippocampus. They placed electrodes in the monkey brains to capture the code formed in the prefrontal cortex that they believed allowed the animals to remember an image they had been shown earlier. Then they drugged the monkeys with cocaine, which impairs that part of the brain. Using the implanted electrodes to send the correct code to the monkeys’ prefrontal cortex, the researchers significantly improved the animal’s performance on the image-identification task.
Within the next two years, Berger and his colleagues hope to implant an actual memory prosthesis in animals. They also want to show that their hippocampal chips can form long-term memories in many different behavioral situations. These chips, after all, rely on mathematical equations derived from the researchers’ own experiments. It could be that the researchers were simply figuring out the codes associated with those specific tasks. What if these codes are not generalizable, and different inputs are processed in various ways? In other words, it is possible that they haven’t cracked the code but have merely deciphered a few simple messages.
Berger allows that this may well be the case, and his chips may form long-term memories in only a limited number of situations. But he notes that the morphology and biophysics of the brain constrain what it can do: in practice, there are only so many ways that electrical signals in the hippocampus can be transformed. “I do think we’re going to find a model that’s pretty good for a lot of conditions and maybe most conditions,” he says. “The goal is to improve the quality of life for somebody who has a severe memory deficit. If I can give them the ability to form new long-term memories for half the conditions that most people live in, I’ll be happy as hell, and so will be most patients.”
Despite the uncertainties, Berger and his colleagues are planning human studies. He is collaborating with clinicians at his university who are testing the use of electrodes implanted on each side of the hippocampus to detect and prevent seizures in patients with severe epilepsy. If the project moves forward as envisioned, Berger’s group will piggyback on the trial to look for memory codes in those patients’ brains.
“I never thought I’d see this go into humans, and now our discussions are about when and how,” he says. “I never thought I’d live to see the day, but now I think I will.”Read article >>