Disorders and Conditions

Questions about a Disorder?

Mental Health News

Following are the latest news and information resources for the various mental health topics that we cover. We hope you will find the news educational and the links in the resources section useful in helping you to get even more in-depth data.

OCD Parenting Good for Pets

Helicopter parenting may not be the best strategy for raising independent kids. But a healthy measure of clinginess and overprotectiveness could actually be advantageous when rearing dogs and cats, according to new research from UC Berkeley and California State University, East Bay.

A Web-based survey of more than 1,000 pet owners nationwide analyzed the key personality traits and nurturing styles of people who identified as a “cat person,” a “dog person,” “both” or “neither.”

Surprisingly perhaps, those who expressed the greatest affection for their pets also rated among the most conscientious and neurotic, suggesting that the qualities that make for overbearing parents might work better for our domesticated canine and feline companions, who tend to require lifelong parenting.

“The fact that higher levels of neuroticism are associated with affection and anxious attachment suggests that people who score higher on that dimension may have high levels of affection and dependence on their pets, which may be a good thing for pets,” said Mikel Delgado, a doctoral student in psychology at UC Berkeley and co-author of the study, recently published in the Journal of Applied Animal Welfare Science.

The results echo those of a 2010 study by University of Texas psychologist Sam Gosling, a UC Berkeley graduate, which showed dog owners to be more extroverted, but less open to new experiences, and cat owners to be more neurotic, but also more creative and adventurous.

While previous studies have focused on people’s attachment to their pets, this is the first U.S. study to incorporate the principles of human attachment theory – which assesses the bond between parents and children or between romantic partners — with pet owners’ personality types, including whether they identify as a “dog person” or “cat person.”

It is also the first to find a positive correlation between neuroticism, anxious attachment and the care of and affection for pets, said CSU-East Bay psychologist Gretchen Reevy, co-author of the paper and a graduate of UC Berkeley.

Delgado and Reevy recruited male and female pet owners of all ages through the Craigslist classified advertising website, their personal Facebook pages and pet-related pages on the Reddit news and social networking site. Nearly 40 percent of those surveyed said they liked dogs and cats equally, while 38 percent identified as dog people and 19 percent as cat people. A mere 3 percent favored neither.

The online questionnaire was based on both human and animal attachment assessments, including one that measures the “Big Five” overarching human characteristics (openness, conscientiousness, extraversion, agreeableness and neuroticism). Pet owners were also rated according to the Lexington Attachment to Pets Scale, which measures affection for pets, and the Pet Attachment Questionnaire, which gauges “anxious attachment” and “avoidant attachment.”

People who score high on anxious attachment tend to need more reassurance from the objects of their affection, and in the survey those tended to be younger people who chose a cat as a favorite pet.

Conversely, people who rate highly on avoidant attachment, which refers to a less affectionate and more withdrawn temperament – and can inspire such rejoinders as “commitment-phobe” in romantic relationships – are much less needy. Both dog and cat lovers scored low on avoidant attachment, suggesting both personality types enjoy close relationships with their pets.

“We hypothesized that more attentive and affectionate pet owners would receive higher affection scores and lower avoidant attachment scores, as higher levels of avoidant attachment would suggest distancing behaviors between the individual and their pet,” Delgado said.

Delgado and Reevy plan to dig more deeply into the link between neuroticism and affection for and dependence on one’s pet.

“We will investigate further whether greater affection for and greater anxious attachment to one’s pet, and neuroticism, are associated with better care and understanding of the pet’s needs,” Reevy said.

Read article >>

Lack Of Exercise Responsible For Twice As Many Deaths As Obesity

A brisk 20 minute walk each day could be enough to reduce an individual’s risk of early death, according to new research published today. The study of over 334,000 European men and women found that twice as many deaths may be attributable to lack of physical activity compared with the number of deaths attributable to obesity, but that just a modest increase in physical activity could have significant health benefits.

Physical inactivity has been consistently associated with an increased risk of early death, as well as being associated with a greater risk of diseases such as heart disease and cancer. Although it may also contribute to an increased body mass index (BMI) and obesity, the association with early death is independent of an individual’s BMI.

To measure the link between physical inactivity and premature death, and its interaction with obesity, researchers analysed data from 334,161 men and women across Europe participating in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study. Between 1992 and 2000, the researchers measured height, weight and waist circumference, and used self-assessment to measure levels of physical activity. The participants were then followed up over 12 years, during which 21,438 participants died. The results are published today in the American Journal of Clinical Exercise.

The researchers found that the greatest reduction in risk of premature death occurred in the comparison between inactive and moderately inactive groups, judged by combining activity at work with recreational activity; just under a quarter (22.7%) of participants were categorised as inactive, reporting no recreational activity in combination with a sedentary occupation. The authors estimate that doing exercise equivalent to just a 20 minute brisk walk each day – burning between 90 and 110 kcal (‘calories’) – would take an individual from the inactive to moderately inactive group and reduce their risk of premature death by between 16-30%. The impact was greatest amongst normal weight individuals, but even those with higher BMI saw a benefit.

Using the most recent available data on deaths in Europe the researchers estimate that 337,000 of the 9.2 million deaths amongst European men and women were attributable to obesity (classed as a BMI greater than 30): however, double this number of deaths (676,000) could be attributed to physical inactivity.

Professor Ulf Ekelund from the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge, who led the study, says: “This is a simple message: just a small amount of physical activity each day could have substantial health benefits for people who are physically inactive. Although we found that just 20 minutes would make a difference, we should really be looking to do more than this – physical activity has many proven health benefits and should be an important part of our daily life.”

Professor Nick Wareham, Director of the MRC Unit, adds: “Helping people to lose weight can be a real challenge, and whilst we should continue to aim at reducing population levels of obesity, public health interventions that encourage people to make small but achievable changes in physical activity can have significant health benefits and may be easier to achieve and maintain.”

Read article >>

How A Habit Becomes An Addiction

Research suggests that only 20–30% of drug users actually descend into addiction — defined as the persistent seeking and taking of drugs even in the face of dire personal consequences. Why are some people who use drugs able to do so without turning into addicts, while others continue to abuse, even when the repercussions range from jail time to serious health problems?

In a comprehensive review in the European Journal of Neuroscience, Barry Everitt outlines the neural correlates and learning-based processes associated with the transition from drug use, to abuse, to addiction.

Drug seeking begins as a goal-directed behavior, with an action (finding and taking drugs) leading to a particular outcome (the drug high). This type of associative learning is mediated by the dorsomedial region of the striatum, the area of the brain that is associated with reward processing, which functions primarily through the neurotransmitter dopamine.

In this kind of learning, devaluing the outcome (by decreasing the potency of the drug, for example) tends to decrease the pursuit of the action. When the high is not what it used to be, the motivation to continue seeking it out decreases.

However, in long-term abusers, this devalued outcome does not reduce the action — indeed, researchers have found that in cases of chronic drug use, a parallel associative learning process eventually comes to the fore. This process is one of stimulus–response; the conditioned stimuli in this case are the various environmental cues — the sight of the powdery white stuff, the smell of burning aluminum foil — that users associate with getting high and that compel them to seek out drugs.

As Everitt puts it, the “must have” of the goal-directed behavior eventually develops into a habitual “must do” response. This second kind of learning is mediated in the brain by a separate section of the striatum — the dorsolateral region, which is connected to areas of the cortex that control sensorimotor function. Everitt outlines the neural mechanisms underlying this shift in learning behaviors, which seems to occur due to changes in two different dopamine signaling pathways that involve the striatum.

Although impulsivity is often seen as an effect of stimulant drug use, it may also be a causal factor in the loss of control that occurs when a drug abuser descends into addiction. The author presents evidence suggesting there is an intrinsic element that can make certain people more vulnerable to impulsivity and, consequently, drug addiction — potentially explaining why not all habitual users go on to become addicted.

With these discoveries about the neural and psychological activity of addiction in mind, Everitt also reviews some potential treatment options. These include drugs that block specific dopamine receptors to disrupt the reward-processing circuits in the brain and those that induce plasticity in the brain regions associated with habitual drug-seeking behavior.

Everitt also discusses existing drugs that could be repurposed to treat drug addiction — such as SSRIs, common antidepressant medications that raise serotonin levels in the brain, or atomoxetine, a pharmaceutical treatment for ADHD that tends to reduce impulsivity.

One of the more intriguing prospects is a class of drugs that targets memory reconsolidation, designed specifically for preventing relapse. If memories of previous drug experiences can be disconnected from the environmental cues that normally induce craving, the theory goes, those stimuli will cease to bring about drug-seeking behavior.

As this review shows, a more complete understanding of the neural and learning processes associated with the transition from drug use to addiction can help researchers better identify and treat those most at risk for descending into compulsive drug abuse.

Read article >>

The Curious Science Of When Multitasking Works

Trying to do two things at once is usually a recipe for doing both badly, according to a long line of research. We’re slower and less accurate when we try to juggle two things. Experts came to believe that there wasn’t much that could be done about this, so most of the advice in HBR has been to avoid multitasking as much as possible.

But if giving up multitasking isn’t an option, a new study published in in Psychological Science offers some hope: your ability to multitask may depend on whether you were trained to do the two tasks separately or simultaneously.

The first thing to know about multitasking is that the word is a misnomer. You’re not really doing two things at once so much as rapidly switching back and forth between them. That switching process is mentally taxing — your brain has to recall the instructions for how to do one task, then put them aside and recall the instructions for how to do the other, then repeat the whole thing again — and so the result is poor performance on both.

Cognitive scientists at Brown have now added an interesting wrinkle by drawing a connection between multitasking and the research on learning and memory. Previous studies have demonstrated that context affects our ability to remember information or perform a task. A famous study in 1975 taught divers a list of vocabulary words while underwater. Later, when asked to recall their meaning, the divers could recall more words underwater than on land. What if something similar were true for multitasking?

The Brown researchers, Joo-Hyun Song and Patrick Bédard, performed an experiment where participants completed “visuomotor” exercises on a computer — moving a stylus around on a screen based on visual prompts. Some of the participants were just moving the cursor in response to a series of dots on the screen. Some were asked to do that, and to follow a series of letters that appeared intermittently at the same time. In other words, the second group was asked to multitask.

Later, the participants were asked to do the exercises again, except some of the single-taskers were asked to multitask and some of the multitaskers were asked to just do the single task. Surprisingly, the multitaskers didn’t do any worse this time around, on average, than those performing the single task.

Instead, just like the divers recalling words underwater, what mattered was consistent context. Those who performed under the same conditions both times did better than those whose conditions changed. So the multitaskers who started out doing two things at once were able to recall how to complete the task better than the multitaskers who were later asked to just do one thing, or the single-taskers who were later asked to do two.

In a second experiment, the researchers found that it didn’t necessarily matter what the second task even was. This time multitaskers were asked, the second time around, to try a totally new task, along with a practiced one, and performed just as well.

While there’s no guarantee that what works in the lab will hold true in the office, these results suggest the possibility that our ability to juggle tasks and recall information depends on the context in which we learned those things in the first place. If the research does apply to office work, it’s most likely to tasks that require motor skills, like typing, since that’s what the experiment measured. If you’re typing while listening to a conference call, maybe you’re less likely to make mistakes if you were equally distracted when you originally learned to type.

The best advice is still to avoid multitasking whenever possible. But for those who have to do it, consistent context matters. If you’re going to be multitasking when forced to recall information or perform a task, it may be better to practice multitasking when you learn it in the first place.

Read article >>

Human Brain Keeps Memories Tidy By Pruning Inaccurate Ones

Your brain is a memory powerhouse, constantly recording experiences in long-term memory. Those memories help you find your way through the world: Who works the counter each morning at your favorite coffee shop? How do you turn on the headlights of your car? What color is your best friend's house?

But then your barista leaves for law school, you finally buy a new car and your buddy spends the summer with a paint brush in hand. Suddenly, your memories are out of date.

What happens next?

An experiment conducted by researchers from Princeton University and the University of Texas-Austin shows that the human brain uses memories to make predictions about what it expects to find in familiar contexts. When those subconscious predictions are shown to be wrong, the related memories are weakened and are more likely to be forgotten. And the greater the error, the more likely you are to forget the memory.

"This has the benefit ultimately of reducing or eliminating noisy or inaccurate memories and prioritizing those things that are more reliable and that are more accurate in terms of the current state of the world," said Nicholas Turk-Browne, an associate professor of psychology at Princeton and one of the researchers.

The research was featured in an article, "Pruning of memories by context-based prediction error," that appeared in 2014 in the Proceedings of the National Academy of Sciences. The other co-authors are Ghootae Kim, a Princeton graduate student; Jarrod Lewis-Peacock, an assistant professor of psychology at the University of Texas-Austin; and Kenneth Norman, a Princeton professor of psychology and the Princeton Neuroscience Institute.

The researchers' experiment involved 24 adults, who were shown a series of photos one at a time while their brain activity was monitored by a functional magnetic resonance imaging (fMRI) machine. The participants were asked a question about each photo, but the real purpose of the exercise was to monitor their brain activity as the photos were shown.

The photos included three-photo sequences, such as two photos of faces followed by a photo of a scene. In this example, the first two photos would appear again later in the series, but this time would be followed by a new face rather than the scene. The researchers measured how strongly participants were expecting to see a photo of the scene the second time by looking for the pattern of brain activity associated with that scene, around the time when it should have appeared in the sequence.

"We wanted to get direct access to what things people are predicting without asking them," Turk-Browne said. "These kinds of predictions are not necessarily consciously accessible, and if we ask about them, it will change the behavior."

Later, participants were shown photos and asked whether they recognized them from the fMRI portion.

By analyzing the fMRI data and the memory test results, the researchers found that the more strongly participants' brains predicted — incorrectly — the next photo in the sequence, the more likely they were to forget the predicted photo.

"Our specific hypothesis in the context of the experiment is that the bigger the prediction, the more the error and the more likely you are to forget the thing you were predicting," Turk-Browne said. "We think it's the error causing the forgetting. How can we measure the error? That's difficult. But we know the error is proportional to the strength of the prediction. So we use the strength of the prediction as a measure of the prediction error."

The researchers say the findings fit a model for how the brain handles memories called the nonmonotonic plasticity hypothesis. The model claims that strong activation of a memory — such as when a remembered object or event is encountered again — will strengthen the memory.

But moderate activation of a memory — such as the activation that occurs when your brain makes an unconfirmed prediction using the memory — can degrade the memory.

So, Turk-Browne said, if your brain activates a memory in forming a prediction and then doesn't re-experience the remembered object or event, the memory can begin to fade.

"This is a very general mechanism for influencing what people remember and forget on the basis of whether it can be reliably predicted given the situation in which it occurs or not," Turk-Browne said. "People remember and forget a lot of things. This isn't going to explain all of remembering or all of forgetting, but this is an automatic, unconscious way the brain has to figure out when things turn out not to be really reliable in terms of how the world is structured."

Morgan Barense, an assistant professor of psychology at the University of Toronto, said the researchers have found a "powerful learning mechanism" in the brain.

"I think it's an incredibly compelling story that this automatic prediction mechanism is constantly operating under the radar of our conscious experience, optimizing what elements of a memory we should remember based on how likely they are to occur in our environment," said Barense, who studies how memory functions are organized in the brain. "It makes sense that the brain would operate in this way, yet it was only with the development of modern neuroimaging analyses that we were able to observe this mechanism in action."

Turk-Browne said the research opens the door to several avenues of additional research, including working to better understand the brain's predictions.

"What are the physiological mechanisms of generating predictions in the first place? We're investigating that," he said.

The research was supported by funding from the National Institutes of Health.

Read article >>

Do Viruses Make Us Smarter?

A new study from Lund University in Sweden indicates that inherited viruses that are millions of years old play an important role in building up the complex networks that characterize the human brain.

Researchers have long been aware that endogenous retroviruses constitute around five per cent of our DNA. For many years, they were considered junk DNA of no real use, a side-effect of our evolutionary journey.

In the current study, Johan Jakobsson and his colleagues show that retroviruses seem to play a central role in the basic functions of the brain, more specifically in the regulation of which genes are to be expressed, and when. The findings indicate that, over the course of evolution, the viruses took an increasingly firm hold on the steering wheel in our cellular machinery. The reason the viruses are activated specifically in the brain is probably due to the fact that tumors cannot form in nerve cells, unlike in other tissues.

“We have been able to observe that these viruses are activated specifically in the brain cells and have an important regulatory role. We believe that the role of retroviruses can contribute to explaining why brain cells in particular are so dynamic and multifaceted in their function. It may also be the case that the viruses’ more or less complex functions in various species can help us to understand why we are so different”, says Johan Jakobsson, head of the research team for molecular neurogenetics at Lund University.

The article, based on studies of neural stem cells, shows that these cells use a particular molecular mechanism to control the activation processes of the retroviruses. The findings provide us with a complex insight into the innermost workings of the most basal functions of the nerve cells. At the same time, the results open up potential for new research paths concerning brain diseases linked to genetic factors.

“I believe that this can lead to new, exciting studies on the diseases of the brain. Currently, when we look for genetic factors linked to various diseases, we usually look for the genes we are familiar with, which make up a mere two per cent of the genome. Now we are opening up the possibility of looking at a much larger part of the genetic material which was previously considered unimportant. The image of the brain becomes more complex, but the area in which to search for errors linked to diseases with a genetic component, such as neurodegenerative diseases, psychiatric illness and brain tumors, also increases”.

Read article >>

Study Finds Experience Of Pain Relies On Multiple Brain Pathways, Not Just One

People’s mindsets can affect their experience of pain. For example, a soldier in battle or an athlete in competition may report that an injury did not feel especially painful in the heat of the moment. But until now it has been unclear how this phenomenon works in the brain.

A new study led by the University of Colorado Boulder finds that when we use our thoughts to dull or enhance our experience of pain, the physical pain signal in the brain—sent by nerves in the area of a wound, for example, and encoded in multiple regions in the cerebrum—does not actually change. Instead the act of using thoughts to modulate pain, a technique called “cognitive self-regulation” that is commonly used to manage chronic pain, works via a separate pathway in the brain.

The findings, published in the journal PLOS Biology this month, show that the processing of pain in our brains goes beyond the mere physical pain signal and underscore a growing understanding among neuroscientists that there is not a single pain system in the brain, as was once believed.

“We found that there are two different pathways in our brains that contribute to the pain experience,” said Choong-Wan Woo, lead author of the study and a doctoral student in CU-Boulder’s Department of Psychology and Neuroscience. 

The first pathway mediates the effects of turning up the intensity of painful stimulation and includes a number of “classic” regions in the brain, such as the anterior cingulate cortex. The second pathway, discovered in the new study, mediates the effects of cognitive regulation, and involves increasing activity in the medial prefrontal cortex and nucleus accumbens—brain regions that are involved in emotion and motivation but do not typically respond to painful events in the body. 

This latter pathway may hold some of the keys to understanding the “emotional” aspects of pain, which can contribute substantially to long-term pain and disability.

Other CU-Boulder co-authors of the study are psychology and neuroscience Associate Professor Tor Wager and postdoctoral researcher Mathieu Roy. Jason Buhle, an adjunct assistant professor at Columbia University, is also a study co-author. 

For the study, participants were given painful heat stimuli on their arms while their brains were scanned using functional magnetic resonance imaging, or fMRI.

For the first scan, the participants were asked to experience the painful heat without thinking of anything in particular. In subsequent scans, the participants were asked to imagine that the sizzling hot heat was actually damaging their skin, a thought that increased their experience of the pain, and then to imagine that the heat was actually a welcome sensation on an extremely cold day, a thought that decreased their experience of the pain.

The scans of the brain were then compared. The signal for physical pain remained the same across all three scenarios, regardless of how the participants rated their pain experience. But a signal in the brain using a second pathway changed in intensity depending on the type of thoughts, or “cognitive self-regulation” used.

The researchers were able to disentangle the two pathways based on recent work done in Wager’s Cognitive and Affective Neuroscience Lab. In 2013, Wager and his colleagues published a study in the New England Journal of Medicine that identified for the first time a distinct brain signature for physical pain.

“Previously people did not have this specific brain marker for pain,” Woo said. “Incorporating that measure, and identifying a separate pathway that makes an independent contribution to pain, is a major innovation of this paper.”

Read article >>

MRI Images Detect Brain Abnormalities In Patients With Bipolar Disorder

Sometimes, a new way of looking at something can bring to light an entirely new perspective.

Using a different type of MRI imaging, researchers at the University of Iowa have discovered previously unrecognized differences in the brains of patients with bipolar disorder. In particular, the study, published Jan. 6 in the journal Molecular Psychiatry, revealed differences in the white matter of patients' brains and in the cerebellum, an area of the brain not previously linked with the disorder. Interestingly, the cerebellar differences were not present in patients taking lithium, the most commonly used treatment for bipolar disorder.

"This imaging technique appears to be sensitive to things that just have not been imaged effectively before. So it's really providing a new picture and new insight into the composition and function of the brain [in bipolar disease]," says John Wemmie, UI professor of psychiatry in the UI Carver College of Medicine, and senior study author.

Bipolar disorder affects about 1 percent of the population. Despite being relatively common, scientists do not have a good understanding of what causes this psychiatric condition, which is characterized by sudden mood shifts from normal to depressed or to an abnormally elevated or "manic" mood state.

The study examined 15 patients with bipolar disorder and 25 control subjects matched for age and gender. The bipolar patients were all in normal (euthymic) mood state during the study.

The UI team imaged the participants' brains using an MRI approach known as quantitative high-resolution T1 rho mapping, which is sensitive to certain byproducts of cell metabolism, including levels of glucose and acidity in the brain. Compared to the brains of people without bipolar disorder, the researchers found that the MRI signal was elevated in the cerebral white matter and the cerebellar region of patients affected by bipolar disorder. The elevated signal may be due to either a reduction in pH or a reduction in glucose concentration - both factors influenced by cell metabolism.

Previous research has suggested that abnormal cell metabolism may play a role in bipolar disorder. However, investigating metabolic abnormalities in the brain has been hindered by lack of a good imaging tools. Available methods are slow, low-resolution, and require researchers to identify the region of interest at the beginning of the study. In contrast, the new imaging approach can rapidly acquire a high-resolution image of the whole brain. The study is the first time this MRI technique has been used to investigate a psychiatric disease.

One reason researchers didn't know that the cerebellum might be important in bipolar disorder, is because no one chose to look there, says Casey Johnson, UI postdoctoral researcher and first author on the study.

"Our study was essentially exploratory. We didn't know what we would find," he adds. "The majority of bipolar disorder research has found differences in the frontal region of the brain. We found focal differences in the cerebellum, which is a region that hasn't really been highlighted in the bipolar literature before."

Spurred on by the finding, Johnson and Wemmie conducted an extensive search of the scientific literature on bipolar disorder and began to find pieces of evidence that suggested that the cerebellum may function abnormally in bipolar disorder and that lithium might potentially target the cerebellum and alter glucose levels in this brain region.

"Our paper, with this new technique, starts to bring all these pieces of evidence together for the first time," Johnson says.

Wemmie hopes that the new insights provided by the T1 rho imaging might help refine understanding of the abnormalities that underlie bipolar disease and lead to better ways to diagnose and treat this problem.

While lithium can be an effective mood stabilizer for people with bipolar disorder, it causes numerous unpleasant side effects for patients.

"If lithium's effect on the cerebellum is the key to its effectiveness as a mood stabilizer, then a more targeted treatment that causes the same change in the cerebellum without affecting other systems might be a better treatment for patients with bipolar disorder," Wemmie says.

The study was also supported by grants from the National Institutes of Health, the Department of Veterans Affairs, and the National Alliance for Research on Schizophrenia and Depression (NARSAD), the former name of the Brain and Behavior Research Foundation.

Read article >>

Terminating Our Bad Speech Habits

Australians aren’t well known for their articulation. From Kath and Kim to Kylie Mole, we’re the first to poke fun at our poor speech habits. But are our word choices reflecting badly on our common or garden intelligence? Should we worry about the degradation of our language leading to the degradation of our reputation?

Like, you know?

American comedian Taylor Mali in his YouTube sketch “Totally like whatever, you know?” attacks inarticulateness in the US, describing new language trends as making the current generation of Americans the most “aggressively inarticulate generation". He despises the rise of discourse particles or “fillers” such as “like”, “you know” and the tools of vagueness, “approximators” such as “sort of” and “and that”.

Take the transmutation of the verb “go”. Time was, this simply meant “to move”, but it has evolved into a synonym for “say”. For many the most known example of this evolved in Australia in the 1970s and 1980s in The Comedy Company’s character Kylie Mole.

Yet, strangely enough, the first instance we have of “go” in this new sense is not new at all - it’s from The Pickwick Papers by Charles Dickens in 1836, according to the Oxford Dictionary.

We can track the “new” meaning of “like” back even further, from the novel Evelina by Fanny Burney, published in 1778, where the word in a sentence meant “as it were” or “so to speak", which may be derived from the Old English gel?c, from which we get adverbs (quickly = quick-like) and adjectives (friendly = friend-like).

Yeah, no

Another major language trend to have emerged is that of “yeah, no”. That too appears to have Australian origins. It is first analysed in a 2002 issue of the Australian Journal of Linguistics by linguists Kate Burridge and Margaret Florey in a paper called Yeah-no He’s a Good Kid: A Discourse Analysis of Yeah-No in Australian English.

“Yeah-no” can be a politeness strategy, especially where conflict might occur - as, for example, if a shop assistant recommends a cheese/coat/lipstick that the customer really doesn’t want, but rather than potentially offend with a straight-out “no”, the customer might say “yeah-no, I was looking for something a bit more…”

It can be a self-effacing downtoner; when a person is embarrassed by a compliment. As Burridge and Florey point out, it is often heard in sporting contexts. They give the following example from the 1999 Coolangatta Iron Man contest:

Reporter:

And with me is one champion, a phenomenal effort, Ky Hurst. You said you felt buoyant today, you proved that. Some of the best bodysurfing we’ve ever seen.

Ky Hurst:

Yeah-no, that was pretty incredible I think. It was, you know, in one of the swims, I think it was the first swim leg and also the second swim leg, I picked up some really nice waves coming through.

Yeah-no, is certainly spreading, and not just within the world of sport: even Bill Clinton seems to have succumbed to this bad habit.

The character from the TV show Little Britain, Vicky Pollard, won a British Award in 2010 for her “yeah but no but yeah” catch phrase. Even though many people love the Pollard character, her main characteristic, according to her creators David Walliams and Matt Smith, is her inarticulateness. Walliams remarks that:

people didn’t talk like that ten years ago […] people constructed sentences, and now it’s getting rarer and rarer.

So what does it all matter?

Is inarticulateness a hanging offence? What’s wrong with these apparently minor weaknesses in expression? Well, articulateness will get you a job, or at least be the first thing an employer will consider. Graduate Careers Australia, in its reports for the past five years, has listed the top ten selection criteria for recruiting graduates. Work experience usually comes sixth; calibre of academic results, in spite of the propaganda of the education industry, comes fourth; while interpersonal and communication skills (written and oral) always come first.

Languages are studied by linguists, who tend to be either descriptivists who prefer to scientifically observe and record language without making any value judgements, or prescriptivists who try to prescribe or lay down rules of usage.

Most linguists are descriptivists, but, to return to Vicki Pollard et al, it’s obvious that David Walliams, Matt Smith and Taylor Mali are prescriptivists - they argue strongly for articulateness, and recommend changes in the way we speak and write. Burridge and Florey’s interesting analysis notwithstanding, as a closet prescriptivist I think we have a problem with our articulateness.

If Taylor Mali refers to this generation of Americans as the most aggressively inarticulate one in yonks, then we might just be the most passively inarticulate generation of Australians in yonks. If so, what do we do?

It seems fairly simple, according to Walliams and Mali - think before you speak, and then speak in complete, declarative sentences, and say “yes” or “no”, but not both. If Australians started this stuff, let’s finish it.

Baden Eunson is an Adjunct Lecturer from the School of Languages, Literatures, Cultures and Linguistics in the Faculty of Arts at Monash University.

Read article >>

Study Finds Cognitive Impairment, Depression Among Retired NFL Players

A study led by the UT Dallas Center for BrainHealth examining the neuropsychological status of former National Football League players has found heightened incidence of cognitive deficits and depression among retired players.

But researchers from the center and from UT Southwestern Medical Center say their study, published online Monday in JAMA Neurology, also is significant for what it did not find: evidence of cognitive impairment in the majority of ex-players.

“Many former NFL players who took part in our study, even those with extensive concussion histories, are healthy and cognitively normal,” said Dr. John Hart Jr., medical science director at the Center for BrainHealth and director of the BrainHealth Institute for Athletes, which was created to address the long-term effects of sports-related traumatic brain injuries. “In 60 percent of our participants – most of whom had sustained prior concussions – we found no cognitive problems, no mood problems and no structural brain abnormalities. Many former NFL players think that because they played football or had concussions, they are certain to face severe neurological consequences, but that is not always the case.”

Daryl Johnston

Dr. Hart, who is the study’s lead author and holds a joint appointment at UT Southwestern as a professor of neurology and psychiatry, said the investigation is the largest comprehensive study of former NFL players using neuropsychological testing, neurological assessments and neuroimaging.

Former Dallas Cowboy fullback Daryl Johnston, who participated in the study and helped recruit other players to take part, said: “Having played 11 years in the NFL and taken countless hits, I’ve heard about the struggles of the players who came before me and the challenges regarding their quality of life. Through the Center for BrainHealth, former players can find out if there is an issue, and if you catch it early or late, there are things you can do to improve your condition. The brain is regenerative for life, and we can restore faculties that just a few years ago were thought to be lost forever.”

Since 2010, 34 ex-NFL players with a mean age of nearly 62 underwent detailed neurological and neuropsychological assessments measuring aspects of intelligence, cognitive flexibility, processing speed, language skills, memory and mood. Researchers also gathered detailed retrospective histories of mental status and concussion experiences, and examined motor and sensory functions, gait and reflexes. Twenty-six of the ex-players also underwent detailed diffusion tensor MRIs. All but two of the 34 players reported having experienced at least one concussion, with 13 as the highest reported number.

Also noteworthy was that, for the first time, researchers identified a correlation between cognitive impairment and cerebral white-matter abnormalities. Among the players who were found to have cognitive deficits or depression, researchers found disrupted integrity in their brains’ white matter, which is connective tissue that allows information to travel from one brain cell to another. There were also associated brain blood flow changes in those who developed cognitive impairments, providing clues to the active brain changes resulting in deficits.

Some of the players do have some form of cognitive impairment. Four were diagnosed as having fixed cognitive deficits, eight as having mild cognitive impairment and two had dementia. When compared to healthy, age-matched control subjects, the former football players generally had more difficulty on neuropsychological tests that dealt with naming, word finding, and visual and verbal episodic memory.

About 24 percent of the players were diagnosed with depression, including six who never before had been diagnosed or treated. The rate of depression in an age-matched general population would be about 10 percent to 15 percent, said neuropsychologist Dr. Munro Cullum, the study’s senior author and a professor of psychiatry and neurology at UT Southwestern.

The findings could have implications beyond the football field, particularly in the areas of aging and Alzheimer’s disease, Dr. Cullum said.

“There is still so much we don’t know about concussions and later-life function, nor do we know who is vulnerable to cognitive problems later in life,” Dr. Cullum said. “Severe and moderate head injuries have been identified as a potential risk factor for Alzheimer’s. We’re still learning about concussions.”

Other researchers on the study included Dr. Nyaz Didehbani, Dr. Elizabeth Bartz, Jeremy Strain, Heather Conover and Sethesh Mansinghani, all of UT Dallas; Dr. Kyle Womack, who holds appointments at both UT Dallas and UT Southwestern; Dr. Hanzhang Lu of UT Southwestern; and Dr. Michael A. Kraut, who holds appointments at both UT Dallas and the Johns Hopkins University School of Medicine.

The study was supported by the BrainHealth Institute for Athletes at the Center for BrainHealth.

Read article >>

Did You Know?

Vitamin D Supplementation Does Not Relieve Depressive Symptoms.

Newsletters

Stay informed of the most recent news and research