Imposters are us

comments 2
Health / Myths / Psychology

Have you ever felt like a fraud? That you aren’t good enough, experienced enough or smart enough to be doing what you’re doing? Join the club.

Anxious you’re about to be found out?

Feeling like a phoney

On the outside you appear confident, composed and on top of your game. But on the inside you are wracked with self-doubt. You feel like a fraud and as though someone is about to tap you on the shoulder and ask you what you think you’re doing. You’re sure your inadequacies and incompetency are just about to be revealed. Hello Imposter Syndrome.

In 1978, psychologists Pauline Rose Clance and Suzanne Imes described this phenomenon among high-achieving women. Five years of research had highlighted how many successful women believed their success could be attributed to luck, chance or errors in selection processes. These women were sure their abilities had been overestimated. Clance went on to create a scale to quantify the experience of imposter syndrome.

Researchers have been studying the imposter syndrome ever since and we now know a lot about it. Despite the fact the imposter syndrome was first described in women, we now know men are just as likely to experience it. One study among University academics (a profession in which the imposter syndrome is rife) found men were even more likely than women to experience imposter syndrome.

Research suggests 70% of people will experience feelings of being an imposter at some point in their lives. And it’s worth pointing out this phenomenon doesn’t qualify as a syndrome according to the medical definition. According to Clance, if she could go back she would call it the Imposter Experience since nearly everyone experiences it and it’s not a mental illness.

Come one, come all

An important aspect of the imposter syndrome is the fact each of us tends to think we are the only one suffering from it. We listen to the monologue of self-doubt going on in our own heads and mistakenly assume it’s just us. But the truth is most people feel this way at least some of the time. When Olivia Fox Cabane asks incoming students at the notoriously highly-selective Harvard Business School each year ‘How many of you in here feel that you are the one mistake the admissions committee made?’, two-thirds of the students put up their hand.

(And of course, it’s somewhat terrifying to consider the possibility everyone around us is just winging it. We like to think the people out there flying planes, performing surgery, making decisions in court and running our governments are highly competent.)

There are a number of common features among people experiencing imposter syndrome. The vast majority of ‘imposters’ are able to successfully fulfill their work requirements despite their perceptions of incompetency. In fact, many ‘imposters’ are high achievers who fail to internalise their success. Despite ample objective evidence of their achievements, ‘imposters’ still feel like they are making it up as they go along and fear they are about to be unmasked.

Other features of imposter syndrome are a fear of failure; a tendency to attribute success to luck, error or charm; and the feeling of having given others a false impression. People experiencing imposter syndrome often experience anxiety, may be perfectionists and may also fall victim to procrastination. ‘Imposters’ often feel they need to stand out and be the very best compared with their peers. Interestingly, people suffering imposter syndrome crave praise and acknowledgement but feel uncomfortable when they receive it (because they feel they don’t deserve it).

Although men and women experience imposter syndrome in equal numbers, they do respond to it differently: women commonly work harder to try and prove themselves whereas men tend to avoid situations in which their weaknesses might be exposed.

Fake it ’til you make it

The most common advice we hear about how to deal with feelings of inadequacy is to ‘fake it ’til you make it’. Pretend you feel confident and assured and ignore the nagging doubts. It’s not bad advice, but we may be waiting for a long time to feel like we’ve ‘made it’. The frustrating irony of imposter syndrome is the more experienced and senior you become, the more likely you are to find yourself being required to do new things, and therefore feeling like you are winging it. Getting better at your job isn’t a guaranteed way of making your imposter syndrome go away.

There is plenty of other advice for coping with imposter syndrome. For example, learn to accept compliments, talk with the people around you about how you feel and remember that feeling like a fraud is completely normal. It can also be useful to focus on what you’re learning, rather than how you’re performing. This is mindset theory: if you focus on how you’re performing, you see any mistakes you make as evidence of your inadequacy. But if you have a growth mindset, your mistakes are simply part of the inevitable learning process.

And if all else fails, take heart from the words of an expert in the field:

Impostorism is most often found among extremely talented and capable individuals, not people who are true impostors –Associate Professor Jessica Collett

Links and stuff


It’s a seasonal thing

comments 3
Health / Myths / Psychology

We all know horoscopes are bunk. But research suggests the season you were born may say something about you after all.

What season were you born in?

What’s your star sign?

Is checking your horoscope a guilty pleasure? Most of us agree: there’s no science behind the star sign columns that abound in newspapers, magazines and online. But that doesn’t mean people don’t read them. One study found more than 20% of adults read their horoscope often. There’s also been plenty of research into how many people believe in horoscope predictions, and the numbers are higher than you might think. This is despite the fact we know full well that predictions based on astrology are no better than those based on chance.

Astrological predictions aside, it turns out there are other interesting correlations between the season in which you were born and your personality and health. Research has found a variety of factors including your mood, risk of suffering allergies and how long you are going to live may be influenced by when you were born.

The seasons of life

Researchers have found a number of correlations between season of birth and personality traits. People born in summer are more likely to experience frequent mood swings. Those born in summer and spring are more likely to be excessively positive. But people born in winter are less likely to be irritable than those born at other times of the year.

We know people born during the winter months are more likely to have Seasonal Affective Disorder (‘winter depression’), bipolar disorder and schizophrenia. One fascinating study also found differences in brain structure between people born in different seasons. One very large study found relationships between month of birth and risk of developing a disease for 55 different conditions including MS, diabetes and heart conditions.

When you were born may also influence how long you are going to live. In the US, you are more likely to live to 100 if you were born in September – November. A Danish study found remaining life expectancy at 50 was higher for people born in October or November. In the southern hemisphere, the pattern is shifted by half a year: in Australia, people born in May and June are likely to live longer. Interestingly, British immigrants to Australia fit the pattern for the northern, not southern hemisphere.

Your seasonal brain

We don’t fully understand how the season you were born in influences your personality and health. There are a few different possibilities. Perhaps the fact that sunlight varies across the year, and the amount of sunlight you are exposed to influences vitamin D levels may play a role. We know people born in winter have lower levels of vitamin D in adulthood than those born at other times of the year. The timing of your birth may also be linked to how likely your mother experienced seasonal illnesses during pregnancy. Levels of important chemical messengers in our bodies like serotonin and dopamine also vary according to the season of birth.

There’s also evidence from studies of mice that the season you are born in may have an ongoing effect on your body clock. Mice raised in a winter light cycle had more trouble adapting to a seasonal shift in light than mice raised under summer conditions. This has led to an intriguing suggestion: perhaps the cycles of night and day we experience during the time our brains are developing may affect our personality.

We know the seasons influence us in other ways too. For example,  how our eyes perceive different colours and possibly even how effectively we can pay attention and memorise things.

So although I’m not advocating you spend any time reading your star signs, it may be worth considering that the seasons could be having more of an effect on us than we realise.

Links and stuff

To eat it or not to eat it?

comments 3
Biology / Health / Myths

We’ve all heard the five-second rule: as long as the food has been on the ground for less than five seconds, you’re  safe to eat it. But are you?

Would you eat this biscuit off the floor?

Birth of an urban myth

When it comes to the origin of the five-second rule, stories abound. Genghis Khan often stars in these accounts, with claims it was originally known as the Khan rule. This rule apparently came into play at 13th-century victory banquets when Khan declared that as long as food had been on the ground for less than twelve hours, it was safe to eat. Another story centres on celebrity chef Julia Child famously advising on her TV show The French Chef what to do if you drop food while cooking. ‘You can always pick it up, and if you’re alone in the kitchen, who is going to see?’

One survey found 70% of women and 56% of men are aware of the five-second rule and use it to make decisions about whether to eat dropped food. Another found 87% of people would, or already have, eaten food that’s been on the ground. The five-second rule was the focus of a Mythbusters TV segment and even made it into a Volkswagen commercial.

But is there any science to back up the claim that it takes more than five seconds for bacteria to attach to dropped food?

Testing, testing

I found a couple of tests of the five-second rule, with conclusions both for and against. In 2003, high school student Jillian Clarke set out to examine the rule as part of her science internship. Her conclusion: university floors are remarkably clean. But if you do drop food on a floor containing germs, your food will become contaminated in less than five seconds.

Another study concluded there is definite truth to the five-second rule: food picked up within a few seconds of being dropped is less likely to be contaminated than food left for longer. But a 2006 study found a sausage dropped on a tile floor picked up 99% of the bacteria present within five seconds.

A detailed study published last year found some food picks up bacteria within a second of landing on the floor. The scientists tested four foods (watermelon, bread, buttered bread and a jelly sweet), four surfaces (stainless steel, ceramic tile, wood and carpet) and a variety of contact times from less than a second to five minutes. Unsurprisingly, the moister the food, the more it got contaminated. The message: next time you drop watermelon on the floor, put it in the compost.

All floors aren’t equal

One of the biggest influences on whether bacteria end up on food is not how long the food has been on the floor, but rather what floor the food falls on. And I’m not just talking about how often you vacuum or mop: different floor surfaces provide more or less friendly surfaces for bacteria. Interestingly, the researchers found much lower contamination rates from carpet onto food than from tiles or stainless steel.

Clearly, what sort of food you’ve dropped and where you’ve dropped it matters just as much, if not more, than how long you’ve left the food on the floor for.

But it’s worth bearing in mind your floor is probably far cleaner than many of the other surfaces around your house. Study after study have highlighted how many bacteria live on many of the surfaces we touch regularly: money, mobile phones, remote controls, supermarket trolleys, computer keyboards and, often the most contaminated, kitchen cloths. There are a whole heap of bacterial hotspots you are probably touching before you eat without a second thought.

It’s highly likely your floor contains bacteria, and almost certain any food you drop on that floor will very quickly pick up that bacteria. Most bacteria won’t do you any harm, but some will make you very sick.

Whether eating food you’ve dropped on the floor is more likely to make you sick than any other number of things you do every day is a decision only you can make.

Links and stuff

Who is watching you?

comments 2
Health / Myths / Psychology

Ever had the creepy feeling someone was staring at you, only to turn around and discover it was true? Is it possible to ‘feel’ someone watching you?

Feel like someone might be watching you? Image credit: Seth Showalter via Tookapic

I see you

They say the eyes are the windows to the soul: eye contact is one of the most powerful forms of human communication. Unlike all other primates, human eyes make it extremely obvious which direction we are looking: the exposed white area around the coloured iris gives it away, even from a distance. In most other animals, the iris takes up almost all of the eye, or the area around the iris is darker. Either way, it’s very hard to tell which direction animal eyes are looking, which probably evolved as a way for predators to hide the direction of their gaze from potential prey.

From their very first days, babies prefer faces looking directly at them. And from an early age, our brains respond more to a face looking directly into our eyes than one looking away. We have a ‘gaze detection’ system – a network of nerves in our brains sensitive to whether someone is looking right at us, or just past us. Researchers suggest we have evolved to be so tuned into the direct gaze of other people because it forms the basis of human cooperation and social behaviour.

Sixth sense?

But what about the feeling of being watched? According to surveys, up to 94% of people report having experienced the feeling of being stared at, only to look up and discover it was true. If you’ve had the experience, you’ll know it can feel like ESP or a sixth sense. The phenomenon has also been the subject of plenty of research – the earliest paper I could find on the topic was published in 1898!

Early studies dismissed the idea, declaring that the feeling of being watched was simply a result of being nervous about what may be going on behind your back. There are also other straightforward possible explanations. It could be that in your peripheral vision, you’ve noticed the tell-tale signs of someone looking in your direction. We are enormously sensitive to body language. If someone’s body is facing away from us, but their head points towards us, it’s an immediate give–away they may be looking at us. At the very least, it makes us look up to get more information.

It may also be a self-fulfilling prophecy: if you’re nervous someone is watching you, you may start fidgeting. Your movement alone will make it more likely someone will indeed look at you. There’s also the possibility that as you sit on the train feeling like you’re being watched, the very act of you looking up from your phone makes the person opposite you look up too. When your eyes meet, you wrongly assume this person has been looking at you all along. Research has shown we are hard-wired to assume other people are looking at us, even if they’re not. The argument goes we evolved to think this way because it keeps us alert to danger and ready to interact.

But there’s also a far more intriguing possibility: perhaps your brain has detected someone else’s gaze without your eyes seeing a thing.

Your brain is watching

The evidence for this somewhat unnerving idea comes from studies involving people who have lost their visual cortex due to brain injury. The visual cortex is the part of your brain that processes visual information and maps your view of the world.

One fascinating study reported the experiences of a man who is ‘cortically blind’. Although his eyes are fully functional, and send information to the brain, his visual cortex was damaged by two strokes. As a result, he doesn’t have what we think of as sight. But what do our eyes take in beyond what our sight shows us? Quite a lot, as it turns out and it’s called blindsight.

Although this man (known as TN) can’t see, researchers showed him pictures of faces, some looking right at him, some looking away. At the same time, they measured activity in his amygdala – the part of the brain in charge of facial recognition and emotions. Even though TN couldn’t consciously see the pictures and said he couldn’t tell the difference, there was more activity in his amygdala when he was presented with a face looking directly at him. His brain knew when someone was looking right at him, even when he couldn’t consciously see it.

What an extraordinary trick our brains are capable of. As long as a person looking at us is within our field of view, we can sense it, without consciously noticing anything.

Perhaps this explains the creepy feeling of being watched, no supernatural explanation required.

Links and stuff

Itchy and scratchy

comments 2
Evolution / Health / Myths

We all know the sweet relief that comes from scratching an itch. But why does scratching feel so good? And why does it make the itch worse?

When you really have to scratch that itch! Image credit: lintmachine via Flickr

Why so itchy?

An itch is simply a sensation on the skin leading to the desire to scratch. But the science of itching is far from simple – it’s been the subject of decades of research (there’s even a scientific research journal called Itch). Researchers discovered the first gene associated with itching ten years ago, but there’s still plenty we don’t understand.

Itching can have many causes: from minor insect bites and skin irritations, through to various cancers and the treatments for those cancers. For most of us, itches are just a minor annoyance, barely noticed throughout the day. But for others, itching is a chronic condition that has a serious impact on quality of life. And chronic itching can lead to continuous scratching and severe skin damage.

Scratching the itch

On the face of it, dealing with an itch is straightforward: you scratch it. And hey presto, all is well.

Scratching an itch with a violence that would cause pain elsewhere may be experienced as one of the most exquisite pleasures. – Dr. George H. Bishop (1948)

But then all too quickly, the relief fades. Only to be replaced by an even more insistent itch. Remember being told as a kid that scratching your itch would make it worse? Well, your parents were right.

The reason scratching temporarily relieves a pesky itch is that scratching causes mild pain. This means your nerves carry pain signals, rather than itch signals to your brain and for a brief moment, the itch is gone. The pain distracts from the itch. But the problem is, your brain releases the chemical serotonin in response to the pain.

Serotonin explains how you can end up scratching until you bleed. You scratch because the pain of scratching inhibits the itch. But because your body releases serotonin in response to the pain, it takes more and more vigorous scratching to suppress the itch. At the same time, the serotonin also acts to intensify the itchy feeling. It’s a vicious cycle, which is why the best thing you can do is not start scratching in the first place.

Mice who can’t produce serotonin scratch far less in response to being itchy than normal mice. But if you’re thinking blocking serotonin would be a great way to prevent itching, not so fast. We need serotonin for all sorts of roles in the body: growth, ageing and regulating moods. Getting rid of serotonin would have far more serious consequences than relieving itching.

Catching the itch

Have you started scratching yet? Just reading about scratching can be enough to make you suddenly feel itchy. Audience members in a talk about itching itched more than those listening to a talk about a more innocuous subject. And if you see someone else scratching, you’ll probably copy without even being aware of it. It’s not a sign of a lice outbreak (hopefully), you’ve just fallen victim to socially contagious itching. Like yawning, mice, monkeys and people can all ‘catch’ itching.

There’s been plenty of research into why itching may be contagious. For example, research found that contagious scratching is more common among highly neurotic people.

Research published earlier this year showed that in mice, contagious scratching appears to be a hard-wired, instinctive behaviour. If a mouse sees another mouse scratching – even if the mouse is an image on a screen – the first mouse also starts scratching within a few seconds. A chemical is released in an area of the brain responsible for regulating mouse body clocks (the suprachiasmatic nucleus), which causes the scratching. Inject this chemical into another mouse and it too will start scratching. The mouse isn’t consciously deciding to scratch; it’s an automatic response to the change in brain chemistry.

Hard-wired behaviours tend to be important ones, and there’s probably a good explanation for why we’ve evolved to ‘catch’ scratching. We’re highly sensitive to potentially dangerous things like insects touching our skin. If someone near you is scratching, it may well be a sign of something you should respond to, perhaps a swarm of mosquitoes. Better to start scratching now and avoid getting stung.

But then again, if you don’t want the itch to drive you crazy, now may be a good time to look away.

Links and stuff

By the light of the full moon

comments 2
Astronomy / Health / Myths / Psychology

Known as the Transylvania Effect, many people believe a full moon triggers strange behaviour. Crime rates, psychiatric hospital admissions, emergency room visits, dog bites and hyperactivity in kids are all said to increase during a full moon. Is there any truth to the rumours?

Do strange things coincide with the full moon? Image credit Brad Scruse via Flickr

Bad moon rising

We can thank the Roman goddess of the moon, Luna, for the words lunatic and lunacy. Why the link between the moon and erratic behaviour? Aristotle and Pliny the Elder both suggested that because the human brain is so moist, it responds to the same pull of the moon that drives our tides (which we now know not to be true). In psychiatrist Arnold Lieber’s 1978 bestseller, How the Moon Affects You, he also argued that because our bodies are mostly water, the moon has a strong influence on us. How strong an influence? The story goes that in 19th Century England, lawyers could invoke the defence ‘not guilty by reason of the full moon’. I doubt any lawyers argued their clients had metamorphosed into drooling, snarling werewolves, but the argument is clear: under a full moon, people are no longer in control of their behaviour.

Sounds far-fetched, doesn’t it? But the myth of the Transylvania Effect is alive and well. One survey found 43% of US college students believed people were more likely to behave strangely on a full moon night. The same study found mental health professionals were more likely to believe this than people working in other areas. In another survey, 80% of emergency department nurses and 64% of emergency department doctors believed the moon affected behaviour. Ninety-two percent of the nurses said they found working on the night of a full moon was more stressful, and they should be paid more for working those shifts.

The moon made me do it

In 2007, police in the UK employed more staff to be on patrol on full moon nights claiming ‘Research carried out by us has shown a correlation between violent incidents and full moons’. There are thousands of published research papers exploring the link between the phase of the moon and a huge variety of events and behaviours. The most striking thing about all this research is the almost complete lack of evidence for the moon having any effect on us.

Despite the fact many people believe the time of the full moon is the worst for undergoing surgery, there is no evidence a full moon affects the outcomes of heart surgery, the rate of surgical complications, levels of post-operation nausea and vomiting or post-operative pain. There is no evidence for increased crisis centre calls during full moons and no link between moon phase and frequency of epileptic seizures. Research shows no increase in emergency department admissions or calls for ambulances during a full moon, nor do more people see doctors for anxiety or depression. There is also no relationship between moon phase and birth rates.

Research has found no link between the phase of the moon and psychiatric hospital admissions. A full moon doesn’t lead to increased violence in psychiatric settings, nor is there evidence for a link between moon phase and suicides or suicide attempts. No link has been found between full moons and traffic accidents, and dog bites requiring hospitalisation occur no more often during a full moon than at any other time. Kids aren’t more hyperactive at the full moon, and there are no more murders or other crimes when the moon is full.

The full (moon) truth

If nothing else, research over the past 50 years has provided us with a detailed list of all the human behaviours not influenced by the phase of the moon. Given the lack of evidence, why does the myth persist? It may simply be the result of illusory correlation – if something unusual happens on the night of a full moon, we are more likely to take note and remember it than when nothing unusual happens under a full moon. Confirmation bias probably also plays a role: if you already believe in the Transylvania Effect, you’ll pay particular attention to any information that supports your belief.

But there’s another intriguing possibility. We have some evidence people sleep a little less on full moon nights. One study found people slept an average of 19 minutes less during a full moon; another study recorded a decrease of 25 minutes of sleep on these nights. Study volunteers took longer to fall asleep and slept less deeply at the time of the full moon, even when they couldn’t see the moon and didn’t know the moon was full. They also complained of feeling less rested on waking after a full moon.  A 2-year study of nearly 6000 children living on five continents found kids also slept a little less on nights with a full moon.

Perhaps this is the source of the myth. In times gone by, before our body clocks were under the influence of electric lights and screen time, a full moon may have been very disruptive to sleep. Researchers have suggested the Transylvania Effect may have its root in the fact someone suffering bipolar or a seizure disorder may be highly susceptible to mania at times of sleep deprivation.

Seems plausible but fortunately I don’t think we’ll be hearing the defence ‘not guilty by reason of not having had quite enough sleep last night’ anytime soon.

Links and stuff

Scaling great heights

comments 2
Anthropology / Evolution / Genetics / Health / Myths

Sherpas – sometimes called superhumans – are extraordinary mountaineers who are at home in the peaks of the Himalayas. What is it about Sherpas that enables them to power on at such high altitude?

Sherpas perform extraordinary feats of endurance at high altitude. Image credit: Niklassletteland via Wikimedia Commons

The top of the world

Standing at 8,848 m above sea level, the peak of Mt. Everest is not a welcoming place for humans. We need oxygen, and up there, oxygen levels are only a third of those found at sea level. Some people begin to experience mild altitude sickness, including headaches, nausea, dizziness and exhaustion, at only 2500 m above sea level. Climbers who venture above 8000 m have entered the ‘death zone’: our bodies, particularly our brains, don’t do well without oxygen.

The way to help our bodies cope with high altitude is to allow them to adjust gradually. Climbers spend many weeks acclimatising to high-altitude conditions, slowly moving higher into the mountains. Research has shown our bodies are reasonably good at adjusting to low-oxygen conditions: spending just two weeks in the mountains cause changes in our blood that may last for months. But fewer than 200 climbers have ever managed to reach the summit of Mt. Everest without supplemental oxygen.

One group of people famous for their ability to thrive in low-oxygen conditions are the Sherpas. Sherpas are an ethnic group from Nepal who have lived in the high altitudes of the Himalayas for generations.


Behind virtually every successful summit of Mt. Everest by foreigners, there are climbing Sherpas. We’ve all heard of Tenzing Norgay, famous for having been the first, along with Sir Edmund Hillary, to reach the top of Mt. Everest in 1953. While Sherpas have been called the forgotten heroes of the mountain, you’re sure to be familiar with their incredible climbing abilities. Sherpas hold the world records for reaching the summit of Mt. Everest the fastest and the most times, and Sherpas who act as guides and porters for foreigners who climb Himalayan peaks.

Researchers have long wondered how Sherpas live and work at altitudes that make the rest of us sick. Scientists have put together pieces of the puzzle over the last few decades, studying Sherpa genetics and the evolutionary history of Himalayan people. We know the Sherpa people have been living at high altitude for centuries and have many evolutionary adaptations to low-oxygen conditions.

One normal response our bodies have to low levels of oxygen is to produce more red blood cells. This is a great way for our blood to carry more oxygen, but it also means our blood gets thicker and is more likely to clog blood vessels. Past research has shown Sherpas actually have fewer red blood cells, but higher levels of nitric oxide, a chemical that opens up blood vessels. But research published this month has given us new insights into Sherpa physiology, and the trick to their high-altitude living is not just in their blood.

Extreme science

In 2013, an Xtreme Everest science expedition headed to Everest Base Camp, located at 5,300 m above sea level. The researchers wanted to learn more about human biology at high altitude and took 180 volunteers with them. Sixty-four were Sherpas, the rest, members of the public who live at low altitudes (‘lowlanders’). The scientists looked at the blood, bones and muscle of each member of the two groups before and after they had reached base camp.

It turns out Sherpas can conserve muscle energy at high altitude because of the way their mitochondria function. Mitochondria are like batteries, the energy centres in each of our cells. The research showed the Sherpas’ mitochondria use oxygen much more efficiently then the lowlanders when producing the energy their bodies need. And while the energy reserves in the muscles of the lowlanders decreased the longer these people spent at basecamp, energy reserves increased in the Sherpas’ muscles, despite the low oxygen.

Sherpas have spent thousands of years living at high altitudes, so it should be unsurprising that they have adapted to become more efficient at using oxygen and generating energy.  Dr Andrew Murray, University of Cambridge

This research isn’t just about helping gung-ho adventurers survive the slopes of Everest. Many critically-ill patients also experience a fall in oxygen levels, for example when suffering heart failure, lung diseases and many cancers. Understanding the way Sherpa physiology has evolved to cope with extreme conditions may just help all of us one day.

Links and stuff

Early learning

comments 2
Anthropology / Evolution / Myths / Zoology

Love spicy food? Find certain songs calming? You started developing preferences for flavours and sounds while you were still in the womb.  And it’s not just us: many animals begin learning about the world around them before they’re even born.

Learning by sound and taste. Clockwise from top-left: elephant, horse, kangaroo, cheetah, dolphin and emperor penguin. Image credit Peter Chinn / National Geographic

Listening from the inside

Before you were born, you had a lot to listen to. There was the regular thumping of your mother’s heart, the blood whooshing through her body, and even the rumbling and gurgling of her stomach. You were also getting to know your mother’s voice and from the moment you were born, you preferred her voice over a stranger’s. What’s more, you already preferred listening to your mother’s native language over other languages.

You may have developed more specific preferences too. If your mum liked listening to a particular song or type of music, you would have recognised it and liked it too. One striking example of this comes from a study of women who watched the Australian soap opera Neighbours daily during their pregnancy. Four or five days after birth, babies who had heard the muffled strains of the show’s theme song before they were born became immediately calm upon hearing the song again. The song had no such effect on babies who hadn’t heard the song before birth. The same has been shown for babies who frequently hear particular nursery rhymes before birth.

Other studies have shown we learn words and sounds in the womb and that the memory of these sounds can be detected in our brains. The melody of our newborn cries have characteristics of our native language, and if your mum spoke multiple languages during her pregnancy with you, you were born liking the sound of each of the languages you were exposed to.

A taste sensation

It’s not only sound we’re exposed to in the womb: we also learn about flavour. Before you were born, you swallowed the amniotic fluid that surrounded you. And this fluid bore the signature flavours of the food your mother ate. If she ate spicy food, you developed a taste for it too. We know a variety of flavours can be easily detected in amniotic fluid, including garlic, mint, aniseed and vanilla. One study showed that mothers who ate lots of garlic during pregnancy gave birth to babies who also liked garlic. Babies who aren’t exposed to garlic before birth generally hate the flavour. The same goes for carrots: mothers who drank carrot juice during the last few months of pregnancy had babies who went on to happily eat carrot and carrot-flavoured cereal. So if you want kids who’ll eat their veggies without complaining, make sure you eat plenty of veggies during pregnancy.

These preferences have also been found in other animals. Rats whose mothers eat a high-fat, high-sugar diet are born preferring the same kind of ‘junk’ foods. Female sheep that eat hay infused with oregano oil give birth to lambs that prefer oregano-flavoured hay over normal hay. Newborn chicks are also influenced by the flavours of their mother’s diet.

What’s it like out there?

There are many amazing examples of unborn animals learning about the world waiting for them on the outside. Quail prefer the sound of whatever call they hear while still inside the egg, even if it is the call of a different species. Superb fairy wrens learn a distinct ‘password’ call from their mothers while still in the egg. After hatching, they mimic this call to get fed by their mothers in the presence of imposter cuckoo birds. Zebra finches use song to communicate information about climate to their unhatched eggs. This, in turn, affects the birth weight of the chicks (it’s better to be small in a hotter climate). There’s little doubt that’s going to come in handy in the face of climate change.

Frog embryos learn to be afraid of salamanders, a common predator, if they are exposed to the smell of danger (a mixture of crushed tadpole and salamander). Tadpoles that are exposed to the smell while still in the egg recognise salamanders as dangerous. Cuttlefish prefer eating crabs over shrimp, but only if they’ve been able to see crabs through the transparent wall of their egg before birth. No wonder animals have evolved such early learning: it’s easy to imagine the benefits of knowing about the world waiting on the outside.

So if you’re pregnant, or you’re near someone who is, be aware: the baby inside is listening to every sound.

Links and stuff

Is training your brain just a game?

comments 2
Health / Medicine / Myths / Psychology

Want to be smarter and better at concentrating? Want to improve your memory and protect yourself against dementia? Brain-training programs promise all this and more – but do they work?

Do brain-training games make you smarter? Image credit: dire schaefer via Flickr

The claims

The logic behind brain training is simple. Carry out a mental task repeatedly, and you get better at it. It could be memorising a string of numbers, or fitting together a series of shapes as fast as possible or any number of other ‘fun games’. If repeating these tasks improves your memory, concentration or critical thinking, it seems reasonable to suggest you’ll also get better at real-life tasks that depend on the same skills. The catch phrase is neuroplasticity: our brains change as a result of how we use them.

You don’t have to spend long exploring brain-training websites to find a huge variety of claims about the benefits awaiting you once you start playing. For a start, you can expect ‘135 percent faster auditory processing’, ‘increased brain activation’, ‘improved cognition’sustained improvements in working memory’, improved focus and speaking abilities, ‘lower risk of depression’ and ‘more happy days’. All of the brain-training websites I looked at cited long lists of scientific research papers to back up their claims.

Many respected scientists have put their names to brain training, serving as experts for the numerous brain-training companies. And hundreds of peer-reviewed studies have documented the benefits of brain training. One important study published in 2015 looked at the effects of brain training in nearly 7000 people over the age of 50. The results? Within six weeks of beginning training, the study participants showed improved reasoning skills.

It would be easy to conclude that investing time – and money – in brain training would be a wise choice. We’ve certainly embraced the phenomenon: it’s been estimated we’ll spend $US3.38 billion on brain training annually by 2020. But are we putting our money to good use, or are we being conned?


Many scientists have questioned the claims made by brain-training advocates. And while studies have documented the benefits of brain training, other researchers have looked for improvements in mental functioning after brain training and found none. In 2010, a study of more than 11,000 people found no evidence for the transferability of reasoning, memory, planning or attention skills learned playing the training games to any real-life situations. In 2014, a group of more than 70 psychologists and neuroscientists published a ‘consensus statement’ arguing that although as a community, we are very fearful of losing our memories and other mental skills as we age, ‘claims promoting brain games are frequently exaggerated and at times misleading’. They didn’t suggest we don’t learn new skills playing brain-training games. But importantly,  they argue there’s no evidence that playing brain-training games helps us in the real world.

We object to the claim that brain games offer consumers a scientifically grounded avenue to reduce or reverse cognitive decline when there is no compelling scientific evidence to date that they do. Consensus statement (2014)

But later the same year, more than 100 scientists responded with another statement, arguing ‘certain cognitive training regimens can significantly improve cognitive function, including in ways that generalise to everyday life’.

To train or not to train

Last year, in what could be the nail in the coffin, a group of scientists published a review of every research paper that brain-training companies have cited as evidence for the power of brain training – a total of 374 papers. It took them two years to review all of the research. The authors concluded that none of this research is without problems. For example, many of the studies included only a small number of participants. Others failed to take the placebo effect into account (simply telling someone that playing a game will improve their skills can lead to improved skills). They conclude brain training might make you good, even exceptionally good, at a particular game. But there’s essentially no evidence that playing a brain-training game will have any flow-on effects in the rest of your life.

Lumos Labs, the group behind the popular brain training site Lumosity, paid a heavy price for exaggerated claims about brain training: $US2 million. In 2016, the Federal Trade Commission declared Lumos Labs misled consumers with unfounded claims that ‘Lumosity games can help users perform better at work and in school, and reduce or delay cognitive impairment associated with age and other serious health conditions.’ Another smaller brain-training company paid $US200,000 and agreed to stop ‘making a range of false and unsubstantiated claims.’

The fact is, if you want to get good at brain-training games, playing these games is an excellent plan – and there are plenty of free ones available online. But if you want to improve your memory, concentration or ability to think creatively and critically, there are plenty of better things you could do with your time than stare at a screen. Experts recommend a number of more effective approaches. Firstly, get some exercise: your brain will benefit from the increased blood flow. Second, learn something new. This is a sure-fire way of kick-starting your thinking. Finally, hang out with your friends – being sociable is great for our brains.

Anyone care to join me at a salsa class? We know dancing is an excellent way to ward off dementia.

Links and stuff

Take an espresso nap

comments 6
Health / Myths / Psychology

Do you like coffee? Do you like naps? If you answered yes to both of those questions, you’re going to like what’s coming next. Research shows that combining the two – a coffee nap – is even better than each on its own.

Coffee or a nap? Why not both!?

Coffee, glorious coffee

There’s plenty of evidence for the health benefits of coffee (unless, like me, you’re allergic to caffeine!). Among other positive effects, drinking coffee may protect you against some nasty diseases and improve your memory. And of course, a major draw card for many people is the buzz that comes with a coffee. We’ve long known caffeine results in increased alertness: even one cup of coffee can make you feel more switched on. Caffeine can increase your alertness and reaction times when driving, reduce the number of mistakes you make at work, improve your mood, reduce mental and physical fatigue, improve athletic performance, and increase your ability to make the right decisions. No wonder caffeine is considered the world’s most popular and widely-used drug.

Why is caffeine so effective at combating tiredness? Because a molecule called adenosine is what is making you feel tired. It’s a by-product of brain activity and it builds up during the day, making you feel worn out and drowsy. But caffeine blocks the adenosine from acting on your brain. So with caffeine in your system, you don’t feel the accumulating tiredness.

Unfortunately, these benefits come with a cost. Drink a coffee too late in the day, and you’ll probably struggle to fall asleep. Caffeine consumed even six hours before you go to bed can interfere with both the quality and quantity of your sleep. As a result, many people recommend a 2 pm cut-off for drinking coffee. And the older you get, the more likely caffeine is going to disrupt your sleep. The message is clear: a coffee will perk you up, but be careful. Sleep is too important to mess around with.

The power of a nap

If you’ve ever thought of seeing how long you can stay awake for, please don’t. A lack of sleep, even over a short period, can have disastrous effects on your physical and mental health. Sleep deprivation will interfere with your memory, cause problems for your immune system, and at its most extreme, a lack of sleep could kill you.

We’ve all heard napping is good for us – it’s one good way to fit a bit more shut eye into your day. Even a 10-minute nap can increase your ability to concentrate for up to four hours. And a six-minute nap can improve your memory. People who nap for at least half an hour three times a week have a significantly lower risk of suffering heart disease. A 40-minute nap taken during the work day improves the performance of both doctors and astronauts. Research shows that in the case of learning to do new tasks that rely on your powers of perception, a one-hour nap is just as good as a full night’s sleep in consolidating the new skills.


Separately, caffeine and napping can both work wonders for improving alertness, concentration and productivity.

It may seem counterintuitive, but over the last 20 years, researchers have shown that putting the two together is in fact even more powerful. Enter the ‘coffee nap’.

The idea is simple. Drink a coffee and then immediately lie down for a 15 – 20-minute nap. The timing is essential. And if you get the timing right, you get to reap the combined rewards of a nap and a caffeine buzz. This is because it takes about 20 minutes for the caffeine in your coffee to kick in and take effect.
In one study, a caffeine nap improved driving performance and reduced sleepiness better than either a short nap without the coffee, cold air, or a strongly-brewed coffee without the nap. In another study, researchers found a coffee nap could reduce afternoon sleepiness in drivers by more than 90 percent. Even nappers who just dozed during their nap got the same alertness benefits as those who slept more deeply.

In a Japanese study, a group of sleep-deprived people compared the benefits of five different approaches: a 20-minute nap, a coffee nap, being exposed to a very bright light immediately after napping, washing their face immediately after napping and resting, but not sleeping. In terms of how sleepy the study participants felt, and how successfully they were able to carry out cognition tests, the coffee nap won hands down.

So you know what to do people. Fire up the coffee machine, then have a snooze and look forward to how great you’re going to feel once the caffeine kicks in. In 20 minutes, you’ll be leaping out of bed.

Links and stuff