Trumped-up confidence

comments 2
Anthropology / Myths / Psychology

Know anyone whose confidence in their own ability far outweighs their actual skills? It’s called the Dunning-Kruger Effect.

The Dunning–Kruger Effect is the ignorance of one’s own ignorance. Image credit Gage Skidmore via Wikimedia Commons

A question of confidence

The whole problem with the world is that fools and fanatics are so sure of themselves, and wiser people so full of doubt. –Nobel Prize Winner Bertrand Russell

A lack of confidence is extremely common: seventy percent of us experience the imposter syndrome. Despite good evidence to the contrary, we worry we’re too inexperienced and incompetent to be doing our jobs. So we work harder to try to prove our abilities, all the while being filled with self-doubt.

But you probably know someone who seems to suffer from the opposite problem: misplaced confidence. Think of the infuriating person who dominates meetings despite clearly knowing very little about the topic being discussed. Or the person who is boring you stupid at a party, waxing lyrical – and apparently knowledgably – about a topic they are unmistakably ignorant about. Or the terrible driver who thinks they are one of the best. We all know someone.

This flipside to the imposter syndrome is known as the Dunning-Kruger Effect.

How funny is this joke?

In 1999, David Dunning, and his then student Justin Kruger, published a paper ‘Unskilled and unaware of it’. They describe a series of experiments designed to uncover the relationship between a person’s skill in a certain area, and that person’s perception of their ability in the same area.

Across tests of grammar, humour and logic, students who performed worst also hugely overestimated their abilities. Although on average, they did worse in these tests than 88% of others, these students estimated they had performed better than two-thirds of the other students.

For example, Dunning and Kruger asked 65 students to rate how funny certain jokes were. They then compared the students’ ratings with those of professional comedians. The students who were terrible at predicting what other people would find funny declared themselves to be excellent judges of humour.

In all cases, the confidence of the lowest-performing students well and truly surpassed their skills. It’s easy to understand how this could happen: if you know virtually nothing about grammar, of course you are unlikely to recognise when you make grammatical mistakes. We are simply not good at knowing what we don’t know.

Kruger and Dunning begin their paper with the now-famous example of McArthur Wheeler. Wheeler attempted to rob two banks in broad daylight with no disguise. It turns out he believed that covering his face with lemon juice would make him invisible to security cameras. He wasn’t under the influence of drugs, or delusional; he was just completely wrong about a key part of his robbery plan. The story goes Wheeler was dumbfounded when police showed him the clear-as-a-bell video footage of himself mid-robbery.

This became the perfect example of the Dunning-Kruger Effect: not only was McArthur incompetent at being a bank robber, his incompetence left him completely unable to recognise his incompetence.

We don’t know what we don’t know

As John Cleese famously put it: ‘If you’re very, very stupid, how can you possibly realise that you’re very, very stupid?’

The Dunning-Kruger Effect has now been widely researched. Whether you look at debating, financial knowledge, chess, firearm safety, emotional intelligence or driving ability, the story is the same. Unskilled people simply don’t have the ability to recognise their lack of skills. If you don’t know how to play chess very well, you are completely ignorant to all the far better chess moves you could be making.

The problem isn’t when we know absolutely nothing about a topic. ‘Most people have no trouble identifying their inability to translate Slovenian proverbs, reconstruct a V-8 engine, or diagnose acute disseminated encephalomyelitis’. The problem arises when we know a little bit, but not enough to realise how little we know (a graph explains it better).

An important point to take away from these studies is that the Dunning-Kruger Effect isn’t about ridiculing the stupidity of others. It’s about recognising the traps we all fall into. How do we tackle the effect? By improving our skills. As we become more competent at something, at the same time we become better at recognising the limits of our abilities.

Students who originally estimated that they got five out of ten logic puzzles correct (when in reality they struggled to get one right) changed their tune after getting training. After being taught the basics of how to solve logic puzzles, the same students now predicted they would score one out of ten.

The challenge for all of us is clear. Next time we feel confident about something, we need to think carefully. Is our confidence a sign of genuine ability, or of complete incompetence?

Links and stuff

Hearing accents

comments 2
Anthropology / Myths / Psychology

We all speak with an accent, and we judge each other by our accents. Our accents can change gradually, or sometimes dramatically. But if you want to speak an additional language without any foreign accent, you may be facing an uphill battle.

Did you know that some animals – like these Japanese macaques – have accents too? Image via Wikimedia Commons.

Accent, accents everywhere

Everyone has an accent. And I’m not just talking about humans. The clicking sounds sperm whales use to communicate with each other vary by region, and Japanese Macaques have local accents too. In fact a variety of animals have patterns to their calls that are influenced by where the animal lives – for example wolves, other monkeys and cod. There has even been research into whether a cat’s miaows are affected by its owner’s accent.

Our accents can change over time – both as individuals and as whole countries. One study observed the way reality TV participants regularly changed their accents while living in isolation together for three months.

In the case of the rare foreign accent syndrome, a person’s accent can drastically and immediately change as a result of damage to the brain. Well-known examples are an Australian who has spoken with something similar to a French accent since a car accident twelve years ago, an American who has been speaking with a Cockney accent since having a stroke and a Brit who spoke in a thick Russian accent for several years after a brain hemorrhage. The first recorded case of foreign accent syndrome was in 1941 when a Norwegian woman began speaking in a German accent after being hit by shrapnel. The story goes that she was thought to be a German spy and was shunned by the people around her.

Talk like an Egyptian

If you’re hoping to speak a second language and sound like a native, you may be running out of time. The strength of your accent is directly correlated with your age at the time you learned the additional language. Unfortunately, your ability to speak an additional language accent-free has been tapering off since puberty. If you want to speak an additional language accent-free, your best bet is to learn it before the age of six. How much of the time you continue to speak in your native language will also affect the accent you speak with in a second language.

It’s difficult to sound like a native speaker when you’re not because the sounds you recognise (and will later copy) are established before your first birthday. In one study, researchers played the sounds ‘la’ and ‘ra’ to six-month old Japanese and English babies. At that age, all the babies could pick the difference between the two sounds. But at the ripe old age of ten months, Japanese babies could no longer hear the difference between ‘l’ and ‘r’ sounds, which don’t exist in Japanese. It’s amazing to think your pattern of speaking was already largely set before you spoke your first word.

Here’s a sobering fact for all of us who speak additional languages with a detectable accent: speaking with an accent makes you harder to understand. And research shows we are less likely to trust someone we find harder to understand. The more severe your accent, the less credible people assume you to be. This accent discrimination may be frustrating for travellers but of course is far more damaging for immigrants.

Can I borrow your accent?

Do people ever tease you about the fact you pick up accents easily? You might not even be aware you’re doing it but what’s been dubbed a ‘wandering accent’ is quite common. Without intention, you mimic whatever accent you hear at the time. It’s also known as the chameleon effect. It turns out we even imitate other peoples’ speaking styles when lip reading, and not able to hear their voice.

We mimic other people as a result of wanting to empathise and bond with them and imitating someone’s accent is the best way to improve your ability to understand that person. We now know exactly which part of the brain is active when impersonating someone else which may help to treat someone who has lost their own accent through foreign accent syndrome.

A head’s up: if you suddenly find yourself speaking with a distinctly British accent, a whole new line of acting work may be open to you. In the movies, most villains are portrayed by people with posh British accents – also known as received pronunciation (RP).

We perceive speakers with this accent as highly intellectually but low in morals. Sounds like a villainous combo to me.

Links and stuff

There’s a face in there

comments 2
Anthropology / Evolution / Myths / Psychology

Ever seen Elvis in a corn chip? How about a face in a building or a cloud? Spotting faces – even ones that don’t exist – is something your brain is very good at.

Seeing faces? Image credit Mk2010 via Wikimedia Commons

Do you see what I see?

One of my favourite Twitter accounts is Faces in Things. You’d be amazed where people have spied faces and other human and animal shapes.

You’ve probably heard about Jesus on a banana peel, the Man in the Moon and the face of Madonna on a toasted sandwich (which sold for US$28,000). We’ve seen capsicums that look like British politicians and a crab tainted with the face of Osama bin Laden.

Melbournians may know the Hume Highway rooster tree, now with its own Facebook page and song.

Since the 1700s, the surface of Mars has been a particularly rich source of these illusions. In 1976, people were captivated by images of a face on Mars, which turned out to be nothing more than a trick of light and shadows. Mars also boasts a smiley face in a crater, a lava flow that resembles Kermit the frog, the face of Mahatma Gandhi, a rat and Bigfoot.

Famous cases aside, people have spied faces in power points, trees, buildings, rocks and USB drives. If you tend to see faces everywhere, don’t worry. You’re not going crazy: our brains are exceptionally good at spotting faces.

It’s called pareidolia: the tendency to perceive a familiar pattern when one doesn’t exist.

Faces, faces everywhere

There are many different kinds of pareidolia, but seeing faces is the most common. Why are we so quick to see a face where there isn’t one? Simple – because we spend so much of our time looking at faces. And we’ve evolved to depend on our ability to recognise and extract information from these faces.

We’ve over-learned human faces so we see them where they aren’t – Professor Takeo Watanabe, Brown University

We are hard-wired to recognise faces: our social lives have long relied on us being able to spot a face from a distance or in low light. Not only that, it’s extremely useful to be able to deduce other things from a face: mood, age, gender and the direction a person is looking. Is this person a friend, or a threat? Even as very young babies, we prefer to look at faces over non-faces.

It turns out we have an entire brain area dedicated to recognising faces – the Fusiform Face Area (FFA). This area is active when we see a face, even in blind people.

Research shows this area of the brain lights up in the same way when we see an illusory face as when we see a real face. The FFA is active when a person reports seeing a face, even when there is absolutely no pattern (in a pure noise image).

True believers

Identifying patterns is nothing new – it’s the basis of the infamous Rorschach inkblot test. But some of us are more likely to see faces than others.

A Finnish study firstly asked volunteers whether they saw faces in dozens of objects and landscapes. The researchers then asked about the belief systems of the participants. Did each person believe in God? And how about the paranormal? Religious people and those who believed in the paranormal were much more likely to see faces than atheists and skeptics. ‘Believers’ were also more likely to see emotions in the illusory faces.

And if you’re thinking our ability to see faces in toast, tortillas and toilets is something that makes us uniquely human, think again.

Recent research shows rhesus monkeys see faces that aren’t there too. And it makes perfect sense – if a monkey thinks it sees a tiger when there’s no tiger around, it isn’t a big deal. But the consequences of not spotting a real tiger might not be so pretty. It’s not just humans who have evolved to be highly tuned to faces.

Now given I’m hard-wired to spot faces, excuse me while I waste a bit more time indulging my pareidolia.

Links and stuff

 

How to rock paper and scissors

comments 2
Anthropology / Mathematics / Myths / Psychology

We’ve all used rock-paper-scissors to resolve a dispute. Is there a foolproof winning strategy?

Want to win at rock-paper-scissors?

The earwig and the elephant

When all else fails, rock-paper-scissors (also known as RPS) is an excellent way to make a decision. The rules are simple. At an agreed moment – usually after two or three ‘primes’ to get in sync, two people each make a symbol with their hand. The symbols are rock (closed fist), paper (flat hand) and scissors (index and middle fingers form a V to represent scissor blades).

There is an agreed hierarchy to determine who wins each round: rock breaks scissors, scissors cuts paper, paper covers rock. Despite considerable debate as to whether the rock smashes the scissors or blunts them, the outcome is the same. There are three possible results: win, lose or tie.

There’s plenty of discussion about the origins of RPS – it probably dates back to the Chinese Han Dynasty. The Japanese version, Junken, has a long history and continues to play an important role in Japan today. There are many variations on the theme: for example, the tiger, village chief and village chief’s mother (who trumps the village chief). My personal favourite – earwig-man-elephant – hails from Indonesia. Earwig beats elephant because it crawls up the elephant’s trunk and eats its brain. The game is also sometimes called Rochambeau or Roshambo (remember the Southpark episode?)

Just a kid’s game?

At one level, RPS is a kid’s game, good for settling playground disputes. Among adults, it can decide who should pay for a round of drinks. But don’t be fooled: RPS is a cultural phenomenon.

International competitions offer five-figure prize money. The World RPS Society has an Internationally Recognised Throwing System of hand symbols. There’s also a Player’s Responsibility Code including the sage advice “Think twice before using RPS for life-threatening decisions.”

Why do we love RPS so much? Because despite the deceptively simple rules, there’s a whole world of strategy.

RPS is written off as a kids’ game … but when you delve into it, it’s one of the purest forms of competition that two minds can have with each other – Professional rock-paper-scissors player Jason Simmon

Let’s talk strategy

We tend to think RPS is a fair way to resolve disputes because the winner is decided by chance. Each player picks a hand symbol at random meaning each has an equal likelihood of winning that round.

Except that we are lousy at being random. We follow patterns and if you understand those patterns, you suddenly have a distinct advantage over your opponent. You can find a variety of RPS strategy guides and most suggest different tactics depending on your opponent’s gender and level of experience. For example, inexperienced men tend to lead with rock whereas most women lead with scissors. If you’re not convinced experience makes any difference, play against a computer first in Novice mode, then in Veteran mode where the computer “pits over 200,000 rounds of previous experience against you.” Experience matters.

Recent studies have revealed on average, we choose each action about a third of the time, which is what you would expect if our choices were random. But closer inspection shows there are predictable patterns to our choices. In a study of 360 students playing 300 rounds of RPS, the patterns were clear.

Players who won the last round will most likely stick with the same action. If it worked once, it may well work again.

Importantly, players who lost the last round will most likely switch to the next action in a clockwise direction (where R → P → S is clockwise). This is the tendency you can exploit to your advantage.

Can you always win?

The latest research led to dozens of claims that with maths on your side, you can always win at rock-paper-scissors.

The latest research findings can definitely up your chances of winning. If your opponent just won with rock (because you played scissors), you should choose paper next because they are likely to play rock again. If your opponent lost, you should respond to the likelihood that they will switch from rock to paper to scissors in that order by playing the actions in the opposite order. For every scenario, the research predicts which action will most likely win you the next round.

But the problem here is that if your opponent has also read up on strategy, your plans may be foiled. I don’t think you can ever be guaranteed of winning.

Unless, of course, you’re a robot. The Janken robot wins RPS every time because it recognises in one millisecond what shape your hand is forming and almost simultaneously makes the winning action.

Time to channel your inner robot.

Links and stuff

Imposters are us

comments 2
Health / Myths / Psychology

Have you ever felt like a fraud? That you aren’t good enough, experienced enough or smart enough to be doing what you’re doing? Join the club.

Anxious you’re about to be found out?

Feeling like a phoney

On the outside you appear confident, composed and on top of your game. But on the inside you are wracked with self-doubt. You feel like a fraud and as though someone is about to tap you on the shoulder and ask you what you think you’re doing. You’re sure your inadequacies and incompetency are just about to be revealed. Hello Imposter Syndrome.

In 1978, psychologists Pauline Rose Clance and Suzanne Imes described this phenomenon among high-achieving women. Five years of research had highlighted how many successful women believed their success could be attributed to luck, chance or errors in selection processes. These women were sure their abilities had been overestimated. Clance went on to create a scale to quantify the experience of imposter syndrome.

Researchers have been studying the imposter syndrome ever since and we now know a lot about it. Despite the fact the imposter syndrome was first described in women, we now know men are just as likely to experience it. One study among University academics (a profession in which the imposter syndrome is rife) found men were even more likely than women to experience imposter syndrome.

Research suggests 70% of people will experience feelings of being an imposter at some point in their lives. And it’s worth pointing out this phenomenon doesn’t qualify as a syndrome according to the medical definition. According to Clance, if she could go back she would call it the Imposter Experience since nearly everyone experiences it and it’s not a mental illness.

Come one, come all

An important aspect of the imposter syndrome is the fact each of us tends to think we are the only one suffering from it. We listen to the monologue of self-doubt going on in our own heads and mistakenly assume it’s just us. But the truth is most people feel this way at least some of the time. When Olivia Fox Cabane asks incoming students at the notoriously highly-selective Harvard Business School each year ‘How many of you in here feel that you are the one mistake the admissions committee made?’, two-thirds of the students put up their hand.

(And of course, it’s somewhat terrifying to consider the possibility everyone around us is just winging it. We like to think the people out there flying planes, performing surgery, making decisions in court and running our governments are highly competent.)

There are a number of common features among people experiencing imposter syndrome. The vast majority of ‘imposters’ are able to successfully fulfill their work requirements despite their perceptions of incompetency. In fact, many ‘imposters’ are high achievers who fail to internalise their success. Despite ample objective evidence of their achievements, ‘imposters’ still feel like they are making it up as they go along and fear they are about to be unmasked.

Other features of imposter syndrome are a fear of failure; a tendency to attribute success to luck, error or charm; and the feeling of having given others a false impression. People experiencing imposter syndrome often experience anxiety, may be perfectionists and may also fall victim to procrastination. ‘Imposters’ often feel they need to stand out and be the very best compared with their peers. Interestingly, people suffering imposter syndrome crave praise and acknowledgement but feel uncomfortable when they receive it (because they feel they don’t deserve it).

Although men and women experience imposter syndrome in equal numbers, they do respond to it differently: women commonly work harder to try and prove themselves whereas men tend to avoid situations in which their weaknesses might be exposed.

Fake it ’til you make it

The most common advice we hear about how to deal with feelings of inadequacy is to ‘fake it ’til you make it’. Pretend you feel confident and assured and ignore the nagging doubts. It’s not bad advice, but we may be waiting for a long time to feel like we’ve ‘made it’. The frustrating irony of imposter syndrome is the more experienced and senior you become, the more likely you are to find yourself being required to do new things, and therefore feeling like you are winging it. Getting better at your job isn’t a guaranteed way of making your imposter syndrome go away.

There is plenty of other advice for coping with imposter syndrome. For example, learn to accept compliments, talk with the people around you about how you feel and remember that feeling like a fraud is completely normal. It can also be useful to focus on what you’re learning, rather than how you’re performing. This is mindset theory: if you focus on how you’re performing, you see any mistakes you make as evidence of your inadequacy. But if you have a growth mindset, your mistakes are simply part of the inevitable learning process.

And if all else fails, take heart from the words of an expert in the field:

Impostorism is most often found among extremely talented and capable individuals, not people who are true impostors –Associate Professor Jessica Collett

Links and stuff

 

It’s a seasonal thing

comments 2
Health / Myths / Psychology

We all know horoscopes are bunk. But research suggests the season you were born may say something about you after all.

What season were you born in?

What’s your star sign?

Is checking your horoscope a guilty pleasure? Most of us agree: there’s no science behind the star sign columns that abound in newspapers, magazines and online. But that doesn’t mean people don’t read them. One study found more than 20% of adults read their horoscope often. There’s also been plenty of research into how many people believe in horoscope predictions, and the numbers are higher than you might think. This is despite the fact we know full well that predictions based on astrology are no better than those based on chance.

Astrological predictions aside, it turns out there are other interesting correlations between the season in which you were born and your personality and health. Research has found a variety of factors including your mood, risk of suffering allergies and how long you are going to live may be influenced by when you were born.

The seasons of life

Researchers have found a number of correlations between season of birth and personality traits. People born in summer are more likely to experience frequent mood swings. Those born in summer and spring are more likely to be excessively positive. But people born in winter are less likely to be irritable than those born at other times of the year.

We know people born during the winter months are more likely to have Seasonal Affective Disorder (‘winter depression’), bipolar disorder and schizophrenia. One fascinating study also found differences in brain structure between people born in different seasons. One very large study found relationships between month of birth and risk of developing a disease for 55 different conditions including MS, diabetes and heart conditions.

When you were born may also influence how long you are going to live. In the US, you are more likely to live to 100 if you were born in September – November. A Danish study found remaining life expectancy at 50 was higher for people born in October or November. In the southern hemisphere, the pattern is shifted by half a year: in Australia, people born in May and June are likely to live longer. Interestingly, British immigrants to Australia fit the pattern for the northern, not southern hemisphere.

Your seasonal brain

We don’t fully understand how the season you were born in influences your personality and health. There are a few different possibilities. Perhaps the fact that sunlight varies across the year, and the amount of sunlight you are exposed to influences vitamin D levels may play a role. We know people born in winter have lower levels of vitamin D in adulthood than those born at other times of the year. The timing of your birth may also be linked to how likely your mother experienced seasonal illnesses during pregnancy. Levels of important chemical messengers in our bodies like serotonin and dopamine also vary according to the season of birth.

There’s also evidence from studies of mice that the season you are born in may have an ongoing effect on your body clock. Mice raised in a winter light cycle had more trouble adapting to a seasonal shift in light than mice raised under summer conditions. This has led to an intriguing suggestion: perhaps the cycles of night and day we experience during the time our brains are developing may affect our personality.

We know the seasons influence us in other ways too. For example,  how our eyes perceive different colours and possibly even how effectively we can pay attention and memorise things.

So although I’m not advocating you spend any time reading your star signs, it may be worth considering that the seasons could be having more of an effect on us than we realise.

Links and stuff

To eat it or not to eat it?

comments 3
Biology / Health / Myths

We’ve all heard the five-second rule: as long as the food has been on the ground for less than five seconds, you’re  safe to eat it. But are you?

Would you eat this biscuit off the floor?

Birth of an urban myth

When it comes to the origin of the five-second rule, stories abound. Genghis Khan often stars in these accounts, with claims it was originally known as the Khan rule. This rule apparently came into play at 13th-century victory banquets when Khan declared that as long as food had been on the ground for less than twelve hours, it was safe to eat. Another story centres on celebrity chef Julia Child famously advising on her TV show The French Chef what to do if you drop food while cooking. ‘You can always pick it up, and if you’re alone in the kitchen, who is going to see?’

One survey found 70% of women and 56% of men are aware of the five-second rule and use it to make decisions about whether to eat dropped food. Another found 87% of people would, or already have, eaten food that’s been on the ground. The five-second rule was the focus of a Mythbusters TV segment and even made it into a Volkswagen commercial.

But is there any science to back up the claim that it takes more than five seconds for bacteria to attach to dropped food?

Testing, testing

I found a couple of tests of the five-second rule, with conclusions both for and against. In 2003, high school student Jillian Clarke set out to examine the rule as part of her science internship. Her conclusion: university floors are remarkably clean. But if you do drop food on a floor containing germs, your food will become contaminated in less than five seconds.

Another study concluded there is definite truth to the five-second rule: food picked up within a few seconds of being dropped is less likely to be contaminated than food left for longer. But a 2006 study found a sausage dropped on a tile floor picked up 99% of the bacteria present within five seconds.

A detailed study published last year found some food picks up bacteria within a second of landing on the floor. The scientists tested four foods (watermelon, bread, buttered bread and a jelly sweet), four surfaces (stainless steel, ceramic tile, wood and carpet) and a variety of contact times from less than a second to five minutes. Unsurprisingly, the moister the food, the more it got contaminated. The message: next time you drop watermelon on the floor, put it in the compost.

All floors aren’t equal

One of the biggest influences on whether bacteria end up on food is not how long the food has been on the floor, but rather what floor the food falls on. And I’m not just talking about how often you vacuum or mop: different floor surfaces provide more or less friendly surfaces for bacteria. Interestingly, the researchers found much lower contamination rates from carpet onto food than from tiles or stainless steel.

Clearly, what sort of food you’ve dropped and where you’ve dropped it matters just as much, if not more, than how long you’ve left the food on the floor for.

But it’s worth bearing in mind your floor is probably far cleaner than many of the other surfaces around your house. Study after study have highlighted how many bacteria live on many of the surfaces we touch regularly: money, mobile phones, remote controls, supermarket trolleys, computer keyboards and, often the most contaminated, kitchen cloths. There are a whole heap of bacterial hotspots you are probably touching before you eat without a second thought.

It’s highly likely your floor contains bacteria, and almost certain any food you drop on that floor will very quickly pick up that bacteria. Most bacteria won’t do you any harm, but some will make you very sick.

Whether eating food you’ve dropped on the floor is more likely to make you sick than any other number of things you do every day is a decision only you can make.

Links and stuff

Who is watching you?

comments 2
Health / Myths / Psychology

Ever had the creepy feeling someone was staring at you, only to turn around and discover it was true? Is it possible to ‘feel’ someone watching you?

Feel like someone might be watching you? Image credit: Seth Showalter via Tookapic

I see you

They say the eyes are the windows to the soul: eye contact is one of the most powerful forms of human communication. Unlike all other primates, human eyes make it extremely obvious which direction we are looking: the exposed white area around the coloured iris gives it away, even from a distance. In most other animals, the iris takes up almost all of the eye, or the area around the iris is darker. Either way, it’s very hard to tell which direction animal eyes are looking, which probably evolved as a way for predators to hide the direction of their gaze from potential prey.

From their very first days, babies prefer faces looking directly at them. And from an early age, our brains respond more to a face looking directly into our eyes than one looking away. We have a ‘gaze detection’ system – a network of nerves in our brains sensitive to whether someone is looking right at us, or just past us. Researchers suggest we have evolved to be so tuned into the direct gaze of other people because it forms the basis of human cooperation and social behaviour.

Sixth sense?

But what about the feeling of being watched? According to surveys, up to 94% of people report having experienced the feeling of being stared at, only to look up and discover it was true. If you’ve had the experience, you’ll know it can feel like ESP or a sixth sense. The phenomenon has also been the subject of plenty of research – the earliest paper I could find on the topic was published in 1898!

Early studies dismissed the idea, declaring that the feeling of being watched was simply a result of being nervous about what may be going on behind your back. There are also other straightforward possible explanations. It could be that in your peripheral vision, you’ve noticed the tell-tale signs of someone looking in your direction. We are enormously sensitive to body language. If someone’s body is facing away from us, but their head points towards us, it’s an immediate give–away they may be looking at us. At the very least, it makes us look up to get more information.

It may also be a self-fulfilling prophecy: if you’re nervous someone is watching you, you may start fidgeting. Your movement alone will make it more likely someone will indeed look at you. There’s also the possibility that as you sit on the train feeling like you’re being watched, the very act of you looking up from your phone makes the person opposite you look up too. When your eyes meet, you wrongly assume this person has been looking at you all along. Research has shown we are hard-wired to assume other people are looking at us, even if they’re not. The argument goes we evolved to think this way because it keeps us alert to danger and ready to interact.

But there’s also a far more intriguing possibility: perhaps your brain has detected someone else’s gaze without your eyes seeing a thing.

Your brain is watching

The evidence for this somewhat unnerving idea comes from studies involving people who have lost their visual cortex due to brain injury. The visual cortex is the part of your brain that processes visual information and maps your view of the world.

One fascinating study reported the experiences of a man who is ‘cortically blind’. Although his eyes are fully functional, and send information to the brain, his visual cortex was damaged by two strokes. As a result, he doesn’t have what we think of as sight. But what do our eyes take in beyond what our sight shows us? Quite a lot, as it turns out and it’s called blindsight.

Although this man (known as TN) can’t see, researchers showed him pictures of faces, some looking right at him, some looking away. At the same time, they measured activity in his amygdala – the part of the brain in charge of facial recognition and emotions. Even though TN couldn’t consciously see the pictures and said he couldn’t tell the difference, there was more activity in his amygdala when he was presented with a face looking directly at him. His brain knew when someone was looking right at him, even when he couldn’t consciously see it.

What an extraordinary trick our brains are capable of. As long as a person looking at us is within our field of view, we can sense it, without consciously noticing anything.

Perhaps this explains the creepy feeling of being watched, no supernatural explanation required.

Links and stuff

Itchy and scratchy

comments 2
Evolution / Health / Myths

We all know the sweet relief that comes from scratching an itch. But why does scratching feel so good? And why does it make the itch worse?

When you really have to scratch that itch! Image credit: lintmachine via Flickr

Why so itchy?

An itch is simply a sensation on the skin leading to the desire to scratch. But the science of itching is far from simple – it’s been the subject of decades of research (there’s even a scientific research journal called Itch). Researchers discovered the first gene associated with itching ten years ago, but there’s still plenty we don’t understand.

Itching can have many causes: from minor insect bites and skin irritations, through to various cancers and the treatments for those cancers. For most of us, itches are just a minor annoyance, barely noticed throughout the day. But for others, itching is a chronic condition that has a serious impact on quality of life. And chronic itching can lead to continuous scratching and severe skin damage.

Scratching the itch

On the face of it, dealing with an itch is straightforward: you scratch it. And hey presto, all is well.

Scratching an itch with a violence that would cause pain elsewhere may be experienced as one of the most exquisite pleasures. – Dr. George H. Bishop (1948)

But then all too quickly, the relief fades. Only to be replaced by an even more insistent itch. Remember being told as a kid that scratching your itch would make it worse? Well, your parents were right.

The reason scratching temporarily relieves a pesky itch is that scratching causes mild pain. This means your nerves carry pain signals, rather than itch signals to your brain and for a brief moment, the itch is gone. The pain distracts from the itch. But the problem is, your brain releases the chemical serotonin in response to the pain.

Serotonin explains how you can end up scratching until you bleed. You scratch because the pain of scratching inhibits the itch. But because your body releases serotonin in response to the pain, it takes more and more vigorous scratching to suppress the itch. At the same time, the serotonin also acts to intensify the itchy feeling. It’s a vicious cycle, which is why the best thing you can do is not start scratching in the first place.

Mice who can’t produce serotonin scratch far less in response to being itchy than normal mice. But if you’re thinking blocking serotonin would be a great way to prevent itching, not so fast. We need serotonin for all sorts of roles in the body: growth, ageing and regulating moods. Getting rid of serotonin would have far more serious consequences than relieving itching.

Catching the itch

Have you started scratching yet? Just reading about scratching can be enough to make you suddenly feel itchy. Audience members in a talk about itching itched more than those listening to a talk about a more innocuous subject. And if you see someone else scratching, you’ll probably copy without even being aware of it. It’s not a sign of a lice outbreak (hopefully), you’ve just fallen victim to socially contagious itching. Like yawning, mice, monkeys and people can all ‘catch’ itching.

There’s been plenty of research into why itching may be contagious. For example, research found that contagious scratching is more common among highly neurotic people.

Research published earlier this year showed that in mice, contagious scratching appears to be a hard-wired, instinctive behaviour. If a mouse sees another mouse scratching – even if the mouse is an image on a screen – the first mouse also starts scratching within a few seconds. A chemical is released in an area of the brain responsible for regulating mouse body clocks (the suprachiasmatic nucleus), which causes the scratching. Inject this chemical into another mouse and it too will start scratching. The mouse isn’t consciously deciding to scratch; it’s an automatic response to the change in brain chemistry.

Hard-wired behaviours tend to be important ones, and there’s probably a good explanation for why we’ve evolved to ‘catch’ scratching. We’re highly sensitive to potentially dangerous things like insects touching our skin. If someone near you is scratching, it may well be a sign of something you should respond to, perhaps a swarm of mosquitoes. Better to start scratching now and avoid getting stung.

But then again, if you don’t want the itch to drive you crazy, now may be a good time to look away.

Links and stuff

By the light of the full moon

comments 2
Astronomy / Health / Myths / Psychology

Known as the Transylvania Effect, many people believe a full moon triggers strange behaviour. Crime rates, psychiatric hospital admissions, emergency room visits, dog bites and hyperactivity in kids are all said to increase during a full moon. Is there any truth to the rumours?

Do strange things coincide with the full moon? Image credit Brad Scruse via Flickr

Bad moon rising

We can thank the Roman goddess of the moon, Luna, for the words lunatic and lunacy. Why the link between the moon and erratic behaviour? Aristotle and Pliny the Elder both suggested that because the human brain is so moist, it responds to the same pull of the moon that drives our tides (which we now know not to be true). In psychiatrist Arnold Lieber’s 1978 bestseller, How the Moon Affects You, he also argued that because our bodies are mostly water, the moon has a strong influence on us. How strong an influence? The story goes that in 19th Century England, lawyers could invoke the defence ‘not guilty by reason of the full moon’. I doubt any lawyers argued their clients had metamorphosed into drooling, snarling werewolves, but the argument is clear: under a full moon, people are no longer in control of their behaviour.

Sounds far-fetched, doesn’t it? But the myth of the Transylvania Effect is alive and well. One survey found 43% of US college students believed people were more likely to behave strangely on a full moon night. The same study found mental health professionals were more likely to believe this than people working in other areas. In another survey, 80% of emergency department nurses and 64% of emergency department doctors believed the moon affected behaviour. Ninety-two percent of the nurses said they found working on the night of a full moon was more stressful, and they should be paid more for working those shifts.

The moon made me do it

In 2007, police in the UK employed more staff to be on patrol on full moon nights claiming ‘Research carried out by us has shown a correlation between violent incidents and full moons’. There are thousands of published research papers exploring the link between the phase of the moon and a huge variety of events and behaviours. The most striking thing about all this research is the almost complete lack of evidence for the moon having any effect on us.

Despite the fact many people believe the time of the full moon is the worst for undergoing surgery, there is no evidence a full moon affects the outcomes of heart surgery, the rate of surgical complications, levels of post-operation nausea and vomiting or post-operative pain. There is no evidence for increased crisis centre calls during full moons and no link between moon phase and frequency of epileptic seizures. Research shows no increase in emergency department admissions or calls for ambulances during a full moon, nor do more people see doctors for anxiety or depression. There is also no relationship between moon phase and birth rates.

Research has found no link between the phase of the moon and psychiatric hospital admissions. A full moon doesn’t lead to increased violence in psychiatric settings, nor is there evidence for a link between moon phase and suicides or suicide attempts. No link has been found between full moons and traffic accidents, and dog bites requiring hospitalisation occur no more often during a full moon than at any other time. Kids aren’t more hyperactive at the full moon, and there are no more murders or other crimes when the moon is full.

The full (moon) truth

If nothing else, research over the past 50 years has provided us with a detailed list of all the human behaviours not influenced by the phase of the moon. Given the lack of evidence, why does the myth persist? It may simply be the result of illusory correlation – if something unusual happens on the night of a full moon, we are more likely to take note and remember it than when nothing unusual happens under a full moon. Confirmation bias probably also plays a role: if you already believe in the Transylvania Effect, you’ll pay particular attention to any information that supports your belief.

But there’s another intriguing possibility. We have some evidence people sleep a little less on full moon nights. One study found people slept an average of 19 minutes less during a full moon; another study recorded a decrease of 25 minutes of sleep on these nights. Study volunteers took longer to fall asleep and slept less deeply at the time of the full moon, even when they couldn’t see the moon and didn’t know the moon was full. They also complained of feeling less rested on waking after a full moon.  A 2-year study of nearly 6000 children living on five continents found kids also slept a little less on nights with a full moon.

Perhaps this is the source of the myth. In times gone by, before our body clocks were under the influence of electric lights and screen time, a full moon may have been very disruptive to sleep. Researchers have suggested the Transylvania Effect may have its root in the fact someone suffering bipolar or a seizure disorder may be highly susceptible to mania at times of sleep deprivation.

Seems plausible but fortunately I don’t think we’ll be hearing the defence ‘not guilty by reason of not having had quite enough sleep last night’ anytime soon.

Links and stuff