Songs on repeat

comments 2
Myths / Psychology

We all do it: find a new favourite song that then gets heavy rotation on our playlists. Mine’s currently a Postmodern Jukebox’s 50s prom style cover of Closer by The Chainsmokers. Tara Bautista on the science behind why we choose to play songs on repeat and what happens when we do.

Why we smash that replay button. Photo by Bruce Mars on Unsplash.


Play it again, Sam

Sixty percent of participants in a study last year admitted they immediately re-listen to their favourite song, some even up to four times in a row.

But what makes a song worthy of replay and why do we do it? One answer is the emotions they make us feel.

A study by the University of Michigan revealed three broad categories of favourite songs and why people were drawn to them.

People whose favourite songs were happy and energetic liked the beat whereas calm songs prompted listeners to reflect and gain perspective. But the most listened-to songs by far were the bittersweet ones. Unsurprisingly, the greater your emotional connection to a song, the more you hit replay.

Certain songs become our favourites because they resonate with us. Listening to our favourite music repeatedly can reinforce our identities, argues Kenneth Aigen, who heads the music therapy program at NYU.

Another answer is that repetition increases how much we like things.

In the 1960s, Robert Zajonc first described a phenomenon called the ‘mere exposure effect’. We tend to like things – be it pictures or melodies – more, the second or third time we come across them.

We rate songs familiar to us as more likeable than unfamiliar ones. Since our brains are faster at processing the sounds during re-listens, we are ‘tricked’ into thinking the ease of recognition is a positive thing. Familiarity also increases our emotional engagement with the music.

What’s more, repetition and familiarity with the music allows us to actively participate and not just process the sounds.

Professor Elizabeth Margulis, author of On Repeat: How Music Plays the Mind, says “Musical repetition gets us mentally imagining or singing through the bit we expect to come next.”

Indeed, brain imaging studies show that other parts of the brain besides those which process sound light up when we listen to familiar music. These include brain regions that anticipate where the song is going and those usually associated with body movement. Our mind effectively ‘sings along’.

Neuroscientists have also described two stages when we listen to music that gives us the chills. The brain’s caudate nucleus is excited during the build up to our favourite part. When the peak hits, another part called the nucleus accumbens releases the feel-good brain chemical dopamine.


Can’t Get You Out of My Head

There are songs you deliberately play again, and then there are earworms.

Earworms, or involuntary musical hallucinations, are short bits of music that we hear on a seemingly endless loop in our heads. They’re usually songs we’re familiar with and they can be set off by mental associations. For example, parents of toddlers may see the word ‘shark’ and automatically hear ‘doo doo doo doo’.

Turns out you’re more likely to experience earworms if you listen to music and sing on a daily basis. Also if you show obsessive-compulsive traits and score high in openness to experience.

Psychologists analysed data from 3000 participants who were asked which songs gave them earworms. Popular and recent songs were among the top ten, with Lady Gaga’s Bad Romance taking out top spot.

The researchers then analysed MIDI files of the reported earworm-inducing songs to see what features they had in common. On average, the songs were upbeat with a melody that is both familiar but unique in terms of the intervals between the notes. In other words, the melody has to sound like something we’ve heard before but different enough to remain interesting and memorable. 

I Will Always Love You (… or not)

You’ve listened to it more than a dozen times now, but suddenly your favourite new jam isn’t such a hit. What gives?

Most pieces of music that we’ve liked initially, but then are overexposed to, follow the classic Wundt curve. This is a when the pleasantness of an experience increases for the first few times but then decreases with further exposure. (Pharrell William’s Happy, anyone?)

As our brains figure out the complexity of the song, the novelty wears off and we become bored.

This might be why more complex pieces of music like Bohemian Rhapsody by Queen (coincidentally number seven in the list of earworms) have longer-lasting popularity, according to music psychology expert Dr Michael Bonshor.

Although simpler song structures are catchier in the short-term, more complex layers of harmony, rhythm and vocals provide more opportunities to keep our brains interested.

And as for getting rid of earworms?

Some researchers suggest singing another song, like God Save the Queen, or Karma Chameleon.

But then again, karma karma karma karma karma … d’oh!

Links and stuff

How super is your superfood?

comments 2
Health / Myths

Chia seeds, goji berries, kale, blueberries, turmeric, green tea, quinoa, açaí (which I’ve now learned is pronounced ‘ah-sah-ee’). There’s no shortage of so-called superfoods. But what does that term mean, and are superfoods worth your money?

So much healthy goodness….all in one bowl. Photo by Chef Mick (Michaelangelo) Rosacci via Flickr.

All in a name

Need to lose weight? Want to slow down ageing? Want to minimise your risk of cancer and other diseases? These days you could be forgiven for thinking all you need to do is eat the latest miracle superfood and your ongoing good health is guaranteed.

Superfoods are marketed for their high nutritional value and extraordinary health benefits and the marketing appears to work. A UK survey found 61% of people bought particular foods because of their superfood status. The study also found people are willing to pay more for superfoods. Research suggests we see superfoods as somewhere between food and medicine and that the higher your socioeconomic status, the more likely you are to eat superfoods.

It pays to know there is no scientifically-agreed meaning of the term ‘superfood’: it’s not a legal or regulatory term like ‘organic’ or ‘fair trade’. Superfood is a buzzword created to make us believe a particular food has crossed some nutritional threshold of goodness. But it hasn’t.

Interestingly, using the term superfood on food packaging or in marketing isn’t regulated in Australia or the United States. But there have been very tight restrictions around its use in the European Union since 2007.

What’s so super?

One reason superfoods are so alluring is because we know modern western eating habits are often less-than-ideal. We’re well aware of the steadily rising rates of obesity, type 2 diabetes, heart disease and other diet-related conditions. Foods and diets that appear to be more natural, primitive, traditional and authentic are inherently appealing.

It’s no coincidence that many superfoods have a history of traditional use by ancient or indigenous communities. For example, maca powder – often advertised as an ‘Incan superfood’ – comes from the root of a Peruvian vegetable and is said to improve energy, stamina, fertility and sex drive.

Australian researcher Jessica Loyer describes the packaging of one popular brand of maca powder: women in traditional Andean clothing picking vegetables by hand and placing them in woven baskets. In reality, men, women and children wearing jeans all harvest maca into plastic bags, using tools and in some cases, modern machinery.

Another incredibly appealing aspect of superfoods is the notion that by adding superfoods to your diet, you can compensate for your other bad habits. No need to change your doughnut-a-day habit if you add goji berries to your muesli. The muffin you had at morning tea is healthy because it was packed full of blueberries.

And although superfoods may be advertised as green, sustainable products, unfortunately the opposite can be true. As demand for these superfoods increases, so do the potential profits: it’s no wonder some farmers are abandoning their traditional production methods. But the resulting monocultures, soil degradation and increased use of fertilisers and pesticides are resulting in serious environmental and ecological problems.

What’s the evidence?

Why do some foods end up being touted as superfoods? Often because research has shown a particular component of that food does have specific health benefits. But the problem is the evidence generally comes from animal trials or lab experiments that don’t well represent the complexities of the human body. Just because a particular ingredient kills cancer cells in a lab experiment doesn’t mean eating lots of food containing that ingredient will stop you getting cancer.

There are experiments on people, but they tend to test the effect of very high amounts of a particular ingredient over a short period. For example, an ingredient in cinnamon, cinnamaldehyde, has been shown to reduce cholesterol in people with diabetes. But as nutrition research scientist Emma Beckett has calculated, you’d need to eat about half a supermarket jar of cinnamon each day to get the same amount of cinnamaldehyde as in the study.

It’s a similar story with blueberries.  Blueberries (like red wine) contain resveratrol which is a compound claimed to reduce the risk of heart disease, Alzheimer’s, cancer and diabetes. But to get enough resveratrol for the potential benefits, you’d need to eat roughly 10,000 blueberries a day. That would be both time-consuming and expensive!

So what should we eat?

There’s plenty of information out there on what constitutes a healthy diet. And if you can afford it and like the taste, there’s no reason not to include some ‘superfoods’ in your diet. But if kale doesn’t do it for you, you can also get essentially the same nutrients from cabbage, spinach, broccoli or brussels sprouts. Nutritionists tell us variety is the key to a healthy diet.

Personally, I like the idea of the rainbow diet – making sure I eat fruits and vegetables of a variety of colours. And I reckon Michael Pollan’s adage is worthy of its fame: ‘Eat food. Not too much. Mostly plants’.

Links and stuff


The low down on the sunk cost fallacy

comments 2
Evolution / Myths / Psychology

Waiting in line at the grocery, you hesitate to move to another counter when it opens up. Even though you’re not looking forward to seeing it, you head to a movie you’ve already paid for. You stay in a career or relationship that no longer ‘sparks joy’. Tara Bautista asks why we cling on when we should let go.

Why do we choose to stay on board? Photo by Jack Blackeye on Unsplash.

Going down with the ship

Humans are weird creatures. We prize our supposedly unrivalled ability to tackle problems and come to decisions through logic and rationale. Yet we often fall prey to logical fallacies.

A common psychological trap you’ve probably fallen into is the sunk cost fallacy. It’s also known as the Concorde fallacy.

The sunk cost fallacy is when we continue with an endeavour even though it makes us unhappy or worse off because of the time, money, and effort – i.e. the ‘sunk costs’ – we’ve already invested in it.

In other words, it’s every situation in which you’ve ever thought: “But I’ve already come so far…”

The more you’ve invested, the more you’re likely to stick with it. In a study of US basketball drafts, higher paid draft picks were given more play time than lower paid ones even if their performances weren’t up to par.

Interestingly, one study showed the sunk costs we honour don’t even have to be real or even your own. In an imaginary situation where people were told they were accidentally booked in to two trips, one to Cancun ($800 in value) and the other to Montreal ($200), people were more likely to choose to go to Cancun even if they preferred Montreal.

In another scenario, people were told to imagine they felt full but hadn’t finished eating cake at a party. More often than not, they would push to finish it if they were told the cake had been bought from an expensive and faraway bakery.

But why?

Plenty of researchers have proposed explanations for this counterproductive behaviour.

One is simply that we don’t like to be wrong. There’s a mental disconnect between the effort we put in and the lower than expected return on our investment. As a result, we exaggerate the benefits to justify the cost.

For example, there’s a greater chance you’ll give five-star ratings for videos of kittens, dancing and bike crashes the longer you’ve spent waiting for them to load.

We also want to convince ourselves and others that we’re not wasteful or ‘quitters’.

We still go ahead with the sinking ship despite feeling bad about it. One study goes so far as to say that the negative feelings we have about our sunk costs actually drives the behaviour. The researchers found the likelihood of people sticking to their guns was predicted by how bad they felt immediately after making a decision about sunk costs.

We may also fear the regret we might feel by withdrawing our commitment.

Some researchers propose there’s an evolutionary advantage to whatever mechanism is driving the phenomenon since mice and rats also exhibit the behaviour. They too experience ‘regret’ about sunk costs and show reluctance to quit lining up for food the longer they’ve waited.

Looking to the future to avoid getting snagged

The thing is, thinking about sunk costs is being caught up with the past when we should be focusing on present actions and potential gains from them.

Economic theory suggests that “…decisions should only be guided by future gains and losses, as prior costs do not affect the objective outcomes of current decisions.”

We ‘feel’ our way into the trap, but we can think ourselves out of it.

Reflecting when we’re not mentally overwhelmed helps us to avoid the sunk cost fallacy. People who justify their decisions feel less bad about poor investments and are less susceptible to the sunk cost fallacy than people who don’t justify them.

And speaking of thinking (and perhaps overthinking): people high in neuroticism are less prone to the sunk cost fallacy. This is because they tend to avoid stressful situations by engaging in tasks other than the one responsible for the stress. On the other hand, people low in neuroticism experience less stress, resulting in less incentive to change their behaviour and greater chance of continuing with bad investments.

When things go wrong, Western society has a tendency towards action – to do something rather than not. In fact, we have a tendency to commit even more resources the more desperate the situation is.

But researchers found we can prevent this by reframing how we think. We are less likely to commit more resources to a failing venture if we are taught to think of committing more resources as ‘inaction’ and not choosing to commit is an ‘action’ of itself.

I don’t know about you but changing the way I think about a sinking ship isn’t such a titanic effort when I imagine doing nothing and being buried in the heart of the ocean!


Links and stuff

The truth about early birds

comments 4
Anthropology / Biology / Evolution / Genetics / Health / Myths / Psychology

We all know the saying ‘The early bird catches the worm’. But are there any real benefits to being a morning person? What determines whether you’re an early bird or a night owl anyway? And can night owls become early risers if they want to?

Embracing the dawn. Photo by Pablo Heimplatz on Unsplash.

Tick tock

Your body has an internal clock. It’s located in the base of your brain, in the hypothalamus. You’ve probably heard the term circadian rhythm: this is the natural sleep and wake cycle of all animals which is synced to the Earth’s 24-hour cycle.

Unlike nocturnal animals, people are generally awake during the day. But there are differences in when we prefer to sleep. Your preferred sleep and wake times are your chronotype. Most people start their lives as early birds – many of us have personally experienced how early babies tend to wake up. But wake times usually shift later as we age, with teens notorious for their late nights and long sleep-ins.

Known as the sentinel hypothesis, the theory goes that a tribe of humans with staggered sleep schedules were at an evolutionary advantage: there was always someone wide awake and ready to stand guard. Today, some of us are early risers, some of us late risers, and many fall somewhere in between.

All in the genes?

We’ve long known genes play a role in determining chronotypes, but a big study published last year made clear how complex the link is. Researchers studied the genomes of nearly 700,000 people in the US and UK and identified 351 areas in the genome that contribute to whether someone is an early riser.

But people who are genetically most likely to be early risers only wake on average 25 minutes earlier than those who aren’t, so there are clearly a lot of other factors at play too.

A study published just last week found that one in 300 people has what’s called Advanced Sleep Phase, waking naturally well before dawn (between 3 and 5am). The researchers found for roughly one in 500 people, this super early rising runs in the family. These numbers are higher than we previously thought and also suggest a genetic role. My Dad and I are classic examples of this condition. People find it hard to believe but I wake up at 3:30am without an alarm.

The benefits of being an early bird

Although we now understand being a night owl is nothing about laziness, there are definitely some costs that come with staying up late.

One of the main problems is now known as social jet lag. Normal jet lag is the result of flying to a different time zone and experiencing a mismatch between the time your body thinks it is and the actual time in your new location.

Social jet lag is the mismatch between many peoples’ body clocks and the waking hours they are forced to keep because of school or work. If your preferred sleep time is 2am to 10am, having to be up at 7 for work isn’t going to do you any favours.

But there are other problems associated with being a night owl. Night owls consume more caffeine, nicotine and alcohol. A large twin study found night owls were much more likely to be current and lifelong smokers.

Morning people also tend to have better mental health. Night owls are at greater risk of suffering both depression and schizophrenia.

A big international review of studies found evidence that night owls may be at higher risk of heart disease and type 2 diabetes. The study found those who stay up later at night tended to have more erratic eating habits (for example, missing breakfast) and ate more unhealthy food.

Research suggests there are also personality differences between morning and night people. Morning people tend to be more proactive, persistent and cooperative, whereas night owls tend to be more creative and better at risk-taking.

Can you make the switch?

If you’re a night owl who’d like to get up earlier, you probably can if you’re willing to change your habits.

A recent study of young people whose average preferred sleep time was 3am to 10.30am showed it’s possible to substantially shift sleep times in the space of only one month.

These study volunteers were asked to follow a series of rules: get up two – three hours earlier, have breakfast as soon as they woke up, avoid caffeine after 3pm, maximise the amount of time they spent outdoors in light during the morning, and not sleep in on weekends. Amazingly, all the participants stuck to the program. Quite possibly because they began to feel much better: less sleepy, stressed, anxious and depressed.  Instead of feeling at their best in the evening, they now hit their peak in the mid-afternoon.

Other research has shown that when you take people camping and deny them access to all artificial light other than a campfire, their body clocks quickly change. Within a week, former night owls feel sleepy much earlier in the evening and wake with dawn.

If you’re a night owl who can’t or doesn’t want to make the switch, take heart: if you ever move to Mars, you’ll probably cope far better with that planet’s 24-hour, 39-minute solar day. Extreme morning types like me are likely to find it much harder to adapt to the 40-minute shift.

Links and stuff

Emojis at face value

comments 6
History / Myths / Psychology

If you’re reading this, chances are you’re familiar with emoji and have used them in an 📧, or text. How did this visual ‘language’ come about and how will it impact communication in the future? Tara Bautista explores the history and 🔬 of emojis.

👍 or 👎 them, emojis are the fastest growing ‘language’ in the world. Photo by Lidya Nada on Unsplash.

Smiley faces

“How r u?”

“I’m OK.”

How many of you have received a reply like this and wondered what OK meant?

It could mean anything from “Life’s all good” to “I’m barely surviving, but there’s no need to call emergency services just yet”.

Now how about:

“I’m OK 😊.”

Isn’t that more reassuring?

The simple inclusion of a smiley face emoji adds a depth of emotion that is often lost in text-based messages. This convenience in communication is the reason why these small pictures were introduced in the first place.

The 📈 and 📈 of emojis

The first widely used set of emojis was created in 1999 for the Japanese phone operator NTT DoCoMo. The word ‘emoji’ is an amalgam of the Japanese words for picture (‘e’) and letter/character (‘moji’).

Creator Shigetaka Kurita noticed that despite the potential of digital communication for bringing people closer, written text didn’t provide cues about feelings and intent that we can pick up from hearing people’s voices or seeing their faces and body language.

Kurita devised 176 simple ideograms made of 12 x 12 pixels, which included faces that conveyed emotion. Other emojis represented things we commonly talk about, like traffic and weather.

Today over 90% of the world’s online population uses emojis. There are currently 3053 emojis approved by UniCode, the governing body of emojis whose voting members include IBM, Microsoft, Google and Netflix.

(Fun fact: Anyone can propose new emojis but the approval process is lengthy – up to 18 months. A new emoji must satisfy criteria including that many people are likely to use it, and that it conveys something different to existing emoji.)

Such is the power of this visual ‘language’ that the ‘😂’ emoji was chosen as the 2015 word of the year by Oxford Dictionary.

The 👍 case for emojis

Emojis act as an addition or substitute to written language. They fulfil the role that tone of voice, facial expressions and body gestures do in face-to-face and verbal communication.

Emojis may be simple pictures, but they can express a nuanced range of emotions. Consider how you would tell someone you’re happy. There are many options, including 🙂, 😃 and 😄. Even using non-face emojis can help convey your mood.

On the other hand, the same emoji can mean many things in different contexts. For example, 😕 might be taken as confused, neutral, sad or unhappy.

But the popularity of emojis is undoubtedly due to the fact they are easy to use and understand. Their meanings largely transcend different cultures, ages (and operating systems!).

A 2015 survey of 2000 British young people found that 72% agreed that it was easier to express their emotions using emoji.

An interesting example that showcases the universality of emojis is their use in food-related market research. The potential for using emoji in gauging consumers’ preferences was first noted by researchers when analysing over 12 000 tweets on meals. In a large number of them, people commonly and spontaneously used a range of emojis or emoticons to express their thoughts.

Now, emojis are being used to report the food-related thoughts and feelings of children as young as eight. In one study, children lined up behind emojis that best reflect their feelings about drinks. In another, they chose emojis that represented their feelings when eating certain foods. A similar ‘select-all-emoji-that-apply’ reporting system has been successfully used with Chinese consumers.

Supporters of emojis suggest they may help us communicate with illiterate or poorly-educated people. In 2019, the mosquito emoji was added at the proposal of malaria researchers who wanted to spread public health messages about malaria prevention.

️ : not everyone is 😃 about them

For some people, the idea that 😂 is a ‘word’ and emojis are a language is as controversial as the singer Bob Dylan winning the Nobel Prize for Literature.

It is difficult to conduct a deep and meaningful exchange in emoji. This is because emojis are almost all nouns, with few verbs and no adverbs.

One researcher argues that our greatest power as humans is our ability to clearly express complex thoughts. If we rely too much on emoji, he argues, it ‘may push us back towards cave drawings we started with.’

Another opinion is that that our increasing dependence on fast, visual-based communication may make us mentally lazy and decrease our ability to think about abstract, long-term and complex issues.

Using emojis can also leave a bad impression when used in formal digital communication.

So, it seems that as with most things, 🤷 … can’t please everyone, right?

Links and stuff

Nature as medicine

comments 2
Anthropology / Medicine / Myths / Psychology

For years, people have flocked to cities for greater job prospects and convenience. Tara Bautista explores why something in our psyche still yearns to escape the urban jungle and is soothed by the great outdoors.

Forest bathing and other forms of contact with nature improves both mental and physical health. Photo by Luis Del Río Camacho on Unsplash

A few years ago, when I was struggling with some mental health issues, I realised that the cramped city apartment I was living in was not doing me any favours.

The walls were painted a calming pale green that was meant to bring to mind moss and leaves.  But the tiny Juliet balcony – the only view to the outside – opened up to the not-so-calming interior of a busy car dealership.

Fast forward to months later in my new apartment.

The tall ceilings and increased space are a welcome relief. From my couch, the view from the French doored-balcony is an almost unobstructed vista of the sky. I can easily spend an hour watching the crawl of clouds, bathed in silence.

I believe this change of scene is one factor that helped me in my healing.

In fact, my experience parallels a famous study: patients who underwent gallbladder surgery healed faster and had fewer complications when they had a room overlooking trees as opposed to a wall.

Cottoning on to the benefits of going au naturel

In 1984, renowned biologist EO Wilson coined the term ‘biophilia’ to describe the human urge to connect with nature and other forms of life.

Two years earlier, the Japanese government introduced a new form of therapy as part of their national health program. ‘Shinrin Yoku’ or forest bathing involves people slowly meandering through a forest and taking in the atmosphere – with clothes on, of course. They emerge from forest bathing feeling calmer and more mindful.

Similar nature-as-therapy programs have been introduced in other countries including Scotland, with good scientific reason. Forest bathing decreases anxiety and blood pressure by activating the part of the nervous system that puts your body in ‘rest and digest’ mode.

Even going for a run or walk outside in nature is associated with feeling more revitalised and energetic than if you’d done the same thing on a treadmill.

Reconnection with nature also has tangible effects on physical health according to a 2018 study that looked at combined data from 290 million participants.

The mega study found that living in greener urban areas is associated with a reduction of stress hormones as well as risk of high blood pressure, cardiovascular disease, obesity, diabetes, asthma hospitalisation and early death. Spending time in greener urban areas also lowers the risk of giving birth to premature babies and rates of shortsightedness in children.

More than just something in the air

A study of 20,000 English people found that you need exposure to nature for at least two hours a week – not necessarily in one block of time – to experience improvements in health and wellbeing.

But there isn’t one simple explanation for nature’s healing power.

One theory is that being in nature exposes us to different air. People living in cities are exposed to a lower variety of germs than people who spend time in natural environments. Being in nature exposes us to a more diverse microbiome that may prime our immune systems to heal ourselves faster, for example by producing more immune cells to stave off infections.

But what about the mental side of things?

The Attention Restoration Theory suggests that we have limited ability to focus attention before we feel mentally fatigued. Taking a short break in natural environments away from our busy urban lives helps restore this ability.

Another hypothesis is that we have evolved to feel less stressed in non-threatening, natural environments since they provided our ancestors optimal conditions for things key to survival, including food.

Lastly, the visual stimulation and silence of an outdoor setting may help distract ourselves from our worries since it decreases activity in the prefrontal cortex, the brain region that can get fixated on negative thoughts.

But I live smack bang in the middle of the urban jungle!

What if it’s just not feasible to get away from the bright city lights?

The good news is that you don’t you don’t even need to venture into far-flung wilderness to feel the benefits. Access to urban green spaces within a few kilometres of your home or exposure by virtual reality might be enough.

As for me, I’ll be on my couch, curled up with a hot choccy and contemplating Eastern philosophy as the clouds roll by.

“Nature does not hurry yet everything is accomplished.”Lao Tzu.

Links and stuff

What if practice doesn’t make perfect?

comments 2
Anthropology / Myths / Psychology

You know the saying: practice makes perfect. But how much practice? If you believe the ‘10,000-hour rule’, you can become an expert at most things with 10,000 hours of practice. What’s the catch?

On the way to being a Grand Master? Image credit Michal Jarmoluk via Pixabay

Grand Masters

Think about a skill you’d love to master: maybe chess, soccer, cello or speaking Spanish. What would it take for you to be truly exceptional at your chosen activity?

Is raw talent essential? Perhaps early nurturing of natural talent makes all the difference (think Tiger Woods). Or maybe anyone willing to put in years of hard work can overcome not being born gifted.

Twenty-six years ago, Swedish psychologist Anders Ericsson and his colleagues investigated the practice habits of violin players undertaking training at the elite West Berlin Music Academy. They wanted to understand what made a brilliant violinist as opposed to a ‘very good’ one. Their conclusion: you don’t get to be a virtuoso with less than ten years of practice under your belt. And the more you practice, the better you are likely to be.

The magic number

So where did 10,000 hours come from? In Ericsson’s study, by age 20, the best violinists had spent on average 10,000 hours studying violin. In contrast, the very good violinists had averaged only 7,800 hours of practice.

Fast forward to 2008 and Malcolm Gladwell published his book Outliers. Gladwell wanted to know why stand-outs like Bill Gates and The Beatles were so successful.

The key? According to Gladwell, 10,000 hours of practice. For example, The Beatles played thousands of live shows in Hamburg between 1960 and 1964, resulting in more than 10,000 hours of performance by the time they returned to England.

With that, the ‘10,000-hour rule’ took on a life of its own. Soon, 10,000 hours of practice became a rule-of-thumb; a magical threshold.

How hard is it to cross the threshold? Four hours of practice on five days of the week across ten years will get you to the magic number.

Be deliberate about it

But research has slammed the 10,000-hour rule – unsurprisingly, the path to mastery is not quite so simple.

For a start, the simple rule ignores that there are many forms of practice. Simply doing something over and over again for 10,000 hours doesn’t guarantee you anything – except perhaps boredom.

What’s needed is not just accumulated hours, but committing to what’s called ‘deliberate practice’. Deliberate practice involves getting expert advice: having someone who can give you feedback and correct your mistakes.

And there’s no evidence for a one-size-fits-all recipe for success. Back to the original study of violinists: 10,000 hours was not an exact number of hours reached, but rather an average of the time these top violin students spent practising. It turns out some had practised for far less than 10,000 hours. Others had accumulated more than 25,000 hours of practice.

If not practice, then what?

Researchers have now pooled together the results of many studies in order to understand what role practice plays in success.

For example, a review of 88 studies found that on average, deliberate practice accounted for 12% of the variance in performance across a range of activities. Broken down, the researchers calculated that practice explained about a quarter of the difference in ability in games like chess, about a fifth of the difference in music and even less for sports. When it came to jobs like computer programming or piloting planes, practice barely made any difference to performance – only one per cent.

Research published in 2016 explored the link between deliberate practice and performance across many different sports and found training accounted for only 18% of the variance in performance. And when the researchers looked only at elite sportspeople, practice could only explain one per cent of the differences in success.

Another study found practice could explain about one-third of the differences in success among musicians and chess players.

Practice, although important, is obviously nowhere near the whole story. Things like genes, personality, access to expert advice as well as how good someone is at taking on feedback must all be hugely influential too. The 10,000-hour rule has been debunked.

But it seems to me there are many other reasons to play games, sport and musical instruments. I definitely won’t be giving up all the things I’m mediocre at anytime soon.

Links and stuff

Do we need vitamin D supplements?

Leave a comment
Health / Medicine / Myths

There’s a lot of hype around vitamin D. We’ve long known it’s essential for healthy bones, but over the last decade it’s been claimed low vitamin D levels are linked to a whole host of other illnesses. How clear are those links and how many of us really need vitamin D supplements?

So many supplements to choose from. Image credit Angel Sinigersky via Unsplash

Get some rays

Also known as the sunshine vitamin, there are two ways to get vitamin D. The first is the same way we get all the other vitamins and minerals essential to good health: from the food we eat. Foods high in vitamin D are oily fish and egg yolks. Because many people’s diets are low in vitamin D, in some countries, foods like milk, orange juice and cereal have vitamin D added during processing.

But there’s another way to get vitamin D – your body can make it through a series of steps starting with your skin being exposed to the sun. How easy it is for your body to make enough vitamin D depends on how dark your skin is, and where you live. For example, in Australia in summer, just a few minutes each day of sun exposure in the mid-morning or mid-afternoon is enough for most people.

Why we need vitamin D

We’ve known for a long time that vitamin D is essential for healthy bones. This is because vitamin D plays a vital role in absorbing calcium from your gut. Without enough vitamin D, children develop rickets – weak and soft bones, and in the worst cases, bone deformities. A lack of vitamin D is also associated with osteoporosis and other bone conditions in adults.

The importance of vitamin D in contributing to the growth of healthy bones is not in any dispute. But over the past decade, it’s been claimed low vitamin D levels are also responsible for a variety of other illnesses. Studies have reported links between low vitamin D levels and cancer, heart disease, autism, depression, type 1 diabetes, high blood pressure and rheumatoid arthritis among other conditions.

Popping pills

These research claims led to a massive media focus on vitamin D. For example, in 2010, the New York Times declared ‘a huge part of the population….are deficient in this essential nutrient’. The US-based Vitamin D Council helped to spread the message that we would all be healthier with higher levels of vitamin D became widely accepted. Dr. Oz claimed vitamin D was ‘the number 1 thing you need more of’.

It’s an appealing story: we evolved near the equator where there is plenty of intense sun year-round. Humans also used to spend almost all their time outside wearing minimal clothes. It makes sense that people working in offices, particularly those living far from the equator could become vitamin D deficient.

With all the media hype about vitamin D as a cure-all, it’s no surprise people turned to vitamin D supplements. Pills are a quick fix – much easier than eating sardines or herring every day. Today, millions of people take vitamin D tablets without any evidence it is doing them any good. We know people in the US spend more than a billion dollars a year on vitamin D supplements.

At the same time, millions of people who have no medical complaints or symptoms, and no particular disease risks, are having their vitamin D levels tested. But there’s little consensus about the best form of vitamin D to measure in the body,  and how to best measure it. Not only that, but there is no general agreement on what level of vitamin D actually constitutes a deficiency.

One osteoporosis researcher has gone on the record claiming vitamin D has become ‘a religion’.

The truth about vitamin D

Recent large-scale studies have upended the idea that a lack of vitamin D is the cause of most of the illnesses it has been linked to. A recent study showed vitamin D supplements don’t prevent heart attacks. Another showed vitamin D supplements didn’t reduce the risk of cancer in older women.

Late last year a large clinical trial involving 25,000 people found no evidence for the benefits of the supplements in protecting against heart disease or cancer. Another large study of 5,000 New Zealanders found no evidence vitamin D supplements prevented heart problems.

Interestingly, a review of 81 existing studies also found no evidence that vitamin D supplements reduced falls or bone fractures. The results of many of the other studies claiming links between low vitamin D and illnesses have been called into question after further review.

On the other hand, there does appear to be evidence that people who live in areas with high sun exposure, and particularly those who spent lots of time in the sun as kids are less likely to develop multiple sclerosis. Recent studies have also supported a possible association between vitamin D levels and depression.

Some researchers are concerned people have assumed that if some vitamin D is good, more must be better. As a result, people are buying high-dose vitamin D tablets online. But unlike the B and C vitamins, vitamin D can be stored in the body and an overdose can cause vomiting, nausea, loss of appetite and weakness.

Please always follow your doctor’s advice. But if you’ve decided to take vitamin D supplements on the basis of what you’ve read in the media or heard on the grapevine, it might be time for a rethink. You may be wiser to save the money and spend it on more real food instead.

Links and stuff



What a coincidence!

comments 2
Myths / Psychology

We’ve all been floored at some point in our lives by what seems an impossible coincidence. It’s tempting to interpret such improbable, even eerie events, as signs from the Universe. But what can science tell us about coincidences? Are they as rare as we think?

What are the chances? Image credit Colton Sturgeon via Unsplash.


She was one in a million, so there’s five more just in New South Wales.   Charlie No. 1 by The Whitlams

One in a million

My husband (who I met in my late 20s) grew up in the same street where my best friend in high school lived. Two of my Aunts passed each other on an escalator one day in New York when neither of them had any idea the other was even in the United States, let alone New York.

They seem like crazy coincidences and I could list plenty more; I’m sure you could too.

A coincidence is defined as ‘a surprising concurrence of events, perceived as meaningfully related, with no apparent causal connection’. Coincidences delight, astound and confound us. They can seem so impossible that we wonder if something mysterious or supernatural must be involved.

A third of us notice coincidences regularly and the frequency of experiencing a coincidence isn’t related to gender, age, occupation or level of education. Coincidences make us ask ourselves ‘what are the chances of that?’ Researchers have studied coincidences for a long time and have in fact calculated the chances of seemingly highly unlikely events happening.

The Law of Truly Large Numbers

Let’s start with The Birthday Paradox. Let’s imagine a room of people. How many people need to be in the room before it’s likely two of those people will share a birthday? You’d imagine quite a few given there are more than 360 possible days someone could have been born. But statisticians have done the sums and it turns out you only need 23 people before there’s a 50-50 chance that two of them will share the same birthday.

On September 6, 2009, the Bulgarian lottery winning numbers were 4, 15, 23, 34, 35 and 42. Four days later exactly the same six numbers were drawn. The repeat was labelled as a freak coincidence – never before seen in the lottery’s 52-year history. The then sports minister ordered an investigation to look for evidence of fraud. But mathematicians calculated that in the space of only 43 years, it is more likely than not that a lottery machine will draw an identical set of six numbers.

According to mathematicians Persi Diaconis and Frederick Mosteller, if you have a large population, very low probability events will happen. This is now known as the Law of Truly Large Numbers. Put a different way, ‘With a large enough sample, any outrageous thing is apt to happen’. With more than 7 billion people alive today, highly improbable events happen every day.

When a woman won a US lottery twice in the space of four months, it was reported as a one in 17 trillion event. In fact, analysis showed that the odds of the same event happening to someone somewhere in the United States was more like one in 30. Of course the odds of you being that lucky person are unfortunately significantly higher!

How about the likelihood you’ll have a correct premonition about someone you know of dying? In a country the size of America, calculations show that 70 people a day will think of a specific person dying five minutes before that person actually dies. Not exactly a rare event.

Similarly, Littlewood’s Law says that you can expect a miracle (defined as a one in a million event) about once a month. The logic is that during the hours you are awake and engaged with the world around you (around eight hours), you experience something every second. That’s 28,800 experiences a day. So a one-in-a-million event will happen every 35 days.

These laws aren’t new ideas: British mathematician Augustus de Morgan wrote back in 1866 ‘Whatever can happen will happen if we make trials enough’.

Why do we love coincidences?

Despite the very good odds of extraordinary coincidences happening every day, we are all impressed and affected by them. One study found that students who believed they shared a birthday, first name or fingerprint similarities with someone asking them to perform a task were much more likely to perform the task than if they didn’t share that similarity.

Part of the explanation is no doubt the fact that we’re no good at probability calculations. We’re astonished by coincidences but pay no attention to the millions of instances in our daily lives when nothing extraordinary happens. Coincidences stay in our memories for a long time and sometimes come to hold great meaning.

Notably, research has shown that although we all notice coincidences, some people are more likely to ascribe meaning to them. People who describe themselves as religious or spiritual and people who spend lots of time seeking meaning in their life are all more prone to experiencing coincidences. We are all more likely to notice coincidences when we are extremely angry, anxious or sad.

It won’t surprise you to hear that we are also far more amazed by our own coincidence experiences than other people’s. That makes sense – it is far more unlikely that a given weird coincidence will happen to you than to some other person on the planet.

We like to believe our life has meaning and identifying connections is a powerful way to find meaning. We also like patterns – we find life much easier to understand if it comprises a series of patterns rather than just random events. Humans also love hearing and telling stories and coincidences lend themselves beautifully to storytelling.

The upshot of all this is that rather than being surprised when one-in-a-million coincidences happen, we should expect them. We can all look forward to the next time something seemingly crazy happens that shakes us out of our day-to-day routine and makes us exclaim ‘what are the chances?’

Links and stuff


Why we procrastinate

comments 6
Myths / Psychology

It’s Friday afternoon. You’ve known for weeks your final report is due by the end of the day. But somehow you’ve managed to put off working on it until now. And you can’t possibly get it done in time. Sound familiar? According to one researcher, procrastination is ‘a common pulse of humanity’. Why do we procrastinate and how can we stop?

I will definitely stop procrastinating……tomorrow. Image credit Lynn Friedman via Flickr.

Just do it……soon

Procrastination is ‘voluntarily delaying an intended course of action despite expecting to be worse off for the delay’. That’s right. We know putting off doing something is going to be bad, but we do it anyway.

It’s easy to imagine procrastination is a recent thing, a result of the constant barrage of social media and other digital distractions. Not at all: procrastination has been around for a long time.

Egyptologist Ronald Leprohon translated some 1400 B. C. hieroglyphics as ‘Friend, stop putting off work and allow us to go home in good time.’ And Hesiod the Greek poet advised not to ‘put your work off till tomorrow and the day after’ back in 800 B.C. Ancient Greek philosophers coined the term Akrasia, which is a state of acting against your better judgement.

In a study of more than 1300 adults from six countries (including Australia), about a quarter of people reported that procrastination was one of their defining personality traits. Other research found one in five people qualify as a chronic procrastinator. In a study of uni students, only one percent reported they never procrastinate.

Is procrastination actually a problem?

One of the first studies attempting to get a handle on the effects of procrastination followed the academic performance, stress, general health and procrastination habits of U.S college students in 1997.

In the short-term, procrastinators were less stressed than others, presumably because they chose fun stuff over study. But in the long run, procrastinators got lower marks and experienced greater stress and more illness compared with non-procrastinators. Since then evidence has been mounting: putting things off can seriously undermine our wellbeing.

The procrastination war

Procrastination has been the subject of much research over recent decades, and there’s no one cause. In fact procrastination is psychologically complex.

What we do know is that despite being commonly associated with laziness, procrastination doesn’t have much at all to do with time management skills.

Put simply, procrastination is a war between two parts of our brain: the limbic system (think of it as your inner 4-year old) and the prefrontal cortex. The limbic system seeks instant gratification while the prefrontal cortex is involved with planning and decision making.

The limbic system is one of the oldest parts of our brains and tends to function on auto-pilot. Anytime you’re not consciously engaged with a task, the limbic system leads you to give into what feels immediately good. The limbic system is powerful and wants pleasure now. As a result, we value immediate rewards much more highly than future ones.

In contrast, it takes effort to kick the more recently evolved pre-frontal cortex into action.

What this means is that distant rewards, even if they are big ones, don’t have much sway over us compared with immediate pleasure. Procrastination isn’t a bad habit; it’s pretty much hard-wired into your brain.

Research published last year also suggests procrastination may have more to do with managing emotions than time. Brain scans suggest people who procrastinate more are less successful at filtering out emotions and distractions.

How impulsive are you?

Piers Steel from the University of Calgary is one of the word’s experts on procrastination. He analysed more than 200 procrastination studies and found a clear link between impulsiveness and procrastination.

People who tend to act impulsively are also likely to procrastinate a lot, which makes sense. In one instance, we should wait but instead do whatever it is now. In the other instance, we should do something right now but instead we wait. The common feature is self-control.

Another factor is self-confidence. If we doubt our ability to successfully complete a task, we are much more likely to put it off.

Unfortunately it’s unlikely you’ll be able to settle the ‘do it later versus do it now’ war once and for all and never procrastinate again.

So what can you do about it when you find yourself procrastinating?

Ask yourself why

Perhaps the most important thing is to notice you’re procrastinating and ask yourself why. Is the task too big and overwhelming? Are you missing some of the tools you need to get it done? Do you genuinely not care about getting it finished? Are you surrounded by too many distractions?

Once you know there’s a specific problem, you can do something to fix it.

Start and reward yourself… in intervals

One of the best tactics to beat procrastination is just to start. But of course, that’s the whole problem. So instead of telling yourself you have to complete the whole thing, commit five or ten minutes to starting one small part of your task. Building a habit of simply making a start can be one of the most effective ways to tackle procrastination.

And once you work for the specified period, give yourself a reward. Merlin Manne has a good approach for this, called the (10 + 2)*5. It’s a version of the Pomodoro technique.

The idea is that before you know it, you’ll be engrossed in your task and kicking goals.

Stop beating yourself up

Interestingly, one study found forgiving yourself for procrastinating makes it less likely you will procrastinate next time. By forgiving ourselves, we minimise the negative feelings we associate with a task that can lead us to avoid doing it again in the future.

And if you can focus on how good you’ll feel once something is done, you’ll have much more motivation in the here and now to get started on it.

Set deadlines, be specific and remove distractions

It’s a well-known fact that deadlines spur many of us into action.

Having a firm and costly deadline is an excellent way to get stuff done. And contrary to what you might think, it’s ok for this deadline to be self, rather than externally imposed.

It’s also extremely helpful to think about your task in concrete specific terms. You are significantly more likely to do what you need to do if you focus on the how, when and where of getting it done. Being specific requires your pre-frontal cortex to take over.

Finally, commit to getting rid of distractions. Turn off your WiFi and phone, and don’t check email. Your limbic system will always be tempted by whatever distractions are on hand.

So instead of reading this blog post, is there something else you should be doing right now?


Links and stuff