Rethinking daydreaming

Leave a comment
Health / Myths / Psychology

Your mind often escapes when you’re in a boring meeting or washing the dishes. Previously considered undesirable and unhealthy, science has come around to the silver linings of letting our thoughts drift.

Losing yourself in thought ain’t so bad. Image credit: Wes Hicks on Unsplash.

Driven to distraction (in a good way)

Daydreaming, otherwise called mind-wandering or self-generated thought, often gets a bad rap as something indulgent, unwanted and unproductive. It distracts us from the task immediately at hand and can result in errors. In extreme cases, called maladaptive daydreaming, the behavior can severely impair important functions in everyday life.

Yet research conducted in normal adults show we engage in it anywhere between 3050% of the time in our daily lives. One study found that the only activity during which our mind wanders less than 30% of the time is sex. Even rats engage in mind-wandering.

These observations have convinced scientists of the benefits to letting your thoughts drift. They’ve shown several advantages, including: greater creativity to solve problems, consolidation of memories, opportunity to form a sense of self-identity and continuity over time. Daydreaming enhances socioemotional skills e.g. empathy and also learning by giving us a brief break from tasks.

In healthy people, tests have shown mind-wandering is positively linked to fluid intelligence (ability to reason and think flexibly) and openness to experience (curiosity and exploration of ideas). In turn, openness to experience is a strong predictor of creative achievement.

What happens in our brains when we daydream?

Self-generated thought is most likely to happen when we’re engaged in an undemanding external task, like waiting for the bus.

Our human condition is such that we are forever in the situation of deciding how much attention to give to self-generated thought and how much to information from the external social or physical environment.

Jerome L. Singer

Focusing on an external task and consciously processing our environment makes use of our brain’s executive control network, which is responsible for impulse control and goal-oriented thinking. By contrast, our default mode network (DMN) is switched on when we’re not actively attending to anything.  The executive control network and DMN are usually described as being in a tug-of-war with one another. Scientists think that daydreaming occurs when the balance is tilted towards the DMN.

You might think that the brain’s activity would be minimal when DMN is ‘winning’ i.e. when you’re daydreaming. But it’s during this time that the DMN performs ‘housekeeping’ duties like sorting out memories, thinking of the future, and categorising new information. Accordingly, brain imaging studies show many brain areas are active, with some even more so than when performing tasks. Daydreaming also recruits additional brain regions, such as the insula, which is involved in awareness of our body’s internal functions as well as emotions.   

Content and timing matters

As far back as the 1960s, American psychologist Jerome L. Singer recognised that not all wandering thoughts were bad. Since then, scientists have identified two major influences on whether or not daydreaming is beneficial.

One factor is content.

Mind-wandering allows us to ‘time travel’ and reinterpret past experiences when we receive new information. However, repetitive negative self-generated thoughts (‘rumination’) about the past is linked with depression and anxiety. In contrast, daydreaming about the future and weighing possible options, can be helpful since it can lead to better economic and moral choices that rely on patience instead of impulse. Envisioning failure can even motivate us to avoid it.

The other crucial factor is timing.

Obviously, drifting off to la-la-land is bad when you need to pay attention to a difficult task, have an imminent deadline or when the external environment is risky e.g. crossing the road. (Although, some scientists say that if you can stay on-task but simultaneously mind-wander, it may be a sign that you have a greater working memory capacity – a more ‘efficient’ brain.)

But there are some situations when a drifting mind is good.

Science advocates that when solving a problem requiring creative thinking, you can achieve better results by letting your thoughts stray while doing an undemanding task. For me, it’s when I’m showering or brushing my teeth. This strategy is better than working through the problem non-stop, simply resting, or engaging in another demanding task.

Interestingly, not all daydreaming is unintentional – about half the time we allow it to happen.

Scientists think it is the ability to restrict when mind-wandering occurs – to deliberately allow it only during undemanding situations – is the key to success. In an investigation of 274 college students, those that could focus during demanding tasks but reported frequent daydreaming in other situations, tended to be intelligent and were higher achievers. In fact, brain imaging in people who intentionally daydream has revealed they have better connectivity between (and perhaps control of) their brains’ executive control network and DMN…  blah blah blah…

… Have you drifted off amid all this science talk? Well now you have good reason(s) to. Happy daydreaming!

Links and stuff

Time flies, but why?

Leave a comment
Health / Myths / Psychology

Time is meant to be one of the few completely reliable and objective things in life. But it seems to slow down when we are afraid — think of a slow-motion car crash. And time flies when we’re having fun. Why is it that time seem to pass more quickly the older we get? Does our ability to assess the passing of time change as we age?

Do the days feel like they’re racing past faster and faster? Photo by Image by Eduin Escobar from Pixabay

I remember vividly as a child the last few days before Christmas seemed to last an eternity. And on the morning of my 8th birthday party, it felt impossible that I would have to wait until afternoon to see my friends.

At the same time, although I feel like I “just” had my son, it won’t be too long before he’s at high school. And even though in my mind I was an undergraduate student until very recently, I am now teaching undergrads who hadn’t even been born when I started uni. What on earth is going on?

I have hopefully become wiser with age, but that doesn’t mean I’ve learned to disrupt the space-time continuum. It clearly comes down to my perception of time.

We know our perceptions of time change as a result of all sorts of factors. When we feel rejected, or depressed, time seems to slow down. Experiments confirm that when we feel genuinely frightened, we also perceive time to pass slowly. But when listening to music, time seems to go faster.

As we age, people across all cultures share the feeling that time speeds up. Psychologists and philosophers have been trying to explain this phenomenon for at least 130 years.

We all feel time passes quickly

A number of studies report that in fact people of all ages feel time passes quickly. Researchers have asked people of different ages how quickly they feel time has passed during the previous week, month, year and decade. But the only clear age-related pattern is that the older the person, the more likely they are to say that the last 10 years had passed quickly. Over shorter time-scales, people of different ages have similar perceptions of the speed of time passing.

But researchers have identified that regardless of age, the more time-pressured we feel, the more likely we feel the days, weeks and months are passing too fast. This is closely linked to a perception of not having enough time to do all the things we want to do and is true of people in a variety of Western cultures.

Why do we feel like time is racing away from us?

Scientists have a number of different theories. One is called the ‘proportionality theory’. This simply argues that a year seems to pass much faster when you are 40 than when you are four because it constitutes only one fortieth of your life, rather than a whole quarter.

Many theories have to do with how many novel experiences we have at different stages of life. The idea is the first time we experience something, our brains store lots of information about it. This results in our memories of an event being very rich and dense. So upon thinking back to childhood experiences, the many vivid memories we have give the impression that these memories must have formed over a very long time.

The reason we remember our youth so well is that it is a period when we have more new experiences than in our thirties or forties. It’s a time for firsts — first sexual relationships, first jobs, first travel without parents, first experience of living away from home, the first time we get much real choice over the way we spend our days. Claudia Hammond

In contrast, when our days are somewhat indistinguishable from one another, when we are following the same daily routines (as is more common later in life), passages of time seem to go faster. As adults we generally have fewer new experiences that imprint on our minds.

Each passing year converts some of this experience into automatic routine which we hardly note at all, the days and the weeks smooth themselves out in recollection to contentless units, and the years grow hollow and collapse. William James (1890)

How can we slow life down?

If you feel like life is escaping you and passing by far too fast for comfort, the answer may be to fill your time with new experiences. Going to new places and learning new things may very effectively slow down our internal sense of time.

Slowing down and feeling under less time pressure could also help. My favourite solution of all is to have more holidays — a combination of reducing time pressure and maximising new experiences. Although at the time a great holiday may seem to be over all too fast, as we look back we feel like we’ve been away from the daily grind for a long time.

A holiday in the name of research? Yes please.

Links and stuff

Blame it on the full moon

Leave a comment
Astronomy / Health / Myths / Psychology

Tomorrow is Halloween, which leads to thoughts of ghosts, witches, vampires and scary pumpkin faces. But year round, many people believe a full moon is linked with other spooky stuff. Crime rates, psychiatric hospital admissions, emergency room visits, dog bites, sporting injuries and hyperactivity in kids are all said to increase during a full moon. Is there any truth to the rumours?

Do strange things happen when there’s a full moon? Image by cocoparisienne from Pixabay

Bad moon rising

Also known as the Transylvania Effect, the idea that people behave erratically during a full moon is not a new one. Aristotle and Pliny the Elder both suggested that because the human brain is so moist, it responds to the same pull of the moon that drives our tides (which we now know not to be true). In psychiatrist Arnold Lieber’s 1978 bestseller, How the Moon Affects You, he also argued that because our bodies are mostly water, the moon has a strong influence on us. How strong an influence? The story goes that in 19th Century England, lawyers could invoke the defence ‘not guilty by reason of the full moon’. I doubt any lawyers argued their clients had metamorphosed into drooling werewolves, but the argument is clear: under a full moon, people are no longer in control of their behaviour. And we can thank the Roman goddess of the moon, Luna, for the words lunatic and lunacy.

Sounds far-fetched, doesn’t it? But belief in the Transylvania Effect is alive and well. One survey found 43% of US college students believed people were more likely to behave strangely on a full moon night. The same study found mental health professionals were more likely to believe this than people working in other areas. In another survey, 80% of emergency department nurses and 64% of emergency department doctors believed the moon affected behaviour. Ninety-two percent of the nurses said they found working on the night of a full moon was more stressful, and they should be paid more for working those shifts.

The moon made me do it

In 2007, police in the UK employed more staff to be on patrol on full moon nights claiming ‘Research carried out by us has shown a correlation between violent incidents and full moons’. There are thousands of published research papers exploring the link between the phase of the moon and a huge variety of events and behaviours. The most striking thing about all this research is the almost complete lack of evidence for the moon having any effect on us.

Despite the fact many people believe the time of the full moon is the worst for undergoing surgery, there is no evidence a full moon affects the outcomes of heart surgery, the rate of surgical complications, levels of post-operation nausea and vomiting or post-operative pain. Professional soccer players are no more likely to be injured when playing or training during a full moon. There is no evidence for increased crisis centre calls during full moons and no link between moon phase and frequency of epileptic seizures. Research shows no increase in emergency department admissions or calls for ambulances during a full moon, nor do more people see doctors for anxiety or depression. There is also no relationship between moon phase and birth rates.

Research has found no link between the phase of the moon and psychiatric hospital admissions. A full moon doesn’t lead to increased violence in psychiatric settings, nor is there evidence for a link between moon phase and suicides or suicide attempts. No link has been found between full moons and traffic accidents, and dog bites requiring hospitalisation occur no more often during a full moon than at any other time. Kids aren’t more hyperactive at the full moon, and there are no more murders or other crimes when the moon is full.

The full (moon) truth

If nothing else, research over the past 50 years has provided us with a detailed list of all the human behaviours not influenced by the phase of the moon. Given the lack of evidence, why does the myth persist? It may simply be the result of illusory correlation – if something unusual happens on the night of a full moon, we are more likely to take note and remember it than when nothing unusual happens under a full moon. Confirmation bias probably also plays a role: if you already believe in the Transylvania Effect, you’ll pay particular attention to any information that supports your belief.

But there’s another intriguing possibility. We have some evidence people sleep a little less on full moon nights. One study found people slept an average of 19 minutes less during a full moon; another study recorded a decrease of 25 minutes of sleep on these nights. Study volunteers took longer to fall asleep and slept less deeply at the time of the full moon, even when they couldn’t see the moon and didn’t know the moon was full. They also complained of feeling less rested on waking after a full moon.  A 2-year study of nearly 6000 children living on five continents found kids also slept a little less on nights with a full moon.

Perhaps this is the source of the myth. In times gone by, before our body clocks were under the influence of electric lights and screen time, a full moon may have been very disruptive to sleep. Researchers have suggested the Transylvania Effect may have its root in the fact someone suffering bipolar or a seizure disorder may be highly susceptible to mania at times of sleep deprivation.

Seems plausible but fortunately I don’t think we’ll be hearing the defence ‘not guilty by reason of not having had quite enough sleep last night’ anytime soon.

Links and stuff



Is it just me, or is it the Barnum Effect?

Leave a comment
Anthropology / Myths / Psychology

Let me tell you about yourself: You pride yourself as an independent thinker and do not accept others’ statements without satisfactory proof. While you have some personality weaknesses, you are generally able to compensate for them. You have a great deal of unused capacity which you have not turned to your advantage.

Neither they, nor I, can really read your mind. Photo by Scott Rodgerson on Unsplash.


Did you read the above statements and think they were eerily accurate?

Here’s the catch – they weren’t written about you.

These generic descriptions, called Barnum statements, could be about anyone regardless of their age, gender or nationality. If you were fooled, you wouldn’t be alone.

The unsuspecting millions

The phenomenon you just witnessed underlies the enduring popularity of astrology, psychics, tarot readers and faith healers – despite what scientists have to say.

Roughly 90% of surveyed people know their star sign. Most identify with the broad personality traits ‘bestowed’ by their Zodiac. For example, Sagittarians agree that they’ve been incurably bitten by the travel bug. Or, that it’s valid to excuse your stubbornness on the account of being a Taurus. Yet if you give people 12 unlabelled Zodiac personality descriptions, the probability that they correctly pick their own is no more likely than chance.

You’re so vain; you probably think this song is about you

The Barnum effect refers to the tendency to believe vague personality descriptions that are presented to be personalised, when in fact they can be almost universally applied. It is otherwise known as the Forer effect, after the American psychologist who conducted the first scientific study on the phenomenon.

In 1942, Professor Bertram R. Forer made unsuspecting first year psychology students take a personality test. (Read the original paper here.) A week later, he then gave them ‘individual’ personality descriptions comprising ambiguous statements based on their answers. When asked how accurate their description applied to them, the students gave an average rating of over 4 out of 5. In reality however, they all received an identical statement cobbled together from a news stand astrology book.

Forer and others also showed that the greater accuracy with which you rate some external description about yourself, the greater your belief in the validity of the person or system dispensing the assessment. Fifty million people in the US and UK believe that Zodiac horoscopes accurately predict their personal future.

In other words, if you truly believe you’re stubborn because you’re a Taurus, you’re also more likely to believe next month’s horoscope that you’ll hit a lucky streak in love or in your job.

What makes Barnum statements so believable?

The Barnum Effect gets its name from the winning formula used by the famous showman Phineaus Taylor Barnum: to include a “little something for everyone”. We can get sucked into believing Barnum statements because they are vague and often double-headed enough to cover all bases.

Consider the statement: “At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved.” It seems believable because everyone’s either an extrovert or introvert, and it’s unlikely that you’ll always be one or the other all the time!

Barnum statements seduce us because they also include generally favourable statements that flatter our ego. Most people are less likely to believe them if more negative elements of their personality are described.

Other factors that can make you more susceptible is your belief in the authority of the person giving the assessment, as well as how much detail you provided.

Guarding against gullibility

Don’t like being duped?

The best defense against swindlers who use Barnum statements is a healthy dose of skepticism. Some might say being educated is enough. But even with the same level of education, those that exercised greater logical thinking skills were less prone to having the wool pulled over their eyes.

After all, what are the chances that all Taureans around the world will simultaneously get a raise next month?

Links and stuff


Why you (probably) can’t multitask

comment 1
Myths / Psychology

How many different things are you trying to do right now? Reading this blog post while listening to a podcast, posting on Instagram and checking email? It’s easy to think you’ll be more productive by doing lots of things at once. But it turns out for almost all of us, effective multitasking is an illusion.

A different take on driving and texting. Photo by Michiel Gransjean via Flickr.

You’re not really multitasking

With the possible exception of texting while driving, you probably think multitasking is something to aspire to. But the reality is, trying to do multiple things at once actually slows you down and leads to more mistakes. We are simply no good at trying to engage with more than one decision-making process at a time.

In fact, experts tell us even the term multitasking is wrong: we should call it task switching. Researchers suggest we may lose up to 40% of our productivity by trying to multitask. What feels like multitasking is actually rapid switching between tasks and the time it takes to switch between tasks all adds up.

Think it only takes a few seconds to jump between email and the report you urgently need to write? No big deal unless you are switching every time you get a new email notification. And how long does it take you to get to the point where you are writing effectively again? Research shows it takes an average of 23 minutes. In some cases, study volunteers never got back into the flow of writing.

In another study, volunteers were asked to memorise either a 7-digit or single-digit number while eating. Those who had the more challenging memory task were less aware of the flavour of the food they were eating. At the same time, they ate more of both the sweet and salty food on offer.

Some tasks are easier than others

If multitasking is so hard, perhaps you’re wondering why you can read this blog, eat your lunch and listen to music all at once.

How easy it is to juggle tasks depends on how engaged your prefrontal cortex is during the activity. Natural behaviours like walking, talking and eating don’t take a lot of brain effort so we can do those things at the same time as paying attention to something else.

Having said that, talking on your mobile while driving increases your risk of having an accident four-fold. It’s been shown to be equivalent to driving with a blood alcohol reading of about 0.08.

When 56,000 people were observed approaching an intersection in their cars, those who were talking on their mobiles were ten times less likely to actually stop at the stop sign.

Even walking at the same time as talking on your mobile takes up most of your brain’s attention. Only one quarter of people doing both noticed a clown on a unicycle ride past them who had been planted there by researchers.

And as soon as there are multiple tasks which require thinking, it becomes clear our brains simply aren’t wired for multitasking.

There are supertaskers among us

But all is not lost. Research has uncovered a small proportion of the population — about two percent — who have extraordinary multitasking abilities. Imagine simultaneously doing a driving test, solving complex maths problems and doing memory tests on your phone.

People who can easily do that have been called supertaskers and they have brains that actually become less active the more tasks they are trying to do at once. And that’s not all. Supertaskers do better, not worse, at each individual task the more simultaneous tasks they are doing.

The irony is that as soon as we hear that supertaskers exist, 90% of us decide we belong in that 2%! But research has shown that the people who multitask the most and are the most confident in their multitasking abilities tend to actually be the worst at it.

The cost of multitasking

Assuming like me, you are not a supertasker, it’s worth considering what our attempts at multitasking cost us.

Research suggests the more we attempt to multitask, the more we are training ourselves not to focus. We are effectively teaching ourselves that something unknown – an unread email or the next notification – is always more worthy of our attention than whatever task we are meant to be working on.

We sacrifice focus in order to ensure we don’t miss any unexpected surprises, but in the process we lose our ability to block out distractions. And that means we’re not very good at getting stuff done.

As Cal Newport says in his excellent book Deep Work,

Efforts to deepen your focus will struggle if you don’t simultaneously wean your mind from a dependence on distraction.

Links and stuff




The reality of the Uncanny Valley

comment 1
Anthropology / Evolution / Myths / Psychology

Technology has made making extremely human-like robots a reality. But is this a good thing? Tara Bautista delves into the unease that many people experience with things that are almost – but not quite – human. Welcome to the ‘Uncanny Valley’.

Feeling weirded out? Me too. Photo by Franck V. on Unsplash.

When realistic isn’t pleasantly real enough

Last weekend I started watching a new fantasy show on Netflix called The Dark Crystal: Age of Resistance. You could say it’s a cross between Game of Thrones and Jim Henson’s The Muppets.

It’s a great series but it took me three episodes to get over the weirdness of the puppets. The pairing of human voice acting with the puppets’ mechanical facial expressions made for an unsettling experience. (And spoiler alert: I never want to see puppets kiss again).

Turns out I’m not alone in feeling this way.

In 1970, a Japanese robotics professor called Masahiro Mori first predicted the relationship between how human-like something is and our feelings towards it. He argued that at first, we experience increased positive feelings the more human-like the object is. But as something approaches being almost but not perfectly human-like, he predicted a dip in positive feelings and we would experience feelings of discomfort and revulsion.

Professor Mori called the dip the “Bukimi no Tani” – literally translated as “Valley of Eeriness” but more popularly known as the “Uncanny Valley”. Into the valley would fall things like androids (human-like robots), Japanese bunraku puppets, prosthetic hands, corpses and zombies.

The science behind our heebie-jeebies

Up until 2004, the Uncanny Valley hypothesis remained controversial and little studied. Nowadays it’s gaining acceptance as more scientific evidence backs up its existence.

In 2015, a landmark study reproduced Professor Mori’s predicted rise-dip-rise when participants rated the likeability of 80 real-life robot faces (ranging from clearly robot to android) as well as digitally-composed robot faces. The researchers showed the Uncanny Valley effect even influenced participants’ decisions about robots’ trustworthiness.

(Fun fact: Pixar won its first Oscar for the its CGI short film, Tin Toy in 1988. However, audiences were so creeped out by the human baby character that it drove the film studio to initially focus on stories with non-human characters like Toy Story and Cars).

But why do have negative feelings surrounding almost perfect human representations?

Firstly, the existence of a similar phenomenon in monkeys suggests there’s an evolutionary basis for it. Macaques are frightened and avoid eye contact with realistic but not perfect digital images of other monkeys. Like humans, they are more at ease with real images of monkeys and representations that are real but not too real.

An early but largely discredited hypothesis about how the uncanny valley works was that we generally are uncomfortable about things we can’t easily categorise. In the case of androids: are they human, or non-human?

However, the most likely explanation is the perceptual mismatch hypothesis. Because they look so human, we can’t help but want to judge them by human standards. But our brains pick up from their movements and responses that they’re not human.

In a 2011 study, the brain activity of 20 participants were monitored as they watched videos of a human, android and robot performing the same everyday actions such as drinking water from a cup or waving. The largest difference in brain responses between the three types of videos occurred when watching the android.

The researchers interpreted their findings to mean that our brains have no trouble processing when a human moves like a human, or a robot moves like a robot. But the brain lights up with an error message when there’s a mismatch, as in with the human-like appearance but robotic motion of androids.

The expectation that androids would move like us may be why the internal brain regions that help us move smoothly are activated when watching the slightly jerky movements of an android. One such region is the subthalamic nucleus (STN), which is impaired in voluntary movement disorders such as Parkinson’s disease. In fact, it’s suggested that our brains mistakenly interpret androids as people with Parkinson’s.

Our discomfort with perceptual mismatches also applies to emotions in artificial intelligences.  We’re spooked when something we know to be controlled by a computer appears to show spontaneous empathy like a real person.

In sum, scientists think that because androids fail to meet our expectations of a normal healthy human, we have an averse response.

Towards getting un(real)?

So, how should we account for the uncanny valley in an era where androids are being developed to become our caregivers, companions and even priests?

One solution is to simply to design robots that are not too human-like. Think adorable Disney robots like Wall-E, or Baymax in Big Hero 6. (Relatedly, Pixar finally got the formula right for non-threatening, animated human characters with The Incredibles).

Or perhaps it’s just a matter of exposure and time (three episodes?) before the valley becomes a plain.

Links and stuff

The who, why and how of Stonehenge

comments 5
Anthropology / History / Myths

Legend has it Stonehenge was created by the wizard Merlin with the help of giants. Meanwhile, scientists have been studying the monument for centuries. What do we know about who built Stonehenge, and why?

An ancient observatory, a healing place, or a landing pad for alien spaceships? Image credit FreeSally via Pixabay

Giants and aliens

Stonehenge, located on Salisbury Plain, is the best-known prehistoric monument in Europe.  It was built around 5,000 years ago by people who left no written record. As a result, it’s become one of our favourite mysteries: there are records of claims about its origin as early as the 12th century.

The Merlin Hypothesis, no longer as popular as it was in the 14th century, has been joined by a swag of other theories (some more plausible than others) about who built it, how and why. Given the incredible effort involved in creating Stonehenge – it’s been estimated building took more than 20 million hours – it seems certain Stonehenge was sacred.

More than 50,000 cremated bone fragments from at least 58 people have been found at Stonehenge so it certainly served as a cemetery. Because many of the skeletons buried nearby show signs of illness and injury, it’s also been suggested Stonehenge was considered a healing place.

Other theories argue Stonehenge was an ancient hunting and feasting site, or a long-distance communication system. Stonehenge has incredible acoustics, which give the impression of having been architecturally designed.

It’s also been suggested Stonehenge creates a passage through the underworld, given the way the moon is viewed on the horizon through a stone avenue at certain times. In 1964, an astronomer proposed that Stonehenge was an ancient observatory used to predict the movement of the sun and stars, including eclipses.

It won’t surprise you to know there have also been alien theories: to some, Stonehenge looks like an ideal landing pad for alien spaceships.

Build me a monument

One of the fascinating things about Stonehenge is research suggests the building process took place over multiple stages, possibly over as many as 1500 years. So maybe it meant different things to different people over that time.

A Stonehenge puzzle we believed we’ve solved is the question of where the stones came from. The biggest blocks of sandstone – weighing up to 30 tons – almost certainly come from about 32km north of Salisbury Plain. That’s an impressive distance to travel with such massive objects.

More extraordinary: we know the 42 smaller bluestones forming the inner circle came from the Preseli Hills in western Wales, a whopping 230km north of Stonehenge.

Researchers are confident they’ve even managed to identify the exact quarries where the stones came from. They’ve found evidence of quarrying at the sites from the right time period, including platforms at the base of rock outcrops. These platforms may have served as loading bays for the stones when they were lowered onto wooden sledges for transport. It’s also been proposed the stones we now know as Stonehenge were first built into a monument in Wales, before later being dismantled.

Over the years there have been many theories as to how the stones were brought to Salisbury Plain. One suggestion – now discredited – was the stones were brought by glacier movement, rather than by people. Other theories involve transport by boat, either on rivers or all the way around the coast.

But the latest research suggests the stones were probably dragged by people on wooden sleighs across ramps made of branches. A reenactment in 2016 using only rope, wood and stone tools showed that ten people were able to drag stones the size of the largest stones at Stonehenge about 1.6km per hour. Exhausting and slow, no doubt. But definitely possible.

Who built it?

Who dragged such enormous stones across the countryside? One of the challenges we’ve faced in understanding more about Stonehenge is the fact that the human remains buried beneath the stones were cremated. The cremation process means many of the standard dating techniques researchers use – for example studying the tooth enamel – don’t work.

But exciting research published last year did manage to analyse the chemistry of the bone fragments successfully. As a result of the foods these people ate in the decade or so before they died, the researchers could determine the ‘chemical signature’ of the bone fragments. Then they set about matching these signatures with the chemistry of the surrounding areas.

The results tell us that although some of the people buried at Stonehenge were almost certainly local, up to 40% came from a long way away. Where from? The area they most likely came from includes the region of Wales where the bluestones were quarried.

It’s possible these people were even cremated in Wales and their remains were brought with the bluestones themselves. But at this stage we can’t be sure of the relationship between these outsiders and the stones. There’s still plenty we don’t understand about Stonehenge and the mystery is likely to keep us entranced for many years to come.

Let’s hope the current battle around how to deal with traffic on the nearby A303 road is solved. Neither major traffic jams, nor the potential building of a tunnel in a world heritage site sound ideal.

Links and stuff

Songs on repeat

comment 1
Myths / Psychology

We all do it: find a new favourite song that then gets heavy rotation on our playlists. Mine’s currently a Postmodern Jukebox’s 50s prom style cover of Closer by The Chainsmokers. Tara Bautista on the science behind why we choose to play songs on repeat and what happens when we do.

Why we smash that replay button. Photo by Bruce Mars on Unsplash.


Play it again, Sam

Sixty percent of participants in a study last year admitted they immediately re-listen to their favourite song, some even up to four times in a row.

But what makes a song worthy of replay and why do we do it? One answer is the emotions they make us feel.

A study by the University of Michigan revealed three broad categories of favourite songs and why people were drawn to them.

People whose favourite songs were happy and energetic liked the beat whereas calm songs prompted listeners to reflect and gain perspective. But the most listened-to songs by far were the bittersweet ones. Unsurprisingly, the greater your emotional connection to a song, the more you hit replay.

Certain songs become our favourites because they resonate with us. Listening to our favourite music repeatedly can reinforce our identities, argues Kenneth Aigen, who heads the music therapy program at NYU.

Another answer is that repetition increases how much we like things.

In the 1960s, Robert Zajonc first described a phenomenon called the ‘mere exposure effect’. We tend to like things – be it pictures or melodies – more, the second or third time we come across them.

We rate songs familiar to us as more likeable than unfamiliar ones. Since our brains are faster at processing the sounds during re-listens, we are ‘tricked’ into thinking the ease of recognition is a positive thing. Familiarity also increases our emotional engagement with the music.

What’s more, repetition and familiarity with the music allows us to actively participate and not just process the sounds.

Professor Elizabeth Margulis, author of On Repeat: How Music Plays the Mind, says “Musical repetition gets us mentally imagining or singing through the bit we expect to come next.”

Indeed, brain imaging studies show that other parts of the brain besides those which process sound light up when we listen to familiar music. These include brain regions that anticipate where the song is going and those usually associated with body movement. Our mind effectively ‘sings along’.

Neuroscientists have also described two stages when we listen to music that gives us the chills. The brain’s caudate nucleus is excited during the build up to our favourite part. When the peak hits, another part called the nucleus accumbens releases the feel-good brain chemical dopamine.


Can’t Get You Out of My Head

There are songs you deliberately play again, and then there are earworms.

Earworms, or involuntary musical hallucinations, are short bits of music that we hear on a seemingly endless loop in our heads. They’re usually songs we’re familiar with and they can be set off by mental associations. For example, parents of toddlers may see the word ‘shark’ and automatically hear ‘doo doo doo doo’.

Turns out you’re more likely to experience earworms if you listen to music and sing on a daily basis. Also if you show obsessive-compulsive traits and score high in openness to experience.

Psychologists analysed data from 3000 participants who were asked which songs gave them earworms. Popular and recent songs were among the top ten, with Lady Gaga’s Bad Romance taking out top spot.

The researchers then analysed MIDI files of the reported earworm-inducing songs to see what features they had in common. On average, the songs were upbeat with a melody that is both familiar but unique in terms of the intervals between the notes. In other words, the melody has to sound like something we’ve heard before but different enough to remain interesting and memorable. 

I Will Always Love You (… or not)

You’ve listened to it more than a dozen times now, but suddenly your favourite new jam isn’t such a hit. What gives?

Most pieces of music that we’ve liked initially, but then are overexposed to, follow the classic Wundt curve. This is a when the pleasantness of an experience increases for the first few times but then decreases with further exposure. (Pharrell William’s Happy, anyone?)

As our brains figure out the complexity of the song, the novelty wears off and we become bored.

This might be why more complex pieces of music like Bohemian Rhapsody by Queen (coincidentally number seven in the list of earworms) have longer-lasting popularity, according to music psychology expert Dr Michael Bonshor.

Although simpler song structures are catchier in the short-term, more complex layers of harmony, rhythm and vocals provide more opportunities to keep our brains interested.

And as for getting rid of earworms?

Some researchers suggest singing another song, like God Save the Queen, or Karma Chameleon.

But then again, karma karma karma karma karma … d’oh!

Links and stuff

How super is your superfood?

comments 2
Health / Myths

Chia seeds, goji berries, kale, blueberries, turmeric, green tea, quinoa, açaí (which I’ve now learned is pronounced ‘ah-sah-ee’). There’s no shortage of so-called superfoods. But what does that term mean, and are superfoods worth your money?

So much healthy goodness….all in one bowl. Photo by Chef Mick (Michaelangelo) Rosacci via Flickr.

All in a name

Need to lose weight? Want to slow down ageing? Want to minimise your risk of cancer and other diseases? These days you could be forgiven for thinking all you need to do is eat the latest miracle superfood and your ongoing good health is guaranteed.

Superfoods are marketed for their high nutritional value and extraordinary health benefits and the marketing appears to work. A UK survey found 61% of people bought particular foods because of their superfood status. The study also found people are willing to pay more for superfoods. Research suggests we see superfoods as somewhere between food and medicine and that the higher your socioeconomic status, the more likely you are to eat superfoods.

It pays to know there is no scientifically-agreed meaning of the term ‘superfood’: it’s not a legal or regulatory term like ‘organic’ or ‘fair trade’. Superfood is a buzzword created to make us believe a particular food has crossed some nutritional threshold of goodness. But it hasn’t.

Interestingly, using the term superfood on food packaging or in marketing isn’t regulated in Australia or the United States. But there have been very tight restrictions around its use in the European Union since 2007.

What’s so super?

One reason superfoods are so alluring is because we know modern western eating habits are often less-than-ideal. We’re well aware of the steadily rising rates of obesity, type 2 diabetes, heart disease and other diet-related conditions. Foods and diets that appear to be more natural, primitive, traditional and authentic are inherently appealing.

It’s no coincidence that many superfoods have a history of traditional use by ancient or indigenous communities. For example, maca powder – often advertised as an ‘Incan superfood’ – comes from the root of a Peruvian vegetable and is said to improve energy, stamina, fertility and sex drive.

Australian researcher Jessica Loyer describes the packaging of one popular brand of maca powder: women in traditional Andean clothing picking vegetables by hand and placing them in woven baskets. In reality, men, women and children wearing jeans all harvest maca into plastic bags, using tools and in some cases, modern machinery.

Another incredibly appealing aspect of superfoods is the notion that by adding superfoods to your diet, you can compensate for your other bad habits. No need to change your doughnut-a-day habit if you add goji berries to your muesli. The muffin you had at morning tea is healthy because it was packed full of blueberries.

And although superfoods may be advertised as green, sustainable products, unfortunately the opposite can be true. As demand for these superfoods increases, so do the potential profits: it’s no wonder some farmers are abandoning their traditional production methods. But the resulting monocultures, soil degradation and increased use of fertilisers and pesticides are resulting in serious environmental and ecological problems.

What’s the evidence?

Why do some foods end up being touted as superfoods? Often because research has shown a particular component of that food does have specific health benefits. But the problem is the evidence generally comes from animal trials or lab experiments that don’t well represent the complexities of the human body. Just because a particular ingredient kills cancer cells in a lab experiment doesn’t mean eating lots of food containing that ingredient will stop you getting cancer.

There are experiments on people, but they tend to test the effect of very high amounts of a particular ingredient over a short period. For example, an ingredient in cinnamon, cinnamaldehyde, has been shown to reduce cholesterol in people with diabetes. But as nutrition research scientist Emma Beckett has calculated, you’d need to eat about half a supermarket jar of cinnamon each day to get the same amount of cinnamaldehyde as in the study.

It’s a similar story with blueberries.  Blueberries (like red wine) contain resveratrol which is a compound claimed to reduce the risk of heart disease, Alzheimer’s, cancer and diabetes. But to get enough resveratrol for the potential benefits, you’d need to eat roughly 10,000 blueberries a day. That would be both time-consuming and expensive!

So what should we eat?

There’s plenty of information out there on what constitutes a healthy diet. And if you can afford it and like the taste, there’s no reason not to include some ‘superfoods’ in your diet. But if kale doesn’t do it for you, you can also get essentially the same nutrients from cabbage, spinach, broccoli or brussels sprouts. Nutritionists tell us variety is the key to a healthy diet.

Personally, I like the idea of the rainbow diet – making sure I eat fruits and vegetables of a variety of colours. And I reckon Michael Pollan’s adage is worthy of its fame: ‘Eat food. Not too much. Mostly plants’.

Links and stuff


The low down on the sunk cost fallacy

comments 2
Evolution / Myths / Psychology

Waiting in line at the grocery, you hesitate to move to another counter when it opens up. Even though you’re not looking forward to seeing it, you head to a movie you’ve already paid for. You stay in a career or relationship that no longer ‘sparks joy’. Tara Bautista asks why we cling on when we should let go.

Why do we choose to stay on board? Photo by Jack Blackeye on Unsplash.

Going down with the ship

Humans are weird creatures. We prize our supposedly unrivalled ability to tackle problems and come to decisions through logic and rationale. Yet we often fall prey to logical fallacies.

A common psychological trap you’ve probably fallen into is the sunk cost fallacy. It’s also known as the Concorde fallacy.

The sunk cost fallacy is when we continue with an endeavour even though it makes us unhappy or worse off because of the time, money, and effort – i.e. the ‘sunk costs’ – we’ve already invested in it.

In other words, it’s every situation in which you’ve ever thought: “But I’ve already come so far…”

The more you’ve invested, the more you’re likely to stick with it. In a study of US basketball drafts, higher paid draft picks were given more play time than lower paid ones even if their performances weren’t up to par.

Interestingly, one study showed the sunk costs we honour don’t even have to be real or even your own. In an imaginary situation where people were told they were accidentally booked in to two trips, one to Cancun ($800 in value) and the other to Montreal ($200), people were more likely to choose to go to Cancun even if they preferred Montreal.

In another scenario, people were told to imagine they felt full but hadn’t finished eating cake at a party. More often than not, they would push to finish it if they were told the cake had been bought from an expensive and faraway bakery.

But why?

Plenty of researchers have proposed explanations for this counterproductive behaviour.

One is simply that we don’t like to be wrong. There’s a mental disconnect between the effort we put in and the lower than expected return on our investment. As a result, we exaggerate the benefits to justify the cost.

For example, there’s a greater chance you’ll give five-star ratings for videos of kittens, dancing and bike crashes the longer you’ve spent waiting for them to load.

We also want to convince ourselves and others that we’re not wasteful or ‘quitters’.

We still go ahead with the sinking ship despite feeling bad about it. One study goes so far as to say that the negative feelings we have about our sunk costs actually drives the behaviour. The researchers found the likelihood of people sticking to their guns was predicted by how bad they felt immediately after making a decision about sunk costs.

We may also fear the regret we might feel by withdrawing our commitment.

Some researchers propose there’s an evolutionary advantage to whatever mechanism is driving the phenomenon since mice and rats also exhibit the behaviour. They too experience ‘regret’ about sunk costs and show reluctance to quit lining up for food the longer they’ve waited.

Looking to the future to avoid getting snagged

The thing is, thinking about sunk costs is being caught up with the past when we should be focusing on present actions and potential gains from them.

Economic theory suggests that “…decisions should only be guided by future gains and losses, as prior costs do not affect the objective outcomes of current decisions.”

We ‘feel’ our way into the trap, but we can think ourselves out of it.

Reflecting when we’re not mentally overwhelmed helps us to avoid the sunk cost fallacy. People who justify their decisions feel less bad about poor investments and are less susceptible to the sunk cost fallacy than people who don’t justify them.

And speaking of thinking (and perhaps overthinking): people high in neuroticism are less prone to the sunk cost fallacy. This is because they tend to avoid stressful situations by engaging in tasks other than the one responsible for the stress. On the other hand, people low in neuroticism experience less stress, resulting in less incentive to change their behaviour and greater chance of continuing with bad investments.

When things go wrong, Western society has a tendency towards action – to do something rather than not. In fact, we have a tendency to commit even more resources the more desperate the situation is.

But researchers found we can prevent this by reframing how we think. We are less likely to commit more resources to a failing venture if we are taught to think of committing more resources as ‘inaction’ and not choosing to commit is an ‘action’ of itself.

I don’t know about you but changing the way I think about a sinking ship isn’t such a titanic effort when I imagine doing nothing and being buried in the heart of the ocean!


Links and stuff