Is it just me, or is it the Barnum Effect?

Leave a comment
Anthropology / Myths / Psychology

Let me tell you about yourself: You pride yourself as an independent thinker and do not accept others’ statements without satisfactory proof. While you have some personality weaknesses, you are generally able to compensate for them. You have a great deal of unused capacity which you have not turned to your advantage.

Neither they, nor I, can really read your mind. Photo by Scott Rodgerson on Unsplash.

 

Did you read the above statements and think they were eerily accurate?

Here’s the catch – they weren’t written about you.

These generic descriptions, called Barnum statements, could be about anyone regardless of their age, gender or nationality. If you were fooled, you wouldn’t be alone.

The unsuspecting millions

The phenomenon you just witnessed underlies the enduring popularity of astrology, psychics, tarot readers and faith healers – despite what scientists have to say.

Roughly 90% of surveyed people know their star sign. Most identify with the broad personality traits ‘bestowed’ by their Zodiac. For example, Sagittarians agree that they’ve been incurably bitten by the travel bug. Or, that it’s valid to excuse your stubbornness on the account of being a Taurus. Yet if you give people 12 unlabelled Zodiac personality descriptions, the probability that they correctly pick their own is no more likely than chance.

You’re so vain; you probably think this song is about you

The Barnum effect refers to the tendency to believe vague personality descriptions that are presented to be personalised, when in fact they can be almost universally applied. It is otherwise known as the Forer effect, after the American psychologist who conducted the first scientific study on the phenomenon.

In 1942, Professor Bertram R. Forer made unsuspecting first year psychology students take a personality test. (Read the original paper here.) A week later, he then gave them ‘individual’ personality descriptions comprising ambiguous statements based on their answers. When asked how accurate their description applied to them, the students gave an average rating of over 4 out of 5. In reality however, they all received an identical statement cobbled together from a news stand astrology book.

Forer and others also showed that the greater accuracy with which you rate some external description about yourself, the greater your belief in the validity of the person or system dispensing the assessment. Fifty million people in the US and UK believe that Zodiac horoscopes accurately predict their personal future.

In other words, if you truly believe you’re stubborn because you’re a Taurus, you’re also more likely to believe next month’s horoscope that you’ll hit a lucky streak in love or in your job.

What makes Barnum statements so believable?

The Barnum Effect gets its name from the winning formula used by the famous showman Phineaus Taylor Barnum: to include a “little something for everyone”. We can get sucked into believing Barnum statements because they are vague and often double-headed enough to cover all bases.

Consider the statement: “At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved.” It seems believable because everyone’s either an extrovert or introvert, and it’s unlikely that you’ll always be one or the other all the time!

Barnum statements seduce us because they also include generally favourable statements that flatter our ego. Most people are less likely to believe them if more negative elements of their personality are described.

Other factors that can make you more susceptible is your belief in the authority of the person giving the assessment, as well as how much detail you provided.

Guarding against gullibility

Don’t like being duped?

The best defense against swindlers who use Barnum statements is a healthy dose of skepticism. Some might say being educated is enough. But even with the same level of education, those that exercised greater logical thinking skills were less prone to having the wool pulled over their eyes.

After all, what are the chances that all Taureans around the world will simultaneously get a raise next month?

Links and stuff

 

Why you (probably) can’t multitask

comment 1
Myths / Psychology

How many different things are you trying to do right now? Reading this blog post while listening to a podcast, posting on Instagram and checking email? It’s easy to think you’ll be more productive by doing lots of things at once. But it turns out for almost all of us, effective multitasking is an illusion.

A different take on driving and texting. Photo by Michiel Gransjean via Flickr.

You’re not really multitasking

With the possible exception of texting while driving, you probably think multitasking is something to aspire to. But the reality is, trying to do multiple things at once actually slows you down and leads to more mistakes. We are simply no good at trying to engage with more than one decision-making process at a time.

In fact, experts tell us even the term multitasking is wrong: we should call it task switching. Researchers suggest we may lose up to 40% of our productivity by trying to multitask. What feels like multitasking is actually rapid switching between tasks and the time it takes to switch between tasks all adds up.

Think it only takes a few seconds to jump between email and the report you urgently need to write? No big deal unless you are switching every time you get a new email notification. And how long does it take you to get to the point where you are writing effectively again? Research shows it takes an average of 23 minutes. In some cases, study volunteers never got back into the flow of writing.

In another study, volunteers were asked to memorise either a 7-digit or single-digit number while eating. Those who had the more challenging memory task were less aware of the flavour of the food they were eating. At the same time, they ate more of both the sweet and salty food on offer.

Some tasks are easier than others

If multitasking is so hard, perhaps you’re wondering why you can read this blog, eat your lunch and listen to music all at once.

How easy it is to juggle tasks depends on how engaged your prefrontal cortex is during the activity. Natural behaviours like walking, talking and eating don’t take a lot of brain effort so we can do those things at the same time as paying attention to something else.

Having said that, talking on your mobile while driving increases your risk of having an accident four-fold. It’s been shown to be equivalent to driving with a blood alcohol reading of about 0.08.

When 56,000 people were observed approaching an intersection in their cars, those who were talking on their mobiles were ten times less likely to actually stop at the stop sign.

Even walking at the same time as talking on your mobile takes up most of your brain’s attention. Only one quarter of people doing both noticed a clown on a unicycle ride past them who had been planted there by researchers.

And as soon as there are multiple tasks which require thinking, it becomes clear our brains simply aren’t wired for multitasking.

There are supertaskers among us

But all is not lost. Research has uncovered a small proportion of the population — about two percent — who have extraordinary multitasking abilities. Imagine simultaneously doing a driving test, solving complex maths problems and doing memory tests on your phone.

People who can easily do that have been called supertaskers and they have brains that actually become less active the more tasks they are trying to do at once. And that’s not all. Supertaskers do better, not worse, at each individual task the more simultaneous tasks they are doing.

The irony is that as soon as we hear that supertaskers exist, 90% of us decide we belong in that 2%! But research has shown that the people who multitask the most and are the most confident in their multitasking abilities tend to actually be the worst at it.

The cost of multitasking

Assuming like me, you are not a supertasker, it’s worth considering what our attempts at multitasking cost us.

Research suggests the more we attempt to multitask, the more we are training ourselves not to focus. We are effectively teaching ourselves that something unknown – an unread email or the next notification – is always more worthy of our attention than whatever task we are meant to be working on.

We sacrifice focus in order to ensure we don’t miss any unexpected surprises, but in the process we lose our ability to block out distractions. And that means we’re not very good at getting stuff done.

As Cal Newport says in his excellent book Deep Work,

Efforts to deepen your focus will struggle if you don’t simultaneously wean your mind from a dependence on distraction.

Links and stuff

 

 

 

The reality of the Uncanny Valley

comment 1
Anthropology / Evolution / Myths / Psychology

Technology has made making extremely human-like robots a reality. But is this a good thing? Tara Bautista delves into the unease that many people experience with things that are almost – but not quite – human. Welcome to the ‘Uncanny Valley’.

Feeling weirded out? Me too. Photo by Franck V. on Unsplash.

When realistic isn’t pleasantly real enough

Last weekend I started watching a new fantasy show on Netflix called The Dark Crystal: Age of Resistance. You could say it’s a cross between Game of Thrones and Jim Henson’s The Muppets.

It’s a great series but it took me three episodes to get over the weirdness of the puppets. The pairing of human voice acting with the puppets’ mechanical facial expressions made for an unsettling experience. (And spoiler alert: I never want to see puppets kiss again).

Turns out I’m not alone in feeling this way.

In 1970, a Japanese robotics professor called Masahiro Mori first predicted the relationship between how human-like something is and our feelings towards it. He argued that at first, we experience increased positive feelings the more human-like the object is. But as something approaches being almost but not perfectly human-like, he predicted a dip in positive feelings and we would experience feelings of discomfort and revulsion.

Professor Mori called the dip the “Bukimi no Tani” – literally translated as “Valley of Eeriness” but more popularly known as the “Uncanny Valley”. Into the valley would fall things like androids (human-like robots), Japanese bunraku puppets, prosthetic hands, corpses and zombies.

The science behind our heebie-jeebies

Up until 2004, the Uncanny Valley hypothesis remained controversial and little studied. Nowadays it’s gaining acceptance as more scientific evidence backs up its existence.

In 2015, a landmark study reproduced Professor Mori’s predicted rise-dip-rise when participants rated the likeability of 80 real-life robot faces (ranging from clearly robot to android) as well as digitally-composed robot faces. The researchers showed the Uncanny Valley effect even influenced participants’ decisions about robots’ trustworthiness.

(Fun fact: Pixar won its first Oscar for the its CGI short film, Tin Toy in 1988. However, audiences were so creeped out by the human baby character that it drove the film studio to initially focus on stories with non-human characters like Toy Story and Cars).

But why do have negative feelings surrounding almost perfect human representations?

Firstly, the existence of a similar phenomenon in monkeys suggests there’s an evolutionary basis for it. Macaques are frightened and avoid eye contact with realistic but not perfect digital images of other monkeys. Like humans, they are more at ease with real images of monkeys and representations that are real but not too real.

An early but largely discredited hypothesis about how the uncanny valley works was that we generally are uncomfortable about things we can’t easily categorise. In the case of androids: are they human, or non-human?

However, the most likely explanation is the perceptual mismatch hypothesis. Because they look so human, we can’t help but want to judge them by human standards. But our brains pick up from their movements and responses that they’re not human.

In a 2011 study, the brain activity of 20 participants were monitored as they watched videos of a human, android and robot performing the same everyday actions such as drinking water from a cup or waving. The largest difference in brain responses between the three types of videos occurred when watching the android.

The researchers interpreted their findings to mean that our brains have no trouble processing when a human moves like a human, or a robot moves like a robot. But the brain lights up with an error message when there’s a mismatch, as in with the human-like appearance but robotic motion of androids.

The expectation that androids would move like us may be why the internal brain regions that help us move smoothly are activated when watching the slightly jerky movements of an android. One such region is the subthalamic nucleus (STN), which is impaired in voluntary movement disorders such as Parkinson’s disease. In fact, it’s suggested that our brains mistakenly interpret androids as people with Parkinson’s.

Our discomfort with perceptual mismatches also applies to emotions in artificial intelligences.  We’re spooked when something we know to be controlled by a computer appears to show spontaneous empathy like a real person.

In sum, scientists think that because androids fail to meet our expectations of a normal healthy human, we have an averse response.

Towards getting un(real)?

So, how should we account for the uncanny valley in an era where androids are being developed to become our caregivers, companions and even priests?

One solution is to simply to design robots that are not too human-like. Think adorable Disney robots like Wall-E, or Baymax in Big Hero 6. (Relatedly, Pixar finally got the formula right for non-threatening, animated human characters with The Incredibles).

Or perhaps it’s just a matter of exposure and time (three episodes?) before the valley becomes a plain.

Links and stuff

The who, why and how of Stonehenge

comments 5
Anthropology / History / Myths

Legend has it Stonehenge was created by the wizard Merlin with the help of giants. Meanwhile, scientists have been studying the monument for centuries. What do we know about who built Stonehenge, and why?

An ancient observatory, a healing place, or a landing pad for alien spaceships? Image credit FreeSally via Pixabay

Giants and aliens

Stonehenge, located on Salisbury Plain, is the best-known prehistoric monument in Europe.  It was built around 5,000 years ago by people who left no written record. As a result, it’s become one of our favourite mysteries: there are records of claims about its origin as early as the 12th century.

The Merlin Hypothesis, no longer as popular as it was in the 14th century, has been joined by a swag of other theories (some more plausible than others) about who built it, how and why. Given the incredible effort involved in creating Stonehenge – it’s been estimated building took more than 20 million hours – it seems certain Stonehenge was sacred.

More than 50,000 cremated bone fragments from at least 58 people have been found at Stonehenge so it certainly served as a cemetery. Because many of the skeletons buried nearby show signs of illness and injury, it’s also been suggested Stonehenge was considered a healing place.

Other theories argue Stonehenge was an ancient hunting and feasting site, or a long-distance communication system. Stonehenge has incredible acoustics, which give the impression of having been architecturally designed.

It’s also been suggested Stonehenge creates a passage through the underworld, given the way the moon is viewed on the horizon through a stone avenue at certain times. In 1964, an astronomer proposed that Stonehenge was an ancient observatory used to predict the movement of the sun and stars, including eclipses.

It won’t surprise you to know there have also been alien theories: to some, Stonehenge looks like an ideal landing pad for alien spaceships.

Build me a monument

One of the fascinating things about Stonehenge is research suggests the building process took place over multiple stages, possibly over as many as 1500 years. So maybe it meant different things to different people over that time.

A Stonehenge puzzle we believed we’ve solved is the question of where the stones came from. The biggest blocks of sandstone – weighing up to 30 tons – almost certainly come from about 32km north of Salisbury Plain. That’s an impressive distance to travel with such massive objects.

More extraordinary: we know the 42 smaller bluestones forming the inner circle came from the Preseli Hills in western Wales, a whopping 230km north of Stonehenge.

Researchers are confident they’ve even managed to identify the exact quarries where the stones came from. They’ve found evidence of quarrying at the sites from the right time period, including platforms at the base of rock outcrops. These platforms may have served as loading bays for the stones when they were lowered onto wooden sledges for transport. It’s also been proposed the stones we now know as Stonehenge were first built into a monument in Wales, before later being dismantled.

Over the years there have been many theories as to how the stones were brought to Salisbury Plain. One suggestion – now discredited – was the stones were brought by glacier movement, rather than by people. Other theories involve transport by boat, either on rivers or all the way around the coast.

But the latest research suggests the stones were probably dragged by people on wooden sleighs across ramps made of branches. A reenactment in 2016 using only rope, wood and stone tools showed that ten people were able to drag stones the size of the largest stones at Stonehenge about 1.6km per hour. Exhausting and slow, no doubt. But definitely possible.

Who built it?

Who dragged such enormous stones across the countryside? One of the challenges we’ve faced in understanding more about Stonehenge is the fact that the human remains buried beneath the stones were cremated. The cremation process means many of the standard dating techniques researchers use – for example studying the tooth enamel – don’t work.

But exciting research published last year did manage to analyse the chemistry of the bone fragments successfully. As a result of the foods these people ate in the decade or so before they died, the researchers could determine the ‘chemical signature’ of the bone fragments. Then they set about matching these signatures with the chemistry of the surrounding areas.

The results tell us that although some of the people buried at Stonehenge were almost certainly local, up to 40% came from a long way away. Where from? The area they most likely came from includes the region of Wales where the bluestones were quarried.

It’s possible these people were even cremated in Wales and their remains were brought with the bluestones themselves. But at this stage we can’t be sure of the relationship between these outsiders and the stones. There’s still plenty we don’t understand about Stonehenge and the mystery is likely to keep us entranced for many years to come.

Let’s hope the current battle around how to deal with traffic on the nearby A303 road is solved. Neither major traffic jams, nor the potential building of a tunnel in a world heritage site sound ideal.

Links and stuff

Songs on repeat

comment 1
Myths / Psychology

We all do it: find a new favourite song that then gets heavy rotation on our playlists. Mine’s currently a Postmodern Jukebox’s 50s prom style cover of Closer by The Chainsmokers. Tara Bautista on the science behind why we choose to play songs on repeat and what happens when we do.

Why we smash that replay button. Photo by Bruce Mars on Unsplash.

 

Play it again, Sam

Sixty percent of participants in a study last year admitted they immediately re-listen to their favourite song, some even up to four times in a row.

But what makes a song worthy of replay and why do we do it? One answer is the emotions they make us feel.

A study by the University of Michigan revealed three broad categories of favourite songs and why people were drawn to them.

People whose favourite songs were happy and energetic liked the beat whereas calm songs prompted listeners to reflect and gain perspective. But the most listened-to songs by far were the bittersweet ones. Unsurprisingly, the greater your emotional connection to a song, the more you hit replay.

Certain songs become our favourites because they resonate with us. Listening to our favourite music repeatedly can reinforce our identities, argues Kenneth Aigen, who heads the music therapy program at NYU.

Another answer is that repetition increases how much we like things.

In the 1960s, Robert Zajonc first described a phenomenon called the ‘mere exposure effect’. We tend to like things – be it pictures or melodies – more, the second or third time we come across them.

We rate songs familiar to us as more likeable than unfamiliar ones. Since our brains are faster at processing the sounds during re-listens, we are ‘tricked’ into thinking the ease of recognition is a positive thing. Familiarity also increases our emotional engagement with the music.

What’s more, repetition and familiarity with the music allows us to actively participate and not just process the sounds.

Professor Elizabeth Margulis, author of On Repeat: How Music Plays the Mind, says “Musical repetition gets us mentally imagining or singing through the bit we expect to come next.”

Indeed, brain imaging studies show that other parts of the brain besides those which process sound light up when we listen to familiar music. These include brain regions that anticipate where the song is going and those usually associated with body movement. Our mind effectively ‘sings along’.

Neuroscientists have also described two stages when we listen to music that gives us the chills. The brain’s caudate nucleus is excited during the build up to our favourite part. When the peak hits, another part called the nucleus accumbens releases the feel-good brain chemical dopamine.

 

Can’t Get You Out of My Head

There are songs you deliberately play again, and then there are earworms.

Earworms, or involuntary musical hallucinations, are short bits of music that we hear on a seemingly endless loop in our heads. They’re usually songs we’re familiar with and they can be set off by mental associations. For example, parents of toddlers may see the word ‘shark’ and automatically hear ‘doo doo doo doo’.

Turns out you’re more likely to experience earworms if you listen to music and sing on a daily basis. Also if you show obsessive-compulsive traits and score high in openness to experience.

Psychologists analysed data from 3000 participants who were asked which songs gave them earworms. Popular and recent songs were among the top ten, with Lady Gaga’s Bad Romance taking out top spot.

The researchers then analysed MIDI files of the reported earworm-inducing songs to see what features they had in common. On average, the songs were upbeat with a melody that is both familiar but unique in terms of the intervals between the notes. In other words, the melody has to sound like something we’ve heard before but different enough to remain interesting and memorable. 

I Will Always Love You (… or not)

You’ve listened to it more than a dozen times now, but suddenly your favourite new jam isn’t such a hit. What gives?

Most pieces of music that we’ve liked initially, but then are overexposed to, follow the classic Wundt curve. This is a when the pleasantness of an experience increases for the first few times but then decreases with further exposure. (Pharrell William’s Happy, anyone?)

As our brains figure out the complexity of the song, the novelty wears off and we become bored.

This might be why more complex pieces of music like Bohemian Rhapsody by Queen (coincidentally number seven in the list of earworms) have longer-lasting popularity, according to music psychology expert Dr Michael Bonshor.

Although simpler song structures are catchier in the short-term, more complex layers of harmony, rhythm and vocals provide more opportunities to keep our brains interested.

And as for getting rid of earworms?

Some researchers suggest singing another song, like God Save the Queen, or Karma Chameleon.

But then again, karma karma karma karma karma … d’oh!

Links and stuff

How super is your superfood?

comments 2
Health / Myths

Chia seeds, goji berries, kale, blueberries, turmeric, green tea, quinoa, açaí (which I’ve now learned is pronounced ‘ah-sah-ee’). There’s no shortage of so-called superfoods. But what does that term mean, and are superfoods worth your money?

So much healthy goodness….all in one bowl. Photo by Chef Mick (Michaelangelo) Rosacci via Flickr.

All in a name

Need to lose weight? Want to slow down ageing? Want to minimise your risk of cancer and other diseases? These days you could be forgiven for thinking all you need to do is eat the latest miracle superfood and your ongoing good health is guaranteed.

Superfoods are marketed for their high nutritional value and extraordinary health benefits and the marketing appears to work. A UK survey found 61% of people bought particular foods because of their superfood status. The study also found people are willing to pay more for superfoods. Research suggests we see superfoods as somewhere between food and medicine and that the higher your socioeconomic status, the more likely you are to eat superfoods.

It pays to know there is no scientifically-agreed meaning of the term ‘superfood’: it’s not a legal or regulatory term like ‘organic’ or ‘fair trade’. Superfood is a buzzword created to make us believe a particular food has crossed some nutritional threshold of goodness. But it hasn’t.

Interestingly, using the term superfood on food packaging or in marketing isn’t regulated in Australia or the United States. But there have been very tight restrictions around its use in the European Union since 2007.

What’s so super?

One reason superfoods are so alluring is because we know modern western eating habits are often less-than-ideal. We’re well aware of the steadily rising rates of obesity, type 2 diabetes, heart disease and other diet-related conditions. Foods and diets that appear to be more natural, primitive, traditional and authentic are inherently appealing.

It’s no coincidence that many superfoods have a history of traditional use by ancient or indigenous communities. For example, maca powder – often advertised as an ‘Incan superfood’ – comes from the root of a Peruvian vegetable and is said to improve energy, stamina, fertility and sex drive.

Australian researcher Jessica Loyer describes the packaging of one popular brand of maca powder: women in traditional Andean clothing picking vegetables by hand and placing them in woven baskets. In reality, men, women and children wearing jeans all harvest maca into plastic bags, using tools and in some cases, modern machinery.

Another incredibly appealing aspect of superfoods is the notion that by adding superfoods to your diet, you can compensate for your other bad habits. No need to change your doughnut-a-day habit if you add goji berries to your muesli. The muffin you had at morning tea is healthy because it was packed full of blueberries.

And although superfoods may be advertised as green, sustainable products, unfortunately the opposite can be true. As demand for these superfoods increases, so do the potential profits: it’s no wonder some farmers are abandoning their traditional production methods. But the resulting monocultures, soil degradation and increased use of fertilisers and pesticides are resulting in serious environmental and ecological problems.

What’s the evidence?

Why do some foods end up being touted as superfoods? Often because research has shown a particular component of that food does have specific health benefits. But the problem is the evidence generally comes from animal trials or lab experiments that don’t well represent the complexities of the human body. Just because a particular ingredient kills cancer cells in a lab experiment doesn’t mean eating lots of food containing that ingredient will stop you getting cancer.

There are experiments on people, but they tend to test the effect of very high amounts of a particular ingredient over a short period. For example, an ingredient in cinnamon, cinnamaldehyde, has been shown to reduce cholesterol in people with diabetes. But as nutrition research scientist Emma Beckett has calculated, you’d need to eat about half a supermarket jar of cinnamon each day to get the same amount of cinnamaldehyde as in the study.

It’s a similar story with blueberries.  Blueberries (like red wine) contain resveratrol which is a compound claimed to reduce the risk of heart disease, Alzheimer’s, cancer and diabetes. But to get enough resveratrol for the potential benefits, you’d need to eat roughly 10,000 blueberries a day. That would be both time-consuming and expensive!

So what should we eat?

There’s plenty of information out there on what constitutes a healthy diet. And if you can afford it and like the taste, there’s no reason not to include some ‘superfoods’ in your diet. But if kale doesn’t do it for you, you can also get essentially the same nutrients from cabbage, spinach, broccoli or brussels sprouts. Nutritionists tell us variety is the key to a healthy diet.

Personally, I like the idea of the rainbow diet – making sure I eat fruits and vegetables of a variety of colours. And I reckon Michael Pollan’s adage is worthy of its fame: ‘Eat food. Not too much. Mostly plants’.

Links and stuff

 

The low down on the sunk cost fallacy

comments 2
Evolution / Myths / Psychology

Waiting in line at the grocery, you hesitate to move to another counter when it opens up. Even though you’re not looking forward to seeing it, you head to a movie you’ve already paid for. You stay in a career or relationship that no longer ‘sparks joy’. Tara Bautista asks why we cling on when we should let go.

Why do we choose to stay on board? Photo by Jack Blackeye on Unsplash.

Going down with the ship

Humans are weird creatures. We prize our supposedly unrivalled ability to tackle problems and come to decisions through logic and rationale. Yet we often fall prey to logical fallacies.

A common psychological trap you’ve probably fallen into is the sunk cost fallacy. It’s also known as the Concorde fallacy.

The sunk cost fallacy is when we continue with an endeavour even though it makes us unhappy or worse off because of the time, money, and effort – i.e. the ‘sunk costs’ – we’ve already invested in it.

In other words, it’s every situation in which you’ve ever thought: “But I’ve already come so far…”

The more you’ve invested, the more you’re likely to stick with it. In a study of US basketball drafts, higher paid draft picks were given more play time than lower paid ones even if their performances weren’t up to par.

Interestingly, one study showed the sunk costs we honour don’t even have to be real or even your own. In an imaginary situation where people were told they were accidentally booked in to two trips, one to Cancun ($800 in value) and the other to Montreal ($200), people were more likely to choose to go to Cancun even if they preferred Montreal.

In another scenario, people were told to imagine they felt full but hadn’t finished eating cake at a party. More often than not, they would push to finish it if they were told the cake had been bought from an expensive and faraway bakery.

But why?

Plenty of researchers have proposed explanations for this counterproductive behaviour.

One is simply that we don’t like to be wrong. There’s a mental disconnect between the effort we put in and the lower than expected return on our investment. As a result, we exaggerate the benefits to justify the cost.

For example, there’s a greater chance you’ll give five-star ratings for videos of kittens, dancing and bike crashes the longer you’ve spent waiting for them to load.

We also want to convince ourselves and others that we’re not wasteful or ‘quitters’.

We still go ahead with the sinking ship despite feeling bad about it. One study goes so far as to say that the negative feelings we have about our sunk costs actually drives the behaviour. The researchers found the likelihood of people sticking to their guns was predicted by how bad they felt immediately after making a decision about sunk costs.

We may also fear the regret we might feel by withdrawing our commitment.

Some researchers propose there’s an evolutionary advantage to whatever mechanism is driving the phenomenon since mice and rats also exhibit the behaviour. They too experience ‘regret’ about sunk costs and show reluctance to quit lining up for food the longer they’ve waited.

Looking to the future to avoid getting snagged

The thing is, thinking about sunk costs is being caught up with the past when we should be focusing on present actions and potential gains from them.

Economic theory suggests that “…decisions should only be guided by future gains and losses, as prior costs do not affect the objective outcomes of current decisions.”

We ‘feel’ our way into the trap, but we can think ourselves out of it.

Reflecting when we’re not mentally overwhelmed helps us to avoid the sunk cost fallacy. People who justify their decisions feel less bad about poor investments and are less susceptible to the sunk cost fallacy than people who don’t justify them.

And speaking of thinking (and perhaps overthinking): people high in neuroticism are less prone to the sunk cost fallacy. This is because they tend to avoid stressful situations by engaging in tasks other than the one responsible for the stress. On the other hand, people low in neuroticism experience less stress, resulting in less incentive to change their behaviour and greater chance of continuing with bad investments.

When things go wrong, Western society has a tendency towards action – to do something rather than not. In fact, we have a tendency to commit even more resources the more desperate the situation is.

But researchers found we can prevent this by reframing how we think. We are less likely to commit more resources to a failing venture if we are taught to think of committing more resources as ‘inaction’ and not choosing to commit is an ‘action’ of itself.

I don’t know about you but changing the way I think about a sinking ship isn’t such a titanic effort when I imagine doing nothing and being buried in the heart of the ocean!

 

Links and stuff

The truth about early birds

comments 4
Anthropology / Biology / Evolution / Genetics / Health / Myths / Psychology

We all know the saying ‘The early bird catches the worm’. But are there any real benefits to being a morning person? What determines whether you’re an early bird or a night owl anyway? And can night owls become early risers if they want to?

Embracing the dawn. Photo by Pablo Heimplatz on Unsplash.

Tick tock

Your body has an internal clock. It’s located in the base of your brain, in the hypothalamus. You’ve probably heard the term circadian rhythm: this is the natural sleep and wake cycle of all animals which is synced to the Earth’s 24-hour cycle.

Unlike nocturnal animals, people are generally awake during the day. But there are differences in when we prefer to sleep. Your preferred sleep and wake times are your chronotype. Most people start their lives as early birds – many of us have personally experienced how early babies tend to wake up. But wake times usually shift later as we age, with teens notorious for their late nights and long sleep-ins.

Known as the sentinel hypothesis, the theory goes that a tribe of humans with staggered sleep schedules were at an evolutionary advantage: there was always someone wide awake and ready to stand guard. Today, some of us are early risers, some of us late risers, and many fall somewhere in between.

All in the genes?

We’ve long known genes play a role in determining chronotypes, but a big study published last year made clear how complex the link is. Researchers studied the genomes of nearly 700,000 people in the US and UK and identified 351 areas in the genome that contribute to whether someone is an early riser.

But people who are genetically most likely to be early risers only wake on average 25 minutes earlier than those who aren’t, so there are clearly a lot of other factors at play too.

A study published just last week found that one in 300 people has what’s called Advanced Sleep Phase, waking naturally well before dawn (between 3 and 5am). The researchers found for roughly one in 500 people, this super early rising runs in the family. These numbers are higher than we previously thought and also suggest a genetic role. My Dad and I are classic examples of this condition. People find it hard to believe but I wake up at 3:30am without an alarm.

The benefits of being an early bird

Although we now understand being a night owl is nothing about laziness, there are definitely some costs that come with staying up late.

One of the main problems is now known as social jet lag. Normal jet lag is the result of flying to a different time zone and experiencing a mismatch between the time your body thinks it is and the actual time in your new location.

Social jet lag is the mismatch between many peoples’ body clocks and the waking hours they are forced to keep because of school or work. If your preferred sleep time is 2am to 10am, having to be up at 7 for work isn’t going to do you any favours.

But there are other problems associated with being a night owl. Night owls consume more caffeine, nicotine and alcohol. A large twin study found night owls were much more likely to be current and lifelong smokers.

Morning people also tend to have better mental health. Night owls are at greater risk of suffering both depression and schizophrenia.

A big international review of studies found evidence that night owls may be at higher risk of heart disease and type 2 diabetes. The study found those who stay up later at night tended to have more erratic eating habits (for example, missing breakfast) and ate more unhealthy food.

Research suggests there are also personality differences between morning and night people. Morning people tend to be more proactive, persistent and cooperative, whereas night owls tend to be more creative and better at risk-taking.

Can you make the switch?

If you’re a night owl who’d like to get up earlier, you probably can if you’re willing to change your habits.

A recent study of young people whose average preferred sleep time was 3am to 10.30am showed it’s possible to substantially shift sleep times in the space of only one month.

These study volunteers were asked to follow a series of rules: get up two – three hours earlier, have breakfast as soon as they woke up, avoid caffeine after 3pm, maximise the amount of time they spent outdoors in light during the morning, and not sleep in on weekends. Amazingly, all the participants stuck to the program. Quite possibly because they began to feel much better: less sleepy, stressed, anxious and depressed.  Instead of feeling at their best in the evening, they now hit their peak in the mid-afternoon.

Other research has shown that when you take people camping and deny them access to all artificial light other than a campfire, their body clocks quickly change. Within a week, former night owls feel sleepy much earlier in the evening and wake with dawn.

If you’re a night owl who can’t or doesn’t want to make the switch, take heart: if you ever move to Mars, you’ll probably cope far better with that planet’s 24-hour, 39-minute solar day. Extreme morning types like me are likely to find it much harder to adapt to the 40-minute shift.

Links and stuff

Emojis at face value

comments 6
History / Myths / Psychology

If you’re reading this, chances are you’re familiar with emoji and have used them in an 📧, or text. How did this visual ‘language’ come about and how will it impact communication in the future? Tara Bautista explores the history and 🔬 of emojis.

👍 or 👎 them, emojis are the fastest growing ‘language’ in the world. Photo by Lidya Nada on Unsplash.

Smiley faces

“How r u?”

“I’m OK.”

How many of you have received a reply like this and wondered what OK meant?

It could mean anything from “Life’s all good” to “I’m barely surviving, but there’s no need to call emergency services just yet”.

Now how about:

“I’m OK 😊.”

Isn’t that more reassuring?

The simple inclusion of a smiley face emoji adds a depth of emotion that is often lost in text-based messages. This convenience in communication is the reason why these small pictures were introduced in the first place.

The 📈 and 📈 of emojis

The first widely used set of emojis was created in 1999 for the Japanese phone operator NTT DoCoMo. The word ‘emoji’ is an amalgam of the Japanese words for picture (‘e’) and letter/character (‘moji’).

Creator Shigetaka Kurita noticed that despite the potential of digital communication for bringing people closer, written text didn’t provide cues about feelings and intent that we can pick up from hearing people’s voices or seeing their faces and body language.

Kurita devised 176 simple ideograms made of 12 x 12 pixels, which included faces that conveyed emotion. Other emojis represented things we commonly talk about, like traffic and weather.

Today over 90% of the world’s online population uses emojis. There are currently 3053 emojis approved by UniCode, the governing body of emojis whose voting members include IBM, Microsoft, Google and Netflix.

(Fun fact: Anyone can propose new emojis but the approval process is lengthy – up to 18 months. A new emoji must satisfy criteria including that many people are likely to use it, and that it conveys something different to existing emoji.)

Such is the power of this visual ‘language’ that the ‘😂’ emoji was chosen as the 2015 word of the year by Oxford Dictionary.

The 👍 case for emojis

Emojis act as an addition or substitute to written language. They fulfil the role that tone of voice, facial expressions and body gestures do in face-to-face and verbal communication.

Emojis may be simple pictures, but they can express a nuanced range of emotions. Consider how you would tell someone you’re happy. There are many options, including 🙂, 😃 and 😄. Even using non-face emojis can help convey your mood.

On the other hand, the same emoji can mean many things in different contexts. For example, 😕 might be taken as confused, neutral, sad or unhappy.

But the popularity of emojis is undoubtedly due to the fact they are easy to use and understand. Their meanings largely transcend different cultures, ages (and operating systems!).

A 2015 survey of 2000 British young people found that 72% agreed that it was easier to express their emotions using emoji.

An interesting example that showcases the universality of emojis is their use in food-related market research. The potential for using emoji in gauging consumers’ preferences was first noted by researchers when analysing over 12 000 tweets on meals. In a large number of them, people commonly and spontaneously used a range of emojis or emoticons to express their thoughts.

Now, emojis are being used to report the food-related thoughts and feelings of children as young as eight. In one study, children lined up behind emojis that best reflect their feelings about drinks. In another, they chose emojis that represented their feelings when eating certain foods. A similar ‘select-all-emoji-that-apply’ reporting system has been successfully used with Chinese consumers.

Supporters of emojis suggest they may help us communicate with illiterate or poorly-educated people. In 2019, the mosquito emoji was added at the proposal of malaria researchers who wanted to spread public health messages about malaria prevention.

️ : not everyone is 😃 about them

For some people, the idea that 😂 is a ‘word’ and emojis are a language is as controversial as the singer Bob Dylan winning the Nobel Prize for Literature.

It is difficult to conduct a deep and meaningful exchange in emoji. This is because emojis are almost all nouns, with few verbs and no adverbs.

One researcher argues that our greatest power as humans is our ability to clearly express complex thoughts. If we rely too much on emoji, he argues, it ‘may push us back towards cave drawings we started with.’

Another opinion is that that our increasing dependence on fast, visual-based communication may make us mentally lazy and decrease our ability to think about abstract, long-term and complex issues.

Using emojis can also leave a bad impression when used in formal digital communication.

So, it seems that as with most things, 🤷 … can’t please everyone, right?

Links and stuff

Nature as medicine

comments 2
Anthropology / Medicine / Myths / Psychology

For years, people have flocked to cities for greater job prospects and convenience. Tara Bautista explores why something in our psyche still yearns to escape the urban jungle and is soothed by the great outdoors.

Forest bathing and other forms of contact with nature improves both mental and physical health. Photo by Luis Del Río Camacho on Unsplash

A few years ago, when I was struggling with some mental health issues, I realised that the cramped city apartment I was living in was not doing me any favours.

The walls were painted a calming pale green that was meant to bring to mind moss and leaves.  But the tiny Juliet balcony – the only view to the outside – opened up to the not-so-calming interior of a busy car dealership.

Fast forward to months later in my new apartment.

The tall ceilings and increased space are a welcome relief. From my couch, the view from the French doored-balcony is an almost unobstructed vista of the sky. I can easily spend an hour watching the crawl of clouds, bathed in silence.

I believe this change of scene is one factor that helped me in my healing.

In fact, my experience parallels a famous study: patients who underwent gallbladder surgery healed faster and had fewer complications when they had a room overlooking trees as opposed to a wall.

Cottoning on to the benefits of going au naturel

In 1984, renowned biologist EO Wilson coined the term ‘biophilia’ to describe the human urge to connect with nature and other forms of life.

Two years earlier, the Japanese government introduced a new form of therapy as part of their national health program. ‘Shinrin Yoku’ or forest bathing involves people slowly meandering through a forest and taking in the atmosphere – with clothes on, of course. They emerge from forest bathing feeling calmer and more mindful.

Similar nature-as-therapy programs have been introduced in other countries including Scotland, with good scientific reason. Forest bathing decreases anxiety and blood pressure by activating the part of the nervous system that puts your body in ‘rest and digest’ mode.

Even going for a run or walk outside in nature is associated with feeling more revitalised and energetic than if you’d done the same thing on a treadmill.

Reconnection with nature also has tangible effects on physical health according to a 2018 study that looked at combined data from 290 million participants.

The mega study found that living in greener urban areas is associated with a reduction of stress hormones as well as risk of high blood pressure, cardiovascular disease, obesity, diabetes, asthma hospitalisation and early death. Spending time in greener urban areas also lowers the risk of giving birth to premature babies and rates of shortsightedness in children.

More than just something in the air

A study of 20,000 English people found that you need exposure to nature for at least two hours a week – not necessarily in one block of time – to experience improvements in health and wellbeing.

But there isn’t one simple explanation for nature’s healing power.

One theory is that being in nature exposes us to different air. People living in cities are exposed to a lower variety of germs than people who spend time in natural environments. Being in nature exposes us to a more diverse microbiome that may prime our immune systems to heal ourselves faster, for example by producing more immune cells to stave off infections.

But what about the mental side of things?

The Attention Restoration Theory suggests that we have limited ability to focus attention before we feel mentally fatigued. Taking a short break in natural environments away from our busy urban lives helps restore this ability.

Another hypothesis is that we have evolved to feel less stressed in non-threatening, natural environments since they provided our ancestors optimal conditions for things key to survival, including food.

Lastly, the visual stimulation and silence of an outdoor setting may help distract ourselves from our worries since it decreases activity in the prefrontal cortex, the brain region that can get fixated on negative thoughts.

But I live smack bang in the middle of the urban jungle!

What if it’s just not feasible to get away from the bright city lights?

The good news is that you don’t you don’t even need to venture into far-flung wilderness to feel the benefits. Access to urban green spaces within a few kilometres of your home or exposure by virtual reality might be enough.

As for me, I’ll be on my couch, curled up with a hot choccy and contemplating Eastern philosophy as the clouds roll by.

“Nature does not hurry yet everything is accomplished.”Lao Tzu.

Links and stuff