Viewing entries in

Damn It, It’s on the Tip of My Tongue


Damn It, It’s on the Tip of My Tongue

Photo by  Juliet Furst  on  Unsplash

There is a very specific feeling of frustration that comes with trying to recall a word that you know is sitting in your brain somewhere but failing to drop into your mouth. We refer to it as having a word on the “tip of your tongue”. We might fall into this trap once a week or so in normal life (although it becomes more frequent with age), and around 50% of the time, we manage to get ourselves out of the mess within a minute. We often experience it when a friend asks a question with a familiar answer that we have not heard or used recently, or when we try to find a specific but uncommon word that describes a feeling we are trying to communicate.

Questions with uncommon single-word answers are particularly likely to elicit a tip of the tongue (TOT) phenomenon. Let’s see if any of these questions, which have previously been used to elicit TOT feelings in lab studies, do a good job of getting you into the dreaded TOT mental state (the answers will be at the end of the article):

  • What do you call a word or sentence that reads the same backward and forward such as, “Madam, I’m Adam”?

  • What is the name of the islands off the coast of Ecuador that Darwin visited to study unique species of animal life?

  • What is the order of lower mammals including kangaroos and opossums which carry their young in an abdominal pouch?

  • What is the word meaning favoritism in hiring based on family relationships?

  • What do you call a person who appeals to people’s prejudices, making false claims and promises in order to gain power?

  • What are people who make maps called?

What exactly is our brain up to when we experience a TOT feeling? Brain scans suggest that two key brain areas are particularly active: the anterior cingulate cortex and the right prefrontal cortex. Our anterior cingulate cortex is typically involved in detecting and monitoring mental conflicts. It is a core part of the inner battle between competing options when we encounter a problem. The right prefrontal cortex is involved in working through our memories as we retrieve them, especially when we are not particularly confident that they are correct. It underpins the sense of familiarity but lack of certainty about the solution to a problem when we desperately peruse the contents of our mind in search of the answer.

These brain functions are reminiscent of what happens when a word is on the tip of our tongue. In our mind, we work through multiple conflicting possibilities with similar sounds or meanings as we try to zero in on the target: “is it despotism? Neapolitan… nativism… NEPOTISM!”. There goes the answer to one of the TOT questions I listed above, if you didn’t already think of the word.

We are more likely to find ourselves in a TOT state with emotional words than neutral words. This suggests that emotions are a significant part of our memory recollection process, as we retrieve different clues to what the word may be. Perhaps the emotions themselves are a definitive signal that the word we seek is sitting in our mind somewhere. We know that we know the word, we just can’t quite bring it to the front of our mind. This relates to what scientists call “metacognition”: thinking about thinking.

The metacognitive account of TOT phenomena explains that when we fail to recall a word, we activate our metacognitive processes to estimate whether the word would come to mind if we just thought hard enough. In addition to retrieving any emotional clues, we hunt down related or half-baked bits of information that somehow connect to our target. For example, clues could include syntactic (sentence structure), semantic (meaning), or phonemic (sound) information. As we accumulate some clues, we may cross a threshold that initiates a TOT feeling and encourages us to keep searching rather than give up. Then, if we continue accumulating more clues, we may be lucky enough to cross another threshold that activates the complete target concept and allows us to spit out the word.

To prove that phonemic information is one of the major clues that we use in recalling words, researchers put participants in a TOT state and then tested how well they could eventually recall the correct word. They showed that a list of similar sounding words helped participants to retrieve the correct target word, rather than interfering with their thought process. TOT states make us extra curious to find out the answer to our problem. Reeling off words that sound like they connect to the camouflaged target may be one good way to pull the word out of the bushes.

When we are trying to carefully count objects, perhaps the number of people in a room, our more annoying friends might whisper distracting asynchronous numbers into our ear to force us into furiously starting again. But when it comes to TOT states, a list of similar words can be helpful. As we play detective and attempt to solve a “what is that word?” mystery, any related information that pops into our head becomes a clue. Our brain activity bounces around these clues in trying to resolve the conflict, and with enough information, it eventually settles upon the solution and gives us a feeling of relief that is difficult to rival.

Answers to the list of TOT questions at the top of the article

  • Palindrome

  • Galapagos

  • Marsupials

  • Nepotism

  • Demagogue

  • Cartographers


The Science of Hypocrisy


The Science of Hypocrisy

This article was a front-page feature on Medium

Photo by  Rishabh Butola  on  Unsplash

For many of us, a huge part of daily conversation revolves around gossip. We love to talk about the blunders and missteps of friends, family, and celebrities. On top of that, news organizations and social networks are like outrage amplifiers because that’s what gets the clicks. We are all used to name-calling in the news, especially when it’s directed at politicians or performers. But there’s one particular name that really gets our attention.

If you want to destroy someone, call them a “hypocrite.”

Hypocrisy typically involves criticizing or condemning the immoral acts of others while engaging in those acts ourselves. This can make us look worse than if we engaged in those immoral acts but didn’t criticize them at all, which might sound odd. But would you rather someone engaged in immoral behavior and criticized it or engaged in immoral behavior and didn’t criticize it? Diving into the psychology of hypocrisy can make how we feel about it make more sense.

Testing for hypocrisy

An experiment in 2001 aimed to turn people into hypocrites in the lab. Participants were to assign a set of tasks to themselves and an unknown second participant. One type of task was exciting and offered rewards while the other was neutral with no rewards. A coin placed next to the participants had a written instruction explaining that most people believed flipping the coin would be a fair way to distribute the tasks. Indeed, practically all of the participants agreed that flipping the coin to assign tasks would be the most moral thing.

But when it came down to it, only half of them actually flipped the coin, with practically everybody in the non-coin-flipping half giving themselves the exciting tasks. Among the people who did flip the coin — which was labeled “self” on one side and “other” on the other — 85% to 90% still managed to assign the exciting task to themselves. Clearly, either the coin was a magical sycophant or the participants pretended the coin had landed in their favor when it really hadn’t.

People wanted to look fair by using a coin to make their decision, but behind the scenes, they were just as selfish as the people who did not use the coin at all (most of whom had agreed using the coin would be the most fair but didn’t do it). It’s all a perfect example of moral hypocrisy at work.

The drive behind hypocrisy

Self-interest is the most obvious reason for any of us to act like hypocrites. When people are questioned about why they act in conflict with their own stated moral standards, many will say that the personal costs are enough to outweigh the intention to act morally. Essentially, we all want to act fairly until we are put on the spot and facing our own personal consequences. For example, it’s easy to justify many of our unfulfilled wishes to donate to charities and failed inclinations to help a stranger in need by telling ourselves that we just can’t afford to do it right now.

We all want to act fairly until we are put on the spot and facing our own personal consequences.

Our hypocrisy helps us out, that’s for sure. But we also use it in our relationships. Often, when we rate the fairness or morality of other people’s actions, we judge them more harshly than we judge ourselves doing the same actions. In a 2007 report on a modification of the exciting task/boring task paradigm described earlier, participants afterward were told to judge their and others’ fairness on a scale from 1 (extremely unfair) to 7 (extremely fair). People scored themselves a 4 on average but rated others’ fairness at only a 3 on average.

Interestingly, our judgments of other people tend to be far more favorable if those others fall within our in-group (even if it’s a purely arbitrary in-group characterized by a random trait). We often judge an in-group member’s misbehavior to be just as fair as our own. We only have a greater distaste for other people’s bad behaviors when those people fall outside a social circle that we ourselves have drawn.

But why is hypocrisy so distasteful?

We’ve covered what hypocrisy looks like and what motivates it, but we haven’t tackled why we seem to hate it so much. One strong explanation relates to false signaling. In essence, hypocrites employ a double layer of deception in their immoral acts — one more layer than the basic liars who simply say they’ve acted morally when they haven’t. When we hypocritically condemn someone’s immoral behavior, we disguise our personal misbehavior with a veil of persuasiveness or manipulation. It’s easier to see through an outright lie than a hypocrite’s condemnation. On top of that, a hypocrite has brought another person into the game. Instead of directly denying their immorality, the hypocrite sneakily implies they are good by attempting to shame someone else. This is a recipe for hatred when caught out.

Hypocrites employ a double layer of deception in their immoral acts — one more layer than the basic liars who simply say they’ve acted morally when they haven’t.

A set of recent experiments had Yale faculty testing this false signaling theory by giving people stories about different kinds of liars and hypocrites and then studying how people judged the characters within those stories. Four important results came out of these trials:

  1. When a person condemns other people’s behavior and we know nothing else about that person, we typically believe it comes from their moral goodness.

  2. Condemnation of bad behavior is a stronger signal of a person’s moral goodness than claims of personally avoiding bad behavior.

  3. When a person condemns a behavior that they themselves commit (hypocrite), we rate them as significantly worse than a person who says they don’t commit a behavior when they do (liar).

  4. We perceive hypocrites better if they admit to sometimes engaging in the bad behavior than if they make no such admission.

Overall, it backs up the idea that we have a greater tolerance for liars than we have for hypocrites. Hypocrites are like a special type of liar who puts extra effort into disguising their misbehavior and sending us false signals of moral superiority. Those false signals drive our contempt. If a hypocrite is honest about their hypocrisy — if they get rid of false signals by admitting to what they condemn — our view of them can become significantly more favorable.

Perhaps there’s a lesson we can learn here. If we’re going to lie, that’s bad enough; let’s try not to fool and distract other people by pointing the finger. Sometimes, it’s okay to be transparent about our flaws. Nobody is perfect, but honest self-criticism and the ability to admit when we fail to live up to our own standards may be a good foundation for integrity. Hypocrites are terrible people. And occasionally, I’m one of them.


A Hangry Judge Could Ruin Your Life


A Hangry Judge Could Ruin Your Life

Photo by  Alex Iby  on  Unsplash

Photo by Alex Iby on Unsplash

Some decisions are so consequential that the average person is forbidden from making them. To decide the treatment for another person’s disease, or the penalty for a criminal defendant, you need to go through years of careful training and education. As a doctor, you cannot afford to prescribe the wrong drug for a patient’s symptoms. And as a judge, you cannot afford to imprison an innocent person. We all know that these kinds of errors must happen occasionally. But at the same time, we hold certain professions to a higher standard. For that reason, it can often be a shock to discover those role models just being human.

When we are deprived of sleep for a day or two, all of us are capable of making some poor decisions. Our memory and attention capacities take a hit, so our work during the day is likely to be less effective. A lack of sleep increases the chances of a lapse in concentration, which can be disastrous when driving or using dangerous tools. But even outside immediate physical dangers, there may be major ethical consequences to being tired.

Judges often have to try many cases across their day. Their job may not be physically strenuous, but their high mental burden is bound to be exhausting, as I’ve previously described. If judges are as human as the rest of us, you might expect to see their decision-making change in line with their sleep patterns.

If you wanted to investigate the effects of sleep deprivation on real-world legal decision-making, it probably wouldn’t be sensible to ask a group of judges to go without sleep for a day. However, researchers at the University of Washington and University of Virginia thought of a clever way to test the idea without getting in the way of normal judicial proceedings. They made use of a natural change we all go through during the transition to daylight saving time in the spring: on Sunday, we turn our clocks forward and miss one valuable hour of peaceful sleep.

The researchers analyzed court sentences for US citizens between 1992–2003, and examined how long defendants were locked away for on the Monday after a clock change (“sleepy Monday”) compared to a typical Monday. Alarmingly, they found that sentences were 5% longer overall on sleepy Mondays.

You might find the results above surprising. Could a single hour of sleep really make such a difference? Some researchers are indeed disputing the extent of these sleep deprivation effects among judges. But to be clear, there does seem to be a difference between 6–7 hours of sleep and 7–8 hours of sleep when it comes to general health. During the Sleep Duration Consensus Conference in 2015, 15 experts in the field of sleep science reviewed all available evidence and voted on exactly how much we should be sleeping each night. The panel reached a consensus suggesting that 7–8 or 8–9 hours of sleep a night was ideal for optimal health, but 6–7 hours crossed into the suboptimal range.

Judges aren’t the only people who need to worry about sleep. The performance of medical professionals also suffers when they are bleary-eyed. As sleep loss increases, surgeons make more mistakes and are slower to perform particular tasks in a surgery simulation. The demanding working conditions and long shifts in many of the world’s healthcare systems may not be good for patients, or for the people trying to save their lives.

We don’t yet fully understand everything that goes on during sleep, or even the reasons why we sleep, but we know that we struggle without it. The effects of sleep loss are similar to the effects of alcohol intoxication. When researchers measured the hand-eye coordination of volunteers following either a few drinks or a night of no sleep, they found that 17 hours of sleep deprivation mimicked the performance problems of a 0.05% blood alcohol concentration. Staying awake for 24 hours was similar to a 0.1% blood alcohol concentration. Keep in mind that the legal alcohol limit for driving in the US is 0.08%. The more sleep we lose, the more drunkenly we behave.

Photo by  Hutomo Abrianto  on  Unsplash . Adapted by yours truly.

Photo by Hutomo Abrianto on Unsplash. Adapted by yours truly.

Food deprivation may be analogous to sleep deprivation when it comes to the quality of our decision-making. All of us get a little short-tempered and miserable when we feel hungry. Skipping lunch is not a popular proposition in my household. The feeling of frustrated hunger is so widespread that the world has come up with a dedicated word for it: “hangry”, a portmanteau of hungry and angry.

A hangry judge may be the last thing you want to see if you’re ever in court hoping for parole. Researchers analyzed the decisions of judges in Israel depending on when a parole hearing took place during the day. In these cases, judges had two options for their conclusions: “yes, parole is granted” or “no, go back to your cell”. The hearings took place in one of three daily sessions, each session separated by a break where the judges could grab some food and drink.

The researchers found that decisions to grant parole in favor of prisoners declined steadily between the start and end time of a session. And this was not just a minor effect we can easily ignore. Favorable rulings were at around 65% at the start of a session when the judge was feeling happy and refreshed, and declined to almost 0% just before the break. Straight after the refreshment break, the rate abruptly jumped back up to approximately 65%. There may be accompanying variables in addition to hunger that explain this pattern, but nobody in their right mind could have predicted such a dramatic effect before seeing this data.

In these judicial cases, the decision to avoidgranting parole is essentially a decision to keep things as they already are. Granting parole would mean changing the status quo and releasing the prisoner. It may be that this burden is too much for a hangry judge to think about. When we are tired and struggling to focus because the only thing on our mind is food, we may be naturally drawn to the least dramatic option; the option least likely to get us in trouble if we make a mistake. When we are refreshed, happy, and comfortable, we can better weigh up the pros and cons of a problem in a bid to make the fairest and most rational decisions we can.

Next time you feel a little frustrated with someone, or you see them looking a little skittish, consider whether hunger or fatigue could have something to do with it. We have an overwhelming tendency to assume that when a stranger is unfriendly to us, it’s because they are a terrible person. But this bias may be irrational, because we are all capable of being a little mean on a bad day.

With some careful attention to the real influences underlying our own behavior and judgment, we can make better decisions when it matters. And with greater generosity in how we read other folks’ motivations, we can develop a more compassionate attitude toward the people around us.


When Your Brain Becomes Your Puppetmaster


When Your Brain Becomes Your Puppetmaster

Photo by  Sagar Dani  on  Unsplash

Photo by Sagar Dani on Unsplash

Some features of our lives just seem inviolable. Most people would never worry about failing to recognize objects they see every day, or beginning to believe that their arm does not belong to them. And yet, these are exactly the types of things that can go wrong. Body and brain functions are not physical laws like those of thermodynamics or relativity. Gravity may be here to stay, but when it comes to our behaviors and perceptions, we may be justified in being a little more nervous. So what would you do if you lost control of your own arm?

If I ask you to lift your arm, and you agree to participate in the exercise, you’ll probably see your arm start to rise. But you always feel that the arm is doing what you, as a conscious agent,want it to do.

Anarchic hand syndrome is a disturbing disorder in which patients lose the normal experience of voluntary movement. An arm can begin to move and act without the patient wanting it to, as if the limb has a will of its own. In fact, a patient will often begin fighting their own limb if it becomes uncooperative, trying to stop it from grabbing at their tissue while they blow their nose or from touching the person sitting next to them. Have a look at this video demonstrating the plight of an elderly patient in her hospital bed after she suffered a severe stroke.

In the most extreme cases, your own anarchic limb can try to kill you. One patient described her hand tearing away at her bedcovers in the night and grabbing her own neck to strangle her. The only sense she could make of her horrifying condition was to assume that her limb was possessed by an evil spirit.

Patients with anarchic hand syndrome are in the bizarre situation of knowing that their limb is their own but losing all sense of agency over it. Without the normal process of intending to perform an action, it’s hard to say that you turned on the light when you flick the switch. Your arm certainly did it. But not you.

It all links back to our sense of who we believe we are. When we use the words “I” or “me”, we normally refer to our conscious minds and experiences. It would be strange to say “I am beating my heart faster” after going for a run, even though the heart is a part of our own body. We simply say “my heart is beating faster”. But when it comes to lifting our arm, we say “I am lifting my arm”, not “my arm is lifting”. The difference all comes down to our sense of consciousness and intention. Our heart rate is automatically controlled behind the scenes of our awareness, so although we are educated enough to know that we own our heart as much as we own our arm, we don’t talk about heart activity as a product of our control.

Photo by  rawpixel  on  Unsplash . Adapted by yours truly.

Photo by rawpixel on Unsplash. Adapted by yours truly.

So in a sense, when we have anarchic hand syndrome, our arm becomes more like our heart. The arm is on us and it is a part of us. But we are not in control. The labels “anarchic hand syndrome” and “alien hand syndrome” are often used interchangeably, even in academic papers. But some researchers distinguish between them, explaining that patients believe anarchic hands belong to their own body even when they cannot control them, while alien hands are experienced as a completely disowned limb.

Anarchic hand syndrome typically follows extensive damage to motor-related areas towards the front of the brain, including the anterior corpus callosum and supplementary motor area. Alien hand syndrome usually features damage further towards the back of the brain, including the posterior corpus callosum and parietal areas. The symptoms can also arise from degeneration in the circuits that connect areas of our cerebral cortex with the basal ganglia, a system that is critical in allowing us to move smoothly and effortlessly.

During movements of an anarchic hand, the primary motor cortex in the brain — one of the final command centers for sending “move” signals to your limbs — is fully activated. But unlike with voluntary movements, that activity is practically isolated, appearing without the normal co-activation of premotor, prefrontal, and parietal areas that are so important for our experiences of intention and movement awareness.

Utilization behavior refers to actions that appear fully functional, but emerge habitually and automatically in the wrong environments. They can occur after lesions to frontal areas of the brain, similar to anarchic hand syndrome, but patients often do not comment on their actions being out of the ordinary (unlike the woman with the anarchic hand in the video linked above, who repeatedly complained about her arm). When a patient sits in a doctor’s office, and sees a pen and paper sitting on the table, they might pick up the pen and begin to write. When they see a pack of cards, they might deal them as though they are about to start a game with the doctors. None of these actions have anything to do with the doctor’s instructions. Even when the doctor says that the objects should not be touched, the patient returns to their action after a small distraction. The patients simply use the objects because they are there.

Some theories of motor behavior explain that whenever we see a manipulable object in our environment, our brains automatically prepare the relevant action for handling that object. When we see a hammer, we initiate a motor program for a palm-grasp action. When we see a grape, we initiate a program for a smaller precision grip with our fingers. Thankfully, under normal conditions, we have the control systems in place to suppress those action plans when they are contextually irrelevant (although, when I spot a delicious bunch of grapes in a bowl, I often struggle with that suppression). Efficient hammering is great when we are putting together furniture, but not when we are in a doctor’s waiting room. When the control systems in our brain are destroyed, particularly following frontal damage, we may find ourselves acting for the sake of acting.

Photo by  Maja Petric  on  Unsplash . Adapted by yours truly.

Photo by Maja Petric on Unsplash. Adapted by yours truly.

When you read enough about brain dysfunctions, it begins to seem as though there is nothing in your life that you can depend on. We should remember that disorders like the ones described above are incredibly rare. And on the plus side, they can inspire us to appreciate some of the smaller facts of our existence. Even on your most boring day, you probably achieved several minor miracles of purposeful action and awareness. The notorious 3-pound organ sitting in our skulls can cause us grief during testing times, but it also makes life worth living the rest of the time.


The Upsides of Being an Autumn Baby


The Upsides of Being an Autumn Baby

Photo by  Lydia Winters  on  Unsplash

Your month of birth may be more life-altering than you think. It’s nothing to do with your star sign. Several variables depend on which month you happened to enter the world. Your mother’s food choices during pregnancy, the season of your earliest days of life, and your exact age at starting school are all good examples. Could any of these variables have a noticeable effect on our successes and failures in later life? It’s a tricky question with many possible answers, but studies have been taking on the challenge. Now, there are some curious stories coming out of the data.

Let’s first look at health. When you compare overall lifespan past the age of 50, Austrian and Danish people (Northern Hemisphere) live approximately half a year longer on average if they were born in autumn rather than spring. The same benefits apply in Sweden too. In fact, this autumn advantage is even true in Australia (Southern Hemisphere), where the seasons are reversed across the year, so that autumn begins in March/April. Fascinatingly, the lifespans of British immigrants in Australia maintain the annual pattern of their European counterparts, rather than adopting the schedule of their new Australian friends. So insight number one: seasons of birth seem to influence longevity.

Photo by  Barbara Alçada  on  Unsplash . Adapted by yours truly.

Photo by Barbara Alçada on Unsplash. Adapted by yours truly.

There is an ongoing debate around exactly why seasonality might affect our health or other life outcomes. The characteristics of mothers who are trying to conceive could be one relevant factor. For example, between 1989–2001, teenage mothers were most likely to give birth in January. Given the disadvantages that young mothers may face in bringing up children, you might expect that children born in January are more likely, on average, to experience problems in their growing life. However, this explanation alone is not sufficient.

In a 2013 study, researchers looked at mothers who had multiple children across different months. By looking at differences between siblings, and therefore studying the same mothers, general effects of varying maternal characteristics could be ruled out. And yet, the data still showed effects of seasonality in child health. Babies conceived in May (births due around February), were most likely to arrive prematurely and have low birth weights, possibly due to changes in maternal nutrition between seasons. The months of greatest maternal weight gain overlapped with the months of conception that produced the healthiest baby birth weights (summer months of June-August). But the birth pattern also showed a striking correspondence with the prevalence of influenza in health centers. When May-conceived babies were born, often prematurely in late January or early February, seasonal influenza was at its peak. The strong correlation between influenza prevalence and gestation length suggests that the seasonality of certain diseases could partly explain the effects of birth month on health.

So on average, winter and spring babies live slightly shorter lives and suffer worse health at birth. And it may not end there. Some researchers have looked at birth patterns among populations with specific disorders. In one study, researchers analyzed data for over 42,000 multiple sclerosis (MS) patients across Europe and Canada. Compared to a non-MS control group, patients were significantly less likely to have been born in November and more likely to have been born in May. The number of MS patients born in May was 9.1% more than expected, while the number born in November was 8.5% less than expected. Once again, the mechanisms that explain this pattern are a little foggy, but MS is a product of both genetic and environmental factors. Vitamin availability and susceptibility to specific viral infections are amongst the environmental influences that could vary by season, and might explain some of the story behind heightened risks for MS in May births.

Surely spring babies can’t have all the downsides? When it comes to academic performance, they may be outdone in their misfortunes by summer babies. Many school systems around the world, including in the US and UK, have cut-off birthdays between August and September to decide which academic year you fall into. If you are born in August, this means you end up the youngest in your class, while if you’re born a month later in September, you’ll probably be the oldest in your class. Children’s mental abilities change rapidly in their early development, so August babies might have a disadvantage in keeping up with their older classmates. If this holds true, you would expect to find a consistent difference between the academic outcomes for August babies and September babies of the same year, even if they were born only a day apart (e.g. 31st August instead of 1st September when the cut-off puts them into separate academic years).

When you look at the data, the older children in class (September births) do indeed outperform the younger children, and the difference is at least partly driven by the age at which children take academic tests. It’s not that the younger children develop weaker cognitive skills as they grow up, it’s that their skills are tested earlier in their development, therefore creating an uneven playing field. As you’d expect from this account, the differences between the older and younger children decline as they get older and their developmental trajectory evens out. However, the more troublesome difference between children may be in their beliefs about their academic competence. The younger children have a harsher view of their own competence than the older children do, even when asked for their judgments at around the same age. This pessimistic outlook may be more persistent, and could potentially sabotage later outcomes. Easy fixes for these issues are hard to come by. More affluent families tend to address the age imbalance by delaying their young child’s entry into school (a practice known as redshirting), while less affluent families are more likely to have young children who are held back a grade before testing. Both practices may have compensatory effects on test scores, but they may also come with substantial costs further down the line, like delayed work experience. Age-based adjustments for test scores could provide another option for balancing academic outcomes early on, but that comes with its own controversies.

Photo by  Jacob Postuma  on  Unsplash . Adapted by yours truly.

Photo by Jacob Postuma on Unsplash. Adapted by yours truly.

The youngest children in a class may also be at greater risk of mental health challenges. One study published in 2000 looked at schools in Northern Ireland, and found that children who were referred to psychological services were significantly more likely to be born at the end of the school year than the start. Similarly, in 2015, survey and register data in Denmark suggested that a 1 year delay in entry to kindergarten reduced scores on hyperactivity and inattention scales at age 7. Some of the mental health costs of relative youth could be driven by the negative self-perceptions of competence that I highlighted earlier.

Exposure to high temperatures during early life may also impact economic outcomes. Some evidence suggests that exposure to high temperatures (>32°C or >90°F) during prenatal development or during the first year of life is associated with lower earnings in adulthood. Each extra day of heat exposure correlates with a 0.1% reduction in annual earnings at age 30. The good news is that regular access to air-conditioning entirely cancels out this effect, so a community can easily mitigate the potential downsides of sun and heat exposure if they can afford the relevant resources.

After all of this, there seems to be a clear winner in the lottery of birth timing. Those born in autumn (September-November) have some probabilistic advantages over their spring, summer, and winter peers. The summer babies may have heightened risks from the sun and their pesky school schedules, while the late winter/spring babies may have a greater risk of health problems from viruses and nutritional deficits. It’s important to treat all of the evidence carefully. Most of it is correlational, which means we are still waiting on a definitive insight into the causes of differences between children born in different months. We should also remember that there is enormous variability in life circumstances and outcomes, so even when we find average differences according to month of birth, there will be a multitude of other genetic and environmental influences that make it practically impossible to predict how well someone will do purely from their birthday. A final note of caution is that we will have an incomplete picture of these effects for decades to come.

For now, autumn babies look like the lucky ones. But I would bet that the costs of fall are lurking in the darkness, waiting to be discovered by keen-eyed researchers.


Mindfulness Lessons from Science and Children


Mindfulness Lessons from Science and Children

Photo by  Vanessa Serpas  on  Unsplash

There are some human characteristics that we describe as childlike. In growing up, we gladly leave behind many of those qualities. Adults shouldn’t throw tantrums in supermarkets and cry about parents’ tyrannical desires to prevent accidental deaths in the playground. However, some childish adjectives are earnestly used as compliments for adults. When we describe an adult as childlike, we usually refer to some innocent or charming quality about them. That’s a nice sentiment, but some features of children’s mindsets may even be profoundly healthy for adults to cultivate. So in what ways do we need to be more like a child again?

I don’t yet have my own children, but in interacting with my many young cousins, there is always one particular trait that stands out. That is their ability to live and experience life in the moment. Children seem to be able to have fun with just about anything. The other day, I saw a child screaming with laughter at the noise they were able to create by hitting a can with a stick. They did not worry about the latest disaster in the news or the state of the economy. They simply made the most of what they found in front of them, and appreciated every second as they experienced it.

Of course, none of this is to say that children are Zen masters. Far from it. If you’ve ever been in an airplane with a child around, you’ll know all too well that children do not hesitate to scream for what they urgently crave. Nevertheless, they seem to be able to engage wholly with an activity in a natural way that adults no longer find so easy.

This psychological quality of children is reminiscent of a mindset widely discussed in the sciences and the media: it’s called mindfulness. Jon Kabat-Zinn was a pioneer in bringing mindfulness into the sciences, and he defines it as “the awareness that emerges through paying attention on purpose, in the present moment, and nonjudgmentally to the unfolding of experience moment by moment”. You can think of it as a mindset in which your attention is entirely locked on what is happening right now. Not what happened moments ago. Not what might happen in the future. What is happening now.

Photo by  Robert Collins  on  Unsplash . Adapted by yours truly.

Photo by Robert Collins on Unsplash. Adapted by yours truly.

Why should we care about the science?

It took a while for mindfulness to be taken seriously in the scientific world. This is because the principles within it were originally developed in a religious context, especially in the Buddhist tradition. It wouldn’t take you long to find a scientist who frowns upon the concept of religions. This frowning is due to the many religious premises and claims that cannot be supported by objective evidence. But the frowning often goes too far and becomes a phobic barrier to ideas that can actually be tested scientifically. Mindfulness occasionally still hits this barrier, but now features prominently in neuroscience and psychology studies. If you search for ‘mindfulness’ on Google Scholar, you’ll be reading papers for the rest of your life.

You will still find people who reject the scientific idea of mindfulness because of its religious baggage, and they will often lock horns with people who argue science has no place in discussing mindfulness. I’m in the camp who believe both teams are being too absolutist. Without scientific evidence, mindfulness will never be clearly distinguishable from the snake oils that do more harm than good for humanity, and it certainly won’t ever become a valuable part of our mainstream health services. And without an open and unbiased mind to take mindfulness seriously in the first place, you’ll never fairly weigh up the evidence to understand its true value. If we’re being practical and trying to avoid personal biases, we should enjoy any benefits of mindfulness in our personal lives, while acknowledging the value of emerging evidence through scientific scrutiny.

We don’t necessarily need evidence to believe that mindfulness is good for us personally, but we do need it to truly understand the extent of its benefits across different people, problems, and interventions. We cannot confidently and honestly recommend mindfulness as a useful intervention for others, unless we have independent research to back us. We have to rely on something more than the beliefs, feelings, and words that come from our own mind or the minds of those who agree with us. Clearly defined methods, testable hypotheses, and replicable experiments provide us with the material to convince a sensible doubter. Intercessory prayer has been around for thousands of years, with many religious people attesting to its value in improving the health of their loved ones. But with no scientific evidence to back up that claim, good doctors will never prescribe prayer as an intervention to support your loved one who may be suffering from a medical malady. Mindfulness, on the other hand, has growing scientific support replicating across independent research labs, and may eventually fall into that bucket of widely prescribed interventions. Science and evidence-based medicine have been crucial in our progress as a species. We cannot afford to dump them now.

The mechanics of meditation

Many of our anxieties are driven by a fear of something that may or may not happen in the future. Any pragmatist will tell you that it’s pointless to worry about something if you cannot change or affect it, but mindfulness provides a concrete approach for shifting your attention to something more helpful: your actual experience in the present moment. There are many types of meditation, but all of them are activities that cultivate this mindset in some form.

Photo by  Laurenz Kleinheider  on  Unsplash . Adapted by yours truly.

Photo by Laurenz Kleinheider on Unsplash. Adapted by yours truly.

I will highlight two types of meditation that I find particularly helpful. Rather than diving into their religious or historical context, I will adopt the labels and methods that have been used in the sciences. The first I will call focused meditation, and the second I will call open monitoring.

If you have any basic experience of meditation, you are probably familiar with focused meditation. The instruction is to focus on a specific object in the world or on your body. A convenient anchor is often the breath. You aim to maintain your attention entirely on your inhalations and exhalations in the present moment. This is far more challenging than it sounds, as any early practitioner will tell you. You don’t just consider the concept of breathing while mentally singing Ed Sheeran’s new song. You pay full and exclusive attention to every moment of the breath as it occurs. For example, if you are focusing on your chest, you want to experience it as it lies rested at its lowest point, then stay focussed on it as it slowly rises in each moment, then maintain your attention as it reaches its peak, and then do exactly the same with your mind as it sinks back to its resting position. Whenever you realize your attention has deviated from your breath — which might be as frequently as every few seconds — you simply return your mind calmly to the breath. Ideally without getting angry at yourself for having been distracted.

As you practice this, you will become better able to focus your attention on your breath for a few more seconds each time. Areas of the brain involved in sustained attention become more active as you start practicing meditation and improve your focus. However, at the highest levels of expertise (around 44,000 hours of practice), this activity is reduced again. In early practice, you become better able to recruit your attentional resources in the brain, but with expertise, focusing becomes effortless and you no longer need to rely on those resources. Distractions have a harder time dragging your attention away from where you direct it.

Open monitoring is a little different to focused meditation. Instead of choosing a specific object to direct your attention towards, the task is to focus your attention fully on any thought, feeling, or experience that arises in the present moment. There is no need to judge, anticipate, control, or react to anything that occurs. So an active mind that frequently jumps between objects is less of a concern during open monitoring than focused meditation as long as you are aware of what is happening. It’s still easy to become so distracted that you no longer pay close attention to your thoughts as they happen. There is an important distinction between being aware of thought and being lost in thought. I’m aware when I notice that my mind has floated from thinking about my breath to thinking about the work presentation I’m anxious about. I’m lost when my mind has drifted to worrying about the work presentation, panicking, wondering what I will need to do to prepare, and forgetting to notice each emerging experience.

The nice thing about open monitoring is that you can learn to apply it throughout your everyday life activities. With focused meditation, you usually need to find the time and space to quietly sit and focus on your breath. After a few days of this, most people will inevitably find some excuse to stop the practice. But with open monitoring, you can simply aim to remain aware and present with anything you happen to be doing: in the shower, you can focus your attention on the water as it hits your back; while walking in the park, you can focus on the colors of the trees; when eating, you can focus on the texture of the food as it rolls around your mouth. These are just examples, but the more you manage to bring your mind into the present as you go about your life, and the less you get lost in your head while your body does everything else on autopilot, the better you will appreciate your life as you live it. We too often let the day drift by and ask ourselves at the end of it “where did my Sunday go?”.

Evidence to support the benefits of mindfulness

Research is ongoing, and we still have much to learn. The experiments in this area are a mixed bag of higher and lower quality methods. A challenge in designing a good experiment is to compare mindfulness interventions with an appropriate control condition. If we want to understand how mindfulness impacts health, we need to know what it’s better than. Some experiments use no controls, which is clearly not ideal; if people improve in an aspect of their wellbeing following a mindfulness intervention, how do we know it’s not just because of the social interaction involved in their classes, or the effort of trying something rather than nothing? Other studies compare mindfulness to standard treatments for the targeted symptoms, or to different attention tasks, which is providing a more reliable insight into the specific health benefits of mindfulness itself. Review papers and meta-analyses that helpfully combine the results of multiple studies are also growing in number.

There is a way to go, but consistent effects are emerging. Mindfulness may not help everyone but there is now a large volume of evidence to suggest that it can have important, far-reaching benefits for many aspects of mental health. In general, extended mindfulness practice seems to adapt brain structure and function related to emotion, attention, and self-awareness. Experiments so far have highlighted benefits in areas including hypochondria, decision-making, chronic depression, and even chronic physical pain.

Some of the most convincing benefits are in emotions and relationships. Mindfulness techniques can be a great tool for shifting the mind away from ruminating on possible dramas and disasters. Most of my recent fears came from prognosticating outcomes that were either not that bad in the end, or did not happen at all. My time clearly would have been better spent focusing on my actual lived experiences. When we successfully apply mindfulness to our lives, many of us are happier, and we become more pleasant people to be around. We shouldn’t meditate with any specific goal in mind, because it’s counterproductive. A goal orientation is a distraction in itself. But it’s certainly helpful and motivating to know that whenever we do meditate, there are good reasons to believe it is worthwhile.

The child in us is waiting to emerge from a long slumber. There are many benefits to centering our minds on our current experiences and shifting away from the usual obsessions about objectives, plans, and goals. It’s about time we focused on the only thing we know for certain: that we are breathing, thinking, and feeling right now. We don’t need to keep a chart or track our data. Mindfulness is less of a life hack and more of a way to live. It provides an escape from our monotonous and robotic approach to our everyday activities. Next time you eat, shower, or fold the laundry, know that you can appreciate the moment rather than simply get through it. Life is always going to be too short. So you might as well live it.


Our Bizarre Love for Story Spoilers


Our Bizarre Love for Story Spoilers

The people who make the best company tell the best stories. At parties, you never want to end up standing next to the guy who won’t stop talking about the washing machine he recently bought. Good stories teach us, entertain us, and help us to build connections with people. It’s probably obvious that a compelling story will depend on the interests of the person you are talking to. If you’re alone at a bar beside a stranger, last night’s football game might be a decent bet for a conversation starter. But there are also other critical features, independent of individual interests, that make us engaging storytellers.

Our brains dynamically track the content of a story as we hear it. It wouldn’t be accurate to talk about a ‘story perception’ part of the brain. Instead, as a story evolves, so does our brain activity. When we read about characters and their ambitions, the goal-directed areas of our brain are particularly active (e.g. the superior temporal cortex and prefrontal cortex). When we read about characters interacting with physical objects, motor-related areas of our brain are more active (e.g. precentral sulcus). In a sense, we are living a story as we read it. This is what makes the best literature so engaging and enjoyable: it takes us by the hand and leads us everywhere we need to go without looking back.

Self-control can be exhausting. If you’ve ever spent an hour trying to hold a good posture instead of your typical slouch, you’ve probably experienced this yourself. Keeping your back straight is not a particularly energy-intensive activity like running or swimming. Much of the pain comes from keeping your goal in mind and maintaining the levels of effort and motivation you need. I’ve previously described the exhaustion of learning to drive a car. Driving when you’re an expert is a lazy activity but driving when you’re learning can be utterly draining. Some evidence suggests that we have a central self-control energy resource, and when an activity drains that resource, our self-control capacity on other tasks is also diminished (this is a theory known as ego-depletion, although it is currently hotly debated).

The question for storytellers is whether they can exhaust other people with their stories. Listening to someone’s story could reasonably leave us feeling tired, but it seems to depend on how we process their story. Imagine reading a story about a waiter who arrives at work feeling hungry, and spends their whole shift resisting the temptation to eat food from the restaurant because it is against company policy. If we actively take the waiter’s perspective as we read the story, it exhausts our self-control capacity. But if we passively read without putting ourselves in the character’s shoes, it does not seem to have this cost. In fact, it may inspire additional self-control capacity according to some evidence. So engaging with a story can be an exhausting experience, but it depends on how much we empathize with the characters as they go through their hardships. I can relate to this evidence after recently spending a full day watching a controversial senate hearing about the alleged wrongdoing of a US Supreme Court nominee. Putting myself in the shoes of the questioners and the witnesses, as the stories and perspectives were laid out, did not leave me feeling particularly healthy at the end of the day.

This question is going to sound crazy, but is it possible that spoilers for stories are a good thing? Well, according to one study, they may be. Researchers gave participants stories that they had not read before, and asked them to rate how much they enjoyed the stories. Before some of the readings, the participants were shown a spoiler paragraph before reading the full story. Unbelievably, for several types of story (mysteries, ironic twists, and evocative literary stories), people consistently reported enjoying the spoiled stories more than the unspoiled stories. You can come up with your own reasons for why this might happen, but some possibilities are that spoilers allow us to organize and anticipate stories better, while offering the pleasurable tension of knowing what may be about to hit the characters. This makes some sense to me, and explains why I often enjoy a movie more after a second viewing. However, for the time being, please do not spoil any of the upcoming titles on my movie list. The evidence might show that I’m likely to enjoy a spoiled story. But it hasn’t said anything about long-term appreciation or memorability. Yet…

Photo by  Zhifei Zhou  on  Unsplash . Adapted by yours truly.

Photo by Zhifei Zhou on Unsplash. Adapted by yours truly.

In keeping along this track of surprising facts, here’s one more obvious-sounding question. Would you prefer to hear a story about an experience you’ve had before, or an experience you’ve never had? You’d be forgiven for thinking you would rather hear about something completely new, because that is in line with what most people think. However, we may all be wrong. In an experiment, when a speaker describes their experiences related to a video, they expect that listeners will prefer listening to the story if they have not already seen that video. And the listeners expect the same thing. But the data suggests that listeners enjoy a story about a familiar video more than a story about an unseen video. When we tell people stories that are completely new to them, we tend to assume they have more knowledge about the topic than they do. It’s a cognitive bias called the “curse of knowledge”. It leaves us in the awkward position of occasionally rambling while someone stares blankly into our eyes, too polite to interrupt and ask “what are you talking about?”.

So there’s a great incentive to learn about another person’s interests and background before deciding which of your amazing stories to share. For both story spoilers and conversational topics, some sense of familiarity allows our brains to keep better pace with a changing story. We don’t need to exhaust ourselves with patching holes in our knowledge and playing catch up in real time as the story unfolds. Next time you’re out chatting with friends, tell them stories about what they already know.


How Music Plays Your Brain


How Music Plays Your Brain

Listening to music can be a euphoric experience. It’s unclear exactly why it should feel so good. Is there some evolutionary advantage to enjoying music? Is it a byproduct of some other important function? Is it just one big accident in our evolutionary history? The debate still rages on these questions, but there is one important fact that we can be confident about: music has some deep-rooted appeal for humans.

There is something special about music even for the youngest listeners. Infants in their first year of life already have a meaningful sense of musical timing and pitch. When listening to samples of Western music, Mafa populations in Cameroon recognize the same basic emotions of happiness, sadness, and fear that Westerners do. Both populations also enjoy a similar sense of musical harmony when they listen to each other’s music. When asked to express different emotions by creating musical or physical movement patterns in a computer program, participants in the USA and a tribal village in Cambodia make very similar choices. There is a fundamental core to musical experience and expression that all humans seem to share.

We can look beyond humans to examine how deep our musical roots really stretch. In addition to the cross-cultural appeal of music, there may be a cross-species appeal. There are ongoing discussions about exactly how much our perception of music overlaps with that of non-human primates. Although there are commonalities in our ability to detect rhythms, it is still unclear whether monkeys can synchronize their movements with music in the way that humans can. Some non-human primates, like Kuni the bonobo, may spontaneously synchronize with audible rhythms when they play with a drum. But we need to wait for more evidence to fully understand whether non-human primates enjoy dancing as much as we do.

Non-human primates, just like humans, do prefer consonant music over atonal or dissonant music. However, researchers have often struggled to find any consistent preference for music over silence when non-human primates can choose between them. In 2014, one research group decided to test this question in a little more detail, by trying a range of different musical styles. They divided a room into four zones, which progressively increased in distance from a music speaker playing either West African akan, North Indian raga, or Japanese taiko instrumental music. As the music played, the researchers measured where a group of chimpanzees spent most of their time. If they spent most of their time in zone 1, closest to the speaker, it would indicate that the chimpanzees enjoyed the music. If they preferred to stay in zone 4, where they could barely hear the music, that would suggest they preferred silence.

When the Japanese music played, the chimpanzees showed no preferences between zones. They seemed to be just as happy close to the speaker as they were far away from it. But when the African or Indian music played, they spent the majority of their time in zone 1, as close to the music as possible. In fact, they spent significantly more time in zone 1 than they did when no music was playing. So the tonal melodies and ambiguous pulses in West African akan and North Indian raga music seem to set a nice mood for chimpanzees.

Photo by  Rob Schreckhise  on  Unsplash . Adapted by yours truly.

Photo by Rob Schreckhise on Unsplash. Adapted by yours truly.

What exactly is our brain doing when we listen to music? Instead of processing individual sounds, the brain processes patterns of sound as it implicitly develops expectancies about what is coming next. The auditory parts of our brain analyze several core features of music including pitch and duration, and interact with frontal brain areas as we use our working memory to pull together the information into higher level abstract representations. Many of us have experienced the intense chills that come with listening to deeply moving parts of our favorite music. As this happens, the reward centers of our brain, especially subcortical areas including the ventral striatum that sit deep within the brain, adjust their activity: the more chills we feel, the more activity they show. In fact, when we experience those moments of musical euphoria, the striatum releases dopamine, one of the brain’s reward-related neurotransmitters.

Experts who play musical instruments probably experience music differently to non-musicians. A recent study looked for this difference in the brain. The researchers put beatboxers and guitarists in a brain scanner and measured the levels of activity in their brain as they listened to music involving beatboxing or guitars. When looking at parts of the brain involved in translating sensory information into motor actions (sensorimotor areas), they found an interesting pattern. Guitarists showed increased sensorimotor activity when listening to guitar music, and beatboxers showed increased sensorimotor activity when listening to beatboxing music. So their prior musical experiences in physically playing instruments changed how their brains reacted when listening to those instruments.

The researchers drilled down a little further to examine the finer details of the sensorimotor activity for each group. The primary sensorimotor areas of our brains are organized somewhat topographically: different sections of their structure represent different parts of the body (you can map out and illustrate these maps with “cortical homunculi”). This means you can compare “hand” activity to “mouth” activity in those brain areas. If musicians’ sensorimotor activity during listening represents the actions involved in playing the music, then you would expect to see hand areas activated for the guitarists and mouth areas activated for the beatboxers. After all, those are the body parts they use when making their music. This is precisely what the researchers found. Guitarists activated their hand areas when listening to guitar music, while beatboxers activated their mouth areas for beatboxing music. Their brains automatically recruited relevant sensorimotor regions in processing the musical audio, almost as though they were actually playing the music on some level. In other words, the musicians’ passive listening became a little more active when listening to their own type of music.

Photo by  Matheus Ferrero  on  Unsplash . Adapted by yours truly.

Photo by Matheus Ferrero on Unsplash. Adapted by yours truly.

Music is one of those real-life miracles that practically all of us can connect to. It brings us together at festivals, bars, and other social events, and can give us a dramatic emotional lift when we most need it. The question of why it has these magical effects continues to elude us, but this mystery makes our musical experiences all the more impressive. Whether you’re a metalhead or an opera enthusiast, don’t forget to fully appreciate and enjoy your next musical fix. If you’re lucky, it might inspire a creative spark or moment of ecstasy. But at the very least, you’ll be tapping into an experience that you share with many of your fellow primates.


The Tech That Reads Your Mind and Sees Your Dreams


The Tech That Reads Your Mind and Sees Your Dreams

“turned on red Psychic Reader neon sign” by  Scott Rodgerson  on  Unsplash

“turned on red Psychic Reader neon sign” by Scott Rodgerson on Unsplash

The ability to read someone’s mind has traditionally been the stuff of fiction. Our thoughts and experiences are private to us and we can choose when we share them with others. But with developments in brain scanning technology, mind-reading is becoming a hard science rather than a false promise. You may no longer need to be superhuman to see the darkest thoughts and desires of the person opposite you. You can instead convince them to lie in your brain scanner.

A basic test for a mind-reading machine is to tell you what visual image you are holding in your head. If the machine has to decide which of two images you are thinking of, does it perform significantly better than guessing at random?

In some cases, this is a fairly easy task with a brain scanner. For example, if you are trying to guess whether someone is thinking of playing tennis or walking around their house, you find different areas of the brain that are most active: the supplementary motor area for the motor imagery involved in playing tennis, and the parahippocampal place area for the spatial imagery involved in walking around your house. This distinction has been used to communicate with hospital patients who lie motionless in what appears to be a coma. If patients can use a thought to answer “yes” or “no” to questions (e.g. thinking of tennis for yes and house for no), then the doctor knows that the patient is showing signs of consciousness.

When you have a larger number of visual images to choose from or greater similarity between images, the task of decoding what someone is thinking becomes far more difficult. The overall levels of activity across the brain might be very similar for seeing a leopard versus a duck, so you need to be more sophisticated in how you analyze brain imaging data. One option is to drill down into detailed patterns of activity within a single area.

You can start scientific mind-reading by decomposing a list of images into their different visual features (e.g. object position, orientation, light contrast, etc). Then, you can take a set of practice images and train a decoder machine to link the features for those images with patterns of activity in the visual areas of a person’s brain as they see those features. Each feature drives brain activity in a different direction, so every unique combination of features corresponds to a unique pattern of activity overall.

After the machine is trained, you show the person brand new images they’ve never seen before and measure the patterns of activity in their visual brain areas. By using the associations that the decoder picked up during training, you can infer the visual features for the new image they see from their brain activity patterns. The machine can then look through a database of images, and estimate which image the person is seeing based on how closely the inferred features from brain activity match the actual decomposed features of an image. The closest match becomes the machine’s best guess.

The amazing thing is that because there is so much overlap in how our brain responds to actual visual images and the visuals we imagine in our head, you can do the same thing for what people are thinking about. Instead of showing them a visual image in the brain scanner, you just ask them to visualize a particular object in their mind. By analyzing brain activity in the same way, the machine can correctly infer which object they are imagining, or even which piece of famous artwork they have in mind.

One of the most exciting applications for this kind of mind-reading may be in decoding the content of our dreams while we sleep. Dreams are not only notoriously difficult to understand, they are often so vague and disconnected with reality that we barely remember them when we wake up. However, as with the imagined versus seen images I mentioned above, there is strong overlap in our visual brain patterns corresponding to seen images and dreamt images.

Researchers put participants in a brain scanner, waited until they fell asleep, and then woke them up during the most dream-intensive phase of sleep. By asking them to describe any visual images they saw while asleep, the researchers built a record of the images that people dreamed and their brain activity during those moments. By training a decoder machine on brain activity when people physically saw different images while awake, they could successfully read and predict what people visualized in their sleep from the same patterns of brain activity. As this kind of technology develops and improves, we should end up with more accurate and more comprehensive dream-reading machines.

Memory is another important function that depends on our ability to generate mental images. Long-term memory is similar to the type of visualizing I described in the experiments above. If I ask you to imagine a leopard or your tennis swing, you are recalling elements from your long-term memory of past experiences with those images or actions. But you also have working memory, which refers to your capacity to hold a number of objects in mind for seconds or minutes while doing a task. You may be trying to hold a phone number in mind while you dial it, or perhaps pictures during a memory game.

Visual areas of the brain generally do not sustain their overall level of activity when we hold visual images in memory. But as I explained before, you may need to drill down to find patterns of brain activity that code for specific images. This is exactly what one group of researchers tested when they asked participants to remember the orientation of a quickly flashed visual object for 11 seconds. After that delay period, participants had to decide whether a new comparison object was the same as the object in their memory. The decoder could analyze activity in visual areas of the brain during that delay period, and guess which of two orientations people were holding in their memory with over 80% accuracy. So even though brain activity in those areas returns to its resting level after seeing a visual object, it continues to exhibit a pattern of activity matching that object as you hold it in memory. Those same patterns also reliably predict the image if you generate it yourself in your mind instead of holding it in working memory over a delay.

The activity in our brain is naturally responsible for our mental experiences. Decoding those experiences with brain scanners is a radical and enlightening new project. We have already hit successes in decoding the contents of our mental imagery, dreams, and working memory. It’s thrilling to consider where we go from here. Although it’s easy to worry about future misuse of this technology (e.g. with invasions of privacy), the scientific journey itself could realistically improve how we understand people’s conscious experiences and potentially how we treat mental health disorders. In essence, it could teach us about the most important facts in our lives: where our thoughts come from, what they do to us, and how we can change them for the better.


One Way Children Are Smarter Than You


One Way Children Are Smarter Than You

Photo by  JESHOOTS.COM  on  Unsplash

You can pretty much consider it a law that your cognitive ability improves as you develop from a child into an adult. The usual exception to this is your ability to learn, especially with languages or a musical instrument, where children excel at picking up new talents. But with knowledge, memory, attention, and pretty much all other measurable mental capacities, children are the developers rather than the masters. Well, until one study came along to cast doubt on one aspect of our attentional ability.

Change detection is a popular concept studied in experimental psychology. It refers to our ability to notice differences between images. If you’ve ever tried a “spot the difference” puzzle, you know how tricky it can be to spot small differences between otherwise identical images that sit next to each other. But when you flash two images independently to people, and ask them to report whether there is a difference, it becomes difficult to notice even large changes in the images.

Our general weakness in spotting these kinds of differences is referred to as change blindness. The remedy is focused attention. When a change is significant enough, and there is no interruption in the continuity of images, our attention is automatically drawn to the change. But with interruptions in continuity, or when the change is small, we need to work much harder to focus our attention in the right area. This is why filmmakers often get away with continuity errors between shots or frames. The transitions between shots disrupt the image continuity enough that we miss the errors. Here is a fantastic example of a card trick employing change blindness, so you can see for yourself.

So what does this have to do with children? Researchers decided to directly compare change detection abilities between children (4–5 years old) and adults (~20 years old). Both groups saw cartoon characters that could differ in one of two characteristics (body shape or hand shape). During the main part of the experiment, participants were only asked to look for characters who matched a particular body shape, so their attention was focused on a single characteristic. After the main part of the experiment, the adults and the children saw a range of characters, and were asked whether or not they had previously seen those exact characters. Adults performed just as well as kids when rejecting characters who differed in the characteristic that they focused on during the main experiment (i.e. body shape). But shockingly, the kids were significantly better than adults at identifying characters who differed in the characteristic they did not focus on (i.e. hand shape).

Photo by  Picsea  on  Unsplash . Adapted by yours truly.

Photo by Picsea on Unsplash. Adapted by yours truly.

When it comes to detecting visual changes outside our direct focus, children may outperform their parents. It seems that adults have a highly selective focus while children are more diffuse or scattered in their attention. Attention-related circuits in the brain are developing throughout childhood, and it may be that this immaturity in attentional control results in a serendipitous advantage for processing information that is not immediately relevant. So for us adults, our mature ability to zero in on particular visual traits means we are successful in recognizing relevant changes. But when asked about details that are irrelevant to that previous focus, the kids come out on top.


The Bird Who Cried Snake


The Bird Who Cried Snake

Photo by  Ben White  on  Unsplash

Photo by Ben White on Unsplash

If I say “strawberry ice cream”, the odds are that strawberry ice cream will be the first thing that pops into your mind. It’ll probably start with a visual mental image, but you might go on to recall its sweet fruity taste and frosty sensation too. If I instead told you not to think of strawberry ice cream, you’d probably end up in the exact same boat. Mental imagery is often an involuntary phenomenon, and it’s hugely influenced by what people around us say. Sometimes, frustratingly so. The reason we cover our ears when people start talking about repulsive or terrifying scenes is because we know what our mischievous brains will do at that moment or later that night when we are trying to sleep in our dark bedroom.

Language interacts with our mind in stimulating mental imagery and influencing how we handle the world. Verbal communication is primarily a device for sharing important information, so our brains accordingly use language to guide where we should focus our attention. You can see this effect at work when people are looking for hidden objects. If you say the word “square”, people are faster and more accurate in detecting square-like visual images, but slower in detecting circle-like images. And the reverse is true if you say the word “circle”. Despite presenting random words that don’t necessarily predict which object is being presented, people can’t help but use the language in their attention and decision-making. Language and other communicative signals (whether meaningful or misleading) create expectations or “sensory templates” about how an upcoming event should look or sound. Other people’s language has privileged access to our brain.

Speaking of privileged access, you may have noticed throughout your life that certain words are particularly good at grabbing your attention. The first and most obvious example of this is known as the cocktail party effect. When we stand in a noisy bar with friends, we’re usually good at narrowing our attention to focus exclusively on the words that a friend is saying, while filtering out all the nonsense coming out of other people’s mouths. When we avoid paying active attention to the many conversations around us, they essentially sound like one big monotonous buzz. However, if our name pops up within that buzz, many of us immediately and automatically prick up our ears. This means that despite everything sounding like meaningless noise in the background, our brains continue processing something about the unattended information, without us being aware of it. And when that information is suddenly relevant to us (nothing is more relevant than our own name), then our conscious attention shifts from the conversation with our friend, to “is this stranger talking about me?”.

Another type of word with a priority pass into our consciousness is the taboo type. I don’t want to use any of these words directly in this article so I’ll exchange a commonly used example for the rhyming euphemism “cluck”. Even when you are not listening to a conversation, it’s hard to stay tuned out if it suddenly features a cluck. If we are with close friends who regularly swear, it can become a more normal word, but of course then it’s no longer taboo. If we are in more polite company or at a formal event, then we have no choice but to immediately notice and recoil when someone exclaims “cluck this” or talks about their clucking boss at work. It’s another great example of the involuntary effects that language can have on us, not only at a mental imagery level but also an emotional and behavioral level.

Photo by  rawpixel  on  Unsplash . Adapted by yours truly.

Photo by rawpixel on Unsplash. Adapted by yours truly.

Taboo words connect with our emotional brain systems in a way that more neutral words do not, and they elicit automatic stress-related physiological reactions. It may be that we have two distinct language systems in the brain: one closely related to emotional vocalizations which is more likely to handle swearing and cursing, and another for our more advanced information-filled communicative abilities. Neurological disorders like aphasia are characterized by an inability to speak or understand language, and yet patients can often curse and swear with less difficulty. This may be because their brain damage is confined to the informational language system rather than the emotional system.

We have more in common with other animals when it comes to our emotional vocalizations than our more informational communication. We know that language can have direct and automatic impacts on our own behavior and mental imagery, so perhaps bird calls have similar effects on a bird’s mind. One major reason for birds to call is to raise an alarm. For the Japanese tit, one of the biggest concerns is the Japanese rat snake, which can move up a tree to capture its prey. So could there be a bird call that makes other birds in the area specifically watch out for snakes, or are there only general urgency signals?

One researcher decided to test this for himself by hanging a speaker in a tree, broadcasting alarm calls, and examining how birds would react to a stick that moved like a predatory snake. With general alarm calls, a bird would ignore the stick. But with snake-specific calls, it would fly within a meter of the stick to survey exactly what was going on. In fact, it would only be interested in this stick if it moved in a snake-like way up a tree. It showed no interest in the same stick if it was swinging from the tree in a way that didn’t resemble a snake. Upon spotting a real snake, a bird would normally hover over it and try to look big in an attempt to deter it from progressing further up the tree. Of course, they didn’t need to do that with the stick when their closer inspection revealed it was harmless. But their approach behavior showed that snake-specific alarm calls automatically stimulated the mental image of a snake (or at least something analogous since we don’t know the bird’s direct experience), and shifted their visual attention towards objects in the environment that most resembled that particular dangerous template.

Photo by  SK Yeong  on  Unsplash . Adapted by yours truly.

Photo by SK Yeong on Unsplash. Adapted by yours truly.

Communicative signals directly and specifically affect our next moves, and the benefits of communication clearly apply across many animal species. There are several practical uses to human language including emotional expressions, information exchange, negotiations, and warning signals. In normal circumstances, other people’s signals often predict something meaningful about what may be about to happen, and what we need to look out for in our environment. It is therefore a useful adaptation to use language quickly and automatically when it may be useful. Although this can open the door to frustrating deceptions and false alarms, we have gained a lot more from communicating freely with our fellow humans than we have lost to their occasional bad intentions.


In the Future, We May Not Need to Face Our Fears


In the Future, We May Not Need to Face Our Fears

This article was a front-page feature on Medium.

Photo by  denis harsch  on  Unsplash

Many people live with fear. Chronic fear: phobias, anxieties, and PTSD. Conditions like these can be debilitating and difficult to treat. Many treatment options cause painful physical and psychological side-effects. New and promising treatments, however, use advances in brain scans and neurofeedback to revolutionize the way science helps us overcome our fears.

The problem with drug-based treatments will always be their side effects and wide targets. There’s no drug to specifically cure a fear of snakes or fear of flying, for example; medications only dampen a generalized level of anxiety (sometimes, they even knock people out completely).

Medications can only dampen the general level of anxiety or perhaps even knock people out completely.

One of the best available treatments for a specific anxiety? Exposure therapy. In a controlled environment, participants are trained to relax and face their fears. If they have a phobia of snakes, for instance, they might be asked to imagine a snake in the first session, then look at a picture of a snake in the second, then watch a video in the third, then see a real snake in the fourth. If they make it through the therapy, they might successfully reduce their level of fear in the real world. The problem, as you might expect, is that dropout rates during this type of therapy are high. Repeated exposure to your deepest fears is a painful process.

Parallel to exposure therapy runs another stream of scientific research looking at a method known as neurofeedback. Through manipulating brain patterns, this technique trains people to shift their behavior in specific directions. If you were undergoing neurofeedback, the procedure might go something like this:

You sit and look at a circular disc on a computer screen while researchers measure your brain activity. You see that the size of the circular disc changes, and you know that its size is somehow linked to a target pattern of brain activity in your head. When that pattern is more active, the disc grows. When it’s less active, it shrinks. Over time, you begin to learn how to consistently make the disc bigger. But, strangely enough, you don’t always know exactly how you’re managing to control your brain activity in order to accomplish this. The learning process is implicit and outside your awareness.

Neurofeedback shows some potential as a tool for treating neurological or psychiatric disorders. The logic is that if doctors can identify a particular signature of activity in the brain that characterizes a patient’s symptoms, they might be able to use neurofeedback training to reduce that activity. If the activity is shown to have a meaningful role in causing their symptoms, then the hope is those symptoms will also decrease.

Emerging evidence supports these benefits for disorders, including ADHD and stroke recovery. Of course, there are still questions around the practicality and efficacy of this treatment. But the evidence, so far, is promising.

Repeated exposure to your deepest fears is a painful process.

Building on the potential of this research, a new study published in March 2018 by labs at UCLA and in Japan brings together the worlds of exposure therapy and neurofeedback. The study’s ambition was to expose participants not to their fears themselves (like in exposure therapy), but to the unconscious activity representing those fears in their brains (neurofeedback). By rewarding participants when their brains showed that unconscious activity, they tried to create a positive rather than negative emotional association with the feared object.

Critically, this method avoids the need to directly present the fear to participants, minimizing the chance they’ll drop out of therapy (a common problem with exposure techniques).

Surrogate volunteers with no phobias were shown the fear-based objects (e.g., spiders and snakes) and their brain activity was scanned. Researchers used these patterns to infer what fearful activity would look like in the brains of people with phobias toward those objects. Then they used neurofeedback training to reward participants whenever their brain activity looked like it represented the unseen feared object. Amazingly, neither researchers nor participants knew the fear that was being targeted: The computer randomly selected neurofeedback for each participant, automatically using an object they didn’t fear as a control.

If doctors can identify a particular activity signature in the brain that characterizes a patient’s symptoms, they might be able to use neurofeedback training to reduce that activity.

At the end of the experiment, participants’ physiological fear levels (skin conductance responses and brain activity in their amygdala) were reduced when looking at images of the object they feared. Fear responses to the control object, which was not targeted in the neurofeedback training, remained the same as before the experiment.

It’s amazing to consider what this kind of neurofeedback could do for people in the future. Imagine the benefits for those with chronic anxieties, phobias, or conditions such as PTSD. Could their symptoms one day be treated without ever exposing them to the terrors they suffer from?

When phobias are overwhelming enough that they take over our lives, we may be able to defeat them without ever directly facing them.


How Air Pollution Is Destroying Your Brain


How Air Pollution Is Destroying Your Brain

There are few things in my everyday life that frustrate me more than cycling behind an old rickety van that is blowing black fumes into my face. I tried those face masks that filter pollutants as you breathe, but on a hot day, the sweat and discomfort is unbearable, and I must say, the mask makes me look rather like an unpopular supervillain.

I usually try not to complain about the world, and look for the good and bad in everything, and I think most of my previous articles have been relatively optimistic about life in general. The frequent headlines about new technologies either killing or curing our brains are misleadingly one-sided. The truth is usually somewhere in the middle. But when it comes to pollution, I don’t think I am exaggerating too much. It’s hard to see how the particles of poison entering my respiratory system could be good for my health in any way. As with cigarettes, sometimes the science is convincing enough that we can wholeheartedly say “smoking is bad for you”. So in the spirit of the anti-smoking lobbies that rightly worked so hard in the past, here is some of the recent evidence on how air pollution may be destroying your brain.

Photo by  Katerina Radvanska  on  Unsplash . Adapted by yours truly.

Photo by Katerina Radvanska on Unsplash. Adapted by yours truly.

The first important question to ask is whether it’s even possible or feasible for external pollutants to enter the brain. The blood-brain barrier is good at keeping foreign particles in the blood from interfering with the brain, but pathogens can find other entry points. A research project published in 2016 looked for nanoparticles of magnetite (an iron ore) in brain tissue samples from people who died in fatal accidents while they lived in Manchester, UK, and Mexico City. Magnetite particles are produced by combustion and abundantly found in the air we breathe in major cities. Biologically produced magnetite is also naturally present in the brain, but the researchers could distinguish this natural magnetite from airborne magnetite by comparing their structures. Rounded nanoparticles like those found in air pollution outnumbered natural magnetite in the brain samples. The tiny size of the particles (< 200 nanometers) meant that they could enter the brain via the olfactory nerve, the nerve connecting the smell receptors in our nose to the brain. High amounts of brain magnetite have been linked to neurodegenerative diseases like Alzheimer’s, and the researchers did indeed find high concentrations in the brain samples from older people who had a history of symptoms. But scarier than that, some of the highest magnetite concentrations came from younger people living in Mexico City, especially in those who were exposed to the most polluted areas.

The research above shows that external pollutants can contaminate the brain. But the next question is whether there is good evidence of a link between pollution and brain abnormalities or cognitive deficits. I’ll describe the evidence of harm to different groups of people, in order of their age.

Let’s start with the unborn. Fetuses in the womb can be exposed to pollutants through the placenta, so researchers tested whether the mother’s air quality in the third trimester would predict the child’s brain development over its first 7–9 years of life. They found that greater prenatal exposure to pollutants was linked to smaller white matter volumes across the left hemisphere of the child’s brain in later life (at an average age of 8 years old). Sadly, this reduced white matter volume correlated with slower mental processing, more behavioral problems, and stronger ADHD symptoms in childhood.

Next up is air pollution at school. Researchers compared the cognitive development of children in schools exposed to high versus low levels of traffic-related pollution. They assessed memory and attention performance every three months for one year, for almost 3000 kids across 39 schools in Barcelona, Spain. As you might expect after the results of the last study, the kids in the more polluted schools showed less improvement in their working memory and attention performance over the course of the year. This deficit in cognitive performance may be linked to impaired connectivity across the brain in school children who are exposed to more environmental pollutants.

Photo by  Peter Hershey  on  Unsplash . Adapted by yours truly.

Photo by Peter Hershey on Unsplash. Adapted by yours truly.

Tests on older women have also suggested a connection between brain structure and estimated exposure to air pollution based on where they have lived. Consistent with the placental pollutant effects I described earlier, these tests showed that women previously exposed to more airborne particulates had smaller white matter volumes, even after controlling for other demographics or relevant health issues. Given that air pollution is consistently a risk factor for dementia, it may be that these white matter deficiencies are related to the onset of neurodegenerative diseases. But we need to wait for more research to clarify the exact links between pollution, white matter damage, and cognitive declines in older age.

Most of these studies are about the long-term effects of toxic particulate matter, but there may also be immediate threats from sudden changes in pollution levels. A large systematic review of 6.2 million events across 28 countries looked at whether short-term increases in air pollutants resulted in increased hospital admissions for strokes or fatal health problems. Admissions did indeed increase with rises in airborne particulate concentrations and gases including carbon monoxide, sulphur dioxide, and nitrogen dioxide. So next time there’s a severe smog problem outside, it might be best to stay indoors, especially if you have existing health problems.

Photo by  Alex Gindin  on  Unsplash . Adapted by yours truly.

Photo by Alex Gindin on Unsplash. Adapted by yours truly.

I’ll highlight one final study because it was published so recently and because the methods are both simple and smart. A group of researchers took the data from an existing national survey across China that included tests of verbal and mathematical ability. They then took the exact dates and locations of those surveys and matched them to daily air pollution data across China, while removing the effects of other county-level variables (e.g. GDP per capita and population density) and individual-level variables (e.g. household income and education). They found that verbal and mathematical performance was lower in areas with high pollution, and the effects were strongest when you averaged the pollution levels across longer time frames. Men appeared to be more vulnerable than women to the negative effects of air pollution on verbal test performance, and less educated men were the most vulnerable of all.

The evidence on how air pollution impacts our brain and cognitive performance is discomforting to say the least. There is still much to learn, so it’s worth keeping up to date with the science as it evolves. But in the meantime, I’ll be avoiding the busiest streets during bicycle rides to minimize the risk of the silent dangers lurking in our air.


Stop Assuming They Don’t Like You


Stop Assuming They Don’t Like You

Photo by  Noah Buscher  on  Unsplash

When you think about what makes you anxious in life, social events are likely to feature prominently. Public speaking, meeting new people, and competing with others make many of us wince with an awkward pain. We have anxieties about what can go wrong for good reasons: loneliness is a killer, and weak social networks can prevent us from making progress.

Fear of embarrassment may be one of the primary emotional drivers that make us nervous about joining or speaking to a new group of people. We don’t want to be that person standing alone at the party and we don’t want our reputations destroyed by a hasty comment that came out wrong. Generally speaking, two things need to come together to cause embarrassment. The first is a failure according to our personal standards (e.g. falling over or saying something stupid). The second is a social setting in which we know others may be judging us. When you look at the brain while someone is embarrassed, you find activity in exactly the areas of the brain that are most relevant for these functions: emotional arousal areas like the anterior insula that are linked to the experience of personal failure, and ‘mentalizing’ areas like the medial prefrontal cortex that are involved in understanding what other people may be thinking about us.

When we end up in the unfortunate position of social reject, the emotional pain we experience is not so different to the physical pain from a burn. Both are deeply uncomfortable, highly aversive, and both make me want to jump into a large bucket of ice water to numb the pain. In fact, there is striking similarity between the two types of pain in how the brain treats them. They both activate parts of the brain important for processing physical sensations on the body (like the posterior insula and somatosensory cortex). When you really zoom in to look at those areas in more detail, you may be able to detect differences in the precise patterns of activation within them, depending on the type of pain. After all, the two experiences are not entirely identical and we are still very capable of distinguishing them. But there is no getting around it: when we feel socially rejected, it hurts like hell. Whether a romantic partner has called for a hiatus, or we’ve embarrassed ourselves in front of an audience, the brain knows exactly which systems to recruit in order to make it as excruciating as it needs to be.

Photo by  rawpixel  on  Unsplash . Adapted by yours truly.

Photo by rawpixel on Unsplash. Adapted by yours truly.

If some of my previous accounts of personal and general brain-hating haven’t already made it clear, we are vulnerable to errors in our perceptions and thinking patterns. So maybe there are times when we misread how others feel about us. In a refreshingly simple recent experiment, researchers put two strangers into a room, gave them a few ice-breakers, and asked them to chat. They then pulled the couple apart and surveyed them individually on how they felt about the other person, and how they believed the other person felt about them. People consistently underestimated how much the other person liked them, and the researchers called this ‘the liking gap’. This gap in how we think other people feel about us, and how they actually feel, can persist for months after we meet someone, and it holds true whether the conversations we had were 2 minutes or 45 minutes long.

It’s almost as though we are utterly determined to believe that other people have a problem with us, even in the absence of any evidence to support that belief. The effect may be driven by an excessively critical review of our own performance after an interaction with a new person. We judge our own conversational quality more negatively than we judge other people’s. We dwell too long on small details that might have been mistakes and might have annoyed or offended our conversational partner, and don’t pay enough attention to how they reacted perfectly happily or normally to everything we said. Perhaps this self-critical attitude drives us to improve and become better company in the long run. Or perhaps it needlessly upsets and embarrasses us, and makes us hesitant to meet new people in the future. That’s for you to decide.

Photo by  Kelly Sikkema  on  Unsplash . Adapted by yours truly.

Photo by Kelly Sikkema on Unsplash. Adapted by yours truly.

We are gregarious creatures, so friends provide some of the biggest excitements and joys that life has to offer. It’s a good idea to carefully monitor our behavior and make sure we present our best selves when we meet new people. But much of the time, we have a habit of reading the situation poorly. In typical conversations, the pressure to be liked can overwhelm our rationality and distort our judgments about what other people are thinking. When we next conclude that a conversation was a failure, it might be worth a second thought. And even when we really do suffer a social rejection, there may be silver linings we can cling to, like the opportunity to use our emotional reactions and sense of independence as inspiration to be creative (and there are other ways to maximize your creativity too). Aren’t all the best love songs about breakups?


The Day I Embarrassed Myself


The Day I Embarrassed Myself

Photo by  Louis Hansel  on  Unsplash

The biggest turning points in our lives come from moments when we need to make a decision. We make decisions ranging from the most trivial to the most important every single day. We pick and choose the friends who are right for us, the directions in which to travel, the careers to develop, and the cities to build. Anatomically speaking, we humans are all basically the same. It is our decisions that set us apart.

Decisions are not always easy, and the modern world often asks a lot from our poor ape brains. Sometimes it seems like we can’t win. We can have both too little and too much choice. Our conscious minds can overthink a problem while our unconscious minds miss too much. And we are expected to make reasonable sense of what is around us now, while also predicting the future consequences of the possible decisions available to us.

Predicting the future is no easy feat for non-clairvoyants (i.e. everyone). Many people and events can depend on what we decide to do, and I’m not just talking about the decisions of war generals. Deciding whether or not to buy a coffee right now can impact what we hear and say in a later work meeting and might affect our reputations and careers. Deciding whether to take this crowded train or the next quiet one to university can determine whether we make it to an exam on time or fail. And the most recent pressing decision on my mind while I lived in the UK: deciding whether or not to attend a wedding can impact how particular people feel about us.

As I hinted at when I referred to our poor ape brains, our reasoning and decision-making is not optimally set up for modern day in civilized society. There are plenty of processes and mechanisms that made sense in our evolutionary history, but now are misaligned with the ideals and demands of modern life. We call them cognitive biases and our brain is littered with them. I will talk through just a few of these in the context of my decision-making on the day I had to attend a wedding, because it’s easy to see how often I make these mistakes. It might seem like a dire situation for human psychology, but far from it. When I notice a cognitive bias appear in my head and remain aware of it, it is less likely to force me into making poor decisions that turn my molehills into mountains.

Keep in mind this important note as I tell you the story: I hate weddings. I absolutely hate weddings. And I argued with my wife every day for one month about why I had to go to this specific wedding, and why I couldn’t just stay at home (just like I argued for my own wedding). This was how my morning went on that day. I will italicize my cognitive mistakes to make them extra embarrassing. I hope you can relate to at least a couple of them. Here goes…

Photo by  rawpixel  on  Unsplash . Adapted by yours truly.

Photo by rawpixel on Unsplash. Adapted by yours truly.

7am — My alarm rings and I slowly open my eyes. It dawns upon me. It is the day of that wedding, and I need to leave the house within the next hour for a long journey from London to Codsall. Codsall for goodness sake. Codsall! What a daft name for a place.

7.15am — I’m still lying in bed, and putting off the day ahead by reading the news on my phone. I get a message from my wife who is flying in from Washington DC and meeting me at the wedding. Her flight was cancelled during the night while I was failing to sleep but avoiding looking at my phone, and she had to get on a new one. She will now be at the wedding 6 hours after I arrive in Codsall. I will need to spend 6 hours in a dingy little depressing village cafe, waiting for my wife, so that I don’t need to spend any time alone at the wedding. This is an abomination.

  • Cognitive bias 1 — Overgeneralizing learned rules: I have been to many small English villages in my time, and I would estimate something like 40% of them had cafes that I did not enjoy sitting or working in. In cities, this value is close to 0%. So I have detected what I believe is a reasonable pattern in the world in terms of my preferences, but I am over-applying the rule to places I have never visited before. Yes, I have encountered far more beastly cafes in small villages than I have in cities. But it is nowhere near 100% of those villages. So I should be giving completely new places a good chance of impressing me with their cafe selection. Some evidence suggests overgeneralization may be relevant in panic disorder and generalized anxiety disorder, where patients’ perceptions of danger spread too far.

7.31am — I’ve made it into the shower.

8am — After sulkily putting on my clothes and throwing my suit in a bag, I am prepared to leave. I look out of the window and it is pissing it down out there (translation for non-British people: raining heavily). I do not have an umbrella.

8.15am — I’ve walked through the rain and I’m now at my local tube (subway/metro) station in east London. The place is crawling with humans scurrying to get to work in central London. I miss the first train I need because too many people get on ahead of me, so I wait my turn at the front of the queue for the next one. It arrives but I’m pushed out of the way by a small woman with curly hair who was behind me. She takes up the last empty space on the train as the doors slam shut, and she looks at me with contempt as it begins moving. This woman is an arrogant, selfish, devil-worshipper. I am now a misanthrope for the foreseeable future.

  • Cognitive bias 2 — Attributional biases: We can always catch ourselves making mistakes in how we attribute characteristics to the events in our lives. One example is the “curse of knowledge”, in which both adults and children incorrectly assume other people know what they do. This makes communication difficult and can lead to bad decisions. We also make errors in attributing responsibility, especially by ascribing permanence to what is temporary. When we are happy or sad, we often feel it is a defining part of us rather than a fleeting emotional experience that will come and go. Patients with depression have a worse problem: they believe that any negative emotions will stay with them forever and are their own fault, while positive emotions are an accident that will disappear before long. We also tend to assume that other people’s bad behavior is attributable to basic character flaws rather than the possibility that they are just having a bad day. Is the pushy woman I met on the train really a devil-worshipper? Or could she have got some bad news about a relative that morning?

8.22am — I am sitting in a chair on the train platform in despair, with my head in my hands. Everyone has their own thing going on, entirely ignoring each other. A small black Labrador trots up to my leg on its owner’s leash. It stares into my eyes. This dog knows. It is confused about our culture and behavior and is questioning why we insist on standing in these crowded sweaty places rather than running around outside in the park chasing sticks.

  • Cognitive bias 3 — Anthropomorphization: We enjoy imbuing non-humans with human-like characteristics because we feel it helps us to understand them better. This is not always completely unrealistic. After all, a dog probably has some emotional subjective experiences going on, even if we cannot exactly pinpoint their quality relative to humans. The problems with anthropomorphization are a little clearer when we start talking about inanimate objects as though they were alive. We see eyes in the headlights of cars, a Mother figure within nature, and we form emotional attachments to rocks and bits of metal (e.g. jewelry). This might also relate to our visions of Gods and conscious intentions within natural phenomena throughout history. When people anthropomorphize slot machines, they even gamble more.

Photo by  Daniel Cheung  on  Unsplash

8.45am — I finally make it into a train, but on the way I become certain I will miss my train from London Euston station to Codsall. This annoys me and I seriously consider going back home, lying on my wonderful sofa, and ignoring all messages and calls from wedding people. But I have already paid money for the return train tickets. Surely I can’t waste that money by not taking the trip now? If I miss the train, I will just need to pay for another ticket at the station. I have come this far, lost this much money, and now I need to see the trip through to the end no matter what.

  • Cognitive bias 4: Sunk cost fallacy — This is often expressed as ‘throwing good money after bad’. When we have spent money on something, we experience an overwhelming commitment to it, and fight against any urge to drop out of the commitment early. When we buy a ticket to a play or an opera and take our seat, we are more likely to sit through a full 3–4 hours of torment rather than leave if we dislike what we are seeing. This is true even though the most rational decision is to leave if we predict continued disappointment from it. Remaining committed to a decision after we start it is perhaps one of the biggest drains on human time and happiness. And it’s not only monetary investments that affect us in this way. Commitments of effort and time also affect us in similar ways. Once we start, we are hesitant to stop, even if we foresee approaching disaster from continued commitment to our initial decision. We need to be able to stop when the time is right, ignoring past investments that have no real impact on what we do now. When resources have already been committed to a particular course of action, those sunk costs should not brainwash us into continuing with plans that turn out to be ineffective. Quitters are not always weak losers; they are often the strongest and most resilient people in the developed world. The sunk cost fallacy may itself be driven by overgeneralization (see Cognitive bias 1 above) of a “Don’t waste” rule.

9.22am — I ran at speeds that Einstein would be impressed with to catch the departing train at Euston station with about 15 seconds to spare. I drop myself down dramatically in an empty seat, and the air pushed out from under me creates a nice calming breeze. I think about the obstacles I have overcome to get here over the last couple of hours. So many separate bad things have happened on the way to this wedding. Positive and negative events seem to happen fairly randomly, sosurely I am due a pleasant surprise when I actually get to the wedding. Nobody has ever had such an unlucky roll.

  • Cognitive bias 5: The gambler’s fallacy: Have you ever been to a casino? Stand by the roulette table long enough and you’ll see something peculiar but intuitive for most people. When the roulette wheel has landed on red or black repeatedly in a row, customers start betting big on the opposite color for the next spin. They believe that in a random sequence, you are unlikely to see a long series of the same event. People intuitively feel that red, red, red, red, red, is less likely to happen than red, black, red, red, black, even though the probability of getting red or black is always 50%. This is referred to as the gambler’s fallacy. This is not just something that fools us standard everyday specimens of humanity. In my academic research, I analyzed some data that suggested elite soccer goalkeepers may show similar biases when deciding which way to dive in penalty shoot-outs.

12.00pm — I’m sitting in a cafe in Codsall, and against all the irrational odds I set myself, it’s one of the most peaceful and wonderful cafes I’ve ever sat in. I got more writing done than I normally would, had amazing cheap coffee and cake (relative to London where I lived at the time), and talked to random strangers about their lives. My wife ended up arriving around 6 pm and we made it to the wedding just before the curtains closed. I even enjoyed what was left of the wedding. Codsall is great…

Photo by  Gades Photography  on  Unsplash . Adapted by yours truly.

Photo by Gades Photography on Unsplash. Adapted by yours truly.

I walked you through my mental mishaps on that wedding day because they are so representative of our general everyday reasoning (feel free to describe your own examples in the comments section to help me look less stupid). But the same biases could just as easily apply in more serious situations, where the basic impulses in our characters guide us towards disastrous beliefs and actions. All of the biases listed above can change our lives by affecting the decisions we make. And there are certainly many more than the ones I mention. By being more aware of them, we can minimize the chance that they blindside us where it hurts.


How You’re the Easiest Person to Fool


How You’re the Easiest Person to Fool

What makes humans so special? Our ability to reason may be what most distinguishes us from other animals. We can consider our motivations, plan ahead, and look back on what we’ve done to understand how it went. But we are far from omniscient: our brains are designed to be efficient rather than perfect (if perfect is even possible). Because of this, we frequently make errors in judgement, and even convince ourselves of having made decisions that others made for us.

We regularly take shortcuts in our reasoning. Intuition plays a dominant role in our decision-making and this is not always a bad thing. It means we make quick decisions when it matters. When dogs see a frisbee flying in their direction, they don’t need to model the exact trajectory and flight speed of the object as it moves through three-dimensional space. Instead, they simply move to keep a steady optical angle between themselves and the object. When the ball is approaching in a straight line, they are likely to be in a good position to catch it. And don’t assume that dogs use this system because they’re too stupid for something more advanced. Professional baseball and cricket players use this same shortcut, and perform incredibly well with it.

Photo by  James Ting  on  Unsplash . Adapted by yours truly.

Photo by James Ting on Unsplash. Adapted by yours truly.

When you need to make quick decisions, these shortcuts based on the simplest sources of information may be the best option. In fact, the advantages of this intuitive decision-making may not be exclusive to decisions under time pressure. Even some business decisions that don’t necessarily depend on speed may be most accurate when using intuitive assumptions rather than complex statistical models. When assessing whether someone is likely to be a repeat customer, a shortcut based on when the customer last bought a product performs better than a complicated model that accounts for overall customer dropout rates, purchasing distributions, and customer lifetime distributions. Next time you sit for hours thinking about which of five backpacks to buy — analyzing their colors, sizes, styles, brands, etc — maybe you should stop yourself and just click to buy the cheapest so that you can go out and do something more worthwhile.

Shortcuts or “heuristics” are fantastic tools, but in the modern world, they often lead us astray. We can easily be caught in the midst of irrational and biased reasoning. For example, when we hear a description of a person and are then asked to judge the probable truth of different statements about them, we may well be inclined to talk nonsense. Imagine the following scenario. Linda is “31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations”. Knowing that about Linda, how would you rank these statements in order of likelihood:

  1. Linda is active in the feminist movement

  2. Linda is a bank teller

  3. Linda is a bank teller and is active in the feminist movement

People generally believe that 3) above is more likely than 2). This is because of the ‘representativeness heuristic’. Activity in the feminist movement feels as though it would be representative of Linda given her description, so we place a lot of weight on that statement. So much weight in fact that we believe it’s more likely Linda is a bank teller and a feminist, than just a bank teller with no additional statement about her. If you can’t quite intuit why this is a fallacy (the conjunction fallacy to be specific), just replace the feminist part of 3) with any other neutral descriptor. It’s easier to see why you’d rather put your money on Linda being a bank teller than both a bank teller and someone who enjoys flying kites. One of the statements is demanding only one thing to be true while the other is demanding both that thing and an additional qualifier to be true. When uncertain, the statement with an additional demand must be less likely by any logical standard.

Now and then, our brain just plain makes things up. If I ask you to choose between A or B, and you choose A, what do you think you’d say if I asked “So why did you choose B?”. In a conversation where you didn’t know this was coming, you may not even bat an eyelid. You’d go on to explain exactly why you chose B and why B is so great.

Photo by  Vladislav Babienko  on  Unsplash . Adapted by yours truly.

Photo by Vladislav Babienko on Unsplash. Adapted by yours truly.

In one amusing experiment, men and women were shown two photos of female faces, and asked to choose which they found more attractive. The researcher then gave that photo to the participant for a closer look, and asked them to explain why they chose that photo. Unbeknownst to the participants, on some trials the sneaky researcher switched the photos with a subtle trick that meant the participants were now explaining why they chose the photo that they did not choose. But rather than acting surprised and complaining that they chose the other photo, most participants began to reel off the reasons why this face — the face they did not choose — was the most attractive. Sometimes, they would cite reasons that could only apply to their originally chosen photo, in full view of a face entirely inconsistent with their reasoning (e.g. “because she was smiling”, when the woman in the new photo looked conspicuously miserable). Other times, they would cite reasons that only applied to the new photo, and could not have applied to their original choice (e.g. “I like earrings!”, when the original photo displayed no jewelry whatsoever). And occasionally they would simply invent excuses about how the woman they were seeing at that moment was “very hot” or had “more personality” about her, in spite of the fact that they originally judged her to be lower in those traits.

Whatever the specifics of the confabulations, people are certainly willing to deceive themselves. This level of self-deception and post-hoc rationalization does not only apply to judging physical attractiveness. You can find the same patterns of choice blindness when it comes to political opinions and even moral attitudes. Perhaps we’re not as inflexible and resistant to change as it commonly appears. All it takes is for someone to fool us into thinking that we’ve already changed our minds.

When we reason ourselves into a void, it’s easy to become distrustful of the squishy organs sitting inside our heads. But that would be expecting too much from your brain. It’s great at its job precisely because it takes so many efficient shortcuts. When a lion is spotted nearby, the person who immediately thinks “run” is likely to make a lot more progress in life than the person who sits analyzing the lion’s distance, speed, size, and probable hunger levels. Our intuitions are certainly capable of embarrassing us, and it’s always worth quadruple-checking our thinking process. The modern demands of societies, economies, and the internet, are unlike anything that humans ever experienced in their evolutionary history. So the occasional snafu or two should be expected when we try to make sense of it all. Let’s just hope we can limit the damage with a little extra self-awareness.


This Is Your Brain on the Internet


This Is Your Brain on the Internet

This article was a front-page feature on Medium.

The Earth has never witnessed a more seamless tool for knowledge-sharing than the internet. Word-of-mouth is a great way to send knowledge from one brain to another, and the internet allows us to do this with practically any source of information in an instant, from one side of the world to the other. But as Uncle Ben says in Spiderman, “with great power comes great responsibility.”

Our frictionless information sharing has become a bit of a pain. We’re reluctant to invest time in checking the validity of each piece of news we find, and this leaves a lot of dishonest or misleading material circulating around in our brains, potentially changing our behavior in dysfunctional ways. Learning is a great and productive tool when it teaches us what is true about the world, helping us to better navigate it. It’s not so great when it teaches us a falsehood. If we learn that marshmallows are good for building electronic machines, for example, we won’t get very far. But if we learn that silicon is good for building electronic machines, we get the smartphone.

True and false news spread differently across the world of social networks. Falsehoods tend to be more novel and unique in their messages (they are being invented after all), so they spread faster and wider across networks than truths. The way in which people interact with the messages also varies. False news inspires comments of surprise and disgust, while true news inspires comments of sadness, anticipation, and trust.

We cannot blame bots for this state of affairs. Although they amplify and accelerate the spread of news, they tend to do this equally for both true and false news. It is we humans, with our attraction to the unique drama, surprise, and disgust of false news, who are primarily responsible for the avalanches of misinformation that characterize modern information consumption.

The internet has changed the way that our brains work. Humans have always been good at learning and adapting to new environments. So given the internet’s dramatic impact on life in the developed world, it is no surprise that we have adjusted our thinking and behavior. The biggest impact has perhaps come from companies like Google, which make all knowledge available to us at a few keystrokes.

Our internet usage has “Googlified” our brain, making us more dependent on knowing where to access facts and less able to remember the facts themselves.

We can test how much technology has influenced our mental function by examining how and when the brain activates tech-related concepts. When words and concepts are readily accessible at the front of our minds, they often distract us and interfere with how well we perform behavioral tasks. Researchers have used this principle to test whether difficult trivia questions automatically activate internet-related concepts in our brain. If we don’t know the answer to something, our first thought is likely to be “Google.” When study participants took part in a behavioral task immediately following difficult trivia questions, their performance in that task worsened when words like “Google” appeared on a screen, distracting them.

There are other powerful indicators of Google’s impact on our mind. When we type out interesting trivia tidbits on a computer, our memory for the information is significantly better if we are told that the computer will delete rather than save the information. And if we type out tidbits and save them to a specific folder, we are more likely to remember where we stashed that information than details of the actual contents.

In other words, our internet usage has “Googlified” our brains, making us more dependent on knowing where to access facts and less able to remember the facts themselves. This might sound a little depressing, but it makes perfect sense if we are making the most of the tools and resources available to us. Who needs to waste their mental resources on remembering that an “ostrich’s eye is bigger than its brain,” when the internet can tell us at a moment’s notice? Let’s save our brains for more important problems.

Image:  sbtlneet/Pixabay / CC0 , modified by author.

Image: sbtlneet/Pixabay/CC0, modified by author.

The internet acts as a great aid, but our faith and reliance on it can make us overconfident in our own abilities. After using the internet to look up answers to questions, we begin to believe that our own general ability to understand and explain problems is better than it really is. Compared to those without recent internet access, we even insist that our brains are more active, reflecting our illusions of superior competency. We often fail to grasp just how much we are relying on sources beyond our own talents when we succeed in the world.

Photographs also have transformative effects on the way our memories work. When we walk through an art museum, we remember less about the display pieces if we take photos of them, even when it takes longer to photograph them than purely look at them. Our memory is less impaired if we zoom into specific details of the pieces while photographing them, but certainly no better than enjoying the pieces themselves without our paparazzi behavior.

The advantages of using the internet correctly are enormous, so we need to be careful about making any concrete recommendations on usage limits.

Photographs can be a great way to physically save a moment into your collection, and cameras may help visual memory if used as a tool to enhance how you engage with an experience. But don’t let them come at the expense of your own enjoyment and natural memory of the real thing in front you. It’s counterproductive and a little bizarre to take photos of the world’s wonders, but forget to look at them while they’re actually there.

In our modern digital world, we are using increasing numbers of different media at the same time. The effects of this on our general cognitive capacities are not yet clear, but there may well be some costs. A 2009 study showed that people who heavily engage in multiple forms of media at the same time (e.g., talking on the phone, while working on an essay, while listening to music, while watching TV), perform worse in standardized cognitive tests that measure memory, attention, and task-switching. (A 2013 study suggested the opposite effect for task-switching.)

Heavy media multitaskers may be more vulnerable to distraction and interference from irrelevant sources of information. When you constantly bounce between multiple sources of entertainment and work, you may well be training your mind to become more volatile and less able to sustain attention to the one important task you really need to complete.

Photo:  Erik Lucatero / Unsplash , modified by author.

Photo: Erik Lucatero/Unsplash, modified by author.

To fully understand the costs and benefits of the internet on our brains, we need to patiently watch how the research evolves over the next few decades. The reward-rich world of the internet may come with costs that include distractibility and impaired self-control. Recent studies even suggest that children who use the internet excessively may develop less gray and white matter volume in certain brain areas, and may harm their verbal intelligence. It is not yet clear if internet usage directly causes these effects or if children who are predisposed to the effects are just more likely to overuse the internet. For now, the evidence provides notes of caution and attention rather than conclusive insights.

The advantages of using the internet correctly are enormous, so we need to be careful about making any concrete recommendations on usage limits. However, as with practically everything in the world, moderation and thoughtful consumption are likely to go a long way.

When we pay careful attention to what the internet is doing to us in our own lives — how happy or sad it is making us, and how much it is helping or hindering our progress — we can make better decisions about optimizing our well-being. The internet is amazing, but the beautiful world outside is also waiting for us to directly experience, learn from, and appreciate it. The whole wide world and the world wide web may well compete for our time and attention. It is up to us to maximize the benefits in our own lives by choosing the right “www” when it matters.


Learning, the Easy Way


Learning, the Easy Way

Without the ability to learn, we would be a pretty useless species. The most exciting advances in artificial intelligence come from introducing the ability for machines to learn for themselves and apply that knowledge to future problems, because we know that advanced learning is a major turning point in human evolution. Some even worry that if robots learn too well, it could usher an intelligence explosion that signals the end of humanity. But machines and existential catastrophes aside, in our own little way, humans have been learning successfully for hundreds of thousands of years. We teach each other how to use new tools, from the most primitive hammers and knives, to the more remarkable modern tools at our disposal like smartphones. We also teach each other the rules for living in our societies, producing cultures within which we can develop functional laws and economic systems. Effective learning and knowledge-sharing is the ultimate path to prosperity.

The brain treats tasks during learning very differently to how it treats them after mastery. Take the example of learning to drive a car. On your first few lessons, driving a car is quite difficult. It requires you to learn several different motor skills, including adjusting foot pedals at the right time, shifting gears smoothly, keeping the steering accurate, and keeping your eyes on the mirrors. At the most demanding times, you have to do all of this at the same time. When you are unfamiliar with these tasks, you need to pay a lot of attention as you perform them. You may remember the utter exhaustion you felt after each of your early driving lessons as you walked back through the door of your home. This is not because driving a car is a physically strenuous activity. It is because of the draining mental effort you had to put into learning each of the new motor skills required to drive well, under the pressure of other more impatient and experienced drivers behind you.

The prefrontal cortex in your brain, an important area for learning new rules, was highly active when you were learning to drive. It is involved whenever you try to master a new task, and will sit behind much of your conscious effort and mental exhaustion during learning. When you become an experienced driver on the other hand, things are a little different. If you are a regular driver, you do not sit paying attention to the gear stick each time you shift it, or the timing of how you release your foot on the clutch pedal. At this stage of task mastery, your prefrontal cortex and effortful focus are no longer so involved while you perform the activity. Other areas including the striatum (a cluster of cells deeply buried under your cortex) and the deeper cells within your cerebellum (the “little brain” hanging off the rear underside of your big brain), can take over in running the activity “offline” for you (at least for motor learning). The task becomes more automated, and allows you to spend your spared mental energy thinking about how the cloud in the distance looks rather like an elephant, or how you should have phrased something differently during that argument earlier.

Photo by  NeONBRAND  on  Unsplash

Photo by NeONBRAND on Unsplash

The principle of employing a lot of mental effort when learning something, and gradually reducing that effort as we become experts, is true across many types of learning. But can we make life easier for ourselves during that early stage of high mental effort?

For children learning to read and write, computers can be a hindrance, at least when it comes to recognizing letters. In one study, researchers took a group of 76 children aged 3–5 and trained half of them to copy letters of the alphabet by hand, and the other half to copy the letters by typing on a keyboard. After 3 weeks of learning, only the older children showed progress in recognizing letters in a test, but within that older group, those who learned through handwriting performed significantly better than those who learned through typing. When you write letters by hand, there are specific motor signals and movement-related sensory signals that come into play. Writing the letter X requires a very different manual action to writing the letter S, whereas typing the two letters on a keyboard requires practically identical actions. So when typing, you need to rely solely on the visual difference between the letters in distinguishing them (and perhaps a little on their different locations on a keyboard, but this is still limited compared to the actions required in handwriting).

The richer and more unique signals associated with written letters may therefore assist in learning them. Adults who are taught to handwrite new unseen characters also have a strong advantage in later recognizing those characters, compared to adults who learn the characters through typing. This advantage holds true even when the two groups see the letters for the same amount of time.

Handwriting rather than typing also happens to be good advice for university students. Students frequently take their laptops to class, with the excuse that they can search the internet for class-relevant material during class to aid their learning. However, perhaps surprisingly, class-related internet usage does not correlate with class performance, suggesting there is no real benefit to bringing that laptop to class. What’s more, perhaps unsurprisingly, irrelevant and nonacademic usage of the internet is common when students use their laptop in class. And the more they do this, the worse their class performance gets. So there you have it. Taking your laptop to class likely gives you little to no academic benefit, and the constant temptation to check Facebook, or examine the latest funny cat videos, gets in the way of effective learning.

Photo from Pixabay. Edited by yours truly.

Photo from Pixabay. Edited by yours truly.

There is one other important principle you may want to keep in mind when taking on a new learning challenge: testing works. Our school systems are built around exams at the end of the year that test your progress and potentially set you up for your later life. Although the high-pressure, now-or-never nature of exam grading has its painful downsides, you do learn more effectively by being tested. Imagine the following two scenarios:

Scenario 1: I give you a list of words to remember, and give you a couple of seconds per word to study them. 1 minute later, I repeat this exact study session. Finally, 5 minutes later, I run through these same two study sessions again with you.

Scenario 2: This starts the same way as Scenario 1 with a study session. But 1 minute later, instead of a repeat study session, I test you on the words, asking you to recall as many as you can. Finally, 5 minutes later, I run through the same study and test sessions again with you.

If I gave you an exam 2 days after each of the scenarios, testing how many words you could recall from the study list, the second scenario would likely provide you with the best performance. This is referred to as the testing effect in experimental psychology. Testing leads to more effortful processing of study material and stronger elaboration on the meaning of words. These positive benefits outweigh the advantage of seeing material for a longer time with additional study sessions.

So restudying alone seems like a mistake, but students often add to their woe by also cramming their study material into the last couple of days before a big exam. This style of massed study performs worse than spacing out your study sessions. If you really want to remember something over the long term, you should separate your study sessions of the material by days or even weeks. Give yourself a chance to process the material and consolidate your memories before jumping into another revision session. With cramming in quick succession, your brain is likely to habituate to your notes and become less effective at reinforcing existing connections.

Photo by  Aaron Thomas  on  Unsplash . Picture of brain from Pixabay. Adapted by yours truly.

Photo by Aaron Thomas on Unsplash. Picture of brain from Pixabay. Adapted by yours truly.

Learning helps us to succeed and lead happier lives. It drives positive academic and career progress, along with a more fulfilling sense of accomplishment in everyday life. Whether we are trying to learn a new language, a new sport, or something new in history class, there are specific things we can do to make it easier for ourselves. Motivation is undoubtedly a major player in our ability to learn something new, but understanding how our brain facilitates learning can also give us an extra nudge towards success. So go ahead and get learning. But save yourself some pain by doing it the easy way.


How to Spend Money if You Want to Be Happy


How to Spend Money if You Want to Be Happy

Photo by  Christine Roy  on  Unsplash

Money can do a lot for us. When we don’t need to worry about the price of our next meal, how to afford a home, and how much we need to save for a well-earned holiday, we are certainly better off. But when it comes to the relationship between money and happiness, things get a little more complicated. There is no straightforward answer to exactly how money impacts life satisfaction. So when you next think about how much happier you’ll be if you get a pay rise, or when you next give someone the advice that “money doesn’t make you happy”, you should know that you are partly correct and partly wrong.

For anyone interested in how money affects happiness, the first evidence you probably want to see is the correlation between how rich people are and how happy people are. You can look at this across several levels. First, in the differences between countries. Whether you include developed countries, developing countries, or countries going through major political transitions, scores of life satisfaction do not seem to increase with economic growth across countries. At least, not over long-term averages. When you look at year-by-year patterns within a single country, for example during a political transition from communism to capitalism, you can spot similar changes in GDP and life satisfaction during those specific years. When the economy crashes, people get relatively miserable. As it recovers, life satisfaction recovers with it.

We can also look at how happiness relates to household income. In the US, there is a weak but significant correlation between a person’s household income and their level of happiness. If you look under the skin of the correlation, you can spot plateaus in how much your happiness increases as you get richer. In 2004, if you made over $90,000 a year, you were almost twice as likely to say you were “very happy” than someone who made less than $20,000. But you’d be practically no happier than someone who made between $50,000 and $90,000. Of course household income can grow a lot higher than $90,000, and you find much steadier growth in life satisfaction if you look at household income on a logarithmic scale (i.e. rising by orders of magnitude rather than linearly). But keep in mind, there is a difference between the typical long-term life satisfaction measures and daily assessments of your emotional experiences. Although the way you evaluate your life in general rises steadily with log income, the way you specifically felt yesterday stops improving after you make around $75,000 (at least in 2008–2009). Above this level of income, you’re probably not going to be any freer to do the things that really make you happy on a day-to-day basis, like seeing friends, enjoying hobbies, and staying healthy. But your general level of life satisfaction seems more sensitive to how much money you have available.

Photo by  rawpixel  on  Unsplash . Adapted by yours truly.

Photo by rawpixel on Unsplash. Adapted by yours truly.

So the relationship between money and happiness is complicated at best. But what exactly could the downsides of money be? One possibility is that money negatively impacts your ability to appreciate and enjoy the smaller things in life — the fresh air in the park or the taste of chocolate. In one experiment, participants who were reminded about wealth and money spent less time eating chocolate, and found it less enjoyable, than people without such a reminder. In general, money mindsets seem to make people less prosocial and more business-like. In addition, those who are making more money are likely to be working more, rather than doing the leisurely things they daydreamed about before getting rich. You can probably see why these kinds of mindsets might not be the happiest.

To add to our burdens, we are creatures who very quickly adapt to new gains and successes in our lives (one of my previous articles goes into our adaptive brains in more detail). This means that a pay rise might make us happier for a few weeks, but as our spending increases and our ambitions drift towards even bigger and better things, we begin to lose that brief spurt of happiness we enjoyed. Even worse, we love to compare ourselves to other people. In times of greater inequality in the community, those at the bottom are likely to feel cheated, which leads to an overall drop in happiness. And as we get richer, we simply find new richer groups to compare ourselves to, so our happiness can only ever last so long.

Perhaps happiness is less about how much money we have, and more about how we use it. When people receive a cash windfall at work, the more of it they spend on others (e.g. friends, charities), the happier they become. In some cases, this indicator of prosocial spending is an even stronger predictor of eventual happiness than the total amount of the windfall in the first place (of course within reasonable boundaries). This is even true in an experimental context when you give people money in the morning and ask half of them to spend the money on themselves, and the other half to spend it on others, throughout the day. At the end of the day, the people who buy gifts for others or give the money to charity gain greater happiness than the people who pay off bills or buy themselves a gift. And it doesn’t matter whether you’re given 5 bucks or 20 bucks, the strongest predictor of your happiness is still how prosocial you are in your spending, not how much you receive. The emotional rewards of sharing may even be universal and deep-rooted enough to apply to children. When children share one of their snacks with a cuddly puppet friend, they express more happiness than when they receive the treat for themselves. In fact, their happiness is greatest when they share one of the treats from their own stash rather than simply an extra treat they are handed by a researcher.

Photo from Pixabay. Adapted by yours truly.

Photo from Pixabay. Adapted by yours truly.

We all think we’ll be happy with more money, and there is certainly some truth to this. But it’s not the whole story. We quickly get used to new ways of living and new purchases we make, so our day-to-day happiness has its limits when it comes to income. However, with a little extra generosity and a stronger focus on using our time for activities that actually make us feel good, we can fully reap the benefits that money and life have to offer. Nobody likes to end on a cliché, but our happiness really does rest on making our money work for us, rather than the other way around.


Social Signals: From Awkward Encounters to Best Friends


Social Signals: From Awkward Encounters to Best Friends

Photo by  Duy Pham  on  Unsplash

Photo by Duy Pham on Unsplash

Some people just have it. None of us really know what ‘it’ is, but we tend to call it social charisma. As I write this, I am sitting at a desk in the British Library in London, UK, opposite a young male French student who is studying here, and two other British female students next to him (I don’t know any these people). The male was a stranger to the female pair only moments ago. But now they are fully engaged in a verbal dialogue full of smiles and honest enjoyment. The interaction brings to mind the kind of dance that some colorful species of male bird might do to attract a mate. And this guy is the most colorful in the entire library. It really doesn’t seem to be the stuff that he says, but the way that he says it. Anyone who listens is bound to be enthralled, purely through the non-verbal communicative signals he is giving off. I am certainly enthralled. And I am certainly more than a little odd because I have been glancing and eavesdropping on this situation for far too long by any normal standard.

New social situations are a pretty tense environment for many of us. We want to be liked and the feeling of being judged is anxiety-inducing. For some people with neuropsychiatric disorders, the experience is even more salient. Tourette syndrome is a disorder characterized by unwanted, rapid, repeated, and sudden movements of the body. These movements are called tics and they can include eye blinks, facial movements, and vocalizations. During a research project I was running in Germany in 2013, I chatted with a Tourette Syndrome patient about their social experiences. He was the only patient I had met who suffered from severe vocal tics, and it was interesting to talk to him about how that affected his life. He would tell me nightmarish stories about his experiences on local buses for example, where he would involuntarily shout Nazi slogans, and get beaten up in response, by people who thought he was some kind of racist demagogue (remember this is in Germany, and in case you didn’t know, German society is a little sensitive when it comes to speech related to Nazism). He went on to say his tics were always at their worst when it was least appropriate for him to tic, which sounds like the kind of torture the most evil person in the universe would inflict upon their victims. The content of vocal tics and their frequency will depend on a patient’s anxieties in the moment. Indeed, when I introduced myself as “Erman”, this word featured prominently in the rest of his vocal tics during our time in the room together. He may have adopted it because it was a noticeably unusual name, and because repeatedly shouting it might be considered awkward and inappropriate. With time, it settled down, perhaps because I did my best not to respond to it. But as you can probably imagine, it is quite difficult to avoid reacting when someone shouts your name.

The experience with the patient I mentioned above is a revealing one that we can all relate to, at least to some small and less extreme extent. Social situations provide many of our biggest opportunities to embarrass ourselves, ruin our reputations, and be widely hated. Loneliness is one of the most dangerous psychological threats to our mental wellbeing, and it exposes the deep evolutionary roots of social networks. Historically, friends and family have been essential for care and safety. Although we could lead a physically safe life without them in the modern world, we would struggle to shake the yearning feeling for loving people around us.

Photo from Pixabay. Adapted by yours truly.

Photo from Pixabay. Adapted by yours truly.

One big question is, do we know anything about what makes people likable and effective at building positive relationships? The first bandwagon people jump on in explaining this phenomenon is social mimicry. We like people who copy our behavior as we interact with them, and we also mimic the people we like. This mimicry usually involves small physical actions like adjusting our posture or touching our face while talking, and there is evidence that this characterizes communication between people who are enjoying each other’s company. When a scientific stooge mimics a participant in conversation during an experiment, the participant feels they have a stronger rapport with that stooge, and they perceive the interaction to be smoother, even when the participant is not at all aware of any mimicry. Most of us have probably noticed the special quality of mimicry when interacting with infants, where we are more likely to exaggerate our facial expressions so that they are no longer subtle unconscious movements. Infants react more positively to adults who mimic them and the adults seem unable to stop themselves from maniacally copying every expression on the infant’s face.

So mimicry does seem to work, but it’s likely to perform best when it’s under the radar and detected only unconsciously. If we try too hard with it, there is a strong chance the person being copied will pick up on a slightly odd or robotic communicational style, which is likely to drive them away rather than make them like us more. And there is most certainly more to the story. Remember the guy I described at the library? I watched with my eagle eye and failed to spot anything coming close to mimicry from him.

Touch is another powerful social signal. We do not use it much in Western societies, especially in places like the UK where even extended eye contact creates a sense of awkwardness. But it is certainly a core part of our evolutionary history. Primates, including chimpanzees, groom each other as a social bonding technique, not just as a means to keep clean and eat a parasite or two. The total time that individuals within some primate species spend grooming each other correlates with the strength of their relationship. A chimpanzee is more likely to share food with another chimpanzee if the second has groomed the first within the last two hours, and similar patterns are found when analyzing the likelihoods that one baboon will help another when under attack. The time and effort that primates devote to social grooming is unlike any grooming found elsewhere in the animal kingdom, with some primate species spending up to 20% of their day touching and brushing each other.

Photo by  Brian Mann  on  Unsplash . Adapted by yours truly.

Photo by Brian Mann on Unsplash. Adapted by yours truly.

Humans are exceptionally good at verbal communication, so our relationships are generally dominated by speaking to each other. But we still enjoy touching our closer friends and family. Mothers stroke their children, and if my childhood is anything to go by, then mothers also incessantly groom their child’s hair when they see a note from school describing a hair lice epidemic. Romantic partners also enjoy physical contact now and again, although common jokes would suggest their urges generally decline with the duration of their relationship. Teammates in sport develop elaborate handshake protocols, often exclusive to individual pairs of players, which they execute religiously upon meeting. They fist bump when expressing determination or condolence during weak performance, but reserve chest bumps for scoring celebrations. Waitresses who briefly touch their customers on the shoulder or hand also get larger tips than those who do not, regardless of the gender of the customer.

There is little doubt that touch is capable of strengthening the social bond between people, and positively impacting behavior. It is not necessarily a good idea to comb through the hair of the next stranger you meet. But it is helpful to consider the power and benefit of the sense of touch between loved ones. Subtle pats on the back or arm can enhance friendships, even without a person noticing them. Closer friends are more likely to want to touch each other in the first place, and touching is likely to drive them even closer. Touch is an influential and important social behavior, but that can make it both a very good and a very bad thing, depending on its context. It’s not something we should flippantly hand out like a cookie. So please, use it wisely.

Social behaviors vary between cultures and people, so there’s no one-size-fits-all approach to building strong connections with the people we meet. Some people love hugs and contact, while others will stick to a firm handshake. This complexity in human sociality can cause those awkward hug-handshake hybrid moments when we meet new people, but it also underlies the diversity that we love about the world. It’s always useful to understand the commonalities between us in the way we interact: emotion, mimicry, touch, humor. But in the end, we have to treat each new person we meet as an individual, and work out the ideal way to interact with them in order to build a positive relationship. I don’t know about you, but I find that exciting.