Viewing entries tagged

The Altruistic Brain


The Altruistic Brain

Photo by  Elaine Casap  on  Unsplash

With our divided politics and fierce news media, it’s easy to lose faith in human goodness. Everyone seems to be out to get each other. However, altruism is still a potent force in the world. An altruistic behavior is an action that benefits a recipient but presents some cost or burden to the altruist. We find examples of people who volunteer for charity work and help strangers in need without any obvious expectation of a return favor. Why exactly would we decide to behave altruistically? Is it purely the positive influence of the social institutions we have designed, or are there also altruistic pressures in our biology? Studying the brain can give us some clues.

The brain and the dictator game

Some people are more altruistic than others. Researchers can measure altruistic behaviors in tasks such as “dictator games”, where a participant decides how evenly to split a pool of money between themselves and a second stranger playing the game. Let’s say a participant is given $1200 and offered two options:

  1. Split the money so that you receive $1010 and the other person receives $190

  2. Split the money so that you receive $730 and the other person receives $470

In this scenario, a participant is clearly behaving altruistically if they choose option B. They sacrifice a personal monetary gain in order to give more money to another person.

When participants in one experiment played games like this, researchers found that more altruistic people — those who were willing to give up more — had larger gray matter volumes in a part of their brain called the right temporoparietal junction (TPJ). During altruistic behavior, TPJ activity was highest when the personal costs of altruistic actions were just below the ultimate cost people were willing to pay. If the costs exceeded their willingness, or were much lower than their willingness, their TPJ remained relatively quiet. The anatomy and function of this brain area may therefore relate to how much we are willing to sacrifice. When we realize we need to make a major personal sacrifice, but agree in order to help someone else, our right TPJ is likely to reach its peak level of altruistic activity.

Empathy in altruism

Our motivations to behave altruistically may frequently come from a feeling of empathy. When we see others in pain, we activate similar feelings within ourselves, and this mechanism of empathy seems common across many mammals and birds. It is therefore a strong candidate for explaining part of the story about how altruism evolved. Helping others is rational when we feel as bad as they do, because we also want to get rid of our own negative feelings. Maybe Joey in Friends is right when he argues that there are no selfless good deeds:

Young people today may be familiar with the experience of watching other people play video games. Online video gaming broadcasts attract large followings. The vicarious enjoyment of seeing others play a game may relate to empathy. An experiment in 2007 studied participants’ brains as they either played a simple game or watched somebody else play the game. People who showed greater brain activity during game viewing versus game playing, in an area known as the posterior superior temporal cortex, reported engaging in more altruistic behaviors in their everyday life. These high altruists were better able to link events in the game to the decisions and actions of the game player they were watching. The empathy that allows them to detect intentions and purpose in other people may also bolster their altruistic instincts.

Friendly genes and chemicals

Oxytocin is a chemical produced by the hypothalamus in the brain, and it plays an important role in social behaviors. When oxytocin is administered through the nose, male participants are more likely to trust and cooperate with people within their in-group. With out-groups, they become more likely to be defensively aggressive, for example by engaging in pre-emptive strikes against a group they see as a threat. But they are no more likely to be offensively aggressive in wanting to greedily exploit others. This has been called a “tend and defend” response in an effort to make it as memorable as the “fight or flight” response for adrenaline.

When we are high on oxytocin, we also adapt how we donate money to charities. We become more likely to donate to social charities rather than environmental charities. Experiments have also compared how we spend money on goods when we take oxytocin versus a placebo. Compared to placebo, oxytocin doubles the price we are willing to pay for sustainable goods that are framed in terms of social benefits (e.g. fighting poverty, improving labor conditions). But when sustainability is framed in terms of environmental benefits (e.g. conservation, biodiversity), the price we are willing to pay does not change.

We may find traces of altruism in our genes. Studies on twins allow us to test how much of a particular trait could be explained by genetics versus environment. For example, if identical twins who are reared in separate families happen to be more closely matched on a trait than non-identical twins reared in the same family, that’s a strong sign that genetics plays a major role in defining that trait. According to twin studies, around 50% of the variability in empathy and altruism can be explained by genetics.

Researchers are beginning to find gene candidates related to our altruistic tendencies. Evidence has specifically linked prosocial behavior to genes that regulate hormones including vasopressin and oxytocin. Dopamine — a neurotransmitter recruited in functions such as reward and motivation — also seems relevant to altruism. In one experiment, participants were paid to complete some tough cognitive tasks, and were then asked whether they wanted to donate any of their hard-earned cash to a poor child living in Peru. Participants who carried a specific form of a gene involved in catabolizing dopamine (the COMT gene) donated around twice as much money as the people who lacked that particular form. Social behavior is complicated, so unsurprisingly, the story of altruism seems to involve several hormones, genes, and brain networks.

Extraordinary altruists

The amazing levels of self-sacrifice from some people has earned them the label of “extraordinary altruists”. In academic literature, this label commonly refers to people who donate a kidney to a stranger. This is clearly a tremendous act of kindness: the donor goes through major surgery and takes on health risks in order to save the life of someone they do not know. The brains of extraordinary altruists look like the inverse of a psychopath’s brain.When you compare their brains to a typical population, the altruists have larger amygdalae that are more responsive to fearful facial expressions. Psychopaths have precisely the opposite pattern: smaller amygdalae and less emotional reactivity.

Another recent study from late 2018 took a different approach in analyzing the brains of extraordinary altruists. Kidney donors had their brains scanned while a painful pressure was delivered to their thumb. On some trials, instead of receiving the pain themselves, they watched the pain being delivered to another person’s hand. When researchers examined the levels of activity in the anterior insula of the brain, they found overlapping activity for experiencing pain directly versus watching pain in others. This overlap was greater for the altruists than for typical people. The more reactive their insula was for personally experienced pain, the more reactive it was for empathic pain when watching others get hurt. In other words, evidence consistently shows that altruists have strong indicators of compassion and empathy in their brain, and this predicts their abundant desire to help.

Another group of extraordinary altruists are people who have won a Carnegie Hero Medal for heroic acts as a civilian. Researchers examined the testimonies of medal winners from published interviews, and found that they frequently described acting automatically and intuitively during their moments of heroism. Even when there was sufficient time for them to deliberate over what to do and reflect upon their decision before becoming a hero, their behavior was still dominated by intuition.

This also fits the descriptions of another recent hero, James Shaw Jr, who single-handedly wrestled a weapon away from an active gunman during the 2018 Nashville Waffle House Shooting. Listen to how he describes the quickly unfolding events in the interview below. He does not believe that he should be considered a hero because his actions were instinctive and he was not thinking about saving the lives of other people at the time. But, of course, his courage and life-saving instinct make him a hero regardless of the speed or automaticity of his actions. Heroism does not require deliberation.

What makes an altruist?

Aside from the biological mechanisms underlying our altruism, what influences could drive someone to be altruistic rather than selfish? During one experiment, 1–2 year old children either played reciprocally with an experimenter in a room, for example rolling a ball to one another, or the child and experimenter played with their own separate toys. When the experimenter later needed help with reaching for the object, the children who engaged in reciprocal play were significantly more likely to give them a helping hand. In fact, after reciprocal play with the experimenter, they were also more likely to help another adult who entered the room, suggesting that reciprocal play promotes altruism toward people in general rather than just the people we play with. Older children around 4 years of age showed similar effects in their behavior. After reciprocal play, they were more generous in distributing stickers between themselves and an experimenter. When interacting cooperatively and reciprocally, children learn to be receptive to the needs of others and act benevolently.

In another study, adults were taught compassion training. They imagined loved ones, strangers, and enemies in their minds, while cultivating feelings of warmth and positivity towards them. A control group instead practiced reinterpreting stressful events from their personal life. After 30 mins of practice a day for 2 weeks, participants who went through compassion training were more likely to spend their own cash to help a victim who was treated unfairly in a game. This enhanced altruistic impulse was mediated by changes in the brain that included the inferior parietal cortex, prefrontal cortex, and connections between the prefrontal cortex and deeper brain structures like the nucleus accumbens that are involved in emotional regulation.

But how do we stop the cheaters?

Altruism only really works when we don’t have a huge group of unfriendly narcissists taking advantage of us. So how does humanity maintain a sense of selflessness when up against those kinds of counter-pressures?

One answer is straightforward: punishment. In fact, punishment itself can also be an altruistic act. Imagine you are working in a group where everyone benefits when each person in the team cooperates. When one person chooses not to cooperate, they may get a larger payoff, but it comes at the cost of the rest of the team. So what can a cooperative team member do to stop that person? One option is to accept a personal cost in order to punish the dissenter. Although both the punisher and the punished take a costly hit, this willingness to altruistically punish leads to greater group cooperation over the long term. It ferrets out harmful cheaters and virtue-signaling free-riders.

Altruists may also have a direct mating advantage. In a 2017 study, participants were entered into a prize draw to win $100, and were asked whether they wanted to donate their potential winnings to a charity or keep them. Participants also completed a questionnaire about their sexual history, including questions about their overall number of lifetime sexual partners. The researchers found that men who chose to donate their winnings reported a greater number of partners from their past. This relationship between altruism and sexual partners did not seem to hold true for women. So, at least for men, behaving altruistically may score you points as a prospective romantic partner.

What are we supposed to take from this?

Altruism has a long and interesting story. We find its markers in our brains, genes, and behavior. Many of our altruistic impulses arise from a feeling of empathy, and our hormones help to drive those experiences. Selfless good deeds may be hard to come by, but why should that degrade the value of our altruistic impulses? Any willingness to help another person is worth respecting and appreciating. Whether you are an extraordinary altruist or just a helpful passerby, please keep doing what you are doing. It would be an understatement to say that you flip frowns into smiles. Many of life’s happiest and most inspiring moments come from the time you did what you could to help.


Curiosity Killed the Human


Curiosity Killed the Human

Photo by  Chase Clark  on  Unsplash

Photo by Chase Clark on Unsplash

What do you think is behind that door? This kind of question is at the root of many popular horror stories because it tickles our curiosity. We pay for our movie ticket expecting to be scared out of our pants, so we know there cannot be anything good in the other room. And yet, we can’t help but want to see. We’ll cover our eyes if we have to. Just open the door.

Curiosity is an important feature in human psychology. It motivates us to learn and turns us into excitable explorers. Naturally, it is also a guiding principle behind science. Without curiosity, we lose interest in the activities that fuel progress in everyday life. And that level of apathy can be a nasty threat and alarm bell: it forms one of the primary diagnostic criteria for clinical depression.

When you think of curiosity as an impulse to explore, you easily see it in other animals. Anyone with a cat or dog will regularly notice their pet sniffing at new objects in the room. Exploration is a fundamental part of life because it drives animals to find healthier environments and better resources. There is always a tradeoff between exploration and exploitation. Exploration is risky, because in searching for better environments, we need to leave the safety of the environment we have already found. On most days, we may be happy exploiting our current comforts: in my case, the sofa, laptop, and television. But on other days, our curiosity will drive us to see what else the world has to offer.

* * *

Many of our important behavioral drives are innate. For example, we are born with impulses that tell us how to eat and breathe. Regardless of how curious we are about the world, we will know how to digest a hamburger and adjust our breathing during exercise. But certain aspects of the world are unstable and differ depending on where you find yourself. There cannot be a genetic program for building a 1000-foot skyscraper or developing a digital computer. In our evolutionary environment, these did not exist, and they offered no immediate advantage for survival. It is our innate capacity to learn, and be curious enough to learn, that allows us to achieve such dramatic feats which are characteristic of the human species. Importantly, our curiosity often has one primary target: information. When we are curious, information is a reward in itself.

We seek information to reduce our uncertainty about the world and resolve gaps in our knowledge. As I’ve previously described when talking about tip of the tongue feelings, the problems that makes us most curious are often those that are already within our scope of broader knowledge, rather than those that are totally alien to us. Most of us are not particularly turned on by someone describing the details of cellular mechanics during the transmission of action potentials. But practically all of us are turned on by someone telling us they’ve found the solution to immortality.

We are most curious when we can clearly see the gap in our knowledge, understand why it’s important, and know how a specific piece of information will patch over it. This is one of the big challenges for educators. When a maths teacher ignores a child who asks “why should I care about algebra?”, they miss an opportunity to stimulate a critical and productive sense of curiosity. When we are curious to learn a piece of information, we activate areas of our brain typically involved in anticipating rewards like the caudate nucleus. In that state, everybody wants to learn.

As I hinted at earlier when talking about the risks involved in exploration, we may occasionally be too curious for our own good. In a fun 2016 study, researchers put a bunch of prank pens that delivered electric shocks on a table in front of participants. They told the participants that the pens were from a previous study and were not relevant for their particular experiment, but that they were free to play with the pens while waiting for their own experiment to begin. In reality, the actual purpose of the study was to see what participants did with the pens.

There were a total of 30 pens on the table, each with a colored sticker to communicate how likely the pens were to deliver shocks when clicked. The participants were told that the 10 red pens were certain to shock if clicked, while the 10 green pens had their batteries removed and could deliver no shocks at all. But there were also 10 orange pens. As clearly explained to participants, only 5 of these pens could deliver shocks and the other 5 were safe. So the orange pens introduced uncertainty in a bid to make the participants curious.

The researchers found that people were more likely to click an orange pen than any other pen. If the participants simply wanted to test how painful the shocks were, then you would expect them to choose the red pens. But the orange pens were the most tempting. The additional curiosity of finding out which orange pens were dangerous boosted the intrigue of this self-imposed macabre game. Sometimes, our curiosity makes us do pointless or even stupid things, as the short clip below from the animated comedy Family Guy demonstrates.

* * *

At this moment, the most curiosity-arousing stories in the news are related to politics. All of the major outlets are plastered wall to wall with text about what the Republicans did yesterday and what the Democrats will do today. What could drive the overwhelming curiosity for this type of news? Politics is such a common interest that everyone is likely to have a strong opinion about it. A conversation on politics is therefore likely to be controversial, and could make us angry, happy, or surprised. But who cares about the specifics? It’s the anticipation itself that makes us curious. So we read, we formulate opinions, and we find someone who will listen to them. I’ve been to social occasions with rules like “no talk about politics”. But it rarely takes long for the host to give up on enforcing that rule. People can’t stop themselves engaging with conversational topics as eventful as politics. We are just too damn curious.

Our curiosity drives us to learn. Researchers can have a good time exploiting this drive using prank pens, but for the most part, our natural curiosity has a deeply important function. It pushes us to solve new problems that propel ourselves, and occasionally the rest of humanity, forward. We frequently struggle to find the motivation to finish a task on our to-do list. The same tasks that were exciting a year ago lose their charm and become just another boring activity to get through.

If we can inject a renewed sense of curiosity into dull tasks, by arousing a desire to absorb new information, then we introduce a motivating reward. I read a lot of research papers, and it quickly becomes a tedious activity. So I’ve taken to crafting each paper into a story that I can tell a friend about, wondering what kind of reaction I can elicit from them. It’s similar to the incentive that attracts us to politics: social communication. This simple twist multiplies my curiosity in reading a boring paper and gives some life to an otherwise monotonous task. All I need to ask is “hmm, what will Steve think when I tell him about this?”. Admittedly, this strategy may also lose its allure before long. But then I’ll find another way to make myself curious.

We can all take on the challenge of making our lives more interesting. But it may be prudent to remember that curiosity can be both prey and predator. When it’s the prey, you hunt it down by finding the answers to productive questions that improve your life. But when it’s the predator, as it is with prank pens and risky social conversations, run in the opposite direction as fast as you can.


The Science of Poetry and the Poetry of Science


The Science of Poetry and the Poetry of Science

Good poetry punches us right in the gut. It combines the immediate visceral beauty of musical patterns with the more cerebral pleasures of language, making it one of the few art forms that has endured across human history. Although similar to lyrical song, poetry takes more of its allure from language rather than music. It’s not necessarily easy to see why poetry should affect us quite so deeply. It is just a string of words after all. But language gives us the means to throw thoughts and feelings from our own mind across into someone else’s mind. Organize your words in the right order, and you can make someone feel like they’ve never felt before. As an adult, that is a rare experience.

In one of my recent articles, I described the chills that that music can give us. Poetry can do the same thing, but it targets different areas of the brain, especially those further towards the back of your head like the precuneus and supramarginal gyrus. Activity in a cluster of cells deeper within the brain, called the nucleus accumbens, also set the scene for that moment of ecstasy. As a poem gets particularly powerful, these cells ramp up their activity in the seconds leading up to peak emotional intensity, and then settle back down as cortical areas like the precuneus hit us with the chills.

Poetry chills come with all the consequences you’d expect, notably including the goosebumps that sprout over the surface of your body. These intense physiological responses are a signal that whatever is happening right now is deeply relevant to us and we should take note. Information that directly affects us on a personal level has priority in the brain. The precuneus, which ups its brain activity when we experience chills, has been associated with exactly this kind of self-referential processing.

The most enjoyable goosebumps, as opposed to say shivers from the cold, are driven by feelings of awe and surprise. So when a poem says something unexpected that connects deeply with our own personal experiences, you can expect chills to take the stage. The right poem proves to us that our experiences are in good company, no matter how alone we feel. It helps us to realize that we are part of something bigger. Our existence comes from an unbroken chain of life on Earth and a flourishing universe across billions of years. It’s harder to feel isolated when you have the whole universe behind you, right?

* * *

Poetic techniques help to make messages more memorable and meaningful. For example, when aphorisms rhyme, we are more likely to believe that they are accurate in their meaning, because rhyme allows us to process a sentence more fluently. You may remember the infamous line from the defense attorney in the O.J. Simpson trial, after the courtroom watched the alleged murderer trying on the glove used in the killing: “If it doesn’t fit, you must acquit”. Perhaps the line owes some of its success to its effective use of rhyme.

Alliteration also helps us to remember information better, not only the alliterated information itself, but also the messages surrounding it. The patterns and rhythms within poems are a sensory bonus in perception. They drive us to pay attention and connect with a message on a deeper level. Often, the best poems are not necessarily saying anything extraordinary or special. They are simply expressing it in a unique way that allows us to see a familiar scene from a new perspective. New perspectives are powerful because they stimulate our imagination and add vivid clarity to important messages that haven’t yet matured within our own minds.

In stimulating new perspectives, poems also make use of another great device: metaphor. Metaphors often adopt basic concepts to aid us in comprehending more complicated concepts. This may lie behind much of human progress in intelligence. It can be tricky for us to grasp many of the universe’s abstract mysteries, but our ability to represent those mysteries in the form of concrete intuitions helps us to break them down into manageable insights. Of course, there may be limits to this. For example, to this day, we struggle to wrap our minds around concepts such as consciousness or quantum mechanics, even with heroic attempts at metaphors. As the physicist Richard Feynman put it, “I think I can safely say that nobody understands quantum mechanics”. And as Niels Bohr, another physicist, put it, “We must be clear that when it comes to atoms, language can be used only as in poetry”. Poetry is often our first and final salvation for making sense of the things that make no sense to us.

To put it simply, poetry nudges us to revisit and reinterpret our own memories. Even the least ambiguous experiences from our past can be brought back to life with a new twist. My favorite example of this comes from the British poet Ross Sutherland. I won’t spoil the poem by describing it, but it’s a great example of what can happen in your mind when you bring together old memories, new words, and emotional meaning. Here is the video.

“Standby for Tape Backup” by Ross Sutherland —

* * *

Language is, of course, our most important social tool. When we interact with others, the level of overlap in the styles of our conversational language predicts the quality of our relationship. When researchers analyzed the language used by people on speed dates, they found that greater similarity in language styles predicted a greater desire to see each other again. They saw similar outcomes when looking at instant messages sent between couples: a stronger match in language style predicted better relationship stability 3 months later.

Poetry is no exception to this rule in communication. Elizabeth Barrett and Robert Browning were romantically intertwined poets in the Victorian era. The language styles within their poems fluctuated in agreement with the quality of their relationship. The language in their respective poems was most synchronized during the happiest and healthiest period of their relationship. The same pattern can be found in the writings of Sylvia Plath and Ted Hughes, another pair of poets with an emotionally turbulent relationship and the poetic styles to match.

As humans, we do not particularly enjoy uncertainty, but we also want to avoid boredom. Poems may occupy a goldilocks zone between our hesitance to take risks and our search for some hint of novelty. They help us reconstruct our memories in the service of finding meaning, and they help us understand the perplexing world we live in. When our language becomes too repetitive, or our mental life get a little stale, there is no better cure than a dose of poetry.


Facebook Knows You Better than You Know Yourself


Facebook Knows You Better than You Know Yourself

Photo by  Hugh Han  on  Unsplash

Photo by Hugh Han on Unsplash

Our fears around online data privacy are only just beginning. Social networks are a centerpiece in our daily lives. We use them to absorb news, connect with friends, and develop careers. When a service becomes that fundamental, you expect people to share personal details while using it. And I’m not just talking about sharing names and addresses at sign-up. I’m talking about sharing our moods, personalities, likes, dislikes, and beliefs, as we use digital platforms every single day.

The deeper sides of our personalities have always been somewhat hidden, only revealed to those who are closest to us. Our personalities do affect our outward behavior, so they are not completely private to us as we interact with people. But we can be a little more friendly, thoughtful, or outgoing than normal when we choose to. With new people, we tend to show our best sides because we are better able to keep our undesirable traits under control.

Once we have a romantic partner who sees us constantly, controlling our flaws is a much bigger challenge. We all have moments of weakness, and those moments are often the most revealing about us. We are less conscious about projecting our best possible selves to the people who love us unconditionally. So our family and friends are likely to have a fuller picture of our personality than a stranger. Well, that’s until Facebook came along.

The classiest life events make a prominent appearance on social networks. The latest glamorous holiday, or the promotion at work, is more likely to feature in our feed than the latest TV and sofa binge. However, rather like our loved romantic partners, our loved social networks are becoming a heavily connected part of our lives; almost an extra limb. This makes us more likely to share a controversial or traditionally private aspect of our personalities online. We become a little less perfect and a little more honest when we post about our latest health hazard, frustration at the supermarket, or political opinion. Our online personas are becoming more like extensions of our real selves rather than idealized virtual identities. In some ways, it’s exciting to achieve this level of openness in online communication. In other ways, especially in relation to privacy, it’s plain scary.

* * *

Sometimes, it’s funny to hear our loved ones describe a specific pattern in our behavior that we had not noticed. We exclaim “You know me better than I know myself!” with a smile. But what if Facebook can do the same thing? And what if Facebook knows you better than even your closest relative or loved one?

Our history of Facebook likes is a diamond mine for data scientists. Those likes hit a wide range of material including musical tastes, humorous memes, and political anger. So when they’re analyzed, they present some surprisingly accurate insights into our character.

In 2013, researchers tested how well a computer could understand you as an individual by working through your Facebook likes. Their database covered more than 58,000 volunteers, and their statistical models aimed to predict personal characteristics by analyzing the content of each volunteer’s likes. The models could predict whether someone was homosexual or heterosexual with 88% accuracy, whether they were African American or Caucasian American with 95% accuracy, and whether they were Democrat or Republican with 85% accuracy. In 73% of cases, they could even tell whether someone was a smoker or not.

Using a similar statistical model in 2015, researchers from the University of Cambridge and Stanford decided to examine whether Facebook likes could also accurately predict a user’s personality. The gold-standard personality assessment in academic psychology is known as the OCEAN model which targets the “Big Five” personality traits: openness, conscientiousness, extraversion, agreeableness, and neuroticism. Facebook users completed a questionnaire to assess these traits, and the researchers then compared how well a computer could predict personalities next to the predictions of human acquaintances. Shockingly, the computer models did a better job of predicting users’ personalities than close friends and family. In fact, the computer’s performance was only matched by the best human judges of our character: our spouses.

* * *

So Facebook has the capacity to read us as well as our most intimate companions. The thought is as amazing as it is unsettling. The laptop that I type on right now, in some respects, understands me just as well as my wife does. But beyond simply understanding us, could our social networks directly change our moods and behaviors?

It’s no exaggeration to say that one particular study in 2014 unleashed a media firestorm. Researchers were testing how emotions spread across online networks. For over half a million Facebook users, the experimenters manipulated the likelihood of seeing positive or negative emotional information in personal news feeds. In one condition, the chance of a positive emotional message appearing was reduced to only 10%, while a negative emotional message was set to 90%. The numbers were reversed in the alternative condition.

When participants in the experiment saw less positive emotional information in their feed, their own status updates also contained less positive information and more negative information. When they saw less negative emotional information, their posts showed the opposite pattern. In other words, this was a clear case of rapid online emotional contagion. Our emotions are vulnerable not only to the events in our own life, but also to the events in other people’s lives when they share those events with us. In the physical world, the consequences of this are somewhat attenuated; we have less frequent and less transparent access into other people’s emotional experiences. In the world of the social internet, these barriers fall down. We have direct and immediate access into what our networks are telling us in real-time.

The reason the study above was so controversial is because it felt too invasive. It was always obvious that Facebook was running experiments on us to test their product — perfectly normal practice for tech companies — but there was something especially weird about learning that they could curate our news feeds to manipulate our emotions on a mass scale. When we normally talk to someone in a personal conversation, there is no external editor in our brain adjusting which information we hear and which information is blocked. We hear everything we are told and can decide how we feel about it. We can even walk away from the person if we don’t want to hear what they are telling us. But when our conversations take place on somebody else’s platform, suddenly we have less control over what we see and hear. Even though we signed up to the terms and conditions of the service, we don’t like to find out that our communications could be manipulated to make us feel a certain way.

When we learn about the results of controversial studies, it’s not unusual for our first thought to be one of panic or anger. But with all technological developments, it’s important to get past the most immediate dramatic reactions and look objectively at the possible advantages moving forward. Most recently in October 2018, researchers analyzed the language that hospital patients posted on Facebook, and used it to predict a depression diagnosis with over 70% accuracy. And that is only through analyzing language before a formal diagnosis by a doctor. Our interactions on social networks present a large clean window through which we reveal the qualities of our mental health. Once you get past the eerie feeling that may settle in your stomach after learning this, it’s easy to see the potential benefits to our health services. We currently struggle to understand, predict, and treat the large and diverse sphere of mental health problems. Every additional reliable insight can help to strengthen our medical frameworks.

The most practical applications of detailed online data relate to psychological profiling. A business that understands our likes and dislikes can serve us more effectively. Remember the study I described showing how Facebook likes predict our personalities? If a company understands our personality through this data, they can tailor their adverts and messaging in a way that appeals to us. The extraverts among us are likely to prefer colorful, outgoing, party-related images in adverts, while the introverts will lean toward quieter and more personal images. So what happens when we get what we want?

A study at the end of 2017 showed that messages tailored to our personalities and digital footprints were more persuasive than non-tailored messages. When adverts matched our profiles, for example our level of extraversion, they produced 40% more clicks and 50% more purchases than mismatched adverts. Companies that decide to use this approach will bother us less often with irrelevant nonsense that we don’t want to see. That’s a good thing. But their ultra-persuasive messages might also give us an extra push toward making a purchase we can’t afford or toward an activity that is not necessarily the healthiest personal choice. Perhaps not such a good thing.

* * *

The evidence all falls into a mixed bag of costs and benefits. With a reliable insight into our unique characters and perceptions of the world, internet products can provide better personalized services, and doctors can improve the timing and accuracy of their diagnoses. But the sheer extent and quality of the personal data tumbling clumsily across the web also presents opportunities for our data to be used against our own interests.

The number one defensive weapon against data misuse is awareness. That’s why experiments are so important. By explicitly teaching us how our data could be used and how it impacts our behavior, advancing knowledge provides a way to counteract possible negative consequences and speak up against anything we don’t like. I doubt we’re any more likely to read terms and conditions before accepting them, but at least we can be more conscious of how to use a service once we are signed up. If you have data that you desperately want to keep private, do not post it. If you feel the disadvantages of a social network outweigh the advantages in your own life, change how you use it. The stronger our awareness, the better we can make those decisions. At least for now, our choices are still our own.


Damn It, It’s on the Tip of My Tongue


Damn It, It’s on the Tip of My Tongue

Photo by  Juliet Furst  on  Unsplash

There is a very specific feeling of frustration that comes with trying to recall a word that you know is sitting in your brain somewhere but failing to drop into your mouth. We refer to it as having a word on the “tip of your tongue”. We might fall into this trap once a week or so in normal life (although it becomes more frequent with age), and around 50% of the time, we manage to get ourselves out of the mess within a minute. We often experience it when a friend asks a question with a familiar answer that we have not heard or used recently, or when we try to find a specific but uncommon word that describes a feeling we are trying to communicate.

Questions with uncommon single-word answers are particularly likely to elicit a tip of the tongue (TOT) phenomenon. Let’s see if any of these questions, which have previously been used to elicit TOT feelings in lab studies, do a good job of getting you into the dreaded TOT mental state (the answers will be at the end of the article):

  • What do you call a word or sentence that reads the same backward and forward such as, “Madam, I’m Adam”?

  • What is the name of the islands off the coast of Ecuador that Darwin visited to study unique species of animal life?

  • What is the order of lower mammals including kangaroos and opossums which carry their young in an abdominal pouch?

  • What is the word meaning favoritism in hiring based on family relationships?

  • What do you call a person who appeals to people’s prejudices, making false claims and promises in order to gain power?

  • What are people who make maps called?

What exactly is our brain up to when we experience a TOT feeling? Brain scans suggest that two key brain areas are particularly active: the anterior cingulate cortex and the right prefrontal cortex. Our anterior cingulate cortex is typically involved in detecting and monitoring mental conflicts. It is a core part of the inner battle between competing options when we encounter a problem. The right prefrontal cortex is involved in working through our memories as we retrieve them, especially when we are not particularly confident that they are correct. It underpins the sense of familiarity but lack of certainty about the solution to a problem when we desperately peruse the contents of our mind in search of the answer.

These brain functions are reminiscent of what happens when a word is on the tip of our tongue. In our mind, we work through multiple conflicting possibilities with similar sounds or meanings as we try to zero in on the target: “is it despotism? Neapolitan… nativism… NEPOTISM!”. There goes the answer to one of the TOT questions I listed above, if you didn’t already think of the word.

We are more likely to find ourselves in a TOT state with emotional words than neutral words. This suggests that emotions are a significant part of our memory recollection process, as we retrieve different clues to what the word may be. Perhaps the emotions themselves are a definitive signal that the word we seek is sitting in our mind somewhere. We know that we know the word, we just can’t quite bring it to the front of our mind. This relates to what scientists call “metacognition”: thinking about thinking.

The metacognitive account of TOT phenomena explains that when we fail to recall a word, we activate our metacognitive processes to estimate whether the word would come to mind if we just thought hard enough. In addition to retrieving any emotional clues, we hunt down related or half-baked bits of information that somehow connect to our target. For example, clues could include syntactic (sentence structure), semantic (meaning), or phonemic (sound) information. As we accumulate some clues, we may cross a threshold that initiates a TOT feeling and encourages us to keep searching rather than give up. Then, if we continue accumulating more clues, we may be lucky enough to cross another threshold that activates the complete target concept and allows us to spit out the word.

To prove that phonemic information is one of the major clues that we use in recalling words, researchers put participants in a TOT state and then tested how well they could eventually recall the correct word. They showed that a list of similar sounding words helped participants to retrieve the correct target word, rather than interfering with their thought process. TOT states make us extra curious to find out the answer to our problem. Reeling off words that sound like they connect to the camouflaged target may be one good way to pull the word out of the bushes.

When we are trying to carefully count objects, perhaps the number of people in a room, our more annoying friends might whisper distracting asynchronous numbers into our ear to force us into furiously starting again. But when it comes to TOT states, a list of similar words can be helpful. As we play detective and attempt to solve a “what is that word?” mystery, any related information that pops into our head becomes a clue. Our brain activity bounces around these clues in trying to resolve the conflict, and with enough information, it eventually settles upon the solution and gives us a feeling of relief that is difficult to rival.

Answers to the list of TOT questions at the top of the article

  • Palindrome

  • Galapagos

  • Marsupials

  • Nepotism

  • Demagogue

  • Cartographers


The Science of Hypocrisy


The Science of Hypocrisy

This article was a front-page feature on Medium

Photo by  Rishabh Butola  on  Unsplash

For many of us, a huge part of daily conversation revolves around gossip. We love to talk about the blunders and missteps of friends, family, and celebrities. On top of that, news organizations and social networks are like outrage amplifiers because that’s what gets the clicks. We are all used to name-calling in the news, especially when it’s directed at politicians or performers. But there’s one particular name that really gets our attention.

If you want to destroy someone, call them a “hypocrite.”

Hypocrisy typically involves criticizing or condemning the immoral acts of others while engaging in those acts ourselves. This can make us look worse than if we engaged in those immoral acts but didn’t criticize them at all, which might sound odd. But would you rather someone engaged in immoral behavior and criticized it or engaged in immoral behavior and didn’t criticize it? Diving into the psychology of hypocrisy can make how we feel about it make more sense.

Testing for hypocrisy

An experiment in 2001 aimed to turn people into hypocrites in the lab. Participants were to assign a set of tasks to themselves and an unknown second participant. One type of task was exciting and offered rewards while the other was neutral with no rewards. A coin placed next to the participants had a written instruction explaining that most people believed flipping the coin would be a fair way to distribute the tasks. Indeed, practically all of the participants agreed that flipping the coin to assign tasks would be the most moral thing.

But when it came down to it, only half of them actually flipped the coin, with practically everybody in the non-coin-flipping half giving themselves the exciting tasks. Among the people who did flip the coin — which was labeled “self” on one side and “other” on the other — 85% to 90% still managed to assign the exciting task to themselves. Clearly, either the coin was a magical sycophant or the participants pretended the coin had landed in their favor when it really hadn’t.

People wanted to look fair by using a coin to make their decision, but behind the scenes, they were just as selfish as the people who did not use the coin at all (most of whom had agreed using the coin would be the most fair but didn’t do it). It’s all a perfect example of moral hypocrisy at work.

The drive behind hypocrisy

Self-interest is the most obvious reason for any of us to act like hypocrites. When people are questioned about why they act in conflict with their own stated moral standards, many will say that the personal costs are enough to outweigh the intention to act morally. Essentially, we all want to act fairly until we are put on the spot and facing our own personal consequences. For example, it’s easy to justify many of our unfulfilled wishes to donate to charities and failed inclinations to help a stranger in need by telling ourselves that we just can’t afford to do it right now.

We all want to act fairly until we are put on the spot and facing our own personal consequences.

Our hypocrisy helps us out, that’s for sure. But we also use it in our relationships. Often, when we rate the fairness or morality of other people’s actions, we judge them more harshly than we judge ourselves doing the same actions. In a 2007 report on a modification of the exciting task/boring task paradigm described earlier, participants afterward were told to judge their and others’ fairness on a scale from 1 (extremely unfair) to 7 (extremely fair). People scored themselves a 4 on average but rated others’ fairness at only a 3 on average.

Interestingly, our judgments of other people tend to be far more favorable if those others fall within our in-group (even if it’s a purely arbitrary in-group characterized by a random trait). We often judge an in-group member’s misbehavior to be just as fair as our own. We only have a greater distaste for other people’s bad behaviors when those people fall outside a social circle that we ourselves have drawn.

But why is hypocrisy so distasteful?

We’ve covered what hypocrisy looks like and what motivates it, but we haven’t tackled why we seem to hate it so much. One strong explanation relates to false signaling. In essence, hypocrites employ a double layer of deception in their immoral acts — one more layer than the basic liars who simply say they’ve acted morally when they haven’t. When we hypocritically condemn someone’s immoral behavior, we disguise our personal misbehavior with a veil of persuasiveness or manipulation. It’s easier to see through an outright lie than a hypocrite’s condemnation. On top of that, a hypocrite has brought another person into the game. Instead of directly denying their immorality, the hypocrite sneakily implies they are good by attempting to shame someone else. This is a recipe for hatred when caught out.

Hypocrites employ a double layer of deception in their immoral acts — one more layer than the basic liars who simply say they’ve acted morally when they haven’t.

A set of recent experiments had Yale faculty testing this false signaling theory by giving people stories about different kinds of liars and hypocrites and then studying how people judged the characters within those stories. Four important results came out of these trials:

  1. When a person condemns other people’s behavior and we know nothing else about that person, we typically believe it comes from their moral goodness.

  2. Condemnation of bad behavior is a stronger signal of a person’s moral goodness than claims of personally avoiding bad behavior.

  3. When a person condemns a behavior that they themselves commit (hypocrite), we rate them as significantly worse than a person who says they don’t commit a behavior when they do (liar).

  4. We perceive hypocrites better if they admit to sometimes engaging in the bad behavior than if they make no such admission.

Overall, it backs up the idea that we have a greater tolerance for liars than we have for hypocrites. Hypocrites are like a special type of liar who puts extra effort into disguising their misbehavior and sending us false signals of moral superiority. Those false signals drive our contempt. If a hypocrite is honest about their hypocrisy — if they get rid of false signals by admitting to what they condemn — our view of them can become significantly more favorable.

Perhaps there’s a lesson we can learn here. If we’re going to lie, that’s bad enough; let’s try not to fool and distract other people by pointing the finger. Sometimes, it’s okay to be transparent about our flaws. Nobody is perfect, but honest self-criticism and the ability to admit when we fail to live up to our own standards may be a good foundation for integrity. Hypocrites are terrible people. And occasionally, I’m one of them.


A Hangry Judge Could Ruin Your Life


A Hangry Judge Could Ruin Your Life

Photo by  Alex Iby  on  Unsplash

Photo by Alex Iby on Unsplash

Some decisions are so consequential that the average person is forbidden from making them. To decide the treatment for another person’s disease, or the penalty for a criminal defendant, you need to go through years of careful training and education. As a doctor, you cannot afford to prescribe the wrong drug for a patient’s symptoms. And as a judge, you cannot afford to imprison an innocent person. We all know that these kinds of errors must happen occasionally. But at the same time, we hold certain professions to a higher standard. For that reason, it can often be a shock to discover those role models just being human.

When we are deprived of sleep for a day or two, all of us are capable of making some poor decisions. Our memory and attention capacities take a hit, so our work during the day is likely to be less effective. A lack of sleep increases the chances of a lapse in concentration, which can be disastrous when driving or using dangerous tools. But even outside immediate physical dangers, there may be major ethical consequences to being tired.

Judges often have to try many cases across their day. Their job may not be physically strenuous, but their high mental burden is bound to be exhausting, as I’ve previously described. If judges are as human as the rest of us, you might expect to see their decision-making change in line with their sleep patterns.

If you wanted to investigate the effects of sleep deprivation on real-world legal decision-making, it probably wouldn’t be sensible to ask a group of judges to go without sleep for a day. However, researchers at the University of Washington and University of Virginia thought of a clever way to test the idea without getting in the way of normal judicial proceedings. They made use of a natural change we all go through during the transition to daylight saving time in the spring: on Sunday, we turn our clocks forward and miss one valuable hour of peaceful sleep.

The researchers analyzed court sentences for US citizens between 1992–2003, and examined how long defendants were locked away for on the Monday after a clock change (“sleepy Monday”) compared to a typical Monday. Alarmingly, they found that sentences were 5% longer overall on sleepy Mondays.

You might find the results above surprising. Could a single hour of sleep really make such a difference? Some researchers are indeed disputing the extent of these sleep deprivation effects among judges. But to be clear, there does seem to be a difference between 6–7 hours of sleep and 7–8 hours of sleep when it comes to general health. During the Sleep Duration Consensus Conference in 2015, 15 experts in the field of sleep science reviewed all available evidence and voted on exactly how much we should be sleeping each night. The panel reached a consensus suggesting that 7–8 or 8–9 hours of sleep a night was ideal for optimal health, but 6–7 hours crossed into the suboptimal range.

Judges aren’t the only people who need to worry about sleep. The performance of medical professionals also suffers when they are bleary-eyed. As sleep loss increases, surgeons make more mistakes and are slower to perform particular tasks in a surgery simulation. The demanding working conditions and long shifts in many of the world’s healthcare systems may not be good for patients, or for the people trying to save their lives.

We don’t yet fully understand everything that goes on during sleep, or even the reasons why we sleep, but we know that we struggle without it. The effects of sleep loss are similar to the effects of alcohol intoxication. When researchers measured the hand-eye coordination of volunteers following either a few drinks or a night of no sleep, they found that 17 hours of sleep deprivation mimicked the performance problems of a 0.05% blood alcohol concentration. Staying awake for 24 hours was similar to a 0.1% blood alcohol concentration. Keep in mind that the legal alcohol limit for driving in the US is 0.08%. The more sleep we lose, the more drunkenly we behave.

Photo by  Hutomo Abrianto  on  Unsplash . Adapted by yours truly.

Photo by Hutomo Abrianto on Unsplash. Adapted by yours truly.

Food deprivation may be analogous to sleep deprivation when it comes to the quality of our decision-making. All of us get a little short-tempered and miserable when we feel hungry. Skipping lunch is not a popular proposition in my household. The feeling of frustrated hunger is so widespread that the world has come up with a dedicated word for it: “hangry”, a portmanteau of hungry and angry.

A hangry judge may be the last thing you want to see if you’re ever in court hoping for parole. Researchers analyzed the decisions of judges in Israel depending on when a parole hearing took place during the day. In these cases, judges had two options for their conclusions: “yes, parole is granted” or “no, go back to your cell”. The hearings took place in one of three daily sessions, each session separated by a break where the judges could grab some food and drink.

The researchers found that decisions to grant parole in favor of prisoners declined steadily between the start and end time of a session. And this was not just a minor effect we can easily ignore. Favorable rulings were at around 65% at the start of a session when the judge was feeling happy and refreshed, and declined to almost 0% just before the break. Straight after the refreshment break, the rate abruptly jumped back up to approximately 65%. There may be accompanying variables in addition to hunger that explain this pattern, but nobody in their right mind could have predicted such a dramatic effect before seeing this data.

In these judicial cases, the decision to avoidgranting parole is essentially a decision to keep things as they already are. Granting parole would mean changing the status quo and releasing the prisoner. It may be that this burden is too much for a hangry judge to think about. When we are tired and struggling to focus because the only thing on our mind is food, we may be naturally drawn to the least dramatic option; the option least likely to get us in trouble if we make a mistake. When we are refreshed, happy, and comfortable, we can better weigh up the pros and cons of a problem in a bid to make the fairest and most rational decisions we can.

Next time you feel a little frustrated with someone, or you see them looking a little skittish, consider whether hunger or fatigue could have something to do with it. We have an overwhelming tendency to assume that when a stranger is unfriendly to us, it’s because they are a terrible person. But this bias may be irrational, because we are all capable of being a little mean on a bad day.

With some careful attention to the real influences underlying our own behavior and judgment, we can make better decisions when it matters. And with greater generosity in how we read other folks’ motivations, we can develop a more compassionate attitude toward the people around us.


When Your Brain Becomes Your Puppetmaster


When Your Brain Becomes Your Puppetmaster

Photo by  Sagar Dani  on  Unsplash

Photo by Sagar Dani on Unsplash

Some features of our lives just seem inviolable. Most people would never worry about failing to recognize objects they see every day, or beginning to believe that their arm does not belong to them. And yet, these are exactly the types of things that can go wrong. Body and brain functions are not physical laws like those of thermodynamics or relativity. Gravity may be here to stay, but when it comes to our behaviors and perceptions, we may be justified in being a little more nervous. So what would you do if you lost control of your own arm?

If I ask you to lift your arm, and you agree to participate in the exercise, you’ll probably see your arm start to rise. But you always feel that the arm is doing what you, as a conscious agent,want it to do.

Anarchic hand syndrome is a disturbing disorder in which patients lose the normal experience of voluntary movement. An arm can begin to move and act without the patient wanting it to, as if the limb has a will of its own. In fact, a patient will often begin fighting their own limb if it becomes uncooperative, trying to stop it from grabbing at their tissue while they blow their nose or from touching the person sitting next to them. Have a look at this video demonstrating the plight of an elderly patient in her hospital bed after she suffered a severe stroke.

In the most extreme cases, your own anarchic limb can try to kill you. One patient described her hand tearing away at her bedcovers in the night and grabbing her own neck to strangle her. The only sense she could make of her horrifying condition was to assume that her limb was possessed by an evil spirit.

Patients with anarchic hand syndrome are in the bizarre situation of knowing that their limb is their own but losing all sense of agency over it. Without the normal process of intending to perform an action, it’s hard to say that you turned on the light when you flick the switch. Your arm certainly did it. But not you.

It all links back to our sense of who we believe we are. When we use the words “I” or “me”, we normally refer to our conscious minds and experiences. It would be strange to say “I am beating my heart faster” after going for a run, even though the heart is a part of our own body. We simply say “my heart is beating faster”. But when it comes to lifting our arm, we say “I am lifting my arm”, not “my arm is lifting”. The difference all comes down to our sense of consciousness and intention. Our heart rate is automatically controlled behind the scenes of our awareness, so although we are educated enough to know that we own our heart as much as we own our arm, we don’t talk about heart activity as a product of our control.

Photo by  rawpixel  on  Unsplash . Adapted by yours truly.

Photo by rawpixel on Unsplash. Adapted by yours truly.

So in a sense, when we have anarchic hand syndrome, our arm becomes more like our heart. The arm is on us and it is a part of us. But we are not in control. The labels “anarchic hand syndrome” and “alien hand syndrome” are often used interchangeably, even in academic papers. But some researchers distinguish between them, explaining that patients believe anarchic hands belong to their own body even when they cannot control them, while alien hands are experienced as a completely disowned limb.

Anarchic hand syndrome typically follows extensive damage to motor-related areas towards the front of the brain, including the anterior corpus callosum and supplementary motor area. Alien hand syndrome usually features damage further towards the back of the brain, including the posterior corpus callosum and parietal areas. The symptoms can also arise from degeneration in the circuits that connect areas of our cerebral cortex with the basal ganglia, a system that is critical in allowing us to move smoothly and effortlessly.

During movements of an anarchic hand, the primary motor cortex in the brain — one of the final command centers for sending “move” signals to your limbs — is fully activated. But unlike with voluntary movements, that activity is practically isolated, appearing without the normal co-activation of premotor, prefrontal, and parietal areas that are so important for our experiences of intention and movement awareness.

Utilization behavior refers to actions that appear fully functional, but emerge habitually and automatically in the wrong environments. They can occur after lesions to frontal areas of the brain, similar to anarchic hand syndrome, but patients often do not comment on their actions being out of the ordinary (unlike the woman with the anarchic hand in the video linked above, who repeatedly complained about her arm). When a patient sits in a doctor’s office, and sees a pen and paper sitting on the table, they might pick up the pen and begin to write. When they see a pack of cards, they might deal them as though they are about to start a game with the doctors. None of these actions have anything to do with the doctor’s instructions. Even when the doctor says that the objects should not be touched, the patient returns to their action after a small distraction. The patients simply use the objects because they are there.

Some theories of motor behavior explain that whenever we see a manipulable object in our environment, our brains automatically prepare the relevant action for handling that object. When we see a hammer, we initiate a motor program for a palm-grasp action. When we see a grape, we initiate a program for a smaller precision grip with our fingers. Thankfully, under normal conditions, we have the control systems in place to suppress those action plans when they are contextually irrelevant (although, when I spot a delicious bunch of grapes in a bowl, I often struggle with that suppression). Efficient hammering is great when we are putting together furniture, but not when we are in a doctor’s waiting room. When the control systems in our brain are destroyed, particularly following frontal damage, we may find ourselves acting for the sake of acting.

Photo by  Maja Petric  on  Unsplash . Adapted by yours truly.

Photo by Maja Petric on Unsplash. Adapted by yours truly.

When you read enough about brain dysfunctions, it begins to seem as though there is nothing in your life that you can depend on. We should remember that disorders like the ones described above are incredibly rare. And on the plus side, they can inspire us to appreciate some of the smaller facts of our existence. Even on your most boring day, you probably achieved several minor miracles of purposeful action and awareness. The notorious 3-pound organ sitting in our skulls can cause us grief during testing times, but it also makes life worth living the rest of the time.


The Upsides of Being an Autumn Baby


The Upsides of Being an Autumn Baby

Photo by  Lydia Winters  on  Unsplash

Your month of birth may be more life-altering than you think. It’s nothing to do with your star sign. Several variables depend on which month you happened to enter the world. Your mother’s food choices during pregnancy, the season of your earliest days of life, and your exact age at starting school are all good examples. Could any of these variables have a noticeable effect on our successes and failures in later life? It’s a tricky question with many possible answers, but studies have been taking on the challenge. Now, there are some curious stories coming out of the data.

Let’s first look at health. When you compare overall lifespan past the age of 50, Austrian and Danish people (Northern Hemisphere) live approximately half a year longer on average if they were born in autumn rather than spring. The same benefits apply in Sweden too. In fact, this autumn advantage is even true in Australia (Southern Hemisphere), where the seasons are reversed across the year, so that autumn begins in March/April. Fascinatingly, the lifespans of British immigrants in Australia maintain the annual pattern of their European counterparts, rather than adopting the schedule of their new Australian friends. So insight number one: seasons of birth seem to influence longevity.

Photo by  Barbara Alçada  on  Unsplash . Adapted by yours truly.

Photo by Barbara Alçada on Unsplash. Adapted by yours truly.

There is an ongoing debate around exactly why seasonality might affect our health or other life outcomes. The characteristics of mothers who are trying to conceive could be one relevant factor. For example, between 1989–2001, teenage mothers were most likely to give birth in January. Given the disadvantages that young mothers may face in bringing up children, you might expect that children born in January are more likely, on average, to experience problems in their growing life. However, this explanation alone is not sufficient.

In a 2013 study, researchers looked at mothers who had multiple children across different months. By looking at differences between siblings, and therefore studying the same mothers, general effects of varying maternal characteristics could be ruled out. And yet, the data still showed effects of seasonality in child health. Babies conceived in May (births due around February), were most likely to arrive prematurely and have low birth weights, possibly due to changes in maternal nutrition between seasons. The months of greatest maternal weight gain overlapped with the months of conception that produced the healthiest baby birth weights (summer months of June-August). But the birth pattern also showed a striking correspondence with the prevalence of influenza in health centers. When May-conceived babies were born, often prematurely in late January or early February, seasonal influenza was at its peak. The strong correlation between influenza prevalence and gestation length suggests that the seasonality of certain diseases could partly explain the effects of birth month on health.

So on average, winter and spring babies live slightly shorter lives and suffer worse health at birth. And it may not end there. Some researchers have looked at birth patterns among populations with specific disorders. In one study, researchers analyzed data for over 42,000 multiple sclerosis (MS) patients across Europe and Canada. Compared to a non-MS control group, patients were significantly less likely to have been born in November and more likely to have been born in May. The number of MS patients born in May was 9.1% more than expected, while the number born in November was 8.5% less than expected. Once again, the mechanisms that explain this pattern are a little foggy, but MS is a product of both genetic and environmental factors. Vitamin availability and susceptibility to specific viral infections are amongst the environmental influences that could vary by season, and might explain some of the story behind heightened risks for MS in May births.

Surely spring babies can’t have all the downsides? When it comes to academic performance, they may be outdone in their misfortunes by summer babies. Many school systems around the world, including in the US and UK, have cut-off birthdays between August and September to decide which academic year you fall into. If you are born in August, this means you end up the youngest in your class, while if you’re born a month later in September, you’ll probably be the oldest in your class. Children’s mental abilities change rapidly in their early development, so August babies might have a disadvantage in keeping up with their older classmates. If this holds true, you would expect to find a consistent difference between the academic outcomes for August babies and September babies of the same year, even if they were born only a day apart (e.g. 31st August instead of 1st September when the cut-off puts them into separate academic years).

When you look at the data, the older children in class (September births) do indeed outperform the younger children, and the difference is at least partly driven by the age at which children take academic tests. It’s not that the younger children develop weaker cognitive skills as they grow up, it’s that their skills are tested earlier in their development, therefore creating an uneven playing field. As you’d expect from this account, the differences between the older and younger children decline as they get older and their developmental trajectory evens out. However, the more troublesome difference between children may be in their beliefs about their academic competence. The younger children have a harsher view of their own competence than the older children do, even when asked for their judgments at around the same age. This pessimistic outlook may be more persistent, and could potentially sabotage later outcomes. Easy fixes for these issues are hard to come by. More affluent families tend to address the age imbalance by delaying their young child’s entry into school (a practice known as redshirting), while less affluent families are more likely to have young children who are held back a grade before testing. Both practices may have compensatory effects on test scores, but they may also come with substantial costs further down the line, like delayed work experience. Age-based adjustments for test scores could provide another option for balancing academic outcomes early on, but that comes with its own controversies.

Photo by  Jacob Postuma  on  Unsplash . Adapted by yours truly.

Photo by Jacob Postuma on Unsplash. Adapted by yours truly.

The youngest children in a class may also be at greater risk of mental health challenges. One study published in 2000 looked at schools in Northern Ireland, and found that children who were referred to psychological services were significantly more likely to be born at the end of the school year than the start. Similarly, in 2015, survey and register data in Denmark suggested that a 1 year delay in entry to kindergarten reduced scores on hyperactivity and inattention scales at age 7. Some of the mental health costs of relative youth could be driven by the negative self-perceptions of competence that I highlighted earlier.

Exposure to high temperatures during early life may also impact economic outcomes. Some evidence suggests that exposure to high temperatures (>32°C or >90°F) during prenatal development or during the first year of life is associated with lower earnings in adulthood. Each extra day of heat exposure correlates with a 0.1% reduction in annual earnings at age 30. The good news is that regular access to air-conditioning entirely cancels out this effect, so a community can easily mitigate the potential downsides of sun and heat exposure if they can afford the relevant resources.

After all of this, there seems to be a clear winner in the lottery of birth timing. Those born in autumn (September-November) have some probabilistic advantages over their spring, summer, and winter peers. The summer babies may have heightened risks from the sun and their pesky school schedules, while the late winter/spring babies may have a greater risk of health problems from viruses and nutritional deficits. It’s important to treat all of the evidence carefully. Most of it is correlational, which means we are still waiting on a definitive insight into the causes of differences between children born in different months. We should also remember that there is enormous variability in life circumstances and outcomes, so even when we find average differences according to month of birth, there will be a multitude of other genetic and environmental influences that make it practically impossible to predict how well someone will do purely from their birthday. A final note of caution is that we will have an incomplete picture of these effects for decades to come.

For now, autumn babies look like the lucky ones. But I would bet that the costs of fall are lurking in the darkness, waiting to be discovered by keen-eyed researchers.


Mindfulness Lessons from Science and Children


Mindfulness Lessons from Science and Children

Photo by  Vanessa Serpas  on  Unsplash

There are some human characteristics that we describe as childlike. In growing up, we gladly leave behind many of those qualities. Adults shouldn’t throw tantrums in supermarkets and cry about parents’ tyrannical desires to prevent accidental deaths in the playground. However, some childish adjectives are earnestly used as compliments for adults. When we describe an adult as childlike, we usually refer to some innocent or charming quality about them. That’s a nice sentiment, but some features of children’s mindsets may even be profoundly healthy for adults to cultivate. So in what ways do we need to be more like a child again?

I don’t yet have my own children, but in interacting with my many young cousins, there is always one particular trait that stands out. That is their ability to live and experience life in the moment. Children seem to be able to have fun with just about anything. The other day, I saw a child screaming with laughter at the noise they were able to create by hitting a can with a stick. They did not worry about the latest disaster in the news or the state of the economy. They simply made the most of what they found in front of them, and appreciated every second as they experienced it.

Of course, none of this is to say that children are Zen masters. Far from it. If you’ve ever been in an airplane with a child around, you’ll know all too well that children do not hesitate to scream for what they urgently crave. Nevertheless, they seem to be able to engage wholly with an activity in a natural way that adults no longer find so easy.

This psychological quality of children is reminiscent of a mindset widely discussed in the sciences and the media: it’s called mindfulness. Jon Kabat-Zinn was a pioneer in bringing mindfulness into the sciences, and he defines it as “the awareness that emerges through paying attention on purpose, in the present moment, and nonjudgmentally to the unfolding of experience moment by moment”. You can think of it as a mindset in which your attention is entirely locked on what is happening right now. Not what happened moments ago. Not what might happen in the future. What is happening now.

Photo by  Robert Collins  on  Unsplash . Adapted by yours truly.

Photo by Robert Collins on Unsplash. Adapted by yours truly.

Why should we care about the science?

It took a while for mindfulness to be taken seriously in the scientific world. This is because the principles within it were originally developed in a religious context, especially in the Buddhist tradition. It wouldn’t take you long to find a scientist who frowns upon the concept of religions. This frowning is due to the many religious premises and claims that cannot be supported by objective evidence. But the frowning often goes too far and becomes a phobic barrier to ideas that can actually be tested scientifically. Mindfulness occasionally still hits this barrier, but now features prominently in neuroscience and psychology studies. If you search for ‘mindfulness’ on Google Scholar, you’ll be reading papers for the rest of your life.

You will still find people who reject the scientific idea of mindfulness because of its religious baggage, and they will often lock horns with people who argue science has no place in discussing mindfulness. I’m in the camp who believe both teams are being too absolutist. Without scientific evidence, mindfulness will never be clearly distinguishable from the snake oils that do more harm than good for humanity, and it certainly won’t ever become a valuable part of our mainstream health services. And without an open and unbiased mind to take mindfulness seriously in the first place, you’ll never fairly weigh up the evidence to understand its true value. If we’re being practical and trying to avoid personal biases, we should enjoy any benefits of mindfulness in our personal lives, while acknowledging the value of emerging evidence through scientific scrutiny.

We don’t necessarily need evidence to believe that mindfulness is good for us personally, but we do need it to truly understand the extent of its benefits across different people, problems, and interventions. We cannot confidently and honestly recommend mindfulness as a useful intervention for others, unless we have independent research to back us. We have to rely on something more than the beliefs, feelings, and words that come from our own mind or the minds of those who agree with us. Clearly defined methods, testable hypotheses, and replicable experiments provide us with the material to convince a sensible doubter. Intercessory prayer has been around for thousands of years, with many religious people attesting to its value in improving the health of their loved ones. But with no scientific evidence to back up that claim, good doctors will never prescribe prayer as an intervention to support your loved one who may be suffering from a medical malady. Mindfulness, on the other hand, has growing scientific support replicating across independent research labs, and may eventually fall into that bucket of widely prescribed interventions. Science and evidence-based medicine have been crucial in our progress as a species. We cannot afford to dump them now.

The mechanics of meditation

Many of our anxieties are driven by a fear of something that may or may not happen in the future. Any pragmatist will tell you that it’s pointless to worry about something if you cannot change or affect it, but mindfulness provides a concrete approach for shifting your attention to something more helpful: your actual experience in the present moment. There are many types of meditation, but all of them are activities that cultivate this mindset in some form.

Photo by  Laurenz Kleinheider  on  Unsplash . Adapted by yours truly.

Photo by Laurenz Kleinheider on Unsplash. Adapted by yours truly.

I will highlight two types of meditation that I find particularly helpful. Rather than diving into their religious or historical context, I will adopt the labels and methods that have been used in the sciences. The first I will call focused meditation, and the second I will call open monitoring.

If you have any basic experience of meditation, you are probably familiar with focused meditation. The instruction is to focus on a specific object in the world or on your body. A convenient anchor is often the breath. You aim to maintain your attention entirely on your inhalations and exhalations in the present moment. This is far more challenging than it sounds, as any early practitioner will tell you. You don’t just consider the concept of breathing while mentally singing Ed Sheeran’s new song. You pay full and exclusive attention to every moment of the breath as it occurs. For example, if you are focusing on your chest, you want to experience it as it lies rested at its lowest point, then stay focussed on it as it slowly rises in each moment, then maintain your attention as it reaches its peak, and then do exactly the same with your mind as it sinks back to its resting position. Whenever you realize your attention has deviated from your breath — which might be as frequently as every few seconds — you simply return your mind calmly to the breath. Ideally without getting angry at yourself for having been distracted.

As you practice this, you will become better able to focus your attention on your breath for a few more seconds each time. Areas of the brain involved in sustained attention become more active as you start practicing meditation and improve your focus. However, at the highest levels of expertise (around 44,000 hours of practice), this activity is reduced again. In early practice, you become better able to recruit your attentional resources in the brain, but with expertise, focusing becomes effortless and you no longer need to rely on those resources. Distractions have a harder time dragging your attention away from where you direct it.

Open monitoring is a little different to focused meditation. Instead of choosing a specific object to direct your attention towards, the task is to focus your attention fully on any thought, feeling, or experience that arises in the present moment. There is no need to judge, anticipate, control, or react to anything that occurs. So an active mind that frequently jumps between objects is less of a concern during open monitoring than focused meditation as long as you are aware of what is happening. It’s still easy to become so distracted that you no longer pay close attention to your thoughts as they happen. There is an important distinction between being aware of thought and being lost in thought. I’m aware when I notice that my mind has floated from thinking about my breath to thinking about the work presentation I’m anxious about. I’m lost when my mind has drifted to worrying about the work presentation, panicking, wondering what I will need to do to prepare, and forgetting to notice each emerging experience.

The nice thing about open monitoring is that you can learn to apply it throughout your everyday life activities. With focused meditation, you usually need to find the time and space to quietly sit and focus on your breath. After a few days of this, most people will inevitably find some excuse to stop the practice. But with open monitoring, you can simply aim to remain aware and present with anything you happen to be doing: in the shower, you can focus your attention on the water as it hits your back; while walking in the park, you can focus on the colors of the trees; when eating, you can focus on the texture of the food as it rolls around your mouth. These are just examples, but the more you manage to bring your mind into the present as you go about your life, and the less you get lost in your head while your body does everything else on autopilot, the better you will appreciate your life as you live it. We too often let the day drift by and ask ourselves at the end of it “where did my Sunday go?”.

Evidence to support the benefits of mindfulness

Research is ongoing, and we still have much to learn. The experiments in this area are a mixed bag of higher and lower quality methods. A challenge in designing a good experiment is to compare mindfulness interventions with an appropriate control condition. If we want to understand how mindfulness impacts health, we need to know what it’s better than. Some experiments use no controls, which is clearly not ideal; if people improve in an aspect of their wellbeing following a mindfulness intervention, how do we know it’s not just because of the social interaction involved in their classes, or the effort of trying something rather than nothing? Other studies compare mindfulness to standard treatments for the targeted symptoms, or to different attention tasks, which is providing a more reliable insight into the specific health benefits of mindfulness itself. Review papers and meta-analyses that helpfully combine the results of multiple studies are also growing in number.

There is a way to go, but consistent effects are emerging. Mindfulness may not help everyone but there is now a large volume of evidence to suggest that it can have important, far-reaching benefits for many aspects of mental health. In general, extended mindfulness practice seems to adapt brain structure and function related to emotion, attention, and self-awareness. Experiments so far have highlighted benefits in areas including hypochondria, decision-making, chronic depression, and even chronic physical pain.

Some of the most convincing benefits are in emotions and relationships. Mindfulness techniques can be a great tool for shifting the mind away from ruminating on possible dramas and disasters. Most of my recent fears came from prognosticating outcomes that were either not that bad in the end, or did not happen at all. My time clearly would have been better spent focusing on my actual lived experiences. When we successfully apply mindfulness to our lives, many of us are happier, and we become more pleasant people to be around. We shouldn’t meditate with any specific goal in mind, because it’s counterproductive. A goal orientation is a distraction in itself. But it’s certainly helpful and motivating to know that whenever we do meditate, there are good reasons to believe it is worthwhile.

The child in us is waiting to emerge from a long slumber. There are many benefits to centering our minds on our current experiences and shifting away from the usual obsessions about objectives, plans, and goals. It’s about time we focused on the only thing we know for certain: that we are breathing, thinking, and feeling right now. We don’t need to keep a chart or track our data. Mindfulness is less of a life hack and more of a way to live. It provides an escape from our monotonous and robotic approach to our everyday activities. Next time you eat, shower, or fold the laundry, know that you can appreciate the moment rather than simply get through it. Life is always going to be too short. So you might as well live it.


Our Bizarre Love for Story Spoilers


Our Bizarre Love for Story Spoilers

The people who make the best company tell the best stories. At parties, you never want to end up standing next to the guy who won’t stop talking about the washing machine he recently bought. Good stories teach us, entertain us, and help us to build connections with people. It’s probably obvious that a compelling story will depend on the interests of the person you are talking to. If you’re alone at a bar beside a stranger, last night’s football game might be a decent bet for a conversation starter. But there are also other critical features, independent of individual interests, that make us engaging storytellers.

Our brains dynamically track the content of a story as we hear it. It wouldn’t be accurate to talk about a ‘story perception’ part of the brain. Instead, as a story evolves, so does our brain activity. When we read about characters and their ambitions, the goal-directed areas of our brain are particularly active (e.g. the superior temporal cortex and prefrontal cortex). When we read about characters interacting with physical objects, motor-related areas of our brain are more active (e.g. precentral sulcus). In a sense, we are living a story as we read it. This is what makes the best literature so engaging and enjoyable: it takes us by the hand and leads us everywhere we need to go without looking back.

Self-control can be exhausting. If you’ve ever spent an hour trying to hold a good posture instead of your typical slouch, you’ve probably experienced this yourself. Keeping your back straight is not a particularly energy-intensive activity like running or swimming. Much of the pain comes from keeping your goal in mind and maintaining the levels of effort and motivation you need. I’ve previously described the exhaustion of learning to drive a car. Driving when you’re an expert is a lazy activity but driving when you’re learning can be utterly draining. Some evidence suggests that we have a central self-control energy resource, and when an activity drains that resource, our self-control capacity on other tasks is also diminished (this is a theory known as ego-depletion, although it is currently hotly debated).

The question for storytellers is whether they can exhaust other people with their stories. Listening to someone’s story could reasonably leave us feeling tired, but it seems to depend on how we process their story. Imagine reading a story about a waiter who arrives at work feeling hungry, and spends their whole shift resisting the temptation to eat food from the restaurant because it is against company policy. If we actively take the waiter’s perspective as we read the story, it exhausts our self-control capacity. But if we passively read without putting ourselves in the character’s shoes, it does not seem to have this cost. In fact, it may inspire additional self-control capacity according to some evidence. So engaging with a story can be an exhausting experience, but it depends on how much we empathize with the characters as they go through their hardships. I can relate to this evidence after recently spending a full day watching a controversial senate hearing about the alleged wrongdoing of a US Supreme Court nominee. Putting myself in the shoes of the questioners and the witnesses, as the stories and perspectives were laid out, did not leave me feeling particularly healthy at the end of the day.

This question is going to sound crazy, but is it possible that spoilers for stories are a good thing? Well, according to one study, they may be. Researchers gave participants stories that they had not read before, and asked them to rate how much they enjoyed the stories. Before some of the readings, the participants were shown a spoiler paragraph before reading the full story. Unbelievably, for several types of story (mysteries, ironic twists, and evocative literary stories), people consistently reported enjoying the spoiled stories more than the unspoiled stories. You can come up with your own reasons for why this might happen, but some possibilities are that spoilers allow us to organize and anticipate stories better, while offering the pleasurable tension of knowing what may be about to hit the characters. This makes some sense to me, and explains why I often enjoy a movie more after a second viewing. However, for the time being, please do not spoil any of the upcoming titles on my movie list. The evidence might show that I’m likely to enjoy a spoiled story. But it hasn’t said anything about long-term appreciation or memorability. Yet…

Photo by  Zhifei Zhou  on  Unsplash . Adapted by yours truly.

Photo by Zhifei Zhou on Unsplash. Adapted by yours truly.

In keeping along this track of surprising facts, here’s one more obvious-sounding question. Would you prefer to hear a story about an experience you’ve had before, or an experience you’ve never had? You’d be forgiven for thinking you would rather hear about something completely new, because that is in line with what most people think. However, we may all be wrong. In an experiment, when a speaker describes their experiences related to a video, they expect that listeners will prefer listening to the story if they have not already seen that video. And the listeners expect the same thing. But the data suggests that listeners enjoy a story about a familiar video more than a story about an unseen video. When we tell people stories that are completely new to them, we tend to assume they have more knowledge about the topic than they do. It’s a cognitive bias called the “curse of knowledge”. It leaves us in the awkward position of occasionally rambling while someone stares blankly into our eyes, too polite to interrupt and ask “what are you talking about?”.

So there’s a great incentive to learn about another person’s interests and background before deciding which of your amazing stories to share. For both story spoilers and conversational topics, some sense of familiarity allows our brains to keep better pace with a changing story. We don’t need to exhaust ourselves with patching holes in our knowledge and playing catch up in real time as the story unfolds. Next time you’re out chatting with friends, tell them stories about what they already know.


How Music Plays Your Brain


How Music Plays Your Brain

Listening to music can be a euphoric experience. It’s unclear exactly why it should feel so good. Is there some evolutionary advantage to enjoying music? Is it a byproduct of some other important function? Is it just one big accident in our evolutionary history? The debate still rages on these questions, but there is one important fact that we can be confident about: music has some deep-rooted appeal for humans.

There is something special about music even for the youngest listeners. Infants in their first year of life already have a meaningful sense of musical timing and pitch. When listening to samples of Western music, Mafa populations in Cameroon recognize the same basic emotions of happiness, sadness, and fear that Westerners do. Both populations also enjoy a similar sense of musical harmony when they listen to each other’s music. When asked to express different emotions by creating musical or physical movement patterns in a computer program, participants in the USA and a tribal village in Cambodia make very similar choices. There is a fundamental core to musical experience and expression that all humans seem to share.

We can look beyond humans to examine how deep our musical roots really stretch. In addition to the cross-cultural appeal of music, there may be a cross-species appeal. There are ongoing discussions about exactly how much our perception of music overlaps with that of non-human primates. Although there are commonalities in our ability to detect rhythms, it is still unclear whether monkeys can synchronize their movements with music in the way that humans can. Some non-human primates, like Kuni the bonobo, may spontaneously synchronize with audible rhythms when they play with a drum. But we need to wait for more evidence to fully understand whether non-human primates enjoy dancing as much as we do.

Non-human primates, just like humans, do prefer consonant music over atonal or dissonant music. However, researchers have often struggled to find any consistent preference for music over silence when non-human primates can choose between them. In 2014, one research group decided to test this question in a little more detail, by trying a range of different musical styles. They divided a room into four zones, which progressively increased in distance from a music speaker playing either West African akan, North Indian raga, or Japanese taiko instrumental music. As the music played, the researchers measured where a group of chimpanzees spent most of their time. If they spent most of their time in zone 1, closest to the speaker, it would indicate that the chimpanzees enjoyed the music. If they preferred to stay in zone 4, where they could barely hear the music, that would suggest they preferred silence.

When the Japanese music played, the chimpanzees showed no preferences between zones. They seemed to be just as happy close to the speaker as they were far away from it. But when the African or Indian music played, they spent the majority of their time in zone 1, as close to the music as possible. In fact, they spent significantly more time in zone 1 than they did when no music was playing. So the tonal melodies and ambiguous pulses in West African akan and North Indian raga music seem to set a nice mood for chimpanzees.

Photo by  Rob Schreckhise  on  Unsplash . Adapted by yours truly.

Photo by Rob Schreckhise on Unsplash. Adapted by yours truly.

What exactly is our brain doing when we listen to music? Instead of processing individual sounds, the brain processes patterns of sound as it implicitly develops expectancies about what is coming next. The auditory parts of our brain analyze several core features of music including pitch and duration, and interact with frontal brain areas as we use our working memory to pull together the information into higher level abstract representations. Many of us have experienced the intense chills that come with listening to deeply moving parts of our favorite music. As this happens, the reward centers of our brain, especially subcortical areas including the ventral striatum that sit deep within the brain, adjust their activity: the more chills we feel, the more activity they show. In fact, when we experience those moments of musical euphoria, the striatum releases dopamine, one of the brain’s reward-related neurotransmitters.

Experts who play musical instruments probably experience music differently to non-musicians. A recent study looked for this difference in the brain. The researchers put beatboxers and guitarists in a brain scanner and measured the levels of activity in their brain as they listened to music involving beatboxing or guitars. When looking at parts of the brain involved in translating sensory information into motor actions (sensorimotor areas), they found an interesting pattern. Guitarists showed increased sensorimotor activity when listening to guitar music, and beatboxers showed increased sensorimotor activity when listening to beatboxing music. So their prior musical experiences in physically playing instruments changed how their brains reacted when listening to those instruments.

The researchers drilled down a little further to examine the finer details of the sensorimotor activity for each group. The primary sensorimotor areas of our brains are organized somewhat topographically: different sections of their structure represent different parts of the body (you can map out and illustrate these maps with “cortical homunculi”). This means you can compare “hand” activity to “mouth” activity in those brain areas. If musicians’ sensorimotor activity during listening represents the actions involved in playing the music, then you would expect to see hand areas activated for the guitarists and mouth areas activated for the beatboxers. After all, those are the body parts they use when making their music. This is precisely what the researchers found. Guitarists activated their hand areas when listening to guitar music, while beatboxers activated their mouth areas for beatboxing music. Their brains automatically recruited relevant sensorimotor regions in processing the musical audio, almost as though they were actually playing the music on some level. In other words, the musicians’ passive listening became a little more active when listening to their own type of music.

Photo by  Matheus Ferrero  on  Unsplash . Adapted by yours truly.

Photo by Matheus Ferrero on Unsplash. Adapted by yours truly.

Music is one of those real-life miracles that practically all of us can connect to. It brings us together at festivals, bars, and other social events, and can give us a dramatic emotional lift when we most need it. The question of why it has these magical effects continues to elude us, but this mystery makes our musical experiences all the more impressive. Whether you’re a metalhead or an opera enthusiast, don’t forget to fully appreciate and enjoy your next musical fix. If you’re lucky, it might inspire a creative spark or moment of ecstasy. But at the very least, you’ll be tapping into an experience that you share with many of your fellow primates.


The Tech That Reads Your Mind and Sees Your Dreams


The Tech That Reads Your Mind and Sees Your Dreams

“turned on red Psychic Reader neon sign” by  Scott Rodgerson  on  Unsplash

“turned on red Psychic Reader neon sign” by Scott Rodgerson on Unsplash

The ability to read someone’s mind has traditionally been the stuff of fiction. Our thoughts and experiences are private to us and we can choose when we share them with others. But with developments in brain scanning technology, mind-reading is becoming a hard science rather than a false promise. You may no longer need to be superhuman to see the darkest thoughts and desires of the person opposite you. You can instead convince them to lie in your brain scanner.

A basic test for a mind-reading machine is to tell you what visual image you are holding in your head. If the machine has to decide which of two images you are thinking of, does it perform significantly better than guessing at random?

In some cases, this is a fairly easy task with a brain scanner. For example, if you are trying to guess whether someone is thinking of playing tennis or walking around their house, you find different areas of the brain that are most active: the supplementary motor area for the motor imagery involved in playing tennis, and the parahippocampal place area for the spatial imagery involved in walking around your house. This distinction has been used to communicate with hospital patients who lie motionless in what appears to be a coma. If patients can use a thought to answer “yes” or “no” to questions (e.g. thinking of tennis for yes and house for no), then the doctor knows that the patient is showing signs of consciousness.

When you have a larger number of visual images to choose from or greater similarity between images, the task of decoding what someone is thinking becomes far more difficult. The overall levels of activity across the brain might be very similar for seeing a leopard versus a duck, so you need to be more sophisticated in how you analyze brain imaging data. One option is to drill down into detailed patterns of activity within a single area.

You can start scientific mind-reading by decomposing a list of images into their different visual features (e.g. object position, orientation, light contrast, etc). Then, you can take a set of practice images and train a decoder machine to link the features for those images with patterns of activity in the visual areas of a person’s brain as they see those features. Each feature drives brain activity in a different direction, so every unique combination of features corresponds to a unique pattern of activity overall.

After the machine is trained, you show the person brand new images they’ve never seen before and measure the patterns of activity in their visual brain areas. By using the associations that the decoder picked up during training, you can infer the visual features for the new image they see from their brain activity patterns. The machine can then look through a database of images, and estimate which image the person is seeing based on how closely the inferred features from brain activity match the actual decomposed features of an image. The closest match becomes the machine’s best guess.

The amazing thing is that because there is so much overlap in how our brain responds to actual visual images and the visuals we imagine in our head, you can do the same thing for what people are thinking about. Instead of showing them a visual image in the brain scanner, you just ask them to visualize a particular object in their mind. By analyzing brain activity in the same way, the machine can correctly infer which object they are imagining, or even which piece of famous artwork they have in mind.

One of the most exciting applications for this kind of mind-reading may be in decoding the content of our dreams while we sleep. Dreams are not only notoriously difficult to understand, they are often so vague and disconnected with reality that we barely remember them when we wake up. However, as with the imagined versus seen images I mentioned above, there is strong overlap in our visual brain patterns corresponding to seen images and dreamt images.

Researchers put participants in a brain scanner, waited until they fell asleep, and then woke them up during the most dream-intensive phase of sleep. By asking them to describe any visual images they saw while asleep, the researchers built a record of the images that people dreamed and their brain activity during those moments. By training a decoder machine on brain activity when people physically saw different images while awake, they could successfully read and predict what people visualized in their sleep from the same patterns of brain activity. As this kind of technology develops and improves, we should end up with more accurate and more comprehensive dream-reading machines.

Memory is another important function that depends on our ability to generate mental images. Long-term memory is similar to the type of visualizing I described in the experiments above. If I ask you to imagine a leopard or your tennis swing, you are recalling elements from your long-term memory of past experiences with those images or actions. But you also have working memory, which refers to your capacity to hold a number of objects in mind for seconds or minutes while doing a task. You may be trying to hold a phone number in mind while you dial it, or perhaps pictures during a memory game.

Visual areas of the brain generally do not sustain their overall level of activity when we hold visual images in memory. But as I explained before, you may need to drill down to find patterns of brain activity that code for specific images. This is exactly what one group of researchers tested when they asked participants to remember the orientation of a quickly flashed visual object for 11 seconds. After that delay period, participants had to decide whether a new comparison object was the same as the object in their memory. The decoder could analyze activity in visual areas of the brain during that delay period, and guess which of two orientations people were holding in their memory with over 80% accuracy. So even though brain activity in those areas returns to its resting level after seeing a visual object, it continues to exhibit a pattern of activity matching that object as you hold it in memory. Those same patterns also reliably predict the image if you generate it yourself in your mind instead of holding it in working memory over a delay.

The activity in our brain is naturally responsible for our mental experiences. Decoding those experiences with brain scanners is a radical and enlightening new project. We have already hit successes in decoding the contents of our mental imagery, dreams, and working memory. It’s thrilling to consider where we go from here. Although it’s easy to worry about future misuse of this technology (e.g. with invasions of privacy), the scientific journey itself could realistically improve how we understand people’s conscious experiences and potentially how we treat mental health disorders. In essence, it could teach us about the most important facts in our lives: where our thoughts come from, what they do to us, and how we can change them for the better.


One Way Children Are Smarter Than You


One Way Children Are Smarter Than You

Photo by  JESHOOTS.COM  on  Unsplash

You can pretty much consider it a law that your cognitive ability improves as you develop from a child into an adult. The usual exception to this is your ability to learn, especially with languages or a musical instrument, where children excel at picking up new talents. But with knowledge, memory, attention, and pretty much all other measurable mental capacities, children are the developers rather than the masters. Well, until one study came along to cast doubt on one aspect of our attentional ability.

Change detection is a popular concept studied in experimental psychology. It refers to our ability to notice differences between images. If you’ve ever tried a “spot the difference” puzzle, you know how tricky it can be to spot small differences between otherwise identical images that sit next to each other. But when you flash two images independently to people, and ask them to report whether there is a difference, it becomes difficult to notice even large changes in the images.

Our general weakness in spotting these kinds of differences is referred to as change blindness. The remedy is focused attention. When a change is significant enough, and there is no interruption in the continuity of images, our attention is automatically drawn to the change. But with interruptions in continuity, or when the change is small, we need to work much harder to focus our attention in the right area. This is why filmmakers often get away with continuity errors between shots or frames. The transitions between shots disrupt the image continuity enough that we miss the errors. Here is a fantastic example of a card trick employing change blindness, so you can see for yourself.

So what does this have to do with children? Researchers decided to directly compare change detection abilities between children (4–5 years old) and adults (~20 years old). Both groups saw cartoon characters that could differ in one of two characteristics (body shape or hand shape). During the main part of the experiment, participants were only asked to look for characters who matched a particular body shape, so their attention was focused on a single characteristic. After the main part of the experiment, the adults and the children saw a range of characters, and were asked whether or not they had previously seen those exact characters. Adults performed just as well as kids when rejecting characters who differed in the characteristic that they focused on during the main experiment (i.e. body shape). But shockingly, the kids were significantly better than adults at identifying characters who differed in the characteristic they did not focus on (i.e. hand shape).

Photo by  Picsea  on  Unsplash . Adapted by yours truly.

Photo by Picsea on Unsplash. Adapted by yours truly.

When it comes to detecting visual changes outside our direct focus, children may outperform their parents. It seems that adults have a highly selective focus while children are more diffuse or scattered in their attention. Attention-related circuits in the brain are developing throughout childhood, and it may be that this immaturity in attentional control results in a serendipitous advantage for processing information that is not immediately relevant. So for us adults, our mature ability to zero in on particular visual traits means we are successful in recognizing relevant changes. But when asked about details that are irrelevant to that previous focus, the kids come out on top.


The Bird Who Cried Snake


The Bird Who Cried Snake

Photo by  Ben White  on  Unsplash

Photo by Ben White on Unsplash

If I say “strawberry ice cream”, the odds are that strawberry ice cream will be the first thing that pops into your mind. It’ll probably start with a visual mental image, but you might go on to recall its sweet fruity taste and frosty sensation too. If I instead told you not to think of strawberry ice cream, you’d probably end up in the exact same boat. Mental imagery is often an involuntary phenomenon, and it’s hugely influenced by what people around us say. Sometimes, frustratingly so. The reason we cover our ears when people start talking about repulsive or terrifying scenes is because we know what our mischievous brains will do at that moment or later that night when we are trying to sleep in our dark bedroom.

Language interacts with our mind in stimulating mental imagery and influencing how we handle the world. Verbal communication is primarily a device for sharing important information, so our brains accordingly use language to guide where we should focus our attention. You can see this effect at work when people are looking for hidden objects. If you say the word “square”, people are faster and more accurate in detecting square-like visual images, but slower in detecting circle-like images. And the reverse is true if you say the word “circle”. Despite presenting random words that don’t necessarily predict which object is being presented, people can’t help but use the language in their attention and decision-making. Language and other communicative signals (whether meaningful or misleading) create expectations or “sensory templates” about how an upcoming event should look or sound. Other people’s language has privileged access to our brain.

Speaking of privileged access, you may have noticed throughout your life that certain words are particularly good at grabbing your attention. The first and most obvious example of this is known as the cocktail party effect. When we stand in a noisy bar with friends, we’re usually good at narrowing our attention to focus exclusively on the words that a friend is saying, while filtering out all the nonsense coming out of other people’s mouths. When we avoid paying active attention to the many conversations around us, they essentially sound like one big monotonous buzz. However, if our name pops up within that buzz, many of us immediately and automatically prick up our ears. This means that despite everything sounding like meaningless noise in the background, our brains continue processing something about the unattended information, without us being aware of it. And when that information is suddenly relevant to us (nothing is more relevant than our own name), then our conscious attention shifts from the conversation with our friend, to “is this stranger talking about me?”.

Another type of word with a priority pass into our consciousness is the taboo type. I don’t want to use any of these words directly in this article so I’ll exchange a commonly used example for the rhyming euphemism “cluck”. Even when you are not listening to a conversation, it’s hard to stay tuned out if it suddenly features a cluck. If we are with close friends who regularly swear, it can become a more normal word, but of course then it’s no longer taboo. If we are in more polite company or at a formal event, then we have no choice but to immediately notice and recoil when someone exclaims “cluck this” or talks about their clucking boss at work. It’s another great example of the involuntary effects that language can have on us, not only at a mental imagery level but also an emotional and behavioral level.

Photo by  rawpixel  on  Unsplash . Adapted by yours truly.

Photo by rawpixel on Unsplash. Adapted by yours truly.

Taboo words connect with our emotional brain systems in a way that more neutral words do not, and they elicit automatic stress-related physiological reactions. It may be that we have two distinct language systems in the brain: one closely related to emotional vocalizations which is more likely to handle swearing and cursing, and another for our more advanced information-filled communicative abilities. Neurological disorders like aphasia are characterized by an inability to speak or understand language, and yet patients can often curse and swear with less difficulty. This may be because their brain damage is confined to the informational language system rather than the emotional system.

We have more in common with other animals when it comes to our emotional vocalizations than our more informational communication. We know that language can have direct and automatic impacts on our own behavior and mental imagery, so perhaps bird calls have similar effects on a bird’s mind. One major reason for birds to call is to raise an alarm. For the Japanese tit, one of the biggest concerns is the Japanese rat snake, which can move up a tree to capture its prey. So could there be a bird call that makes other birds in the area specifically watch out for snakes, or are there only general urgency signals?

One researcher decided to test this for himself by hanging a speaker in a tree, broadcasting alarm calls, and examining how birds would react to a stick that moved like a predatory snake. With general alarm calls, a bird would ignore the stick. But with snake-specific calls, it would fly within a meter of the stick to survey exactly what was going on. In fact, it would only be interested in this stick if it moved in a snake-like way up a tree. It showed no interest in the same stick if it was swinging from the tree in a way that didn’t resemble a snake. Upon spotting a real snake, a bird would normally hover over it and try to look big in an attempt to deter it from progressing further up the tree. Of course, they didn’t need to do that with the stick when their closer inspection revealed it was harmless. But their approach behavior showed that snake-specific alarm calls automatically stimulated the mental image of a snake (or at least something analogous since we don’t know the bird’s direct experience), and shifted their visual attention towards objects in the environment that most resembled that particular dangerous template.

Photo by  SK Yeong  on  Unsplash . Adapted by yours truly.

Photo by SK Yeong on Unsplash. Adapted by yours truly.

Communicative signals directly and specifically affect our next moves, and the benefits of communication clearly apply across many animal species. There are several practical uses to human language including emotional expressions, information exchange, negotiations, and warning signals. In normal circumstances, other people’s signals often predict something meaningful about what may be about to happen, and what we need to look out for in our environment. It is therefore a useful adaptation to use language quickly and automatically when it may be useful. Although this can open the door to frustrating deceptions and false alarms, we have gained a lot more from communicating freely with our fellow humans than we have lost to their occasional bad intentions.


In the Future, We May Not Need to Face Our Fears


In the Future, We May Not Need to Face Our Fears

This article was a front-page feature on Medium.

Photo by  denis harsch  on  Unsplash

Many people live with fear. Chronic fear: phobias, anxieties, and PTSD. Conditions like these can be debilitating and difficult to treat. Many treatment options cause painful physical and psychological side-effects. New and promising treatments, however, use advances in brain scans and neurofeedback to revolutionize the way science helps us overcome our fears.

The problem with drug-based treatments will always be their side effects and wide targets. There’s no drug to specifically cure a fear of snakes or fear of flying, for example; medications only dampen a generalized level of anxiety (sometimes, they even knock people out completely).

Medications can only dampen the general level of anxiety or perhaps even knock people out completely.

One of the best available treatments for a specific anxiety? Exposure therapy. In a controlled environment, participants are trained to relax and face their fears. If they have a phobia of snakes, for instance, they might be asked to imagine a snake in the first session, then look at a picture of a snake in the second, then watch a video in the third, then see a real snake in the fourth. If they make it through the therapy, they might successfully reduce their level of fear in the real world. The problem, as you might expect, is that dropout rates during this type of therapy are high. Repeated exposure to your deepest fears is a painful process.

Parallel to exposure therapy runs another stream of scientific research looking at a method known as neurofeedback. Through manipulating brain patterns, this technique trains people to shift their behavior in specific directions. If you were undergoing neurofeedback, the procedure might go something like this:

You sit and look at a circular disc on a computer screen while researchers measure your brain activity. You see that the size of the circular disc changes, and you know that its size is somehow linked to a target pattern of brain activity in your head. When that pattern is more active, the disc grows. When it’s less active, it shrinks. Over time, you begin to learn how to consistently make the disc bigger. But, strangely enough, you don’t always know exactly how you’re managing to control your brain activity in order to accomplish this. The learning process is implicit and outside your awareness.

Neurofeedback shows some potential as a tool for treating neurological or psychiatric disorders. The logic is that if doctors can identify a particular signature of activity in the brain that characterizes a patient’s symptoms, they might be able to use neurofeedback training to reduce that activity. If the activity is shown to have a meaningful role in causing their symptoms, then the hope is those symptoms will also decrease.

Emerging evidence supports these benefits for disorders, including ADHD and stroke recovery. Of course, there are still questions around the practicality and efficacy of this treatment. But the evidence, so far, is promising.

Repeated exposure to your deepest fears is a painful process.

Building on the potential of this research, a new study published in March 2018 by labs at UCLA and in Japan brings together the worlds of exposure therapy and neurofeedback. The study’s ambition was to expose participants not to their fears themselves (like in exposure therapy), but to the unconscious activity representing those fears in their brains (neurofeedback). By rewarding participants when their brains showed that unconscious activity, they tried to create a positive rather than negative emotional association with the feared object.

Critically, this method avoids the need to directly present the fear to participants, minimizing the chance they’ll drop out of therapy (a common problem with exposure techniques).

Surrogate volunteers with no phobias were shown the fear-based objects (e.g., spiders and snakes) and their brain activity was scanned. Researchers used these patterns to infer what fearful activity would look like in the brains of people with phobias toward those objects. Then they used neurofeedback training to reward participants whenever their brain activity looked like it represented the unseen feared object. Amazingly, neither researchers nor participants knew the fear that was being targeted: The computer randomly selected neurofeedback for each participant, automatically using an object they didn’t fear as a control.

If doctors can identify a particular activity signature in the brain that characterizes a patient’s symptoms, they might be able to use neurofeedback training to reduce that activity.

At the end of the experiment, participants’ physiological fear levels (skin conductance responses and brain activity in their amygdala) were reduced when looking at images of the object they feared. Fear responses to the control object, which was not targeted in the neurofeedback training, remained the same as before the experiment.

It’s amazing to consider what this kind of neurofeedback could do for people in the future. Imagine the benefits for those with chronic anxieties, phobias, or conditions such as PTSD. Could their symptoms one day be treated without ever exposing them to the terrors they suffer from?

When phobias are overwhelming enough that they take over our lives, we may be able to defeat them without ever directly facing them.


How Air Pollution Is Destroying Your Brain


How Air Pollution Is Destroying Your Brain

There are few things in my everyday life that frustrate me more than cycling behind an old rickety van that is blowing black fumes into my face. I tried those face masks that filter pollutants as you breathe, but on a hot day, the sweat and discomfort is unbearable, and I must say, the mask makes me look rather like an unpopular supervillain.

I usually try not to complain about the world, and look for the good and bad in everything, and I think most of my previous articles have been relatively optimistic about life in general. The frequent headlines about new technologies either killing or curing our brains are misleadingly one-sided. The truth is usually somewhere in the middle. But when it comes to pollution, I don’t think I am exaggerating too much. It’s hard to see how the particles of poison entering my respiratory system could be good for my health in any way. As with cigarettes, sometimes the science is convincing enough that we can wholeheartedly say “smoking is bad for you”. So in the spirit of the anti-smoking lobbies that rightly worked so hard in the past, here is some of the recent evidence on how air pollution may be destroying your brain.

Photo by  Katerina Radvanska  on  Unsplash . Adapted by yours truly.

Photo by Katerina Radvanska on Unsplash. Adapted by yours truly.

The first important question to ask is whether it’s even possible or feasible for external pollutants to enter the brain. The blood-brain barrier is good at keeping foreign particles in the blood from interfering with the brain, but pathogens can find other entry points. A research project published in 2016 looked for nanoparticles of magnetite (an iron ore) in brain tissue samples from people who died in fatal accidents while they lived in Manchester, UK, and Mexico City. Magnetite particles are produced by combustion and abundantly found in the air we breathe in major cities. Biologically produced magnetite is also naturally present in the brain, but the researchers could distinguish this natural magnetite from airborne magnetite by comparing their structures. Rounded nanoparticles like those found in air pollution outnumbered natural magnetite in the brain samples. The tiny size of the particles (< 200 nanometers) meant that they could enter the brain via the olfactory nerve, the nerve connecting the smell receptors in our nose to the brain. High amounts of brain magnetite have been linked to neurodegenerative diseases like Alzheimer’s, and the researchers did indeed find high concentrations in the brain samples from older people who had a history of symptoms. But scarier than that, some of the highest magnetite concentrations came from younger people living in Mexico City, especially in those who were exposed to the most polluted areas.

The research above shows that external pollutants can contaminate the brain. But the next question is whether there is good evidence of a link between pollution and brain abnormalities or cognitive deficits. I’ll describe the evidence of harm to different groups of people, in order of their age.

Let’s start with the unborn. Fetuses in the womb can be exposed to pollutants through the placenta, so researchers tested whether the mother’s air quality in the third trimester would predict the child’s brain development over its first 7–9 years of life. They found that greater prenatal exposure to pollutants was linked to smaller white matter volumes across the left hemisphere of the child’s brain in later life (at an average age of 8 years old). Sadly, this reduced white matter volume correlated with slower mental processing, more behavioral problems, and stronger ADHD symptoms in childhood.

Next up is air pollution at school. Researchers compared the cognitive development of children in schools exposed to high versus low levels of traffic-related pollution. They assessed memory and attention performance every three months for one year, for almost 3000 kids across 39 schools in Barcelona, Spain. As you might expect after the results of the last study, the kids in the more polluted schools showed less improvement in their working memory and attention performance over the course of the year. This deficit in cognitive performance may be linked to impaired connectivity across the brain in school children who are exposed to more environmental pollutants.

Photo by  Peter Hershey  on  Unsplash . Adapted by yours truly.

Photo by Peter Hershey on Unsplash. Adapted by yours truly.

Tests on older women have also suggested a connection between brain structure and estimated exposure to air pollution based on where they have lived. Consistent with the placental pollutant effects I described earlier, these tests showed that women previously exposed to more airborne particulates had smaller white matter volumes, even after controlling for other demographics or relevant health issues. Given that air pollution is consistently a risk factor for dementia, it may be that these white matter deficiencies are related to the onset of neurodegenerative diseases. But we need to wait for more research to clarify the exact links between pollution, white matter damage, and cognitive declines in older age.

Most of these studies are about the long-term effects of toxic particulate matter, but there may also be immediate threats from sudden changes in pollution levels. A large systematic review of 6.2 million events across 28 countries looked at whether short-term increases in air pollutants resulted in increased hospital admissions for strokes or fatal health problems. Admissions did indeed increase with rises in airborne particulate concentrations and gases including carbon monoxide, sulphur dioxide, and nitrogen dioxide. So next time there’s a severe smog problem outside, it might be best to stay indoors, especially if you have existing health problems.

Photo by  Alex Gindin  on  Unsplash . Adapted by yours truly.

Photo by Alex Gindin on Unsplash. Adapted by yours truly.

I’ll highlight one final study because it was published so recently and because the methods are both simple and smart. A group of researchers took the data from an existing national survey across China that included tests of verbal and mathematical ability. They then took the exact dates and locations of those surveys and matched them to daily air pollution data across China, while removing the effects of other county-level variables (e.g. GDP per capita and population density) and individual-level variables (e.g. household income and education). They found that verbal and mathematical performance was lower in areas with high pollution, and the effects were strongest when you averaged the pollution levels across longer time frames. Men appeared to be more vulnerable than women to the negative effects of air pollution on verbal test performance, and less educated men were the most vulnerable of all.

The evidence on how air pollution impacts our brain and cognitive performance is discomforting to say the least. There is still much to learn, so it’s worth keeping up to date with the science as it evolves. But in the meantime, I’ll be avoiding the busiest streets during bicycle rides to minimize the risk of the silent dangers lurking in our air.


Stop Assuming They Don’t Like You


Stop Assuming They Don’t Like You

Photo by  Noah Buscher  on  Unsplash

When you think about what makes you anxious in life, social events are likely to feature prominently. Public speaking, meeting new people, and competing with others make many of us wince with an awkward pain. We have anxieties about what can go wrong for good reasons: loneliness is a killer, and weak social networks can prevent us from making progress.

Fear of embarrassment may be one of the primary emotional drivers that make us nervous about joining or speaking to a new group of people. We don’t want to be that person standing alone at the party and we don’t want our reputations destroyed by a hasty comment that came out wrong. Generally speaking, two things need to come together to cause embarrassment. The first is a failure according to our personal standards (e.g. falling over or saying something stupid). The second is a social setting in which we know others may be judging us. When you look at the brain while someone is embarrassed, you find activity in exactly the areas of the brain that are most relevant for these functions: emotional arousal areas like the anterior insula that are linked to the experience of personal failure, and ‘mentalizing’ areas like the medial prefrontal cortex that are involved in understanding what other people may be thinking about us.

When we end up in the unfortunate position of social reject, the emotional pain we experience is not so different to the physical pain from a burn. Both are deeply uncomfortable, highly aversive, and both make me want to jump into a large bucket of ice water to numb the pain. In fact, there is striking similarity between the two types of pain in how the brain treats them. They both activate parts of the brain important for processing physical sensations on the body (like the posterior insula and somatosensory cortex). When you really zoom in to look at those areas in more detail, you may be able to detect differences in the precise patterns of activation within them, depending on the type of pain. After all, the two experiences are not entirely identical and we are still very capable of distinguishing them. But there is no getting around it: when we feel socially rejected, it hurts like hell. Whether a romantic partner has called for a hiatus, or we’ve embarrassed ourselves in front of an audience, the brain knows exactly which systems to recruit in order to make it as excruciating as it needs to be.

Photo by  rawpixel  on  Unsplash . Adapted by yours truly.

Photo by rawpixel on Unsplash. Adapted by yours truly.

If some of my previous accounts of personal and general brain-hating haven’t already made it clear, we are vulnerable to errors in our perceptions and thinking patterns. So maybe there are times when we misread how others feel about us. In a refreshingly simple recent experiment, researchers put two strangers into a room, gave them a few ice-breakers, and asked them to chat. They then pulled the couple apart and surveyed them individually on how they felt about the other person, and how they believed the other person felt about them. People consistently underestimated how much the other person liked them, and the researchers called this ‘the liking gap’. This gap in how we think other people feel about us, and how they actually feel, can persist for months after we meet someone, and it holds true whether the conversations we had were 2 minutes or 45 minutes long.

It’s almost as though we are utterly determined to believe that other people have a problem with us, even in the absence of any evidence to support that belief. The effect may be driven by an excessively critical review of our own performance after an interaction with a new person. We judge our own conversational quality more negatively than we judge other people’s. We dwell too long on small details that might have been mistakes and might have annoyed or offended our conversational partner, and don’t pay enough attention to how they reacted perfectly happily or normally to everything we said. Perhaps this self-critical attitude drives us to improve and become better company in the long run. Or perhaps it needlessly upsets and embarrasses us, and makes us hesitant to meet new people in the future. That’s for you to decide.

Photo by  Kelly Sikkema  on  Unsplash . Adapted by yours truly.

Photo by Kelly Sikkema on Unsplash. Adapted by yours truly.

We are gregarious creatures, so friends provide some of the biggest excitements and joys that life has to offer. It’s a good idea to carefully monitor our behavior and make sure we present our best selves when we meet new people. But much of the time, we have a habit of reading the situation poorly. In typical conversations, the pressure to be liked can overwhelm our rationality and distort our judgments about what other people are thinking. When we next conclude that a conversation was a failure, it might be worth a second thought. And even when we really do suffer a social rejection, there may be silver linings we can cling to, like the opportunity to use our emotional reactions and sense of independence as inspiration to be creative (and there are other ways to maximize your creativity too). Aren’t all the best love songs about breakups?


The Day I Embarrassed Myself


The Day I Embarrassed Myself

Photo by  Louis Hansel  on  Unsplash

The biggest turning points in our lives come from moments when we need to make a decision. We make decisions ranging from the most trivial to the most important every single day. We pick and choose the friends who are right for us, the directions in which to travel, the careers to develop, and the cities to build. Anatomically speaking, we humans are all basically the same. It is our decisions that set us apart.

Decisions are not always easy, and the modern world often asks a lot from our poor ape brains. Sometimes it seems like we can’t win. We can have both too little and too much choice. Our conscious minds can overthink a problem while our unconscious minds miss too much. And we are expected to make reasonable sense of what is around us now, while also predicting the future consequences of the possible decisions available to us.

Predicting the future is no easy feat for non-clairvoyants (i.e. everyone). Many people and events can depend on what we decide to do, and I’m not just talking about the decisions of war generals. Deciding whether or not to buy a coffee right now can impact what we hear and say in a later work meeting and might affect our reputations and careers. Deciding whether to take this crowded train or the next quiet one to university can determine whether we make it to an exam on time or fail. And the most recent pressing decision on my mind while I lived in the UK: deciding whether or not to attend a wedding can impact how particular people feel about us.

As I hinted at when I referred to our poor ape brains, our reasoning and decision-making is not optimally set up for modern day in civilized society. There are plenty of processes and mechanisms that made sense in our evolutionary history, but now are misaligned with the ideals and demands of modern life. We call them cognitive biases and our brain is littered with them. I will talk through just a few of these in the context of my decision-making on the day I had to attend a wedding, because it’s easy to see how often I make these mistakes. It might seem like a dire situation for human psychology, but far from it. When I notice a cognitive bias appear in my head and remain aware of it, it is less likely to force me into making poor decisions that turn my molehills into mountains.

Keep in mind this important note as I tell you the story: I hate weddings. I absolutely hate weddings. And I argued with my wife every day for one month about why I had to go to this specific wedding, and why I couldn’t just stay at home (just like I argued for my own wedding). This was how my morning went on that day. I will italicize my cognitive mistakes to make them extra embarrassing. I hope you can relate to at least a couple of them. Here goes…

Photo by  rawpixel  on  Unsplash . Adapted by yours truly.

Photo by rawpixel on Unsplash. Adapted by yours truly.

7am — My alarm rings and I slowly open my eyes. It dawns upon me. It is the day of that wedding, and I need to leave the house within the next hour for a long journey from London to Codsall. Codsall for goodness sake. Codsall! What a daft name for a place.

7.15am — I’m still lying in bed, and putting off the day ahead by reading the news on my phone. I get a message from my wife who is flying in from Washington DC and meeting me at the wedding. Her flight was cancelled during the night while I was failing to sleep but avoiding looking at my phone, and she had to get on a new one. She will now be at the wedding 6 hours after I arrive in Codsall. I will need to spend 6 hours in a dingy little depressing village cafe, waiting for my wife, so that I don’t need to spend any time alone at the wedding. This is an abomination.

  • Cognitive bias 1 — Overgeneralizing learned rules: I have been to many small English villages in my time, and I would estimate something like 40% of them had cafes that I did not enjoy sitting or working in. In cities, this value is close to 0%. So I have detected what I believe is a reasonable pattern in the world in terms of my preferences, but I am over-applying the rule to places I have never visited before. Yes, I have encountered far more beastly cafes in small villages than I have in cities. But it is nowhere near 100% of those villages. So I should be giving completely new places a good chance of impressing me with their cafe selection. Some evidence suggests overgeneralization may be relevant in panic disorder and generalized anxiety disorder, where patients’ perceptions of danger spread too far.

7.31am — I’ve made it into the shower.

8am — After sulkily putting on my clothes and throwing my suit in a bag, I am prepared to leave. I look out of the window and it is pissing it down out there (translation for non-British people: raining heavily). I do not have an umbrella.

8.15am — I’ve walked through the rain and I’m now at my local tube (subway/metro) station in east London. The place is crawling with humans scurrying to get to work in central London. I miss the first train I need because too many people get on ahead of me, so I wait my turn at the front of the queue for the next one. It arrives but I’m pushed out of the way by a small woman with curly hair who was behind me. She takes up the last empty space on the train as the doors slam shut, and she looks at me with contempt as it begins moving. This woman is an arrogant, selfish, devil-worshipper. I am now a misanthrope for the foreseeable future.

  • Cognitive bias 2 — Attributional biases: We can always catch ourselves making mistakes in how we attribute characteristics to the events in our lives. One example is the “curse of knowledge”, in which both adults and children incorrectly assume other people know what they do. This makes communication difficult and can lead to bad decisions. We also make errors in attributing responsibility, especially by ascribing permanence to what is temporary. When we are happy or sad, we often feel it is a defining part of us rather than a fleeting emotional experience that will come and go. Patients with depression have a worse problem: they believe that any negative emotions will stay with them forever and are their own fault, while positive emotions are an accident that will disappear before long. We also tend to assume that other people’s bad behavior is attributable to basic character flaws rather than the possibility that they are just having a bad day. Is the pushy woman I met on the train really a devil-worshipper? Or could she have got some bad news about a relative that morning?

8.22am — I am sitting in a chair on the train platform in despair, with my head in my hands. Everyone has their own thing going on, entirely ignoring each other. A small black Labrador trots up to my leg on its owner’s leash. It stares into my eyes. This dog knows. It is confused about our culture and behavior and is questioning why we insist on standing in these crowded sweaty places rather than running around outside in the park chasing sticks.

  • Cognitive bias 3 — Anthropomorphization: We enjoy imbuing non-humans with human-like characteristics because we feel it helps us to understand them better. This is not always completely unrealistic. After all, a dog probably has some emotional subjective experiences going on, even if we cannot exactly pinpoint their quality relative to humans. The problems with anthropomorphization are a little clearer when we start talking about inanimate objects as though they were alive. We see eyes in the headlights of cars, a Mother figure within nature, and we form emotional attachments to rocks and bits of metal (e.g. jewelry). This might also relate to our visions of Gods and conscious intentions within natural phenomena throughout history. When people anthropomorphize slot machines, they even gamble more.

Photo by  Daniel Cheung  on  Unsplash

8.45am — I finally make it into a train, but on the way I become certain I will miss my train from London Euston station to Codsall. This annoys me and I seriously consider going back home, lying on my wonderful sofa, and ignoring all messages and calls from wedding people. But I have already paid money for the return train tickets. Surely I can’t waste that money by not taking the trip now? If I miss the train, I will just need to pay for another ticket at the station. I have come this far, lost this much money, and now I need to see the trip through to the end no matter what.

  • Cognitive bias 4: Sunk cost fallacy — This is often expressed as ‘throwing good money after bad’. When we have spent money on something, we experience an overwhelming commitment to it, and fight against any urge to drop out of the commitment early. When we buy a ticket to a play or an opera and take our seat, we are more likely to sit through a full 3–4 hours of torment rather than leave if we dislike what we are seeing. This is true even though the most rational decision is to leave if we predict continued disappointment from it. Remaining committed to a decision after we start it is perhaps one of the biggest drains on human time and happiness. And it’s not only monetary investments that affect us in this way. Commitments of effort and time also affect us in similar ways. Once we start, we are hesitant to stop, even if we foresee approaching disaster from continued commitment to our initial decision. We need to be able to stop when the time is right, ignoring past investments that have no real impact on what we do now. When resources have already been committed to a particular course of action, those sunk costs should not brainwash us into continuing with plans that turn out to be ineffective. Quitters are not always weak losers; they are often the strongest and most resilient people in the developed world. The sunk cost fallacy may itself be driven by overgeneralization (see Cognitive bias 1 above) of a “Don’t waste” rule.

9.22am — I ran at speeds that Einstein would be impressed with to catch the departing train at Euston station with about 15 seconds to spare. I drop myself down dramatically in an empty seat, and the air pushed out from under me creates a nice calming breeze. I think about the obstacles I have overcome to get here over the last couple of hours. So many separate bad things have happened on the way to this wedding. Positive and negative events seem to happen fairly randomly, sosurely I am due a pleasant surprise when I actually get to the wedding. Nobody has ever had such an unlucky roll.

  • Cognitive bias 5: The gambler’s fallacy: Have you ever been to a casino? Stand by the roulette table long enough and you’ll see something peculiar but intuitive for most people. When the roulette wheel has landed on red or black repeatedly in a row, customers start betting big on the opposite color for the next spin. They believe that in a random sequence, you are unlikely to see a long series of the same event. People intuitively feel that red, red, red, red, red, is less likely to happen than red, black, red, red, black, even though the probability of getting red or black is always 50%. This is referred to as the gambler’s fallacy. This is not just something that fools us standard everyday specimens of humanity. In my academic research, I analyzed some data that suggested elite soccer goalkeepers may show similar biases when deciding which way to dive in penalty shoot-outs.

12.00pm — I’m sitting in a cafe in Codsall, and against all the irrational odds I set myself, it’s one of the most peaceful and wonderful cafes I’ve ever sat in. I got more writing done than I normally would, had amazing cheap coffee and cake (relative to London where I lived at the time), and talked to random strangers about their lives. My wife ended up arriving around 6 pm and we made it to the wedding just before the curtains closed. I even enjoyed what was left of the wedding. Codsall is great…

Photo by  Gades Photography  on  Unsplash . Adapted by yours truly.

Photo by Gades Photography on Unsplash. Adapted by yours truly.

I walked you through my mental mishaps on that wedding day because they are so representative of our general everyday reasoning (feel free to describe your own examples in the comments section to help me look less stupid). But the same biases could just as easily apply in more serious situations, where the basic impulses in our characters guide us towards disastrous beliefs and actions. All of the biases listed above can change our lives by affecting the decisions we make. And there are certainly many more than the ones I mention. By being more aware of them, we can minimize the chance that they blindside us where it hurts.


How You’re the Easiest Person to Fool


How You’re the Easiest Person to Fool

What makes humans so special? Our ability to reason may be what most distinguishes us from other animals. We can consider our motivations, plan ahead, and look back on what we’ve done to understand how it went. But we are far from omniscient: our brains are designed to be efficient rather than perfect (if perfect is even possible). Because of this, we frequently make errors in judgement, and even convince ourselves of having made decisions that others made for us.

We regularly take shortcuts in our reasoning. Intuition plays a dominant role in our decision-making and this is not always a bad thing. It means we make quick decisions when it matters. When dogs see a frisbee flying in their direction, they don’t need to model the exact trajectory and flight speed of the object as it moves through three-dimensional space. Instead, they simply move to keep a steady optical angle between themselves and the object. When the ball is approaching in a straight line, they are likely to be in a good position to catch it. And don’t assume that dogs use this system because they’re too stupid for something more advanced. Professional baseball and cricket players use this same shortcut, and perform incredibly well with it.

Photo by  James Ting  on  Unsplash . Adapted by yours truly.

Photo by James Ting on Unsplash. Adapted by yours truly.

When you need to make quick decisions, these shortcuts based on the simplest sources of information may be the best option. In fact, the advantages of this intuitive decision-making may not be exclusive to decisions under time pressure. Even some business decisions that don’t necessarily depend on speed may be most accurate when using intuitive assumptions rather than complex statistical models. When assessing whether someone is likely to be a repeat customer, a shortcut based on when the customer last bought a product performs better than a complicated model that accounts for overall customer dropout rates, purchasing distributions, and customer lifetime distributions. Next time you sit for hours thinking about which of five backpacks to buy — analyzing their colors, sizes, styles, brands, etc — maybe you should stop yourself and just click to buy the cheapest so that you can go out and do something more worthwhile.

Shortcuts or “heuristics” are fantastic tools, but in the modern world, they often lead us astray. We can easily be caught in the midst of irrational and biased reasoning. For example, when we hear a description of a person and are then asked to judge the probable truth of different statements about them, we may well be inclined to talk nonsense. Imagine the following scenario. Linda is “31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations”. Knowing that about Linda, how would you rank these statements in order of likelihood:

  1. Linda is active in the feminist movement

  2. Linda is a bank teller

  3. Linda is a bank teller and is active in the feminist movement

People generally believe that 3) above is more likely than 2). This is because of the ‘representativeness heuristic’. Activity in the feminist movement feels as though it would be representative of Linda given her description, so we place a lot of weight on that statement. So much weight in fact that we believe it’s more likely Linda is a bank teller and a feminist, than just a bank teller with no additional statement about her. If you can’t quite intuit why this is a fallacy (the conjunction fallacy to be specific), just replace the feminist part of 3) with any other neutral descriptor. It’s easier to see why you’d rather put your money on Linda being a bank teller than both a bank teller and someone who enjoys flying kites. One of the statements is demanding only one thing to be true while the other is demanding both that thing and an additional qualifier to be true. When uncertain, the statement with an additional demand must be less likely by any logical standard.

Now and then, our brain just plain makes things up. If I ask you to choose between A or B, and you choose A, what do you think you’d say if I asked “So why did you choose B?”. In a conversation where you didn’t know this was coming, you may not even bat an eyelid. You’d go on to explain exactly why you chose B and why B is so great.

Photo by  Vladislav Babienko  on  Unsplash . Adapted by yours truly.

Photo by Vladislav Babienko on Unsplash. Adapted by yours truly.

In one amusing experiment, men and women were shown two photos of female faces, and asked to choose which they found more attractive. The researcher then gave that photo to the participant for a closer look, and asked them to explain why they chose that photo. Unbeknownst to the participants, on some trials the sneaky researcher switched the photos with a subtle trick that meant the participants were now explaining why they chose the photo that they did not choose. But rather than acting surprised and complaining that they chose the other photo, most participants began to reel off the reasons why this face — the face they did not choose — was the most attractive. Sometimes, they would cite reasons that could only apply to their originally chosen photo, in full view of a face entirely inconsistent with their reasoning (e.g. “because she was smiling”, when the woman in the new photo looked conspicuously miserable). Other times, they would cite reasons that only applied to the new photo, and could not have applied to their original choice (e.g. “I like earrings!”, when the original photo displayed no jewelry whatsoever). And occasionally they would simply invent excuses about how the woman they were seeing at that moment was “very hot” or had “more personality” about her, in spite of the fact that they originally judged her to be lower in those traits.

Whatever the specifics of the confabulations, people are certainly willing to deceive themselves. This level of self-deception and post-hoc rationalization does not only apply to judging physical attractiveness. You can find the same patterns of choice blindness when it comes to political opinions and even moral attitudes. Perhaps we’re not as inflexible and resistant to change as it commonly appears. All it takes is for someone to fool us into thinking that we’ve already changed our minds.

When we reason ourselves into a void, it’s easy to become distrustful of the squishy organs sitting inside our heads. But that would be expecting too much from your brain. It’s great at its job precisely because it takes so many efficient shortcuts. When a lion is spotted nearby, the person who immediately thinks “run” is likely to make a lot more progress in life than the person who sits analyzing the lion’s distance, speed, size, and probable hunger levels. Our intuitions are certainly capable of embarrassing us, and it’s always worth quadruple-checking our thinking process. The modern demands of societies, economies, and the internet, are unlike anything that humans ever experienced in their evolutionary history. So the occasional snafu or two should be expected when we try to make sense of it all. Let’s just hope we can limit the damage with a little extra self-awareness.