Viewing entries tagged
Self

The Science of Hypocrisy

Comment

The Science of Hypocrisy

This article was a front-page feature on Medium

Photo by  Rishabh Butola  on  Unsplash

For many of us, a huge part of daily conversation revolves around gossip. We love to talk about the blunders and missteps of friends, family, and celebrities. On top of that, news organizations and social networks are like outrage amplifiers because that’s what gets the clicks. We are all used to name-calling in the news, especially when it’s directed at politicians or performers. But there’s one particular name that really gets our attention.

If you want to destroy someone, call them a “hypocrite.”

Hypocrisy typically involves criticizing or condemning the immoral acts of others while engaging in those acts ourselves. This can make us look worse than if we engaged in those immoral acts but didn’t criticize them at all, which might sound odd. But would you rather someone engaged in immoral behavior and criticized it or engaged in immoral behavior and didn’t criticize it? Diving into the psychology of hypocrisy can make how we feel about it make more sense.

Testing for hypocrisy

An experiment in 2001 aimed to turn people into hypocrites in the lab. Participants were to assign a set of tasks to themselves and an unknown second participant. One type of task was exciting and offered rewards while the other was neutral with no rewards. A coin placed next to the participants had a written instruction explaining that most people believed flipping the coin would be a fair way to distribute the tasks. Indeed, practically all of the participants agreed that flipping the coin to assign tasks would be the most moral thing.

But when it came down to it, only half of them actually flipped the coin, with practically everybody in the non-coin-flipping half giving themselves the exciting tasks. Among the people who did flip the coin — which was labeled “self” on one side and “other” on the other — 85% to 90% still managed to assign the exciting task to themselves. Clearly, either the coin was a magical sycophant or the participants pretended the coin had landed in their favor when it really hadn’t.

People wanted to look fair by using a coin to make their decision, but behind the scenes, they were just as selfish as the people who did not use the coin at all (most of whom had agreed using the coin would be the most fair but didn’t do it). It’s all a perfect example of moral hypocrisy at work.

The drive behind hypocrisy

Self-interest is the most obvious reason for any of us to act like hypocrites. When people are questioned about why they act in conflict with their own stated moral standards, many will say that the personal costs are enough to outweigh the intention to act morally. Essentially, we all want to act fairly until we are put on the spot and facing our own personal consequences. For example, it’s easy to justify many of our unfulfilled wishes to donate to charities and failed inclinations to help a stranger in need by telling ourselves that we just can’t afford to do it right now.

We all want to act fairly until we are put on the spot and facing our own personal consequences.

Our hypocrisy helps us out, that’s for sure. But we also use it in our relationships. Often, when we rate the fairness or morality of other people’s actions, we judge them more harshly than we judge ourselves doing the same actions. In a 2007 report on a modification of the exciting task/boring task paradigm described earlier, participants afterward were told to judge their and others’ fairness on a scale from 1 (extremely unfair) to 7 (extremely fair). People scored themselves a 4 on average but rated others’ fairness at only a 3 on average.

Interestingly, our judgments of other people tend to be far more favorable if those others fall within our in-group (even if it’s a purely arbitrary in-group characterized by a random trait). We often judge an in-group member’s misbehavior to be just as fair as our own. We only have a greater distaste for other people’s bad behaviors when those people fall outside a social circle that we ourselves have drawn.

But why is hypocrisy so distasteful?

We’ve covered what hypocrisy looks like and what motivates it, but we haven’t tackled why we seem to hate it so much. One strong explanation relates to false signaling. In essence, hypocrites employ a double layer of deception in their immoral acts — one more layer than the basic liars who simply say they’ve acted morally when they haven’t. When we hypocritically condemn someone’s immoral behavior, we disguise our personal misbehavior with a veil of persuasiveness or manipulation. It’s easier to see through an outright lie than a hypocrite’s condemnation. On top of that, a hypocrite has brought another person into the game. Instead of directly denying their immorality, the hypocrite sneakily implies they are good by attempting to shame someone else. This is a recipe for hatred when caught out.

Hypocrites employ a double layer of deception in their immoral acts — one more layer than the basic liars who simply say they’ve acted morally when they haven’t.

A set of recent experiments had Yale faculty testing this false signaling theory by giving people stories about different kinds of liars and hypocrites and then studying how people judged the characters within those stories. Four important results came out of these trials:

  1. When a person condemns other people’s behavior and we know nothing else about that person, we typically believe it comes from their moral goodness.

  2. Condemnation of bad behavior is a stronger signal of a person’s moral goodness than claims of personally avoiding bad behavior.

  3. When a person condemns a behavior that they themselves commit (hypocrite), we rate them as significantly worse than a person who says they don’t commit a behavior when they do (liar).

  4. We perceive hypocrites better if they admit to sometimes engaging in the bad behavior than if they make no such admission.

Overall, it backs up the idea that we have a greater tolerance for liars than we have for hypocrites. Hypocrites are like a special type of liar who puts extra effort into disguising their misbehavior and sending us false signals of moral superiority. Those false signals drive our contempt. If a hypocrite is honest about their hypocrisy — if they get rid of false signals by admitting to what they condemn — our view of them can become significantly more favorable.

Perhaps there’s a lesson we can learn here. If we’re going to lie, that’s bad enough; let’s try not to fool and distract other people by pointing the finger. Sometimes, it’s okay to be transparent about our flaws. Nobody is perfect, but honest self-criticism and the ability to admit when we fail to live up to our own standards may be a good foundation for integrity. Hypocrites are terrible people. And occasionally, I’m one of them.

Comment

Our Bizarre Love for Story Spoilers

Comment

Our Bizarre Love for Story Spoilers

The people who make the best company tell the best stories. At parties, you never want to end up standing next to the guy who won’t stop talking about the washing machine he recently bought. Good stories teach us, entertain us, and help us to build connections with people. It’s probably obvious that a compelling story will depend on the interests of the person you are talking to. If you’re alone at a bar beside a stranger, last night’s football game might be a decent bet for a conversation starter. But there are also other critical features, independent of individual interests, that make us engaging storytellers.

Our brains dynamically track the content of a story as we hear it. It wouldn’t be accurate to talk about a ‘story perception’ part of the brain. Instead, as a story evolves, so does our brain activity. When we read about characters and their ambitions, the goal-directed areas of our brain are particularly active (e.g. the superior temporal cortex and prefrontal cortex). When we read about characters interacting with physical objects, motor-related areas of our brain are more active (e.g. precentral sulcus). In a sense, we are living a story as we read it. This is what makes the best literature so engaging and enjoyable: it takes us by the hand and leads us everywhere we need to go without looking back.

Self-control can be exhausting. If you’ve ever spent an hour trying to hold a good posture instead of your typical slouch, you’ve probably experienced this yourself. Keeping your back straight is not a particularly energy-intensive activity like running or swimming. Much of the pain comes from keeping your goal in mind and maintaining the levels of effort and motivation you need. I’ve previously described the exhaustion of learning to drive a car. Driving when you’re an expert is a lazy activity but driving when you’re learning can be utterly draining. Some evidence suggests that we have a central self-control energy resource, and when an activity drains that resource, our self-control capacity on other tasks is also diminished (this is a theory known as ego-depletion, although it is currently hotly debated).

The question for storytellers is whether they can exhaust other people with their stories. Listening to someone’s story could reasonably leave us feeling tired, but it seems to depend on how we process their story. Imagine reading a story about a waiter who arrives at work feeling hungry, and spends their whole shift resisting the temptation to eat food from the restaurant because it is against company policy. If we actively take the waiter’s perspective as we read the story, it exhausts our self-control capacity. But if we passively read without putting ourselves in the character’s shoes, it does not seem to have this cost. In fact, it may inspire additional self-control capacity according to some evidence. So engaging with a story can be an exhausting experience, but it depends on how much we empathize with the characters as they go through their hardships. I can relate to this evidence after recently spending a full day watching a controversial senate hearing about the alleged wrongdoing of a US Supreme Court nominee. Putting myself in the shoes of the questioners and the witnesses, as the stories and perspectives were laid out, did not leave me feeling particularly healthy at the end of the day.

This question is going to sound crazy, but is it possible that spoilers for stories are a good thing? Well, according to one study, they may be. Researchers gave participants stories that they had not read before, and asked them to rate how much they enjoyed the stories. Before some of the readings, the participants were shown a spoiler paragraph before reading the full story. Unbelievably, for several types of story (mysteries, ironic twists, and evocative literary stories), people consistently reported enjoying the spoiled stories more than the unspoiled stories. You can come up with your own reasons for why this might happen, but some possibilities are that spoilers allow us to organize and anticipate stories better, while offering the pleasurable tension of knowing what may be about to hit the characters. This makes some sense to me, and explains why I often enjoy a movie more after a second viewing. However, for the time being, please do not spoil any of the upcoming titles on my movie list. The evidence might show that I’m likely to enjoy a spoiled story. But it hasn’t said anything about long-term appreciation or memorability. Yet…

Photo by  Zhifei Zhou  on  Unsplash . Adapted by yours truly.

Photo by Zhifei Zhou on Unsplash. Adapted by yours truly.

In keeping along this track of surprising facts, here’s one more obvious-sounding question. Would you prefer to hear a story about an experience you’ve had before, or an experience you’ve never had? You’d be forgiven for thinking you would rather hear about something completely new, because that is in line with what most people think. However, we may all be wrong. In an experiment, when a speaker describes their experiences related to a video, they expect that listeners will prefer listening to the story if they have not already seen that video. And the listeners expect the same thing. But the data suggests that listeners enjoy a story about a familiar video more than a story about an unseen video. When we tell people stories that are completely new to them, we tend to assume they have more knowledge about the topic than they do. It’s a cognitive bias called the “curse of knowledge”. It leaves us in the awkward position of occasionally rambling while someone stares blankly into our eyes, too polite to interrupt and ask “what are you talking about?”.

So there’s a great incentive to learn about another person’s interests and background before deciding which of your amazing stories to share. For both story spoilers and conversational topics, some sense of familiarity allows our brains to keep better pace with a changing story. We don’t need to exhaust ourselves with patching holes in our knowledge and playing catch up in real time as the story unfolds. Next time you’re out chatting with friends, tell them stories about what they already know.

Comment

How Music Plays Your Brain

Comment

How Music Plays Your Brain

Listening to music can be a euphoric experience. It’s unclear exactly why it should feel so good. Is there some evolutionary advantage to enjoying music? Is it a byproduct of some other important function? Is it just one big accident in our evolutionary history? The debate still rages on these questions, but there is one important fact that we can be confident about: music has some deep-rooted appeal for humans.

There is something special about music even for the youngest listeners. Infants in their first year of life already have a meaningful sense of musical timing and pitch. When listening to samples of Western music, Mafa populations in Cameroon recognize the same basic emotions of happiness, sadness, and fear that Westerners do. Both populations also enjoy a similar sense of musical harmony when they listen to each other’s music. When asked to express different emotions by creating musical or physical movement patterns in a computer program, participants in the USA and a tribal village in Cambodia make very similar choices. There is a fundamental core to musical experience and expression that all humans seem to share.

We can look beyond humans to examine how deep our musical roots really stretch. In addition to the cross-cultural appeal of music, there may be a cross-species appeal. There are ongoing discussions about exactly how much our perception of music overlaps with that of non-human primates. Although there are commonalities in our ability to detect rhythms, it is still unclear whether monkeys can synchronize their movements with music in the way that humans can. Some non-human primates, like Kuni the bonobo, may spontaneously synchronize with audible rhythms when they play with a drum. But we need to wait for more evidence to fully understand whether non-human primates enjoy dancing as much as we do.

Non-human primates, just like humans, do prefer consonant music over atonal or dissonant music. However, researchers have often struggled to find any consistent preference for music over silence when non-human primates can choose between them. In 2014, one research group decided to test this question in a little more detail, by trying a range of different musical styles. They divided a room into four zones, which progressively increased in distance from a music speaker playing either West African akan, North Indian raga, or Japanese taiko instrumental music. As the music played, the researchers measured where a group of chimpanzees spent most of their time. If they spent most of their time in zone 1, closest to the speaker, it would indicate that the chimpanzees enjoyed the music. If they preferred to stay in zone 4, where they could barely hear the music, that would suggest they preferred silence.

When the Japanese music played, the chimpanzees showed no preferences between zones. They seemed to be just as happy close to the speaker as they were far away from it. But when the African or Indian music played, they spent the majority of their time in zone 1, as close to the music as possible. In fact, they spent significantly more time in zone 1 than they did when no music was playing. So the tonal melodies and ambiguous pulses in West African akan and North Indian raga music seem to set a nice mood for chimpanzees.

Photo by  Rob Schreckhise  on  Unsplash . Adapted by yours truly.

Photo by Rob Schreckhise on Unsplash. Adapted by yours truly.

What exactly is our brain doing when we listen to music? Instead of processing individual sounds, the brain processes patterns of sound as it implicitly develops expectancies about what is coming next. The auditory parts of our brain analyze several core features of music including pitch and duration, and interact with frontal brain areas as we use our working memory to pull together the information into higher level abstract representations. Many of us have experienced the intense chills that come with listening to deeply moving parts of our favorite music. As this happens, the reward centers of our brain, especially subcortical areas including the ventral striatum that sit deep within the brain, adjust their activity: the more chills we feel, the more activity they show. In fact, when we experience those moments of musical euphoria, the striatum releases dopamine, one of the brain’s reward-related neurotransmitters.

Experts who play musical instruments probably experience music differently to non-musicians. A recent study looked for this difference in the brain. The researchers put beatboxers and guitarists in a brain scanner and measured the levels of activity in their brain as they listened to music involving beatboxing or guitars. When looking at parts of the brain involved in translating sensory information into motor actions (sensorimotor areas), they found an interesting pattern. Guitarists showed increased sensorimotor activity when listening to guitar music, and beatboxers showed increased sensorimotor activity when listening to beatboxing music. So their prior musical experiences in physically playing instruments changed how their brains reacted when listening to those instruments.

The researchers drilled down a little further to examine the finer details of the sensorimotor activity for each group. The primary sensorimotor areas of our brains are organized somewhat topographically: different sections of their structure represent different parts of the body (you can map out and illustrate these maps with “cortical homunculi”). This means you can compare “hand” activity to “mouth” activity in those brain areas. If musicians’ sensorimotor activity during listening represents the actions involved in playing the music, then you would expect to see hand areas activated for the guitarists and mouth areas activated for the beatboxers. After all, those are the body parts they use when making their music. This is precisely what the researchers found. Guitarists activated their hand areas when listening to guitar music, while beatboxers activated their mouth areas for beatboxing music. Their brains automatically recruited relevant sensorimotor regions in processing the musical audio, almost as though they were actually playing the music on some level. In other words, the musicians’ passive listening became a little more active when listening to their own type of music.

Photo by  Matheus Ferrero  on  Unsplash . Adapted by yours truly.

Photo by Matheus Ferrero on Unsplash. Adapted by yours truly.

Music is one of those real-life miracles that practically all of us can connect to. It brings us together at festivals, bars, and other social events, and can give us a dramatic emotional lift when we most need it. The question of why it has these magical effects continues to elude us, but this mystery makes our musical experiences all the more impressive. Whether you’re a metalhead or an opera enthusiast, don’t forget to fully appreciate and enjoy your next musical fix. If you’re lucky, it might inspire a creative spark or moment of ecstasy. But at the very least, you’ll be tapping into an experience that you share with many of your fellow primates.

Comment

The Tech That Reads Your Mind and Sees Your Dreams

Comment

The Tech That Reads Your Mind and Sees Your Dreams

“turned on red Psychic Reader neon sign” by  Scott Rodgerson  on  Unsplash

“turned on red Psychic Reader neon sign” by Scott Rodgerson on Unsplash

The ability to read someone’s mind has traditionally been the stuff of fiction. Our thoughts and experiences are private to us and we can choose when we share them with others. But with developments in brain scanning technology, mind-reading is becoming a hard science rather than a false promise. You may no longer need to be superhuman to see the darkest thoughts and desires of the person opposite you. You can instead convince them to lie in your brain scanner.

A basic test for a mind-reading machine is to tell you what visual image you are holding in your head. If the machine has to decide which of two images you are thinking of, does it perform significantly better than guessing at random?

In some cases, this is a fairly easy task with a brain scanner. For example, if you are trying to guess whether someone is thinking of playing tennis or walking around their house, you find different areas of the brain that are most active: the supplementary motor area for the motor imagery involved in playing tennis, and the parahippocampal place area for the spatial imagery involved in walking around your house. This distinction has been used to communicate with hospital patients who lie motionless in what appears to be a coma. If patients can use a thought to answer “yes” or “no” to questions (e.g. thinking of tennis for yes and house for no), then the doctor knows that the patient is showing signs of consciousness.

When you have a larger number of visual images to choose from or greater similarity between images, the task of decoding what someone is thinking becomes far more difficult. The overall levels of activity across the brain might be very similar for seeing a leopard versus a duck, so you need to be more sophisticated in how you analyze brain imaging data. One option is to drill down into detailed patterns of activity within a single area.

You can start scientific mind-reading by decomposing a list of images into their different visual features (e.g. object position, orientation, light contrast, etc). Then, you can take a set of practice images and train a decoder machine to link the features for those images with patterns of activity in the visual areas of a person’s brain as they see those features. Each feature drives brain activity in a different direction, so every unique combination of features corresponds to a unique pattern of activity overall.

After the machine is trained, you show the person brand new images they’ve never seen before and measure the patterns of activity in their visual brain areas. By using the associations that the decoder picked up during training, you can infer the visual features for the new image they see from their brain activity patterns. The machine can then look through a database of images, and estimate which image the person is seeing based on how closely the inferred features from brain activity match the actual decomposed features of an image. The closest match becomes the machine’s best guess.

The amazing thing is that because there is so much overlap in how our brain responds to actual visual images and the visuals we imagine in our head, you can do the same thing for what people are thinking about. Instead of showing them a visual image in the brain scanner, you just ask them to visualize a particular object in their mind. By analyzing brain activity in the same way, the machine can correctly infer which object they are imagining, or even which piece of famous artwork they have in mind.

One of the most exciting applications for this kind of mind-reading may be in decoding the content of our dreams while we sleep. Dreams are not only notoriously difficult to understand, they are often so vague and disconnected with reality that we barely remember them when we wake up. However, as with the imagined versus seen images I mentioned above, there is strong overlap in our visual brain patterns corresponding to seen images and dreamt images.

Researchers put participants in a brain scanner, waited until they fell asleep, and then woke them up during the most dream-intensive phase of sleep. By asking them to describe any visual images they saw while asleep, the researchers built a record of the images that people dreamed and their brain activity during those moments. By training a decoder machine on brain activity when people physically saw different images while awake, they could successfully read and predict what people visualized in their sleep from the same patterns of brain activity. As this kind of technology develops and improves, we should end up with more accurate and more comprehensive dream-reading machines.

Memory is another important function that depends on our ability to generate mental images. Long-term memory is similar to the type of visualizing I described in the experiments above. If I ask you to imagine a leopard or your tennis swing, you are recalling elements from your long-term memory of past experiences with those images or actions. But you also have working memory, which refers to your capacity to hold a number of objects in mind for seconds or minutes while doing a task. You may be trying to hold a phone number in mind while you dial it, or perhaps pictures during a memory game.

Visual areas of the brain generally do not sustain their overall level of activity when we hold visual images in memory. But as I explained before, you may need to drill down to find patterns of brain activity that code for specific images. This is exactly what one group of researchers tested when they asked participants to remember the orientation of a quickly flashed visual object for 11 seconds. After that delay period, participants had to decide whether a new comparison object was the same as the object in their memory. The decoder could analyze activity in visual areas of the brain during that delay period, and guess which of two orientations people were holding in their memory with over 80% accuracy. So even though brain activity in those areas returns to its resting level after seeing a visual object, it continues to exhibit a pattern of activity matching that object as you hold it in memory. Those same patterns also reliably predict the image if you generate it yourself in your mind instead of holding it in working memory over a delay.

The activity in our brain is naturally responsible for our mental experiences. Decoding those experiences with brain scanners is a radical and enlightening new project. We have already hit successes in decoding the contents of our mental imagery, dreams, and working memory. It’s thrilling to consider where we go from here. Although it’s easy to worry about future misuse of this technology (e.g. with invasions of privacy), the scientific journey itself could realistically improve how we understand people’s conscious experiences and potentially how we treat mental health disorders. In essence, it could teach us about the most important facts in our lives: where our thoughts come from, what they do to us, and how we can change them for the better.

Comment

Stop Assuming They Don’t Like You

Comment

Stop Assuming They Don’t Like You

Photo by  Noah Buscher  on  Unsplash

When you think about what makes you anxious in life, social events are likely to feature prominently. Public speaking, meeting new people, and competing with others make many of us wince with an awkward pain. We have anxieties about what can go wrong for good reasons: loneliness is a killer, and weak social networks can prevent us from making progress.

Fear of embarrassment may be one of the primary emotional drivers that make us nervous about joining or speaking to a new group of people. We don’t want to be that person standing alone at the party and we don’t want our reputations destroyed by a hasty comment that came out wrong. Generally speaking, two things need to come together to cause embarrassment. The first is a failure according to our personal standards (e.g. falling over or saying something stupid). The second is a social setting in which we know others may be judging us. When you look at the brain while someone is embarrassed, you find activity in exactly the areas of the brain that are most relevant for these functions: emotional arousal areas like the anterior insula that are linked to the experience of personal failure, and ‘mentalizing’ areas like the medial prefrontal cortex that are involved in understanding what other people may be thinking about us.

When we end up in the unfortunate position of social reject, the emotional pain we experience is not so different to the physical pain from a burn. Both are deeply uncomfortable, highly aversive, and both make me want to jump into a large bucket of ice water to numb the pain. In fact, there is striking similarity between the two types of pain in how the brain treats them. They both activate parts of the brain important for processing physical sensations on the body (like the posterior insula and somatosensory cortex). When you really zoom in to look at those areas in more detail, you may be able to detect differences in the precise patterns of activation within them, depending on the type of pain. After all, the two experiences are not entirely identical and we are still very capable of distinguishing them. But there is no getting around it: when we feel socially rejected, it hurts like hell. Whether a romantic partner has called for a hiatus, or we’ve embarrassed ourselves in front of an audience, the brain knows exactly which systems to recruit in order to make it as excruciating as it needs to be.

Photo by  rawpixel  on  Unsplash . Adapted by yours truly.

Photo by rawpixel on Unsplash. Adapted by yours truly.

If some of my previous accounts of personal and general brain-hating haven’t already made it clear, we are vulnerable to errors in our perceptions and thinking patterns. So maybe there are times when we misread how others feel about us. In a refreshingly simple recent experiment, researchers put two strangers into a room, gave them a few ice-breakers, and asked them to chat. They then pulled the couple apart and surveyed them individually on how they felt about the other person, and how they believed the other person felt about them. People consistently underestimated how much the other person liked them, and the researchers called this ‘the liking gap’. This gap in how we think other people feel about us, and how they actually feel, can persist for months after we meet someone, and it holds true whether the conversations we had were 2 minutes or 45 minutes long.

It’s almost as though we are utterly determined to believe that other people have a problem with us, even in the absence of any evidence to support that belief. The effect may be driven by an excessively critical review of our own performance after an interaction with a new person. We judge our own conversational quality more negatively than we judge other people’s. We dwell too long on small details that might have been mistakes and might have annoyed or offended our conversational partner, and don’t pay enough attention to how they reacted perfectly happily or normally to everything we said. Perhaps this self-critical attitude drives us to improve and become better company in the long run. Or perhaps it needlessly upsets and embarrasses us, and makes us hesitant to meet new people in the future. That’s for you to decide.

Photo by  Kelly Sikkema  on  Unsplash . Adapted by yours truly.

Photo by Kelly Sikkema on Unsplash. Adapted by yours truly.

We are gregarious creatures, so friends provide some of the biggest excitements and joys that life has to offer. It’s a good idea to carefully monitor our behavior and make sure we present our best selves when we meet new people. But much of the time, we have a habit of reading the situation poorly. In typical conversations, the pressure to be liked can overwhelm our rationality and distort our judgments about what other people are thinking. When we next conclude that a conversation was a failure, it might be worth a second thought. And even when we really do suffer a social rejection, there may be silver linings we can cling to, like the opportunity to use our emotional reactions and sense of independence as inspiration to be creative (and there are other ways to maximize your creativity too). Aren’t all the best love songs about breakups?

Comment

The Day I Embarrassed Myself

Comment

The Day I Embarrassed Myself

Photo by  Louis Hansel  on  Unsplash

The biggest turning points in our lives come from moments when we need to make a decision. We make decisions ranging from the most trivial to the most important every single day. We pick and choose the friends who are right for us, the directions in which to travel, the careers to develop, and the cities to build. Anatomically speaking, we humans are all basically the same. It is our decisions that set us apart.

Decisions are not always easy, and the modern world often asks a lot from our poor ape brains. Sometimes it seems like we can’t win. We can have both too little and too much choice. Our conscious minds can overthink a problem while our unconscious minds miss too much. And we are expected to make reasonable sense of what is around us now, while also predicting the future consequences of the possible decisions available to us.

Predicting the future is no easy feat for non-clairvoyants (i.e. everyone). Many people and events can depend on what we decide to do, and I’m not just talking about the decisions of war generals. Deciding whether or not to buy a coffee right now can impact what we hear and say in a later work meeting and might affect our reputations and careers. Deciding whether to take this crowded train or the next quiet one to university can determine whether we make it to an exam on time or fail. And the most recent pressing decision on my mind while I lived in the UK: deciding whether or not to attend a wedding can impact how particular people feel about us.

As I hinted at when I referred to our poor ape brains, our reasoning and decision-making is not optimally set up for modern day in civilized society. There are plenty of processes and mechanisms that made sense in our evolutionary history, but now are misaligned with the ideals and demands of modern life. We call them cognitive biases and our brain is littered with them. I will talk through just a few of these in the context of my decision-making on the day I had to attend a wedding, because it’s easy to see how often I make these mistakes. It might seem like a dire situation for human psychology, but far from it. When I notice a cognitive bias appear in my head and remain aware of it, it is less likely to force me into making poor decisions that turn my molehills into mountains.

Keep in mind this important note as I tell you the story: I hate weddings. I absolutely hate weddings. And I argued with my wife every day for one month about why I had to go to this specific wedding, and why I couldn’t just stay at home (just like I argued for my own wedding). This was how my morning went on that day. I will italicize my cognitive mistakes to make them extra embarrassing. I hope you can relate to at least a couple of them. Here goes…

Photo by  rawpixel  on  Unsplash . Adapted by yours truly.

Photo by rawpixel on Unsplash. Adapted by yours truly.

7am — My alarm rings and I slowly open my eyes. It dawns upon me. It is the day of that wedding, and I need to leave the house within the next hour for a long journey from London to Codsall. Codsall for goodness sake. Codsall! What a daft name for a place.

7.15am — I’m still lying in bed, and putting off the day ahead by reading the news on my phone. I get a message from my wife who is flying in from Washington DC and meeting me at the wedding. Her flight was cancelled during the night while I was failing to sleep but avoiding looking at my phone, and she had to get on a new one. She will now be at the wedding 6 hours after I arrive in Codsall. I will need to spend 6 hours in a dingy little depressing village cafe, waiting for my wife, so that I don’t need to spend any time alone at the wedding. This is an abomination.

  • Cognitive bias 1 — Overgeneralizing learned rules: I have been to many small English villages in my time, and I would estimate something like 40% of them had cafes that I did not enjoy sitting or working in. In cities, this value is close to 0%. So I have detected what I believe is a reasonable pattern in the world in terms of my preferences, but I am over-applying the rule to places I have never visited before. Yes, I have encountered far more beastly cafes in small villages than I have in cities. But it is nowhere near 100% of those villages. So I should be giving completely new places a good chance of impressing me with their cafe selection. Some evidence suggests overgeneralization may be relevant in panic disorder and generalized anxiety disorder, where patients’ perceptions of danger spread too far.

7.31am — I’ve made it into the shower.

8am — After sulkily putting on my clothes and throwing my suit in a bag, I am prepared to leave. I look out of the window and it is pissing it down out there (translation for non-British people: raining heavily). I do not have an umbrella.

8.15am — I’ve walked through the rain and I’m now at my local tube (subway/metro) station in east London. The place is crawling with humans scurrying to get to work in central London. I miss the first train I need because too many people get on ahead of me, so I wait my turn at the front of the queue for the next one. It arrives but I’m pushed out of the way by a small woman with curly hair who was behind me. She takes up the last empty space on the train as the doors slam shut, and she looks at me with contempt as it begins moving. This woman is an arrogant, selfish, devil-worshipper. I am now a misanthrope for the foreseeable future.

  • Cognitive bias 2 — Attributional biases: We can always catch ourselves making mistakes in how we attribute characteristics to the events in our lives. One example is the “curse of knowledge”, in which both adults and children incorrectly assume other people know what they do. This makes communication difficult and can lead to bad decisions. We also make errors in attributing responsibility, especially by ascribing permanence to what is temporary. When we are happy or sad, we often feel it is a defining part of us rather than a fleeting emotional experience that will come and go. Patients with depression have a worse problem: they believe that any negative emotions will stay with them forever and are their own fault, while positive emotions are an accident that will disappear before long. We also tend to assume that other people’s bad behavior is attributable to basic character flaws rather than the possibility that they are just having a bad day. Is the pushy woman I met on the train really a devil-worshipper? Or could she have got some bad news about a relative that morning?

8.22am — I am sitting in a chair on the train platform in despair, with my head in my hands. Everyone has their own thing going on, entirely ignoring each other. A small black Labrador trots up to my leg on its owner’s leash. It stares into my eyes. This dog knows. It is confused about our culture and behavior and is questioning why we insist on standing in these crowded sweaty places rather than running around outside in the park chasing sticks.

  • Cognitive bias 3 — Anthropomorphization: We enjoy imbuing non-humans with human-like characteristics because we feel it helps us to understand them better. This is not always completely unrealistic. After all, a dog probably has some emotional subjective experiences going on, even if we cannot exactly pinpoint their quality relative to humans. The problems with anthropomorphization are a little clearer when we start talking about inanimate objects as though they were alive. We see eyes in the headlights of cars, a Mother figure within nature, and we form emotional attachments to rocks and bits of metal (e.g. jewelry). This might also relate to our visions of Gods and conscious intentions within natural phenomena throughout history. When people anthropomorphize slot machines, they even gamble more.

Photo by  Daniel Cheung  on  Unsplash

8.45am — I finally make it into a train, but on the way I become certain I will miss my train from London Euston station to Codsall. This annoys me and I seriously consider going back home, lying on my wonderful sofa, and ignoring all messages and calls from wedding people. But I have already paid money for the return train tickets. Surely I can’t waste that money by not taking the trip now? If I miss the train, I will just need to pay for another ticket at the station. I have come this far, lost this much money, and now I need to see the trip through to the end no matter what.

  • Cognitive bias 4: Sunk cost fallacy — This is often expressed as ‘throwing good money after bad’. When we have spent money on something, we experience an overwhelming commitment to it, and fight against any urge to drop out of the commitment early. When we buy a ticket to a play or an opera and take our seat, we are more likely to sit through a full 3–4 hours of torment rather than leave if we dislike what we are seeing. This is true even though the most rational decision is to leave if we predict continued disappointment from it. Remaining committed to a decision after we start it is perhaps one of the biggest drains on human time and happiness. And it’s not only monetary investments that affect us in this way. Commitments of effort and time also affect us in similar ways. Once we start, we are hesitant to stop, even if we foresee approaching disaster from continued commitment to our initial decision. We need to be able to stop when the time is right, ignoring past investments that have no real impact on what we do now. When resources have already been committed to a particular course of action, those sunk costs should not brainwash us into continuing with plans that turn out to be ineffective. Quitters are not always weak losers; they are often the strongest and most resilient people in the developed world. The sunk cost fallacy may itself be driven by overgeneralization (see Cognitive bias 1 above) of a “Don’t waste” rule.

9.22am — I ran at speeds that Einstein would be impressed with to catch the departing train at Euston station with about 15 seconds to spare. I drop myself down dramatically in an empty seat, and the air pushed out from under me creates a nice calming breeze. I think about the obstacles I have overcome to get here over the last couple of hours. So many separate bad things have happened on the way to this wedding. Positive and negative events seem to happen fairly randomly, sosurely I am due a pleasant surprise when I actually get to the wedding. Nobody has ever had such an unlucky roll.

  • Cognitive bias 5: The gambler’s fallacy: Have you ever been to a casino? Stand by the roulette table long enough and you’ll see something peculiar but intuitive for most people. When the roulette wheel has landed on red or black repeatedly in a row, customers start betting big on the opposite color for the next spin. They believe that in a random sequence, you are unlikely to see a long series of the same event. People intuitively feel that red, red, red, red, red, is less likely to happen than red, black, red, red, black, even though the probability of getting red or black is always 50%. This is referred to as the gambler’s fallacy. This is not just something that fools us standard everyday specimens of humanity. In my academic research, I analyzed some data that suggested elite soccer goalkeepers may show similar biases when deciding which way to dive in penalty shoot-outs.

12.00pm — I’m sitting in a cafe in Codsall, and against all the irrational odds I set myself, it’s one of the most peaceful and wonderful cafes I’ve ever sat in. I got more writing done than I normally would, had amazing cheap coffee and cake (relative to London where I lived at the time), and talked to random strangers about their lives. My wife ended up arriving around 6 pm and we made it to the wedding just before the curtains closed. I even enjoyed what was left of the wedding. Codsall is great…

Photo by  Gades Photography  on  Unsplash . Adapted by yours truly.

Photo by Gades Photography on Unsplash. Adapted by yours truly.

I walked you through my mental mishaps on that wedding day because they are so representative of our general everyday reasoning (feel free to describe your own examples in the comments section to help me look less stupid). But the same biases could just as easily apply in more serious situations, where the basic impulses in our characters guide us towards disastrous beliefs and actions. All of the biases listed above can change our lives by affecting the decisions we make. And there are certainly many more than the ones I mention. By being more aware of them, we can minimize the chance that they blindside us where it hurts.

Comment

This Is Your Brain on the Internet

Comment

This Is Your Brain on the Internet

This article was a front-page feature on Medium.

The Earth has never witnessed a more seamless tool for knowledge-sharing than the internet. Word-of-mouth is a great way to send knowledge from one brain to another, and the internet allows us to do this with practically any source of information in an instant, from one side of the world to the other. But as Uncle Ben says in Spiderman, “with great power comes great responsibility.”

Our frictionless information sharing has become a bit of a pain. We’re reluctant to invest time in checking the validity of each piece of news we find, and this leaves a lot of dishonest or misleading material circulating around in our brains, potentially changing our behavior in dysfunctional ways. Learning is a great and productive tool when it teaches us what is true about the world, helping us to better navigate it. It’s not so great when it teaches us a falsehood. If we learn that marshmallows are good for building electronic machines, for example, we won’t get very far. But if we learn that silicon is good for building electronic machines, we get the smartphone.

True and false news spread differently across the world of social networks. Falsehoods tend to be more novel and unique in their messages (they are being invented after all), so they spread faster and wider across networks than truths. The way in which people interact with the messages also varies. False news inspires comments of surprise and disgust, while true news inspires comments of sadness, anticipation, and trust.

We cannot blame bots for this state of affairs. Although they amplify and accelerate the spread of news, they tend to do this equally for both true and false news. It is we humans, with our attraction to the unique drama, surprise, and disgust of false news, who are primarily responsible for the avalanches of misinformation that characterize modern information consumption.

The internet has changed the way that our brains work. Humans have always been good at learning and adapting to new environments. So given the internet’s dramatic impact on life in the developed world, it is no surprise that we have adjusted our thinking and behavior. The biggest impact has perhaps come from companies like Google, which make all knowledge available to us at a few keystrokes.

Our internet usage has “Googlified” our brain, making us more dependent on knowing where to access facts and less able to remember the facts themselves.

We can test how much technology has influenced our mental function by examining how and when the brain activates tech-related concepts. When words and concepts are readily accessible at the front of our minds, they often distract us and interfere with how well we perform behavioral tasks. Researchers have used this principle to test whether difficult trivia questions automatically activate internet-related concepts in our brain. If we don’t know the answer to something, our first thought is likely to be “Google.” When study participants took part in a behavioral task immediately following difficult trivia questions, their performance in that task worsened when words like “Google” appeared on a screen, distracting them.

There are other powerful indicators of Google’s impact on our mind. When we type out interesting trivia tidbits on a computer, our memory for the information is significantly better if we are told that the computer will delete rather than save the information. And if we type out tidbits and save them to a specific folder, we are more likely to remember where we stashed that information than details of the actual contents.

In other words, our internet usage has “Googlified” our brains, making us more dependent on knowing where to access facts and less able to remember the facts themselves. This might sound a little depressing, but it makes perfect sense if we are making the most of the tools and resources available to us. Who needs to waste their mental resources on remembering that an “ostrich’s eye is bigger than its brain,” when the internet can tell us at a moment’s notice? Let’s save our brains for more important problems.

Image:  sbtlneet/Pixabay / CC0 , modified by author.

Image: sbtlneet/Pixabay/CC0, modified by author.

The internet acts as a great aid, but our faith and reliance on it can make us overconfident in our own abilities. After using the internet to look up answers to questions, we begin to believe that our own general ability to understand and explain problems is better than it really is. Compared to those without recent internet access, we even insist that our brains are more active, reflecting our illusions of superior competency. We often fail to grasp just how much we are relying on sources beyond our own talents when we succeed in the world.

Photographs also have transformative effects on the way our memories work. When we walk through an art museum, we remember less about the display pieces if we take photos of them, even when it takes longer to photograph them than purely look at them. Our memory is less impaired if we zoom into specific details of the pieces while photographing them, but certainly no better than enjoying the pieces themselves without our paparazzi behavior.

The advantages of using the internet correctly are enormous, so we need to be careful about making any concrete recommendations on usage limits.

Photographs can be a great way to physically save a moment into your collection, and cameras may help visual memory if used as a tool to enhance how you engage with an experience. But don’t let them come at the expense of your own enjoyment and natural memory of the real thing in front you. It’s counterproductive and a little bizarre to take photos of the world’s wonders, but forget to look at them while they’re actually there.

In our modern digital world, we are using increasing numbers of different media at the same time. The effects of this on our general cognitive capacities are not yet clear, but there may well be some costs. A 2009 study showed that people who heavily engage in multiple forms of media at the same time (e.g., talking on the phone, while working on an essay, while listening to music, while watching TV), perform worse in standardized cognitive tests that measure memory, attention, and task-switching. (A 2013 study suggested the opposite effect for task-switching.)

Heavy media multitaskers may be more vulnerable to distraction and interference from irrelevant sources of information. When you constantly bounce between multiple sources of entertainment and work, you may well be training your mind to become more volatile and less able to sustain attention to the one important task you really need to complete.

Photo:  Erik Lucatero / Unsplash , modified by author.

Photo: Erik Lucatero/Unsplash, modified by author.

To fully understand the costs and benefits of the internet on our brains, we need to patiently watch how the research evolves over the next few decades. The reward-rich world of the internet may come with costs that include distractibility and impaired self-control. Recent studies even suggest that children who use the internet excessively may develop less gray and white matter volume in certain brain areas, and may harm their verbal intelligence. It is not yet clear if internet usage directly causes these effects or if children who are predisposed to the effects are just more likely to overuse the internet. For now, the evidence provides notes of caution and attention rather than conclusive insights.

The advantages of using the internet correctly are enormous, so we need to be careful about making any concrete recommendations on usage limits. However, as with practically everything in the world, moderation and thoughtful consumption are likely to go a long way.

When we pay careful attention to what the internet is doing to us in our own lives — how happy or sad it is making us, and how much it is helping or hindering our progress — we can make better decisions about optimizing our well-being. The internet is amazing, but the beautiful world outside is also waiting for us to directly experience, learn from, and appreciate it. The whole wide world and the world wide web may well compete for our time and attention. It is up to us to maximize the benefits in our own lives by choosing the right “www” when it matters.

Comment

Why Feeling Insignificant Can Be Deeply Empowering

Comment

Why Feeling Insignificant Can Be Deeply Empowering

“When you consider things like the stars, our affairs don’t seem to matter very much, do they?” - Virginia Woolf (Night and Day)

In the Western world, we are carefully nurtured by those around us until we reach ‘maturity’, and then we are left to face the world alone. We do, of course, improve in our ability to live independently as we get older. As babies and toddlers, our independence is pretty much limited to breathing. Everything else, including feeding, cleaning, and transport, is closely supervised by adults who hopefully know what they are doing. But when we hit our late teens and early twenties, perhaps excluding the occasional error in feeding and hygiene, we are able to do all of these things alone. So we are encouraged to spread our wings and fly.

Independence is important, but if we are honest, there are many moments of fear and uncertainty where we could use guidance, even well into adulthood. We all want help with our minds because we are needy creatures, but asking for it can be a daunting experience because we do not want to seem incompetent or babyish. So we are left with the impression that others are living successful happy lives, while we put on a brave face that masks an insecure internal reality.

d0ef4-1gjoqbgy2xtc09atcffgjpg.png

Our hesitance to ask for help is part of a collective psychosis, driven by the extreme assumption that adults should be able to take care of their own psychological development, and anything else is a sign of weakness. In fact, we are all wearing the emperor’s new clothes, and it may take someone screaming “But we are all incompetent!” to snap us out of our delusions of suffering alone. It’s about time we all got a little more familiar with each other’s worries and follies, so that we can see just how much overlap there is between us. Sharing our concerns with someone who responds with an acknowledging nod of “I’ve been there my friend” can have a profoundly positive effect on our emotional life.

I remember one particular moment in my own life where this scenario came to pass. It was a chilly Saturday night; the kind that makes your knuckles feel numb within minutes of stepping outside. I was a fairly standard emotionally volatile 17 year old sitting with friends in the local pub in my small town on the outskirts of London. I noticed an on-off girlfriend walk in and sit at the next table with her friends. It was the kind of relationship you might label with “it’s complicated” on an online social network page. We got chatting later in the night, but naturally with the early stages of intoxication looming, a slightly heated debate took to the stage (so important was this debate that I have entirely forgotten what it was about).

After making our way out of the pub into the conveniently cold night that would cut any argument short, we exchanged a few more intense words, then I turned and darted away into a dark and lonely park nearby to feel sorry for myself. I sat down at the nearest and gloomiest bench I could find to complement my mood, and contemplated why the universe wouldn’t let me be happy like everyone else. A drunken gentlemen probably a decade or so my senior at the time — my age now as I write this sentence — walked past at that moment and mumbled something I didn’t quite catch. I blurted out what any British person would at such a time: “Sorry?”. He stopped and smiled and asked a question that I found shocking at the time: “Is it a girl?”. With my current more mature mind, this question seems far less magical, but at the time I thought this man was a prophet so I asked him to surrender his enlightened teachings. He continued in a rather haphazard but comprehensible way to explain that we all go through romantic anguish, and that it blinds us to the clear fact that any time we spend in misery is time that we will later regret. So why waste any of our limited life on emotional pains that will eventually disappear and barely be remembered? I can assure you that his insights that night were not particularly deep, wise, or even fully coherent, but simply meeting someone who could highlight the fundamental non-exclusivity of my feelings was remarkable proof of the power of shared pain and empathetic community. It demonstrates we all go through those kinds of struggles and come out better at the end. I immediately hopped up and went home to happily finish a book I was reading about earthworms.

Photo from Pixabay. Adapted by yours truly.

Photo from Pixabay. Adapted by yours truly.

Your pains and anxieties seem much less significant when you feel part of something bigger. This bigger thing can span from other members of society to the universe as a whole. The larger parts of that range may be difficult to feel any strong connection to because their scale is so vast that our ape brains struggle to comprehend them in any realistic way. But they probably also present a more powerful experience of togetherness or ‘oneness’ if conquered, something that many Buddhists teach (I’m not a Buddhist myself). At a more basic level, connecting properly with others around you allows you to take someone else’s perspective of your own pain, which helps to get you out of your own skin and away from your self-centered perceptual or cognitive biases that distort your view of the world.

Photo from Pixabay. Adapted by yours truly.

Photo from Pixabay. Adapted by yours truly.

A mental disorder known as anosognosia presents a particularly extreme example of the benefits of shifting away from your first person perceptions. Anosognosia refers to an inability to see or understand your illness, even when it is abundantly clear to everyone else. When patients have anosognosia for hemiplegia (paralysis on one side of their body), they will repeatedly and confidently deny that they have a problem, even though they cannot move their limbs. If they are asked to move their limbs, they tend to make irrational excuses for why their limb cannot move at that moment in time. But they are utterly convinced that they are healthy, and are not just making up stories that they know are false. When a group of researchers decided to film a patient denying their symptoms and replay it back to them, they discovered a dramatic change in the patient’s reactions. The patient was effectively cured of their anosognosia, purely through seeing themselves from a third person perspective. This video replay approach has now been used more widely with similar results. Although this is a rare and extreme case, it does illustrate the power of putting yourself in someone else’s shoes. We all have needs that can benefit from escaping our narrow minded and often robotic first person perspectives.

So what exactly are the needs that we share as humans? Abraham Maslow took on this challenging question in the 1940s. His solution was a famous pyramid, the “hierarchy of needs”. According to Maslow, our most basic needs at the bottom of the pyramid are physiological. We need food to prevent starvation and shelter to protect us from the elements. Without these, we make very little progress in the world. A step up the pyramid brings us to our safety needs; we can only thrive if we manage to avoid war, abuse, and any other violent or dangerous situation that threatens to cut our life short. Only after meeting these most basic needs do we begin to care about our more advanced psychological needs in the upper levels of the pyramid. These are belonging, esteem, and ultimately self-actualization: we need friendship and intimacy, we need self-respect, and then we need to realize our full personal potential.

Maslow’s model makes intuitive sense overall. People in war-torn countries are probably less able to find the time and mental space to care about reaching their full potential when they are worrying about whether the next bomb will drop on their home. If we cannot keep ourselves alive with easy access to food, then we cannot motivate ourselves to find friends or self esteem. However, I cannot help but feel that this conceptualization of human needs adds to the common sentiment that our psychological needs matter less than our physiological needs. It is a sentiment that makes us believe we should feel slightly embarrassed when complaining about being lonely or sad. The truth is, when we fail to meet basic psychological needs, we can be far worse off than a simple step down in Maslow’s hierarchy of needs. A lack of food can kill us, but so can a lack of psychological and emotional stability. According to the US Center for Disease Control and Prevention, suicide is the leading cause of death for people aged 25–34 after unintentional injuries. In one 2013 survey of 9th-12th grade students in the US (~14–18 years of age), 17% of students had seriously considered suicide within the last 12 months preceding the survey. These are not small numbers.

I believe our human needs in the modern world have a more complicated structure than the hierarchy presented by Maslow (as valuable as that hierarchy has been within the psychological sciences). Food is no less important than it used to be, but we need to take our psychological needs just as seriously, not just as secondary phenomenon worthy of less concern. Intense sadness can feel just as bad as hunger, and looking at the suicide statistics above should at least give us second thought about assuming it is any less fatal. In fact, suicide presents an additional bizarre problem compared to deaths from failures of physiological needs, specifically that people actively want to die, rather than dying unintentionally from bodily failures. Whatever the subtleties of the problem, in the modern developed world, it is relatively uncommon for someone to die of starvation, so we may need to begin accepting our psychological needs within our primary needs.

We are all capable of making progress towards meeting our needs for mental wellbeing . Some of us, due to both genetic and environmental luck during our development, will find this much easier than others, and some of us will naturally be more resistant to psychological challenges and pressures. But most importantly, within our own little fortunate or unfortunate worlds, we can all make the most of the cards we are dealt.

Comment