Author Archives: pcrickma

What are the Odds you were Right the First Time?

How many times have you been taking an exam and gone with your gut? How many times has your intuition been right, and how many times have you changed your original answer and gotten it wrong? I have discussed this very issue with classmates before, and from my limited, anecdotal evidence I can say that people resounding believe their first “snap decision” was most likely to be correct. Of course, as we have learned, my memory of these conversations is subject to all sorts of biases. Namely, I am probably influenced by a confirmation bias because I believe my first hunch to be right most of the time. But let’s assume that I wasn’t heavily influenced by bias, and my memory serves me correctly. Even if it was the case that we all knew we had a better chance of getting the answer right on our first hunch, I would assume there would still be a solid proportion of us that would go back, use more system 2 processes and proceed to get the answer wrong after we had falsely reasoned our way into it. Thankfully, we don’t have to just rely on my experience with a few classmates for our understanding of decision making; for that, we have cognitive psychologists, sociologists, and other scientists that do a much nicer job.

This being said, we do have to rely on our subjective experience of the world very often, and use our knowledge to make decisions, often quickly. It might be useful to have every bit of information necessary to make a reasoned decision, but it is also incredibly time consuming. Heuristics are shortcuts that are quick and close, but they can also have some grave implications if used in the wrong context. The literature on decision making and our use of heuristics is divided. Some are in the “people are stupid and we should stop them from making stupid decisions” camp, and others are in the “heuristics are useful, and we can be rational decision makers” camp. Upon realizing the nature of this confliction, I too became conflicted. Is it the case that we are poor decision makers, and are thus justified for going back in our exam and answering the question differently? Or, is it the case that we have a better shot with our initial, level 1 processing borne gut instinct? In order to reconcile this dilemma, I wanted to get a sense for the literature on both sides, and also to find a couple non-scholarly articles with opposing ideas.

Tversky and Khaneman famously argued that we are not rational decision makers. They argued against the utility theory model that suggested that people would maximize their utility based on an economic formula, and for prospect theory, which is characterized by an important reference point, and a sharper slope on the “loss” side of the model. Some of their work is relatively uncontroversial, but the controversy comes in when they make the arguments about all of the errors we make because of things like base rate neglect, competency heuristic, and gambler’s fallacy, to name a few. Their research shows that we are subject to a number of errors that make us unequivocally bad at making decisions. Gigerenzer, Cosmides, and Tooby all posit that Tversky and Khaneman weren’t quite fair to us, and they did research to illustrate the ways in which our intuitions and heuristics can serve us, not hurt us. Beginning with the presupposition that Tversky and Khaneman’s research wasn’t externally valid, they set out to find the ways in which our heuristics helped us. In Gigerenzer’s book “Bounded Rationality” he outlines the research done that suggests that we are equipped with heuristics that work well, and that evolution has left us with an “adaptive toolbox”.  In essence, the postulation is that humans are not great in lab scenarios that are inherently unlike reality, but in the real world the same processes that breakdown in the lab, will actually serve us.

Figure 1. Path analytic results from Experiment 1. (*p < .01.)

So which theory is most correct? I looked at a couple of articles to try to get a better understanding of which side of the argument appeared to have the upper hand. In a Forbes article entitled “How to Trust your Intuition and Listen to your Gut” the author details some tips on how to more readily trust your first reactions. The underlying heuristics are not mentioned explicitly, but their influence is evident. With tips like “Pay attention to your bodily reactions,” and “Stop and attend when your body signals you,” it’s clear that the author is calling readers to rely more on system 1 processes, and to suppress the tendency to overthink and overcomplicate. I think the oversight in this article is the fact that some situations simply call for a more algorithmic approach to problem solving. Another article from Psychology Today entitled “Should you Trust your Gut” takes a more comprehensive approach to decision making and intuitions. The author offers some helpful instructions on how to slow down your snap decisions, and reason your way through your problems. Of course, there is a fundamental oversight here as well: sometimes we just don’t have time, or shouldn’t take the time to utilize these techniques. The article goes on to explain some mindfulness practices that might help one to curb their initial instinctual reactions. In actuality, it’s clear that sometimes it is better to let your instinctual reactions guide your decisions.

Image result for decisions

Unfortunately it appears there is no hard and fast rule on whether or not we should trust our first reactions. That being said, I think that understanding the literature in the context of how you experience the world is key in deciphering which system you should bank on in a given situation. Sometimes we can be tricked, bamboozled or misled by clever framing techniques or someone trying to take advantage of our heuristics. A lot of the time, though, we can actually trust our gut more than we are inclined to; after all, millions of years of evolution is behind our most instinctual intuitions. We are nothing if not creatures designed for survival and perpetuation of our genes, and we can unconsciously make sense of a totally ambiguous world. This leads me to believe that our first answer might be the right one, so next time you second guess yourself on an exam, consider just leaving it alone.



False Memories in History

If I told you that there were actually 14 original American colonies, you might believe me. You could think, “That’s plausible enough to be true,” or “I can’t imagine why someone would lie about that,” and you might store that as a true statement in your memory. Even if you didn’t, though, you might still store it in your memory as something that is false. Let’s say you were confronted with the same piece of false information a couple weeks later. Would you be more inclined to believe it was true than before? Research done by Begg, Anas, & Farinacci (1992), would suggest that you would, in fact, have a higher chance of believing it, especially if you didn’t remember that I was the source of the original false claim. This phenomenon is known as the illusory truth effect, and it strikes me as something that is important to understand in the face of an uncertain future, and an ever changing past.

Table 2.

Of course, I don’t mean that our past is actually changing. I just mean that the way that we are conceptualizing it is changing, and thus our collective conception of history is subject to change, just like our own personal memories. For instance, the American populous seems to substantially edit historical events, in order to allow for a more optimistic interpretation of our past. Similarly, we make autobiographical memory errors when recalling an event in which we may have behaved unsavorily, and this appears largely trivial, but when it happens on a large scale, the implications are far graver.

An article, linked below, by Benjamin Mitchell-Yellin entitled “Misremembering American History” explores a couple of the reasons why our history is being distorted, why we need to be concerned, and what we can do to combat misinformation. The author uses the events at Charlottesville as an example of how our historical misconceptions are shaping the way we view more recent events. He argues that, although we might not like to think so, (which may be part of the problem) the events in Charlottesville were more an iteration of our past than an uncharacteristic or freak occurrence. The hashtag #ThisIsNotUs became popular in the aftermath of the event, which Yellin felt was problematic because this suggests a willful ignorance of our past.  He argues that implicit racial biases might be the culprit in the case of our augmenting history. Undoubtedly, our racial biases could be the sole corrupter, but I prefer a more charitable interpretation.

Image result for charlottesville va protest

As we have learned in class, our memories can’t always be trusted. They are subject to intrusion errors, schema based inferences, proactive and retroactive interference. False memories can even be created, for instance through therapy, even when they have no basis in reality. But how, one might ask, is our history similar to an individual’s memory? Isn’t American History well enough established that it’s not subject to change? I would argue that our history is in fact subject to change, just so long as there are enough people that are susceptible to believing nice sounding lies. In some ways, our collective memory can even instantiate more falsehoods than an individual might, because, like I alluded to above, those lies that are repeated will often be recalled as truths. By that token, the spread of a lie looks like an exponential function, where the more it is reiterated the more likely it is to be recalled as truth. It also happens that people have a tendency to recall positive events before negative ones, as Waldfogel found (1948). So our muddled ideas of history might simply be a case of false information being spread that we like to remember because it is more pleasant than the truth.

To play devil’s advocate, let’s say that the forums for false information disappeared entirely, and we could only have access to true information. This would fix our problem of misremembering, right? Maybe it would, but there the possibility of fudging the facts still remains. Let’s take, for instance, the study conducted by Bartlett (1932) as an example of how a story can abruptly change its complexion. As I’m sure you remember from class, Bartlett conducted a study in which he told a story called “The War of the Ghosts” to students who somehow managed to change their recollection of the story considerably over time. It can be inferred that they changed the stories in ways that made it more congruent with their worldview. Today, we have access to an unlimited amount of information, and it stands to reason that we, too, augment information so that it is more aligned with the way we feel about the world. Even in the face of facts, we will often find a way to change them to fit our existing schemas.

In a continually polarizing nation, it might be useful for us to know the mechanisms by which we change the information we consume, so we might try to limit the degree to which this happens. A day might come where we have changed our past beyond recognition, and stopping the dissemination of falsehoods might be impossible. However, understanding how our memories often fail us, and how truly fluid they can be, might be a tangible step in the direction of a brighter, less biased future.


You’ve Probably Forgotten Some Your Most Vivid Memories

Think back to a time in your life that was particularly memorable, or maybe even catastrophic. How vivid are those memories? How many details can you recall from that hour, day or week? It is likely that you are able to remember quite a bit of what happened. As we have learned in class, memories are more easily formed when they have emotional significance. This is thought to be the case in part because of our amygdala’s proximity to the hippocampus, and how the two seem to be intertwined in recalling emotionally charged memories. Some might even find it cruel that you can remember the atrocities of your life before you can remember the piece of important information that you repeated to yourself about a thousand times before taking a test. That is why elaborative rehearsal utilizing your emotion works better than the simple maintenance rehearsal we often use. In the New Yorker article, “You have no Idea What Happened” they delve into the topic of poignant emotional memories, and like me, you might be surprised at what they found.

Ulric Neisser is a cognitive psychologist that, in 1986, was interested in researching the accuracy of people’s memories about the explosion of the Challenger. It turns out that 2 years after the event happened, people were very bad at recalling the details of what happened to them that day, who they were with, what they were doing, etc. However, they were very confident in their false memories. The findings of Neisser piqued the interest of cognitive neuroscientist, Elizabeth Phelps, who would then devote her career to studying emotional memories, and why we get them wrong so often.

Image result for challenger explosion

Phelps would go on to posit that we do remember the core details of an emotional experience, however, we do not recall the peripheral details as well. As I remember from class, Dr. Rettinger gave the example of the tiger attacking you at the watering hole, and the fact that if you were asked to recall how many stripes the tiger had, you would likely not know, or more appropriately, not care. This is what likely happens to us in the case of emotional memories. We are able to recall the thing that was scary, or sad, but we aren’t as capable at getting every single little thing right. That’s why when someone says they, “remember it like it was yesterday” they almost definitely don’t, they just think they do because the important part of the memory is almost palpable to them. Research has shown that small things that we like to think we know, are inaccurate at best, and gravely dangerous at worst. What I mean by the latter is that eyewitness testimony is still admissible in court, and is often the primary means for getting a conviction.

As this article and the supporting research suggests, we are confident in our false memories, so it is conceivable that someone would convict someone of a crime they have very little memory of simply because they are confident they know what they saw. Unless they were hallucinating (which is another conversation entirely) it is reasonable to believe that they do in fact know what they saw. However, what they don’t know is the more important thing to focus on. What they don’t know is what the person’s face looked like, what they were wearing, how tall they were, etc. These are all irrelevant details when your emotional memory tunnel is tuning your sensory systems to obtain only the most important things: whether you’re in danger, where the danger is, and how you get away from it. So the irrelevant details, that get a tiny portion of our attention, are what get people convicted who might have been innocent.

If there is one thing that I have learned in our class, it is that our brains and visual systems are incredible at creating order in the face of ambiguous situations. When we don’t see an entire object, or person, we fill in the rest. When a word is misspelled we don’t even notice because our brain has already corrected the mistake and we skimmed right by it. By the same logic, we might fill in the remaining details of an event we are recalling, especially if we are recalling an event that happened a while ago. We get the central point, like the Challenger exploded, or we got attacked by a dog, and we forget the details quickly and then make them up later to retain continuity in our memories. Think about your most vivid memory, and then consider the possibility that much of that memory has been forgotten and manufactured over the years. It wasn’t yesterday, and if it was, write it down.Image result for model of forgetting

Referenced Article:


Some of us still attribute addiction to a lack of discipline. Here is why it is a chronic brain disease.

This article, written for Psychology Today by Shahram Heshmat outlines 10 reasons for addiction and its perseverence even after people recognize that they have a substance abuse issue. Those ten reasons discussed are as follows: 1: genetic vulnerability, 2: self-medication, 3: lack of alternative rewards, 4: impaired insight, 5: a love hate relationship with the drug, 6: deadly attraction, 7: falling off the wagon, 8: overvaluation of the immediate reward, 9: stress and 10: projection bias. This article does a good job of explaining, in layman’s terms, how addiction can manifest itself in a person and wreak havoc in their life. Some solid outside evidence was brought in to back up claims like, “There is substantial evidence for a genetic predisposition to develop addiction” (Heshmat, 2018). Overall, the article does well, if not superbly, in explaining why it is that addiction is so prevalent when the remedy for addicts seems so simple. It unfortunately is not as easy as employing self-discipline when it comes to addiction; there are more factors at play than most people are aware of.

While this article does a great job in explaining addiction’s origins and how it might sink its teeth in and take hold of someone’s life, it does little in the way of explaining how one might loosen the vice grip of addiction, or conquer it altogether. For most people who suffer from this affliction, it is more complicated than simply kicking the habit. A lot of people can stop, but few can stay stopped. In this article, mindfulness was briefly mentioned as an important facet of recovery, but this sort of strategy is worth more than a quick mention; it may be the ticket to normalcy that so many long for. Many addicts fall into a pattern of thinking that is centered in justifying, rationalizing, and minimizing their drug use. Cognitive dissonance arises, where a person’s behavior is incongruent with their beliefs, and so either the behavior has to change, or, more often, the cognition has to change to be more accommodating of the behavior. If one wants to escape the pattern of undesired behavior and the subsequent irrational thinking that follows, they must first be aware that their thinking is flawed.

Mindfulness techniques allow one to become more self-aware, something that a lot of addicts and alcoholics desperately need. Heshmat posits that one of the reasons that people continue to use drugs is an impaired insight, or a lack of awareness of one’s own cognitions and beliefs. It is also believed that this particular shortcoming is associated with dysfunction of the insular cortex (Heshmat, 2018). The brain does not function optimally for those who are addicted to drugs, and it may play a role in the perpetuation of their condition. However, if one can learn to become aware that their thought processes have become impaired by the substance, they are more likely to make headway in eventually breaking free from it entirely. Of course, there are a multitude of ways to treat addiction, but being aware of how it interacts with our brains will likely play an integral role in its eradication.

Articles like this make me hopeful that addiction will eventually be seen by the vast majority as a treatable malady, rather than solely an issue of self-discipline. The more we understand about how specific areas of the brain are affected by addiction, the better we will be able to identify and treat the accompanying maladaptive thought patterns. As our understanding progresses, so too will the diagnosis, treatment and reintegration of sufferers.

Below are the original sources cited in the article:

Al’Absi Mustafa (2007). Stress and Addiction: Biological and Psychological Mechanisms (2007) Academic press.
Bickel, W. K., Johnson, M. W., Koffarnus, M. N., MacKillop, J., & Murphy, J. G. (2014). The behavioral economics of substance use disorders: Reinforcement pathologies and their repair. Annual Review of Clinical Psychology, 10, 641-677.
Field M., Munafò M. R., Franken I. H. A. (2009). A meta-analytic investigation of the relationship between attentional bias and subjective craving in substance abuse. Psychol. Bull. 135 589–607.
Hart, Carl (2013) High Price: A Neuroscientist’s Journey of Self-Discovery That Challenges Everything You Know About Drugs and Society Harper.
Heyman G.M. (2009) Addiction: A disorder of choice. Cambridge, MA: Harvard University Press.
Khantzian, E. J. (2012). Reflections on treating addictive disorders: a psychodynamic perspective. The American Journal of Addictions, 21, 274-279.
Kreek et al. (2005), Influences on impulsivity, risk taking, stress responsivity and vulnerability to drug abuse and addiction Nat. Neurosci., 8 (11): 1450-1457.
Kringelbach ML, Berridge KC (2009). Towards a functional neuroanatomy of pleasure and happiness. Trends Cog Sci.;13:479–487.
Marlatt GA, Witkiewitz K. In: Relapse Prevention for Alcohol and Drug Problems. 2. Marlatt G Alan, Donovan Dennis M, editor. Relapse prevention: Maintenance strategies in the treatment of addictive behaviors; 2005. pp. 1–44. 2005.
Naqvi, Nasir H., David Rudrauf, Hanna Damasio, and Antoine Bechara (2007). Damage to the Insula Disrupts Addiction to Cigarette Smoking. Science 315: 531-534
Paulus MP, Stewart JL. (2013). Interoception and drug addiction. Neuropharmacology. 2014 Jan;76 Pt B:342-50.
Volkow, N.D., Baler, R.D. (2014), Addiction science: Uncovering neurobiological complexity, Neuropharmacology,76, 235-249.

Test Post

Hi All,

My name is Peyton, and it took me way too long to figure out how to post to our blog. Also, I am still not sure I did it correctly. Anyway, hope you all are having a great week.