GUEST FORUM

Comment on this article

Information Literacy and Confirmation Bias: You Can Lead a Person to Information, but Can You Make Him Think?
by Mark A. Allan

Abstract: Many librarians teach information literacy skills, including how to identify “fake new”, without knowingly incorporating tools to address confirmation bias. Confirmation bias is discussed, along with its inclusion within various credibility and information literacy models. Techniques that librarians can utilize or teach to their patrons to overcome confirmation bias are presented.


Background
The recent elections have drawn attention to the amount of ‘fake news’ that is generated by information producers across the political spectrum. Much of this misleading content may be read in its entirety and subsequently passed on via social media. However, much is also merely glanced at and forwarded on to other individuals. Scholarship by Gabielkov, Ramachandan, Chaintreau, and Legout published in 2016 has shown that when using social media, 59% of people approvingly forward links to their friends without having read the linked articles (as cited in DeMeyers, 2016, p. 1).

It’s possible that in these cases a rudimentary sort of information literacy assessment has taken place. The forwarder may have done a cursory appraisal of the resource’s title – was it from an authoritative source? Is the author an expert? If the forwarder has read the content of a social media post, he may also have evaluated it according to a set of mental criteria. Despite any such analysis, there are indications that fake news informs and reinforces viewpoints (Silverman & Singer-Vine, 2016; Firozi, 2017). One explanation for this behavior could be that readers/forwarders of incorrect/fake news are agreeing with the resource’s title and/or content based upon their own existing bias.
Needless to say, librarians who perceive themselves as gatekeepers to information are concerned. Indeed, they have the opportunity to educate students and the public about evaluating information (Lenker, 2016, p. 511-512). When asked by the Informed Librarian Online if there was a topic that interests me and that I would like to write about, I responded I would be interested in writing about cognitive bias, in particular, confirmation bias. Given a personal interest in psychology, but without a degree in this or a related field, I wondered if confirmation bias might have something to do with the consumption and dissemination of ‘fake news.’ Therefore, I started to search the literature with an eye toward promoting a discussion of how to address confirmation bias in information literacy sessions. Any errors or biases in the following piece are my own. I welcome constructive feedback!

Confirmation bias (also known as congeniality or myside bias) is a cognitive bias in which individuals are biased to seek and favor information that confirms their existing beliefs (Reber, Allen, & Reber, 2009). When individuals are initially exposed to a particular viewpoint, they are likely to continue to hold that point of view when exposed to disconfirming information. (Reber et al., 2009). This bias has been seen to be extremely powerful. In one experiment published by Westen, Blagov, Harenski, Kilts, & Hamann in 2006, brain scans showed activation of an area of the brain associated with reward and pleasure when individuals resolved a quandary confirming their initial beliefs (as cited in Shermer, 2006, p. 1).

Much of confirmation bias and other cognitive biases may be due to the theory that the brain has two systems for processing information (Kahneman, 2011, p. 20). The first system (System 1) includes the act of jumping to conclusions based upon past experiences or education, and the second system (System 2) encompasses critical thinking (Kahneman, 2011, pp. 20-22). While System 1 is automatic, fast, and always working; System 2 is less involved in everyday tasks and is slow, lazy and easily distracted (Kahneman, 2011, pp 20-50). Incorrect beliefs (including biases) that emerge from System 1’s operation must be identified and countered by System 2. (Gilbert, 1991, as cited in Kahneman, 2011, pp. 80-81)

That said, we live in a world inundated in all kinds of information with varying degrees of quality, which support seemingly innumerable agendas. Circumstances, events, and the information available about these matters change. There are also individuals motivated to trick or take advantage of people easily influenced by information which corresponds with their beliefs (Firozi, 2017). In order to make informed decisions, we need to use System 2. Given the power of confirmation bias in validating preexisting conclusions and informing individual’s information seeking behavior, one may assume that this topic is commonly addressed in information literacy sessions. However, an informal posting to the  Information Literacy Instruction Discussion List (ILI-L) resulted in only a few responses from librarians who are teaching about confirmation bias or other cognitive biases (Allan, 2016).

Credibility Tests, a Standard, and a Framework
Recent tests, standards, and frameworks provide varying opportunities for recognizing the importance of confirmation bias and other cognitive biases within their paradigms. It is noteworthy that the very prominent CRAP (Currency, Reliability, Authority and Purpose/Point of View) and CRAAP (Currency, Relevance, Authority, Accuracy, Purpose) credibility tests only go so far with bias assessment. While both call for examining the content, reputation of the author, and vested interests, these criteria may be subjective. How is a reader to know if their assessment is being prejudiced by internally held beliefs that have not been examined? These tests, as well as CARS or IMVAIN in the school setting (Gardner, 2016) are often utilized in non-college environments.

With regard to higher education, the Association of College & Research Libraries’ (ACRL) Information Literacy Competency Standards for Higher Education (Standards) were approved in 2000 and stated that, “An information literate individual is able to... Incorporate selected information into one’s knowledge base” (2000, p. 2-3). While Standard Three goes on to indicate that such a student “determines whether the new knowledge has an impact on the individual’s value system,” and will use “consciously selected criteria”, it does not seemingly address the importance of unconscious dispositions. (ACRL, 2000, p. 11, 12).

In 2016, the ACRL rescinded the Standards and adopted the Framework for Information Literacy for Higher Education (Framework). Although the Framework’s six frames are not to be considered absolute, it appears to contain multiple frames wherein contemplation and discussion of confirmation bias can take place. The Authority is Constructed and Contextual frame seems to be the most likely frame to encompass this conversation. According to the frame, a disposition developed by learners is one that “develop(s) awareness of the importance of assessing content with a skeptical stance and with a self-awareness of their own biases and worldview” (ACRL, 2016, p.4).

Reducing Confirmation Bias
So what tools can librarians provide students and the public to make better decision making when confronted with inaccurate or deliberately misleading information? Multiple studies indicate that just knowing that bias exists does not mean that one identifies it in one’s self. (Pronin, Gilovich, & Ross, 2004. p. 785). It also effects very knowledgeable people, who often demonstrate a greater bias consistent with their own outlook. (Taber & Lodge, 2006, as cited in Kahne & Bower, 2016, p. 7). However, it appears that instructing students in finding and evaluating content does have an impact on students’ judgment. Research conducted by Kahne & Bower (2016, p. 23) indicates that media literacy does indeed improve the judgment of students when encountering experimental social media ‘posts’. Librarians, pat yourselves on the back!

Librarians can provide motivation to use authoritative sources and not utilize biased content. An “accuracy motivation” (Hart, Albarracín, Eagly, Brechan, Lindberg, & Merrill, 2009 p. 558) can occur when a non-biased result is perceived as an important personal outcome. (Hart et al., 2009 p. 577). Librarians can potentially communicate the importance of non-biased fact and conclusion-finding during information literacy sessions. However, accuracy motivations are not always effective. Confirmation bias was found to be larger when disagreeable information was “high or moderate in quality rather than low in quality”, possibly due to personal defensive motivations. (Hart et al., 2009 p. 577). In the current social and political environment where the public’s distrust of media and arguably science and higher education is substantial, one might be tempted to point to this result as a possible explanation.

Additionally, peer pressure may also be useful in mitigating the use of biased content. Jonas states that “impression motivation” by trying to achieve “favorable interpersonal consequences” might reduce confirmation bias (2005, p. 978). Therefore, requiring individuals to present research results in an audience setting where neutrality or an unbiased outlook is encouraged could curtail bias. (Hart, et al., 2009, p. 582). While this technique may not be possible in “one shot” instruction sessions, it may be workable in an information literacy course or other settings. Even so, it may be of limited value due to the need to establish a neutral social environment for every different topic a researcher encounters.

A common strategy for spotting confirmation bias that many librarians already employ is to ask researchers to identify when a piece of information causes an emotional response. This can be seen as a red flag that the information source is itself biased and is intentionally attempting to make the researcher angry by using inflammatory words and/or content. Even if an analysis results in no provocative words or content being found in the source, the reader should attempt to objectively examine his own reaction. This is known as self-regulation, to “self-consciously monitor one’s cognitive activities, the elements used in those activities, and the results educed... with a view toward questioning, confirming, validating, or correcting either one’s reasoning or one’s results” (Facione, 2013, p. 7). An information resource triggering an emotional or knee-jerk reaction should cause the researcher to hesitate and further examine the source as well as to seek additional information before making use of it or sharing on social media. Therefore, librarians need to continue to encourage information seekers to monitor their own thought processes.

An additional important strategy librarians can teach is to “consider the opposite” while reading and critically evaluating information resources (Lord, Lepper, & Preston, 1984, p. 1231). In the Lord study, groups of students were given evidence on two opposing sides of a proposition. One group of students was directed to “Ask yourself at each step whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue”; whereas another was directed to be “as objective and impartial as possible” (Lord, 1984, p. 1233). Students engaging in considering the opposite after reading evidence for both a supporting and an opposing proposition were shown to be less polarized in belief - less subject to their bias - following the assignment, whereas the group told not to be biased remained so. (Lord, 1984, 1236). Additionally, multiple studies have shown that bias might be overcome by considering multiple plausible alternatives (Lilienfeld, Ammirati, & Landfield, 2009, p. 393).

The Lord study also indicated that students instructed to “consider the opposite” were more even-handed in choosing questions for further investigation to determine if two opposing propositions were supported. (Lord, 1984, p. 1237). This study seems to indicate that students’ information search strategies would be less biased if they are asked to consider opposing or different explanations before researching.

Librarians need to make clear to their students that our default mode of thinking is System 1 and that System 2 needs to be activated for non-biased thought. (Kahneman, 2011, p. 24-25). Just trying to be objective is not enough, only by considering opposing and/or different views do we reduce bias. This may have an effect not only upon evaluating information but also upon searching for it.

Conclusion
Librarians need to teach that recognizing and counteracting one’s own predispositions are necessary to analyzing and searching for information, whether using social media or undertaking an assignment. Doing so not only benefits the initial reader, but can also be seen as a way to limit the spread of fake news and other inaccurate information.

While some popular models for evaluating the credibility of information do not currently include the element of confirmation bias, they can easily be tweaked. CRAP or CRAAP could be revised to SCRAP or SCRAAP, thereby indicating the importance of self-examination or self-awareness. The new tests would be indicative of the importance of initially recognizing one’s own cognitive biases, including confirmation bias, in the evaluation process. While an S could be added to the end of the original acronym(s), this could imply that a person’s own cognition is the least important factor in the test. In models such as the ACRL’s Framework, confirmation bias and ways to mitigate it should be utilized and addressed.

Given the importance of confirmation bias in subconsciously guiding the information evaluation and gathering processes, it is important that methods for minimizing confirmation bias be emphasized in information literacy sessions. If initially inserting the self into the evaluation process does not happen, one’s unconscious bias has already come home to roost.


References
Allan, M. (2016, November 8). Design: Teaching Cognitive Bias in Information Literacy [Electronic mailing list message]. Retrieved from http://lists.ala.org/sympa/arc/ili-l/2016-11/msg00055.html
Association of College and Research Libraries. (2000). Information Literacy Competency Standards for Higher Education. Retrieved from http://www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/standards.pdf
Association of College and Research Libraries. (2016). Framework for Information Literacy for Higher Education. Retrieved from http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/infolit/Framework_ILHE.pdf
Confirmation bias. (2009). In Reber, A., Allen R., & Reber, E., The Penguin dictionary of psychology. London, UK: Penguin. Retrieved from http://search.credoreference.com/content/entry/penguinpsyc/confirmation_bias/0
DeMeyers, J. (2016). 59 percent of you will share this article without even reading it. Forbes. Retrieved from https://www.forbes.com/sites/jaysondemers/2016/08/08/59-percent-of-you-will-share-this-article-without-even-reading-it/
Facione, P. (2013) Critical thinking: What it is and why it counts. Retrieved from https://www.nyack.edu/files/CT_What_Why_2013.pdf
Firozi, P. (2017). Fake news site gains more than 1M views in less than 2 weeks. The Hill. Retrieved http://thehill.com/blogs/blog-briefing-room/news/323256-fake-news-website-gains-more-than-1-million-views-in-less-than
Gabielkov, M., Ramachandran, A., Chaintreau, A., & Legout, A. (2016). Social clicks: What and who gets read on Twitter. ACM SIGMETRICS Performance 2016, 44(1), 179-192. doi:10.1145/2896377.2901462
Gardner, L. (2016). Teaching information literacy now. School Library Journal. Retrieved from http://www.slj.com/2016/11/industry-news/teaching-media-literacy-now/
Gilbert, D. (1991). How mental systems believe. American Psychologist, 46(2), 107-119. doi:10.1037/0003-066X.46.2.107
Hart, W., Albarracín, D., Eagly, A. H., Brechan, I., Lindberg, M. J., & Merrill, L. (2009). Feeling validated versus being correct: A meta-analysis of selective exposure to information. Psychological Bulletin, 135(4), 555-588. doi:10.1037/a0015701
Jonas, E. (2005). Giving advice or making decisions in someone else's place: The influence of impression, defense, and accuracy motivation on the search for new information. Personality and Social Psychology Bulletin, 31(7), 977-990. doi:10.1177/0146167204274095
Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.
Kahne, J., & Bowyer, B. (2017). Educating for democracy in a partisan age: Confronting the challenges of motivated reasoning and misinformation. American Educational Research Journal, 54(1), 3-34. doi:10.3102/0002831216679817
Lenker, M. (2016). Motivated reasoning, political Information, and information literacy education. Portal: Libraries and the Academy, 16(3), 511-528.
Lilienfeld, S. O., Ammirati, R., & Landfield, K. (2009). Giving debiasing away: Can psychological research on correcting cognitive errors promote human welfare?. Perspectives on Psychological Science, 4(4), 390-398. doi:10.1111/j.1745-6924.2009.01144.x
Lord, C. G., Lepper, M. R., & Preston, E. (1984). Considering the opposite: A corrective strategy for social judgment. Journal of Personality & Social Psychology, 47(6), 1231-1243.
Pronin, E., Gilovich, T., & Ross, L. (2004). Objectivity in the eye of the beholder: Divergent perceptions of bias in self versus others. Psychological Review, 111(3), 781-99.
Shermer, M. (2006). The political brain. Scientific American 295(1), 36. Retrieved from https://www.scientificamerican.com/article/the-political-brain
Silverman, C., & Singer-Vine, C. S. (2016). Most Americans who see fake news believe it, new survey says. BuzzFeed News. Retrieved from https://www.buzzfeed.com/craigsilverman/fake-news-survey?utm_term=.yxr5ly6EBx#.qn3K5PAaV3
Taber, C., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755-769. doi:10.1111/ j.1540-5907.2006.00214.x
Westen, D., Blagov, P., Harenski, K., Kilts, C., & Hamann, S. (2006). Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. presidential election. Journal of Cognitive Neuroscience, 18(11), 1947-1958

Copyright 2017 by Mark A. Allan.

About the author:
Mark A. Allan has been employed at Angelo State University’s Porter Henderson Library for over fifteen years. As of March, 2017, he is the Assistant Director for Research and Instruction Services. He stays active in the Texas Library Association, and may be contacted at mark.allan@angelo.edu.