GUEST FORUM

Comment on this article

Opting Out is Not an Option: Big Tech, Algorithmic Ranking, and the Information Crossroads

by Jenny Fielding


Abstract:


In an information landscape flooded with misinformation, disinformation, and social media spin, current pedagogy around information literacy focuses (rightly) on teaching critical thinking around source evaluation. However, the ascendance of algorithms that constantly filter and personalize our information streams has created an environment where users despair of determining credibility, at all, a trend which has the potential to threaten scholarship, discourse, and democracy itself.


As 2020 progresses and conspiracy theories abound in the public consciousness, it has percolated into librarian discourse that the initial tumult around "fake news" several years ago was an information canary-in-the-coalmine moment. While certainly not missed,  it was perhaps not given enough ongoing scrutiny since the term evolved quickly into a political talking point. Clearly something was amiss in the information ecosystem when credible experts were pilloried as liars and shams while wild conspiracies gained traction, so much so that mainstream news outlets spent copious time and effort debunking them (which, ironically, may have simply served to entrench them even more in the minds of their adherents). Librarians unsurprisingly and immediately began creating tools and guides to help people discern between reliable information and mis- and disinformation, and "fake news" filtered into information literacy sessions and syllabi. These efforts, while necessary, thorough, and well-founded, have nonetheless by-and-large omitted a crucial piece of this troubling trend – the omnipresent yet often invisible influence of Big Tech on our information environment, which facilitates and enables this trend. More than that, this influence now extends far past the bounds of our laptops or even our phones since, for all intents and purposes, algorithmic ranking by tech platforms can practically dictate what information we encounter at all. Current statistics on search engine market share consistently puts Google between 87-92% (Clement, 2020). As the monopoly player in information, what Google ranks on its first page (or autoplays in our YouTube feed) has outsized influence on what information we are exposed to and thus act on. "When Google’s ranking algorithm puts a result for a popular search term in its top 10, that helps determine the behavior of millions of people…Most users will never look past the first page of search results, and when the overwhelming majority of people all use the same search engine, the ranking algorithm deployed by that search engine will determine myriad outcomes" (Doctorow, 2020).


This gap has begun to be more formally explored, and Shoshana Zuboff’s 2019 book, The Age of Surveillance Capitalism rings this alarm bell clearly, her call to arms focusing mainly on the far-reaching implications of Big Tech for personal autonomy and privacy. Cory Doctorow also explores this concept in his new book How to Destroy Surveillance Capitalism (published in full on Medium), counterpointing Zuboff primarily through a framework of anti-trust and activist solutions to the threat of Big Tech monopolization. But a 2020 study by Project Information Literacy puts the point most succinctly for librarians and educators, "Though information literacy has grown to include news literacy in the wake of the ‘fake news’ crisis, there is little consideration of how colossal sites like these influence what we see and learn, what we think, and ultimately, who we are" (Head, Fister & MacMillan, 2020).


"Data collection is happening invisibly and constantly" (Head, Fister, & MacMillan, 2020). Our increasing tendency to not only be perpetually tethered to our devices – computers, phones, tablets – but the growing interconnectedness of these devices with each other and other always-on smart technologies – cars, appliances, home monitoring systems, digital assistants like Siri and Alexa – means that our online and offline experiences are being shaped and targeted to us in new and unprecedented ways. Data is constantly being collected about us, is bought and sold with impunity, and may influence things like college acceptance, insurance rates, mortgage approvals, or job offers. But Big Tech has amassed so much monopolistic power it has left people with no other practical option but to engage with it to accomplish fundamental tasks of life. Opting out is no longer a viable option.


In addition to Google’s monopolistic hold on search, we also curate our own information ecosystems in our social media feeds (many of which are owned by the main social media monopoly player, Facebook), and in our shopping (the Amazon monopoly), all of which target ads, information, and suggestions which confirm our biases and create suspicion of information outside our digital bubble.  "Decades of media consolidation, deregulation, and economic trends combined with the rise of social media platforms that are designed for persuasion but have no ethical duty of care, have contributed to engineered distrust of established knowledge traditions such as journalism and scholarship" (Head, Fister & MacMillan, 2020).


This constant information torrent leaves people feeling helpless, or with the impulse to retrench into familiar points of view and biases. Conflicting information about the same events leave more than a third of students saying they distrust the credibility of all news (Head et al, 2018). While this is a byproduct of the current climate of information saturation, it is also a feature of a deliberate strategy by groups who benefit from a muddled information picture – consipracy theorists, disinformation agents, and malicious actors. "Flooding the zone" with "a dizzying number of narratives and counternarratives," leaves people with an epistemological crisis of confidence in what is even knowable, resulting in disengagement and cynicism (Illing, 2020). As a student in Head, Fister & MacMillan’s 2020 study articulates, "It’s not that we’re lacking credible information. It’s that we’re drowning in like, a sea of all these different points out there, and people are willingly giving themselves up to participate in that sea." Interestingly enough, this skepticism does not seem to translate to the main tool we use for search itself. Google is somehow placed outside the content found on its platform - they are viewed as ‘other,’ or ‘arbiter’ - with public trust for the brand ranking extremely high (Newton, 2020). And so the question of algorithmic ranking and targeted content, which influences almost all of our online interactions, is absent from most information literacy instruction, as well as college-level critical thinking exercises and assignments.


What We Can Do


It is imperative that, as educators, we confront the fact that our established pedagogies and strategies may not currently be enough to meet this information tipping point. Head, Fister & MacMillan note, regarding college assignments and courses, that they "do not address the significant social and ethical questions raised about the workings of influential information systems on the public sphere…at a time when falsehoods proliferate and trust in truth-seeking institutions is being undermined." Students themselves recognize this, as sessions on domain names are scoffed at as "just kind of outdated for the caliber that the internet is today" (2020). Librarians, faculty, and educators need to realize that much of our teaching still engages with the internet as if it is static, instead of acknowledging the massive implications of a networked world where collection and exploitation of ambient information is the norm. This becomes even more urgent when these systems are examined in light of the inequity they often uphold and reinforce. "Marginalized groups face higher levels of data collection when they access public benefits, walk through highly policed neighborhoods, enter the health-care system, or cross national borders. That data serves to reinforce their marginality when it is used to target them for suspicion and extra scrutiny" (Eubanks, 2017).


This author has advocated for more sophisticated methods of addressing these issues (Fielding, 2019 & Fielding, 2020), and Wineberg et al. have provided an excellent interdisciplinary model that speaks to the importance of this topic for civic engagement and democracy (2019). But it is clear that, as a profession, librarians should be leading the charge to de-monopolize Big Tech and advocate for digital privacy rights. These trends, when left unchecked and unexamined,  foreshadow dire consequences for scholarship, for our democracy, and for our discourse. Work from organizations like the Electronic Frontier Foundation should be reviewed to supplement discussions around search, information, and privacy. As Doctorow notes, "In a world as complex as this one, we have to defer to authorities…(but) the collapse of the credibility of our systems for divining and upholding truths has left us in a state of epistemological chaos." (2020). Shoring up these systems and holding Big Tech accountable for its monopolistic clout ensures not just the ongoing relevance of libraries, but the accessibility and freedom of information as we move past the dawn of the digital age.


References


Clement, J. (2020). "Global market share of search engines 2010-2020." Statista.com. https://www.statista.com/statistics/216573/worldwide-market-share-of-search-engines/


Doctorow, C. (2020). "How to destroy surveillance capitalism." OneZero: Medium.com. https://onezero.medium.com/how-to-destroy-surveillance-capitalism-8135e6744d59


Eubanks, V. (2017). Automating inequality: How high-tech tools profile, police, and punish the poor. New York: St. Martin’s Press.


Fielding, J. (2019). "Rethinking CRAAP: Getting students thinking like fact-checkers in evaluating web sources." C&RL News. 80 (11): 620–622. https://doi.org/10.5860/crln.80.11.620


Fielding, J. (2020). "Evolving our pedagogy in a post-truth landscape." Informed Librarian Online. https://www.informedlibrarian.com/GuestForum.cfm?FILE=gf2004.html


Head, A., Fister, B. & MacMillan, M. (2020). "Information literacy in the age of algorithms: Student experiences with news and information, and the need for change." Project Information Literacy. https://www.projectinfolit.org/uploads/2/7/5/4/27541717/algoreport.pdf


Head, A., Wihbey, J., Metaxas, P.T., MacMillan, M., & Cohen, D. (2018). "How students engage with news: Five takeaways for educators, journalists, and librarians," Project Information Literacy. https://projectinfolit.org/pil-public-v1/wp-content/uploads/2020/08/newsreport.pdf


Illing, S. (2020). "'Flood the zone with shit': How misinformation overwhelmed our democracy."


Vox.com. https://www.vox.com/policy-and-politics/2020/1/16/20991816/impeachment-trial-trump-bannon-misinformation


Newton, C. (2020). "The Verge tech survey 2020." The Verge. https://www.theverge.com/2020/3/2/21144680/verge-tech-survey-2020-trust-privacy-security-facebook-amazon-google-apple


Wineburg, S., Breakstone, J., Smith, M., McGrew, S. & Ortega, T. (2019). "Civic online reasoning: Curriculum evaluation." Stanford History Education Group. https://cor.stanford.edu/


Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. New York: PublicAffairs.


Copyright 2020 by Jennifer A. Fielding


About the author: Jennifer A. Fielding is the Coordinator of Library Services for the Lawrence Campus at Northern Essex Community College in Massachusetts. Her research interests are information equity, algorithmic bias, and information literacy pedagogy.