Analysing Digital Culture 101: Media and Democracy
Nowadays, media are so present in our everyday life that we tend not to notice their influence and power anymore. Because of this, the Analysing Digital Culture series explores the key approaches to the analysis of new-media and reflects on their social and cultural significance from different theoretical perspectives. The series touches on some of the most popular debates on digital media such as their economy, their emancipatory and limiting power, the content recommendation systems, the circulation of memes and viral content, and the selfies culture.
Analyzing Digital Culture 101 will be divided into the following chapters:
Digital Culture 101: Media and Democracy
Digital Culture 101: Online Business Models
Digital Culture 101: Recommendation Systems
Digital Culture 101: Cultural Meaning of Memes
Digital Culture 101: The Attention Economy
Digital Culture 101: Media and Democracy
“The word democracy comes from the Greek words “demos”, meaning people, and “kratos” meaning power; so democracy can be thought of as “power of the people”: a way of governing which depends on the will of the people” states the Council of Europe’s website. These words highlight the pivotal role of the people’s will in democratic governments. The Council of Europe also mentions that the idea of democracy has two key principles: individual autonomy, so people should be able to control their own decisions and lives without imposition by others, and equality, so everyone should be able to influence society in the same way. Because of this, the information diet of citizens is an essential matter, and it is what allows or undermines democracy. This has been demonstrated by the many past and present dictatorships and authoritarian governments such as Francisco Franco’s and Xi Jinping’s that use censorship as their main tool to control the population. Today’s major source of information is the internet, a universe in which all information in the world is hyperlinked between each other. Social media are the applications that enable people to access this information easily. The study by Gottfried & Shearer (2016) shows that most people rely on social media to get informed about society and the world. Because of this, it is essential to understand whether they are emancipatory tools or if they undermine democratic processes.
Social media’s undermining effects
Many media scholars have added to the debate concerning the influence of social media on democratic processes. Sunstein and Pariser are two well-known writers in the media studies field. They coined two terms to explain how social media jeopardize democracy: echo chambers (Sunstein, 2011) and filter bubbles (Pariser, 2011). The terms address two phenomena that are strictly connected. This is because the echo chamber metaphor addresses the social media feature of connecting like-minded people (Bruns, 2019). Filter bubbles [Figure 1] indicate the many different affordances of social media that make users view extremely personalised information on their feeds (Baer, 2016). These two phenomena minimize the possibility of serendipity, so the chances of encountering opposite points of view. Bruns (2019) explains in his article that it is human nature to surround ourselves with people that agree with us or have a similar point of view. Before social media, this has never been a problem because people received information from many different channels, and consequently, there has always been the chance for serendipity. Baer (2016) points out that the extreme personalization of information has become problematic because many consume information purely online and through their social feeds. By doing so, people’s biases are constantly confirmed. Most importantly, social media users are not able to shape their thoughts on varied information and are not free to have the possibility to change their minds. In this regard, Baer (2016) says that despite the fact that broadcasting and news studios do not operate perfectly, they are more complete than social media news because they report opposite and contrasting ideas.
Figure 1: Visual representation of filter bubbles. Kelly & Francois. 2018
Bakshy et al. (2015) and Frenkel (2018) demonstrate the power of people’s will and awareness. The two studies prove the great influence that social media has on people’s opinions and, consequently, on elections. Bakshy, Messing, and Adamic (2015) are three Facebook researchers who carried out a study in 2015 and published it in the world’s leading multidisciplinary science journal Nature. The data scientists proved that Facebook users who self-reported conservative ideologies received 5% less diverse content than moderate and liberal users. This is caused by Facebook’s algorithm that alters the news feed according to a user’s friends. Consequently, the more a user shares his ideology and political view, the more these will be confirmed with personalized content, and the less likely he is to adopt a different opinion (Baer, 2016). While discussing the meaning of the study, Baer (2016) also mentions that this is a specific affordance of Facebook that allows the creation of groups with people sharing the same opinions, such as RightAlerts or Being Liberal.
Bakshy et al. (2015) findings justify and confirm Frenkel’s (2018) article. Frenkel describes Facebook’s great power in today’s society as having “connected more than 2.2 billion people, a global nation unto itself that reshaped political campaigns, the advertising business and daily life around the world. Along the way, Facebook accumulated one of the largest-ever personal data repositories.” Gottfried & Shearer (2016) reported that 67% of US adults use Facebook, while 44% get news from social media. Moreover, with power comes responsibilities. Frenkel (2018) explains how in 2017, accumulated evidence against Facebook proved its role in “disrupt[ing] elections, broadcast[ing] viral propaganda and inspire[ing] deadly campaigns of hate around the globe.” The accusations concerned Facebook’s leak of personal information of millions of people in 2010 [Figure 2]. This data was collected by a firm associated with Donald Trump called Cambridge Analytica, which assisted him in the 2016 presidential campaign. Frenkel claims that this happened because Zuckerberg (Facebook’s creator) and Sandberg (Facebook’s chief operating officer) were too busy rushing the company’s growth and ignored the warning signs they received. Both Benkler (2017) and Baer (2016) claim that Trump’s unexpected triumph in the 2016 elections resulted from this strongly pro-Trump conservative media sphere.
Bruns’ (2019) critique on filter bubbles and echo chambers
Bruns (2019) describes the filter bubbles and echo chambers metaphors as “an unfounded moral panic that presents a convenient technological scapegoat for a much more critical problem: growing social and political polarization.” Although social and political polarization is a complex topic worth discussing, the writer chooses to focus on questioning the nature and existence of the terms. Bruns justifies his statement by saying that Sunstein and Pariser never provided concrete definitions for the two notions. He says that they have been used a lot by academics and journalists for their commonsensical nature, but they represent vague concepts. The definition problem created confusion around these terms, which are often used as interchangeable synonyms. This ambiguity makes the scholar raise the question of whether there is a line that differs in diverse content between multiple users from being a result of filter bubbles and echo chambers or being expressions of different interests. A second problematic point he highlights is that before social media and online platforms, people tended to get information from like-minded sources and gather in groups based on their interests and preferences. As a result, Bruns claims that social media do not provoke filter bubbles and echo chambers because they have always existed. With this, he arrives at the conclusion that since democratic governments have been implemented in the past and still exist today, then neither filter bubbles and echo chambers nor social media are what undermine democracy.
Bruns supports his claims by analyzing Pariser’s (2011) example of the different results she obtained from her friends when they searched for the name of the energy company BP. They were three women with the same background, origins, and political views. Praiser (2011) claims that the results differed because the filter bubbles effect encloses each user into a circle of extremely personalized information [Figure 3]. Bruns (2019) belittles Praiser’s (2011) example by defining it as a mere anecdote that does not bring any evidence for the existence of filter bubbles and echo chambers. This is because different studies (Haim, 2018; Nechushtai & Lewis, 2019; Krafft et al., 2018) proved the opposite effect. Researchers demonstrated that different users searching for the same word obtained extremely similar content. Burns says that 5 to 10% also received identical information in the same order. Differences in the results were remarkable only when users used the search engine from different countries and in different languages. This manifests an opposite problem: “if there are echo chambers or filter bubbles in search at all, they appear to encapsulate the entire population of a given country rather than fragment it into separate groups” (Bruns, 2019). This conclusion stretches Sunstein and Pariser’s terms to a breaking point since they involve individuals and groups and brings Bruns to claim that a higher degree of personalization would be beneficial to obtain well-informed citizens.
Sandvig’s (2014) algorithmic recommendation system
Although Bruns (2019) can debunk Sunstein’s and Pariser’s theories about filter bubbles and echo chambers, Sandvig (2014) manages to confute Bruns’ conclusion about social media’s inability to undermine democracy. Sandvig explains that “at the core of [social media] sit algorithms that provide functions like social sorting, market segmentation, personalization, recommendations, and the management of traffic flows from bits to cars.” In this sentence, Sandvig addresses the great power that algorithms have in the distribution and flow of content in social media. The scholar presents this as a problem because of the great gap in knowledge about algorithms and the ways to analyze and control them. This becomes particularly important if we consider the forever increasing amount of computational infrastructure in our society. In this regard, Sandvig explains that the problem arises when these algorithms get manipulated. Algorithmic manipulation can be done for many reasons, and it does not always have direct negative consequences on users or is illegal. Sandvig focuses precisely on this grey zone where regulations cannot be applied, and the public does not have the tools to react but has the right to be made aware.
Sandvig gives two examples to prove that the algorithmic recommendation system can depress alternative information sources to the mainstream ones and political participation, yet democracy. The first example concerns Google’s hard-coded rules that prioritize Google services in its top results despite the company’s statement that it promised not to do so in 2010 [Figure 4]. With this example, Sandvig shows how although Google did something illegal, it did not harm people directly. Surely, Google’s users kept being satisfied with the information they found and still considered the results useful; however, the company misused its power to get additional benefits and did not follow the antitrust regulations. The second example concerns YouTube and Reply Girls. One of YouTube’s services is to allow YouTubers, video uploaders, to connect their videos with others that discuss similar topics as “video response”. This feature makes people who watch YouTube videos receive recommendations about different videos with similar topics to create a conversation flow and promote the sharing of different ideas. However, this feature has often been misused by people like the Reply Girls. These users connected their videos to other topical videos with a larger audience just to increase their number of visualizations, and not because they wanted to respond to them or because they discussed similar topics. Reply Girls did this by choosing as the thumbnail of their videos cleavage pictures that easily created interest in the users and pushed them to play the video. Sandvig explains that this became problematic not only because it interrupted the conversation flow and decreased the average quality content but also because topical content, most of the time, involves news and politics. Consequently, YouTube’s algorithmic recommendation ended up producing misogyny and distracting viewers from political engagement. For these reasons, Sandvig concludes his article by stating that more research should be done with the aim of understanding and controlling algorithms because “worrisome algorithms may span a great range of inquiry from unlawful monopoly leveraging to public-interest-minded research about the broader media and information environment.”
Many scholars have contributed to the debate around the media’s role in democratic governments. Scholars like Baer (2016), Benkler (2017), Pariser (2011), and Sunstein (2014) have developed theories and carried out studies that demonstrate the media’s possible undermining effects on democracy. Bruns (2019) manages to debunk some of these studies. However, Sandvig’s (2014) research brings an alternative response by charging algorithmic recommendations with threatening democracy. All these researchers claim that social media is emancipatory if used for the purposes they were created: to expand knowledge, gather people together and preserve freedom. However, they also have the potential to cause great harm, and if misemployed, they can damage society at the individual and global levels.
Baer, D. (2016). “The ‘Filter Bubble’ Explains Why Trump Won and You Didn’t See
It Coming,” New York Magazine. 10 November.
Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130–1132. https://doi.org/10.1126/science.aaa1160
Benkler, Y., Faris, R., Roberts, H., & Zuckerman, E. (2017). Study: Breitbart-led
right-wing media ecosystem altered broader media agenda. Columbia Journalism
Review, 1(4.1), 1-13. https://www.cjr.org/analysis/breitbart-media-trump-harvard-study.php
Bruns, A. (2019). It’s not the technology, stupid: How the ‘Echo Chamber’and ‘Filter
Bubble’metaphors have failed us. In International Association for Media and
Communication Research, 7- 11.
Democracy—Manual for Human Rights Education with Young people—Publi.coe.int. (n.d.). Manual for Human Rights Education with Young People. Retrieved 17 February 2023, from https://www.coe.int/en/web/compass/democracy
Frenkel, S. et al. 2018. “Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis.” The New York Times, November 14, 2018. https://www.nytimes.com/2018/11/14/technology/facebook-data-russia-election-racism.html
Gottfried, J., & Shearer, E. (2016, May 26). News Use Across Social Media Platforms 2016. Pew Research Center’s Journalism Project. https://www.pewresearch.org/journalism/2016/05/26/news-use-across-social-media-platforms-2016/
Haim, M., Graefe, A., & Hans-Bernd, B. 2018. “Burst of the Filter Bubble? Effects of Personalization on the Diversity of Google News.” Digital Journalism 6 (3): 330–43.
Krafft, T. D., Gamer, M., & Zweig, K. A., 2018. “Wer sieht was? Personalisierung,
Regionalisierung und die Frage nach der Filterblase in Googles Suchmaschine.” Kaiserslautern: Algorithm Watch. https://www.blm.de/files/pdf2/bericht-datenspende---wer-sieht-was-auf-google.pdf.
Lichterman, J. (n.d.). Nearly half of U.S. adults get news on Facebook, Pew says. Nieman Lab. Retrieved 17 February 2023, from https://www.niemanlab.org/2016/05/pew-report-44-percent-of-u-s-adults-get-news-on-facebook/
Nechushtai, E., & C. Lewis, S. 2019. “What Kind of News Gatekeepers Do We Want Machines to Be? Filter Bubbles, Fragmentation, and the Normative Dimensions of Algorithmic Recommendations.” Computers in Human Behavior 90: 298–307. https://doi.org/10.1016/j.chb.2018.07.043.
Pariser, E. 2011. The Filter Bubble: What the Internet Is Hiding from You. London: Penguin.
Sandvig, C., Hamilton, K., Karahalios, K., & Langbort, C. (2014). “Auditing
algorithms: Research methods for detecting discrimination on internet platforms,”
paper presented to “Data and Discrimination: Converting Critical Concerns into
Productive Inquiry,” Annual Meeting of the International Communication
Association, Seattle, WA, USA, 22 May, 1-8.
Sunstein, C. R. 2001a. Echo Chambers: Bush v. Gore, Impeachment, and Beyond. Princeton, N.J.: Princeton University Press.
Figure 1: https://www.technologyreview.com/2018/08/22/140661/this-is-what-filter-bubbles-actually-look-like/
Figure 2 & cover figure: https://alliance.columbia.edu/events/facebook-made-me-do-it-social-media-and-democracy-alliance-series
Figure 3: https://www.codeheroku.com/post.html?name=Building%20a%20Movie%20Recommendation%20Engine%20in%20Python%20using%20Scikit-Learn
Figure 4: https://samblogs.com/google-algorithm-updates-history-and-they-still-matter-today/