“But everyone says so”: Inside the Echo Chamber
- Eleni Chatzi
- Sep 8
- 16 min read
Introduction
One of the most influential embodiments of the public discourse regarding social media and polarization derives from Sunstein’s (2001, 2017) metaphor of the echo chamber, alongside Pariser’s (2011) concept of the filter bubble. These conceptual frameworks are based on the notion that social media users are inclined to interact principally with other users with whom they share common beliefs, and consume content that predominantly aligns with their pre-existing ideological perspectives. A focal point in understanding the political and sociological impacts of digital media, echo chambers and filter bubbles can be detected in social networks, from “traditional” platforms such as Facebook, Twitter, and Instagram to emerging platforms as TikTok and Bilibili. Their formation raises significant concerns regarding disinformation, polarization, and the erosion of democratic discourse (Gao et al., 2021). The concept of echo chambers is based on the notion of digital environments where users are fundamentally exposed to information that synchronises with their existing beliefs. Filter bubbles, as introduced by Pariser (2011), refer more narrowly to the effects of algorithmic personalization, where users are shown content aligned with their previous behaviors, often unknowingly (Hartmann et al., 2025).
Even though it is crucial to comprehend thoroughly their meaning and rapid expansion, the modus operandi of echo chambers and filter bubbles remains mainly unaddressed, whilst their terminology is inadequately clarified, as there is no definitive consensus between academic researchers (Kitchens et al., 2020; Ross Arguedas et al., 2022). According to Ross Arguedas (2022), terms like echo chambers, filter bubbles, and polarization are vastly utilized in political discourse and public dialogue, though not always grounded in scientific research. Although the terms are often used interchangeably and refer to similar consequences in online political discourse, they originate from different mechanisms. As Nguyen (2017) notes, this distinction is important for understanding how each contributes to the broader problem of information segregation and polarization in digital media spaces. However, in most literature cited in the current article, both the echo chamber and the filter bubble are commonly used interchangeably. Whilst their conceptual distinctions are addressed below, the focus here remains on their shared impact as metaphors for digital environments in which pre-existing beliefs are repeatedly echoed and reinforced.
Definitions and Concepts
The echo chamber constitutes one of the most prominent and rising research areas in the digital sphere, entailing various academic disciplines like political science, psychology, and information science (Liu, 2022). Most commonly, echo chambers are framed conceptually as confined digital spaces where continuous exposure to content that is ideologically consistent with the user’s pre-existing viewpoints leads to reendorsement of said viewpoints, diminished exposure to interdisciplinary arguments, and in many cases, polarization (Cinelli et al., 2021; Ranalli & Malcom, 2023). According to Nguyen (2020), an echo chamber consists of an epistemic environment in which participants encounter beliefs and opinions that coincide with their own, or in the words of Cinelli , an echo chamber is “a self-reinforcing mechanism that moves the entire group toward more extreme positions” (2021, p.1). In their influential book, Jamieson and Capella define echo chambers as “a bounded, enclosed media space that has the potential to both magnify the messages delivered within it and insulate them from rebuttal” (2008, p. 76). The conceptual framework of homophily (the tendency of like-minded individuals forming groups to reinforce their existing beliefs), confirmation bias and cognitive dissonance are central pillars to the formation of echo chambers (Mahmoudi et al., 2024). Research on the causes of echo chambers has proposed that algorithmic recommendations have also played a crucial role in the development of these congruent digital spaces (Jiang et al., 2021).

These types of digital spaces are not limited to a particular topic of interest. They can span over a vast variety of topics and subjects, from gender equality and abortion laws, to climate change, pandemics, and vaccines (Okruszek et al., 2022; Górska, Kulicka, & Jemielniak, 2023; Neff et al., 2021; Neff & Jemielniak, 2022). Evidence provided by Lang (2019), Vaca-Jiménez et al. (2021), and Sharma and Vasuja (2022) has showcased that echo chambers are expansive in impact and have the potential to modify national policies or affect populations globally. Three basic features make social media platforms ideal for echo chambers to flourish: platforms are not restricted by geographical limits, the cost of sharing radical beliefs is minimal compared to an in-person conversation, and fellow believers of fringe viewpoints can always be found in the chaotic vastness of the internet web (Alatawi et al., 2021).
An example of an echo chamber would be a user who is highly critical of a particular political party: their preference is perceived by the algorithms and priority is given to comparable information that excludes dissimilar or contradictory viewpoints. The user’s beliefs are reinforced, their exposure to varied perspectives is limited and consequently, a type of illusory consensus is created (Wang et al., 2020).
According to social scientists, echo chambers entail two dimensions, one being magnification, where attitude-consistent messages are reinforced, and the other insulation, where exposure to dissimilar views is minimal or absent. Echo chambers cannot be detected via only one media platform; their comprehension requires a thorough understanding of an individual’s “media diet” online and offline (e.g. a media user may obtain information from more traditional means such as newspapers or television). Even though the terminology applies to a vast array of topics, the majority of existing research on the matter focuses on political echo chambers (Ross Arguedas et al., 2022).

In political and public discourse, the term is used interchangeably with the term filter bubble, although they are not identical. As a term, filter bubble was conceived by activist Eli Pariser (2011) in order to define his concern regarding the personalization of available information for each user. The filter bubble theory showcases how tailored algorithms create an enclosed space around individuals and restricts their opportunity to engage with varied viewpoints, often without active choice (Pariser, 2011). Whilst both terms refer to information environments with limited diversity, the distinction lies in their mechanisms: echo chambers may result from user behavior, platform structures, or both, whereas filter bubbles are predominantly shaped by platform algorithms. The concern among scholars and commentators is that both phenomena may contribute to rising political polarization, which manifests not only in ideological divisions, but also in emotional responses to opposing views (affective polarization) and the segmentation of media audiences along political lines (Ross Arguedas et al., 2022).
As stated above, previous research on online echo chambers has focused on political fragmentation and polarization in democracies (Hong & Kim, 2016; Barberá et al., 2015), the impact on information dissemination (Törnberg, 2018; Wang & Song, 2020; Zimmer et al., n.d.), and echo chamber identification models (Villa et al., 2021; Morini et al., 2021; Barberá et al., 2015). Within the context of an echo chamber, disinformation is understood as the deliberate dissemination of deceptive content by actors aiming to manipulate public opinion. Such content often appears credible and is crafted to resemble legitimate news or documentary material, typically serving specific political objectives (Bennett & Livingston, 2018, p. 124).

In their paper, Ranalli and Malcolm (2023) initially explored the viewpoint that echo chambers could potentially be beneficial for preserving true beliefs, but ultimately rejected this thesis as flawed. Echo chambers present epistemically harmful features that hinder them from becoming epistemically valuable. Their greatest liability lies in how they restrict access to the positive reasons others may hold for their opposing views, even when those views are ultimately mistaken. This restricted access is inherently problematic; a user cannot see for themselves why the opposing views are flawed and address them with constructive criticism and counter-arguments. This situation is named “reasons undermining” and constitutes a focal point of echo chambers; epistemic agency is inhibited and deprived.
How Algorithms Reinforce Echo Chambers and Filter Bubbles
The phenomenon of selective exposure, where users actively choose content that confirms their pre-existing beliefs while disregarding opposing viewpoints, resides in the core of echo chambers, resulting in a situation called ideological homogeneity. Namazzi (2024) mentions that confirmation bias contributes to this dynamic and creates a feedback loop that is difficult to escape from. These biases are further amplified by algorithmic prioritization that is based on user engagement metrics (Kim, 2023). The inclination of algorithms to showcase particular forms of digital material that reflect the bias detected in the data that is used to train them is called algorithmic bias (Silva & Kenney, 2018). This type of bias can manifest in the following ways: firstly, by data bias, where data that are used to train algorithms may be inherently skewed. For instance, if a map app is trained solely on roads used by a specific demographic population, it would present a biased view of the transportation network. Additionally, by algorithmic design bias, stemming from decisions made during the algorithm’s development stage, may introduce unintended bias. Algorithms might favor sensationalist content over accurate reporting due to reasons of preference and prioritization by users. The features contributing to algorithmic bias entail the introduction of said bias in the data collection process. For example, algorithms that are trained on data with limited diversity and disproportionate representation of societal groups will enhance these perspectives in the content selection process (Williams et al., 2018). More specifically, if a social media platform collects data from users residing in urban environments, the algorithms that result may face difficulties in representing the interests of users residing in rural areas. Essentially, the emphasis on metrics of engagement unintentionally magnifies the possibility of exaggerated or even false information spreading, i.e., if the algorithm is designed to optimize engagement of users, priority will be given to emotionally charged posts that are more prone to being viral even if they lack accuracy (Tucker et al., 2018). Sensationalist content and posts with validated preexisting content may receive higher priority (Shin et al., 2022). That has been evident in cases of newsfeeds being filled with conspiracy theories because of their radical content and their probability of generating more hits, shares, and likes compared with pieces of investigative journalism (Putri et al., 2024). Moreover, flaws in algorithmic design or data collection may suppress content originated by marginalized communities, consequently overlooking minority voices and reinforcing the creation of digital spaces where those perspectives are systematically excluded (Karizat et al., 2021).

The aforementioned conditions may result in an environment where algorithms perpetuate societal biases and restrict heavily the range of provided information. Media consumers are secluded from exposure to alternative ideas when algorithmic bias gives priority to content that matches the user’s previous preferences (Bandy & Diakopoulos, 2021). This sequence provides fertile ground for the rise of echo chambers, in a similar way a student relies on a single textbook for an upcoming test; the student relies on a restricted source of information, leading to limited knowledge of the test subject (Putri et al., 2024). Likewise, users that engage primarily within echo chambers and filter bubbles fail to access vital perspectives that could broaden their understanding.
Research and Critique
In the last decade, there has been a rising number of academic research on the emergence of echo chambers. In their systematic review, Alatawi et al. (2021) introduced a categorization of issues and challenges attributed to echo chambers, resulting in a four-element classification: attribute, mechanisms, detection modeling and prevention-mitigation. In their comprehensive review, Arguedas et al. (2022) examined the existence of echo chambers, filter bubbles, and polarization within online social networks and were led to three key conclusions: the first one being that echo chambers are perceived way more prevalent than they are; the second one being the lack of substantial evidence supporting the formation of filter bubbles; and the third about the complex role of news media in polarization. In addition, a lack of scientific consensus on the definitions of these terms was observed while mentioning their recurrent misuse in political and public dialogue.

A study by Gao et al. (2021) investigates the phenomena of echo chambers and filter bubbles in a limited search but rapidly expanding field of technology, the short video platforms such as Douyin (the Chinese version of TikTok). Their study explores whether this kind of platform fosters echo chambers in the way other social media platforms are known to. From the analysis of behavioral data derived by over 1,300 users, researchers examined how users consumed political and apolitical content, how their preferences evolved, and how the recommendation algorithms influenced what they saw. The findings suggest that while users do develop content preferences over time, and algorithms reinforce these preferences to some extent, the presence of true echo chambers is limited. Many users were exposed to a range of views, particularly when their initial engagement patterns were diverse.
The terms echo chamber and filter bubble have transformed into buzzwords in discussions about misinformation and political polarization, but their meaning is often vague or contradictory (Coady, 2024). David Coady’s article “Stop Talking about Echo Chambers and Filter Bubbles” argues that the concepts of echo chambers and filter bubbles are not only conceptually unclear but actively misleading. He contends that these terms, though popular, do not refer to real or uniquely modern phenomena and often obscure more than they clarify. By forming parallels to terms like fake news and conspiracy theory, Coady shows how vague or contradictory definitions have become widely accepted without scrutiny. Instead of clarifying communication problems or improving epistemic understanding, these labels are used as rhetorical weapons and tools of moral panic, reinforcing narrow political binaries and discouraging genuine debate. According to the author, much like how the term fake news has been weaponized in political debates, echo chambers and filter bubbles serve more to assign blame than to foster understanding (Coady, 2024).
Coady further critiques scholars like Nguyen for treating echo chambers as self-evidently bad, noting that epistemic communities that prioritize internal trust, like climate scientists, also fit such definitions. The author dismantles the assumption that we should always strive for exposure to “the other side,” arguing that such sides are often artificially constructed within a narrow political spectrum dominated by elite media. He suggests that the language of filter bubbles and echo chambers supports an illusion of diversity while entrenching conformity. Ultimately, the author proposes we abandon these terms altogether, advocating for clearer, more philosophically grounded ways of discussing epistemic and democratic challenges (Coady, 2024).

Figà Talamanca and Arfini (2022) also challenge the widely accepted belief that digital algorithms are primarily to blame for isolating individuals within so-called filter bubbles or echo chambers. They argue that this perspective gives too much agency to technology and fails to account for the deeper cognitive and emotional dynamics at play. While it is true that algorithms personalize content based on user behavior, the authors emphasize that individuals are not passive recipients of information. Instead, people bring their own beliefs, biases, and discomforts into their digital interactions, and these shape how they interpret and react to what they see online.
Rather than viewing digital isolation as a purely technological issue, the authors present it as a socio-technical problem. They introduce the idea that intellectual isolation arises from the interplay between the design of digital platforms and the natural tendencies of human cognition, such as the desire to avoid uncertainty or discomfort. Through concepts like the “epistemic bubble” and “epistemic discomfort,” they show that people often resist information that challenges their beliefs, not because it is hidden from them, but because confronting it can be emotionally and psychologically difficult.
Conclusion
As previously mentioned, apart from homophily, confirmation bias, and cognitive dissonance, the emergence of personalized algorithms plays a crucial role in the formation of echo chambers by offering an online experience that responds to individual interests (Eg et al., 2023).
As discussed in the literature, echo chambers and filter bubbles are characterized by their complex nature and pervasive impact on public opinion formation, dissemination of information, and construction of social dynamics. Studies analysing the phenomena emphasize the possibility of echo chambers enhancing polarization by enabling individuals to cluster in like-minded environments where their opinions, however fringe or false, are reassured while the exposure to diverse viewpoints gets severely limited. While this practice could be attributed solely to the active search of the user for digital spaces with similar preferences, several scholars support that echo chambers are a byproduct not only of user preferences but a construct of network structures and algorithmic recommendations that are presented to the users themselves.
Echo chambers emphasize the role of social interaction and group dynamics, while filter bubbles highlight the influence of algorithmic filtering. The literature showcases an imminent danger that accompanies echo chambers and filter bubbles: the spread of disinformation. These digital spaces may function as fertile ground for the dissemination of fake facts and rumors that exist only to reinforce the user’s existing beliefs. Societal repercussions that originate from biased information reproduced in echo chambers can be substantial, as members of the echo chamber stay entrenched in their views and become progressively more partisan.
At the same time, literature reveals a growing skepticism toward the universality and explanatory power of these terms. Scholars such as Coady (2024) and Figà Talamanca and Arfini (2022) challenge the deterministic view that attributes epistemic isolation solely to algorithmic bias. Instead, they highlight the agency of users, their resistance to discomfort, and the deeper socio-political structures that shape media consumption. Ultimately, addressing the challenges posed by echo chambers and filter bubbles requires more than technical solutions; it demands critical awareness of how digital infrastructures interact with human psychology and societal dynamics.
Bibliographical References
Alatawi, F., Cheng, L., Tahir, A., Karami, M., Jiang, B., Black, T., & Liu, H. (2021). A survey on echo chambers on social media: Description, detection and mitigation (arXiv:2112.05084). arXiv. https://doi.org/10.48550/arXiv.2112.05084
Bandy, J., & Diakopoulos, N. (2021). More accounts, fewer links: How algorithmic curation impacts media exposure in Twitter timelines. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW1), 1–28.
Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting from left to right: Is online political communication more than an echo chamber? Psychological Science, 26(10), 1531–1542. https://doi.org/10.1177/0956797615594620
Cinelli, M., De Francisci Morales, G., Galeazzi, A., Quattrociocchi, W., & Starnini, M. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences, 118(9), e2023301118. https://doi.org/10.1073/pnas.2023301118
Coady, D. (2024). Stop talking about echo chambers and filter bubbles. Educational Theory, 74(1), 92–107. https://doi.org/10.1111/edth.12620
Eg, R., Tønnesen, Ö. D., & Tennfjord, M. K. (2023). A scoping review of personalized user experiences on social media: The interplay between algorithms and human factors. Computers in Human Behavior Reports, 9, 100253
Figà Talamanca, G., & Arfini, S. (2022). Through the Newsfeed Glass: Rethinking Filter Bubbles and Echo Chambers. Philosophy & Technology, 35(20). https://doi.org/10.1007/s13347-021-00494-z
Gao, Y., Liu, F., & Gao, L. (2023). Echo chamber effects on short video platforms. Scientific Reports, 13, 6282. https://doi.org/10.1038/s41598-023-33370-1
Górska, A. M., Kulicka, K., & Jemielniak, D. (2022). Men not going their own way: a thick big data analysis of #MGTOW and #Feminism tweets. Feminist Media Studies, 23(8), 3774–3792. https://doi.org/10.1080/14680777.2022.2137829
Hartmann, D., Wang, S. M., Pohlmann, L., & Berendt, B. (2025). A systematic review of echo chamber research: Comparative analysis of conceptualizations, operationalizations, and varying outcomes. Journal of Computational Social Science, 8, 52–94. https://doi.org/10.1007/s42001-025-00381-z
Hong, S., & Kim, S. H. (2016). Political polarization on Twitter: Implications for the use of social media in digital governments. Government Information Quarterly, 33(4), 777–782. https://doi.org/10.1016/j.giq.2016.04.007
Jamieson, K. H., & Cappella, J. N. (2008). Echo chamber: Rush Limbaugh and the conservative media establishment. Oxford and New York: Oxford University Press.
Jiang, B., Karami, M., Cheng, L., Black, T., & Liu, H. (2021). Mechanisms and attributes of echo chambers in social media (arXiv:2106.05401). arXiv. https://doi.org/10.48550/arXiv.2106.05401
Karizat, N., Delmonaco, D., Eslami, M., & Andalibi, N. (2021). Algorithmic folk theories and identity: How TikTok users co-produce Knowledge of identity and engage in algorithmic resistance. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2), 1–44.
Kim, L. M. (2023). The echo chamber-driven polarization on social media. Journal of Student Research, 12(4). https://doi.org/10.47611/jsr.v12i4.2274
Kitchens, B., Johnson, S., & Gray, P. (2020). Understanding echo chambers and filter bubbles: The impact of social media on diversification and partisan shifts in news consumption. MIS Quarterly, 44(4), Article 16371. https://doi.org/10.25300/MISQ/2020/16371
Lang, S. (2019). Consulting Publics in European Union Gender Policies: Organising Echo Chambers or Facilitating Critical Norm Engagement?. In: Engberg-Pedersen, L., Fejerskov, A., Cold-Ravnkilde, S.M. (eds) Rethinking Gender Equality in Global Governance. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-15512-4_9
Liu, Z. (2022). The internet echo chamber and the misinformation of judges: The case of judges’ perception of public support for the death penalty in China. International Review of Law and Economics, 69, Article 106028. https://doi.org/10.1016/j.irle.2021.106028
Mahmoudi, A., Jemielniak, D., & Ciechanowski, L. (2024). Echo chambers in online social networks: A systematic literature review. IEEE Access, 12, 9594–9619. https://doi.org/10.1109/ACCESS.2024.3353054
Morini, V., Pollacci, L., & Rossetti, G. (2021). Toward a standard approach for echo chamber detection: Reddit case study. Applied Sciences, 11(12), 5390. https://doi.org/10.3390/app11125390
Namazzi, S. N. S. (2024). The relationship between political ideology and social media echo chambers. International Journal of Humanity and Social Sciences, 2(2), 48-59. https://doi.org/10.47941/ijhss.1781
Neff, T., & Jemielniak, D. (2022). How do transnational public spheres emerge? Comparing news and social media networks during the Madrid climate talks. New Media & Society, 26(4), 2066-2091. https://doi.org/10.1177/14614448221081426 (Original work published 2024)
Neff, T., Kaiser, J., Pasquetto, I., Jemielniak, D., Dimitrakopoulou, D., Grayson, S., Gyenes, N., Ricaurte, P., Ruiz-Soler, J., & Zhang, A. (2021). Vaccine hesitancy in online spaces: A scoping review of the research literature, 2000–2020. Harvard Kennedy School Misinformation Review. https://doi.org/10.37016/mr-2020-82
Okruszek, L., Piejka, A., Banasik-Jemielniak, N., & Jemielniak, D. (2022). Climate change, vaccines, GMO: The N400 effect as a marker of attitudes toward scientific issues. PLOS ONE, 17(10), e0273346. https://doi.org/10.1371/journal.pone.0273346
Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. penguin UK
Putri, S. D. G., Purnomo, E. P., & Khairunissa, T. (2024). Echo Chambers and Algorithmic Bias: The Homogenization of Online Culture in a Smart Society. In ICEnSE 2024: Proceedings of the 1st International Conference on Environment and Smart Education (Paper 05001, pp. 1–8). SHS Web of Conferences. https://doi.org/10.1051/shsconf/202420205001
Ranalli, C., & Malcom, F. (2023). What’s so bad about echo chambers? Inquiry, 66(5), 563–582. https://doi.org/10.1080/0020174X.2023.2174590
Ranalli, C., & Malcom, F. (2023). What’s so bad about echo chambers? Inquiry, 1–43. https://doi.org/10.1080/0020174X.2023.2174590
Ross Arguedas, A., Robertson, C. T., Fletcher, R., & Nielsen, R. K. (2022). Echo chambers, filter bubbles, and polarisation: A literature review. Reuters Institute for the Study of Journalism. https://doi.org/10.60625/risj-etxj-7k60
Sharma, B., Vasuja, K. (2022). Investigating Social Media Induced Polarization on National Education Policy 2020. In: Qureshi, I., Bhatt, B., Gupta, S., Tiwari, A.A. (eds) Causes and Symptoms of Socio-Cultural Polarization. Springer, Singapore. https://doi.org/10.1007/978-981-16-5268-4_8
Shin, D., Hameleers, M., Park, Y. J., Kim, J. N., Trielli, D., Diakopoulos, N., Helberger, N., Lewis, S. C., Westlund, O., & Baumann, S. (2022). Countering algorithmic bias and disinformation and effectively harnessing the power of AI in media. Journalism & Mass Communication Quarterly, 99(4), 887–907.
Silva, S., & Kenney, M. (2018). Algorithms, platforms, and ethnic bias: An integrative essay. Phylon (1960-), 55(1 & 2), 9–37.
Sunstein, C. (2001). Republic.com. Harvard Journal of Law & Technology, 14(2).
Sunstein, C. R. (2017). #Republic: divided democracy in the age of social media. Princeton University Press.
Terren, L., & Borge, R. (2021). Echo chambers on social media: A systematic review of the literature. Review of Communication Research, 9, 99–118. https://doi.org/10.12840/ISSN.2255-4165.028
Törnberg, P. (2018). Echo chambers and viral misinformation: Modeling fake news as complex contagion. PLoS ONE, 13(9), e0203958. https://doi.org/10.1371/journal.pone.0203958
Tucker, J. A., Guess, A., Barberá, P., Vaccari, C., Siegel, A., Sanovich, S., Stukal, D., & Nyhan, B. (2018). Social media, political polarization, and political disinformation: A review of the scientific literature. Political Polarization, and Political Disinformation: A Review of the Scientific Literature (March 19, 2018).
Vaca-Jiménez, S., Gerbens-Leenes, P. W., Nonhebel, S., & Hubacek, K. (2021). Unreflective use of old data sources produced echo chambers in the water–electricity nexus. Nature Sustainability, 4(6), 537–546. https://doi.org/10.1038/s41893-021-00686-7
Villa, G., Pasi, G. & Viviani, M. Echo chamber detection and analysis. Soc. Netw. Anal. Min. 11, 78 (2021). https://doi.org/10.1007/s13278-021-00779-3
Wang, X., & Song, C. (2020). Viral misinformation and echo chambers: The diffusion of rumors about genetically modified organisms on social media. Internet Research, 30(5), 1547–1564. https://doi.org/10.1108/INTR-11-2019-0491
Wang, X., Sirianni, A. D., Tang, S., Zheng, Z., & Fu, F. (2020). Public discourse and social network echo chambers driven by socio-cognitive biases. Physical Review X, 10(4), 41042.
Williams, B. A., Brooks, C. F., & Shmargad, Y. (2018). How algorithms discriminate based on data they lack: Challenges, solutions, and policy implications. Journal of Information Policy, 8, 78–115.
Zimmer, F., Scheibe, K., Stock, M., & Stock, W. G. (2019). Echo chambers in social media: A literature review. In 8th Annual Arts, Humanities, Social Sciences & Education Conference (pp. 1–22). [Conference presentation].
Visual References
Figure 1:Peckham, S. (2023, April 13). What are algorithms? How to prevent echo chambers and keep children safe online. Internet Matters. Retrieved from https://www.internetmatters.org/hub/news-blogs/what-are-algorithms-how-to-prevent-echo-chambers/
Figure 2:Kuchera, B. (2019). Echo chambers are resistant to voices from outside [Image]. In C. Thi Nguyen, The problem of living inside echo chambers, The Conversation. Retrieved September 11, 2019, from https://theconversation.com/the-problem-of-living-inside-echo-chambers-110486
Figure 3:Thakur, S. (2023, December 8). Illustration of a person sitting inside a bubble, using a laptop with their phone nearby [Illustration]. In A. Shukla, “Are you stuck in an echo chamber online? Read this to escape.” Newslaundry. Retrieved from https://www.newslaundry.com/2023/12/08/are-you-stuck-in-an-echo-chamber-onlineread-this-to-escape
Figure 4:Babis, D. (2018, July 30). Tearing down media echo chambers with AI. Good Audience. Retrieved from https://blog.goodaudience.com/tearing-down-media-echo-chambers-with-ai-c25363e68802
Figure 5: Kellogg Insight. (2017, April 6). The surprising speed with which we become polarized online. Kellogg School of Management at Northwestern University. https://insight.kellogg.northwestern.edu/article/the-surprising-speed-with-which-we-become-polarized-online
Figure 6: Wong, W. (2014, June 10). Are you trapped in your filter bubble? Cooler Insights. https://coolerinsights.com/2014/06/are-you-trapped-in-your-bubble/





