We live in a digital age where knowledge has become a contested space, increasingly shaped by hidden systems and algorithmic influence. As computer systems, AI technologies, and media companies gain unprecedented power, the processes by which we encounter and evaluate truth are fundamentally transformed. Search engines, recommendation algorithms, and mass media platforms now function as epistemic gatekeepers, determining what is visible, credible, or worthy of consideration. These systems do more than assist with information retrieval; they actively shape attention, belief, and perception, often steering billions without accountability or transparency. In response, there is a growing need to recenter human discernment through critical thinking, intuition, and deeper reflection. These are qualities that stand in contrast to the purely material or mechanistic assumptions of algorithmic design.
These systems, though artificial, are increasingly treated as sources of guidance, reassurance, and authority. They have become central to how people form trust in information, often shaping worldviews more deeply than longstanding social or intellectual institutions. The issue extends beyond misinformation or bias and reflects a more profound epistemic shift. Knowledge is frequently detached from wisdom and repurposed for influence, engagement, and commercial gain. In this environment, preserving intellectual autonomy depends on the ability to recognize and evaluate the forces that construct our sense of what is real. Examining the philosophical and communicative implications of media and algorithmic influence, including the growing role of AI, we must consider how these systems shape our understanding and ask how individuals can remain critically aware in the face of technologies that are designed to persuade rather than enlighten.
The scientific method remains a powerful tool for understanding the world through observation and repetition. However, it is vital to recognize its limitations within a broader epistemological framework. A strictly materialistic perspective, focused solely on empirical evidence and sensory input, often overlooks other valuable modes of knowing, such as moral intuition, introspection, and philosophical reasoning. These dimensions are essential for addressing questions of purpose, ethics, and meaning, which lie outside the scope of scientific measurement. Integrating diverse ways of knowing does not require rejecting science. Instead, it calls for intellectual humility and an awareness that truth may extend beyond what is quantifiable. As AI and algorithmic systems increasingly shape our informational environment, this broader perspective becomes even more critical for cultivating discernment in both human and technological contexts.
Media Influence and the Construction of Knowledge
As online networks become powerful intermediaries of belief, the mechanics of supposed media effect serve as a stark reminder of how easily standards can be altered without anybody realizing. The Influence of Presumed Media Influence (IPMI) model illustrates that media attention and interpersonal communication can subtly motivate individuals to engage with artificial intelligence, not through direct persuasion, but by encouraging them to reflect on social norms and their perceptions of others’ media consumption (Li et al., 2024).
This aligns with agenda-setting theory, which posits that the media do not dictate thoughts but rather frame topics for consideration, thereby constructing a “pseudo-environment” that influences public perception, often diverging from actual reality (Chernov & McCombs, 2019). These creations are constrained by reality itself; nonetheless, for many individuals, the pseudo-environment becomes a functional reality. Many people still surrender their freedom without even realizing it to algorithmic gods, which are tools that alter how people perceive the world, where what people think they know can be more important than what they actually know. Agenda-setting theory’s grounding in scientific realism offers a helpful lens here: while beliefs may be constructed, they are tested, and sometimes contradicted, by enduring truths in the real world.
Yanovitzky and Weber (2019) assert that the media function as knowledge intermediaries. They not only provide people with information; they also determine how that knowledge is organized, shared, and utilized in public life. They claim that the media helps determine what information is seen, who can view it, and how it connects to power and action. This reveals a greater problem: the truth is being filtered more and more through systems that prioritize getting people to agree with you over understanding the facts.
Samantha Lay (2008) deepens this critique by showing how information is not simply passed along by the media but is strategically transformed at each stage before it becomes news. Through what she terms “strategic predetermination,” Lay identifies how facts are reshaped across five stages, from initial reporting to internal review, press engagement, journalistic preparation, and final publication. At each point, the information is reconstructed for different audiences, often reinforcing institutional narratives rather than challenging them. This layered process means that by the time the public encounters a “news event,” it may bear only selective resemblance to the original occurrence. If what we receive is multiple steps removed from the truth, then our trust is often placed not in what is, but in what has been shaped to appear trustworthy.
In the digital age, algorithms, live feeds, and breaking news cycles operate as powerful forces of epistemic authority, often privileging speed over accuracy. These systems are not just tools of dissemination but active participants in shaping what people come to accept as credible. A Swedish ethnographic study on digital journalism found that newsroom practices increasingly prioritize rapid content delivery over factual reliability. This results in what researchers term “epistemic dissonance,” a growing gap between what is asserted and what is actually known (Ekström et al., 2021). Journalists, under institutional pressure, frequently publish incomplete or unconfirmed information, relying on qualifiers and future corrections to maintain the appearance of credibility. In such environments, epistemic shortcuts become normalized.
The influence of artificial intelligence further complicates this landscape. AI-driven recommendation engines and automated news feeds amplify emotionally charged content, reinforcing patterns of immediate engagement while suppressing nuance. As a result, public perception becomes increasingly shaped by algorithms designed to maximize attention rather than understanding. This dynamic not only distorts what individuals know but also affects how they think, encouraging reactive, emotionally driven responses over reflective, critical reasoning. In such a context, it becomes essential to remain conscious of how informational systems guide perception and to cultivate habits of discernment that resist the pull of immediacy in favor of deeper insight.
Post-Truth Journalism
Pablo Capilla (2021) intensifies this by characterizing the post-truth scenario as an epistemological transformation inside journalism, where truth is no longer anchored in proof but replaced by emotional resonance. Capilla argues that neoliberal rationality has changed the audience’s epistemic expectations. The media no longer reports; they also help create truth regimes that fit with the power structures. In this new order, what people think is “true” is often based on what they believe or how they feel, not on what is actually true. As Capilla explains, “…change is occurring in the way people perceive reality through the news, and that this shift is affecting the perception of what is true or false in the news” (pp. 319–320).
Recent cognitive research emphasizes the significance of epistemic awareness in navigating contemporary information systems. Feinkohl et al. (2016) found that individuals who hold more advanced epistemological beliefs, such as understanding that knowledge is complex, constructed, and subject to change, demonstrate stronger memory and a greater ability to critically evaluate scientific claims, particularly when presented through journalistic formats. In contrast, individuals with less developed epistemic frameworks are more likely to accept information at face value, especially when it is delivered frequently or with confidence. This tendency is especially concerning in environments shaped by breaking news and algorithmically curated content, where scientific assertions are often presented without adequate context or acknowledgment of uncertainty.
The study also reveals that critical thinking is influenced more by the quality of one’s epistemological outlook than by cognitive ability alone. As AI systems increasingly influence how information is selected and presented, cultivating this deeper form of intellectual engagement becomes critical. Simply acquiring knowledge is no longer sufficient. Individuals must learn to examine the structures behind the information they receive, applying habits of skepticism, reflection, and humility to think clearly in an environment defined by speed, volume, and persuasion.
Personal Epistemology in the Age of Digital Media
Schwarzenegger’s (2020) analysis of individuals’ interactions with contemporary media environments expands upon this emphasis on personal epistemic production. Schwarzenegger identifies three components of what he terms “personal epistemologies of the media,” selective criticality, pragmatic trust, and competence–confidence (p. 369). These categories illustrate the inconsistent application of critical thinking, often influenced by ideological alignment or perceived personal expertise, as well as the tendency to rely on specific sources due to cognitive convenience rather than epistemic rigor. This reflects what Ryan (2021) describes as a deficit in epistemic trust, where individuals no longer believe that media sources offer reliable or comprehensive coverage. As a result, they often reduce their expectations of being accurately informed. Both Ryan and Schwarzenegger observe that media users frequently overestimate their ability to identify truth, relying more on emotional resonance and familiarity than on careful evaluation or verification. This pattern reveals a broader vulnerability in how trust is formed, where it is placed not in reasoned evidence but in perceived authority, social identity, or ideological agreement.
When people defer to either legacy institutions or algorithmically curated content without critical reflection, they may unknowingly replace genuine inquiry with cognitive shortcuts. These habits are shaped by convenience, repetition, and mental ease rather than deliberate reasoning. Schwarzenegger’s model emphasizes that epistemic development is not solely intellectual. It is also shaped by daily patterns, media routines, and the cultural cues that inform how individuals choose to trust or question the information they encounter.
Bartsch et al. (2024) add further insight by examining how individuals judge the credibility of news from unfamiliar or fictional sources. Their research found that people often rely on surface-level cues such as tone, formatting, or alignment with existing beliefs, rather than engaging in thoughtful verification. This reliance on superficial indicators highlights how easily the appearance of credibility can replace actual knowledge. In an information environment shaped by speed and emotional appeal, the ability to distinguish between what feels true and what is verifiably accurate becomes increasingly critical for responsible engagement.
Aubele (2023) argues that the term “bias” is often used too loosely, becoming a shortcut that replaces deeper engagement with the ideological content of information. Rather than relying on such generalized labels, he recommends using structured analytical methods such as agenda-setting analysis, framing, and sentiment analysis. These tools can help differentiate between high- and low-quality sources, contributing to a more accurate understanding of how information is constructed and communicated.
Adiprasetio et al. (2025) expand on this point by noting that institutional fact-checking is not immune to underlying assumptions about what qualifies as valid knowledge, credible sources, or authoritative interpretation. Even well-intentioned verification efforts may unintentionally reinforce existing epistemological frameworks rather than challenge them. These insights suggest the importance of advancing beyond simplistic acceptance or rejection of information. A more reflective approach, grounded in discernment, critical inquiry, and patience, is needed to navigate a media environment shaped by competing narratives and evolving standards of truth.
Search Engines as Epistemic Authorities
There has been a dramatic shift in the way individuals access and rely on information since Google became the epistemic interface for common questions. According to Bhatt and MacKenzie (2019), people often unquestioningly trust search engines, engaging in digital literacy rituals that obscure the algorithmic factors that determine whether content is displayed or not. This ignorance is not passive; it is structurally perpetuated, favoring ease above critical awareness. Macknight and Medvecky (2020) broaden this critique in the realm of economics education, highlighting how Google selectively mediates disciplinary knowledge, making certain concepts more prominent or authoritative than others. They call this “Google-knowing,” which suggests that what you can find out depends on how the search engine is configured, not the difficulty of the issue. Finally, Narayanan and De Cremer (2022) see Google as offering “bent testimony.” This means that outputs graded by algorithms are seen as reliable testimony, even though individuals do not write them and lack accountability. Reflection on this trend raises urgent concerns: as dependence on search engines replaces trust in concrete, accountable sources of knowledge, truth is no longer recognized within the community or through wisdom traditions but is increasingly sourced from commercial algorithms designed for engagement rather than illumination.
Case Study: Ivermectin, Media Framing, and Public Perception During COVID-19
Epistemological concerns came to the forefront during the COVID-19 pandemic, revealing significant tensions between public trust and mediated information. At the outset of the pandemic in 2020, televised mainstream news began broadcasting mortality statistics in formats resembling stock market tickers. Despite the gravity of the data presented, the absence of contextual details, such as the phrase “while controlling for,” was notable. Terms like “comorbidity” were assumed to be understood by general audiences without explanation, despite their implications in cases involving non-COVID-related fatalities such as fatal car accidents, drownings, or shootings, for example. Initially, such editorial decisions were presumed to be pragmatic omissions rather than deliberate framing choices. However, this trust in legacy media highlights a broader phenomenon: the extent to which individuals defer personal reasoning in favor of institutional narratives.
A pivotal moment in the public discourse occurred during a widely circulated podcast episode between Joe Rogan and CNN’s Chief Medical Correspondent, Dr. Sanjay Gupta. In that interview, Gupta acknowledged the framing of Ivermectin as a “horse dewormer,” to which Rogan responded by calling it a deliberate lie, suggesting that such characterization was knowingly propagated by televised mainstream news (Rogan & Gupta, 2021). Ivermectin has become the center of a polarizing narrative, portrayed across multiple media platforms as a veterinary drug misused by misinformed individuals. This portrayal extended to popular culture, exemplified by talk show host Jimmy Kimmel’s inflammatory suggestion that those who used Ivermectin should be denied medical care and left to die (Jones, 2021).
However, Ivermectin’s use in human medicine was well-established long before the pandemic. This discrepancy raised critical questions regarding media framing. How could legacy media fail to acknowledge that Ivermectin had been prescribed to humans over a billion times globally and had even received a Nobel Prize for its role in human medicine? Further academic inquiry into this media narrative culminated in a published study in Propaganda in Focus (Vanadisson, 2023). The study analyzed how repetitive TMN broadcasts created an epistemic framework that not only shaped opinions but also constructed an alternate reality, sidelining dissenting perspectives. The strategic framing of Ivermectin functioned as a powerful gatekeeping mechanism, reinforcing specific beliefs while excluding others, thus defining the boundaries of what was considered legitimate knowledge.
This case supports Yanovitzky and Weber’s (2019) theory of media as knowledge brokers, highlighting the media’s capacity to mediate not only access to facts but also reality itself for trusting audiences. The implications of such epistemic gatekeeping are profound. Individuals who previously trusted institutional narratives found themselves grappling with cognitive dissonance and social alienation when they questioned dominant discourses. This aligns with the “spiral of silence” theory (Matthes et al., 2010), which posits that individuals often suppress dissenting views due to fear of social exclusion unless those views are held with high conviction.
Conclusion
In a digital environment shaped by algorithms, commercial incentives, and accelerated information cycles, determining what is true has become an increasingly challenging task. Media institutions, AI-driven platforms, and search engines do more than distribute information; they actively influence what is seen, what is believed, and how knowledge is formed.
This paper has examined the role of these epistemic gatekeepers in shaping public understanding. It suggests that systems designed for engagement can obscure accuracy, and that repeated exposure to information can replace genuine critical evaluation. As individuals increasingly rely on curated content and rapid news delivery, there is a growing risk that convenience will take precedence over credibility.
Addressing this challenge requires more than technological reform. It calls for a renewed cultural commitment to epistemic responsibility. This includes questioning how narratives are framed, seeking context beyond headlines, and remaining aware of the psychological cues that influence belief. Analytical tools such as agenda-setting theory and framing analysis are helpful, but they must be supported by habits of reflection, intellectual humility, and a willingness to engage with complexity.
Looking ahead, individuals, educators, and institutions must prioritize the development of critical media literacy and responsible information practices. This includes integrating epistemic training into education, promoting transparency in digital design, and encouraging greater accountability among media producers. The future of public knowledge will depend not only on system-level changes but also on how individuals choose to engage with the information they consume. In an era where speed and simplicity often prevail, the deliberate pursuit of depth, accuracy, and independent thought remains a vital path forward.
References
Adiprasetio, J., Rahmawan, D., Wibowo, K. A., Yudhapramesti, P., & Hartoyo, N. M. (2025). Epistemological problems in fact-checking practice: Evidence from Indonesia. Media Practice & Education, 26(1), 36–57. Retrieved June 4, 2025, from https://doi.org/10.1080/25741136.2024.2341338
Aubele, J. (2024). Better than bias: The power of and alternatives to descriptions of news media as biased. Dissertation Abstracts International: Section B: The Sciences and Engineering, 85(8-B).
Bartsch, A., Mares, M.-L., Schindler, J., Kühn, J., & Krack, I. (2024). Trust but verify? A social epistemology framework of knowledge acquisition and verification practices for fictional entertainment. Human Communication Research, 50(2), 194–207. https://doi.org/10.1093/hcr/hqad036
Bhatt, I., & MacKenzie, A. (2019). Just Google it! Digital literacy and the epistemology of ignorance. Teaching in Higher Education, 24(3), 302–317. Retrieved June 4, 2025, from https://doi.org/10.1080/13562517.2018.1547276
Capilla, P. (2021). Post-truth as a mutation of epistemology in journalism. Media and Communication, 9(1), 313–322. https://doi.org/10.17645/mac.v9i1.3529
Chernov, G., & McCombs, M. (2019). Philosophical orientations and theoretical frameworks in media effects: Agenda setting, priming and their comparison with framing. The Agenda Setting Journal, 3(1), 63–81. https://doi.org/10.1075/asj.18016.che
Ekström, M., Ramsälv, A., & Westlund, O. (2021). The epistemologies of breaking news. Journalism Studies, 22(2), 174–192. Retrieved June 4, 2025, from https://doi.org/10.1080/1461670X.2020.1831398
Feinkohl, I., Flemming, D., Cress, U., & Kimmerle, J. (2016). The impact of epistemological beliefs and cognitive ability on recall and critical evaluation of scientific information. Cognitive Processing, 17(2), 213–223. https://doi.org/10.1007/s10339-015-0748-z
Jones, Z. C. (2021, September 9). Jimmy Kimmel jokes hospitals shouldn’t treat patients who used ivermectin for COVID-19 treatment. CBS News. https://www.cbsnews.com/news/jimmy-kimmel-ivermectin-covid-19-pandemic-hospitals/
Lay, S. (2008). Information/transformation: The Strategic predetermination of information events. Journalism Studies, 9(1), 57–74. https://doi.org/10.1080/14616700701768063
Li, Z., Shi, J., Zhao, Y., Zhang, B., & Zhong, B. (2024). Indirect media effects on the adoption of artificial intelligence: The roles of perceived and actual knowledge in the influence of presumed media influence model. Journal of Broadcasting & Electronic Media, 68(4), 581–600. Retrieved June 2, 2025, from https://doi.org/10.1080/08838151.2024.2377244
Macknight, V., & Medvecky, F. (2020). (Google-)Knowing Economics. Social Epistemology, 34(3), 213–226. Retrieved June 4, 2025, from https://doi.org/10.1080/02691728.2019.1702735
Matthes, J., Rios Morrison, K., & Schemer, C. (2010). A spiral of silence for some: Attitude certainty and the expression of political minority opinions. Communication Research, 37(6), 774–800. https://doi.org/10.1177/0093650210362685
Narayanan, D., & De Cremer, D. (2022). “Google told me so!” On the bent testimony of search engine algorithms. Philosophy & Technology, 35(2), 1–20. Retrieved June 4, 2025, from https://doi.org/10.1007/s13347-022-00521-7
Rogan, J., & Gupta, S. (2021, October 15). Joe Rogan & CNN Sanjay Gupta podcast transcript: Ivermectin. Rev. https://www.rev.com/blog/transcripts/joe-rogan-cnn-sanjay-gupta-podcast-transcript-ivermectin
Ryan, S. (2021). Fake news, epistemic coverage and trust. Political Quarterly, 92(4), 606–612. https://doi.org/10.1111/1467-923X.13003
Schwarzenegger, C. (2020). Personal epistemologies of the media: Selective criticality, pragmatic trust, and competence–confidence in navigating media repertoires in the digital age. New Media & Society, 22(2), 361–377. Retrieved June 3, 2025, from https://doi.org/10.1177/1461444819856919
Vanadisson, R. (2023, August 22). Ivermectin and televised mainstream news: A case study of propaganda. Propaganda in Focus: Analyzing Persuasive Communication and Censorship. https://propagandainfocus.com/ivermectin-and-televised-mainstream-news-a-case-study-of-propaganda/
Yanovitzky, I., & Weber, M. S. (2019). News media as knowledge brokers in public policymaking processes. Communication Theory (1050-3293), 29(2), 191–212. https://doi.org/10.1093/ct/qty023
(Featured Image: “EdTechSR Ep 314 Tech Giants’ AI Race” by Wesley Fryer is licensed under CC BY 2.0. Cropped by Propaganda In Focus)




