Sunday, February 22, 2026
Brock's Only Independent Student Newspaper
One of the only worker-managed newspapers in Canada

The race to label a glitchy TikTok as “censorship” signals eroding trust toward media institutions 

|
|

A video discussing the Jeffrey Epstein emails appears to “glitch” the moment its creator says “Syria,” cutting or de-syncing the audio in a way that behaves differently depending on how and where the clip is played. The comments immediately and confident started labelling the glitch as a form of deliberative platform censorship. This diagnosis provides a small but indicative reflection of how people view the current political and media environment with such distrust that anomalies are read as manipulation by default, not errors. 

A TikTok discussing the 20,000 emails sent by Jeffrey Epstein and recently released by The House Oversight Committee was posted by user @dumbbirchtree on Nov. 17.  

The video appears to play normally until the creator says “Syria” at 0:28, after which the audio for everything she says between 0:28 and 0:40 is cut while the video itself continues to play normally — meaning after the creator says “Syria” at 0:28, the audio skips to her saying “or he’s emailing about…” while the video plays un-synced in the background. However, that part of the audio, where the creator says “or he’s emailing about…” syncs visually at time stamp 0:40.  

Upon reading the comments, I downloaded the video onto my iPhone and MacBook and found two strange occurrences: if you let the video play from the beginning, the audio cuts out completely right after they say “Syria” and remains silent through the end (from 0:28 to 1:05); but, if you skip to any point between 0:29 and 0:40, the audio consistently lands on the same line, “or he’s emailing about…” — without matching the video in the background — no matter where you place the play head within that window.  

Responses to this audio glitch cluster around two broad categories. One is a technical, file-level explanation: media files can contain gaps or discontinuities that cause audio to drop while video continues, and players can handle those discontinuities differently. In one downloaded copy of the video, I observed that playing from the start led to silence after “Syria,” while seeking into the 0:29–0:40 window reliably produced the same later line. That pattern is consistent with an audio timeline gap: if there is no decodable audio in a time range, many players will “snap” to the next available audio when you seek, even if the visual track continues. 

The other category is platform intervention. Independent of glitches, we know that major video platforms have the technical capacity to mute audio over specific time ranges, remove audio and/or serve different encodes of the same upload depending on context. That capability does not demonstrate that intervention occurred here, but it helps explain why some viewers see intervention as plausible. In both categories, the shared constraint is visibility: the average individual has limited access to information about what versions were generated, how they were processed or whether moderation actions were applied, which makes definitive conclusions difficult from observation alone. 

These comments illustrate how uncertainty about a media artifact becomes part of the artifact itself. In @dumbbirchtree’s TikTok’s comment section, it is evident how fast the description moved from a general observation, with comments reading “it glitched after she said Syria” to presenting their interpretations as evidence, writing “this is censorship.” A separate cluster of comments focus on response — either emotional reaction or practical advice on how to preserve the message in a way that would be harder to alter, such as re-recording or presenting text on screen. 

In this sequence, the post becomes more than a single clip; it becomes a public thread where viewers test whether an experience is shared, propose explanations with varying confidence and disseminate those explanations to later readers. The result is that multiple narratives can coexist around the same technical observation, often without a mechanism for confirming which narrative best matches the underlying cause. 

The video is useful as a case study, because it shows how quickly an ambiguous input can generate a high confidence — yet unverifiable — explanation in a social feed. Explanations that are short and definitive are easier to repeat and attach to the clip than explanations that involve uncertainty or technical nuance, and comment sections can create a form of social reinforcement: repeated assertions that “it happened to me too” or that “this is censorship” can function as informal evidence for later viewers even when the causal chain remains unclear. 

This pattern of viewers treating an unexplained anomaly as definitive censorship fits a broader environment in which verification is often difficult for the average user and where trust in information systems is measurable low. The World Economic Forum’s Global Risks Report 2025 lists misinformation and disinformation among the top short-term risks for the second consecutive year, noting impacts on social cohesion and governance.  

In practice, that risk is amplified by the basic conditions of digital distribution: most users cannot directly observe why a clip behaves differently across devices or playback methods and many lack a technical framework for transcoding, audio/video timelines, buffering or moderation states. So, when something unusual happens, the cause is not readily verifiable from the viewing experience alone.  

Synthetic media and A.I. simultaneously increase uncertainty about what online content should be considered trustworthy. The Brennan Center describes a “liar’s dividend,” where the existence of convincing fakes can make it easier to deny authentic evidence by claiming it is fabricated. Reuters similarly reported on a BBC/EBU study found that leading A.I. assistants frequently produced problematic answers to news questions, including significant errors and sourcing issues, which researchers and broadcasters warned could further erode trust.  

As well, information systems include far more publishers and far less shared context, with social platforms playing a major role in distribution. The Reuters Institute’s “Digital News Report 2025” describes declining engagement and low trust in many markets alongside shifts in how news is accessed, with social platforms playing a major role in distribution. In the U.S. specifically, Gallup reported that confidence in mass media is at a new low of 28 per cent. 

Finally, economic conditions are one documented pathway into institutional distrust and conspiratorial interpretation, particularly when large gaps in wealth contribute to concentrated power and unaccountable decision-making. The World Inequality 2022 Report estimates that the richest 10 per cent own about 76 per cent of global wealth, while the bottom 50 per cent own about two per cent. As well, a summary based on UBS wealth reporting states that households with more than one million USD hold 47.5 per cent of global wealth.  

Research links these macro conditions to political attitudes: a 2024 study in Public Opinion Quarterly reports that economic inequality can reduce political trust when it is recognized and perceived as a failure of the political system. Consequently, multiple peer-reviewed studies find that high economic inequality is also associated with stronger conspiracy beliefs, often explained through mechanisms like anomie — a sense that society lacks clear rules, fairness or predictability.  

In an environment where a significant portion of the public experiences institutions as serving concentrated interests, ambiguous events in digital media — like unexplained missing audio — are more likely to be interpreted through frameworks of manipulation or suppression, rather than treated as neutral technical errors.  

When trust in information collapses, politics shifts from arguing over policies to arguing over what counts as reality. Platforms become part of the political battleground because they function as large-scale distributors and de facto editors, yet their recommendation systems and enforcement decisions are often opaque. When content is labeled, limited or removed through platform rules, moderation can end up shaping public debate in ways that feel like proxy governance.  

Mistrust also changes how people evaluate claims. Instead of asking “is this true?,” audiences default to “who is saying it?” Credibility becomes tied to identity and group membership, which makes persuasion across groups harder and tends to intensify polarization. Conspiracy framing can then operate as a political style: it offers quick certainty, a clear villain and built-in resistance to correction. It does not always require full belief to be politically useful.  

In a media environment this opaque, it’s unsurprising that a glitchy TikTok is automatically considered by most as censorship before it is considered a technical error. 

More by this author

RELATED ARTICLES

Social media has an alt-right pipeline problem, and women are its newest target 

Trends that urge women to step into their “divine feminine energy,” consume their way into a “clean girl aesthetic” and blame small mistakes on the fact they are “just a girl” are not products of neutral shifts in our algorithms. The differing frames women have been forced into online indicate subtle dog whistles to alt-right ideologies, ultimately functioning to naturalize conservatism, traditional gender roles and regressive choice feminism. 

The loneliness epidemic: a Gen-Z moral crisis, or a product of intimacy without dependency? 

If you’ve ever scrolled through social media, sat through a family dinner or had to endure a ‘situationship,’ surely you have been exposed to the common diagnosis of modern dating as a moral failure. It’s always the same arguments: the newer generation is impatient, nobody wants to put in the work, everyone is incapable of commitment and they’re all addicted to novelty. 

The presentation of technology and its inevitability  

For the first two decades of the 21st century, technology advanced at breakneck speed. Its rapid development often left sacrificed accountability, with tech being allowed to interfere with institutions like democracy, personal rights, privacy and ownership. 

The NHL is homophobic and the use of “Heated Rivalry” in their promotion doesn’t change that 

Piggybacking off the popularity of Crave’s new hit hockey show, Heated Rivalry, doesn’t make the NHL any less homophobic

Brock University’s Concurrent Education program is exhausting its students before they get the chance to become educators 

The Concurrent Education program at Brock University is unnecessarily difficult and ridiculously expensive, causing future educators to experience complete burnout before they even have a chance to reach the classroom. 

Should you do a moot court on a whim? 

On Jan. 24, on a frigid morning during a cold snap and with just four hours of sleep, I embarked at 7:40 a.m. to meet my partner in crime, Wenyang Ming, for my first mock moot court trial.  

A good rom-com shouldn’t be the exception, but the rule 

The rom-coms of today don’t just disappoint — they feel out of touch.

Editorial: Feelings over Trump’s military intervention in Venezuela are contrasting but not contradictory 

The response to the United States’ capture of former Venezuelan President Nicolás Maduro displays an unusual juxtaposition: many Americans are upset at U.S. President Donald Trump for his unannounced military intervention while, on the contrary, many Venezuelans — namely those living within the U.S. — have met the news with widespread celebration.