top of page

TikTok Journalism: The Good, The Bad, The Ugly

Not all good, not all bad -- but it gets ugly.

Popping Bubbles?

This “Stitch Incoming” video is an example of how some citizen journalists attempt to correct misinformation being spread on social media. In this video, the creator begins by addressing the audience at large, then shifts to addressing the creator of the original video directly. The video is phrased almost as a one-on-one conversation between two creators–but through addressing the audience and posting this correction publicly, the correcting party is opening up the conversation to a wider public forum–one that, due to the nature of information bubbles on platforms like TikTok, will likely largely consist of their preexisting audience. Creators of these correction videos like Jules Buet in their “#stitch with prof.kinnon” and Reuben @reubenwoodall in his “#stitch with Kelly Cadigan…” tend to respond largely informally, and while they do reference research to support their arguments, this information is not usually cited. Buet approaches correction in a slightly less formal version of the “New Wave Broadcast”–they maintain a neutral affect and they reference studies to make their point, but these studies remain uncited. Reuben approaches correcting misinformation differently, expressing that he finds the original creator’s argument “ridiculous” and speaking instead to his own personal experience–though when it comes to citing sources to support his argument, he likewise makes vague allusions like “have you seen all the anti trans bills that have been going on right now,” rather than citing a verifiable source. Visually, both creators position the camera below eye level, and thus create a visual effect of looking down on the viewer, positioning themselves as an authority. This has the potential to make creators of these correction videos seem more trustworthy to viewers as authoritative speakers on a subject, regardless of how much effort they put into citing their sources.

not allow creators to edit videos after they have been posted–unlike more traditional forms of journalism, particularly print journalism, updates within the originally published content with an addendum acknowledging that updates were made is not possible. Citizen journalist participants in Chelsea Peterson-Salahuddin’s study “News for (Me and) You: Exploring the Reporting Practices of Citizen Journalists on TikTok” discussed their methods for correcting misinformation they had unintentionally shared: some posted updates in the comments (which viewers are not guaranteed to open), some left their original videos up and made new videos correcting the misinformation (which are not guaranteed to be shared to the same audience), and some deleted their original videos and made corrected videos in their place (this is also not guaranteed to reach all the people who saw the original video and undo the harm already caused–it simply limits further spread). This highlights a complex problem when it comes to sharing journalistic content on an algorithmically-controlled platform: video creators have two different audiences to consider at all times. Those two audiences are viewers who choose to follow a creator’s account, and those who simply encounter a creator’s content as they scroll through videos that platforms like TikTok have selected “for them.”

20250416_191308.jpg
Screenshot_20250416_183053_TikTok.jpg
Screenshot_20250416_183103_TikTok.jpg
Comments from Zach Lancaster's "Am I on the Wrong Side of TikTok?"

When it comes to correcting information that others have shared, it’s unclear who this correction is intended to benefit beyond existing audiences who already know or agree with the new information being shared. It’s unlikely that in the case of content which discusses politicized topics that the video will be promoted to the “other side”–and when it is, creators like Zach Lancaster (who makes conservative news analysis content) and Robyn Holdaway (who makes queer educational content) often react by creating videos like their respective “Am I on the Wrong Side of TikTok?” and “Oh no…” TikToks stating that they ended up on the “wrong side of TikTok” or that their video “breached containment,” in the hopes that users on the “right side” of TikTok–those who agree with their viewpoint and ideals and exist comfortably within their information bubble–will engage with the video to “bring them back.” As much as social media platforms are treated as avenues of public discourse–and according to the New York Times, increasingly used as search engines–it seems that on TikTok, a platform which sustains a prolific subgenre of “wrong side of TikTok” videos, a culture exists in which users actively prefer to stay within their curated information bubbles rather than meaningfully engage with those who hold different points of view. 

When it comes to citizen journalists’ own content, issuing corrections is already complicated; content creators are limited by the ever-changing and updating capabilities of the platforms on which they publish their content. TikTok, like many short-form video platforms, does

Pointing and Laughing

This “Pointing and Laughing” video is another way TikTok creators correct misinformation on the platform. Rather than making a counter argument or educating viewers on the topic at hand, TikTok creators like Clover-Lynn @hillbillygothic simply drown out the audio of the original video to make their point. In some of her videos, such as her “This guy…” TikTok, Clover-Lynn goes as far as censoring the captions

on the original video. This is partially the result of social media companies putting the onus of regulating information accuracy on users, partially the result of privileging entertainment as a core value on social media platforms, and partially the result of users simply wanting to stay in their preferred bubbles and attract an audience within them–public discourse devolves into pointing and laughing at the “others,” from those who are spreading misinformation to those who are making disagreeable or hateful arguments to those who are simply deemed cringeworthy. 

In “Stirring Up Virtual Punishment: A Case of Citizen Journalism, Authenticity and Shaming,” Agneta Mallén analyzes a particular case of citizen journalism involving a taxi driver, his customer, a passerby, and hundreds of internet commenters in an attempt to point out two major weaknesses of citizen journalism: bias, and the enablement of stigmatization and punishment towards individuals involved. While Mallén specifically analyzes a work of citizen journalism that serves as a primary source–a video of an event as it happened–and points out that the poster’s framing of the video provides viewers with a bias before they have even witnessed the content of the video, her point applies to secondary-source citizen journalism as well. In the case of videos like these–and in the case of “Um, Actually” correction videos–TikTok’s “stitch” feature only allows a few seconds of the original video to be played before content creators chime in with their response, and its “duet” function allows for creators to drown out the original video completely. Stitches and duets link back to the original video, so that viewers are able to view it in its original context–but by then, creators who have stitched or duetted the original video have already functioned as a framing device, and viewers have already been introduced to a biased perspective on the content.

Mallén argues that while public discourse on social media has advantages like “problem solving, information sharing and provision of mutual support and empathy,” it likewise provides a scene for harassment, bullying, and humiliation–what she calls “virtual punishment.” In addition to algorithmic surveillance, content creators are surveilled by their audiences and peers, who are incentivised to punish perceived deviance for entertainment value to gain more views, more engagement, and–in the case of creators in TikTok’s Creator Fund–money. Clover-Lynn has found success in making memes about Appalachian culture and music, but the success of these videos often pale in comparison to her “Pointing and Laughing” videos–by far, her most viewed TikTok is her “yall hear somethin?” TikTok, at 4.2 million views. On TikTok, schadenfreude sells–it’s no wonder that TikTok creators prefer to stay within insular communities on the platform, when being exposed to “others” so easily results in public humiliation. 

Edutainment

Chelsea Peterson-Salahuddin argues that “TikTok’s particular social media logic, as a platform for sharing memes and dances, privileged entertaining news,” noting that study participants expressed that “to gain popularity on TikTok, where users are often exposed to an endless scroll of content through the FYP, privileging entertainment as a news value was a strategy to help them attract visibility and gain a following on the platform.” This is a central issue when it comes to social media as a news source–none of these platforms are designed for the purpose of disseminating news content. They are designed to entertain, to advertise, to retain users. Citizen journalists are not just journalists–they are also entertainers. Goad Gatsby, who still posts TikToks as part of his journalistic practice, uses the platform as a way to put a more lighthearted, entertaining spin on the news: “I’m trying to present [the news] in a way that doesn’t bum people out.” This entertainment aspect can be interpreted as a strength of disseminating news content on social media platforms–in contrast, Ben Paviour, a more traditional local journalist in Richmond, Virginia who writes for VPM and is currently a fellow with the New York Times Local Investigations Fellowship, expressed concern about the reach of an article he was writing that he estimated to be around 3,000 words: “I just wonder who will see it.” As Peterson-Salahuddin argues, entertaining news draws viewers’ attention–and on TikTok, news content has the potential to be served to viewers who aren’t actively seeking it out. That entertainment aspect can help draw viewers’ eyes and ears to stories that are ignored or underprivileged by more traditional outlets. However–strongly valuing entertainment has its risks, as misinformation, disinformation, and even bullying and harassment are likewise easily played for entertainment value. On platforms designed to entertain, rather than to inform, regulating the accuracy of information is deprioritized–it’s up to viewers to determine what content is purely for entertainment’s sake, what content is genuine, what content is satirical, and what content is to be taken as authentic news.

This genre of video–the “Late Night Parody”–uses humor as a framework to discuss current events, drawing on the stylings of late-night talk shows–or perhaps more accurately, parodies of late-night talk shows, like Saturday Night Live’s recurring Weekend Update segment. Citizen journalists like Ella Yurman on Going Down with Ella Yurman and Peyton Vanest in his “Google is Free” series report on current events using various humorous gimmicks to catch and maintain viewers’ attention, from guest starring clowns to framing informative videos as episodes of a game show. As can be seen in Vanest’s “Google is Free: Trump’s First Week” TikTok, often videos of the “Late Night Parody” genre will conclude on a more sober note and creators will level with the viewer, often to make a call to action. The quality of these videos range from visibly amateurish to semi-professional–the goal is only partially to appear authentic and trustworthy, and more to win viewers over based on the entertaining element of their content. In “News for (Me and) You: Exploring the Reporting Practices of Citizen Journalists on TikTok,”

As Nicholas Carr argues in his article “How to Fix Social Media,” social media companies have long avoided regulation by arguing that they are tech businesses, not media companies. Broadcast standards–like the public interest standards that Bogdan Belei details in “The Forgotten Public Interest Standard,” which include diversity and localism as core tenants, and which broadcast media companies have been held to for nearly a century–are not applied to social media applications. Carr argues that legislative changes should be made to redefine social media companies as media companies and likewise regulate them as such–but is further regulation the answer? Is the solution to regulate social media companies–and by extension, millions of social media users’ content–to the same standard as broadcast media professionals? Part of the strength of social media as a public forum is that anyone with a phone camera and an internet connection can report firsthand on personal experiences and the world around them, or likewise access information on other people’s personal experiences, perspectives, and observations. In Kalley Huang’s article for the New York Times, “For Gen Z, TikTok Is the New Search Engine,” 25-year-old interviewee Nailah Roberts points this out as a positive of searching for information on the platform, even for something as simple as restaurant reviews–on TikTok, “you see how the person actually felt about where they ate.” Another interviewee, 24-year-old Alexandria Kinsey, claims that TikTok’s search results “don’t seem as biased” as Google’s, and that she often wants a “different opinion” from what ads and websites optimized for Google say. While claims concerning a lack of bias may not be accurate, it seems that social media users–young ones, in particular–actively appreciate the ability to glean unregulated information firsthand from their peers. News content is only a fraction of what users are looking for on social media–and in the case of marginalized communities, even the deregulation of news content can be a benefit of social media platforms. Do these benefits–entertainment, convenience, representation, personal connection–outweigh the dangers of misinformation?

Touching Grass

This question doesn’t have an easy answer. While social media as a platform for citizen journalism has its benefits, it likewise presents a heap of complications and drawbacks due to the nature of the deregulated, algorithmically controlled, corporatized space. Social media companies don’t want to be burdened with regulatory responsibility, and persistently offload this responsibility to users, designing various community-based information regulation techniques rather than finding in-house solutions. On April 16, 2025, TikTok announced that they would be testing their new “footnotes” feature, which is essentially TikTok’s version of X (formerly Twitter) and Instagram’s Community Notes. Once again, unvetted users are tasked with regulating information shared by their peers: all you need to contribute to TikTok’s footnotes is a valid email address, no policy violations in the past six months, to be at least 18 years old and based in the U.S., and to have used TikTok for at least six months. Users are open to this self-regulation, from using euphemistic language to discuss mature topics to publicly correcting their peers–but these methods are only partially effective at best, persistently blurring the line between truth, misinformation, and fiction and recursively confirming users’ own biases. Introducing legislation to force social media companies to take responsibility for regulating their platforms more effectively may mitigate the proliferation of misinformation on social media and may even mitigate some level of harassment and “virtual punishment,” but would at the same time dampen the benefits of social media as a platform for reporting and discussing the news, and would be antithetical to the actual desires of social media users.

​

In “Trapped in a Chronically Online World: MillenigenZ, and Social Media,” Faye Linda Wachs asserts that Amercians are increasingly becoming “permanently” or “chronically” online, in that we are perpetually connected to the internet rather than connecting at specific times for specific purposes, and notes that “a reduction in communal ‘third spaces,’ exacerbated by the global COVID-19 pandemic, has made social media into a proxy for potential in-person experiences.” Wachs argues that MillenigenZ–an age bracket that consists of younger Millennials (born 1981-1995) and the oldest members of GenZ (born 1996-2010)–bear a unique perspective as those who have grown up alongside the internet when compared to older generations or younger, more technologically-saturated generations, and notes that MillenigenZ participants of her study understand the problems with social media: they know they are being advertised to, they know they are

81e26d7dfa9c4cdcebec8095535c68ad.jpeg
Image courtesy of TikTok

being manipulated and perpetually tempted into spending too much time on social media while being entertained and/or enraged, and they expressed concern about how it shaped the attitudes and views of those around them–but they don’t know how to stop, or how to improve their experience. Despite their issues with social media–and the information they encounter on social media–young adults continue to engage with social media on a near obligatory basis, and value it as a means of communication and avenue for connection with their peers that they wouldn’t experience without social media. 

​

Perhaps part of the solution is to mitigate this sense of obligation–maybe it’s time to reconsider the amount of focus and energy we direct at changing social media platforms themselves, rather than the way we think about and interact with them. While the discussion regarding legislative regulation of social media platforms continues, we could, at the same time, be building a world outside of social media that provides just as much of an opportunity for people–especially young people–to connect and engage in discourse without the trappings of algorithmic manipulation. Instead of focusing entirely on the complex task of regulating social media platforms, we could reexamine legislation that criminalizes loitering, fund and build more “third spaces” to foster in-person connection, and expand opportunities for young people to connect with nature rather than expanding deforestation of Federal lands managed by the U.S. Forest Service. We could focus more on digital literacy as part of public education curriculums to teach younger generations how to interact with the content they will encounter online, from fact-checking and research to visual analysis. People want to have access to educational content, to be knowledgeable about current events, to connect with other people and to be able to share what they know with others–that’s why citizen journalism on TikTok exists, and why it has an audience. But we need to understand and remember that social media wasn’t designed for this purpose–it was designed to entertain, to advertise, to maximise engagement. Rather than closing the Department of Education, as the Trump administration is currently aiming to do, we could be teaching people how to think critically about the content they consume, and in turn teach them to be responsible digital citizens.

bottom of page