top of page

TikTok Journalism: The Good, The Bad, The Ugly

Citizen journalistic content on TikTok isn't all bad -- but it's not all good, either.

A New Journalistic Wave

In this genre of short-form journalistic content–the “new wave broadcast”–the journalist draws from the style of traditional broadcast journalism, functioning as a familiar "talking head" figure that reports the news seriously and semi-formally. Some professional journalists who report independently on social media platforms–like Erin Reed–incorporate this style in their content. In fact, some traditional news outlets like the Baltimore Banner have begun to create their own short-form video content as part of their reporting practice–in the Baltimore Banner’s case, by incorporating a “News in Brief” section on their website. As seen in Erin Reed’s “Trumps trans passport ban” video on TikTok, the journalist delivers the news confidently with a relatively neutral affect, regardless of the actual neutrality of the content. They often cite sources both verbally and visually, including screenshots within the video to back up the authenticity of their claims and frequently incorporating additional video clips to lend further credibility. As is typical with this style, citizen journalist Indie P. Jones provides links to her sources–in this case, through a link in her TikTok bio. This combination of an authoritative speaker, higher-than-average production value, and reference to outside sources creates an air of trustworthiness and authenticity–a trustworthy air that may further discourage users from exiting the app and fact-checking these sources for themselves. While this type of content bears some similarities to traditional broadcast news in its visual presentation and source transparency, its differences go beyond who is creating and fact-checking the content. As Victoria Carmichael et al. argue in “Media Coverage of Mental Illness: A Comparison of Citizen Journalism vs. Professional Journalism Portrayals,” traditional journalistic gatekeeping practice prioritizes sensationalism in the case of topics like mental illness, whereas citizen journalists tend to prioritize education and destigmatization.

Even as citizen journalists prioritize entertaining content in hopes of garnering more engagement, as detailed by Chelsea Peterson-Salahuddin in “News for (Me and) You: Exploring the Reporting Practices of Citizen Journalists on TikTok,” their content remains overall more positive in tone and focus than similar stories aired on traditional news broadcasts. This has the potential to make amateur journalistic content more appealing to viewers–especially viewers who are diagnosed with mental illnesses themselves. This focus on positivity and education can bring about tangible benefits to communities who are sensationalized by traditional outlets–Aja Romano argues in their article for Vox titled “Trans People Deserve Better Journalism” that mass media journalists fail to thoughtfully and accurately report on conversations pertinent to trans people on a wide scale, positing that mainstream journalistic publications tend to overwhelmingly misconstrue transgender issues as a semantic debate rather than a human rights battle. As Goad Gatsby, citizen journalist and activist turned RVA Mag writer, put it: “There’s an entire market for discussing the rare exception of teenagers who get gender affirming care, without the voices of trans people…having trans voices on the ground reporting is very important.” While similar issues can and do occur within citizen journalism, there is likewise an opportunity for transgender people to speak for

Screenshot_20250416_162522_Google_Play_Store.jpg
One of the featured reviews of the TikTok app on the Google Play Store as of
April 16, 2025

and report on themselves, from personal narratives to more classic, researched, talking-head style content like the above–and for this content to reach viewers outside of the transgender community, providing a more well-rounded pool of information and a more positive perspective to the public beyond what is sensationalized by traditional outlets. This is among the greater goods of citizen journalism–the potential for underrepresented communities to foster understanding in the public consciousness and be able to represent themselves. 

Sound the Alarm!

In this genre of short-form journalistic content–in which creators seem to “raise the alarm”–the creator is usually not a professional journalist or even a person claiming to be a citizen journalist, but simply a citizen who hopes to inform other citizens. These videos tend to be informal and generally appear spontaneous, with little thought put into lighting, framing, background, or the appearance of neutrality. Often, citizen journalists like TikTok creator Georgie @soupytime (who has built an audience making a variety of content, only part of which is related to current events) will address viewers in a friendly, personable manner, and will be recording in a personal space, such as their bedroom or the interior of their personal vehicle. In their study “Alternative health and conventional medicine discourse about cancer on Tiktok: Computer Vision Analysis of TikTok videos,” Roxana Mika Muenster et al. argue that the visual language of TikTok “micro-narratives” contributes heavily to a viewer’s perception of the information presented. While the focus of the study was on cancer information and misinformation videos, the same concept is easily applied to this genre of content–citizen journalists recording in deeply personal spaces, as Georgie does in her “get angry” video regarding the proposed TikTok ban, can create a friendly and even intimate rapport with viewers, as though creators of this content are simply venting over a video call to their friend, the viewer. While seemingly unintentional, this intimacy can inspire a bias toward the citizen journalist as a trustworthy figure, and can be leveraged as a subtly persuasive tactic–especially in the case of creators like Georgie, who have built an audience through posting about their everyday lives and marketing their personality.

Additionally, the citizen journalist generally does not shy away from showing emotion or making emotional appeals, and the journalist tends to be overtly biased on the information they report themselves. This content is generally more well categorized as news analysis or opinion, rather than straightforwardly informative. Often, a call to action is included, though the direction of this action tends to be somewhat vague–"stay connected" and "speak up" being examples here. Sources are generally not cited, or at most, are offhandedly referenced. Occasionally, citizen journalists will encourage viewers to look up information for themselves, or provide the viewer with a specific phrase or incident to search for. Between curated intimacy, uncited information, and calls to action–this raises the question of how citizen journalists should be vetted, if at all. While some social media companies have made efforts towards vetting the authenticity of information–often through the community, as is the case with X (formerly Twitter) and Instagram, both of which already have or are testing Community Notes features that allow users to add notes to posts correcting misinformation (and, in Instagram’s case, working with third-party fact-checkers to likewise add notes to posts, as seen in this CGI video of ball lightning posted to Instagram Reels)–little effort is put into vetting content creators themselves. Just like social media companies place the responsibility of vetting and correcting information shared on their platforms on users, the burden of vetting the trustworthiness of content creators falls to their audiences–and sometimes, the creators themselves. In a direct follow-up to her “get angry” video, Georgie posted another video in which she reminded her audience that she held a creative writing degree–not a journalistic or political one–and urged her audience to seek out other, more well-informed sources, going so far as to provide a collection of her personal favorites. As Leon Cvrtila argues in “Truth Politics and Social Media: Towards a Foucauldian Approach,” social media companies themselves don’t care to be arbiters of truth and in fact profit off of the proliferation of misinformation on their platforms, which incentivises these companies to allow misinformation to continue to spread. If social media companies are unwilling–or unable–to verify the truthfulness and accuracy of the content that they host, and the responsibility falls to users instead, then those users need to be equipped with the knowledge of how to interrogate the accuracy of information they encounter, how to interpret all facets of information presented in video content–textual, visual, and tonal–and be willing to put in all that effort in the first place.

Blowing Bubbles

This video falls into a very similar (if not the same) “raise the alarm” style as the previous example of citizen journalism–the video is informal, the creator is in a personal space, the content is overtly biased, and still the citizen journalist openly makes emotional appeals. A throughline between these two videos is the “think of the children” rhetoric being utilized to persuade viewers into sharing the creator’s viewpoint on each issue–though in this case, the citizen journalist is a primary source themselves, having been one of the children they implore you to consider. Unlike the previous video, this one is approached as more of a personal narrative than a researched argument–though a call to action is still present. In Chelsea Peterson-Salahuddin’s study “News for (Me and) You: Exploring the Reporting Practices of Citizen Journalists on TikTok,” several participants reported that they selected which news stories to share based on personal interest, for reasons ranging from passion on a subject garnering more views to feeling more comfortable reporting in areas of expertise. Peterson-Salahuddin argues that “the focus on personalization enabled by the communicative social media logic of the FYP [For You Page] influenced participants to focus on news that personally interests them in the hopes the FYP would deliver it to similarly interested audiences.” This is reflective of what some call the information “bubble” that is seemingly created by social media algorithms which appear to promote content similar to that which users have already interacted with–though nobody has access to the internal mechanisms of social media algorithms but the companies that own them, the prevailing theory on the information cycle on these platforms like TikTok seems to be that creators choose stories based on personal interest, the algorithm promotes that content to users who are likely to share that interest, and, if the user engages with that post–the algorithm promotes more content related to that interest.

This bubble shoehorns users into a selective pool of information–one where they are unlikely to encounter counterarguments, dissenting points of view, or the full spectrum of information on a topic. Despite the similarities in topic and style of these two videos, the fact that they present differing viewpoints on a deeply politicized topic may be enough for these videos to be pushed to entirely separate audiences based on the

political leanings of users. In their study “The illusory certainty: Information repetition and impressions of truth enhance subjective confidence in validity judgments independently of the factual truth,” Annika Stump et al. found that repetition of information impacts not only the baseline perception of that information being truthful, but also an individual’s confidence in the truthfulness of the information, regardless of the actual factual truth of the information. Their findings additionally suggest that shorter intervals between repetition of information had a more significant impact on whether information was judged to be true than longer intervals. If citizen journalists’ theory that the algorithm is largely interest-based–measured on the basis of engagement–rings true, then viewers being presented with multiple videos that align with their personal interests and political views and conform to preexisting beliefs within a short timespan has a real-world impact on what information they perceive to be truthful–and what they perceive to be misinformation. It’s a bubble of continual confirmation bias that begins as soon as a user creates an account. In the process of researching for this project, I created a handful of shell accounts on TikTok to find examples of the content I’d be recreating without the bias of my personally curated account and associated algorithm. In trying to minimize political bias in what content I would find, I intentionally engaged with left-leaning content on some accounts, and right-leaning content on others–and found that when I used my real birth year (2004),

Screenshot_20250416_162500_Google_Play_Store.jpg
One of the featured reviews of the TikTok app on the Google Play Store as of
April 16, 2025

it was difficult to curate a For You Page that didn’t end up circling back to progressive content regarding transgender people. When I made a new account using my mother’s birth year (1967), I immediately stopped having the same issue. This evidence is anecdotal, of course–but it is indicative that TikTok’s algorithm is most likely attempting to “guess” what content users will be interested in based on user information (like age) from the very beginning–and the longer a user sticks around, the more time there is to insulate the information bubble. As one participant in Peterson-Salahuddin’s study, Indie P. Jones, put it: “Once you get put into a certain TikTok algorithm, it doesn’t like you to break out of it.” These information bubbles are a major drawback of social media as a news-gathering source–it’s not as simple as changing the channel from Fox News to PBS, or vice versa. In my (admittedly short-lived) case, I had to misrepresent my age to find content from the “other side” of the political spectrum than what is statistically standard for a college student, as reported by the Pew Research Center.

Foucault and Self-regulation

A lasting staple of citizen journalistic content is the inconsistent self-censorship–many citizen journalists like TikTok user @traumabare in their video “The AFD sent fake plane tickets…” and Indie P. Jones in her video on a GOP congressman who traveled to Uganda to support a “Kill the Gays” bill (posted to her backup account) do not shy away from cursing, but will not use words like "suicide," "die," "kill," "rape," "gun," or even phrases like "white supremacist" in order to avoid algorithmic suppression of their video or account. Terms for censoring these words have become relatively standardized, such as "grape" often being used to replace "rape," "pew pew" replacing "gun," and "unalive/self-unalive" replacing "die," "kill," and "suicide.” In @traumabare’s video, additional examples include “not-see” in place of “nazi,” and “mustache man” as a euphemism for Adolf Hitler. The goal of the video is to inform others, and as Chelsea Peterson-Salahuddin discusses in her study “News for (Me and) You: Exploring the Reporting Practices of Citizen Journalists on TikTok,” a pervasive theory among citizen journalists on platforms like TikTok is that using these words will violate content filters and result in a "shadowbanning" of their account–a term that refers to an account that is not actually banned, but whose content is being suppressed or no longer promoted by the algorithm–and therefore it will not be pushed onto viewers' feeds. This self-censorship in an effort to avoid suppression can cheapen the actual content of videos that creators discuss, as euphemisms–especially those that border on comical–sanitize the reality of the mature topics they refer to. This benefits social media companies, who are more app store and ad-friendly if they don’t allow “mature” content, and disadvantages users who discuss “mature” topics–including mature topics that often appear in the news. 

​

In Michel Foucault’s Discipline and Punish, Foucault dedicates a chapter to analyzing surveillance and discipline as tools for generating power and control using Jeremy Bentham’s panopticon as a framing device. The Panopticon is a circular prison constructed around a single guard tower in the center, which prisoners cannot see into, but guards can see out of–in theory, the guard tower need only be staffed by one guard, as the constant threat that the guard might be looking is enough incentive for prisoners to constantly self-regulate. In Foucault’s eyes, Panopticism is a perfection of orderly, individualizing social control: the trappings of constant visibility create a perfect method of control in which the actual exercise of power is rendered unnecessary through the self regulation of the prisoner. Regardless of how social media sites’ algorithms or content filters actually work, users know that they exist and have managed to agree upon a set of rules under which they believe the algorithm or content filters function, and that is enough to compel some degree of self-regulation–in this case, self-censorship. In Georgie’s video on the proposed TikTok ban, they argue that banning TikTok is a violation of free speech–but speech on TikTok is not entirely “free,” as users believe they will be punished for using certain terms. Citizen journalists’ folk theories on the inner workings of social media algorithms reflect the notion that “the algorithm” is the guard in the panopticon’s tower–though the algorithm is not an entity, but of course, an algorithm, and thus users have no doubts that it is watching.  

bottom of page