Home

Donate

What TikTok Can Still Do to Safeguard The Election

Selina Xu, Yvonne Ploder / Oct 29, 2024

Selina Xu is a writer and researcher in technology and geopolitics, and Yvonne Ploder works in strategy and operations in the Office of Eric Schmidt. This post was written in their personal capacity.

Shutterstock

On the eve of the US presidential election, it appears that TikTok is still failing to keep political ads off its platform despite an explicit ban, according to a new investigation.

In September, researchers from the nonprofit Global Witness tested the political ad moderation systems of TikTok, Facebook, and YouTube by submitting ads that contained disinformation, including false warnings that citizens must pass an English language test to vote and claiming that former President Donald Trump is ineligible to run for president due to his felony convictions. Of the three platforms, TikTok did the worst — despite being the only one prohibiting political advertising altogether. It approved 50% of ads containing false information about the election.

That’s concerning, given that a third of American adults under 30 said they regularly get news on TikTok, and nearly two-thirds of Americans between 18 and 29 use the platform—greater than the share of that age group that voted in the last cycle. With Trump and Kamala Harris both battling for the affection of 170 million users on the app, here are some ideas for TikTok to change how it deals with political content in the crucial weeks before, during, and after the election.

While these actions may not sway the federal judges presiding over TikTok’s fight against the law passed earlier this year that would force its Chinese owner, ByteDance, to sell the popular social media app or face a ban in the US, the platform can win in the court of public opinion by embracing these concrete actions to build trust. That may buy it time; the President will get to decide if the action should be delayed before a January 19, 2025 deadline.

1. Limit content distribution from new accounts

In the weeks before, during, and after the election, TikTok should limit the number of times any new account can comment, invite, message, share, or forward — particularly by new accounts demonstrating suspicious platform activity or sending a high volume of messages related to voting and elections. This could have an outsized impact, according to nonprofits like Protect Democracy. It reduces the likelihood that bad actors, whether relying on bots or humans, can supercharge content distribution.

2. Researcher and civil society access

Imagine a viral TikTok video of a voting site catching fire in a key swing state. How many people have seen the video? Where are they based? How many reposts?

These are all crucial data that are currently unavailable to the public. For an election as pivotal as this year’s, journalists and watchdogs need real-time access to social media data. Such data comes in the form of application programming interfaces (APIs). Today TikTok's API is accessible only by application and only to researchers who register as developers.

More needs to be done. Opening up API access for free to journalists will be crucial during the election cycle for immediate accountability (e.g., tracking political messaging and voter targeting), since academics work on a much longer timeline.

3. Ad transparency

TikTok has long banned all political ads. But scroll through the “For You” page and it’s obvious that political content abounds. In September, NBC News found 52 politically themed ads running on TikTok in apparent violation of its policies.

Often it’s unclear if posts about Trump or Harris were sponsored by political campaigns. Creators are currently asked to self-disclose (typically with the hashtag #ad or #sponsored). Yet, TikTok doesn’t appear to effectively monitor and enforce its rule that creators must disclose paid partnerships, nor does the platform proactively label sponsored posts as advertisements — especially when it comes to political messaging, according to research by the Mozilla Foundation.

This lack of transparency and lax oversight means we don’t know who’s paying for the content or how it’s targeted. Is it enough to ask influencers to self-disclose? No. TikTok can build a public ad library for the US, like what it has done for the EU, including data about how many or which ads were rejected under its ban on political ads. Such a library will be a great tool for identifying disinformation, the source of viral posts, and paid influence campaigns.

Social media experts we’ve spoken to have argued for TikTok to take a bolder step: allow political ads, but regulate them — instead of letting them run rampant and unmonitored. New political ads are taking novel forms and are harder to detect. A new policy that maintains campaign ad records will be a step in the right direction.

TikTok still needs to prove itself

You may ask: Why would TikTok do any of this? The short answer is that TikTok has its survival at stake. No other social media platform has as much of an incentive to build trust within civil society and to prove that it is a good-faith actor in the November election. These steps will enable TikTok to signal its difference from other platforms like Meta and X. Meanwhile, the clock is ticking before the Jan. 19 deadline. As its lawsuit plays out in court, these actions could also help persuade civil society that TikTok has embraced reasonably meaningful alternatives that address the government’s national security and data privacy concerns.

Compared to other social media platforms, TikTok is also unique in its design. New accounts can go viral and spread misinformation within hours on the platform, irrespective of follower count. As such, election misinformation is also harder to track, as it spreads primarily via ordinary users with fewer political personalities on the app. And while the platform does remove millions of videos and fake accounts, its filters remain easy to evade with creative spellings and by creating new accounts.

So long as TikTok bills itself as, first and foremost, an entertainment app, its platform features will remain particularly vulnerable to misinformation, per a Harvard Kennedy School Shorenstein Center report: details about a post’s publication time and location are not clearly displayed on the mobile app; parody and comedy videos can be easily conflated with fact; content is often decontextualized due to the encouragement of reposting and remixing with built-in video editing tools.

It’s about time TikTok acknowledged that it plays an indispensable role in the elections. But banning TikTok won’t ensure the election remains free and fair, nor that our personal data won’t be misused. What we need is increased transparency and accountability to protect our democracy — that’s the direction we should nudge TikTok toward as a society.

Authors

Selina Xu
Selina Xu is a writer and researcher in technology and geopolitics in the Office of Eric Schmidt. She was a former China reporter at Bloomberg News and has previously written for CNN and The Straits Times.
Yvonne Ploder
Yvonne Ploder works in strategy and operations at the intersection of tech and philanthropy in the Office of Eric Schmidt. She has worked at the Boston Consulting Group and Uber and has previously written for USA Today.

Topics