Better Laws, Not Bans: Why a TikTok Ban is a Bad Idea
There are real harms that occur on TikTok, like any other. The answer to those problems lies in broadly applicable regulations: better privacy, competition, platform responsibility, and disinfo rules
New legislation making its way through the U.S. Congress has placed a TikTok ban back on the public agenda. The app is already prohibited on government devices in Canada, the government has quietly conducted a national security review, and there are new calls to ban it altogether from the Canadian market. While it might be tempting for some politicians to jump on the bandwagon, a ban would be a mistake. There are legitimate concerns with social media companies, but there simply hasn’t been convincing evidence that TikTok currently raises a national security threat nor that it poses a greater risk than any other social media service. The furor really seems to be a case of economic nationalism – a desire to deny a popular Chinese service access to the U.S. market – rather than a genuine case that TikTok poses a unique privacy and security threat. Taken at face value, however, the case against TikTok comes down to a simple concern: its owner, ByteDance, is a Chinese company that could theoretically be required to disclose user information to the Chinese government or compelled to act on its behalf. The proposed U.S. law therefore would require that TikTok be sold within six months or face a ban.
While the concerns associated with TikTok given its Chinese connection and popularity with younger demographics are well known, the privacy and security case against it is very weak. First, the risk of government mandated disclosures seems entirely theoretical at this stage. To date, the company says there have been no such requests and it has worked to create a firewall from its Chinese operation through Project Texas, which it says will ensure that U.S. data stays in the U.S. It is true that the Chinese government could require disclosures, but that is true of any government. Indeed, mandated governmental disclosures has been a concern for decades: think of the B.C. government outsourcing health data to the U.S. two decades ago, the Snowden revelations in 2013, or the myriad of U.S. laws that already mandate disclosures. Rather than a ban, the solutions include blocking statutes that prohibit disclosures, retention of data within jurisdictions, and transparency requirements that mandate notification of data disclosures.
Second, the privacy and disinformation concerns are by no means unique to TikTok. All social media sites are known platforms for government-backed disinformation campaigns and raise significant privacy issues. Banning a single app doesn’t solve the issue, it only means shifting those campaigns to other platforms. Disinformation is a problem whether on TikTok, Facebook, Twitter or any other social media service. If we are serious about addressing the issue, we need broadly applicable regulations and compliance measures.
Third, banning a single social media service only strengthens the competitors and consolidates their power. For example, India banned TikTok on a permanent basis in 2021. The result? Instagram, owned by Meta, became the country’s most popular app, providing a reminder of the unintended consequences of an app ban.
Fourth, a democratic government banning TikTok seem likely to create a model that will be emulated by others to restrict speech. TikTok has already been banned in Nepal for “disrupting social harmony”, Somalia due to explicit content, Indonesia for blasphemy, and Afghanistan to “prevent young persons from being misled.” This is not a model that Canada or any other democratic country should be embracing.
Fifth, TikTok is an important platform for expression. While governments have occasionally pursued restrictions on foreign government-backed broadcasters (ie. Russia Today), TikTok is a user content site hosting expression from millions worldwide. There are real harms that occur on the platform, like any other. The answer to those problems lies in broadly applicable regulations – better privacy, competition, platform responsibility and accountability as well as measures to address deliberate misinformation. That notably includes the platform liability portion of the Online Harms Act. But it does not include – nor should not include – a ban based on a flimsy, largely evidence-free case.
Post originally appeared at https://www.michaelgeist.ca/2024/03/better-laws-not-bans-why-a-tiktok-ban-is-a-bad-idea/
Find me on: