TikTok on Wednesday acknowledged it had erred in penalising a 17-year-old who had posted witty but incisive political videos, promising it would restore her ability to access her account on her personal device. The company’s apology – coupled with a new pledge to reevaluate its policies – still failed to satisfy the teen, Feroza Aziz, who again raised concerns that she’d been the victim of censorship by the fast-growing, Chinese-owned social-media app.
“TikTok is trying to cover up this whole mess,” she told The Washington Post. “I won’t let them get away with this.”
The saga started earlier this week, when Aziz tweeted that her profile had been temporarily suspended. She attributed the penalty to the fact she had recently shared a satirical video that urged viewers to research the harrowing conditions facing Muslims in China’s detention camps. Her comment quickly garnered widespread attention because TikTok is owned by a China-based tech conglomerate, ByteDance, though the company has sought to stress recently its US operations are independent from Beijing’s strict censorship rules.
TikTok, however, said it had penalised her not for her comments about China but rather a video she’d shared earlier – a short clip, posted on to a different account, that included a photo of Osama bin Laden. Aziz’s video violated the company’s policies against terrorist content, TikTok said, so the company took action against her device, making any of her other accounts unavailable on that device. TikTok said her videos about China did not violate its rules, had not been removed and had been viewed more than a million times.
But the video in question – a copy of which she shared with The Washington Post – actually was a comedic video about dating that the company had misinterpreted as terrorism, Aziz said.
By Wednesday evening, TikTok had reversed course: The company said it restored her ability to access her account on her personal device. TikTok also acknowledged that her video about China had been removed for 50 minutes on Wednesday morning, which it attributed to a “human moderation error.”
“We acknowledge that at times, this process will not be perfect. Humans will sometimes make mistakes, such as the one made today in the case of @getmefamouspartthree’s video,” wrote Eric Han, the head of safety at TikTok U.S., referring to Aziz’s account.
“When those mistakes happen, however, our commitment is to quickly address and fix them, undertake trainings or make changes to reduce the risk of the same mistakes being repeated, and fully own the responsibility for our errors,” Han continued.
In doing so, TikTok for the first time offered detail about the actions it has taken to police its platform: In November, the company said, it banned 2,406 devices associated with accounts that violated rules about terrorism, child exploitation or spam. It was part of that sweep that Aziz’s own device had been banned, locking her out of her account there.
Aziz, however, said late Wednesday she isn’t convinced.
“Do I believe they took it away because of a unrelated satirical video that was deleted on a previous deleted account of mine? Right after I finished posting a 3 part video about the Uyghurs? No,” she tweeted Wednesday.
TikTok’s policies have drawn critical attention in Washington, where investigations have begun into whether the platform presents a national security risk.
© The Washington Post 2019