TikTok removed 350,000 videos which contained disinformation on the US Presidential election, the Mirror can reveal.
And the short video sharing platform emerged as an unexpected target for false claims about ballot fraud and QAnon conspiracies.
TikTok say they applied an ‘election information banner’ on nearly 7 million clips.
And in a bid to combat Covid-19 disinformation, the site set up an “information hub”, which they say was viewed 2.6 billion times.
Some 51,505 videos were deleted for promoting COVID-19 misinformation.
The network removed 89,132,938 videos in total in the last six months of 2020. 6,144,040 accounts were removed for violating the site’s ‘Community Guidelines’.
Some 347,225 videos were removed during the campaign for “election misinformation, disinformation, or manipulated media.”
And TikTok say they worked with fact checkers to verify to limit the distribution of unverified content.
Videos spread through the platform ‘peer-to-peer’, with popular videos appearing in each user’s algorithm-driven ‘For You’ feed.
A further 441,028 videos which were not deleted, but included unverified information, were made ineligible for the ‘For You’ feed.
Cormac Keenan, Head of Trust and Safety at TikTok said: “We’ll continue to listen to feedback from our community and share our progress as we work to ensure TikTok is a safe and positive space for creative expression.”
The network say they removed almost 9.5 million spam accounts in the second half of 2020.
And 3.5 million videos were removed for breaking the site’s advertising policy.
Users whose videos have been removed are allowed to appeal the decision.
After being reviewed by TikTok staff, some 2.9 million videos were reinstated on appeal.