TikTok, a video-sharing social networking service announced that it would try to reduce misinformation by removing videos flagged by fact-checkers as inconclusive. It said that it will also remove videos that are identified to be spreading false information.
This feature is launching in the United States and Canada but will become available globally in the “coming weeks.”
Tiktok partners with fact-checkers at PolitiFact, Lead Stories, and SciVerify to help assess the accuracy of the content. If the fact check confirms a content to be false, it will be removed from the platform.
The company noted that sometimes fact checks are inconclusive or content is not able to be confirmed, especially during unfolding events.
In cases like that, a video may become ineligible for recommendation into anyone’s For You feed to limit the spread of potentially misleading information.
Users will be notified when a video with unsubstantiated content is identified in an effort to reduce sharing. The video’s creator will also be notified as well that their video was flagged as unsubstantiated content.
To verify a video, TikTok’s internal investigation and moderation team, work to verify misinformation using readily available information like existing public fact checks. If it can’t do so, the video will be sent to a fact-checking partner. If the fact check determines content is false or violates TikTok’s misinformation policy, it will simply be removed.
A viewer who comes across one of these flagged videos will see a banner that says the content has been reviewed but can’t be conclusively validated.