TikTok set to reduce misinformation on its platform

PARIS, FRANCE - MARCH 05: In this photo illustration, the social media application logo, Tik Tok is displayed on the screen of an iPhone on March 05, 2019 in Paris, France. The social network broke the rules for the protection of children's online privacy (COPPA) and was fined $ 5.7 million. The fact TikTok criticized is quite serious in the United States, the platform, which currently has more than 500 million users worldwide, collected data that should not have asked minors. TikTok, also known as Douyin in China, is a media app for creating and sharing short videos. Owned by ByteDance, Tik Tok is a leading video platform in Asia, United States, and other parts of the world. In 2018, the application gained popularity and became the most downloaded app in the U.S. in October 2018. (Photo by Chesnot/Getty Images)
0
TikTok is taking steps to reduced misinformation by removing videos that are identified to be spreading false information.

 

TikTok, a video-sharing social networking service announced that it would try to reduce misinformation by removing videos flagged by fact-checkers as inconclusive. It said that it will also remove videos that are identified to be spreading false information.

This feature is launching in the United States and Canada but will become available globally in the “coming weeks.”

Tiktok partners with fact-checkers at PolitiFact, Lead Stories, and SciVerify to help assess the accuracy of the content. If the fact check confirms a content to be false, it will be removed from the platform.

The company noted that sometimes fact checks are inconclusive or content is not able to be confirmed, especially during unfolding events.

In cases like that, a video may become ineligible for recommendation into anyone’s For You feed to limit the spread of potentially misleading information.

Users will be notified when a video with unsubstantiated content is identified in an effort to reduce sharing. The video’s creator will also be notified as well that their video was flagged as unsubstantiated content.

To verify a video, TikTok’s internal investigation and moderation team, work to verify misinformation using readily available information like existing public fact checks. If it can’t do so, the video will be sent to a fact-checking partner. If the fact check determines content is false or violates TikTok’s misinformation policy, it will simply be removed.

A viewer who comes across one of these flagged videos will see a banner that says the content has been reviewed but can’t be conclusively validated.

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More