Published on

TikTok Releases Global Transparency Report for Second Half of 2019

TikTok Releases Global Transparency Report for Second Half of 2019

TikTok has released its global Transparency Report for the second half of 2019, providing insights into the requests received from law enforcement agencies worldwide and how they are handled. The report also includes information about TikTok's content moderation practices and actions, specifically regarding the removal of videos that violate their Community Guidelines or Terms of Service.

Background

Last December, TikTok published its first Transparency Report, covering the legal requests received and their response during the first half of 2019. They are committed to regularly publishing these reports and providing more information in future reports as they invest in their infrastructure, improve reporting systems, and develop new safety policies, practices, and partnerships.

Approach to Safety

As a global platform, TikTok receives a large number of video uploads every minute, which comes with a greater responsibility to ensure user safety. TikTok has dedicated teams across markets, including a US-based safety team, to strengthen policies, technologies, and moderation strategies. They collaborate closely with regulators, policymakers, government, and law enforcement agencies to promote the highest standard of user safety.

Enforcing Community Guidelines

TikTok utilizes a combination of technology and content moderation to enforce their Community Guidelines. Their systems automatically flag potentially violative content, but context is crucial in determining whether certain content violates their guidelines. Trained moderators review and remove content, including proactively removing evolving or trending violative content. Users can also report inappropriate content or accounts through the in-app reporting feature. If a violation is determined, the content is removed. TikTok educates users about safety options and controls through in-app videos and their Safety Center.

Enforcement of Community Guidelines & Terms of Service

In the second half of 2019, TikTok removed 49,247,689 videos globally, which accounted for less than 1% of all user uploads, for violating their Community Guidelines or Terms of Service. The majority of these videos were proactively caught and removed by their systems before user reports, and a significant portion of them were taken down before receiving any views. The Transparency Report provides detailed information on the markets with the highest volume of removed videos.

TikTok, like other internet platforms, receives legal requests for user information from government agencies worldwide. They carefully review each request to ensure its legal sufficiency and consider the requesting entity's authorization to gather evidence or investigate an emergency involving imminent harm. In the second half of 2019, TikTok received 500 legal requests for information from 26 countries, and the report provides further details on their response.

Government Requests for Content Removal

TikTok occasionally receives requests from government agencies to remove content on their platform that violates local laws. They review all material in line with their Community Guidelines, Terms of Service, and applicable law, taking appropriate action. If a report is not legally valid or does not violate their standards, the content may not be removed. In the second half of 2019, TikTok received 45 requests to remove or restrict content from government bodies in 10 countries. The report includes more specific information.

Takedowns for Infringement of Intellectual Property

TikTok prohibits content that infringes on third-party intellectual property. In the second half of 2019, they evaluated 1,338 copyright content takedown notices, and the report outlines their response.

Looking Ahead

TikTok is committed to responsibly building their platform and moderating content as they continue to grow. They strive to be transparent about the content they remove and provide users with meaningful ways to control their experience, including the option to appeal if a mistake is made. They will continue to evolve their Transparency Report based on user and stakeholder feedback. Additionally, they are on track to open global Transparency Centers in Los Angeles and Washington, D.C., allowing invited experts and policymakers to witness firsthand how content is moderated on TikTok.

Their ultimate goal is to maintain TikTok as an inspiring and joyful platform for creative expression for everyone.