close
Wednesday May 15, 2024

Community guidelines violations: TikTok removes 11.7m videos from Pakistan

By Aimen Siddiqui
July 22, 2023

KARACHI: TikTok, a popular video-sharing application, removed 11,707,020 (11.7 million) videos uploaded on its platform from Pakistan for violating community guidelines in the first quarter of 2023 (Q1 2023), according to a quarterly report titled ‘Community Guidelines Enforcement Report’ released on June 30, 2023 by TikTok for January-March 2023.

Even though this makes Pakistan the top country in 50 countries with the largest volume of removed videos -- for violations of TikTok’s community guidelines -- the number suggests a 7.3 percent improvement as compared to the last quarter when around 12,628,267 videos from Pakistan were removed at the end of 2022.

The US is ranked second with the removal of more than 9.6 million videos. Brazil comes third with 3.2 million removed videos, followed by Saudi Arabia with 3.1 million videos.

According to the report, the quick removal of harmful content enables TikTok to maintain a safe and welcoming environment for its community. The company also employs a proactive removal mechanism where it identifies and removes a video before it is reported.

In Pakistan, it had a proactive removal rate of 98.8 percent. Around 92.2 percent videos were removed within 24 hours of being posted, and 83 percent were removed before any views during the first quarter of 2023.

Globally, TikTok removed a total of 91,003,510 videos, which represents 0.6 percent of all videos uploaded to TikTok. Around 53,494,911 videos were removed through automated systems. TikTok restored 6,209,835 videos after review.

The report says that 30 percent of the videos removed violated TikTok’s minor safety policy; around 27.2 percent of the videos were related to illegal activities and regulated goods. Over 9.0 percent of videos were removed for violent and graphic content. Under the minor safety policy, 55.2 percent videos were removed for nudity & sexual activity involving minors; 2.8 percent videos were flagged for ‘grooming behaviour’, and 2.3 percent for sexual exploitation of minors.

In a move that underscores the video sharing app’s priorities towards the safety of young users, TikTok removed 16,947,484 accounts globally, suspected to belong to users under the age of 13. The platform also removed 51,298,135 fake accounts to prevent any fraudulent activity – 21.2 percent of the accounts removed under the illegal activities and regulated goods policy were reportedly involved in frauds and scams.

TikTok says it also proactively targeted spam accounts and associated content, taking preventive measures to curb the creation of automated spam accounts.

In its report, the company says, “We remain vigilant in our efforts to safeguard the platform from adversarial threats, including the presence of inauthentic or fake accounts and engagement. These threats persistently probe and attack our systems, leading to occasional fluctuations in the reported metrics within these areas.”