TikTok Removes Four Million ‘Violative’ Videos In September

TikTok claimed Wednesday that it removed four million “violent” videos in the EU in September, in its first transparency report after the bloc’s new law against unlawful and harmful content went into effect.

The Chinese-owned video-sharing network, which is popular among younger online users, also stated that it employs 6,125 people to censor content in the European Union in all of its national languages.

The numbers were provided as part of TikTok’s need to publish a transparency report every six months under the EU’s new Digital Services Act (DSA) for big online platforms.

TikTok had previously not released monthly removal data for the bloc as a whole, so the significance of the September figure will not be obvious until it is compared to future reports.

The DSA, which took effect in August, threatens very large internet platforms and search engines with fines of up to 6% of global revenue for infractions.

TikTok and 18 other platforms are subject to increased EU inspection since they have at least 45 million monthly users in the EU.

Others include: Meta’s Facebook and Instagram; Alphabet’s YouTube and Google Search; X, formerly known as Twitter; Microsoft’s Bing search engine and LinkedIn; Apple’s AppStore; Alibaba’s AliExpress; and Wikipedia.

The European Commission revealed last week that it has initiated investigations into TikTok and Meta, requesting further information on the steps they have taken to combat the dissemination of “illegal content and disinformation” following the Hamas attack on Israel.

TikTok stated that it had 134 million users in the European Union as of September 2023.

More work to do

It said it was “proud” of the efforts it has made so far but recognised that “we still have work to do”.

TikTok stated in its report that it “proactively” searches for information deemed illegal or harmful under its regulations, first using automated algorithms and then supplementing with human inspection as appropriate.

According to the company, the quantity removed on its own initiative was “seven times greater than the volume of violative content removed in response to a user report.”

In accordance with its DSA duties, the company has introduced a new in-app mechanism for users to report suspected illicit content.

It further stated that when it gets removal requests from EU authorities, it evaluates the content in light of its rules as well as national and EU legislation.

TikTok reported receiving 17 takedown requests from EU governments in September.

It also received 452 requests for information about users and accounts from EU governments, which it considered “on a case-by-case basis” in order to protect users’ privacy and other rights.

It stated that the typical period for action taken against a signaled video was 13 hours, saying that it takes time to evaluate legal requirements as well as factors like as freedom of expression.

Leave a Reply