image captionTikTok’s European head of public policy Theo Bertram said the firm had to “do better”
TikTok has written to social media firms asking them to join together to remove content that depicts self-harm or suicide more quickly.
It comes after a clip of a man killing himself was widely circulated on its platform and viewed by young children.
Theo Bertram, Europe’s public policy head, said the sharing of the video suggested a co-ordinated attack, possibly from bot accounts.
He declined to discuss ongoing negotiations on the future of TikTok.
Mr Bertram was being grilled by MPs on the Department of Digital Culture, Media and Sport who are investigating how social media platforms deal with online harms.
They were also keen to hear more about the future of the company outside China, in wake of President Donald Trump’s threat to ban the app in the US unless a deal is struck with American firms.
Owner ByteDance is currently in talks with Oracle and Walmart over its future, but reports suggest that China is unlikely to approve what it sees as an unfair deal.
Mr Bertram said he was not able to comment on the details of the ongoing negotiations.
“I think there are broader concerns around China and China’s role in the world. And I think that these concerns are projected on to TikTok and don’t think they are always fairly projected,” he told MPs.
When pressed on how the platform dealt with content sensitive to the Chinese government, such as protests in Hong Kong and the treatment of the Uighur Muslims, he told MPs: “TikTok is a business outside of China and is led by European management that have the same concerns and the same world view that you do and we care about our users.”
Some of those users have recently been traumatised by a clip circulating on the platform showing a US man killing himself, and Mr Bertram acknowledged that the firm had to “do better”.
Mr Bertram explained that the firm had seen a huge spike in the sharing of the clip a week after the broadcast took place on Facebook Live.
“Following an internal review, we found evidence of a co-ordinated effort by bad actors to spread this video across the internet and platforms, including TikTok.
“And we saw people searching for content in a very specific way. Frequently clicking on a profile of people as if they’re kind of anticipating that those people had uploaded a video.”
He said the firm had written to the chief executives of Facebook, Instagram, Google, YouTube, Twitter, Twitch, Snapchat, Pinterest and Reddit.
“What we are proposing is that, the same way these companies already work together around child sexual imagery and terrorist-related content, we should now establish a partnership around dealing with this type of content.”
And for TikTok itself, he promised “changes to machines learning and emergency systems” as well as how algorithms that detect such content can work better with the firm’s content moderators.
He was also asked about reports that TikTok had removed content around disabilities or LGBTQ.
He explained that “unfortunately” there had been a policy around not promoting content that might encourage bullying, which limited content from people with disabilities and LGBTQ content.
“That is no longer our policy,” he said.
He was less clear on whether the firm restricted the promotion of LGBTQ hashtags in Russia, saying: “Not as far as I’m aware… The only time we will remove that content is when we have a legal requirement to do so.”