TikTok unveils Sh300M mental health fund amid criticism

TikTok's KES300 million health fund will be directed towards supporting and developing locally relevant, evidence-based content that helps to raise awareness, fight stigma, and encourage open dialogue on mental health—a growing concern that affects one in every four people in Kenya today.
As pressure mounts on video sharing platform TikTok to check the spread of explicit content, hate speech and the platform’s risk for manipulation, the social media giant is countering, unveiling $2.3 million (about KES297 million) global mental health fund targeting Kenya and other select countries in Africa.
In an announcement last week, TikTok said it will be leveraging on partners such as Nairobi-based youth-focused social enterprise, Mental360, to offer assistance to users, including through counselling, advice, free psychological support, and mental support services to those in need.
The funding will be directed towards supporting and developing locally relevant, evidence-based content that helps to raise awareness, fight stigma, and encourage open dialogue on mental health—a growing concern that affects one in every four people in Kenya today.
In an industry update in February this year, TikTok said that in Kenya, "99.9 percent of harmful content was removed before any users reported, with 96.4 percent taken down within 24 hours. This reflects TikTok’s commitment to protecting its users, especially younger audiences, from content that could negatively affect mental health. This undertaking was carried out between July and September last year.
According to the short video sharing platform, the expansion of its mental health fund to Africa is part of a raft of changes by the Chinese company to enhance wellness among its growing user base.
The announcement comes at a time when mental health has become a persistent health issue in Kenya, affecting even ‘expert’ content moderators on various social media platforms.
For example, in Kenya about 200 former content moderators sued Meta, the parent company of Facebook, alongside its outsourcing companies Sama and Majorel, over claims of unlawful dismissal, exploitation, and psychological trauma.
The moderators, who were contracted to review disturbing content including graphic violence and child abuse, allege that they were subjected to inhumane working conditions including inadequate mental health support, low pay, intense performance monitoring, and a culture of silence enforced through restrictive non-disclosure agreements.
Their job mandated them to look at horrors for hours just so that the world wouldn’t have to. The 2023 lawsuit, which Meta tried unsuccessfully to block on jurisdictional grounds, was allowed to proceed by a Kenyan court. This set a precedent for local accountability of global tech giants.
Beyond the legal filings, the psychological toll on the moderators has been severe. A 2024 report revealed that all 144 workers surveyed showed signs of post-traumatic stress disorder, with over 80 percent reporting extremely severe symptoms.
Former moderators have spoken of working in silence through trauma, experiencing depression, suicidal thoughts, and sleep disorders. Some even reported being targeted by online actors due to the content they reviewed, especially those moderating extremist videos.
The mental anguish eventually pushed many to seek union representation, only to be dismissed or blacklisted from future opportunities when a new subcontractor, Majorel, took over Meta’s content moderation work in Nairobi.
“If you feel comfortable browsing and going through the Facebook page, it is because there’s someone like me who has been there on that screen, checking, ‘Is this okay to be here?’” one of the complainants told Associated Press in 2023.
The case sparked international attention as it challenges the ethics of outsourcing content moderation to the Global South. It is the first known lawsuit outside the U.S. to demand accountability from a social media company over the working conditions of its moderators.
It has also catalyzed the formation of Africa’s first content moderation union and placed renewed pressure on tech firms to take responsibility for the hidden human cost of their platforms.
In the new mental health measures announced in Johannesburg, South Africa, TikTok has expanded local in-app helplines and introduced an industry-first meditation feature for all TikTok users.
“In the coming weeks, users of some countries in Africa will have access to local in-app helplines that provide expert support when reporting content related to suicide, self-harm, hate, and harassment,” TikTok announced.
All this is on top of TikTok's #MentalHealthMatters campaign, which promotes positive mental health practices. The global giant hopes that together, these efforts will support balanced digital habits, providing communities with access to reliable information.
“TikTok is committed to user safety and community well-being and provides tools and protections to help our community enjoy their experience on the platform. But to achieve this, we all need to play a very vital role in fostering a secure and respectful environment,” noted Mercy Kimaku, TikTok’s Regional Risk Prevention Lead for Sub-Saharan Africa on Thursday, at the Digital Well-being Summit.
The inaugural Digital Well-being Summit, bringing together policymakers, mental health experts, NGOs, and industry leaders across Sub-Saharan Africa, attracted delegates from Kenya, South Africa, Nigeria, Ghana, Ethiopia, Zimbabwe, and sought to strengthen efforts to support and protect community well-being on the platform.
Additionally, TikTok has introduced a guided meditation experience tool called Sleep Hours. This feature is the first of its kind in the industry and was piloted in March 2025. It has also been made available worldwide. It is an in-app well-being resource that is automatically enabled at 22:00 for all users under the age of 18. Anyone above this age can choose to turn it on.
“People come to TikTok to learn, share their experiences, and connect with communities around the world. That’s why we’re proud to introduce tools that not only support digital well-being but also empower our community, especially young users, with a safe, supportive space to explore and navigate complex emotions,” said Valiant Richey, TikTok Global Head of Trust and Safety Outreach and Partnerships.
Currently, data from the Communications Authority of Kenya shows that the country has at least 29.2 percent of Kenyans using TikTok.