Meta is introducing tighter controls for teenagers using Messenger in a move aimed at improving online safety for users under 18. The latest update will see teen accounts automatically equipped with stricter privacy settings and added parental control options, according to the company.
This change comes as scrutiny intensifies on how tech firms handle minors’ data. Regulatory bodies across the US and Europe are demanding firmer safeguards, pushing platforms like Meta to respond with stronger policies. “We’re committed to ensuring younger users have safer and more private experiences across our platforms,” Meta stated in a press release.
The focus on youth safety isn’t unique to Meta. TikTok, for instance, has dealt with mounting criticism over harmful content aimed at young users, including incidents in Kenya that triggered national discussions and regulatory hearings. As content moderation lapses draw global attention, tech giants are being forced to act decisively—or face reputational damage and legal backlash.
Local context matters: Africa’s digital safety gap
While Meta’s policy update is a positive step, its impact in Africa remains uncertain. Across many African countries, digital literacy is low, and regulatory frameworks for child protection online are still evolving. Even in nations like Nigeria, where the Child Rights Act exists, enforcement remains weak—especially when it comes to internet use.
A 2019 attempt by Nigerian lawmakers to introduce a bill targeting online child pornography signaled growing awareness of the issue, but gaps persist. Most African children access social media through shared or borrowed devices, making age verification tricky. In these contexts, even the best tech policies fall short without strong local support systems.
“Big tech needs to think beyond tools and features—they must also engage with community structures and raise awareness at ground level,” an internet safety advocate in Lagos shared during a recent workshop.
No Comments