Child Accounts on Instagram Will Be Safer, Here’s What Meta Is Doing
Casal dels Infants – Child accounts on Instagram are now a top priority for Meta as the company steps up efforts to improve safety across its platform. Meta has announced a series of new measures aimed at preventing misuse by suspicious adults and creating a safer digital environment for children.
Adult-run accounts that frequently post content featuring children, such as those managed by parents, child actors’ agents, or young influencers, will now be automatically placed under the strictest messaging settings. This measure is designed to block unwanted messages, including inappropriate requests or sexually suggestive comments.
Additionally, the Hidden Words feature will be automatically enabled to filter out offensive, harassing, or inappropriate comments.
“Read More: Every Day, 10 Children in Gaza Lose Their Legs”
Meta will restrict adult accounts that teens have previously blocked from discovering accounts featuring children. Instagram will also stop recommending these accounts in search results and user suggestions.
Although most people create child-focused accounts with good intentions, Meta acknowledges that bad actors sometimes misuse them. Examples of violations include leaving sexual comments or requesting inappropriate photos via direct messages.
Meta will send in-app notifications to affected accounts, informing them about the updated safety settings. These alerts will appear at the top of users’ feeds and encourage them to review their account privacy settings.
This move will significantly impact family vloggers and parents who manage child accounts on Instagram, including those who run accounts on behalf of their children.
Meta is also rolling out additional safety features for teen accounts. Instagram now shows teen users safety tips when they interact with new profiles, including details such as the account’s creation date. If an account seems suspicious, teens will be able to immediately block or report it.
As of June 2025, Instagram reports that teens have blocked over 1 million problematic accounts and reported 1 million more.
Meta removed nearly 135,000 Instagram accounts that shared sexual content involving children. The company also took down another 500,000 linked accounts across both Instagram and Facebook.
The company also reaffirmed that 99% of users, including teens, continue to keep the nudity protection filter enabled. Furthermore, recipients have chosen not to open over 40% of blurred images sent via DM. This suggests that the protection features are working effectively.
“Continue Reading: Jackie Chan’s Daughter Becomes Homeless in Canada, How Did It Happen?”