Teen restrictions expand on Instagram, Facebook, and Messenger

Meta is broadening Instagram restrictions that prevent minors from being exposed to inappropriate contact and content, bringing those protections to Facebook and Messenger and adding new limits for teenage Instagram users.

Facebook and Messenger Teen Accounts are rolling out in the US, UK, Australia, and Canada starting today, with other regions coming “soon,” according to Meta. The company hasn’t specified which protections will be applied, but if it matches Instagram’s Teen Account rollout then changes will automatically apply to new and existing accounts for users under the age of 18. Older teens can disable these protections, but those under 16 will need to request parental permission via supervisory tools to make any changes.

Current Teen Account protections include restrictions around messaging and interactions with strangers, and tighter controls for seeing sensitive content. Teens are also encouraged to spend more time offline with 60-minute time limit reminders and a sleep mode that mutes notifications between 10PM and 7AM. These features are part of Meta’s efforts to address criticisms around child safety on its platforms. Facebook and Instagram are currently under EU investigation over concerns around safeguarding minors, with a separate US lawsuit filed in 2023 having accused Meta of creating a “marketplace for predators in search of children.”

Meta is also adding additional protections to Teen Accounts on Instagram “in the next couple of months,” that prevent minors from starting a live broadcast or disabling a feature that blurs images in DM when nudity is detected. Under 16s will require parental permission to change or remove these new Instagram restrictions, which are designed to limit contact between children and strangers on the platform.