Modern technology gives us many things.

Meta launches new tools for teens to find and remove intimate and exploitive images

Meta, the company behind Facebook and Instagram, is launching a first-of-its-kind platform called Take It Down to allow minors the ability to find and remove intimate images that were release without their knowledge or consent.

Meta doesn’t allow the sharing or spread of child exploitation material which can cause unimaginable trauma for the victims.

Now across Meta’s apps – Facebook and Instagram – there are ways for this exploitive content to be taken down.

This platform be used by people under 18, parents and trusted adults acting on behalf a young person and adults concerned about images taken of them when they were younger than 18.

Minors, parents and trusted adults can now use the Take It Down platform and receive help from the National Centre for Missing and Exploited Children to remove concerning images and prevent them from being shared.

The new platform was built with privacy and safety at the forefront.

With that in mind, when Australians use Take It Down it will generate a unique code and hashtags so users can report the images without having to download then to their own device and upload them to NCMEC.

Instead the hashtags and images assigned to that code are uploaded while keeping them easily findable.

It also ensures that all online matching results in those images being blocked.

Take It Down is already operating on Facebook and Instagram so users can use hashtags to scan and earmark the inappropriate material at any time and make it easier to take down.

Meta already blocks posting photos of intimate actions or sextortion material.

Facebook and Instagram can also restrict suspicious adults attempting to access the material and to prevent suspicious adults connecting with teenager on the platform.

The apps already have supervision tools and age-verification technology along with resources for teens to warn them about the dangers and potential harm of taking intimate images.