Meta To Blur Adult Instagram Content For Teen Safety

Tech censorship giant Meta has announced that it will now begin testing to blur images on Instagram that contain images of the naked human body.

The move is being made in response to complaints of scammers using the adult images as bait, as well as concerns of teens being unwillingly subjected to inappropriate images that can influence them in harmful ways.

Although the Meta-owned app enjoys sky-high popularity, there are growing concerns in the United States and Europe that the social media platform is addictive and a contributor to mental health problems in young adults.

Indeed, the dopamine hits fueled by prolonged social media exposure can train the malleable brain to become used to the steady chemical diet of instant gratification. In the entirety of human history, such exposure to repeated pleasurable distractions at such a universal level has hitherto been unknown, but many fear its effects are beginning to show in the form of depression and other mental health issues.

Young adults, whose brains are still in critical development stages, may be particularly vulnerable to the onslaught of attention-grabbing and perception-altering media.

Indeed, when Meta founder Mark Zuckerberg attended a Senate hearing in February, he was confronted with information that 35% of teenage girls were exposed to unwanted images of human nakedness through the Instagram app. As he was grilled over and held accountable for the inappropriate images, he was compelled to turn to the audience behind him and apologize for the platform’s neglect to rein the images in.

Meta announced that the app will use on-device machine learning to analyze and detect if an image contains nudity or not. The company also said it will develop technology to assist in identifying scammers attempting to extort users and send pop-up messages to users who may have interacted with such accounts.

The newly developed blurring feature will be set to default for users who declare themselves to be under the age of 18 and will also notify adults to encourage them to turn it on as well, in order to prevent unwanted exposure to unwilling audiences.

Although the effort to provide initially blurred images may be well-met, Meta’s record on censorship and stifling free-speech is abysmal, and skepticism exists that the technology developed to protect minors may in turn eventually be used for more nefarious purposes, such as censoring political speech and images. However, for the time being, the move is being welcomed by many as a necessary step to help protect minors from unwanted adult content.