Jesus' Coming Back

Meta to Hide Content from Teens about Self-Harm, Eating Disorders

The parent company of Instagram and Facebook said this week it will begin hiding content about suicide, self-harm, and eating disorders on teen accounts and also begin placing teens in the “most restrictive content control setting.” 

Meta, in a blog post, said it will “restrict teens from seeing certain types of content across Facebook and Instagram even if it’s from friends and people they follow.” Examples of such content, Meta said, are content that discusses struggles with self-harm and eating disorders or that “includes restricted goods or nudity.”

Further, Meta said it is “automatically placing teens into the most restrictive content control setting on Instagram and Facebook.”

The changes will apply automatically, provided teens do not lie about their age when signing up.

“We already apply this [restrictive] setting for new teens when they join Instagram and Facebook and are now expanding it to teens who are already using these apps,” Meta said. “Our content recommendation controls — known as ‘Sensitive Content Control’ on Instagram and ‘Reduce’ on Facebook — make it more difficult for people to come across potentially sensitive content or accounts in places like Search and Explore.”

Meta also is applying a change for everyone — teens and adults — on the subjects of self-harm and eating discovers. 

“While we allow people to share content discussing their own struggles with suicide, self-harm, and eating disorders, our policy is not to recommend this content, and we have been focused on ways to make it harder to find,” Meta said. “Now, when people search for terms related to suicide, self-harm, and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help. We already hide results for suicide and self-harm search terms that inherently break our rules, and we’re extending this protection to include more terms. This update will roll out for everyone over the coming weeks.”

Some critics, though, said the changes are coming too late.

“Today’s announcement by Meta is yet another desperate attempt to avoid regulation and an incredible slap in the face to parents who have lost their kids to online harms on Instagram,” Josh Golin, executive director of the children’s online advocacy group Fairplay, told ABC News. “If the company is capable of hiding pro-suicide and eating disorder content, why have they waited until 2024 to announce these changes?”

In October, a bipartisan coalition of 33 attorneys general filed a federal lawsuit against Meta, alleging that the company “knowingly designed and deployed harmful features on Instagram, Facebook, and its other social media platforms that purposefully addict children and teens,” a summary of the lawsuit said. 

Photo Courtesy: ©iStock/Getty Images Plus/kitzcorner


Michael Foust has covered the intersection of faith and news for 20 years. His stories have appeared in Baptist Press, Christianity Today, The Christian Post, the Leaf-Chroniclethe Toronto Star and the Knoxville News-Sentinel.

Related podcast:

The views and opinions expressed in this podcast are those of the speakers and do not necessarily reflect the views or positions of Salem Web Network and Salem Media Group.

Related video:

We would do well to consider how biblical patterns might inform our contemporary actions. Read James Spencer’s full article here

Sound and Photo Credit:©/iStock/Getty Images Plus/skynesher

Source

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More