Instagram through its parent company Facebook has announced that it will introduce a way to nudge teens away from harmful content and encourage them to “take a break” from using the app altogether.
Facebook and Instagram are under high levels of scrutiny after a bombshell report and detailed testimony from a whistleblower paint a picture of a company that does not seem capable of putting the safety of its users over profits. Whistleblower Frances Haugen’s statements to legislators were particularly impactful, and while the company initially responded to Haugen with attempts to discredit her, the company now seems to be shifting gears in an attempt to repair its damaged public image.
Facebook’s Vice President of Global Affairs Nick Clegg said that the company would introduce new measures on its apps to prompt teens away from harmful and toxic content and even encourage them to stop using the apps altogether for a time.
“We’re going to introduce something which I think will make a considerable difference, which is where our systems see that the teenager is looking at the same content over and over again and it’s content which may not be conducive to their well-being, we will nudge them to look at other content,” Clegg says.
Facebook will try to ‘nudge’ teens away from harmful content https://t.co/AHvjUJCjNq pic.twitter.com/faHO835aTU
— Reuters (@Reuters) October 10, 2021
Clegg did not specify how this “nudge” would work, how the company would determine what content was not conducive to the well-being of its users, and if it would adjust what content it showed in addition to directing young users to look at other content on its apps. Facebook’s internal studies have shown that if the algorithm is left to its own devices, it would take less than a week to show a new user harmful content.
Additionally, Clegg said that the company would be introducing a feature it calls “take a break,” where it would prompt teens to stop using Instagram and close the app. Clegg did not go into detail on how the feature would work, but it is likely to occur as a dismissable pop-up like other notifications that Instagram has added into its app in the past. It is not clear how easy it would be to ignore or if a notification like this would have a meaningful impact on the mental health of its users.
That said, Facebook and Instagram are likely leaning on this as a type of damage control and hope that it may help some change their minds about the company and its willingness to protect users. For example, Senator Amy Klobuchar who chairs the Judiciary Committee’s antitrust subcommittee has argued for more regulation against technology companies like Facebook in the past and is likely to do so again in this case.
“I’m just tired of hearing ‘trust us’, and it’s time to protect those moms and dads that have been struggling with their kids getting addicted to the platform and been exposed to all kinds of bad stuff,” Klobuchar said on CNN on Sunday.
Clegg additionally expressed openness to the idea of letting regulators access Facebook algorithms that are used to determine what content is displayed to users on both the parent app and Instagram, something Facebook has been reluctant to do in the past.
Image credits: Header photo licensed via Depositphotos.
Originally posted on PetaPixel. Click here to Read More