Instagram’s algorithm officially listed as the cause of death in a court case in the UK- Technology News, Firstpost

Meta-owned Instagram has often faced the allegation that the platform has been detrimental to the mental health of many young adults and teenagers, and that it doesn’t do enough in terms of ensuring that people in certain age brackets get certain kinds of posts in their feeds.

Now, Instagram has officially been listed as the cause of death by a coroner in a case involving a 14-year-old girl named Molly Russell, who died by suicide in 2017.

Instagram's algorithm officially listed as the cause of death in a court case in the UK

One of the key areas that the trial is focusing on, is the fact that Molly, the girl who died by suicide, viewed thousands of posts on platforms like Instagram and Pinterest promoting self-harmbefore taking her own life.

The coroner who testified in the case, Andrew Walker, at one point, described the content that Russell liked or saved in the days ahead of her death as so disturbing, that he found it “almost impossible to watch.”

In his testimony as the coroner, Walker concluded that Russell’s death could not be ruled a suicide. Instead, he described her cause of death as “an act of self-harm whilst suffering from depression and the negative effects of online content.”

Walker came to his conclusion based on Russell’s prolific use of Instagram, which included liking, sharing, or saving 16,300 posts during a period of six months before her death, and over and about 5,793 pins on Pinterest over the same amount of time, which, when combined with how the platforms catered content to contribute to Russell’s depressive state, made her situation worse.

“The platforms operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text,” which “romanticized acts of self-harm” and “sought to isolate and discourage discussion with those who may have been able to help,” Walker said.

In order to get users to spend more time on their app, platforms like Instagram and Pinterest, curate the feed of a user in such a way that it only shows things that the user might have shown even a little interest in. The interest is measured by time spent on a post, whether the post was liked or saved, whether the post was engaged with using comments, etc. Instagram’s algorithm does not take into account the nature of the post, nor does it take into account the age of the user who is interacting with the post. This, advocates have argued, is one of the biggest areas where content moderation has failed users.

Walker’s testimony reignites a question that child safety advocates have been asking for years – How responsible are social media platforms for the content algorithms that are being fed to minors, and why allow minors onto the platform in the first place?

As per a Bloomberg report, the Russell family’s lawyer has requested that Walker “send instructions on how to prevent this happening again to Pinterest, Meta, the UK government, and the communications regulator.” In their statement, the family pushed UK regulators to quickly pass and enforce the UK Online Safety Bill, which could institute “new safeguards for younger users worldwide.”

During the trial, Pinterest and Meta took different approaches to defend their policies. Pinterest said that it did not have the technology to more effectively moderate the content that Molly was exposed to. Meta’s head of health and well-being, Elizabeth Lagone, on the other hand, told the court that the content Molly viewed was considered “safe” by Meta’s standards. Meta’s official response has irked the Russell family.

“We have heard a senior Meta executive describe this deadly stream of content the platform’s algorithms pushed to Molly, as ‘SAFE’ and not contravening the platform’s policies. If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive,” the Russell family wrote in their statement.

They also added, “For the first time today, tech platforms have been formally held responsible for the death of a child. In the future, we as a family hope that any other social media companies called upon to assist an inquest to follow the example of Pinterest, who have taken steps to learn lessons and have engaged sincerely and respectfully with the inquest process.”

.

Leave a Comment

%d bloggers like this: