Collaboration to Combat Suicide and Self-Harm Content

Published:

Thrive Program: A Collaboration to Combat Graphic Content

Meta, Snap, and TikTok have joined forces to launch the Thrive program, aimed at curbing the dissemination of graphic content that depicts or promotes self-harm and suicide. Through Thrive, participating companies can share crucial “signals” to notify each other of any infringing material on their platforms.

The Partnership with Mental Health Coalition

Thrive is developed in collaboration with the Mental Health Coalition, a charitable organization dedicated to dismantling the stigma surrounding mental health conversations. Meta has taken the lead in providing the necessary technical framework for Thrive, enabling the secure sharing of signals. This infrastructure leverages the same cross-platform technology utilized in the Lantern program, which focuses on combating online child abuse. Participating firms can exchange hashes that correspond to the offensive media to alert each other.

Balancing Safety and Engagement

Meta acknowledges its efforts to diminish the visibility of such content on its platform while preserving space for individuals to share their mental health, suicide, and self-harm narratives, devoid of promotion or graphic depictions.

sajdhasd

Meta’s Action Against Harmful Content

According to Meta’s data, the company takes decisive steps against millions of pieces of suicide and self-harm content each quarter. In the last quarter alone, approximately 25,000 posts were reinstated, with a majority being restored following user appeals.

Thrive Program Overview

The Thrive program, initiated by Meta, Snap, and TikTok, is a collaborative effort to combat the spreading of harmful content related to self-harm and suicide across their platforms. Through the exchange of “signals,” participating companies can promptly alert each other to any infringing material. This joint initiative aims to promote a safer online environment while respecting the need for open discussions about mental health.

Technological Backbone Provided by Meta

Meta’s technical infrastructure lies at the core of the Thrive program, facilitating the secure sharing of signals among the participating companies. This mechanism, similar to the one utilized in the Lantern program designed to combat online child abuse, enables the swift identification and removal of harmful content. By sharing hashes that match offending media, companies can effectively alert each other to potentially dangerous material.

Emphasizing Safety and Support

Meta has taken proactive measures to reduce the prevalence of harmful content on its platform, while also encouraging open conversations around mental health issues. The priority is to create a safe space where individuals feel comfortable discussing their struggles without promoting or depicting harmful behaviors.

Monitoring and Response Efforts

Meta’s ongoing monitoring and response mechanisms are designed to swiftly address instances of harmful content related to suicide and self-harm. With millions of pieces of such content targeted every quarter, Meta’s data-driven approach enables prompt intervention and restoration of content, particularly for cases where users appeal against removal.

Conclusion

The Thrive program spearheaded by Meta, Snap, and TikTok demonstrates a collaborative effort to address the dissemination of harmful content surrounding self-harm and suicide. Through innovative technological solutions and cross-platform signals, these companies are working together to create a safer online environment while encouraging open conversations about mental health.

FAQs

Q: How does Thrive help combat graphic content related to self-harm and suicide?

A: Thrive enables participating companies to share signals to alert each other of violating content on their platforms, facilitating prompt removal and intervention.

Q: What role does Meta play in the Thrive program?

A: Meta provides the technical infrastructure behind Thrive, ensuring secure signal sharing among participating companies to combat harmful content effectively.

Q: How does Meta balance safety and user engagement on its platform?

A: Meta strives to reduce the visibility of harmful content while promoting open discussions on mental health topics, ensuring a safe and supportive environment for users.

Q: How does Meta respond to instances of harmful content on its platform?

A: Meta takes proactive measures to monitor and address millions of pieces of suicide and self-harm content every quarter, swiftly restoring content in cases where users appeal against removal.


Credit: www.theverge.com

Related articles

You May Also Like