Snapchat, Meta, and TikTok are teaming up for a new initiative to prevent content featuring suicide or self-harm from spreading across social media. Meta revealed this initiative on its blog.
This is a new program called “Thrive” and will be overseen by the Mental Health Coalition. All three platforms will share data concerning content for a broader cross-platform action on such content.
A Meta spokesperson described the new initiative Thrive as a database that all participating companies will have access to.
Here is how Meta explains it.
“Through Thrive, participating tech companies will be able to share signals about violating suicide or self-harm content so that other companies can investigate and take action if the same or similar content is being shared on their platforms. Meta is providing the technical infrastructure that underpins Thrive – the same technology we provide to the Tech Coalition’s Lantern program – which enables signals to be shared securely.”
All three apps allow their users to discuss mental health concerns and share their thoughts on related matters. However, there are certain rules and regulations around the distribution of graphic imagery, and/or material that could encourage suicide or self-harm.
Meta also says that data shared will only identify content and will not include information about who has posted it. This will allow them to remove such material with ease and at a fast pace.
“We’re prioritizing this content because of its propensity to spread across different platforms quickly. These initial signals represent content only, and will not include identifiable information about any accounts or individuals.”
The featured content will be assigned a number known as “hash.” That will be shared with the other social media companies so they can check for the same on their platforms and remove that content.
Thanks for choosing to leave a comment. Please keep in mind that all comments are moderated according to our comment Policy.