When a participating company like Meta discovers harmful content on its app, it shares hashes (anonymized code pertaining to pieces of content relating to self-harm or suicide) with other tech companies, so they can examine their own databases for the same content, as it tends to spread across platforms.
You are viewing a single comment's thread from: