top of page

The
System
1
Read
Every five minutes, our software reads all of the incoming social media posts within a region. This can be a region of any size. We work with cities, counties, and entire states.
2
Analyze
Our machine learning model then processes all of the posts to identify any that are deemed suicidal. This model has been trained with 600,000+ example posts to detect suicidal tones.
-
As a partner, what information would I get about the social media users?We would report to you the text of the post, the user's name, username, and the time that the post was made online.
-
What's in it for you?We have no intentions of monetizing this service. It started as a project to put what we learned about machine learning into practice. Now that it has the ability to save lives, we wanted to put it out there to make a difference in the world.
-
Can you give us a precise location of where the post came from?When we form a partnership, we break down the region of operation into a number of components that the partner desires. Beyond determining which component (ex: "Northwest") of the region the post came from, we cannot pinpoint the exact location from which the user made the post.
-
How many individuals can I expect this system to flag as suicidal?It is impossible for us to say exactly how many posts per week will be deemed suicidal but we do not expect the number of posts reported to ever surpass five per week. The number of suicidal users reported depends on the number of verification posts (the number of posts the partner wants from a user before it is reported) and the size of the region of operation.
-
How does the model work?You can think of machine learning as a black box: when fed some input, it is expected to spit out an output. This black box has been trained with 600,000+ example social media posts. Today, when it is fed an input of new social media posts, it is able to spit out an output of whether the post indicates a risk of suicide. The model does not look for specific words or phrases; rather it detects an underlying suicidal tone.
bottom of page