Florida School Shooting: So Many Failures, But Online Moderators Get It Right
The New York Post

Florida School Shooting: So Many Failures, But Online Moderators Get It Right

In the wake of the deaths of 17 children and adults at Parkland, Florida’s Marjory Stoneman Douglas High School, many are pointing out the multiple failures of social and legal systems put in place to protect society from this type of attack. The accused shooter, Nikolas Cruz, diagnosed with depression, autism, and ADHD came into repeated contact with social workers, mental health counselors, school administrators, law enforcement, and the FBI, and each time it was determined that he wasn’t enough of a threat to warrant further action.

Cruz also posted rants on social media, particularly on Snapchat and in a private chat room on Instagram where he shared pictures of self-mutilation after a breakup, pictures of him holding weapons, threats to kill his ex’s new boyfriend, and posted racist threats against blacks, gays and Hispanics. Many believe that it was Cruz who posted that he was going to be “a professional school shooter” on YouTube. After the shooting, some of the Instagram photos were used to identify Cruz as the gunman.

On social media sites with moderators trained to look for and report those posing a danger to themselves or others, however, the system works. Whether the content is personally viewed or reported by another user, moderators are trained to be able to rapidly assess the level of threat and if needed, escalate it to security personnel on the client side or to law enforcement (including the FBI) themselves.

Though there will be future tragedies as officials work to refine the at-risk identification and assistance process, moderators are saving lives every day. Members of my moderation team are directly responsible for rescuing several people who posted suicide threats, and in one case, their fast action resulted in law enforcement confronting and arresting an armed person who had threatened to shoot up a schoolyard from an apartment across the street.

From a moderation perspective, the operative phrase which determines what action to take is whether the threat is “credible and imminent,” whether the poster seems serious about their intent and if they mention that it will happen “now” or “very soon” or at a specific time. The time factor is one thing that caused the FBI not to pay enough attention to the Cruz posts – he did not say he would do something at a certain time.

Moderators have no control over what happens after they report an incident and they seldom know the results, but their constant vigilance keeps the users of the communities they monitor much more safe and secure: a system that doesn’t fail.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics