Social media legislation would help prevent gun violence

4 min read

Social media legislation would help prevent gun violence

I joined other Annunciation parents searching for answers. For many shooters, we learned, the path to gun violence starts online.

By
Mike Roaldi / MinnPost

May 13, 2026, 12:38 PM CT

Facebook
Instagram
Twitter
Reddit
Bluesky

Mike Roaldi, MinnPost.

For me it’s the sirens. Some friends who were there have trouble with loud noises, or when sitting near windows. But, when I hear sirens, my body reacts before my mind can catch up. For a moment, I am back on the day of the Annunciation shooting. The sirens told me it was real. There wouldn’t be dozens of sirens if it was a false alarm.

My kids hid under church pews while the bullets flew in through the windows. Thirty were injured. My 8-year-old son’s friend was killed. They used to do joint birthday parties and were planning a fantasy football league this year.

In the hours and days that followed, as we hugged our kids and waited to learn who had been hurt and who had died, so many of us asked the same impossible question: How could anyone do something like this? How does a person become capable of attacking children in a school or a church?

There is no single answer. Anyone who says otherwise is not being honest. School shootings are the product of many failures layered on top of one another: access to weapons, mental health crises, warning signs missed, isolation, grievance, ideology, despair.

But one part of the answer is increasingly clear. For some vulnerable young people, the pathway to gun violence begins online — not with one dramatic moment; not always with an explicit plan. Often it begins in ways that can look ordinary from the outside: a lonely child, a private screen, an algorithm designed to keep him watching, clicking, scrolling and returning. Over time, that child can be pulled deeper into communities that normalize violence, glorify shooters and offer a dark kind of belonging.

That is why the Legislature should pass HF 4138/SF4696, the Stop Harms from Addictive Social Media Act (SHASM), companion bills that are ready to receive votes in both houses this week. The legislation requires that minor children under 16 obtain parental consent to obtain an account on a social media platform, and it removes the most addictive features from those accounts.

This bill would not solve every cause of school shootings. No bill can. But it focuses on one important layer of prevention: the platform designs that keep children trapped in harmful online environments. These are not neutral systems. They are built to maximize engagement. When that content is harmless, the result may be wasted time. When the content is violent, extremist or self-destructive, the result can be far more dangerous.

SHASM would give parents more insight into their kids’ social media habits and foster healthy dialogue around usage. It gives one more tool to put in guardrails without banning social media outright as Australia has done. And it would force platforms to take some responsibility for the systems they have built and the harm those systems can amplify.

In a recent New York Times opinion piece, researchers from the Violence Prevention Project described what they call the online “True Crime Community.” They were not talking about ordinary interest in crime stories. They were describing a darker ecosystem where mass killers are treated almost like celebrities. Users collect shooters’ images. They make art of them. They talk about them with admiration. They study them, quote them and sometimes emulate them.

The researchers described a recent shooter who had created a Roblox game simulating a mass shooting. That shooter’s TikTok account reportedly reposted videos of another mass shooter. The shooter had also participated in a gore forum where users post uncensored videos of violence.

For a vulnerable child, these communities can send a devastating message: If you feel invisible, violence can make you known. That is how the unthinkable begins to become thinkable.

And the terrifying truth is that there are likely dozens, if not hundreds, of young people in Minnesota encountering this kind of content right now. Most will never commit violence. But some may be isolated, angry, unstable or desperate enough to see in these communities a script for what to do with their pain.

Tech companies and other opponents of social media regulation may argue that parents should handle this on their own. But that ignores the reality of what parents are up against. We are not simply deciding whether our children can watch a television show or go to a friend’s house. We are trying to manage platforms powered by algorithms, behavioral design, endless content and private online spaces that most adults cannot fully see or understand.

Again, SHASM is not about banning the internet or any particular form of content—it explicitly eschews that approach. Nor is the bill about pretending technology is the only cause of violence.

It is about recognizing that when a child is vulnerable, isolated and angry, online systems can either interrupt that spiral or accelerate it — and that parents need help understanding where their kids are spending time online to make responsible choices together with their kids.

We have to act where we can. SHASM is not the whole answer. But it is a meaningful layer of prevention. It is a way to reduce the odds that another vulnerable child is pulled further down a pathway toward violence while adults and platforms look away.

Mike Roaldi, an Annunciation parent who is part of the Annunciation Light Alliance, has testified at the Minnesota Legislature. The alliance supports gun safety measures but also bills that foster prevention by mitigating harms to children online that may lead them to violence.

Mike Roaldi / MinnPost
Mike Roaldi / MinnPost
Civic Media App Icon

The Civic Media App

Put us in your pocket.