By Kristin Robbins, District 37A

One of my priorities at the Legislature has been trying to address children’s mental health. That’s why I have proposed legislation that would stop Big Tech from sending kids an endless stream of content that is designed to keep them on these platforms for as long as possible.


Research shows that teens aged 13-18 are spending more than three hours a day on social media. Social media design features have been linked to harmful effects on minors, including increased anxiety and depression, bullying, eating disorders, self-harm, drug addiction and suicide.


The Stop Online Targeting Against (SOTA) Kids Act, HF 1503, would ban social media companies from using algorithms to target kids in Minnesota. Kids under age 18 would still be able to have social media accounts and view content of their choosing, but they would need parental permission to open an account and their “feeds” would only show content they have “liked” or “followed.” They would not be sent additional content based on their age, sex, race, or other characteristics. They would also not be targeted based on what they like, follow, or even just linger on for a few seconds.


The SOTA Kids Act takes a narrow, but concrete step to protect Minnesota kids online by prohibiting social media companies from targeting kids with unsolicited content. It has already received strong bipartisan support, passing all House and Senate Committees with bipartisan votes. Unfortunately, it still has never been allowed to come to the floor for a vote on final passage.


Current federal law already requires parental permission for children to have an account if they are under age 13. My bill will simply move that age up to 18 and ensure that kids are no longer targeted with unsolicited content. The bill provides for a private right of action for families whose kids have been targeted with unsolicited content. Without financial penalties, Big Tech will continue to push content to kids and sell their data. We need to change the incentive structure to protect, not exploit, kids.


Big Tech opposes this bill. They argue that they cannot possibly know how old account holders are, whether they live in Minnesota or separate out algorithms that are used to target content to kids from other algorithms used in other ways on the platform.


The recent passage of the Data Services Act (DSA) in the European Union undercuts their arguments. In order to comply with the provisions of the DSA in Europe, Meta and Tik Tok have already started to offer users the option of a non-customized feed in reverse chronological order (newest to oldest) and users under 18 will no longer receive advertising content personalized for them by algorithms.


Parents and kids are struggling to limit the addictive and adverse mental health effects of social media engagement. Many kids have spoken out and say they need help in getting offline. Big Tech platforms have been knowingly harming our kids to drive longer engagement, despite mounting evidence of its negative consequences.


HF 1503 provides a bipartisan, common-sense solution that will ensure that kids cannot be targeted with unsolicited content by Big Tech. It is not the only thing we can do, but it is an important step in helping them break free from these addictions so they can have fully engaged and healthy lives in the real world.


If you are looking for additional resources to help navigate your child’s social media use, I recommend following the work of Dr. Jonathan Haidt, a professor at New York University (www.jonathanhaidt.com) and the groups Live More, Screen Less (www.livemorescreenless.org) and Wait until 8th (www.waituntil8th.org)