Minnesota Attorney General Report Takes Tech Companies to Task
The tech industry is in the hot seat again. After facing a rigorous grilling before the Senate Judiciary Committee earlier this week, social media and technology execs are now facing the ire of Minnesota’s chief lawyer.
A new report released Thursday by Minnesota Attorney General Keith Ellison’s office showed that rising technologies like social media and artificial intelligence programs are harming the state’s youth.
The report, commissioned by the state legislature last year and led by Ellison, detailed specific design choices made by companies to increase engagement, which has led to negative consequences for consumers, especially children and young adults. In a statement, Ellison said the report’s findings, along with its recommendations, will help state policymakers better regulate technology companies and create safer digital spaces for children.
“Technologies like social media are changing how kids in Minnesota grow up, often in profoundly negative ways,” Ellison said. “As things stand, technology companies with little oversight and a habit of putting profits over people have nearly unfettered access to our kids via their computers and smart phones.”
Harmful or unwanted experiences run rampant on social media, with more than a quarter of Meta users reporting having witnessed bullying and harassment in a seven-day period, the report stated, cited the company’s own internal surveys. This unrestricted digital access created by technology companies has led to dangerous encounters becoming rampant on social media.
“On more than one occasion, while I was a minor I had received sexually explicit photos from men who added my account,” a complainant wrote to Ellison’s office. “I did not need to add them back to see the image they had sent me. No minor should ever be subjected to this.”
Inappropriate and explicit content has become normalized for children, with the average age of pornography exposure being 12 years old nationwide, and one in eight Instagram users under 16 reporting unwanted sexual advances on the platform.
Although young people are aware of the negative impacts of social media on their mental health, design choices such as comparative likes, lax default privacy settings, and AI algorithms have increased usage while harming well-being of its users, the report said.
Infinite scrolling not only leads to less sleep and poorer academic performances by students — including roughly half of college students — but also creates a culture that idolizes content and creators that pander to biased desires rather than reality.
“When she was 13, she started cutting herself,” another complainant wrote to the Minnesota Attorney General’s Office. “When asked why, she said that girls on Instagram talked about how it was exhilarating to cut yourself, so she did it.”
Governments across the world have lagged behind children in becoming aware of technology’s rising impacts on users. In the United States, regulation has varied across states.
Florida and Texas passed a bill to limit content moderation on social media platforms, but legal challenges killed Florida’s bill and is putting Texas’ in front of the U.S. Supreme Court later this month. Additionally, Montana became the first state to ban TikTok last year, while states like Utah, Delaware, and Connecticut have passed legislation to limit children’s access to social media and platforms’ ability to target them.
Nationally, Congress’ attempts at regulation have include legislation to restrict platforms’ access to children’s personal data and require increased transparency from social media platforms on their targeting practices. President Joe Biden issued an executive order in October to guide federal agencies’ development and use of AI.
Previous attempts at action have created lessons to guide future regulation, according to the report. For instance, Ellison’s report noted that overly-prescriptive laws can “become obsolete quickly as technology advance.” At the same time, reporting requirements that are too broad have had little impact.
“This unacceptable status quo must change if we want young people in Minnesota to grow up with the dignity, safety, and respect every one of them deserves,” Ellison said.
Here are Ellison’s recommendations to state policymakers:
- Ban “Dark Patterns” within platform design, such as auto-play and auto-play and aggressive notifications that amplify platform usage
- Mandate transparency for product development with potential harmful features.
- Create consumer-friendly device base defaults.
- Track platform-specific technology’s impact on users.
- Mandate interoperability and promote consumer choice.
- Mandate usage limits and provide technology education in schools.
While the recommendations aim to hold technology companies accountable, Ellison acknowledged online bullying, over-sexualization, and predators will still roam the digital landscape, and users will retain their ability to participate in risky behavior. However, users must become more aware of the consequences of their engagement, and corporations must be held to “the same mitigating forces as the offline world,” he said.
Moving forward, as AI becomes more mainstream and more individuals gain the ability to target people online, it will become increasingly important to set firmer rules of engagement to protect vulnerable consumers, Ellison said.
“I will continue to use all the tools at my disposal to prevent ruthless corporations from preying on our children,” Ellison said. “I hope other policymakers will use the content of this report to do the same.”