The Massachusetts Supreme Judicial Court has ruled that Meta Platforms must face a lawsuit brought by the state’s attorney general, which alleges that the company designed its Instagram platform to foster addiction among children. This decision, issued on Friday, represents the first time a state high court has addressed whether a federal law that generally protects internet companies from liability over user-generated content also shields them from claims related to allegedly purposeful addictive design.

The case stems from a legal challenge initiated by Massachusetts Attorney General Andrea Joy Campbell, who argues that Meta’s practices targeted children’s developmental vulnerabilities, contributing to harm. The attorney general’s office contends that the lawsuit does not seek to hold Meta responsible for material posted by Instagram users—a category typically covered by Section 230 of the Communications Decency Act of 1996, which grants broad immunity to internet platforms for third-party content. Instead, the suit focuses on Meta’s own conduct, including how the company allegedly engineered Instagram’s features and user experience to maximize engagement in a way that could lead to addiction, particularly among younger users.

In the court’s unanimous opinion, Justice Dalila Argaez Wendlandt emphasized that the claims revolve around Meta’s design decisions and its communication with the public regarding the platform’s safety. “The claims allege harm stemming from Meta’s own conduct either by designing a social media platform that capitalizes on the developmental vulnerabilities of children or by affirmatively misleading consumers about the safety of the Instagram platform," the opinion states.

Meta has disputed these allegations, maintaining that its platforms are designed to provide safe and positive experiences for all users, including minors. The company argues that Section 230 protections should prevent legal actions based on user engagement or content-related issues, asserting that the lawsuit improperly seeks to hold it accountable for user behavior rather than its own actions.

The ruling clears the way for Massachusetts’ lawsuit to proceed, allowing further examination of whether Meta’s internal practices surrounding Instagram’s design and safety disclosures contributed to harm among child users. The case may have broader implications for how courts interpret Section 230 protections in relation to allegations involving platform design and user addiction, particularly among minors. Legal experts note this decision could influence similar lawsuits pending in other states, raising questions about the responsibility of social media companies for the wellbeing of younger audiences interacting with their products.