Skip to content
View in the app

A better way to browse. Learn more.

Thailand News and Discussion Forum | ASEANNOW

A full-screen app on your home screen with push notifications, badges and more.

To install this app on iOS and iPadOS
  1. Tap the Share icon in Safari
  2. Scroll the menu and tap Add to Home Screen.
  3. Tap Add in the top-right corner.
To install this app on Android
  1. Tap the 3-dot menu (⋮) in the top-right corner of the browser.
  2. Tap Add to Home screen or Install app.
  3. Confirm by tapping Install.

Become a member

Become a member

Meta Ordered to Pay $375m in New Mexico Over Child Safety Claims

A court in the US state of New Mexico has ordered Meta to pay $375m (£279m) after a jury found the company misled users about the safety of its platforms for children.

Get today's headlines by email image.png

The ruling follows a seven-week trial examining how the company’s services — including Facebook, Instagram and WhatsApp — exposed minors to harmful content and interactions.

Jury finds violations of consumer law

Jurors concluded that Meta violated New Mexico’s Unfair Practices Act by giving a misleading impression about protections for young users. The penalty reflects thousands of violations, each carrying a potential fine.

Raul Torrez described the outcome as “historic”, saying it marked the first successful state-level legal action against the company over child safety concerns.

Prosecutors argued that Meta’s platforms enabled exposure to sexually explicit material and contact from predators, while internal evidence suggested the company was aware of such risks.

Evidence presented during trial

During proceedings, jurors reviewed company documents and heard testimony from former employees. Among them was Arturo Béjar, who said internal experiments showed underage users were being served sexualised content.

He told the court his own daughter had received inappropriate messages from a stranger on Instagram.

State lawyers also cited internal research indicating that at one point 16% of Instagram users reported seeing unwanted nudity or sexual activity within a single week.

The lawsuit, filed in 2023, alleged that Meta’s recommendation algorithms directed young users towards explicit material, including content linked to exploitation and trafficking.

Meta to appeal ruling

Meta rejected the findings and confirmed it plans to challenge the decision. A spokesperson for the company, led by chief executive Mark Zuckerberg, said it continues to invest in safety measures.

“We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing harmful content,” the spokesperson said.

The company highlighted recent initiatives, including new account settings for teenagers and tools designed to alert parents to potentially harmful activity.

Wider legal challenges continue

The case is one of several legal actions facing Meta and other technology firms in the United States over the impact of social media on young users.

A separate trial in Los Angeles is examining claims that platform design contributed to addiction among children.

Thousands of similar lawsuits are currently progressing through US courts, reflecting growing scrutiny of how major tech companies protect younger audiences online.

Join the discussion? Create account. orange.png

Already a member? haveyr-say.png


image.png
Adapted by ASEAN Now. Source 25 March 2026

Account

Navigation

Search

Search

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.