top of page

Elon Musk's X Under Investigation Over Lax Moderation

  • Nov 14, 2025
  • 2 min read

Elon Musk's social media platform, X, is officially facing a major legal crackdown in Europe. Ireland’s media regulator has launched a formal investigation into the platform, citing serious concerns that X is failing to protect users and properly handle reports of harmful content. This probe is a sharp and critical response to Musk’s leadership and decisions to gut content moderation teams since he took over. 


The Core Complaint: X Is Ignoring Users

Musk considers investigation

The investigation is being led by Coimisiún na Meán (the Media Commission) in Ireland, where X maintains its European headquarters. The central focus of the probe is X’s failure to comply with the European Union's landmark digital law, the Digital Services Act (DSA). The DSA is designed to make powerful online platforms accountable for the content they host. 


Crucially, the Irish regulator is looking into three key areas where X is suspected of failing its users. First, users must be able to properly appeal X’s moderation decisions. This includes decisions both to remove content the user posted and decisions not to remove content the user reported as harmful or illegal. Second, X must provide an easily accessible internal system for handling user complaints. Finally, X is required to clearly inform users about the outcome of their reports and appeals. 


X Continues to Spread Hate and Misinformation

This regulatory action comes after Musk has repeatedly declared himself a "free speech absolutist," but under his leadership, the platform has seen a dramatic rise in misinformation and hate content, leading to an exodus of advertisers and trust. This includes a huge explosion of antisemitism, which includes influencers that have been able to monetize and profit from spreading hate content online. Controversially, Musk permitted white supremecist and Holocaust denier Nick Fuentes to return to the platform. 


Other key changes under Musk that have led to lax regulation include firing thousands of staff, including those responsible for content moderation, which severely weakened X’s ability to police the platform. Additionally, X removed key policies designed to combat COVID-19 misinformation and election-related lies, allowing dangerous fake news to circulate freely. 


Musk claims that X’s community posts serve as a system to regulate content and misinformation, as people can try to post a note on false information for others to see. However, a study found that the notes would not appear even after being reported. The result was that the community notes would be attached so late that only 22% of people that saw the post would also see the community note.


Comments


bottom of page