S&D's additional measures in response to social media issues

Person holding phone near their laptop with both screens visible and open on social media

We, the Socialists and Democrats, have constantly fought for a stronger, more protective and inclusive digital environment in Europe. Over the past five years, we have made considerable progress in ensuring Europeans can navigate, consume and communicate online in an easier and safer way throughout the Union. Some key achievements include the Digital Services Act (DSA), which addresses the spread of illegal content online; the Digital Markets Act (DMA), which promotes fair competition and transparency in online advertising; and the Artificial Intelligence Act (AI Act), which sets obligations to develop human-centric AI systems and ensures transparency and accountability for AI-generated content. These rules are crucial milestones that better protect Europeans and allow them to reap the benefits from online services in Europe. 

The beginning of the Trump administration, the statement of Meta CEO Zuckerberg in January on fact checking and the speech of US Vice-president Vance on misinformation at the Munich Security Conference set a showdown with the European Union. The direct calls for stopping the application of the DSA is the first test of the credibility of our recent European tech rulebook. As we claimed during the plenary debate on 21 January, Europe cannot bow down to the tech oligarchs when our democracy and rule of law are at stake. The Commission should not be afraid to use all the power at its disposal to guarantee the protection of Europeans and the respect of our values.

The DSA is the most advanced and ambitious legislation that puts obligations on social media platforms to moderate their online content. Very Large Online Platforms (VLOP) like X, Facebook and TikTok, have an obligation to identify and analyse systemic risks linked to the use of their services and to take corresponding mitigation measures. VLOPs must ensure the transparency of their algorithms by giving the Commission access when requested in order to ensure adequate and effective enforcement. Failing to comply with the DSA would make them subject to a fine of up to 6% of their worldwide turnover or even a ban if decided at national level. The misuse of VLOPs, be it by spreading disinformation, harmful or illegal content or via manipulative or abusive activities, can threaten our European democracies. The DSA rules are a shield against these threats.

The Commission must urgently conclude its investigations and take the necessary measures it deems appropriate. Additionally, an increase in the Commission’s staff responsible forenforcement of the DSA and for the investigation of violations is necessary to ensure efficient enforcement of our rules. Children and young people are especially vulnerable to the negative impact of disinformation, harmful content, dark patterns of social media but also to criminal networks using social media platforms to recruit and exploit minors for criminal activities or even acts of terror. We also call on the Commission to speed up the development of guidelines on the protection of minors under the Digital Services Act, as a first step.

The DMA can complement the DSA to promote fair competition and transparency in digital advertising. In addition to DSA Article 26 on advertising transparency, the DMA must be fully enforced to ensure that dominant platforms do not unfairly disadvantage traditional media outlets.

The AI Act represents a major step forward in regulating AI-generated content and safeguarding Europe’s commitment to trustworthy and human-centric AI. As AI-powered tools become increasingly widespread, clear labelling and watermarking of synthetic content are essential to prevent disinformation. The DSA and AI Act should be fully leveraged to ensure strict transparency and disclosure obligations, empowering citizens to distinguish between authentic and AI-generated content in a rapidly evolving societal and geopolitical context. Finally, the soon-applicable Political Advertising Regulation presents another essential tool to combat manipulative advertising practices, strengthen democratic resilience, and restore balance in the digital ad ecosystem

The recent geopolitical developments show that the EU is at a crossroads. While we urge the European Commission to use all the legal arsenal at its disposal and, in particular, swiftly conclude its ongoing investigations into X, Meta and TikTok – applying strong and appropriate sanctions in case of infringements to the DSA – we consider that the EU must conduct a broader reflection to address several concerns.

This includes, among others, a strategy on how to counter the negative impact of social media, especially on the mental and physical health of users, in particular minors, on our European democracies in general, integrity of elections and confidentiality of communications. Additionally, we need better media literacy to help individuals identify and access trustworthy, high-quality information.

The S&D Group proposes the following additional solutions to further protect Europeans:

• Strengthen protection of individuals online with the future Digital Fairness Act: We urge the Commission to quickly present the expected Digital Fairness Act in order to bring additional protection for Europeans against deceptive and addictive designs, dark patterns, personalised practices targeting vulnerabilities and more generally the digital asymmetry that individuals face in the online world. That this proposal can only be expected at the end of 2026 undermines the urgency related to the mental health crisis amongst young peopleand other problems created by social media applications.

Build a democratic European platform for trustworthy news and information: This platform would centralise existing content from public service media and licensed broadcasters across Member States. Powered by AI translation technology, the platform would serve as a common access point to a common database of information across Europe and allow all EU citizens to find news and other content produced by publicly funded and licensed media in their own language. Promoting access to editorial media outlets on social media is important as young people use social media as their main source of information. We should continue to build on previous pilot projects such as “Building a European Public Space” which explored collaboration with the media and influencers.

Foster EU value-led alternative: As showcased in Brazil for X, or the attempt in the US for TikTok, Member States can also ban social media platforms infringing our laws. If such a decision is taken, we Europeans should provide our citizens with alternatives to allow them to share their ideas on safe, unbiased and respectful online platforms. This can be achieved either by: (1) obliging platforms infringing our laws to sell their European activities to EU actors (via a public private partnership) before a ban is applied or (2) foster the emergence and development of European competitors in the social media market thanks to a federated and secured public digital infrastructure. The EU has the talent and resources to create such competitors.

• Develop a federated and secured public digital infrastructure grounded in European regulatory standards: Based on a comprehensive regulatory framework that aligns with European values of privacy by design, transparency, accountability, expandability and competition, we can develop a federated and secured digital infrastructure. This would allow the EU to reclaim digital sovereignty, foster competition, and provide Europeans with a transparent, interoperable, accountable and democratic digital ecosystem. In order to do this, we would need a clear and evidence-based definition of strategic technologies and an assessment of global dependencies, as well as an analysis of possible obstacles to the development of European capacities. We would call for the launch of a Digital Sovereignty Fund from the EU Budget to release the necessary investments to build the European digital ecosystem. Besides new investments we could use incentives for public procurement as a means to strengthen EU investment in European digital infrastructure, such as cloud operators.

• Ensure the Future of media pluralism and put an end to exploitative business models: The digital advertising market is dominated by a handful of tech giants (Google, Meta, and Amazon) who capture 80% to 90% of global digital ad revenue. This dominance undermines media pluralism and weakens traditional media outlets that provide independent and high-quality journalism. As users increasingly engage with online platforms rather than trusted news sources, press media struggle to monetise their content. At the same time, these platforms exploit users through invasive profiling, leveraging personal and even sensitive data to fuel opaque recommender algorithms. These algorithms amplify disinformation, hate speech, radicalisation and harmful content, while keeping users locked into closed information bubbles. This threatens media literacy and the ability of Europeans to access diverse and reliable news. The EU has the tools to address this imbalance. A strict and well-resourced enforcement of GDPR and ePrivacy rules must prevent unlawful data exploitation and ensure that platforms violating European law do not gain an unfair advantage over those that respect user rights. Ultimately, targeted advertising must be banned, and sensitive data must never be used for commercial purposes. By enforcing these rules, we can level the playing field, redirect advertising revenue to trustworthy journalism, and safeguard voters from manipulative influence.

• Encourage member states to invest in media and information literacy initiatives and offer EU-wide coordination: As emphasized in the Niinistö report on preparedness (2024), "to ensure we stand together during crises, we also need to bolster citizens’ ability to recognise authoritative sources of crisis response information and to dismiss disinformation and Foreign Information Manipulation and Interference (FIMI)”. Media literacy is an essential skill for European citizens as mis- and disinformation are widely shared on various platforms. We should ensure that, across the EU, Europeans are offered high quality training and information campaigns regarding media literacy. The Commission should look at Member States that have already successfully implemented such programmes in their education systems, and enable the sharing of best practices.

• Create a cross-party coalition to launch concrete proposals: The European rules-based approach to social media will come under immense geopolitical pressure. This coalition can be the forum for pro-European policymakers, civil society organisations and like-minded experts or even local actors (cities, public services etc.) to further reflect and launch concrete initiatives to advance EU policy on these topics.

MEPs involved
Head of delegation
Vice-president
Malta
S&D press contact