Skip to main content
Normal View

Social Media

Dáil Éireann Debate, Tuesday - 5 December 2023

Tuesday, 5 December 2023

Questions (80)

Jennifer Murnane O'Connor

Question:

80. Deputy Jennifer Murnane O'Connor asked the Minister for Tourism, Culture, Arts, Gaeltacht, Sport and Media her plans to ensure that social media companies regulate and remove harmful content, including content inciting violence. [53401/23]

View answer

Written answers

There is clearly an issue with extremist groups using online services as tools to organise. The tactics of these groups are pernicious and divisive, preying on the most vulnerable in our society, and often amount to incitement to violence and hatred.

Where an offence is alleged to have been committed it is the role of An Garda Síochána to investigate and, if necessary, to engage with online service providers to obtain information relevant to the investigation.

Under new legislation like the Online Safety and Media Regulation Act, which I commenced in March of this year, and the EU's Digital Services Act, it is for online platforms to put in place the necessary systems and processes to reduce the availability of extremist content like hate speech, threats, incitement to violence or intimidation.

Coimisiún na Meán is Ireland’s new online safety and media regulator and will also be Ireland’s regulator for the EU’s Digital Services Act, acting jointly with the European Commission.

Coimisiún na Meán are developing their first binding online safety code, which will set out rules for how designated online services deal with defined categories of harmful online content, including extremist content like hate speech, threats and incitement to violence. It expects to adopt this code in 2024 following a public consultation on a draft which will begin in the coming days.

Failure to comply with an online safety code can lead to the imposition of significant financial sanctions of up to €20 million or 10% of turnover and continued non-compliance can lead to criminal penalties.

The Digital Services Act places legally binding obligations on Very Large Online Platform, including that they must allow users to easily flag potentially illegal content and that platforms must contact law enforcement when there is a suspicion that a criminal offence involving a threat to life or safety has taken place or is happening. These large platforms must also assess and mitigate a series of risks arising from the use of their services, including in relation to incitement to violence and hatred.

Alongside regulation, these new laws require platforms to deal with complaints and I would urge people to flag incitement to violence and hatred when they see it. This will allow regulators to identify patterns and decide where further investigation or action is most urgently required.

Top
Share