The Minister of State is welcome to Seanad Éireann.
Digital Services Bill 2023: Second Stage
Minister of State at the Department of Further and Higher Education, Research, Innovation and Science (Deputy Niall Collins)
I am pleased to bring this Bill before the House today. The purpose of the Bill is to fully implement Regulation 2022/2065 of the European Parliament and European Council on a single market for digital services, commonly referred to as the EU Digital Services Act. The EU regulation has direct legal effect in all member states. Consequently, the obligations it places on regulated entities are directly applicable and require no further implementation in national law. The Digital Services Bill 2023 is a technical measure that is necessary to give full effect to the supervision and enforcement provisions of the regulation. The Bill does not add to nor alter the obligations placed on regulated entities by the regulation.
The services available online have impacted on practically every facet of the lives of European citizens. These services have brought enormous benefits socially, culturally and economically. They have provided unprecedented access to information and facilitated an entirely new level of connection and communication between geographically remote citizens. They are transforming for the better our work-life balance, the delivery of healthcare and education and access to Government services and have opened up new enterprise opportunities for commerce and trade. However, online services have also created an entirely new source of risk for citizens and society at large. These online safety risks include the dissemination of illegal and harmful online content.
It was for the purpose of addressing these risks that the EU adopted, in November 2022, the EU digital services regulation. The regulation emphasises a pioneering regulatory framework to protect EU users of digital services and their fundamental rights online. It aims to rebalance the responsibilities of users, online platforms and public authorities, placing citizens at the centre. The regulation marks a sea-change in the EU’s ability to protect society from illegal and harmful online content. The regulation is designed to improve online safety by placing obligations directly on providers of online intermediary services, particularly platforms such as social media, marketplace sites and search engines. These obligations are designed to expedite the identification and removal of illegal and harmful online content. The regulation places obligations on service providers to improve transparency of their services and to give users more control over their online experience.
An important mechanism for expediting the removal of illegal content is the framework for trusted flaggers established by the regulation. Providers of online services are obliged to prioritise assessment of notices submitted by trusted flaggers about the presence on their service of specific items of information that the trusted flagger considers to be illegal content. However, the trusted flagger is not the arbitrator of illegal content. It is the responsibility of the providers to decide whether the flagged content is illegal. The providers are obliged to come to a decision on a notice in a diligent and non-arbitrary manner. The protection of freedom of speech is a fundamental principle in the EU regulation. The regulation provides users with a right to complain if their content is removed and with the right to access an out-of-court dispute settlement mechanism if the matter is not resolved to their satisfaction by the appeals process.
The European Commission has designated 22 entities as very large online platforms and search engines under the regulation. The European Commission has primary responsibility for regulating these entities, but will do so in concert with national authorities. As thirteen of these very large entities are established in Ireland, we have a unique, critically important and high-profile role in the overall EU regulatory framework for digital services. For this reason, it is imperative for our national reputation that Ireland enacts the Digital Services Bill before the EU deadline of 17 February, when the EU regulation comes into full effect.
The regulation requires Ireland to designate a lead competent authority, to be known as the digital services co-ordinator. The digital services co-ordinator will be the single point of contact, with lead responsibility for all matters to do with the EU regulation in Ireland, including co-ordination across the EU, handling complaints, codes of conduct, communications, supervision, investigation and enforcement. The Bill designates Coimisiún na Meán as the digital services coordinator for Ireland. An coimisiún was established by the Minister for Tourism, Culture, Arts, Gaeltacht, Sport and Media, Deputy Catherine Martin, on 15 March 2023. The Bill adds the functions of the digital services co-ordinator to those an coimisiún already has and adapts an coimisiún’s existing powers, such as powers of investigation and the power to impose financial sanctions, for the specific cases where it will be implementing provisions of the EU regulation. The co-location within Coimisiún na Meán of the supervisory and enforcement responsibilities for the digital services regulation, the EU terrorist content online regulation, the Online Safety and Media Regulation Act 2022 and the regulation of broadcasting and video-on demand services will enable efficient and cohesive implementation of the regulatory framework for the benefit of citizens and providers. The synergies will also provide significant cost savings for the Exchequer.
The Bill designates the Competition and Consumer Protection Commission, CCPC, as a second competent authority with specific responsibility for the elements of the EU regulation relating to online marketplaces. The Bill provides both competent authorities with the necessary powers to carry out investigations and take enforcement actions, including the imposition of significant financial penalties for non-compliance. The Bill is prescriptive regarding how the competent authorities should exercise their powers to ensure the principles of natural justice, fair procedures and proportionality are fully respected.
Having set out the background, context and purpose of the Bill, I will now outline its main provisions. The Bill is structured in 4 Parts and contains 83 sections. It has been drafted to provide the greatest possible alignment between the provisions for Coimisiún na Meán and those for the CCPC, while simultaneously maintaining consistency with their respective principal Acts, namely the Broadcasting Act 2009 and the Competition and Consumer Protection Act 2014.
Part 1 of the Bill deals with preliminary and general matters common to legislation, namely, commencement, definitions, service of documents and revocations.
Part 2 contains a series of amendments to the Broadcasting Act 2009 to designate and empower Coimisiún na Meán as the digital services co-ordinator and a competent authority for the EU regulation.
Part 3 designates the Competition and Consumer Protection Commission as a competent authority for three specific articles of the regulation relating to online marketplaces. The provisions for the CCPC have been set out in a self-contained part of the Bill. This part provides that the CCPC can use its powers of investigation as provided for under the Competition and Consumer Protection Act 2014 in the context of the EU regulation.
Part 4 contains miscellaneous provisions, including an obligation on authorised officers and staff of the CCPC to maintain professional secrecy.
It is imperative that the State has a comprehensive and robust legal basis for the full and effective implementation of the EU digital services regulation. I am confident that the Bill achieves this objective in a balanced and proportionate manner. I thank the Leas-Chathaoirleach and Senators for their attention and I commend the Bill to the House.
I welcome the youth and government group from YMCA from Belfast, who are accompanied by their co-ordinator Peter McNeice. I thank them all for coming. It is interesting they are here when we are debating further and higher education and digital services and given the events in Northern Ireland in recent days. I hope they learn a lot about government when they are in Leinster House. We never know, they might be serving here in future.
They might help us if they learn about government.
We can always be learning.
I welcome the Minister of State, Deputy Collins to the House and I welcome the Bill before us. Full implementation in Ireland of the EU regulation on a Single Market for digital services is an important step and we should proceed with a speedy and diligent manner. As all Members are aware, services available online have transformed the lives of everyone in the country and throughout the world. They have brought enormous benefits. Information has never been so readily available, connections are available between geographically-remote citizens and new enterprise and trade opportunities have become available. However, it has not all been positive. In recent years we have seen the spread of disinformation becoming a more serious challenge in Ireland and around the globe, as well as the growth of harmful and illegal content. These dangers are not insignificant and the need for a comprehensive and effective regulatory framework to protect individuals as well as society at large has been made clear. European law has been leading the way in this area for some time. It is essential that it continues given the challenges facing us.
We cannot allow online service providers to operate without responsibility. Frankly, this has gone on for too long, particularly with regard to social media which operates on a wild west footing where anything goes. These sites have, of course, provided benefits in allowing us to connect with family and friends and to keep in contact regardless of where we are located. However, they have also been weaponised for bullying and abuse towards many. The owners and operators of social media sites largely refuse to take responsibility for such actions. It is vital they are regulated and citizens are protected to the greatest degree possible.
With this in mind, the regulation is designed to improve online safety by placing obligations directly on providers of online intermediary services, with a focus on platforms such as social media and marketplace sites as well as on search engines. These obligations are designed to expedite the identification and removal of illegal and harmful online content. The regulation also places obligations on service providers to improve the transparency of their services and to give users more control over their online experience.
The European Commission has designated 19 entities as very large online platforms and very large search engines under the regulation. The European Commission has primary responsibility for regulating these entities but will do so in concert with national authorities. As 13 of these very large entities are established in Ireland, we have a fundamentally important role in the overall EU regulatory framework for digital services. In other words, under the country of origin principle the country where the company is headquartered in the European Union, in this case Ireland for 13 of the 19 entities, is obliged to regulate not only at home but throughout the Single Market in its entirety. For this reason it is essential for our national reputation that Ireland enacts the Digital Services Bill before the EU deadline of 17 February, when the EU regulation comes into full effect.
One of the key recommendations of the Oireachtas Joint Committee on Enterprise, Trade and Employment in its pre-legislative report was that Coimisiún na Meán be satisfactorily resourced with the level of staffing and legal expertise required to allow optimal operational capacity and enforcement. I am glad to see the Department has acted on this. It is very positive that funding has already been allocated for this purpose, as €2.7 million was allocated in 2023 specifically to support the establishment of the digital services functional in Coimisiún na Meán in preparation for this legislation. Budget 2024 allocates a further €6 million to complete preparations and capacity building for its digital services function and to support initial operations.
The Bill provides a robust and comprehensive legal basis on which Ireland will be able to fulfil its responsibilities in the regulation. I am delighted to speak in support of it today.
Cuirim fáilte roimh an Aire Stáit. I thank my colleague, Senator Crowe for the opportunity to say a few words on the Bill. It is a technical Bill but it does have a number of important functions. It is critical that it is enacted by 17 February as we have discussed. This has to be seen as part of a suite of European legislation to ensure a safe, open, fair and innovative digital space throughout the European Union. The Digital Services Act, the Digital Markets Act and the forthcoming AI Act are all critical to ensure that we as citizens and consumers can play an active part in this new world.
The philosophy underpinning the European legislation and the Government's approach in this area is critical. We need to regard the current and emerging digital environment as a public space. It should not be behind walled gardens. Control of a small number of tech companies by a small number of wealthy individuals should not be able to shape the entire development of society and humanity. It is important that the European Union takes an active role in this. The Government's approach to legislation with this philosophy is critical.
It is my strong belief, and the Government should never apologise for saying this, that citizens and consumers should have the same rights, opportunities and protections online as they do offline. The legislation, even though it is technical legislation, supports these principles. I was glad that when we debated in quite a lot of detail the Online Safety and Media Regulation Act in 2022, which led to the establishment of Coimisiún na Meán, that we debated many of the issues. These included, as my colleague Senator Crowe has mentioned, how we deal with illegal content and how we ensure we have a regulatory framework that will protect the safety of individual citizens. There were some who said we should have waited until the Digital Services Act was enacted. I believe we were correct to enact the Online Safety and Media Regulation Act and establish Coimisiún na Meán. The work done in just over a year by Coimisiún na Meán deserves a lot of praise.
This legislation empowers Coimisiún na Meán and the CCPC. In that regard, it is very technical legislation but tied with this - and this is really important - is an information campaign to make people aware of their rights as a result of the enactment of this new legislation. What is also critical, and the Government has been very strong on this up to now, is the enforcement mechanisms. Under the DSA, regulator are in a very strong position, where there are problems with various platforms and ISPs, to be able to tackle them and that is critical going into the future. In that context, we must ensure that regulators are properly resourced, including making sure that the levels of staff required are available. I do not want to see a situation arising similar to the situation with the Data Protection Commission, for instance, where lots of complaints or problems were notified to the commission but it did not have the necessary resources, including staff, to address them. This will require a wide range of specialists, including administrative, competition, and digital lawyers, as well as computer scientists, to support regulators in their work and it is absolutely critical that there is sufficient level of resourcing for Coimisiún na Meán and the CCPC.
As I have said on numerous occasions in this House, we need to have far more public debates on technology, how it is used and how it shapes our lives. I am on record as calling for the establishment of an Oireachtas committee on AI because the next measure in this suite of legislation, the AI Act, will be the most important legislation that the EU will enact this decade. It needs to be seen as part of the suite of measures which includes this legislation. This is technical legislation, and we need to have it enacted by the deadline of 17 February.
The Digital Services Bill 2023 is weighty legislation, as is often the case with Bills that give effect to EU regulations. It covers quite a lot of ground and contains a mixed bag of provisions that cover consumer protection and the regulation of advertising algorithms to content moderation and the combating of child sex abuse material. Some of these provisions are obviously deserving of our full support but others have the capacity to give rise to grave concerns. The stated goal of the European Commission's DSA is to ensure greater harmonisation of the conditions affecting the provision of intermediary digital services in the particular online platforms that host content shared by their customers.
Before I get on to the subject of content moderation, I want to briefly mention pornography. I do not think we should be letting children watch pornography. There have been too many studies showing the ill effects of it, both mentally and socially. The past two years have seen a slew of reports on the increase in the number of teenagers who have reported being sexually assaulted when socialising with other teens and this has been directly linked to the consumption of often violent and extreme pornography. In its annual report, the voluntary organisation Children At Risk Ireland, said there had been a significant increase of 36% in the number of children being referred to it for harmful sexual behaviour. We are standing idly by while our young men and women are being hurt and damaged by this. Multiple states in the USA have introduced measures to block pornography for minors, as has our closest neighbour, the UK. Multiple telephone companies have begun blocking pornography sites by default. I strongly believe that the Houses of the Oireachtas should be examining taking steps in this direction as well.
Regarding the Bill, I have concerns about some aspects of the content moderation regime, particularly when it comes to curbing certain so-called forms or categories of speech while leaving these categories undefined. Under the DSA, the European Commission can put significant pressure on digital platforms to curb hate speech, disinformation and threats to civic discourse which, on the face of it, should have the full support of all decent and civic-minded folk. However, as recent months in Ireland have proven, these terms constitute what research fellow David Thunder calls "notoriously vague and slippery categories". Utterances which are dismissed as far-right dog-whistles one day often become Government Minister talking points a few months down the line as the establishment catches up with the opinions of the public. Imagine if such speech was illegal. The man in the pub could be prosecuted for saying the same thing as a Cabinet Minister, his only mistake being that he said it before the Government changed its mind. By their nature, anti-free speech laws become a large stick wielded by the State to enforce its own ideas of what is true.
The DSA creates entities called trusted flaggers to report illegal content they identify on large online platforms. Online platforms are required by the Act to respond promptly to reports of illegal content provided by these trusted flaggers nominated by member state-appointed digital service coordinators. The Act requires large online platforms to take the necessary measures to ensure that notices submitted by trusted flaggers acting within their designated area of expertise, through the notice and action mechanisms required by the regulations, are treated with priority. Platforms will face periodic audits of their actions in compliance with the Act by auditors working on behalf of the European Commission. If it deems that a large online platform such as X has not been in compliance with the DSA, the European Commission may fine said platform up to 6% of its gross annual global turnover. The kicker is that the idea of non-compliance is hard to quantify so platforms will not know the threshold until it is established through precedent. As they will not know at the outset what exactly is required in order to meet due diligence obligations of systemic risk management, it seems likely that companies wishing to avoid legal and financial headaches will err on the side of caution and will put on a show of compliance to avoid getting fined. In this sense, it is exactly like the hate speech Bill. By design, no one will know where the line in the sand lies and so all will be forced to self-police and self-censor. The legislation is not intended to be enforced by An Garda Síochána or the courts, but by fear. This legislation will create an atmosphere of legal uncertainty both for online platforms and their users. It heavily incentivises online platforms to police speech in a manner that aligns with the EU Commission, around vague categories like disinformation and hate speech, and this will obviously have repercussions for the free speech of users. The net effect of the legislation is to apply an almost irresistible pressure on social media platforms to play the counter-disinformation game in a way that will earn them a passing grade with the Commission's auditors and thus avoid getting hit with hefty fines.
By 17 February, we need to rock up to the EU and rubber stamp this legislation, which means it must go through four Stages in this House. This reminds me of the hate speech Bill . It got very little scrutiny in the Dáil and then it came to this House. Thanks to the fact that Senators were diligent and awake, we have stalled that legislation for now. I feel the same will happen with the Bill before us. Rushing this through is not the way forward. We need to spend time on this. We are not due to sit again until next Wednesday, which means we will get one day next week and one the week after to debate this and that will be it. That is not how we should do our business in this House. We are not giving enough time to this legislation. I do not agree with forcing online spaces, which are the town square of the world, to abide by the EU’s particular vision of reality so I will be voting against the Bill.
I welcome the Bill and thank the Minister of State and his officials for being here.
It is technical legislation but important nonetheless. It is important it is done by 17 February to make sure we are in line with our colleagues in the European Union to be able to fight against some of the challenges in the digital world. As Senators Crowe and Byrne mentioned, it is now clear that some of those tech companies and social media companies are not able or certainly are not willing to regulate matters themselves. This needs to be done by states and by the European Union. This Bill does that and Ireland needs to play its role in that. I welcome the Bill. I am sure next week we will have amendments tabled by the Opposition where it can be teased out a bit further. I look forward to that too.
I thank the Minister of State for his time today. I welcome the Digital Services Bill. It is a pity Senator Keogan has left. Each provision in the Bill is necessary for Ireland to fulfil the mandatory requirements on member states under the EU regulations. I do not know what the big debate about it is. It just has to be done. We are in the EU and these are EU regulations we have to stick to.
The EU digital services regulation establishes clear and proportionate rules to protect consumers and their fundamental rights online while simultaneously fostering innovation, growth and competitiveness. It aims to rebalance the responsibilities of users, platforms and public authorities according to European values, placing citizens at the centre. The benefits for stakeholders include better protection of fundamental rights, more control and choice, stronger protection of children online and less exposure to illegal content. I do not see anything to object to there. Benefits also include greater democratic control and oversight over systemic platforms and mitigation of systemic risks such as manipulation or disinformation. Fines of up to 6% of the worldwide annual turnover of a provider if it is found guilty should be something that might teach providers to care a little more. We often see the complete or scant disregard by big social media platforms in letting mistruths go exponentially. Sadly, we see it working in this country. I have always looked at America in pity at how polarised it has become, but if we do not get this right, and that is why this Bill is so important, we are going down that same road, where it will be 50% for and 50% against every single issue, and there will be no debate or nuance. It will just be completely polarised. It is not just the big companies that are at it. It also suits the far right. We had a case in County Clare in a place near Corofin where they decided to have a blockade. It was never on the radar, it was never going to be used for international protection applicants, IPAs, and all of a sudden there was a blockade. One of the lovely old men from the neighbourhood was walking past and was hospitalised for absolutely no reason. We have to take back some control of the appalling misinformation that exists.
Unfortunately, while people may not be gullible, they are believing. Seeds of doubt are being sown. It sounds so barbaric, it puts fear in people and, as a result, fear is driving people, even though we are the country of a thousand welcomes and we have in fact taken in more per population than any other country in the world when Ukraine and the IPAs are taken into consideration. It is just sad to see a misrepresentation of our great people when these kind of things happen. It demonstrates the risks posed by illegal and harmful online content. Ordinary citizens are being completely misinformed and misled. The sooner we can deal with that issue and people are held accountable the better because blatant lies are being told. It is putting the fear of God into people. Those people are being used by a certain minority with its own agenda and the whole thing is becoming really disturbing.
The Digital Services Act is very important. We have to have it done by 17 February. It is great that the Minister, Deputy Martin, got the Coimisiún na Meán off the ground because so many people are affected, whether it is children bullied online or, like the old man in Corofin, being hospitalised for absolutely no reason, or people being ripped off by purchasing things that do not exist. It is to be hoped this will go a long way in helping to protect our people and in helping us to get better at making sure the truth prevails.
The Labour Party welcomes the Digital Services Bill. We all have to be clear that, as Senator Garvey said, this is giving effect to the EU regulation on digital services. It is specific with regard to establishing the mechanism for the supervision and enforcement of the regulation in this country. In that regard we welcome the Bill. This is the latest chapter in the EU initiative to ensure we have a greater degree of safety online for citizens throughout Europe. In some ways this is a fundamental shift in our relationship with services like the Internet and social media in that it is moving beyond a place where it is the Wild West to a place recognising that the State and the EU have to legislate because of the real potential for harm that is possible in our online space. It is my hope that the EU Digital Services Act will have a transformative impact on how we all consume social media and, in particular, on some of the largest digital platforms.
One of the key concerns I want to raise is the designation of Coimisiún na Meán as the lead competent national authority. I am hugely supportive of the establishment of Coimisiún na Meán, but given that it is up and running for less than a year now, there are serious and fundamental questions with regard to its capacity and its expertise to do the enormous job that it is going to be tasked to do, in particular because Ireland is home to so many intermediary service providers. We are the Europe, Middle East and Africa, EMEA, headquarters for some of the biggest ISPs across the European Union. The challenge that Coimisiún na Meán will have is nothing short of enormous.
We saw the difficulties the Data Protection Commission encountered in this and indeed the reputational issues for this country when the Data Protection Commission was very much under-resourced to do the job it was tasked to do. Thankfully, if belatedly, resources materialised. An enormous effort needs to be put into ensuring Coimisiún na Meán gets off to a good start. It needs significant resourcing to do its job because something like 20 large online platforms are headquartered in this country. When we compare the role Ireland will have in regulating these providers relative to other EU member states, we have an enormous responsibility.
The second concern is that while Coimisiún na Meán has been designated as the national lead competent authority, the Competition and Consumer Protection Commission, CCPC, has also been designated responsibility, in particular over the online marketplace. We have a concern that, given the enormousness of the regulatory challenge, this could give rise to logistical and other confusion. The last thing we want to see is anything falling through the net. It is important we put those concerns on the record.
However, in the same vein, we believe in the power and the capacity of the Digital Services Act to bring about a regulatory environment for online providers in a way we previously have not. We will be supporting this Bill. We hope that we have everything put through by 17 February.
The Minister of State is welcome. Sinn Féin is supportive of this Bill and the EU Digital Services Act as it seeks to regulate very large online platforms and provide a more equitable online environment. The Bill aims to address illegal and harmful content. It is hoped it will also rein in the powers of big tech and give Internet users a bit more control over their digital lives. Given the rise in disinformation and harmful content on social media and various online platforms, it is essential this legislation is effective and robust, but it also has to ensure the balance between protection from harmful content and, at the same time, protecting freedom of speech. In recent weeks, since the outbreak of Israel's war on the people of Palestine, there have been concerns about the use of the EU Digital Services Act in respect of the digital rights of Palestinians and other vulnerable communities.
Online information has been taken down amid accusations that it constitutes harmful content or disinformation. The person who posted it must then open a complaint and prove the content is legitimate and valid. This highlights how the EU Digital Services Act could potentially be applied with political bias. It shows how the interpretation of this information can be politically applied. We only have to look at the statement from Commissioner Thierry Breton warning X, Meta and TikTok and reminding those companies of their obligations under the Digital Services Act with regard to the war in Palestine. The problem, of course, lies in the framing by the Commissioner and the EU of the situation, which ignores the Palestinian perspective and the ongoing human rights violations. It is fair to say the European Union has disgraced itself when it comes to Palestine.
It is a relief that Ireland has not been swayed by the narrative and that Irish people, media and politicians have been generally even-handed and fair in their analysis of the current conflict. I do not think the Irish people would let the Government away with bias in relation to the horrors being visited upon the people of Palestine by the apartheid Israeli Government. As I said, there is a balance that needs to be struck, and that is not easy. I welcome that individuals can appeal a decision that has removed their content, and that there are processes to help with this type of undertaking. This Bill is about ensuring a safer online environment for people who use social media and the Internet in general.
We all enjoy great advantages from using online platforms and media and the ability to have so much information right there at our fingertips. However, social media and video platforms also have a very dark side, and this is why we need much more advanced regulation. I am not sure how effective this legislation will really be regarding the specific issue of the damage online content is having on our children's mental health and well-being, but I hope it will have some impact and that we will continue to work harder to protect vulnerable users, including children, from harmful content.
We must also ensure that very large online platforms take responsibility for cracking down on online scams and fraudsters. Another major issue we are seeing is the spread of disinformation and all kinds of lies and far-fetched conspiracy theories. Indeed, Senator Keogan will be well acquainted with some of these. The more the lies are shared as if they were true, of course, the more people start to believe them. Unfortunately, trying to counteract the lies afterwards is a much more difficult and slower process.
There needs, therefore, to be much more accountability on the part of the platforms, including X, formerly Twitter, TikTok and Facebook. Many of these platforms are headquartered in Ireland, as the Minister of State acknowledged, and we do have a role to play here. In fact, 13 of these very large online platforms are established here. This means we have a critical and high-profile role in supporting the EU Commission in regulating them. My colleague Deputy Louise O'Reilly brought forward amendments in the Dáil, and we might have another look at those in this House. As I said, though, we do support this legislation. I apologise because I have to attend another meeting now, so I will not be here to hear the response from the Minister of State. I will, though, certainly follow it up afterwards.
As the Minister of State outlined, the Bill before us does not add to or alter the obligations placed on regulated entities by the EU Digital Services Act. This is because that is a regulation that has direct effect and impact. The purpose of this Bill is to give effect to the supervision and enforcement mechanisms in relation to it.
If I may interrupt the Senator for one second, I welcome all the students from Kildare Town Community School.
I wish the students a happy St. Brigid's weekend, particularly as it is a Kildare school.
This Bill will designate Coimisiún na Meán as Ireland's lead competent authority in respect of this matter. There are four systemic risks in this context. I will comment briefly in this regard. I will also probably be submitting some amendments on Committee and Report Stages to see how we can address these risks.
I start with illegal content, including child sexual abuse materials. I again note my colleague Senator Flynn's long-standing legislation to ensure that the inappropriate phrase "child pornography", which still exists in our legislation, would be replaced by the appropriate language, namely, "child sex abuse materials". As we mention this context now, this is a chance to remember that we should be updating the other legislation in this respect as well. I refer as well to the impact of illegal hate speech, the impacts in respect of the Charter of Fundamental Rights around discrimination and privacy and the foreseeable impacts on the democratic process, public security and disinformation campaigns. I will touch briefly on each aspect, including, as I did when we spoke regarding the language establishing Coimisiún na Meán, in terms of the other principle around the rights of participation and equitable and diverse participation within online spaces and communication.
It is welcome that there are wide-ranging restrictions under the legislation on precisely what kind of targeting online platforms can do. There will be a ban on targeting advertising based on people's sexual orientation, religion, ethnicity and political beliefs. It is no exaggeration to say that some of the targeting practices permitted until recently, and that are still taking place, have verged on the dystopian. In The Age of Surveillance Capitalism, Dr. Shoshana Zuboff gives an example of a young woman who did not know she was pregnant. An online algorithm, however, was able to detect the changes in her online shopping and correctly predict that she was pregnant and target information and products at her in that way. That is the somewhat dystopian aspect of this context. This example did not involve targeting based on a protected characteristic, but it is a cautionary example. It is very welcome in particular, for example, to see targeting by sexual orientation is no longer going to be permitted. As we know, this type of targeting can have particularly detrimental impacts on young people. Other legislation is pending regarding a ban on conversion therapy, but we know that disinformation, and very frightening information, can often be targeted at people questioning their sexuality.
Regarding the targeting facet, as well, I note that Senator Ruane and I had the general data protection regulation, GDPR, legislation amended back when it came through the House. Our amendment specifically looked at limiting advertising for children, which brings me to our next area of concern. I again note that Ireland could have been ahead of the curve because our Data Protection Act 2018 has a specific clause seeking to ban the targeting of advertising to children. Senator Ruane and I submitted an amendment to that effect which was successfully inserted into the Bill. That section of the legislation, however, was never commenced. It was never put into effect. It is good that the Digital Services Act is now addressing this context, but we could, and should, have had this provision in place since 2018. When we think of the damage done because the targeting of advertisements to children has been allowed since then, it is regrettable that we have lost so many years in this regard. Several provisions in the legislation do go some way towards addressing harm caused in this regard. Users will now be able to view their social media timelines chronologically rather than in a sequence controlled by an algorithm. Children will also be able to be free of targeted advertisements while in online spaces. These are all important steps in creating an online space that is less manipulative.
The DSA goes part of the way towards curtailing the worst intrusive practices. Crucially, it is welcome that there is a ban on profiling based on protected characteristics such as sexual orientation and ethnicity. Serious concerns have been raised regarding human rights violations of this type. It was found in the United States, in the state of Washington, for example, that the type of housing information displayed was targeted to or hidden from people of particular ethnicities. Also influenced was the kind of messaging put out and the kind of profiles being built. Those profiles, in themselves, then, become self-reinforcing, in that they can have a real social impact. That social impact then becomes something blamed on ethnicity, when it has, in fact, been created. There is a long record of this type of occurrence.
Coming back to the amendments, one of the concerns is that so many of those algorithms do not simply just have data points and are based on profiling but there is training of these algorithms to associate certain prejudices with certain characteristics. I acknowledge Abeba Birhane, a really outstanding researcher at UCD, who is one of those who exposed the intense prejudice built into the MIT database. This is the massive database that MIT used, with many algorithms making many decisions targeting lots of material, including targeted advertisements. Ms Birhane was one of two academics who went inside that black box and looked at what are the signals and what are the information points that are being fed into the algorithms they are being trained on. She found there was gross racial prejudice, misogyny and so forth within those boxes. This is an Irish academic we should be very proud of as one of the first to highlight the impacts of this issue.
Very large online platforms, VLOPs, and very large online search engines, VLOSEs are special categories. These entities have more than 45 million monthly active users, which is 10% of the EU consumer citizen base. These platforms should be subject to more onerous obligations around transparency and the control of harmful and illegal content. This is welcome and appropriate. It cannot be overstated how much power those large platforms can have and the direct consequences of their power when it is misused and when it is unchecked. We saw very recently in Dublin how online platforms have been used to spread disinformation around asylum seekers and migrants, and how the platforms were used to plan and co-ordinate acts of violence against asylum seekers and against communities. I am also concerned that we are seeing some disinformation currently at work in respect of the forthcoming referendums, unfortunately. It is extraordinarily hard to monitor in those spaces.
Ireland is going to be a lead regulator in respect of those very large online platforms. There is a concern that our Data Protection Commission so far has not had a strong enough record in implementing the regulations we have had. Very often we have had an iterative approach where we are engaging and engaging with platforms and giving very long times and long lead ins for actions. It is appropriate that we are now looking for more restrictions. The decision to force Meta to halt its data transfers from the EU to the US is very positive from the EU but it is regrettable that Ireland has not stepped fully into its role. A total of 75% of Ireland's Data Protection Commission decisions in EU cases over a five year period were overruled by the European Data Protection Board. Effectively, Ireland is having to have the regulations applied at European level because we are failing to apply them properly at national level. That is a real shame for Ireland. It points to us as being less responsible in our role as one of the major hubs for all of the major online companies. It will reflect poorly on us and our international reputation if we are not seen to fully apply the EU's Digital Services Act. It points to attempts to wriggle out of obligations, which I will just signal.
I will also signal to the Minister of State a point which he may address when he comes back in. We have seen attempts by large platforms to define themselves outside of the rules. I am looking particularly to Amazon as an example. Will the Minister of State, Deputy Collins, assure us that Ireland will be taking very strong measures to ensure that large platforms do not manage to define themselves out of these rules that come in?
Lastly, I must signal that there are environmental implications around digital services and how they operate. I may be bringing some of those issues to the Minister's attention on Committee Stage.
The Minister of State is welcome to the House. There are some very serious issues with the Digital Services Bill that to date have not received enough attention, like the Criminal Justice (Incitement to Violence or Hatred and Hate Offences) Bill 2022 that preceded this. That Bill managed to sail through the Dáil with very little opposition.
I come at this from a very clear perspective. In my first week as president of the Teachers Union of Ireland I had to cope with one of our schools where two 11-year-old kids committed suicide. They committed suicide because of online bullying. What goes on online and through the digital media is frightening from time to time. As anybody in this House will tell you, when one makes a complaint to the likes of X about something that was said or something that was made up about us and some of the horrendous things that are said they come back and tell us it is not in breach of their rules. I would love to know precisely what the rules are.
The EU's Digital Services Act will control, we hope, illegal content but it does not just mean controlling child sexual abuse material or the import and export of contraband goods. It also includes stopping hate speech as defined by EU law. If the Criminal Justice (Incitement to Violence or Hatred and Hate Offences) Bill 2022, which is very broad and open to interpretation, were to be enacted in conjunction with the Digital Services Bill, the consequences could be very serious indeed. The Bill allows additional national competent agencies to be designated with the power to conduct specific tasks and this would include NGOs. At the moment we are facing into a referendum where many State-funded NGOs are putting out information and disinformation saying that the existence of the current articles in the Constitution have placed limits on women's rights to work outside the home. We have had two women Presidents in this country, several women CEOs, and thousands of women who are working. There is no limit ever on women working in this country. Almost all State-funded NGOs are on board with this. This morning I questioned if this is directly in conflict with the McKenna judgment. Is the State usurping McKenna by using NGOs to sell the State's message around the referendums? The Electoral Commission has publicly stated that it does not have the means to regulate all misinformation and disinformation. Where does that leave us with the referendums? In a recent correspondence the commission did not define what it considers to be misinformation or disinformation. For example, if you or I see something online relating to the upcoming referendums that we feel is disinformation or misinformation, and today I will be writing to the commission on a number of those issues, if we are not aware of what is disinformation or misinformation then we do not know that we should be reporting it. Definitions must be clear to empower agencies if we want to avoid Kafkaesque scenarios. Empowering agencies to deal with the problem they cannot clearly define is not really a good idea.
The establishment of the new media commission is a positive development, but anyone who has paid close attention to the issues must recognise the incredible and unenviable task this new commission will have. The Bill grants external organisations the ability to engage in regulatory powers based often on vague definitions. Given the import of child sexual images and so on we clearly need the regulatory framework. The trend with Government policy regarding NGOs and the hate speech Bill, with the regulation of disinformation, and now with this piece of legislation is to create a regulatory framework by which views that go against the consensus or against the Government can be easily overshadowed or suppressed. We may find ourselves in a situation where we will not be able to speak out. That would be terrible as I am sure the Minister of State would agree. I know his own views and that he is a fairly decent guy. There are many people in this Chamber who would be regarded as left wing or progressive but who might dismiss my concerns, for example on hate speech. They might think that defending free speech is a cause for the privileged or for the extreme right wing, which is often used by Members of this House but we have to be able to speak out on things. We have to be able to highlight issues of concern. There is a creeping movement to try to suppress the public comment on Government.
We have to try to find a balance. Legislation has been getting rammed through this House with no regard for the important role this reforming House plays. I know the Government has to have the Bill passed by 17 February, but I am asking it to make sure it gives us plenty of time and listens carefully to the amendments that are tabled.
Minister of State at the Department of Further and Higher Education, Research, Innovation and Science (Deputy Niall Collins)
To recap, the purpose of the Bill is to provide for the full implementation of the EU regulation on a single market for digital services. It is an indispensable component of a pioneering regulatory framework to protect EU users of digital services and their fundamental rights online. The framework will rebalance the responsibilities of users, online platforms and public authorities, placing citizens at the centre. It represents a sea change in the EU's ability to protect society from illegal online content and disinformation.
Resourcing both our digital services co-ordinator, Coimisiún na Meán, and the competent authority, the Competition and Consumer Protection Commission, is of the utmost importance to the Government. We allocated €2.7 million last year to setting up the digital services co-ordinator function in the coimisiún. This funding has increased to €6 million this year, when it will be fully operational. The CCPC has also received additional funding in 2024 for its new responsibilities under the Digital Services Act. This will enable the continuing recruitment of people with the necessary technical, legal and regulatory skills for the coimisiún to carry out its functions as the Irish digital services co-ordinator and for the CCPC as the competent authority.
This is a technical Bill that is necessary to give effect to the supervision and enforcement provisions of the EU regulation. It will not add to nor alter the obligations on intermediary service providers under the EU regulation. It will ensure the rights and protections provided for in the EU digital services regulation will be rigorously asserted in Ireland for the benefit and protection of the consumers of digital services. It is mandatory under the EU regulation for member states to give effect to these national provisions by 17 January, as we outlined. Our national digital strategy, Harnessing Digital, sets out Ireland's ambition to be a centre of regulatory excellence in Europe for the benefit of both industry and consumers. This ambition is to be realised through a modern, cohesive and well-resourced regulatory system for the digital economy, and the Bill will be a key enabler of this vision. This comprehensive and sophisticated Bill will, once enacted, re-enforce Ireland’s position as one of the most advanced and progressive EU member states in the digital economy and digital society, a fact acknowledged through our participation as a founding member of the D9+ group of EU leaders in this space.
I thank Senator Crowe for his supportive observations on the Bill and, likewise, Senator Malcolm Byrne. I agree that citizens and consumers should have the same rights online as offline. We do need to make people aware of their rights arising from the digital services regulation.
To respond to Senator Keogan in regard to the curbing of the freedom of speech, the Bill puts protection of freedom of expression at its core. This includes protection from Government interference in people's freedom of expression and information. The horizontal rules against illegal content are carefully calibrated and are accompanied by robust safeguards for freedom of expression and an effective right of redress to avoid both the under-removal and the over-removal of content on the grounds of illegality. The Bill will give users the ability to contest the decisions taken by an online platform to remove their content, including when these decisions are based on platforms' terms and conditions. Users can complain directly to the platform, use an out-of-court dispute settlement body or seek redress before the courts. The Bill provides rules on transparency of content moderation decisions for very large online platforms, VLOPs, and very large search engines, VLOSEs, and will provide users and consumers with a better understanding of the ways these platforms impact on our society. VLOPs and VLOSEs will be obliged to mitigate those risks, including as regards freedom of expression. They will be held accountable through independent auditing reports and public scrutiny. All the obligations in the Bill, including the crisis response mechanism, are carefully calibrated to promote the respect of fundamental rights, such as freedom of expression.
In respect of trusted flaggers, which Senator Keogan also spoke about, for all users the Bill will oblige providers to submit notices about illegal content. Trusted flaggers are organisations that have met certain conditions, including expertise in identifying illegal content, and have been awarded this status by the digital services co-ordinator, in our case Coimisiún na Meán. Trusted flaggers' notices will get priority but they are not the arbitrators of illegal content. It will be the responsibility of the provider to assess and decide whether the flagged content is illegal and whether it should be moderated.
I thank Senators Ahearn, Garvey and Sherlock for their support for the Bill. I understand Coimisiún na Meán will have a big challenge, but significant resources have been provided since last year to support it to build capacity to ensure it will be ready, and further increases in resources will come this year.
I thank also Senator Gavan for his support for the Bill. I responded earlier to his concerns about the protection of freedom of speech, which is embedded in the regulations.
I thank Senator Higgins for her comments. We agree the prohibited targeting of advertising towards minors is a very positive step, as is the ban on targeting based on special category data.
There is certainly a need to understand how platforms are using platforms. The European Centre for Algorithmic Transparency was set up in April of last year to provide scientific and technical expertise to the European Commission, including by carrying out research into the impact of algorithmic systems used by online platforms and search engines.
As we said, Coimisiún na Meán has been resourced early to prepare for 17 February, when the Bill will effectively come into force. Resourcing in 2023 enabled Coimisiún na Meán to recruit staff with the level of expertise that will be required, including the digital services commissioner, who has been in place since July of last year.
I understand Senator Craughwell’s concerns about online bullying and harassment. The harmful communications and related offences Act 2020 was passed specifically to address these issues. We need to be able to speak out, and the digital services regulation seeks to present the balance required to allow that to happen.
When is it proposed to take Committee Stage?
Next Wednesday.
Is that agreed? Agreed.