I welcome the Minister, Deputy Catherine Martin, back to the House once more. The amendments tabled for section 11 have been disposed of. Senator Alice-Mary Higgins was speaking to the section yesterday when we reported progress. Does anybody wish to speak to the section before I put the question? No.
Online Safety and Media Regulation Bill 2022: Committee Stage (Resumed)
I move amendment No. 142:
In page 66, line 2, after "mental" to insert ", emotional".
I move amendment No. 143:
In page 66, line 2, to delete “moral” and substitute “emotional”.
I welcome the prominence and availability of public service programmes and services in this Bill. This was discussed by TG4 and RTÉ at the committee. I just want to make the point. I do not have a question for the Minister. It is important to look at how many steps someone has to take when using the Sky box or the Virgin box to finally get to public service broadcasting or programming and to protect that in the future. It is a welcome part of the Bill. That is all I have to say.
Amendment No. 144 has been ruled out of order, as it incurs a potential charge on the Revenue.
Amendments Nos. 145 and 146 are related. Amendment No. 146 is a physical alternative to amendment No. 145 and they may be discussed together by agreement. Is that agreed? Agreed.
I move amendment No. 145:
In page 76, to delete lines 19 and 20.
I welcome the Minister back to the Chamber. I will speak to amendments Nos. 145 and 146. They both seek to address the inclusion of online content by which a person bullies or humiliates another person in the list of non-offence specific categories in the Bill. Both amendments aim to achieve a similar end, but go about doing so in different ways. Amendment No. 145 seeks to delete the inclusion of "online content by which a person bullies or humiliates another person" in the list of non-offence specific harmful online content. This particular amendment is a straight deletion of the definition of "harmful online content" as it stands, which is defined as "online content by which a person bullies or humiliates another person". In my view, the definition currently provided in the Bill is vague and ill-defined, to such an extent that it is unsuitable for inclusion in legislation.
It is not clear from the existing definition what sort of content would constitute bullying or humiliating content. As a result, this provision creates a huge potential for overreach by social media platforms. For example, will they be so concerned about being stung by this regulation that they will overcompensate by removing all potentially humiliating content from their online platforms, or will they begin to remove posts where citizens ridicule the decisions of policymakers and legislators for fear that the language being used is too humiliating? The vagueness and subjectivity of this term makes it completely inappropriate for a legislative provision. By making this provision, we risk granting social media platforms an inappropriate level of discretion over what sort of speech they allow on their platforms and what sort of speech they suppress. It would be reckless of us as policymakers to include such provisions in important legislation, which will grant powers to social media providers to limit the free speech of users.
Additionally, no clear division is created here between the content which is acceptable criticism and that which is unacceptable humiliation. One fundamental principle of which we should be mindful when creating laws is that the people the laws apply to should be able to understand and correctly interpret them. I do not think this is possible if we maintain the inclusion of a vague definition. This is a serious civil liberties concern. It is unclear what threshold is being created in the Bill given that the wording is so ambiguous. Is the threshold on bullying or humiliating behaviour? Does criticism of a person in authority count as humiliating behaviour? This wording sets worrying foundations whereby we could find public figures who exert enormous power over politics and the economy seeking to make themselves immune from ridicule or criticism. There needs to be a clear and better defined threshold by which we can easily separate out criticism and even ridicule from behaviour which is actual bullying.
We are all interested in preventing bullying online. It is clear that is a common goal we all share, but we cannot do this with such a broad and crude provision in the Bill as it stands, which does not seek to give bullying behaviour a proper definition. I stress again that this wording is in conflict with the principles we should follow as legislators, that is, that people should be able to read the law and know how to follow it. The way this section is currently drafted, it is not clear that someone would be able to read this provision and know whether they are breaking the rules by ridiculing a powerful person online. In my view, it either needs to be removed from the Bill, as amendment No. 145 would achieve, or be further clarified, as amendment No. 146 seeks to achieve. Alternatively, it could be clarified by the Minister and the Department by means of the introduction of a ministerial amendment on Report Stage.
Amendment No. 146 is an alternative to amendment No. 145. It seeks to replace the language in section 139A(3)(a) with:
"online content by which a person sends repeated, abusive communications to another person, to an extent that a reasonable person would conclude constituted intimidating or bullying behaviour."
In this way, we can seek to create a clear threshold that would separate legitimate criticism and disagreement online from actual bullying behaviour. This amendment, rather than simply deleting the vague definition of bullying or humiliating content, seeks to replace the definition with a stronger one that is less open to interpretation. This amendment seeks to create a clearer and less ambiguous threshold, which would allow online providers to actually identify the kinds of bullying content which the original definition seeks to capture. This amendment defines bullying content as "repeated, abusive communications".
This makes sense as, for example, we would not want social media platforms to simply remove messages that legitimately criticise the decisions made by people in positions of authority. We must be conscious of the fact that the way in which people engage with the issues of the time has changed. A significant portion of our communication takes place online or on social media platforms and while certain language that can be directed at people in positions of authority might sometimes not seem like the kindest, sometimes it is an absolutely legitimate expression of political dissatisfaction. To ask online providers to suppress this kind of speech would be a gross overreach. I am sure that everyone here in the House has been subject to messages of criticism online, but just because the language invoked by an individual in this way might cause offence to us as individuals, that does not mean that we should legislate for people not to be able to express their frustration or dissatisfaction in this way. By inserting the new definition, the legislation would instead identify bullying behaviour as behaviour which consists of repeated, abusive communications, which a reasonable person would believe constitutes bullying.
It is my hope that this clearer definition would fulfil the intended aim of the original provision, which is to put a stop to bullying behaviour online without unintentionally creating a chilling effect, whereby social media providers overcompensate by removing all negative, critical posts from their platforms. Again, we all wish to restrict bullying behaviour online, but we cannot simply discard civil liberties concerns in order to do so. This amendment creates a much clearer threshold in this Bill whereby the average person would clearly understand that bullying behaviour is unacceptable, while legitimate criticism and dispute is fine.
I thank Senator Ruane for tabling amendments Nos. 145 and 146 regarding the definition of cyberbullying in the Bill. The current definition in section 44, inserting section 139A into the Broadcasting Bill 2009, was arrived at through extensive consultation and engagement with the Office of the Attorney General.
It refers to online content by which a person bullies or humiliates another person and subjects that definition to a risk test regarding risk to a person's life or risk of significant harm to a person's physical or mental health where the harm is reasonably foreseeable. The proposed amendments would have the effect of replacing the current definition with a narrower definition focused on repeated abusive behaviour and to subject that definition to a reasonable person test. In this regard, I believe that the content intended to be covered by the amendment is covered by the category of offence-specific content which refers in turn to the updated offence of harassment under the Harassment, Harmful Communications and Related Offences Act 2020, Coco's Law, under section 45 of the Bill inserting Schedule 3 into the 2009 Act.
The category of harmful online content regarding cyberbullying is intended to be broader than the matters covered by Coco's Law but subject to the risk test. I do not believe it would be desirable to limit the definition of cyberbullying in this manner and I do not propose to accept the amendments.
I will look again at my definition between now and Report Stage but the most concerning thing is the word "humiliating" and whether we are combing the two or if there are separate definitions in the Bill. Bullying is an objective test. There is a repeated element of bullying where there may not be a repeated element to humiliating. When we put those things together we have to have a clear definition of what humiliation means in this context. Someone can be subjectively humiliated but how do we even begin to measure that? Bullying is a little more straightforward because we have definitions in law about objective testing of bullying versus humiliation.
I thank the Minister and her officials again. By the end of today, the Minister will have been in the Chamber for 16 hours dealing with many aspects of this Bill. We agreed not to subject it to a guillotine because of its importance and that we would go through the Bill. I would hope that during today's discussion brevity and concise contributions would help. However, if we debate many issues in depth at this Stage, I would ask that when we come to Report Stage, we do not re-debate all the issues. The Minister has given a commitment to reflect on some of the views here. I express sincere thanks on behalf of those of us who have dealt directly with the officials and the Minister to try to come up with some suitable wording on this. Given the debate here, I hope that we do not revisit all the issues on Report Stage and that if colleagues have issues, they would consult with the officials in advance of that.
Those of us who sat on the joint committee dealing with this faced the challenge around what was awful and lawful. Where do we draw the line between somebody throwing a couple of nasty comments and a process of harassment? I would say to Senator Ruane that if one reads section 139A(3) in conjunction with section 139A(4), as the Minister said, it makes it fairly clear what the threshold required is. I am quite sympathetic because it was one of the things the joint committee struggled with. The other thing is that it is quite subjective. Something might be water off a duck's back to me, if someone was trying to bully or humiliate me, but someone else could find it much tougher in a different situation. It is difficult but I think it is covered by the definitions in the section.
I understand the Senator's concerns but I believe the provision is sufficiently precise to avoid the issues she raised. It was done with extensive consultation with the Office of the Attorney General. It was considered in-depth during drafting. Therefore, I will not accept the amendments.
I move amendment No. 146:
In page 76, to delete lines 19 and 20 and substitute the following:
“(a) online content by which a person sends repeated, abusive communications to another person, to an extent that a reasonable person would conclude constituted intimidating or bullying behaviour;”.
I move amendment No. 147:
In page 76, between lines 26 and 27, to insert the following:
“(da) online content by which a person makes available disinformation or false information which is intended to mislead;”.
The Bill creates new categories of harm around things that have not been illegal per se, such as promoting eating disorders or online bullying, but which are very obviously harmful and must be regulated. An obvious omission here is disinformation or false information. Ms Frances Haugen appeared before the joint committee in February and agreed that disinformation should be included as a type of harmful content. I think we are all aware of the dangers of harmful content. Many people in Ireland, particularly those who were vulnerable and isolated, fell victim to online disinformation during the height of the pandemic. In the United States, we saw the effect of false information around presidential elections and there have been devastating consequences in Myanmar and Ethiopia. There will be a gaping hole in the Bill if disinformation is excluded and, therefore, I hope the Minister will consider the amendment. The joint committee recommended in its pre-legislative scrutiny report that it be included in the Bill. The Department's response referred to the Digital Services Act.
The effect of amendment No. 147 would be to insert another category of harmful online content regarding disinformation. I do not propose to accept the amendment for a number of reasons. Disinformation and false information intended to mislead is being tackled on an EU-wide basis through a number of mechanisms, including the Digital Services Act, political agreement on which was agreed on Friday, 22 April. The Government has decided that Coimisiún na Meán will be the digital service co-ordinator which is the primary regulator under the Digital Services Act. My colleague, the Minister for Enterprise, Trade and Employment, has led Ireland's negotiations on the Digital Services Act.
The EU code of practice on disinformation is a Commission initiative which has involved a range of online platforms, leading social networks, advertisers and advertising industry players to sign up to self-regulatory standards to fight disinformation. It is the Commission's intention that the code will evolve into a co-regulatory instrument under the Digital Services Act.
In addition, the Commission has also established the European Digital Media Observatory, which has a hub in DCU and which has been tasked with monitoring the implementation of the code. While I note the intent of the amendment, the matter will be addressed by the Minister for Enterprise, Trade and Employment and his Department in the context of the legislation necessary to give effect to the implementation of the Digital Services Act in Ireland. I do not believe it would be useful to cut across this work at this stage.
Senators will be aware that in March the Government decided that once established under the Bill, Coimisiún na Meán would act as the primary regulator termed the digital services co-ordinator under the Digital Services Act. It made the decision in light of the clear synergies between the objectives and the approaches of Coimisiún na Meán and the digital services co-ordinator, including taking a systemic approach to dealing with online safety and platform regulation and similar resourcing needs and expertise for implementation and enforcement.
The Digital Services Act is still undergoing negotiation on a number of technical matters. The final text of the regulation is not available. Should the code of practice on disinformation evolve into a co-regulatory instrument under the Digital Services Act, as proposed by the European Commission, I would expect that Coimisiún na Meán would have a role to play in its capacity as the digital services co-ordinator.
There is a very specific problem with disinformation which is designed to influence the results of elections and referendums as these processes are at the heart of our democracy. The Minister for Housing, Local Government, and Heritage asked the Attorney General to prepare proposals for inclusion in the Electoral Reform Bill around the protection of our electoral process against disinformation, with a view to bringing amendments to that Bill which, I understand, is due to commence Committee Stage in the Dáil next week.
The Institute for Future Media Democracy and Society in DCU called attention to the lack of reference to disinformation in the Bill. It noted that the implementation of the codes of practice around disinformation were within the remit of the media commission.
The Minister has acknowledged that it will be the digital services co-ordinator but it felt it was appropriate that provisions be made in the general scheme of the Bill at that time. At the very least, I will ask whether we will need to revisit this legislation in the future when the Digital Services Act, DSA, is complete. Could the Minister give me guidance on that?
On the Senator's query, there would be no need to revisit this legislation because the Digital Services Act will provide for it itself.
I move amendment No. 148:
In page 76, between lines 26 and 27, to insert the following:
"(da) online content by which a person promotes financial harm or online gambling content;".
I reserve the right to resubmit the amendment on Report Stage.
Amendments Nos. 149 to 152, inclusive, have been ruled out of order as potential charges on the Revenue.
I move amendment No. 153:
In page 77, line 4, after "(3)(e)” to insert “or 139B”.
I reserve the right to resubmit the amendment on Report Stage.
Amendment No. 154 has been ruled out of order as a potential charge on the Revenue.
I move amendment No. 155:
In page 77, line 9, after “139A(2)(b)” to insert “or 139B”.
I reserve the right to resubmit the amendment on Report Stage.
Amendments Nos. 156 to 158, inclusive, have been ruled out of order as potential charges on the Revenue.
I move amendment No. 159:
In page 77, between lines 38 and 39, to insert the following:
"(6) The Minister or the Joint Oireachtas Committee on Tourism, Arts, Culture, Sport and Media may write to the Commission and request that the Commission consider a potential new category of harmful content and upon receipt of such a request the Commission shall evaluate such requests and may consider making a proposal in respect of such a new category.".
Amendment No. 159 seeks to insert a new subsection, subsection (6), into section 139A of the Bill providing that the Minister or the Oireachtas Joint Committee on Tourism, Culture, Arts, Sport and Media may write to the commission and request that the commission consider a potential new category of harmful content and upon receipt of such a request, the commission would evaluate such requests and may consider making a proposal in respect of providing such a new category.
Section 139A as it stands sets out the interpretation of harmful online content and age-inappropriate content. I think it is hugely important that we create flexibility within the legislation for the Minister and the joint committee to suggest new categories of harmful content as they see fit and for the commission to consider and evaluate the requests to account for future changes in online behaviour but also to account for potential oversights that are not presently provided for or spoken to in the legislation.
I thank Senator Ruane for her amendment regarding a role for the Oireachtas joint committee in requesting that an coimisiún consider making a proposal for a new category of harmful online content. As it stands, the Bill provides at section 44 - inserting sections 139B and 139C into the Broadcasting Act 2009 - that it is within an coimisiún's sole discretion as to whether it considers making a proposal to the Minister for a further category of harmful online content. No specific role is provided for the Minister, the Government or indeed the Oireachtas joint committee at that point of the process. The reason for this was to safeguard the independence of an coimisiún, particularly regarding a matter as sensitive as proposals for further categories of harmful online content, and to ensure that the process for making a proposal under these provisions was separated from the political process. In this regard, the directive sets a high bar regarding the independence of regulatory bodies and requires member states to ensure that they are both legally distinct from the government and functionally independent of their respective governments and of any other public or private body. In light of these considerations, I am not proposing to accept this amendment.
I accept that response and will withdraw the amendment.
Amendment No. 160 has been ruled out of order as a potential charge on the Revenue.
I move amendment No. 161:
In page 78, line 3, after "public" to insert the following:
"which shall include publication on an accessible website, at least 4 weeks prior to the commencement of the consultation period, and for the duration of the consultation, which shall be not less than 4 weeks".
Amendment No. 161 provides that when the commission is bringing draft regulations to the attention of the public, it shall include publication on an accessible website at least four weeks prior to the commencement of the consultation period and for the duration of the consultation, which shall be not less than four weeks.
I thank the Senator for the amendment. The amendment would require an coimisiún, when undertaking a public consultation on a proposal for a new category of harmful online content, to publish the draft proposal four weeks before the consultation begins and to require that such a consultation should be at least four weeks. I do not propose to accept the amendment as it appears to be overly prescriptive. In the first instance, it would be more usual for a public body to give notice that a public consultation was commencing, rather than that a public consultation would commence within a specified time. I do not see the value in an coimisiún publishing a draft proposal for a new category of content without at the same time or shortly thereafter commencing its consultation period on the proposal. I also do not foresee circumstances where an coimisiún would seek to artificially foreclose or artificially shorten a public consultation.
I will withdraw the amendment with the right to resubmit it on Report Stage.
I move amendment No. 162:
In page 78, line 20, after "reconsider" to insert "or review".
Amendment No. 162 seeks to insert a provision into section 139C that would allow the Minister, as well as asking for reconsideration of or accepting a proposed new regulation, to request that the commission would review the proposed regulation.
The amendment would provide that where an coimisiún has submitted a proposal for an additional category of harmful online content to the Minister, the Minister may request that an coimisiún reconsider or review the proposal. The relevant section in the Bill as initiated provides that the Minister may request that an coimisiún review the proposal. I do not accept this amendment as I do not see a substantial difference between the meanings of reconsider and review in this context. It is difficult to see how an coimisiún would reconsider a proposal without also reviewing it.
I will withdraw the amendment.
Amendments Nos. 163 to 165, inclusive, are related and may be discussed together by agreement. Is that agreed? Agreed.
I move amendment No. 163:
In page 78, to delete lines 36 to 40, and in page 79, to delete lines 1 to 3 and substitute the following:
"139D. (1) In this Part, 'age-inappropriate online content' means online content that either—
(a) is likely to be unsuitable for children (either generally or below a particular age), having regard to their capabilities, their development, and their rights and interests, in particular content consisting of—
(i) pornography, or
(ii) realistic representations of, or of the effects of, gross or gratuitous violence or acts of cruelty,
or
(b) consists of online advertisements or commercial communications which are age-inappropriate, including those advertising—
(i) high salt or fat foods,
(ii) alcohol, or
(iii) gambling.".
Amendment No. 163 suggests a different approach to tackle the problem of advertisements that may be harmful to children. This was discussed in one of our previous meetings with the Department and it seemed interested by this approach at the time. I am not sure if that is still the case. Section 139D of the Bill defines a separate category of "age-inappropriate content", which is not the same as harmful content but rather something which adults can be safely exposed to but children cannot. In the Bill, it is defined as including pornography or representations of violence.
This is a very helpful category to define within the Bill. We need to acknowledge that there are some types of content to which adults can be safely exposed, while children cannot. It is strange to me then that this category of content is defined but is barely used at all in any of the later mechanisms set up by the Bill. The only place it seems to be used is when issuing guidance materials by the commission. There it has the option of having regard to the risks of exposure to age-inappropriate content.
Since the Bill already goes to the trouble of defining this special category of material that is suitable for adults but not children, we could better use that definition by expanding it and perhaps adding further mechanisms later in the Bill whereby that type of content should be limited automatically by platforms.
For example, there is a stipulation later in the Bill - I might have to come back on the section; I think it is section 139 but I will double-check. It stipulates that the commission can demand content is either limited or removed. If it is outright harmful content, it can be removed, but the power of limiting content remains vague. Why not attach this power specifically to age-inappropriate content? Then the platforms - they already do this in many cases - could use automated means to detect and limit age-inappropriate content, in order that it never appears on the timeline of minors. This age-inappropriate content could include age-inappropriate advertising, as we discussed in previous amendments. This would be a new approach to limiting that kind of advertising.
Our amendment seeks to insert a new paragraph to extend the definition of age-inappropriate online content to include advertisements for products that are potentially harmful to the public health of minors. This would allow for a ban on targeting of this type of advertising at children, while still allowing it to be targeted at adults on online platforms. This is a different and possibly better mechanism to control advertising to children within the Bill. Currently, this seems to be left up to the media service and online safety codes, and it is not clear what type of advertising will actually be banned to children following the development of these codes. By better utilising the definition of age-inappropriate content, and actually defining certain types of advertising as age-inappropriate content, we can hardwire child safety measures into the Bill. We can ensure that we are explicitly identifying that these types of advertising are harmful to children, which we know they are. Also, it is a more efficient use of the mechanisms already created within the Bill, as currently the useful category of age-inappropriate content is poorly used within the remainder of the Bill.
Amendment No. 165 seeks to add a subsection to section 139D, providing that notwithstanding anything in this section, content which seeks to educate children on the effects of violence or acts of cruelty in an age-appropriate manner shall not be considered age-inappropriate online content. This is to ensure that educational material is still available on these subjects and is not unduly restricted.
I will speak to amendment No. 164, in which Sinn Féin calls for the new media commission to produce a directive outlining appropriate minimum age requirements for children to create online accounts. I am thinking, in particular, about the metaverse here. These are obviously environments that present huge opportunities for education and learning but I am conscious that this is the most significant attempt we have made, alongside the Digital Services Act, to regulate the 2D screen, yet we know that when children and people of all ages put on a headset it is a completely different physical and psychological experience and it has different privacy repercussions. The harms of the metaverse in particular are not yet clear. There have been incidents of sexual assault. We would like the media commission to bring forward proposals on a minimum age requirement for children's online accounts, which would apply to all of the social media companies, 2D screen, the metaverse and so on. Senator Malcolm Byrne had a briefing for people on the headsets. This was also a recommendation of the committee in its pre-legislative report. The Children's Rights Alliance was supportive of an age limit for the establishment of an online account.
I have some sympathy for what Senator Warfield has outlined. What is important in this legislation is that, in as far as possible, it is technology-neutral because who knows what kind of technology the new media commission will be regulating in the years ahead. In terms of some of the challenges and opportunities that are presented within the metaverse, we already see children and young people avoiding some of the rules around age identification and verification. I accept it is not primarily a piece of responsibility for the media commission but, given its work, which relates precisely to the points that have been made, it is something of which the commission is going to have to be conscious. The primary focus of the role of the new office of the online safety commissioner will probably end up being around children's safety online and we will need to have regard to this. In terms of taking on board some of the concerns, this was an issue that the joint committee did debate in great detail.
The age of digital consent is a separate question which we may not wish to get into, but it is something the online safety commissioner is going to have to be aware of. As Senator Warfield says, the question of ensuring the legislation is technology-neutral for whatever new technologies come down the line, including devices that, through machine learning and AI, will monitor exactly what children and young people are doing online to make sure that those protections are in place.
I thank the Senators. In terms of amendment No. 164 regarding a minimum age requirement for children, it is important to acknowledge that there is a very real issue with young children accessing online services that simply were not designed with them in mind. It is an issue that I am particularly aware of as both a parent of young children and chair of the National Advisory Council for Online Safety, which recently released a comprehensive report on children's online safety. Finding workable solutions in this area raises a number of complex issues, including privacy and data protection matters that would need to be resolved before a solution could be effectively implemented.
In this regard, it is useful to note that an EU-funded pilot called euCONSENT is in development. The pilot aims to deliver a system for online age verification and parental consent, which balances the rights of children and the need to protect them from online harm and age-inappropriate content. The outcome of this pilot will inform any approach taken to this issue at an EU and national level.
While I cannot accept this amendment, I will be asking an coimisiún to look at this issue as a priority in conjunction with the Data Protection Commission and to identify potential options and solutions to dealing with this complex issue. Not everything can be dealt with in a single item of legislation, but what matters is that we put the framework in place.
Regarding amendment No. 163, I note that section 139K of the 2009 Act, as inserted by this Bill, provides coimisiún na meán with the power to create binding safety codes to ensure "that service providers take any measures in relation to commercial communications on their services that are appropriate to protect the interests of users of their services, and in particular the interests of children".
Further to this, section 139K specifically provides that, in respect of providers of video-sharing platform services, an coimisiún shall make online safety codes requiring those providers to comply with the requirements of Article 9(1) of the revised directive in respect of advertisements directly placed by the provider and user-generated advertisements. This includes a long list of requirements, including regarding the protection of children. As such, given the extensive provision already in the Bill for the regulation of online commercial communication, I will not be accepting this amendment.
I understand the intention of amendment No. 165 in seeking to clarify that the definition of age-inappropriate content used in the Bill should not preclude educational material. However, I do not believe that this provision would have this effect due to its framing with regard to the capabilities, development, rights and interests of children. Accordingly, I am not accepting the amendment.
I will be brief, as I am conscious that others have debated the issue. I am happy to engage further with the Minister on it. Even though I understand where amendment No. 164 is coming from, we had extensive debate on very similar issues when the data protection legislation was going through. It is not simply the age of consent in terms of when one begins to have an online account; the question is what one may consent to. Even parental consent is often not a great screen sometimes because, while it may be fine in terms of that content, the problem is what the data are then used for. That is where the question of the commercial regulation is crucial in this area, and perhaps enhancing those provisions in terms of the commercial communications. The section, which was inserted on our behalf in the Data Protection Act, sought not simply to have an age at which a child might sign up, based on the idea of a digital age of consent, but if somebody does sign up at 12 or 14, his or her data should not be able to be gathered and he or she should not be construed as giving consent to other uses of those data, such as commercial marketing.
Basically, we went by limiting how children were targeted by commercial actors rather than just limiting or regulating the point of access for children. It put a burden on those who are engaging in online commercial activities that target children. That might sometimes be a more effective way of coming at this issue. I would note, notwithstanding our own amendment, which I will withdraw and reserve the right to come back on with regard to education, that I respect that we want to get the issue of educational information in order that children are empowered and can know what they are dealing with.
I would have one caveat that comes between the two, however. Since we started debating this Bill, it has been brought to my attention that educational online video material for children, even though that material may itself be very educational and positive, often harvests the data of children in some cases. There is, therefore, a real need to ensure that if a person accesses positive educational online content, to which many parents will say "Yes" and give consent and everything else, there is not a link between that educational online provider and a number of commercial actors that many then use the data. Sometimes the educational aspect is the upfront bit to which people agree but children's data are actually used outside of that and maybe even as the business model. Again, that is just to mention it. I think that perhaps spans amendments Nos. 164 and 165.
With regard to amendment No. 163, as I said, there is provision in section 139K of the 2009 Act specifically in respect of providers of video-sharing platform services. We know the coimisiún shall make online safety codes requiring those providers to comply with Article 9(1) of the revised directive, which has an extensive list of requirements, including the protections for children.
With regard to amendment No. 164, as I said, it really is a complex issue. I agree with the intent of the Senators' amendment but I do not believe it hits on a workable solution. As I said, there are active moves at the EU to identify practical solutions to these matters. In addition to this, last week, the European Commission proposed a new regulation on combating child sexual abuse. As part of this, the Commission foresees an obligation for certain online services to conduct risk assessments and take risk mitigation measures in respect of age verification. Therefore, work is being done in that regard but I do not think this is a workable solution in this amendment.
I move amendment No. 164:
In page 79, between lines 3 and 4, to insert the following:
“(2) The Media Commission shall produce a directive outlining appropriate minimum age requirements for children to create online accounts within 12 months of its establishment.”.
I appreciate that the Minister and I want to get to the same place. I think we should legislate; we have put this in legislation. I know the Minister will ask the media commission to look at the issue. I will press the amendment on this occasion.
I move amendment No. 165:
In page 79, between lines 3 and 4, to insert the following:
“(2) Notwithstanding anything in this section, content which seeks to educate children on the effects of violence or acts of cruelty in an age-appropriate manner shall not be considered age-inappropriate online content.”.
Amendments Nos. 166 to 168, inclusive, are related and may be discussed together with the agreement of the House. Is that agreed? Agreed.
I move amendment No. 166:
In page 79, between lines 28 and 29, to insert the following:
“(fa) section 42 of the Irish Human Rights and Equality Commission Act,”.
Amendments Nos. 166 to 168, inclusive, are all around trying to ensure that in deciding whether an online service provider falls into the category of providers to which the online safety codes may apply, the coimisiún might again have due regard to those principles of human rights and equality. They are the same points I have made throughout. Again, perhaps by addressing them comprehensively in the powers and functions section, we will not need to address them in another place. It is, however, part of a general principle of ensuring that those considerations are there in terms of the public duty on equality and human rights, particularly with regard to the online safety codes because there is that kind of harm protection role. We know that persons who fall under, for example, the equality grounds in Ireland may be especially vulnerable. It would be important that the coimisiún would have regard to that public duty on equality and human rights when considering which providers should fall into that category.
Amendment No. 167 would insert a caveat that when deciding whether an online provider falls into the category of providers, it would have due regard to the UN Convention on the Rights of Persons with Disabilities, UNCRPD. That is very much linked to amendment No. 168. These are around trying to embed different considerations. The reason I am bringing them in repeatedly is because these are things to which Ireland has signed up but that are often not embedded or part of something we think about constantly.
I referred to the UNCRPD, which Ireland has ratified, but I also referred to the web accessibility directive, which is now law from the EU but is, again, somewhat unevenly applied. This directive sets requirements that there would be levels of accessibility in terms of online websites and providers. It started at one stage and has now moved through to phones and websites, so it is working in a wider space. It has been a phased application of the web accessibility directive. As I understand it, it should now be fully applied. Again, that means that when the commission is thinking about the safety codes, it might also be thinking about those measures. A small example of something that overlaps between safety and the web accessibility directive, which is something we will come to later, is that question of having, for example, flashing images that create effects and that may cause seizures for certain persons. It is an example of something that overlaps between online safety and the web accessibility directive. People should not have that risk to access online sites.
These amendments would require an coimisiún to have regard to a number of legislative instruments when designating a relevant online service to which online safety codes may be applied. Amendment No. 166 would appear to duplicate provisions already in law and, as such, I do not accept the amendment.
As I stated earlier in this debate, section 42 of the Irish Human Rights and Equality Commission Act 2014 provides that a public body shall have regard to the need to eliminate discrimination, promote equality of opportunity and protect human rights. As a public body, an coimisiún will already be subject to provisions of that Act. As I said only yesterday in this Chamber, I would expect that an coimisiún would not only comply with section 42 of that Act but also, in accordance with the spirit of the Act, demonstrate a culture and practice of respect for human rights.
With regard to amendment No. 167, I understand the UN Convention on the Rights of Persons with Disabilities was ratified by the State and entered into force in 2018. Accordingly, similar to other UN conventions that have been ratified and entered into, the State is legally bound to the obligations set out in the treaties. I do not see the rationale for inserting a reference to the UNCRPD in the context of the designation process.
The effect of amendment No. 168 would be to require an coimisiún to have regard to the web accessibility directive when considering whether to designate a relevant online service. However, this directive only applies to public bodies and, as such, its legal relevance to the designation process would be unclear. As such, I cannot accept this amendment.
In light of the fact that the Minister's officials have indicated a willingness to engage on the question of the proper recognition of human rights and equality in section 7 regarding powers and functions, I will not press the amendment in this context. I think that may be where we can best address it.
I move amendment No. 167:
In page 79, between lines 28 and 29, to insert the following:
“(fa) the United Nations Convention on the Rights of Persons with Disabilities,”.
I move amendment No. 168:
In page 79, between lines 28 and 29, to insert the following:
“(fa) the Web Accessibility Directive (Directive (EU) 2016/2102),".
I believe we will have an opportunity to debate the related issue I highlighted later. I will withdraw this amendment for now and reserve the right to reintroduce it.
Amendments Nos. 169 and 170 are related and may be discussed together by agreement. Is that agreed? Agreed.
I move amendment No. 169:
In page 80, between lines 36 and 37, to insert the following:
"(1A) A record of any consultations conducted under subsection (1) shall be kept and in the case of consultations carried out under subsection (1)(d) such a record will include the rationale by which the Commission deemed the person to be appropriate to consult with, and such records shall be deposited with the Minister and laid before both Houses of the Oireachtas.".
This amendment is similar to another provision I put forward. It is very important that there be transparency in respect of records, especially since this is an area involving very large amounts of money. It is a very significant area and it is really important that there be absolute transparency. Not only that, but there also must be seen to be transparency in regard to how the commission performs its functions. This is necessary to engender the wide public trust that is essential to its successful operation.
As set out in the Bill, when it comes to designating an online service, the commission may consult a number of relevant persons. I have suggested who those people might be. The Bill also specifies that the commission may consult any other person it considers appropriate. My amendment proposes that where any such consultation under subsection (1) takes place, there should be a record of that consultation having happened and, in the case of consultations carried out under subsection (1)(d), which allows for consultation with any other persons the commission considers appropriate, the record should also include the rationale under which the commission deemed those persons appropriate to consult. I have further proposed that these records be deposited with the Minister and laid before both Houses of the Oireachtas. Perhaps they do not need to be laid before the Houses but, for accountability purposes, there certainly needs to be a record of such consultations.
We had a similar debate around another area in which I was looking for transparency. I ask the Minister to address this issue in order that we can move forward. There is the principle of the records existing and then there is the principle of where they should be deposited, whether with the Minister and-or before the Houses of the Oireachtas. It would be useful if she could address both principles in order that we can narrow our proposals down towards something acceptable on Report Stage. On the general principle of transparency, I note that similar amendments have been accepted by the Minister for the Environment, Climate and Communications, Deputy Eamon Ryan, in respect of climate legislation, and by the Minister for Housing, Local Government and Heritage, Deputy Darragh O'Brien, in respect of housing legislation. It has been a general practice that where there is consultation, we should endeavour to ensure it is known that such a consultation has occurred. That is constructive and positive and it builds public trust.
Will the Minister respond to those two points, that is, the question of there being a record and the question of where such records should be deposited? The principle of transparency is really important. Where the commission is engaging with a major online company, lobbyist or any other body, some of that engagement will be captured under the lobbying legislation, but some of it will not. There should be a record of all such engagements.
Amendment No. 170 is a fairly simple amendment to ensure that in the maintenance of the register, unless somebody is deemed to be responsible for online safety at a named service, questions must arise. We are requiring a company to set out very clearly its address, the nature of its business and so on for designation purposes. This amendment proposes that the company must also give clear information regarding the individual within that company, organisation or platform who is responsible for online safety. This would require the register to be updated regularly in order that the commission would know at all times who is the responsible individual within a company. The amendment is quite simple in that it does what is says. We hope this provision will be integrated into the Bill.
Amendment No. 169 would have the effect of requiring coimisiún na meán to deposit a record of consultations undertaken prior to the designation of a relevant online service with the Minister and to lay it before the Houses of the Oireachtas. I understand amendment No. 71 sought to achieve a similar effect. As referenced by the Minister of State, Deputy Chambers, in response to that amendment, it is not usual practice to lay internal records of the processes and activities of public bodies before the Oireachtas, nor for the Oireachtas Library to store such materials.
I understand the intention of the amendment is to ensure an coimisiún is transparent in respect of those persons it seeks to consult. However, as a public body, an coimisiún will be subject to appropriate transparency and accountability requirements, including to the Oireachtas joint committee. The principle of transparency is satisfied by the requirement for the commission to be accountable to the Oireachtas. As set out in the Bill, the executive chairperson and the commissioners will be accountable to the Oireachtas joint committee in respect of their functions, while the executive chairperson will additionally be accountable to the Committee of Public Accounts in respect of the finances and value-for-money practices of an coimisiún. The decisions of an coimisiún as regards the making of media service codes, rules and online safety codes are also subject to a clear public consultation process. Furthermore, details of those decisions must be laid before the Houses of the Oireachtas, where they may be subject to a negative resolution procedure. As a regulator, an coimisiún will necessarily be in contact with regulated entities regularly regarding the regulation of those entities. I expect an coimisiún to undertake all such contact appropriately and to be transparent in how it does so. Accordingly, I do not intend to accept the amendment.
Amendment No. 170 would require the contact details of the person or persons designated as responsible for online safety at the named service to be entered into the public register of designated online services. Under section 139J of the Broadcasting Act 2009, as inserted by this Bill, an coimisiún is required to maintain a register of such services, which shall include the address of the provider and any other information the commission considers appropriate about how the provider may be contacted by members of the public. This provision will achieve a similar effect to the Senator's amendment. Therefore, I will not be accepting the amendment.
I appreciate the point about documents being laid before the Oireachtas. However, I am concerned as to why there is a reluctance to have a record of those the commission consults. It is not the same to say the Oireachtas joint committee can bring people before it. Any of us who sit on a committee know there are two hours in which to cover every issue at a meeting. We should not have to ask people to read out a list of everybody with whom they have met. This is basic information that it should be possible to request and access. It should not be a matter of having to put in freedom of information requests to find out who was consulted or having to drag that information out through committee work. That is a waste of everybody's time and energy. The information should be part of the record.
I accept the Minister's point that it is not usual practice to lay an internal document from a public body before the Oireachtas. However, there should be a rationale given as to why the commission decided to meet with somebody. Even if she does not want to include the rationale, because that is something that could be asked about in a committee, the idea of not specifying that there be a record of the consultations is a problem. There should be such a record. As I said, the same principle has been established in regard to other legislation. It is part of a general improvement in how we do things and part of building public confidence. The State's job is not simply to hand everything over to bodies and say it hopes they will do their best. We regulate such bodies and, as part of that, we should be specifying what kinds of records are required.
It is important that there be a record of consultations. I would like to see a requirement for providing a rationale, but I understand the concerns in that regard. However, the simple requirement for a record of consultations made is important. It relates to what Senator Malcolm Byrne said earlier. We have given a lot of time to this and everybody has loads of places to be and lots of things to do. This is part of the Committee Stage process. We have been engaging by not pressing a large number of amendments where there is some scope to work with the Government. The success of Report Stage is in the gift of the Government because it has indicated it is willing to work on lots of aspects of the Bill. We have all taken that in good faith. Nobody has pressed any amendment where there is any indication of potential progress. Everybody has indicated that we would much prefer the Government to bring forward its own amendments to tackle issues of mutual concern. The success of Report Stage will be down to whether there are constructive proposals brought forward. We are all hoping for that.
I do not want to have to table hundreds of amendments on Report Stage. That is why I am hoping the Government will indicate its amendments before we reach Report Stage. That will save all of us unnecessarily duplicating each other’s amendments. We have seen so many areas that overlap.
I want to clarify a question raised by Senator Malcolm Byrne on the online safety designated person. Would that be separate from the data protection officer? I think it is a separate role. That is a constructive proposal. We know from the Data Protection Act that the inclusion of a named person has made a difference in terms of where data protection has been applied successfully. Similarly, having a designated online safety person could prove key to things being effective.
As I said in relation to the principle of transparency, when it comes to the making of the media service codes and rules and the online safety codes, they are subject to a clear public consultation process and must be laid before the Houses of the Oireachtas so I see that transparency there.
The regulator by its very nature must be able to engage on an ongoing basis with the entities it regulates, without an unnecessary or undue administrative burden. There are practical considerations here. We have struck the right balance, so I will not be accepting the amendment.
I move amendment No. 170:
"In page 81, between lines 30 and 31, to insert the following:
“(c) contact details of the person or persons designated as responsible for online safety at the named service.”."
I move amendment No. 171:
"In page 81, line 36, to delete “may” and substitute “shall”."
A similar principle underpins amendment No. 171 as other amendments. It seeks to insert “shall” instead of “may” specifically in relation to the making of online safety codes. As I said, these are the core duties and functions and they should not be subject to discretion. They should be established as an expectation in the legislation, hence my preference for the word “shall” rather than “may”.
In compelling an coimisiún to make online safety codes, amendment No. 171 would have the effect of limiting the discretion of an coimisiún in when and how it exercises its regulatory duties through the creation of online safety codes. I refer to our debate yesterday, which was on the similar subject of media service codes. I reiterate that the purpose of establishing the independent regulatory body of an coimisiún under this Bill is to delegate the exercise of powers from the Oireachtas to that body in accordance with Article 15.2.2° of the Constitution. That a body delegated such powers has discretion in the exercise of such powers within the strictures of legislation is appropriate. It is in the spirit of the provisions on the independence of regulatory bodies in the revised audiovisual media services directive, AVMSD.
In addition, changing the word “may” to “shall” in those provisions would create a mandatory obligation in every instance which, given the breadth of the provisions, may create a problem. Changing this to a mandatory obligation from a discretionary imperative, which is what the word "may" implies in this context, would mean that an coimisiún could be challenged for not having made a code about any particular issue that is standard or practice of these service providers. Given the breadth of these provisions, this is potentially significantly problematic. If we go down this road, we would likely need to significantly limit the range of matters that may be addressed through online safety codes to a small and exclusive list that would no doubt rapidly become out of date. I will not be accepting the amendment.
I move amendment No. 172:
"In page 82, line 9, after “to” to insert “advertisements or"."
I will withdraw the amendment because we have had engagement on this issue. I reserve the right to reintroduce it if necessary.
Amendments Nos. 173 to 177, inclusive, and amendment No. 183 are related. Amendment No. 174 is a physical alternative to amendment No. 173. Amendments Nos. 173 to 177, inclusive, and amendment No. 183 may be discussed together.
I move amendment No. 173:
"In page 82, to delete line 12, and substitute “children, having particular regard to the general public health interests of children.”."
The marketing of nutritionally poor food has an effect on children’s consumption preferences. It has an effect on their purchase requests and on what they ask their parents and guardians to buy. It has an effect on what they choose to consume. It has an effect on their health.
We need the regulation of marketing practices that incentivise the consumption of nutritionally poor food. We need such regulation because of the high rates of child obesity and related non-communicable diseases throughout the European Union and at home in Ireland.
The imposition of online restrictions on the marketing of high fat, salt or sugar, HFSS, foods and drinks to protect children from harm is even more warranted, because the marketing strategies that are used to promote such foods are increasingly integrated, increasingly immersive and increasingly personalised. Therefore, they are more likely to cause harm. I hope the Minister will engage constructively and positively with amendment No. 173. I will come back in to speak on amendment No. 175.
Amendment No. 173 takes the language from Part 5 of the Bill about how the public health interests of children must be a key consideration of how the media regulator regulates advertising and transposes it to the section covering digital media. At this point, I would like to thank Kathryn Reilly of the Irish Heart Foundation for her assistance with this amendment.
One of the important features of the Bill is its attempt to bring the regulation of conventional and digital media into the common legislative framework. It is a great step forward but this Bill, as drafted, does not go far enough to create parity between digital and conventional media. This amendment will make sure the public health interests of children are explicitly considered by the regulators overseeing the digital sphere. This is particularly important because young people are spending more and more time in digital spaces and consuming digital media. They are watching less conventional media. It is important that regulations that govern advertising reflect this reality.
The online world is run on advertising. It is totally pervasive. Online advertising can be discreet, explicit and subtle. It can be incorporated into online content through sponsorships and product placement. This obscures the distinctions between entertainment, information and advertising. These blurred lines make online advertising more effective, particularly for younger or more vulnerable consumers who are not equipped with the media literacy and critical capacity of the standard adult. Given the huge amount of time young people spend consuming this digital content, as well as the insidiously effective mechanisms at play with digital advertising, it is vital that online regulators are empowered to take action against harmful advertising with the public health interests of children in mind.
This legislation already contains that brilliant language that gives due weight to the importance of protecting children's health. This amendment is a rather modest one which seeks to replicate the Bill’s already effective language in the section that covers digital media. I hope the Minister will accept this amendment, which will strengthen the Bill and will ensure parity for regulation across different types of content.
Amendment No. 176 is a copy of a provision in Part 5, Chapter 3, dealing with media services codes and media services rules that may prohibit the advertising of high fat, salt and sugar foods and drinks in commercial communications on media services, recognising the “general public health interests of children”. This was part of the transposition of the audiovisual media services directive. The Bill already makes provision for restrictions on junk food and drink advertising on media services in the public health interests of children. This amendment would simply extend that to online services. It is using the text of the Bill to extend protections to children from harmful marketing into the online world. It also includes identical restrictions on the advertisement of gambling and alcohol. The evidence is unequivocal that the marketing of nutritionally poor food affects children’s consumption preferences, purchase requests, consumption choices and, ultimately, their health. Bearing in mind the high rates of child obesity and related non-communicable diseases throughout the European Union, the effective regulation of marketing practices that incentivise the consumption of nutritionally poor food is urgently needed.
The imposition of online restrictions on the marketing of junk food and drinks to protect children from harm is even more warranted as the marketing strategies used to promote such products are increasingly integrated, immersive and personalised, and therefore likely to cause harm and create long-term behavioural patterns. General comment No. 25 of the United Nations Committee on the Rights of the Child in respect of the digital environment calls on states to make the best interests of the child a primary consideration when regulating advertising and other forms of marketing addressed and accessible to children. We are seeing the deployment of a vast array of big data and adtech tools to tap into online cultural spaces, and to infiltrate them with powerful promotions for some of the unhealthiest products on the market. There exists a large, integrated, data-driven digital marketing system, fuelled by big data, artificial intelligence, and machine learning. This has created a powerful, pervasive and immersive digital environment that is harming children’s health, furthering health inequities, and contributing to increasingly higher levels of disease in the population.
The Covid-19 pandemic triggered a dramatic increase in people’s screen time. Children and teens whose schools closed relied on YouTube for educational videos, attending virtual classes on Zoom, Microsoft Teams and Google Classroom, and flocking to TikTok, Snapchat and Instagram for entertainment and social interaction. This constant immersion in digital culture has exposed them to a steady flow of marketing for unhealthy products, much of it under the radar of parents and teachers. Unfortunately, the Bill as currently drafted does not appreciate the key role online marketing plays in eroding children’s health. Today’s youth, from infancy through to adolescence, are at the epicentre of an exploding digital media and marketing landscape. This Bill provides a unique opportunity to address this through amendment of Part 11, Chapter 3, on online safety codes. By not including a ban on alcohol advertising to children in the online safety and media regulation Bill, we would allow an unacceptable gap in protection to continue. There is an illogical reality where in real-world settings we have laws to ensure there are no alcohol ads in proximity to children’s environments such as schools, playgrounds, etc., but in the digital space, where children are spending increasing quantities of time, we are allowing the alcohol industry to put ads in the palms of the hands of our children 24-7.
This amendment is in the spirit of the Bill. It aims to put an end to the day of the Internet operating as a wild west where standards are lower and anything goes. There is a clear gap in how we choose to protect children from harmful advertising. The way to remedy this is by way of this amendment, which specifically prohibits the advertising of alcohol, gambling and high fat, salt and sugar foods in the online spaces that children inhabit. Only by naming these harmful products can we ensure that this will be enforced. We know that advertising these products to children is wrong and this should not be a battle that needs fighting.
I want to speak on amendments Nos. 173 and 174. To add in these lines will only strengthen the Bill. They are simple and short measures, but it is important that they would be added. I compliment the Irish Heart Foundation, which many of us met during the pre-legislative scrutiny phase, on the work it has done and for the interest it has taken in this Bill in the interests of children's health. I would be fully supportive of adding in these small amendments, which strengthen the Bill.
On alcohol, I said yesterday - and Senator Black has just said - that the reality is that the promotion of alcohol is in the palms of the hands of children by allowing it to be advertised on the phones. That is an analogy we need to be cognisant of when we make decisions. If we do not put this in the Bill, we will allow our children to face that advertising 24-7.
I want to speak to amendment No. 175. I am reminded that the Connemara filmmaker Bob Quinn always fought against advertisements during children's programming when he was on the board of the RTÉ Authority. I join Senator Black in thanking the Irish Heart Foundation for its engagement in respect of this Bill. I thank former Senator Kathryn Reilly for her work and for the Irish Heart Foundation's work on all Stages of this Bill and for their interest in it. It is grounded in a huge amount of work by the World Health Organization, as Senator Black has mentioned, and by UNICEF. In its 2019 report, Children, Food and Nutrition - Growing Well in a Changing World, UNICEF called for a broad regulatory framework for food marketing to children, encompassing TV, video games, film, books and social media for all age groups, as well as businesses and restaurants that give away toys to market unhealthy food. I thank former Senator Kathryn Reilly and the Irish Heart Foundation for their work on these amendments and during Committee Stage. I hope the Minister will give consideration to amendments Nos. 173 and 175 as well.
I join in thanking the Irish Heart Foundation, the Children's Rights Alliance and the many groups that have advocated on this set of amendments as one of the core areas of amendment to the Bill. I also thank the individuals who have contacted us with their concerns about the online targeting of children and the kinds of content to which children are being exposed and with which they are being targeted. The fact that the Minister can see amendments in this grouping from Sinn Féin, the Civil Engagement Group, the Labour Party and many others shows that these concerns are common. I urge the Minister to engage in the context of the fact that there is value in the different proposals that have been put forward. The clear and general signal is that in the media service codes there was some acknowledgement of that public health component in respect of children and they acknowledged that the online codes should similarly have some form of recognition. Amendments Nos. 172 and 173 are minimalist in nature. They simply seek to recognise the public health interests of children. There is the other suite of issues that have been identified.
I will highlight two things that are not in my amendment in this grouping but that are valuable. If the Minister is coming back with her own proposal, I would be open to all of these issues being dealt with in that.
There is recognition in Senator Black's proposal of commercial communications in gambling that target children. That is an issue that goes from the loot box we talked about in this House to other areas where gambling is targeted at children. Very young people can be exposed very quickly to gambling. That is a very important example. Senators have tabled amendments on trans-fatty acids, sugars and salts and it is mentioned in the original Bill, which is a clear sign the Minister recognises their dangers. The other issue is gambling and the fact that it affects children.
The Labour group has spoken at length, and I will not prolong it, about recognising that milk-based formula and follow-on milks aimed at young children are a public health concern. They include that in the definition of public health concern. Whatever changes we come to on Report Stage, in terms of recognising this issue, I hope it will be reflected in the online codes as well as the media service codes given what I highlighted earlier, that some of the areas of code are about online service providers, such as baby clubs, targeted messages and the expert-mum forum pop-ups people are invited to.
Senator Ruane and I put forward amendment No. 183 in this section. What is important about this amendment, and I urge that it be listened to, is that it does not just speak to online content. It speaks to the fact that online safety codes should provide for restrictions on the use of personal data of children for profiling, micro-targeting or direct marketing. This is about the recommender system and data gathering. It is about providing data for one purpose. Senator Black outlined the extent to which parents have had to rely on online educational resources during a significant period of online home education, for example. Any such material that targets children should not store information in respect of the children's ages. If people want to access education materials through an online education provider, and they have children of this age and that age in the house, they should not receive advertisements based on the children's ages. That should not be the case and it should not be allowed.
I emphasise this issue because I am very frustrated by it. It was identified as a key issue in 2018. We had the agreement of all parties to put it into the Data Protection Act. It was not commenced because there was a concern about the wording, but it has never been addressed. It has been four years since then and of the information we learn about the online targeting and profiling of children and how their information is used, there has been no comforting information. The issue has only worsened. We constantly hear of news scandals about the harvesting of information and so forth, and this issue still has not been addressed. That is why we push to have measures put into legislation because if they are not included and strongly committed to, they may not happen.
I ask that the Minister take on board the minimalist point on public health, the points everybody is making about high-fat and salty foods, the points on gambling, on breast milk substitutes and on alcohol. Alcohol is being targeted at children, effectively and indirectly, which was acknowledged by Senator Carrigy. It is not just a question of the content but how it is reaching children. The Minister has some concrete, solid proposals that could, through this Bill, set an agenda for online safety codes. If the commission were given a signal politically, from us, through this legislation, that these are areas we expect it to work on, it would be incredibly positive and it would set an agenda.
The Minister mentioned that the purpose of this Bill is the delegation of powers. The purpose is not the delegation of powers. The purpose of this legislation is media and online regulation. We are entrusting some of that to be delivered by a commission that will be established. It is absolutely within the rights of the Legislature and the Government to send signals to the commission as to how we want that done. We are not giving them full discretion in that regard. We are giving them the responsibilities and duties and we should be sending the signal as to the areas we expect to be addressed. That does not exclude the commission from identifying other areas that we have not thought of. If these areas are heard as common concerns from across society and the House, they should be put on the agenda for online safety.
I recognise the objective behind these amendments, which is to specify that an coimisiún may protect the health of children by regulating, through online safety codes, commercial communications relating to certain products, including foods high in fat, sugar and salt, as well as alcohol and gambling products. In this regard, I note that an coimisiún already has this power under the Bill and, in particular, in relation to the regulation of commercial communications on video sharing platform services, which must abide by the provisions set out in Article 9(1) of the revised audiovisual media services directive. However, in taking any action in this area, an coimisiún would be guided by the policies of the Department of Health, which is responsible for public health policy on these matters.
I see the merit of specifically referencing the public health interests of children in this section. For this reason, although I am rejecting amendments Nos. 173 to 177, inclusive, I would like to return to them on Report Stage once I have engaged with the Minister for Health.
On amendment No. 177, I note the reference to milk-based formula and remind the Senators that I have committed to examining issues as to the advertising of such products in advance of Report Stage. However, on amendments Nos. 175 to 177, inclusive, I note that just referencing programmes would not cover the range of user-generated content available through designated online services nor would it cover those commercial communications placed directly by the service provider.
I cannot accept amendment No. 183 as I believe the wording in subsection 1 of the amendment is not specific enough to achieve its intended objective and that the issues regarding the use of personal data of children in subsection 3, while worthy of attention and concern, are a matter for the Data Protection Commission. If a provision in the Data Protection Act has not been commenced, the issue that arises is the commencement of that provision by the Minister for Justice, not repeating those provisions in another Act, one that deals with content regulation and not data protection. As to subsection 2, I am examining this issue in relation to the other amendments in this grouping.
In conclusion, I do not accept amendment No. 183, but I will examine the other amendments in this grouping before Report Stage.
The provision in the Data Protection Act has not been commenced but, yes, it focuses on the targeting, gathering and use of information. However, it is and would be appropriate to address it in this Bill. We have been ensured that part of the regulation of online content is content delivery. Measures aimed at content delivery based on the profiling of children are relevant to this Bill. It is very important that we put a marker down. When this area is talked about in the public realm, what ends up being talked about is children. When people talk about online regulation to anybody on any street corner or in a taxi, or to a friend at dinner, they talk about children. The fact is that we have failed to provide measures against the gathering of children's data and their being targeted and profiled and sent material that is inappropriate to them in an inappropriate way. We did not do it properly in 2018 when we had the chance. As I said, this Government and a previous Government have failed to do it in the meantime. We are also potentially at risk of not doing it in this legislation.
We will be hearing about online safety commissioners and everything else. What I am saying is that the question of how the data is being delivered to children and whether profiles are being made of children needs to be addressed as part of that. It is fundamental. I will not pretend that we have done lots of stuff with an eye on safety if we are not dealing with the issue of children being profiled and targeted.
I still believe that we are dealing with different regulators. The use of personal data of children is subsection (3) of the amendment. It is worthy of attention and concern, but it is a matter for the Data Protection Commission. The regulator that we are setting up in this Bill deals with content regulation and not data protection. This is an issue for the Department of Justice and for the Data Protection Commission, so I am not accepting this amendment.
I appreciate the Minister said she would return to this on Report Stage, so I withdraw the amendment and reserve the right to resubmit it.
I move amendment No. 174:
In page 82, to delete line 12, and substitute “children, having particular regard to the general public health of children.”.
I thank the Minister for her comments and will withdraw the amendment at this stage.
I move amendment No. 175:
In page 82, between lines 12 and 13, to insert the following:
“(2A) Provision made for the purpose referred to in subsection (2)(d) may prohibit the inclusion in programmes of commercial communications relating to foods or beverages considered by the Commission to be the subject of public concern in respect of the general public health interests of children, in particular those foods or beverages which contain fat, trans-fatty acids, salts or sugars.”.
I will withdraw the amendment and reserve the right to resubmit it.
I move amendment No. 176:
In page 82, between lines 12 and 13, to insert the following:
“(2A) Provision made for the purpose referred to in subsection (2)(d) may prohibit the inclusion in programmes of commercial communications relating to foods or beverages considered by the Commission to be the subject of public concern in respect of the general public health interests of children, in particular those foods or beverages which contain fat, trans-fatty acids, salts or sugars.
(2B) Provision made for the purpose referred to in subsection (2)(d) may prohibit the inclusion in programmes of commercial communications relating to alcohol products considered by the Commission to be the subject of public concern in respect of the general public health interests of children.
(2C) Provision made for the purpose referred to in subsection (2)(d) may prohibit the inclusion in programmes of commercial communications relating to gambling considered by the Commission to be the subject of public concern in respect of the general public health interests of children.”.
I thank the Minister for reconsidering.
I move amendment No. 177:
In page 82, between lines 12 and 13, to insert the following:
“(2A) Without prejudice to the generality of subsection (2)(d), measures to be taken by service providers under that paragraph may include a prohibition of the inclusion in programmes of commercial communications relating to foods or beverages considered by the Commission to be the subject of public concern in respect of the general public health interests of children, in particular foods or beverages containing fat, trans-fatty acids, salts, sugars or milk-based formulae (infant milks, follow-on milks, growing-up milks and toddler milks) aimed at infants and young children up to 36 months.”.
I thank the Minister for her comments.
I move amendment No. 178:
In page 82, line 28, to delete “may” and substitute “shall”.
We do not need to elaborate because it is the same principle again. The purpose of this Bill is not just to give a lot of power to the commission but also to regulate and ensure there is regulation of the online media space and the provision of online safety. The Oireachtas is being asked to give responsibility for delivery of that policy goal to the commission. It would be appropriate that we would, in fact, include requirements for duties to be performed; otherwise, we simply leave discretion to the commission as to whether certain functions happen. Bear in mind, this does not interfere with the independence of the commission in terms of how it performs, but to delegate whether it performs and to leave that as a discretionary matter is inappropriate and is, in fact, an abrogation of our responsibilities to make sure these things happen. Therefore, it should be “shall” and not “may”.
I cannot accept this amendment. As I said previously, it would have the effect of limiting the discretion of an coimisiún in when and how it exercises regular duties through the creation of online safety codes. I have explained my reason for not accepting amendments of this kind in previous discussions around amendments grouped with amendments Nos. 133 and 171. I am not proposing to accept this amendment either.
Amendments Nos. 179 and 180 are related and may be discussed together by agreement. Is that agreed? Agreed.
I move amendment No. 179:
In page 82, between lines 39 and 40, to insert the following:
“(4A) When preparing such codes referred to in subsection (1), the Commission shall have due regard to section 42 of the Irish Human Rights and Equality Commission Act 2014, the United Nations Convention on the Rights of Persons with Disabilities and the Web Accessibility Directive (Directive (EU) 2016/2102).”.
I will not dwell on amendment No. 179. It is the same point. It is on ensuring that the United Nations Convention on the Rights of Persons with Disabilities, the Irish Human Rights and Equality Commission Act, the public duty on equality and human rights and the web accessibility directive would be reflected in the preparation of codes.
On amendment No. 180, again, I am not against consultation. I think it is good. However, it is specifying that in the preparation of the codes, the commission would consult. I looked for the record of who it consults with. It has full discretion to consult with whomever it see fits, but there should be a record of it. I suggest it should consult with appropriate persons who have relevant expertise in the areas of human rights because rights are a key component in respect of this area of online safety and digital public participation. These are areas of expertise that would add to the development of online safety codes and should be part of the preparatory process.
In the first instance, I would suggest that amendments Nos. 179 and 180 would be more appropriate to amending section 139M of the Bill, rather than amending section 139K. Amending section 139M already provides for a range of matters that an coimisiún must have regard to when preparing online safety codes.
On the intention behind amendment No. 179, and in the first instance, the proposed inclusion of section 42 of the Irish Human Rights and Equality Commission Act 2014 is a matter which must be considered by an coimisiún in the preparation of online safety codes. As I noted before, this would appear to duplicate provisions already in law. Section 42 of the Irish Human Rights and Equality Commission Act 2014 provides that a public body shall have regard to the need to eliminate discrimination, promote equality of opportunity and protect human rights. As a public body, an coimisiún would already be subject to the provisions of that Act. Similar to other UN conventions that have been ratified and entered into, the State is legally bound by the obligations set out in the treaties. I do not see the rationale for specifically referencing the United Nations Convention on the Rights of Persons with Disabilities in context of making online safety codes. Further to this and regarding the web accessibility directive, as I said, this directive applies only to public bodies and, therefore, its legal relevance to the making of online safety codes is unclear. Therefore, I do not intend to accept amendment No. 179.
On amendment No. 180, in the first instance, I note that under the amending section 139(1)(b) an coimisiún shall consult with any person it thinks appropriate in advance of making an online safety code. While, in practice, this will be a matter for an coimisiún, I would see this as covering the persons referred to in the amendment. In addition, I would see it as inappropriate to single out persons with expertise in a limited field who must be consulted. This may give rise to the impression that the views of those persons should be held in higher regard than those not explicitly referenced. It would also be impractical and quite difficult to provide for an exhaustive list of such persons who must be consulted in advance of making an online safety code. Generally, however, it is the expectation that there will be a strong culture of consultation with relevant persons and stakeholders when it comes to the creation of regulatory codes. Accordingly, I do not propose to accept amendment No. 180.
I move amendment No. 180:
In page 82, between lines 39 and 40, to insert the following:
“(4A) In preparing such codes referred to in subsection (1), the Commission shall consult with appropriate persons with relevant expertise in human rights and digital public participation.”
Amendments Nos. 181 and 182 are out of order as they would impose a potential charge on Revenue.
I move amendment No. 183:
In page 83, between lines 6 and 7, to insert the following:
“Online safety codes: public health of children
139L. (1) Without prejudice to section 139K, in the case of a designated online service which incorporates advertisements or commercial communications, the Commission shall ensure that an online safety code in respect of that service shall provide that risks of harm arising from the advertising of products to children on designated online services are minimised.
(2) Online safety codes as described under subsection (1) shall provide for restrictions, limitations, or outright banning of advertising to children of—
(a) high salt and fat foods,
(b) alcohol, and
(c) gambling.
(3) Online safety codes shall provide for restrictions on the use of personal data of a child for profiling, micro-targeting, or direct marketing, in accordance with the Data Protection Act 2018.”.
Tá
- Black, Frances.
- Boylan, Lynn.
- Gavan, Paul.
- Higgins, Alice-Mary.
- Moynihan, Rebecca.
- Ó Donnghaile, Niall.
- Ruane, Lynn.
- Wall, Mark.
- Warfield, Fintan.
Níl
- Ahearn, Garret.
- Ardagh, Catherine.
- Blaney, Niall.
- Buttimer, Jerry.
- Byrne, Malcolm.
- Byrne, Maria.
- Carrigy, Micheál.
- Clifford-Lee, Lorraine.
- Conway, Martin.
- Currie, Emer.
- Dooley, Timmy.
- Fitzpatrick, Mary.
- Horkan, Gerry.
- Kyne, Seán.
- Martin, Vincent P.
- McGreehan, Erin.
- O'Loughlin, Fiona.
- Seery Kearney, Mary.
- Ward, Barry.
- Wilson, Diarmuid.
Amendments Nos. 183a, 187a and 192a are related and may be discussed together by agreement. Is that agreed? Agreed.
I move amendment No. 183a:
In page 83, between lines 6 and 7, to insert the following:
"Online safety codes: disinformation
139L.(1) Without prejudice to section 139K, the Commission shall ensure that an online safety code in respect of a designated online service shall provide that risks of harm arising from the spread of disinformation are minimised.
(2) Online safety codes as described under subsection (1) may provide for restrictions on, limitations on, or outright banning of disinformation.
(3) In this section, 'disinformation' means verifiably false or misleading information that is created, presented and disseminated for economic gain or to intentionally deceive the public, which may cause public harm, in particular harm to democratic political and policy-making processes.".
These supplementary amendments were submitted on the basis of an issue that has been highlighted over the course of the debate. They all relate to disinformation. Their goal is to recognise that disinformation is not simply problematic because of its impact on the general public but that it can cause harm. Disinformation can be a cause of harm to the individual, for example disinformation on health. We have seen the very serious consequences of disinformation during the Covid pandemic. Disinformation has a wider implication. It can be used to create dangers for individuals based on their groups. Misinformation can be created in terms of categories of persons with regard to nationality. It can be used to encourage people to make poor decisions. It can have political outcomes. It can also have directly personal outcomes.
The Minister mentioned that the digital services directive will look at the issue of disinformation. There is a weird aspect with the Bill because in some ways it reaches into the area of the digital services directive, for example by the inclusion in its provisions of interpersonal communication or cloud storage, which I believe may be inappropriate or premature. In other ways it does not address issues even though there is a significant overlap with the directive. Disinformation is being addressed, considered and recognised as a significant problem at EU level. It is also a source of online harm to individuals. I suggest that perhaps it should be included as something that will be addressed in the online safety codes. The other amendments are consequential. They reiterate this point on disinformation.
Amendment No. 183a provides that an coimisiún should have an online safety code requiring that designated online services seek to minimise the risk of harm arising from the spread of disinformation. I do not propose to accept the amendment for a number of reasons.
Regarding disinformation, this kind of content is being tackled, as has been acknowledged by the Senator, on an EU-wide basis through a number of mechanisms. These include the Digital Services Act, DSA, on which political agreement was reached on 22 April 2022.
The Government decided that coimisiún na meán will be the digital services co-ordinator, which is the primary regulator under the DSA. My colleague, the Tánaiste and Minister for Enterprise, Trade and Employment, and his Department have led Ireland’s negotiations in this area. There is also the European Commission's code of practice on disinformation. This is a Commission initiative that has involved a range of online platforms, leading social networks, advertisers and advertising industry players signing up to self-regulatory standards to fight disinformation. It is the Commission's intention that the code will evolve into a co-regulatory instrument under the DSA. Additionally, the Commission has also established the European Digital Media Observatory, which has a hub in Dublin City University, DCU, and this body has been tasked with monitoring the implementation of the code.
While I note the intent of the amendment, this matter will be addressed by the Minister for Enterprise, Trade and Employment and his Department in the context of the legislation necessary to give effect to the implementation of the DSA in Ireland. I do not believe it would be useful to cut across this work at this stage.
Senators will be aware that in March the Government decided that once established under the Bill, coimisiún na meán would act as the primary regulator, termed the digital services co-ordinator, under the DSA. It made the decision in light of the clear synergies between the objectives and approaches of coimisiún na meán and the digital services co-ordinator, including taking a systemic approach to dealing with online safety and platform regulation and similar resourcing needs and expertise for implementation and enforcement. The DSA is still undergoing negotiation regarding several technical matters. The final text of the regulation is not available. Should the code of practice on disinformation evolve into a co-regulatory instrument under the DSA, as proposed by the European Commission, I would expect that coimisiún na meán would have a role to play in its capacity as the digital services co-ordinator.
As I indicated, there is a specific problem with disinformation designed to influence the results of elections and referendums as these processes are at the heart of our democracy. The Minister for Housing, Local Government and Heritage, Deputy Darragh O'Brien, has asked the Attorney General to prepare proposals for inclusion in the Electoral Reform Bill 2022.
On amendment No. 187a, similar to amendment No. 183a, I do not propose to accept the amendment because, as previously set out, disinformation is being tackled on an EU-wide level through the mechanisms outlined. I do not want to unintentionally work at cross-purposes with those mechanisms or with the work of the Tánaiste and the Department of Enterprise, Trade and Employment. The same reasons apply to amendment No. 192a in the context of my responses to amendments Nos. 183a and 187a. I am not proposing to accept these amendments.
While I understand that the digital services directive will be coming through, the Tánaiste and officials from the Department of Enterprise, Trade and Employment are looking at this from one perspective, namely, regarding services and digital services. The Minister, however, has the capacity to look at it from another angle, in much the same way as she mentioned that her colleague would be bringing forward measures concerning disinformation in the Electoral Reform Bill 2022. That will, of course, not undermine the Tánaiste's work in bringing forward measures in this context as well as part of the digital services legislation, when that arrives, because there is specific recognition of issues of potential harm in electoral processes.
Similarly, amendment No. 187a seeks to bring nuance to this provision. I accept that amendment No. 183a is prescriptive, whereas amendment No. 187a is quite minimalist. It is not asking the Minister to deal with all aspects of disinformation, but only disinformation insofar as it causes online harm. The electoral issue is one issue but there is also the matter of harm caused to individuals, specifically the serious harm to health caused during the pandemic by misinformation and disinformation.
While I appreciate it is the Minister’s expectation that this kind of provision will come about as part of the future regulatory framework, I ask if it might be possible to send a signal in this regard now. Regarding other issues that are going to come under the digital services directive, specifically cloud storage and interpersonal communications, rather than include them in the Bill, it would be better to send a signal that they may be included, subject to certain things. I am suggesting there may be capacity to put in a building block in this regard. If that is not possible, I ask the Minister to clarify that there is nothing in the Bill that precludes the commission, even now, from identifying and addressing an issue such as disinformation, if it so wished, within an online safety code. I suggest this because the advent of the digital services directive might be five years away, as the Minister said, and disinformation can do much harm in the meantime in areas extending from Covid-19 to climate to politics.
To clarify, the purpose of the Bill is not to implement the DSA regulation in Irish law but, among other things, to create a regulatory framework for online safety and to establish a regulator to enforce that framework. The Government has decided, as I said, that coimisiún na meán will be the digital services co-ordinator under the DSA and that further legislation will be brought forward to provide for that and build upon the regulatory framework for online safety in that respect. My officials are part of a working group with officials from the Department of Enterprise, Trade and Employment, which meets weekly on the DSA. This is happening.
Does Senator Higgins wish to press the amendment?
No. I am happy to hear that the Minister is involved in that process because very different expertise is available in the two Departments.
Is the Senator withdrawing the amendment?
In that context, I will withdraw the amendment.
I move amendment No. 184:
In page 84, line 13, to delete “relates to content that falls within” and substitute “is necessary and proportionate for an investigation or prosecution in relation to”.
This amendment seeks to embed a GDPR-like provision around necessity and proportionality in this legislation to ensure we do not have a slightly wider framing. The legislation now refers to an online safety code that would only apply to interpersonal communications "in so far as it relates to content that falls within one of the offence-specific categories...". Again, what we want to avoid here are fishing exercises and hashing, for example. These aspects have been extensively debated in respect of the digital services directive. The Minister mentioned that measures are coming forward in the context of that directive, but she can use her existing powers where it is necessary and proportionate to address issues of online targeting.
I refer to a situation where there is a need to investigate, for example. This is the kind of debate that arises in the context of GDPR all the time. If there is a need to investigate and prosecute, then it is permissible to access appropriate material but it is not permissible to engage in sweeping exercises to find content that may potentially end up being relevant to an offence. There must be a reason and a necessity which meets a test of proportionality. It is not permissible, for example, to examine all, or even large swathes, of content in someone’s interpersonal communications, such as all someone's WhatsApp messages, just because a particular type of content is being sought. It is necessary to have reasonable grounds to do so and it must be necessary and proportionate for an investigation and prosecution.
As it stands, this legislation is framed so that it relates "to content that falls within one of the offence-specific categories". Therefore, it would be possible to have a situation whereby interpersonal communications are subject to large-scale surveillance because of a rationale relating to an offence. For example, hashing involves putting certain words in a search and then scanning a huge quantity of data to see if those specific words appear anywhere. Then there is follow-up, which creates a dynamic that falls outside proportionality under the GDPR. In the data services directive, it has been recognised that such practices are not appropriate.
I am worried about the way that it is framed in this Bill as it is a very wide sweep. One could say we are looking for keywords that relate to something that might be an offence and that would cover the “relates to content that falls within”, because one is effectively going to look at content in the sweep rather than investigating specific content or whatever. I have other related amendments and this is one of those areas where warrants and things like that are inappropriate. Am I correct that amendment No. 185 is not grouped with this amendment so amendment No. 184 stands alone?
Amendment No. 184 is being taken on its own.
I will come to this issue in amendment No. 185.
We will come to amendment No. 185 next.
It is the same suite of issues.
The same principle, yes.
Amendment No. 185 concerns "encryption backdoors" or "scanning” and so on. I suggest that we put in the appropriate language of necessity and proportionality to ensure that we do not inadvertently create a context whereby this clause on safety codes could be part of inappropriately wide surveillance.
To clarify, an coimisiún will not have a role in the investigation or prosecution of individuals for offences specified in the criminal code. That is the role of An Garda Síochána. The role of an coimisiún is to implement a regulatory framework to minimise the availability of online content by which those offences listed in Schedule 3 to the 2009 Act, as inserted by the Bill, can be committed. The effect of this amendment would be to require that an coimisiún determines whether aspects of the regulatory framework are necessary and proportionate for an investigation or prosecution, which is a matter outside of its competency. Accordingly, I cannot accept the amendment.
I disagree. I am not saying that an coimisiún needs to do the investigations or prosecutions; I am saying that in providing codes, for example, it should not provide that regulation allows for large-scale scanning. That is why the necessity and proportionality part is appropriate because the coimisiún puts in the online safety code and later very considerable powers are given to authorised officers to implement the code. That is why it is important, from the top down, that the online safety code considers the issues of necessity and proportionality.
I wish to press the amendment.
I move amendment No. 185:
In page 84, between lines 14 and 15, to insert the following: “(5A) An online safety code as defined in subsection (5) in respect of an interpersonal communications service or a private online storage service shall prohibit the use of encryption backdoors or client-side scanning.”.
The amendment concerns a similar and related issue relating to an online safety code. These measures will likely be part of the digital services directive when it comes through. The amendment states: "An online safety code as defined in subsection (5) in respect of an interpersonal communications service or a private online storage service shall prohibit the use of encryption backdoors or client-side scanning", which are practices that have been identified as being very problematic in respect of the digital services directive. These have been very extensively debated at European level. They are flagged in the balance that has always been sought between having appropriate accountability and measures for safety, and also ensuring appropriate data protection and care for private life.
These are measures so that the online safety code, in respect of interpersonal communications, includes the prohibition of encryption backdoors, which allow, for example, the State to access anything. Let us bear in mind that this is one of the really key provisions. The use of and demand for such backdoors in the United States is one of the reasons, for example, privacy shields and other measures in terms of the exchange of data between the EU and the US ran aground. That was because those were not properly clarified and there were no proper safeguards in place.
With regard to "client-side scanning", the idea is that there would be wholesale scanning of the interpersonal communications of persons who are using phone exchange services. I do not suggest that there are not situations where there are areas that need to be appropriately investigated. There are measures, such as warrants, that can be used in respect of those rather than the idea of a backdoor or routine scanning process.
These matters are quite serious and are a problem. Another way to address them would be for the interpersonal communications and private online storage services, which do not fall within the remit of the AVMS directive, to be bracketed aside and then added through the legislation the Minister has mentioned is going to be introduced regarding the digital services directive. These are digital services, effectively, rather than online broadcast or online spaces in the classic sense of being public spaces or public services being accessed. It would be more appropriate that they be addressed under digital services directive legislation. If they are not going to be addressed in that way, we need to make sure that the key safety concerns, which were identified during the digital services debate, are reflected in this Bill as well.
I thank the Senator for the amendment.
The purpose of the Bill is not to prohibit the use of technologies by providers of designated online services and, accordingly, I cannot accept the amendment. The technologies referenced by this amendment have legitimate uses. In particular, the reference to “encryption backdoors” would appear to prohibit the use of any encryption which is not end-to-end encryption by a provider of a private communications or online storage service. This would be highly disproportionate. It would have significant unintended consequences, especially for providers of business-to-business technologies such as various videoconferencing solutions which, for very good reasons, may be encrypted but not end-to-end encrypted. I will, therefore, not accept the amendment.
I accept the point made by the Minister. There are situations where it may be allowed but there are also situations that the amendment tries to address, which is around, for example, State access to encryption backdoors. The Minister gave the example of videoconferencing and that is an example of how we are straying into the area of the digital services directive. Is videoconferencing covered by this legislation? Is it part of the remit? Is everything in a videoconferencing environment covered? A whole set of questions arise. I accept that the same regulator will end up acting in these areas. That is fine but the same legislation does not really do the same work.
I am not going to press the amendment because the Minister has made the valid point that there are circumstances where a provider themselves may build in a backdoor, and should be able to do so. That is not what I had hoped to identify. We could make all of our lives much easier by simply providing that there may, in future, be measures in respect of "interpersonal communications service or a private online storage service" but for now they are not covered by all of these provisions in the Bill in the same way. That would be a cleaner way to address this matter rather than us having to fit within a very lengthy and tiring Bill on the AVMS directive, a mini-discussion on the digital services directive. Maybe something that pulls that out might be the best way for us to address that.
I wish to clarify for the Senator that videoconferencing is included in the Bill but it will not be covered by the Digital Services Act.
I wish to withdraw the amendment but hope to table an amendment to address the issue.
I move amendment No. 186:
In page 84, between lines 14 and 15, to insert the following:
“(5A) Services defined under subsection (5) shall not be deemed to be designated online services until such date as set by the Minister following the transposition into Irish law of the European Digital Services Act.”.
The amendment deals with the same issue. It is probably one of the cleanest ways to address this issue.
It provides that services defined under subsection (5), such as interpersonal communications or private online storage services, "shall not be deemed to be designated online services until such date as set by the Minister following the transposition into Irish law of the European Digital Services Act". This would be a clean way of doing it. It would not require the Minister to return to this legislation but, rather, would allow that we would not have the premature application of the provisions of the Bill to areas that will be shaped by the digital services directive. The amendment provides for the Minister to take a simple act in respect of section 5 following the transposition of the Digital Services Act. It is effectively a commencement clause. It basically proposes that the Minister will not commence this clause until the relevant EU laws are in place. It is probably one of the cleanest and easiest ways to address that slight overlap of concern that has been identified by me and the Minister in the context of the digital services directive. The amendment refers to the European Digital Services Act but it should probably refer to the digital services directive. That is a technicality. Is the amendment something that could work?
I note the points the Senator has made in respect of the Digital Services Act. As she will be aware, the Digital Services Act, which is a proposed regulation rather than a directive of the EU, is currently under negotiation between the EU institutions. The Department of Enterprise, Trade and Employment is leading on this. The Digital Services Act, as a horizontal legislative instrument, will address a wide range of issues, from illegal and harmful online content to consumer rights issues such as availability of illegal goods on online platforms. It will introduce a new systemic regulatory system for very large online platforms that will be overseen by the European Commission. I understand that provisional political agreement between the European Commission, the European Parliament and the Council of the EU, representing the member states, was recently reached, subject to further technical discussions and amendments. As such, the final text of the Digital Services Act has not been formally agreed and is not yet available.
A key feature of the Digital Services Act will be the appointment of a digital service co-ordinator. As I stated, in March the Government decided that coimisiún na meán will fulfil that role. A programme of work is now under way, led by the Department of Enterprise, Trade and Employment, to identify the appropriate resources necessary for an coimisiún and the legislative measures necessary to implement the Digital Services Act into Irish law. Although the Digital Services Act, as an EU regulation, will have a direct effect on Irish law, it has become increasingly necessary to provide legal scaffolding in domestic law to give effect to provisions of regulations, particularly regulatory provisions. For example, there is a need to provide for the designation of competent authorities or for penalties and enforcement mechanisms for infringements of a regulation.
Although the Digital Services Act is forthcoming, the requirement to transpose the revised audiovisual media services directive, which we are doing through the Bill, remains. In light of the infringement proceedings under way against Ireland for a failure to transpose the Bill, it is important that we transpose the directive as soon as possible to avoid fines. Moreover, given that an coimisiún will also fulfil the function of the digital services co-ordinator under the Digital Services Act, it is important that we put that on a formal legislative footing as soon as possible. In establishing an coimisiún and setting out a regulatory framework for online safety and a robust and fair enforcement and investigation mechanism, the Bill is establishing the scaffolding to which I referred, as well as the structures to support future and forthcoming legislation, including the Digital Services Act. In this regard, the Bill will assist in the implementation of the regulation and it will not, therefore, hinder it. I do not accept the amendment, but I have noted the Senator's points and will reflect on them.
I thank the Minister for indicating that she will reflect. This amendment might allow us to proceed in terms of the delivery of the audiovisual media services directive, which everybody wants to do, and would give the scaffolding required by the Minister. It would, perhaps, allow us to avoid an unfortunate and unintended situation whereby we may have an online safety code that ends up being out of step with a provision of the digital services directive. The amendment would allow us to avoid that. There will be plenty for the new commission to do, but the amendment would allow that tackling these areas, among the many to be tackled, would not be done until a suitable time when guidance is there in terms of the digital services directive. This would ensure that, for example, even though the commission will regulate a much wider set of services than the digital services directive, they would be aligned in terms of the approach taken. I thank the Minister for indicating that she will look to that issue. I will withdraw the amendment in that context.