General Scheme of the Online Safety and Media Regulation Bill 2020: Discussion (Resumed)

This meeting has been convened in the context of the committee's pre-legislative scrutiny of the online safety and media regulation Bill. In our first session, we will meet representatives from the Irish Human Rights and Equality Commission, IHREC. The witnesses will be joining us remotely. Unfortunately, that is how matters must be at the moment, but it is still good. I welcome Ms Sinéad Gibney, the chief commissioner of IHREC, and her commission colleague, Dr. Lucy Michael.

The format of the meeting is such that I will invite witnesses to make opening statements, which will be followed by questions from members of the committee. As the witnesses are probably aware, the committee may publish the opening statements on its website following the meeting. Before I invite them to deliver their opening statements, I remind them they have three minutes to do so.

I will set out the position on parliamentary privilege. Witnesses are reminded of the long-standing parliamentary practice that they should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable or otherwise engage in speech that might be regarded as damaging to the good name of that person or entity. Therefore, if the statements of witnesses are potentially defamatory of any identifiable person or entity, they will be directed to discontinue their remarks. It is imperative they comply with such directions. I have no doubt this situation will not occur. As our witnesses are attending remotely from outside the Leinster House campus, they should note there are limitations to parliamentary privilege and, as such, they may not benefit from the same level of immunity from legal proceedings as a witness who is physically present.

Members are reminded of the long-standing parliamentary practice to the effect that they should not comment on, criticise or make charges against a person outside the Houses or an official either by name or in such a way as to make him or her identifiable. I remind members of the constitutional requirement that they must be physically present within the confines of Leinster House or the convention centre to participate in our public meeting. I will not permit members to attend unless they are adhering to this constitutional requirement. I ask that members identify themselves when contributing for the benefit of the debates office staff preparing the Official Report and to mute their microphones when not contributing to reduce background noise and feedback. I ask that members use the raise hand button when they want to contribute. I also ask everyone joining today's meeting to ensure all mobile phones are on silent mode or switched off.

I invite Ms Gibney to make her opening statement on behalf of IHREC.

Ms Sinéad Gibney

On behalf of the Irish Human Rights and Equality Commission, I thank the committee. The Irish Human Rights and Equality Commission is Ireland's independent national human rights institution and equality body. In this role, we hold a specific mandate to keep under review the adequacy and effectiveness of law and practice relating to the protection of human rights and equality, and to examine any legislative proposal and report our views on any implication it has for human rights or equality.

Our written contribution, already provided to members, focuses on a number of specific issues, primarily the role and functions of the media commission, the definition of "harmful online content" and "age appropriate content", and the accessibility of services for people with disabilities. Most fundamentally, we are clear that any proposed legislation must satisfy the requirements of legality, necessity and proportionality.

There can be no doubt this is significant legislation from a human rights and equality perspective and that it can be strengthened through stronger and more consistent reference to human rights and equality standards. Protecting people from online harmful content and conduct is a requirement under international human rights law, but it is necessary that such measures be balanced against competing fundamental rights, including the rights to freedom of expression, privacy and freedom of assembly.

We have covered the issue of disability access, informed by the work of our disability advisory committee and our role as the independent monitor of Ireland's compliance with the UN Convention on the Rights of Persons with Disabilities, CRPD. However, we strongly recommend that, on issues of disability access and in line with the CRPD principles, the committee invite disabled people's organisations to speak to it about their direct experience and understanding.

Regarding the establishment of a media commission, it is important this legislation includes a specific statutory requirement for the commission to have due regard in the performance of its functions to the need to eliminate discrimination, promote equality of opportunity and protect human rights. This requirement is one incumbent upon all public bodies as part of the public sector equality and human rights duty. Given the breadth of responsibility of the new commission, though, an explicit statement of this obligation is an important addition.

Legal definitions of concepts that are central to the effectiveness of this law in practice need to be articulated more explicitly. For example, the definition of "harmful online content" needs to be clear and sufficiently precise. Terms relating to hate speech, such as racism, sexism, and ableism, should also be clearly defined under the proposed legislation. I will invite Dr. Michael to contribute during this part of our discussion.

We would further question why the definition of online harm does not include material that violates other legal regimes, for example, defamation law, data protection, privacy law, consumer protection law or copyright law. The fact a statement is defamatory or in breach of data protection or copyright law does not necessarily mean it may not also be a form of harmful online content. This clarity is in the interest of delineating freedom of speech as well as providing adequate protection for affected groups.

For these and other reasons, we are clear the Bill is sorely needed, but it must be designed and defined in ways that make it truly effective, and it cannot be a stand-alone measure. Legislatively, it must fit hand in glove with the draft hate crime legislation that is also being prepared by the Houses and combine effectively with the eventual publication of a national action plan against racism.

Beyond legislation, to ensure an overarching, effective and human rights-compliant framework for online safety, media regulation and tackling hate speech and hate crime, there must be a broader societal conversation beyond criminalisation and prohibition, one that includes education, counter-speech and the promotion of pluralism.

Dr. Michael and I are happy to take members' questions.

I thank Ms Gibney for her detailed presentation. Two members have indicated, Deputy Munster and Senator Byrne. I remind members we are not using a rota for speaking slots this session and they must instead indicate. Members should be mindful this is a shorter session. I will ask them to be as concise in their questions.

I welcome our guests. Will they expand on why they believe material that violates other laws, for example, laws on defamation, data protection and consumer protection, should be within the scope of this Bill? If the Bill was to be amended to bring them within the media commission's scope, would we be opening ourselves to too much crossover between regulators?

Ms Sinéad Gibney

No. It would be the exact opposite in that it would provide clarity. Greater legal clarity and more precision throughout the Bill will give the new media commission the tools it needs to be effective.

This will also protect users and specified groups, once the law is in practice. Within the specific part to which the Deputy refers, if it is not done as we have recommended, there is a danger that those potential acts will not be identified as hate crimes because they are identified in those other categories. That is definitely problematic.

I have one other question. Ms Gibney referred to the Bill being amended to ensure that people with disabilities have meaningful access and can engage. Will she expand on what exactly IHREC would like to see explicitly included?

Ms Sinéad Gibney

In our recommendations we note that denial of access for people with disabilities is in itself discrimination. The CRPD is going to be helpful for this committee in the framing of this legislation. The CRPD principles should be front and centre. There should be inclusion by design throughout.

The other areas that reference this throughout the legislation are particularly in reference to children with disabilities and the specific group that they comprise. Part of this is a procedural issue for us and it comes through in our recommendations. It is the question as to whether there is participation of people with disabilities throughout?

We encourage this committee to invite organisations representing disabled people to come and speak to it directly about issues of accessibility. That ongoing participation should be baked into the commission. Ultimately, we would advise that the commission should be a diverse representation of Irish society. Like our organisation, membership of the commission could have the lived experience of disability within its ranks. Engagement and participation should be an ongoing function of the commission. If the commission were to transpose or incorporate a public sector duty type of wording within its legislation, it would guarantee that because it would give the commission, on a statutory footing, a duty to eliminate discrimination throughout. Dr. Michael may wish to add to that.

Dr. Lucy Michael

This relates to the strong thrust of our submission, which is around recognising the public sector duty and creating an explicit obligation to be in line with Article 21 of the EU Charter of Fundamental Rights on diversity and anti-discrimination. The media commission should be made reflective of that in all of its policies, practices and processes, as well as its representation. It is key that the body is created like that from the start and does not have to catch up later on. It is very much reflected in all of the submission that equality must be at the heart of this legislation in every respect.

I thank the witnesses. I agree with that recommendation. Perhaps we could arrange, even if it means arranging an additional meeting for that purpose, to have representatives from disabled people’s organisations attend the committee.

I fully agree and do not think that any member would disagree. I thank the Deputy for her line of questioning and the witnesses for responding.

I thank the Chair and the witnesses. I am conscious that Ms Gibney was before the Seanad Special Committee on the Withdrawal of the UK from the EU on Monday. It has been a busy week for her before the Oireachtas committees. I agree with Deputy Munster’s point about some of the disability organisations.

I will ask about two specific issues. The first, which has been raised with us by a number of organisations, is the question of whether to have an individual complaints mechanism. Questions arise with regard to human rights and equality, particularly in respect of broadcasters and tech giants. Some of the actions of these organisations are allegedly in breach of some of the protections in the proposed Bill. How would IHREC see that operating? What mechanism would it like to see in place with the new media commission?

My second question relates to algorithmic decision-making. I take a particularly strong position on this issue. We need to have a code that will ensure that any organisation, be it a State organisation or private business, that uses algorithms to make decisions will be audited and that there are guards against bias. It is particularly crucial against the nine equality headings that we have that in place. This is going to be a big issue. I would like to hear the witnesses’ views on how they would see that operating. It is certainly our intention that there will be a code. I want a guarantee that any organisation that uses an algorithm to make a decision will be audited. I would be grateful to hear the witnesses’ views.

Ms Sinéad Gibney

I thank the Senator. It is good to see him twice in the same week. On the question of individual complaints, we did not go into detail on this issue in our submission. However, I would be more than happy to engage in further scrutiny of this. I will give my response to the question today.

We want to see more detail on the super-complaints mechanism proposal. That would be welcome. I listened back to some of the contributions to the committee from both rights groups and tech companies. There has to be better transparency around the handling of individual complaints. There has to be a move away from the obligation on the individual, rather than on the tech or online platforms, in determining the harmful content. This is an example of where there has to be an inclusive approach. The tech companies have expertise to help address this issue. A body or commission like this has the potential to bring together the different actors within this State and provide leadership across Europe in finding a resolution to this issue. This is an example of the kind of issue where we can bring those groups together to deal with it. This is not something we can do without co-operation among the different actors involved. The tech companies will play an important role in resolving this and their expertise will be needed to do so.

I know there are issues around speed of take-down and the ownership piece. First and foremost, there has to be transparency within the Bill. To go back to what we are putting forward most strongly around legality, proportionality and appropriateness, if those concepts inform the design of this law and the commission, the committee will, I believe, come up with appropriate solutions.

I would be more than happy for our team to go into more detail on that element of the Bill. The legislation is so wide-ranging that we were only able to go into certain areas of it in our initial submission.

On algorithmic decision-making, the Senator will see that we have encouraged the incorporation in the legislation not only of content but also of conduct. That would deal with the particular issues to which he alluded. We saw the first example of that essentially in the predicted grades last year. That will definitely be a huge issue as more artificial intelligence, AI, is used across the public and private sectors. It is critical that is incorporated. Our suggestions around conduct would deal with and address that.

It would be useful for this committee to receive a further submission. Given that we will be debating the super-complaints mechanism, a further submission on the matter would be helpful.

I thank the Chair and witnesses. Does IHREC believe there is scope for discretion on the part of an online safety commissioner without a clearly defined role? Could any such discretion be compromised with regard to the protection of human rights? Ms Gibney referred to the importance of clear definitions. Are there challenges in defining hate speech and discrimination, as these definitions can change as our understanding grows? How can we best manage the existing codes so that they do not become outdated? Private communication services and cloud storage services are also covered in the Bill. Is this problematic as regards human rights?

Ms Sinéad Gibney

I thank the Deputy. I will answer the first couple of questions, after which Dr. Michael will cover hate speech. When the Deputy talks about discretion, is he talking about the independence of the commissioner?

The language used in the Bill is that the commissioners "may" do this or that. There is no redress system or something like that.

Ms Sinéad Gibney

We go into detail on this in our submission. The most important feature of how the commission is set up is that it is independent and it is able to execute its functions within that independence. As the head of a State agency that enjoys that independence, and IHREC is arguably one of most independent agencies within the State, I can tell the committee that the future media commissioner will thank it for considering this in the drafting of this legislation. The specifics of our recommendation on this are around the budget itself, which should be a separate Vote account, and the removal of the capacity for the Minister or the Department to remove individual commissioners because that is problematic. What is most important is that, as a regulatory body, the commission has the discretion and independence to operate as it needs to. To elaborate on that, the need for that is down to the fact we are engaging rights which are core, fundamental rights, namely, the rights to freedom of expression, freedom of assembly and privacy. While interference with those rights is required in this instance, and I think everyone can agree with that, that interference should be allowed to happen with full independence and without any undue influence from Government. That is why we have chosen to emphasise this piece within it. We, as a body enjoying similar regulatory functions, have a lot of sympathy for that.

Maybe I will hand over to Dr. Michael to discuss hate speech. Perhaps afterward I can comment on the private communications issue, although I am going to be limited in what I can say on that.

Dr. Lucy Michael

I thank Ms Gibney. I will try to be brief. Hate speech is a term that causes much confusion among policymakers and legislators as well as the public. Certainly in terms of this legislation, I would be very cautious about ensuring it was well defined. This legislation must be carefully aligned with the hate crime and hate speech Bill that is coming forward. We know many people need a little more education and awareness on what hate speech is in this context. Only the most extreme instances of hate speech are criminalised, and the European Court of Human Rights and the charter both recognise there are limitations on free speech and those should be for particular purposes. If a state does not criminalise certain hate speech, then it must regulate it in some way, and that is where this Bill comes in. The implementation without gaps of hate speech legislation in such a way that it protects both those who are targeted by hate speech and freedom of expression for them and everybody else means we need that careful definition and alignment. Therefore, in creating the media commission and those involved in creating further codes, toolkits and guides going forward, a careful definition now, along with a mechanism to ensure that is kept up to date and in alignment with other legislation is crucial.

Ms Sinéad Gibney

We have not gone into detail on this, so perhaps this is another one I can add to in any further scrutiny we might provide in future around private communications. Ultimately, this is one of the areas the independent regulator itself will have to examine in more detail in the context of specific platforms and situations, so it will not be able to address it fully until the agency is created. We will be more than happy to do some further digging on it for Deputy Mythen.

On the communications services, there seem to be many organisations either for it or against it and this might be a thing that falls in between. There are many organisations, like the Children's Rights Alliance, which favour it. Should it be an age thing or what do the witnesses think?

Ms Sinéad Gibney

I am sorry. Will the Deputy clarify his question?

On the communications services, there seem to be a lot organisations that are including it and some in favour of excluding it for different reasons. Should it be just age-related, like for children under a certain age, or should all be included?

Ms Sinéad Gibney

Is the Deputy is referring to the communications services?

Yes, the private communications services, private messages and so on.

Ms Sinéad Gibney

I see. I will have to follow up with the Deputy to give an answer and get the team to do a little digging into it, if that is okay.

I thank Ms Gibney and her team for both the submission from the Irish Human Rights and Equality Commission and the statement today. I have a simple question which follows on from previous ones. We were talking about the independence of the commissioner and the need for him or her to operate without undue influence from Government. I am wondering about the proposed funding model for the commission through a levy on regulated services. Does that present issues for the commission's independence of operation? Does industry funding for the online safety commissioner present issues where undue influence is concerned?

Ms Sinéad Gibney

Is the Senator talking about the fines mechanism?

No. The Minister is proposing that the funding model for the commission be through a levy on the regulated services so the commission would be funded by the services themselves.

Ms Sinéad Gibney

Again, I will have to ask the team to do a little more digging on this one but, instinctively, I imagine it could be shaped and framed in such a way that it would not interfere with the commission's independence, but that would depend on it being done effectively within the legislation. I am sorry I cannot answer the Senator in more detail.

That is fine. I thank Ms Gibney.

I thank our guests. I have a question for them myself. The theme of harmful content and hate speech has been something that has reoccurred on numerous occasions with many of our presenters to date. With regard to the objectives and functions and the membership of the new media commission, are the representatives satisfied sufficient oversight, checks and balances exist within the general scheme, as outlined, to ensure the balancing of rights? I will go to Ms Gibney first and perhaps Dr. Michael would like to come in on that as well.

Ms Sinéad Gibney

I do not think the general scheme provides that at the moment, and that is what our recommendations are quite strong around. The specifics include the creation of a Vote account, which is essentially the same as what we enjoy. Also, the removal of commission members is not appropriate and is not comparable to our own legislation, for example, or that of the Data Protection Commission. In terms of the set-up, those are the two key pieces we recommend but the make-up of the commission itself should be considered. Speaking as the chief commissioner of a group of 15 people who do represent the diversity of society, we have a legislative mandate to have gender equality within the commission and that representation only strengthens our organisation. For such an organisation as the media commission, which impacts so strongly on the rights and equality of rights of individuals, to have that same level of representation would be very strong. Added to that should be the need to eliminate discrimination and promote equality of opportunity and rights. Those are some things we feel would be important for that. Like I said at the outset, that is within the 2014 IHREC Act, something this body would most likely be subject to anyway, but given the significant impact this would have on rights and equality, we believe being more explicit in the drafting of the legislation, giving it that legislative footing, will strengthen that positive obligation on this body.

Dr. Michael might want to elaborate further.

Dr. Lucy Michael

To elaborate on what Ms Gibney is saying, it is important to say much of this legislation, both this Bill and others coming through this year, are about restoring public trust in the face of online harm and disinformation. That public trust is restored when people see representative institutions and those with a positive obligation to address equality and diversity within them. The explicit adoption of the section 42 duty, which ensures every institution gives due regard to equality as well as that clearer equality and human rights framework, is key as far as we can see to ensure that trust in the new media commission and in this legislation.

I thank Dr. Michael. That concludes our session today. I thank Ms Gibney and Dr. Michael for being with us. Their statements have been most insightful and helpful. I also thank my colleges for their questions.

Sitting suspended at 1 p.m. and resumed at 1.13 p.m.

We are now into our second session. Colleagues and our guests are very welcome. We are joined by representatives of the Irish Council for Civil Liberties and Digital Rights Ireland to discuss the general scheme of the online safety and media regulation Bill. I welcome the following witnesses who will be joining us virtually via Microsoft Teams: Mr. Liam Herrick, executive director of the Irish Council for Civil Liberties, ICCL; his colleague, Ms Olga Cronin, policy officer on surveillance and human rights; and Dr. T.J. McIntyre, chair of Digital Rights Ireland. The format of the meeting is that I will invite the witnesses to make opening statements, to be followed by questions from members. As the witnesses are probably aware, the committee may publish the opening statements on its website following the meeting. I will first call the representatives of the Irish Council for Civil Liberties to deliver their opening statement, to be followed by Digital Rights Ireland.

Before I invite the witnesses to deliver their opening statements, which are limited to three minutes per organisation, I advise them of the following in terms of parliamentary privilege. Witnesses are reminded of the long-standing parliamentary practice that they should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable or otherwise engage in speech that might be regarded as damaging to the good name of that person or entity. Therefore, if the statements of witnesses are potentially defamatory of any identifiable person or entity, they will be directed to discontinue their remarks. It is imperative they comply with such directions. As our witnesses are attending remotely from outside the Leinster House campus, they should note there are limitations to parliamentary privilege and, as such, they may not benefit from the same level of immunity from legal proceedings as a witness who is physically present. I ask those participating in the meeting to identify themselves when contributing for the benefit of the debates office staff preparing the Official Report and I ask them to mute their microphones when not contributing to reduce background noise and feedback. I remind members to keep their mobile phones switched off or on silent mode.

With that housekeeping out of the way, I am delighted to get to the crux of the meeting, which is the presentations by our guests. I invite Mr. Herrick to make his opening statement on behalf of ICCL.

Mr. Liam Herrick

The Irish Council for Civil Liberties, of which I am executive director, is very grateful to the committee for this opportunity to appear. The committee has received our opening statement. The ICCL is an independent NGO which works to promote and protect human rights in Ireland. We do not represent particular groups but, rather, work to ensure the Government fulfils its human rights obligations in all relevant law and policy.

In our submission to the committee, we analysed the general scheme of the online safety and media regulation Bill in light of Ireland’s human rights obligations. We specifically considered the Bill in respect of the right to freedom of expression, the right to send and receive information and the right to privacy and private communications. Having analysed the scheme of the Bill, the ICCL is concerned that aspects of what is proposed are overly vague and poorly defined to the point that it is wholly unclear who can expect to be regulated by the proposed media commission and when and how.

We have the following main concerns. There is a troubling vagueness in the context of the definition of what might constitute harmful online content under head 49A of the Bill. The ICCL is concerned about what this vagueness may mean for legal accessibility, foreseeability and the safeguarding of the right to freedom of expression and communication, as well as the potential chilling effect that may result from this vagueness in the context of self-censorship and in the internal workings of platforms and technology companies. The ICCL accepts the legitimate and important intention on the part of the Department in bringing this forward to reduce the hurt that children and adults may feel because of material online. However, there is a danger that in passing this legislation in the manner proposed, the issuing of notices for the removal of content which, for example, could be deemed likely to cause someone else to feel humiliated is a threshold so low that it could seriously damage individuals’ constitutional rights to freedom of expression.

Similarly, there is troubling vagueness in terms of what online services may be deemed a designated online service by the commission. We particularly endorse the submission by our colleagues in Digital Rights Ireland to that effect. There is an assumption that by defining content, the intention of the Bill is to engage with commercial operations of large technology companies. However, this vague definition has the potential to capture a much wider range of online services, the operators of which will be subjected to as yet unwritten regulator codes. What we do know is that these designated services will be chosen at will by the commission from a plethora of services that have one thing in common, namely, facilitating the dissemination of content. The scheme tells us that this will cover online services that might include social media services, public boards and forums, online gaming services, e-commerce services, private communications services, online cloud storage services, press publications and so on. It is a vast and potentially unlimited list.

The Bill fails to provide for the actual role of the online safety commissioner and, as a consequence, it fails to specify the specific functions of the commissioner in the wider media commission. We understand this is essentially a technical drafting issue.

There is a common purpose between the intention of the drafting and what all present wish to see, that is, a well-defined body, but it needs further work, perhaps on Committee Stage. We endorse the submissions of bodies such as the Law Society and the Irish Human Rights and Equality Commission with regard to ensuring gender diversity and wider diversity principles in the membership of the commission.

As it stands, we believe this Bill will provide for the regulation of conversations that are had online. It will see words written by members of the public subjected to codes via service operators, even though members of the public do not make up a licensed body. The enforcement of State codes of conduct against members of the public for non-illegal behaviour creates serious problems with respect to fundamental rights. It is unclear to the ICCL how the commission intends to regulate private conversations which might contain criminal content in a manner which balances the rights to freedom of expression and privacy in communications and respects principles of legality, necessity and proportionality.

The ICCL is supportive of the mechanisms that will provide supplementary routes of redress to victims of crime perpetrated online. However, we are concerned about the approach being taken and the potential human rights impact of a system designed to restrict speech that is, in and of itself, non-illegal. Our colleagues in Digital Rights Ireland have raised some interesting points about how one might seek to define what is illegal content in this manner. We are happy to answer any questions the committee has.

I thank Mr. Herrick for that comprehensive opening statement. I ask Dr. McIntyre to address the committee on behalf of Digital Rights Ireland.

Dr. T.J. McIntyre

I am an associate professor in University College Dublin, UCD, and chair of Digital Rights Ireland. In our submissions, we focused on several points that have been mentioned by the Irish Council for Civil Liberties. We described in some detail why we believe the provisions of the Bill on online safety, which are separate from the requirements of the audiovisual media services directive, are in breach of national and European standards of freedom of expression and privacy. I am happy to elaborate on this later if the committee wishes.

It might be helpful to outline the wider regulatory context. This Bill has two parts. Part of it is required under European law to transpose the audiovisual media services directive, AVMSD. Ireland is long overdue in doing so. It should have been done by September 2020. The part relating to online safety goes significantly beyond that. Much of Part 4 of the heads of the Bill would create a national regime for controlling content typed or spoken, not merely audiovisual content, by individuals on a wide range of platforms, conceivably including private platforms. It would do so in a way that is essentially unprecedented in Irish or European law.

I am concerned that this new scheme, by being hitched up to the audiovisual media services directive, might result in this legislation being passed in a way that would not give us time for proper scrutiny. The problem is that the heads of the Bill in Part 4 merge together the two areas in a way that makes it difficult to disentangle them. The concerns we and, I believe, the Irish Human Rights and Equality Commission and the Irish Council for Civil Liberties have relate largely to these domestic provisions and innovations. It would be appropriate to pull out those provisions now, hold them back and consider how they can, as the human rights commission pointed out in the earlier session today, be better aligned with national law in areas such as hate speech and with the forthcoming Digital Services Act from the European Union. The Digital Services Act is likely to be adopted within the next 18 months to two years. It will have a significant impact on this area. It is not clear that the national scheme being proposed in this Bill would be compatible with the Digital Services Act. It is likely that if we rush to adopt a scheme like this, it will have to be substantially reconfigured at significant cost to the taxpayer and the businesses affected within a short period before Ireland is required to implement the Digital Services Act.

As a practical matter, I suggest the appropriate means of handling the concerns expressed by a number of bodies about these aspects of the Bill would be to hold over those aspects, go to further consultation with them and take into account the input of other Departments. To a large extent, some of these problems relate to the origin of the Bill in a Department which does not have any real experience in regulating the Internet. It overlooks several significant issues of fundamental rights and Internet law which would probably have been considered had it gone to a fuller consultation more generally.

I thank Dr. McIntyre. He has put another slant on our thinking in his presentation today. I move to my colleagues. They are limited to three minutes each because it is a shorter session. I call Deputy Munster.

Dr. McIntyre's submission went into a great deal of detail about the problems with definitions in head 49, particularly the first two relating to the dissemination of illegal material and the provision on the prohibition of cyberbullying. How would he change those definitions? How can they be improved, in his opinion?

Dr. T.J. McIntyre

I do not think I would attempt to change those definitions in this context. The problem is we are trying to take a model which tries to do a few different things. We are trying to regulate these services and the content produced by individuals on the services as though they were produced by professional organisations in traditional broadcast media. The audiovisual media services directive is an outgrowth of traditional media regulation. It takes a regulatory scheme that was designed with traditional media in mind and transposes that to a class of bodies such as video on demand, YouTube etc., and adopts a much narrower set of obligations on the narrower entities that fall within these definitions. It is not intended to apply across the board to regulation of individual speech. The problem is that, by bringing in online safety issues more generally, we are taking that narrow, rather intrusive model and expanding it unnecessarily.

I will give a concrete example. The AVMSD is primarily aimed at illegal content. It deals with harmful content which is accessible by children. What is proposed is an expansion of this kind of scheme to deal with content which is harmful generally. We would be moving from regulating what children see to regulating what adults see. That is a significant step.

I have a couple of quick questions for Mr. Herrick or Ms Cronin. In his submission, Mr. Herrick said a voluntary statutory system should be introduced that would allow social media and other online platforms to opt in to an online harms code with specific attention to children and that a regulator would oversee these standards. What happens if providers choose not to opt in?

Under head 56, Mr. Herrick raised serious concerns about regulating private communications. Has he any comments or suggestions around how criminal activity can be addressed or pursued without infringing upon private communications of individuals? Should these, criminal or otherwise, be off limits to the State?

Mr. Liam Herrick

On the first question, it goes to a question brought up in the earlier session by Senator Warfield on the question of the interface between regulation within the industry and the platforms and State intervention. It is an incredibly complex question. We all accept self-regulation is not satisfactory or ultimately acceptable as a solution to the problem of online harm. On the other hand, it is doing it without the capacity of the State to regulate in an invasive and comprehensive way the huge volume of material that is produced or regulated by these companies.

One of the interesting models being brought forward by international human rights organisations is the idea of a social media council, which would be an interface between self-regulation moderation in companies and potentially state bodies, such as an online safety or media commission. Our colleagues in the ARTICLE 19 freedom of expression organisation are exploring the idea of piloting such a model in Ireland. There is innovative thinking on how we might link the moderation and standards within platforms with a state safety perspective.

The Deputy asked another question about the regulation of private communications that are criminal in content. As members might hear from An Garda Síochána and other criminal justice agencies of the State, they are already very much engaged in monitoring criminal activity online and the dissemination of illegal material. There are international co-operation agreements in place in that regard. There is much going on in that space already.

There was a question on defining cyberbullying put to Dr. McIntyre a while ago. It is not for us to come up with a definition but when we look at the detail of what is proposed here, material that might have the effect of being humiliating to another person sets an incredibly low threshold for imposing a strict regulatory imposition on content that might be deemed offensive or humiliating to an adult. As Dr. McIntyre states, this goes far beyond the idea of regulating illegal content, which was the purpose of the directive. That is not to say, of course, that we do not all want to take positive action to deal with bullying online. The idea that this is solely to be achieved through regulation of this type does not fully appreciate the nature of bullying and the wide range of social and psychological factors involved with dealing with that very complex problem.

I thank the witnesses. I will push back on some of the assertions made by Mr. Herrick and Dr. McIntyre. We have accepted that the era of self-regulation by the technology companies is over and I get the point about criminal activity. Part of our responsibility as legislators, however, is to deal with the very real challenges around online bullying and harassment, and the real harm caused by these. We have heard from witnesses about such harm and concerns. We could certainly wait for the Digital Services Act but that will take some time before it is adopted. Dr. McIntyre outlined how quickly that is happening. Nevertheless, we have a responsibility as legislators to deal right now with the problems of online harm. I accept entirely that it is not going to be solved just with the creation of an office of an online safety commissioner. There is a requirement around education, and this is a reflection of broader matters in society.

We have a responsibility nonetheless to be able to address some of the behaviour online and we know Twitter, Facebook and many other social media giants are not doing this. I am open to the idea of a social media council and perhaps the witnesses could expand on that. Facebook introduced an oversight board and again there is a question mark because Facebook appointed its members and I do not view it as independent. I appreciate all the problems we are seeing but how, in the immediate term, can we solve all those challenges?

I referred earlier to the human rights commission and I have a question on algorithmic decision-making. We want to see audits carried out in the area but do the witnesses have a view on it?

Mr. Liam Herrick

I will answer the second question, which relates to artificial intelligence, before handing over to my colleague, Ms Cronin, to deal with the Senator's question about what other steps might be taken to deal with online harm.

Artificial intelligence and algorithms constitute an area of growing interest to the ICCL across a wide range of human rights matters. We will launch a programme of work monitoring how artificial intelligence algorithms engage with questions of human rights in Ireland over the next couple of months. As the Senator is aware, the European Union is bringing forward a package of measures in this area too and it is another example of where there are important developments at the Brussels level. This is not a question of standing back and deferring to Europe but trying to advance measures without being cognisant of what is happening at the European level could be counterproductive in the longer term. Dr. McIntyre has already dealt with that point. Ms Cronin will address other measures. We all accept there is a social problem that must be addressed but this might not be the right solution. We have given some thought to the other areas in which the Government and Oireachtas might invest their energy.

Ms Olga Cronin

Senator Byrne's question is valid. Nobody likes to see anybody being unpleasant online, and as we indicated in the opening statement, we understand the intention of this Bill to be to reduce the hurt that both children and adults feel because of some online material. We are concerned that in the well-intentioned effort to target the few, this Bill's extremely vague elements, coupled with the substantial powers to be given to the media commission, could have a collateral effect or unintended consequences that will substantially restrict the rights of all Internet users.

There is the definition of online harm under head 49A(b), which is, "material which is likely to have the effect of intimidating, threatening, humiliating or persecuting a person to which it pertains and which a reasonable person would conclude was the intention of its dissemination". That lacks precision, clarity and certainty and it is too vague, arbitrary and unspecific about what material it refers to. It is open to-----

Apologies for interrupting, but is the solution for us then to more clearly define that aspect as part of the proposed Bill? Are the mechanisms we are proposing fine, provided we have clearer definitions?

Ms Olga Cronin

We must be really careful about what we are doing. This type of legislation is being seen across the world and we have seen Bills like this fall in other jurisdictions, such as in India and Canada. They are mentioned in our submission. That happened because this type of regulation is so difficult. The schoolchildren who came before the committee a few weeks ago were remarkable and their comments are valid. They spoke about the difference between a 13-year-old and a 17-year-old and how they can react to material online. There is a big difference between a 13-year-old, a 17-year-old, a 36-year-old and a 66-year-old but all that nuance of the different layers are not in the Bill. There does not appear to be an appreciation of the different capacity people have at different stages in their life and in different positions or levels of engagement online. Ultimately, this Bill will restrict speech that is not illegal. These are fundamental and core principles that are the bedrock of democracy.

I ask the witness to bring her contribution to a close because we have gone over time. I want to give Dr. McIntyre the opportunity to respond to Senator Byrne.

Dr. T.J. McIntyre

The Senator asked what we can do immediately. I looked back through my notes to prepare for today's hearing and I saw that the first time I spoke to an Oireachtas committee on this matter was March 2013. At that stage I said we urgently needed greater funding and training for police in the area of responding to social media abuse and greater funding for the Data Protection Commission for supporting individuals with regard to so-called revenge porn intimate image offences.

Nearly a decade on, that is still largely true. We can do this immediately in cases of serious abuse. Unfortunately, we still find the prosecution of somebody for these kinds of abuse a rarity worthy of comment. Unfortunately, we still find many individuals let down by the justice system. I am a practising solicitor and I have encountered in my professional capacity a number of individuals who have been the victims of online abuse. Very often their message is that they took their complaints to the Garda and were told the complaints could not be dealt with, that they were civil matters because the abusive material was hosted on a service in another jurisdiction, and there was nothing the Garda could do about the complaints. We can handle that side of things - immediate, severe harms - already if we have the funding to do so. We had functioning legislation under the old Non-Fatal Offences Against the Person Act. We now have stronger legislation in Coco's Law. On that side of things, the challenge is enforcement, not new legislation.

To respond to Senator Byrne's broader question, which was what Ireland should do in the meantime, we can certainly progress with online harm legislation, but trying to shove it into the model of the audiovisual media services directive will be unproductive, particularly given that we see a different model coming down the road in the form of the Digital Services Act. At this stage, to the extent we want to focus on particular categories of harm, bullying in particular, the committee could consider whether that is adequately dealt with in Coco's Law; if not, why not; and whether we should look at an expansion of that. However, the aim here, to have a very ambitious package dealing with all online harms, is doomed to failure in its current form, partly because of the constitutional difficulties we, the ICCL, and the Irish Human Rights and Equality Commission have identified but also because it tries to shove a square peg into a round hole.

I think this is the second day I have been at a committee meeting with the ICCL. It is doing the Lord's work coming in and out to meet with us. I have just two questions. Perhaps they are not straightforward but they will be pretty quick. Ms Cronin talked about the collateral effect and the Bill going beyond the scope required. Is the Bill in need of specific reforms or are we going about this in entirely the wrong way? That question is probably throwing the cat among the pigeons from a political perspective, but are we not necessarily getting this right?

Second, is what is broadly proposed in the Bill a human rights approach to online regulation? Are we meeting the human rights requirements or are we not? If not, what do we need to do to resolve that?

Dr. T.J. McIntyre

I thank Senator Hoey. I agree with the point I think she is making, which is that we are not going about this the right way in the current format, partly for the reasons I outlined to Senator Byrne. If I may elaborate and give the committee just one concrete example of why this is, the Bill proposes to place obligations on the providers of private communications services by way of a media code to be adopted by the new commission. In doing so, however, it seems to be completely unaware of the provisions of European law, the privacy directive and the new European electronic communications code, which would preclude the Bill from doing that. Therefore, what is proposed, as it stands, simply could not be done. I looked through the entire heads of the Bill and the regulatory impact analysis to see whether these pieces of legislation are mentioned but I could not find them mentioned anywhere. It might be that there is another policy document I am not aware of that does so, but it seems to me the Bill reflects very much the broadcasting mindedness of its originator. It reflects the priorities of a Department used to regulating a small number of large entities with relatively little freedom of expression litigation brought against it. I can think of only one case of any great significance in this context in recent years. This approach tends to fall down when applied to a much wider range of entities which also deal with private speech, which involves a much wider range of public individuals, rather than a small number of broadcasters and so on. Again, I think the model we are looking at here is perhaps not the appropriate one.

Mr. Liam Herrick

I agree with what Dr. McIntyre has had to say in response to what Senator Hoey said. There are ways in which the grounding of the Bill in human rights and equality principles could be more explicit, and the Law Society has made suggestions in this regard, but that will not get away from the core approach being taken here. Dr. McIntyre has put it very well in saying it applies a broadcasting model aimed at large companies to enforcement against potentially small providers and then, indirectly, to private communications and the private speech of individuals. How this will work in practice is that if companies are under these legal obligations, they will adjust their internal processes and algorithms and so on, which will have a knock-on effect on filtering the communications of ordinary people as well as the effect self-censorship might have. It is difficult to get away from the fundamental human rights principles that apply to this question of vague definitions as to what types of speech might be proscribed. Once you get beyond speech that is clearly illegal, and Digital Rights Ireland has cast some questions as to how even that might be defined, and into harmful content and more vague concepts such as bullying and disinformation, it becomes very difficult to strike an appropriate balance. The approach, we find, is fundamentally flawed in that respect.

I welcome the witnesses. What they are saying, from what I have picked up from the conversation so far, is really to put the brakes on. As I read through both Mr. Herrick's and Dr. McIntyre's introductions, I picked out the words "overly vague", "confined", "concerned", "freedom of expression" etc. There are a significant number of queries and issues with the Bill, from what I take from what they are saying. They are correct we clearly need to define what is legal and illegal, especially when it comes to bullying. I think we highlighted that in a previous committee meeting. I acknowledge that defining something across the board is not the correct way. What a 13-year-old or a 14-year-old considers harmful might not be harmful to someone in my age bracket. Perhaps that needs to be taken into context. I highlighted one line in the Digital Rights Ireland statement, "it makes little sense to rush through the domestic provisions at this time". Clearly, what the witnesses are saying is to put the brakes on, look at this again and redefine a lot of the things in the Bill.

However, I have to take up what Senator Byrne said. Online harm and bullying is a serious issue that needs to be dealt with immediately. I would like the witnesses to expand on the idea of the social media council. Maybe they could touch on the points I have made. I think I am correct in saying that what they are calling for is fuller consultation and for the brakes to be put on this. However, I firmly believe we cannot hold back. This needs to be dealt with immediately.

Mr. Liam Herrick

As Dr. McIntyre set out earlier, the haste and the urgency of the Bill comes from the transposition of the directive, and the difficulty is that all the other extraneous matters that have been included in the Bill require further deliberation. One example of other legislative developments going on in parallel that will have an impact on the Bill is the question of hate crime and incitement to hatred. Separately, the Department of Justice has issued a policy paper aiming to review the Prohibition of Incitement to Hatred Act 1989 and to legislate for hate crime for the first time. The approach that has been taken in that case is very strongly grounded in principles of freedom of expression and will introduce statutory definitions of concepts to which the Bill we are discussing will then makes cross references. Therefore, to advance this Bill before that legislation is enacted and resolves the definition in that Act will require further amendment down the line.

On the question of bullying, I wish to clarify that the Irish Council for Civil Liberties is a member of the Children's Rights Alliance. We take the problem of bullying and the harm experienced by children very seriously, but it is important to bear in mind that the problem of bullying did not arrive with the Internet. The supports that children require to be protected from harm include changes to the curriculum in schools and supports in schools. There is also the chronic underinvestment by successive Governments in mental health and other supports for children. The Oireachtas is rightly concerned about dealing with this problem, but the suggestion that this is the only measure to deal with it is a misdirection of energy and priority when, frankly, the Oireachtas has underperformed over a long period of time in providing the necessary financial and other resources to schools and other services that are trying to help children to develop resilience in the face of bullying.

Dr. T.J. McIntyre

We can make two brief points in response to Senator Carrigy's point. One is that, to the extent that there are discrete concrete problems, we can sometimes deal with them in discrete ways which do not necessarily require going into this broader frame of legislation. For example, with regard to material which promotes suicide, we can legislate to criminalise that more explicitly. There is an existing prohibition, but we could legislate to widen that regarding the provision of information about the commission of suicide without any great difficulty. That would provide a secure basis to deal with those cases.

More generally, however, there might be some misapprehension as to how far-reaching the effects of this Bill would be, if adopted. The heads of the Bill propose that we legislate for providers who are established in the State. Quite a few providers are established in Ireland because of our rather generous tax regime and this legislation would have a significant effect in respect of those, but it would have no effect at all on providers who are headquartered elsewhere. I doubt it is the case, therefore, that there would be a huge reduction in certain types of concerning content, for example, material that promotes suicide or eating disorders, which is already relatively actively policed by the types of providers that tend to be headquartered in Ireland. That material, to the extent that it is available from other providers elsewhere, will continue to be available from other providers elsewhere. I do not believe we should overstate the effects of this legislation. If we say we need to act quite quickly in respect of some of these issues, it is not the case that a purely domestic response, even given the number of tech firms headquartered in Ireland, is necessarily going to have that effect. Unfortunately, this is an area where we have to look for international solutions as well. For that reason we should consider the desirability of integrating this with the Digital Services Act.

Ms Olga Cronin

To follow up on Mr. Herrick's point earlier and the question about what can be done straightaway, Ms Sarah Fitzgerald from Kinsale Community School told the committee that we must build resilience because cyberbullying will not just go away if we enact legislation. She said they needed to have tools to cope. It is an excellent point and one that should be supported. She and her schoolmate, Ms Megan Fahy, spoke in detail about measures being taken in their school to combat cyberbullying. They also suggested that sixth class pupils be spoken to before they go into secondary school. They said the problem mostly relates to first to third year pupils. They had some excellent suggestions for the immediate term, including talking about cyberbullying in the social and personal health education class. It is worth reiterating their points because they were very good.

On that, it is something we raised when representatives from TikTok, Facebook and Twitter were before the committee. I put it to them that there was a responsibility on them to fund a campaign in all schools through the Department of Education with regard to online bullying. They committed to that. It is something the Oireachtas and I, personally, will keep pushing, that they follow up on the commitment they gave. The responsibility is on them. They are making significant profits through it and they should contribute to it. It would be rolled out by the Department of Education in every school in the country. The witness is 100% correct, and that is something we must ensure is implemented.

I thank the witnesses for the enlightened discourse. It certainly put the cat among the pigeons, at least. I have three brief questions. Do they think there could be a mechanism based on age for the regulation of private messaging systems, for example, only private messages sent or received by children covered in the Bill? Do they think the commission as it is currently defined may have too much discretion in investigation or not? Under the Bill, the commissioner can demand information, but that is without prior judicial authorisation, particularly in individual private communications. Do they think the Bill could cause problems for people who want to organise politically? If so, can they give examples or more details of how it could?

Mr. Liam Herrick

They are very broad questions and I am not sure I am able to answer them all off the bat. Certainly, the question about the broad powers is a deep concern for us. One of the difficulties is that not only are very broad powers being proposed for the commission, but also powers for it to take on the role of further defining what might be harmful content. That is problematic from the administration of law and constitutional perspective. It is also difficult from a freedom of expression perspective.

In terms of a mechanism in respect of age, as Ms Cronin said earlier, there is a fundamental difficulty in treating adults and children alike in the proposed Bill. We note in our submission that other jurisdictions have tended to focus particularly on harmful content with regard to children. I am not sure there is any mechanism for distinguishing in terms of private messaging in this area. I will be happy to refer back to the Deputy on that in writing.

As regards organising politically, it is not something we addressed specifically in our submission. It is not something that is immediately apparent to us as a difficulty, but Ms Cronin may have some thoughts on that point.

Ms Olga Cronin

I wish to make a point about children in terms of age, the difference and the differentiation. The UN Committee on the Rights of the Child released general comment No. 25 last March. Other people have raised this in previous sessions with the committee. The committee consulted with 709 children across 28 countries and it states that meaningful access to digital technologies can support children to realise the full range of their civil, political, cultural, economic and social rights. This is very important. Children have a right to participate online and a right to seek, receive and impart information. However, of course, children have the right to be protected from harm. One way towards safeguarding the protection would be mandating service users to carry out children's rights impact assessments, as stated in the comment. That was mentioned by previous witnesses, including Professor Conor O'Mahony and Dr. Karen McAuley. Dr. McAuley also spoke about international frameworks that speak to the responsibilities of businesses in regard to human rights and children's rights. We would support the call from both Professor O'Mahony and Dr. McAuley regarding the need for those assessments.

That same comment called for robust age verification systems that would be consistent with data protection and privacy requirements. They should be used to prevent children from acquiring access to products and services that are illegal for them to use based on age. Such systems, if they are sufficiently robust, should enable businesses to operate granular safety settings which would uphold and respect the children's right to participate.

Do the witnesses think the system of redress in the Bill is weak, and what would they like to see included in the Bill that would entrench a more robust appeals system?

Dr. T.J. McIntyre

I will comment briefly on Deputy Mythen’s point regarding communications systems for children. Although companies already offer communications services that are limited to children and that have parental controls, explicit filtering and so on, and there is nothing in the law to stop that, one of the problems when trying to legislate for it is that there a very large number of edge cases. For example, there is sometimes the proposal that there should be child-only services, which sounds great but, of course, these do not deal with the problem of the 17-year-old being in a very different age bracket from a 12-year-old, nor do they deal with the problem of the 17-year-old who turns 18 and would like to talk to his or her friends, to take his or her friends’ messaging details with him or her, but finds he or she has now been aged out of the service. While companies are working on child-only or child-friendly messaging services, at this stage at least we are not in the position to enshrine that in law.

The second point the Deputy made and which I would like to respond to briefly is the impact of this on political organisation. There are two points I would make on that. One is that a concerning aspect of these rules is that they would apply across the board. They would, therefore, restrict what elected representatives say on their Facebook pages, Twitter pages or on their blogs that are hosted by Blogger or by WordPress. It is an across-the-board form of control. It is not restricted to what goes on through traditional, large broadcast media. The second point related to private communications is that this would also potentially impact private communications between a constituent and an elected representative. It would essentially require technology firms to look inside private communications and the cloud files representatives store in, for example, their Microsoft 365 storage or their Google Drive storage. Again, as currently proposed, it seems this would be done without any requirement or judicial authorisation and based only on some possible source of internal authorisation within the commission. I would regard this as very concerning.

The redress system seems to be very weak in the Bill. What would the witnesses entrench? Would they like to see a more robust appeals system, for instance, if someone is ordered to take something down?

Mr. Liam Herrick

We do not have anything further than what is in our submission on that. I do not know if Dr. McIntyre or Ms Cronin have any particular views on the appeal mechanism.

Ms Olga Cronin

We do not have anything prepared to talk about that. However, we would be more than happy to follow up with the Deputy in writing on that.

What are the witnesses’ thoughts or views on the definition of cyberbullying in the Bill?

Dr. T.J. McIntyre

It strikes me the definition is so wide that it would criminalise or certainly prohibit material from Phoenix magazine's mainly political cartoons. An aspect that is troubling is it relies on intention. It refers to the intention of the person who wrote or sent something. This would suggest that if I took a piece of newspaper coverage of a scandal and sent it to someone with the view of humiliating the person who was behind that scandal, that act would be prohibited, based on my subjective intention. That is very worrying for two reasons. First, it focuses on subjective rather than objective intention. To that extent it does not matter that there is an objective public value in me being able to share this news article or political cartoon. Second, it places a significant burden on the tech firms to assess the subjective intention of the person who is sharing this particular news article or cartoon. Certainly, in its current form it is unworkable.

Mr. Liam Herrick

We agree 100% with all of what has just been said. It is a clear illustration of the very different nature and texture of the very necessary task of regulating and aggressively going after materials such as child sexual abuse material. This is clearly defined in law as being illegal and has an obvious social harm. Already, the technology companies are developing and investing quite a lot in artificial intelligence and machine learning methods of countering that. However, this concept is so vague, open and encompasses legitimate political expression. It is almost impossible to separate it from legitimate and important political expression. I would put in the same category questions such as disinformation and misinformation, which are also notoriously difficult to define. Therefore, I think this is the wrong way of trying to deal with a very understandable social problem.

I am afraid I am going to challenge Mr. Herrick's and Dr. McIntyre’s assertions. They make the point that these are vague definitions around what is regarded as online harm. There are people already determining what is online harm. Facebook and Twitter have their community standards, over which we as legislators have no control. Already, we have a situation determining what can go on platforms. Mark Zuckerberg and Jack Dorsey can continue to determine what is placed on their platforms. They made a decision, with reference to the political point, on whether they would continue to allow Donald Trump to use their platforms and whether his language was harmful. Whether you agree or disagree with what Donald Trump was saying, it was Zuckerberg, Dorsey and their teams who made the decision, rather than democratic society. It is those tech giants, who, by the way, are located in Ireland not just for tax reasons but for reasons of the solid talent that is based here in Ireland.

The argument we have made is we accept we need to be clearer on defining what is meant by online harm. Certainly, in terms of hate speech, we need to be clearer. However, the current situation that exists is that those definitions are already determined by the community standards that are drawn up by Facebook and Twitter. What we have at the moment is a regime whereby Silicon Valley determines both what can appear and the rules on social media in Ireland, whereas we as legislators do not have that role. I accept this is a requirement for many states to work together on this. If, however, we look at Germany where they have the network enforcement Act, and I appreciate their legislation around hate speech is stronger, they were able to ensure that with the issue of Holocaust denial, which is very clearly defined in German law, within 24 hours harmful content should be taken down. Yes, we can wait for the Digital Services Act and, yes, we can wait for perfection, but this is a challenge we need to address. For all our flaws, I would argue that in the democratic public space, it is better to have us as legislators and regulators determining what is regarded as harmful content rather than the feudal lords, such as Mark Zuckerberg and Jack Dorsey.

Mr. Liam Herrick

There is a huge amount in what the Senator has just said. As we have said earlier, everybody accepts that having the community standards and self-regulation as the sole model is neither feasible nor sustainable going forward. However, there is a lot to be said for putting our energies into demanding radical transparency on behalf of the platforms about their processes, about the consistency of their application, and about the algorithms they use. There is a need for a conversation between the companies and the State on what they are doing. However, what is being proposed here is going far beyond that.

If the State seeks to regulate content that is illegal, it is on very solid ground.

If the State seeks to define harmful content in a broader sense, it is putting itself in a very difficult position. It raises the question whereby if it were intended that the State was to supplant the internal monitoring and moderation of the companies themselves, how much in terms of resources would the Oireachtas and the Government intend to commit to doing that? I refer to the task and the scale of what might be involved. We talked about the example of Germany, where the regulator has gone very far and certainly beyond where the Constitution stands in terms of the balance of freedom of expression and private communications.

Nobody is saying that the State has no role. What is proposed in the Bill is too far too fast, however, and it will produce many great difficulties for the State and for the strong protection of freedom of expression in the Constitution.

Dr. T.J. McIntyre

While the Senator made some good points, the fundamental problem is that either we want a regime whereby some entities can make decisions about what kind of content they want on their platforms or we do not. It is desirable that we allow a degree of freedom, certainly to smaller entities. There are questions relating to competition when we come to larger entities. These can be put aside for the moment. Nevertheless, it is important that services such as, for example, have discretion as to what they consider appropriate for their community and that while it may be legal to say something, such services are able to say it may not be tasteful or appropriate for their audiences. Similarly, it is important that there are other forums which might, for example, set out to make themselves child friendly and exclude material that would be suitable for adults but not for children.

Conversely, if we get to a point where the State can prescribe, in some detail, what kind of content a service must not carry, short of it being illegal, and can also prescribe what content it must carry, for example, in a European context, with proposals to require political leaders' speech to be carried, notwithstanding that it might be offensive, we will rapidly end up with the State supplanting the editorial judgment of the service. That might be considered desirable but it would place a greater burden on the State, in order to comply with the requirements of Article 10 of the European Convention on Human Rights and the standards under Bunreacht na hÉireann, to prescribe clearly what can and cannot be done and to provide a State redress scheme in respect of that and so on.

The risk for Senator Malcolm Byrne is that if he gets what he wants, it will restrict the freedom to regulate because we will end up with the State taking on more responsibilities while having a weaker ability to control than private entities would. We cannot have a State rule that would provide that, for example, material harmful to children or material lacking in taste and so on should be precluded. Private entities are free to do that if they wish. There is a catch-22 here. The greater the degree of State involvement in these policies, the more difficult it becomes to regulate, precisely because a greater degree of precision and State supervision of regulation will be needed. If the Senator would like to encourage tech firms to regulate, there are ways of doing that, such as requiring them to adopt policies or to be transparent about the application of their policies and so on, without necessarily being prescriptive as to what they can and cannot carry within the law.

That brings the meeting to a natural conclusion. I thank Mr. Herrick, Ms Cronin and Dr. McIntyre for their presentations. The meeting has been most insightful and we had a good opportunity to interrogate those ideas and thoughts. It will certainly assist the committee in our pre-legislative scrutiny as we proceed. I also thank members for their engagement.

We will adjourn until 12.30 p.m. tomorrow, Thursday, 27 May, when we will hold a virtual meeting in committee room 3, as we continue our pre-legislative scrutiny of the online safety and media regulation Bill with representatives from Screen Producers Ireland and the Joint Creative Audiovisual Sectoral Group.

The joint committee adjourned at 2.14 p.m. until 12.30 p.m. on Thursday, 27 May 2021.