Léim ar aghaidh chuig an bpríomhábhar
Gnáthamharc

Joint Committee on Media, Tourism, Arts, Culture, Sport and the Gaeltacht díospóireacht -
Wednesday, 12 May 2021

General Scheme of the Online Safety and Media Regulation Bill 2020: Discussion (Resumed)

Apologies have been received from Deputy Peter Fitzpatrick. We have one item of committee business to address before I call on the witnesses to present, that is, the draft minutes of our meetings on 5 and 6 May, both the public and private sessions. Are those minutes agreed? Agreed. There are no matters arising.

We will now proceed to our pre-legislative scrutiny. This meeting has been convened with representatives of the Ombudsman for Children's Office, OCO, the special rapporteur on child protection, Professor Conor O'Mahony, representatives from the National Anti-Bullying Research and Resource Centre and from the DCU institute of future media, democracy and society. This is the fourth of our public hearings to discuss the general scheme of the online safety and media regulation Bill.

Our guests are all very welcome. We have to hold these meetings virtually and, of course, it never feels the same as having our guests present in the committee rooms, but we are delighted to be joined by them. I thank them very much for their presence.

I would like to welcome the witnesses who are joining us remotely. I welcome Dr. Karen McAuley, head of policy at the Ombudsman for Children's Office, OCO; Professor Conor O'Mahony, special rapporteur on child protection; Dr. Eileen Culloty from institute for future media, democracy and society in DCU; and Dr. Tijana Milosvevic from the National Anti-Bullying Research and Resource Centre.

The format of the meeting is such that I will invite witnesses to make their opening statements, which will be followed by questions from members of the committee. As the witnesses are probably aware, the committee may publish the opening statements on its website following the meeting. I will call the representatives of each organisation to deliver their opening statement in the following order: Dr. McAuley, Professor O'Mahony and a representative of the institute for future media, democracy and society. The opening statements are limited to three minutes per organisation.

I would advise witnesses of the following regarding parliamentary privilege. Witnesses are reminded of the long-standing parliamentary practice that they should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable, or otherwise engage in speech that might be regarded as damaging to the good name of any person or entity. Therefore, if witnesses' statements are potentially defamatory in relation to an identifiable person or entity, they will be directed to discontinue their remarks. It is imperative that they comply with any such direction. As our witnesses are attending remotely from outside the Leinster House campus, they should note that there are some limitations to parliamentary privilege and, as such, they may not benefit from the same level of immunity from legal proceedings as witnesses who are physically present do.

Members are reminded of the long-standing parliamentary practice to the effect that they should not comment on, criticise or make charges against a person outside the House, or an official either by name or in such a way as to make him or her identifiable. I remind members of the constitutional requirements whereby members must be physically present within the confines of Leinster House or the Convention Centre Dublin. I will not permit a member to attend where he or she is not adhering to the constitutional requirements. Therefore, any member who attempts to attend from outside the precincts will be asked to leave the meeting.

I ask members to please identify themselves when contributing for the benefit of Debates Office staff preparing the Official Report, to mute their microphones when not contributing to reduce background noise and feedback and to use the raise hand button when they wish to make a contribution. I remind those joining today's meeting to ensure their mobile phones are switched off or on silent.

I invite Dr. McAuley to make her opening statement on behalf of the Ombudsman for Children's Office, OCO.

Dr. Karen McAuley

I thank the Chair for inviting the Ombudsman for Children’s Office, OCO, to appear before the joint committee today. I would also like to convey the Ombudsman for Children’s apologies that he is unable to attend today’s meeting due to a prior engagement. As members of the committee are aware, the OCO is an independent statutory body which was established in 2004 under the Ombudsman for Children Act 2002. Under the 2002 Act, the OCO has two core statutory functions, namely, to promote the rights and welfare of children up to 18 years of age and to examine and investigate complaints made by or for children about the administrative actions of public bodies, schools and voluntary hospitals that have or may have adversely affected a child.

The OCO’s engagement with developments that have given rise to the current general scheme, and with the general scheme itself, has focused on proposals for the establishment of an online safety commissioner and, as such, on Part 4 of the current scheme.

In a new general comment published in March this year, the UN Committee on the Rights of the Child recalls that while the digital environment is playing an increasingly important role in many aspects of children’s lives, it was not originally designed for children. As the UN committee notes, and as we all know, alongside the opportunities the digital environment affords for the realisation of children’s rights are the risks it poses to their violation or abuse.

The proposals set out in the general scheme to establish a regulatory framework for online safety to address the spread and amplification of harmful online content, represent a significant opportunity to strengthen the protection of children from such content. This is not to suggest, however, that the proposals do not require further refinement. In this regard, our submission to the committee identifies elements of Part 4 that we believe require further attention. They include the proposals relating to categories of harmful online content and to a systemic complaints scheme.

The OCO’s work is underpinned by primary legislation. Therefore, we understand how important it is that legislation is enabling. The prospective authority, credibility and effectiveness of the proposed media commission are largely contingent on the statutory framework that is ultimately put in place for it, as well as on the resources allocated to enable the commission to discharge its functions, including in respect of online safety.

The online environment is constantly evolving and expanding. Therefore, it is foreseeable that, once enacted, this legislation will be subject to future amendments. This being the case, we would suggest that work to develop and progress the current legislative proposals may benefit from a focus on getting the fundamentals right, so as to ensure that the commission is placed on a sound statutory footing and can get off to a good start.

In relation to the prospective work of the online safety commissioner specifically, this will involve making sure that the regulatory tools available to the commissioner have the potential to be effective, that the provisions made about the online content and material that fall within the scope of the commissioner’s work are rights compliant, understood and workable and the complaints scheme put in place upholds the right, including the right of children, to an effective remedy.

I thank the committee again for the invitation to meet it today. I am happy to take questions, if required.

Thank you very much Dr. McAuley. We will hold off on questions from our members until all our witnesses have had the opportunity to make a statement. I thank Dr. McAuley for being so considerate in terms of the timeframe available to us. I ask Professor O'Mahony to address the committee.

Professor Conor O'Mahony

Go raibh maith agat, a Chathaoirligh. I am grateful for the opportunity to speak with members about the general scheme of the online safety and media regulation Bill. This is an important Bill because of the digital environment is largely unregulated in Ireland at present. The Bill presents an important opportunity to address a number of concerns relating to the risk of harm that may occur to children during their online activities. At the outset, I would like to emphasise that the digital environment is not an inherently bad thing for children.

We often focus on the negatives but the reality is that the digital environment is somewhere we all increasingly live our lives. Children spend a significant amount of time there and it offers enormous opportunities to children to express themselves, play and socialise, avail of educational opportunities and participate in society. Children’s rights principles require that we ensure children are allowed to benefit from these opportunities and that measures aimed at protecting children from harm avoid unduly restricting children’s capacity to so engage.

However, we should acknowledge that there are important and significant risks of harm in the digital environment. As children now spend so much time online, they are exposed to increasing risks of cyberbullying, exposure to harmful material, exploitation and abuse. Although children’s rights principles require that children should have the freedom to engage in online activities, they equally require that children should be protected from harm while they so do.

My submission draws in particular on international guidance from the UN Committee on the Rights of the Child and the Council of Europe. I have extrapolated from that guidance three key principles of which we need to be aware. First, the legislative and regulatory environment must be clear and predictable. Second, the law must require businesses to meet their responsibilities to respect children’s rights in the digital environment. Third, the law must provide for accessible non-judicial remedies and grievance mechanisms while ensuring that judicial remedies remain available.

Looking at the general scheme of the Bill in its current form, there are several ways in which it falls somewhat short of these principles. As regards a clear and predictable legal environment, the Bill lacks clear definitions of several concepts, such as pornography or gross or gratuitous violence and this, in turn, leaves the concept of age-inappropriate material somewhat unclear in the Bill. It omits financial harm as a type of online harm, meaning that it has no implications around the exposure of children to online gambling. The references to the concepts of the best interests of the child and the evolving capacities of the child are quite vague and do not sufficiently explain what those principles require of regulatory bodies or service providers. Most important, the Bill does not provide for a system of individual complaints that can lead to the removal of harmful content. Finally, it does not require service providers to carry out any form of children's rights due diligence, such as risk assessments on possible forms of online harm to children.

The final point I wish to make is that international guidance has repeatedly emphasised the importance of including child participation in any process leading to the enactment of laws regulating the digital environment. No one understands better than children and young people themselves how they experience and navigate the digital environment. We would be doing them a disservice if we did not afford them the opportunity to contribute their views before this legislation is enacted.

I thank the committee members for their attention. My written submission elaborates on these points. I am happy to deal with any questions members may have.

I thank Professor O'Mahony for that detailed presentation. I invite Dr. Culloty to deliver her opening statement.

Dr. Eileen Culloty

I thank the Chairman and members for this opportunity to contribute to the consultation process. The Bill is important legislation because the media environment has changed fundamentally and there are major concerns regarding online harms. These are complex issues. They relate to debates about the regulation of online platforms and the need to balance the mitigation of harms with the protection of rights and freedoms. In that context, it is imperative that the Bill provides sufficient clarity about the roles and functions of the new media commission. Only one role is identified in the Bill, that of the child safety commissioner. We recommend the inclusion of a media pluralism commissioner. This would, in keeping with the EU media freedom Act, address these critical concerns and threats with regard to the future of a viable media system in Ireland and a vibrant public sphere. Relatedly, we recommend that action be taken quickly on the content levy because it is imperative for the viability of Irish media and should be broadened to include wider support for the sector.

We note that the Bill does not reference disinformation. This is out of step with the EU Digital Services Act and the democracy action plan, both of which recognise disinformation as a harm that threatens society and democracy. Media regulators across Europe, including the Broadcasting Authority of Ireland, BAI, are already monitoring the implementation of the code of practice on disinformation, so this falls within the remit of the media commission.

To date, there has been widespread criticism of online platforms. Much of this is about the lack of transparency and accountability relating to automated decision making, moderation and the responses of the platforms to harms. Much of the reporting and their transparency measures are in aggregate. For example, reports will note that three million pieces of bullying and harassment content were actioned in a given quarter but, unless there is sufficient context, it is impossible to analyse these data in a meaningful way or to evaluate whether the actions of the platforms are actually effective. Research is needed, as are clear obligations to report meaningful data, rather than more transparency reports.

In Australia, the Online Safety Act and the Office of the eSafety Commissioner have implemented an individual complaints mechanism. If the implementation of such a mechanism is not possible here, we recommend that platforms catering to children be required to fund a portion of prevention and intervention measures, such as counselling services and helpline services. This could also be considered as a component of providers’ duty of care.

In conclusion, we believe the Bill is a concrete step towards establishing media regulation that is fit for the 21st century. Our comments are intended to enhance the design and operation of a media commission that will serve the needs of the public and the media industry. I am happy to answer members' questions on behalf of the DCU institute for future media, society and democracy. My colleague, Dr. Milosevic, is present representing the National Anti-Bullying Research and Resource Centre.

I thank Dr. Culloty for her presentation. The committee appreciates it. She made some very good and comprehensive suggestions for us to consider.

I thank the witnesses for their comprehensive and helpful statements. The Department is telling us that an individual complaints mechanism would be quickly ground down, that there would be backlogs and, essentially, that it would not be effective. We know that Australia has a system in place whereby an individual complaints mechanism is available. I know from personal experience that when hundreds of tweets are aimed at you on Twitter, and one in particular on Instagram is the one you want to remove and chase, the tweets on Twitter essentially do not matter in that context as far as Instagram is concerned. I believe in an individual complaints mechanism. How important do our guests consider such a mechanism is in the context of the Bill?

Professor Conor O'Mahony

To my mind, the absence of an individual complaints mechanism is the single biggest weakness in the Bill as it stands. If we persist with an approach that does not include provision for individual complaints, it is probable that we will be heavily criticised in the near future by the UN Committee on the Rights of the Child for that failure. The thing about the risk of it being overwhelmed is that we do not need to think of an individual complaints mechanism in a State agency as the first port of call. Ideally, the law should be structured in such a way that one imposes strong obligations on the service providers themselves to have local-level complaints mechanisms such that the complaint would in the first instance go to the service provider, which would have a legal obligation to have a mechanism for dealing with that complaint, responding to it and removing content. It would only be in cases where that fails at local level with the service provider that the complaint would then be escalated to whichever State agency is regulating this area of activity. In that way, one tries to filter out the majority of the complaints before they reach the relevant State agency and that avoids the risk of it being overwhelmed in the way that has been identified.

As I have said, it is not sustainable to leave this out of the Bill.

Dr. Karen McAuley

To build on Professor O'Mahony's points, from our perspective it is most important to ensure that what is provided for via the proposals complies with, and gives effect to, the right including children's right to an effective remedy. From our perspective, what is proposed at present does not appear to be aligned with children's right to an effective remedy, as conceived by the Council of Europe in its 2018 recommendation which Professor O'Mahony referenced and several general comments issued by the UN Committee on the Rights of the Child, including its most recent comment on children's rights in the digital environment.

In respect of children's rights to an effective remedy, reference is made to the need for child-sensitive mechanisms. That means mechanisms that are prompt, genuinely available and accessible, and able to receive, investigate and address complaints. Therefore, t is important to give further consideration to the introduction of an individual complaints mechanism. We understand the Department's concerns, and clearly it would not be in service users' interests, including those of children, for the commissioner to be overwhelmed by complaints such as to create significant delays or backlogs in addressing them. However, as Professor O'Mahony indicated, it would not be anticipated that the commissioner would be the first port of call. Really, what needs to happen is that online services are provided with the competencies to be able to deal with complaints appropriately, effectively, which means swiftly, and that where the manner of dealing with it does not resolve the problem, service users including children and parents acting on their behalf have the opportunity to approach the commissioner with their concerns.

I thank Dr. Milosevic for joining us; she has the floor.

Dr. Tijana Milosevic

I want to support my colleagues in what they have said. If this is impossible to institute for some reasons, we wish to underscore the point that it is important to provide psychological and counselling supports to children who experience this. We have found, and I believe it was indicated by the children who appeared as witnesses before the committee last week, that very often they find themselves in a situation where the reporting of it does not work for some reason. What happens is that content spills over from one platform to another, and it is the case that the content is spreading and the child is not able to get the right support. As we have seen from research, very often it is not enough to have the content taken down. What also matters to children is to have that follow-up, where they get the educational and psychological support as victims, particularly of cyberbullying which is the focus of our research. Therefore, in addition to having content removed, supports for an individual must be in place. We recommend that it is most important to consider that there is a good infrastructure to support children when they find themselves in these situations and to support them individually.

Dr. Tijana Milosevic: I want to support my colleagues in what they said. If this is impossible to institute for some reasons, we want to underscore the point that it is important to provide psychological and counselling supports to children who experience cyberbullying. We have found, and I believe that it was indicated by the children who appeared as witnesses before the committee last week, that very often they find themselves in a situation where the reporting of it does not work for some reason. What happens is that content spills over from one platform to another, and there is a case there the content is spreading and the child is not able to get the right support. As we see from research, very often it is not enough to have the content taken down. What matters to children is to also have that follow-up, whereby they get the educational and psychological support as victims, particularly of cyberbullying, which is the focus of our research. Therefore, in addition to content removal, there should be supports for the individual. It is most important to consider that that there is a good infrastructure to support children when they find themselves in these situations and to support the individual child.

I thank Dr. Milosevic, and Senator Warfield for his question. We may have an opportunity for a second round of questions but we will move on. I call on Senator Cassells.

(Interruptions).

Following on from the closing remarks of Professor O'Mahony in respect of the request for the voices of children to be heard in this debate, as Dr. Milosevic has alluded to, we had several transition year students here last week from Kinsale and Tallaght. I will read from the submission of two of the girls from Kinsale, which states: "Sometimes it can be infuriating for students to be lectured on the dangers of social media by parents who perhaps don't understand that this is an integral part of growing up." Everyone here this morning has touched on that point and the issue of finding that balancing act. However, I refer back to the remarks made by Dr. McAuley in her opening statement on guaranteeing the fundamental rights. I ask all of the witnesses the following. Currently, how far short is the general scheme of the legislation in respect of getting the fundamental rights and the definitions right? Already we have touched on a number of points in respect of defining pornography and abuse. How far short are we in that respect?

I have a follow-up question, specifically for Professor O'Mahony. I am delighted that he touched on the category of harmful online content, in particular, in the area of gambling. He noted in his submission that about three quarters of teenagers of between 12 and 17 years of age gamble annually, higher than any other age group. I fully support him in the point he made about the failure to deal with this. I ask him to allude to that. By way of illustrating the dangers and the very devious way in which gambling companies can use these platforms to target children, I will use the following example. Just last week, a prominent Dublin GAA star, acting as an ambassador for one of these gambling companies, spoke about underage football. In the body of the article, when reference was made to the gambling company for which he was acting as ambassador, the name of the gambling company was highlighted so that a user could click on it and be brought to the live online betting odds for last weekend's fixtures. That is how deceptive and devious they are. They are able to use these platforms to attack young underage children. It galls me to think that there are sports stars being complicit in this area and are letting their good names be used for the exploitation of others. Those are my questions.

Dr. Karen McAuley

I thank the Senator for his questions. He asked about ways in which the proposals fall short. We have already spoken briefly about the provisions made in respect of the availability for redress and how it is proposed that complaints may be dealt with by the commissioner, or not, as the case may be. That is one very important area that, from our perspective, needs further work in order to ensure that it safeguards children's right to an effective remedy in an appropriate and effective way.

As we indicated in our submission and I noted in my opening statement briefly, there are other areas about which we have residual concerns. In respect of the proposed approach to defining harmful content, it has less to do with the four categories that have been proposed so far than the use of categories as a mechanism. We understand the rationale for it, but we do have concerns about whether the use of categories is sufficiently specific to be rights-compliant. Clearly, it is absolutely essential that the provisions made in this regard and in other areas are rights-compliant so as to mitigate against any risk of future challenge. It is really about legal certainty around this.

We also have concerns in terms of specificity about whether the use of categories, as currently conceived, is sufficiently clear as to support a shared understanding about what falls within the scope of each category, and with that, that the categories will be workable for the commission. Again, it is about mitigating risk at this juncture of differences of understanding as to what does and does not fall within the scope and perspective disagreement between the commission and other stakeholders on this point. I will hand over to others to give them an opportunity to speak. We have similar concerns around proposals currently around age-inappropriate material. We need further clarification and information and the proposals need further refinement.

Professor Conor O'Mahony

Briefly, on the first question, my sense of the Bill is that it is broadly going in the right direction. I do not think it is a case of it needing to be scrapped and to start all over again. It is a case of it needing to go further in a number of respects. To take a few examples that I mentioned, it is the right thing to make reference to principles like the best interests of the child and the child's evolving capacities. It is good to include that. However, we need to go further and clarify what that means and what obligations flow from those principles in respect of the media commission or the online safety commissioner or service providers. It should be clarified what they have to do on foot of those principles to make them a reality. The thrust of my submission is that the general direction is good, but there are a number of places in which the proposals need to go further and be refined and developed more.

On the gambling point, I agree with the Senator. The issue is that we must ensure that we do not leave loopholes, because we know that service providers will try to exploit those loopholes where they exist. If the evidence suggests - which I believe it does - that online gambling poses a risk of harm to children, then we need to be sure that the Bill really calls that out and puts in place specific safeguards that address that rather than leaving loopholes, of which service providers will be only too happy to take advantage.

Dr. Eileen Culloty

I want to make a point on the fundamentals. It is important to understand that this media environment is going to keep changing and there will be new platforms and practices. Dr. McAuley was talking about the danger of focusing on categories of content. In five years' time, the dynamics could be completely different. It will be important to build in some element of future-proofing, which is what we are trying to get at with this idea of a media pluralism commissioner. While that may not be the correct term, I am referring to somebody who takes a very broad perspective of the environment and the trends that are happening. Perhaps we could also move out of this cycle of reacting to all the digital problems that are coming at us and being on the back foot, and actually think forward with some vision of what kind of media environment we want for all citizens, whatever age they are.

I thank the witnesses for their presentations. I have two sets of questions. The first is for Dr. McAuley around the question of advertising and micro-targeting of children online. I have a particular concern in this regard. I am rather frightened by the news that Facebook is now talking about developing Instagram for kids, and I think it is an appalling idea. I would be grateful to hear what Dr. McAuley feels we need to put into the legislation to deal with advertising that specifically targets children, and how we can address that.

On a question to Dr. Culloty, an area of concern is around the whole question of algorithmic decision-making and machine learning. We have been looking at the idea that the new media commission would develop specific codes in these areas to ensure transparency and awareness of when algorithmic decision-making is taking place. Will Dr. Culloty elaborate on what she believes we should look to have in the legislation? I may come back in later but those are my first two sets of questions.

Dr. Karen McAuley

I thank Senator Byrne for his questions. We have not had an opportunity to look in detail at the whole question of advertising in the context of this general scheme. However, as the committee knows, children have a right to protection from material that is potentially harmful to their well-being, and that could include advertising and the risk of commercial exploitation associated with that. From our perspective, and to refer back to what Professor O'Mahony was discussing in his opening statement, there is an international human rights framework that talks about the responsibilities of business in the round in regard to human rights and children's rights. It is an area that the UN Committee on the Rights of the Child and others have looked at. There are two potential codes that are referenced in the general scheme that, if developed and implemented, may assist in this area. The first is around impact assessment and, as Professor O'Mahony indicated, there is currently no explicit reference to children's rights and due diligence. As regards child rights impact assessment, that is a mechanism we would like to see promoted through the legislation. There is also a code in regard to the whole area of commercial engagement and advertising, so there may be scope to develop it in the legislation.

Professor O'Mahony may want to come in on this. At present, the Broadcasting Authority of Ireland has specific codes around advertising and children, and dealing with issues like junk food and other areas. I am looking for clarity on what we should have in the legislation around those areas of advertising but, obviously, in regard to specific platforms. I am frightened by the idea of Instagram for kids. What can we do, as legislators, to address that?

Professor Conor O'Mahony

The issue about online advertising is that it gives rise to additional risks that might not arise in respect of advertising on television or radio, or in publications. If we look, for example, at the most recent general comment from the UN Committee on the Rights of the Child, it particularly highlights the risk of profiling and the risk of advertising based on data which have been collected about the characteristics of service users, or implied characteristics. We know that when we use our devices and search for a particular service, the next thing is that advertisements are thrown at us for that service. That sort of process, whereby online services can, to an extent, covertly gather data on children and then seek to bombard them with particular types of advertising based on those data, is particularly flagged by the UN committee as something which needs to be regulated. Although I might have to double-check this, I believe it said that sort of profiling and targeting should be prohibited by law.

It is important that those kinds of things are captured because the existing regulatory environment around other forms of advertising might not have the detail. While I am not an expert on that area of law, we might not find the detail we need in some other laws to deal with that sort of activity.

I asked Dr. Culloty about algorithmic decision-making.

Dr. Eileen Culloty

When it comes to algorithms, we need to emphasise there is a big difference between platforms providing transparency and there being accountability for what they are doing. The platforms often talk about transparency and provide transparency reports but when we are talking about something as complicated as algorithms, transparency means very little, definitely for most members of the public. What accountability requires is, first, that people with expertise are able to ask the right questions of the platforms and, hopefully, this will be within the commission itself. It also requires that the platforms share sufficient data to allow independent oversight of what they are doing. That second part has been a huge gap where the platforms have not provided any kind of data that would allow us to understand what actions they are actually taking. Dr. Milosevic might have something to add in regard to content moderation.

Dr. Tijana Milosevic

With respect to children, I want to add the importance of opening up the data and opening up the mechanisms or systems of moderation that companies have in place to evaluation by children. What we are lacking currently, and this has been my point for a long time, is an understanding of the impact and effectiveness of how moderation affects children, so we need to embed their own feedback into the design and development of these moderation systems. Currently, a platform may say that, proactively, it is able to detect a certain amount of content, for example, bullying content, without it being reported first by children. However, that does not mean much to us without sufficient context as to how much bullying there is overall, what the platform has done with this and whether children have actually been satisfied with the initiatives the platform has taken with that content. In order to evaluate that, as Dr. Culloty pointed out, we should have access to data but also greater understanding of what these numbers actually mean.

I welcome the witnesses. It has been a very informative session and I thank them for taking time out to be with us. My question is general in its approach and is for all the witnesses. It is to get an understanding of their feelings on the Bill in its current general scheme format and the provisions they would like to see in order to provide a clear legal obligation to remove harmful content on foot of an appropriate complaints system. In his submission, Professor O'Mahony referenced international approaches, including in the UK and Germany. The latter adopted the Network Enforcement Act to remove harmful content within 24 hours of notification of a complaint and, basically, to remove any unlawful content within seven days. We have seen that France recently proposed fines of up to 4% of online providers' global revenue for repeat offenders. A number of other international legislative measures have been enacted and Australia has established the eSafety Commissioner. How do the witnesses feel the Bill needs to be strengthened in this area?

With regard to misinformation, would the witnesses foresee platforms using fact checkers and how would the future shape of this be implemented? We all know different platforms, such as Facebook and Twitter, use different forms in terms of how they police misinformation.

I would like to hear the witnesses' understanding of it.

Dr. Karen McAuley

We share Deputy Dillon's concerns about what is currently provided for and the extent to which it will facilitate the takedown of material. It is very important for all of us to bear in mind that there will be circumstances where a child could be very distressed by virtue of content they have been exposed to, for example, online cyberbullying. They and their parents will be looking to have something done about it. Speed is of the essence here. While safeguards are in place in terms of procedural fairness, rights balancing and so on, one of our concerns is that there is nothing currently set out under the legislative proposals that suggests speed is possible. I am not suggesting short cuts by any means, but speed is one of the things we need to think about. In that regard, there is currently no obligation on online services to remove harmful content. That area needs to be looked at further in the interests of ensuring that what is put in place is not only effective but does not cause undue delay.

Professor Conor O'Mahony

The key point is that if we are going to define all these various forms of harm in the Bill, that needs to have some actual effect beyond simply codes of practice or broad, soft measures like that. If there is content which falls within the definition of some form of harmful content in one of the various categories in the Bill there needs to be a consequence. First, there needs to be a very clear legal obligation on a service provider to remove content identified on its platform which falls foul of one of those provisions in the Bill. If it is only a soft measure through some form of code of practice, or a document of that nature, we are likely to see non-compliance.

In my submission, I referenced an anecdote from the CEO of CyberSafeIreland who talked about a parent taking more than two years to seek to have harmful material in relation to one of their children removed. There is a risk that if we do not create a very clear obligation to remove material that scenario will be repeated in future. A clear obligation that harmful material must be removed is currently absent, but there also needs to be a process which leads up to that point because in order to identify that material there needs to be a very clear complaints mechanism.

The key elements are, first, the obligation to remove material and, second, a local complaints mechanism at service provider level so issues can be brought to a service provider's attention without having to go near the online safety commissioner. Third, as a safety net, the online safety commissioner complaints mechanism would address cases where service providers are not meeting their obligations. Those are the three components I would like to see.

Dr. Eileen Culloty

Regarding disinformation, the Deputy is absolutely right that each platform does completely different actions. What happens on each platform differs. One of the big problems is that we do not know whether those actions are effective. We also do not know the extent of the impact of disinformation on these platforms. All of that comes back to the fact that platforms are not obliged to share this data with independent researchers or authorities who could actually verify what is happening. That is why we are arguing that disinformation should be included, explicitly named, in the Bill because it already falls within the remit of the BAI, which currently has a role in monitoring the EU code of practice on disinformation. The EU Digital Services Act, the democracy action plan and the media action plan all clearly reference disinformation.

We should also note that all these issues are interrelated. We tend to talk about them as completely separate issues, as though problems with online harms and child safety, attacks on journalists and disinformation that might undermine democracy were not related. These are all interrelated and the common denominator is the online platforms. It makes sense for a new media regulator to have that broad perspective, which would give it more power in dealing with platforms.

Many of the questions I planned to ask have been asked already. I will put my first question to Dr. Culloty in relation to her call for a media plurality commissioner, which seems a very positive proposal. The remit suggested in her submission is very broad, but she makes the point that many of these functions are already under the remit of the BAI. Can Dr. Culloty discuss the remit she would envisage for this role and how it might work in practice?

Dr. Eileen Culloty

It goes back to the point I was making about how all these separate problems are fundamentally interconnected. We need to be able to see them with that comprehensive view. That is why we think a role specifically aimed at that is needed. For example, we know worldwide media freedom has been eroded and Ireland is not immune to this. It is already a big issue at European level which is why proposals for the European media freedom Act have been announced.

The traditional issue of diversity of ownership has normally been viewed as a competition issue, but pluralism in the market is not just about competition anymore. For example, in a very small English-speaking country like Ireland, which is dominated by a huge English-language market, there is a real concern about the future viability of Irish cultural production in Irish media. We know there are huge problems in journalism with the viability of its financial model. We are already seeing the closure of many local media outlets, especially regional newspapers, creating these news deserts where information is not covered.

Of course, all the issues we discussed today around regulating online content raise freedom of information and freedom of expression concerns. There ought to be a dedicated role that thinks about all these matters in a comprehensive way. Again, it is future-looking and assessing changing trends and future threats.

Dr. Culloty also put forward in her submission the idea of requiring online platforms catering for children to fund a portion of prevention and intervention measures. Most people would agree that makes sense. Can Dr. Culloty tell us what that will look like, very briefly? Is she talking about a levy on a percentage of profits?

Dr. Tijana Milosevic

That proposal came from us. We would have to, of course, leave the administrative decision-making up to Government because it is not within our area of expertise. At the moment, companies normally voluntarily donate money to individual research groups, charities or organisations, but the idea is to institute that through the Government. What we propose, envisage or suggest could be part of the statutory duty of care which has been proposed in other legislation. The rationale for this is that we often see that infrastructural support for a child would be beneficial as part of an individual complaints scheme. For instance, we have Coco's Law within the Department of Justice, which relates to a young person, not a child, who was affected and did not receive adequate help. What happened to her was a series of online harassment cases. She was not getting adequate infrastructural support from the Garda, from the online companies or adequate counselling and psychological and educational help.

We are proposing the Government really thinks about supporting an infrastructure that involves education and counselling, so when this happens every individual child and parent knows where to go. This should be streamlined, rather than having a series of individual organisations that might not be working in tandem. It should be concertedly thought through how the infrastructure is unified, to ensure we know who people go to in order to get the help they need when this happens when they are not able to deal with an individual piece of content that, in cyberbullying cases, usually happens in tandem with what is going on offline. That is what we are proposing. The details and administrative part are outside my domain of expertise.

In relation to Dr. Karen McAuley's points, Professor O'Mahony stated that the single biggest weakness in the Bill was the lack of an individual complaints mechanism.

Two weeks ago, representatives from the BAI and the Data Protection Commission appeared before the committee. They flagged the lack of resources. The option at the moment is to hold platforms accountable. It has been done in Australia, so it is possible. The idea is for us to take our time with this Bill to ensure it is fit for purpose and that it covers as many aspects as possible. That is what we want it to do. I am sure the witnesses would agree that fundamental to that is having sufficient resources. Would they agree with Professor O'Mahony that that would be the single biggest weakness if we want to have accountability and oversight of all this? Not having the option of an individual complaints mechanism automatically weakens the Bill from the get-go.

Dr. Karen McAuley

I thank the Deputy for her question. We share Professor O'Mahony's concerns in that regard. Our submission to the committee made reference to the proposed systemic complaints scheme which obviously cannot be viewed in isolation. It needs to be looked at with the other tools that are proposed for the commissioner. We are not suggesting that having a systemic complaints scheme is inherently flawed. However, while the proposed commission is being tasked with it, the general scheme and indeed the regulatory impact analysis provide very little detail about it. Therefore, it is not clear whether it could work as an effective alternative.

Regarding individual complaints handling, as we noted in our submission to the committee we feel it is very important that online service providers are tasked and equipped with the knowledge and competences to be able to deal effectively, appropriately and efficiently with complaints that come to them. Ultimately, if service providers are adept at good complaints handling practice and are able to resolve the concerns that complaints may come to their attention, that serves service users, including children, well because it supports a swift resolution of the issue. As I said earlier, it is still important to allow for that not being the case at all times. People need a further avenue. That is where the question of prospective individual complaints handling by the online safety commission itself needs to be considered. It is not a question of suggesting that the online safety commissioner should be the first port of call but rather that it should be an avenue that is available. In other words, we should have an independent non-judicial mechanism.

We are very aware of the Department's concerns about the capacity of the commission to deal with the potential quantity of complaints that may come to it if an individual complaints-handling mechanism is put in place. We are not sure about the extent to which the Department may have considered introducing a third strand. It is not a question of having one or the other. Rather than a systemic scheme or an individual complaints handling scheme, is there a way in which we can introduce an individual complaints-handling component which would complement the roles performed, for example, by a systemic complaint scheme by the auditing measures that are there and which would be set up in such a way that would mitigate the risk of the commission being overwhelmed by the volume of complaints?

The Deputy's time is up, but we will probably be able to have a second round of questions.

I apologise that I missed the presentation and I do not know what has been asked already. How important will the media pluralism commissioner be? The Bill makes reference to a child safety commissioner, but it merely provides that the Minister may appoint a child safety commissioner. I ask the witnesses to outline their views on the importance of that.

It was stated earlier that before the legislation was enacted, it would be doing young people a disservice if they were not afforded the opportunity to contribute their views. After this legislation is enacted, should children have an active role in this?

Dr. Eileen Culloty

The role of the media pluralism commissioner will be to ensure that the policy and regulatory environment in Ireland and the environment in which all Irish-facing media will operate is designed and has enough oversight to ensure we have a vibrant media sphere, not just for journalism and news media but also for our cultural production, which is also a major part of the media system. There are many threats on multiple systems, including the harassment of journalists and media workers as well as the decline of media companies at national, regional and local level. Some previous witnesses have spoken about community media. It is about ensuring that we have a very vibrant national system for media throughout the system, including the public broadcaster. We believe that one role could tie together the various strands of concern and also have a future-looking perspective.

Dr. Karen McAuley

The Deputy mentioned that explicit provision has not been made in the general scheme for the appointment of an online safety commissioner. As outlined in our submission, we understand the reasons for that. To ensure that the legislation is agile and to future-proof it as far as possible, our preference would be for explicit provision to be made for an online safety commissioner.

Regarding children's participation, our initial submission in 2019 strongly encouraged the Department to include children and young people in its work to develop the current legislative proposals. We are disappointed that has not occurred. Equally, we very much welcome that the committee invited young people to appear before it last week. If there is scope, given that this legislation has the distance to travel, we would really like to see children and young people being afforded a child-friendly opportunity to be able to express their views and concerns in respect of the legislation prior to it being enacted. Equally, when the commission is set up, we would hope that children and young people, as very important stakeholders, would be consulted about the work of the commission.

Professor Conor O'Mahony

I echo what Dr. McAuley has said. I also welcome that young people were invited to appear before the committee last week. I had not been aware of that before today and it is good to hear. It is incredibly important that any child participation is meaningful and not just a tokenistic thing. It should not be done just to give the appearance that children have been involved. They need to be given a meaningful opportunity to have their say, contribute to the process, and to have their views properly considered and taken on board as part of that process.

The important point has been made that this is an ongoing process. As we are all aware, the digital environment evolves very quickly and how it looks today may not be how it looks in a few years. Dr. Culloty has repeatedly made the point about future-proofing. Part of that future-proofing process must involve staying in touch with young people to hear their experiences of how it has changed and evolved, and for that feed into reviews of codes of practice or reviews of the legislation itself in years to come.

In his observations, Professor O'Mahony mentioned financial harm, as something that should be included in the definition of harmful content. He referred to gambling. For example, young teenagers are playing slot machine simulations which may cause financial harm in the future.

Professor Conor O'Mahony

At the moment the Bill captures harmed children through things like cyberbullying, abusive imagery or self-harm imagery. As it stands, the definitions in the Bill would not capture a child suffering harm through exposure to gambling. That could take a number of different forms. I drew the term "financial harm" from work of the British children's commissioner who had recommended in their framework that the concept of financial harm was an appropriate vehicle for ensuring that laws regulating harm online could capture exposure of children to gambling. That was the thinking behind that.

I thank our witnesses for their very informative presentations to the committee today. They have brought very detailed observations on the Bill. I will ask them to extrapolate further on some of the points they have made.

My first question is for Dr. Culloty. She spoke about future-proofing, which is the idea behind having a commissioner for media pluralism. The problem for any Bill like this is the fluidity of social media platforms and how they change to reach larger audiences. It will be difficult to keep on top of the issue, which I believe was Dr. Culloty's point. Is research ongoing into online enforcement of community standards by social media platforms? I had a recent experience of this with one of the platforms in question. What issues do researchers encounter when trying to research these platforms? Dr. Culloty mentioned that the platforms were not that willing to share information with independent researchers. Perhaps that is where the issue of disinformation comes into this.

Professor O'Mahony stated that the Bill did not require service providers to carry out any form of due diligence in respect of children's rights, for example, risk assessment of possible forms of online harm that might result from their services. We have touched on this matter already, but will he expand on it further?

Dr. McAuley made many observations. Someone mentioned the consequences and obligations stemming from harmful online content. I did not write down the person's name, although it may have been Professor O'Mahony. This is an important matter. How does Dr. McAuley suggest the Bill could address it better? Dr. Culloty spoke about how the speed of removal of harmful content was not what it should be. Perhaps that point feeds into this matter.

Dr. Eileen Culloty

One of the issues with social media platforms is that they have become integrated into our everyday lives. The children who spoke last week were clear about that. It is how we communicate with our families, friends and colleagues. It is also where small businesses operate. The platforms are private companies, though, and they are free to set the terms and conditions for participation on their platforms as they wish. This is something with which we are struggling to grapple. We treat them like they are a public space, but they are not. They are private commercial spaces.

One of the reasons we struggle to understand the extent of the phenomenon that we are concerned about is that the platforms do not provide access to data that would allow us to research it. In terms of the way they implement community standards, the research suggests it is inconsistent. We saw this in the wake of the riots at the US Capitol, where platforms wanted to take various actions. Some of them were in compliance with their community standards. Most of the time, they did not even reference their community standards or terms of service. They just took the actions they wanted to take. There is no pushback against that.

It is a larger, blue-sky academic question. I hope that something that comes out of the Future of Media Commission is a decision on what we want from these online public spaces and how citizen participation can be imbedded into them. In Ireland, there is a proposal for the development of social media councils involving various independent experts and citizens who would engage in discussions on what types of content and behaviour are suitable online and what we want from these spaces. Platforms might participate in those discussions.

All of these conversations are in their early days, but the question about the future is a major one and we cannot sleepwalk into more of these mistakes. Significant discussions need to be had.

Very true.

Professor Conor O'Mahony

Due diligence is emphasised by the UN Committee on the Rights of the Child, CRC, as an obligation that should be imposed on service providers. This point was emphasised in an earlier general comment in respect of children's rights in the business sector and has been re-emphasised in terms of the digital environment specifically in the CRC's most recent general comment.

Head 50A of the Bill makes reference to requirements on service providers to undertake actions like risk assessments, but it does not specify that such risk assessments should include an examination of risks posed to children's rights in the delivery of their services. The important point to emphasise in this regard is that children's rights are still a relatively emerging concept in some ways and the concept's buy-in by society at large and the business sector is not as strong as it could be. Children's rights are a more detailed and involved concept than just the child's right to protection from harm. The CRC's general comment No. 25 on children's rights in the digital environment spells out a range of different rights that children have that are implicated in online activities. If we are to ensure that these rights are protected in the digital environment, the CRC says that we need to impose obligations on service providers to assess how the delivery of their services will impact across the range of children's rights and what risks will be presented in that regard. My fear is that, when head 50A refers simply to a risk assessment without making any reference to what kinds of risk need to be covered by that assessment, children's rights could be left behind in the process.

Dr. Karen McAuley

I am sorry, as I had a difficulty with my sound and did not fully catch the Chair's question. Would she mind repeating it?

I will summarise. It was about how to address in the Bill the speed of removal of harmful content and the lack of consequences and obligations in respect of such content.

Dr. Karen McAuley

Does the Chair mean the consequences for children or services?

For services.

Dr. Karen McAuley

The Chair might interrupt me if I am speaking off point, but provisions are made under head 54A for prospective sanctions for non-compliance with commission directions. As an organisation, we are not best placed to comment on the prospective efficacy of those sanctions. We would like to think that they might work as a deterrent.

We welcome an important aspect of the Bill, although it needs further development, namely, the focus on prevention and early intervention, on trying to reduce children's exposure to harmful content, and on putting in place a system that understands what that means and is able to deal with it appropriately in the interests of children.

I wish to tease out some of those issues, as they will be crucial questions around an individual complaints mechanism. We would be concerned if such a mechanism was part of the media commission yet the regulator had to deal with complaints in the first instance. In those circumstances, we will be looking at drawing up codes and setting a statutory duty of care on some of the platforms. What do the witnesses believe we should include in that duty of care? Should there be variations depending on the type of platform? For example, Facebook and Twitter are very outward facing. Should they be subject to different expectations than WhatsApp or a private messaging group? The Chair posed Dr. McAuley some questions about sanctions. They are crucial in this regard. There is also the question of remedies. I would be interested in hearing the witnesses' views on same. For anyone, but children in particular, what remedies do they believe could be offered?

My final question is a general one and gets to the crux of much of this debate. Do the witnesses consider a number of these social media platforms to be platforms or publishers?

Professor Conor O'Mahony

There were many questions and I might not get to all of them. On the question of sanctions, I echo Dr. McAuley's point about their use as a deterrent. There is that value to them. It was mentioned that some of the sanctions seen in other jurisdictions were very much at the higher end. If those were put in place in Ireland and providers realised that the consequences could be significant financially if they fell foul of them, we would create a strong deterrent.

Deterrents, by their nature, will never catch everything but they have an important role to play in focusing the mind of certain providers in respect of the nature of their obligations.

On the question of remedies, the remedy is emphasised in the international guidance from the UN Committee on the Rights of the Child and the Council of Europe. The idea that there should be an individual remedy for the individual child is stressed by both organisations. Financial compensation is obviously one version of that. The idea is that where the service provider has failed in its obligations and children have suffered harm as a result, there would be a possibility of redress on foot of it. This serves to have a deterrent effect in the first instance because there is potentially a further financial consequence for the service provider and it provides a degree of redress for the person affected. Mentioned several times, particularly by Dr. Milosevic, has been the idea of supports for children affected. From a children's rights perspective, where children do suffer harm, particularly the very serious harm that can occur online, providing them with the necessary therapeutic services and supports is a really important part of the picture. The reality in Ireland is that, across the spectrum of ways in which children suffer harm, we do not always provide very good services to children who experience sexual abuse or other such forms of harm. Therefore, there is a lot that needs to be done to improve the services. That applies in the digital environment as well as in the everyday world. Perhaps fines or damages awarded against service providers who fall foul of their obligations could be channelled into supporting those kinds of services or access thereto. Perhaps this could be explored further.

Dr. Eileen Culloty

On the question on platforms and publishers, the debate has been occurring for a long time but it may miss the nature of what is happening. The early regulation on the Internet defined what we call online platforms as something akin to a telephone line provider, which is providing a service and is not responsible for what people say over the phone. That is perfectly reasonable. The platforms we have now, however, are clearly a lot more involved than a telephone line. Therefore, they are more than just neutral. They are also not publishers in the traditional sense of media broadcasters or newspapers. From regulatory and policy perspectives, we have not really grappled with the fact that we have a new type of entity with enormous influence.

On the point on media pluralism, when we talk about pluralism in the media market we mean media owners. We should also think about pluralism on the Internet because the Internet is a lot bigger than the handful of platforms that dominate the online world. It is lax regulation internationally that has allowed them to buy up competitors and new entrants to the market. They are fundamentally anti-competitive and giving the view that a handful of platforms own the Internet. They do not. There is an important role for regulators internationally, not just in Ireland, to push back against that.

I have just one question. It may not be all that relevant but I will ask it just in case. A number of years ago, there was a considerable conversation on the digital age of consent. The Ombudsman for Children had a strong view on being realistic about children's Internet usage. Should we keep that in sight in respect of this Bill? We have one of the highest digital ages of consent, at 16. Is there anything that would concern the delegation about that in regard to this Bill? I have not fully thought out my question.

Dr. Karen McAuley

I thank the Senator for his question. As he knows, the digital age of consent is provided for under the Data Protection Act. There is an attendant provision within that Act to the effect that the age will be reviewed three years after coming into effect. Therefore, there is an obligation to have a review of the current digital age of consent. Based on the timeline, that should be initiated this month. We have not heard any details on when it is going to start but, under the legislation, it should be initiated this month. We look forward to the discussion and associated developments. It will be interesting to see the extent to which the digital age of consent, set at 16 in Ireland, has been monitored and evaluated in terms of its impact. We believe that is important in order to support and strengthen the debate on the issue.

On the question on the relationship between the digital age of consent and this legislation, I am not sure I have an answer. Suffice it to say that ultimately the digital age of consent is in our GDPR and Data Protection Act as a protective measure. It is one of several measures related to acknowledging the additional vulnerability of children and the need for special measures to ensure their protection. In that respect, there may be a connection. In this Bill and in that measure, we are seeking to protect children.

Professor Conor O'Mahony

I will not add very much to that other than to stress that, from children's rights and international human rights law perspectives, there is a well-established principle that any measures that have the effect of restricting children's ability to participate in online activities must be prescribed by law, necessary and proportionate. Therefore, something like the review of the digital age of consent would have an important role to play in determining the extent to which it complies with those principles. We have to recall that while children have a right to be protected from harm, they also have rights to express themselves, participate in society and engage in play and social activities online. It is a tricky one; there is a balance to be struck between the two.

Dr. Tijana Milosevic

There is an important point to be made on the issue of privacy versus safety. From our research, many parents, but also children, are not quite aware of why the digital age of consent, 16, is related to privacy. There seems to be an idea that it is about safety and driven by safety legislation. It is distinct issue. Media literacy education is required so parents and children will understand what this is all about and what the digital age consent means. We know from legislating in the United States and research from ten years ago that parents were helping underage children to get on social media platforms if they believed they were supposed to be on them, even if they were under the allowed age. Many of them did not know why the provision existed in the first place. Many of them believed it was about safety, not about privacy. It is extremely important to understand that it is actually about privacy. It had not been intended as a safety measure although it has been used in that manner. This leads to the issue of balancing the right to protection, on the one hand, and the right to provision and participation, on the other. It is important to have education in that regard.

My question is for Professor O'Mahony. He mentioned that financial harm was not covered in the general scheme of the Bill and discussed the statistics on the number of teenagers engaged in gambling annually but I would like to mention the importance of definitions of concepts such as pornography and gross or gratuitous violence. Does he have any comments or recommendations in that regard?

Professor Conor O'Mahony

The recommendation in that regard is that there needs to be a definition. One of the key guiding principles is that the legal and regulatory environment needs to be predictable. If concepts are left undefined in a way that allows for a huge amount of subjectivity concerning what falls within a definition — the term "pornography" is notorious in legal terms in that respect — people will often not agree on what comes within the meaning of a term.

It is a matter of trying to comply with the requirement of clarity and predictability for people. I do not want to offer a particular view on how that definition should be phrased. However, it would be better to have a definition of key concepts that determine what is or is not captured by the Bill rather than leaving terms that are known to be subjective undefined in a way that invites disputes.

Does anyone else want to comment on Dr. Culloty's observations? It seems members are happy that they have got to ask all the questions they wanted to ask.

I will take this opportunity to thank Dr. Culloty, Dr. McAuley, Professor O'Mahony and Dr. Milosevic for their presentations. They were comprehensive and interesting. The witnesses have given us a new body of work to do in terms of the Bill. There is nothing the witnesses have suggested today that any of the members would be in disagreement with. My thanks to the witnesses for their input. It has been impressive and helpful. We will continue our work with the Online Safety and Media Regulation Bill tomorrow with representatives from the Irish Society for the Prevention of Cruelty to Children, the Children's Rights Alliance and CyberSafeKids.

The joint committee adjourned at 1.50 p.m. until 12.30 p.m. on Thursday, 13 May 2021.
Barr
Roinn