Léim ar aghaidh chuig an bpríomhábhar
Gnáthamharc

Joint Committee on Tourism, Culture, Arts, Sport and Media díospóireacht -
Wednesday, 7 Jul 2021

General Scheme of the Online Safety and Media Regulation Bill 2020: Discussion (Resumed)

This meeting has been convened in the context of the committee's continued pre-legislative scrutiny of the general scheme of the online safety and media regulation Bill.

I welcome the following witnesses to the meeting, who will be joining the meeting in committee room 3 remotely via Microsoft Teams: Mr. Ronan Lupton, barrister-at-law; Ms Una Fitzpatrick, director of Technology Ireland; and Dr. Pierre François Docquir, head of media freedom at Article 19.

The format of the meeting is such that I will invite witnesses to make opening statements, which will be followed by questions from members of the committee. As the witnesses are aware, the committee may publish the opening statements on its website following the meeting.

Before I invite the witnesses to make their opening statements, which are limited to three minutes for each organisation or witness attending, I wish to advise them of the following in relation to parliamentary privilege. Witnesses are reminded of the long-standing parliamentary practice that they should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable, or otherwise engage in speech that might be regarded as damaging to the good name of the person or entity. Therefore, if their statements are potentially defamatory in respect of an identifiable person or entity, they will be directed to discontinue their remarks. It is imperative that they comply with any such direction. As the witnesses are attending remotely from outside the Leinster House campus, I ask them to note that there are some limitations to parliamentary privilege and as such, they may not benefit from the same level of immunity from legal proceedings as a witness who is physically present does. Witnesses participating in this committee session from a jurisdiction outside of the State are advised that they should also be mindful of their domestic law and how it may apply to the evidence they give.

Members are reminded of the long-standing parliamentary practice to the effect that they should not comment on, criticise or make charges against a person outside the Houses or an official either by name or in such a way as to make him or her identifiable. I remind members of the constitutional requirement that they must be physically present within the confines of the place which Parliament has chosen to sit, namely, Leinster House or the Convention Centre Dublin or both, in order to participate in the public meeting. I will not permit a member to attend if he or she is not adhering to the constitutional requirements. Therefore, any member who attempts to attend from outside the parliamentary precincts will be asked to leave the meeting. I also ask members to identify themselves when contributing for the benefit of the Debates Office staff who are preparing the Official Report. Microphones must be muted when not contributing to reduce background noise and feedback. I ask members and witnesses to use the raise-your-hand function on their screen if they want to contribute. I remind those participating in the meeting to ensure their mobile telephones are on silent mode or switched off.

That concludes the housekeeping. I hope it is clear. I invite the first witness, Mr. Ronan Lupton, to address the committee.

Mr. Ronan Lupton

I thank the committee for the invitation to address it today and to give some evidence in relation to an area where I have acted and worked for approximately 23 years. My background is set out in full in my written submission of 19 March 2021. The evidence I am giving today is simply as an independent legal practitioner, not on behalf of the Law Library or Alternative Operators in the Communications Market, ALTO, which is a telecoms group that I chair. I was privileged to have been appointed to the internet advisory board, the Internet Safety Advisory Council, ISAC, which now operates as the National Advisory Council for Online Safety, NACOS. Many members of the committee will be aware of the child safety and online advisory role that NACOS undertakes under the auspices of the office for Internet safety, which had resided previously within the Department of Justice but is now part of the Department of the Environment, Climate and Communications.

Members will have seen my written submission of 19 March 2021, which I was invited to put in. I took a very legalistic approach in relation to my submission, as members will have seen, suggesting that we should get on with the business of the audiovisual media services directive, AVMSD, and perhaps forget or park aspects of the online safety and media regulation, OSMR, national legislation until such time as the European law in respect of the Digital Services Act and the Digital Markets Act is concluded. I have moved away from that, having followed the business of the committee in the last number of weeks. I have been paying attention to the submissions being made by various parties. I understand the position that the committee is now in and fully appreciate that.

In my opening statement, which is a separate document, I have set out five key points on which I want to give evidence today. Hopefully, I can get through them in the next two minutes. The first is the issue of the forthcoming European law and making sure that the committee, which I know it is, is fully mindful of the flight path and dual track that is happening in respect of amendments to the area of online safety and the general e-commerce amendments which are planned centrally in Europe. That legislation is coming at us like a train. In terms of the OSMR and the pre-legislative scrutiny, we must bear in mind that some unpicking may have to happen if we go ahead and legislate for many of the developments within the heads at this stage. There are some examples but I do not intend to go into them at this particular time. The second point is that I welcome the establishment of a new media commission. It is quite clear that it needs to happen. We do not need to spend more time on that.

The issue of online safety is a massive one for everybody on the committee and every individual living in Ireland but it is something that has effectively been subject to self-regulation for quite a number of years. That is no longer adequate, given the nature of the environment we operate in, the pervasive nature of the Internet and our duty and responsibility as legislators, lawyers and so on. From that point of view, I have to also welcome - and I do so quite clearly - the initiatives and the basis of the legislation, as drafted. I wish to highlight a point which I know Senator Malcolm Byrne and others on the committee have raised before, namely, the issue of civil versus criminal law regulation on how that interworks and operates. It is obviously a very serious point. Members will see, in my opening statement, that I make the delineation between a civil matter as something that you report to the Garda versus what we can do in terms of regulation and making sure that complaints are properly handled and dealt with.

Finally, the committee administrators asked me to deal with the issue of levies. I have done that by comparison to the ComReg telecoms levy. It may not be that the ComReg telecoms levy is an appropriate model, but I have set it out so members can at least deliberate on it. I have concluded, in my opening statement, that it is really a matter for the new media commission to consult upon how levies will operate in the future.

That is the sensible approach to take at this point. Those are the five areas I have prepared. The committee will have seen in my opening statement that I have carefully considered the evidence of Dr. T.J. McIntyre of Digital Rights Ireland and Professor Conor O’Mahony to the committee along with their written submissions. I would associate myself with everything they have submitted and filed in written form. That is it for me. I will happily address any questions, queries or comments the committee has afterwards.

I now call on Ms Fitzpatrick to address the committee on behalf of Technology Ireland.

Ms Una Fitzpatrick

I thank members of the committee for inviting Technology Ireland to participate in these hearings as it conducts its pre-legislative scrutiny of the online safety and media regulation Bill. Technology Ireland is an association within IBEC that represents the ICT, digital and software technology sector. Technology Ireland is a member of the National Advisory Council for Online Safety since its establishment in 2018 and continues to participate in and support the council. Technology Ireland has consistently expressed support for the overall goals of the proposed online safety and media regulation Bill, which should provide for a systemic approach in regulating online platforms and digital services.

I will not cover the full extent of the Technology Ireland submission to this committee in the three minutes available to me today. I will instead focus on four key points. First, this Bill seeks to transpose the EU audiovisual media services directive into Irish law. This important legislation governs EU-wide co-ordination of national legislation on all audiovisual media and further enshrines the country of origin principle, which is of particular importance to Ireland given the number of video-sharing platforms established in the State. Since the deadline for the transposition of the audiovisual media services directive into national law has already passed, Technology Ireland members encourage the swift adoption of this Bill to ensure legal certainty in the regulation of video-sharing platforms across the EU.

The second key point concerns the media commission and the establishment and appointment of the regulator. The rapid establishment of a fully resourced and staffed regulator is of crucial importance. It is essential that the commission is staffed with sufficient in-house expertise to negate the need for external consultants, which may result in a process of uneven and inconsistent decision making. In our submission, we call for the rapid establishment of an online safety commissioner and the prioritisation of existing EU online safety law.

The third key point concerns sanctions for non-compliance and senior management liability. Technology Ireland members believe that administrative financial sanctions should be limited to the most serious, repeated and systemic cases to ensure that systemic failures are penalised rather than isolated individual ones. Technology Ireland believes that the inclusion of senior management liability could have a detrimental effect on prospective investment into Ireland. Such clauses are also unlikely to be practical in terms of proof of liability and as such could undermine the credibility of the Act. Inclusion of senior management liability could also create an international precedent, which could be utilised by more oppressive regimes seeking to pressure management of media outlets without the checks and balances present in Irish law.

My fourth and final point is on the designation of relevant online services. Technology Ireland supports the two-step process by which a targeted list of in-scope services is set out in primary legislation. The regulator has a duty to designate individual services or categories of service for statutory regulation based on an objective and evidence-based assessment of risk to Irish consumers and in consultation with the service provider. This will ensure that providers have maximum certainty as to which services are in scope and the regulator focuses its resources where the public interest and risk is greatest. To conclude, I reiterate that Technology Ireland is strongly supportive of the overall objectives of the online safety and media regulation Bill and I look forward to today’s engagement and responding to members' questions.

I now invite Dr. Pierre François Docquir, who I understand is joining us all the way from the UK, to make his opening statement.

Dr. Pierre François Docquir

I am actually joining the meeting from Lisbon, Portugal. I thank members for the opportunity to speak before the joint committee. Article 19 is a global freedom of expression non-governmental organisation. We work for a world where all people everywhere can freely express themselves and actively engage in public life without fear of discrimination.

Article 19 acknowledges that part of the content that circulates on social media platforms can contribute to undermining safety and civility and we also acknowledge that preventing social harms linked to the circulation of online content is a worthy objective. It is a very difficult issue, however, especially with categories of speech that are, as the expression goes, "lawful but harmful". As our colleagues from the Irish Council for Civil Liberties, ICCL, have developed in their submission to this committee, definitions of categories of "harmful content" are highly problematic from the perspective of freedom of expression. Dealing with each category of social harms is a complex matter that requires many different perspectives and different forms of expertise. It cannot be expected that social media companies would take the place of therapists, social workers, researchers, media literacy experts and others.

We believe that bringing together the different stakeholders concerned with content moderation in a transparent and participatory forum could provide an effective approach to the external oversight of social media platforms. Such a mechanism could ensure both the protection of fundamental rights and the effective regulation of problematic content. This is why, as part of Article 19’s efforts to promote freedom of expression online, we have developed the idea of a social media council, which is a model for a multi-stakeholder mechanism of voluntary compliance that will oversee content moderation practices on social media platforms on the basis of international standards on freedom of expression and other fundamental rights. The social media council would have the power to review individual content moderation decisions and to elaborate general recommendations for social media platforms.

During our discussions on the social media council model, one of the concerns that emerged as particularly important is that the oversight of content moderation practices should be well informed of the linguistic, political, historical, cultural and social dimensions of the context of a content moderation dispute. This is why Article 19 considers that, where possible, social media councils should be created at the national level. We have submitted the idea to a broad range of stakeholders in Ireland and on the basis of the interest of the majority of the people with whom we spoke, we have initiated a process that could lead to the creation of the first social media council. I want to make it very clear that Article 19 proposes to facilitate a process and does not plan to operate or control the Irish social media council.

We believe that the social media council could work well to complement a future public regulatory authority. In our written submission, we made suggestions for how the social media council could fit within the future legal and regulatory framework for social media platforms in Ireland. I am very happy to respond to any questions members may have.

I thank all the witnesses for their very interesting and insightful contributions.

I thank all our witnesses not just for their presentations but for their submissions, which have been highly informative. I have a question for each of the three witnesses. In response to Mr. Lupton, we will be constantly amending legislation in this area as the Digital Services Act and Digital Markets Act come down the track.

In considering this, a crucial question will concern the thresholds for complaints, particularly if we introduce an individual complaints mechanism, and how we should go about defining online harm.

Ms Fitzpatrick argued that we should not make senior management liable, an issue on which I am not going to take a position. She will be aware that in workplace safety legislation and through the work of the Health and Safety Authority, we make senior management liable in the case of serious breaches of workplace safety legislation. How is that to be differentiated from the circumstances we are discussing, where senior management knowingly fails to take action where serious online harm is done?

My final question is for Dr. Docquir. I like the idea of digital advisory panels and so on but the only example of that globally at present is the Facebook oversight board. What are his views on that and on Facebook maintaining that the board is independent? I would question that but I would be grateful to hear Dr. Docquir's views. If we were to establish a panel here, particularly on a statutory basis, how might it operate?

Mr. Ronan Lupton

I thank the Senator for his question. He gave me the elephant trap question how to define harm, and I appreciate that greatly. The situation is extraordinarily complex and if we borrow from the UK jurisdiction in terms of the efforts to define precisely what harm is, we will end up spending a serious degree of time, work and effort to try to find that definition.

From my point of view, there are two fundamental aspects we as a society and as legislators need to think about. The first is harm that is purely of a criminal nature, and we must ensure the Statute Book is tooled up and equipped to deal with criminally harmful content. Online-offline dynamics, which under the heads of Bill comprise the dissemination of information, intimidation, threatening, humiliating, persecuting and those types of behaviour, have been called out in various witness evidence as being somewhat vague.

The difficulty is that those definitions can be brought to the nth degree and it will take years trying to codify them and to set down what they are and need to be. It is a very difficult question to answer with any surety, but on one side - the important side - it is only when the code of conduct on criminal behaviour is dealt with properly on the Statute Book that it will become slightly easier to legislate for the civil harms and to give the job of regulation to a regulator, whether that is a digital safety commissioner under the new commission or otherwise, and we are not there yet.

As I mentioned earlier, while we are lucky to have Coco's Law passed and on the Statute Book, we are deficient in Garda training, for example, and in having the powers on the Statute Book necessarily to prosecute the egregious criminal harm that is going on. We have then to look at what the civil side is and how to get there. The efforts in the Bill are a starting point but the committee's deliberations will have to come to a certain point and members will have to try to draw a line under where the Bill can go. The issue of the offline versus online world and the experience on the civil side-----

I apologise but I will have to stop Mr. Lupton there. There may be an opportunity for a second round of questions in which we can develop some of these points further. To be fair to all our guests and members, we will try to get everyone in.

Ms Una Fitzpatrick

On senior management liability, the call from Technology Ireland members is for much greater clarity on when that may arise. If these are systemic, repeated and serious offences that are not being addressed by the company and senior management, a process should be there. Many of Technology Ireland's members intend to engage, and already do, with authorities to address issues. We would like further clarity on what the process would be and on the criteria for any of those sanctions to come into play. It is still a bit loose or unclear as to when they may kick in, and that is the point on which our members would like clarity.

Dr. Pierre François Docquir

The Senator is absolutely correct. The Facebook oversight board is the only mechanism today that exercises external oversight of a social media company. At Article 19, we think it is an interesting experiment but it will take some time before we can make up our mind about its effectiveness. The oversight board is for Facebook only, whereas the social media council, SMC, would be an industry-wide mechanism. The oversight board applies Facebook's community standards and we propose that the social media council should work on the basis of international standards regarding freedom of expression. Finally, the oversight board is a global mechanism of 20 people who are supposed to take care of the entire planet, or almost that, given that that is the size of Facebook, to a degree. By contrast, the social media council in Ireland would be for Irish cases only and would be operated by people who speak the language and know the context very well. We think there is something key in ensuring that local voices can have a say in the oversight of content moderation problems that affect their society.

I thank Mr. Lupton, Ms Fitzpatrick and Dr. Docquir. My first questions are for Ms Fitzpatrick. Would it be beneficial to include in the Bill a clause setting out a specific date for the immediate appointment of a regulator? She stated that the inclusion of social management liability is undermining the credibility of the Bill. Who then should be liable within the Bill?

Dr. Docquir outlined the idea of setting up a national social media council. How would its members be chosen? What main role does he foresee the social media council having in framing individual content decisions and the protection of freedom of expression and the protection of fundamental rights?

While I appreciate that Mr. Lupton indicated he did not want to answer this question, he spoke about two European Acts, namely, the Digital Services Act and the Digital Markets Act. He stated that a significant legislative review will be required, along with the possible unpicking of a section of the Bill. In his expert opinion, will he outline any reservations he has about the European Acts? How could they outweigh or possibly interfere with the Bill before us? He went on to state that the establishment and function of any content levy scheme is likely to be highly complex and may result in unintended consequences. What unintended consequences was he was referring to?

Ms Una Fitzpatrick

I thank the Deputy for the questions. Our members would absolutely support the inclusion of a start date for the media commissioner. When this was initially proposed, Ireland was seen as being at the vanguard and taking a leadership role. The membership and the companies based here were really supportive and grateful for that proposal.

On who should be liable, as in my earlier response, it is about a process. It is about highlighting that in some cases, it is the providers of the content and not the carriers of the content. For ongoing, systemic issues, however, if there is no engagement or resolution of an issue, of course there may be liability issues.

I really think there has to be a process to get to that point. If it is a once-off incident and there have not been any prior issues then perhaps that would be seen as being unduly harsh. Obviously, if this was an issue that had been arising repeatedly over a period and there had been efforts to engage with a company which was not engaging, then in those circumstances, the process would kick in. It is really just to clarify that a process needs to be put in place around that sanction.

I thank Ms Fitzpatrick. The second question goes to Dr. Docquir.

Dr. Pierre François Docquir

I thank the Chairman very much. I thank the Deputy for the questions.

In terms of the appointment mechanism, the social media council is a multi-stakeholder mechanism. We suggest that the different categories of stakeholders, by which we mean social media platforms, media companies, the advertising industry, academia and civil society organisations representing the diversity of society, should appoint one member each or two members for some of these categories. This is detailed in our written submission. In addition, then, a number of independent directors and members would be chosen by the first set of directors. This is from conversations we have had so far with our interlocutors in Ireland. This seems to be a governance structure that makes sense for the people with whom we have been speaking.

In terms of the role of the social media council for the appeals mechanism and how it would impact content moderation decisions, we suggest there should first be an attempt to deal with the problem internally with the company whose platform is concerned with the content moderation dispute and then the case, if not solved internally, could be brought to the social media council.

The social media council would have the capacity to choose the cases it wants to focus on to avoid being drowned in too many cases. On the other hand, however, every person should also have the possibility to send their requests. A filtering mechanism will, therefore, be fine-tuned in the constitution of the social media council.

The decisions of the council will not be based on companies' terms of service or community standards. They will be based on international standards on human rights, which is where the protection of freedom of expression and other fundamental rights come into play. The decisions of the social media council would be binding for the companies within the framework of a voluntary compliance mechanism, of course. Nonetheless, companies that take part in the mechanism would commit to executing the decisions of the social media council. That is how we suggest things could work and then reach a result that would combine the protection of fundamental rights with an effective regulation of problematic content.

Mr. Ronan Lupton

In terms of the Digital Services Act and the Digital Markets Act, the main legislation we will need to worry about is the Digital Services Act. What that will do is modify fundamentally legislation called the e-Commerce directive, which we transposed into Irish law under SI 68 in 2003. That provides a number of protections for online platforms, which are really the nuts and bolts of what this Bill needs to fix, in other words, how does one get content taken down and reported and so forth.

My main point is this. If we are at pre-legislative scrutiny at this stage and we have a large item of legislation such as the Digital Services Act at least being drafted and put out there for the European processes to take effect, what is happening is a dual track. We are operating on the online safety and media regulation Bill, which will fundamentally put in national criteria for dealing with content online. As that will all shift, we are working on the current legislative basis in terms of how the platforms operate at this stage.

My main point initially was to ask why do we not just get on with the AVMSD, park the rest of it and see what comes out in terms of the Digital Services Act? That is not really practical when we think about what could go on online in the interim.

The key question Deputy Mythen was really getting at was, therefore, what the differences are and whether we can identify them. I would need much more time to do that but I know from the written submissions the committee has received, particularly from the platforms, that much of the information is broken out to show were those issues arise, fundamentally with regard to the defences around hosting. In other words, is the platform a publisher? The answer is invariably "No" but what are their duties in terms of taking down deleterious, criminal and questionable content? If we are going to have a situation where a regulator has certain powers under national legislation, which fundamentally conflict with what the Digital Services Act ultimately ends up legislating for and dictating, that is the type of scenario we need to be careful about. Ultimately, it will mean possibly unpicking, which is really where I am going in terms of my very clear one-word submission in the opening statement. I want to make sure that we do not inefficiently use our time and that of the Attorney General's office and the various parliamentary draftspeople.

That is not to the detriment of society, it must be said. I would never say that. Ultimately, we would be doing something but the question is, how is it going to play out? Ultimately, European law will be superior to what we do. We will, therefore, have to retrofit it and that is a key issue. It is difficult in the sense that, yes, the AVMSD is there, we are late on it and need to do it and therefore, let us get it done. There is that aspect of it.

I have, however, been involved in reports in this space since 2014 and 2017 through the Internet content advisory group and the Law Reform Commission. I agree with the conclusions of both even though they are different. It is, however, really that point of whether we can look at the future and see what is going to be there and maybe try to plan for that. It is very difficult to do so but hopefully that at least seeks to address some of the questions Deputy Mythen had.

Can Mr. Lupton comment on the levy context?

Mr. Ronan Lupton

The difficulty with that is the model I have given the Deputy looks at taking one fifth on 1% of relevant revenue. As we know, many companies that may not be located here would be captured by that, so some thinking may need to be done in respect of how that operates. Is it broadcasts that are made available in this jurisdiction? Ultimately, the model I have given assumes national location, national accounts filing and so forth, which may be something that a regulator considers in part. From that point of view, therefore, it was just a suggestion. There will have to be consultation, however, not necessarily limited to Ireland but maybe pan-European consultation on how that ultimately works.

One observation, however, and I think it is quite clear that the Deputy has heard it before, is that many member states have not gone for this model or at least have reservations as to how it will work. It needs, therefore, to be consulted upon widely by whatever form that commission takes when it is appointed.

We will move on. Senator Carrigy may go ahead.

I thank the Acting Chairman and welcome all our guests. I have a couple of points and questions, some of which have been spoken about. Perhaps the witnesses could use the time to expand on them.

Dr. Docquir spoke about the social media council. To be honest, that is a fantastic idea. It is in his submission that Ms Irene Khan, UN special rapporteur, has recommended that it be created. I believe it should be done Europe-wide and that the criteria and membership should be set out in order that every European country would then have it in place. Does Dr. Docquir wish to expand on what he already said? He answered some of the questions I would have asked in his reply to Deputy Mythen. I will not repeat them again but he may want to expand on that.

Ms Fitzpatrick, in her presentation, made a point that the commission should focus its attention on governance and regulation of the providers of content and not the carriers. What is she proposing then should be in place for anyone who would share or carry content that was illegal? Is she proposing that the commission would have any say in that at all?

I thank Senator Carrigy. The first question was to Dr. Docquir.

Dr. Pierre François Docquir

I thank the Senator very much for the question. We see the social media council as something that should exist at the national level because we want local voices to have a say with regard to the oversight of content moderation cases. The European Union is a union and it is something I personally find fantastic, to be honest. There are many difference from one country to the other in terms of languages and culture. Having a detailed understanding of the context for any case of regulating speech is absolutely key to reaching the appropriate decision.

In our view and in our proposal, we see the social media council as existing at the national level. It may be compared, for instance, with the existence of press councils, which also exist at the national level in respect of the national press industry. At the EU level, in a future situation whereby social media councils would exist in a number of European countries, it would make sense and be necessary for these councils to co-ordinate with one another. As well as this, they would have some sort of European association, where they could deal with a number of common concerns and make sure that they work together in a consistent manner. The social media council could also find ways to co-operate with Facebook’s oversight board to provide local anchorage that the Facebook oversight board, being a global body, does not have.

Ms Una Fitzpatrick

I thank the Senator for the question. We advise that the media commission focuses attention, with regard to governance and regulation, on the providers of content and not on the carriers of that content. That refers to both traditional and online content. We are saying that when the media commission comes into place, it should establish how it will look into harms: both online harms or other harms. It should set out criteria for dealing with those people who put up the content, rather than the carriers of the content. The carriers will have a role to play if they do not engage in the process of removal, or things like that. In the first instance, though, we would focus on those who create the harmful content, as opposed to the carriers. The carriers obviously also have a role. We would like the media commission to clarify the process there, in consultation with all stakeholders, and to work with all parties involved.

Would Mr. Lupton like to contribute to those comments?

Mr. Ronan Lupton

The only point I wanted to make is a thread that has run through all contributions this morning. In Ireland, we are in the lucky position where we have the constitutional right to freedom of expression. This has been pointed out in many submissions to the committee. This is codified under Article 40.6.1°.i of the Constitution. As an adjunct to that, there has been case law through the courts that deals with online rights. Those rights have been vindicated by the courts. Some people bristle at the mere concept of that. However, picking up from Dr. Docquir’s point on this supervening European law, the charter, and the convention aspects, we are in a unique position. We should bear that in mind. We are a fairly liberal society. We have this in what is an old Constitution at this stage. It mentions organs of political press and the media and so forth, but also the right of the individual to communicate. We need to be careful, both as legislators and citizens, not to impinge or trample on those rights. This point on how we do things is a common thread, in line with Dr. Docquir and Ms Fitzpatrick. Liability for carriers and platforms is going to be regulated, as I said in my earlier answer to Deputy Mythen. It will be European law. That is where that unpicking may have to occur, if we go slightly too far in our national legislative instrument in Ireland. We need to be aware of what is out there in advance. That is my contribution. I know that I was not asked a direct question by Senator Carrigy but I wanted to make a contribution.

If any witnesses have not been directly asked a question but wish to contribute, they should just raise their hand. I thank Senator Carrigy and we will move on to Deputy Munster.

I thank the Chair. I have a couple of questions for each of our witnesses. First, Dr. Docquir proposes a voluntary system. In a voluntary system like the one he proposed, does he believe it would be sufficient? What if companies chose not to sign up to it? Without any statutory footing, what power would the system have? Also, who would fund it? Maybe he has clarified this already. Would providing for an individual complaints mechanism in the Bill negate the need for a social media council?

Second, Mr. Lupton mentioned Professor Conor O’Mahony’s submission. Can he comment on the proposed individual complaints mechanism? Does he have an opinion on the need for the individual complaints mechanism, that was recommended by so many of the children’s rights stakeholders? Professor O’Mahony cited it as the most serious omission from the Bill from his point of view. Could Mr. Lupton please comment on that?

Third, Ms Fitzpatrick mentioned in her opening statement - and others made reference to this - that only the most serious repeated and systematic cases should be penalised. Does she come from a position of protecting the tech giants from any significant responsibilities or liabilities? She will understand the objective of the Bill and what we are trying to do. Yet, she is also against any senior management liabilities. She had said that it may have a detrimental effect on investment in Ireland. Is it her position that tech giants should be immune from oversight or responsibility, in case they move their business elsewhere? That would be tough to swallow. We all know why they are here. It is to pay less tax. I am, therefore, curious about that.

Dr. Pierre François Docquir

I thank the Deputy for the questions. I will start by commenting on the funding. In our view, it is key to make sure the funding of the social media council does not undermine its independence. There are various possibilities for sources of funding. One is contributions from the social media platforms. After all, as they would also benefit from the service that the social media council would provide, they might as well incur part of the cost. Another source of funding, as for press councils, might be of a philanthropic origin. Another option that we discussed is that there might be a contribution from the future regulator, if the social media council itself has been contracted to operate an individual appeals mechanism on behalf of the regulator.

The Deputy’s second question points to the coexistence of different regulatory approaches. One is a statutory approach with laws and the possibility of courts imposing sanctions. The other model is voluntary compliance, which does not come with the strength of the law. At this particular stage of history, a combination of the two approaches to regulation of social media platforms might be effective. The danger with a legal regulatory approach is that with the risks of sanctions and fines, companies might at some point lawyer up. This might lead to long legal proceedings, before we can see the effect of regulation coming into practice. A voluntary compliance mechanism can be more flexible. It can offer a space which, in my most optimistic moments, I would call a "space for co-learning", of what it means for social media platforms to be regulated externally. Recognising a social media council in the law might prove to be an interesting incentive for social media platforms to join the social media council and to play fairly within that voluntary compliance mechanism. That is the sense of our proposal.

Mr. Ronan Lupton

As for the rights of the child, the UN Convention on the Rights of the Child, UNCRC, is quite clear. We are obliged to take consideration of the youngest and most infirm in society and ensure their rights are heard and vindicated. A good example of consultation taking place was that with the Data Protection Commissioner on the digital age of consent aspect of that legislation. I agree with Professor O'Mahony in terms of his submission on that.

I have a couple of issues with the idea of an individual complaints mechanism, the first of which relates to the experience in the data protection space, whereby people send individual complaints to the regulator when some of them should be sent to the person who is perpetrating the breach rather than being sent straight to a regulator. We could end up having a regulator or commission that is under-resourced and if that is so, that will not do the job we want it to do. In examining how something arises online, the first step that should be taken is to report the deleterious or negative content to the service provider, which should assess that content and take steps, although it usually will not or will have reasons it might not, for example.

The second step is to report the content to a regulator, which is what the Bill aims to achieve, but the question is what the regulator will do. I believe the regulator should be an advocate for the individual or child whose complaint has fallen on deaf ears, whether it is a criminal or civil complaint. We should build in the necessary mechanisms to allow it to go directly to the social media company, telephone company or whatever it is, and to make the case that this has not been dealt with, without recourse to the courts. That is fundamentally what the correct approach should be.

The Australian approach is a little more detailed and possibly would not suit the Irish environment. That said, while there is room for that model, opening an individual complaints mechanism to the football pitch, if one likes, and allowing everybody run to the new digital safety tsar, who will be either a member of the media commission or somebody separately appointed, will mean that within six to eight months, he or she will not be able to deal with the complaints as lodged and there will be a difficulty in that regard. It might suit from a constituent point of view to say we now have a Bill for this and here are the legislation and the regulation but if it is not functional, we will have a major problem with it. It will not do what it says on the tin.

The question is how to set the rules of engagement. Should we bring about codes of conduct and put them on a statutory footing in respect of how the behaviour should be engaged with between the content providers and the regulator, with a mechanism for the regulator acting as an advocate for the complainant to go the Garda and do the necessary when it comes to that? It is a vexed question - there are no two ways about it - but both as a citizen and as somebody giving evidence to the committee, I want to avoid ending up with a regulator who, first, may be under budget and under-resourced and, second, may be unable do the job he or she was put in place to do and as reflected in the legislation.

The question is how to build the rules of engagement and whether we can make those rules of engagement work. It might be that we should take up Dr. Docquir's suggestion for some form of social media council, or a rebranded NACOS, whereby we bring the necessary management from the operators to that council and have a discussion on what should and should not happen and how they are performing, and bring that into law or regulation of some manner. The question of the best route is one the committee may struggle with, but certainly on its face, having some form of individual complaints mechanism, with the correct parameters, is something we need to examine seriously.

Ms Una Fitzpatrick

On the question of penalisation and whether this is for tech giants, our membership comprises all company sizes. We have large companies but also medium and smaller sized companies. They are all, basically, of the same mind. From an Irish perspective, we do not want to take unilateral action on an Irish level. We are waiting to see what will come down from the EU because then it will at least be EU-wide. From an implementation point of view, that will make it slightly more straightforward than having each member state going off and taking different approaches with different responsibilities.

When I mentioned reputation, that is related to that. Ireland has a really positive reputation in the context of engagement between all the stakeholders and opportunities for collaboration, input and working together. We would like an EU-wide statutory footing regarding penalisation to be put in place, with the mechanisms within Ireland for such issues, whether something like the social media council or something else such as NACOS, where issues can be brought to the table and worked through in a collaborative way, with all stakeholders' perspectives brought forward.

I thank our guests for their interesting contributions. I am just listening and do not need to contribute. I am double-jobbing here.

Very good. The Deputy is multitasking, as always. Fair play to him.

I thank our guests for their contributions. Members' questions have been interesting and thought-provoking. I have two sets of questions, the first of which are for Ms Fitzpatrick and Mr. Lupton. I wish to tease out the idea of the individual complaints mechanism. As Deputy Munster correctly noted, many witnesses who have appeared before the committee and advocate groups, particularly children's rights advocates, have made repeated calls for the establishment of an individual complaints mechanism. Having spoken these groups, I have to agree with their argument, at least on the face of it. Nevertheless, we have to discuss further the practicalities of it and what resources would be needed. I understand this has been successfully established in Australia and it seems to be working. If it can be done there, I am sure there is some way we can do it here. How could such a mechanism be resourced? Ms Fitzpatrick might touch on the type of volume that applies to individual complaints. Mr. Lupton mentioned that setting up such a mechanism would be challenging but surely there could be an imaginative or creative way around that, given that children's rights activists are calling for it.

Freedom of expression is something we all advocate for. What are Dr. Docquir's views on the existing mechanisms for the removal of content? Does Article 19 believe that some content removal has been in breach of freedom of expression rights? Do social media companies sometimes go too far or not far enough? I refer in particular to what we have seen over the past 12 months in the context of Covid-19 and vaccination programmes. Some very influential Instagram and Facebook account holders, who may have several thousand followers, have gained support but they may be putting out misinformation on the vaccination programme, using terms such as "experimental" and other harmful terminology, and on the pandemic itself, suggesting it is a hoax. That is difficult for me to entertain, given the significant influence and number of followers these individuals have. Where is the balance there? Where do we go too far and where have we not gone far enough?

Mr. Ronan Lupton

Ultimately, the key response to the individual complaints mechanism relates to the rules of engagement. My preference is that any regulator be an advocate for the complainant, but the key query relates to how many complaints come and how they are filtered. That is the real challenge here. We have experience, as I said, with the Data Protection Commissioner and ComReg, where consumer complaints lines - call centres, almost - had to be set up to deal with the volume coming through. While a mechanism to allow complaints to be made is called for, the first port of call must be the social media company, to complain about the content with which the issue arises and make it take action to take down the content within the scope of the law, as it will then be aware of the deleterious content, whether it is misinformation or defamation, criminal or otherwise.

One would then take the steps to report but the question is whether society or the citizen knows the steps to take ant the answer is probably not. We cannot have a situation where people are told to run off to the digital safety czar and make their complaint, only for the digital safety czar to say it does not meet the threshold because it is freedom of expression-type content. An example would be a case before the courts recently involving Salesian College. Vulgar abuse is not defamation; it is simply vulgar abuse. You may not like it but it is, unfortunately, a feature of freedom of expression. The idea in that case was to try to unmask the person making the vulgar statements. That did not go ahead and the case has been withdrawn but it was before the courts and there is a written judgment in terms of what occurred. If we take that model, you would normally complain to Facebook first to say "I have somebody abusing me. We do not know who they are. Do something about it". If it fails to do this and to take steps, you should perhaps get a log from that particular platform provider and send it to the social media czar or commission and ask it to advocate on your behalf, rather than simply going to the social media commissioner or information society commissioner and saying "I have this complaint. You go deal with it. You find out what my problem is and take the initiative." I think this would be a disaster. However, if there are rules of engagement and the platforms buy in to it, be it through a code of conduct or some form of legislative narrative that tells them they must engage in this manner, ultimately, it will be cleaner and we will be able to resource it properly and come to some meeting of minds without recourse to the courts to try to take the content down and deal with it in a manner that is proper.

I take the Acting Chairman's point about the misinformation-disinformation dynamic. As I mentioned earlier, unfortunately or fortunately, depending on what side you are looking at, freedom of expression is codified in the Constitution and, unfortunately, people treat that sometimes as a carte blanche mechanism to spread misinformation about vaccines, efficacy and all sorts of nonsense and they get away with it. That is a deficiency on the Statute Book that needs to be addressed separately to this Bill. I am not saying it should not be in here. Perhaps in or around heads 49A, 49B and 49C, there may be room for codifying what should not come out. Going back to Senator Malcolm Byrne's question about how one defines the harm that take place in respect of that, it is very different because, ultimately, a regulator would take care of a doctor or lawyer who was misbehaving or doing something he or she should not do and have a regulatory sanction. However, when you are dealing with something like this, it is an unfortunate fact of life both in the off-line and online world that people say things that fundamentally are factually and scientifically incorrect and we simply have to deal with them. The Australian model may go too far in terms of how it does things. It is not dealing with the same level of European law and the dimension of rights, which Ms Fitzpatrick mentioned a second ago will be a ready-to-wear set of legislative principles under the Digital Safety Act, which will affect all 27 member states. Of course, we know that Britain has done its own thing anyway and is barking in some aspects. That is my contribution. I will happily pick up on any further questions that come from that thread. I am conscious of time so I will head over to Ms Fitzpatrick.

Ms Una Fitzpatrick

Mr. Lupton has covered a lot of the key points really well. In respect of individual complaints, Technology Ireland has said that it thinks that from when the media commission is first formed, we think it will be ineffective and administratively unworkable. If someone was to say that in four months' time, we will have a media commission in place, work can begin in terms of looking at what are the systemic issues and where are the issues of most risk. We are going to start off with a media commission that, hopefully, will have great resources but will probably have some limit on those resources. We should really focus those on the areas of most harm and risk if we do a risk-based assessment on complaints and focus on those that have the most impact and cause the most harm. We have seen the volume of complaints to the Data Protection Commission. It would often say that these complaints are more like customer service complaints rather than complaints about data protection issues. We want to avoid that overwhelming nature and get to heart of what the key harms and risks are for people.

Could Dr. Docquir address the question concerning freedom of expression versus harmful content?

Dr. Pierre François Docquir

That is a very broad question that I did not include in my written submission so I would be very happy to follow up and share some documents by my colleagues in the law and policy team in Article 19, particularly with regard to developments relating to the Digital Safety Act. There are three things relating to content moderation by platforms from a free speech perspective and a human rights standards perspective. The first thing is transparency. There has been progress. Transparency reports have been developed and we know more about what is taking place but there is a general consensus that they could be improved, for example, on disaggregating data per country in order that we know exactly how many content moderation problems exist on each platform in Ireland. The second problem is due process. This involves making sure people are informed of the reasons why their content has been moderated and giving them a chance to have a fair conversation with the platform and a remedy to contest the takedown decision. The final thing is making sure that freedom of expression and other fundamental rights are part of the debate on whether the specific piece of content should be allowed to circulate on the platform.

In terms of the question about disinformation, of course, I acknowledge that some seriously problematic content is circulated online. There is no doubt about that. However, the key thing here is to remember that content moderation is not binary or at least is no longer binary. There are many options and ways of dealing with content. Content can be demonetised, which can be a very strong incentive for people to stop sharing stupid or harmful content. It can be downranked or flagged. There are ways to display a message that points to authoritative sources of information on, for example, health and the pandemic. Something like a social media council would be an appropriate forum for the discussion and fine-tuning of those non-binary approaches to content moderation in order that stakeholders could all share their views on what is needed, platforms could explain what is possible technically and some sort of societal agreement could be reached on the best approach to content moderation for each of the categories of problematic content.

Could Dr. Docquir elaborate on demonetising content?

Dr. Pierre François Docquir

It is the idea that people share content because they profit from selling advertising. Not only platforms but the producers of content get a share of the advertising profits. If a platform prevents somebody from getting an income source, it removes the incentive for that person to produce problematic content.

Dr. Docquir has opened up a whole new conversation about how this is coming about, where it is coming from and the sources of a lot of this misinformation. We need another session on that.

I again thank the witnesses for a fascinating exchange focusing on many crucial issues. I will delve into some other issues we will consider. The first question is directed in particular at Ms Fitzpatrick and the sector she represents. Obviously, we see that the new media commission will have a regulatory role. Does Ms Fitzpatrick also see it having a role in fostering and supporting the development of the content creation sector in Ireland? This links to how the content levy will be used.

How could that happen most effectively?

Any of our guests can reply to my next query on algorithmic decision-making. I have been asking many questions about the role of the commission in deciding the code and enforcing transparency in algorithmic decision-making. It relates to the challenge of balancing freedom of expression with the right to a good name and the takedown powers of the new regulator. In what circumstances can a regulator order a social media company to take down particular material? In Germany, this issue arose around Holocaust denial in particular. German authorities can tell social media companies, particularly those of a certain size, to take material down quickly. I wish to cite an example that Dr. Docquir might not be aware of, but others will. There was a recent situation involving Presentation College Carlow where a series of statements started to be made online about the school. They were subsequently proven false but they were damaging to the reputation of the school, in particular its male teachers. Very quickly, the story exploded all over social media. The school has won a judgment from the Press Council but that is only part of the steps necessary. In such circumstances where the reputation of an individual or institution has been badly damaged online, what role should the media commission have?

I believe the Senator's first question was directed towards Ms Fitzpatrick.

Ms Una Fitzpatrick

I thank the Senator for his question. On the content creation side, the media commission could play a strong role in the collaborative space between content creators and the platforms in terms of how they work together.

Regarding levies and so forth, the industry would like to see an EU-wide step taken, as opposed to anything that was too unilateral on an Irish basis.

The Senator referred to the media commission having a role in bringing commercial operators together, for example, in providing training and support. There are outreach programmes like those in many of our member companies already, but the media commission could play a role in spreading them and making them more freely available. It could help creators to find new ways of monetising their content, which is something in which the platforms have a great deal of experience. There is a role for the media commission in sharing that expertise with content creators.

Senator Byrne's second question was open. Does any witness wish to respond?

It was on the issue of algorithmic decision-making.

Mr. Ronan Lupton

I can give a view on that, although it is probably one the Senator has heard before. The people who can be damaged most by algorithmic decision-making online are children. As adults, we disclose certain information on our devices, whether knowingly or not, and algorithms then take effect. We know that they are a fact of life on the Internet but those who will be troubled most are those who do not understand what they are doing on the Internet. The question is whether the legislation is able to develop a situation in which we can have some form of regulatory or code of conduct guidance on algorithmic decision-making in respect of minors. The Data Protection Act looks at targeting and advertising, and there are actions happening in that regard that are fundamentally good. Ultimately, however, it is a global question. It is something that irritates many people. If people are not happy with their algorithms, should they run off to the digital safety commissioner and say that the social media companies' algorithms are showing them content they do not want to see? That is the type of complaint that an individual complaints mechanism would fall over and die on. If people understood - it is a citizen information exercise - how the technology works, it makes life a little easier. We are all subjected to it. I am not saying that it is not a pain to have algorithmic decision-making happening but we need to focus on who is most likely to be most hurt by it, namely, minors. They can be exposed online to dietary issues, alcohol sales, gambling, etc. The question arises as to who is responsible for setting the parameters on the devices they are given and who is giving them the devices. Invariably, the answer is their parents or they have done it themselves and not told the parent in the context of the vast sums of money they have extracted from their parents by saying that they need the latest and greatest in technology, but that is another day's work.

The Senator's question on freedom of expression was a good one, if he does not mind me saying so. Freedom of expression online is codified, but we also have the Defamation Act. Under it, if a defence is reasonably likely to succeed - that is an open situation - the reality is that someone will not get an injunction from the courts. A court will look at the case presented to it and see whether there is a possible defence that is reasonably likely to succeed. If there is - a clever lawyer will have put one forward - then someone will not get an injunction, meaning that he or she will then have to take costly and lengthy defamation proceedings to resolve the issue. While I do not want to get into the details of the Carlow example, it was a serious situation. One can find businesses and institutions that do good work being subjected to crank complaints. I am not saying that happened in the Carlow context but those bodies can be subjected to complaints where people are exercising freedom of expression but doing so in the wrong. Trying to vindicate the right to a good name as businesses or individuals becomes an expensive and lengthy process and they have to go to court to do it.

Can this legislation fix that? There is an effort to do so in or around heads 49A to 49C in terms of providing recourse to a regulator to take data and information down, but what will the parameters be? We should be careful not to have censorship by the Executive without democratic legitimacy. Reverting to my point on misinformation and disinformation, we should seek to put into legislation clear guidelines for the regulator on when it would be appropriate to take down information. There should be takedowns - there are no two ways about it - but it is a question of when doing so is appropriate. We do not want a regulator going rogue and telling platform providers or others to take something down, that it is bold and, although they have not really assessed it, it is prima facie not correct or appropriate. We cannot have that. We would die as Ireland Inc. commercially, never mind anything else.

There are protections in the Constitution to deal with this issue. As the committee can probably tell from the way I am able to address this question, it is an area that I practice in, but the issue is vexed and problematic and there is room on the Statute Book for separate legislation in the criminal context, for example, through enhancing the incitement to hatred legislation or new legislation that a concerned Senator might introduce as a Private Members' Bill to shore it up.

Dr. Pierre François Docquir

I will make two points on algorithms. First, it is a question of non-technical persons who do not understand computer code or language being given the right level of transparency and being able to identify the effects of the algorithmic distribution of content. Second, there is the issue of a human rights impact assessment. Research shows that there is considerable discrimination and issues linked to the ways that algorithms are being built. Building into the designs of those programmes a concern for the protection of human rights should be part of the solution.

I am not aware of the case to which the Senator referred.

He mentioned that the Press Council has dealt with it but said that that was not a complete solution because the harm is still linked to the online circulation of that information. I suggest there might be a role for a social media council here in designing the appropriate content moderation response to maybe sharing a rectification or the proper-----

(Interruptions).

Dr. Pierre François Docquir

-----to the people who have seen the false and harmful piece of information. That would contribute to repairing the harm the individual's reputation has suffered. Again, the social media council could be the place to fine-tune solutions, both societally, which is desirable, and in a way that is technically feasible.

If you would like to come in on those two points as well, Ms Fitzpatrick, please feel free to do so.

Ms Una Fitzpatrick

Regarding Mr. Lupton's point, we definitely agree as to where the issues arise around freedom of expression and defamation. It was interesting to pick up on Dr. Docquir's point about the possible role for the social media council in the case referred to by the Senator. From the platform's point of view, that would be interesting to explore.

We will go back to Deputy Mythen for the final question.

My first question is for Mr. Lupton. The introduction of the online safety codes is important but Mr. Lupton states that there are no guarantees, since they are voluntary codes, that the company will sign up or that it will stick to the terms. Is this a weakness in the Bill? Does the Bill need to be strengthened?

I have another little question. It is a bugbear of mine. One of the definitions of harmful content refers to material containing or comprising a defamatory statement. There are different cultures and sometimes what is accepted in one culture is not accepted in another. One such famous instance involved cartoons. I will not mention the subject of them. There are also comedians. Generally, comedians operate on the basis of someone being the butt of the joke. That is the way it works. It might not be funny to the person who is the butt of the joke but other people accept it. How would the witnesses deal with that?

Mr. Ronan Lupton

I will take Deputy Mythen's second question first. The Defamation Act is stand-alone legislation, and only the courts can determine whether something is defamatory. That is the reality of it. We will therefore not have a situation in which a regulator will be able to determine whether something tends to injure the reputation of an individual or a company in the minds of ordinary and reasonable members of society, which is what the test is. I think that will be an issue and it might be challenged before the courts as to where it should sit. There is an interplay between Article 40.6.1°.i of the Constitution, dealing with freedom of expression, and the Article 40 rights, which imply the protection of and right to a good name and the State vindicating that right. That interplay is very much hardwired into society here, and there needs to be very careful thought not only from the committee's reporting position but also when this Bill goes further in terms of amendments and how it will operate in that regard. A court is the only proper place to determine this, unless it is very clear that a really bad defamation has occurred. This is where most of the problems arise. The platform says it has not seen the content and says, "You are only telling us about it now." Therefore, we go back and check and the platform agrees to take it down, or it does not, but it does not make an assessment of it in terms of its defamatory nature or otherwise. The courts do that. I know Dr. McIntyre has highlighted that in his submission from Digital Rights Ireland, DRI.

As for online safety, there have been voluntary codes operating through the auspices of the hotline.ie service and so forth. They have been very high-level and light as far as it goes. I was involved in those and I know how they operate. A lot of the content reports do not result in much but the odd time you get a very serious situation going where Interpol must get involved or the INHOPE network has to get involved in the reporting of online criminal content relating to children. I think the online safety codes will bring the necessary parties to the table. There will always be difficulties. If a new operator comes to town tomorrow morning and says, "We are just going to go Wild West here and are not going to comply with anything and we are global", that will be extremely difficult. Members have to ask themselves whether these codes are really practical and whether we can do anything to stop facilitating those services in Ireland. The answer is "No". It is an open platform. The Internet is pervasive. It is not a platform but is all around us in terms of connectivity. From that point of view, careful consideration needs to be given to how the codes will operate and what detail will go into them. It might well be that when the commission looks at them, there will be a consultation process of some description that will take account of the views of those who will be affected by them, namely the consumer, the child and the platforms, in order that we have a kind of middle ground that can operate the codes as they are.

Dr. Pierre François Docquir

I think there was a question about humour and satire and the intersection between international or European human rights law and local values and local harms. In certain situations the international human rights law might push for some change or progress in local norms and in other situations it is clear that the limits of humour or the limit of acceptable speech can be decided only in light of the national context. Once again, this is where we see the importance of having content moderation oversight organised at the national level.

Ms Fitzpatrick, if you wish, we will leave the final word to you.

Ms Una Fitzpatrick

I will come back to the point raised about online safety codes and the question as to whether they are effective or whether there is willingness. I know from my members that they are very eager to get involved in the establishment and development of online safety codes. As a sector and a membership group, they want to be regulated. We recognise now that the factors for investment in Ireland will substantially change based on OECD-level changes happening on the tax front. We have seen that among the top three reasons to invest in places, regulation is really coming to the fore. We see that even in Ireland. Looking at the biopharma sector, which is where my original background is, the companies present here are really well regulated. The Irish biopharma regulator is internationally well recognised and, as a result, attracts a lot of investment into the country because it is a strong but a fair regulator and is really respected internationally. That is really where we want to pitch this with the media commission. We want a strong and a fair regulator that works with all the stakeholders, takes a very pragmatic and risk-based approach and will enhance the reputation of the sector here in Ireland.

We will wrap up. I thank members for some very thought-provoking questions - we certainly got a good discussion going - and the witnesses, Ms Fitzpatrick, Mr. Lupton and Dr. Docquir for their excellent answers and really insightful contributions. I am sure this will very much help us in our pre-legislative scrutiny process of the online safety and media regulation Bill. We have run out of time, so that concludes our business for today.

The joint committee adjourned at 1.59 p.m. until 3.30 p.m. on Wednesday, 14 July 2021.
Barr
Roinn