Léim ar aghaidh chuig an bpríomhábhar
Gnáthamharc

Joint Committee on Tourism, Culture, Arts, Sport and Media díospóireacht -
Wednesday, 21 Jul 2021

General Scheme of the Online Safety and Media Regulation Bill: Discussion (Resumed)

We have received apologies from Senator Hoey and Deputies Dillon and Cannon. This meeting has been convened to discuss, with representatives of Safety Over Stigma, the general scheme of the online safety and media regulation Bill. I thank our guests very much for joining us virtually for our meeting. This is not the way we would like to be doing things but we are getting closer to being able to do in-person meetings. We very much appreciate our witnesses giving of their time and their generosity of spirit regarding the information they are going to share with us. I welcome Ms Alicia O'Sullivan, founder of Safety Over Stigma, and Professor Louise Crowley from University College Cork. The format of the meeting is such that I will invite our witnesses to make an opening statement and that will be followed by questions from members.

As our witnesses may be aware, the committee may publish the opening statement on the website following today's meeting. Before I invite our witnesses to deliver their opening statement, which will be limited to three minutes, I wish to advise them of the following in respect of parliamentary privilege. Witnesses are reminded of the long-standing parliamentary practice that they should not criticise or make charges against any person by name or in such a way as to make him, her or it identifiable or otherwise engage in speech that might be regarded as damaging to the good name of the person or entity. If an opening statement is potentially defamatory in respect of any identifiable person or entity, the witness giving it will be directed to discontinue his or her remarks. As our witnesses are attending remotely outside the Leinster House campus, they should please note that there are some limitations in parliamentary privilege and as such they may not benefit from the same level of immunity from legal proceedings as a witness physically present does.

Members are reminded of the long-standing parliamentary practice to the effect that they should not comment on, criticise or make charges against a person outside the Houses or an official either by name or in such a way as to make him, her or it identifiable. I remind members finally then of the constitutional requirement that they must be physically present within the confines here of Leinster House in order to participate in today's public meeting. I cannot permit members to attend if they are not adhering to that constitutional requirement. If a member is not within the precincts of Leinster House, I will have to ask him or her to leave the meeting. I ask members to please identify themselves when contributing for the benefit of assisting the staff of the Debates Office involved in the preparation of the Official Report. Members might also mute their microphones when not contributing in order to reduce the background noise and feedback. I also ask members to raise their hand button when wishing to contribute and I also remind all members joining today's meeting to ensure that their mobile phones are in silent mode or, better again, switched off. I thank those in attendance for bearing with me for all of that housekeeping advice.

I ask Ms O'Sullivan to address the committee now, please.

Ms Alicia O'Sullivan

I thank the Chair. To start, it is quite difficult to be virtually, even though not physically, before the committee today. In a way, I am also glad that it is me here now rather than someone else further down the line. I am grateful that this general scheme exists and that Safety Over Stigma can play a role.

Briefly, a fake Instagram account was set up in my name a while back and the perpetrator posted multiple explicit images of another woman’s naked body and gave the impression that it was me. The horror experienced when realising what was unfolding was amplified by the uniform, dismissive and victim-blaming reasons I received from An Garda Síochána. In addition, the initial reluctance by Instagram to remove this account, while simultaneously deleting my personal Instagram account, further aggravated the situation. Evidently, both the response and the governing laws were inept. From my experience, the perpetrator was more protected than I was at all times, from the moment they were allowed to sign up under my name to the present day where there are few repercussions for this. Every day I see another girl that I know facing the exact same situation on Instagram.

That said, we welcome the proposed establishment of the media commission and the acknowledgment of respect for democratic values but think that, in addition to the rightful liberty of expression, explicit reference should also be made to the privacy and good name rights of citizens. We also recommend that in addition to allowing for the establishment of advisory committees, a specific committee on online safety and media regulation be established.

A huge factor in my experience was the lack of public awareness to the relevant law, given the fact I was told that someone posting illicit photographs purporting to be me was not illegal when, in fact, it was and is.

A solution to this would be the development of a specific helpline or support network for online abuse and cybercrime in which the law and procedures to take can be explained and support can be offered. This would go a long way to create greater public awareness and improve legal and media literacy.

Thankfully, the fake page was taken down in less than 24 hours, but many women privately told me that the same had happened to them and the account was not taken down for about a week. I imagine that these victims did not have the same amount of people reporting it as I did and suffered further because of that. From my experience, it should be standard that when a complaint is made about the veracity of an online account or claims that it contains illegal or harmful content it should be subject to an immediate take down procedure, and this material be reviewed promptly and further reviewed if it is put back up.

Someone pretending to be you online is frightening and damaging. An obvious protective step would be the requirement for some form of identification or identity, which can be traced and verified, to be provided when setting up an account. This could be done by requiring unique identifiers for users signing up to accounts.

I started this campaign for one simple reason. I remember feeling such distress, discomfort and dismissal. I want to play a role in not only putting an end to image based sexual abuse, but for there to be repercussions and accountability for the perpetrators. Most important, victims should be treated with the respect and empathy they deserved at every point. I am simply asking each committee member to do everything in their power to ensure that we, as a country, can do this. I thank the committee.

I thank Ms O'Sullivan for her very honest and open presentation today. It cannot have been easy for her. She has already been through some trauma. We appreciate her sharing her experience with us because it will be very helpful. We aim to protect users, such as Ms O'Sullivan, on social media. I will invite my colleagues to speak. I am conscious that Professor Crowley is on the line. Perhaps she would like to come in following the questions from some of our colleagues.

I thank Ms O'Sullivan and Professor Crowley for coming before the committee. I thank Ms O'Sullivan, in particular, for sharing her story again. A lot of us have had the honour of hearing her story. She has shared it and has been exceptionally brave. As she said, she is just one person. This is a challenge that many individuals are facing.

As a committee, we deal with laws and regulations. Things can get very technical at times. Ultimately, our challenge is to ensure that Ms O'Sullivan's rights, reputation and dignity are respected. This is what we have to achieve from this Bill. I refer to Ms O'Sullivan's experience with social media companies, in this case Instagram. What does she believe companies can do in order to specifically address this problem?

In establishing an online safety commissioner, we are considering the introduction of an individual complaints mechanism. If somebody like Ms O'Sullivan went to Instagram, Facebook or Twitter and the company failed to take action, how would she like to see an individual complaints mechanism work? Ms O'Sullivan has spoken about her experience with the Garda. What additional training and resources does she believe gardaí need to have in these circumstances?

Ms Alicia O'Sullivan

I will ask Professor Crowley to come in. In terms of Instagram, from my experience it was quite difficult to make a complaint. It was a very difficult platform to manoeuvre across. It was like a help centre online. It was not obvious which boxes to tick in terms of what had actually happened to me. One option was for people who were pretending to be me, but it did not go as far as what I experienced.

Identifiers are a subject of conversation in the UK at the moment. On a platform like TikTok, something is taken down straightaway and reviewed. In our submission we suggested that in cases of bullying where someone is purposefully trying to get a person's account taken down, there should be a mechanism whereby he or she can check the complaint and for the complaint to be traced. The user about whom a complaint has been made could check the validity of the complaint.

Professor Crowley can take the second question. In terms of training for An Garda Síochána, we welcome the comments of the Minister, Deputy McEntee, that there would be further training for gardaí. To be honest, what was lacking most, which is quite a sad affair, was empathy for me as a young woman who went into a Garda station to complain about something I was very upset about. There was a lack of sensitivity and help. I left the station visibly upset and nobody offered me any sort of support or provided me with a telephone number to call. I was not aware that the divisional protective services unit could have helped me at that point. I do not think the average citizen would be aware of that, and I definitely should have been told about it at that point.

Professor Louise Crowley

I thank Senator Byrne for his questions, which get to the nub of some of the key issues. In respect of his suggestion on an individual complaints mechanism, we need to think about the role of lawmaking within an individual jurisdiction. That is crucial because without that we are beholden to the policies of individual platform owners. They have their own interests and priorities in the commercial world. If we are creating a framework to respond to the needs of all citizens who might be affected by this type of behaviour and asserting Ireland's position as regards the importance of protecting service users from the abuse experienced by Ms O'Sullivan, a complaints mechanism could be established within the proposed legislation to give power back to Ireland and lawmakers. It could be a very proactive, positive statement by the Government that the opportunity to make a complaint to an Irish body, established with a view to protecting individuals, is a very meaningful and accessible mechanism.

A key issue, as Ms O'Sullivan said, is knowing what box to tick and what category a person comes under. We could be very considered in creating an individual complaints mechanism that we have spoken about. We would present it in a way that is compassionate to the person who is probably going through one of the most traumatic moments of his or her life and is looking for answers. The challenge with that, as is the case for any support guidelines or rules that might exist, is that people are trawling through volumes of information to try to find an answer. A very accessible and clear means of making a complaint to an Irish body charged with ensuring protection for users would be a very important statement and mechanism for those seeking support.

I am coming from a legal perspective. The crucial issue is the threshold at which the witnesses think an individual complaint could be made. They will appreciate that this is the challenge we are facing as legislators.

This engagement has to conclude at 10.25 a.m. I will give our guests an opportunity to answer the questions. I will move on to another member and the witnesses can answer Senator Byrne's question in the next round.

I thank the witnesses for coming before the committee today. They spoke about the individual complaints mechanism. I am delighted to hear they are supportive of that because unless it is included, the legislation will not be as effective as we want it to be. It is vital that is included in the Bill.

Does Ms O'Sullivan think it is important for legislators, the Department and all of those who will ultimately be responsible for the implementation of the legislation to take into account of the lived experience of young people? For example, should there be a youth advisory committee to the media commission?

Ms Alicia O'Sullivan

Absolutely. We have said that we think an advisory committee on media regulation and online safety should be established. Within that, there should be youth representation.

My generation certainly experiences the online world very differently to my parents or possibly even people in Government. Therefore, young people must be represented. As Professor Crowley touched on, when this happened to me, and as we are all inclined to do, the first thing I did was to look it up. I am a law student as well, but there was not much out there in this context. Coco's Law is very new and has only come in since the end of February. Therefore, experiences like mine have never really been spoken about before. I refer again to this type of dismissive behaviour that we as a society have displayed to issues concerning the Internet. The Internet is a very dangerous thing in the hands of the wrong people. It is also an amazing thing. We must move on now to consider how we can best protect people in this context and it is always important to have young people's voices heard at the table in these types of endeavours.

I thank Ms O'Sullivan. Do I have enough time for one more question?

Very briefly, Deputy Munster. I would love to have more time, but I want to ensure everyone gets in very briefly.

That is fair enough. There can be a "Yes" or "No" answer to this question to save time. Does Ms O'Sullivan think young people should be involved in awareness-raising activities, such as through youth work or other avenues, in respect of Internet laws and protections in that regard? Would that be a crucial element?

Ms Alicia O'Sullivan

Absolutely, young people are doing that anyway through organisations such as Comhairle na nÓg, the YMCA and the Irish Second–Level Students Union, ISSU, which I was involved in this year with the leaving certificate. Therefore, young people are already out there doing that, so now it is just a case of bringing them to the table and hearing what they have to say.

I call Deputy Mythen, who has two minutes.

I will be brief. I thank Ms O'Sullivan and Professor Crowley. I commend Ms O'Sullivan on her bravery and determination in following up her cause. It is good to see young people taking up this type of initiative and Ms O'Sullivan is an example to all young people in the country. How important is it to have a go-to support service for young people who have gone through an experience similar to Ms O'Sullivan? How are the existing services failing young people? Regarding the helpline suggested by Ms O'Sullivan, how important would it be for the protection of young people?

Ms Alicia O'Sullivan

Supports exist, and after telling my story online I received such support from the Sexual Violence Centre Cork and from Professor Crowley who works with the bystander intervention programme. As I mentioned already, the staff of the divisional protective services unit have been nothing short of incredible. However, the problem which exists is the lack of awareness of these supports. Again, as I said earlier, in essence, I was not told of any of these services when I first went to report this crime. The average person in society is not aware of these supports. These are not new problems, but the supports available are not widely known. This is the big hurdle we must overcome.

I will move on now and call Senator Cassells.

I thank Ms O'Sullivan and Professor Crowley. I will be brief because I have only a minute remaining. I thank Ms O'Sullivan for her work in this respect, for being an advocate and for appearing before this committee. One of the most important aspects is how she has highlighted the point of contact with An Garda Síochána in respect of reporting crimes like this. In Ms O'Sullivan's case, that was a serious sexual crime which happened on the street. I refer to the work of An Garda Síochána in being able to deal with a case like this and, as Ms O'Sullivan said, showing empathy in doing so.

Turning to Instagram, and Ms O'Sullivan's interaction with that platform, does she believe that a helpline would be more helpful, instead of going through a series of exercises to tick boxes? My experience of these social media giants is that it is very hard to get contact with humans. In her interaction with Senator Byrne, Professor Crowley touched on these tech giants and their interests in the commercial world. This goes to the nub of the issue. When representatives of those companies were before this committee, what seemed to come through from the meeting was those companies seeking to protect what they perceive as their freedoms because it is obviously in the advertising space that those companies are going to make their money. From a legal point of view, how far should the Oireachtas and society in general be pushing to get a greater balance in this sphere? I ask that because I think the situation now is totally unbalanced.

Ms Alicia O'Sullivan

Starting with Instagram, it was very difficult to manoeuvre around. Regarding how the company responds initially to someone reporting something very damaging, such as illegal content, those types of reports must be taken more seriously than other categories of reporting. Regarding a helpline, it is very difficult to have one in the context of a social media company such as Instagram. However, the company's responses to cases such as mine and my experience must be different. Something obviously went wrong when my account was disabled, and there certainly seems to be a lack of human involvement behind the kind of artificial intelligence involved and that does not help the situation. Regarding whether there should be a helpline, then, I do not know if that is entirely possible. However, I certainly think there should be a certain number of people behind these screens.

Professor Louise Crowley

I will be brief given the time left. A helpline needs to work side by side with a body that has a capacity to do something about what is reported via that helpline. A helpline cannot just be about listening and supporting, it must also have the teeth required to do something about what is reported. Regarding how far the Oireachtas can go, as someone who works in the area of seeking to address all forms of sexual harassment and violence, I believe the Oireachtas should go as far as it needs to go in this regard. All forms of harassment and abuse, especially sexual harassment and abuse, constitute a huge societal challenge. Ultimately, the key role and responsibility of the Oireachtas is to identify the issues and to govern and regulate in a way that supports and respects all our people.

If a platform has the capacity to ignore a situation such as that experienced by Ms O'Sullivan and many similar situations experienced daily by other women and men - and we have knowledge and evidence of these experiences - then the obligation on our lawmakers is to take action to prevent this happening and to make these platforms accountable for they are doing. It is not enough to allow them to just create their own rules with their commercial interests in mind. Those companies have a social responsibility, but it is the responsibility of lawmakers to create a framework that will require those companies to comply with their responsibilities in this regard. Therefore, I urge the members of this important committee to take all the steps they feasibly can, based on the constitutional right to privacy for all our people, to ensure that rights are respected and that our laws allow that outcome to be achieved.

I thank Professor Crowley.

I thank Ms O'Sullivan and Professor Crowley. I thank Ms O'Sullivan for her bravery, honesty and determination in championing this issue, not only for herself but also for thousands of others right across the country and possibly the world in respect of the work we are doing here. Professor Crowley is right that it has been proved repeatedly that self-regulation is no regulation for these giant technology companies. Once we finish the pre-legislative scrutiny of this Bill, we will then have a major responsibility to ensure that we are in a much better place and that people will be safer on the Internet and on social media. I again thank the witnesses for joining us. I also thank my colleagues for their questions and for being so timely in this regard. I appreciate their co-operation. We will now briefly suspend to allow the secretariat to make arrangements for our next session.

Sitting suspended at 10.28 a.m. and resumed at 10.33 a.m.

This meeting has been convened in the context of the committee's continued pre-legislative scrutiny on the general scheme of the online safety and media regulation Bill. I welcome the Australian eSafety Commissioner, Ms Julie Inman Grant. She is joined by her colleague, Mr. Toby Dagg, executive manager in the investigations branch of the eSafety Commissioner. Both of our guests are joining us remotely in committee room 4 via Microsoft Teams. The format of today's meeting is such that I will invite our witnesses to make their opening statement, which will be followed by questions from members of the committee. As the witnesses are probably aware, the committee may publish the opening statement on its website following our meeting.

Before I invite the witnesses to make their opening statement, which is limited to three minutes, I want to advise them about parliamentary privilege. Witnesses are reminded of the long-standing parliamentary practice that they should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable, or otherwise engage in speech that might be regarded as damaging to the good name of that person or entity. Therefore, if their statements are potentially defamatory in respect of an identifiable person or entity, they will be directed to discontinue their remarks.

Witnesses participating in this session from a jurisdiction outside of the State are advised that they should be mindful of the domestic law and how to apply to the evidence they give.

We are delighted to have the witnesses with us and I thank them for taking the time to do so. As my colleague, Senator Malcolm Byrne, said earlier, we are having an Australian summer these days and we do not know how to cope with the heat but we have plenty of air conditioning so we are all fine.

I call on Commissioner Inman Grant to deliver her opening statement.

Ms Julie Inman Grant

Dia daoibh and hello from Sydney. It is a pleasure to talk to the committee today and I congratulate members for delving so deeply into these laws and considering how to best protect their citizens online.

When I was preparing for this hearing, I reviewed some of the debates the committee has had for this Bill. I noticed that the committee has heard from industry and civil society, and understandably their views have differed on the recommended approach and timelines for online regulation. One thing is for certain; we may never achieve full scale consensus because these issues are challenging. Balancing a free and open Internet with the protection of individuals and societies, where national laws have bounds but where harms know no borders, is a hard regulatory and policy posture to introduce, and more importantly, to deliver. It requires commitment, education, perseverance and a fair degree of cultural change. As members know, the issue of online harms and keeping people safe online is more important than ever as we rely on the digital environment to learn, work and even socialise.

Here in Australia during the pandemic, where we are still in lockdown, we have seen a surge in reports across all of our regulatory schemes. In the fourth quarter of the last financial year compared with the same time in the previous year, reports about illegal content, primarily child sexual abuse material and some pro-terrorist content, increased by 96%. Reports of image-based abuse or the non-consensual sharing of intimate images and videos increased by 255%. Adult cyber abuse, for which we do not have a scheme, increased by about 53%. The most moderate increase was in child cyberbullying, at 19%, which did not surprise us because most youth-based cyberbullying tends to be peer-to-peer and an extension of conflict happening within the schoolyard. When kids were separated, there was less cyberbullying as an extension of that conflict.

As the world’s first government agency solely dedicated to tackling online abuse, we provide an important safety net for Australians when reports to social media sites, which may miss culture or context, tend to fall through the cracks. Last month, a more robust and modernised Online Safety Act passed both Houses of the Australian Parliament, providing me and my office with more expansive powers to help protect all Australians from the most serious forms of online harm, and we are happy to talk the committee through some of those provisions. The Act will take effect in early 2022 and it will enhance our ability to provide greater services and support for citizens, in line with our core functions.

We have an important leadership and co-operative role to play in the international arena, as other governments, such as the Irish Government, look to establish online content regulators. We have landed on an effective model in Australia that is working by focusing on three major sets of interventions. These are as follows: prevention; protection; and proactive and systemic change. We invest in our own research, education and awareness campaigns to prevent online harms from happening in the first place and through our regulatory and complaints schemes, we protect our citizens by taking down seriously harmful content as well as employing a range of powers and remedial actions to hold both perpetrators and platforms to account. We also seek to minimise the threat surface for the future by staying ahead of emerging tech trends and by shifting responsibility back onto the platforms themselves through initiatives like #SafetybyDesign. We accept that each country will approach online safety in line with its own perceived needs and regulatory structures but we hope to achieve a significant synergy in a collective approach. In the near future, just as we have network of data protection authorities, we will likely also see a global network of online safety regulators.

As the committee moves through future debates on its Bill, it is always helpful to remember the citizens for whom the committee is carrying out this reform. It will be providing them with vital protections that they do not currently enjoy. It could be a person that members know: a child; a person at risk or disadvantage; an older person in their lives; or a colleague. Online harms can affect anyone at any time. While digital harms may not leave visible scars, we know the damage can be significant and enduring. We are pleased that the committee understands this and that it is working towards the best possible solution for Ireland. I look forward to members' questions.

I congratulate Ms Inman Grant. The Office of the eSafety Commissioner in Australia is leading the way globally in the protection of people online. We are excited to have their contribution today and hopefully we will be able to do the same in Ireland.

Some Members have indicated that they would like to come in and ask questions. I will begin with Senator Malcolm Byrne.

I thank the commissioner and Mr. Dagg for their presentation, as well as for their ongoing work. As the Chair has said, they have taken the lead in this area. The evidence, as I know from speaking to people in Australia, is that they are having an impact. I have a couple of questions. One of our challenges here is the threshold for reporting of individual complaints. We are going to have to build that into legislation. I might ask the commissioner to talk about the thresholds that are used in her individual complaints' mechanism. One issue she might touch on in is the question of extra-territoriality. How does she deal with social media companies and platforms that are not based in Australia? Finally, she might talk about changing behaviours, because this is not all about regulation. Can she give instances of, since her office has been set up, how the tech companies' behaviour have changed? Also, has she noticed a change in behaviour of Australian citizens online as a result of her work?

Ms Julie Inman Grant

The Senator raises three good and important questions. As I mentioned, we have about four regulatory schemes. We are now working through an additional scheme around adult cyber abuse. Definitions can be tricky for policymakers. The key is to get enough precision and understanding of the general consensus across the major social media sites, and where their policies lie. To give the Senator an example, serious cyberbullying of a child is anything that is intimidating, harassing, humiliating, or threatening. This is a high enough threshold, but it gives us some latitude to look at the content and context. One of the values of the citizens' service we provide when we interact either with the child, a guardian, or an educator, is that we can get the full story from them. This is where we can help serve as a safety net and bridge that inherent power imbalance that exists between the tech companies and the users. If the tech companies say that a report does not meet the threshold, there is not much recourse for that child to go through. Does Mr. Dagg want to give any specific examples of how he applies that latitude? Where does he see the importance of pitching that language and threshold?

Mr. Toby Dagg

The role of the plaintiff in the context that the commissioner explained is a crucial one. It is backed by some clear legal thresholds in the Act. They set out a high bar for us to meet, in relation to child cyberbullying. Importantly, where the definition of child cyberbullying is concerned, we can also taken into consideration the characteristics of the child. These are essential. A child who presents with particular vulnerabilities may be effected badly by a degree of harassment or intimidation that may be below the threshold for serious distress in the case of another child. When we explain the full circumstances to the social media platforms, we are able to bring into that conversation facts that might be apparent to a moderator, or an officer that just assesses the material on its face. We serve as a safety net. The first thing a person needs to show when they come through our doors is that they have made a complaint to the social media platform about the relevant material. This is so the social media platform can consider that complaint against their terms of service. When we have that conversation with the social media platform, we often talk about cultural, social, or individual factors that will likely have been missed at a first pass through. This is simply because of the volume and velocity of matters that those moderators need to assess.

I will give the committee a good example to illustrate what I mean. We were dealing with a matter a couple of years ago that involved some video footage that had been provided on a social media platform. It depicted girls from a remote community in Central Australia; it was an Aboriginal community. It is common in Aboriginal communities to be careful about naming deceased people. There is a strong cultural prohibition against that. This video had presented images of the deceased person. The video also named that person. The video was trying to cause distress part of the community and specific individuals in the community. To a person who is not familiar with the Australian context, the significance of that material would have been missed. It caused immense pain within the community. The person who was depicted had taken her own life. She was only a 15-year-old girl. By explaining this to the social media platform, we were able to have that material taken down. When it comes to the question of the "behaviour", that Senator Byrne asked about, part of our job is to educate the social media platforms about the significance of our local context, notwithstanding any sheer application of regulatory principle.

I also had a question about the extra-territoriality.

Mr. Toby Dagg

I will ask the commissioner to answer that.

Ms Julie Inman Grant

I was only going to say that we have had an online content scheme in place for more than 20 years. We, as the eSafety Commissioner, have only been established for six of those. As a result, less than 1% of the content that we deal with is actually hosted in Australia. It is almost all hosted overseas. It depends on the regulatory scheme. For instance, with the image-based abuse scheme, we have had what I would regard as a relatively high degree of success, in terms of getting intimate images and videos taken down from overseas sites. There are thousands of different sites, and we have about an 85% success rate. It helps to have the government crest behind us, as well as a strong set of remedial actions and powers to hold either perpetrators or the platforms to account. There tends to be image chan boards and rogue porn sites that are hosted in permissive environments that are not likely to take that content down. One of the newer forms does make it clear, in our new Bill, that our laws and regulatory schemes will have extra-territorial reach.

I thank the commissioner and Senator Byrne. I will now move on to Senator Shane Cassells.

I want to give a warm Irish welcome to the commissioner and to Mr. Dagg. We are delighted to have them both here. It is interesting to hear the Australian experience and the work they have been doing over the past six years to regulate this whole online world. I will start with a question for Mr. Dagg. We have just had a young law student at committee. She was here just before Mr. Dagg joined the meeting. She had the experience of having her image taken and used. The image was of a naked woman's body and it was posted on an Instagram account. She related to the committee the difficulty of the complaints' process. Since the e-Safety Commissioner has been formed, has their presence seen a sea change in how the tech giants deal with that complaints process? Mr. Dagg has pointed out that the complainants have to go to the tech giant before going to the commissioners. However, has there been a sea change? What pressure does Mr. Dagg put on the tech giants?

My follow-up question is to the commissioner, although these can be answered together. She spoke about the new online safety Act which was passed in Australia in 2021. How specifically has that been tweaked? In what spheres was it tweaked? What deficiencies did she see that needed to be addressed? If she has had a chance to look at the Irish Bill, does she see any deficiencies in this piece of legislation? Many people who have come to the committee have said that they want to see the Bill amended in different spheres. I have one last question for the commissioner. I have found through this pre-legislative process is the huge breadth of work emcompassed in this Bill. It is overwhelming to hear the witness testimony, especially from the young girl who was just in the meeting, or the special rapporteur on child protection rights. They talk about the abuse to these people. I ask the commissioner, because she ended her statement by talking about putting citizens first, has she found that emotionally stressful in her job? She is dealing with the attacks on citizens' rights as well.

Mr. Toby Dagg

I thank Senator Cassells. It is fair to say that our experience with the efficacy of the reporting processes that are provided by the services and platforms has been mixed. We have had some very direct engagement with some of the major platforms in respect of deficiencies that we have identified in recording, for example, an account carrying child sexual abuse material. We provide in that way clear illustrations as a consequences of our investigations. In some cases, that has yielded some positive results, which we have been pleased to see, and some marked improvements in the context of removing any ambiguity regarding aspects of the complaints process. However, there remain some deficiencies and we have been pointing to those on a fairly consistent basis. Our experience has been that some of the glaring holes are slow to be remedied and repaired. Why that might be the case is a matter for speculation. There is definitely work that continues to have to be done.

Senator Cassells referred specifically to Instagram. We are pleased to see that Instagram has improved the protective tool set made available to its users, particularly in dealing with unwanted messages and detection of abuse in some of the content that passes over Instagram. That is very welcome, but there is a lot more to do.

Our real concern is with those platforms, services and sites that serve a very specific interest, namely, the malicious distribution of harmful content serving a particular audience who really do not care all that much about the impact on victims, and, in fact, take some pleasure from it. They currently have very poor reporting mechanisms. It is difficult to identify who is responsible for the services. What the legislation that the Commissioner referred to seeks to do is provide multiple entry ways into a regulatory action. If we cannot get any response from the content service or the operator of a website, we can identify the hosting service and push at that door. If that fails or if removal notices are ignored, we can go to additional mechanisms such as seeking to have the material delisted from search engines, for example, through a power that the Commissioner wields. What we found effective and what will be apparent in the new legislation is that it is good to have fallback points so that you can increase your regulatory spread and exercise ever more serious measures in order to achieve a form of compliance and remedy the particular ill which is causing harm to a complainant.

Ms Julie Inman Grant

We try to achieve as much as we can co-operatively and collaboratively. We are often surfacing things that are missed but there is no question that having a big stick we can wield means that they are more compliant and willing to work with us. That has changed with the spectre of having new powers. We have seen some of the platforms do less in the context of goodwill. When this legislation was conceived in 2015, social media was primarily the challenge but we know online harm happens on dating sites and online gaming platforms. There is a really broad ecosystem. The search engines have a role to play and we will have a function around linked deletion notices. The app stores have a role to play, particularly if they are hosting rogue apps that are harming citizens that are in violation of their terms of service.

We now will have the power to remove content more rapidly. The quicker we take down content, the more distress is relieved from the person. It was initially 48 hours, and is now down to 24. We have seen some cases where a major platform will take down content within 12 minutes. We had a lot of baulking on the part of the platforms but even since that time around the sudden content moderation that is happening, they are increasingly using artificial intelligence, AI, and other tools to surface and help with their moderation efforts. Therefore, that is possible.

We have a new set, of, I guess, a duty of care, called the basic online safety expectations that will be laid down through ministerial instrument and will be able to compel transparency reports. We can do periodic reports or ones around specific issues. For instance, if there is a big pile-on, a brigade or a volumetric attack, we can write to a platform and ask what signals was it seeing, how did it respond, and is it consistently and fairly enforcing its terms of service, and if it does not comply there are penalties.

I guess the big pièce de résistance is the adult cyber abuse scheme. Within three months of expanding our remit from Children's eSafety Commissioner to eSafety Commissioner adult cyber abuse reports started exceeding youth-based cyberbullying. Of course, if you are indigenous, an Torres Strait Islander, you have different ethnicity, sexuality or gender, you are three times more likely to be targeted than a white male.

I thank the Commissioner, and Mr. Dagg as well. It was an insightful presentation. Their presence alone is having a massive impact when they talk about cultures. I call Deputy Munster.

I welcome the Commissioner and Mr. Dagg. In some of the kickback that we have been getting from some stakeholders regarding the individual complaints mechanism, their argument is that it would be instantly bogged down with complaints and reports and that would render it useless. Were there similar concerns in Australia initially and how did it play out?

Ms Julie Inman Grant

We had the same fear-mongering and there were times when we were concerned as well. Because we were a new entity, particularly for youth-based cyberbullying, we had more of a trickling effect at the beginning than we were hoping for. As we raised awareness about our services, that started to gather pace. We have not had an advertising or marketing budget and we have had to use a lot of our own media to get word out that we are a new agency and that we are here to help people. We also find over time that we need to do more to encourage young people to engage in help-seeking. Only 50% of young people will speak to trusted adult, such as a teacher or a parent, when things go wrong online. Even fewer will report to a social media site, let alone a Government entity, because they do not think anything will happen. That will all take time to build.

As we went into the pandemic, as I mentioned, we saw a surge of reports. At one point, in April 2020, around the Easter long weekend, we had a 600% increase in image-based abuse reports. There are a couple of reasons for that. Of course, more people were turning to digital intimacy tools when they were separated from their loved ones and things went wrong, but there were also four different variants of sexual extortion scams that were going around. These scams involved people stating that they had hacked into a person's hard drive and they had seen his or her intimate images and that the person must send them money, or a range of imposter accounts seeking to meet people, asking for a sexy Skype and using that as a form of sexual extortion for either more explicit images and videos or for money.

We had some surge funding given to us by the Government over the pandemic period in 2020 to accommodate the increase in reports. The way that Mr. Dagg has set up investigations is that we have multiple teams that all work together. We are prepared for surge support. If there is a surge in youth-based cyberbullying reports, the image-based abuse team can step in and help. We have never been overwhelmed. We have 26 million people. Ireland, I believe, has 5 million. It would not be intractable.

I agree 100% with Ms Inman Grant. We are taking account of the different stakeholders' opinions and that. If we do not have an individual complaints mechanism, the Bill will not be worth the paper it is written on. Certainly, it will not be as effective as we want it to be. Is it Ms Inman Grant's opinion that there is no reason an individual complaints mechanism will not work once it has sufficient funding and resources behind it?

There is no reason that would not work if it had sufficient funding and resources behind it.

Ms Julie Inman Grant

We can speak from our own experiences. We are mitigating individual harms and helping citizens to remediate them. That cannot really happen on a process and system level. There are benefits to that approach as well. Both really need to work together. The other value of engaging with citizens on a daily basis is that we see terrible things every day, which energises us and reminds us why we are doing what we do. We are helping individual citizens. It also gives us a really good indication of what is going wrong on these platforms, what is being missed and what are the trends and the social engineering scams. Without that, it would be harder for us to ascertain or pinpoint what is going wrong on those platforms or where their processes and technologies are deficient. Does Mr. Dagg agree?

Mr. Toby Dagg

Absolutely. To the commissioner's remarks on this particular question, I will add that what has helped us to limit the volume we might otherwise need to deal with it are the legislative brakes on the volumes. These include the requirement for complainants to have first made a complaint to the social media platform before coming to us. The provisions create sufficiently high thresholds so that we can concentrate our regulatory endeavours and energy on those harms that are manifesting in a way that leave their mark more so than the normal hurly-burly of social media interaction. I will give a snapshot from the cyberbullying scheme. In the first full year of operation, the 2015-16 financial year, we received 186 complaints about child cyberbullying. In the last fully financial year, 2020-21, we received 934. There has been a year-on-year increase of 30% to 35%. Because we have made our processes more efficient and our procedures more responsive, we have been able to absorb this increase with some headroom left for those times when we receive a surge of the sort the commissioner talked about.

I thank the commissioner, Ms Inman Grant, and Mr. Dagg for taking the time to speak to us this morning. I know their schedules are busy so I thank them very much. I will take the opportunity to ask three quick questions. Some questions have already been asked so I will try to ask different ones. During our discussions, some stakeholders have made strong suggestions regarding a self-regulation model for Internet safety. The general consensus was that self-governance is very weak. What are the dangers and potential pitfalls associated with self-regulation? In Australia, the take-down time has been reduced from 48 hours to 24 hours. Do the witnesses think that should be included in our Bill? Will they tell us about their social media tier system and how it works? What incentives are there for companies to sign up? What happens to companies that do not participate or that simply refuse to participate?

Ms Julie Inman Grant

As someone who worked in the technology industry for more than 22 years, my view on self-regulation is that it does not work. One of the things we are trying to achieve through safety by design is to encourage a change away from the cultural ethos of moving fast and breaking things towards mindfully building or updating online services while understanding and mitigating risks, engineering out misuse and building in safety protections at the front end. The best analogy is that, more than 50 years ago, the car manufacturers really pushed back when regulators forced them to embed seatbelts but we know take it for granted that seat belts are embedded, that airbags will deploy and that brakes will work. These things are guided by international standards. If you build the digital roadways, you should be erecting the guard rails. Right now, the technology companies do not have any of these requirements and we believe they should. We should continue to push them to do better. We can do that through safety by design and by surfacing up best practice because there are good innovations happening. This is the type of change we want to see go viral. We are never going to regulate or arrest our way out of online abuse. We are talking about human malfeasance and behaviour playing out on an online platform. A combination of both is needed. A pragmatic approach should be taken to regulation, which I believe Ireland is seeking to do. I believe we have struck a useful balance. We are not trying to undermine innovation or curtail freedom of speech. In fact, the burden is on us, particularly in respect of the adult cyberabuse scheme, not only to prove that the target is suffering serious harm or distress, but also that the perpetrator intended to cause serious harm, which is a high threshold to reach. I will ask Mr. Dagg to give his input with regard to the tier system. The Deputy asked a third question but I am blanking on it.

Mr. Toby Dagg

I will take a look at the tier question for the Deputy. I believe the three questions were on self-governance, whether a 24-hour or 48-hour threshold should be included in the Bill and the tier system. The tier scheme is actually being repealed in the Act which is to commence at the beginning of next year. I can certainly talk about how it has operated and how it is designed. I may then talk briefly about some of the reasons why we have moved to a non-tiered approach to the regulatory schemes in the new Act.

Under the current arrangements, there is a two-tier scheme for serious cyberbullying matters. A social media service can be declared a tier 1 service on application to the commissioner. If the basic online safety requirements are met, requirements which include nominating a contact person, having policies in place which prohibit the use of the service for cyberbullying and adopting those sorts of principles, the commissioner can declare the social media service a tier 1 service. With regard to the benefits of being a tier 1 social media service, the tier scheme operates like a regulatory scheme. Tier 1 members can elect to have what is called the special rule imposed. This allows for any decisions made as to whether material amounts to cyberbullying material to be made by us with reference to the terms of service in operation on the relevant platform. Under this scheme, we serve the provider with a request for removal rather than a compulsory notice.

That is also available under the tier 2 scheme. On the advice of the commissioner, the Minister can declare a large social media to be a tier 2 service. There are various elements the commissioner needs to take into consideration before making that advice available to the Minister. If a service is declared to be a tier 2 social media service, the nature of the regulatory options become somewhat more interventionist and attract civil penalties.

Ms Julie Inman Grant

I was working at Twitter when the legislation was established and it is interesting to note that Twitter and Yahoo signed up to the tier 1 scheme but, at the eleventh hour, Google, YouTube, Instagram and Facebook declined to sign up and were deemed tier 2 service providers. In the course of business, they have been collaborative and co-operative when we have requested removal. I believe they did not sign up to become tier 1 services because doing so involved a coregulatory model and they did not want to set a precedent showing a tolerance for compliance with a regulatory scheme, which might start a domino effect.

Mr. Toby Dagg

I will transition to discussing the arrangements under the new Act.

We have found over the life of the cyberbullying scheme that harm is happening on social media, on relevant electronic services like instant messaging and gaming services and on websites. The legislation as it is written now under the Enhancing Online Safety Act does not cater for websites within the cyberbullying scheme. They need to be social media or relevant electronic services. The online safety Act creates a degree of uniformity across all our regulatory schemes, so they focus on the harms that occur across platforms. What will be brought into scope through the reforms are websites, other designated Internet services and anything that can be reached by an Internet connection. That is where some of the more serious, cross-platform cyberbullying is happening. As a result, the social media-focused tiered scheme will be abolished in favour of that broader approach.

It has come to my turn. The commissioner is right that in Ireland we hope to take the pragmatic approach that Australia has taken without compromising freedom of speech. Will Ms Grant talk about what it has taken to resource her commission in terms of manpower? That will help us to formulate that. In terms of ensuring we do the right thing and get it right from the start, will she give us three points of advice in the wrap-up?

Ms Julie Inman Grant

In the four and a half years I have been in this role, we have quintupled in size. When I came in, there were 35 people in the office. We have grown with schemes and functions and will be at almost 200 by the end of the year. We are in the middle of a major hiring boom now to accommodate the new functions which will take significant man and woman power. We are about 70% women at the eSafety Commissioner and 30% men. I think sometimes Mr. Dagg feels outnumbered. I used to say when I was in tech, where the proportion of men to women was the opposite, that the odds were good but the goods were odd. That was an inside tech joke. I will not do any more joking.

We started with a Australian $10 million budget. This year we are at Australian $55 million. It feels like a luxury but we will use every penny to put towards fulfilling our mission of keeping all Australians safer and having more positive experiences online. We are comparatively small compared with our brethren in national security or the law enforcement community, equivalent to An Garda.

What I would say, first, is "due diligence". You have obviously been thinking about this for a long time. We have engaged with various sectors of the Irish Government for at least two years. You are getting that right and trying to make sure you get the policy settings right. Our bias is that having a citizen-facing service is very valuable to citizens and the individuals impacted harmfully and as a way to surface up what is happening in the online environment and assess those trends.

It is important to make sure to choose someone for commissioner who has an understanding of how the technology industry works. What has helped me in this role is knowing what points they will come in and talk about. I know what their limitations are so I can be reasonable but I also know when they can do better. It is about having that understanding of trust and safety, how harms manifest, how online platforms can be weaponised, how to engage constructively with the industry to the extent one can and when to wield the stick. Those are all important considerations when choosing the person to lead these teams.

I have a philosophy of hiring smart, nice people who have a passion for making a change and a difference. We see the most horrible aspects of humanity every day, particularly the pro-terrorist content and child sexual abuse material our investigators look at. We have extensive wellness programmes in place and we try to make the culture positive so we walk in every day with a spring in our step waiting to help more people.

I have a quick question about something that has been noticeable during the pandemic. The commissioner looked at data around online harm that is happening. An issue we will look to address related to media literacy is fake news, particularly with regard to Covid. I wonder how the commissioner responded to that. I am thinking about stories, which the commissioner may or may not wish to comment on, of vaccine hesitancy in Australia and whether that has been a contributory factor. I ask about the extent to which the commissioner's office engages in tackling problems around fake news, particularly during the pandemic.

Ms Julie Inman Grant

We have an interesting governance arrangement. We sit in the Australian Communications and Media Authority. That is the traditional media and communications regulator. It has taken on the role around misinformation and, to a degree, disinformation, as has, when it comes to disinformation by state actors, our home affairs division. The authority has developed a code of practice with the high technology industry about misinformation and disinformation. That is of concern.

We have a stronger prevention of media literacy programme. Part of what we do is teach through parents or educators what we call the 4Rs for the digital age: respect, responsibility, building digital resilience and critical reasoning skills. Critical reasoning skills are important in helping young people and adults discern fact from fiction, whether or not an impostor account is attacking them or whether they are being socially engineered for a scam. Teaching these skills at home to start with so people question everything and in schools is critical. It is a skill set we all need to address. We have not specifically addressed vaccine hesitancy in this context.

I offer my apologies for being late. I got delayed. I will not ask specific questions. They may have been asked earlier. We will have a media commissioner and the Australian model is seen as best practice, which is why we are chatting today. The position of media commissioner will be extremely important. The Chair asked about best advice from the Australian experience that we should learn from. With regard to the position of media commissioner, are there specific terms of reference for who that person may be that we should highlight and prioritise before the position is put in place? It is important we have a strong leader in that role to drive this forward when we bring in the regulations.

Ms Julie Inman Grant

Part of what was written into the job description for the eSafety Commissioner was some knowledge of the technology industry.

As I have said, that has come in handy. Depending on how complex the codes of practice are, Ireland might decide to choose somebody with a much deeper skill set when it comes to bureaucracy, codes enforcement and development or investigation. An all-rounder may be needed.

As the online safety regulatory network grows globally, we must engage internationally with counterparts. We must engage with non-governmental organisations. We are managing many competing interests and dealing with citizens in great distress. Sometimes we cannot help people. A person is required who is a good communicator with a good team around him or her. A person also needs much resilience because the pace of technology will always outstrip policy. We are moving very quickly and we must be deft and stay a step ahead. We are dealing with some hard issues and we must steel our resolve. This person must also be able to lead a team and ensure they are focused on the task and also healthy when it comes to their well-being. What would Mr. Dagg consider the secrets of success?

Mr. Toby Dagg

That might be a career-limiting question. The commissioner has spoken about finding someone who understands the industry components and can inquire as to the significance of statements by industry, pre-empting and anticipating some of the strategies and tactics that might be employed by industry to push against regulatory reform. That is absolutely essential. Beyond that, it should be somebody capable of adapting to change and forecasting some of the essential strategic changes within the country's landscape that are likely to affect planning and execution on the objectives as set out in the legislation. That is key as well.

Will Mr. Dagg give a quick example of the process involved with a complaint in Australia?

Mr. Toby Dagg

Absolutely. All our complaint processes function in reasonably similar ways in terms of throughput. A person visits the safety commissioner website and we have a reporting page. Every page on the website allows a person to navigate to the report page via a prominent report abuse button in the right-hand corner. The person can then complete a complaint form specific to a complaint type. We are conscious of the need to ensure citizens are not put in a position of having to choose which door to go through. There are some similarities between schemes and we do not want people trying to diagnose their own form of online harm through a regulatory lens. We have separate forms set up for each of our complaint areas.

A person will tell us what is happening, such as the location of material, and if it is image-based it may be a URL on an adult or rogue site. They person will also include demographic information so we can prioritise and we prioritise complaints from young people, for example. Approximately 25% to 30% of image-based abuse complaints are made by young people. We collect as much evidence as we can up front and the report will generate an entry in our complaints management system. An investigator will pick that up and start to perform triage by reviewing the complaint and assigning priority. Any significant issues, sensitivities or risks will be raised with a manager and we will discuss some of the responses that should follow. In talking about human-centred harms, involving image-based abuse, cyberbullying and cyber abuse, there is invariably a phone call with a complainant in order to talk through additional details, gather additional evidence and starting to plan a response.

Once that response has been settled, we may have recourse to several regulatory options and these are anything from informal warnings to a perpetrator on a particular matter to options involving applying to the federal court for injunctions. These are not options we have had recourse to because as others have mentioned, much of our work is done very successfully at the informal end of the regulatory spectrum. There is a range of interventions that we can have regard to, depending on the severity of the matter presenting.

I thank Mr. Dagg and I thank Deputy Mythen for the question. I thank the Commissioner and Mr. Dagg for joining us. Much of what they have shared with us today could certainly be applied to our work in Ireland. Of course, the three "P"s of prevention, protection and proactive change provide some really good highlights for us to use. I hope we can protect our citizens as Australia has done so well. We have just 5 million people compared with Australia's 26 million, but we have work to do in order to protect them. We certainly intend to deliver on that. I thank the witnesses for joining us from the other side of the world today. I thank colleagues for their participation in the meeting. I say "slán" to gach duine.

Ms Julie Inman Grant

I thank the committee for the opportunity to speak. Good luck.

Sitting suspended at 11.27 a.m. and resumed in private session at 11.50 a.m. The joint committee adjourned at 12.05 p.m. until 10 a.m. on Wednesday, 15 September 2021.
Barr
Roinn