General Scheme of the Online Safety and Media Regulation Bill 2020: Discussion (Resumed)

This meeting has been convened to hear from the Irish Society for the Prevention of Cruelty to Children, ISPCC, the Children's Rights Alliance and CyberSafeKids. It is the fifth of our public hearings to discuss the general scheme of the Online Safety and Media Regulation Bill 2020. I welcome the witnesses who will be joining us remotely via Teams: Mr. John Church, chief executive officer of the ISPCC, and his colleague Ms Fiona Jennings, senior policy and public affairs manager; Ms Tanya Ward, chief executive of the Children's Rights Alliance, and her colleague Ms Julie Ahern, legal and policy manager; and Ms Alex Cooney, chief executive officer of CyberSafeKids.

The format of the meeting is such that I will invite witnesses to make their opening statements, which will be followed by questions from members of the committee. As the witnesses are probably aware, the committee may publish the opening statements on its website following the meeting. Opening statements will be limited to three minutes each, if possible.

Witnesses are reminded of the long-standing parliamentary practice that they should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable, or otherwise engage in speech that might be regarded as damaging to the good name of any person or entity. Therefore, if witnesses' statements are potentially defamatory in relation to an identifiable person or entity, they will be directed to discontinue their remarks. It is imperative that they comply with any such direction. As our witnesses are attending remotely from outside the Leinster House campus, they should note that there are some limitations to parliamentary privilege and, as such, they may not benefit from the same level of immunity from legal proceedings as a witness who is physically present does.

Members are reminded of the long-standing parliamentary practice to the effect that they should not comment on, criticise or make charges against a person outside the House, or an official either by name or in such a way as to make him or her identifiable. I remind members of the constitutional requirements whereby they must be physically present within the confines of Leinster House or the Convention Centre Dublin to participate in public meetings. I will not permit a member to attend where he or she is not adhering to the constitutional requirements. Therefore, any member who attempts to attend from outside the precincts will be asked to leave the meeting.

I ask members to identify themselves when contributing for the benefit of Debates Office staff preparing the Official Report and to mute their microphones when not making a statement to reduce background noise and feedback. I encourage members and guests to use the raise hand button when they wish to contribute. I remind those joining the meeting to ensure their mobile phones are switched off or on silent. Either would be great.

It is wonderful to have our guests here. I invite Mr. John Church to make an address to the committee on behalf of the ISPCC.

Mr. John Church

I thank the committee for inviting us. I am joined by Ms Fiona Jennings, ISPCC's senior policy and public affairs manager. We will both be happy to address any questions members might have on our opening statement or written submission as the session proceeds.

The ISPCC's sole concern with this legislation and the reason we are here is to ensure the protection of children online and to impress on members, as legislators, that this is within their gift. The ISPCC views itself as a child-centred technology advocate. It champions all the benefits technology has, highlights and educates on the potential risks and harms, and speaks out about the gaps in the current regulatory approach.

Through our suite of Childline services, children and young people tell us at first-hand about the experiences of their daily lives. In the ISPCC's last operational year, the Childline telephone services received more than 265,000 contacts, of which we answered in excess of 206,000. Our Childline text and online service received nearly 35,000 contacts. Online safety and bullying, including cyberbullying and the impact it can have on children's mental health, featured across these services.

Today, I want to share the story of just one child. I want to demonstrate the real human impact of online harm and why access to effective and efficient remedies is crucial and warrants meaningful policy and legislative change, which could, in some cases, be life-saving. The child's story reflects the weekly interactions we have with children and young people and why cyberbullying is an issue that we must all take seriously. Kate is 13 years old. Her interests include gaming, hanging out with her friends, playing the piano and enjoying all the typical things a 13-year-old girl ought to enjoy. She enjoys posting videos of her piano-playing online, competing with her friends through online gaming, and using various apps to keep in touch with what is going on in their busy daily lives. At least she used to enjoy these activities but things changed recently for her. The in-school bullying she was enduring moved online, where lies were being spread about her by her so-called friends. Students were texting and calling her horrible names on the very apps on which they once enjoyed hanging out together. She sits alone in her bedroom, but not in silence. She is kept company by the constant pinging of her phone alerting her to the latest horrible thing being said about her.

When Kate contacted Childline, she told the call facilitator how she wanted to slit her wrists as she felt this was the only way to stop the cyberbullying. Imagine being in the shoes of Kate, a child who felt this was the only option that would make her situation better. She blocked some of the children who were saying horrible things about her and reported some of them to the various platforms and gaming sites she was being bullied on but nothing meaningful happened and the cyberbullying continued.

Cyberbullying can have a long-lasting and devastating impact on its victims. By its nature, it is intentional, targeted, repeated and persistent. Whereas in the past some victims of bullying had a reprieve at weekends and school holidays, it is no longer the case when bullying moves online. It is likely that each example of cyberbullying that Kate endured, if reported in isolation, would not meet the investigation threshold of any platform or site. However, when considered together, the picture is very clear. The impact on Kate is very clear. It is in respect of cases like hers, a case of egregious cyberbullying, that the ISPCC impresses upon the committee the need for some mechanism to be available to children. A reporting mechanism that children such as Kate could avail of would go some way to rectifying the problem and limit the harm being caused.

The ISPCC was impressed to hear the representatives of the Broadcasting Authority of Ireland, BAI, the future regulators in this space, state at last week's committee meeting that the protection of minors is central to the body's plans and efforts, while recognising that in a small number of circumstances there may not be the potential to resolve an issue effectively and efficiently within the provisions of the proposed Bill. It was further heartening to hear the witnesses from the BAI and the Data Protection Commission, DPC, calling on the committee to reconsider including a notice and take-down function under the remit of the online safety commissioner, seeing merit in such an individual complaints function.

There is a need for society to shift the narrative in how we speak about bullying behaviour. It is not banter, it is not a rite of passage, and it is not "just having the craic" where someone is deliberately and persistently being targeted over and over. We need to recognise it for what it is, namely behaviour that can cause long-lasting harm. Our one ask of the committee is that it reconsider the concept of the notice and take-down provision as set out in a draft Bill purposed by the Law Reform Commission, LRC, in 2016.

In the most egregious cases of cyberbullying, children must have access to an individual complaints mechanism. While we appreciate there is now an urgency in getting this legislation passed due to the need to transpose the audiovisual media services directive, we ask the committee not to fall at the final hurdle and to remain committed to the protection of children online at this critical juncture. We look forward to any questions committee members might have.

I thank Mr. Church for the very powerful statement he has made. I believe all members of the committee listening here today would concur with everything he has said. It will certainly be our goal to achieve the principles Mr. Church has set out here today.

Ms Tanya Ward

I am delighted to have an opportunity to meet the committee in respect of this Bill. There are three issues we wanted to raise with the committee with regard to the Bill. The first relates to the naming of the online safety commissioner in the legislation. The legislation talks about appointing commissioners. I understand the Government's position is that the Bill does not say there should be an online safety commissioner because it wants the flexibility to appoint additional commissioners in the future if there is a demand. However, you could also consider the reverse. What if a future government decides it does not want to have an online safety commissioner? What if there were two online safety commissioners and a government decided to get rid of one of them? This is something about which we are concerned. We feel there is no real reason the legislation could not be amended to state that there will be at least one online safety commissioner. That would be a very important amendment and the Oireachtas needs to accept it.

The committee has heard from many organisations and stakeholders with regard to the individual complaints mechanism. I cannot emphasise enough how absolutely important this is. This online safety legislation will have failed if it does not include an independent individual mechanism for children and young people. They are up against the big, wealth technology giants that control the majority of these platforms and do not invest enough in safety-proofing or in privacy measures. The emphasis should obviously be on making these platforms process cases in a timely and effective fashion. We are concerned that, without some independent mechanism, children will not have a right to a remedy. This is something provided for in law. A common or typical case about which we would hear from our members might involve a nine-year-old or a ten-year-old posting content that is harmful to them while a child and then getting bullied about it later in life. It might otherwise involve other children or young people posting content that is harmful to them, on which basis they get bullied. As we have heard time and again, these platforms are very slow and inconsistent in taking this kind of content down.

The last issue I will raise with the committee relates to the area of education. Members will know that children who are vulnerable offline are very vulnerable online. Most children do not have an adult at their shoulder when they are online. That is just the reality of the situation. What do we do? We have to make the Internet a safer place for them. We have to regulate it. The other thing we need to do is to focus on empowering and educating children to protect themselves online.

One of our concerns, which has come up through our member organisations, relates to the lack of consistency in online education in our schools and youth settings throughout the country. There needs to be a core curriculum, stipulated by the online safety commissioner. The commissioner should have a role in regulating and registering those who deliver this type of education to children and young people. It is only in this way that we can be sure that children are getting the toolkit they need to keep themselves safe online.

I thank Ms Ward. That was another very comprehensive and empowering presentation. I am looking forward to extracting a little bit more and teasing out the points Ms Ward has made, as are all members, I am sure.

Ms Alex Cooney

I thank the committee for giving us the opportunity to appear here this afternoon. CyberSafeKids is an Irish charity set up in 2015 with the aim of equipping children with the skills and knowledge to embrace the many opportunities for learning and enjoyment technology can deliver while avoiding the inherent risks they face in the online world such as cyberbullying, exposure to harmful content and online grooming.

We fully support the introduction of legislation that will change the landscape in Ireland with regard to online safety in general and particularly with regard to children, who are among the most vulnerable online users. We welcome the idea there will be an online safety commissioner with both the powers and the resources to make a difference if those powers and resources are well defined. The commissioner must be explicitly named in the Bill, however, so that the intention to create that post is made both clear and specific.

We welcome the proposal to have a central point of contact for a broad range of online safety matters and the fact this role will have oversight over education and public awareness, both of which are key strategies in equipping online users with the skills and knowledge to make informed decisions in the online space. We want to see the regulator’s powers further extended to include oversight and the powers to accredit online safety education programmes throughout the country.

One of the reasons we are here today is we all know self-regulation does not work. No other industry that wields such power and influence is left to self-regulate in the same way. This legislation needs to be meaningful and it must be effective. Our concern is that the stated desire to bring about systemic change will all take years to effect real change and that only dealing with super-complaints will not make a real difference to ordinary people’s lives.

This brings me to the part of this Bill that is lacking, that is, an individual complaints mechanism. It is essential such a service be available to Irish children and their guardians and that the law provide a vital safety net at a critical time. If children are being bullied or harassed online, and if they or their guardians have tried and failed to get this content removed from the online services in question, then there must be scope for them to access support and remedial action in a timely fashion.

Let me provide an example. Last year, a teacher from a secondary school in Ireland reached out to us over concern about a 14-year-old boy in her class. He had recently moved to the school after being severely bullied in his previous school. The bullying was over some videos this boy had posted on a video sharing platform when he was much younger. When it had first happened, his mother promised to get those videos down by the time her son woke up the next morning. Unfortunately, despite her best efforts, she simply could not do it as he had lost his login details and she found no way around it. She reached out to the platform in question but it said the videos did not violate its community standards so it could not help. The videos stayed up for two years. The boy moved schools but became increasingly distraught about his new classmates finding this content and it all starting again. His teacher asked if we could help. We contacted the online service in question and argued the boy was under the minimum age restriction of 13 when he posted the videos. It took a bit of back and forth but all traces were removed ten days after we contacted the platform.

Imagine if that family had the option open to them at the time to take their case to an independent regulator who had the power to issue a time-bound takedown notice. It might have saved two years of living with the threat of someone seeing those videos and the worry of what he or she might say. It might have saved this boy having to move schools. Just to be clear, an individual complaints mechanism should not deal with every single case that arises. This is about putting in place an effective triage system so that it would be those cases, and only those cases, in respect of which all available channels with the relevant online service provider had been exhausted that would be dealt with. If this service would need to be for the whole of Europe, then it would need to be resourced accordingly with European money as well the proposed levy to be provided by the online services themselves.

I thank the committee for providing the opportunity to comment on this Bill, which has the potential to change the landscape with regard to online safety in Ireland if we get it right. I look forward to the members’ questions.

I thank Ms Cooney. There is an obvious common thread to the ask in all of today's statements. As I said, I do not believe any members of the committee would disagree with it. I thank all of our witnesses very much. I will now call our members

I thank all of the witnesses for their testimony. I agree this legislation is incredibly important. As legislators, one of our core functions and responsibilities is to protect all of our citizens, particularly those most vulnerable of our citizens, in the public space, of which the online area is now part. We need to take action. I strongly agree with the testimony of the witnesses and with their view that we need specifically to name the online safety commissioner in the Bill and to resource the office properly. The witnesses' testimony has been very powerful for all of us because we have been talking to children who have been directly impacted by these issues.

We also heard from students from Kinsale and Tallaght last week. We have been engaging with young people. It is important that their voices are heard.

When we set up this new regulator, what mechanisms do the witnesses believe should be in place to allow children and young people to have direct input into the codes? Representative and advocacy bodies can represent them but it is important that young people's voices are heard. We heard from a number of young voices last week. How can we put that in place on a statutory basis?

Community standards have been touched on. As the witnesses know, we will meet with a number of the social media companies next week. What messages would the witnesses like to send to those companies with regard to changes to their community standards that the witnesses believe will make children safer? Part of this will be about what the regulator can do but it is also about what actions can be taken by social media companies.

Mr. John Church

That is a good question, which we ask ourselves all the time. We in the ISPCC have a number of children advisory committees. We get 800 to 1,000 contacts from children every day. While that is many children and a mini omnibus in itself, we still like to have qualitative input from children. Through the ISPCC and other organisations, there would be a good opportunity to set something up formally. We would be happy to facilitate that. That would be my thinking on the first question.

On the Senator's statement about community standards and the platforms, we interact with all the platforms. We have an open, transparent relationship. We were talking to Facebook this week and know it will be at the committee next week. While what it is putting in place is admirable to some degree, with various tools and software to prevent this, one only has to look at the number of images and harmful content that it takes down regularly to tell one how big it is. If there was a question that the committee wanted to probe with it, it would be the aspect of shareholder value versus the protection of the child. Our concern in the ISPCC is that one is traded off against the other. We have quarterly meetings with such companies as Facebook, TikTok, etc. While super things are happening and the investment is admirable, it is nowhere near what it should be doing. We in the ISPCC believe that one cannot trade one off against the others.

There is a serious issue with end-to-end encryption. I do not know if the committee is aware of this but companies are all looking at implementing end-to-end encryption to protect privacy. That may unwittingly protect the perpetrators online. "Online trading" is a terrible term to use but online trading of child sexual abuse materials is a big business. I would like to see the platforms be able to do both. We landed a helicopter on Mars. Surely we can protect children at the same time as providing privacy.

I am sorry to cut Mr. Church short. Members are indicating. We are out of time. We will hopefully have an opportunity to come back on that.

I thank the witnesses for their presentations. I will quickly go through the questions. What do the witnesses envision as being needed by way of warnings and signposting for children online and for their parents and teachers? Social media platforms are using intervention tools much more. If somebody mentions something about Covid, an intervention happens. What do the witnesses think that social media platforms need to do in that regard, especially for children? Do they think that social media need to adopt child-friendly versions of their platforms with additional protections? I am not clear on the answer myself. We talk about education and teaching children how to navigate platforms but do social media giants need to adopt child-friendly versions of them? What do the witnesses think they would look like?

What resources would the witnesses recommend to legislators such as ourselves to inform us about the safety needs of children online? There are two parts to that question. Do the witnesses have international examples of either a reporting mechanism which has been deployed or what other countries have done, which we are doing our own research on? If there are examples of the reporting mechanisms, we can push for them to go into the Bill. What do we need to delve into further? I know that we have been sent a substantial amount by the witnesses to look at.

Ms Tanya Ward

On the question on signposting and warnings, we believe that children and young people should not be able to access certain types of harmful contact online. This includes information about suicide, images of child abuse, violence and harmful pornography. We should see platforms employing techniques, algorithms and artificial intelligence to prevent children from having access to that content. We should make them accountable for allowing children and young people to access that content. If one goes outside the front door, someone there is responsible. Children should have the same experience but some things should be off-limits to them.

We talk about special platforms for children and young people. When those things are created, it is a new product for children and young people. If there is a chance to meet with these companies, one should raise the profiling and targeting that they are engaging in with children and young people when it comes to advertising. Children and young people are impressionable and it is having a significant effect on many situations, especially with regard to alcohol and junk food. Cosmetic treatments are another issue we hear about from members as well. Many young girls are blasted with advertisements and it distorts their self-image.

I know Ms Ahern did not get to speak yet. I welcome her to the meeting.

Ms Julie Ahern

On the last question the Senator put forward about resources and international examples, one international complaints mechanism to look at is definitely Australia. It has an online safety commissioner and a two-tiered system. Tier one is where companies can opt in and tier two is where they are mandated. One could look at this as an example of roll-out. With regard to resources for children, the Council of Europe has developed some great guidelines on children in the digital environment, which take into account both children's rights online and the need to empower them to be online, and also what to do with regard to remedies, fair procedure rights and their protection rights. The UN Committee on the Rights of the Child recently developed a general comment. That was done in consultation with children and young people. It was published earlier this month. It is well worth a look with regard to language that the Oireachtas could adopt in the Bill as it stands.

Ms Alex Cooney

To add to my colleagues' points with regard to signposting, warnings and so on, this is an opportunity for public awareness campaigns along the lines of road safety campaigns, where we have seen significant investment over the years in public awareness campaigns which make a difference. We should do much more on that.

I hope that would come under the remit of the online safety commissioner's office.

Regarding child-friendly versions of different online services, we must be a little careful there because, inevitably, children want to be where their peers are, and often where their older peers are. One would really struggle to keep them on these supposedly safer platforms. I have heard them described as training grounds. That is right. I would exercise some caution there and I would focus more on issues such as age verification, looking at safe ways of identifying whether it is a child who is accessing a particular platform and what safeguarding measures should be put in place to protect that child.

I call Senator Warfield. I see he got his hair cut since we saw him yesterday.

All those stories are all gone. There is a little delay, but I will proceed. Do the witnesses think the Garda would need to be resourced with regard to online safety following the passing of this Bill? It was suggested by one of the social media companies that we split the Bill between media regulation and online safety. Do they have a response to that?

Ms Jennings is offering to respond. I welcome her to the meeting.

Ms Fiona Jennings

I am happy to take the second question on splitting the Bill. We see the points that are being made in that regard. There is an urgency for the transposition of the audiovisual media services directive and, in terms of splitting it, then to remove the national legislative proposal, the online safety commissioner, which we are discussing today. We have major concerns about that. We have been campaigning for this for some time and we feel there is a great deal of good in the current proposal. However, the glaring absence, and we are making this point today and it has been made previously at committee meetings, is the lack of an individual complaints mechanism. We are informed by officials that they are having conversations with the EU Commission and they appear to be confident that it is aligning with what the Digital Services Act is proposing. The Digital Services Act proposes a much more robust complaints mechanism. If they are to align and there is minimum disruption when the Digital Services Act comes into place, we probably need to pay a little more attention to that. We certainly ask the committee to pay more attention to what is happening in that regard as well.

In respect of the Garda and resourcing for the Bill, briefly, we know that criminal content will not be a significant aspect of the Bill. However, from our relationship with the Garda National Protective Services Bureau, it always needs additional resources, whether in personnel or resources for the technologies it uses to speed up that process. We certainly would not argue, and I doubt that the Garda would argue either, with the need for additional resources there.

Just to let Ms Jennings know, her screen has frozen. She might just switch off and switch on and, hopefully, we will see her in real time again. Does Ms Cooney wish to comment on this?

Ms Alex Cooney

Yes, briefly. On the first question about resourcing the Garda, I echo Ms Jennings's point. We hear very inconsistent reports from members of the public about reporting, for example, bullying cases, to the Garda and the advice they receive. Training and supports are definitely required, especially for the gardaí who are dealing with the complaints as they come in, to ensure that much more consistent advice and support are given and, hopefully, there is the opportunity to minimise harm to a child.

I also echo Ms Jennings's comments on the second point. We would be very concerned if the online safety aspects of this Bill were moved out in order to wait for the Digital Services Act to come in. That would be an unnecessary delay.

Ms Julie Ahern

To add to my colleagues' points about the Garda and the need for additional resources, the other issue to consider after the Bill is passed is the need for some type of protocol between the online safety commissioner and the Garda so they can communicate effectively. With regard to what Ms Cooney said about children going to the Garda and getting inconsistent reports, we hear that all the time from our members too. What needs to happen is that the Garda must know how it can interact in a way that ensures it does not breach GDPR in contacting the online safety commissioner, and to have a professional protocol in place whereby they can both notify cases to each other. That would help to alleviate stuff that might go to the online safety commissioner that would, perhaps, not be in the commissioner's remit and it could be referred back to the Garda and the other way round. That is in addition to what my colleagues have said.

With regard to the intention to split, we would be incredibly concerned about that. As the committee knows, the audiovisual media services directive at European level has taken approximately ten years to get to the current position. If we have to wait another ten years, who knows what harm could happen to children online?

I thank the witnesses. That is very informative.

Senator Cassells has five minutes.

I welcome the witnesses. Everyone can agree that the past two days of hearings for the pre-legislative scrutiny of this Bill have been the most insightful and important because of the witnesses involved, with the Ombudsman for Children and the special rapporteur on child protection yesterday and the representatives here today. We are moving beyond the headline-grabbing issues into the depth of the matters. Mr. Church touched on-----

Senator Cassells, I am just letting you know we cannot see you. Perhaps your screen is switched off.

Is that better, Chairman?

Yes, that is perfect.

Mr. Church referred to the issue, even with the terrible example he gave, that this is the platform on which young people communicate. The problem is when things go sour. The question is the extent of regulation vis-à-vis the protection of children's rights. As we saw last week with the children from Kinsale and Tallaght who gave testimony to the committee, they expressed concern about over-regulation of the area. It comes down to the fine lines. I refer to the point Ms Ward made. Obviously, there are identifiable threats in terms of pornography and other issues, but she also spoke about the targeted advertisements by cosmetics companies, especially at young girls, which in many cases can be equally destructive to their self-image and self-worth, and can equally lead down a harmful path. Where do we draw the line in these areas of fine lines as to what is material that can be very harmful to young children? This point was made yesterday by the special rapporteur on child protection regarding harmful online content, its dissemination and its identification. It might be easily identifiable with things such as pornography, but it is not as easily identifiable in a swathe of other areas and is equally destructive. What are witnesses' thoughts on how that can be addressed?

Ms Fiona Jennings

I saw the testimony of the young people from Tallaght and Kinsale. It is 2021 and they said things that we have been listening to for the past number of years. I totally agree with them that we should not regulate for everything. It is not necessary. However, we need to regulate for the right things. That is my first comment. In that respect, education has a massive role to play.

When we talk in the ISPCC about the different types of harm we find the four Cs very useful: content, where the child is a recipient; contact, where the child is a participant; conduct, where the child is an actor; and contract, a new one that has been updated. The special rapporteur touched on the latter yesterday in respect of online gambling. There is a plethora of online harms so the first question we must ask ourselves is what we want to regulate and what needs to be regulated versus where education plays a role in this as well. We all are online now, we are a lot more proficient online and our lives have moved online.

Adults as well as children and young people need to have a better understanding as to how content is served up, how we can manage that and how we can protect our privacy online. To summarise, we need to understand what we want to regulate, and why, and where education can be brought in.

I call Ms Ward.

Ms Tanya Ward

There are two ways to do this. One is that one makes certain websites not accessible to children of a certain age - this is for the very serious harmful content - and the other way is to have legal restrictions that one could have in law. We already have, for example, a ban on advertising to children under the public alcohol Act. That ban does not exist in relation to the online world. Legislation could be passed that regulates and imposes bans. One could say, for example, children under the age of 16 should not be exposed to X, Y and Z and one could specify exactly what the technology providers should take on board when they are doing that. There are many examples in other countries where this is being looked at. In the US, there is an organisation that campaigns for a commercial free childhood, in that children who are online are given a space where they are not being profiled, where their data is not being sold on and where they are not being hammered by inappropriate advertisements.

Could Ms Cooney respond in a few seconds? I know it is not fair that she has to come in at the tail end of the five-minute slot.

Ms Alex Cooney

In terms of the regulation versus protection discussion, we also have to take into account the age of the child. Childhood encompasses such a wide range and regulation needs to address the evolving capacity of children. We mainly focus on children under the age of 13 and we are looking for a high level of protection whereas older youth, as one can imagine, do not want their rights to privacy infringed on. We have to take into account the ages. We need to look at measures such as minimum age restrictions. We need to look at how all of that is managed by the online services and the kind of safeguards that they put in place for different age groups.

Deputy Fitzpatrick has five minutes for questions and answers.

As a parent and a grandparent, there is always a fear factor when one's children or grandchildren go online or on any kind of social media. One is always afraid to ask them any questions, such as "What are you doing?", "Who are you talking to?" and "Can I help you?". The child gets very protective and kind of shouts and roars at one. The child feels one does not trust him or her. The bottom line is that as parents and grandparents we just want to protect our children or grandchildren and to know that they are safe online.

I am very interested in these individual complaints and I would appreciate if the witnesses could elaborate on what the procedure is there. Will making individual complaints protect our children, especially from bullying and harassment? Can the witnesses explain in simple terms how that works? They say the regulatory powers should be extended across the country to include oversight. These are matters that I want them to elaborate on.

Every time I go to a committee, no matter what it is, everybody always mentions that we have to educate, and Ms Cooney mentioned education. What will help with more education? To be honest, I want to be educated because I do not want to look stupid when I am talking to my children and grandchildren. Could Ms Cooney the elaborate in answering those few questions?

It is over to Ms Ahern first.

Ms Julie Ahern

I will take the question around individual complaints mechanisms. On what an individual complaints mechanism should look like, it should, first of all, go to the platform. If the child has a problem, he or she should first go to what would the local complaint platform would be. Whatever platform it is, one should go to the local platform and make the complaint there. If one is not happy with the result, it has not taken it seriously or it has not done anything about it, then one would make the complaint to the online safety commissioner. That is how we would see it happening. There would be two tiers and, effectively, the online safety commissioner would be a safety net for when the platforms do not do their job. It would not be that everyone with a problem would go straight to the commissioner. It would be if they do not get the result from the platform.

In order for it to be child friendly, it would need to be prompt, it would need to be accessible and it would need to provide information that children and families can understand to access the complaints mechanisms. What we should be doing is putting it back on the technology platforms that their complaints mechanisms are effective in the first instance. That is what the online safety commission should also have a role in doing - making sure that they do it right first and then only that which is most egregious comes to the online safety commission. An example that the committee could look at would be something like the Ombudsman for Children where people have to make a complaint to the person first and then go to the Ombudsman for Children who sees whether it meets a certain threshold.

I thank Ms Ahern. Is Ms Cooney next?

Ms Jennings is to come in next, if that is okay.

Ms Fiona Jennings

I thank Deputy Fitzpatrick. I suppose, first of all, I would suggest that he take a little pit stop at our new digital hub which was specifically created for parents, grandparents and carers. I will send the Deputy the link directly afterwards. Mr. Church might talk a little more about that.

To complement something that Ms Ahern was saying around the individual complaints mechanism, it is important to remember that the Audiovisual Media Services Directive, AVMSD, contains special provisions for the protection of minors and it is important that that is reflected and provided for in the online safety and media regulation, OSMR, exactly as Ms Ahern has outlined. I am sure it will come up again throughout this particular session.

For the hub, I will pass it on to Mr. Church for a moment.

I call Mr. Church.

Mr. John Church

I am going to contradict Ms Jennings as I am not going to talk about the hub. There is a very important point, following on from Ms Ahern there. I note the Broadcasting Authority of Ireland, BAI, has been before the committee. We have interacted with the BAI and one of the major concerns here is the scale of being able to handle complaints. There is a fear, which, in my view, is a myth, that they will be inundated with complaints and they will not be able to handle it. The population of Europe is 750 million and 450 million people are online. If one starts with numbers like that, it scares the hell out of one. One thinks we cannot handle this. Ms Ahern's point was key to this. When the platforms' complaints mechanisms are in place, by the time the most egregious cases come to the regulator they are actually small in number but they are significantly damaging to that individual and irreparable. We have had this good conversation already with the BAI and its fears were somewhat allayed by what we said. If one looks at Australia, the numbers reported in its annual report are very small compared to the population of Australia. It is very doable. Even if we were to get it right in Ireland which is absolutely what we should be doing, we should be leading by example here and being the leaders in Europe on this one. It is very doable, within the resources of, or even maybe some extra resources within, the BAI. That is a key point to allay fears around scale.

I thank Mr. Church. I will bring in Ms Cooney briefly.

Ms Alex Cooney

I thank Deputy Fitzpatrick for his question. I would completely agree that we need much stronger education measures in place because one of the big challenges is how we can make better decisions online. Many parents are allowing children to access online services well below the minimum age of consent and we need to do much more education there to equip parents to make good decisions in the online space and to recognise the responsibility that comes with a child being online. We welcome the brief mention of responsibility for public awareness campaigns that is in this Bill and we would like to see that expanded on and taken seriously because this is a great opportunity to do education. The Deputy is not alone in feeling he needs information. That is a common feeling at present.

I thank Ms Cooney and Deputy Fitzpatrick for his questions. I am moving on now to our next speaker, Deputy Mythen.

I thank all the witnesses. It has been a useful engagement. Obviously, I am appearing last and many questions have been asked but I would like to explore the situation with the community standards on the services providers. We should be able to get to those where they are published. They should be published and we should be able to challenge. We have heard the witnesses. A designated online safety commission must be named. Also, the notice and take down provision must be provided. I would like to explore the mechanism further.

How do the witnesses think it will work? Would a child need parental consent to make a report? How are anonymous complaints made?

I saw the disturbing figures from the ISPCC. This pandemic highlighted the situation regarding abuse, grooming and bullying and we see that coming out more. I do not think half of that would have come out only for the pandemic. It is the underbelly of the whole situation. Is there a breakdown of those data on bullying, abuse and grooming? That would instigate an online safety officer to be named.

Mr. John Church

It is a good question. We have information and are more than happy to supply it. During the pandemic, we saw a 25% to 30% spike around March when schools closed down. It certainly highlighted domestic violence, which was there already. Over time, it caused much more anxiety, stress and loneliness among children. Things have cooled down but not to pre-Covid levels. We would be happy to provide data. There are more than 400 profiles we categorise so we are easily able to do that. One point is children do not always ring and say they are here to report cyberbullying. They ring with a major concern and are tense and anxious. It is only through confidential discussion with a volunteer online that it comes out that it may be online.

Ms Julie Ahern

On community standards, one of the big issues is that they are very different across different platforms. One of the key things the online safety commissioner will have the power in the current general scheme to do is set the codes of conduct. We would like to see children and young people consulted in that and their voices heard. They know the Internet better than we do and they know what needs to happen.

On the making of complaints, there is an online safety commissioner in Australia and one has been established in Fiji. Both of those allow for parents, children, schools, principals and head teachers, gardaí and other people to make complaints about different issues going on online. It is quite broad in terms of who is allowed to make a complaint, and then they look at the merits. We can send on information on how those systems operate on a practical level, if that would be of help.

That sounds very useful.

Ms Alex Cooney

I echo Ms Ahern's point about the lack of consistency across different platforms. That is something to consider with the community standards. Responsiveness is the other issue. We hear from children who say they have reported things and there is no response, which is not acceptable. Things like that should be monitored.

The idea that you could make an anonymous report is important. Many parents are reluctant to go to the Garda and things like that with issues because they do not want children being criminalised or to make something seem bigger to a child. We have to consider all the sensitivities here.

I cannot see why we cannot use the Public Health (Alcohol) Act as an example. We have the junk food industry as well. As witnesses said, if we allow service providers to have their own standards, they are there for profit, to make money and to exploit. They will exploit children. I have grandchildren and it is worrying. Mr. Church spoke about a 13-year-old with Kate's story. I have grandchildren aged six, seven and eight who are using this stuff. The first thing they are bought as a toy is the digital stuff. They are learning all the time. We all have to face this, which is relatively new. Young people are growing up with this stuff and see it every day in the week, with little remote toys. You see the system. There is a mechanism where they are putting in that and influencing through that.

I was in the Chamber and am only getting in on this committee now. Apologies if I ask questions that have already been asked. I have a question for each of the group representatives.

I ask the Children's Rights Alliance regarding the individual complaints mechanism. We have heard over and over the matters raised at committee and it seems to be one of the main weaknesses in the general scheme of the Bill. Many witnesses have said such a mechanism is vital if it is to be effective, which I can understand. I see in Children's Rights Alliance's statement that it would like to see a number of remedies, including compensation. Will representatives of the organisation elaborate on the remedies it would like to see?

Ms Julie Ahern

On remedies, we were looking at the best practice in terms of children's access to a proper remedy under European Convention rights and under Irish law. The Council of Europe in the guidelines on the digital environment recommended there should be a suite of remedies available to children and young people, the main one being effective takedown procedures with other options where there are different instances of things happening. This is so it is not confined to one thing and there are other remedies available, like, as the Deputy mentioned, compensation. In broadcasting media as it is now, if somebody is defamed, they have the option of getting compensation, a retraction or a right of reply. It would be analogous to that so that if you were defamed or something was said about you online, you would have those opportunities. It is about looking at the suite of things that can happen and the suite of remedies available. There is more information on that in the guidelines coming from the Council of Europe. It looks at children's rights under the European Convention of Human Rights and domestic legislation.

In CyberSafeKids' submission and that of the Children's Rights Alliance, the matter of children under the age of 13 signing up for social media accounts was discussed. Will the representative from CyberSafeKids comment on the minimum age for children signing up for social media accounts? For example, is 13 too young? Second, does she have any thoughts on safeguarding children from being able to sign up to accounts where they are under the minimum age?

Ms Alex Cooney

The minimum age restriction on most of the popular online services is 13. Some of them are 16. From our data we see that many children under that age access those services. Even as young as eight and nine they access them in some numbers. Clearly, the restrictions that are in place do not work. We need to look at that and there may be other ways around it. We do not want children to have to hand over more of their data to determine their age. We want to see things like scanning that does not hold any data, is immediately deleted and is not given to the online service that determines whether that is a child. There are other ways of determining whether it is a child, such as the language they are using. There is a lot more we can consider to ensure children are protected from harmful online content, not just social media platforms that are not suitable for them.

On the right age, we are asked this all the time. It is very difficult to give an exact age. Thirteen is the age that is generally used. It is also used in US legislation, but we would say to parents who are considering allowing their children to sign up for these services that it depends on the maturity of the child and the parents' ability to support and engage with the child on this.

We need much more involvement from parents and carers in children's online journey, especially when they are under the age of 13 so that they are well prepared for the online experience.

The ISPCC obtained a legal opinion on whether an individual complaints mechanism is mandated by the revised audiovisual media services directive. I ask for a summary of the main issues raised in that opinion.

Ms Fiona Jennings

Yes, we did. It relates to how the audiovisual media services directive is being interpreted regarding the protection of minors. I would have no problem in providing the Deputy with the full legal opinion. The directive makes provision for the protection of children and young people online. We feel this has not been reflected in the online safety and media regulation Bill and it ought to be there. It specifically comes under article 28(b)3.1. It follows what we have been saying, which is that where those impacted have reported or flagged the content, they are able to access an individual complaints mechanism. I would be happy to provide the Deputy with the full legal opinion.

That would be great. I thank Ms Jennings.

I thank the witnesses for their statements and their further commentary on this matter. Members feel very passionate about what we are trying to do. We want to create something that is robust and has the effect of being a trusted option for children and young people. Mr. Church spoke about regular meetings with the tech giants such as Facebook, TikTok, Twitter, etc. I am sure he has raised many of the issues discussed today such as delays and inconsistency across the platforms about taking posts down and how difficult it can be for parents to access or make contact with these people in the first place. As I am sure Mr. Church has made these points known to them, what has their feedback been?

Mr. John Church

That is an interesting question. I am also a member of the national advisory committee for online safety, which is Government appointed. Our belief certainly is that we are better off having the tech giants, the platforms and industry around the table so that we can have open conversations. As much as we want to talk to them, they want to talk to us - perhaps ticking the box to show they are doing what they can. Ultimately, we believe that certain private shareholder-value-driven organisations do not put child safety first; they put the shareholder first. Ultimately, as it does today, this boils down to the child. Our responsibility is to protect the child. We commend some of the stuff they are doing, but it is just not good enough. It is merely the tip of the iceberg. With the sort of money the tech giants are making, it should be possible to protect children.

It sounds to me as if Mr. Church is just getting lip service; he did not say that, but I did.

Mr. John Church

I did not say that.

Mr. John Church

At the end of the day, they have a job to do and we have a job to do. We will do whatever we need to do to protect the children. They will do whatever they need to do to protect shareholder value. Maybe sometimes there is no middle ground.

It sounds like that.

Ms Tanya Ward

The principal problem is that taking more proactive measures interferes with their business model, which is something they are just not interested in. As Mr. Church said, it means their profits would not be as high because they would need to plough them back into safety proofing and privacy proofing.

One of the other big problems is that they put all the emphasis and all the responsibility on parents, which is just not acceptable. Parents cannot be with their child at every point in the day when they are in the online world. They cannot be outside the front door either. The focus needs to be put back on them to make the Internet a safer place. For a long time, they have got away without any formal regulation. We know the harm that is being done. We know they are making enormous profits. The time is now right to start that journey to having regulation. We also need to get the balance right with the protection of human rights alongside that.

It is no exaggeration to say that lives have been lost because of social media.

Ms Alex Cooney

I echo the points made by the previous speakers. We have raised the issue of encryption with Facebook, for example, which is proposing to use it more widely. There are challenges and risks to child safety regarding encryption. It is very difficult to get clear responses on those specific issues. Our job is to keep raising those issues and keep them in the public eye to ensure that we are all clear on the risks involved there. They could take some simple measures, including, for example, making all our accounts private by default. These are simple measures that could be taken across the board and would be really effective and safeguarding users. So many kids that we talked to simply do not even think about it. It is not a thing they think about when they set up that account. Then they are sharing personal information with potentially a wide audience. These platforms are often designed for us to share personal information. I totally agree with Ms Ward's point about putting more onus back on them. That is what this regulation should do and there is much more beyond that.

We have heard of Facebook's intention to start up a new platform for Instagram kids. I ask Mr. Church to comment.

Mr. John Church

I will probably hand that over to Ms Jennings. Ms Cooney made a good point earlier. Anything that is targeted directly at kids is like the alcopop for platforms. That is my personal view on it. It is just a form of a private platform trying to recruit for future sustenance.

Ms Fiona Jennings

We are now looking at specific platforms for children. A few years ago, we were talking about a specific internet for children. As somebody mentioned earlier, our concern would be the people who may be hanging out on these platforms and how they could be open to abuse. I do not know much more about Facebook's proposals for Instagram for kids. We already have YouTube Kids and perhaps it is following in the same vein as that. For all the reasons that have been discussed, we certainly would be concerned about something that is specifically aimed at children. We would be looking at it from the protection side of it and what the implications would be.

Ms Alex Cooney

Snapchat had a version for kids. None of the children that we spoke to, the eight to 13-year olds, wanted to be on that platform; they want to be on the version that they know their older peers are on. The challenge will always be in ensuring kids are using the child-friendly version that is in addition to the practice for future social media use.

We have time for a second round. I call Senator Malcolm Byrne.

I ask Ms Ward and Ms Cooney to comment on how we can specifically build in a voice for children and young people when the new online safety commissioner is set up. It is important that they have influence, particularly over the shaping of the codes that will be drawn up.

I want to talk about advertising and micro-targeting of children and young people. At present the BAI has quite a good code on advertising for traditional broadcasters.

There are some flaws in it, but it is pretty comprehensive. I do not see any reason we cannot fold that into a new code. In my view, we should look at trying to prohibit most forms of advertising and microtargeting aimed at children. We should allow children to be children, essentially. I would be grateful for the witnesses' views on that.

My final question comes back to the issue of separating the implementation of the Audiovisual Media Services Directive from the safety commissioner and the media commission. I take the view we should set up the media commission and the online safety commissioner. They will be flawed and there will be problems from the start because technology is changing so quickly it will not be able to cover everything. Quite frequently, we set up regulators and let the legislation catch up. We need to deal with the issues around some of the principles. We will be dealing with the use of biometrics very soon, particularly TikTok and its use of biometric data. We will be dealing with that issue and trying to address it. I am interested in hearing the witnesses' views on what the core principles of the online safety commission should be when we set it up. It will be very difficult for us to try to constantly future-proof against every technological development.

Ms Tanya Ward

It is quite simple to involve young people and children when developing codes of conduct and other programmes and mechanisms the online safety commissioner would be obliged to deliver and produce. An amendment could be included that imposes a legal obligation on the online safety commissioner to consult with children and young people on a corporate basis. That would inform any of his or her actions in this space. There is a plethora of organisations with expertise in this area. The Government has its participation unit and the Ombudsman for Children's Office has its own participation work. There are many ways that could happen. That is the best way to ensure children and young children are consulted in any of those mechanisms.

I completely agree with the Senator on profiling. There was much talk about the digital age of consent a number of years ago, which just fell well short. We should have been talking about why we could not ban the profiling of young children's data and its commercial exploitation. Why can we not do that? That is what the conversation should have been about instead of this mock mechanism that was meant to protect children's data but, so far, has not done a very good job. We certainly support and back any amendment in that area. The Irish Heart Foundation got an amendment carried relating to that data protection legislation but, unfortunately, it was not brought into force due to the European Commission.

The Senator's party in particular has talked about the international space and what is happening at EU level. Some of these matters will have to be dealt with at EU level because one of the principles of the general data protection regulation, GDPR, is that there is meant to be equality in the marketing of products. When Ireland tries to impose a better standard in that space, we will probably come into contact with EU law. That is something that might need to be dealt with at an international level. Different attempts could be made on that when legislating for the online safety Bill and online safety commissioner.

I am concerned about any delay in the establishment of an online safety commissioner. As Ms Ahern and Ms Jennings said, it can take ten years to develop EU law and much experience will be missed. It is also very important to imagine the impact the establishment of an online safety commissioner in Ireland will have on the rest of the European Union and the shaping it will do at EU level. Ireland could play a massive global role in keeping children safe online and I hope we go in that direction.

I thank Ms Ward. We fully intend to do that.

Ms Alex Cooney

I echo all those points. On the reference to the digital age of consent introduced three years ago, we know that despite all the debates at the time about the appropriate age and different European countries opting for different ages while we opted for 16, ultimately, it did not really matter, because the companies collect the data anyway. They just do not use consent as the basis to collect it. Professor Sonia Livingstone in the UK and others have done some very good research on this. These things need to be meaningful. If we are to spend all this time, effort and investment of resources the measures need to work. I agree that we should be protecting children's data and not micro-profiling them. TikTok introduced measures this year committed to not micro-profiling 13- to 16-year-olds so that should apply to any platform used by a 13- to 16-year-old. We should be seeing these measures much more widely utilised.

I agree with the points about consultation. There should be many opportunities built in for young people to have their say. Again, we must look at the different age ranges. Older young people will have a different view from those under 13, but we need to understand all of it.

I will allow a little latitude because we are into the second round.

Ms Julie Ahern

I will come in very briefly on the splitting of the two parts of this issue and the concerns outlined by the Senator about the pace of legislation and how we will struggle to keep up. Establishing an online safety commissioner now, even if it is not perfect, will effectively be the Government's biggest asset in keeping up with changes. This commissioner will be specifically tasked, under the legislation currently proposed, with keeping track of what is happening and to advise and put forward solutions. Setting a commissioner up now would be the biggest asset in trying to future-proof legislation, knowing what is coming down the line and how we need to go forward. This is on top of what Ms Ward mentioned, which is that we can be the global leader in regulation.

Ms Jennings brought up a question about all accounts being made private by default. What should be explored, and is not explored much, is the principle of data ownership. That is where we should focus some of our expertise. There are great conversations happening now in America about ownership of all sorts of items, including the ownership of one's own data and not allowing service providers to use that data. We should develop some sort of system focusing on ownership. The ownership of data is the most important issue because data is what service providers develop and use algorithms for and so on. We should all have a conversation on the ownership of data to ensure it belongs to the rightful person. Can anyone elaborate on that?

Ms Alex Cooney

That is a very interesting point around data ownership but we also need more education, particularly for children around what their data is and its value. We know many children will value popularity more than privacy and do not really understand what it means to give their data away. We have much work to do to educate, not just children, but online users in general around the value of their data, how they can best protect it and what measures they should be asking for in order to get better protections of their data.

Education is one thing but, if we get into the nuts and bolts of it, the nub of the issue is the passing on of ownership. That is where all the service providers gain. The little button is pressed and that is it, they can sell the data across the board, globally, and make lots of money out of it. I would like to see the issue of direct ownership of data explored.

Ms Tanya Ward

One of the issues is that many of the platforms we access are free. How do they make their money? By exposing us to microtargeting, advertisements or by selling our data onto a third party. We probably need a bigger conversation in our society about that. If one wants to protect data, it may be necessary to pay a small subscription fee. It could be very small, just enough to ensure that data is not being exploited. The issue probably is that many people do not realise their data is being exploited in the way that it is. As Ms Cooney said, we are not educating people at a young age about what is actually happening to their data and their data profile and how it could be used against them in later life. Certain behaviours online could be used against them by insurance brokers, etc., because those behaviours might predict they may get a form of cancer by drinking, eating certain foods or whatever. There are many things out there we need to be very concerned about, but part of the debate should be about whether we pay or do not pay. That is probably what we all need to talk about.

My apologies for missing the opening of the meeting. There was a clash and I was at another committee. I have read the opening statements and there are some insights I would like to get from the witnesses on education. I apologise if other members have touched on this.

The witnesses may have been told about the presentations we were given last week by two secondary schools, when the students very eloquently and intelligently outlined what they would like to see from this Bill and what they would like the Bill to do for them. It was not just the Bill they were talking to, and they were talking about overall policy on online platforms, social media, cyberbullying and safe online behaviour. One of their points related to the fact there is a bit of a disconnect with teachers of an older generation and parents, who are older and who may not have grown up in the world of Instagram, TikTok and Facebook, so there may be a lack of understanding. They presented the excellent idea of having advocates and activists from within their peer group for safe online behaviour and to talk about issues such as cyberbullying, the posting of harmful content and so on. I apologise again if this has been brought up already but I would like to have a comment from the witnesses on that excellent idea.

Ms Fiona Jennings

I thank the Deputy. I am delighted to come in on this issue. I thought those four students from Tallaght and Kinsale were just fabulous last week. I think I speak for everybody here when I say we all had a little smile on our faces when they were commenting on their experience of online safety education.

In 2017, with the support of the Vodafone Ireland Foundation, we were in a position to hold a consultation with more than 100 young people on this exact point on the experience of online safety education. A couple of things came out of that in that, first, there was a huge disparity in the quality and the frequency of online safety education received and, second, they felt online safety education should be on the curriculum and viewed as a life skills subject.

It is probably a broader discussion outside the role of the online safety commissioner but I can understand it when they talk about being lectured about this. We can often feel lectured when is something is coming up randomly and sporadically. If we really want our children to be digitally confident and we want them to have a good understanding of online safety, we need to embed it across the curriculum. I would even suggest messages at a preschool level. I was recently reading a book to my daughter, who is five, and I was surprised to see there were messages in the book on how to keep safe online. Deputy Mythen talked about his grandchildren and the smart toys they have, and the education that is needed around that. On the back of that, for parents, grandparents and carers, again with the support of the Vodafone Ireland Foundation, we were able to develop a digital-ready hub, which has a plethora of resources that cover issues such as digital resilience, privacy and security, online harm and cyberbullying. Again, I can share the link afterwards.

I can totally understand where they were coming from. If it was more embedded across the curriculum, we would have many more digitally competent students at the end of it.

Ms Tanya Ward

Ms Jennings made the point about the early years. The Council of Europe and the UN Committee on the Rights of the Child recommended that online education should start in the early years but the other thing it said is that small children should be protected from technology. Therefore, giving a two-year-old or three-year-old child an iPad can actually be harmful to them, and that is the other message we probably need to get out there. We need to develop their digital competencies to get the best out of the online world and we need to start at an early age, but we also need to protect them from overexposure at a very young age.

Ms Alex Cooney

I echo the point about the need to have consistent and obligatory education available to students at primary and secondary level. At the moment, much of the content that is available is dated and is up to 20 years old. Clearly, a lot has changed in this landscape in the past 20 years and we need to make sure it is current information and that it is up-to-date with young people's experiences.

We also need to ensure that teachers feel equipped to teach it. When we ask teachers that question, nearly half of them say they do not feel comfortable and do not feel equipped to teach this kind of content, so that is another thing to consider in ensuring that children and young people are educated.

Of course, ideally, we also offer this education at home, so we are equipping parents and carers with the skills and knowledge to have good conversations at home because the best preparation will be had there. It will be starting from an early age and it will involve the whole range, not just the negative stuff but the positive uses of technology, clear boundaries, trying to normalise the conversation and ensuring that if a child ever comes across something that makes them feel uncomfortable, they have a trusted adult they can approach to talk about it.

I have one final question. Everyone on this call wants technology to be used safely, particularly among our children and young people, and this is very important legislation. I want to ask the witnesses what success would look like and what will make a difference for us. When we have implemented this legislation, when the media commissioner is in place and when we have the online safety commissioner, what will success look like for those young people we met from Kinsale and Tallaght?

Ms Alex Cooney

Success will look like a strategic, centralised approach to this, where we have industry, policy and public awareness coming together to really promote good social norms around safe online use. To make the analogy with road safety again, we would have that industry buy-in, good policies and legislation in place and the public awareness to ensure that the public is coming with us on it. That really does make a difference and has made a difference. I am looking for a place where we have good social norms in place.

Ms Fiona Jennings

As to what success would look like, if the committee has children and young people in before it again, members should please ask them that question because they will get a very interesting answer. From the point of view of the ISPCC, it is meaningful and robust regulation, with industry support and public awareness and engagement. If we could complete that circle and have education in there as well, that would be a good day's work.

Mr. John Church

I want to be very specific on this one. The day we go from 40 to 50 calls a week on our child line to zero calls, that is a success. It is when we are getting no more calls on cyberbullying, and we do not have to appear in front of committees to legislate it because it is a fully legislated world.

Ms Julie Ahern

I fully agree with what my colleagues said.

For us, it really is that robust regulation whereby there is a regulator in place that is fully resourced and equipped to look after setting standards but also taking complaints from children and young people when the platforms fail them.

I thank Ms Ahern. I know Deputy Munster wishes come in. I understand Senator Carrigy is on the call. Does he wish to come in on this issue?

No. My apologies to the Chairman. I did not think I would be able to make it to the meeting. I just got here by the skin of my teeth. I will look at the recording of the meeting at a later stage. I do not want to go back over issues that have already been covered.

That is no problem. I call Deputy Munster.

It has already been touched on but I want to ask about education and community awareness programmes. The witnesses have relayed their views in that regard. It was interesting to hear that the schools programmes are outdated and that there are problems with teachers not feeling confident enough to teach or portray the dangers. Have the witnesses identified specific powers that are needed in that regard?

Ms Alex Cooney

I would model it on the Australian example. It has many powers with regard to both public awareness and education. We mentioned earlier that it has the power, for example, to accredit and recommend online safety programmes, which makes a significant difference for a school seeking to book such a programme. It runs thematic campaigns as well. I know those issues are touched on in the Bill but they are not elaborated on. A lot of the detail remains to be addressed, but that is what we are seeking.

That is perfect. That is what I wanted to know. I thank Ms Cooney.

Does Ms Jennings wish to come in?

Ms Fiona Jennings

Yes. As Ms Cooney stated, it would be fantastic if there were specific powers for oversight specifically for the reasons I outlined previously, that is, the significant disparity in how anything to do with online safety or digital competence is delivered, whether it be inside the school or by external providers. It would be fantastic if that could be included.

I thank our guests and my colleagues who have been participating. That concludes our deliberations for today. I thank the witnesses most sincerely not only for their statements but also the very meaningful way in which they engaged with members and helped to extrapolate and tease out further ideas and suggestions. We will take on board everything that has been presented here today. It has been very useful ahead of our meeting with the tech giants next week.

The committee now stands adjourned until 11.30 a.m on Wednesday, 19 May for a private meeting on MS Teams, followed by a public meeting of the joint committee at 12.30 p.m. virtually and in committee room 1 with representatives of Facebook, Twitter and TikTok to continue the committee's scrutiny of the online safety and media regulation Bill.

The joint committee adjourned at 2.04 p.m. until 12.30 p.m. on Wednesday, 19 May 2021.