I will try to be quick. I am the director of privacy for Apple in Europe, worldwide director of privacy compliance and the appointed data protection officer in Europe under the GDPR. I thank the Chair and the members of the committee for the opportunity to speak to them today on the important issue of voice assistants and the use of data.
Apple has been operating in Cork since 1980 and we are proud of the many contributions we make to the economy and job creation. We employ 6,000 people in Ireland. Over the last four years, we have spent more than €1 billion with local companies, and our investment and innovation supports more than 27,000 jobs up and down the country.
We believe privacy is a fundamental human right. Our approach to privacy is different from that of other tech companies. We design our products from the ground up to minimise the amount of data Apple collects and we work vigilantly to protect our customers' personal data and give them control over their information.
Privacy is an issue larger than one company or one country, which is why we strongly support the GDPR and advocate for other countries to adopt similar approaches that deliver a highly effective and necessary framework. During the committee’s session last week, the discussion emphasised Article 25 of the GDPR, concerning privacy by design. This is a core value and fundamental principle that Apple has embodied from the beginning. Privacy is at the heart of every product and service we create, and we continually develop innovative technologies and techniques designed to minimise how much customer data we, or anyone else, can access while delivering world-class services to our customers.
Privacy by design applies to all Apple’s services. That includes Siri, Apple's intelligent assistant. Like all our services, our goal with Siri is to create the best user experience while vigilantly protecting user privacy.
We introduced Siri in 2011 as an integral part of our products, helping users get things done faster and easier. This includes tasks like making calls, sending messages, setting alarms, getting directions, finding photos and playing music and television shows, to name a few. Like other Apple services, we minimise the amount of data Siri collects and use that data only to improve Siri. We do not use Siri data to build a marketing profile, and we never sell it to anyone. Our product is the technology we create, not the customer.
We have built privacy protections into Siri according to some core principles that demonstrate our comprehensive approach to protecting user data. First, Siri uses as little data as possible to deliver an accurate result. When a person asks a question about a football match, for example, Siri uses his or her general location to provide suitable results but if a person asks for the nearest supermarket, more specific location data is used.
Second, we design Siri to operate with the most sensitive data on a person's device instead of having to send everything through Apple servers. For example, if a person asks Siri to read his or her unread messages, the content of the message never leaves his or her device and is not transmitted to Siri's servers because that is not necessary to fulfil his or her request. This means that messages are not available to Apple or any other third party.
Third, requests made to Siri are not linked to a person's Apple ID, phone number or email address. Instead, they are associated with a random identifier - a long string of letters and numbers associated with a single device to keep track of data while it is being processed. This is a feature we believe is unique among the digital assistants in use today.
As the committee heard last week, to improve voice assistants such as Siri, there is a need for human review of a very small sample of audio interactions. This helps to ensure that Siri understands users' questions and provides the right answer. For example, Siri has to recognise my Irish accent and all the members' accents, and members will understand the complexity if that is multiplied across the different languages, dialects, and styles of speech in the countries where we do business.
However, I want to be clear that human review of audio samples has always been conducted on a very small subset of audio samples from Siri requests. The people reviewing the audio samples are not shown an Apple ID, phone number, or email. As I mentioned earlier, all Siri requests are associated with a random identifier.
This August, customer concerns arose in regard to human review of Siri audio samples. In response, we immediately suspended human review of Siri audio requests, reviewed our practices and policies, and released the following improvements to Siri’s privacy protections. First, by default, we no longer retain audio recordings of Siri interactions. Second, users now have the choice to help Siri improve by learning from audio samples of their requests. For users who choose to share their audio it is still only associated with the random identifier I mentioned earlier. Third, if customers choose to help improve Siri, only Apple employees - not contractors - will be allowed to listen to audio samples of Siri interactions for the limited review purposes described earlier. Additionally, our team will work to delete any recording which is determined to be the result of an inadvertent trigger of Siri. Finally, there is now a "delete Siri and dictation history" option in settings that makes it easy for users to delete Siri requests that have been retained for six months or less.
Our mission is to make products and services that enrich the lives of our customers. Unlike others, we do not view our customers and their data as the product. That is why we build privacy protections into everything we do. It is why we intentionally limit our own access to customer data. Our products and features include innovative privacy technologies and techniques designed to minimise how much of the customer's data we or anyone else can access. We believe that users should control their data and they should understand how their data is used, stored, and protected. That is the approach we bring to Siri and all the services we offer. We do not sign in with the customer's Apple ID to use Siri and the customer's device processes as much information as possible without sending it to Apple’s servers, and powerful security features help prevent anyone, except the customer, from being able to access his or her information. We are constantly working on new ways to keep the customer's personal information safe. We look forward to continuing our partnership to build on the GDPR's progress and strive for the highest standard of user privacy protection.
I look forward to answering members' questions. I thank the committee.