DCMS committee hears evidence from the Information Commissioner on online harms and the ethics of data
A sub-committee of the Digital, Culture, Media and Sport Select
Committee today took evidence on the subject of online harms and
the ethics of data from Elizabeth Denham CBE, Information
Commissioner, Information Commissioner's Office, supported by Paul
Arnold, Deputy Chief Executive and Chief Operating Officer,
Information Commissioner's Office. The following is a partially
edited transcript based on voice recognition software which, as a
result, will include some typographical errors. The...Request free trial
A sub-committee of the Digital, Culture, Media and Sport Select
Committee today took evidence on the subject of online harms and
the ethics of data from Elizabeth Denham CBE, Information
Commissioner, Information Commissioner's Office, supported by Paul
Arnold, Deputy Chief Executive and Chief Operating Officer,
Information Commissioner's Office.
The following is a partially edited transcript based on voice recognition software which, as a result, will include some typographical errors. The official transcript will be sent as soon as it is available. Although Amazon was not mentioned, you may be interested in a particular section, highlighted in bold, about major companies sharing data.
Transcript Kevin Brennan [00:00:53] Thank you, Chair, and good morning to both our witnesses. I wonder, Elizabeth, if you could tell us since we haven't seen you for quite a long time, you know what you feel you've accomplished in your office briefly. [00:01:09] And also, if you've got any loose ends or any unfinished business that you wish you had accomplished whilst you've served your term of office. Elizabeth Denham [00:01:20] The ICO has been on a journey in the past four years, and we've had the challenge of bringing in a new law, the UK Data Protection Act 2018. We've had the challenge of Brexit, the transition period, and we also have the challenge of, like everyone else, the global pandemic. So it has not been a smooth ride for the ICO. But what I'm most proud of is I've been able and my team has been able to build the capacity and the capability to be able to respond to issues that are larger than life and are certainly to important to your constituents to parliament and to government. So back in the day, even three or four years ago, if you said you were a Data Protection Commissioner, that might be a conversation stopper at a party. But now I think data protection is mainstream and it sits at the intersection of law and technology and society. So we have the capacity to grapple with issues such as the use of data in political campaigns. We've grappled with children's rights online. We have responsibilities for regulating in the digital economy. And now we are involved in being an expert adviser to government on trade deals. Kevin Brennan [00:03:14] You kindly wrote to us in October concerning the whole investigation this previous predecessor committee was doing into Cambridge Analytica. In August last year, the US intelligence Senate Intelligence Committee published a report and in that report, on page 664, it says the committee was unable to obtain the corporate communications of Cambridge Analytica or SCL Group, which had already been seized by the U.K. authorities. [00:03:50] Did the Senate Intelligence Committee contact you and asked to see the evidence? And how did you respond if they did? Elizabeth Denham [00:03:59] The committee did not contact my office to ask for the evidence that we had seized on the servers of Cambridge Analytica. Kevin Brennan [00:04:07] What would happen if they were to do so now? Elizabeth Denham [00:04:10] If it was a proper request for information, we would certainly share that information. We shared Cambridge Analytica data and analysis with the Federal Trade Commission, for example, in the US. And also we have shared information with state attorneys general that were carrying out their investigation and with the Securities and Exchange Commission. So we need in our law to share information, if it's necessary, for another law enforcement agency. Kevin Brennan [00:04:46] And on that on that point, are you able to supply us with a list of the countries and agencies that you have assisted throughout the duration of your inquiry into this? Elizabeth Denham [00:04:57] I can provide you with that information. I would like to write to you after this session to give you a full list of the various data protection authorities around the world that we've shared information. [00:05:10] OK, now, in April 2019 that your office originally had committed to produce a final report into the seizure of the servers which your office carried out, et cetera, that never arrived. But a year later, you wrote the letter, which I referred to earlier on, rather than producing a final report for the committee. What was the reason for the delay and why didn't we get a full report? Why did we get a letter instead? [00:05:41] The letter was extensive, as you will know, but my office produced three reports on the investigation into the misuse of data in political campaigning. So we had a policy report and we had two enforcement reports that we had. We had looked at the entire ecosystem of data sharing and campaigning. So beyond Facebook and Cambridge Analytica, we also investigated and audited political parties. We audited data brokers, credit reporting agencies. And the strands of investigation are reported out sufficiently, in my view, all of our work. So taken together, the letter, which was our final line on the report, together with the policy and the enforcement actions, prosecutions, fines, stop processing orders we had we had done a lot of work in this space. [00:06:44] And I think what's important here is we have really pulled back the curtain on the use of data and democracy, which has been taken up by new regulatory and parliamentarians around the world. Kevin Brennan [00:07:11] What happened to the material that fell outside of the remains of your investigation, but wasn't passed on to other agencies? Elizabeth Denham [00:07:20] For clarity, are you talking about information that didn't relate to algorithms or data protection? Kevin Brennan [00:07:30] Yeah, I understand there was some material that fell outside the remit of your investigation on your legal powers and some of that got passed onto the National Crime Agency and so on. But there was other material that wasn't passed on to other agencies. What would happen to that material? [00:07:45] So we have the material, our office has material, and we're going through the process of determining what's best done with the terabytes of data that we see from Cambridge Analytica. When there were issues that were beyond the mandate of my office, we have passed on information to the National Crime Agency, passed on information to the National Cybersecurity Center, et cetera, because it's my job to regulate data, not to regulate potential criminal activities. [00:08:20] Sharing of information involves the insolvency service. So we passed on information from Cambridge Analytica to the insolvency service for them to determine whether or not they were going to take actions against any records of Cambridge Analytica. And if they did and they struck Alexander Nix from the register. Kevin Brennan [00:08:46] But the material that didn't get passed on, you still have that material is what you're saying. What will happen to that? Elizabeth Denham [00:08:52] We're determining what should be retained and be properly disposed of. So we're in consultation with other experts. [00:09:02] OK, can I just ask you a little bit? Some information has come to light subsequent to the letter you sent the committee in October. I mean, in particular, Brittany Kaiser published some further evidence that seemed potentially to undermine the idea that Cambridge Analytica and AIQ's relationship wasn't much closer than suggested, perhaps in your letter that in fact, it was a contractual partnership. And the Canadian privacy commissioner in their report also said that the categorically AIQ made use of FCL, Cambridge Analytica data sets and did so illegally. [00:09:41] Do you accept that that is further evidence that perhaps the two companies have a much, much closer contractual relationship than you suggested in your letter? [00:09:50] And you know that I suppose your letter, if you like, denied that relationship but didn't really supply the underlying analysis we might have got from a report. [00:10:01] Is that something you can provide the committee further analysis of why you came to that conclusion? Elizabeth Denham [00:10:07] I certainly can offer a private briefing for the committee to unpack that finding. I'm certainly willing to do that. [00:10:16] But I mean, the difficulty of this investigation is there were so many commentators in the public sphere and we're a regulator. We have to be driven by the evidence that we find. And I believe that we have done that to the extent of our abilities, our powers. And if there are commentaries and commentators that feel differently, they have the opportunity to come forward and provide us with the information. But I have to follow the evidence. I have to balance and weigh the evidence and come to the conclusion that we did. And although there were some UK data in the hands of AIQ, we did not find that there was a close relationship at the time in terms of the referendum between AIQ and Cambridge Analytica. Kevin Brennan [00:11:13] Do you think that further evidence that Brittany Kaiser supplied has any relevance? Elizabeth Denham [00:11:19] It could possibly have relevance, but again, as a regulator, we have to follow the evidence and we have to be able to question witnesses and the statements that they provide. Kevin Brennan [00:11:34] Can I ask you whether or not you're confident that at this point, the evidence or the data that you're destroying, that it's not premature to do that at this point and that it might be sensible just to hold on to some of it for a bit longer? Elizabeth Denham [00:11:51] We haven't said that we are destroying it at this time. And again, I can offer a briefing or any of the committee yourself to be able to go through the detail of our considerations would be. Underscore how significant this investigation was because it was the first time that a data protection regulator knew the depth of forensic analysis. With a huge amount of data that we see and the witnesses that came forward, we needed to see the evidence, not just the lines that they're taking to make public commentary because we have to be evidence-led. Kevin Brennan [00:12:39] In 2018, Mark Zuckerberg testified before the U.S. Senate Commerce, Science and Transportation Committee, and he promised that Facebook would conduct a full audit of Cambridge Analytica service after the UK government completed its investigation. I think he meant you by that way. In your letter, you also state you're in the process of returning materials to SCL CA's administrators or destroying material. After your October 2020 letter and the completion of your investigation, did Facebook contact you to complete their audit? Elizabeth Denham [00:13:23] Once again, I think I could answer that question in private. Kevin Brennan [00:13:31] It's not a straightforward thing to ask whether or not they contacted you about completing their audit since after all, it was a commitment that Mark Zuckerberg gave in the public domain before a US Senate committee. Elizabeth Denham [00:13:43] Part of it's part of an agreement that we struck with in terms of our litigation against Facebook. So there is an agreement that's not in the public domain. And that's why I would prefer to discuss this in private. Kevin Brennan [00:14:01] Have you kept all documents, emails and data that might be of assistance to other countries, jurisdictions in their investigations if there are others going on around the world? Elizabeth Denham [00:14:11] We have. Kevin Brennan [00:14:20] Finally, just to say you did mention about your reports about political parties in your answers earlier on in that letter in October to, as she said, will shortly be publishing reports of the findings of our audit of the main political parties, the main credit reference agencies and the major data brokers, as well as Cambridge University Psychometrics Center. And we'll write separately to the committee on those issues. When do you plan to provide that information to the committee? Elizabeth Denham [00:14:51] We have published the audit of the political parties. We've also published the results of our investigation into credit reporting, including an enforcement notice against Experian. We are soon to publish our investigation and in the next few weeks we will be publishing our guidance for political campaigns and their. Kevin Brennan [00:15:21] [00:15:21] Moving on to a slightly different area, briefly, when did you become aware of of of WhatsApp changing its terms and conditions, which happened recently to allow sharing of information with Facebook? And are you concerned about the way big corporations do this to consumers today? [21.9s] [00:15:43] [00:15:43]We are really getting a choice when a platform does this sort of thing. [5.3s] Elizabeth Denham [00:15:49] [00:15:49]What's really interesting about the WhatsApp announcement is an ongoing sharing with Facebook is how many users voted with their virtual feet and left the platform to take up membership with telegram or signal, which are an end to end encryption. [20.1s] Kevin Brennan [00:16:11] [00:16:11]So how many did you have your fingers crossed? [3.5s] Elizabeth Denham [00:16:15] [00:16:15]Yeah, millions. And I can get the specifics, but I read about this probably in the same way you did. I read about the change in the terms of service in the media. However, you will note that the change in the terms of service and the requirement of users to share information with Facebook does not apply to UK users or to users in the EU. And that's because in 2017 my office negotiated with WhatsApp so that they agreed not to share user information and contact information until they could show that they comply with the GDPR. [41.3s] Kevin Brennan [00:16:58] [00:16:58]So you told us all those people changed the service needlessly because it didn't apply to the. [3.7s] Elizabeth Denham [00:17:03] [00:17:03]Well, I mean, that's been reported out extensively, but at the same time, I think it's a bigger issue of trust and users expect companies to maintain their trust and not to suddenly change the contract that they have with the users. And I think it's an example of users being concerned about the trustworthiness and the sustainability of the promises that are made to users. [28.1s] Kevin Brennan [00:17:34] [00:17:34]And do you think that these big, huge, powerful corporations should be able to buy up these smaller operations just to exploit their users who are very smart, with very different expectations of what they're getting into and then almost trapped into using those services? Although, you know, in some instances there's a real opportunity to change. Why should they have to change if they signed up to a particular, you know, regime they thought they were getting into? And in any case, as we all know, a lot of businesses exploit the natural inertia of people who simply don't have the time or the opportunity. Cost is too great for them to get involved in changing all their data services. I mean, isn't there a real gap here in what, you know, in the law in allowing this sort of thing to happen? [51.3s] Elizabeth Denham [00:18:26] [00:18:26]I think that competition authorities are looking at mergers and acquisitions with new eyes, so mergers and acquisitions that are at their heart about seizing and controlling more personal data is is a real issue. And you'll see that the competition and markets authority in the U.K. is looking at this issue quite closely. Our office, the Data Protection Authority, working with the CMA on issues of personal data, the target of which is important for companies that are merging. And I think what you're asking me is, is this fair? And do we need, as policymakers and regulators, civil society, to see changes in the way that big tech is regulated? And I think the answer to that is absolutely yes. [55.6s] [00:19:23] [00:19:23]And data is, is such an asset in the digital economy that we need to look afresh at how data and how competition works in the digital economy. Content and conduct regulation is, of course, the focus of the government in its online harms legislation. [21.2s] Kevin Brennan [00:19:46] OK, finally, do you personally use WhatsApp and Facebook? Elizabeth Denham [00:19:52] I do not. I'm not on WhatsApp, I do use end to end encryption messaging, and I am not on Facebook. My choice. Which service to use for end to end encrypted messaging. Elizabeth Denham [00:20:14] Signal. Julian Knight [00:20:19] Just to clarify, in terms of the relationship between WhatsApp and Facebook and data, Eve Sweeney appeared before the Home Affairs Select Committee the last week, and we had an exchange there. And I'm just going to read you what she said to me. I just want to make sure that this is correct and this is the case. That's OK. She stated that there were no changes arising from this update, as in the terms conditions to our data sharing practices with Facebook anywhere in the world, especially the U.K. Is that correct? Elizabeth Denham [00:20:58] What I've read in tech journals is that there are changes in terms of sharing information with Facebook, particularly contact information and the contact information of users' friends. So that's my understanding from reading detailed tech reports in the public domain. But again, it is is an important question to ask of the company in a regulatory call or regulatory review. Julian Knight [00:21:29] OK, are you going to do that? Elizabeth Denham [00:21:32] I will do I again, because we had received an undertaking from WhatsApp not to share information with Facebook and that goes back to 2017. That applies to UK users. And my understanding is it also applies to EU users. Julian Knight [00:21:58] That's from 2017 seventeen. You haven't asked for a specific undertaking at this point in time? Elizabeth Denham [00:22:03] No, because up until January the 1st it was the Irish Data Protection Authority's job to oversee the activities of WhatsApp. So as long as we were in the area and the one stop shop and it was the my Irish colleague, it was WhatsApp that's changed now. Julian Knight [00:22:24] How long will it be before you can produce for the committee the answers from Facebook and WhatsApp and whether or not there is any sharing of data between those two platforms? Elizabeth Denham [00:22:35] I will follow it up and I will respond to the committee chair. Heather Wheeler [00:22:44] Thank you very much, sir, and good morning, both of you, very kind of you to come along. And the beauty of these meetings is you didn't have to go out in the snow. So well done. So I'm interested in a couple of different areas. First set of questions about the Freedom of Information Act. And so straightforwardly, do you still consider that the current Freedom of Information Act is fit for purpose or is it not fit for purpose? Elizabeth Denham [00:23:17] Pleased to answer questions about the transparency agenda and freedom of information when when I was at my scrutiny hearing in 2016, I think the DCMS Committee got the impression that I was fundamentally an FOI advocate. And a lot of the work that I had done in Canada before coming to the U.K. was focused on fairness and transparency and openness in government. When I came to the U.K. in 2016, Lord Burns had done an independent study about whether or not the FOI act was fit for purpose. And his conclusion and his committee's conclusion and report was that the FOI act worked relatively well with some recommendations and improvements. My main parliamentary report on transparency that I issued was in 2018, and that report was about the need to extend the FOI act, the cover, the delivery of services by the private sector. And I think what's needed to make the act work in a way that's fair to the public is that the act needs to reflect how public services are delivered. The pandemic is only accelerated the range of actors involved in the delivery of this. So I think the same accountability should apply for the private sector that are delivering fundamental public services. Heather Wheeler [00:25:01] So to be clear, you think that the rights of information should actually be extended to all organizations that deliver government services? Elizabeth Denham [00:25:11] I do. Heather Wheeler [00:25:12] That is absolutely fascinating. So the obvious next question then is who should bear the burden of the costs associated with those Freedom of Information requests? Elizabeth Denham [00:25:23] I think fundamentally what's important is that citizens have a right to hold organizations to account to understand how decisions are made. If you take a housing association that falls outside the act, then an individual doesn't have the right to access information about the safety of their housing. And again, I don't think that's fair in terms of who bears the burden of costs. Obviously, private sector companies that are delivering service under massive contracts should bear the burden of those costs. And again, how this would work in practice, extending the reach of the act to cover private sector delivery, you could actually come up with a threshold of the value of the contract before that organization is subject to transparency requirements. Heather Wheeler [00:26:21] OK, that's really, really interesting, thank you for that. I'm now going to go off in a slightly different tangent, but it's something we've already talked about this morning. So I'm interested in GDPR stuff and because I have such a technical understanding of it all, obviously. So what have been the biggest problems for the Osseo in moving to GDPR? And Paul, I don't know if this is something that that might be more up your street, but obviously, if if Elizabeth wants to carry on talking, that's fine. Whoever wants to take it. [00:26:53] Actually, I'm happy to have Paul speak to that. And I think the reason that it's it's appropriate for Paul to address this is the kind of change that the ICO had to go through to bring in the act, to administer the act were certainly under under his his mandate and his function at the ICO. Paul Arnold [00:27:17] Yeah, as the commissioner says, the mandate, the change for the isco in 2018 was something we were preparing for in the build up, as you would expect. And I'd probably categorize it in three ways. There was a fundamental capacity challenge for us, a capability challenge, and then the all important culture of the organization. So if I talk about capacity first, there was a pretty immediate and profound increase in demand for all ICO services. For May 2018 onwards, most of our historic and traditional services simply exploded in demand, with something like 130 percent increase almost overnight. [00:27:55] The new duties that came with the DPA 18, particularly the ones for us to have regard for economic growth, also saw as introduce a number of new services which in and of themselves had brought their own demands to our door. So we had a really basic capacity challenge from 2018 onwards, as I say, to meet what was effectively about 150 percent increase in demand for our work. So since 2017, we've increased the size of the ICO by about 85 percent. And so we now have an FTE workforce of just over seven hundred and fifty. And we've done that in a very deliberate and precise and responsible way, as you would hopefully expect, making sure that we were kind of unlocking the efficiency and productivity as we move forward. But it really was an essential and very rapid requirement for us to scale up our services overnight's in terms of the capability of the organization. But that's something that all the regulators obviously do on a regular basis to assess what skills do we need to meet the challenge. [00:28:58] But I think, as is evident from the discussions so far this morning, the territory in the areas in which the ICO's work reaches are almost endless. And so the capabilities that we need to prioritize and really focus on the areas of greatest need to give you some highlights of some key examples. Our economic analysis and our understanding of the economic impact has been a key Schrum for us since 2018, since we were given that really important duty to have due regard for economic growth and all the work that we do. So that's an area that we have to assume we have upscaled on. We continue to do so through all our relevant plans. We've also talked today about our some of our biggest investigations and a government with a new capability that we've really needed to develop since 2017/18, that ability to stand up large scale investigations with dozens of investigators to make sure that governance and infrastructure around those is really strong and fit for purpose. [00:29:59] So I think there've been a number of challenges on the capacity and capability from I think we're really proud of what I know. We're very proud of what we've accomplished. It sounds like a long time ago. 2018 is just over two years. Obviously, we've had the last 12 months with the pandemic that was is interrupted, but the budget for organization. So I think where we are now is that is a great place. I think the culture of the organization is something we've also focused on. It's all too easy to focus on the practicalities of capacity and capability. But we were we were determined not to lose the kind of DNA of the ICO from pre 2017 where we're organized. We're fundamentally a knowledge based organization, which means we're people organization. And we've there are many new members of staff who work for us now compared to 2017. But the thing we have in common is the the real pride that all of our colleagues take in their work, their their commitment to the mission and the purpose, which is those are really key factors for a regulator like us. [00:31:02] That really is what drives our recruitment and retention, helping us attract the best talent, even though as a public sector body, we're clearly not able to compete with the large private sector global entities we talked about today. Heather Wheeler [00:31:17] It's really interesting. It's great that you're so proud about your workforce. That shines out from what you say. So you gave me one example. [00:31:28] One of the questions that I think people are quite keen to know about is with the eight rights, which have been actually the most problematic in regulate. Paul Arnold [00:31:50] I'm not sure that we would single any out which has been particularly problematic, I think, with with very much with very mixed approach. Them included the introduction of GDPR and the state as a as just a fundamental upgrade to the UK legislative regime. And it's difficult to single one out by the very nature they they're meant to be complementary and part of a set. So it's really not about raising the awareness of organizations of of the real need for change, making sure we can cut through some of the some of the essential myths or or or initial reaction to GDPR. I think I think it's not a very complex law. [00:32:30] One of the things we're most proud of is the work we've done with our Sandy advice, which is really recognizing the need to simplify the law. So those organizations that need to comply with it, but by definition, their activities aren't the most complex. And so we are very keen to avoid small businesses feeling that the new law was a huge burden for them. And I think we've made some really good strides in packaging local advice and support through the assembly hall. And there's more there are more products coming up and down the line on that in the next 12 months with so with the sole intention of simplifying and reducing the helping organizations reduce their compliance risks through a very practical and targeted advice and support. Heather Wheeler [00:33:12] I think that is going to be incredibly welcomed, Paul, because, you know, there are many expletives that go in the same sentence as GDPR. I mean, you know, there are very few people in the real world doing real jobs who have any interest in GDPR at all. It is a pain in the neck. And, you know, you either can't get information from people because people, quote GDPR or you get completely wrapped up in red tape because you have to do GDPR. It's it is not a blessing, in my humble opinion. Anyway, I got one last question, which I am now interested in. Obviously, the EU has still got its elements of data protection. [00:33:57] Do you have any insights on how the UK government maybe will shadow that post Brexit? Elizabeth Denham [00:34:04] If I could come in on on this one. With the with the trade and cooperation agreement with the EU, there's a six month period in which data can continue to flow and from you to the UK. [00:34:24] And what's important about that is that it gives the EU enough time to come to a decision about whether or not the UK regime is essentially equivalent to that of the EU. So that's that's a process. That's a technical process. It political process. It's a legal process. And that's where we are right now. I think the advantage of the PR approach is that other countries around the world are using the GDPR as a model to reform their law. So the direction of travel and the trajectory of where the laws are going is because people strong their rights. And I think the GDPR get a bad rap from people who say is just about the paperwork of privacy or it's just about having to record all of your decisions around data. We try to bust that myth because at its heart, data protection is about respect for customers and citizens data, and it's about individuals having the right of agency around their personal data. And it's more important to the reputation of governments and businesses than it's ever been. We talked a minute ago about millions of users abandoning WhatsApp because they are concerned about the service. It's so important that government in their policy takes people with them and that there's trust and confidence in the digital economy. And I think stripping down the the GDPR to its main principles about protection of individuals and it's about certainty for business. What what are they supposed to do to account for her data? Damian Hinds [00:36:38] In my experience, when we talk to our constituents, when people say my data, they mean things they have disclosed voluntarily for some purpose, like their bank details of their address or their credit card number. When we talk about what public policy people talk about, they generally mean something different, which is data about what people do. It's it's tracking data. Is it possible in this debate we're just totally talking at cross purposes. Elizabeth Denham [00:37:05] I think when policy makers are talking about using data better to solve policy problems, to be able to predict, make predictions about what people will do, I do agree that that's that's a different matter. But I also think that people are more concerned about how their data is vacuumed up by especially the private sector to be able to analyze and predict what they're going to be interested in, for example. So think about Internet advertising. And we've got we've got an investigation ongoing about the use of personal data to profile individuals and predict and nudge what they may be interested in, what they may do. So I think people are getting more aware of some of the implications of being tracked, followed the data crunched up and used to deliver services to them. And we certainly saw that in our credit reporting agency report about the massive amount of data that is collected about people and how it can be used to deny them credit, for example, or to serve them information or even not serve them information that will change their behavior. So I do think people are starting to be more aware and concerned about analytics and algorithms that make decisions about their lives. Damian Hinds [00:38:41] Yeah, I wanted to come on to that because it's sort of there's this further level of data layer, if you like, which isn't really data at all. It's inference. So this is using data, perhaps bits of data that you've given and bits of data about what you've done to jump to conclusions because you have these friends, because you live in this place, because you know you like yoga, then you're more likely to be interested in Scottish independence or the environment or whatever it might be. It's not data in the strict sense at all is just stuff that people have made up of effectively by looking at what other people who have similar characteristics do and should that be a matter of regulation. So what conclusions companies come to and then what they do with it in terms of, as you rightly say, how they sell to you, how they credit score you, but also from the point of view of politicians, the sort of news that gets you in, the sort of views that gets get put to. Elizabeth Denham [00:39:38] Yeah, I think there's a debate, a good example of the debate that that you're speaking about is on social media companies and whether or not look alike audiences on a platform like Facebook, is that personal information? And we have come to the conclusion in our investigation that when you have a core of personal information and you're adding on to it potential inferences, that can be personal information. And I think what's going to be really helpful for campaigners and politicians is to see our guidance for political campaigning. That is going to help campaigners on the on the ground. So practical, real information on how to use data respectfully in a political campaign and still reach the voters and the potential voters for in an election, because that's of value too. Damian Hinds [00:40:42] But political parties are regulated and they're accountable. And ultimately, they have a brand to protect. There are gazillions of people and machines operating on social media who are not. And the platforms generally are serving up content from them to other individuals rather than the Labour Party or the Conservative Party. So I guess my question is, is there a gap in public policy about who is caring about what these algorithms come up with, the inferences they make about people, and therefore they the echo chamber of the bubble that they end up being assigned into? Elizabeth Denham [00:41:27] The Online harms white paper, I think goes some way to looking at the potential harms that people face online, but also we regulate the Icko regulates the use of algorithms when they make a significant impact on someone's life. So if you take the example of Ofcom and the use of the algorithm for a level exam's, there is an example of the use of data and the use of inferences about people that has a real and significant impact on people's lives. We regulate algorithms from from that perspective. Where the gap is, I would suggest, is that the ICO can only look at fairness of algorithms when it's a significant effect on someone's life, whether they get a job. Damian Hinds [00:42:31] It's by definition of impact on someone's life and the impact on liberal democracy, for example, is indirectly an impact on everybody's life. And the effect I'm asking about is where effectively people get served up a certain type of content and become a partly because of their own actions, then they share it or whatever, but also because of influences that have been made about them. They will get served up content which goes deeper and deeper, sometimes down a down a down the rabbit hole, divides society more, makes people less aware of things that they have in common, more aware of things on which they differ, that is injurious in the first instance to democracy. But I don't think it is arguable at all. Actually, it's an important element in all of this. Is anyone caring about that? Elizabeth Denham [00:43:24] I think I think the gap you're talking about is looking at the societal impacts of the use of algorithmic decision making so we can look at the individual impact. But what is the effect of the use of tracking and profiling have on society as a whole? And some of those questions are ethical questions, and some of them have to be really need a public debate. I think Internet advertising is a good example. Who is going to regulate Internet advertising because it can have effect on people's lives. Certain offers are never made, for example, to certain certain demographic groups in society. So that goes to fairness. But at the end of the day, transparency, fairness, oversight, reputation, all of these are of utmost important, I think, on a societal level. Damian Hinds [00:44:26] OK, can I turn to the question of children and first of all, commend you on the age appropriate design code, which I think is a fantastic part of your legacy. And I remember when it came out thinking, my God, this is amazing. So just I mean, some of the highlights. Well, the children profiling should be by default and hate platform should collect from only the minimum amount of personal data they need for the elements of the service, the users actively and knowingly engaged in. Well, for children, sorry, sassing should be high privacy by default in seven. And perhaps the most important of all, either establish age with an appropriate level of certainty or apply the standards to all users. [00:45:09] Does it feel to you like that happening? Elizabeth Denham [00:45:13] I think the age appropriate design code is world leading. It's going down in a direction of influencing the design of services that are expecting to be used by children. And I think it goes to the problem that you must hear from your constituents, which is that the Internet wasn't designed for children and the Internet needs to take it out of children's specific needs. So we're proud of the age appropriate design code. There are jurisdictions around the world, including California, Mexico, Australia, that are looking at the UK's code. The code doesn't come into force until September of 2021. Damian Hinds [00:46:08] My question is not is it a great code? I think it's a great code. My question is, is it happening? Do you feel like it's happening? Elizabeth Denham [00:46:14] I feel like we've had impact in the design of services. Last February, before the pandemic strap, myself and my team went to Silicon Valley and we went there to socialize the code with the big tech companies. So meetings with the chief executives across all of the big tech companies. And I can tell you that their designers, their engineers met with our engineers. And I think there is there is an ambition for the UK to see this happen to protect UK children. But there's also an interest by the big players and it will come. We're spending the next six months continuing to educate and socialize, especially the the large platforms on how they can put the code into practice. But I think I think it's world leading. Were we consulted extensively with children's groups, with technology companies, with the gaming industry, with civil society to try to get this right? And you're right. I'm very proud of the work of my office on this. And I want to see how it comes to life in practice. So we will be enforcing against the code in September. [00:47:39] Thank you so according to the 2019 Ofcom report. A quarter of 10 year olds who go online flames to have a social media account, 43 percent of 11 year olds. And of course, these are just what they say. Of course, we don't know for a fact, but these social media platforms typically have a minimum age of 30. Why should we have any faith in the ability of platforms to distinguish between children and adults in this way and this concept of age assurance as opposed to age verification? How much confidence could we have in that technology? Elizabeth Denham [00:48:21] The assurance is not the same as requiring a technical identity issue for the companies. So I think assuring that the companies are doing their best to be able to identify individuals that should not be on the platform is is important. And as identity management and identity management tools progress, we would expect more from companies. But as you know, the state of development right now of identity management solutions is is behind the curve. And we've seen that in response to other initiatives, other legislative initiatives. So we have to give companies the benefit of the doubt that when they throw their engineers and their designers at a task, they usually have the ability to make it happen. [00:49:20] A lot of these big platforms already know a hell of a lot about who the users are, and they can use all kinds of algorithms around that site, and they certainly do. Yeah. So what we're saying is there could be a gap between ability and will, I think. Well, hopefully the code gives them the will. [00:49:41] So we know that if you throw engineers at a problem and if it benefit of the company, they're going to find a solution. And what we've done with the code is incentivize the companies to do a better job to police. Who's on their platform, at what age? Damian Hinds [00:50:00] And if in a year or two years time, a quarter of 10 year olds responding to the three Ofcom survey that year, 43 percent of 11 year olds responding. If they say they have a social media account and they say the social media accounts, the social media platforms they're using, what do you think is the appropriate sanction that on those social media companies. Elizabeth Denham [00:50:20] I think an investigation and a sanction that's appropriate for the extent of the of the breach, though, as you know, we have a whole toolkit full of various sanctions and and we reserve fines and stop processing orders for the most serious of data breaches. Damian Hinds [00:50:43] And you say that maybe they would give the benefit of the doubt to companies when they throw their expertize and ability at these things. There's another one says just give the benefit of the doubt to the kids. And I just wonder, given that partly because it seems to be very difficult to distinguish between children and adults, which because of the age-appropriate design, do you think it would be inappropriate to apply for all Eustice? [00:51:07] I I think if the if the company doesn't follow the identity assurance requirement, I don't think any of the provisions of of the code, any of the 15 standards are out of place for an adult user. I just think the choice for an adult is really important for you. [00:51:29] But you're not suggesting taking away choice. Are you even for the children's code? This is all about defaults and what should be what should be the settings when you come in today, when you come into the platform. So, I mean, is there anything in the age-appropriate design code for children that you think shouldn't also apply for the full population? [00:51:49] I suppose just just the first standard, which is the duty to have regard to the UN convention. But other than that, no. And I think a lot of adults would appreciate defaults that are privacy by design location turned off until an adult decides to turn it on. So these are I think these are respectful defaults. But again, I think the UK, the the parliament and government decided that we would focus on children. And that's what we've done. That's what we've delivered in the quote. Damian Hinds [00:52:28] On vaccination passports, this committee has a strong interest in people being able to return to sporting events, music events, travel, business, hospitality and so on. And in that regard, we've talked about various ways of proving having been vaccinated or having had a positive test. But there are also big privacy issues around these questions. That's also an area this committee is deeply interested in. [00:52:57] As the minister said at the weekend, we are not a people carrying country. I just wondered, what's your take on on vaccination, passports, what their usefulness might be and what your concerns would be? Elizabeth Denham [00:53:12] We would approach a detailed proposal around a vaccination passport, our freedom passport, we would approach it in the way that we do any initiative by government, and that is, is it necessary? Does it work? Does it do what it says on the tin? Is it proportionate? And is there transparency? So the necessity, the question about necessity, because we're talking about personal health information, which is a special category of data that requires controls. So at the outset, we would ask government the same questions that we asked them about the contact tracing at the same principles. I think with immunity passports, some of the issues are beyond data protection. They touch on human rights. They touch on whether or not we are going to create a two tier society based on whether you have a jab in the arm and the concerns over whether or not this is identity identity by the back door. Those are some of the concerns I would have, but my approach would be to ask government, where's the necessity? How is the data going to be used? Is it transparent? And is it proportionate to the problem? You know, for a long time we've carried vaccination certificates when we go to foreign travel to show that you've had your typhoid vaccination, for example, and that's a piece of paper. So if we start talking about immunity, passports that are digital are tacked on to the contact tracing application, then I think those are real, real questions for policymakers. Alex Davies-Jones [00:54:54] I'd like to go back to explore some of the themes around age and children that we were just discussing with my colleague Damian Hinds. As we've heard, most of the social media platforms say that users should be 13 or older. WhatsApp is restricted to those who are 16 years or more. Are we more hands off with who should be on social media compared to those who go to the cinema to watch a movie? Elizabeth Denham [00:55:23] I think it's such a great question because I think the principle behind this is that the laws to protect children online should be the same as the laws to protect children in the analog world. So I think that's the fundamental principle that we're trying to achieve, because why should the Internet be a wild west for children and the Internet wasn't designed for children. So how do we how do we solve that problem? We can't put it in the too hard pile that we're used to regulations and laws around children buying alcohol or buying cigarets or getting into certain movies. We think the same should apply online. And that's the that's the principle behind the age appropriate design code. Alex Davies-Jones [00:56:14] Yeah, I completely agree. And on that point, who should set the limits for social media? Should it be the platforms themselves or the ICO? Elizabeth Denham [00:56:24] Well, in law, the definition of a children in terms of data protection, I mean, that's that's in law and that's children under the age of of 13. So that's already established in law with the age appropriate design code. Does is require companies to deliver content based on the age of the user. And so a 16 year old is going to access information and be able to understand a privacy notice much better than a 13 year old. And that's why that's why the code is called the age appropriate design code. Alex Davies-Jones [00:57:06] Yeah. You've already stated here today that if a social media platform is shown to have users who are under the age appropriate code, then you would open an investigation. [00:57:20] You've currently got an investigation ongoing into TikTok. Are you able to tell us more about that investigation and specifically why it is taking so long and longer than originally thought? Elizabeth Denham [00:57:33] TikTok is is a broad investigation, and I can't speak about a live investigation in public, as I'm sure you can appreciate, but we are nearing the end of our investigation. And I can tell you what we are looking at. We're looking at privacy notices and transparency. We're looking at the sharing of the information across borders. We're looking at the governance and the privacy program up TikTok. We're looking at messaging systems that are perhaps open that allow adult users to send direct messages to children. One of the complications with the TikTok, TikTok has recently announced some pretty significant changes to the way it operates. So we have to take that into account in our final report and in any sanction we decide to issue against TikTok. So TikTok is changing. It was there was there was there was a potential sale of TikTok. So, again, we had to we had to step back and look at the the whole investigation. But we are coming to a conclusion. So you will soon see the end of our investigation and action. Alex Davies-Jones [00:58:55] Good. I think it would actually be quite useful for the committee to have you back once that investigation is completed, if possible. So maybe we could follow up with that. [00:59:03] Some of the issues you just mentioned around the investigations, TikTok of why you're investigating them. Are they wider than just a TikTok issue? Elizabeth Denham [00:59:13] Yes, I think those same issues are at play with other large platforms. I mean, it happens that Tock is the largest platform used by underage individuals in the U.K. So it's you know, it has the greatest number of. Child users, so it's natural that we would look at that. We've had complaints about TikTok, but I mean, some of the same issues have been explored by the Federal Trade Commission in the US around, for example, YouTube. And these are the same issues. Just how are how our children's rights protect children from. Alex Davies-Jones [00:59:57] OK, and finally, if I may, I'd like to ask what your opinion is on ID cards for 13 year old. To be used for well, it could be used for a range of things, specifically with regards to age verification on social media. Elizabeth Denham [01:00:16] And again, I would look at what's the purpose of those identity cards or identity tokens, let's say they might be if they're electronic, do they actually do they work or are they fit for purpose? [01:00:32] What are the privacy implications? And so I would have to look at a specific contextual example before we can look at transparency, fairness, proportionality, fit for purpose. Julie Elliott [01:00:54] Following what Alex and Damian have been talking about, do you think this 13 age thing is actually working? Do you think that young children under 13 are generally using platforms? Elizabeth Denham [01:01:25] Yes, I do. I absolutely. Julie Elliott [01:01:27] And is there anything we can do about that? Because I think they are. Do you think there's anything that we can do about that to try and try and legislate probably or regulate to try and stop people under 13 using these platforms? [01:01:43] Or do you think it's just gone? Elizabeth Denham [01:01:45] No, I don't think it's gone. I think, as I say, it's hard. It's difficult, it's challenging. It probably takes a village, including parents, to work with their children to know what their children are accessing online. But when it comes to the law and regulation, I think the age appropriate design code, once it's in force in September, will definitely help. And I think the Online harms agenda, which sets out the expectations for conduct and content online, will also be a huge step forward. So I think the UK, in its approach to online harms and data protection, is out in front of other jurisdictions and really trying to tackle the issue of children online. Julie Elliott [01:02:36] The government have given Ofcom the regulation of the online harms an area? Do you think that's the right place for it to be? Did you have any discussions with government ministers about the possibility of the responsibility of Online harms regulation being with your office? Elizabeth Denham [01:02:57] I'm supportive of, of course, the government's decision and it is our government in Parliament to decide who is going to be the leader, I think because of Ofcom experience with content broadcasting and content regulation. It's logical to take on that responsibility. I'm supportive of that. I think that we online harms agenda and the debate around content regulation in moderation will be an easy one. I think it'll be challenging to balance freedom of speech and security around this space. [01:03:38] And I do think that there's a tension and existing tension between data protection and privacy and content regulation, because in order to have content regulation, the regulator and the company are going to have to know more about the users and what's shared online. There is a policy dimension there. Julie Elliott [01:04:07] Following on from that or sort of interrupting. So how do you envisage your organization working with Ofcom on this issue? Because there must be some sort of overlap I would have thought. Elizabeth Denham [01:04:18] Yeah, if you were if you were going to draw a diagram, you would see you would see the overlap because and what it essentially is, is that personal information and the kind of profiling that's done about individuals determines the delivery of content. So personal data is used to determine what content is there when users. So you can see that the overlap is there and what's happened in the last 18 months, Ofcom, DBEIS and the Competition and Markets Authority have joined together and have an active work for Brown for the purpose of regulatory coherence. So you can see that in regulation areas, competition, content protection and data protection are coming together like they never have before. [01:05:18] And the responsibility is also on our shoulders that there's a coherent approach to this. Julie Elliott [01:05:27] Do you think the framework puts too much focus on regulating companies as opposed to minimizing harm? So do you think the balance in the framework is right? Elizabeth Denham [01:05:38] Well, we haven't seen the detail of the framework in the bill yet. And the bill, of course, will be interested in and comment and commentating as we do on legislative proposals. I cannot say without the details of the framework. Julie Elliott [01:05:55] And do you have any concerns that heavy regulation might prevent the emergence of competitive platforms to those that currently dominate the landscape in this area? Elizabeth Denham [01:06:08] I think that's why competition regulators, content regulators and privacy regulator can work together and take down some of those models. [01:06:17] And I don't think I don't think that we're worried about ending our perimeters so much that we're interested in making sure that whoever leads an investigation has the public interest in mind. Julie Elliott [01:06:31] It's very encouraging hearing that you're not working in silos those because this clearly just, you know, breach goes over all sorts of areas and things will get missed if you work in silos. So that's very encouraging. Julian Knight [01:06:50] Thank you, Julie. Just to follow up, you spoke about how algorithms manage content delivery and their importance. And obviously this committee has a lot of evidence recently in terms of how algorithms are dictating our personal lives, should algorithms be FOIable? Elizabeth Denham [01:07:09] Well, algorithms that are used by public authorities. Should be liable to the point where they're explainable. I would say terror, so obviously algorithms, there may be some more property issues as they're provided to public authorities. [01:07:32] But what I believe and this is going back to the transparency agenda is when public bodies that are subject to FOI use algorithms to inspect the public, the population, they should be explainable. [01:07:48] People should understand how their data is used and what we've done it because we have worked with the Alan Turing Institute to produce. To produce transparency tools on how to explain to the public how algorithms work, and we've also worked on an auditing tool, an algorithm on auditing tool, which I think is world leading in terms of the work of the ICO. Julian Knight [01:08:19] What about private authorities? What about private institutions and their use of algorithms? Should they in some way be liable? Maybe not the very technical data or as you say, the sort of, if you like, the mission statement that lies behind them, the objectives of those particular algorithms? Should I not be able to see the algorithms, what they're actually going to do in terms of my data on my potential source of information when I when I log on to a particular website,. Elizabeth Denham [01:08:53] Under GDPR and the data protection requirements in the UK, an individual has the right to challenge an algorithmic decision that has effect on their life. And in that challenge, there is a transparency requirement to that. I think the limits of it are that, again, think about intellectual property. Then you need a regulator to stand in between the company and the citizen or the consumer, because I think the regulator behind closed doors can look at the fairness of the data that's used. The training data can look at the details of the algorithm and be able to come to a conclusion on whether or not this was fair to use a to use an algorithm that makes a decision. So that's where I think in the private sector, it's important to have a regulator in in the in the middle chair. [01:09:54] An individual having the right to challenge an algorithm and its impact on their daily lives - I mean, I can't really imagine that many people sort of a knowing what to do would be getting a heads up enough in order to do that. [01:10:07] It's quite a narrow field of opportunity, isn't it, in terms of transparency that you have to challenge? [01:10:15] Surely it would be better if you had a right of access, so a right of access involving all of the detail of machine learning that goes into an algorithm? Again, I think that's why the oversight of algorithmic making in both the public sector and the private sector, I think should be a regulator. Damian Hinds [01:10:39] Forgive me, I'm not suggesting that someone should have easily and then basically receives a huge amount of machine being you know, we've all got lives to live after all. [01:10:49] But what I'm suggesting is that surely it would be helpful and it would be much more transparent if, for example, I was able to FOI a private company and find out what exactly the algorithms that they had subjected my information to actually meant. What was the purpose of them? Elizabeth Denham [01:11:12] What was the purpose of them? How do they work? How do they make decisions? Are those decisions fair? So an individual has those rights. But I think, as we say, it's a narrow field. It takes it takes time and it creates opportunity for an individual to do that. I think civil society plays a role here. Individuals don't always know what to complain about, but I think both the commissioner's ability to do on motion investigations and civil society, bringing these issues to light are all are all important. This is the area that I think the regulation of algorithms is the most challenging policy area that parliamentarians are going to face in the next three years. [01:12:04] Surely it would be easier, though, to have something like a digital bill of rights, for instance. Surely It be easier if we were able to see precisely what the algorithm was meant to do. If we had an automatic right of access, if you like, a Bill of Rights. Elizabeth Denham [01:12:30] I think I think the GDPR and the Data Protection Act do give the regulator the ability to look behind the curtain and look inside the black box and see what's happening. Julian Knight [01:12:43] Well, OK, I get that you have the ability to do that. But why don't our constituents have the ability at least to see to gauge exactly the impact of algorithms on their lives? Elizabeth Denham [01:12:56] They have the ability to file on subject access request. [01:13:01] So that's in law, public and private sector to find out how their data is used, who it's been shared with. Whether that goes as far as the code, I don't think so, but there is there is already a right in law to a subject access request, an individual your constituents can serve to a company or a public body. Julian Knight [01:13:25] Yeah, a public body. Exactly. But not Facebook, for instance. Elizabeth Denham [01:13:30] Private sector or public sector. Yeah. That is covered in a subject but is a subject access. Julian Knight [01:13:37] But that is still only a it's quite as I said, it's quite a narrow field. Surely would be better for individuals in the same way that they can look to see exactly what their privacy settings are on certain websites to be or manage cookies to be able to see precisely what is being applied to them in terms of in terms of algorithms. Elizabeth Denham [01:14:00] I agree that individuals should have the right to see more about how algorithms work, but, you know, I think it's untested how far a subject access request goes with a private sector company. I'd be interested in writing to you with more detail about the work that we're doing in our various tools, auditing tools, transfer laws. If you're interested in that chair, I can take some time and write to you. John Nicolson [01:14:35] A criticism I'm sure you've heard before is that the your office, the ICO is weak when it comes to enforcement. So could I perhaps start off by asking you about the report published on the 11th of November 2020 into data protection compliance by political parties? Now, you know what that showed. But for those who haven't followed it as closely as some of us, it should rather disturbing conclusions that the Conservative Party had ethnic and religious detail on 10 million voters looking at their country of origin, their ethnic origin and their religion based on their names. Do you think that's an acceptable thing for the Conservative Party to have done? [01:15:24] No, and in our audit work, where we looked at the practices of all political parties, our recommendation was for any kind of ethnicity data to be deleted. And the Conservative Party, I'm told we have evidence the Conservative Party have destroyed the needed that information. I think what's really important in the audits that we've done of the political parties is that there is agreement with our audit recommendations for them to comply. Our next report, which is due in June of this year, will outline all the recommendations and how they have or have not been adopted. John Nicolson [01:16:15] Right. I think I think the key word there, if I may pick it up, is recommendation. And that is why people sometimes say you're weak on enforcement because it was only a recommendation. You hope the Conservative Party has accepted your recommendation, but you don't enforce it if you have the legal powers to enforce it. Did you have the legal powers to require the Conservative Party to eliminate all this data that it assembled in this very inappropriate way? Elizabeth Denham [01:16:45] The Conservative Party's deletion of the data was done as a response to our recommendation. John Nicolson [01:16:53] Would you have ordered them to do so? Yes, we could have ordered them to do so in this case, didn't you? Well, we didn't have to because they volunteered that they would. If they hadn't volunteered, would you have ordered them to do so? Elizabeth Denham [01:17:04] Yes, we would have seen examples of that in public parties where we made a recommendation. It hasn't been followed and we have ordered the deletion of data. So it's engagement first, enforcement second. John Nicolson [01:17:22] OK, so you play nice and then you get then you get rough. If they don't behave, I think that's entirely the right way to treat the Conservative Party. Probably there are three legal basis for data collection. Did the Conservative Party's accumulation of this data fall into any of these legal bases? Perhaps you could outline what they are. Elizabeth Denham [01:17:47] I don't have the audit report in front of me, but I suspect we're looking at the legal basis of consent of legitimate, legitimate interests or democratic engagement, is that right? John Nicolson [01:18:01] Yep. I think that's I think that's exactly right. I mean, you're the expert in this, of course, but that's my understanding of it. [01:18:09] You said that the lawful basis the parties were processing personal data were not always appropriate. Is there ever a legal basis on which a party could do with the Conservative Party, and that is assemble people's names based on their apparent religion? Elizabeth Denham [01:18:35] So religion and ethnicity are both like health information, special category data that requires a higher standard for legal basis to collect. So again, ethnicity is not an acceptable collection of data. There isn't a legal basis that allows for the collection of that data. John Nicolson [01:18:58] Right. So just to confirm what the Conservative Party did was illegal. Elizabeth Denham [01:19:04] We made the recommendation that they destroy the data because they didn't have the legal basis to collect it. John Nicolson [01:19:11] Well, that's a roundabout way of saying it's illegal, but there's no legal basis for collecting it. It must be illegal. Elizabeth Denham [01:19:16] So just to confirm it was illegal, it was illegal to collect the ethnicity data. And there has been. John Nicolson [01:19:25] I'm glad you've been able to confirm that because John Whittingdale, the minister has repeatedly said in the House of Commons that it was legal, what they did. So you put it firmly on the record that what the Conservative Party that was illegal and therefore what Mr Whittingdale keeps asserting is simply untrue. Elizabeth Denham [01:19:47] I think that Mr. Whittingdale suggested that you asked me in my appearance about specific collection of data. So, again, we're talking about the collection of ethnicity data. [01:20:03] The Conservative party and all the other parties all had a series of recommendations from our office, which we are following up to make sure they comply with it, but I really want to because, you know, people like clear answers. John Nicolson [01:20:20] And this so you've confirmed that it was not legal to do what the Conservative Party did, therefore, it was illegal. So what Mr Whittingdale says on the floor of the House of Commons that what the Conservative Party did was legal is simply wrong. That's a yes or no answer is allowed to be wrong. I mean, goodness knows, politicians are loathe to be wrong about things. There's a logic if there's an absolutely logical conclusion to what you said to me. And I think it just requires a simple confirmation. Mr Whittingdale is wrong. Elizabeth Denham [01:20:59] I don't have what Mr. Whittingdale said. John Nicolson [01:21:02] I can tell you, he said because I asked him the question. He said that what the party had done was not illegal. And you've just confirmed it was illegal. So he was wrong. [01:21:27] Commissioner, you should be a politician, really, there's nothing wrong with saying the words 'Yes'. But let's let's move on because I feel I'm flogging this, too, to get your answers are pretty obvious. You mentioned other political parties. And again, Mr. Whittingdale and others have tried to suggest that all parties were equally bad on this. [01:21:51] But it's worth noting, isn't it not, is it not, that according to your report, the SNP, Plaid Cymru and the DUP, all represented in parliament, they did not do this. So it's not true that all the parties did the same thing. Elizabeth Denham [01:22:08] The recommendations in our report on many other aspects of practice, so not just a collection of ethnicity data, but are many things that needed improvement across all of the political parties. You have to look at the report, the extent of the report and all of the issues that we examined, not just whether or not it was collected. John Nicolson [01:22:34] Yeah, I accept that you made recommendations across the board. I accept that. But I'm just trying to confirm the SNP, the DUP and Plaid did not do what the Conservative Party did. Mr Whittingdale implied that they they did. That is wrong. And I hope you're able to confirm that they did not accumulate data based on ethnicity, ethnicity and religion. Elizabeth Denham [01:23:09] I can speak for the ethnicity. I'd have to go back and it for religion, but I, I agree with the other parties were not collecting ethnicity. John Nicolson [01:23:31] On the question of of track and trace, were you concerned with track and trace about the potential for fromus data breaches? [01:23:46] The test and trace program is something that we have been actively advising along the way. So, as you know, the tests and trace program was set up at pace to be able to deal with a pandemic, the public health emergency. So our approach is the same as we always do, is to be agile and get in there and give advice, improve practice on the ground. So the test and trace is a system that we will be auditing and the tests and trace program knows and is aware that we are starting to audit how they use data and whether we like it. John Nicolson [01:24:28] Were you worried about the private company that had been involved that was involved in this? Because it didn't exactly have the most glorious track record. Elizabeth Denham [01:24:37] Our audit is looking at, both the security of data, governance of the data, the collection and the disposition of the data at the end of the day. So we're doing a wholesale audit on the flight. John Nicolson [01:24:54] Finally, were you were you concerned about the plans to give the data track and trace data to the police? Elizabeth Denham [01:25:04] My understanding is that that data was not given to the police. I think that was a that was a consideration. John Nicolson [01:25:13] Would have been concerned had it been given to the police. Elizabeth Denham [01:25:16] Yes, because I I believe the tests and trace program is set up for dealing with a public health emergency, and it wasn't necessary to share that information with law enforcement. Giles Watling [01:25:39] In our evidence sessions back in September TikTok said that U.K. user data was held in Singapore and the US, but they're moving all that to Ireland. Did you know that they're still keeping the data on service in the US and Singapore? Elizabeth Denham [01:26:01] I'm not aware of I'm not aware of where the data is held, I'm sure that my team that's doing the investigation, it is aware of that. But I think what's important in terms of where the data is held is that the UK legislation has extraterritoriality. So we have the ability to investigate wherever the data is held. Giles Watling [01:26:26] Yeah, so the EU doesn't recognize or Theo Bertram said the EU doesn't recognize the US as a safe place for this storage. Would you agree with that? Elizabeth Denham [01:26:39] I think that the challenge right now is that the the privacy shield, which was the bridge to the U.S. between Europe and the US that allowed data to flow with protection was struck down by the courts. And now there is work going on between the EU and and the US to find another mechanism for the safe transfer of data. So it's it's not that the data is unsafe, it's it's looking at what kind of oversight and protection there is of European data in the US and Singapore. Giles Watling [01:27:19] Do you have enough access to that information? Elizabeth Denham [01:27:21] And we also think that we have worked to gain access to data held in the US in various investigations, including export. So we have the ability to work with other regulators to be able to carry out our enforcement activities. And I think what's really strong about the ICO is that I'm the chair of the Global Privacy Assembly and my deputy is the chair of the PSB Privacy Committee. So that brings us into positive and good relationships to collaborate with regulators around the world, which we do, and I think helps to safeguard UK citizens interests. Giles Watling [01:28:07] So you could say you are satisfied with access you have to USA data?. Elizabeth Denham [01:28:14] I am satisfied now, again, carrying out enforcement activities and investigations in other countries requires us to have a relationship, a memorandum of understanding with regulators that are on the ground there moving VAT. Giles Watling [01:28:34] Thank you for that. Moving back to TikTok, they say they're moving all their dates to Ireland. Do you have any evidence to show that they have done that or that complying with the. Elizabeth Denham [01:28:45] I can write to you about what our team has found, but I don't have that knowledge at my fingertips right now. Giles Watling [01:28:53] Well, that would be good. Do you think it's sufficient for a company such as TipTok merely to have an ambition to comply with the regulations? Or do you think it should absolutely comply with regulations? Elizabeth Denham [01:29:04] No, it should absolutely comply with the regulations. And that's why we are doing a deep dove investigation into TikTok. Giles Watling [01:29:11] Right. That's very good. And I was pleased to actually, by the way, when you said earlier that you were slightly reticent about the idea of ID cards, I think civil liberties must be defended. Thank you for that. Moving on. Do you think you could you were talking earlier about the privacy regulation. Do you think we should be ready to enact legislation equivalent to that here? [01:29:47] Well, I would say that when we hear what we hear about most in terms of public concern is about marketing and the kind of scams and criminal activity that goes on with the use of personal information in electronic marketing. So the public cares most about two things. They care. They care about nuisance calls and nuisance tax and especially vulnerable populations falling prey to that. And secondly, they care about the security of their data. So data breaches the amount of data that's for sale on the dark web. Those are the two highest level of concerns for U.K. citizens. So I do think we need electronic communication regulation. Absolutely. But I think that the the cookie regulation, the way cookies work on the Internet is it's not meaningful control and regulation. So I do think people are tired of cookie notices. They click, accept, accept. And whether or not that's actually meaningful control for individuals I think is an open question for policymakers now. Giles Watling [01:31:08] And I'm guilty of this as well, because you want to get on with it when you go to a site and it just is, you know, Cookie and you just kind of accept it. So it's sort of meaningless. [01:31:15] But all the states from going back to Cambridge Analytics and all this data collection, all the stuff that irritates people all the time is microtargeting of advertising information. Are we getting on top of it finally, because it's been going on for years now. Elizabeth Denham [01:31:29] I think we are on the public, doesn't understand how Internet advertising actually works, so we read about it in the press, we read about specialist report, but I think they are getting on top of this. [01:31:44] I think the digital markets unit, the work that the Competition and Markets Authority is doing on investigation, investigating Internet advertising, the work that we're doing to investigate real time bidding that affects individuals. I do think regulators are getting on top of that. But I think what needs to be clear is who is ultimately responsible for overseeing Internet advertising? [01:32:16] And that's an important question. Giles Watling [01:32:19] Who do you think that should be? Elizabeth Denham [01:32:21] Well, I think from a competition aspect, it's the committee and from the use of personal data and the use of personal data to micro target, I think that's that's our office. [01:32:34] But there are other bodies as well involved in looking at fairness in advertising. Giles Watling [01:32:40] Thank you. That's very clear. So so if the UK has a data protection regime with GDPR, but we don't have the privacy regulation that the EU has, would that undermine our ability to ensure robust data protection in the future? Elizabeth Denham [01:33:04] It could do I do think that the electronic marketing regulation and the GPS need to work more closely together and to reflect the same standards around content, for example. I think that's really important. Julian Knight [01:33:22] Just to pick up a couple of points that you mentioned in your answers to Giles there. In relation to the CMA and its work, do you think competition law is a means by which to make large social media companies behave as better citizens? Elizabeth Denham [01:33:37] I think that it's takes it's going to take a village to change the behavior of the size of companies we've never seen before, and I think it's a combination of the kind of accountability and governance that we expect, both through competition for content regulation and data protection. I do think those those silos and those walls are coming down. And we have an opportunity, I think, to choreograph. Our response to big tech using the expertize that all three of those regulators and I think what we need is a coherent and coordinated response to the social media and technology companies. Julian Knight [01:34:25] Just to clarify, when you say take a village, I've actually never heard that before. Maybe I'm sort of I do actually come from a village. So maybe that's the reason why. But what is it? Is that does that mean basically lots of different organizations? Elizabeth Denham [01:34:39] I think know, when I say it takes a village, I think it's really important that we get the right law in place so that speech makers and parliamentarians, I think it's going to take the public to deeply care and be prepared to change providers. For example, if they're unhappy, it's going to take the regulators to be bold and do the the work that they do. And that's that's what I mean by ability, just take civil society, it takes it takes individuals to vote with their feet and it takes the regulators to have the right approach. Julian Knight [01:35:19] Thank you for clarifying that. You also talk very briefly touched on ID cards, which Charles did. I'm very much in agreement with him. I can't think of anything worse. But what do you think the management, the pandemic would have been easier with ID cards or some form of ID token you before? Elizabeth Denham [01:35:38] Could have been, but I think I mean, none of us saw what we had to deal with and none of us saw coming in 2020. And I do think that what's happened in the pandemic is a huge acceleration in the use of data, a huge acceleration in the take up of digital services to be able to run our lives in isolation. And I think what's happened is we've fast-tracked where we are now. We fast-tracked what would have taken us five years. And now what you have sitting in your lab as policymakers is how do you go forward to grapple with identity management ID cards, employing employees working from home and the kind of rules and regulations around that. So. I think I think ID cards need very careful consideration. And again, I could see that vaccine passports of some sort would be useful. I can see that. But people have to trust the government when they bring in these initiatives to understand what's the purpose, to narrow it as much as possible, and make sure at the end of the day that there are civil liberties, human rights, data protection are respected. Clive Efford [01:37:10] You updated your guidance to employers back in 2018 when do you expect to update it again? And how often do you think you will do that updating in the future? Elizabeth Denham [01:37:24] Well, an update to the 2018 guidance is is imminent because, during the pandemic, we've been giving a lot of advice to employees and employers about the collection of health data in the workplace, about the need to or the desire to surveil employees to make sure that they're effectively working from home. So, again, I talked about the acceleration of the use of digital tools endemic in the employment field. [01:37:55] It has accelerated extremely quickly. So we've been getting just in time and unions and employees and employers. [01:38:06] And we will create a hub of guidance and an update of our guidance because it's absolutely needed in this environment. Clive Efford [01:38:17] So what sort of practices have you uncovered that are taking place that are some trade unions, I understand are written to you expressing concern about what some employers are, the lengths some employers are going to monitor what staff are doing and that that may be infringing on their own personal privacy. So what sort of things have you? Is there anything that has alarmed you? Elizabeth Denham [01:38:42] I think an increase in the kind of digital keystroke monitoring and other types of monitoring, how long somebody is is on their computer, how long it takes them to answer a call. So that kind of surveillance going back to the principles in the law, it has to be necessary. It has to be transparent. It has to be fair. And an individual has a right to see what their employer is collecting on them. So all of those principles apply. But I think just like everything else in the pandemic, employers are no different and they're trying to protect their workplace and their assets. And part of that is more intensive surveillance of their employees. But all of those practices have to follow the principles in the law. Clive Efford [01:39:34] And did you discover that any infringements in terms of inappropriate use of personal data relating to employees. Elizabeth Denham [01:39:42] I haven't seen that directly. But I think the question about how much health information of an employee does an employer have a right to, I think that is a significant issue. And, you know, no jab, no job. And those kinds of issues, I think, are real issues that we have to grapple with, not just in data protection, but in employment legislation and human rights as well. Clive Efford [01:40:10] I mean, even before KPI, we saw that technology was being used in places like large warehousing where pickle's operate and the technology was being used as a monitor. What staff are doing the very intimidatory for people who are in a very weak position, often in very low paid work with our trade union representation. Are you concerned that KPIs created an environment where that sort of repressive regime can be imposed more easily? Elizabeth Denham [01:40:39] I think that's why civil society, employee groups, oversight bodies like ours has to pay attention to what's happening in the pandemic and just in the economic recovery period as well. So it's not just. This current period, but also we know we'll go through years of economic recovery, where new uses of data will be proposed, and again, those same principles need to apply. Clive Efford [01:41:10] I want to move on to BA and the data breach which happened in 2018. Do you think that BA responded appropriately and adequately to that breach? Elizabeth Denham [01:41:25] They did. I mean, the tragedy and the disappointment is that they didn't take good care of customer data, and so there were large gaps in the security of data and therefore the breach. [01:41:46] The fines that we issued, again, both to the Marriott and to and to B.A. were reduced partly because of pandemic pragmatism and the hit that the pandemic has had on the hospitality industry. Clive Efford [01:42:10] So can I kind of come to that, that the issue about the level of the fine, because you say that you took into consideration the impact of the pandemic on those companies, does that mean that if similar breaches in future were to occur outside of the pandemic, that people can expect bigger fines? Elizabeth Denham [01:42:31] Yes, again, the reason that the fines were lowered in that case was partly because of the additional information that we received in submissions from British Airways and the Marriott Hotel, but also because of the economic circumstances that that those organizations were facing. So 20 million fine is still a significant fine, compared to our previous cap of 500000 pounds that we had under the previous legislation. Clive Efford [01:43:07] BA was fined 183 million. What was the worst aspects of it that made you impose that size of fine? Elizabeth Denham [01:43:17] Well, that was that was a recommended fine, the actual fine was was 20 million. The number of individuals of clients that that were affected and the detail of the kind of information that had been breached, including passports, et cetera. So it was a significant issue because of the number of people affected and the detail and the sensitivity of the information that was breached and also the fact that it was entirely preventable. Clive Efford [01:43:56] Do you think that they do make companies sits up and respect customers data more think that the level of fines and sanctions that are in the GDPR important because they get the attention of the boardroom. [01:44:15] And since the GDPR has been passed, there's been more investment in data security than in any time in the past. And part of that is probably fear of sanctions or fines. [01:44:31] But let me say that GDPR is not all about fines and the laws, only two and a half years old. So there's still a journey with this law. But I think one of the most significant concerns of the public is data security. And that's why the fines are an important deterrent. They're not the only tool that we use in the toolbox, though. Clive Efford [01:44:57] Do you think the government does enough to protect people's data and particularly data relating to their identity? Elizabeth Denham [01:45:08] I think the government still has a way to go to protect systems, especially legacy systems. But I do think the general trajectory is better sensitivity, more control and more security to protect data. And that's also the way to ensure public trust in the future and ensure that the duty of the government. [01:45:32] Well, it is a duty of the government to have reasonable safeguards in place to protect personal data that's part of the law. Clive Efford [01:45:38] And is there sufficient safeguards in place for people who suffer identity theft to actually prove who they are and get their identity back? Elizabeth Denham [01:45:49] So, I mean, that's a whole area of deep public concern as well. We don't have a role in compensating individuals that are harmed by identity theft, but that's the financial ombudsman and the FCA that that are in that space. Clive Efford [01:46:08] Do you have any suggestions for steps that the government should take in order to protect people from identity theft? Because, after all, is a data that starts the process? Elizabeth Denham [01:46:20] I think the National Cybersecurity Center has a lot of advice for individuals and for organizations, and I think they've done a really good job in their security essentials program to help government and local governments, small government comply with reasonable safeguards and standards to protect data. Clive Efford [01:46:44] Do you think that companies that ask for data when say, you know, train companies when they sell tickets, for instance, do you think that they ask for data that is not necessarily germane to the transaction, that they that there's too much data being sold at times unnecessarily? Elizabeth Denham [01:47:01] We have seen many complaints about the over-collection of information and really collecting too much information and storing too much information is just a significant risk for four companies. Because, again, you have more data that that could be exposed, that could be used, and if there is a data breach, then retaining unnecessary data is just going to exacerbate the problem that the company has found. And we've seen that in many of our data breaches that shouldn't have been collected or data that shouldn't have been retained opposing the company to claims and sanctions by regulators. Damian Green [01:47:52] We are in the middle of holding an investigation into the music industry and streaming and payments to artists and so on. And the use of data has become an interesting subset of that. And in many of the stakeholders we've had in front of us have asked for the standardization of data when it comes to music rights so that everyone excuses me gets paid properly and nobody gets missed out within the current UK regulatory framework. [01:48:24] Who should this fall to? Is it you or is it someone else or isn't that any kind of regulatory framework about? Elizabeth Denham [01:48:36] Is this really about copyright? Clive Efford [01:48:39] Well, it's no, not really, it's about the use of the data, which drives a lot of consumption of music on streaming services, which boiling it down at the artists to complain to us and the songwriters are saying this is kept private between the streaming services and the big record labels. And so they can't tell whether they're being paid fairly or not. So it's that use of data in the keeping of that data private that I think might impinge on your sort of realm of powers. [01:49:21] And I just wondered if it does or is this the first time he was asked you this type of question,. Elizabeth Denham [01:49:27] I wonder if I could take it, take it away and write to you in response to that? Because, again, we're in the business of regulating collection use and disclosure of personal information. And so if we're talking about, I guess, the number of hits on a certain song. That might be information that that determines how the musician is paid, so that's the platform and the songwriter, but the personal information, that's the area that we're in. I don't know, Paul, if you if you're listening if you can help us out here. Paul Arnold [01:50:09] I'm sorry, but I think we'll take that away and look into it. Damian Green [01:50:16] I mean, sort of expanding it a bit, then it sounds to you at first instance as though this is a sort of commercial arrangement between two bodies and therefore wouldn't fall under any definition of personal data that you deal with. Elizabeth Denham [01:50:31] I think so. And so I mean, as Paul said, I haven't come across this, but it sounds like a commercial relationship involving non-personal data. Damian Green [01:50:43] So I suppose that leads on the question of whether there are any mechanisms for regulating this type of non-personal data. Is that I mean, you will know as well as anyone, obviously, what the boundaries of your powers are. [01:51:00] But does it feel like that this is just a gap, that there just isn't any regulation of this type of thing, and that if the committee or anyone else came to the conclusion that perhaps there ought to be some kind of regulation and not least about the transparency of the use of algorithms in this type of activity, that we'd have to start from scratch with primary legislation. Elizabeth Denham [01:51:27] I'm not sure I'd like to take that away and now if algorithms are used, but if the data is non-personal, then it's definitely not us. So let me have a think on that one and we'll write to you. Julian Knight [01:51:46] I think that's a really pertinent question and also it feeds into the questions that I was asking earlier in the session... if an individual has the right to challenge an algorithm, impacts their lives, whether or not it's a private or a public company. To a certain extent, a cartel effectively with nontransparent algorithms are then managing a market, which then means that the producers, the artist, if you like, within the music industry, are not gaining fair recompense for their work. Is that not public harm? Elizabeth Denham [01:52:22] It could be, but is that not a contractual issue? Julian Knight [01:52:29] If the means by which that is a cartel - and I'm using that phrase, I'm not judging - is arranging its business approach is by a means of algorithms. And then that is not clear in a contract. The power balance is enormously in their favour and there is no power. The power to individuals who have a right and challenge to an algorithm doesn't have any bearing on that, does it? Elizabeth Denham [01:52:59] Now, I think I mean, that's again, that's a complicated issue. Like to have a further discussion with you upon that, because when I see an algorithm of individuals being able to challenge that. Julian Knight [01:53:16] We'll pick this up probably in writing if we can because I think it's a really important area and it sort of moves into our inquiry. [01:53:24] Just finally and if you like, we're going back to where we were at the start where Kevin talked about Cambridge Analytica. I mean, when I remember interviewing you at that particular time on quite a few occasions. Looking back on it now, what's your sort of feeling? What's your sort of what reflections do you have in terms of that period? And that's sort of of the circus that went around the whole Cambridge Analytica and the fact that there was a degree of depth. We're using it as a means by which to undermine the referendum results. Elizabeth Denham [01:54:02] We were very careful because we're a nonpartisan, apolitical, evidence-based regulator. It was really challenging for us to do our work because it felt like the investigation was on the telly every night and, you know, witnesses were popping up here and there and everywhere. And so, you know, carrying out and invest a serious investigation in the public domain like that was extraordinary. And so what we did is we had to take comments from witnesses that we had many witnesses that refused to be interviewed by us. But we're happy to talk on the six o'clock news that makes it really difficult. I don't have the ability to compel individuals to be interviewed. I don't have that power in law. So, again, there would be lots of claims which we had to interview those who are willing to be interviewed. We had to look at the evidence and we had to be able to keep government and parliament up to date. So as you said, I was in four times one year when we were investigating and part of our investigation was behind closed doors and the rest of it seemed to be conducted in the public domain. So very challenging. Julian Knight [01:55:28] Did any of that force your hand or change the way in which you approached things? Because my position was there and there was a degree of confirmation bias about much of the sort of reporting around this. We had sort of much-lauded prize-winning journalists coming out with quite bizarre conspiracy theories and diagrams and that sort of thing. And we had, frankly, a sort of we have disjointed Netflix films. We have witnesses who claim to be whistleblowers but actually tried to set what actually in the market for selling information in the same way as IL was being accused of. Is your feeling that maybe, perhaps that there was just there's a circus, it was confirmation bias, and that effectively you were being used as a means by which to bring about a political objective, as in the undermining of a democratic decision of the people of this country. Elizabeth Denham [01:56:23] That's not what we were investigating, so that may be coloured by others, but the reason that we investigated the use of data in political campaigns and we started that investigation before the news about Cambridge Analytica and Facebook started that investigation months before Californians ran their program because we could see the train leaving the station in terms of the technology developments, the behavioural advertising models that were being used in democratic elections before there had been any kind of public debate. So I think the question that was driving me forward was, is it acceptable that behavioural advertising techniques that we're used to selling us holidays and Trainor's, is it right that those techniques are transposed into the democratic realm? And those are those were the questions that we were trying to answer. And we tried our best to use the evidence we had in front of us. I mean, gosh, it makes a good narrative. [01:57:38] You've got you know, you've got a lot of characters involved in that story. But really important is that we were able to look at the data to interview those who were willing to talk to us. [01:57:51] And we pretty much words and we haven't finished that work because, again, it led to investigations of credit reporting and data brokers. [01:58:02] So I think what we did was we looked at the whole ecosystem of political campaigning and we would get to a good place if we started a debate that that needs to be had. Julian Knight [01:58:15] Who wasn't willing to speak to you. You said that they were happy to appear on the six o'clock news or the front page of The Guardian. Any of the so-called whistleblowers, Whiley et al? Elizabeth Denham [01:58:37] I don't want to give you any inaccurate information, so I would feel better if I could write. Julian Knight [01:58:43] Yeah, OK. But interesting is that because you've referenced it in the public session, I would actually release that particular information as it comes along. |