Minister gives evidence to Lords committee on regulation of the Internet
Margot James MP, Minister for Digital and the Creative Industries,
Department for Digital, Culture, Media and Sport, gave evidence to
the Lords Select Committee on Communications on 12 November on the
subject of "The internet: to regulate or not to regulate?" Summary
Replying to Committee chairman Lord Gilbert of Panteg, she
explained the digital charter, which she said was not set in stone.
It was there to evolve as the technology evolved. A White Paper on
the internet safety...Request free
trial
Margot James MP, Minister for Digital and the
Creative Industries, Department for Digital, Culture, Media and
Sport, gave evidence to the Lords Select Committee on
Communications on 12 November on the subject of "The internet: to
regulate or not to regulate?"
Summary Replying to Committee chairman Lord Gilbert of Panteg, she explained the digital charter, which she said was not set in stone. It was there to evolve as the technology evolved. A White Paper on the internet safety strategy would be published this winter, followed by legislation as soon as the parliamentary timetable permitted. The government was “giving a great deal of thought” to the current regulatory structure. Because the online environment presented unique challenges, it was felt there was too much of a gap in the regulatory armamentarium, allowing swathes of online activity to proceed with little or no regulation. On resourcing of regulators, the minister said the government had recently increased significantly the money, human resources and expertise available to the ICO in anticipation of the greater volume of work likely to flow from the new data protection legislation. But there were other areas where the online world had presented new and far greater challenges than many existing regulators were resourced to cope with and there could be a gap in resourcing.
On the question of balancing further regulation with innovation,
the government was extremely keen to get that line right. Any
measures introduced to make citizens more secure online would set
clear online-safety expectations to protect users from harmful
behaviour and criminality without deterring innovation and growth
within the tech sector. Responding to Baroness Chisholm of Owlpen’s questions on social media platforms and reporting of problems, the minister said a revised version of the code of practice was being prepared, in particular to improve the appeals process. Baroness Kidron asked about algorithms and the approach to regulating them. The minister felt there should be greater accountability for and transparency in how algorithms were developed, established and set, so that people had a better idea of how they and their data might be affected by them. Their impact would be studied by the new Centre for Data Ethics and Innovation being launched this month. The Competition and Markets Authority had also been asked by the business department to look into the effect of algorithms in certain sectors. She agreed with Baroness Kidron that it was not enough to criticise bad performance, but also regulate it before it happened. She also agreed that deliberately creating compulsion through the use of algorithms was harmful and was at the nub of many problems with social media platforms. The government was looking at a duty of care and would come forward with proposals in the White Paper. But ultimately it would not be a one-size-fits-all solution. There would be various ways of approaching the challenge of getting a more respectful online environment and protecting people’s rights. The Chairman asked the minister to comment on the behaviour of airlines in conducting the “scam” of allocating seats to passengers, for which they used algorithms. Margot James said technology enabled greater opaqueness, but it was not necessarily always responsible for the motivation of the company to try to cheat its customers, and it was important to have greater standards of corporate responsibility. Responding to a question by Lord Goodlad, she said the government was not seeking a hasty retreat from the e-commerce directive, but more could be done under the directive, in particular in relation to hosting illegal content. Indeed, Germany changed its law on this at the beginning of this year. The Lord Bishop of Chelmsford asked if the algorithms that “muddle the seats on the plane and then charge us to sit next to our loved ones” could do a similar job before content goes on the Internet. Margot James said one of the consultation roundtables she will host later this year involved more technology experts to get an idea of what was possible. On the benefits and risks to consumers of the concentration of the digital market in the hands of a few large tech companies, she James said the competition law review announced in the consumer Green Paper, which would report next spring, would look at whether powers were adequate in the face of large concentrations of technologies in a few companies. The Information Commissioner reported only last week on the use of data analytics in political campaigning, saying the processing “of personal data to target political messages to individuals must be transparent and lawful”. There was great concern about the manipulation of people’s data in the pursuit of electoral gain. She added that the public interest tests applied to critical national infrastructure in the case of companies being acquired by companies overseas. Baroness McIntosh of Hudnall asked what principles should determine what we mean by “ethical by design” and how to then enforce them when it came to applying pressure. The minister hoped the Centre for Data Ethics and Innovation would help in that respect. Margot James concurred with Baroness Benjamin that online crime was a scourge and needed to be tackled vigorously. The Home Office had made funding available to the police to enable specialist units to assess the threat and identify criminality where they saw it. It should not all be down to the public purse to identify illegal material and get it taken down. "We should expect this of the technology companies. They have proved that they can do it with terrorist content, so we need to hold them to the same standards, certainly for child sexual exploitation and sexual abuse images." Replying to Baroness Benjamin, the minister said the issue of online gambling would be considered in the White Paper. Lord Gordon of Strathblane turned to the question of Brexit. The minister said “we will always be better enabled if we act as part of a more global effort.” She did not think Brexit would affect the online safety agenda either positively or negatively, with one potential exception: data. At the moment, the data protection regime was aligned with the European Union’s GDPR, and t was anticipated that data would be an important part of whatever deal the UK leaves the European Union under.
The Chairman said the committee had called on the government to
consider what could be done to ensure that work being done by
voluntary organisations, government and schools, was
co-ordinatedand more effective. Margot James replied: “That is really high
on my agenda right now.” There was a need for greater
co-ordination. In part responding to that need, the Government
had established the Digital Skills Partnership, which she
co-chairs with the chief executive of Cisco. Local digital skills
partnerships between schools, universities and industry were
being established. Select Committee on Communications Uncorrected oral evidence: The internet: to regulate or not to regulate? Monday 12 November 2018 3.30 pm Members present: Lord Gilbert of Panteg (The Chairman); Baroness Benjamin; Baroness Bonham-Carter of Yarnbury; Baroness Chisholm of Owlpen; The Lord Bishop of Chelmsford; Lord Goodlad; Lord Gordon of Strathblane; Baroness Kidron; Baroness McIntosh of Hudnall. Witness I: Margot James MP, Minister for Digital and the Creative Industries, Department for Digital, Culture, Media and Sport.
USE OF THE TRANSCRIPT
Examination of witness Margot James MP. The Chairman: I welcome everybody to this session of the House of Lords Communications Committee. Our inquiry is into regulation of the internet. It is wide-ranging and we will probably report in the new year. I am delighted that our witness today is Margot James, the Minister for Digital and the Creative Industries. We have a wide set of questions for you, Minister. Thank you very much for coming to see us today. As usual, the session will be broadcast online and a transcript will be taken. Minister, perhaps you could introduce your brief to the Committee and tell us about the digital charter and the internet safety strategy; in particular, whether you think that the various regulators have the resources that are required for them to fulfil their brief. Then we will open it up to some wider questions from members of the Committee. Margot James MP: Thank you very much, Lord Gilbert, and thank you all for this very important inquiry that you are undertaking. That is quite a broad question to start with. I will be as concise as I can be in my response. As you mentioned, my brief covers digital and the creative industries. That includes technology, telecoms and the vast majority of what would be called creative industries. One of my top three priorities, in a very wide brief, is securing an online environment that is more respectful of users and protective of their rights and security. That is definitely one of my top three priorities. You asked about the digital charter, which came into being last year. That sets out at quite a high level the principles we want to see embody the internet for our citizens: that it should be free, open and accessible; that people should understand the rules that are applied to them when they are online; that people’s data should be respected and used appropriately—obviously that has had a big airing as a topic for debate in its own right this year; that there should be protections in place to help keep citizens, particularly children, secure online; and, most importantly, that the same rights that people have offline must be protected online and that the social and economic benefits brought by new technologies should be fairly shared. We are committed to those principles. The charter is not just established and set in stone. It is there to evolve as the technology evolves, and guides a lot of what we are doing. You also asked about the internet safety strategy, which falls out of the digital charter, essentially. The sequence of events on this has been that my department consulted through the publication of an internet safety Green Paper last autumn. The Government responded to that consultation and published their response in May this year. At that time we announced that we would be working on a White Paper on tackling and addressing online harms, for publication this winter, and that we fully expected that to be followed by legislation as soon as the parliamentary timetable permitted. The Chairman: We have heard from witnesses about the role of various regulators in the field. There is a variety of regulators with a range of responsibilities. A number of our witnesses have strongly argued that we need either an overseeing regulator or, at the very least, a body that scans the horizon and that understands new developments in the area of the digital economy, new issues as they arise and new remedies. What thought has your department given to our current regulatory structure and the extent to which it needs to be changed or improved as we proceed? Margot James MP: We are giving a great deal of thought to that very subject at the moment as we work on our White Paper. A number of regulatory bodies have an online footprint in their day-to-day work: Ofcom, the Advertising Standards Authority, and the Information Commissioner, of course. There are various regulators that apply their work online, but it is not necessarily either their sole remit or the main purpose of their activity. In the way the regulatory landscape has evolved, in line with the point made in the digital charter—that the same rights that people enjoy offline should be protected online—there has to date been a reliance on existing regulators to apply their work online. However, because the online environment presents several unique challenges, it is felt that there is too much of a gap in the regulatory armamentarium which is allowing swathes of online activity to proceed with very light-touch regulation and, in some instances, no regulation at all. The Chairman: I mentioned the resources available to the regulators. Do you think the regulators are sufficiently resourced to hire the staff they need in competition with the internet giants? Margot James MP: The ICO is probably the best live example. We have recently increased significantly the money, human resources and expertise available to the ICO in anticipation of the greater volume of work that is likely to flow from the new data protection legislation. In that case, I can answer your question with a degree of confidence, but there are other areas where the online world has presented new and far greater challenges than many existing regulators are resourced to cope with and where the answer at the moment is that we probably see a gap in the resourcing. But it is too soon to give you a more detailed and accurate response, simply because, as we are currently working on the White Paper, we have to come to a conclusion concerning what we want to regulate, or what we feel needs to be regulated, before we can decide how best to regulate and therefore what resources such a regulator will require. There are certain questions that cannot be answered just yet, but I reassure the Committee that we are working very hard on all those aspects. The Chairman: One final question from me. As you contemplate further regulation in the swathes of unregulated areas that you have described—indeed, that implies that you are considering further regulation—how do you balance that further regulation with innovation and issues arising from that set of public policy balances? Margot James MP: That is a very important question, and we are extremely keen to get that line right. We want to improve people’s safety and security online, apropos the principle that what is illegal offline should be illegal online and treated similarly, and so forth. Any measures that we introduce to make citizens more secure online will set clear online-safety expectations to protect users from harmful behaviour and criminality without deterring innovation and growth within the tech sector. These are both crucial requirements, and we see no reason why they cannot go hand in hand. Our aspiration is to arrive at a point where the online world is similar to the offline world in what citizens can expect and how they can expect to be treated, and where we have a flourishing technology sector, which I am hopeful will be all the more positive an environment commercially while the online world is a more secure place for our citizens. I do not see a contradiction between the two. Lord Gordon of Strathblane: Minister, it occurs to me that, as you have described, the internet touches every aspect of life nowadays. It is very important to realise that the digital charter is not a one-off White Paper; it is a rolling programme. In many ways, however, the expectation is growing. So many problems have been parked, awaiting the digital charter, that the more you include in it the more people will conscious of what you have not addressed. Undoubtedly, new problems will arise before the ink is dry. Margot James MP: You put the point very well indeed, which is why anything that we introduce under the auspices of the digital charter, particularly things that we introduce into law ultimately, need to be mindful of the fact that, if we regulate, we are proposing essentially to regulate a moving target. Things evolve very rapidly with new technology; there is no doubt about that. So we need a flexible approach. There is another point, which is that the evolution of technology will in part provide the solutions that we are looking for. Technology itself will play a huge part in the improvement of the online environment. Lord Gordon of Strathblane: On the question of potential legislation, while accepting that it is not a given that there will be any, it is probably likely that at some point in the future there will be a necessity to legislate. You have already alluded to the fact that it is a moving target. Frankly, Parliament is not best equipped to deal with something like the internet. We face the choice of giving Ministers Henry VIII powers, which MPs are not terribly keen on, or having them legislate ponderously, probably over a year, through both Houses, at which point the target has moved on. How do we cope with that? Margot James MP: Certain things move on. Technology itself enables more rapid communication, and sometimes more anonymous communication. There are all sorts of things that technology enables and that will continue to evolve, but that does not necessarily mean that we should or need to sit back and be content with the current regulatory environment. Simply because a sector of society moves at pace does not mean to say that it should not move at pace within the law and within an acceptable regulatory environment. Lord Gordon of Strathblane: Witnesses at various sessions have thought that it might be helpful for the department to have a horizon-scanning body, a sort of forum that perhaps elects its own chairman and advises the Government of X, Y or Z problem that is on the horizon so that we can take pre-emptive action and prevent the problem arising in the first place, rather than take remedial action later. Do you see that as a potential advantage? Margot James MP: A sort of technology observatory sounds like a very good idea. We have officials in my department who are tasked with assessing and staying across emerging technologies. The team is highly qualified, which enables us to get the best-quality advice. Of course, we have a fantastic university sector in this country—four of our top 10 universities are in the global top 10—and there is a huge amount of research. So I think we have the benefit of horizon scanning, but there is no reason why it should not be more formalised in the manner that you suggest. Baroness McIntosh of Hudnall: Minister, I want to take you back to a point of detail in your first answer on the internet safety strategy. You talked about the consultation—and, by the way, this has nothing to do with the question that has just been asked.
Margot James MP: I will have to write to the Committee with the precise numbers which the consultation produced. I have tried to interrogate this base of respondents myself. I will redouble the set of inquiries that I left with officials a little while ago and write to the Committee. I noticed that Ofcom has also done some research—in the last couple of months, in fact. It published it, I believe, in September. From memory, 55% of respondents to the Ofcom consultation revealed that they had observed or been the victim of an unpleasant or illegal action online and thought that there should be better regulation of online platforms. Baroness McIntosh of Hudnall: My point is that percentages can be misleading— Margot James MP: Indeed. Baroness McIntosh of Hudnall: —because if the sample was small, the numbers will not be particularly telling. It would be helpful for us to know, because in necessarily persuading for example the tech companies that there is a serious public concern about this, a small sample might not help. Margot James MP: I will definitely write to the Committee with the numbers. I do not know whether I am reassuring you, but I looked at that research and thought, “Fifty-five per cent think that social media platforms should be better regulated. That’s very interesting”, because I thought it would be more like 80% of the people I speak to, and more. Hardly anyone I speak to does not think that social media platforms would not benefit from better regulation. The Chairman: Thank you for offering to send us that information. Baroness Chisholm of Owlpen: Leading on from that, I have two questions. One is about the Government’s proposal for annual transparency reports on social media platforms. Another is whether the Government’s proposed social media code of practice in the internet safety strategy will look at the important area of platform takedown and the appeal and complaints procedure. Margot James MP: Yes. I should preface my answers with the point that we are working on these approaches and they are evolving. We published a draft version of the statutory code of practice in May this year. Since then we have been consulting a wide range of stakeholders on the code of practice and we are currently developing a revised version of the code. We anticipate that it will set a clear and common approach to online security, that it will help companies to understand how they should promote safety and security on their platforms, and that it will make it clear to users what they can expect when things go wrong and when they report problems they have experienced to the relevant platform. Social media transparency reporting is an expectation that we will set social media platforms and other relevant internet companies with regard to their publication of the number of complaints they have received, the nature of those complaints and their response to them. We see the transparency agenda as very important, and it is what we anticipate coming out of this. Baroness Chisholm of Owlpen: Surely one of the most important things must be that the appeals process is easy for a person to be able to go through, that they can feel they will get some answers and that the large platforms cannot hide behind it. Margot James MP: Exactly. That is completely the purpose. At the moment, many systems for reporting are far too opaque and the response mechanisms are far too haphazard. During some of the consultations that I have led myself, I have seen a tendency for users not to bother to report problems because they anticipate that nothing will be done. We have to achieve a cultural change through this new approach when it comes into being. Baroness Kidron: Perhaps I might say at the outset how delighted I was to hear you say in your opening remarks that you want an online environment that is “respectful of users” and delivers on their rights and security. That is so refreshing, because we hear so much about safety, and actually this aspiration for a beneficial technology is very welcome.
Margot James MP: Algorithms are becoming much more the topic of discussion in relation to the hidden biases and directions they are setting, which are not necessarily transparent to the end users. We feel that there should be greater accountability for and transparency in how these algorithms are developed, established and set, so that people have a better idea of how they and their data might be affected by them. This is an area we are looking at in our development of the White Paper. In addition, we are establishing—this month, actually—the Centre for Data Ethics and Innovation, and we will ask it to assess the impact of algorithms in certain areas to make sure that the public are better informed about them. There is not just the new centre. The Competition and Markets Authority has been asked by the business department to look into the effect of algorithms in certain sectors of the economy, under the auspices of the consumer rights Green Paper published by that department in April. I will give the Committee one example of what I believe the CMA is looking at in the aviation sector, where algorithms are deployed to allocate seats on aircraft. Some airlines have set an algorithm to identify passengers of the same surname travelling together and have had the temerity to split those passengers up around the plane, and then when the family ask to travel together they are charged more. That is an example of a very cynical, exploitative means of deploying algorithms to hoodwink the general public, which these various bodies, such as the CMA and the new Centre for Data Ethics and Innovation, are going to get on top of, I trust. Baroness Kidron: Can you see a direction of travel in which, rather than finding cynical or bad practice and then punishing it, we actually set—in advance, pre-emptively—certain levels of behaviour that are expected and will be regulated in that sense? Margot James MP: The answer is yes, and I very much agree with what is behind your question. It is not enough for us to criticise bad performance, although it is very important that we root it out, hold companies to account and expect better of them. As you suggest, it is very important to the realisation of all the benefits of technological change, which are so manifest, that these benefits can flourish and that citizens can take advantage of them with confidence. That is the happy state which we aspire to. Baroness Kidron: I would like to ask one very specific thing. I know you have spoken publicly on this yourself. Tristan Harris of the Center for Humane Technology speaks a great deal about the attention economy and the interruption of free will by algorithmic methods trying to get our attention, hold our attention and reward us for our attention in what might be considered bad-calorie ways. Is this in scope as an issue, as a harm? Is deliberately creating compulsion a harm? Margot James MP: I think it should be in scope, and I will endeavour to make it so in my work towards the White Paper. You touch on something that is really at the nub of so many of our problems with social media platforms: the fact that the algorithms are exactly as you have described. For instance, if you key in “weight loss” on YouTube—or another platform; I do not want to single one out—you get bombarded with great volumes of the same material. Of course, if someone is vulnerable and has any mental health or addiction problems, or anything like that, that can make the situation very much worse. There are a legion of different examples that we could deploy on the same theme, which is that these algorithms need far greater transparency and companies need to be held more to account for their deployment. Baroness Kidron: And perhaps a bit of oversight on the recommends. Margot James MP: Yes, exactly. I have huge respect for the Center for Humane Technology. I think it is doing fantastic work. I hope to be meeting it next spring when I visit America. It is definitely on my list of organisations I want to see. Baroness Kidron: Excellent. Thank you. The Chairman: Can I return to your example of the airline seating scam to pull two things out of this? Margot James MP: Yes. The Chairman: When we go about trying to deal with that issue by regulation in one form or another, will we try to regulate the behaviour of the airline in conducting this scam in the first place or will we try to address it through regulating algorithms? It seems to me that actually that may not be an algorithm and that it could easily be done without an algorithm. The definition will have changed.
Margot James MP: That is a really good question, which focuses the mind. You are right that technology enables greater opaqueness, but it is not necessarily always responsible for the motivation of the company to try to cheat its customers, which in essence is what such an airline is doing. We need to dig back to setting greater standards of corporate responsibility so that companies are not manipulating and hoodwinking customers and feeling that that is an okay version of customer service, which it is not. It is not the technology’s fault; it is the poor standards of corporate governance in whatever organisations are culpable in that area. This strays into areas of corporate governance and consumer protection, which are the preserve of the business department. Having been a Minister in that department, I am quite aware of what it is trying to do and I applaud it for that, but just regulating the technology will not get to the root of the problem. The Chairman: I come back to this, because it has emerged from this inquiry that there are two aspects of regulation that we are looking at: regulation of the digital economy and regulation in the digital era. It is an example of where regulation is not keeping up with the way the world is changing in the digital era. It is not itself a digital issue. Bad behaviour has greater consequences because of the pace and capacity, but it is not itself a technical issue. Margot James MP: No. I agree with where you are coming from on this. Your question should inform the development of our White Paper and our thinking about regulation. I have already said that if we do regulate, we intend to do so very sensitively, with an eye to encouraging innovation as well as to making the online environment a more secure place for citizens. But I take your point that a lot of what we are trying to improve is stuff that might have been easier to detect in the offline world than in the digital world. Because of that, the consumer needs to be empowered but also needs protection, because it is easier in the online world to act without people realising what your agenda is. Lord Goodlad: Minister, my question is about the safe harbour provisions in the e-commerce directive. Do you think that after Brexit this country ought to repeal the safe harbour provisions of the e-commerce directive? Margot James MP: I am not quite sure I follow your connection with the safe harbour provisions, which, as far as I understand it, apply to an arrangement that the European Union has with the United States on data protection. Of course I am familiar with the e-commerce directive, which mimics to a certain extent the provisions in the United States by enabling online platforms to operate without liability for the content on their platforms. Is that the gist of what you would like me to address? Lord Goodlad: Very much so. Margot James MP: Okay, thank you. The answer is that we are not seeking a hasty retreat from the e-commerce directive. I should preface everything I say by saying that, as with everything else, what we do post Brexit is all a matter for the negotiations and the deal. What I can say is that more can be done within and under the e-commerce directive. The e-commerce directive permits companies to be liable for hosted illegal content once the company is aware that that content is on its platform. Then, if it does not remove that content expeditiously it is within the preserve of member states to have legislation to fine such a company. Indeed, Germany brought this into law at the beginning of this year. The German Government passed a law that makes platforms liable for any illegal content found on their sites and they are subject to substantial fines if they have not taken the content down within 24 hours. There is an interesting aspect here, which I think is important. The company’s liability takes effect only once it becomes aware that the content is on its platform. That enables the company to have a fairly reactive policy in place, and to outsource the policing and detection of all this illegal content to the third sector, to Governments or to the police. Actually, some respondents to our consultation have told us that that is a derogation of duty and ask: why should the public purse pick up the bill for that sort of investigation? Baroness Kidron: You have answered half of my question. “Once it becomes aware” is the problem, is it not? We have found, particularly in relation to children, that these companies are expecting pre-schoolers to police the internet, as it were. That is unacceptable. I do not mean to put you on the spot, but there is the idea of a duty of care, which we will discuss in the House later this evening. Do you think that a duty of care would match the liability question? If we introduce duty of care, do companies then have a duty of care irrespective of the liability question of the e-commerce directive? I suppose the other thing is that, if they insist that it is only illegal content that they are responsible for, does that push Governments into creating more illegality instead of having a better cultural and—to quote you back at yourself—“respectful” environment? Margot James MP: We are certainly looking at duty of care as one potential solution we develop the White Paper, for some of the reasons you have just set out. I do not want us to get ahead of ourselves here, but I suspect that ultimately it will not be a one-size-fits-all solution. There will be various ways of approaching the challenge of getting a more respectful online environment and protecting people’s rights. You make the point about the distinction between content that is illegal and content that is harmful but not necessarily illegal. There is a grey area between the two, of course, as there is offline. For those reasons, I doubt that one solution in the law will be adequate. We will probably look at a panoply of measures that will ultimately improve the online environment for everybody. The Lord Bishop of Chelmsford: I resisted coming in earlier on the content moderation issue, but I will just try this one out. When a cinema chain is showing a film, it does not wait for somebody to complain that the content was completely inappropriate; it has filters and processes that are governed and regulated, so it decides what is shown and then gives helpful and widely understood recommendations about who the film might be suitable for. Could not the clever algorithms that muddle the seats on the plane and then charge us to sit next to our loved ones—that might be a mixed blessing, but anyway— Margot James MP: You might be happier with the original seat. The Lord Bishop of Chelmsford: It depends how long the flight is, but that is another matter. Could not those clever algorithms do a similar job before the content goes up? Margot James MP: You raise a very good point. One of the consultation round tables that I will host later this year involves more technology experts so that we can get an idea of what is possible. I should have mentioned earlier—you may already know—that we are working on the White Paper jointly with the Home Office, which has had quite considerable success in removing huge amounts of terrorist content. At first it was expected that terrorist content would be taken down within one or two hours of it going up, but now it is expected more and more that the algorithms and other technological solutions will identify material as it uploads and immediately take it down. That is an excellent example of where voluntary working with the internet companies has produced results, and we want to see more of those approaches in other areas of illegality and harm. On your cinema analogy, it occurred to me while you were speaking that, although you would not expect the cinema to show stuff and then warn people—I think that was your analogy—neither would you expect the cinema chain to be judge and jury over what it should flag up. That is something else that we want to tackle in our White Paper. In fact, a number of companies are keen for the boundaries to be set. Some companies do not see it as their role to decide what content crosses the line and what does not, and it is unusual that they have been permitted to get to this point. The Lord Bishop of Chelmsford: It is also the case, as I am sure you are aware, that some of the big players—I will come to this in my set question in a moment—now concede that it is not enough to say simply, “I’m a platform upon which others stand”. We have gone beyond that now, so it is timely to be thinking about these things.
Margot James MP: There have been some developments. The competition law review was announced in the consumer Green Paper, which I mentioned earlier, which was published in April. That review, which will report next spring, will look at whether our powers are adequate in the face of large concentrations of technologies in a few companies, which is what I think is behind your question. I await that report with interest. It is being driven more by the business department and the Treasury, but obviously my department is keeping a close eye on it. There are moves in the United States to look at this, because a lot of the companies we are talking about are US-domiciled organisations. I do not have a lot to add at this stage. The Lord Bishop of Chelmsford: Just to press you a little on this, our competition law deals on the whole with things of economic interest, whereas one of the big anxieties that we as a world have about a few large companies dominating the market is the harm it will do. We have received some evidence on this. Jamie Bartlett at Demos talked about the harm it will do to democracy rather than to consumer welfare, and Lorna Woods, of the excellent University of Essex, told the Committee that competition law does not really account for non-economic interest. How do we tackle things that cannot be measured economically but where all the anxiety lies? Margot James MP: Going back to the economic thing for a minute, I mentioned the panel reporting in February next year. My department is directly involved with the business department and is working jointly with it on the digital competition expert panel. Democracy could, as you say, be affected by the concentration of so much market power in so few organisations. There are various ways in which we are trying to address the potential impact on democracy of some of the consequences of unpoliced activity online, which is a huge subject in its own right. The Information Commissioner reported only last week on the use of data analytics in political campaigning. She said, “The invisible use” and processing “of personal data to target political messages to individuals must be transparent and lawful”. She found in her report that at the moment it was not. She found a disturbing disregard for voters’ personal privacy, that companies had been wholly negligent and that there had been illegal activity on the part of organisations campaigning in British elections and referendums. There is great concern, and the Electoral Commission, the ICO and the Cabinet Office are all working in this area to strengthen our defences against the manipulation of people’s data in the pursuit of electoral gain. Lord Gordon of Strathblane: I have a follow-up question. Some of the offline activity involving the media is subjected to a public interest test. Is there not a case for looking at a public interest test in relation to some of the internet mergers that go on? Margot James MP: The public interest tests apply to critical national infrastructure in the case of companies being acquired by companies overseas. Critical national infrastructure is grounds for a large merger or acquisition being referred to the Secretary of State for Business, so there is some protection there. In the environment that we are looking at, there may be a bit of a crossover into the duty of care mentioned by Baroness Kidron. It sounds to me as though there might be potential for looking at both these areas within our deliberations. Baroness Kidron: I want to go back to the question of economic value and data. One of the big moves in America, particularly among the people whom you may go and see in the spring, is in recognising that data itself has a value. When the Committee was doing its inquiry into advertising, we really got behind the idea that if only we publicly attached a notion of value to data, first, it would be harder for it to be so rapaciously taken away from us because it might have some value and, secondly, the existing levers, such as the Competition and Markets Authority, would suddenly say, “Hang on, how is this value being distributed, and should we take a look at that?”
Margot James MP: Definitely. It is certainly top of mind at the moment. Going back to the Lord Bishop of Chelmsford’s point, there is the economic value but there is also the democratic value and the privacy aspects. It all revolves around people’s personal data, which is why I was so pleased to get the Data Protection Bill through Parliament earlier this year, introducing the GDPR into British law but also going beyond and really strengthening our data-protection arrangements and strengthening the powers of the Information Commissioner—updating her powers not just to fine but to investigate and interrogate and issue criminal sanctions. That is all in the environment of protecting people’s data and giving people power and rights over their data, as well as an appreciation, through education, of the value of that data. To some people, the value may be economic. Other people might just wish it to be kept private. But people need to be more aware of the data that companies and organisations hold on them. Of course, people are now at liberty to make a subject data access request and find out what data an organisation holds on them. It is interesting to note that some American citizens are using our data protection laws to enforce their data rights against companies that are located in their jurisdiction, because of course in America at the moment there is nothing to compare with the data protection laws that respect the value of citizens’ data here in the Europe. Baroness McIntosh of Hudnall: I will come to the question I wanted to ask you, Minister, but on that particular point, it has been suggested, not just to us but more widely, that this whole business about people’s data and the value and protection of it is severely undermined by the apparent fact that quite a number of people, particularly younger people, are far less concerned about the privacy of their data, how it is used and by whom than one would imagine—certainly a lot less than we are. Do you recognise that possibility and do you have any thoughts about it? Margot James MP: I hesitate to generalise too much. You are absolutely right to say that people value their own privacy and data, online and offline, to varying degrees and that it might be interesting to do some research to find out what the demographics might look like. With young people, you are talking about people who have grown up with the internet and have never known anything else and have always had to accept a system whereby the major social media platforms through which they live their lives extract data—or have done to date—as the price for a so-called free service. With the awareness that I talked about earlier, people, including young people, are becoming cognisant of the fact that this is not free, in fact; that they are giving their data away and it has a value, as I was saying. I attended a meeting of the British-Irish Council last week and the Scottish Government Minister present invited a young person who had been involved in the 5Rights initiative—I know Baroness Kidron has championed and developed that, for which I salute her—which was facilitated by the Scottish Government. It was very interesting. This young person gave a presentation to the meeting at which she said that she and various other young people involved had been really quite horrified at the extent of the extraction of data from them over which they had no prior knowledge and, until very recently, no control. I do not think we can necessarily break it down by age, but it is an interesting question and it would be fascinating to see a demographic analysis of attitudes to data and its value. Baroness McIntosh of Hudnall: Indeed. Of course we should not generalise, but it is clear that attitudes to privacy, for example, are changing—for good or ill, I do not say. Margot James MP: Yes, and people’s awareness is changing and improving. Some people might welcome not being charged for the services—they are perfectly at liberty to do so—and they might be quite ready to give away data and so forth. But it is not just the extraction and sale of data. There is another aspect to this, which is even less acceptable in my opinion: the processing of that data and the use of it in various opaque settings. I go back to the earlier discussion about democracy and the impact on people’s voting behaviour. We are now in an environment where vast amounts of analysis can be made about a person based on information readily available to an internet platform about their browsing habits, their shopping history, where they live—all sorts of things. It is when that microdata is then used to target them in a way that does not have their consent and of which they are often unaware that it becomes very worrying. Pricing is a key issue—going back to our airline discussion—such as the pricing of various utility providers. There is no doubt that that microtargeting and the amount of information now available to companies about people can result in those companies operating a differential charging system for their services, whether it is energy or telecoms or whatever. People who the company thinks are not likely to leave get charged more, and all this is going on with no transparency. That is what is so concerning. Baroness McIntosh of Hudnall: Indeed, and that can also be used to exclude people from accessing services, which is another very big issue.
A world of technology that can do everything that we can see it can do could certainly do that if it were minded to, but it is probably not minded to. It has been suggested to us that it is unlikely that design will naturally evolve to become ethical and transparent if the people who are doing the designing are left to themselves. The question is therefore, first, what principles should determine what we mean by “ethical by design”. Secondly, and perhaps more importantly, because I know you have got to the principles, how do you then enforce them when it comes to applying pressure? Where will that enforcement come from? This takes us back to the regulation point. Margot James MP: Ethics are very difficult to enforce, are they not? You want an environment where companies are incentivised and their motives and algorithms are aligned with the public good and a higher ethical standard. That is the ideal. I share your concern that just leaving things to evolve might not lead us to that end state, which is why we have established the Centre for Data Ethics and Innovation. We expect considerable work and progress on these issues from that body. Once it has completed its first year of work, we will be closer to assessing the contribution it is likely to make in the long term. I have high hopes for it as an organisation. It will ultimately be on a statutory footing and independent of government, and it has the potential to influence the development of technology very much for the good in the long term. We mentioned earlier the Center for Humane Technology; it has a similar remit. It is good that these organisations are now putting their heads above the parapet and contributing to debate. There are already differences in behaviour when you look at different technology companies. There will be companies that want, in their DNA, to live up to high ethical standards. When I worked in the pharmaceutical industry before I went into politics, there was a company, Merck, that was voted Fortune 100’s most admired American company year after year. The philosophy of the company’s founder was that if you put the interests of the patients first, the profits will follow. I believe in that as a principle of corporate governance, and I do not wish to imply in my evidence that technology companies are not capable of working all of this out for themselves and applying those high standards. That is what we want to see. However, I do agree that we cannot rely solely on corporate good citizenship, given the scale of the problem that we, and I am sure your Committee, have identified. Baroness McIntosh of Hudnall: May I press you a little on that? It is interesting that you have pointed to the pharmaceutical industry, where there is clearly a huge potential for good and an enormous potential for harm. They may not be absolutely equal, but they are certainly close. In the world of technology, and potential harms from technology, apart from issues where there is clear illegality, at the moment it is still up to the person or people who have experienced the harm to initiate the complaint, or whatever process there is. It still relies on the end user stepping forward and saying, “I have suffered this harm. What redress may I now look for?” It is perfectly true that in some cases there will be redress, but, to follow your pharmaceutical company analogy, you cannot leave it to the patient to be dead before you identify the harm. There have to be intermediate steps to make it less likely that a patient will be dead. Margot James MP: Of course. I agree. Baroness McIntosh of Hudnall: So following your own analogy, do you imagine a regime as strict as the one that applies in the pharmaceutical industry eventually applying in the world of digital? Margot James MP: I would have to give that thought. On the question of whether we need a stricter regulatory regime, yes, we do. Whether it needs to be as strict as in the pharmaceutical industry is something I would have to reflect upon before giving you a yes/no answer. Undoubtedly, there needs to be greater regulatory oversight. It should not be left to individuals. Individuals should have recourse, but even ensuring that this happens now under the current system would be progress. I think we probably all feel that more needs to be done. I will quote from the ICO’s report into political campaigning. What the Information Commissioner says could apply equally across many of the harms that we are discussing. She said in her report last week, “Whilst voluntary initiatives by the social media platforms are welcome, a self-regulatory approach will not guarantee consistency, rigour or … public confidence”. I concur exactly. Baroness McIntosh of Hudnall: May I leave you with one thought for when you consider this further, Minister? The pharmaceutical industry depends hugely on innovation. There has to be, does there not, a constant stream of innovation in order for the benefits of the pharmaceutical industry to be realised. One thing that the big tech companies and others frequently say about regulation is, “If you regulate, you will stifle innovation”. I simply leave that for you to think about. Margot James MP: By the way, it might reassure you that I do not accept that argument at all. Baroness Benjamin: First, I would like to thank you for the recent announcement of the £57 million Contestable Fund for children’s programmes. Hopefully, some of those programmes will talk about internet safety. We will keep our fingers crossed. My question is about online crime, which is happening thick and fast. The police now have to deal with a huge amount of online crime, including terrorist activities, child exploitation and gross sexual abuse. A friend of mine recently lost £84,000 in a bank transfer to Holland. It is happening thick and fast. Do you think that the resources, powers and expertise provided to the UK police forces are sufficient for the sheer scale and complexity of online crime? Secondly, what initiatives are the Government carrying out to improve the UK’s co-operation with international partners to combat online fraud? Margot James MP: Thank you for your kind comments about the Contestable Fund. I know you played a huge part in getting that initiative on the books, so thank you. I completely concur that online crime, fraud and some of the terrible examples that you just mentioned—sexual exploitation, and so forth—are a scourge and need to be tackled vigorously. The Home Office has made funding available to the police to enable specialist units to assess the threat and identify criminality where they see it. I might have to write to the Committee on the question of whether they have adequate resource, following consultation with my counterpart in the Home Office. I sit on the Organised Crime Task Force as the Digital Minister, and I feel that, whatever resourcing we have, we are facing a tidal wave and, as I said earlier, it should not all be down to the public purse to identify this material and get it taken down. We should expect this of the technology companies. They have proved that they can do it with terrorist content, so we need to hold them to the same standards, certainly for child sexual exploitation and sexual abuse images. That side of things should be treated absolutely as seriously as terrorism. The fact that we are not there yet is a severe indictment of some of the platforms. So I do not think it should all be down to the public purse. If we can get the regulation right and our expectations set accordingly, we may find that we have enough resource within policing. I cannot be precise in my answer. The Home Office has allocated funds. Resources are always finite. Whether it is enough depends partly on what we expect the companies themselves to do. Baroness Benjamin: What about online banking? You are encouraged to go online, and more and more people are losing money when they make huge transfers, especially to banks abroad. Margot James MP: It is an area in which people have to be on their guard. The amount of online fraud is extremely serious. Which? magazine did a survey of online fraud and found figures in the billions. Consumers need to get advice from their banks and make sure that they double-check everything before transferring any significant amount of money. In part, I am hopeful that this is an area in which technology will assist the banks in helping to protect their customers better than they are at the moment. Baroness Benjamin: On the question of co-operation, how is the UK working with our international partners? Margot James MP: There is a great deal of international co-operation with all global organisations, banking and financial regulatory institutions and the European Union. The Government are working with a whole range of international partners. Like most aspects of digital harm, it is global in nature and scope. So the Government are working closely with other regulators around the world to try to reduce all this. Baroness Benjamin: Finally, what financial and other resources will be available to the UK Council for Internet Safety? Margot James MP: I will write to the Committee with an exact response on that. The Council for Internet Safety is being revamped. It has a new board and a new remit, but I will write to the Committee to confirm what the resourcing will be. Baroness Benjamin: Fantastic. Thank you. Baroness Kidron: It struck me, in that last exchange, that so many of the answers to the extreme are also answers to the quotidian. As you probably know, the WePROTECT technical board will report on Friday on what we think will help to curb the spread of images of child sexual exploitation. Without pre-empting the answer, so many things come back to impact assessments, the harmonisation of rules, designed standards and so on. I am interested to know whether you see extreme harm and everyday harm in the same continuum, or whether we are always going to be pushed into dealing with each harm separately. Margot James MP: To a certain extent, different harms require different solutions, but I do see a lot of these things as a continuum. You have the horrors of child abuse images online. You also have child sexual exploitation online. Most of the time it is connected, I suspect. Then there is grooming, which is at the start of the scale. Grooming concerns me greatly, because it is increasing. It is very easy for an older predator to masquerade as a young teenager and get the confidence of a young person, perhaps a vulnerable young person, online. Because so many young people live their lives online, they do not find it odd to make friendships online with a view to meeting someone. Speaking for myself perhaps, we might have a natural sense of caution that, if you live your life online and have never known any different, you perhaps do not have. So I think children require greater protection in this area than adults. To my mind, at the moment they are not getting it. Baroness Kidron: May I just add to that continuum? We know that there is one more stage: oversharing, competitive popularity, competition and, of course, addiction. We have seen that you start with that cultural piece and it goes all the way through the line. That is what I meant. Margot James MP: Yes. I answered your question more in relation to child sexual exploitation, but potential addiction is another area. As the Lord Chairman asked earlier, is it the technology? What is behind it? People of all generations have been tempted to show off, as children, as young people, sometimes even as older adults. Baroness Kidron: I cannot think who you mean. Margot James MP: The thing about technology, as always, is that it enables and exacerbates and can be so much of a young person’s day-to-day experience that it becomes a problem, whereas in the offline world it was much easier to contain. Baroness Kidron: A vast proportion of sexualised images of young children are actually posted by themselves, their friends or companions. That has grown in this environment, so I am keen to note that that forms part of the continuum. Margot James MP: Yes, indeed it does. Thank you. Baroness Benjamin: We have not mentioned online gambling, which I feel we really need to discuss when we talk about addiction and young people. Many university students at the moment are addicted to online gambling, and gambling companies are targeting young people to gamble online. I would like to know how the Government see this and how we are dealing with it, especially in the case of young students, many of whom are committing suicide and having mental problems because of online gambling and addiction. Margot James MP: We are looking at online gambling as part of our White Paper development. We recognise that the online environment can exacerbate somebody’s gambling instincts and that gambling is introduced in areas where it perhaps would not have occurred before. All these things conspire to create a bigger problem, which we are definitely addressing in the development of the White Paper. Later this year, I have a round-table consultation on gambling, problems online and what we should be doing about it. Baroness Benjamin: Fantastic. Lord Gordon of Strathblane: We now turn to a specific aspect of international regulation. It is amazing that we did not come to it earlier. Inevitably, it is Brexit. What opportunities does it provide, and what threats does Britain face from it? Margot James MP: Online? Lord Gordon of Strathblane: Absolutely. I was not looking for a general answer. Margot James MP: No. Good. We could have been here quite a long time. In this area, this is a very global phenomenon. It is not an area where one country can easily introduce measures to unilaterally deal with the problem in one territory. There are things that the Government are doing, can do and will do, but we will always be better enabled if we act as part of a more global effort. When I say “more global effort” I ought to add the caveat “by like-minded countries”, because there are countries with a very different attitude to the internet that we do not necessarily want to emulate. I met with my French counterpart last week, and we discussed various measures that the UK Government are taking, such as the requirement for pornography sites to have robust age-verification systems in place to prove that someone is 18 or over. My French counterpart is very keen to know more; they are looking to do something similar in France. I have mentioned Germany, which I think is already taking more action legally than any other country in Europe, and the Australians have introduced measures. I do not think that Brexit will affect the online safety agenda either positively or negatively, with one potential exception: data. At the moment, our data protection regime is aligned with the European Union’s GDPR, and we anticipate data being an important part—I hope—of whatever deal the UK leaves the European Union under. Data is an important part of that, but we will need an adequacy decision. Lord Gordon of Strathblane: Do you mean on taxation? Margot James MP: Data flows are very important. Of the United Kingdom’s data flows, 75% are within the single market, within the European Union. The figure for our trade in physical goods is, I think, slightly under 50%, but for data it is 75%. It is therefore very important that we get an adequacy decision when we leave the European Union. We are fully confident of getting one, but there may be a time lag between the end of the implementation period and the embedding of whatever future framework the Government are able to negotiate. During that time, companies will have guidance from the ICO and the Government on alternative legal routes to the trade in data. Lord Gordon of Strathblane: While obviously agreeing that anything that we do would be better if it were universally implemented, there is still quite a lot that we can do, and indeed take a lead in, in the hope that others will follow. Margot James MP: Yes. I am sorry if my answer did not give that impression. I apologise for that. We are doing hugely important things in this country. We are setting the lead on age verification, with the institute for data ethics and our White Paper development. We are in the lead in a lot of these areas, and these measures will have an effect. Will they have a greater effect if they are deployed across borders? Yes, they will, but that is not to say that they will not have a very positive effect when done unilaterally. Lord Gordon of Strathblane: You mentioned the distinction between services and goods, which prompts me to ask the obvious question. The service industries, including broadcasters, were very much encouraged by the Prime Minister’s Mansion House speech, and therefore slightly taken aback when the Chequers agreement did not provide for any protection for the service industries. Were you equally disappointed? Margot James MP: My main concern, which is somewhat beyond my brief, was the trade in manufactured goods. The Chequers proposals were very strong on staying true to our desire for frictionless trade at the borders. It is different for services. I am not saying that the sectors I represent are not disappointed—I am sure they are—but the problems and challenges that they face are of a different order from those faced by manufactured goods. Lord Gordon of Strathblane: But services are far more important to this country than goods when it comes to straight value for money. Margot James MP: It is true that the economic contribution of services in this country is absolutely the greater; it is probably 75% of GDP. But I gave the example of data transfers. There are ways around the challenge of leaving the European Union that will not be to the detriment of the technology industry and companies that wish to send data over borders. I am confident that we will get an adequacy decision and that we will therefore be able to trade data seamlessly between Britain and the rest of the European Union, once we have that adequacy decision. I do not think that the problems are the same as they are for the manufactured sector. Lord Gordon of Strathblane: Yet some broadcasters have already left for Amsterdam. Margot James MP: On broadcasting you are quite right. I was answering with regard to data. It is true that broadcasting is another subject. We had a very good arrangement under the audio-visual services directive. For anyone who is not cognisant of this, it means that if a company satisfies the standards of one regulator, it can broadcast across the whole European Union. Roughly a third of all content broadcast across the European Union originates here in the United Kingdom. That regulation has worked very favourably for the United Kingdom. Lord Gordon of Strathblane: And the loss of it? Margot James MP: The loss of it will be regrettable, certainly, but the industry is looking at reciprocity, and there is hope that it will be able to counter the worst effects of the loss of protection under that directive. Although some of the organisations you mention are establishing operations in other countries within the European Union, I do not think that any of them are thinking that they will have to move all their operations. As long as they have a significant presence within a regulated jurisdiction of the European Union, that will be adequate. I am not trying to say that it will be as good—it will not—but I think there are ways of protecting the sector that will ensure that it does not have a calamitous result. The Chairman: Thank you. We have talked a lot about regulation. The other side of the coin is education. In our inquiry into children and the internet, we found a lot of organisations out there trying to do good things in schools, working with children to get them to understand their role in looking after their own online security. We have talked about some of the banking scams, and we have all seen things that people have done which seem to be remarkably stupid—basically giving away their money. Again, education has an important role. We found a lot of good work but very little co-ordination and we called on government to consider what could be done to ensure that all of this work being done across the piece by voluntary organisations, government and schools, was co-ordinated and more effective. Have you had an opportunity to look at that? Margot James MP: That is really high on my agenda right now. There is a need for greater co-ordination. In part responding to that need, the Government established the Digital Skills Partnership, which I co-chair with the chief executive of Cisco. That is designed to bring the various elements of skills training and confidence boosting under one purview. The other great thing that we are now doing is establishing local digital skills partnerships between schools, universities and industry in a locality. In the West Country, there is the Heart of the South West Digital Skills Partnership. There is one in the north-west, in Lancashire. I hope to launch the West Midlands Digital Skills Partnership jointly with my Secretary of State and the Mayor for the West Midlands early next month. It is a live issue. You are quite right to point it out. There is a lot going on. We do not want to step in and take things over or stop things, but we do want to co-ordinate and through that process, I hope, identify any gaps. We do not want duplication and gaps, and I am hopeful that the local digital skills partnerships will be able to address those things. Skills and confidence are absolutely crucial to citizens being able to enjoy the benefits of new technology on a more equal footing. At the moment, there are a lot of people, not just children, who are disadvantaged by not having the confidence to go online. Some 20% of people with a registered disability have never been online. That is appalling. A high proportion of people over 65 lack confidence online. We want the benefits of technology to be shared across society, not for certain groups to benefit while other groups fall behind. Baroness Benjamin: You mentioned the partners you are working with. One partner you might be interested in working with is called UKBlackTech. I do not know whether you have been involved with it, but it is doing a lot of things for young black people who do not feel connected with the world that we are creating, and trying to get more BAME kids to understand the technical, online world that they are part of but not part of. I suggest you get in touch with it. Margot James MP: Thank you for mentioning that organisation. I had not come across it. It sounds excellent and definitely a group that we will consult. The Chairman: Minister, thank you very much for the evidence that you have given us and for being open with us and discussing many of the issues that we are considering as we produce our report. We will be reporting in the new year. I hope on behalf of the Committee that this could be the beginning of a dialogue in this very important area of public policy, and that when we report and you respond to us you may come back and talk to us about the issues that emerge from our report. Margot James MP: I would be delighted to do that. Thank you all once again for this very important inquiry. I hope that the timing enables us to read it and look at its recommendations as part of the final stages of the development of the White Paper. We seem to be working in concert on this, and what you have done here is very valuable. Thank you very much indeed. The Chairman: Thank you and thank you for your time. Margot James MP: Thank you. |