Lords Communications and Digital Committee takes evidence on freedom of expression online
The House of Lords Communications and Digital Committee heard
evidence on the subject of freedom of expression online. Witnesses
were: Ayishat Akanbi, a cultural commentator Dr Jeffrey Howard,
Associate Professor, UCL Department of Political Science Responding
to questions by committee chairman, Lord Gilbert of Panteg, In
introductory remarks, Ayishat Akanbi felt there was a sense of fear
surrounding expressing yourself online. Political and cultural
discussion...Request free trial
The House of Lords Communications and Digital Committee heard
evidence on the subject of freedom of expression online.
Witnesses were:
Responding to questions by committee chairman, Lord Gilbert of Panteg, Responding to Lord McInnes of Kilwinning about the differences between online and offline expression, Ayishat Akanbi said people were less inhibited and more combative offline, but because they spent so much time online, they had a fear of expressing themselves because of the consequences. Jeffrey Howard said there were benefits of amplifying your message, but there were equal dangers. The problem was one of moderation. Ayishat Akanbi spoke of harmful ideas, including incitement and self-harm. Baroness Bull asked if the Internet was a democratic space, or were there sectors of people unable to express themselves for various reasons? Jeffrey Howard worried that the pervasive nature of shaming and attacking people was unhealthy. Some people were reluctant to speak because they feared for their safety. Ayishat Akanbi agreed that people, especially in vulnerable groups, could not speak their minds if their views were counter to the majority. For example, you could say anything about a straight white male but could not say the same about any other group. On “protected characteristics” such as body shape, Jeffrey Howard said definitions were fraught. It was necessary to contextualise. The Bishop of Worcester referred to polls about freedom of speech. Had there been a change in public attitudes? Ayishat Akanbi said people assumed a nefarious intention when there was talk of free speech. For example, they may equate it with the freedom to be prejudiced, rather than freedom to explore different views. She noticed that it tended to be older people, especially men, who wanted more freedom to speak. Some took it as a euphemism for wanting to be racist and homophobic. She said she didn’t know of any contemporaries who shared her view that it was important to fight for free speech and distinguish between free speech and speech we hate. Jeffrey Howard said free speech was as vital today as ever. The limits may have changed over the years, but that was nothing to do with the Internet. Replying to Lord Vaizey on the subject of digital citizenship, Ayishat Akanbi said the problem was a lack of social media etiquette (a phrase widely liked by members of the committee). Jeffrey Howard urged young people to consider online content and not to be duped by misinformation. And they needed to learn to be tolerant of people with whom they disagreed. These went to the heart of the principles of free speech. Lord Colville of Culross asked if platforms could give a balance between online harms and freedom of speech. Jeffrey Howard was encouraged by progress made, but social media platforms had a moral obligation to do more and were best placed to do so. He pointed out that if large companies like Facebook were broken up then the resources at their disposal to engage in content moderation would be reduced. Ayishat Akanbi agreed with the need to be realistic about expectations. Committee chairman Lord Gilbert talked about definitions of harmful content. For example, questioning government guidance could undermine compliance and be taken as being harmful. Jeffrey Howard said it would be a clear mistake to ban disinformation, except in targeted cases, such as when its falsehood was incontrovertible and dangerous. It was sometimes necessary for speech to cause harm before it could be banned. Baroness McIntosh asked how to teach young people that there was no legitimate distinction between what was said online and offline in terms of moral weight. Ayishat Akanbi suggested IT lessons could include modules on social media. It wasn’t clear how to better education adults. Perhaps it was about having more conversation about how to disagree with each other. Replying to Lord Storey, Jeffrey Howard said it would be morally wrong for social media platforms to do the bidding of totalitarian regimes and so there could be ethical issues simply operating in those countries. Baroness Quin raised the role of the State in this area. Ayishat Akanbi said there could be legislation to avoid online comments remaining “on the record” for all time. Jeffrey Howard felt there had to be some sort of democratic oversight, but the best way was to work together with platforms, providing guidelines. In the long term, proper regulation would be essential, but he did not know what form that would take. He approved of many elements of the Online Harms White Paper. He replied to Lord Allen of Kensington that algorithmic design was crucial. He thought it could be possible to introduce mechanisms to slow down the rapid pace of discourse on social media in order to make people more thoughtful about their actions. Ayishat Akanbi agreed that getting people to respond more slowly, perhaps by being forced to open tweets before responding to them, would be very helpful. Baroness Grender asked about Twitter’s decision to moderate Donald Trump’s tweets. Both witnesses agreed that it made sense to issue a ‘health warning’ on some of his claims. Baroness Buscombe commented that the ability to be anonymous enabled great harm and abuse. Jeffrey Howard shared the concern about anonymity, but there were cases when remaining unnamed was important. In a mixed model, some platforms could protect anonymity while others didn’t. Ayishat Akanbi agreed that inauthentic accounts were more dangerous than anonymous ones, even though the latter could cause harm. |