Delphi report round 1

DIGITAL WILDFIRE: (Mis)information flows, propagation and responsible governance 

Delphi panel round 1 report 

INTRODUCTION

Thank you once again for completing Round 1 of the Digital Wildfire Delphi. This questionnaire sought opinions on the phenomenon of digital wildfires - in which misleading or provocative content on social media spreads rapidly and can cause serious harm. It particularly focused on opportunities for the responsible governance of social media. This report briefly summarises the responses we received and highlights the similarities and differences between different groups of respondents: 1) social media users; 2) respondents from social media platforms; 3) lawyers; and 4) respondents from institutions.

The report also provides information which respondents can draw upon in considering their responses to the questionnaire for Round 2, which seeks to:

· confirm panellists’ views on the usefulness of the concept of ‘digital wildfires’ in debates over the responsible governance of social media communications in the light of responses to Round 1;

· establish whether their views have altered as a consequence of other panellists’ views on responsible governance; and, finally, to

· provide respondents with an opportunity to discuss the technical and political feasibility of different methods for governing harmful social media communications.

The Round 2 questionnaire is designed to provide panellists with the opportunity to record the strength of their agreement or disagreement with views on the usefulness of ‘digital wildfires’ as a concept and with views on whom, if anyone, ought to be responsible for governing social media communications. It also provides respondents with an opportunity to rate the technical and political feasibility of alternative methods for governing these communications. There are also free-text boxes after each question in which panellists are asked to record the reasoning behind their views and which provide an opportunity for respondents to identify any other issues which they think ought to be addressed.

SUMMARY FEEDBACK ON RESPONSES TO EACH QUESTION IN ROUND 1 

Question 1: Would/do you recognise digital wildfires and if so, what do you think are their main characteristics? Can you give some examples of recent digital wildfires you have noticed?

A large majority of respondents across all groups said that they recognise digital wildfires. They described digital wildfires as involving the fast spread of posts (text or images) on social media which might include negative, false, unverified, antagonistic, prejudiced or misleading content. This spread of content can have (intentional or unintentional) effects such as igniting debate, sparking violence, causing offence or creating confusion. Examples given included: allegations connecting high profile individuals to historic child sexual abuse crimes; Gamergate; rumours about the death of Whitney Houston; Justine Sacco’s ‘racist’ tweet about HIV and Africa; and the online ‘shaming’ of individuals.

However a number of respondents (in particular lawyers and those from social media platforms) pointed out that the concept of digital wildfires can be ambiguous and the nature of social media content can be open to interpretation.

For example: “It is impossible to tell a digital wildfire from genuine interest of a news story, gossip, moral panic, and even humour.” [Lawyer]

Question 2: Who, if anyone, should be responsible for monitoring or controlling digital wildfires?

Respondents in all groups suggested that responsibility should be shared; for example by: social media platforms and administrators; police and law enforcement; civil society; and individual users. Respondents from institutions were most likely to place emphasis on the responsibility of social media platforms whilst lawyers and respondents from social media platforms were the most likely to caution against further regulatory interventions.

For example: “I think this should be done through a collective approach that makes it important to tackle social media harm.” [Social media user]

“Social media sites should be responsible for removing such content and people should also realise they have a responsibility for not spreading misinformation” [Respondent from institution]

“…given that political judgement is often involved in deciding what is and is not a ‘digital wildfire’…I do not think that regulators should get involved beyond their current role/law” [Lawyer]

Question 3: Insofar as digital wildfires ought to be controlled how ought they to be controlled? Is existing legislation and its enforcement adequate? Is self-regulation amongst social media users sufficient? What might be the unintended consequences of (self-) regulation?

Social media users reflected on the strengths and weaknesses of different forms of regulation. Respondents from Institutions often referred to current legislation and/or its enforcement as not adequate. They also expressed reservations about self-regulation. Respondents from social media platforms emphasised the value of self-regulation, but also its limitations, as well as the importance of allowing platforms to be open and innovative. Lawyers referred to a governance role for social media platforms but were wary about the use of (further) legislation and any potential suppression of freedom of speech. They referred to self-governance as limited but important. All groups referred to the value of (existing) legislation being connected to other forms of enforcement.

For example: “I am not sure if we need something else or just better enforcement of social media laws which sometimes is lacking. The police enforcement needs to be better”. [Social media user]

Self-regulation and reporting from third party users will always benefit control of a situation and be able to recognise Digital Wildfires in their infancy.” [Respondent from institution]

For user self-regulation, there are many unintended consequences that are socially constructed and difficult to predict.” [Lawyer]

Self-regulation would typically relay on the “wisdom of crowds”, which could easily degrade into the “folly of crowds” – particularly as it is this 'folly of crowds' which lead to the digital wildfire in the first place.” [Respondent from social media platform]

Question 4: What, if any, limits would you place on freedom of speech in social media communications? Please illustrate your answer through reference to examples.

All groups displayed respect for freedom of speech and oriented to the need for some limitations to it. The differed over whether further limits might be necessary with lawyers and respondents from social media platforms tending to refer to current arrangements as sufficient.

For example: “Rather than place limits on freedom of speech, it can sometimes be better to allow these views to be publicly aired, but ensure that they are appropriately and effectively challenged. This can then turn the tide of any digital wildfire, challenging hatred and incitement whilst not placing limits on freedom of speech. It is likely that if limits are placed on freedom of speech, those whose freedom of speech has been limited will only use this as further fuel for the wildfire”. [Respondent from institution]

The best response to bad speech is more speech” [Lawyer]

“…it is hard to regulate this type of behaviour through standard command-and-control regulation, nor is it even desirable to impose legal controls on speech that would be proportionate and necessary in a democratic society. Therefore, legal controls cannot work – leaving either technological controls or social/community based controls. Technological controls are too simplistic or through design faults, either over filter or under filter.” [Lawyer]

Question 5: What other ethical considerations do you think there are in the governance and regulation of responsible uses of social media?

Respondents referred to range of ethical considerations, highlighting that social media regulation is complex and cannot be considered separately from dynamics in wider society.

For example: “One of the best ways to address ‘responsible’ behaviour on social media is through education and social media literacy. At the same time, it should be accompanied by a clear understanding that freedom of expression also includes the right to offend.” [Lawyer]

Diverse communities have diverse values…One must be sensitive to the value of real diversity to appreciate why an experimental and open field of alternative approaches must be tolerated here as elsewhere.” [Lawyer]

Something to break the current trend in social media of surfacing reinforcing views by trying to promote “things” the user might like based on previous preferences and activities. This is driven by the advertising domain but is leading to viewpoint bubbles which are effectively dry tinder to digital wildfires.” [Respondent from social media platform]