
The Online Harms White Paper sets out the government’s plans for a package of online safety measures. This package comprises legislative and non-legislative measures and will make companies more responsible for their users’ safety online. The White Paper proposes establishing in law a new duty of care towards users, which will be overseen by an independent regulator. Companies will be held to account for tackling a comprehensive set of online harms, ranging from illegal activity and content to behaviours which are harmful but not necessarily illegal.
The government wishes to consult on these proposals. Here is the response submitted by Fair Play For Women.
About our response to the online harms white paper
We welcome the government’s white paper on online harms. We are encouraged that the government is willing to take action to help reduce the very serious risks to users that can occur even on very mainstream and popular platforms such as Twitter, YouTube and Facebook.
We do not have the relevant expertise to respond to all of the consultation’s question, so have chosen to respond only in those areas where we can share insight or have concerns.
We have two main points that should be considered as the proposals develop:
Firstly, that online abuse of women and girls is a problem that is not consistently being tackled well by online platforms. The effect of online abuse is that women and girls are not able to fully participate in society.
Secondly, that online platforms can, and do, wrongly censor women and girls for expressing legitimate speech around the existence of biological sex, the realities of male violence against women and girls, and the recognition of these realities under UK legislation. This too prevents women and girls from participating fully in society.
We also have concerns around how a regulator would define and assess legal harms. There could be very negative consequences if the boundaries of what is considered ‘harmful’ are drawn so wide as to capture legitimate speech and expression.
Whilst we welcome the white paper’s commitments to freedom of speech and freedom of expression, we would like to see further details on how this would work in practice.
Question 1: This government has committed to annual transparency reporting. Beyond the measures set out in this White Paper, should the government do more to build a culture of transparency, trust and accountability across industry and, if so, what?
We support annual transparency reporting as one of the methods that both the government and the public could use to hold online platforms to account. Genuine transparency reporting could be helpful to understand how online platforms’ community standards work in practice, and in the long-term help to improve how they operate.
We would like to see the proposed regulator request data on how online platforms draft and enact their community standards in practice, in order that they are held to account when they get things wrong. At present there is relatively very little easily-available and accessible data or guidance on how online platforms’ community standards are enforced.
This is important because lack of effective community standards is harmful to women and girls. They can either fail to protect women and girls from online abuse (e.g. doxxing, threats, harassment etc.), or censor their legitimate speech unfairly. Both have the effect of excluding women and girls from participation online, and therefore in wider society.
Online abuse and democratic participation
Online platforms are an incredible tool for communicating, sharing ideas, arguing, discussion, humour and campaigning. Their use – in particular the larger platforms such as Twitter – has become a fundamental and essential part of inclusion in society.
The white paper recognises that online platforms can be a tool for abuse, bullying and harassment. This behaviour can have the effect of forcing users out of the online space, excluding them from participation in society. How, and how well, platforms draft and implement their own rules affect the extent to which women and girls are protected.
Online abuse of women and girls has been highlighted in the UK parliament by the Joint Human Rights Select Committee as part of their inquiry into democracy, free speech and freedom of association which is currently ongoing.
In April 2019, the committee heard evidence that threats and intimidation MPs are subjected to through social media constitutes a threat to our democracy, and is causing them to restrict their engagement with the public. On a discussion about abuse towards women and girls and community standards, Westminster MP, Joanna Cherry QC, said to the representative from Twitter, Katy Minshull:
I am puzzled as to why Twitter does not include sex as a protected characteristic. When we come to look at the very unpleasant videos and tweets that I am about to show, you will perhaps understand my concern that sex is not included as a protected characteristic by Twitter when in law it is a protected characteristic in the UK.
The response from Ms Minshull was that ‘gender’ was included as a protected characteristic. Sex and gender are not the same thing.
Fair Play for Women would suggest that the incorrect substitution of sex with gender might contribute to Twitter’s apparent inability to protect women and girls from misogynist abuse online. We observe that online platforms are drafting their own community standards without regard for the equalities legislation of the countries they operate in. The result is to the detriment of women and girls, who suffer a disproportionate amount of targeted misogynist online abuse.
Sadly, Fair Play for Women and our supporters have been victims of serious online abuse, and have witnessed first-hand how online platforms have either failed to apply (or, misapplied) their own community standards.
Suppression of legitimate speech
In the UK there is important and ongoing discussion around women’s rights to single-sex sports teams, hospital wards, domestic violence shelters in light of potential changes to the Gender Recognition Act (GRA).
In relation to these important discussions, some online platforms have at times enforced their community standards poorly when faced with legitimate expressions of material reality. Namely, that biological sex exists, and that male violence towards women and girls exists. Furthermore, that these realities are understood in UK law, which recognises sex as a protected characteristic under the Equality Act (2010). Also, that in this context, the definitions for man and woman are a male of any age and a female of any age, respectively.
We have seen temporary and permanent bans and other sorts of punitive measures be enacted upon women and girls who express these realities on online platforms, even in instances where their statements could not by any reasonable stretch of the imagination be genuinely considered hateful or transphobic, or count as harassment or incitement to any sort of violence or hatred. The effect is suppression and censoring of legitimate speech.
So, women and girls suffer from both angles; often left with no protection from abuse, or censored for legitimate speech.
What role should Parliament play in scrutinising the work of the regulator, including the development of codes of practice?
To ensure the development of codes of practice that are fit-for-purpose we would value the input of parliament, especially in making sure that the codes of practice are compatible with the law, norms and values that we expect in democratic society.
We would expect any groups of parliamentarians to be representative of all political parties and that no groups are excluded from the democratic process. Fair Play for Women campaigns so that the voices of women and girls are heard in the debates that affect them, and parliamentary input into the development of Codes of Practice could be one way to help that their voices are represented.
Question 5: Are proposals for the online platforms and services in scope of the regulatory framework a suitable basis for an effective and proportionate approach?
At this stage the proposals lack detail in how they might be applied in practice. We would be very interested in seeing how these proposals develop and we would value further clarity on how transparency reporting or super-complaints might function in practice in relation to supporting the rights of women and girls to ensure that they are able to participate fully in society, online.
Protecting freedom of speech
We would value further clarity on how freedom of speech would be protected in practice by a proposed regulator before making comment on whether such an approach is proportionate and effective.
How will ‘harm’ be defined?
We are very concerned that the boundaries of what is considered ‘harm’ or ‘harmful’ are drawn so wide as to stifle legitimate speech, to the detriment of women and girls.
Fair Play for Women has no intention to purposefully cause offence to anyone. But we must continue to express the material realities of male violence, that biological sex exists, and that this is recognised in UK law. We have suffered multiple attempts from individuals and groups who wish to shut down our presence on online platforms on the basis that our assertions cause harm (offence) to people who identify as as transgender.
It absolutely fundamental that the regulatory framework must not draw the boundaries of legal harms so widely that is begins to incorporate legitimate speech in its understanding of ‘harm’. In a misguided aim to protect a user from feeling offended by statements of material reality, there lies the risk of censoring legitimate speech of women and girls, and so in effect switching one alleged harm for another.