Written by: Hannah Williams
The EU Commission’s new e-Privacy Regulation has been interpreted by Facebook as a policy that prevents it from running its automatic scanning of messages to detect child abuse on its platform. We must ask, is this interpretation correct and if so why have Facebook’s competitors not reached the same conclusion? Is it possible that other factors have influenced Facebook’s seemingly drastic judgment?
The Regulation
The Regulation has been developed from the e-Privacy Directive of 2002 with which Member States had to implement the guidelines into national law themselves. However, the importance of this change into a Regulation means that it becomes automatically binding EU law on all Member States from the date that it comes into force. The ePR is proposed to complement the GDPR already in place. The GDPR provides a more general overview of privacy guidelines whereas the ePR will specifically address key areas of concern.
When in force, the Regulation will govern the privacy and data transferred via electronic modes and will control all electronic communication data. This includes governing any information concerning the content of the data.
Facebook’s Response
In what seems to be a stand against the new Regulation, Facebook has stopped its automatic scanning of private messages for indications of child abuse and has blamed this decision on the new policy. The US National centre for Missing and Exploited Children has released data showing a large fall in referrals for material regarding the sexual exploitation of children from the EU in the first few weeks since Facebook ‘was forced’ to turn off its scanning.
It is evident that Facebook’s decision is negatively impacting the protection against the abuse of children online and may result in an opportunity to target children using the services available on Facebook.
Competitors’ Responses
However, in contrast, some of Facebook’s largest competitors have not followed this interpretation and continue to use their tools to detect child abuse material. Competitors such as Google and Microsoft have instead concluded that they do not yet need to turn off their mechanisms completely since there is still a question of clarity to be answered regarding what the new Regulation actually requires.
Andy Burrows, the head of NSPCC children’s charity has commented ‘it’s striking that Facebook has interpreted the failure to reach an agreement before Christmas as requiring them to stop scanning, when what that seems to be is a breaking of ranks from the rest of the industry’.
Why has Facebook reached a different conclusion to the rest of the industry?
In addition to the Regulation, it seems as though a proposal by Facebook to use end-to-end encryption would also affect the company’s ability to scan the content of messages. This decision would mean that no one, including the company itself, would be able to see the content of messages that have been sent by its users. This type of encryption is already used in WhatsApp conversations and is argued to be worthwhile for adding an important layer of security for people who rely on such privacy. However, extending end-to-end encryption to Facebook is estimated to reduce reports of child abuse from the platform by 70% according to the US National Centre for Missing and Exploited Children.
Yvette Cooper, the chair of a House of Commons committee, expressed her upset at the decision, asking, “Why on earth – why, seriously, why is Facebook trying to introduce something that will put more children at risk, that will make it harder to rescue vulnerable children? Why are you doing this?”
By analysing these two decisions in light of each other, it may be suggested that Facebook’s drastic interpretation of the new Regulation was influenced by its desire to expand its use of end-to-end encryption on platforms other than WhatsApp. Both decisions produce the same outcome in that Facebook will increase its privacy protection by no longer employing its automatic scanning of messages thus reducing the monitoring of child abuse content.
Andy Burrows reaches the same conclusion stating that ‘it feels highly improbable that Facebook’s alternative reading [of the ePR] isn’t directly related to where they want to go with end-to-end encryption’. Nevertheless, Facebook declared that it did not have a choice regarding its decision, and contested the allegation that the reasons to stop scanning came from other factors outside of the Regulation.
Conclusion
Considering these factors, it seems that Facebook’s interpretation of the Regulation may have been influenced by its desire to extend the use of end-to-end encryption to its other platforms, which would consequently reduce the level of surveillance of content discussing child abuse.
The large amount of backlash that Facebook has received from this decision may persuade the company to reconsider its interpretation and follow the approach of its competitors by continuing to scan messages until more explicit instructions are given by the EU. However, whether this clarification will actually have an impact on Facebook is unknown, considering its wish to use more secure encryption.
Love this – I’m now intrigued to understand why Facebook interpreted the new regulations this way, and why they would benefit from an end to end encryption. Do they value their user’s sense of security or is there more to it that would directly benefit them??