On Friday 21 April, the European Parliament’s Civil Liberties, Justice and Home Affairs (LIBE) Committee published the draft report from lead Rapporteur, Javier Zarzalejos, on the proposed regulation to combat online child sexual abuse materials (CSAM).
As noted in our position paper, CISPE fully supports the intention of the proposal and our members are committed to doing what they can to address this disturbing issue. However, the final legislation must be workable and enforceable. Therefore, it is crucial that the final text of the legislation takes into account the technical and practical limitations faced by cloud infrastructure service providers when requested to remove – and especially detect – such material hosted by their customers.
CISPE warmly welcomes the rapporteur’s clarification that removal orders should only be targeted to cloud infrastructure providers as a last resort. As is rightly pointed out by the rapporteur’s amendment, it is rarely technically possible for a cloud infrastructure provider to take down or disable access to one specific piece of content. This is because the infrastructure provider does not have access to customers’ granular data. Therefore, to comply with a removal order, cloud infrastructure providers must sometimes take down the entirety of a service. Mr. Zarzalejos’ amendment accurately reflects the importance of issuing removal orders to providers closest to the content.
We also welcome the rapporteur’s additions protecting the sanctity of encryption, including end-to-end encryption. Encryption protects customers’ sensitive data, including that of governments or hospitals. Requiring providers to engineer vulnerabilities into products and services that are applied to such customers would undermine the entire security and privacy of customer data and the information technology infrastructure.
However, CISPE believes additional clarity is required regarding the role of cloud infrastructures in detection orders. To avoid undue interference to fundamental rights, we recommend that detection orders be issued only to controllers. This is because both cloud infrastructure and social media providers qualify as hosting services under the regulation. Only the social media service, or the ‘controller’, controls the user’s content data, has full visibility of that content data, and is therefore able to take specific action if that content data qualifies as online child sexual abuse. Cloud infrastructure providers cannot access content data in a way that would allow them to apply a tailored detection order without going beyond what is strictly necessary to address a child sexual abuse risk. As such, it would be disproportionate to issue detection orders to cloud infrastructure providers. To ensure detection orders are effective, expedient and privacy-preserving, they should be issued only to controllers.
You can find a more detailed explanation of CISPE’s position in the video below. We look forward to further discussions with the European Parliament on these important topics.