Under new online safety legislation, UK authorities will have the authority to order tech companies to redesign their platforms and impose fines if they fail to police child sexual abuse material.
The regulations will target end-to-end encrypted platforms, which are under growing political pressure to grant governments and law enforcement access to content, including messages, photos, and videos, and where messages can only be viewed by the sender and recipient.
On Wednesday, the Home Office announced an amendment to the online safety bill that would allow communications regulator Ofcom to fine tech companies up to £18 million or 10% of their annual revenue, whichever is higher, if they fail to adhere to yet-to-be-determined child protection standards.
According to the proposals, the regulator would be able to demand that tech companies install unreleased software into encrypted platforms or create their own technologies to identify offensive content.
The move is being made as tech companies try to strike a balance between protecting vulnerable users and preserving the privacy of their users’ data, as well as working with law enforcement and lawmakers who are unable to access content on encrypted platforms.
Apple previously made an attempt to roll out scanning software to combat harmful images of child sex abuse but was forced to backtrack last year due to a fierce backlash from privacy advocates.
Meta, the company that owns Facebook, Instagram, and WhatsApp, has also promised to implement end-to-end encryption on Facebook Messenger, despite previous efforts by the Home Office and charities to oppose it in the interest of children’s safety.
The business expressed its concerns about the effectiveness of Ofcom’s ability to mandate message scanning for objectionable content in a public submission to the bill committee last month. According to Richard Earley, public policy manager at Meta UK, “it is unclear how this would be possible in an encrypted messaging service, and would have significant privacy, security, and safety implications for users.”
According to the law, Ofcom will determine whether platforms are taking sufficient steps to stop, identify, and remove explicit content and whether it is necessary and reasonable to request that platforms change their products.
Home secretary Priti Patel stated that “privacy and security are not mutually exclusive — we need both, and we can have both, and that is what this amendment delivers.”
The government has given more than £550,000 to five projects in the UK to develop technologies that could one day be mandated for use in platforms’ services in order to prevent the spread of child abuse material.
These include third-party applications that can be incorporated into already-existing encrypted platforms and technology for age verification that could be used prior to allowing users access to encrypted services.
According to data released on Wednesday by the children’s charity NSPCC, online grooming offenses have increased by more than 80% in the UK over the last four years, with an average of 120 offenses per week.
When the method of communication was known, 33% of cases used platforms owned by Meta, and 8% used Snapchat.