Graeme Biggar, the director general of the UK’s National Crime Agency (NCA), has urged Instagram’s parent company, Meta, to reconsider its ongoing deployment of end-to-end encryption (E2EE) in the most recent round of the never-ending (and always perplexing) crypto battles by using European police chief.

 European Police Chief:

The request comes after a joint statement made on Sunday by police chiefs from throughout Europe, including the UK, expressing “concern” over the way the tech industry is implementing E2EE and requesting that platforms create security systems that allow them to still detect illicit activity and report messages to law enforcement.

The NCA chief proposed Meta’s current strategy to strengthen security surrounding Instagram users’ private chats by using so-called “zero access”

Child safety is at risk from “access” encryption, in which the content is only accessible by the sender and recipient of the message. In December, the social media behemoth also began a long-awaited rollout of Facebook Messenger’s default E2EE.

“Give the information to us.”
“Our responsibility as law enforcement… is to protect the public from organized crime, from serious crime, and we need information to be able to do that,” Biggar said in an interview with Nick Robinson on BBC Radio 4’s Today program.

A lot of material about end-to-end encryption is being posted by tech businesses. We don’t have an issue with encryption; in fact, strong encryption is a good thing since it’s my duty to attempt to safeguard the public from cybercrime. However, what we need is for the enterprises to continue being able to provide us with the data required to maintain public safety by European police chief.

According to Biggar, platforms are currently sending tens of millions of child safety-related reports annually to police forces all over the world because they can scan unencrypted messages. Biggar also claimed that “on the back of that information, we typically safeguard 1,200 children a month and arrest 800 people.” This suggests that if Meta keeps extending its use of E2EE to Instagram, those reports will stop.

Robinson questioned whether this wasn’t a security risk, pointing out that Meta-owned WhatsApp had been using the gold standard encryption by default for years (E2EE was fully integrated throughout the messaging network by April 2016).

instance when the horse has bolted and the criminal agency is attempting to shut the stable door. He received no clear response to it, only more perplexing equivocation.

“It’s a trend,” Biggar declared. Our goal is not to disrupt encryption. As previously said, we fully endorse privacy and encryption, and end-to-end encryption can function flawlessly. Our desire is for the industry to figure out how to continue giving us the information we require.

Biggar’s intervention is consistent with the above-mentioned joint declaration from European police chiefs, urging platforms to implement as-yet-unspecified “technical solutions” that can provide users with strong security and privacy while preserving their capacity to identify illicit activity and notify law enforcement authorities of decrypted content.

“Businesses will not be able to adequately reply to a legitimate authority,” the proclamation states. We will thus be unable to maintain public safety by European police chief. As a result, we urge the technology sector to incorporate security into its designs, guarantee that it can recognize and denounce harmful and unlawful behaviors, such child sex exploitation, and take extraordinary, legal action when necessary.

The European Council already approved a resolution in December 2020 that established a comparable “lawful access” rule for encrypted texting.

 

scanning from the client’s side?

The proclamation is vague about the technology that the platforms are supposed to implement in order to search for objectionable content and forward the decrypted content to law enforcement. They are probably advocating for client-side scanning in some capacity, like the one Apple planned to introduce in 2021 to find child sexual abuse material (CSAM) on customers’ devices.

Meanwhile, the contentious message-scanning CSAM legislative scheme is currently being considered by EU parliamentarians. Legal and privacy experts, including the bloc’s own data protection supervisor, have said that the draft bill could seriously jeopardize cybersecurity as well as democratic liberties. Additionally, many contend that it is an ineffective method of protecting kids and that it will probably do more harm than good by producing a large number of false positives.

Parliamentarians resisted the Commission’s plan in October of last year and supported a significantly altered strategy that seeks to restrict the use of CSAM “detection orders.” Still, the European Council has not yet decided on a stance. Numerous civil society organizations and privacy experts cautioned this month that E2EE is still in danger from the planned “mass surveillance” law. The proposed bill is meant to take the place of the interim exception from the EU’s e Privacy regulations that parliamentarians have agreed to prolong, allowing platforms to voluntarily scan for CSAM.

The coordinated statement on Sunday appears to be timed to increase pressure on EU legislators to uphold the CSAM-scanning proposal.

Although critics caution that the EU’s proposal is likely to push adoption of client-side scanning despite the emerging technology being immature, unproven, and simply not ready for widespread usage, it does not mandate any technologies that platforms must use to scan message content.

Robinson questioned Biggar about if European police chief want Meta to “backdoor” encryption, but he did not inquire about whether they are advocating for client-side scanning. Once more, Biggar’s response was evasive: “We wouldn’t call it a backdoor; the industry will have to determine exactly how it happens.” In this, they are the authorities.

Robinson pushed for an explanation from the UK police chief, pointing out that data is either securely encrypted (and therefore confidential) or it isn’t. Biggar, however, veered farther from the topic by asserting that information security and visibility are two different concepts that “every platform is on a spectrum.” He proposed, “Almost nothing is at the absolutely completely secure end.” “For usability reasons, like being able to recover their data in the event of a phone loss, customers don’t want that.”

“What We are arguing that taking a firm stance on either side is ineffective. Naturally, we don’t want everything to be completely transparent. However, we also don’t want to completely close things down. Therefore, we want the businesses to figure out how to ensure that they can give the public protection and encryption while still giving us the data we require to keep the public safe.

 

Absence of safety technology

The UK Home Office has been promoting the idea of “safety tech” in recent years, which would enable E2EE content to be scanned for CSAM without compromising user privacy. Nevertheless, a 2021 “Safety Tech” contest it held in an attempt to provide proof of concepts for said technology yielded outcomes so subpar that Awais Rashid, a cybersecurity professor at the University of Bristol, was the expert chosen to assess the ideas, and he issued a warning last year that none of the technology created for the challenge was suitable for its intended use. “Our analysis demonstrates that the proposed solutions will jeopardize privacy generally and lack any internal safeguards to prevent the repurposing of these technologies for the purpose of monitoring any private communications,” the author stated.

If, as Biggar seems to be arguing, there is technology that permits law enforcement to obtain E2EE data without endangering users’ privacy, then why are police forces unable to articulate the policies that they would like platforms to adopt? (It should be mentioned that reports from last year claimed government ministers had privately admitted there was no such E2EE-scanning technology available at the moment that is private and secure.)

Insightfullnk got in touch with Meta to get an answer to Both Biggar’s comments and the joint proclamation as a whole. A representative for the corporation defended increasing access to E2EE in an email, saying that the vast majority of British people already rely on apps that utilize encryption to protect them from scammers, hackers, and other criminals. We’ve spent the last five years creating strong safety measures to stop, identify, and battle abuse while upholding online security because we don’t think individuals want us reading their private conversations. These steps, which include banning adults over 19 from texting teenagers who don’t follow them and use technology to spot and stop harmful behavior, are outlined in a newly released report. As end-to-end encryption becomes more widespread, we anticipate continuing to submit more reports to the legal enforcement than our competitors as a result of our industry-leading efforts to protect public safety.” By European police chief.

For the better part of a decade, Meta has endured numerous such requests from UK Home Secretaries during the tenure of the Conservative government. The Home Secretary at the time, Suella Braverman, instructed Meta in September of last year that it needed to implement “safety measures” in addition to E2EE. She also threatened to use the government’s authority under the Online Safety Bill (now Act) to penalize the company for not complying.

Biggar responded to Robinson’s question about whether the government could take action if Meta did not alter its course on E2EE by citing the Online Safety Act and the Investigatory Powers Act (IPA), which permits surveillance, saying that “government can act and government should act.” It possesses strong authority to do so under both the Online Safety Act and the Investigatory Powers Act.

 

“Insights from a European Police Chief: Leading Law Enforcement Strategies”

 

Penalties for violations of the Online Safety Act can be severe; the Ofcom has the authority to impose fines up to 10% of global yearly revenue.

Additionally, the UK government is working to strengthen the IPA by giving it more authority over messaging platforms. One such measure is the requirement that message firms acquire Home Office approval before implementing security measures.

The UK tech industry is concerned that the plan to broaden the IPA’s jurisdiction will jeopardize the security and privacy of its residents. Apple issued a warning last summer that it might have to discontinue FaceTime and iMessage in the UK.

if the government didn’t reconsider its intended increase in monitoring authority.

This most recent lobbying effort contains some irony. Even with the emergence of E2EE, law enforcement and security agencies most likely have never had access to more signals intelligence than they do now. Therefore, it is unambiguously binary to suggest that increased web security will instantly mean that kid protection initiatives will come to a stop.

That being said, it should come as no surprise to anyone familiar with the decades-long crypto battles that such appeals are being made in an attempt to undermine Internet security. This is the way this battle on propaganda has always been fought.

Leave a Reply

Your email address will not be published. Required fields are marked *