Bugs In Your Pocket?

Mass Survellience or The Answer To Detecting Crime?

Photo by Rodion Kutsaev on Unsplash

Bugs In Your Pocket?

Mass Survellience or The Answer To Detecting Crime?

When you have people like Ross Anderson, Ron Rivest, Bruce Schneier and Whitfield Diffie are part of a research paper, you sit up and take notice [here]:

Their target is the use of client-side scanning (CSS), and that it is not effective in preventing crime. Along with this, it does not prevent surveillance. The authors argue that some agencies would like CSS installed on all mobile phone devices, and not just for suspects. This may risk the privacy of law-abiding citizens and may overrule the actual risks to society. For them, the risks of implementing CSS is far more dangerous than the previously defined methods to break end-to-end encryption:

The ability of citizens to freely use digital devices, to create and store content, and to communicate with others depends strongly on our ability to feel safe in doing so. The introduction of scanning on our personal devices — devices that keep information from to-do notes to texts and photos from loved ones — tears at the heart of privacy of individual citizens. Such bulk surveillance can result in a significant chilling effect on freedom of speech and, indeed, on democracy itself.

One of the targets of the paper is the perceptual hashing method used by Apple for their CSAM scanning technique. This aims to scan iPhone devices in order to find similarities between images, and then send a cryptographically protected alert to Apple on the detection of a possible match. The use of CSS thus differs from the normal practice of scanning for matches on the server-side (Figure 1).

Figure 1: Client and server-side matching [1]

The paper outlines the case of Europol which has around 5,000 words for drugs and guns. These include different slang terms and languages. The authors outline that the usage of CSS would generate many false positives for hunters, writers and gun collectors.

Another aspect of the paper is the usage of the term “necessary and proportionate” for European law, and where law enforcement agencies are required to implement surveillance that matches the level of the threat, and not to exceed these boundaries. The usage of large scale surveillance on every phone and in every part of the world would lead to a minefield of different legal requirements for the scanning.

Threats to CSS

As with any back-door technology, CSS would be open to attack in itself from a range of threats, including:

  • Abuse by Authorized Parties. These may be insiders within trusted agencies and who could leak information to others.
  • Abuse by Unauthorized Parties. As the scanning methods are a backdoor into a device, the CSS agent on the device would be a focus of attack for a range of adversaries. A compromise in this could lead to a large-scale leakage of personal information.
  • Local Adversaries. This relates to those who may be close to the user, such as an ex-partner, and where the detection of abuse may be complex and where CSS would have little use. In many cases, abusive behaviour often involves a long chain of events, and which often differ in their approaches. The complexity of this type of activity would have little scope within CSS.

Overall, though, it is the risk to privacy that the authors see as the main threat and the possible trojan horse approach of opening the door to CSS. This would include scaling onto other devices (such as within the home) and to other system components. Another major risk is that other content could be revealed which exceed the limits of a legitimate search. For example, someone could be targeted for abusive behaviour, but where it was found that they were involved in financial insider trading. While this approach could be justified in the detecting of crime, it would lead to a mass surveillance of citizens, and where everyone was suspected of being a criminal.

On the other hand, too, we could risk the privacy harm of victims, and where an adversary could reverse engineer the data gathering agent to reveal sensitive information related to the victim. The example they give is related to the PhotoDNA perceptual hash of a 26x26 grey-scale thumbnail [2], and which can be reversed to reveal a match to a given person. The authors do give credit to the improved methods that Apple use with their NeuralHash method. But that the usage of machine learning methods for training around the detection of contraband content is not free from attack in themselves [1]:

that sensitive training inputs can be extracted from machine learning models, and that — even worse — such attacks are quite difficult to prevent.

Eight principles of CSS

In a positive frame of mind the authors then define some key principles of CSS:

  • Law Enforcement Utility. “The proposal can meaningfully and predictably address a legitimate and demonstrated law enforcement problem.” [1].
  • Equity. “The proposal offers meaningful safeguards to ensure that it will not exacerbate existing disparities in law enforcement, including on the basis of race, ethnicity, class, religion, or gender.” [1].
  • Authorization. “The use of this capability on a phone is made available [only] subject to duly authorized legal processes (for example, obtaining a warrant).” [1].
  • Specificity. “The capability to access a given phone is only useful for accessing that phone (for example, there is no master secret key to use) and that there is no practical way to repurpose the capability for mass surveillance, even if some aspects of it are compromised.” [1]
  • Focus. “The capability is designed in a way that it does not appreciably decrease cybersecurity for the public at large, only for users subject to legitimate law enforcement access.” [1].
  • Limitation. “The legal standards that law enforcement must satisfy to obtain authorization to use this capability appropriately limit its scope, for example, with respect to the severity of the crime and the particularity of the search.” [1].
  • Auditability. “When a phone is accessed, the action is auditable to enable proper oversight, and is eventually made transparent to the user (even if in a delayed fashion due to the need for law enforcement secrecy).” [1].
  • Transparency, Evaluation, and Oversight. “The use of the capability will be documented and publicly reported with sufficient rigour to facilitate accountability through ongoing evaluation and oversight by policymakers and the public.” [1].

Conclusions

The use of CSS has been proposed as a way to overcome some of the worst crimes, but the authors of this paper think that it will actually be ineffective and that it could lead to the mass surveillance of citizens. If you are interested, here is an outline of the CSAM technique that Apple are proposing:

https://medium.com/asecuritysite-when-bob-met-alice/apples-csam-system-walking-a-fine-balance-91618ddf486e?sk=f091680a74d0919845653f2666dc5d18

Reference

[1] Bugs in our Pockets: The Risks of Client-Side Scanning Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Vanessa Teague, Carmela Troncoso, Oct 2021.

[2] Neal Krawetz, PhotoDNA and Limitations, August 2021. here.