During this lockdown, as I hover between my real office and the office I actually use (otherwise known as the kitchen), I have been struck by how conflicted society is about the role of technology when it comes to handling this pandemic. This has especially been the case when it comes to contact tracing apps and platforms that have really captured the imagination of many corners of the global media.
Column inches have been devoted to mass surveillance, Big Brother and privacy rights in the face of apps being launched recently in Australia, India, Singapore and now the UK. Now privacy ethics are rightly high on the agenda, but the focus, in my opinion, has become myopic and ignores many more immediate and serious issues.
First things first. Are these platforms and their dependency on Bluetooth-based proximity actually going to be effective and useful? Some of the most respected individuals in security such as Ross Anderson and Bruce Schneier have written blogs on this, and I feel that hasn’t been discussed enough.
It’s my view that many of the weaknesses raised (e.g., false positive exposure reporting rates; systemic abuse) can be mitigated by combining automated exposure reporting with manual follow-up, but really these technical shortcomings don’t seem to have received much attention. The reason for this, I believe, is that the concerns around privacy have prioritised technical discussions focusing on centralised versus decentralised models for contact tracing instead.
Simplistically, should the proximity matching be done on a central server as is the case with NHSX’s app or be restricted to the smartphone itself as is the case in Germany (this has also been referenced as the Apple/Google model)? This is a complex process by itself but has drawn us away from a simpler question about whether the underlying technology itself is even going to be useful.
How can these apps be useful if very few people use them?
The similar answer is they won’t be. The privacy debate has basically forced the hands of policy-makers to say these apps will be based on an opt-in by end users for both uptake and symptomatic/diagnostic reporting. Poor adoption could happen for a number of reasons such as mistrust of how data will be used but also simply because people don’t have smartphones or are not comfortable downloading apps.
To put this into context, NHS advisors have stated that around 55% of the UK population needs to adopt their app for it to have a meaningful impact. Across a smaller population, say the pilot taking place on the Isle of Wight, with a huge PR campaign and effectively door-to-door campaigning and support, it may be feasible to achieve this rate across 140,000 people. Across a population of nearly 68 million it is a much taller task.
In this scenario, there is a significant risk that secondary outbreaks will be much larger before they are detected, leading to morbidity, mortality and further lockdowns - especially if we only depend on labour-intensive manual tracing.
Now, let’s get to the cybersecurity risks that haven’t been discussed anywhere as far as I can see. There has been a tremendous rise in cyberattacks since the pandemic began and especially in healthcare, with ransomware attacks amongst others targeting hospitals, government agencies and research facilities. What does this have to do with contact tracing apps? Well, the real value of these comes from their interoperability and data sharing capabilities with central and local health IT systems.
Only by receiving this information can statistical analysis, outbreak mapping, capacity management and early clinical intervention for higher risk groups be conducted. This means these platforms are an attractive target for attackers to compromise in order to spread malware throughout a health system, causing damage that actually disrupts clinical care at scale.
This is an immediate patient safety issue. Let’s not forget that there has also been an unprecedented rise in the adoption of telehealth solutions across the world. It’s no surprise that these systems will likely need to have an interface with contact tracing platforms if any form of clinical intervention is going to be planned to look after vulnerable groups that are identified as having been exposed to the virus.
These companies, many of which are relatively small scale, are coping with tremendous demand and expedited procurement. Their solutions, similarly, are attractive attack targets, especially since it is unclear how much security oversight they have in terms of best practice.
These points highlight the need to have a much more nuanced debate about how contact tracing platforms are developed and deployed. For the most part, I am in favour of these solutions but I am struck by how little multi-disciplinary input there seems to be in the discussion about them.
We need more teams that include a combination of clinicians, epidemiologists, technical security experts and privacy advocates working together on this. More media coverage of these diverse viewpoints are essential for educating and engaging the public which will actually enhance adoption. I hope government agencies hear my concerns and set up independent review boards that have the type of composition that can meet these needs.