The release last week of the report into the first 12 months of the federal government’s beleaguered ‘COVIDSafe’ app got me thinking about the importance of Privacy by Design – and in particular, how the ‘design’ part of the equation is not just about the technology.
With the release of the evaluation report – months late and only after a heavily redacted version was released after a concerted FOI push – we now know that the COVIDSafe app has been a terribly expensive flop.
Only 779 users who tested positive to Covid-19, out of around 23,000 positive cases in the relevant time period, consented to having data from the app uploaded to the national COVIDSafe data store between 26 April 2020 and 15 May 2021; that’s a usage rate of around 3%. From those 779 Covid cases, the app identified 81 close contacts, of whom only 17 were contacts not otherwise identified by manual contact tracing.
I don’t even want to calculate the total cost of the COVIDSafe app divided by 17 because I fear the figure would make me cry.
The COVIDSafe app – as Jacqueline Maley described it, a “cute footnote in the story of pandemic bunglings” – has been “utterly outclassed” by QR Code check-in apps implemented by State governments.
How? Privacy, utility and trust.
Compare the public acceptance and uptake of the COVIDSafe app, which was relatively low and which generated a fair amount of public angst and discussion about the pros and cons (even before we knew it didn’t work properly on iPhones), versus the NSW Government’s ‘Covid Safe Check-in’ app, which enjoys incredibly high rates of acceptance and use, by both venues and patrons alike, and with almost no push-back from the public at all.
Two covid apps, both by governments, led by the same political party, covering the same population, for the same broad contact-tracing purpose: one a raging success and the other ultimately an eye-wateringly expensive failure. Why? It comes down to context.
This is a neat illustration of an argument I have made before: public trust, and therefore rates of use or compliance, is not as simple as asking: “Do you trust this organisation (in this case, the government)?”
It’s about asking: “Do you trust this particular way your data is going to be used for this particular purpose, can you see that it will deliver benefits (whether those benefits are personally for you or for others), and are you comfortable that those benefits outweigh the risks for you?”
When you realise that this more complex set of questions is the thinking behind consumer sentiment, it demonstrates how important it is to assess each different data use proposal on a case-by-case basis, because the nature of the proposal, and the context it is in, will make each value proposition unique. That means the balancing act between benefits and risks from a privacy point of view needs to done fresh for every different project.
It also shows the importance of Privacy by Design thinking – and how this is not just about the design of the tech, but the design of the entire ecosystem in which the tech is supposed to work, including legal protections, transparency and messaging, which together add up to how well users understand how an app works. As studies have since shown, how well users understand how an app works makes a difference to their level of trust, because they can make more informed decisions for themselves.
Both apps have built-in privacy features, such as enabling the use of pseudonyms, automated deletion of data after a certain time period, and preventing the data from being accessed unless triggered by a positive covid case.
However the simplicity of the NSW app’s design, and the fact that it puts the user in complete control of when the app is used – instead of the COVIDSafe ‘always on’ design – put it way in front. (The ‘always on’ design also led to other problems with COVIDSafe, like draining battery life and interference with critical diabetes monitoring systems.) NSW app users can at any time revert to pen-and-paper when checking in to a venue.
The NSW app is also superior in its embrace of data minimisation as a design principle, only collecting data about when the user checks in to a venue. By contrast the COVIDSafe’s ‘always on’ design meant vast reams of data being collected on every ‘handshake’ between two devices, and then business rules being written to cull out those which were for less than 15 minutes – an arbitrary time period now known to be meaningless in terms of the likelihood of transmission.
The messaging around the NSW app, and how it works, was clearer too. (It helps that the user experience is intuitive and the user can see if the app is working or not; that means less complex messaging is needed in the first place.) By contrast the communications around the COVIDSafe app were truly awful: we had the PM’s sunscreen analogy, seriously misinformed claims from the Government Services Minister that the app “simply digitises a manual process”, plus the Health Minister’s bargaining and the PM’s ‘maybe I’ll make it mandatory after all’ musings, as well as influencers being paid to make false claims about the app, political spin on whether the app works on iPhones, and a 40% take-up target based on no modelling which the government then quietly dropped.
Finally, the NSW app design has been superior in its embrace of an iterative design process, starting with trials, conducting testing, and openness to user feedback, leading to improvements over time.
Compare that with an almost non-existent bug reporting mechanism for the COVIDSafe app design team. One security researcher – who, four hours after the app launched, found a critical design flaw which meant that Android phone model names and user-assigned device names were transmitted over Bluetooth, allowing for device re-identification and tracking – described the process of trying to report the flaw to the Government as like “yelling into an empty room”. It took over a month for that flaw to be rectified, by which time the app had been downloaded 6 million times.
This was one of a number of flaws which suggest that the app was not comprehensively tested before its launch. While there was a Privacy Impact Assessment conducted on the COVIDSafe app, its scope was limited to examining the federal Department of Health’s compliance with the federal Privacy Act. It did not review whether the app’s build was as described, whether it worked as planned, or whether other models would be preferable.
I am not saying that the NSW check-in app is perfect. In particular, while there is a Public Health Order directing that contact details collected via the app are only to be used or disclosed for the purposes of contact tracing, it lacks the bespoke legal protections of the COVIDSafe app, which was bolstered by specific amendments to the Privacy Act to prohibit use for secondary purposes such as law enforcement. As debates about other check-in apps in WA, Queensland, the ACT and Victoria have shown, public trust can be damaged by broken promises about the purpose for which personal information will be used.
Those of us who urged caution in April 2020, rather than jumping on the COVIDSafe bandwagon, were criticised as not part of ‘Team Australia’. But caution was the right response. You need to check that the tech works, look for abuse cases and unintended side-effects, strengthen the legal protections, prohibit or prevent secondary uses, be transparent, get the messaging right, and be open to user feedback if you are going to build a successful technology project.
Above all, utility matters. If the tech doesn’t work, if the benefits of the data collection are not realised, then all the talk about trading off privacy for other objectives like public health is meaningless. The privacy risks will remain, along with a great big white elephant.
Photograph © Shutterstock