How safe is COVIDSafe?

A couple of days into the Australian Government’s launch of the COVIDSafe app, Portable’s Head of Design, Joe Sciglitano, sat down to talk with our Head of Development, Chris D'Aloisio, and Head of Strategy, Sarah Kaur, to discuss data privacy, security and trusting government in the time of COVID-19.

Joe Sciglitano (Head of Design): Hi Chris, hi Sarah. So I've been seeing lots of different responses from family, friends and fellow Australians about the COVIDSafe app. I've got some family members who have never asked me to download anything before sharing a download link for the app with me, and I've got some close friends who are telling me not to download it at all. And there's every gradation in between.

And so maybe to start this conversation, Sarah and Chris I’m curious if either of you have downloaded the COVIDSafe app and why or why not?

Sarah Kaur (Head of Strategy): Yep. So I downloaded it yesterday morning with glee! I was really curious. I took the same approach with My Health Record. In my role I work closely with government and I think a lot about data governance and how trust in government works. And I thought, if I don't do it, I'll never know what the experience is like, I won't have a chance to experience how they reassure people within the app and I won't have a chance to look at their privacy statements in context. So curiosity was one thing, and then secondly, I think it's a fairly benevolent, fairly safe experiment for the government to support its citizens.

JS: I’m keen to dig into what you mean by “experiment” there.

SK: Sure. In 2018 ANU did a study and they found that there was very, very low confidence from citizens in our government's ability to respond to a data breach. Also, that less than 30% of us think that the government can be trusted to use data responsibly; or is open and honest about how data is collected, used and shared. The government now needs to take small steps where it can to build back that trust to show us, to show Australians that actually, we can trust them. They can ask something of us for the “greater good”, and we can expect something in return. And I think this is a fairly small way to demonstrate that on both sides.

JS: Chris, I’m curious if you’ve downloaded the app.

Chris D'Aloisio (Head of Development): I have not downloaded the app.

JS: Can I ask why not?

CD: I think what Sarah said around trust for government is a good place to come from. Part of the fundamental dissonance that I have with it is that it's not necessarily about trusting government. It's about trusting the regulations and standards that are adhered to within the IT industry as a whole. Policies and regulations, although in place, aren't enough for me as an IT professional to say whether or not a thing will respect the actual data and the privacy requirements of its users.

As an industry, we've got a long way to go, not just in government but in the private sector too, to answer the underlying ethical question around how programmers go about storing data, how it's managed long term, and how that bubbles up to a company’s responsibilities and government responsibilities. That is a question still to be answered.

So it's not a question of trusting government. It's a question of trusting us as a society and how we manage this data and ensuring that we've got these codes or systematic ways of making sure that those things are respected and adhered to.

JS: I think the Australian Government's been quite transparent with this actually. I was quite impressed at how they led with a lot of transparency around what they were doing about privacy and how they were communicating it. But what I'm hearing from you is that it’s not really about the policies and regulations that are documented and shared, but it's the technology, the people and the data governance process itself that you don't have a lot of confidence are built or maintained in a way that is actually secure.

CD: That's correct. Like, in the same way in the construction industry, there are policies and regulations around standards for building codes, but you still need people who are trained and then adhering to those business codes under some formal training or regulation or specification, whereas a lot of our IT industry is built on a combination of self-taught developers who are, perhaps, one to two years into their careers, maybe landing their first job, right up to professionals who have been in the industry for 30+ years. So the spectrum of people who are working on these things is unknown. And yes, the way in which things are built is also unknown in many cases.

SK: One of the things that's been on my mind is why there's so much scrutiny on government. And this is not a naive question deserving of an auto-response like, “well they’re the government, we should hold them to the highest standard”. It’s maybe more a curious question about us as consumers and our behaviour versus the things we say we care about. So why is it that there's so much scrutiny on this app from the government, when we nonchalantly install other apps without a blink of an eye? We give Google and Facebook so much of our data. And we don't have many assurances from them about what they're going to do with it. In a way, it's kind of nice and almost cute that our government’s trying so hard to win our trust.

CD: I think there's a misalignment with the value that the application is providing. In Facebook's case, what it gives people and the way in which it's giving it to them, through a well crafted, almost addictive interface to connect with family and friends around the world, is a very strong value-driven approach to what someone would be getting. So users are less likely to worry about how Facebook is actually providing that experience behind the scenes.

I think that’s the thing that's missing from the Government's narrative. They're saying we'll be able to track cases and the transmission rates, and that's the main story that we've heard. But what does that give people? Does that mean that we will be able to go and socialise with groups of more than 10 people? Does it mean we can go back to the office? I think there's a narrative missing that tells people like the “sales pitch” of what it gives them. And that means they’re not balancing out the risks associated with the privacy concerns.

JS: That really clear, user-centered value proposition does seem to be missing. And I think because of that, some people are leading with distrust and claiming, rightly or wrongly, that the introduction of this app signals the government's intent to ease up on social distancing laws for economic reasons, and potentially create a perception amongst users that they're more protected from COVID-19 than they actually are so that they’ll return to work and return to their old purchasing habits. It's potentially creating a perception that all I need is the app. I've got the COVIDSafe app, so that means I'm safe, which isn't really the intended purpose, even though it’s in the name of the thing.

SK: I think, the Government has been careful not to promise that if we get a certain saturation of this app, then we will ease restrictions. They've been very clear to say with the introduction of this app, it gives us a lot more coverage to be able to trace and contain the virus, which is one in a series of measures to help us get back to normal.

It's demanding all of us as people living in this country to be comfortable in the discomfort of not having really hundred percent assuredness, because maybe it's the uncomfortable truth. Experts around the world are all dealing with how to weigh up so many complex and dynamic pressures and new information that there is no way that the government can say with assurance, “if we have 40% coverage, then these things will follow”.

And I think it's almost a good challenge for us to be able to say, “I acknowledge what the government is saying. I know that there's no really easy answers and I know that assisting our ability to do contact tracing may help. And it's up to me to decide whether I think it's going to help enough to outweigh the value of the risk to my privacy. I'm doing it for the collective good, and that good can't be quantified or promised.”

I actually see it as a good experiment because I see the data policy and governance reform taking off in our country where we will look back in five years, potentially and go, “oh, that was tiny. What the government knows about us now and what we trust the government to do with that data is so, so far past that.” It's a good test.

JS: So how could the government build public trust in the security of the data that they'll be collecting?

CD: So I think, because it's a fairly simple premise for a tracking application, and what we're questioning is the way in which it's collecting data and sending it, I think making the source code available for scrutiny is the first thing. Leveraging the fact that open source communities around the world find vulnerabilities in this way. I think that's something that's worth leveraging.

Even without the government doing that, the Australian IT community have already jumped on the idea of decompiling the app into its source code so that they can review it. There’s already people out there thinking, “if the government won't open source it, I will.” They’ll satisfy the need for themselves and if a vulnerability was found, hopefully report that back to the government. That's a great way to approach that particular problem.

SK: Two weeks ago, there was chat about the data being secured offshore, and that was a potential vulnerability that someone had pointed out, and since then it’s been moved to be secured onshore, which I think is a good demonstration of that kind of thing where, once the community knows a bit more about it, they can point out the vulnerabilities. I really feel like that kind of dialogue between “hackers” and the government is already proving positive.

CD: “Hackers” is kind of the more colloquial term. I'd call them “passionate security individuals.” It's very much their passion and they are skilled in seeking out security best practices.

SK: Joe, this is one for you, would it help if the government published a piece around how they've designed this application against risk? The UX of the app is such that it takes you a minute to install. And then you're encouraged to essentially never look at it again. And that's been a deliberate engineering of how it works. Like I think almost explaining the design rationale and the engineering rationale behind it, would that help?

JS: They have, in fairness, released an independent study on the privacy impact risks of COVIDSafe, and provided responses to those. And whilst I was quite impressed with the clear communication of the privacy policy in the app itself, I think implicit in the hypothetical scenario you gave where the government is sharing more about its design decisions for the app would demonstrate a greater awareness and transparency that they understand the risks and want us to understand them too. So not just saying “here is our response to your privacy concerns” in an FAQ two levels deep, but maybe some acknowledgement of privacy risks in the onboarding experience itself would, for me anyway, give me more confidence that they weren’t being careless.

I mean, I can understand why they wouldn't want to do that, because that can also have the effect of knocking back more users. But if it is consent-based and not mandatory, then people need to be aware of the risks, and then at least that's something that's more in line with reality and doesn’t let people either sit in their ignorance of the risks or, on the other hand, let their imaginations run wild.

At Portable, we've written a report a number of years ago called Hacking the Bureaucracy, which is all about how government can start thinking differently about the way they implement services and products in the public interest. And I guess open-sourcing seems like one of those things that is a best practice standard within the IT community that the government could potentially be doing. But to empathise with the government for a moment, what might some of the barriers be to the government going in that direction?

CD:  There are the inherent risks of being vulnerable to more critique, and more influence and more feedback loops. And the amount of work required for open source maintenance or management of applications, you’ve just opened yourself up to more scrutiny than what you would be under otherwise.

JS: Isn't that the point?

CD: It is the point, but it's a huge investment at a high cost. It costs a lot of time, costs a lot of energy in people's management. So instead of dealing with, say, maybe six or seven levels of bureaucracy to release an application like this, you're now opening that up to the world, and the minutiae of every community and where they come from and their context and their background, and you have to manage for that. And that is a huge amount of work.

JS: Sarah, I'm curious if you can paint a more utopian picture of how this could look. And I don't mean in a way that it's not possible. But how would you answer the questions around privacy and trust with an optimist’s perspective?

SK: There are bad actors in the world. There are good actors in the world. So just as the source code could be made open, and we could benefit from the best minds in the community at large scrutinising that data, let's say a data breach does happen. Let's say the data of 21 days worth of people who signed up gets spilled out across the web. I reckon again, you will see people working with the data and, from a “good actor” perspective, providing analysis that is in some shape or form helpful to us.

You know, if data is collected, and it just sits there, it's effectively worthless. Like, there is something in me that is wondering, not just for this time, but in the future, how much informed consent can we realistically ask people to give? I think we've started to see now in the quest to apply data science analytics and retrieve insights where applications might have very well thought out privacy policies and terms and conditions that people may or may not read, but even if they did read, would be hard to actually explain the full gamut of potential intentions of what that data might be useful or beneficial for. We can't see into the future.

And it's two-sided, I come from Singapore, I have an immense faith in government and bureaucracy. And that kind of biases me to say, if government collects my data today, and in 5-10 years time, it turns out that an analysis of this could actually have this unexpected benefit for the collective or greater good, I would actually retrospectively be okay with that, even if I didn’t know about it at the time. I think that question is going to come up more and more. I agree it's a risk. But I think there's an optimistic picture about what we could learn as well.

JS: My wife was listening to ABC Radio this morning and one of the callers characterised the debate around COVIDSafe, in maybe an oversimplified way, as an “I” thing versus a “we” thing. So those who are opposed are essentially taking a self-centred perspective, and those who are for are taking a collectivist perspective. And it sounds like from what you're saying, Sarah, that there are potential benefits that extend beyond just the immediate effects of the use of data in the app in terms of controlling the spread of the virus. But if that data is useful for improving public wellbeing in the longer term, that you'd be okay with it being analysed and used in other ways.

SK: Yeah, I feel like I place a lot of faith in the idea of a collective future well being. But how about you, Chris? I feel like you do too. And you come at that from the other angle almost.

CD: Yeah, I do resonate with that idea. And it is something that I'd like to think that we as a society can move towards. I think part of the part of the dissonance for me is that because there aren't the measures in place, or a way to do this that is secure and protecting people's privacy, in a way that guarantees it, it's hard to know what the impact of the bad actor, or the good actor for that matter, will look like in the future.

“Irresponsible” is probably the best word that comes to mind for me, because it is a massive power to wield. And with great power comes great responsibility, as the quote goes from a well-known comic book. We've been very good over the generations at coming up with systems and ways to make things safe. We've not yet been able to do that with data, privacy and security in a digital landscape. We'll know when we're there, but we're not there yet. And that's something that I can't categorically come to grips with at this stage, even though I agree in principle with moving in a direction for the collective good as a society.

JS: So as people working at the intersection of design, technology and the public service, broadly speaking, beyond this particular initiative, what do you think the federal government could be doing differently with regards to their response to COVID-19?

SK: Keep pointing Australians to the areas of discussion and debate that are really valid and help us grow as a nation to be okay facing the unknown together. I think they've done a really good job of laying foundations and saying right now we feel like this is a safe step to take, a safe restriction to ease. The boundaries of the unknown, have rippled now into these areas and that's where the debate’s happening. That's where we've already engaged experts. Australians place far more trust in the authority of experts and scientists, a heck of a lot more than politicians. Top trusted professions in Australia are doctors, and then scientists. So with COVID, I think the government is learning that now they can use this to their advantage to build trust.

CD: What the Government could be doing differently? From a technological point of view, using the community at large to help quash the concerns around trust. Policies are great. But it's not the real representation of what’s actually happening. So more transparency across those things would be great.

I also think that there's a narrative piece that is missing. What will all this help them achieve? What will it help us achieve as a society in Australia to get through this crisis? What does it mean to people going about their lives? And I think they could do more to highlight those pieces so that we could get the public to understand more about their intentions.

JS: Thanks both for your time. That was great. There's so much good stuff in there, I'll easily be able to pull a blog post together. I'll send you guys whatever I come up with before it's published so you can make sure it’s okay or veto it.

CD: Thanks.

JS: Full consent.

SK: What are you gonna do with it, Joe? What are you gonna do with our data?

JS: You don’t need to know, don’t worry.

CD: Are you destroying the recording after you've used it?

JS: Just be confident. It's in everyone's best interest that I have this data. Trust me, it might come in handy someday.

CD: Excellent.

Download our report on hacking the bureaucracy

Related content

Sign up to our email newsletter to get updates about our events, work and research

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.