Reader, I installed it.
I’ve spent a bit of time today listening to people that have concerns with the app. All of these boil down to “we don’t trust the government.” Trust has been so eroded by the actions of Cummings et al, that people are justifiably distrustful of an NHS/government app.
That’s fine, I don’t trust the government either, but let me try to explain why in this case it doesn’t matter.
It uses the Apple/Google Exposure Notification API, which means that the app must abide by certain rules before it is allowed on the App Stores, and that includes not being able to track your location. If it doesn’t obey those rules, it doesn’t get put on the App Store.
One of the key points to stress is that all the hard work is done on your phone, and not uploaded to NHS servers. The QR codes you scan to ‘check in’ to a venue are only stored on your phone — and mean you don’t have to hand your personal details over to the venue instead.
The source code is available for all to see (and you can be sure lots of people are looking at it):
There is a method to disclose vulnerabilities:
Concerns have been raised about the requirement for a relatively new smartphone. This is true, it requires iOS 13.5 or newer, or Android 6 or newer. An iPhone 6 will not support it, even though they were being sold up until September 2018, but the iPhone 6s (which was launched one year later, but discontinued at the same time as the 6) will support it. My Samsung Galaxy S7 released in 2016 (running Android 8) does support it.
The reason for this is not the NHS, it’s the operating systems that support the Exposure Notification API, and the privacy strength of the app comes from using that instead of the original plan for an app developed entirely in-house.
It is perfect? I doubt it. For a start, you need to be in proximity to someone for 15 minutes who later tests positive for it to count as a ‘high risk encounter.’ Is it better than writing your contact details in a book? I think so.