This controversial tech collects and stores facial recognition data in a central database.
Digital rights groups are pushing for more robust digital privacy regulations as Australia moves into the next phase of the pandemic, warning that regulations around personal data collection are not up to scratch.
The blowback is directed at South Australiaâs home quarantine app, which works by contacting people in quarantine at random and requesting proof of their identity and location within 15 minutes.
The app uses facial recognition and smartphone geo-location as verification tools.
Failing a check-in â which happens when the person misses their 15-minute window, is located outside their home or is unable to be recognised by the app AI â prompts a visit from SA police.
NSW, Western Australia, the Northern Territory and Victoria are in different stages of rolling out similar apps for home quarantine. Queensland is a notable exception, in that its app uses only geolocation data.
In response to this increased uptake, the Human Rights Law Centre and Digital Rights Watch co-wrote an open letter to the countryâs various health ministers outlining its concerns with the technology.
Digital Rights Watch project lead Samantha Floreani said that while the organisation supported the use of technology in home quarantine, sensitive biometric data was being left vulnerable.
âWhat we’re concerned about is that there aren’t appropriate protections in place to prevent the data that’s collected via these apps isn’t later going to be misused or used for other purposes,â she told The Medical Republic.
Facial recognition data, she said, was particularly problematic in that if there were to be a data breach, then â unlike with a leaked password or stolen licence â there would be no way for an individual to re-secure their identity.
âYou canât change your biometrics â at least not easily,â Ms Floreani said.
Complicating this issue further is the fact that data, at least in SA, would be encrypted and stored in a central repository, to be destroyed at the conclusion of the pandemic.
âThere are other ways that we can approach this,â Ms Floreani said. âWe could, for example, do a decentralised approach where you don’t have all of that information containing people’s biometric information in one central location, a practice which can raise all kinds of privacy and security risks.
âWhat we would prefer to see is a system where you can meet the needs of checking in via the app, but have that information never leave your personal device.â
There are also no standalone privacy protections in place, meaning the stored data could potentially be accessed by law enforcement later down the track.
Perhaps ironically, the governmentâs failed COVIDSafe app had strong baseline privacy protections.
âCOVIDSafe hasn’t been very useful on a practical level, but because of concerns that were raised at the time, it has robust privacy protections built into the authorising legislation,â Kieran Pender, a senior Human Rights Law Centre lawyer, told TMR.
âThe data that will be captured by home quarantine apps are just as sensitive, if not far more sensitive, so itâs particularly important that those same safeguards are there.â
Another concern stemmed from the fact AI often failed to recognise the faces of people with darker skin tones, sparking fears of discrimination.
âIt’s not too far-fetched to imagine the app failing to recognise someone and police being sent to check on them, just because of these proven and well-demonstrated shortcomings with facial recognition technology,â Mr Pender said.
âThe fact that the government is considering imposing any sort of requirements using this technology, knowing its shortcomings and without adequate safeguards to address those, is very alarming.â
Like Ms Floreani, Mr Pender stressed that his organisation was not opposed to quarantine requirements or a shift to home quarantine, but was concerned about the specific methods being used.