LAWS ARE NEEDED TO PROTECT AGAINST THE FACIAL RECOGNITION

Article by Gabrielle Polczynski

The use of Facial Recognition Technology has grown throughout Australia exponentially and the

current laws do not protect people’s civil rights.

What is facial recognition?

”A facial recognition system is a technology capable of matching a human face from a digital image or a video frame against a database of faces, typically employed to authenticate users through ID verification services, works by pinpointing and measuring facial features from a given image.”¹ 

What are the advantages and disadvantages?

Some of the benefits of facial recognition systems are as follows:

  • It provides quick and efficient verification security;²
  • It has improved accuracy compared to other systems;²
  • It can integrate into most security software easily.²

Some of the disadvantages are as follows:

  • Viewing the face can be can hard to do sometimes;¹ 
  • There can be an inconstancy in the datasets used by researchers;¹ 
  • Facial recognition systems have been criticized for upholding and judging based on a binary gender assumption;¹ 
  • It affects your privacy. Civil rights organizations and privacy campaigners, such as Human Rights Law Centre, have concerns about facial recognition.¹ᐟ³ 

The Current Law

“There is no Commonwealth legislation that regulates the use of surveillance devices. Instead, this is currently governed by state and territory legislation. For example, the relevant legislation in South Australia is the Surveillance Devices Act 2016 (SA) (SDA).”⁴ 

Surveillance Act

The SDA prohibits:

  • “the knowing installation, use or maintenance of an optical surveillance device” by a person on “premises” that visually records or observes a “private activity” without the express or implied consent of all the key parties; and” ⁴ 
  • “the knowing use, communication or publication of information or material derived from the use of an optical surveillance device.” ⁴ 

“The regulation of an optical surveillance device in most jurisdictions, including the SDA, is linked to the concept of ‘private activity, meaning an activity carried on in circumstances that may reasonably be taken to indicate that one or all of the parties do not want the activity to be observed by others. Accordingly, the SDA might prohibit FRT in circumstances where it is used for covert optical surveillance (unless an exception applies)”.⁴ 

Privacy Act

“The thirteen Australian Privacy Principles (APPs) in Schedule 1 to the Privacy Act 1988 (Cth) (Privacy Act) are intended to be technology neutral so as to preserve their relevance and applicability to changing technologies.”⁴ 

“Australian privacy law treats biometric information as personal information. In particular, “Biometric information” that is to be used for the purpose of “automated biometric verification”, “biometric identification”, or “biometric templates” is a type of “sensitive information” for the purposes of the Privacy Act and APPs.”⁴ 

“‘Biometric information’ is not defined by the Privacy Act or APPs. Still, it is generally regarded as being information that relates to a person’s physiological or biological characteristics that are persistent and unique to the individual (including their facial features, iris or hand geometry) and which can therefore be used to validate their identity.”⁴ 

“The terms “automated biometric verification” or “biometric identification” are not defined by the Privacy Act or the APPs either. However, the Biometrics Institute defines “biometrics” as encompassing a variety of technologies in which unique attributes of people are used for identification and authentication, while the OAIC has indicated (in effect) that a technology will be “automated” if it is based on an algorithm developed through machine learning technology.”⁴ 

“A ‘biometric template’ is a mathematical or digital representation of an individual’s biometric information. Machine learning algorithms then use the biometric template to match it with other biometric information for verification or identification purposes.”⁴ 

“Given the breadth of the definitions of “biometric information”, “automatic biometric verification”, “biometric identification” and “biometric template”, the majority of biometric information captured by FRT is likely to fall within the protections of the Privacy Act and APPs, and the safeguards contained in Privacy Act and APPs will therefore apply to any biometric information collected by any FRT deployed by an “APP entity”.”⁴ 

Current gaps

The current gaps are for the Surveillance Act are as follows:

The legislated exceptions to the prohibition on the use of optical surveillance devices are very broad and do not currently have any in-built statutory limits. Accordingly, they have the potential to result in the incursion of a person’s privacy. However, the decision in Nanoseconds serves to curtail any such invasion of a person’s privacy by ensuring that the “lawful interest” exception cannot be relied on to use FRT to monitor a person in anticipation that they might do visually something that might impinge upon a person’s lawful interest.⁴ 

The current gaps in the Privacy Act are as follows:

‘The Privacy Act and APPs are federal laws that only apply to organisations and agencies deploying FRT fall within the definition of an “APP entity”. The definition of an “APP entity” does not include state and territory authorities or agencies, or organisations with an annual turnover of less than $3 million. Whilst some jurisdictions have their own specific privacy legislation that steps in to help safeguard a person’s privacy where FRT is used, there are other jurisdictions where no specific privacy legislation exists at all (including South Australia).”⁴ 

“In South Australia, the State public sector must comply with South Australian Information Privacy Principles (IPPs). However, the IPPs do not extend to biometric information, and so there is no other legal framework that applies to any surveillance activities carried out by agencies, authorities and organisations that fall outside the scope of the Privacy Act and APPs in SA”.⁴ 

“Difficulty in Establishing Consent”

“In the past year, the OAIC has issued two rulings in which it determined that the collection of biometric information by two separate companies (Clearview AI49 and 7 Eleven 50) contravened the consent requirements of the Privacy Act and APPs.”⁴ 

“The Privacy Act and APPs strictly require that APP entities collecting biometric information via FRT should obtain express consent. However, the nature of FRT means that it is not often practical to obtain true, express consent from individuals whose biometric information might be captured by FRT. Whilst obtaining express consent is arguably more realistic where “one-to-one” FRT is being utilised for a specific purpose in a controlled environment, it is more difficult for an APP entity to obtain the express consent of every person whose biometric information might be captured in circumstances where “one-to-many” FRT is being deployed. Accordingly, whilst it is not ideal, in order comply with current privacy laws, an APP entity that deploys FRT will usually need to establish that a person’s consent to the collection of their biometric information by FRT can be implied”.⁴ 

“Even though implied consent is an option, it is still difficult to establish that implied consent has been obtained in the first instance, given the relevant legal requirements. In particular, it can be practically difficult to provide people with enough information about how FRT collects and uses their biometric information before FRT captures their image. As a result, most people captured by FRT will not have been properly informed about what they were consenting to. Further, an individual will not often have the ability to refuse to provide their consent to the use of FRT and may feel compelled to provide it due to the inconvenience of not doing so or due to their lack of bargaining power. For example, although 7-Eleven displayed a notice at the entrance to its stores to alert customers that they would be subject to FRT when they entered the store and sought to infer that any customer who then chose to enter the store has provided consent, it is arguable that the customer had no choice (particularly if there were no convenient alternatives available to them).”⁴ 

“Notwithstanding the practical difficulties of obtaining consent in the context of FRT, the OAIC’s decision regarding Clearview AI has reinforced the importance of doing so. In that matter, Clearview AI had compiled a database of more than three billion images scraped from public posts on social media platforms and other public websites. Clearview AI then allowed its paying customers to use its software to upload an image and find matching faces and associated details from social media profiles”.⁴ 

“Following a joint investigation with the UK’s Information Commissioner’s Officer (ICO), the OAIC found that Clearview AI had breached the Privacy Act and APPs by, among other things, collecting Australians’ sensitive information without consent. In particular, the OAIC found that there was no evidence that express consent had been obtained and was not satisfied that consent could be implied in the circumstances on the basis that, among other things: “⁴ 

  • “the act of uploading an image to a social media site did not unambiguously indicate a person’s agreement to the collection of that image by an unknown third party for commercial purposes; and”⁴ 
  • “Clearview AI’s publicly available Privacy Policy was insufficient to enable individuals to understand how their biometric information was being collected, the purpose of collection and how it would be handled by the respondent. ⁴ 
  • Accordingly, any consent purported to be provided through the Privacy Policy would not have been adequately informed.”⁴ 

“The OAIC subsequently ordered Clearview AI to cease collecting facial images and biometric templates from individuals in Australia and to destroy existing images and templates collected from Australia. Similarly, in May 2022, ICO confirmed that it had concluded its own investigation into Clearview AI and that it had found that Clearview AI breached the relevant UK data protection laws. ICO fined Clearview AI £7,552,800 and issued an enforcement notice that ordered the company to stop obtaining and using the personal data of UK residents that is publicly available on the internet and to delete the data of UK residents from its systems”.”⁴ 

Breaths of Exceptions

“Another gap in the protections afforded by the Privacy Act and APPs is that the exemptions to the consent requirements of APP 3, and the single purpose requirement of APP 6, are quite broad and may not sufficiently protect people against invasions of privacy. The exemptions provided for in the Privacy Act, which allow for the collection and use/disclosure of sensitive information (including biometric information) without consent have been made on the basis of balancing individual interests against those of collective security. However, there are arguments that this balancing approach has resulted in individual privacy being “traded off” against the wider community interests of preventing, detecting and prosecuting crime.´”⁴ 

“The issues identified demonstrate the unique challenges posed by biometric technologies. It is clear that while existing privacy and surveillance laws place a number of safeguards on the use of FRT in private enterprise, there are still some gaps in the regulation of the use of FRT”.⁴ 

The issues noted above highlight the need for new laws to protect agonist facial recognition technology.

Do You Agree That Laws are Necessary for Protection?

References

1. Facial Recognition Systems

2. Article titled “What is facial recognition” by Amazon

3. Article titled “New laws needed to address facial recognition technology” Jess Feyder

4. Article titled “Facial recognitions technology and the Law” by Peter Campbell

PLEASE SHARE THIS

Subscribe to our newsletter