Facial Recognition Technology and the law 

14 June 2022

Over the past few years, the development and use of Facial Recognition Technology (FRT) throughout Australia has grown exponentially. The use of FRT is currently regulated by various state and territory legal frameworks, but it has been suggested that it would be more beneficial to enact discrete legislation that specifically responds to the challenges posed by FRT and regulates its use.

This article sets out how Australia’s existing privacy and surveillance laws deal with FRT, and identifies some gaps to be mindful of.

What is FRT and how is it used?

FRT involves the automated extraction, digitisation and comparison of spatial and geometric distribution of facial features. Using an algorithm, FRT compares an image of a face with an image stored in a database, in order to identify a match.1

FRT is deployed in two main ways, being:

  1.  “one-to-one” FRT, which is used to verify the identity of an individual by checking one image against a single, respective image to determine if they are the same person.2 It is often utilised in a controlled environment where the lighting is sufficient and the subject is in an optimal position to facilitate a successful comparison,3 and its most common application is unlocking a smartphone; and
  2. “one-to-many”, which is used to identify an unknown individual by comparing a select image against a large database.4

This article focuses on “one-to-many” FRT, which seeks to match a single facial image with a different facial image of the same individual that has been stored in a large database. It therefore relies on a much larger dataset to conduct a comparison, whilst the facial image being compared against the dataset is often taken from “the wild” (eg CCTV surveillance) and is of lower quality.5 As a result, identifying a person using “one-to-many” FRT is more difficult and prone to false matches and misidentification.6

In Australia, FRT is often used by banks and telecommunications companies for identity verification purposes,7 and is used extensively by immigration authorities to verify the identity of passport holders at international borders/airports, as well as by law enforcement agencies throughout Australia for crime prevention and suspect identification purposes. In South Australia, SAPOL fully implemented its own FRT system (called “NEC NeoFace system”) in the Adelaide CBD in 2019, which integrates FRT with CCTV, ATM, and some social media footage.8 In November 2021, the Adelaide City Council announced plans to roll out an updated City Safe CCTV Network that will involve the introduction of facial and number plate recognition.9

Existing Surveillance Laws

Application to FRT

There is no Commonwealth legislation that regulates the use of surveillance devices. Instead, this is currently governed by state and territory legislation.10 For example, the relevant legislation in South Australia is the Surveillance Devices Act 2016 (SA) (SDA).

The SDA prohibits:

  1.  the knowing installation, use or maintenance of an optical surveillance device”11 by a person on a “premises”12 that visually records or observes a “private activity” without the express or implied consent of all the key parties;13 and
  2. the knowing use, communication or publication of information or material derived from the use of an optical surveillance device.14

The regulation of an optical surveillance device in most jurisdictions,15 including the SDA, is linked to the concept of a ‘private activity’, meaning an activity carried on in circumstances that may reasonably be taken to indicate that one or all of the parties do not want the activity to be observed by others.16 Accordingly, the SDA might prohibit FRT in circumstances where it is used for covert optical surveillance (unless an exception applies).

The definition of “private activity” excludes activities carried on in a public place.17 Accordingly, public authorities can use devices with FRT to monitor the activities of the general public in public spaces, or semi-public spaces, without breaching the SDA.

Even if a person or government authority is prohibited from using a device to monitor FRT by the SDA, section 5(4) of the SDA sets out several exceptions to the general rule. These exceptions include where the use of the optical surveillance device is reasonably necessary for the protection of the “lawful interests” of that person, if the use of the device is in connection with the execution of a “surveillance device warrant” or “surveillance device (emergency) authority”, or where the use of the device is in the “public interest”.18 Similar exceptions exist in other jurisdictions.

The term “lawful interest” is not defined by the SDA but the concept was given judicial consideration in Nanosecond Corporation Pty Ltd v Glen Carron Pty Ltd (2018) 132 SASR 63 (Nanosecond)  where Doyle J held that the recording of a private conversation “just in case” it might prove advantageous in future civil litigation is not enough for the purpose of establishing a lawful interest. The Court is more likely to find that a recording has been made in the protection of a person’s lawful interests where the conversation relates to an allegation of a serious crime or resisting such an allegation, or where a dispute has “crystallised into a real and identifiable concern about the imminent potential for significant harm to the commercial or legal interests of a person.”19 Whilst Nanosecond concerned the use of a listening device, the same principles arguably apply to the recording of a private activity via an optical surveillance device with FRT.

Existing Privacy Laws

Application to FRT

The thirteen Australian Privacy Principles (APPs) in Schedule 1 to the Privacy Act 1988 (Cth) (Privacy Act) are intended to be technology neutral so as to preserve their relevance and applicability to changing technologies.20

Australian privacy law treats biometric information as personal information.21 In particular, “Biometric information” that is to be used for the purpose of “automated biometric verification” or “biometric identification”, or “biometric templates”, is a type of “sensitive information” for the purposes of the Privacy Act and APPs.22

“Biometric information” is not defined by the Privacy Act or APPs, but it is generally regarded as being information that relates to a person’s physiological or biological characteristics that are persistent and unique to the individual (including their facial features, iris or hand geometry),23 and which can therefore be used to validate their identity.24

The terms “automated biometric verification” or “biometric identification” are not defined by the Privacy Act or the APPs either. However, the Biometrics Institute defines “biometrics” as encompassing a variety of technologies in which unique attributes of people are used for identification and authentication,25 while the OAIC has indicated (in effect) that a technology will be “automated” if it is based on an algorithm developed through machine learning technology.26

A ‘biometric template’ is a mathematical or digital representation of an individual’s biometric information.27 Machine learning algorithms then use the biometric template to match it with other biometric information for verification or identification purposes.28

Given the breath of the definitions of “biometric information”, “automatic biometric verification”, “biometric identification” and “biometric template”, the majority of biometric information captured by FRT is likely to fall within the protections of the Privacy Act and APPs, and the safeguards contained in Privacy Act and APPs will therefore apply to any biometric information collected by any FRT deployed by an “APP entity”.29

Current Safeguards

As a form of “sensitive information”, biometric information is afforded a higher level of privacy protection under the Privacy Act and APPs than other personal information in recognition that its mishandling can have adverse consequences for an individual,30 meaning that an APP entity that collects and uses a person’s biometric information via FRT must adhere to stricter requirements.

Consent

The key requirements are contained in APP 3, which (in effect) provides that an APP entity may only solicit and collect a person’s biometric information if the information is reasonably necessary for one or more of the APP entity’s functions or activities,31 the biometric information has been collected by “lawful and fair means”,32 and the person consents to the collection of their biometric information (unless an exception applies).33

Consent for the purpose of the Privacy Act and APPs can be either “express consent” or “implied consent”.34 As a general rule, an APP entity should seek express consent to the collection of sensitive information (including biometric information) as the potential privacy impact is greater.35 In either case, however, an individual must be adequately informed before giving consent.36

The Privacy Act and APPs contain five exceptions to the requirement for an APP entity to obtain a person’s consent prior to collecting sensitive information (including biometric information).37 The exceptions are broad and include:

  1. where it is unreasonable or impracticable to obtain a person’s consent to the collection, and the APP entity reasonably believes the collection is necessary to lessen or prevent a serious threat to the life, health or safety of any individual, or to public health or safety;38
  2. where the APP entity has reason to suspect that unlawful activity, or misconduct of a serious nature, that relates to the APP entity’s functions or activities has been, is being, or may be engaged in and reasonably believes that the collection is necessary in order for the entity to take appropriate action in relation to the matter; and39
  3. where an “enforcement body”40 reasonably believes that collecting the information is reasonably necessary for, or directly related to, one or more of the body’s functions or activities.41

Use & Disclosure of Biometric information

As a type of sensitive information, special requirements also apply to the use and disclosure of biometric information after it has been collected via FRT. APP 6 provides that an APP entity can only use or disclose biometric information for the original/primary purpose for which it was collected. For example, if a company collects the image of a person’s face for the purpose of unlocking their smartphone, the company would not (without consent) be permitted to use the individual’s face for an unrelated purpose, such as to build a database of people whose information could then be sold to a third party for marketing purposes.42

Biometric information can only be used or disclosed for a secondary purpose if an exception contained in APP 6.1 applies. Those exceptions include where the individual has consented to that secondary use or disclosure,43 or where an individual would “reasonably expect”44 the entity to use or disclose the information for that secondary purpose and the secondary purpose is directly related45 to the primary purpose of collection. There are also specific exceptions which enable an APP entity to share a person’s personal information (including their biometric information) with enforcement bodies.46

Current Gaps

Gaps in existing surveillance laws

The legislated exceptions to the prohibition on the use of optical surveillance devices are very broad and do not currently have any in-built statutory limits. Accordingly, they have the potential to result in the incursion on a person’s privacy. However, the decision in Nanosecond serves to curtail any such invasion of a person’s privacy by ensuring that the “lawful interest” exception cannot be relied on to use FRT to visually monitor a person in anticipation that they might do something that might impinge upon a person’s lawful interest.

Gaps in existing privacy laws

Scope

The Privacy Act and APPs are federal laws that only apply to organisations and agencies deploying FRT that fall within the definition of an “APP entity”. The definition of an “APP entity” does not include state and territory authorities or agencies, or organisations with an annual turnover of less than $3 million.47 Whilst some jurisdictions have their own specific privacy legislation that steps in to help safeguard a person’s privacy where FRT is used, there are other jurisdictions where no specific privacy legislation exists at all (including South Australia).

In South Australia, the State public sector is required to comply with South Australian Information Privacy Principles (IPPs).48 However, the IPPs do not extend to biometric information, and so there is no other legal framework that applies to any surveillance activities carried out by agencies, authorities and organisations that fall outside the scope of the Privacy Act and APPs in SA.

Difficulty with establishing true consent

In the past year, the OAIC has issued two rulings in which it determined that the collection of biometric information by two separate companies (Clearview AI49 and 7Eleven50) contravened the consent requirements of the Privacy Act and APPs.

The Privacy Act and APPs strictly require that APP entities collecting biometric information via FRT should obtain express consent. However, the nature of FRT means that it is not often practical to obtain true, express consent from individuals whose biometric information might be captured by FRT. Whilst obtaining express consent is arguably more realistic where “one-to-one” FRT is being utilised for a specific purpose in a controlled environment, it is more difficult for an APP entity to obtain the express consent of every person whose biometric information might be captured in circumstances where “one-to-many” FRT is being deployed. Accordingly, whilst it is not ideal, in order comply with current privacy laws, an APP entity that deploys FRT will usually need to establish that a  person’s consent to the collection of their biometric information by FRT can be implied.

Even though implied consent is an option, it is still difficult to establish that implied consent has been obtained in the first instance given the relevant legal requirements. In particular, it can be practically difficult to provide people with enough information about how FRT collects and uses their biometric information before FRT captures their image. As a result, most people captured by FRT will not have been properly informed about what they were consenting to. Further, an individual will not often have the ability to refuse to provide their consent to the use of FRT, and may feel compelled to provide it due to the inconvenience of not doing so, or due to their lack of bargaining power. For example, although 7Eleven displayed a notice at the entrance to its stores to alert customers that they would be subject to FRT when they entered the store,51 and sought to infer that any customer who then chose to enter the store has provided consent, it is arguable that the customer had no choice (particularly if there were no convenient alternatives available to them).

Notwithstanding the practical difficulties of obtaining consent in the context of FRT, the OAIC’s decision regarding Clearview AI has reinforced the importance of doing so. In that matter, Clearview AI had compiled a database of more than three billion images scraped from public posts on social media platforms and other public websites. Clearview AI then allowed its paying customers to use its software to upload an image and find matching faces and associated details from social media profiles.52

Following a joint investigation with the UK’s Information Commissioner’s Officer (ICO), the OAIC found that Clearview AI had breached the Privacy Act and APPs by, among other things, collecting Australians’ sensitive information without consent. In particular the OAIC found that there was no evidence that express consent had been obtained, and was not satisfied that consent could be implied in the circumstances on the basis that, among other things:53

  1. the act of uploading an image to a social media site did not unambiguously indicate a person’s agreement to the collection of that image by an unknown third party for commercial purposes; and
  2. Clearview AI’s publicly available Privacy Policy was insufficient to enable individuals to understand how their biometric information was being collected, the purpose of collection and how it would be handled by the respondent. Accordingly, any consent purported to be provided through the Privacy Policy would not have been adequately informed.

The OAIC subsequently ordered Clearview AI to cease collecting facial images and biometric templates from individuals in Australia, and to destroy existing images and templates collected from Australia. Similarly, in May 2022, ICO confirmed that it had concluded its own investigation into Clearview AI and that it had found that Clearview AI breached the relevant UK data protection laws. ICO fined Clearview AI £7,552,800 and issued an enforcement notice that ordered the company to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems”.54

Breadth of exceptions

Another gap in the protections afforded by the Privacy Act and APPs is that the exemptions to the consent requirements of APP 3, and the single purpose requirement of APP 6, are quite broad and may not sufficiently protect people against invasions of privacy. The exemptions provided for in the Privacy Act which allow for the collection and use/disclosure of sensitive information (including biometric information) without consent have been made on the basis of balancing individual interests against those of collective security.55 However, there are arguments that this balancing approach has resulted in individual privacy being “traded off” against the wider community interests of preventing, detecting and prosecuting crime.56

Where to from here?

The issues identified demonstrate the unique challenges posed by biometric technologies. It is clear that while existing privacy and surveillance laws place a number of safeguards on the use of FRT in private enterprise, there are still some gaps in the regulation of the use of FRT.

In March 2021, the Australia Humans Rights Commission released the Human Rights’ and Technology Final Report 2021, which made a number of recommendations for the regulation of FRT, including the introduction of tailored legislation that regulates the use of FRT, and the introduction of a statutory cause of action for serious invasions of privacy.57 These recommendations have been made at the same time that the privacy law regime in Australia is undergoing a comprehensive review. Accordingly, those reviews  may result in the incorporation of additional, more tailored  provisions relating to the use of FRT  and its intersection with personal privacy.

This article was written by Caitlin Surman, Senior Associate and reviewed by Peter Campbell, Partner.


1Monique Mann* And Marcus Smith, ‘Automated Facial Recognition Technology: Recent Developments And Approaches To Oversight’ (2017) 40(1) UNSW Law Journal 121, 122.

2This involves a computer checking whether a single facial image matches a different facial image of the same person: Australian Human Rights Commission, Human Rights and Technology (Final Report, March 2021) 113.

3Eifeh Strom, ‘Facing challenges in face recognition: one-to-one vs. one-to-many’, Asmag (Web page, 19 September 2016) <https://www.asmag.com/showpost/21158.aspx>.

4Philip Brey, ‘Ethical Aspects of Facial Recognition Systems in Public Places’ (2004) 2 Journal of Information, Communication and Ethics in Society 97, 98.

5Seth Lazar, Clair Benn and Mario Gunther, ‘Large-scale facial recognition is incompatible with a free society’, The Conversation  (Web page, 10 July 2020)< https://theconversation.com/large-scale-facial-recognition-is-incompatible-with-a-free-society-126282 >.

6Australian Human Rights Commission, Human Rights and Technology (Final Report, March 2021) 113.

7Liz Campbell, ‘Why regulating facial recognition technology is so problematic – and necessary, The Conversation (Web Page, 26 November 2018) <‘https://theconversation.com/why-regulating-facial-recognition-technology-is-so-problematic-and-necessary-107284>.

8‘South Australia Police tap NEC for facial recognition edge over criminals’, NEC Organisation (Web page, 1 August 2016) <https://www.nec.com/en/press/201608/global_20160801_03.html>.

9Malcolm Sutton, ‘Facial recognition technology put on hold in Adelaide amidst privacy concerns’, ABC News (Web page, 10 November 2021) <https://www.abc.net.au/news/2021-11-10/facial-recognition-tech-on-hold-amidst-privacy-concern/100608514>.

10Relevant legislation in other States and Territories is as follows : Surveillance Devices Act 1999 (Vic); Surveillance Devices Act 2007 (NSW); Surveillance Devices Act 2007 (NT); Surveillance Devices Act 1998 (WA); Invasion of Privacy Act 1971 (Qld); Listening Devices Act 1992 (ACT).

11An “optical surveillance device” means a device capable of being used to observe or record visually (whether for still or moving pictures) a person, place or activity: SDA, s 3.  This definition is arguably wide enough to capture any devices that integrate FRT for the purpose of capturing facial images (such as CCTV).

12“premises” includes land, a building, a part of a building, and any place (whether built or not).

13SDA, s 5(1).

14SDA, s 12(1).

15Excluding the Invasion of Privacy Act 1971 (Qld), which only regulates the use of “listening devices”.

16SDA, s 3.

17SDA, s 3. The definition of “private activity” also excludes activities that can be readily observed from a public place, and/or an activities carried on in circumstances where the person ought to reasonably expect that they may be observed by another person.

18SDA, s 6(2).

19Nanosecond,  [103] to [105].

20Office of the Australian Information Commissioner, Submission No. D2018/009462 to Australian Human Rights Commission, Human Rights and Technology Issues Paper (19 October 2018). <https://www.oaic.gov.au/engage-with-us/submissions/human-rights-and-technology-issues-paper-submission-to-the-australian-human-rights-commission>.

21APP Guidelines, Chapter B: Key Concepts, [B.27].

22APP Guidelines, Chapter B: Key Concepts [B.138]; Privacy Act, s 6(1).

23Office of the Victorian Information Commissioner, Biometrics and Privacy, (Web page). < https://ovic.vic.gov.au/resource/biometrics-and-privacy/>.

24Types of Biometrics, Biometrics Institute (Web page)  <https://www.biometricsinstitute.org/what-is-biometrics/types-of-biometrics/>.

25Above n 25.

26Commissioner initiated investigation into Clearview AI, Inc. (Privacy) [2021] AICmr 54,[138] (Clearview).

27International Organization for Standardisation, Standard ISO/IEC 2382-37: 2017(en), Standard 3.3.22 (Web page, 12 March 2021) < https://www.iso.org/obp/ui/#iso:std:iso-iec:2382:-37:ed-2:v1:en>.

28Clearview, [127].

29APP Guidelines, Chapter B: Key Concepts [B.2] to [B.9]; Privacy Act, s 6(1). APP entities generally include  include Australian Government agencies and any organisation with an annual turnover of more than $3 million.

30APP Guidelines, Chapter B: Key Concepts, [B.141].

31APP 3.1 and APP 3.2.

32APP 3.5.

33APP 3.3.

34Privacy Act, s 6(1).

35APP Guidelines, Chapter B: Key Concepts, [B.41].

36APP Guidelines Chapter B: Key Concepts, [B.35].

37The five exceptions are contained at APP 3.4.

38Privacy Act, s 16A(1), Item 1. This is one of the seven “permitted general situations” provided for by s 16A.

39Privacy Act, s 16A(1), Item 2. This is one of the seven “permitted general situations” provided for by s 16A.

40‘Enforcement body’ is defined in s 6(1) of the Privacy. It lists of series of specific bodies. The list includes Commonwealth, State and Territory bodies that are responsible for policing, criminal investigations, and administering laws to protect the public revenue or to impose penalties or sanctions.

41APP 3.4(d)(ii).

42Australian Human Rights Commission, Human Rights and Technology (Final Report, March 2021), 112.

43APP 6.1(a).

44The ‘reasonably expects’ test is an objective one that has regard to what a reasonable person, who is properly informed, would expect in the circumstances. This is a question of fact in each individual case. It is the responsibility of the APP entity to be able to justify its conduct. Examples of where an individual may reasonably expect their personal information to be used or disclosed for a secondary purpose include where the entity has notified the individual of the particular secondary purpose under APP 5.1 (see Chapter 5 (APP 5) or the secondary purpose is a normal internal business practice: APP Guidelines, Chapter 6:APP6, [6.20].

45A directly related secondary purpose is one which is closely associated with the primary purpose, even if it is not strictly necessary to achieve that primary purpose: APP Guidelines, Chapter 6:APP6, [6.26].

46APP 6.2(c), APP 6.2(e) and APP 6.3.

47APP Guidelines, Chapter B: Key Concepts, [B.8]; Privacy Act, s 6(1).

48Government of South Australia, Department of the Premier and Cabinet Circular, Information Privacy Principles Instruction PC012 (Webpage, 16 September 2013) <https://dpc.sa.gov.au/premier-and-cabinet-circulars>.

49Commissioner initiated investigation into Clearview AI, Inc. (Privacy) [2021] AICmr 54 (Clearview).

50Commissioner initiated investigation into 7-Eleven Stores Pty Ltd (Privacy) (Corrigendum dated 12 October 2021) [2021] AICmr 50 (7Eleven).

517Eleven, [89].

52Daniel Kiley, Privacy Law Bulletin, volume 17, April 2020, New tech brings facial recognition into clear view, p. 9.

53Clearview, [125] to [161].

54DAC Beachcroft, “Facial recognition: Clearview AI fined more than £7.5m by ICO” dated 26 May 2022 <https://www.lexology.com/library/detail.aspx?g=ef4595d7-b7ed-42b7-83a6-018af5bfd032&l=9RJSRCL>.

55Above n1, 132.

56Ibid.

57in South Australia, the draft Civil Liability (Serious Invasions of Privacy) Bill 2021 (Privacy Bill) has been tabled for consideration in Parliament to establish a new statutory cause of action for serious invasions of privacy in South Australia, which is separate and distinct from the Privacy Act and APPs. The Privacy Bill will enable an individual to bring civil proceedings against a person who has invaded their privacy where there was a reasonable expectation of privacy, the invasion of privacy was serious and the conduct was undertaken intentionally. Consultation in respect of the Privacy Bill is still underway, but that consultation process will hopefully assist in identifying how the proposed statutory tort can be best utilised to address the gaps in the safeguards provided for in the current privacy and surveillance laws.

Subscribe to HWL Ebsworth Publications and Events

HWL Ebsworth regularly publishes articles and newsletters to keep our clients up to date on the latest legal developments and what this means for your business.

To receive these updates via email, please complete the subscription form and indicate which areas of law you would like to receive information on.

Contact us