Biometric identifiers offer an opportunity to recognise and authenticate users fastly and reliably, especially when users’ data protection is embedded in the design of the technology.
Biometric authentication systems
The notion of “biometric authentication systems” might sound a rather unfamiliar concept, especially for those who do not work in the tech, privacy or security field. Yet, biometric systems have a ubiquitous presence in our lives. Just think how many times you unlocked your mobile with your fingerprints or just by looking at the camera. Or when you passed through an e-gate at the airport where your face pattern was used for automated border control.
During my studies, I worked in a shopping mall that used fingerprints to identify employees and record when we clocked in and out. Well, all these situations have one thing in common: they exploit the power of our unique physical characteristics to authenticate or verify our identity…and this is what biometric systems are all about.
Fingerprints are a secure and immutable proof of identity
According to the Art. 4 of the European General Data Protection Regulation (GDPR), biometric data are “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person”. Each of us is characterised by very complex physical, biological and behavioural patterns which are unique and difficult to forge. Examples of biometrics include fingerprints, face and voice recognition, iris or retinal scans or even DNA, among many others.
Biometric identification systems use biometric characteristics to identify or verify users. These systems read people’s unique body patterns such as fingerprints and facial features to then compare them against a stored digital identity to determine if a user is who they say to be. Biometrics play an increasingly significant role in authentication because they are a solid, secure and immutable proof of identity.
However, with this strong identification capacity comes also some unique challenges that revolve around the protection of users’ data in line with the GDPR. Data needs to be protected from potential hacking attempts and it must be managed ethically to protect users’ privacy.
How will MADRAS-based technologies use biometrics
MADRAS addresses the need for low-cost and reliable materials for the mass production of flexible Organic and Large Area Electronics (OLAE) products. Pursuing this goal, the MADRAS team is developing OLAE photosensors able to detect users’ fingerprints and heartrate. As part of the project technological validation, these photosensors will be installed on scooters belonging to a scooter sharing network and will be used to verify and authenticate their users. While fingerprints are a classic biometric used for authentication, users’ skin vibration in response to heartbeat is a newer technique that will be employed in MADRAS for anti-spoofing purposes. The fingerprints and heart rate data collected by the photosensors is privacy-sensitive information, which requires special safeguards to be put in place for its protection. Along these lines, MADRAS adopted a privacy by design approach to guarantee that users’ data is managed and stored ethically and in respect of their privacy.
What do we mean by privacy by design?
Privacy by design is one of the most discussed topics related to data protection. In the European Union, the protection of data is regulated by the GDPR, which sets out seven key principles for data collection, processing and management. These are: lawfulness, fairness and transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity confidentiality and accountability. Specifically, protecting user data by default is integrated into the GDPR (Art 25, 2) by stating:
“The controller shall implement appropriate technical and organizational measures for ensuring that by default only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, their storage period, and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.”
The privacy by design approach dictates that privacy and data protection are embedded in the entire lifecycle of the technology from its design, through its use and until its disposal. This method allows us to anticipate and prevent the risks to which data is exposed and makes the development of the technology more efficient. Not to mention that users’ trust is vital, and they expect that their data is stored safely and used ethically. GDPR leaves open which protective measures are to be taken by technology managers and providers to comply with the data protection rules. Encryption and anonymisation of data are two possible options, but not the only answer. This is when the challenge of deciding how to better protect users’ data comes in!
How is MADRAS taking care of users’ data
As MADRAS took a privacy by design approach, decisions were made on how to better protect users’ data while MADRAS photosensors were being developed. Among technical specifications defined as part of the initial privacy risk assessment, MADRAS intends to protect users’ data by:
- not storing biometric data in any device.
- applying a one-way transformation function (Hash) to the original biometric data, so that an attacker would not be able to reverse-engineer these data. The hashed data will be then encrypted in the database.
- implementing a mechanism to check the authenticity, integrity and security of the data in the database. The database will implement mechanisms to avoid viruses and trojans, as well as to follow an Information Security Management System standard (like ISO 27001).
- making it possible to remove a user account and securely wipe any data associated with that user’s biometric data. The system shall integrate functionalities to remove the user’s biometric data altogether when needed or required at any time, in order to comply with the GDPR, Art. 17.
- including a mechanism for notifying the company and users about possible data breaches. Moreover, as mandated in the GDPR (Art. 33), the supervisory authority must be notified “not later than 72 hours after having become aware of it“.
In brief, employing a privacy by design approach allows MADRAS to embed a culture of privacy and users’ data protection within the project development. It also makes the development of the photosensors more efficient as risks have been identified before they materialise.
About the author
Francesca Trevisan, Researcher and Project Manager at Eticas
- Social scientist who is completing the PhD in Social Psychologist at the University of Surrey.
- Associate Lecturer in Modelling Social Data II at Goldsmiths, University of London.
- Expertise in social justice, security and AI.
Eticas Research and Consulting is a Spain-based SME founded in 2012 as a university spin-off. Since then, Eticas has been working on the legal, ethical and social impact of security policy, innovation and technology development, as well as the interaction between changing societal values, engineering possibilities and fundamental rights.