For Zeki Erkin, privacy is a human right that needs to be respected. That is why he shapes his research around secure data sharing and processing by means of encryption. So, yes, there is a future in which you receive personalised recommendations on Netflix, and a warning when you were near a COVID-infected person, without a company or the government knowing your every activity.
‘Privacy research is difficult and challenging and anyone working in this field has a deeper reason for doing so,’ says Zeki Erkin, tenured associate professor in the Cyber Security Group at TU Delft. For himself, it is preventing a future as painted in the George Orwell classic “1984”, in which you must even hide something from yourself if you wish to keep it a secret.
Yet we, as consumers, happily and too readily share our demographic information, shopping patterns, medical records and location data to get personalised recommendations, a discount or a sense of security. ‘It is possible to have all these benefits and protect privacy and confidentiality,’ Erkin says. ‘In my group, we design protocols based on advanced encryption schemes that allow data to be processed without revealing its content.’
Zero-knowledge proofs: your birth date remains hidden during age verification
One of the cornerstones of his research, and that of the small army of four PhD students and twelve master’s students that he supervises, are so-called zero-knowledge proofs. Imagine an (on-line) retailer asking for your date-of-birth to determine your eligibility for a discount on a cell phone. ‘They don’t need to know your birth date, even though they would like to collect it,’ Erkin says.
‘With my technology, all they get is proof that you qualify.’ This is just a small example. In a project together with the Central Judicial Collection Agency (CJIB: Centraal Justitieel Incassobureau) he co-designed a system for secure data sharing between municipalities. The idea was to protect people in financial distress from accumulating more debt due to interest and penalties added to unpaid speeding tickets. Without revealing any raw data underlying its calculations, the system answered questions such as “can this person pay his or her debt in three months?” If it weren’t for COVID-19, a prototype system would be up and running today.
There should be no excuse not to use our privacy-ensuring solutions.
The building blocks
If you want to keep something confidential, you usually encrypt it. But some encryption schemes are very special – preserving some structure thereby allowing data manipulation under encryption. Erkin: ‘We take small operations and design a bigger protocol to achieve services like recommender systems, or statistical analysis on confidential data, or prediction algorithms on sensitive data.’ His solutions often involve blockchain technology as a tool for distributing data. ‘There are people working on improving existing blockchain technology, to make it better, faster, more reliable and more secure. Others work on faster and more flexible encryption schemes. We use the tools that are available right now and apply smart techniques to achieve the most efficient design possible, taking into account the security requirements of the specific scenario. A medical setting is completely different from a financial setting.’
Speed is of the essence, but not always
The protocols they develop are very secure and privacy preserving. The research challenge is to also make them fast enough as encryption is a computationally expensive operation. If a normal text-based service takes a single second, the privacy-preserving version could take minutes or much longer. ‘Over the past decade, we have shown significant progress in terms of speed,’ Erkin says. ‘Depending on the application, it is possible to achieve milliseconds response. We will never be as efficient as the plain text version of the system, but we want to get close enough so that people start using it. There should be no excuse not to use it.’
Privacy is not very profitable for the current business models
With very privacy sensitive data involved, the medical field is a domain that Erkin is especially interested in. A few years ago, LUMC had a project, named The Box, in which they gave smart devices to people who had just had surgery. They wanted to monitor their activity, sugar level and weight on a daily basis, without having the patients come into the hospital every day. ‘We redesigned the system such that the data would not be stored at the company which was not located in the Netherlands,’ Erkin says. ‘In the end, the company processed the data under encryption, providing only the final results to the hospital.’
Different assumptions and expectations
Collaborating with the medical doctors at LUMC, he noticed their terminology to be completely different. Their assumptions and expectations as well. ‘We think in terms of passwords, hash-functions and biometric scanning. In the medical domain they are really focussed on saving lives and efficiency is very important. It will take one or two years to really understand each other and not everyone, certainly not a young researcher needing publications, has this time. From my perspective, instead of pushing researchers to write project proposals and granting them with little chance, we should establish multidisciplinary joined programs. Assign one PhD student to people from two different organisations and let them build something, slowly. It is the most successful approach I have seen in academia, and I believe it to be the best way forward for Leiden-Delft-Erasmus security and safety.’
Hopefully sometime soon, privacy will be demanded as much as security
The people in Erkin’s group are all computer scientists. But he is well aware that multidisciplinarity is key to build a privacy-preserving future. ‘We develop prototypes of technical solutions,’ he says. ‘Somebody has to produce it, implement it, manage and maintain it. If something goes wrong, what shall be done to fix it? These are all aspects of governance.’ He therefore is in close contact with people from Leiden University, in particular with Professor Bram Klievink who is interested in the governance and legal aspects of these technical solutions. ‘So far, I have seen only one really good, impactful implementation of the kind of technology we develop,’ Erkin says. ‘My colleagues from EPFL in Lausanne developed a privacy-preserving tracking protocol, that is now being used by multiple countries in their COVID-tracking apps.’
Right now, you may wonder – no, you should wonder – why the privacy-preserving future hasn’t yet arrived. The simple answer: privacy is not very profitable for the current business models; companies want to have your raw data and Erkin’s technology is preventing them from having it. The European law on data protection and privacy (GDPR: General Data Protection Regulation) that became enforceable in 2018 was a step in the right direction. ‘But to be compliant, companies just apply some anonymisation techniques,’ Erkin says. ‘Their biggest effort is in security – protecting their systems and networks from outside attacks. Hopefully sometime soon, privacy will be demanded as much as security.’ Thanks to Erkin (and others), the necessary privacy-preserving tools will be ready and waiting. It is up to us, consumers and users of digital services, to show some outrage – to demand our privacy to be respected and bring that future home.