How the smart city becomes legitimate and hence trustworthy

A smart city is no good if its residents distrust its doings. Municipality The Hagues Chief Data Officer Tanaquil Arduin and professor of public administration Bram Klievink from Leiden University tell how to avoid that situation – somewhat. ‘Make sure the conversation officials need to have with residents can be a comfortable one.’

This is how The Hague uses data

Tanaquil Arduin
Photo by Quintin van der Blonk

In 2019, Tanaquil Arduin was one of the first Chief Data Officers in the public sector. According to her, which problems can be solved with data and smart technology? ‘They mainly allow you to gain insights into social issues. This way you can determine where best to deploy your scarce capacity. Scarce enforcers, neighbourhood managers or front desk staff, for example.'

'Making intersections and road sections safer is also done more purposefully based on data and technology. Based on how busy it is, the weather, possible events or construction activities, and whether cyclists and especially schoolchildren have to use the roadway, we can better assess whether the risk of an accident will increase or decrease.’

    Data and technology allow you to determine where best to deploy your scarce enforcers.'

Suppose residents are concerned about road safety? ‘We cannot deal with everything at one go. Such a system can determine for us which intersection has the highest priority and what we should pay particular attention to at that intersection. We can also explain to residents why this intersection is being dealt with while that one has yet to wait.’

Based on demographic data, so without the involvement of personal data, another algorithm predicts per neighbourhood an increase or decrease in the use of social support for the next five years. ‘We know then whether a neighbourhood will need more case managers, mobility scooters or domestic help and can adjust the level of services accordingly. It works so well that the Association of Dutch Municipalities wants to upscale the use of this model, the so-called Social Support Act forecasting model.'

den haag

And this is how The Hague shows it

In total, the Hague makes use of 26 algorithms that play a role in the city. The municipality really wants to be open about that. Arduin: ‘We have made the podcast series This is what The Hague does with data.’ According to Youtube, the short videos have been watched by hundreds to a thousand viewers. ‘In addition, since this year, there is the Dutch government's Algorithm register.’

In it, The Hague already describes all 26 algorithms: what each of them does and why, whether it is self-learning or rule-based, and what data it makes use of. Maybe the register does not use the simple language that would make it comprehensible for as many people as possible, but it does provide an overview, nevertheless. Bram Klievink compliments Arduin: 'The Hague is leading the way in completing the register.'

As a citizen, you actively need to search for the Algorithm register and the videos. However, if you happen to be at a location in the city where algorithms are used, videos and elaborate descriptions are of little use to you. For these situations, Thijs Turèl of the Amsterdam Institute for Advanced Metropolitan Solutions is developing a guideline for stickers on sensors. They can state what the technology does and does not register, who owns the equipment, and where people can lodge complaints.

   The development of trust in the government is a larger trend that is conjuncture dependent.'

A government must work hard for trust (sometimes in vain)

Bram Klievink
Photo by Patricia Nauta

The algorithm register also contains information about the Amsterdam Top400 and Top600, aimed at the early detection and prevention of juvenile delinquency. That approach became controversial when it was revealed that the municipality had also started preventively surveilling the siblings of juvenile offenders.'

'It undermines the residents’ trust in the use of systems by the government. This is because of its similarity to the recent childcare allowance affair, in which the tax authorities used an algorithm that labelled people with dual citizenship relatively often as potential fraudsters.

Trust is the foundation of a good relationship between government and residents. Bram Klievink did and knows a lot of research on it. ‘Being open about the use of data and technology has, at most, a small effect on citizens’ trust.'

'The development of trust in the government is a larger trend that is conjuncture dependent. A municipality cannot do much about it, even though, at the same time, the risk of mistrust is high. Mistakes that betray trust are widely publicised.'

Klievink analyses it this way: ‘Most citizens have little to do with the government. What they do – file tax returns, apply for a passport – usually goes well. Yet, a lot also goes wrong, and those affected are often vulnerable people. If the government makes those few transactions of mine run smoother or distributes the Social Support Act budget better, I will not notice anything. Nevertheless, that is also valuable.’

All algorithms checked for human rights

Klievink: ‘For the government, trust is something external which you have little control over. Yet, internally, you have to ask yourself at all times whether you are trustworthy. A government has power, and it has to use it legitimately. This means you always have to assess whether the returns of data use or tracking data for security or cost efficiency, for instance, are proportional to the costs of an eventual invasion of privacy. The values have to be balanced.’

    A government has power, and it has to make legitimate use of it.'

In the Hague, that public balancing of values and weighing of fundamental rights forms an important part of the data strategy Arduin has drafted in 2019, with broad involvement of officials and politicians, she agrees. Arduin: ‘We judge each algorithm against a yardstick, which Amnesty International recently told me they think is a good tool. It is the IAMA, the Impact Assessment for Human Rights and Algorithms, developed by the Utrecht Data School.'

'The IAMA asks questions that spark discussion. For example, is it really necessary to monitor this or that in that particular neighbourhood? What does it mean for the equality of citizens, is unnecessary profiling taking place, or is it proportional? And is there information symmetry?’

    We judge each algorithm against the standard of human rights.'

Information symmetry?’ That means that citizens and government have access to the same information. Suppose you want to increase the limit on your credit card, the credit card company may request a lot of information. They know a lot about you, but you do not know exactly how they determine whether your limit can go up or not. No information symmetry.’

There is more to it than ethics and legal committees alone

Klievink does see a pitfall when assessing the legitimacy of urban innovations. ‘Committees that have to do this, often have a technical, legal and ethical approach. However, somebody else will soon have to work with it. Take the digital mirror city, a 3D model of the entire city with which you can navigate through it. Every roof extension can be seen. If someone from the Licensing Department sees an illegal extension, should she act on it or not? And if so, what will be her conversation? The perspective of the civil servant who has to work with it in practice is still regularly missing when an assessment is made before use.’

Digitale spiegelstad
Digital mirror city, a 3D model with which you can navigate through the city

Arduin agrees: ‘Our municipality has 12.000 employees, while an ethical committee consists of only a few people.’ Of course, that is just the way it is. Klievink: ‘My advice would be that the committee also pays attention to the organisational side, the HR side. The organisation deploys technological applications within the context of policy and organisational rules.'

'Subsequently, these applications influence work processes, also touching on other processes and systems. In the end, officials have to work with them, with their own professional knowledge and routines. Even if the legitimate use of technology in the smart city may be clear in theory, it does not provide real direction in practice. Make sure the conversation officials need to have with residents can be a comfortable one.’

Tanaquil Arduin is Chief Data Officer of the municipality of The Hague. She initiated a nationwide network for CDOs in the public sector that connects 60 colleagues in 2023. She is also head of the Data and AI Expertise Centre of the municipality of The Hague.

Bram Klievink is professor of public administration at Leiden University, specialised in algorithmic governance, and member of the executive board of Leiden-Delft-Erasmus Centre for BOLD Cities. He conducts research on how digital innovations in big data and algorithms can challenge and reinforce public governance.

Text: Rianne Lindhout


white paperWhite Paper about Smart Cities

This article is a pre-publication from the white paper 'This is the real smart city, with lively debate on democracy, data and technology'. This (Dutch) publication by the Leiden-Delft-Erasmus Centre for BOLD Cities contains 9 double interviews on the rise of the smart city. It contains an English summary and list of recommendations.

The white paper will be presented on 22 June 2023 at Campus The Hague during a live talk show. You can register for this festive event via the link below.

Attachments
White Paper Smart Cities (4.83 MB)

Next article