Security, Privacy

Key learnings from working on privacy in contact tracing

In today's post I'll share key lessons from my journey in implementing Anonymous Tokens and integrating it in Norway's contact tracing app "Smittestopp". Privacy and transparency, especially in government IT, is vital for gaining citizens' trust - and is here to stay. Therefore I'll share some success factors and my takeaways with you.

6 min read


By Henrik Walker Moe


December 15, 2021

This post is a follow-up on "Anonymous Tokens for Private Contact Tracing" that Tjerand Silde, Martin Strand and I wrote for last year's calendar on 22nd of December, where we tell the story on how we improved privacy in Norway's contact tracing app "Smittestopp" by designing a protocol for anonymity.

Through the work on Anonymous Tokens and integrating our contribution to Smittestopp, I've summarized some success factors and what I want those of us who build digital services to take away from our journey.

Key factors for success:

  • cross competency
  • deep skills and overlap in skills
  • good communication skills
  • open source enabled transparency and trust

Key takeaways:

  • ensure built-in privacy
  • have a data minimisation mindset
  • trust through transparency

Next I'll bring you up to speed on what's happened since our last post, and share how things went after our work was done.

We're live!

"Anonymous Tokens" went live in Smittestopp on April 8th 2021! 🎉

we're live! 😱
we're live! 😱

Anonymous Tokens was the culmination of a huge collaboration between the development team at The Norwegian Health Authority (FHI), other contributors and Martin, Tjerand and myself. It truly was a nation-wide open source "dugnad" to get our contribution into Norway's contact tracing app "Smittestopp" on GitHub!

The tale has been told

The story about the collaboration between contributors to Smittestopp has been told in written form by Den norske dataforening (DND) and Johannes Brodwall on Kode24.no. Johannes highlights the value of FHI's willingness to move towards transparency and their openness to external contributions. You can also hear a podcast-episode from Utviklerpodden with Johannes on this topic.

Our work on Anonymous Tokens has also been mentioned in several Norwegian news media: forskning.no, Grannar.no and Computer World.

We won a privacy-award!

Award for built-in privacy by Datatilsynet
Source: https://www.datatilsynet.no/aktuelt/aktuelle-nyheter-2021/pris-for-innebygd-personvern-til-anonyme-tokens/

We won the "Built-in privacy"-award from the Norwegian Data Protection Authority (Datatilsynet)! This took us by surprise, to say the least. It seems they really wanted to get the message across that this is something they want to encourage and see more of. We agree! 😊

So... what did we learn from all of this?

I'm writing this post almost exactly a year after we started our work on Anonymous Tokens, and I reflect on how we managed to develop it in just 4 weeks. How did we manage that in such a short time? We've also observed challenges in privacy in contact tracing apps around the world. What can be done to move forwards in this space to ensure citizens' and users' privacy are respected in (government) IT services?

Key factors for our success


A team of cross-competency skill sets will move faster by being able to do more and have the required key competency to reach their goals. This encourages team members to develop deep skills in their areas of expertise and collaborate with others, rather than individuals focusing on their own work.

Deep skills and overlap between skills

When you have a team composed of people with expertise in their own subjects, but who can look over to other subjects just enough to get their head above water, the team will understand each other better and move quicker. This was certainly true for us where coding and cryptography were two subjects where we each had expertise and understood enough of each other's subjects to understand, communicate and collaborate.

Good communication skills

When people communicate clearly, concisely and bring just enough context to the discussion to provide for a valuable dialog - you are able to progress faster. Poor communication where people bring in foreign elements such as unfamiliar abbreviations and terms, with a lot of out-of-context information, clutters the dialog and makes it cognitively harder to follow the discussions. A healthy dose of empathy doesn't go amiss.

In both open source and projects I've worked on I've seen both ends of the scale. I wish focus on communication skills and empathy were a larger part of software development.

Open source enables transparency & trust

GitHub provided tooling we could use to construct software built-in measures for security, trust and quality. By using open source we ensured that our contribution to Smittestopp was transparent for public scrutiny and open for external contributions. These benefits can also be used by government IT services that are built with an open source mindset.

FHI's change of direction towards transparency for Smittestopp 2 enabled a nation-wide collaboration and made all of this possible!

Key takeaways for those who build digital services

Ensure built-in privacy

If you want to know more about how you can implement built-in privacy in software development you should have a look at this handbook by Datatilsynet. They've written guidelines and best-practices on privacy in software development focusing on requirements, design, testing, coding, maintenance and more. Share this with your team!

Have a data minimisation mindset

Large datasets with rich user-data are honeypots for threat actors. Threat actors aren't just external hackers. Your own employees, your company and even yourself can unknowingly be threat actors that should be modelled and mitigatigated.

Handling user-data with person identifiable information (PII) is a risk. How big of a risk it is depends on how well you've secured your user's data. By minimizing the amount of data you handle and store, you also minimize the damage potential if data is exposed by hackers and leaks.

Ask yourself: do I really need this data for my service to provide value, and do I have a legitimate reason for storing this data? If you answered no to either question, then you might risk being in breach of GDPR-regulations. Moving towards a data minimisation mindset will please privacy-aware users and mitigate data-processing risks for your service.

Ensure trust through transparency

Trust is a frail thing. Without it, you risk users not adopting your service. Building digital services on an open source platform is a great way of being transparent. This is true not just for commercial companies but especially for governments who build digital services for their citizens.

Transparency builds trust. Transparency in government IT can reveal inequality and injustice. Citizens can also validate if automated processes within these digital services are compliant to laws. This is a win-win scenario!

Could transparency in government IT even lead to stronger democracies?

Up next...