A transparency stack for digital public goods
Digitally enabled services are inherently opaque. Digital public goods could set new standards for transparency, but only if there is an investment in tooling, exemplars and civil society capability.
Digitally enabled public services are different from analogue ones. Someone visiting a physical government benefits office should broadly speaking, see the same service being provided if they visit a separate benefits office somewhere else in the country the following week. People’s experiences may differ, for example, due to means-testing, but broadly speaking, they differ within a common framework. The rules can generally be understood by reading publicly available guidance or law, often without resort to specialist skills. It is unlikely that process will change radically by citizen, location or over short-to-medium term timeframes.1
Conversely, two of the defining characteristics of digital services are the propensity for a lack of shared reality - what you experience using Facebook may be radically different to what I experience - and an opacity about how things work the way they do - how Uber decides to route a car to a given location is a complex mix of proprietary code, internal market incentives and design decisions.
Why are digital services like this? Firstly, digital services can present in radically different ways to different people because the data they can access about people creates a low cost of personalisation of outcomes.2
There is also a low cost of changing code, relative to traditional changes in standard bureaucratic processes. This means digital services can be changed rapidly and radically - it is reasonable to expect a modern digital service to be deploying code or design changes on a daily basis, the changes unseen until external effects manifest themselves.
Finally, there are the information asymmetries digital services create between the organisation operating a digital service and anybody wanting to understand it (and the need for specialist skills to do so). This, in combination with personalisation and regular changes, creates a high cost of understanding the state of a digital service at a point in time for a specific user or group.
These three things, I would argue, are at the route of many of the concerns about automated decision making and datafication of our lives. It’s hard to hold something to account if how it works is different for everyone, changes all the time, and it’s nearly impossible to piece together how it works. These issues exist with digitally enabled services in the public and private sectors.
One of the biggest differentiators of digital public goods from proprietary systems (be they in the government context or competitors to more societal platforms like Uber), and from standard open-source projects could be tackling these issues head-on. Digital public goods could do this by normalising that it is no longer acceptable not to publicly document how services work and how they use data at every layer of the technology stack and how this is changing over time.
Making radical transparency the norm for digital public goods is about more than open-sourcing software code. Although the act of open-sourcing code (assigning a license and publishing the code in the open) is critical, so is the process of coding in the open. Well documented code commits and release histories help explain to external audiences not just what changes have been made, but why. Similarly, systematically documenting changes to the user interface and user experience design through use of design histories can show how changes manifest themselves to the public. In addition to code and development and design histories, lists of the different technical systems and components that make up a service should be publicly available. These should include lists of databases and points of data ingress and egress; types of analytics that are collected; and any experiments or A/B tests that are operating. All this information should be, wherever possible, digitally signed by the organisation(s) responsible for the changes and should build on existing approaches like software registries, reproducible builds, verifiable data structures and integration testing.
The ‘build once, use many times’ model of digital public goods means that transparency measures need to be implemented at the level of individual digital public goods projects and at the level of individual implementations. To make that easier, there needs to be an investment in three things.
Firstly is funding the development of tooling and guidance to help digital public goods projects to systematically publish transparency information. Secondly, one or more exemplar projects that can show what good full-stack transparency looks like in the context of an implementation of a digital public goods project. Finally, work is needed to understand what new skills civil society organisations need to act on this information and what new types of organisation might need to exist. For example, what type of organisation might implement a code review of recent changes? Or recommend design changes to the user interface?
Over time, I think these things will become the norm for digitally enabled services operating in the public realm. Much as, when it comes to the built environment, planning or zoning process may mandate the publication of notices that explain potential changes to people’s neighbourhoods, and public meetings let people have their say. Or how public registers, like those maintained by land registries and company registries put important information and events on the public record so those facts can be checked and verified by the public or their proxies.
Achieving this will require an approach that can focus effort and funding not just on the desired policy outcomes, or on the needs of the immediate users, but on the need society has to understand and hold to account the operations of services in the public realm.
Assuming no arbitrary abuse of power by whoever is delivering the service ↩
Personalisation of outcomes can be positive or negative for the end user, intentional or unintentional by the organisation operating the service. The low cost of personalisation in turn means existing inequalities can be replicated and amplified faster. ↩