What legal and security criteria does the Digital Scotland Service Standard use to decide whether a codebase can be open-sourced?
Executive summary
The Digital Scotland Service Standard (DSSS) promotes making “all new source code open and reusable” and requires publication under appropriate licences or a convincing justification for exceptions [1] [2], while also insisting services be “secure” and protect users’ privacy — creating a built‑in tension where security, privacy and legal constraints can legitimately block full open‑sourcing [3] [4]. The Standard is enforced through a simplified 14‑criteria framework and a developing risk‑based assurance process that expects teams to evidence decisions about openness, security and licensing [5] [4].
1. How the Standard frames openness: a default of “open” with justified exceptions
The DSSS explicitly makes open source the default for new digital work: teams must “make all new source code open and reusable, and publish it under appropriate licences” or provide a convincing explanation for why parts cannot be opened, embedding openness as a presumption rather than an afterthought [1] [2].
2. Security and privacy are gating criteria, not afterthoughts
One of the 14 criteria requires creating “a secure service which protects users’ privacy,” meaning that any decision to publish code must be assessed against security and privacy obligations; the Standard therefore treats security/privacy as prime considerations that can restrict publication if open release would harm user safety or data protection [3] [4].
3. Legal criteria: licences, ownership and convincing justification
Beyond the technical, the Standard requires publishing under “appropriate licences,” which implicates copyright ownership, third‑party dependencies and licence compatibility as legal gates before code is released; teams must document licensing choices and, where third‑party or contractual constraints exist, provide the convincing explanation demanded by the Standard [1] [2].
4. Practical controls mentioned in guidance: sanitisation and “coding in the open”
Government and agency policies that follow the Standard recommend operational measures such as “coding in the open” for new projects and sanitising files to remove sensitive configuration, service‑version details or third‑party keys that could create vulnerabilities — concrete steps that allow code publication while mitigating security risks [6].
5. Assessment and risk‑based decisioning: who decides and how
The move from the older Digital First Standard to the DSSS introduced a simplified 14‑criteria model and a planned risk‑based assurance approach being developed by the Digital Assurance Office; that approach is intended to formalise how services are assessed for openness versus security/legal exceptions, meaning decisions are expected to be evidential and proportionate to service risk [4] [5].
6. Where the Standard leaves discretion — and where friction appears
While prescriptive about the expectation to open source, the Standard explicitly allows convincing explanations for non‑release of specific subsets of code, leaving room for interpretation and local policy [1] [2]; this necessary discretion also creates friction between the drive for reusability and practical constraints like proprietary third‑party components, national security, procurement clauses or privacy obligations [6] [7].
7. Competing incentives and implicit agendas
Open‑by‑default advances collaboration, reuse and transparency across Scotland’s public sector, an explicit policy aim reflected across guidance and agency policies [1] [6], while agencies and suppliers may implicitly prioritise risk avoidance, commercial confidentiality or procurement simplicity — pressures the Standard seeks to balance via licensing rules, sanitisation guidance and the “convincing explanation” route [2] [6].
8. Bottom line: criteria to decide publication are multi‑factor and evidence‑based
In practice the decision to open‑source a codebase under the DSSS turns on at least three documented tests: (a) does publication expose user data or technical secrets that undermine security/privacy? (b) are there legal or contractual licence or ownership barriers? and (c) can sanitisation, licensing and controlled release satisfy both openness and risk mitigation — all to be justified through the Standard’s assurance process [3] [1] [6] [4].