How have pilot CBDC projects (e.g., China, Bahamas, Sweden) handled user privacy and control?

Checked on December 1, 2025
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

Pilot CBDCs have taken divergent approaches to privacy: China’s e‑CNY emphasizes “controllable/managed anonymity” with encrypted storage but traceability for larger or suspicious transfers [1] [2], the Bahamas’ Sand Dollar embeds KYC limits and legal frameworks that constrain anonymity especially above small amounts [3] [4], and Sweden’s e‑krona research prioritizes privacy-by-design experiments and technical proofs of concept while testing trade‑offs with AML and system design [5] [6]. International institutions (IMF, BIS, ECB) push privacy‑by‑design and privacy‑enhancing technologies but warn that AML/CFT and operational needs shape real outcomes [7] [8] [9].

1. China: “Controllable anonymity” — privacy promised, traceability engineered

China’s pilots and public statements frame the digital yuan as offering limited privacy: officials repeatedly describe a system of “controllable” or “managed” anonymity where small transactions can be more private while larger or flagged flows are traceable to fight money laundering and other crimes [1] [2]. State messaging says transaction data are encrypted and that arbitrary queries are prohibited without legal authorization [10] [11], but multiple observers record that the architecture is centrally controlled and built to allow targeted oversight when regulators demand it [12] [13]. Reporting and academic reviews record that China’s model intentionally tilts privacy toward state‑accessible traceability as a feature rather than a bug [14] [13].

2. The Bahamas: legal limits, KYC and small‑value privacy

The Sand Dollar—one of the first live retail CBDCs—was designed with explicit consumer‑protection and data‑protection rules but embeds identity checks and transaction limits that limit cash‑like anonymity in practice [15] [4]. Academic and policy analyses underline that KYC/AML rules and the need for law‑enforcement tools mean true anonymity is infeasible for many use cases; designers allow reduced friction or anonymity only for very small amounts or predefined vouchers, not broad anonymous use [3] [4]. The Bahamas’ legal framework and data‑protection regime are cited as safeguards, but adoption has been modest and day‑to‑day privacy guarantees depend on operational choices by issuers and intermediaries [16] [17].

3. Sweden: experimentation, privacy‑by‑design and unresolved trade‑offs

Sweden’s e‑krona work is framed as an exploratory, technical exercise: the Riksbank has run proofs of concept and investigated multiple architectures (including DLT and centralized solutions) expressly to test privacy designs alongside usability and financial‑stability risks [5] [18]. Academic and central‑bank literature shows Sweden prioritizes experimenting with technical tools—such as zero‑knowledge proofs in the literature—to try to preserve user control over data while meeting regulatory requirements, but it remains unclear which practical compromises will be adopted if the project moves from pilot to production [18] [6].

4. Common design patterns and the regulatory choke points

Across pilots analysts and institutions identify the same pattern: technical measures (encryption, PETs, zero‑knowledge proofs, offline tokens) can increase privacy, but AML/CFT rules, data‑sharing obligations, and the choice of a centrally supervised ledger limit anonymity [7] [19] [9]. The IMF and BIS advise “privacy‑by‑design” and PETs while acknowledging that law enforcement and supervisory needs will shape who can access what data and when [7] [8]. Scholarly reviews find that increased privacy features raise user willingness to adopt CBDCs, yet international AML norms and practical enforcement mostly tip architectures toward conditional traceability [20] [21].

5. Competing narratives and hidden incentives

Central banks publicly emphasize privacy to build trust (China’s officials speak of privacy protection; Bahamas stresses legal safeguards; Sweden stresses research into privacy‑preserving tech) while also signalling they need oversight tools to meet AML/CFT goals [11] [15] [5]. Independent watchdogs and rights organizations warn that in jurisdictions with weak rule‑of‑law, traceability can become surveillance; scholars note commercial incentives to monetize payment data are also relevant but less visible in central‑bank statements [22] [21]. International actors push rules and standards—demonstrating an agenda to reconcile privacy with cross‑border supervision—but available sources show divergence on how far privacy will be protected in practice [23] [12].

6. Bottom line — privacy designs exist, but legal and operational limits decide control

Technical designs and PETs can meaningfully protect payment privacy if centrally enforced rules and legal safeguards limit access to transaction data; however, pilot evidence shows most real systems institute conditional anonymity with KYC, caps, or audit gates to satisfy AML and oversight [19] [4] [1]. Policymakers and the public must therefore scrutinize not only technical promises but the legal access rules, audit procedures, and institutional incentives that determine who controls CBDC data in daily operation [7] [9].

Limitations: sources compiled here cover official statements, central‑bank papers and academic reviews up to mid‑2025; available sources do not mention every pilot detail (for example precise code implementations or secret access logs) and real‑world operational practices may evolve after these reports [7] [23].

Want to dive deeper?
What privacy protections are built into China’s digital yuan pilot and how have they changed over time?
How does the Bahamas’ sand dollar balance anonymity and anti-money-laundering controls?
What lessons did Sweden’s e-krona pilots offer about user data control and offline payments?
How do CBDC privacy designs compare to existing bank and payment app privacy practices?
What technical and legal mechanisms can ensure user control and minimize surveillance in CBDC implementations?