Privacy By Design Done Right: Success Stories and Case Studies

Success Stories and Case Studies

Privacy by Design (PbD) is a proactive approach to integrating privacy considerations throughout the development lifecycle of any technology, product, or service. It emphasizes embedding privacy into the core design from the very beginning, rather than as an afterthought. Implementing PbD perfectly can be challenging without the right guidance. Notably, some tech pioneers in the industry such as Meta and Alphabet have largely failed to implement PbD. For example, the Cambridge Analytica Scandal in 2018 exposed glaring deficiencies in Facebook’s privacy approach and was crucial in necessitating regulatory enforcement such as the GDPR and CCPA. Newer services such as Signal, Duck Duck Go and Proton Mail emerged in response to these concerns. These apps exemplify Privacy by Design since it is the first consideration that went into their creation. The privacy policies, techniques and technologies of these companies reflect their respective levels of dedication to implementing Privacy by Design.

1. Proactive not Reactive; Preventative not Remedial

This principle of PbD entails the idea that the company should take preemptive steps to protect users’ data rather than acting after data breaches or leakages have occurred. For example, when creating a mobile app, the company should implement privacy protection measures such as encryption and access controls from the beginning.
An example of proactive, not reactive, privacy by design in an operating system could be the inclusion of strong encryption and data protection mechanisms as core features from the initial development stage. Instead of waiting for privacy breaches to occur and then addressing them through updates, the operating system could be designed with built-in encryption tools that automatically encrypt user data stored on the device.

 

Real-life Implementation

 

  • In real life, proactive privacy protection can be seen in Apple’s iOS operating system’s firewall system which is built to protect from attacks perpetrated through third party apps and services. Apple’s firewall is a built-in defense against malicious intrusion. The system employs various types of data encryption protocols including AES (Advanced Encryption Standard), SHA (Secure Hash Algorithm) for secure hashing and HMAC (Hash-based Message Authentication Code). These are meant to make the system immune from data intrusion including from law enforcement agencies such as the FBI and the NSA. 

2. Privacy as the Default

The essence of privacy as the default advocates for systems, processes, and technologies to prioritize and default to protecting user privacy. It suggests that privacy should be built into products and services from the outset, rather than requiring users to actively opt out of invasive data collection or sharing practices.  The goal of “privacy as the default” is to shift the burden away from users having to actively protect their privacy towards organizations proactively safeguarding it as an integral part of their products and services.

 

Real-life Implementation

 

  • Messaging app Signal is one of the most apt exemplifications of the concepts of privacy by default. The app employs a plethora of privacy technologies which are built into the app and are in fact the motive behind the creation of the app itself. From the start, open-source top-of-the-range cryptography technologies are built into the system with the express aim of preventing vulnerabilities such as zero-day exploits, which are popular with government spy agencies and well-resourced hackers. Known as the Signal Protocol, Signal’s inbuilt privacy defense covers messaging and voice conversation providing a secure communication platform between users. With this protocol, the user is not required to put in any additional effort in mitigating privacy invasion; the protection is already built into the app from the start. 

3. Privacy embedded into design

Privacy Embedded into Design emphasizes integrating privacy considerations directly into the design and architecture of systems and processes from the outset, rather than treating privacy as an add-on or secondary concern. This ensures that privacy is a fundamental component of the product or service’s structure and functionality.

 

Real-life Implementation

  • The messaging app Telegram exemplifies the concept of Privacy Embedded into Design by minimizing data collection as well as using built-in privacy protection techniques. For example, the app only requires the user to provide a mobile number and profile name to use their service. Additional information such as birthday and location can be added at the user’s discretion, but they also retain granular control over who can see that data. The app does not retain any personal information revealed in chats for any purposes whatsoever. This minimalist approach ensures that sensitive data is not available to third parties or insiders to compromise.  In addition, the app has built-in privacy features such as the ability of any user to delete conversations in their entirety for both parties in the conversation.

4. Full functionality

Privacy measures should not come at the expense of functionality; both privacy and functionality should be maximized to achieve a positive outcome for users. The principle of full functionality emphasizes that PbD is a positive sum game rather than a zero-sum game, meaning that PbD needs to be viewed as an opportunity rather than a cost to the business. It acknowledges that privacy should not come at the expense of functionality and that users should be able to enjoy the full range of features and capabilities without sacrificing their privacy.

 

Real-life Implementation

 

  • Brave Browser offers built-in features like Tor for private browsing and IPFS integration for accessing decentralized content. These tools allow users to access the full functionality of the internet without compromising their privacy. With Brave, users can access all the services that they can find on Chrome and other conventional browsers, while ensuring that they are not being tracked or giving away sensitive information.

5. End-to-end security

Privacy and security measures should be implemented throughout the entire lifecycle of data, from collection to storage and disposal, ensuring comprehensive protection. At its core, the end-to-end security principle aims to minimize the risk of unauthorized access, interception, or tampering with sensitive data by implementing robust security measures at every stage of its journey. This includes encryption, authentication, access controls, and other security mechanisms to safeguard data both in transit and at rest.

 

Real-life Implementation

 

  • One of the best examples of end-o-end security is exemplified by Duck Duck Go’s search term encryption. Under this approach, the search engine encrypts the search term and passes it through a partner search engine before the result is then decrypted by Duck Duck Go for use by the customer. In effect, the encryption works like a personal lock that secures each search session so that no person other than the end user can see the details of their search. Search results are displayed based on the search term rather than curated to the user’s profile as is the case with more conventional engines like Google and Bing.


  • ProtonMail implements the end-to-end security principle of privacy by design by encrypting user emails on the client side before they are transmitted to ProtonMail’s servers. This means that only the sender and intended recipient have access to the unencrypted content. Even ProtonMail itself cannot decrypt and access the contents of the emails, ensuring maximum privacy for users. Additionally, ProtonMail employs open-source cryptography and offers features like two-factor authentication and message expiration to further enhance security and privacy for its users.

6. Visibility and transparency

Organizations should be transparent about their privacy practices, policies, and the collection and use of personal data, allowing users to understand and control how their information is handled. This principle emphasizes the need to make the user aware about the data that they hand over to the company when accessing internet services. One of the most common forms of visibility and transparency implementation involves the display of the privacy policy on the “Terms of Service” (TOS) page. While these disclosures stem from the requirement for privacy, it is common for online services to use exceedingly complicated language that is difficult for the average user to comprehend. This is in many cases both intentional and negligent in nature, leading to users accepting TOS agreements without entirely understanding the ramifications on their privacy.


Real-life Implementation



  • The aforementioned use of difficult verbiage in the privacy policy inspired the health insurance service Alan to create a simplified privacy policy, a snippet of which is shown below. Using a simple Terms of Service (TOS) devoid of convoluted language enhances privacy by design by fostering transparency and user comprehension. Clear, straightforward language empowers users to make informed decisions about their data, aligning with the principle of privacy by design and promoting trust between users and service providers.

7. Respect for User Privacy

This principle emphasizes the importance of users maintaining control over their privacy settings and that they expose to the company. Obtaining the individual’s explicit and voluntary consent is necessary before collecting, using, or disclosing their personal information, except in cases permitted by law. The level of detail and clarity required for consent increases with the sensitivity of the data. Individuals have the right to withdraw their consent at any time.



Real-life Implementation



  • Apple employs differential privacy across various aspects of its ecosystem, including device usage statistics, app usage patterns, and even features like Siri suggestions and keyboard suggestions. When users opt-in to share usage data with Apple, differential privacy algorithms are applied to their data before it is aggregated and analyzed. This process ensures that Apple can gain valuable insights into user behavior and preferences without compromising user privacy.

 

  • Tor browser safeguards user privacy online by routing internet traffic through a network of volunteer-operated servers, encrypting data at each step. This process obscures the user’s location and conceals the specifics of their online activities, making it challenging for third parties to track or monitor them. By providing layers of anonymity, the Tor browser enhances user privacy by preventing adversaries from correlating browsing behavior with individual identities.

 

  • DuckDuckGo prioritizes user privacy by refraining from storing personal information or tracking searches. Instead of creating individual profiles, they serve contextual ads based solely on the search keywords used at that moment, ensuring user anonymity. By dissociating search queries from user identities and preferences, DuckDuckGo maintains user privacy while still providing relevant advertising content.


PbD is a wide concept whose implementation has been a hurdle for many tech companies. Traditionally, tech companies have treated customer data as a free commodity and this has led to significant backlash from the public and authorities. Their profit models are also highly reliant on data tracking, therefore are reluctant to implement PbD. Newer, smaller companies we have discussed in this post have the advantage of setting up their services for privacy maximization right from the start. On the other tech giants would have to overhaul their already colossal code in order to meet the threshold for PbD. That is the reason these services are considered better exemplars of PbD than tech giants.

Leave a Reply

Your email address will not be published. Required fields are marked *