Product privacy done right — by design and by default

Editor’s note: At Lacework, we view privacy as an essential component of security solutions. This article focuses on the critical concept of “privacy by design.” With their extensive and diverse experience in the cybersecurity, privacy, and legal domains, our authors explain the origin of privacy by design, its importance in the context of security, and how Lacework incorporates this concept to our products, services, and solutions. This analysis is tailored for a legal audience but is relevant for anyone with an interest in the nuances of privacy and security. 

 

What is privacy?

The International Association of Privacy Professionals (IAPP), a respected community for global privacy practitioners, has a useful glossary of definitions for all things privacy-related. The IAPP refers to an influential 1890 Harvard Law Review article by Samuel Warren and Louis Brandeis, who later became a Supreme Court Justice, that famously defined privacy as “a right to be let alone.”

In the context of information privacy, in particular, the usage of data relating to individuals (aka “data subjects” in General Data Protection Regulation [GDPR] parlance1), and specifically the “identified or identifiable living individual to whom personal data relates,” this generally refers to the right of individuals “to determine for themselves when, how and to what extent information about them is communicated to others.” Businesses often have a related concern, where they also wish to control the processing and use of their confidential information; these are often treated in concert when building privacy-respectful products.

This concept of information privacy is particularly relevant to data subjects in today’s highly connected environment, where many peoples’ entire digital life is carried everywhere with them on a mobile device and is accessible from anywhere in the globe through cloud-based networks.

How does privacy relate to security?

For most practical purposes, privacy and security are two sides of the same coin, and they are often closely related in the minds of data subjects. The IAPP has a useful discussion of the difference between privacy and security, where privacy relates to controls, procedures, and guardrails around the usage of user data relating to a data subject, and security relates to preventing unauthorized access to this personally identifiable information. This security approach follows the CIA triad2 of confidentiality, integrity, and availability, including using techniques such as firewalls and cloud security in the online world and locked filing cabinets in the offline world. Security also involves preventing misuse of the personally identifiable information in the event it is accessed (e.g., using encryption, compartmentalization of information, etc.). In layperson’s terms, privacy is generally about “what someone is allowed to do with my data” and security is generally about “what attackers have the capability to access and make use of my data, and in what ways.”

Data can be quite secure but used in inappropriate ways can be inconsistent with privacy concerns. For example, certain regimes have quite securely kept secret files or dossiers on individuals that they used without consent to infringe on human rights of certain groups, or to embarrass and deter political dissidents. Similarly, data can be protected by privacy rules while remaining poorly secured. For example, in the US, a table of healthcare information (protected under the Health Insurance Portability and Accountability Act [HIPAA] in the US) or financial information (protected under Gramm-Leach-Bliley Act [GLBA] in the US) relating to individuals can be stored in an open S3 bucket on the cloud. Secure? No. But the privacy rules remain in place, though poorly enforced. In many situations, even though there are strong legal obligations in force relating to the privacy of personal information, there is often insufficient security to protect that personal information. 

What is privacy by design? 

Privacy by design is closely related to the concept of data protection by design. Specifically, designers of products or services that process personal information of data subjects are “…encouraged to take into account the right to data protection when developing and designing such products.” In practical terms, this means that designers and architects of information-based products and services should build privacy protections into their products, and must make privacy the default option for their products to ensure that personal data is protected.

Where did privacy by design originate?

The original concept of privacy by design originated from work done by Ann Cavoukian, who was the Information and Privacy Commissioner of Ontario, Canada, in partnership with others dating back to 1995, which was later expanded upon in a 2009 workshop “Privacy by Design: The Definitive Workshop.”

What regulations require privacy by design?

The GDPR requires privacy by design and by default. Specifically, Article 25 of the GDPR, titled Data protection by design and by default,” and Recital 78which notes that “the controller should adopt internal policies and implement measures which meet in particular the principles of data protection by design and data protection by default” — clearly state this. 

Why is this relevant to customers of security providers?

Companies and organizations that are established in the European Union, or that target residents of the European Union, are subject to the GDPR. This is referred to as the extraterritorial effect of GDPR, where Article 3 defines the extraterritorial3 effect under the Establishment criterion via Article 3(1) and under the Targeting criterion via Article 3(2).

This means that a provider of tools or services that are used by European Union residents, or process the personal information of European Union residents, are subject to the GDPR. Similarly, this also applies to UK residents under the UK GDPR4.

Why does privacy by design matter in the context of security?

Security is intended to prevent access, by persons without permission (i.e., an attacker), to information or physical property. However, if the attacker does manage to bypass security controls and access the underlying information, then having the right design approach can greatly reduce the scope of both security and privacy violations which can occur, sometimes referred to as limiting the “blast radius.” Security is about what’s possible to do with the data, and privacy is about what is the right thing to do with the data. You need to consider both when designing products or services.

In particular, a designer of products or services which processes personally identifiable information should endeavor to follow the principles below, which were outlined originally5 by the Office of the Information and Privacy Commissioner (IPC) of Ontario, Canada:

  1. The design should be proactive, not reactive, and consider privacy concerns up front in the design of products and services. Similarly, techniques used should be preventative to avoid a privacy incident, rather than just reactive. Reactive tools are certainly helpful and can provide a backstop in the event of a security or privacy incident, but ideally privacy should be considered as part of the initial design.
  2. Privacy should be the default setting, meaning that the options as initially presented to the user should be the most privacy-preserving options (if such exist), which a user can then choose to change if they so desire.
  3. Privacy should be embedded into the design from the get-go, and not “bolted on” after the fact.
  4. Privacy should be additive to the design functionality and security, rather than detract from it. While tradeoffs may be necessary in practical terms, the goal is to keep both security and privacy at the forefront of any decisions.
  5. Privacy should follow the entire lifecycle of data, from initial collection to final disposal.
  6. The privacy techniques used should be open and transparent, and subject to independent verification.
  7. User-centric privacy should be a key element of the design, where the privacy interests of the users are protected by default.

By taking these steps, the designer of products and services that process personally identifiable information can demonstrate both their intent and commitment to privacy by design and by default.

What is Lacework’s philosophy on privacy by design?

As a provider of products that help secure our customers’ most critical data and information assets, Lacework takes great care to design products that protect our customers’ information and, importantly, do not cause additional risk to our customers. Our philosophy is that we should strive to make our customers more secure and more private, both in our products and our internal process for supporting our customers.

How do we apply privacy by design to our products?

At Lacework, our approach, which we call “private by design,” is to ensure that our products, services, and solutions are all designed to ensure that we build privacy into our products and how we deliver them, so we can help protect both our customers and their customers.

Lacework solutions, including agentless scanning, have been architected from the ground up to incorporate privacy by design and by default as keystone features. This is discussed in detail in this blog, which discusses our “three tenets of trust” philosophy, and this blog, which describes how the Lacework agentless scanning feature does not export snapshots from the customer environment.

Next generation Lacework products are also being designed with a strong privacy ethos at the core of everything we do, with privacy designed in from the start. Earning customer trust starts with respecting and protecting the privacy of our customers’ information and that of their customers also, as well as robustly securing it from threats and attackers both from the outside and the inside.

How do we apply privacy by design to our internal processes supporting Lacework products? 

Lacework considers privacy through our product lifecycle and in practice, including how we build our products. Privacy is never a job “done.” We consider privacy both during the development of new features as well as when we are determining how to continue adding new privacy-specific enhancements to our existing products. One example of how Lacework applies privacy by design is in how we run our business; we undertake a privacy and security assessment on all new vendors as standard, as part of our procurement vendor onboarding process. 

Why does this matter to Lacework customers and prospects?

Our customers care deeply about both the security and privacy of the data they hold. In many cases, our customers are processors of personally identifiable data — even personal health information, financial information, or other particularly sensitive data — on behalf of their own end-customers, who are often data controllers. Where our customers are processors (or subprocessors), they are obligated to have appropriate technical and organizational measures to protect such data. If they engage an additional organization to process their end-customer’s data further (another sub-processor in effect), they are generally required to inform and provide a right of objection to the data controller. 

Because the Lacework agentless solution is architected to not take a customer’s underlying data outside of the customer environment, the agentless solution does not need to be listed as a subprocessor by our customers, which is a benefit to our customers and reduces their compliance burden.

Conclusions and recommendations

We encourage customers, partners, and users of security tools and services to consider privacy as an important feature in their security solutions. It is necessary for building the trust of users and demonstrating respect for their personal data, as well as for demonstrating compliance with global privacy regulations.

 

 

About the authors 

michael

Michael Moore: As Vice President of Privacy and IP, Michael is responsible for privacy and cybersecurity, procurement, product counseling, transactional support, patents and intellectual property strategy, open-source software, and other matters. Michael is a seasoned attorney with more than a decade of privacy, cloud, transactional, software and hardware counseling and patent and IP experience, which follows his technical career in logic design and software engineering. Michael holds the IAPP privacy qualifications of CIPP-US, CIPP-E, CIPP-C, CIPM, and CIPT.

 

leaLea Kissner: As Chief Information Security Officer (CISO) at Lacework, Lea leads the development and implementation of our overall security strategy and programs. Lea has 20 years of experience leading security, privacy, and anti-abuse efforts at global organizations, including serving as CISO at Twitter, Chief Privacy Officer at Humu, and Global Lead of Privacy Technology at Google.

 

alanAlan Mulvaney: As Senior Manager, Legal at Lacework, Alan has more than 20 years experience in the IT industry, specializing in the negotiation of commercial contracts with a particular focus on Europe. Alan, who has practical day-to-day experience and knowledge in the highly privacy sensitive markets of Germany and France, in particular holds an insight into the key concerns of privacy sensitive customers, especially in highly regulated industries. Alan holds a Practitioner Certificate in Data Protection, in addition to his Bachelor of Business Studies and his Associated Membership of the Chartered Institute of Personnel and Development. 

 

 

1 https://ico.org.uk/for-organisations/data-protection-fee/legal-definitions-fees/

2 https://iapp.org/resources/glossary/#c-i-a-triad-2

3 https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-32018-territorial-scope-gdpr-article-3-version_en

4 https://ico.org.uk/for-organisations/data-protection-and-the-eu/data-protection-and-the-eu-in-detail/the-uk-gdpr/

5 https://www.ipc.on.ca/wp-content/uploads/2018/01/pbd.pdf

Categories