CIPT 2020 Study Guide
Federated identity management
(Single Sign On) Practice of outsourcing authentication not just to a different service within an organization, but to a different organization. In federated identity management, the service that authenticates users is called the identity provider (IdP) and the services that rely on the IdP to authenticate users are called service providers (SPs)
Limitations of access management as a privacy tool
- Access control mechanisms can differ in the granularity at which they protect data as well as the type of access they're able to regulate - The system that authenticates a user learns various pieces of private information about a user and can ask users to supply answers to security questions which are typically personal questions; the accumulation of this personal data raises the chance of harm
Privacy enhancing techniques
- Aggregation - De-identification - Encryption - Identity & access management - Authentication
Taxonomy of privacy problems during information processing
- Aggregation - Identification - Insecurity - Secondary use - Exclusion
Type of information processing violations
- Aggregation - Insecurity - Identification - Secondary use - Exclusion
Impose privacy controls
- Architect - Secure - Supervise - Balance
Taxonomy of privacy problems during information dissemination
- Breach of confidentiality - Disclosure - Distortion - Exposure - Increased accessibility - Blackmail - Appropriation
Type of information dissemmination violations
- Breach of confidentiality - Disclosure - Exposure - Increased accessibility - Blackmail - Appropriation - Distortion
Ongoing vigilance
- Code reviews - Code audits
Tactics of control
- Consent: Only process personal data for which explicit, freely given and informed consent is received - Choose: Allow data subjects to choose which personal data will be processed - Retract: Honoring the data subject's right to the complete removal of any personal data in a timely manner - Update: Provide data subjects with the means to keep their personal data accurate and up to date
Tactics of enforce
- Create: Privacy policy - Maintain: Privacy policy -Uphold: Ensure policies are adhered to
PIA triggers
- Creation of a new product or service - New or updated program for processing data - Onboarding of new data
Tactics of separate
- Distribute: process personal data for a task in separate locations - Isolate: process personal data (for different purposes) independently on separate databases or systems
Process oriented strategies
- Enforce - Demonstrate - Inform - Control
Tactics of minimize
- Exclude: Refrain from processing a data subject's personal data; if you don't need it, don't collect it - Select: Deciding on a case-by-case basis to only process relevant personal data - Strip: Removing partially or unnecessary attributes - Destroy: Remove personal data completely as soon as they become unnecessary
Mobile social computing
- Geotagging - Geosocial patterns
Tactics of abstract
- Group: Aggregate data over groups of individuals instead of processing data of each person separately - Summarize: Summarize detailed information more abstract attributes - Perturb: Add noise or approximate the real value of a data item
Type of invasion violations
- Intrusion - Decisional interference
Tactics of demonstrate
- Log: Track all processing of data and reviewing this information gathered for any risks - Audit: Audit the processing of personal data regularly - Report: Analyze collected information on tests, audits and logs periodically, and report to the people responsible
Natural language generation
- NLG takes data from a search result and turns it into an understandable language to answer almost exactly how a human would NLP → NLU → NLG
Natural language understanding (NLU)
- NLP provides a set of rules & definitions for natural language as a foundation to break down human language and better understand the true meaning behind a search query - NLU is a smaller part of natural language processing; once the language has been broken down, the program understands, finds meaning and can perform sentiment analysis - Takes the slangy, figurative language we talk everyday to understand what we truly mean
Chatbots
- NLU enables language intelligence in bot platforms, making interactions with uses more natural - Bot platforms need to access the sentences in clear test which prevents any type of encryption techniques - Entities with access to dialog data could gather a level of intelligence superior to what's available in social networks (e.g., conversational communication elements like intentions, reasoning, etc.)
Geosocial patterns
- Patterns in an individual's location, such as home or workplace, can uniquely identify many people - Location-based profiles can potentially be used to extract private information about users, that they may no wish to disclose and can pose a threat to their privacy
Privacy Engineering Objectives
- Predictability - Manageability - Dissociability
Organization internal privacy policies
- Privacy policies that serve as a guide for all organizational activities & drive commitments made within the privacy notice - Often overwhelming to a user since they provide a complete & comprehensive overview of an organization's data practices
Bring Your Own Device (BYOD)
- Requires own devices to be used to access potentially sensitive data or internal company networks Employees' personal devices may contain untrustworthy software or hardware, and could be leveraged by attackers to steal user credentials or other sensitive data
Single/multi-factor authentication
- Requires two different mechanisms from two of the categories: something you know, something you are, something you have
Tactics of hide
- Restrict: Prevent unauthorized access to personal data - Mix: Process personal data randomly within a large enough group to reduce correlation - Obfuscate: Prevent understandability of personal data (e.g., hashing) - Dissociate: Remove the correlation between data subjects and their personal data
Regulations & standards that affect data governance & support internal, organizational privacy standards
- Sarbanes-Oxley (SOX) which aims to improve corporate accounting - Basel II, which aims to improve credit risk calculations - HIPAA Privacy Rule, which requires accounting for medical information disclosures
Vehicular automation
- Sensors in modern vehicles measure driving speed, braking, tire pressure, proximity to other vehicles and many detailed aspects of how the driver is interacting with the vehicle - Users' apps (google maps) transmit information about their routes & speed to the mapping company's servers where it is processed, traffic conditions inferred and real-time info is transmitted to others - Attacks against cars are a threat on a large scale; break into Electronic Control Unit (ECU) to disable brakes, control acceleration and perform other nefarious acts
Data oriented strategies
- Separate - Minimize - Abstract - Hide
Symmetric encryption algorithms
- Steam ciphers - Block ciphers
Tactics of inform
- Supply: Inform users which personal data is processed, including policies, processes & potential risks - Notify: Alert data subject whenever their personal data is being used or breached - Explain: Provide information in a concise and understandable form, and explain why processing is necessary
Taxonomy of privacy problems during information collection
- Surveillance - Interrogation
Type of information collection violations
- Surveillance - Interrogation
User-based access controls vs context of authority
- System administrators who configure access control systems are not the same people who decide what policy should be implemented - Challenges on the human side of configuring access control policy frequently lead to access control systems being misconfigured
RFID
- Tiny microchips that contain an antenna used to transmit information, such as serial numbers, to RFID reader - Primarily used in supply chain management to track inventory
What should be included in internal privacy policy?
- Types of data classification - Data collection principles - Protection of data (encryption, access control based, etc.) - Data retention period - Treatment of sensitive data - Sharing of data across groups - Sharing of data with partners and vendors - Creation of departmental privacy policies - Performance of privacy reviews - Participation in a privacy response center - Responding to privacy inquiries
Advantages of federated identity management
- User's identity can be kept hidden from the service provider - Users no longer have to remember multiple sets of authentication credentials - Service providers are relieved of the burden of implementing authentication and protecting users' authentication information
What should be included in security policy?
- encryption - software protection - access controls - physical protection - social engineering prevention - auditing
Calo's Harms Dimensions
- the perception of harm is just as likely to have a significant negative impact on individual privacy as experienced harms - personal information volunteered for use cannot result in a privacy harm - IT professionals need to rely on privacy notice & privacy control to build & retain trust
Information governance principles
1. Accountability 2. Transparency 3. Integrity 4. Protection 5. Compliance 6. Availability 7. Retention 8. Disposition
14 methods of value-sensitive design
1. Clarify project value 2. Identify direct & indirect stakeholders 3. Identify benefits & harms for stakeholders 4. Identify & elicit potential values 5. Develop working definitions of key values 6. Identify potential value tensions 7. Value-oriented design & development
Data lifecycle
1. Collection 2. Use 3. Disclosure 4. Retention 5. Destruction
8 Fair Information Practice Principles (FIPPs)
1. Collection limitation 2. Data quality 3. Purpose specification 4. Use limitation 5. Security safeguards 6. Transparency 7. Individual participation 8. Accountability
four elements of a design pattern
1. Pattern name 2. Problem description 3. Solution 4. Consequences (describe the results from applying the pattern and any trade-offs that occur by using or not using the pattern
COBIT domains
1. Plan and Organize 2. Acquire and Implement 3. Deliver and Support 4. Monitor and Evaluate
Nissenbaum's Contextual Integrity
1. Privacy is provided by appropriate flows of information 2. Appropriate information flows are those that conform with contextual information norms 3. Contextual informational norms refer to five independent parameters (data subject, sender, recipient, information type, transmission principle) 4. Conceptions of privacy are based on ethical concerns over time
Privacy by Design principles
1. Proactive, not reactive 2. Privacy as the default setting 3. Privacy embedded into design 4. Positive sum, not zero sum (full functionality) 5. End-to-end security (full lifecycle protection) 6. Visibility & transparency 7. Respect for user privacy
Accountability Principle
A fair information practices principle states that individuals controlling the collection or use of personal information should be accountable for taking steps to ensure the implementation of these principles (FIPPs)
Transparency Principle
A fair information practices principle that encourages organizations to be open about personal information they collect
Purpose Specification Principle
A fair information practices principle, it is the principle stating: (1) that the purposes for which personal data are collected should be specified no later than at the time of data collection (2) and the subsequent use limited to the fulfillment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.
Collection Limitation Principle
A fair information practices principle, it is the principle stating: (1) there should be limits to the collection of personal data (2) that any such data should be obtained by lawful and (3) fair means and, where appropriate, with the knowledge or consent of the data subject.
Individual Participation Principle
A fair information practices principle, it is the principle that an individual should have the right to access, edit or delete data
Security Safeguards Principle
A fair information practices principle, it is the principle that personal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification or disclosure of data.
Use Limitation Principle
A fair information practices principle, it is the principle that: (1) personal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance with Paragraph 8 of the Fair Information Practice Principles except with the consent of the data subject or by the authority of law.
Adequacy decision
A finding by the EU commission that the legal framework in place in that country provides adequate protection for individuals' rights and freedoms for their personal data; country will be covered for 4 years by adequacy decision
Software defect
A flaw in the requirements, design or implementation that can lead to a fault
COBIT
A framework developed by the Information Systems Audit and Control Association and the IT Governance Institute that defines the goals for the controls that should be used to properly manage IT and ensure IT maps to business needs.
Example of intrusion
A game that invites people to go to individuals' houses without the owner's permission
Example of decisional interference
A government dictating family planning decisions
Example of appropration
A magazine using your image to increase their sells
Private information retrieval (PIR)
A range of protocols in which data can be retrieved from a database without revealing to the database or another observer the information that is retrieved
Example of insecurity
A web developer allowing web visitors to see other customer's records by changing a customer ID number in the URL
Differential privacy
Add sufficient noise to the aggregates to hide the impact of any one individual; the added noise must be large relative to the difference for any two databases and any individual
Example of aggregation
Aggregating a user's searches to better interpret interests (does java mean coffee or the programming language?)
Service-oriented architecture
Aim to decouple services from large-scale servers by allowing designers to replicate services across multiple machines and improving load balancing
Predictability in Privacy Engineering
Aims to enable reliable assumptions about a system, particularly its data and the processing of that data, to all stakeholders
Secure multi-party computation
Algorithms that allow two or more computers to participate in a computation and compute a mathematical result without otherwise revealing private information; faster than homomorphic encryption algorithms
Homomorphic encryption
Allows data to remain encrypted while it's being processed and manipulated; it enables the ability to apply functions on encrypted data without needing to reveal the values of the data; it helps to protect integrity of data by allowing others to manipulate its encrypted form while no one besides the private key holder can understand or access its decrypted values.
Speech recognition
Allows users to interact with and control technologies by speaking to them; voice assistants Consumer's conversations may inadvertently, or maliciously, be recorded and transmitted
Role-based Access Control (RBAC)
An access control model that specifies who is allowed access in terms of roles
Privacy Impact Assessment (PIA)
An assessment that determines the impact on the privacy of the individuals whose data is being stored, and ensures that the organization has sufficient security controls applied to be within compliance of applicable laws or standards.
Example of distortion
An inaccurate notation of a late payment on a credit report
Incident response
An incident response program should have representatives from public relations, legal, privacy and security & the webform, email or phone number should be easily accessible from the privacy notice. Privacy incidents can occur due to poorly applied security controls. Ex: improper access controls can permit inappropriate access to data
Software fault
An incorrect step, process or data definition in a computer program
Destruction
As soon as data is no longer needed, ensure the data and any derivatives are removed from all systems using appropriate methods to prevent recovery
Example of interrogation
Asking a female job candidate if she's pregnant
Principle of least privilege required
Asserts that any user, computer system or component of a computer system should have only the privileges it needs to do its job, and no more; this ensures that harm is minimized in the case of compromise
Vulnerability management
Assessing & developing plans for the capability & probability that a threat actor's acts will succeed
Balance (as a control)
Asymmetries or imbalances of information and power is a key reason why we need to build privacy into our services; Using the strategies of Inform & Control can reduce those imbalances
Something you have
Authentication method that requires an object possessed by the individual (e.g., tokens or keys)
Magnitude data
Average income by age (e.g., someone with a high income will affect the average much more than an individual whose income is close to the average)
Privacy violation has not occurred if the following exist
Awareness Benefit Consent (ABC)
Asymmetric information
Awareness issue; imbalance of information between parties, meaning that one party knows things that the other does not
Bandwidth problem
Awareness issue; lack of time may prevent the individual from educating themselves about what a threat actor has access to or is doing even if the threat actor is transparent
Secure (as a control)
Begin securing the data in domains or data shared across domains using Abstract & Hide
Identifying information needs
Begin with the end in mind to understand & articulate the purpose and use that to drive information collection; look for the minimal information needed to accomplish the goal (data minimization)
Asymmetric power
Benefit issue; May allow the potential violator to skew the benefit in their favor
Hyperbolic discounting
Benefit issue; the tendency for people to increasingly choose a smaller-sooner reward over a larger-later reward as the delay occurs sooner rather than later in time
Machine learning
Bias & fairness in machine learning Capable of all three types of interference (obstruction, intrusion, self-representation) Organizations should define & document their goals for fairness along with their rationale and approach for achieving these goals
Breach of confidentiality
Breaking a promise to keep an individual's information confidential
Patches
Changes to a program that aim to fix, update or improve system
Example of blackmail
Charging customers to delete their accounts
Web tracking
Companies create detailed profiles about what website a particular user visits so they can target advertisements and offers to that user
Providing feedback on contractural & regulatory requirements
Compliance offers a structure for the privacy program; reasonable assurance is fundamental in privacy & compliance
Code audits
Comprehensive analysis of source code to look for common issues, vulnerabilities and violations
Data inventory
Conducting a data inventory reveals where personal data resides, which will identify the data as it moves across various systems and thus how data is shared and organized and its locations. That data is then categorized by subject area, which identifies inconsistent data versions, enabling identification and mitigation of data disparities. The data inventory offers a good starting point for the privacy team to prioritize resources, efforts, risk assessments and current policy in response to incidents.
Endowment effect
Consent issue; Undervaluation of privacy, the finding that items gain in value to the person who owns them
Context-aware computing
Context-dependent privacy interfaces are triggered by certain aspects of the user's context. For instance, being in physical proximity to an IoT sensor may cause the device to announce its presence (e.g., flashing an LED, beeping, sending a description of its data practices to the user's phone). Other context factors might be someone accessing previously uploaded information or analyzing the user's previous privacy settings behavior to warn about potentially unintended privacy settings.
Frequency data
Count of the number of individuals at a given income or age
Example of increased accessibility
Court records being put online an in a searchable format
Understanding quality attributes
Crosscutting concerns that cannot be addressed by a single function; ex: privacy, security, usability, accessibility
Quasi-identifiers
Data that can be combined with external knowledge to link an individual
Pseudonymous data
Data that is not directly associated with an individual
Internet monitoring
Data travels across the internet and data can be picked up in transit; opportunities for tracking and surveillance
Automated decision making
Decisional interference; the data subject should have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her Also includes steering an individual towards a decision that may be adverse to his/her best interest (persuasion)
Complexity in crypto-design
Degree of complexity when encrypting data will depend on how it is implemented
Demonstrate (Process oriented strategy)
Demonstrate compliance - Demonstrate you are processing personal data in a privacy-friendly way
Exclusion
Denies an individual knowledge of and/or participation in what is being done with their information
Design pattern
Describe recurring problems through a shared solution that can be repeatedly reused to solve a problem; Solution templates aimed at frequently encountered problem types
Client-server architecture
Describes the relationship between the client (program on local computer) and server (program running on remote computer); advantage is that it allows the service to store computer data on the client side for the purpose of completing transactions
Cloud-based computing
Describes the shift of client-based services or services typically run on a company's intranet to an off-site third party
Retention
Destroy data when it is no longer needed to complete the transaction; any new uses that motivate longer retention periods require additional consent from the data subject and/or the sending of new privacy notices
Key issue when releasing aggregates
Determine where the data is, whether it is frequency data or magnitude data
Location tracking
Devices contain a wide variety of location tracking technologies, each of which rely on slightly different types of underlying systems
Example of breach of confidentiality
Doctors releasing patient information that might change how they are treated or their credibility in a court case (e.g., mental disorder)
Field encryption
Encrypt only the sensitive fields within a record
Encryption performance in crypto-design
Encrypting & decrypting data will add time to all other processing that will occur for the data
Encryption size in crypto-design
Encryption may increase the number of bytes needed for storing data
Algorithms & keys
Encryption requires picking an algorithm, key length & key value
Full lifecycle protection (end-to-end security) (PbD)
End-to-end security; From cradle to grave, security of personal information must be considered at every stage of the information life cycle: collecting, processing, storage, distribution and destruction
CONTROLS mapped to PROBABILITY of action of a threat actor
Enforce & demonstrate
Supervise (as a control)
Enforce policies through processes & Demonstrate that the other actor is compliant with these policies & processes
T-closeness
Ensure that the distribution of values in a group k is sufficiently close to the overall distribution
Peer-to-peer architecture
Extreme alternative to client-server architectures whereby each peer is both a client and a server; peers use a directory service to find other peers
Insecurity
Failure to properly protect an individual's information
CONTROLS mapped to increase DIFFICULTY for a threat actor
Hide & abstract
High-level design
High level components that serve to group these processes and data together into categories of functionality; considers IT architectures and how information will flow between actors
Anthropomorphism
Human characteristics (e.g., speech recognition, natural language processing/understanding)
Weak identifiers
Identifiers that must be used in combination with other information to determine identity
Cross-border transfers
Implement appropriate safeguards when transferring data internationally - binding corporate rules - approved code of conduct - certification scheme
Low-level design & implementation
Implementation of software following from the requirements & design phases; privacy practices include good code practices & code reviews, reusing standard libraries & frameworks (APIs)
Privacy as the default setting (PbD)
Individuals should not have to resort to self-help to protect their privacy; the default should be privacy preserving. Activities that exceed the expected context must require affirmative informed consent of the individual
CONTROLS mapped to POPULATION MAGNITUDE
Inform & control
Inform (Process oriented strategy)
Inform data subjects about the processing of their personal data
Daniel Solove's taxonomy of privacy
Information collection, information processing, information dissemination, invasions
K-anonymity
Information for each person cannot be distinguished from at least k individual whose information is also present
Strong identifiers
Information that is clearly identifying (e.g., credit card number, passport)
Factors Analysis in Information Risk (FAIR)
International standard quantitative model for security risk; The purpose is to find factors that can be calculated or reasonably estimated, thus building up an estimate of the overall risk
Interrogation
Involves actively questioning an individual or otherwise probing for information; privacy violation occurs when questions breach social boundaries
Aggregation
Involves combining multiple pieces of information about an individual to produce a while that is greater than the sum of its parts (e.g., retail company correlates purchases of unscented lotions, large tote bags and prenatal vitamins to infer that a customer is pregnant)
Decisional interference
Involves others inserting themselves into a decision-making process that affects the individual's personal affairs
Increased accessibility
Involves rendering an individual's information more easily obtainable
Disclosure, as a privacy problem
Involves revealing truthful information about an individual that negatively affects how others view them
Distortion
Involves spreading false & inaccurate information about an individual
Surveillance
Involves the observation and/or capturing of an individual's activities
Secondary use
Involves using an individual's information without consent for purposes unrelated to the original reasons for which it was collected
Appropriation
Involves using someone's identity for another person's purposes
Legal Compliance
Legal Compliance is the alignment of identification of threats & vulnerabilities to specific policy requirements and laws. Organizations view themselves as compliant or non-compliant and do not take the lens of privacy by design.
Abstract (Data oriented strategy)
Limit the detail in which personal data is processed as much as possible
Minimize (Data oriented strategy)
Limit the processing of personal data as much as possible
Identification
Linking information to specific individuals (e.g., website uses cookies, recurring IP address or unique device identifier to link an individual's browsing history to their identity)
Security & privacy in systems development lifecycle
Many functional violations of privacy can be avoided by the correct design, implementation and deployment of software
Public key infrastructure
Massive directory of public keys belonging to individuals that other people could use to send secure electronic mail; today's public keys are mostly used to certify the identity of organizations' web servers, not individuals
CONTROLS mapped to OPPORTUNITY that a threat actor will come into contact with an individual's information
Minimize & separate
Run time behavior monitoring
Monitoring & analyzing usage & data collected from a running system
Intrusion reports
Monitoring a system for threats to security of a network
NICE framework
National Initiative for Cybersecurity Education; divides computer security work into: - securely provision - operate & maintain - protect & defend - investigate - analyze - oversee & govern - collect & operate
NIST framework
National Institutes of Standards & Technologies; explicitly addresses vulnerabilities, adverse events and relative likelihoods of impacts of those events
Example of exposure
Newspaper posting pictures featuring a celebrity doing private matters in their home
Example of disclosure
Newspapers disclosing an individual's private sexual preference
Objective harm defined in Calo's Harms Dimensions
Objective harm is measurable & observable. A person's privacy is violated due to forced or unanticipated use of personal information which can be categorised as economic loss, lost opportunity, lost liberty, or social detriment.
Internal controls
Objectives tied to practical measurements and designed to evaluate components governed by the privacy program
Mix networks
Onion routing networks; combine traffic from multiple computers into a single channel that is sent between multiple computers and then separating the traffic again
Collection
Only collect data for established purposes and always collect consent from data subjects for sensitive data; allow data subjects to opt out of services they deem unnecessary and before collecting the data, when possible
Use
Only use data for the purpose of the original collection; any new uses require additional consent from the data subject, and/or the sending of new privacy notices
Providing feedback on policies
Organizational policies identify key objectives that must be met, and privacy professionals & engineers define the governance program and identify manual & automated aspects
Data Quality Principle
Personal data should be relevant to the purposes for which it is used and should be accurate, complete and up-to-date.
Metadata tracking & surveillance
Photos taken using GPS devices might store location automatically in camera metadata, and when this is uploaded to photo editing applications, that application might also have your location
Enforce (Process oriented strategy)
Policy & process enforcement - commit to processing personal data in a privacy-friendly way, and enforce this
Ubiquitous computing
Present, appearing, found everywhere type of technology
Full functionality (positive sum, not zero sum) (PbD)
Privacy and other design requirements should not be treated as a trade-off. Designers must develop creative win-win solutions
Architect (as a control)
Privacy friendly architecture is about reducing identifiability & decentralizing operations; Separate data by domains & minimize data in each domain
Proactive, not reactive (PbD)
Privacy must be a forethought in any product, service, system or process. Privacy considerations should help drive the design, not the reverse (the design driving privacy violations)
Embedded into design (PbD)
Privacy should be so ingrained into the design that the system or process wouldn't function without the privacy-preserving functionality
Robots
Programmed with the ability to sense, process and record the world around them
Hide (Data oriented strategy)
Protect personal data or make it unobservable to make sure it does not become public or known
Documenting requirements
Provide engineers with an opportunity to capture critical privacy properties prior to embarking on design or other technical commitments; The behaviors the system is supposed to exhibit; functional vs non-functional requirements; privacy is considered non-functional, documented in a software requirements specification (SRS)
Internet of Things (IoT)
Providing privacy notices & enabling choices about privacy become difficult because of the frequent lack of traditional user interfaces
Asymmetric encryption
Public key encryption One key encrypts & second key decrypts; much slower than symmetric encryption
RSA encryption
RSA (Rivest-Shamir-Adleman) is the most common internet encryption and authentication system. The system used an algorithm that involves multiplying two large prime numbers to generate a public key, used to encrypt data and decrypt an authentication, and a private key, used to decrypt the data and encrypt an authentication.
Salt
Random data added to input into cryptographic function to increase the difficulty in determining the original input; prevent linkability between multiple transactions
Manageability in Privacy Engineering
Refers to the ability to granularly administer personal information, including modification, disclosure and deletion
Differential identifiability
Reformulation of differential privacy that limits the confidence that any particular individual has contributed to the aggregate value
Suppression
Removing identifying values from a record
Tokenization
Replace a piece of sensitive data with a token that is a placeholder value with no significance of its own (Hide - Restrict)
Noise addition
Replace actual data values with other values that are selected from the same class of data, which can also reduce the sensitivity of data
Generalization
Replacing a data element with a more general element (e.g., removing day and month from a birthdate), which can also reduce sensitivity of data
Reasonable assurance
Requirements & objectives are not absolute, but rather based upon some criteria that is deemed practical to implement & manage; it affords flexibility & is greatly valuable for managing potential overengineering of solutions
L-diversity
Requires that there be at least l distinct values in each group of k records
Neural networks
Responsible for deeplearning/deepfakes; a set of algorithms that are modeled loosely after the human brain, designed to recognize patterns
Exposure
Results from the revelation of information that we normally conceal from others including private physical details about our bodies
Symmetric encryption
Secret key encryption The same key is used to encrypt & decrypt data
Something you know
Secret knowledge held only by the individual corresponding to the identity e.g., usernames, passwords
Organization security policies
Security policy helps maintain and organization's privacy policies & identifies what security measures need to be in place to protect the organization
IT & InfoSec support information governance
Security reasonably assures that two parties exchange personal data securely, while privacy reasonably assures that the authorized parties are using the personal data appropriately
Separate (Data oriented strategy)
Separate the processing of personal data as much as possible to prevent correlation
Disclosure
Sharing of information external to an organization collecting it; Limit disclosures to those purposes for which data was originally collected; any new disclosures require additional consent from the data subject, and/or the sending of new privacy notices
Open-source
Software that is easily viewed, shared or modified
Closed-source
Software that must be fixed & updated by the vendor
Application encryption
Some applications come with built-in encryption as a feature, but with little choice in what encryption can be applied to the data
Subjective harm defined by Calo in Harms Dimensions
Subjective harm is without a measurable or observable harm, but where an an expectation of harm exists. The perception of harm is just as likely to have a significantly negative impact on privacy as experienced harms called psychological or behavioral harms.
Integrity
System maintains a reliable state, including the quality of data as being free from error. Includes: - accuracy: data is correct and free from errors; - Completeness: concerns whether there is missing data; - Currency: concerns whether the data is up to date
Strategies for skillful practice
TBD
Group
Tactic of abstract; Aggregate data over groups of individuals instead of processing data of each person separately
Summarize
Tactic of abstract; Summarize detailed information more abstract attributes
Perturb
Tactic of abstract; add noise to approximate the real value of a data item
Choose
Tactic of control; Allow data subjects to choose which personal data will be processed
Update
Tactic of control; Provide data subjects with the means to keep their personal data accurate & up to date
Consent
Tactic of control; only process personal data for which explicit, freely given and informed consent is received
Report
Tactic of demonstrate; Analyze collected information on tests, audits & logs periodically and report to the people responsible
Log
Tactic of demonstrate; Track all processing of data & review information gathered for any risks
Audit
Tactic of demonstrate; audit the processing of personal data regularly
Uphold
Tactic of enforce; Ensure policies are adhered to
Create
Tactic of enforce; Privacy policy
Maintain
Tactic of enforce; Privacy policy
Restrict
Tactic of hide; Prevent unauthorized access to personal data
Mix
Tactic of hide; Process personal data randomly within a large enough group to reduce correlation
Obfuscate
Tactic of hide; prevent understandability of personal data (e.g., hashing)
Dissociate
Tactic of hide; remove the correlation between data subjects & their personal data
Notify
Tactic of inform; Alert data subject when personal data is being used or breached
Supply
Tactic of inform; Inform users which personal data is processed, including policies, processes & potential risks
Explain
Tactic of inform; Provide information in a concise and understandable form, and explain why processing is necessary
Select
Tactic of minimize; Decide on a case-by-case basis to only process relevant personal data
Exclude
Tactic of minimize; Refrain from processing a data subject's personal data, if you don't need it, don't collect it
Strip
Tactic of minimize; Remove partially or unnecessary attributes
Destroy
Tactic of minimize; Remove personal data completely as soon as they become unnecessary
Distribute
Tactic of separate; Process personal data for a task in separate locations
Isolate
Tactic of separate; Process personal data for different purposes independently on separate databases or systems
Dark patterns
Techniques to de-emphasize or obscure privacy-preserving response options with the goal to encourage the disclosure of personal information or agreement to unexpected or more expansive uses and/or disclosures of personal information
Main benefit of using a private cloud
The ability to restrict data access to employees and contractors
Software harm
The actual or potential ill effect or danger to an individual's personal privacy (sometimes called a hazard)
Polymorphic encryption
The algorithm, the encryption/decryption pair, changes each time it is used; Advantages: - it changes the algorithm used each time, so it becomes more difficult to recognize - it becomes harder to decrypt because there is no discernible relationship between the algorithm & results
Software error
The difference between a computed, observed or measured value or condition and the true, specified or theoretically correct value or condition
Data classification standard
The goal and objective of a __________ is to provide a consistent definition for how an organization should handle and secure different types of data.
Pseudonymize
The identity of the person is unknown but you can tell when different pieces of data are about the same unidentified person
Difficulty
The impediments that a threat actor in a given situation must overcome to act in a way that is a potential privacy violation
Software failure
The inability of a system or component to perform its required functions within specified performance requirements
Respect for user privacy (PbD)
The individual is the principal beneficiary of privacy and the one affected when that privacy is violated; therefore, their needs and risks should be forefront in the minds of designers
Anonymization
The least restrictive way to utilize data while ensuring that privacy is protected; can be done with suppression, generalization or noise addition
Dissociability in Privacy Engineering
The minimization of connections between data and individuals to the extent compatible with system operational requirements. This can also take the form of architectural data separation, in which identifiable personal information is kept segregated, but still linkable to, transactional data.
Testing & validation
The most crucial phase of software development with regard to managing privacy concerns; consists of two sets of activities: - Verification - Validation
Availability
The need to ensure that information is available to satisfy business needs
Organization privacy notice
The privacy notice is based on an organization's internal privacy policies & should contain: - what data is being collected, processed or shared - why this data practice is necessary and how it benefits the data subject - what controls are available regarding the practice
Privacy role of the IT professional
The privacy professional is a translator sitting at the center balancing the requirements espoused by various stakeholders & organizing those requirements into structured themes & elements that enable the success of each stakeholder as well as the overall success of the enterprise-wide privacy governance program
Probability of action
The probability that a threat actor will act in a way that is a potential privacy violation, if given the opportunity
Vulnerability
The probability that a threat actor's acts will succeed (capability * difficulty = vulnerability)
Violation magnitude
The probable extent to which the potential privacy violation constitutes an actual privacy violation for the affected population and the adverse consequential risks to that population from that privacy violation (population magnitude * adverse consequences risk = violation magnitude)
Adverse consequences risk
The probable frequency & probable magnitude of adverse consequences on the affected population (consequences frequency * consequences magnitude = adverse consequences risk)
Privacy risk
The probable frequency and probable magnitude of future privacy violations
Consequences frequency
The probable frequency of adverse consequence on the affected population
Opportunity
The probable frequency, given a time frame, at which a threat actor will come in contact with an individual or the individual's information & be provided the opportunity to act in a way that could cause a privacy violation
Action frequency
The probable frequency, given a time frame, that a threat actor acts toward an individual in a way that is a potential privacy violation (attempt frequency * vulnerability = action frequency)
Attempt frequency
The probable frequency, given a time frame, that a threat actor attempts an act toward an individual (opportunity * probability of action = attempt frequency)
Consequences magnitude
The probable magnitude of adverse consequence on the affected population
Population magnitude
The probable population for which a potential privacy violation is an actual privacy violation
Information Security
The protection of information from accidental or intentional misuse by persons inside or outside an organization by assessing threats, risks to information & controls to preserve CIA (Confidentiality, integrity, availability)
Artificial Intelligence
The rise of AI technologies has made it possible to create fake accounts targeted to deceive a single person (catfishing), or whole armies of fake profiles designed to change public opinion (sockpuppeting or astroturfing)
Capability
The skills and resources available to a threat actor in a given situation to act in a way that is a potential privacy violation
Visibility & transparency (PbD)
The use of personal information should not be obscured or obfuscated, and disclosure about that use must consider the needs and sophistication of the respective audiences
Blackmail
Threat to disclose an individual's information against their will
Example of surveillance
Tracking mouse movements around a webpage
How design affects users
Transparency & user rights are core concepts of global privacy legislation & guidelines, and customers should be able to make informed privacy & consent decisions
Cross-enterprise authentication & authorization models
Two enterprises may each run their own identity provider, primarily intended to authenticate users within each enterprise. For business reasons, however, the two (or more) enterprises may decide to trust each other's identity providers. This enables company A to allow users from company B to access A's computer systems, but while relying on B's identity provider to authenticate these users.
Drones
Unmanned Aerial Vehicles (UAVs) Many deem drones as violation of privacy, but it's a federal crime to shoot down a drone Is this a useful application? Consider legitimacy, appropriateness, adequacy
Quantum encryption
Use quantum mechanic properties to perform cryptographic tasks such as quantum key distribution: an approach for distributing an encryption key to two endpoints so that it is physically impossible for a third party to get a copy of the key
Control (Process oriented strategy)
User control - Provide data subjects control over the processing of their personal data
Example of identification
Using a machine shop's supervisor, who knows who operates which machine, to correlate machine-operating data with the operator
Examples of exclusion
Using information about previous calls to constantly kick "problem callers" to the back of the queue without them knowing or the ability to dispute it
Value sensitive design
Value-sensitive design is a design approach that accounts for ethical values, such as privacy, in addition to usability-oriented design goals. Value-sensitive design places people, their needs and values important to them at the center of the design process
Example of secondary use
WhatsApp sharing your name and phone number with Facebook so businesses can advertise to you
Utility in crypto-design
When a piece of data is encrypted, only a limited set of operations can be performed on the original value; applying operations on data before encryption is one way to maintain the data's utility
Something you are
Where you are - location or VPN, What you are - biometric data
Wearable devices
While they collect fitness data, this data is not typically characterized as health data and therefore is not afforded the higher privacy protections of health data
Geotagging
Widespread availability of location information on smartphones enables applications to geotag data - labeling the data with the geographic location where it was created
Goal setting in Privacy by Design
Without an honest understanding of the purpose of the project, conflict and tension will naturally flow between business requirements and privacy friendliness; you can't build in privacy unless you understand the true motivations
Transport Layer Security (TLS)
a cryptographic protocol used to prevent monitoring or tampering of data traveling over the internet; adds security capabilities to standard HTTP (hypertext transfer protocol) by turning it to secure HTTPS
Secure Sockets Layer (SSL)
a standard security technology for establishing an encrypted link between a web server and a browser, ensuring that all data passed between them remain private
Cyberbullying
abusive attacks on individual targets conducted through electronic channels (interference with self-representation / personal identity)
Behavioral advertising
advertising that targets particular customers based on their observed online behavior (obstruction, intrusion, interference with self-representation)
data classification scheme
an information scheme used throughout an organization that helps secure confidentiality and integrity of information that is typically used by corporations
Intrusion
any action that affects a person's solitude, including their desire to be alone and their desire to control who has access to their information
Interference with self-representation
any action that alters how an individual is represented regardless of whether the representation is accurate or a misrepresentation
Obstruction
any action to interfere with decisions that affect the person's daily life
Verification in testing
ensure that a resultant system performs according to its requirements
Validation in testing
ensures that those requirements themselves satisfy the original needs of the user base for whom the system was developed
Binding Corporate Rules (BCR)
internal code of conduct operating within a multinational group, which applies to restricted data from the group's EEA entities to non-EEA entities; may be a corporate group or group of enterprises engaged in a joint economic activity
Enterprise Architecture (EA)
involves documenting an organization's IT assets and data flows to facilitate understanding, management, planning. Involves managing data flows across organization to minimize privacy risk & support business growth
Code reviews
review critical source code for defects; identify defects in logic or poor practices that cannot be found in a standard testing regime
Confidentiality
the act of holding information in confidence, not to be released to unauthorized individuals
Social engineering
using deception to obtain unauthorized access to information resources (obstruction, interference with self-representation)