Technical

How to Conduct Data Privacy Assessments That Meet 2026 Compliance Standards

Systematic evaluation frameworks that ensure regulatory compliance and build user trust

By Chandler Supple13 min read
Generate Privacy Assessment

AI creates comprehensive DPIA reports with risk analysis, mitigation plans, and accountability documentation based on your data processing activities

Your company is launching a new feature that uses machine learning to recommend products based on user behavior. Marketing is excited about personalization. Engineering thinks the algorithm is elegant. Legal gets the email about launch date and asks: "Did anyone do a Data Protection Impact Assessment?" Silence. "A what?"

Two months later, your regulator sends an inquiry. Turns out your ML model has been making decisions using protected characteristics you didn't realize were being inferred. Your retention periods violate GDPR. Your vendor contracts lack required data processing agreements. And you have no documentation showing you assessed any of this before launch. The investigation costs $500K in legal fees and consulting, plus fines, plus the reputational hit when it becomes public.

Data Protection Impact Assessments (DPIAs) aren't bureaucratic paperwork—they're systematic risk analysis that prevents costly privacy failures. This guide breaks down how to conduct DPIAs that actually identify risks, document compliance, and pass regulatory scrutiny.

What Is a DPIA and When Is It Required?

A Data Protection Impact Assessment (also called Privacy Impact Assessment or PIA) is a structured evaluation of how a processing activity affects individual privacy and what risks exist.

When DPIAs Are Mandatory (GDPR Article 35)

Under GDPR, DPIAs are required when processing is likely to result in high risk to individuals, particularly:

  • Systematic and extensive profiling with significant effects: Using automated processing to make decisions that significantly affect people (credit scoring, employment decisions, targeted advertising)
  • Large-scale processing of special category data: Health records, biometric data, genetic data, data about children
  • Systematic monitoring of public areas on large scale: CCTV systems, location tracking

When DPIAs Are Strongly Recommended

Even if not legally required, conduct DPIAs for:

  • New technologies or processing methods
  • Significant changes to existing processing
  • Innovative uses of data
  • Combining datasets in new ways
  • Processing children's data
  • Processing that could lead to discrimination or exclusion

When Consequences of Skipping DPIAs

Failing to conduct required DPIAs can result in:

  • Fines up to 2% of global revenue (GDPR)
  • Regulatory orders to stop processing
  • Reputational damage
  • Increased liability if data breaches occur
  • Inability to demonstrate accountability

More importantly, you miss the opportunity to identify and fix risks before they cause harm.

The DPIA Framework: Risk Assessment Methodology

DPIAs follow a structured methodology: describe processing, assess necessity, identify risks, and document mitigation.

Step 1: Describe the Processing Activity

Document exactly what you're doing with personal data:

What data: Specific data types (not just "user data" but "names, email addresses, IP addresses, device identifiers, clickstream data, location history")

Whose data: Data subjects (customers, employees, website visitors, children, etc.) and estimated volume

Why: Clear purpose ("Improve user experience through personalized recommendations" not vague "business purposes")

How: Collection methods, processing operations, storage systems, sharing arrangements, retention periods

Example:

"We are implementing a recommendation engine that analyzes user purchase history, browsing behavior, and product reviews to suggest relevant products. This processes:

  • Purchase data (items bought, prices, dates) for 500K active customers
  • Clickstream data (pages viewed, time spent, items favorited)
  • Account information (name, email, shipping address)
  • Inferred preferences (size, style, price range)

Data is stored on AWS servers in US-East, with backups replicated to EU-West. Third-party analytics provider accesses aggregated data only. Retention: 3 years from last purchase."

Step 2: Assess Necessity and Proportionality

This is data minimization in practice. For each piece of data, ask:

Is it necessary? Can you achieve your purpose without it?

Is it proportionate? Do benefits justify privacy intrusion?

Example analysis:

Necessary:

  • Purchase history - YES (core to recommendations)
  • Product views - YES (indicates interest)
  • Account email - YES (deliver recommendations)

Not necessary (removed):

  • Precise GPS location - NO (city-level sufficient)
  • Full browsing history - NO (on-site behavior sufficient)
  • Social media activity - NO (out of scope)

Document what you removed or limited. This demonstrates data minimization principle.

Step 3: Identify Legal Basis

Under GDPR Article 6, choose your lawful basis:

Consent: Freely given, specific, informed, unambiguous. Good for non-essential processing like marketing. Hard to use when service depends on processing (no genuine choice).

Contract: Processing necessary to perform contract. Example: shipping address needed to deliver product. Can't be stretched to cover nice-to-have features.

Legitimate interests: Your interests or third party's interests that aren't overridden by individual rights. Requires balancing test (see below). Common for analytics, fraud prevention, security.

Legal obligation: Required by law. Example: tax records, employment records.

Choose the most appropriate basis. Don't claim contract necessity for processing that's actually for your legitimate interests.

Step 4: Legitimate Interest Assessment

If using legitimate interests (most common for analytics, personalization, fraud prevention), perform three-part test:

1. Purpose Test: Is the interest legitimate?

"We have a legitimate interest in providing personalized product recommendations to improve user experience and increase customer satisfaction."

Legitimate interests can be commercial (fraud prevention, network security, improving services) or broader (journalism, research).

2. Necessity Test: Is processing necessary?

Could you achieve it another way with less privacy impact?

"Personalization requires analyzing past behavior. We've minimized data collection to on-site activity only (not tracking across the web). We use aggregated data where possible and retain individual data for only 3 years."

3. Balancing Test: Do individual rights override your interest?

Consider:

  • Nature of data (sensitive vs. basic)
  • Reasonable expectations (do users expect this?)
  • Impact on individuals (minimal vs. significant)
  • Safeguards in place (security, transparency, controls)

"Users reasonably expect product recommendations on e-commerce sites. Impact is minimal—no adverse decisions made about individuals. We provide clear privacy notice, easy opt-out, and strong security. Balance tips in favor of our legitimate interest."

Document your reasoning. If challenged, you must show you performed this analysis.

Conducting a DPIA for your new processing activity?

River's AI generates comprehensive privacy assessments with risk analysis, legal basis evaluation, and mitigation plans meeting GDPR and CCPA standards.

Generate DPIA

Risk Identification and Assessment

This is the heart of the DPIA: systematically identifying what could go wrong and how bad it would be.

Common Privacy Risks

Unauthorized access or disclosure:

  • Data breach (hacking, ransomware)
  • Insider threat (employee access abuse)
  • Accidental disclosure (misconfigured database, email error)

Impact: Identity theft, financial fraud, embarrassment, harassment

Function creep/scope creep:

  • Data collected for one purpose used for another
  • "We have this data, might as well use it for X"

Impact: Violates purpose limitation, erodes trust

Discriminatory outcomes:

  • Biased AI models
  • Proxies for protected characteristics
  • Feedback loops amplifying bias

Impact: Discrimination, unfair treatment, legal liability

Lack of transparency:

  • Hidden data collection
  • Unclear purpose
  • No way to exercise rights

Impact: Violates transparency principle, regulatory fines

Inability to exercise rights:

  • Can't access data
  • Can't delete data
  • Can't opt out

Impact: Regulatory complaints, fines

Vendor/third-party risks:

  • Processor fails to secure data
  • Processor uses data inappropriately
  • International transfer risks

Impact: You remain liable for processor failures

Risk Scoring

For each identified risk, assess:

Likelihood: How probable is this risk?

  • High: Likely to occur (>50% chance)
  • Medium: Could occur (10-50%)
  • Low: Unlikely (<10%)

Severity: How bad if it occurs?

  • High: Severe harm (financial loss, discrimination, danger)
  • Medium: Moderate harm (inconvenience, embarrassment)
  • Low: Minimal harm

Overall risk:

  • Critical: High likelihood + High severity
  • High: High likelihood + Medium severity, or Medium likelihood + High severity
  • Medium: Medium likelihood + Medium severity
  • Low: Low likelihood or Low severity

Example risk assessment:

Risk: Database breach exposing customer PII

  • Likelihood: Medium (external attacks common, but security controls in place)
  • Severity: High (financial data + identity info could enable fraud)
  • Overall Risk: HIGH

Current controls:

  • Encryption at rest (AES-256)
  • TLS in transit
  • Role-based access control
  • Annual penetration testing

Residual risk: Medium (controls reduce but don't eliminate risk)

Additional mitigations needed:

  1. Implement database activity monitoring
  2. Reduce retention from 5 years to 2 years
  3. Add anomaly detection for unusual access patterns
  4. Quarterly security audits (not just annual)

When to Consult DPO or Regulator

GDPR requires consulting your Data Protection Officer during DPIAs. If you lack a DPO or if residual risks remain high after mitigation, you may need to consult the supervisory authority before proceeding.

High residual risk scenarios requiring consultation:

  • Large-scale processing of sensitive data with inadequate safeguards
  • Systematic monitoring you can't sufficiently mitigate
  • Innovative technology with unknown risks
  • Processing likely to cause significant harm if risks materialize

Documenting Individual Rights Mechanisms

Your DPIA must explain how individuals can exercise their rights. Vague statements like "users can contact us" aren't sufficient.

Right of Access (Subject Access Request)

How users request: Online form at privacy-page/data-request

Identity verification: Must authenticate (login required) or provide government ID if no account

Response time: 30 days (can extend 60 days if complex, must notify within first 30 days)

Format: Machine-readable JSON export or PDF report

Content: All personal data, purposes, recipients, retention periods, rights, source if not collected from individual

Right to Erasure (Right to Be Forgotten)

How users request: Account settings > "Delete Account" or privacy request form

When we can refuse:

  • Legal obligation to retain (tax records, legal claims)
  • Public interest (archival, research)
  • Exercise/defense of legal claims

Deletion timeline: Immediate from production systems, purged from backups within 90 days

Confirmation: Email sent when complete

Third parties: Notify processors and recipients to delete

Right to Object

Processing based on legitimate interests: User can object; you must stop unless you demonstrate compelling legitimate grounds that override their interests

Direct marketing: User can always object; you must stop immediately

How to object: Unsubscribe links, account settings, email to privacy@company.com

Rights Related to Automated Decision-Making

If you make automated decisions with legal/significant effects:

Right to human review: How users request: Email to appeals@company.com with decision reference

Right to explanation: Provide meaningful information about logic involved

Right to challenge: Appeals process with human review

Vendor and Third-Party Risk Assessment

You remain liable for your processors' failures. Assess vendor risks carefully.

Processor Due Diligence

For each vendor that processes personal data:

What data do they access? Be specific—don't give vendors more data than necessary

What do they do with it? Processing purposes

Where is data stored/processed? Countries matter for international transfer rules

What security measures do they have? SOC 2, ISO 27001, encryption, access controls

Do they use sub-processors? Who are they? Do you have approval rights?

What happens at contract termination? Return or deletion of data?

Data Processing Agreements (DPAs)

GDPR requires written contracts with all processors containing specific terms:

  • Process only on documented instructions
  • Ensure confidentiality of persons processing data
  • Implement appropriate security measures
  • Only use sub-processors with prior authorization
  • Assist with individual rights requests
  • Assist with security and data breach obligations
  • Delete or return data upon contract termination
  • Make available information demonstrating compliance
  • Allow audits

Your DPIA should list all processors and confirm DPA status. Missing DPAs are a significant compliance gap.

International Transfer Assessment

If transferring data outside the EEA, document:

Destination countries: US, India, etc.

Transfer mechanism:

  • Adequacy decision (EU approved countries)
  • Standard Contractual Clauses (SCCs)
  • Binding Corporate Rules
  • Certification mechanisms

Additional safeguards: After Schrems II ruling, SCCs alone may not be sufficient. Document additional measures:

  • Encryption (keys held in EU)
  • Pseudonymization
  • Minimizing US entity's access
  • Contractual commitments to challenge government requests

Documenting vendor relationships and data flows?

River's AI helps you create comprehensive third-party risk assessments with DPA checklists and international transfer safeguards.

Generate Vendor Assessment

The Legitimate Interest Balancing Test in Practice

Since legitimate interests is the most flexible (and frequently misused) legal basis, let's walk through examples.

Example 1: Fraud Detection (Legitimate)

Interest: Preventing fraudulent transactions protects our business and other customers

Necessary: Fraud detection requires analyzing transaction patterns. We use minimal data (amount, time, location) without collecting unnecessary personal details

Balancing: Customers expect and benefit from fraud protection. Impact is minimal (automated analysis, no human review). Strong security protects data. Our interest in fraud prevention does not override customer rights.

Conclusion: Legitimate interest justified.

Example 2: Marketing to Existing Customers (Depends)

Interest: Promoting our products to existing customers increases revenue

Necessary: Marketing requires customer contact info

Balancing: Customers may not expect marketing when they signed up for a service. Easy opt-out available. Frequency limited to monthly. For existing customers about similar products, this may be reasonable. For unrelated products or frequent emails, balance tips against legitimate interest—use consent instead.

Conclusion: Legitimate interest may apply for soft opt-in (related products to existing customers), but consent better for broader marketing.

Example 3: AI Training on User Data (Questionable)

Interest: Improving our AI models benefits all users

Necessary: AI training requires data, but could use synthetic data, publicly available data, or obtain consent

Balancing: Users likely didn't expect their data used for AI training when signing up. Secondary use beyond original purpose. Alternatives exist (consent, synthetic data). Individual rights likely override.

Conclusion: Legitimate interest NOT justified. Use consent or alternative data sources.

DPIA Documentation and Maintenance

The DPIA itself must be documented and maintained.

Documentation Requirements

Your DPIA document should include:

  • Description of processing
  • Assessment of necessity and proportionality
  • Assessment of risks to rights and freedoms
  • Measures to address risks
  • Safeguards, security measures, mechanisms to ensure data protection
  • Evidence of stakeholder consultation
  • DPO opinion
  • Sign-off by appropriate authority

Review and Updates

DPIAs aren't one-time exercises. Review and update when:

  • Processing operations change significantly
  • New risks emerge
  • Technology changes
  • Regulatory guidance evolves
  • At least annually for high-risk processing

Document review dates and what changed.

Who Needs Access

  • Internal: DPO, legal, security, relevant business units
  • External: Regulators upon request, individuals (redacted version)

Don't hide DPIAs in filing cabinets. They're living documents that inform decisions.

Common DPIA Mistakes

Box-checking exercise: Generic template filled out superficially without genuine risk analysis

After-the-fact justification: Writing DPIA after launching to justify decisions already made, rather than informing design

Vague descriptions: "We process user data for business purposes" tells you nothing

Ignoring actual risks: Focusing on hypothetical minor risks while missing obvious major ones

No mitigation actions: Identifying risks but not documenting how you'll address them

Missing vendor assessment: Forgetting that processors create risks you're liable for

Wrong legal basis: Claiming contract necessity for processing that's actually for your commercial benefit

No stakeholder consultation: Engineering and legal do DPIA without talking to affected teams or users

Key Takeaways

Data Protection Impact Assessments are mandatory for high-risk processing and recommended for any new or changed data processing. They're systematic risk analyses that identify privacy issues before they cause harm.

DPIAs follow a structured methodology: describe the processing, assess necessity and proportionality, evaluate legal basis (including legitimate interest balancing test if applicable), identify risks, document mitigation measures, and explain how individual rights are enabled.

Risk assessment is the core: identify what could go wrong, assess likelihood and severity, document existing controls, calculate residual risk, and plan additional mitigations for high risks. Don't skip vendor risk assessment—you're liable for processor failures.

DPIAs must be documented, reviewed regularly, and updated when processing changes. They should inform design decisions, not just justify choices already made. Involve DPO, consult stakeholders, and if residual risks remain high, consult your supervisory authority before proceeding.

The DPIAs that pass regulatory scrutiny are thorough, honest about risks, specific about mitigations, and demonstrate genuine attempt to protect individual rights while achieving business objectives.

Frequently Asked Questions

Do we need a separate DPIA for each processing activity?

You can create a single DPIA covering multiple similar processing operations if they present similar risks. For example, one DPIA for all marketing activities, or one for a product that includes multiple features. However, significantly different activities (e.g., employee monitoring vs. customer analytics) need separate DPIAs.

Can we use a DPIA template or do we need to create from scratch?

Templates are fine—many regulators provide them. But don't just fill in blanks mechanically. Templates provide structure, but you must genuinely analyze YOUR specific risks, not generic risks. A template-based DPIA that thoughtfully addresses your situation is better than a from-scratch document that's superficial.

What happens if our DPIA identifies high risks we can't mitigate?

If residual risk remains high after all reasonable mitigations, you must consult your supervisory authority before proceeding (GDPR Article 36). They may approve with additional conditions, require changes, or prohibit the processing. Don't proceed with high-risk processing without this consultation—it's a compliance violation.

Do startups need DPIAs or are they just for large companies?

Size doesn't matter—risk level does. A 10-person startup using AI to make automated decisions about individuals needs a DPIA. A 10,000-person company collecting only basic contact info might not. However, startups should do DPIAs for any new processing to build good practices and avoid costly mistakes later.

How do we handle DPIAs for fast-moving products with frequent changes?

Conduct DPIA before initial launch covering planned features. For subsequent changes, assess if they materially increase risk—if yes, update DPIA. Small changes (bug fixes, UI updates) don't require DPIA updates. Significant changes (new data collection, new purposes, new vendors) do. Build DPIA review into your change management process.

Should DPIAs be public or kept internal?

DPIAs are typically internal documents but must be made available to regulators upon request. Some organizations publish summaries to demonstrate transparency and build trust. If publishing, redact sensitive business information and specific security details that could aid attackers. You might also need to provide redacted versions to individuals exercising access rights.

Chandler Supple

Co-Founder & CTO at River

Chandler spent years building machine learning systems before realizing the tools he wanted as a writer didn't exist. He founded River to close that gap. In his free time, Chandler loves to read American literature, including Steinbeck and Faulkner.

About River

River is an AI-powered document editor built for professionals who need to write better, faster. From business plans to blog posts, River's AI adapts to your voice and helps you create polished content without the blank page anxiety.