// regulated industry playsby JoshMay 5, 20265 min read

HIPAA + AI: Building Patient-Facing Without Leaking PHI

Healthcare practices want AI for patient communication. HIPAA makes that complicated but not impossible. Here are the patterns that work and the ones that don't.

HIPAA + AI: Building Patient-Facing Without Leaking PHI

Patient-facing AI in a HIPAA-covered practice has a narrow path that works and a wide ditch that doesn't.

This is not legal advice. Talk to your privacy counsel.

The rules in brief

HIPAA's Privacy Rule restricts disclosure of Protected Health Information (PHI). The Security Rule requires safeguards for PHI in electronic form.

Any "business associate" handling PHI on your behalf needs a Business Associate Agreement (BAA). Most consumer AI services don't sign BAAs. Some enterprise AI tiers do.

Patterns that work

Pattern 1: AI on de-identified data only. PHI is stripped before AI sees it. The AI's outputs don't reference patient identity. Useful for: pattern analysis, training material generation, scheduling logic.

Pattern 2: AI with a signed BAA. Use AI services that will sign BAAs. Microsoft Azure OpenAI (with BAA). Google Cloud Vertex AI (with BAA). AWS Bedrock (with BAA). Anthropic's enterprise tier offers BAAs in some configurations.

Pattern 3: AI behind your firewall, never touching PHI. Self-hosted models for tasks that touch PHI. Cloud models only for tasks that don't.

Pattern 4: AI for non-PHI tasks. Marketing content, internal training, scheduling logic, billing operations (which has its own rules but not Privacy Rule).

Patterns that don't work

ChatGPT consumer or Claude.ai consumer for patient information. These services don't sign BAAs in their consumer tiers. Pasting patient info into them is a HIPAA violation.

AI that auto-responds to patient messages with diagnostic or treatment information. Without proper oversight, this creates unauthorized practice and disclosure issues.

AI tools that retain training data on patient information. Even with BAA, services that train on your data violate the spirit and often the letter of HIPAA.

AI integrations that don't log access. HIPAA requires audit logs. AI tools that don't log access to PHI are a compliance gap.

A specific compliant pattern: appointment reminders

Most clinics want AI-generated appointment reminders. Here's how to do it compliantly.

The reminder content is templated and approved by the practice. AI's only job is to personalize timing and tone based on the patient's prior responses. The personalization data (response history) stays in the EMR. The AI receives a token saying "tier 2 personalization," not the patient's history itself.

The reminder is sent through your existing patient communication channel (the EMR's messaging, a HIPAA-compliant SMS service). The AI doesn't transmit the message.

PHI exposure: minimal. The AI only sees a tier flag, not patient details.

A specific compliant pattern: documentation assistance

Most clinics want AI to help with clinical documentation. Here's the compliant pattern.

Use a BAA-covered AI scribe (Nuance DAX, Abridge, others). These are HIPAA-compliant by design. They record the visit, generate notes, and integrate with your EMR.

Don't use general-purpose AI (ChatGPT, Claude consumer) for clinical notes. The exposure is real.

A specific NOT compliant pattern: triage chatbot

"Let's have AI triage patient messages and respond when it can." This is the wrong frame.

Triage requires medical judgment. AI doesn't have a license. The right pattern is: AI summarizes the message and routes to the right human (nurse, scheduler, doctor). The human triages and responds.

The chatbot pattern that often gets shipped — AI directly responding to symptom questions — creates unauthorized practice issues that go beyond HIPAA.

The third-party question

Many AI tools your practice might use have third-party data flows. Your dictation app might be cloud-based. Your billing software might use AI for code suggestions. Your patient portal might have AI features.

For each, you need a BAA. If they won't sign one, find a different vendor.

The cost of vendor switching is much lower than the cost of a HIPAA enforcement action.

The audit-trail requirement

HIPAA requires you to track access to PHI. AI access counts. Your system needs to log: - What AI tool accessed PHI - When - What PHI was accessed - What was done with it - The associated user (the human who initiated)

Most AI tools don't generate this audit trail out of the box. You wire it.

The bottom line

HIPAA + AI is workable. The path is narrow. BAAs, de-identification, role separation, audit trails.

The cost of doing this right is one engineering project. The cost of getting it wrong is fines, reputation damage, and potential loss of right to bill federal payors.

Not legal advice. Talk to your privacy counsel before you ship.

hipaahealthcarephicomplianceai
// go deeper

Want the full guide? Check out our deep-dive page for more context, FAQs, and resources.

read the full guide
// keep reading

Related posts

// ready to ship?

Let's build yours.

Reading is the easy part. We do the work. Tell us what's broken and we'll tell you straight up whether we can help.