Why This Checklist Exists
If you run an independent medical practice and you've started using AI tools to handle documentation, streamline clinical workflows, assist with coding, or summarize patient records, you are already operating inside a compliance environment that most AI vendors have not designed their products to serve. That's not a scare tactic. That's the regulatory reality. The HIPAA Security Rule's administrative, physical, and technical safeguards apply in full to AI systems that process protected health information, requiring organizations to assess risks, implement appropriate controls, and ensure the confidentiality, integrity, and availability of PHI regardless of whether that information is processed by humans or algorithms. (HIPAA Journal) The tools that work beautifully for a tech startup's internal operations can detonate a compliance crisis inside a medical practice. And the enforcement data is unambiguous: in 2025, 76% of all OCR enforcement actions included a penalty for a risk analysis failure, with fines ranging from $25,000 to $3 million, and OCR has confirmed its enforcement priorities in 2026 will be largely the same. (HIPAA Journal) The practices getting hit aren't the ones who tried and failed. They're the ones who didn't know they needed to try.
This checklist is for those practices, the ones still deciding which AI tools to deploy and which architecture to build them on. The argument you'll find woven through every item below is the same: local AI deployment, where your data never leaves your controlled environment, is not one option among many. For a practice bound by HIPAA's technical safeguards, it's the only architecture that meets the full standard without a dangerous stack of third-party dependencies, BAA paperwork, and contractual prayers.
The Proposed Rule That Changes Everything
Before we get to the checklist, one seismic shift deserves your attention. On December 27, 2024, HHS issued a Notice of Proposed Rulemaking proposing to remove the distinction between "required" and "addressable" implementation specifications and make all implementation specifications required with specific, limited exceptions, and to require encryption of ePHI at rest and in transit, multi-factor authentication, vulnerability scanning at least every six months, penetration testing at least annually, and network segmentation. (HHS.gov) The final rule is expected sometime in 2026 with a six-month grace period for compliance to follow. (HIPAA Journal)
That proposed rule is the difference between a compliance environment where a small practice could document its way around encryption and MFA, and one where those controls are non-negotiable floor requirements. If you're deploying AI now, you are building into a regulatory future where everything in this checklist becomes mandatory. Local AI infrastructure, built right the first time, puts you ahead of that curve rather than scrambling to catch up to it.
The Checklist: 45 CFR §164.312
1. Unique User Identification (Required)
Assign a unique ID to every workforce member, contractor, and system process that interacts with ePHI. Shared or generic accounts are incompatible with audit trail requirements and undermine accountability. (AccountableHQ) With local AI deployment, your authentication layer is yours to configure and enforce. Every physician, nurse, and administrator who queries the AI system gets their own credentials, logged against their user ID. No shared logins. No practice-wide passwords. Every interaction traceable to a specific human being. Cloud AI platforms with team accounts routinely blur this line, creating audit trail gaps that become very expensive to explain to an OCR investigator.
2. Emergency Access Procedure (Required)
Establish and implement as needed procedures for obtaining necessary electronic protected health information during an emergency. (eCFR) Local AI infrastructure must include a documented break-glass protocol: who can access the system when your primary authentication method is unavailable, under what conditions, and with what logging and post-incident review requirements. A cloud AI tool going offline during a mass casualty event is not your vendor's problem. It's your patient care problem. Local deployment means your emergency access procedures are entirely within your own operational control.
3. Automatic Logoff (Addressable, Trending Required)
Configure timeouts proportionate to risk and location. Set application and workstation inactivity timers; require re-authentication to resume access to ePHI. Shorten timers for public or shared workstations and kiosk-mode devices used in intake or exam areas. (AccountableHQ) With the proposed rule change treating this as effectively required, local AI deployment lets you configure session timeouts at the application level, the operating system level, and the network level, without depending on a third-party vendor's product roadmap to implement a control your compliance program demands.
4. Encryption and Decryption (Addressable, Trending Required)
The proposed 2025 rule requires encryption of ePHI at rest and in transit, with limited exceptions. (HHS.gov) This is the item that should stop every managing physician cold when evaluating cloud AI tools. When patient data travels to an external AI server for processing, that transmission must be encrypted in transit. When it's stored on the vendor's infrastructure, it must be encrypted at rest. You are dependent on the vendor's key management, the vendor's encryption standards, and the vendor's continued BAA compliance to satisfy this control. With local AI deployment, your data at rest is encrypted on your own drives, under your own key management, auditable by you and your compliance officer. In transit means traffic between your workstations and your on-premises AI server, a route you own and control end to end.
5. Audit Controls (Required)
Log all security-relevant events: authentication success and failure, access to patient records, create, update, delete operations, privilege changes, policy and configuration changes, and data exports. Preserve log integrity with time synchronization, tamper-evident storage, hashing, and restricted administrative access. (AccountableHQ)
This is where cloud AI becomes genuinely treacherous for small practices. When your physician queries a cloud AI with patient context, the log of that interaction lives on the vendor's infrastructure. Producing those logs in response to an OCR audit or a breach investigation requires the vendor's cooperation, the vendor's timeline, and the vendor's definition of what constitutes a complete log. Local deployment means your audit logs live in your system, backed up by you, retrievable by you, formatted for the HIPAA compliance purposes you define. In 2022, 55% of OCR's financial penalties were imposed on small medical practices. (Sprinto) The practices that prevailed in those investigations were the ones that could produce clean, complete audit trails. The ones that couldn't faced corrective action plans on top of their fines.
6. Integrity Controls (Addressable, Trending Required)
Use checksums, hashes, or digital signatures to detect unauthorized changes to files and records. Implement file integrity monitoring on servers holding ePHI; alert on unauthorized edits or permission changes. (AccountableHQ) An AI system that modifies clinical documentation, generates summaries of patient records, or assists with coding decisions is touching data whose integrity is a patient safety issue, not just a compliance issue. Local deployment lets you implement write-protected logging for AI-generated outputs, version control for AI-assisted documentation edits, and integrity verification that you run on your own infrastructure without trusting a third party's attestation that your data hasn't been altered.
7. Person or Entity Authentication (Required)
Implement procedures to verify that a person or entity seeking access to electronic protected health information is the one claimed. (eCFR) Under the proposed 2025 rule, regulated entities would be required to apply multi-factor authentication through verification of at least two of three categories of factors: something the user knows, something the user has, or something the user is. (Duo Security) Local AI deployment supports MFA enforcement at the application level, the network level, and the physical workstation level, simultaneously and without carving out exceptions because a cloud platform doesn't support your authentication stack. You are not negotiating MFA implementation with a vendor. You are implementing it directly, on infrastructure you control, and documenting that implementation as direct evidence for your risk analysis.
8. Transmission Security (Addressable, Trending Required)
Apply encryption at rest to endpoints, servers, databases, backups, and cloud storage, and use strong, validated cryptography with sound key management. (AccountableHQ) For local AI deployment, transmission security means encrypting the network segment between your clinical workstations and your on-premises AI server. It means TLS for any internal API calls. It means ensuring that PHI processed by your AI system never traverses an unencrypted channel between any two points inside your facility. This is far simpler to implement, verify, and document when the entire transmission path is within your four walls than when it involves internet routing to a third-party data center, HTTPS configurations you didn't set up, and key management you can't directly inspect.
9. Business Associate Agreement (Required for Any Vendor Touching PHI)
A Business Associate Agreement is required whenever a vendor creates, receives, maintains, or transmits PHI on behalf of a covered entity, including when the vendor uses AI to perform regulated functions. The BAA must describe permitted and required uses of PHI, require the vendor to maintain safeguards, and establish a timeline for notifying the covered entity of unlawful exposure or a data breach. (Texas Medical Association) The most critical provision when contracting with AI providers is a prohibition on the use of patient data for training or retraining models without patient authorization. (Mcneeslaw) Cloud AI tools that lack a signed BAA, or that offer BAAs with carve-outs permitting model training on your patient data, expose your practice to direct HIPAA liability the moment any PHI touches their system. Local AI deployment removes the BAA question from the equation entirely for on-premises inference. Your patient data never reaches an external vendor. There is no third party to execute a BAA with for the AI processing itself, because there is no third party in the loop.
10. Organization-Wide Risk Analysis (Required)
A comprehensive, organization-wide risk analysis is vital for security. OCR launched a risk analysis enforcement initiative in 2025, and its enforcement priorities in 2026 will continue that focus and evolve to cover risk management as well. (Ogletree) Your AI deployment is a new system that processes ePHI, which means it triggers a mandatory risk analysis update. That analysis must document the threats and vulnerabilities specific to the AI system: model poisoning, prompt injection, unauthorized access to query logs, data exfiltration through API endpoints, and inference attacks that could reconstruct patient data from model outputs. Local AI deployment dramatically narrows the threat surface for that analysis. You are not assessing the risks of a multi-tenant cloud platform serving millions of customers with a security posture you cannot directly audit. You are assessing a system on your own hardware, inside your own network perimeter, under your own physical and logical security controls.
Architecture Over Convenience
The hard truth that this checklist is building toward is this: HIPAA's technical safeguards were written for a world where healthcare organizations control their own systems. Every single one of the ten items on this checklist is more completely, more verifiably, and more defensibly satisfied when your AI infrastructure sits inside your own facility, on your own hardware, under your own administrative control. That's not an ideological position. It's a structural observation about how compliance works when auditors start asking questions.
Cloud AI platforms can satisfy some of these requirements, some of the time, under some configurations, if you read the BAA carefully, if you monitor the vendor's compliance certifications continuously, and if you accept the inherent risk that a third party's security posture is outside your direct control. Local AI deployment satisfies all of these requirements, all of the time, under your direct authority, documented by logs you own and controls you configured.
The proposed 2025 HIPAA Security Rule amendments are making the distinction between "addressable" and "required" increasingly moot. The regulatory trajectory is clear: what was optional is becoming mandatory, and the enforcement posture is intensifying. Building your AI infrastructure on a foundation you fully control isn't just the safest path through the current regulatory environment. It's the only architecture that positions your practice to absorb whatever OCR mandates next without scrambling to renegotiate vendor agreements or re-architect your entire AI deployment.
Your Patients' Data Deserves Your Infrastructure
One conversation about what HIPAA-compliant local AI deployment looks like for your practice.
Request a QuoteOr call directly: 1-801-609-1130
