Yes, using ChatGPT with real patient data can violate HIPAA if it’s not configured for healthcare compliance. General-purpose AI tools, like ChatGPT, are not HIPAA-compliant by default and should not receive protected health information (PHI). This is because they do not automatically provide Business Associate Agreements (BAAs) or healthcare-specific data handling guarantees.
Sharing identifiable patient data in these tools can therefore expose providers to compliance risk. For real clinical documentation, HIPAA-compliant, purpose-built AI platforms are required, like Sully.ai.
Ready for the
future of healthcare?