Introduction
Artificial intelligence (AI) in healthcare is a trendy topic; however, AI integration into medicine, patient care, clinical practice, and documentation is a familiar and improving function. This article serves as a reference for health law attorneys and compliance professionals to:
- Illustrate the various ways that AI is moving to incorporate into healthcare systems
- Encourage health law attorneys and compliance professionals to familiarize themselves with the available resources to ensure software and applications are authorized for use in healthcare systems
- Spotlight the possibilities and pitfalls of AI applications in use or development for billing and coding optimization
- Assist determination whether an online resource or information is reliable and accurate
- Highlight a few new academic studies of interest regarding the integration of AI in healthcare
Artificial Intelligence in Practice
Until now, AI integration in electronic health records (EHR) and diagnostics required significant provider input, such as clicking override in pop-up warnings for physician orders and pharmacy workflow, a final radiology reading by a physician, or a machine interpretation of an EKG requiring review by a healthcare professional for clinical correlation. The rapid improvements and integration of AI in healthcare systems promise to reduce physician burnout, reduce healthcare costs, and improve outcomes. Compliance with regulations as new instruments receive approval is at top of mind for clinicians and health law attorneys. However, regulations may require evolution if patient safety remains a priority during implementation.
AI is found almost everywhere in healthcare systems. AI-enabled tools focused on radiology lead the industry charge, with over 100 radiology-related AI companies and over 400 radiology AI algorithms approved by the U.S. Food and Drug Administration (FDA). To date, the FDA has authorized 950 artificial intelligence and machine learning (AI/ML)-enabled medical devices. Hospital systems and sole providers use machine learning (ML)-enabled medical devices and AI-enhanced chatbots to create documentation, orders, coding, and billing. Currently, the main roles of AI in medicine are:
- Diagnostic assistance
- Predictive analytics
- Personalized treatment plans
- Clinical workflow optimization
The approval and adoption of AI/ML tools, however, has not relieved medical providers’ liability for malpractice. In a recent Texas-state case, In re Acclarent, 2024 WL 2873617 (Tex. App. 2024), a Texas appellate court eventually denied the plaintiff’s request for pre-complaint depositions of the device manufacturer involved in the alleged medical negligence. Still, joint liability among AI/ML vendors and medical providers in some jurisdictions seems inevitable. Absent legislation, future complaints will likely explore the dual nature of medical negligence and product liability and include discovery requests for AI/ML vendor information in negligence cases. The wide-ranging implications for potential hybrid medical malpractice and product liability cases suggest a new “wild west” regarding venues, jurisdictions, and statute of limitations.
The FDA maintains a public list of AI/ML-enabled medical devices that meet the FDA’s “applicable premarket requirements, including a focused review of the devices’ overall safety and effectiveness, which includes an evaluation of appropriate study diversity based on the device’s intended use and technological characteristics.” The FDA updates its list periodically, though it is not meant to be an exhaustive or comprehensive resource of medical devices that incorporate AI/ML. The list provides a solid resource for a compliance officer or health law attorney to begin with when reviewing new software or dealing with a negligence case involving such products.
AI/ML in Provider Documentation: Billing and Coding Optimization Possibilities and Pitfalls
Billing and coding, especially for evaluation and management (E/M) for physician visits, remains susceptible to fraud and abuse. The complexity of E/M coding, which is based on the documentation elements, can lead to incorrect or even fraudulent coding in some cases. AI/ML applications seek to automate the process. However, nearly all billing and coding experts agree that any implementation of artificial intelligence in coding should include a human coder in charge of the result.
Epic, one of the best-known electronic medical record systems, is an industry leader in developing AI technology, improving documentation, and optimizing billing/coding. Epic EHR can use AI to generate progress notes using ambient listening technology from patient/provider visits in the exam room. In other words, the AI drafts a progress note using the information gleaned from the provider and patient conversation. Patient consent is necessary, and the application works similarly to dictation, but on a much higher level. A physician must review the note later and finalize the documentation.
Copying and pasting previous visit information in medical record documentation represents the low-tech version of generative AI in medical documentation and can help save time, but it risks documenting old or obsolete information. In the case of systematic fraudulent charting behavior, copying and pasting large amounts of prior documentation also serves to populate sections of a progress note incorrectly to “check the boxes” on the multiple levels of documentation needed to bill a complex visit. Epic has new documentation features that summarize prior notes and require a provider to choose which elements apply to the current visit and discard others. Providers essentially “train” the AI to discern which elements belong. According to the developer, the idea is promising for reducing provider stress and time engaged in after-visit charting if the implementation includes significant auditing and monitoring data for tracking. The risk of systematic fraudulent coding and billing exists without such surveillance.
AI technology, specifically integrated into medical coding, is still in development. It seeks to alleviate workflow barriers for human coders by suggesting codes matching potential procedure and diagnosis codes. The machine learning part of the equation could effectively prevent fraudulent billing, although it can still be exploited incorrectly or fraudulently.