Setting Industry Standards for Applications of AI in Healthcare II
Ronald M. Razmi, MD
June 28, 2022
American Medical Informatics Association (AMIA) has proposed a framework for the regulation of AI decision support. AMIA has postulated that the development and implementation of clinical decision support (CDS) that trains itself and adapts its algorithms based on new data—here referred to as Adaptive CDS—present unique challenges and considerations. Although Adaptive CDS represents an expected progression from earlier work, the activities needed to appropriately manage and support the establishment and evolution of Adaptive CDS require new, coordinated initiatives and oversight that do not currently exist.
AMIA has stated that new and flexible oversight structures that evolve with the healthcare ecosystem are needed, and these oversight mechanisms should be distributed across institutions and organizational actors. Furthermore, new organizational competencies are needed to evaluate and monitor Adaptive CDS in patient settings. Although the FDA is developing and testing policies for Marketed ACDS, numerous algorithm-driven applications are self-developed without even cursory guidance or oversight.
AMIA calls for two key pillars in oversight of AI solutions: (1) transparency in how Adaptive CDS is trained, requiring standards for how decision support algorithms are trained, “including the semantics and provenance of training datasets … necessary for validation prior to deployment.” (2) communications standards to convey specific attributes of how the model was trained, how it is designed, and how it should operate in situ.
AMIA calls for the creation of new groups to govern AI deployments within specific healthcare organizations, and calls for a new system of oversight across institutions. It appears from their statements that they feel that the current FDA-cleared products can not promise data integrity and algorithmic clarity. This goes back to the issue we alluded to earlier that there are two schools of thought on the FDA role in clearing AI in applications in healthcare. Clearly, one school of thought feels that the FDA is not asking for enough before clearing most of these solutions.