Evaluating synthetic intelligence options from distributors is among the largest challenges informatics leaders face right now. The Well being AI Partnership (HAIP), a multi-stakeholder collaborative, has printed in NEJM AI an outline of its AI Vendor Disclosure Framework, a software designed to assist accountable AI system procurement.
First some background on HAIP: It seeks to be a useful resource for steering for healthcare professionals utilizing AI and associated rising applied sciences, and a platform for community-generated, expert-curated steering, assets, and requirements for accountable AI adoption in healthcare. The group has developed a community that creates a protected area for peer recommendation and collaboration to handle points well being system leaders face whereas adopting AI in healthcare settings. Its Coordinating Middle crew, positioned on the Duke Institute for Well being Innovation (DIHI), manages and coordinates the partnership’s actions. HAIP acquired preliminary funding in 2022 from the Gordon and Betty Moore Basis to determine this neighborhood useful resource.
In accordance with HAIP, the AI Vendor Disclosure Framework, which is publicly obtainable and free to make use of, identifies important info throughout 5 core domains that well being methods ought to request — and distributors ought to disclose — to successfully consider vendor-developed AI methods:
- System Capabilities and Meant Use establishes foundational information concerning the AI system’s functionalities, use, and affected stakeholders.
- System Efficiency and Compliance establishes the AI system’s operational metrics, potential biases, related dangers, and regulatory standing.
- Information Stewardship outlines the strategy to information governance, together with safety measures, high quality assurance processes, secondary use, and retention insurance policies.
- Integration Necessities consider the overall value of possession, together with technical conditions, useful resource necessities, and implementation timelines.
- Lifecycle Administration defines vendor obligations for ongoing assist, monitoring, and upkeep after implementation.
By standardizing expectations for the knowledge wanted in procurement decision-making, the framework goals to reinforce transparency and promote safer well being care AI adoption. It serves each as a best-practice information and a customizable useful resource to assist healthcare supply organizations in procuring vendor-developed AI methods.
Vega Well being Co-founder and CEO Mark Sendak, M.D., M.P.P, was a part of the framework’s growth crew. On LinkedIn, he defined the importance of the brand new useful resource. He wrote that “Mannequin info labels/mannequin playing cards are nice assets for front-line clinicians who want a high-level synthesis of what an AI resolution is, the way it was constructed, and the way it ought to be used.” However he added that the mannequin info label/mannequin card just isn’t enough info to information procurement and implementation selections. “The knowledge wanted for these stakeholders is way more complete throughout domains and way more detailed,” Sendak wrote. “Most folk do not recognize the distinction. Therefore, via Well being AI Partnership we pulled collectively a gaggle of leaders throughout a number of establishments who had been already constructing out these vendor assessments exactly as a result of the knowledge offered by distributors was inadequate for procurement selections.”
Earlier than founding Vega, which seeks to curate a market of healthcare AI options confirmed protected and efficient in real-world settings, Sendak was a inhabitants well being and information science lead on the Duke Institute for Well being Innovation.
