Hippocratic AI teams up with Nvidia
Hippocratic AI has announced a partnership with technology firm Nvidia to support the development of healthcare artificial intelligence (AI)
Hippocratic will use the Nvidia AI platform to allow its large language model (LLM) AI healthcare agent to participate in “super-low-latency voice interactions” with patients – something they say is “required … to build an emotional connection”.
Hippocratic cited research it has conducted, which found that every half-second of improvement in inference speed increases the ability for patients to emotionally connect with AI healthcare agents by 5–10% or more.
When referring to inference latency, Hippocratic is referring to the time delay between when the AI receives an input and when it generates an output – in this case, the patient speaking and the AI making a corresponding response to the statement.
LLMs are a form of AI that has been trained through exposure to large amounts of data, to recognise text and spoken language.
Automating low-risk, non-diagnostic tasks
The purpose of Hippocratic AI is to augment the work of human healthcare workers, enabling the automated completion of “low-risk, non-diagnostic, patient-facing tasks over the phone”.
“With generative AI, patient interactions can be seamless, personalised, and conversational – but in order to have the desired impact, the speed of inference has to be incredibly fast,” said Munjal Shah, Co-Founder and CEO of Hippocratic AI.
“We’re working with Nvidia to continue refining our technology and amplify the impact of our work on mitigating staffing shortages, while enhancing access, equity, and patient outcomes,” he added.
AI is growing increasingly popular as a means to improve the efficiency of administrative processes. Insurer ERV Nordic contracted Simplifai to introduce AI to its claims handling processes for travel insurance earlier this month.