Nabla hit the ground running in 2024 with the close of $24M in Series B funding, vaulting the startup’s valuation to $180M less than year after the US launch of its Nabla Copilot ambient AI assistant.
Nabla Copilot checks all the usual boxes for an automated clinical note solution, quickly transforming patient-provider conversations into note drafts that can be customized to meet different format preferences.
- Since the US rollout in March of last year, Nabla Copilot has grown to over 20k users at small practices and larger systems alike, mostly split between primary care physicians (50%), mental health providers (30%), and a mix of other specialties.
- While Paris-based Nabla maintains a strong position in the European market, it hasn’t wasted any time finding US customers, and recently chained together marquee partnerships with Permanente Medical Group and NextGen Healthcare.
Nabla’s approach to model development is where it starts to differentiate itself from a pack of equally hungry competitors like Abridge (which just closed its own Series B) and Nuance (which is full-speed-ahead with the deployment of DAX Copilot).
- Although Nabla has historically leveraged GPT-4 to power its backend, it’s now focused on migrating toward a combination of homegrown and open source AI models like those championed by Meta AI Chief Yann Lecun, also an early investor.
- By constantly testing and fine-tuning different models for specific tasks, Nabla is aiming to be one of the most nimble companies in the medical scribe arena, while also sidestepping the hefty licensing fees charged by commercial models.
The next step for Nabla outside of breaking its reliance on OpenAI is to launch a new solution geared toward automatically generating billing codes, which could debut before the end of the quarter. Mandarin, Portuguese, and Russian translation features are also on this year’s roadmap, and would add to Nabla’s existing capabilities for English, French, and Spanish.
Nabla is making its agility the driving force behind its business strategy, turning away from generalist AI models in favor of a collection of more narrow algorithms designed to excel at specific use cases. It now has another $24M to fuel the transition, and also hinted that another $10M could be on the way as early as February.