LOGO

uk’s mhra says it has ‘concerns’ about babylon health — and flags legal gap around triage chatbots

AVATAR Natasha Lomas
Natasha Lomas
Senior Reporter, TechCrunch
March 5, 2021
uk’s mhra says it has ‘concerns’ about babylon health — and flags legal gap around triage chatbots

Concerns Raised Regarding Babylon Health by U.K. Medical Device Regulator

The United Kingdom's regulatory body for medical devices has acknowledged harboring reservations concerning Babylon Health, a venture capital-funded developer of AI-powered chatbots. This disclosure was made within a letter addressed to a medical professional who has consistently voiced apprehensions regarding Babylon’s patient safety protocols and corporate governance practices since 2017.

Details from the MHRA Letter

The Health Service Journal (HSJ) initially reported on the letter from the Medicines and Healthcare products Regulatory Agency (MHRA) to Dr. David Watkins. TechCrunch has independently examined the document, dated December 4, 2020 (available for review below). Further context regarding discussions held in a meeting referenced in the letter, alongside other correspondence between Watkins and the regulator detailing numerous concerns, has also been reviewed.

In a recent discussion, Dr. Watkins underscored that the regulator’s anxieties extend “far beyond” the singular, albeit significant, matter of chatbot safety.

Corporate Governance and Ethical Considerations

“The core of the issue lies in the company’s corporate governance – specifically, their handling of safety concerns and their response to individuals who raise them,” Watkins explained to TechCrunch. “This is the primary worry, alongside ethical questions surrounding the promotion of medical devices.”

“The central narrative is that a demonstrably flawed product was promoted. Misleading assertions were made regarding the chatbot’s appropriate application – its intended purpose – with [Babylon CEO] Ali Parsa presenting it as a ‘diagnostic’ tool, which was never accurate. The chatbot never received approval for ‘diagnosis.’”

Regulatory Shortcomings and Misleading Claims

“In my assessment, the MHRA should have adopted a more assertive position with Babylon in 2018, clearly communicating to the public that the claims being made were inaccurate and that the technology was not authorized for use as Babylon was portraying,” he continued. “This action should have been taken, but it wasn’t, due to deficiencies in the regulatory framework at the time.”

“Currently, there is no formal ‘approval’ process for these technologies, and existing legislation does not mandate ethical conduct from companies,” Watkins also stated. “We depend on the health technology sector to operate with responsibility.”

Early Warnings and Escalating Concerns

The consultant oncologist initially began expressing concerns to U.K. healthcare regulators (CQC/MHRA) as early as February 2017, initially focusing on the “apparent lack of rigorous clinical testing or validation,” as outlined in his correspondence. However, as Babylon consistently denied issues and actively countered criticism, his concerns intensified.

Validation of Watkins’ Concerns

The regulator’s acknowledgement that Watkins’ concerns are “valid” and “shared” effectively undermines Babylon’s attempts to deflect criticism through public relations efforts.

Babylon’s Regulatory Compliance

“Babylon cannot assert consistent adherence to regulatory requirements – there have been instances of non-compliance throughout the development of their system,” Watkins also noted, adding: “Babylon did not prioritize safety concerns adequately, which has prolonged this issue for over three years.”

Global Expansion and Digitization Deals

Throughout this period, the company has aggressively pursued extensive “digitization” agreements with healthcare providers worldwide, including a ten-year contract with the U.K. city of Wolverhampton to deliver an integrated application reaching 300,000 individuals.

International Markets and Patient Reach

Babylon also maintains a ten-year agreement with the Rwandan government to support the digitization of its healthcare system, incorporating digitally-enabled triage. Additional markets where the company has expanded include the United States, Canada, and Saudi Arabia.

Babylon reports serving over 20 million patients and completing 8 million consultations and “AI interactions” globally. However, a crucial question remains: is the company operating to the standards expected of a medical device manufacturer?

Safety, Ethical and Governance Concerns

A written summary, dated October 22nd, details a video conference between Watkins and the U.K. medical devices regulator from September 24th of the previous year. The summary outlines the discussion, specifically addressing “misleading statements, significant flaws, and Babylon’s efforts to deny or conceal safety problems.”

Watkins’ account of the meeting indicates: “There was a widespread consensus that Babylon’s corporate conduct and governance fell short of the standards expected for a medical device or healthcare provider.”

He also noted in the summary: “It was communicated that Babylon Health would not receive favorable treatment, considering their association with [U.K. health secretary] Matt Hancock.” This references Hancock’s public endorsement of Babylon’s “GP at hand” app, which previously led to accusations of violating the ministerial code in 2018.

In a separate document compiled and submitted to the regulator last year, Watkins details 14 areas of concern. These encompass issues like the safety of the Babylon chatbot’s triage process, “deceptive and inconsistent” terms and conditions that contradict promotional claims, and a “variety of ethical and governance issues.”

This includes a public campaign targeting Watkins, previously reported, and what he describes as “legal threats intended to avoid scrutiny and negative media coverage.”

Watkins recalls that Babylon responded to safety concerns he raised in 2018 – concerns reported by the HSJ – by launching an attack, claiming that individuals with “vested interests” were spreading “false allegations” to undermine the company.

Watkins writes in associated commentary to the regulator: “These allegations were inaccurate, and it’s evident that Babylon deliberately misled HSJ readers, prioritizing their reputation over patient well-being.

He further points out that in May 2018, the MHRA independently alerted Babylon Health to two incidents concerning chatbot safety – one involving overlooked heart attack symptoms, the other, DVT symptoms. Yet, the company still refuted the HSJ’s subsequent report (titled: “Safety regulators investigating concerns about Babylon’s ‘chatbot’”).

Additional governance and operational concerns raised by Watkins include Babylon’s use of staff NDAs, which he believes fosters a company culture where employees hesitate to voice safety concerns. He also cites “inadequate medical device vigilance,” arguing the Babylon bot doesn’t consistently seek feedback on patient outcomes post-triage, stating: “The lack of a robust feedback mechanism significantly hinders the ability to identify adverse events.”

Regarding staff opinions, Babylon’s Glassdoor rating is currently 2.9 stars. A minority of reviewers recommend the company, and Parsa’s approval rating as CEO is 45%. One Glassdoor reviewer, a current clinical ops associate in Vancouver, Canada, writes: “The technology is outdated and flawed.” A one-star review states: “The well-being of patients is not a priority. A real joke to healthcare. Best to avoid.”

According to Watkins’ report of his online meeting with the MHRA, the regulator acknowledged that NDAs are “problematic” and impede employees’ ability to speak up about safety issues.

He also states that it was recognized that Babylon employees might fear speaking out due to potential legal repercussions. His meeting minutes record: “It was mentioned that the MHRA can investigate concerns raised anonymously.”

Watkins’ summary of his concerns about Babylon also highlights an event in 2018, held in London to promote the chatbot. During this event, he alleges that several “misleading claims” were made, such as the assertion that the AI generates health advice comparable to “top-rated practicing clinicians.”

These bold claims generated significant media attention, helping Babylon attract hype and potentially secure investor funding.

The London-based startup was valued at over $2 billion in 2019, raising $550 million in a Series C round from investors including Saudi Arabia’s Public Investment Fund, a large (unnamed) U.S. health insurance company, and Munich Re’s ERGO Fund, touting it as the largest digital health funding round in Europe or the U.S.

Watkins writes to the regulator: “It should be noted that Babylon Health has not retracted or attempted to correct the misleading claims made at the AI Test Event [which continues to be used as promotional material on its website in certain regions]. Therefore, there remains an ongoing risk that the public will place undue trust in Babylon’s unverified medical device.”

His summary includes anonymous correspondence from individuals claiming to work (or have worked) at Babylon, making additional claims. One writes: “There is significant pressure from investors to demonstrate a return. Anything that slows that down is seen [as] avoidable.”

Watkins asserts in his summary to the regulator: “The allegations against Babylon Health are not false and were raised in good faith to protect patient safety. Babylon’s ‘repeated’ attempts to discredit me personally raise serious questions about their corporate culture and trustworthiness as a healthcare provider.”

The MHRA’s letter to Watkins states: “Your concerns are all valid and ones that we share.”

It also thanks him for raising issues publicly “at considerable personal risk.”

uk’s mhra says it has ‘concerns’ about babylon health — and flags legal gap around triage chatbotsBabylon has been contacted for a response to the MHRA’s validation of Watkins’ concerns but had not responded at the time of writing.

The startup told the HSJ that it complies with all regulatory requirements in the countries where it operates, adding: “Babylon is committed to maintaining the highest standards of patient safety.”

In a previous aggressive incident, Babylon issued a press release labeling Watkins a “troll” and attempting to discredit his work highlighting safety issues with its chatbot’s triage process.

It also claimed its technology had been “NHS validated” as a “safe service” ten times.

Watkins questions this validation process, writing to the MHRA: “As far as I am aware, the Babylon chatbot has not been validated – in which case, their press release is misleading.”

The MHRA’s letter clarifies that the current U.K. regulatory framework for software-based medical devices doesn’t adequately cover “health tech” devices like Babylon’s chatbot.

Watkins notes there is currently no approval process; devices are merely registered with the MHRA, without a legal requirement for the regulator to assess them or receive development documentation. He describes a register maintained by the MHRA.

“You have raised a complex set of issues, and several aspects fall outside of our existing remit,” the regulator concedes in the letter. “This highlights issues we are exploring further as we develop a new regulatory framework for medical devices in the U.K.”

The update to pan-EU medical devices regulation, originally intended for U.K. implementation in May of the previous year, will not proceed following the country’s departure from the bloc.

The U.K. is developing its own regulatory update for medical device rules, meaning a gap remains around software-based “health tech” – not expected to be fully addressed for several years. (Watkins notes some regime adjustments, such as a partial lifting of confidentiality requirements last year.)

In a speech last year, health secretary Hancock stated the government aims to create a regulatory system for medical devices that is “nimble enough” to keep pace with tech advancements like health wearables and AI, while “maintaining and enhancing patient safety.” This includes granting the MHRA “a new power to disclose safety concerns about a device to the public.”

Currently, the existing (outdated) regulatory regime appears to restrict the regulator’s ability to publicly discuss safety concerns. Watkins’ public release of the MHRA’s letter has enabled this disclosure.

The MHRA writes in the letter that “confidentiality unfortunately prevents us from commenting further on any specific investigation,” but also assures him: “Please be assured that your concerns are being taken seriously, and we will take action if necessary.”

Watkins believes the regulator has engaged with Babylon regarding concerns he’s raised over the past three years, noting the company has made changes after his specific queries – such as to its T&Cs (initially stating it wasn’t a medical device, later revised to acknowledge it is) or claims of “100% safety” (withdrawn after intervention by the Advertising Standards Authority).

The chatbot itself has been modified to emphasize triage outcomes rather than diagnoses, according to Watkins.

“They’ve taken a piecemeal approach [to addressing safety issues with chatbot triage]. I would flag an issue [publicly via Twitter], and they would only address that specific issue. Patients of that age, undergoing that exact triage assessment – ‘okay, we’ll fix that, we’ll fix that’ – and they would implement a [specific fix]. But unfortunately, they never addressed the broader fundamental issues within the system. Consequently, safety issues repeatedly emerged,” he said, citing multiple cardiac triage issues he also raised with the regulator.

“When I spoke to people at Babylon, they used to have to implement these quick fixes… All they had to do was ‘dumb it down’ a bit. For example, for anyone with chest pain, it would immediately advise going to A&E. They would remove any decision-making process,” he added. (This also risks straining healthcare resources, as he points out to regulators.)

“That’s how they addressed these issues over time. But it highlights the challenges of developing these tools. It’s not easy. And if you try to do it quickly without sufficient attention, you end up with something useless.”

Watkins also suspects the MHRA influenced Babylon’s removal of certain hyperbolic promotional materials related to the 2018 AI event from its website.

During the 2018 event, Babylon’s CEO demonstrated an AI-powered interface featuring real-time transcription of patient speech combined with “emotion-scanning” AI – analyzing facial expressions to assess emotional state. Parsa stated: “That’s what we’ve done. That’s what we’ve built. None of this is for show. All of this will be either in the market or already in the market.”

However, neither feature has been launched by Babylon. Asked about this last month, the startup told TechCrunch: “The emotion detection functionality, seen in older versions of our clinical portal demo, was developed by Babylon’s AI team. Babylon conducts extensive user testing, which is why our technology is continually evolving to meet patient and clinician needs. After pre-market user testing with our clinicians, we prioritized other AI-driven features in our clinical portal over the emotion recognition function, focusing on improving our service’s operational aspects.”

“I found [the MHRA’s letter] very reassuring and strongly suspect that the MHRA has been engaging with Babylon to address concerns raised over the past three years,” Watkins told us today. “The MHRA doesn’t appear to have ignored the issues, but Babylon simply denies any problems and relies on confidentiality clauses.”

In a statement on the current regulatory situation for software-based medical devices in the U.K., the MHRA told us:

The regulator declined to be interviewed or respond to questions about the concerns it shares about Babylon – stating: “The MHRA investigates all concerns but does not comment on individual cases.

“Patient safety is paramount, and we will always investigate safety concerns, including discussing them with those who report them,” it added.

Watkins raised one final important point regarding patient safety for “cutting-edge” tech tools – questioning where the “real-life clinical data” is. He says studies available are often limited assessments, frequently conducted by the chatbot developers themselves.

“One telling aspect of this sector is the lack of real-life data,” he said. “These chatbots have been around for several years now… and there’s been enough time to gather real-life clinical data, yet it hasn’t appeared. You wonder if that’s because, in real-world settings, they aren’t as useful as we believe?”

Update: Babylon Health has now sent us this statement in response to the MHRA’s letter to Watkins:

#Babylon Health#MHRA#chatbot#triage#AI#healthcare

Natasha Lomas

Natasha's Extensive Journalism Career

Natasha served as a senior reporter with TechCrunch for over twelve years, spanning from September 2012 to April 2025. Her reporting was conducted from a European base.

Prior to her time at TechCrunch, she gained experience reviewing smartphones for CNET UK. This followed a period of more than five years dedicated to business technology coverage.

Early Career at silicon.com

Natasha’s early career included a significant role at silicon.com, which was later integrated into TechRepublic. During this time, her focus encompassed several key areas.

  • Mobile and wireless technologies
  • Telecoms & networking infrastructure
  • IT skills and training

She consistently delivered insightful reporting on these evolving technological landscapes.

Freelance Contributions

Beyond her staff positions, Natasha broadened her journalistic portfolio through freelance work. She contributed articles to prominent organizations such as The Guardian and the BBC.

Educational Background

Natasha’s academic credentials demonstrate a strong foundation in both humanities and journalism. She earned a First Class degree in English from Cambridge University.

Furthering her expertise, she completed a Master of Arts (MA) degree in journalism at Goldsmiths College, University of London. This advanced degree honed her skills in journalistic practice.

Natasha Lomas