Thanks to the public health system becoming increasingly more strained, more people than ever are taking their health into their own hands through the use of digital health apps. From apps like Doctor on Demand, an app that puts patients and doctors in contact without the need for the patient to even leave their house, to apps that put medical diagnosis into the hands of AI algorithms like ADA, there is an app for everything in the Digital Age. But, what if an app gives an incorrect diagnosis? If a medical doctor incorrectly diagnosed a patient they may be liable to be sued for medical malpractice. Can a person sue an app for a misdiagnosis? In this article, we look at what happens when a digital health app gets it wrong.
Medical Diagnosis Apps: How Algorithms Make Decisions
Medical diagnosis apps aim to take information supplied by an app user and, using machine learning, identify any possible health conditions they may have. Machine learning is a type of artificial intelligence that centers around teaching a machine to think in a similar way that a person would. To achieve this, machine learning uses supervised learning techniques to explore data and identify patterns. In the case of medical diagnosis apps, by identifying correlations in data, an app that uses machine learning can make decisions about what medical condition a person may be suffering from based on the experiences of others and their symptoms. Some of these apps can even recommend treatment options.
Is Misdiagnosis Considered Medical Malpractice?
What happens when a medical diagnosis app gives the wrong diagnosis or treatment options to a patient and that patient is injured as a result? In a typical medical malpractice suit, the injured party would be able to hold the doctor who gave the wrong diagnosis responsible for their injuries and a lawyer specializing in personal injury, such as Orlando Personal Injury Attorney David Heil, may be able to assist them in getting compensated. With over 35 years of experience in personal injury law, David Heil of Heil-law.com would be able to advise whether or not there was a case and if it would be worthwhile to pursue. In the case of a misdiagnosis by an app, however, who is considered responsible?
In a typical failure to diagnose case, an attorney will need to prove three things; that there was a doctor-patient relationship, that the doctor did not meet the standard of care required to diagnose a patient’s condition, and that the failure to diagnose the patient resulted in an injury. Unfortunately, thanks to there being no doctor-patient relationship, an app failing to diagnose a person would not be considered medical malpractice. However, app developers or producers may still be facing a mass tort law. For more detailed information realjustice.com may be the go-to site. The mass tort experts may suggest an alternative way to receive the rightful compensation for the victims of app failure.
So, Who Can an Injured App User Sue for Damages?
According to legal experts, if a person was to be injured as a result of an AI misdiagnosis they may be able to seek damages from the app’s producer through the Liability for Defective Products Act of 1991. Unfortunately, thanks to some of the caveats of this law, an attorney may have a difficult time proving whether it would be the developer of the app, the company that bought and sold the app, or even the app store where the app was sold who would be considered the producer of the app and thus the party who would be liable in a liability case.
Ultimately, until there are more cases of this nature brought before the courts there really isn’t much precedence to go on and thus no definite answer can be given. However, if you or a loved one have experienced a misdiagnosis by a health app you must consider consulting with a legal expert to find out what options you may have.
Digital Health Buzz! aims to be the destination of choice when it comes to what’s happening in the digital health world. We are not about news and views, but informative articles and thoughts to apply in your business.