AI is becoming a powerful tool in every industry and its impact in healthcare has become telling. In wake of the pandemic, AI is also being used heavily in the fight against COVID-19. Despite many worries and concerns that people tend to have with AI, combining it with biomedical engineering has developed some pretty powerful tools.
Deep learning is a subset of machine learning and AI concerned with mimicking the function of the human brain. Deep learning algorithms teach computers to learn by example and this is a key player in the automation of driverless cars. In healthcare, deep learning has been used to give people with motor neurone disease the ability to talk by using deepfake algorithms. This is called natural language processing (NLP).
Amongst these things, recent research is improving how deep learning can be used in medical imaging and diagnosis.
Google’s DeepMind AI can detect breast cancer as well as doctors can
About 1 in every 8 women in the US will be diagnosed with breast cancer in her life. It is estimated that around 3.5 million women in the US have a history of breast cancer. Chances of survival are so much better when the cancer is spotted and diagnosed early. Currently, the first stage in screening for breast cancer is a mammogram. This is an x-ray image of the breast. There are certain limitations to mammograms, including false negative and false positive results. Approximately 1 in 5 breast cancers are missed by screening mammograms. Misdiagnosis can lead to delay in treatment. As such, there is need for a more accurate method, without errors.
This is where Google’s DeepMind may have a solution. The DeepMind algorithm was trained using 76000 mammograms of women in the UK and 15000 women from the US. Once trained, DeepMind was fed 25000 scans from the UK and another 3000 from the US. The AI then calculated the risk of cancer for each of these scans, and the results were compared to the diagnosis’ by radiologists. Results showed a reduction in false positives by 5.7% in the USA and 1.2% in the UK. They also showed a reduction in false negatives by 9.4% and 2.7% in the US and UK, respectively.
Computed tomography (CT) imaging
CT scanning is a common procedure in most hospitals. It involves the use of multiple x-ray images combined to form a 2D or 3D cross-sectional image of the body. CT scans are used to detect damage to bones and organs as well as identify problems with blood flow. They are also used to diagnose and monitor the growth of cancers.
In a paper published in Nature Biomedical Engineering, researchers showed that a deep learning algorithm could be trained to predict a 3D CT image of a patient based on single 2D X-ray projection.
Currently, a CT scan will typically take between 15 and 30 minutes. There is often concern around the use of X-ray imaging in hospitals due to the radiation. Even though the amount of exposure to this radiation is too small to cause any significant and immediate damage, there is still a goal to reduce CT scan times.
Using deep learning could help greatly reduce this time. The ability of AI and deep learning to construct 3D representations of the body from little input means that lower-dose radiation CT scans can be used. This means that patients are subjected to less radiation. High amounts of radiation also impact the quality of the final image.
The potential of AI is medical imaging is clear. Early recognition of cancer is so important for someone’s chances of beating cancer and scientists have proved that, if used, AI could increase the accuracy and the speed of diagnosis. However, integration of technology seems to happen a lot more slowly in healthcare than in other industries, largely due to a lack of acceptance and trust.
This is a sponsored post
Digital Health Buzz! aims to be the destination of choice when it comes to what’s happening in the digital health world. We are not about news and views, but informative articles and thoughts to apply in your business.