|As of now, big data and AI are being used in hospitals and other areas of healthcare and diagnostics. / Photo by: Daniil Peshkov via 123rf|
Big data and artificial intelligence go hand in hand. While AI can gather and store millions of records and data, the power to analyze them is provided by big data, which is why data scientists consider them as mechanical giants.
AI isn’t a new concept but big data is. Forbes, a global media company focusing on business, investing, technology, entrepreneurship, leadership, and lifestyle, reported that global big data market revenues for software and services are expected to grow by $103 billion in 2027 from only $42 billion in 2018. The technology will attain about 10.48 percent CAGR. A study conducted by Accenture, a multinational professional services company that provides services in strategy, consulting, digital, technology, and operations, revealed that 79 percent of enterprise executives believe that businesses that do not welcome big data could face extinction.
Also, advanced analytics company McKinsey Analytics’ study “Analytics Comes of Age” reported that almost 50 percent of respondents believe that big data and analytics have fundamentally changed business practices in their sales and marketing functions. At the same time, big data is creating new avenues for innovation and disruption to enterprises by 44.3 percent while decreasing expenses by 49.2 percent.
MIT Sloan Management Review referred to the convergence of big data and AI as “the single most important development that is shaping the future of how firms drive business value from their data and analytics capabilities.” They both empower each other in such a way that businesses and industries benefit the most. Data is the fuel that powers AI. With big data, machine learning applications can use large data sets to learn independently and rapidly.
One of the industries that will benefit the most is healthcare. As of now, big data and AI are being used in hospitals and other areas of healthcare and diagnostics. For instance, cognitive computing systems are used to facilitate repetitive processes. They are also being used to create sample analyses in the context of radiological diagnostics.
Big Data and AI in Curing Diseases
The AI health market is projected to grow by $6.6 billion in 2021, which is a compound annual growth rate of 40 percent. With the increased use of big data in the field, it can drive more personalization and transformation to both medical professionals and patients. The fusion of big data and AI not only improves medical services but also cures diseases.
Researchers from the Icahn School of Medicine at Mount Sinai developed an algorithm and used big data to compare the gathered information to historic groups of those who have been diagnosed and treated for diseases. It would then determine the course of treatments that would have the most positive impacts for the patients. The data came from large patient populations and a variety of sources, including epigenetics, metabolomics, proteomics, and DNA and RNA sequencing.
Another study by the University of Arizona College of Medicine created a new computer program to personalize drug treatments for patients using genetic information. The researchers integrated the data from millions of patients. Using big data, they were able to take genetic data from new patients and align it with these historic groups. This allowed the researchers to predict how diseases will progress in new patients and how they will respond to treatments.
These studies showed how accumulating large data from patients can help researchers in studying different diseases and finding better treatments and cures. According to another article by Forbes, the healthcare industry also gains the ability to determine why some drugs worked for a population and not for others using both AI and big data. For instance, through the increasing availability of healthcare data and the rapid progress of analytics techniques, a study discovered that Plavix or blood thinner clopidogrel doesn’t work in about 75 percent of Pacific Islanders.
|The researchers from the University of Arizona College of Medicine created a new computer program to personalize drug treatments for patients using genetic information. / Photo by: ra2studio via 123rf|
Predicting Scientific Advances Are Likely to Translate to the Clinic
Often, scientific discoveries remain on papers for decades. They are not immediately integrated into clinical applications, which delays the help that it can provide. This is what the researchers of the Office of Portfolio Analysis (OPA) and George Santangelo at the National Institutes of Health (NIH) are hoping to address.
A recent study published in the open-access journal PLOS Biology aimed to decrease the sometimes decades-long interval between scientific discovery and clinical application. Science Daily, an American website that aggregates press releases and publishes lightly edited press releases about science, reported that the researchers came up with a novel metric called "Approximate Potential to Translate" (APT) to make predictions that are based on the scientific paper’s content and the articles that cite them.
While experts believe that numbers should never be a substitute for evaluation, this machine learning model can be used by researchers in focusing their attention on areas of science that have strong signatures of translational potential. At the same time, this can be used to elevate biomedical progress as one component of data-driven decision-making. The researchers have also created an open citation collection (NIH-OCC) to facilitate reproducibility and increase transparency in the scientific community. The NIH-OCC currently contains over 420 million citation links, which will be updated by the researchers every month.
Indeed, big data has made successful applications of AI in healthcare. Aside from having so much healthcare data available, there has also been a rapid development in big data analytics methods. Thus, powerful AI techniques effectively unlock a lot of clinically relevant information hidden in a large amount of data.