Another major driver for the emergence of AI in healthcare is the fact that we now have more powerful computers with stronger Graphic Processing Units (GPU) and cloud computing. You need a lot of computing power to do the type of analytics heavy lifting that AI algorithms do. Cloud computing allows the algorithms to be deployed in a central place in cyber and receive the data it needs to to do its job. This makes the work of implementing algorithms in many medical centers feasible as you don’t have to go to each center and install your algorithm in their systems. It is important to keep in mind that even with cloud computing, local IT work needs to be done to deploy a new algorithm but if you are just making connections to the cloud, you have considerably less work to do at each center. This is important as the economics of developing and deploying models will need to make sense to attract companies.
One of the challenges in developing and utilizing storage, analytics and interpretive methods is the sheer volume of biomedical data that needs to be transformed that often resides on multiple systems and in multiple formats. For example, datasets used to perform tasks such as computational chemistry and molecular simulations that help de-risk, and advance molecules into development, contain millions of data points and require billions of calculations to produce an experimental output. In order to bring new therapeutics to market faster, scientists need to move targets through development faster and find more efficient ways to collaborate both inside and outside of their organizations.
Improving access to data, securely and compliantly, while increasing usability is critical to maximizing the opportunities to leverage analytics and machine learning. For large datasets in the R&D phase, large-scale, cloud-based data transfer services can transfer hundreds of terabytes and millions of files at speeds up to 10 times faster than open-source tools. Storage gateways ensure experimental data is securely stored, archived and available to other permissioned collaborators. Cloud-based hyperscale computing and machine learning enable organizations to collaborate across datasets, create and leverage global infrastructures to maintain data integrity, and more easily perform machine learning-based analyses to accelerate discoveries and de-risk candidates faster.
The availability of open source algorithmic development modules like TensorFlow also lowers the cost and technical barriers for companies. If each company had to develop their own development modules, you would have many less companies or institutions involved in AI in Healthcare.