That an algorithmic system can be developed which can be tested to distribute training across multiple machines (federated learning), train model from encrypted data shared across multiple owners (secure multi-party compute), and evaluate obfuscated training instances on one machine to be decoded by the user (homomorphic encryption) while preserving privacy and security of hospital.
Across all biomedical domains, the disputes over data ownership within and across institutions has become a timely issue. Data security issues have made the ability to form multi-institutional collaborations challenging and require mutual trust and goodwill between institutions. These issues are paramount in an information age where it is increasingly trivial to recapitulate details of the original patient population, thereby endangering patient and group-level privacy. Federated learning and cryptographic technologies such as secure multi-party compute and homomorphic encryption offer potential solutions towards democratization of data access and scientific advancement in light of data security and privacy. However, the real-world implementation of such systems is still hotly debated as a system of incentives has yet to be developed. Preliminary studies have been conducted for their capacity to secure biomedical data like DNA methylation and WSI. Our lab is working to implement additional methods, develop accompanying software frameworks, validate promising clinical models and perform real-world testing across institutions.
Manuscripts: ArXivs coming soon!