A Privacy-Preserving Approach For Building Learning Models in Smart Healthcare using Blockchain and Federated Learning

9:45 16/10/2022
Nowadays, the amount of data generated from Internet of Things (IoT) devices is increasing, paving the way for the development of artificial intelligence (AI) applications. However, with the traditional AI approach, users sharing their raw data causes many concerns in terms of privacy leakage. There have been reports that there are privacy violations on private data of users. In the medical field, the creation of devices to help automatically diagnose the user's disease is gradually becoming a trend in the future. These devices help users monitor their own health, thereby reducing pressure on medical facilities that are often overloaded. However, the healthcare data of individuals will often be very sensitive and rarely shared by users. In addition, the profits generated from machine learning (ML) models mostly belong to the owners of that model. It also becomes an obstacle in encouraging users to share their data. In this article, we propose PriFL-Chain, a privacy-preservation framework to take advantage of data resources of data owners for training ML-based models, while ensuring the privacy of data owners. Specifically, we apply Differential privacy (DP) to federated learning (FL) to train ML models. Users just share the ML model trained on their data instead of sharing the raw data. Furthermore, the contribution activities of users in the system are recorded to the Blockchain to ensure transparency. We also leverage Mobile Edge Computing (MEC) and InterPlanetary File System (IPFS) to reduce the pressure on the central server and reduce data communication costs, making the system more flexible. Experimental results have demonstrated that the combined strategy of FL, Blockchain, IPFS, and MEC can help reduce the cost of training ML models, effectively protect privacy, and utilize data sources of diversity from the community.
The proliferation of connectivity through modern telecommunications has led to increased unwanted and disruptive calls. Such communications negatively impact user experience and trust in platforms. Currently, call filtering relies on centralized architectures that aggregate vast troves of sensitive user data within single entities, compromising privacy and ownership. Users have limited...