Privacy-Preserving Weight Reconstruction in Federated Learning: A Legal Governance Framework Based on Algorithmic Justice
DOI:
https://doi.org/10.64229/h58nn265Keywords:
Federated Learning, Privacy Weighting, Algorithmic Justice, Legal Governance Framework, Dynamic Responsibility AllocationAbstract
This study takes the "right to be forgotten" established by the European Union's General Data Protection Regulation (GDPR) and the compliance requirements of China's Personal Information Protection Law as its theoretical basis. It comprehensively employs a cross-disciplinary research method combining normative analysis and technical deconstruction to deeply explore the legal dilemmas inherent in the technical architecture of federated learning. The study reveals that in the operation of the federated learning system, there is a profound legal conflict between "algorithmic shadowing" and data control rights. Traditional privacy rights theories face increasing limitations in their explanatory power when confronted with this new data processing model. Based on this, this paper innovatively proposes a "dynamic responsibility allocation" legal governance paradigm. By precisely defining the rights and obligations of various participants in the federated learning ecosystem, such as data providers, model trainers, and platform operators, this paper constructs a legal governance framework that meets the requirements of algorithmic justice. The framework aims to achieve a dynamic balance between technological innovation and privacy protection, providing a solid theoretical foundation and feasible institutional guidance for the compliant and robust development of federated learning technology.
References
[1]European Union. (2018). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5. 2016, pp. 1–88).
[2]Gu Yuhao, Bai Yuebin. Research Progress on Security and Privacy in Federated Learning Models [J]. Journal of Software, 2023, 34(6): 2833-2864.
[3]Bai Jinlong, Cao Lifeng, Wan Jiling, et al. Research Progress on Blockchain Privacy Protection Technologies [J]. Computer Engineering and Applications, 2025, 61(02): 19-36.
[4]Wu Hong, Zhao Chang. From Empowerment to Governance: China's Response to Digital Privacy Protection [J]. Seeking Truth, 2025(2): 68-80. Chen Lei, Liu Wenmao. Frontiers and Applications of Data Security Technology from a Compliance Perspective [J]. Frontiers of Data and Computing Development, 2021, 3(3): 19-31.
[5]National People's Congress of the People's Republic of China. (2021). Personal Information Protection Law of the People's Republic of China (PIPL).
[6]He Wen, Bai Hanru, Li Chao. Exploring Enterprise Data Sharing Based on Federated Learning [J]. Information and Computers, 2020(8):173-176.
[7]Qin Peng, Shanguan Lili. Federated Learning for New Applications in Privacy Computing [J]. China Telecommunications Industry, 2024(2):77-80.
[8]Liu Yixuan, Chen Hong, Liu Yuhan, et al. Privacy Protection Techniques in Federated Learning [J]. Journal of Software, 2022, 33(3): 1057-1092.
[9]Zheng Zhifeng. Privacy Protection in the Era of Artificial Intelligence [J]. Legal Science (Journal of Northwest University of Political Science and Law), 2019(2):51-60.
[10]Deng Jianpeng, Zhao Zhison. The Breakthrough and Transformation of DeepSeek: On the Regulatory Direction of Generative Artificial Intelligence [J]. Journal of Xinjiang Normal University (Philosophy and Social Sciences Edition), 2025, 46(04): 99-108. DOI: 10.14100/j.cnki.65-1039/g4.20250214.001.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Zikang Zhang (Author)

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.