安全联邦学习进化优化算法综述

Qiqi Liu, Yuping Yan, Yaochu Jin, Xilu Wang, Peter Ligeti, Guo Yu, Xueming Yan

工程(英文) ›› 2024, Vol. 34 ›› Issue (3) : 23-42.

PDF(1959 KB)
PDF(1959 KB)
工程(英文) ›› 2024, Vol. 34 ›› Issue (3) : 23-42. DOI: 10.1016/j.eng.2023.10.006
研究论文
Review

安全联邦学习进化优化算法综述

作者信息 +

Secure Federated Evolutionary Optimization—A Survey

Author information +
History +

Abstract

With the development of edge devices and cloud computing, the question of how to accomplish machine learning and optimization tasks in a privacy-preserving and secure way has attracted increased attention over the past decade. As a privacy-preserving distributed machine learning method, federated learning (FL) has become popular in the last few years. However, the data privacy issue also occurs when solving optimization problems, which has received little attention so far. This survey paper is concerned with privacy-preserving optimization, with a focus on privacy-preserving data-driven evolutionary optimization. It aims to provide a roadmap from secure privacy-preserving learning to secure privacy-preserving optimization by summarizing security mechanisms and privacy-preserving approaches that can be employed in machine learning and optimization. We provide a formal definition of security and privacy in learning, followed by a comprehensive review of FL schemes and cryptographic privacy-preserving techniques. Then, we present ideas on the emerging area of privacy-preserving optimization, ranging from privacy-preserving distributed optimization to privacy-preserving evolutionary optimization and privacy-preserving Bayesian optimization (BO). We further provide a thorough security analysis of BO and evolutionary optimization methods from the perspective of inferring attacks and active attacks. On the basis of the above, an in-depth discussion is given to analyze what FL and distributed optimization strategies can be used for the design of federated optimization and what additional requirements are needed for achieving these strategies. Finally, we conclude the survey by outlining open questions and remaining challenges in federated data-driven optimization. We hope this survey can provide insights into the relationship between FL and federated optimization and will promote research interest in secure federated optimization.

Keywords

Federated learning / Privacy-preservation / Security / Evolutionary optimization / Data-driven optimization / Bayesian optimization

引用本文

导出引用
Qiqi Liu, Yuping Yan, Yaochu Jin. . Engineering. 2024, 34(3): 23-42 https://doi.org/10.1016/j.eng.2023.10.006

参考文献

[1]
McMahan B, Moore E, Ramage D, Hampson S, Arcas BA. Communication-efficient learning of deep networks from decentralized data. In: Proceedings of the 20th International Conference on Artificial Intelligence and Statistics; 2017 Apr 20-22; Ft. Lauderdale, FL, USA; 2017.
[2]
Y. Jin, H. Zhu, J. Xu, Y. Chen. Federated learning: fundamentals and advances. Springer, Singapore (2022).
[3]
P. Voigt, A. von dem Bussche. The EU General Data Protection Regulation (GDPR): a practical guide. (1st ed.), Springer, Cham (2017).
[4]
Q. Yang, Y. Liu, Y. Cheng, Y. Kang, T. Chen, H. Yu. Federated learning—synthesis lectures on artificial intelligence and machine learning. Morgan & Claypool Publishers, Kentfield (2019).
[5]
Zhu L, Liu Z, Han S. Deep leakage from gradients. In: Proceedings of the 33rd Conference on Neural Information Processing Systems; 2019 Dec 8-14; Vancouver, BC, Canada; 2019.
[6]
Lyu L, Yu H, Yang Q. Threats to federated learning: a survey. 2020. arXiv:2003.02133.
[7]
C. Dwork. Differential privacy: a survey of results. M. Agrawal, D.Z. Du, Z.H. Duan, A.S. Li (Eds.), Theory and applications of models of computation, Springer, Berlin (2008).
[8]
N. Truong, K. Sun, S. Wang, F. Guitton, Y. Guo. Privacy preservation in federated learning: an insightful survey from the GDPR perspective. Comput Secur, 110 (2021), 102402.
[9]
Y. Jin, H. Wang, C. Sun. Data-driven evolutionary optimization: integrating evolutionary computation, machine learning and data science. Springer, Cham (2021).
[10]
B. Shahriari, K. Swersky, Z. Wang, R.P. Adams, N. De Freitas. Taking the human out of the loop: a review of Bayesian optimization. Proc IEEE, 104 (1) (2015), pp. 148-175.
[11]
Y. Jin. Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol Comput, 1 (2) (2011), pp. 61-70.
[12]
D.R. Jones, M. Schonlau, W.J. Welch. Efficient global optimization of expensive black-box functions. J Glob Optim, 13 (4) (1998), pp. 455-492.
[13]
X. Yu, M. Gen. Introduction to evolutionary algorithms. Springer Science & Business Media, London (2010).
[14]
T. Back. Evolutionary algorithms in theory and practice: evolution strategies, evolutionary programming, genetic algorithms. Oxford University Press, Oxford (1996).
[15]
S. Boyd, N. Parikh, E. Chu, B. Peleato, J. Eckstein. Distributed optimization and statistical learning via the alternating direction method of multipliers. Found Trends Mach learn, 3 (1) (2011), pp. 1-122.
[16]
V. Mothukuri, R.M. Parizi, S. Pouriyeh, Y. Huang, A. Dehghantanha, G. Srivastava. A survey on security and privacy of federated learning. Future Gener Comput Syst, 115 (2021), pp. 619-640.
[17]
X. Yin, Y. Zhu, J. Hu. A comprehensive survey of privacy-preserving federated learning: a taxonomy, review, and future directions. ACM Comput Surv, 54 (6) (2021), p. 131.
[18]
K. Zhang, X. Song, C. Zhang, S. Yu. Challenges and future directions of secure federated learning: a survey. Front Comput Sci, 16 (2022), 165817.
[19]
Q. Li, Z. Wen, Z. Wu, S. Hu, N. Wang, Y. Li, et al. A survey on federated learning systems: vision, hype and reality for data privacy and protection. IEEE Trans Knowl Data Eng, 35 (4) (2023), pp. 3347-3366.
[20]
Cao L, Chen H, Fan X, Gama J, Ong YS, Kumar V. Bayesian federated learning: a survey. 2023. arXiv:2304.13267.
[21]
Weeraddana PC, Athanasiou G, Jakobsson M, Fischione C, Baras J. Per-se privacy preserving distributed optimization. 2012. arXiv:1210.3283.
[22]
T. Yang, X. Yi, J. Wu, Y. Yuan, D. Wu, Z. Meng, et al. A survey of distributed optimization. Annu Rev Contr, 47 (2019), pp. 278-305.
[23]
Q. Li, J.S. Gundersen, R. Heusdens, M.G. Christensen. Privacy-preserving distributed processing: metrics, bounds and algorithms. IEEE Trans Inf Forensics Secur, 16 (2021), pp. 2090-2103.
[24]
D.K. Molzahn, F. Dörfler, H. Sandberg, S.H. Low, S. Chakrabarti, R. Baldick, et al. A survey of distributed optimization and control algorithms for electric power systems. IEEE Trans Smart Grid, 8 (6) (2017), pp. 2941-2962.
[25]
Zhao B, Chen WN, Li X, Liu X, Pei Q, Zhang J. When evolutionary computation meets privacy. 2023. arXiv:2304.01205.
[26]
X. Wang, Y. Jin, S. Schmitt, M. Olhofer. Recent advances in Bayesian optimization. ACM Comput Surv, 55 (13s) (2023), p. 287.
[27]
Y. Jin, H. Wang, T. Chugh, D. Guo, K. Miettinen. Data-driven evolutionary optimization: an overview and case studies. IEEE Trans Evol Comput, 23 (3) (2019), pp. 442-458.
[28]
C. Gentry. A fully homomorphic encryption scheme. Stanford University, Palo Alto (2009).
[29]
Yao AC. Protocols for secure computations. In: Proceedings of 23rd Annual Symposium on Foundations of Computer Science (SFCS 1982); 1982 Nov 3-5; Chicago, IL, USA; 1982.
[30]
A. Shamir. How to share a secret. Commun ACM, 22 (11) (1979), pp. 612-613.
[31]
D. Gollmann. Computer security. Wiley Interdiscip Rev Comput Stat, 2 (5) (2010), pp. 544-554.
[32]
L. Sweeney. k-anonymity: a model for protecting privacy. Int J Uncertain Fuzziness Knowl Based Syst, 10 (05) (2002), pp. 557-570.
[33]
A. Machanavajjhala, D. Kifer, J. Gehrke, M. Venkitasubramaniam. L-diversity: privacy beyond k-anonymity. ACM Trans Knowl Discov Data, 1 (1) (2007), p. 3.
[34]
Li N, Li T, Venkatasubramanian S. t-closeness: privacy beyond k-anonymity and l-diversity. In:Proceedings of 2007 IEEE 23rd International Conference on Data Engineering; 2007 Apr 15-20; Istanbul, Turkey; 2007.
[35]
R.L. Rivest, L. Adleman, M.L. Dertouzos. On data banks and privacy homomorphisms. Found Secur Comput, 4 (1978), pp. 169-180.
[36]
Su H, Chen H. Experiments on parallel training of deep neural network using model averaging. 2015. arXiv:1507.01239.
[37]
J. Dean, G. Corrado, R. Monga, K. Chen, M. Devin, M. Mao, et al. Large scale distributed deep networks. Proceedings of the 26th Annual Conference on Neural Information Processing Systems; 2012 Dec 3-6, Lake Tahoe, NV, USA (2012).
[38]
K. Chang, N. Balachandar, C. Lam, D. Yi, J. Brown, A. Beers, et al. Distributed deep learning networks among institutions for medical imaging. J Am Med Inform Assoc, 25 (8) (2018), pp. 945-954.
[39]
Sheller MJ, Reina GA, Edwards B, Martin J, Bakas S. Multi-institutional deep learning modeling without sharing patient data:a feasibility study on brain tumor segmentation. In: Proceedings of International MICCAI Brainlesion Workshop; 2022 Sep 18; Singapore; 2018.
[40]
O. Gupta, R. Raskar. Distributed learning of deep neural network over multiple agents. J Netw Comput Appl, 116 (2018), pp. 1-8.
[41]
B. Custers, A.M. Sears, F. Dechesne, I. Georgieva, T. Tani, S. Van der Hof. EU personal data protection in policy and practice. Springer, Hague (2019).
[42]
M.A. Rahman, T. Rahman, R. Laganière, N. Mohammed, Y. Wang. Membership inference attack against differentially private deep learning model. Trans Data Priv, 11 (2018), pp. 61-79.
[43]
Zhang X, Zhu X, Lessard L. Online data poisoning attacks. In:Proceedings of the 2nd Conference on Learning for Dynamics and Control; 2020 Jun 11-12; Berkeley, CA, USA; 2020.
[44]
Du W, Atallah MJ. Secure multi-party computation problems and their applications: a review and open problems. In: Proceedings of the 2001 Workshop on New Security Paradigms; 2001 Sep 10-13; Cloudcroft, NM, USA;; 2001. p. 13-22.
[45]
Shokri R, Shmatikov V. Privacy-preserving deep learning. In:Proceedings of the 22nd ACM SIGSAC Conference on Computer and Communications Security; 2015 Oct 12-16; Denver, CO, USA; 2015. p. 1310-21.
[46]
Abadi M, Chu A, Goodfellow I, McMahan HB, Mironov I, Talwar K, et al. Deep learning with differential privacy. In:Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security; 2016 Oct 24-28;Vienna, Austria; 2016.
[47]
Bonawitz K, Ivanov V, Kreuter B, Marcedone A, McMahan HB, Patel S, et al. Practical secure aggregation for privacy-preserving machine learning. In:Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security; 2017 Oct 30-Nov 3; Dallas, TX, USA; 2017.
[48]
Choudhury O, Gkoulalas-Divanis A, Salonidis T, Sylla I, Park Y, Hsu G, et al. Anonymizing data for privacy-preserving federated learning. 2020. arXiv:2002.09096.
[49]
J. Song, W. Wang, T.R. Gadekallu, J. Cao, Y. Liu. EPPDA: an efficient privacy-preserving data aggregation federated learning scheme. IEEE Trans Netw Sci Eng, 10 (5) (2023), pp. 3047-3057.
[50]
Truex S, Baracaldo N, Anwar A, Steinke T, Ludwig H, Zhang R, et al. A hybrid approach to privacy-preserving federated learning. In:Proceedings of the 12th ACM Workshop on Artificial Intelligence and Security; 2019 Nov 15; London, UK; 2012.
[51]
Xu R, Baracaldo N, Zhou Y, Anwar A, Ludwig H. Hybrid Alpha: an efficient approach for privacy-preserving federated learning. In:Proceedings of the 12th ACM Workshop on Artificial Intelligence and Security; 2019 Nov 15; London, UK; 2019. p. 13-23.
[52]
H. Zhu, R. Wang, Y. Jin, K. Liang. PIVODL: privacy-preserving vertical federated learning over distributed labels. IEEE Trans Artif Intell, 4 (5) (2023), pp. 988-1001.
[53]
Z. Lian, Q. Yang, W. Wang, Q. Zeng, M. Alazab, H. Zhao, et al. DEEP-FEL: decentralized, efficient and privacy-enhanced federated edge learning for healthcare cyber physical systems. IEEE Trans Netw Sci Eng, 9 (5) (2022), pp. 3558-3569.
[54]
Zhang S, Choromanska AE, LeCun Y. Deep learning with elastic averaging SGD. In:Proceedings of the 29th Annual Conference on Neural Information Processing Systems (NIPS 2015); 2015 Dec 11-12;Montreal, QC, Canada; 2015.
[55]
Kingma DP, Ba J. Adam: a method for stochastic optimization. 2014. arXiv:1412.6980.
[56]
Q. Li, R. Heusdens, M.G. Christensen. Privacy-preserving distributed optimization via subspace perturbation: a general framework. IEEE Trans Signal Process, 68 (2020), pp. 5983-5996.
[57]
O. Kramer. Genetic algorithms. O. Kramer (Ed.), Genetic algorithm essentials, Springer, Cham (2017).
[58]
K. Deb, A. Pratap, S. Agarwal, T. Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput, 6 (2) (2002), pp. 182-197.
[59]
R. Cheng, Y. Jin, M. Olhofer, B. Sendhoff. A reference vector guided evolutionary algorithm for many-objective optimization. IEEE Trans Evol Comput, 20 (5) (2016), pp. 773-791.
[60]
P. Auer. Using confidence bounds for exploitation-exploration trade-offs. J Mach Learn Res, 3 (2002), pp. 397-422.
[61]
Wang Z, Jegelka S. Max-value entropy search for efficient Bayesian optimization. In:Proceedings of the 34th International Conference on Machine Learning; 2017 Aug 6-11; Sydney, NSW, Australia; 2017.
[62]
T. Rodemann. A many-objective configuration optimization for building energy management. Rio de Janeiro, Brazil (2018).
[63]
X. Ye, B. Chen, P. Li, L. Jing, G. Zeng. A simulation-based multi-agent particle swarm optimization approach for supporting dynamic decision making in marine oil spill responses. Ocean Coast Manage, 172 (2019), pp. 128-136.
[64]
T. Schmitt, M. Hoffmann, T. Rodemann, J. Adamy. Incorporating human preferences in decision making for dynamic multi-objective optimization in model predictive control. Inventions, 7 (3) (2022), p. 46.
[65]
H. Abdi, L.J. Williams. Principal component analysis. Wiley Interdiscip Rev Comput Stat, 2 (4) (2010), pp. 433-459.
[66]
A.C. Belkina, C.O. Ciccolella, R. Anno, R. Halpert, J. Spidlen, J.E. Snyder-Cappione. Automated optimized parameters for T-distributed stochastic neighbor embedding improve visualization and analysis of large datasets. Nat Commun, 10 (2019), p. 5415.
[67]
F. Ntelemis, Y. Jin, S.A. Thomas. Image clustering using an augmented generative adversarial network and information maximization. IEEE Trans Neural Netw Learn Syst, 33 (12) (2022), pp. 7461-7474.
[68]
F. Ntelemis, Y. Jin, S.A. Thomas. Information maximization clustering via multi-view self-labelling. Knowl Base Syst, 250 (2022), 109042.
[69]
K. He, X. Zhang, S. Ren, J. Sun. Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2016 Jun 26-Jul 1, Las Vegas, NV, USA (2016).
[70]
S.K. Gaikwad, B.W. Gawali, P. Yannawar. A review on speech recognition technique. Int J Comput Appl, 10 (2010), pp. 16-24.
[71]
D. Yu, L. Deng. Automatic speech recognition. Springer, Cham (2015).
[72]
K.R. Chowdhary. Natural language processing. K.R. Chowdhary (Ed.), Fundamentals of artificial intelligence, Springer, Cham (2020).
[73]
J.E. Van Engelen, H.H. Hoos. A survey on semi-supervised learning. Mach Learn, 109 (2) (2020), pp. 373-440.
[74]
F. Zhuang, Z. Qi, K. Duan, D. Xi, Y. Zhu, H. Zhu, et al. A comprehensive survey on transfer learning. Proc IEEE, 109 (1) (2020), pp. 43-76.
[75]
A. Jaiswal, A.R. Babu, M.Z. Zadeh, D. Banerjee, F. Makedon. A survey on contrastive self-supervised learning. Technologies, 9 (1) (2020), p. 2.
[76]
S. Sun, Z. Cao, H. Zhu, J. Zhao. A survey of optimization methods from a machine learning perspective. IEEE Trans Cybern, 50 (8) (2019), pp. 3668-3681.
[77]
T. Bäck, H.P. Schwefel. An overview of evolutionary algorithms for parameter optimization. Evol Comput, 1 (1) (1993), pp. 1-23.
[78]
P.J. Van Laarhoven, E.H. Aarts. Simulated annealing: theory and applications. Springer, Cham (1987).
[79]
Y. Yuan, H. Xu, B. Wang, X. Yao. A new dominance relation-based evolutionary algorithm for many-objective optimization. IEEE Trans Evol Comput, 20 (1) (2016), pp. 16-37.
[80]
Q. Zhang, H. Li. MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans Evol Comput, 11 (6) (2007), pp. 712-731.
[81]
Liu Q, Jin Y, Heiderich M, Rodemann T. Adaptation of reference vectors for evolutionary many-objective optimization of problems with irregular Pareto fronts. In:Proceedings of 2019 IEEE Congress on Evolutionary Computation (CEC); 2019 Jun 10-13;Wellington, New Zealand ; 2019.
[82]
Q. Liu, Y. Jin, M. Heiderich, T. Rodemann, G. Yu. An adaptive reference vector guided evolutionary algorithm using growing neural gas for many-objective optimization of irregular problems. IEEE Trans Cybern, 52 (5) (2020), pp. 2698-2711.
[83]
Y. Sun, G.G. Yen, Z. Yi. IGD indicator-based evolutionary algorithm for many-objective optimization problems. IEEE Trans Evol Comput, 23 (2) (2018), pp. 173-187.
[84]
J. Bader, E. Zitzler, E. Hyp. An algorithm for fast hypervolume-based many-objective optimization. Evol Comput, 19 (1) (2011), pp. 45-76.
[85]
Y. Zhou, M. Zhu, J. Wang, Z. Zhang, Y. Xiang, J. Zhang. Tri-goal evolution framework for constrained many-objective optimization. IEEE Trans Syst Man Cybern Syst, 50 (2020), pp. 3086-3099.
[86]
G. Yu, L. Ma, Y. Jin, W. Du, Q. Liu, H. Zhang. A survey on knee-oriented multi-objective evolutionary optimization. IEEE Trans Evol Comput, 26 (6) (2022), pp. 1452-1472.
[87]
L.B. Said, S. Bechikh, K. Ghédira. The r-dominance: a new dominance relation for interactive evolutionary multicriteria decision making. IEEE Trans Evol Comput, 14 (5) (2010), pp. 801-818.
[88]
K. Deb, A. Sinha, P.J. Korhonen, J. Wallenius. An interactive evolutionary multiobjective optimization method based on progressively approximated value functions. IEEE Trans Evol Comput, 14 (5) (2010), pp. 723-739.
[89]
C.A.C. Coello, S.G. Brambila, J.F. Gamboa, M.G.C. Tapia, R.H. Gómez. Evolutionary multiobjective optimization: open research areas and some challenges lying ahead. Complex Intell Syst, 6 (2) (2020), pp. 221-236.
[90]
Sakuma J, Kobayashi S. A genetic algorithm for privacy preserving combinatorial optimization. In:Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation; 2007 Jul 7-11;London, UK; 2007.
[91]
Han S, Ng WK. Privacy-preserving genetic algorithms for rule discovery. In:Proceedings of International Conference on Data Warehousing and Knowledge Discovery; 2007 Sep 3-7; Regensburg, Germany; 2007.
[92]
Goethals B, Laur S, Lipmaa H, Mielikäinen T. On private scalar product computation for privacy-preserving data mining. In:Proceedings of International Conference on Information Security and Cryptology; 2004 Dec 2-3: Seoul, Republic of Korea; 2004. p. 104-20.
[93]
Hong Y, Vaidya J, Lu H. Securely solving the distributed graph coloring problem. 2018. arXiv:1803.05606.
[94]
Bogunovic I, Scarlett J, Jegelka S, Cevher V. Adversarially robust optimization with Gaussian processes. In: Proceedings of Advances in Neural Information Processing Systems; 2018 Dec 3-8; Montréal QC, Canada; 2018.
[95]
Cai X, Scarlett J. On lower bounds for standard and robust Gaussian process bandit optimization. In:Proceedings of International Conference on Machine Learning; 2021 Jul 18-24; online; 2021. p. 1216-26.
[96]
Bogunovic I, Krause A, Scarlett J. Corruption-tolerant Gaussian process bandit optimization. In:Proceedings of International Conference on Artificial Intelligence and Statistics; 2020 Aug 26-28; online; 2020. p. 1071-81.
[97]
Han E, Scarlett J. Adversarial attacks on Gaussian process bandits. In:Proceedings of International Conference on Machine Learning; 2022 Jul 17-23; Baltimore, Maryland; 2022. p. 8304-29.
[98]
Zhan ZH, Wu SH, Zhang J. A new evolutionary computation framework for privacy-preserving optimization. In:Proceedings of International Conference on Advanced Computational Intelligence; 2021 May 14-16; Wanzhou, China; 2021. p. 220-6.
[99]
B. Zhao, X. Liu, A. Song, W.N. Chen, K.K. Lai, J. Zhang, et al. PriMPSO: a privacy-preserving multiagent particle swarm optimization algorithm. IEEE Trans Cybern, 53 (11) (2023), pp. 7136-7149.
[100]
Bogdanov D, Emura K, Jagomägis R, Kanaoka A, Matsuo S, Willemson J. A secure genetic algorithm for the subset cover problem and its application to privacy protection. In:Proceedings of International Workshop on Information Security Theory and Practice; 2014 Jun 30-July 2; Crete, Greece; 2014. p. 108-23.
[101]
Yan Y, Han D, Shu T. Privacy preserving optimization of participatory sensing in mobile cloud computing. In:Proceedings of International Conference on Distributed Computing Systems; 2017 Jun 5-8; Atlanta, GA, USA; 2017. p. 1084-93.
[102]
Funke D, Kerschbaum F. Privacy-preserving multi-objective evolutionary algorithms. In:Proceedings of International Conference on Parallel Problem Solving from Nature; 2010 Sep 11-15; Krakow, Poland; 2010. p. 41-50.
[103]
Zhao B, Chen WN, Wei FF, Liu X, Pei Q, Zhang J. Evolution as a service: a privacy-preserving genetic algorithm for combinatorial optimization. 2022. arXiv:2205.13948.
[104]
Suo J, Gu L, Yan X, Yang S, Hu X, Wang L. PP-QIGA: a privacy-preserving quantum inspired genetic algorithm for the double digest problem. 2022. reseachsquare:10.21203/rs.3.rs-1941096/v1.
[105]
Hong Y, Vaidya J, Lu H, Wang L. Collaboratively solving the traveling salesman problem with limited disclosure. In:Proceedings of the 4th ACM Conference on Data and Application Security and Privacy; 2014 Mar 3-5; San Antonio, TX, USA; 2014. p. 179-94.
[106]
Y. Hong, J. Vaidya. An inference-proof approach to privacy-preserving horizontally partitioned linear programs. Optim Lett, 8 (1) (2014), pp. 267-277.
[107]
Y. Hong, J. Vaidya, N. Rizzo, Q. Liu. Privacy-preserving linear programming. S. Goel, Y. Hong, J. Giboney, P. Atrey (Eds.), World scientific reference on innovation: volume 4: innovation in information security, World Scientific, Singapore (2018).
[108]
Borden AR, Molzahn DK, Lesieutre BC, Ramanathan P. Power system structure and confidentiality preserving transformation of optimal power flow problem. In:Proceedings of Annual Allerton Conference on Communication, Control, and Computing; 2013 Oct 2-4; Monticello, IL, USA; 2013. p. 1021-8.
[109]
Gupta A, Ligett K, McSherry F, Roth A, Talwar K. Differentially private combinatorial optimization. In:Proceedings of the Twenty-First Annual ACM-SIAM Symposium on Discrete Algorithms; 2010 Jan 17-19; Austin, TX, USA; 2010. p. 1106-25.
[110]
Kusner M, Gardner J, Garnett R, Weinberger K. Differentially private Bayesian optimization. In:Proceedings of International Conference on Machine Learning; 2015 Jul 6-11; Lille Grand Palais, France; 2015. p. 918-27.
[111]
Fenner P, Pyzer-Knapp E. Privacy-preserving Gaussian process regression—a modular approach to the application of homomorphic encryption. In: Proceedings of Thirty-Fourth AAAI Conference on Artificial Intelligence; 2020 Feb 7-12; New York City, NY, USA; 2020. p. 3866-73.
[112]
Luo J, Zhang Y, Zhang J, Qin S, Wang H, Yu Y, et al. Practical privacy-preserving Gaussian process regression via secret sharing. 2023. arXiv:2306.14498.
[113]
Nguyen TD, Gupta S, Rana S, Venkatesh S. A privacy preserving Bayesian optimization with high efficiency. In:Proceedings of Pacific-Asia Conference on Knowledge Discovery and Data Mining; 2018 Jun 3-6; Melbourne, VIC, Australia; 2018. p. 543-55.
[114]
Xiong Z, Li L, Yan J, Wang H, He H, Jin Y. Differential privacy with variant-noise for Gaussian processes classification. In:Proceedings of Pacific Rim International Conference on Artificial Intelligence; 2019 Aug 26-30; Yanuca Island, Fuji; 2019. p. 107-19.
[115]
Kharkovskii D, Dai Z, Low BKH. Private outsourced Bayesian optimization. In:Proceedings of International Conference on Machine Learning; 2020 Jul 12-18; Vienna, Austria; 2020. p. 5231-42.
[116]
C. Zhang, M. Ahmad, Y. Wang. ADMM based privacy-preserving decentralized optimization. IEEE Trans Inf Forensics Secur, 14 (3) (2018), pp. 565-580.
[117]
C. Zhang, Y. Wang. Enabling privacy-preservation in decentralized optimization. IEEE Trans Control Netw Syst, 6 (2) (2018), pp. 679-689.
[118]
Ruan M, Ahmad M, Wang Y. Secure and privacy-preserving average consensus. In:Proceedings of the 2017 Workshop on Cyber-Physical Systems Security and Privacy; 2017 Nov 3; Dallas, TX, USA; 2017. p. 123-9.
[119]
Gao H, Zhang C, Ahmad M, Wang Y. Privacy-preserving average consensus on directed graphs using push-sum. In:Proceedings of the 6th Annual IEEE Conference on Communications and Network Security (CNS); 2018 May 30-Jun 1; Beijing, China; 2018.
[120]
Tian N, Guo Q, Sun H, Zhou X. Fully privacy-preserving distributed optimization based on secret sharing. 2021. TechRxiv.
[121]
Li Q, Cascudo I, Christensen MG. Privacy-preserving distributed average consensus based on additive secret sharing. In: Proceedings of the 27th European Signal Processing Conference; 2019 Sep 2-6; A Coruña, Spain; 2019.
[122]
T. Zhang, Q. Zhu. Dynamic differential privacy for ADMM-based distributed classification learning. IEEE Trans Inf Forensics Secur, 12 (1) (2016), pp. 172-187.
[123]
Z. Huang, R. Hu, Y. Guo, E. Chan-Tin, Y. Gong. DP-ADMM: ADMM-based distributed learning with differential privacy. IEEE Trans Inf Forensics Secur, 15 (2019), pp. 1002-1012.
[124]
Zhang X, Khalili MM, Liu M. Improving the privacy and accuracy of ADMM-based distributed algorithms. In:Proceedings of International Conference on Machine Learning; 2018 Jul 10-15; Stockholmsmässan, Sweden; 2018. p. 5796-805.
[125]
Huang Z, Mitra S, Vaidya N. Differentially private distributed optimization. In:Proceedings of the 16th International Conference on Distributed Computing and Networking; 2015 Jan 4-7;Goa, India; 2015.
[126]
Gauthier F, Gratton C, Venkategowda NK, Werner S. Privacy-preserving distributed learning with nonsmooth objective functions. In:Proceedings of the 54th Asilomar Conference on Signals, Systems, and Computers; 2020 Nov 1-4; online; 2020. p. 42-6.
[127]
T. Ding, S. Zhu, J. He, C. Chen, X. Guan. Differentially private distributed optimization via state and direction perturbation in multiagent systems. IEEE Trans Automat Contr, 67 (2) (2021), pp. 722-737.
[128]
Dai Z, Low BKH, Jaillet P. Federated Bayesian optimization via Thompson sampling. In:Proceedings of the 34th Conference on Neural Information Processing Systems; 2020 Dec 6-12; Vancouver, BC, Canada; 2020. p. 9687-99.
[129]
Dai Z, Low BKH, Jaillet P. Differentially private federated Bayesian optimization with distributed exploration. In:Proceedings of the 35th Conference on Neural Information Processing Systems; 2021 Dec 6-14; online; 2021. p. 9125-39.
[130]
J. Xu, Y. Jin, W. Du, S. Gu. A federated data-driven evolutionary algorithm. Knowl Base Syst, 233 (2021), 107532.
[131]
J. Xu, Y. Jin, W. Du. A federated data-driven evolutionary algorithm for expensive multi-/many-objective optimization. Complex Intell Syst, 7 (6) (2021), pp. 3093-3109.
[132]
X.Q. Guo, W.N. Chen, F.F. Wei, W.T. Mao, X.M. Hu, J. Zhang. Edge-cloud co-evolutionary algorithms for distributed data-driven optimization problems. IEEE Trans Cybern, 53 (10) (2023), pp. 6598-6611.
[133]
V. Torra, E. Galván, G. Navarro-Arribas. Pso+ fl= paaso: particle swarm optimization + federated learning = privacy-aware agent swarm optimization. Int J Inf Secur, 21 (6) (2022), pp. 1349-1359.
[134]
Kathen MJT, Johnson P, Flores IJ, Reina DGE. Aquafel-PSO: a monitoring system for water resources using autonomous surface vehicles based on multimodal PSO and federated learning. 2022. arXiv:2211.15217.
[135]
Cheng A, Wang Z, Li Y, Cheng J. HPN: personalized federated hyperparameter optimization. 2023. arXiv:2304.05195.
[136]
X. Zhang, Z. Yuan, M. Zhu. Byzantine-tolerant federated Gaussian process regression for streaming data. Adv Neural Inf Process Syst, 35 (2022), pp. 13499-13511.
[137]
Salgia S, Vakili S, Zhao Q. Collaborative learning in kernel-based bandits for distributed users. 2023. arXiv:2207.07948.
[138]
Zhu H, Wang X, Jin Y. Federated many-task Bayesian optimization. IEEE Trans Evol Comput. In press.
[139]
Sim RHL, Zhang Y, Low BKH, Jaillet P. Collaborative Bayesian optimization with fair regret. In:Proceedings of International Conference on Machine Learning; 2021 Jul 18-24; online; 2021. p. 9691-701.
[140]
Li T, Sanjabi M, Beirami A, Smith V. Fair resource allocation in federated learning. 2019. arXiv:1905.10497.
[141]
Candelieri A, Ponti A, Archetti F. Fair and green hyperparameter optimization via multi-objective and multiple information source Bayesian optimization. 2022. arXiv:2205.08835.
[142]
Perrone V, Donini M, Zafar MB, Schmucker R, Kenthapadi K, Archambeau C. Fair Bayesian optimization. In:Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society; 2021 May 19-21; online; 2021. p. 854-63.
[143]
Mehrabi N, de Lichy C, McKay J, He C, Campbell W. Towards multi-objective statistically fair federated learning. 2022. arXiv:2201.09917.
[144]
L. Lyu, X. Xu, Q. Wang, H. Yu. Collaborative fairness in federated learning. Q. Yang, L. Fan, H. Yu (Eds.), Federated learning, Springer, Cham (2020).
[145]
Liu C, Fan Z, Zhou Z, Shi Y, Pei J, Chu L, et al. Achieving model fairness in vertical federated learning. 2021. arXiv:2109.08344.
[146]
Zhang C, Gao H, Wang Y. Privacy-preserving decentralized optimization via decomposition. 2018. arXiv:1808.09566.
[147]
Y. Wang. Privacy-preserving average consensus via state decomposition. IEEE Trans Automat Contr, 64 (11) (2019), pp. 4711-4716.
[148]
Liu Q, Yan Y, Ligeti P, Jin Y. A secure federated data-driven evolutionary multi-objective optimization algorithm. IEEE Trans Emerg Top Comput Intell., in press.
[149]
H. Zhu, J. Xu, S. Liu, Y. Jin. Federated learning on non-IID data: a survey. Neurocomputing, 465 (2021), pp. 371-390.
[150]
Yan Y, Ligeti P. A survey of personalized and incentive mechanisms for federated learning. In:Proceedings of IEEE 2nd Conference on Information Technology and Data Science. 2022 May 16-18; Debrecen, Hungary; 2022. p. 324-9.
[151]
P. Kairouz, H.B. McMahan, B. Avent, A. Bellet, M. Bennis, A.N. Bhagoji, et al. Advances and open problems in federated learning. Found Trends Mach learn, 14 (1-2) (2021), pp. 1-210.
[152]
Chai D, Wang L, Chen K, Yang Q. Fedeval: a benchmark system with a comprehensive evaluation model for federated learning. 2020. arXiv:2011.09655.
[153]
Zhao Y, Li M, Lai L, Suda N, Civin D, Chandra V. Federated learning with non-IID data. 2018. arXiv:1806.00582.
[154]
Z. Lian, Q. Zeng, W. Wang, T.R. Gadekallu, C. Su. Blockchain-based two-stage federated learning with non-IID data in IoMT system. IEEE Trans Comput Soc Syst, 10 (4) (2023), pp. 1701-1710.
[155]
Nishio T, Yonetani R. Client selection for federated learning with heterogeneous resources in mobile edge. In:Proceedings of IEEE International Conference on Communications; 2019 May 20-24;Shanghai, China; 2019.
[156]
Y. Deng, F. Lyu, J. Ren, H. Wu, Y. Zhou, Y. Zhang, et al. Auction: automated and quality-aware client selection framework for efficient federated learning. IEEE Trans Parallel Distrib Syst, 33 (8) (2021), pp. 1996-2009.
[157]
Konecný J, McMahan HB, Yu F, Richtárik P, Suresh AT, Bacon D. Federated learning: Strategies for improving communication efficiency. 2017. arXiv:1610.05492v2.
[158]
F. Sattler, S. Wiedemann, K.R. Müller, W. Samek. Robust and communication-efficient federated learning from non-IID data. IEEE Trans Neural Netw Learn Syst, 31 (9) (2019), pp. 3400-3413.
[159]
J. Xu, W. Du, Y. Jin, W. He, R. Cheng. Ternary compression for communication-efficient federated learning. IEEE Trans Neural Netw Learn Syst, 33 (3) (2022), pp. 1162-1176.
[160]
Y. Chen, X. Sun, Y. Jin. Communication-efficient federated deep learning with layer-wise asynchronous model update and temporally weighted aggregation. IEEE Trans Neural Netw Learn Syst, 31 (10) (2020), pp. 4229-4238.
[161]
Guo Q, Qi Y, Qi S, Wu D, Li Q. FedMCSA: personalized federated learning via model components self-attention. 2022. arXiv:2208.10731.
[162]
H. Zhu, Y. Jin. Multi-objective evolutionary federated learning. IEEE Trans Neural Netw Learn Syst, 31 (4) (2020), pp. 1310-1322
[163]
Liang X, Liu Y, Luo J, He Y, Chen T, Yang Q. Self-supervised cross-silo federated neural architecture search. 2021. arXiv:2101.11896.
[164]
L. Lyu, J. Yu, K. Nandakumar, Y. Li, X. Ma, J. Jin, et al. Towards fair and privacy-preserving federated deep models. IEEE Trans Parallel Distrib Syst, 31 (11) (2020), pp. 2524-2541.
[165]
Shi Y, Yu H, Leung C. A survey of fairness-aware federated learning. 2021. arXiv:2111.01872.
[166]
Zhou P, Fang P, Hui P. Loss tolerant federated learning. 2021. arXiv:2105.03591.
[167]
Chouldechova A, Roth A. The frontiers of fairness in machine learning. 2018. arXiv:1810.08810.
[168]
N. Mehrabi, F. Morstatter, N. Saxena, K. Lerman, A. Galstyan. A survey on bias and fairness in machine learning. ACM Comput Surv, 54 (6) (2021), pp. 1-35.
[169]
Yue X, Nouiehed M, Kontar RA. GIFAIR-FL: an approach for group and individual fairness in federated learning. 2021. arXiv:2108.02741.
[170]
M. Cong, H. Yu, X. Weng, S.M. Yiu. A game-theoretic framework for incentive mechanism design in federated learning. Q. Yang, L. Fan, H. Yu (Eds.), Federated learning: privacy and incentive, Springer, Cham (2020).
[171]
Zhang Q, Liu J, Zhang Z, Wen J, Mao B, Yao X. Fairer machine learning through multi-objective evolutionary learning. In:Proceedings of the 30th International Conference on Artificial Neural Networks; 2021 Sep 14-17; Bratislava, Slovakia; 2021. p. 111-23.
[172]
Speicher T, Heidari H, Grgic-Hlaca N, Gummadi KP, Singla A, Weller A, et al. A unified approach to quantifying algorithmic unfairness:Measuring individual & group unfairness via inequality indices. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining; 2018 Aug 19-23; New York City, NY, USA; 2018. p. 2239-48.
[173]
A. Chouldechova. Fair prediction with disparate impact: a study of bias in recidivism prediction instruments. Big Data, 5 (2) (2017), pp. 153-163.
[174]
Yu G, Ma L, Du W, Du W, Jin Y. Towards fairness-aware multi-objective optimization. 2022. arXiv:2207.12138.
[175]
Mushtaq E, He C, Ding J, Avestimehr S. Spider: searching personalized neural architecture for federated learning. 2021. arXiv:2112.13939.
[176]
He C, Annavaram M, Avestimehr S. FedNAS: federated deep learning via neural architecture search. In:Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition; 2020 Jun 13-19;Seattle, WA, USA. 2020.
[177]
Garg A, Saha AK, Dutta D. Direct federated neural architecture search. 2020. arXiv:2010.06223.
[178]
Xu M, Zhao Y, Bian K, Huang G, Mei Q, Liu X. Federated neural architecture search. 2020. arXiv:2002.06352.
[179]
C. Zhang, X. Yuan, Q. Zhang, G. Zhu, L. Cheng, N. Zhang. Toward tailored models on private AIoT devices: federated direct neural architecture search. IEEE Internet Things J, 9 (18) (2022), pp. 17309-17322.
[180]
Z. Pan, L. Hu, W. Tang, J. Li, Y. He, Z. Liu. Privacy-preserving multi-granular federated neural architecture search a general framework. IEEE Trans Knowl Data Eng, 35 (2023), pp. 2975-2986.
[181]
H. Cho, A. Mathur, F. Kawsar. FLAME: federated learning across multi-device environments. Proc ACM Interact Mob Wearable Ubiquitous Technol, 6 (3) (2022), p. 107.
[182]
Singh I, Zhou H, Yang K, Ding M, Lin B, Xie P. Differentially-private federated neural architecture search. 2020. arXiv:2006.10559.
[183]
H. Zhu, Y. Jin. Real-time federated evolutionary neural architecture search. IEEE Trans Evol Comput, 26 (2) (2022), pp. 364-378.
[184]
Wang C, Chen B, Li G, Wang H. FL-AGCNS: federated learning framework for automatic graph convolutional network search. 2021. arXiv:2104.04141.
[185]
Gratton C, Venkategowda NK, Arablouei R, Werner S. Privacy-preserving distributed zeroth-order optimization. 2020. arXiv:2008.13468.
[186]
Swersky K, Snoek J, Adams RP. Multi-task Bayesian optimization. In: Proceedings of the 26th International Conference on Neural Information Processing Systems; 2013 Dec 5-10; New York City, NY, USA; 2013.
[187]
Lin X, Zhen HL, Li Z, Zhang QF, Kwong S. Pareto multi-task learning. In:Proceedings of the 32nd International Conference on Neural Information Processing Systems; 2019 Dec 8-14;Vancouver, BC, Canada; 2019.
[188]
Smith V, Chiang CK, Sanjabi M, Talwalkar AS. Federated multi-task learning. In:Proceedings of the 30th International Conference on Neural Information Processing Systems; 2017 Dec 4-9;Long Beach, CA, USA; 2017.
[189]
Zhu L, Deb K, Kulkarni S. Multi-scenario optimization using multi-criterion methods: a case study on byzantine agreement problem. In: Proceedings of IEEE Congress on Evolutionary Computation; 2014 Jul 6-11; Beijing, China; 2014. p. 2601-8.
[190]
Deb K, Zhu L, Kulkarni S. Multi-scenario, multi-objective optimization using evolutionary algorithms: initial results. In: Proceedings of IEEE Congress on Evolutionary Computation; 2015 May 25-28; Sendai, Japan; 2015. p. 1877-84.
[191]
Y. Hua, Q. Liu, K. Hao, Y. Jin. A survey of evolutionary algorithms for multi-objective optimization problems with irregular Pareto fronts. IEEE/CAA J Autom Sin, 8 (2) (2021), pp. 303-318
[192]
F.F. Wei, W.N. Chen, Q. Li, S.W. Jeon, J. Zhang. Distributed and expensive evolutionary constrained optimization with on-demand evaluation. IEEE Trans Evol Comput, 27 (3) (2023), pp. 671-685
[193]
Li Q, Heusdens R, Christensen MG. Convex optimisation-based privacy-preserving distributed average consensus in wireless sensor networks. In:Proceedings of the 45th International Conference on Acoustics, Speech, and Signal Processing; 2020 May 4-8; online; 2020. p. 5895-9.
[194]
Q. Li, R. Heusdens, M.G. Christensen. Communication efficient privacy-preserving distributed optimization using adaptive differential quantization. Signal Process, 194 (2022), 108456
[195]
H. Zhu, R. Wang, Y. Jin, K. Liang, J. Ning. Distributed additive encryption and quantization for privacy preserving federated deep learning. Neurocomputing, 463 (2021), pp. 309-327.
[196]
Alvi AS, Ru B, Calliess J, Roberts SJ, Osborne MA. Asynchronous batch Bayesian optimisation with improved local penalization. 2019. arXiv:1901.10452.
[197]
Garcia-Barcos J, Martinez-Cantin R. Fully distributed Bayesian optimization with stochastic policies. 2019. arXiv:1902.09992.
PDF(1959 KB)

Accesses

Citation

Detail

段落导航
相关文章

/