Journal Home Online First Current Issue Archive For Authors Journal Information 中文版

Frontiers of Information Technology & Electronic Engineering >> 2021, Volume 22, Issue 11 doi: 10.1631/FITEE.2000615

A distributed stochastic optimization algorithm with gradient-tracking and distributed heavy-ball acceleration

Affiliation(s): Chongqing Key Laboratory of Nonlinear Circuits and Intelligent Information Processing, College of Electronic and Information Engineering, Southwest University, Chongqing 400715, China; College of Data Science and Information Engineering, Guizhou Minzu University, Guiyang 550025, China; less

Received: 2020-11-08 Accepted: 2021-11-15 Available online: 2021-11-15

Next Previous

Abstract

has been well developed in recent years due to its wide applications in machine learning and signal processing. In this paper, we focus on investigating to minimize a global objective. The objective is a sum of smooth and strongly convex local cost functions which are distributed over an undirected network of nodes. In contrast to existing works, we apply a distributed heavy-ball term to improve the convergence performance of the proposed algorithm. To accelerate the convergence of existing distributed stochastic first-order gradient methods, a momentum term is combined with a gradient-tracking technique. It is shown that the proposed algorithm has better acceleration ability than GT-SAGA without increasing the complexity. Extensive experiments on real-world datasets verify the effectiveness and correctness of the proposed algorithm.

Related Research