Efficient learning strategies over distributed networks for big data
Item Usage Stats
We study the problem of online learning over a distributed network, where agents in the network collaboratively estimate an underlying parameter of interest using noisy observations. For the applicability of such systems, sustaining a communication and computation efficiency while providing a comparable performance plays a crucial role. To this end, in this work, we propose computation and communication wise highly efficient distributed online learning methods that present superior performance compared to the state-of-the-art. In the first part of the thesis, we study distributed centralized estimation schemes, where such approaches require high communication bandwidth and high computational load. We introduce a novel approach based on set-membership filtering to reduce such burdens of the system. In the second part of our work, we study distributed decentralized estimation schemes, where nodes in the network individually and collaboratively estimate a dynamically evolving parameter using noisy observations. We present an optimal decentralized learning algorithm through disclosure of local estimates and prove that optimal estimation in such systems is possible only over certain network topologies. We then derive an iterative algorithm to recursively construct the optimal combination weights and the estimation. Through series of simulations over generated and real-life benchmark data, we demonstrate the superior performance of the proposed methods compared to state-of-the-art distributed learning methods. We show that the introduced algorithms provide improved learning rates and lower steady-state error levels while requiring much less communication and computation load on the system.