Partager cette page :

Communication/Computation Tradeoffs in Consensus-Based Distributed Optimization

le 19 novembre 2013

de 15h30 à 17h00

ENS Rennes Salle du conseil
Plan d'accès

Intervention de Michael Rabbat (McGill University).
Séminaire du département Informatique et télécommunications.

This talk considers the problem of distributed optimization. Each node in a network of processors has access to a convex function, and the nodes must cooperate to obtain a minimizer of the sum of these functions. As a motivating application, we consider machine learning problems where a large dataset is partitioned among the nodes and their goal is to identify a model which globally fits the entire dataset well. We study consensus-based optimization methods where nodes iteratively perform local gradient descent steps followed by a consensus step. In the consensus step, each node passes messages to its neighbors containing information about its new local gradient, and these are fused to drive the state at each node to an agreement on the optimizer of the overall objective function. We focus on understanding the tradeoff between communication and computation in terms of the time to reach a desired level of accuracy. Surprisingly, a speedup can be obtained by communicating less and less frequently as the computation progresses.
Formation, Recherche - Valorisation
François Schwarzentruber

Mise à jour le 9 septembre 2019