Approximate message-passing for convex optimization with non-separable penalties - CEA - Commissariat à l’énergie atomique et aux énergies alternatives Access content directly
Preprints, Working Papers, ... Year :

Approximate message-passing for convex optimization with non-separable penalties

Abstract

We introduce an iterative optimization scheme for convex objectives consisting of a linear loss and a non-separable penalty, based on the expectation-consistent approximation and the vector approximate message-passing (VAMP) algorithm. Specifically, the penalties we approach are convex on a linear transformation of the variable to be determined, a notable example being total variation (TV). We describe the connection between message-passing algorithms-typically used for approximate inference-and proximal methods for optimization, and show that our scheme is, as VAMP, similar in nature to the Peaceman-Rachford splitting, with the important difference that stepsizes are set adaptively. Finally, we benchmark the performance of our VAMP-like iteration in problems where TV penalties are useful, namely classification in task fMRI and reconstruction in tomography, and show faster convergence than that of state-of-the-art approaches such as FISTA and ADMM in most settings.
Fichier principal
Vignette du fichier
approximate_message_passing.pdf (968.5 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

cea-01932983 , version 1 (23-11-2018)

Identifiers

Cite

Andre Manoel, Florent Krzakala, Bertrand Thirion, Gaël Varoquaux, Lenka Zdeborová. Approximate message-passing for convex optimization with non-separable penalties. 2018. ⟨cea-01932983⟩
155 View
447 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More