(Nearly-)Convex relation of nonconvex problems

April 4, 2018 — March 17, 2023

feature construction
functional analysis
Hilbert space
machine learning
optimization
statmech
Figure 1

A particular, analytically tractable way of overparameterization of optimisations to make them “nice” in the sense of being easy to analyse, or solve, via the tools of convex optimization. Popular in kernel methods, compressive sensing, matrix factorization, phase retrieval, sparse coding and probably other things besides.

1 Incoming

Francis Bach is interested in a particular specialization, least squares relaxation. See Sums-of-squares for dummies: a view from the Fourier domain

In these last two years, I have been studying intensively sum-of-squares relaxations for optimization, learning a lot from many great research papers [1, 2], review papers [3], books [4, 5, 6, 7, 8], and even websites.

Much of the literature focuses on polynomials as the de facto starting point. While this leads to deep connections between many fields within mathematics, and many applications in various areas (optimal control, data science, etc.), the need for arguably non-natural hierarchies (at least for beginners) sometimes makes the exposition hard to follow at first, and notations a tiny bit cumbersome.

2 References

Ahmed, Recht, and Romberg. 2012. Blind Deconvolution Using Convex Programming.” arXiv:1211.5608 [Cs, Math].
Awasthi, Bandeira, Charikar, et al. 2015. Relax, No Need to Round: Integrality of Clustering Formulations.” In Proceedings of the 2015 Conference on Innovations in Theoretical Computer Science. ITCS ’15.
Bach. 2013. Convex Relaxations of Structured Matrix Factorizations.” arXiv:1309.3117 [Cs, Math].
Bahmani, and Romberg. 2014. Lifting for Blind Deconvolution in Random Mask Imaging: Identifiability and Convex Relaxation.” arXiv:1501.00046 [Cs, Math, Stat].
———. 2016. Phase Retrieval Meets Statistical Learning Theory: A Flexible Convex Relaxation.” arXiv:1610.04210 [Cs, Math, Stat].
Bubeck. 2015. Convex Optimization: Algorithms and Complexity. Foundations and Trends in Machine Learning.
Diamond, Takapoui, and Boyd. 2016. A General System for Heuristic Solution of Convex Problems over Nonconvex Sets.” arXiv Preprint arXiv:1601.07277.
Goldstein, and Studer. 2016. PhaseMax: Convex Phase Retrieval via Basis Pursuit.” arXiv:1610.07531 [Cs, Math].
Haeffele, and Vidal. 2015. Global Optimality in Tensor Factorization, Deep Learning, and Beyond.” arXiv:1506.07540 [Cs, Stat].
Iguchi, Mixon, Peterson, et al. 2015. Probably Certifiably Correct k-Means Clustering.” arXiv:1509.07983 [Cs, Math, Stat].
Papyan, Sulam, and Elad. 2016. Working Locally Thinking Globally-Part II: Stability and Algorithms for Convolutional Sparse Coding.” arXiv Preprint arXiv:1607.02009.
———. 2017. Working Locally Thinking Globally: Theoretical Guarantees for Convolutional Sparse Coding.” IEEE Transactions on Signal Processing.
Sebek. 2015. Spectral Factorization.” In Encyclopedia of Systems and Control.
Tropp, J.A. 2006. Just Relax: Convex Programming Methods for Identifying Sparse Signals in Noise.” IEEE Transactions on Information Theory.
Tropp, Joel A. 2006. Algorithms for Simultaneous Sparse Approximation. Part II: Convex Relaxation.” Signal Processing, Sparse Approximations in Signal and Image ProcessingSparse Approximations in Signal and Image Processing,.
Tropp, Joel A., Gilbert, and Strauss. 2006. Algorithms for Simultaneous Sparse Approximation. Part I: Greedy Pursuit.” Signal Processing, Sparse Approximations in Signal and Image ProcessingSparse Approximations in Signal and Image Processing,.