publications

publications policy — I do my best to maintain updated versions with possible typo corrections and clarifications on arxiv (both are generally marked in bold and red for easy reference). Therefore, please favor the arxiv versions to the official published ones.

thesis — my thesis (under the supervision of François Glineur and Julien Hendrickx) had the chance to be awarded the ICTEAM thesis award for 2018, the IBM-FNRS innovation award for 2018, and to be a finalist for the AW Tucker prize for 2018. In addition, we received the 2017 best paper award in Optimization Letters, for a joint work with Etienne de Klerk and François Glineur (for this paper).

codes — see my github profile for all my codes. The current version of the Performance EStimation TOolbox (PESTO) is available from here (user manual, conference proceeding). The numerical worst-case analyses from PEP can now be performed just by writting the algorithms just as you would implement them in Matlab. The new PEPit (performance estimation in Python) is available from here (due to the fabulous work of Baptiste Goujaud and Céline Moucer). It is easy to experiment with it using this notebook (see colab).

1 - preprints

  1. preprint
    Nonlinear conjugate gradient methods: worst-case convergence rates via computer-assisted analyses
    arXiv:2301.01530, 2023
  2. preprint
    PROXQP: an Efficient and Versatile Quadratic Programming Solver for Real-Time Robotics Applications and Beyond
    2023
  3. preprint
    Provable non-accelerations of the heavy-ball method
    arXiv:2307.11291, 2023
  4. preprint
    PEPit: computer-assisted worst-case analyses of first-order optimization methods in Python
    arXiv:2201.04040, 2022
  5. preprint
    Optimal first-order methods for convex functions with a quadratic upper bound
    arXiv:2205.15033, 2022
  6. preprint
    Quadratic minimization: from conjugate gradient to an adaptive Heavy-ball method with Polyak step-sizes
    arXiv:2210.06367, 2022

2 - books

  1. book
    Acceleration Methods
    Foundations and Trends in Optimization, 2021

3 - journals

  1. journal
    Automated tight Lyapunov analysis for first-order methods
    Mathematical Programming (to appear), 2024
  2. journal
    Counter-examples in first-order optimization: a constructive approach
    IEEE Control Systems Letters, 2023
  3. journal
    A systematic approach to Lyapunov analyses of continuous-time models in convex optimization
    SIAM Journal on Optimization, 2023
  4. journal
    An optimal gradient method for smooth strongly convex minimization
    Mathematical Programming, 2023
  5. journal
    Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators
    Mathematical Programming, 2023
  6. journal
    On the oracle complexity of smooth strongly convex minimization
    Journal of Complexity, 2022
  7. journal
    A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
    Open Journal of Mathematical Optimization, 2022
  8. journal
    Convergence of a Constrained Vector Extrapolation Scheme
    SIAM Journal on Mathematics of Data Science, 2022
  9. journal
    Optimal complexity and certification of Bregman first-order methods
    Mathematical Programming, 2022
  10. journal
    Efficient first-order methods for convex minimization: a constructive approach
    Mathematical Programming, 2020
  11. journal
    Operator splitting performance estimation: Tight contraction factors and optimal parameter selection
    SIAM Journal on Optimization, 2020
  12. journal
    Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
    SIAM Journal on Optimization, 2020
  13. journal
    Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
    Journal of Optimization Theory and Applications, 2018
  14. journal
    Exact worst-case performance of first-order methods for composite convex optimization
    SIAM Journal on Optimization, 2017
  15. journal
    On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions [Best paper award]
    Optimization Letters, 2017
  16. journal
    Smooth strongly convex interpolation and exact worst-case performance of first-order methods
    Mathematical Programming, 2017

4 - conference

  1. conference
    QPLayer: efficient differentiation of convex quadratic optimization
    In International Conference on Learning Representations (ICLR, to appear) , 2024
  2. conference
    On Fundamental Proof Structures in First-Order Optimization
    In Proceedings of the 62nd Conference on Decision and Control (CDC) , 2023
  3. conference
    Convergence of Proximal Point and Extragradient-Based Methods Beyond Monotonicity: the Case of Negative Comonotonicity
    In Proceedings of the 40th International Conference on Machine Learning (ICML) , 2023
  4. conference
    Last-Iterate Convergence of Optimistic Gradient Method for Monotone Variational Inequalities
    In Advances in Neural Information Processing Systems (NeurIPS) , 2022
  5. conference
    Fast Stochastic Composite Minimization and an Accelerated Frank-Wolfe Algorithm under Parallelization
    In Advances in Neural Information Processing Systems (NeurIPS) , 2022
  6. conference
    PROX-QP: Yet another Quadratic Programming Solver for Robotics and beyond
    In Robotics: Science and systems (RSS 2022) , 2022
  7. conference
    Super-Acceleration with Cyclical Step-sizes
    In Proceedings of the 25th International Conference on Artificial Intelligence and Statistics (AISTATS) , 2022
  8. conference
    A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip [Outstanding paper award]
    In Advances in Neural Information Processing Systems (NeurIPS) , 2021
  9. conference
    Complexity guarantees for Polyak steps with momentum
    In Proceedings of the 33rd Conference on Learning Theory (COLT) , 2020
  10. conference
    Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions
    In Proceedings of the 32nd Conference on Learning Theory (COLT) , 2019
  11. conference
    Lyapunov functions for first-order methods: Tight automated convergence guarantees
    In Proceedings of the 35th International Conference on Machine Learning (ICML) , 2018
  12. conference
    Performance estimation toolbox (PESTO): automated worst-case analysis of first-order optimization methods
    In Proceedings of the 56th Conference on Decision and Control (CDC) , 2017

5 - PhDtheses

  1. PhDthese
    Convex Interpolation and Performance Estimation of First-order Methods for Convex Optimization [ICTEAM thesis award; IBM-FNRS innovation award; AW Tucker prize finalist]
    Université catholique de Louvain, 2017