We just got a notification that our paper (by D. Kovalev, P. Richtárik and me) was accepted to the NeurIPS workshop “Beyond First-Order Optimization Methods in Machine Learning” for a spotlight (8 minute talk) and poster presentation. Together with the free registration that I got as one of the top reviewers, this gives more than enough reason to …

My new paper (for the first time I wrote a single authored work!) is now available online, see the publication section. It turned out the famous Sinkhorn algorithm is nothing but an instance of stochastic mirror descent. Very exciting to see the notion of relative smoothness appear as the only explanation of convergence from the mirror descent perspective.

As I’ve done some research in the field of minmax optimization and deep learning, I was invited to be a reviewer for this year instance of the Smooth Games Optimization and Machine Learning Series of workshops.

We just uploaded two papers on federated learning to arxiv. The links are above on my website (“Recent publications”).

I received free NeurIPS registration for providing high quality reviews. It is awarded to the top 400 reviewers, and some people call it “Best Reviewer Award”.

This year I am also serving as a reviewer for the AAAI conference, which will take place in February in New-York. See the official website for more details

I was at the ICCOPT conference in Berlin from 5 to 8 August as the chair of 3 sessions: 2 on variational inequality/minimax/GANs and 1 on non-smooth optimization.

From 15 to 18 July I’m attending the Frontiers of Deep Learning workshop at Simons Insitute.

From 17 to 28 June I visited Matthias Ehrhardt.

Our work on time series was accepted as a poster to the Time Series Workshop at ICML and I presented it together with Federico Vaggi.