Posts

Today we received the final decisions for our papers submitted to the AISTATS conference (https://www.aistats.org/). Although one work was rejected, this constitutes a good acceptance rate. Unfortunately, we also had to withdraw one of our submissions. Below is the list of papers that we will present:

  1. Revisiting Stochastic Extragradient (K. Mishchenko, D. …

One of the first papers that I wrote got accepted to the SIAM journal on optimization (SIOPT). The review process was quite long and included several revisions, but I’m happy I got it accepted before my graduation. This work is a result of my collaboration with J. Malick and F. Iutzeler, from whose experience I learned a lot about optimization. The …

In addition to my free NeurIPS registration, which I received as on of the top reviewers, I will also receive $1400 from the NeurIPS Foundation to sponsor my travel to the conference.

I was invited by Boris Polyak to present my work on Sinkhorn algorithm at his seminar. The talk took place on Tuesday, 22 October, at the Institute of Control Sciences. It was a great pleasure to hear that Boris liked my work for its simplicity. The slides of my talk are now attached to the corresponding publication ( …

We have submitted 5 papers to 4 different workshops hosted by NeurIPS and all of them were accepted, including one work for oral presentation. The list of papers:

  1. Stochastic Newton and Cubic Newton Methods with Simple Local Linear-Quadratic Rates (oral, with D. Kovalev and P. Richtárik)

  2. Sinkhorn Algorithm as a Special Case of Stochastic Mirror Descent …

We just got a notification that our paper (by D. Kovalev, P. Richtárik and me) was accepted to the NeurIPS workshop “Beyond First-Order Optimization Methods in Machine Learning” for a spotlight (8 minute talk) and poster presentation. Together with the free registration that I got as one of the top reviewers, this gives more than enough reason to …

My new paper (for the first time I wrote a single authored work!) is now available online, see the publication section. It turned out the famous Sinkhorn algorithm is nothing but an instance of stochastic mirror descent. Very exciting to see the notion of relative smoothness appear as the only explanation of convergence from the mirror descent perspective.

As I’ve done some research in the field of minmax optimization and deep learning, I was invited to be a reviewer for this year instance of the Smooth Games Optimization and Machine Learning Series of workshops.

We just uploaded two papers on federated learning to arxiv. The links are above on my website (“Recent publications”).

I received free NeurIPS registration for providing high quality reviews. It is awarded to the top 400 reviewers, and some people call it “Best Reviewer Award”.