Posts

Our work with Yura Malitsky “Adaptive Gradient Descent without Descent” got accepted ICML with scores ‘Accept’, ‘Accept’ and ‘Weak Accept’. You can see the arxiv version of the paper here: https://arxiv.org/abs/1910.09529. Also wait for our follow-up work that we hope to release in the next few months!

This year we (Filip, Peter and I) submitted one paper to the conference on Uncertainty in Artificial Intelligence (UAI). The paper got overwhelmiingly positive feedback and will be presented virtually this summer. The current title of the paper is “99% of Distributed Optimization is a Waste of Time: The Issue and How to Fix it”, but following the …

At the time when there is nothing to do, no place to travel to and no party to have fun at, reviewing papers can be a good way to kill time. I just finished my reviews for ICML, and in 2 weeks there will be another deadline, reviews for UAI. I think I have never spent so much time reviewing as with this ICML, mostly because usually there are a lot of things …

I was selected as one of 12 Outstanding Program Committee members of AAAI, out of more than 6000 reviewers in total. There were about 7737 submissions, which is 10% more than that of NeurIPS. The award will be officially given to me and the other outstanding reviewers on 11 February at 8am. I’m looking forward to going to New York for the conference!

Today we received the final decisions for our papers submitted to the AISTATS conference (https://www.aistats.org/). Although one work was rejected, this constitutes a good acceptance rate. Unfortunately, we also had to withdraw one of our submissions. Below is the list of papers that we will present:

  1. Revisiting Stochastic Extragradient (K. Mishchenko, D. …

One of the first papers that I wrote got accepted to the SIAM journal on optimization (SIOPT). The review process was quite long and included several revisions, but I’m happy I got it accepted before my graduation. This work is a result of my collaboration with J. Malick and F. Iutzeler, from whose experience I learned a lot about optimization. The …

In addition to my free NeurIPS registration, which I received as on of the top reviewers, I will also receive $1400 from the NeurIPS Foundation to sponsor my travel to the conference.

I was invited by Boris Polyak to present my work on Sinkhorn algorithm at his seminar. The talk took place on Tuesday, 22 October, at the Institute of Control Sciences. It was a great pleasure to hear that Boris liked my work for its simplicity. The slides of my talk are now attached to the corresponding publication ( …

We have submitted 5 papers to 4 different workshops hosted by NeurIPS and all of them were accepted, including one work for oral presentation. The list of papers:

  1. Stochastic Newton and Cubic Newton Methods with Simple Local Linear-Quadratic Rates (oral, with D. Kovalev and P. Richtárik)

  2. Sinkhorn Algorithm as a Special Case of Stochastic Mirror Descent …

We just got a notification that our paper (by D. Kovalev, P. Richtárik and me) was accepted to the NeurIPS workshop “Beyond First-Order Optimization Methods in Machine Learning” for a spotlight (8 minute talk) and poster presentation. Together with the free registration that I got as one of the top reviewers, this gives more than enough reason to …