Posts

Reflecting on 2020, I realized I spent a lot of time reviewing. I reviewed 34 conference papers, 3 journal papers and 4 workshop papers. To my surprise, Lihua Lei reviewed 48 papers, 20 of which were for journals and probably took extra time due to revisions.

Reviewing is an important part of doing a PhD and it actually helps when writing papers. However, …

Today I’m presenting at the INFORMS Annual Meeting at the session “Painless Large-Scale Optimization with (Almost) Hessian-Free Acceleration”, which can be accessed here if you registered for the conference. I am covering our recent work on Random Reshuffling which has been accepted at NeurIPS 2020 as a conference paper. The session is …

Our recent paper on SGD with data shuffling (a.k.a. Random Reshuffling) got accepted and will be presented at NeurIPS in December. As this year it is fully virtual, I will be preparing a video presentation. Stay tuned! Meanwhile, you can read it on arxiv: https://arxiv.org/abs/2006.05988

I got selected as a top 33% ranking reviewer for ICML 2020. This came as a surprise two months after the conference took place. I am also quite happy about the reviews that I got for my papers this year. This seems to be about luck as many people have expressed their disappointment about the reviewing quality. Nevertheless, in the papers that I was reviewing …

Over the past two years, I have reviewed for NeurIPS, ICML, AAAI, UAI, and several journals. Due to the crazy number of submissions each of these conferences receives, all people with expertise should consider devoting part of their time to reviewing. This year, I’m also reviewing for ICLR, even though I haven’t submitted a single paper there. …

Peter Richtárik, Filip Hanzely and I are organizing a session on Optimization for Deep Learning at the SIAM Conference on Mathematics of Datascience (MDS20). See the link below for more info: https://www.siam.org/conferences/cm/conference/mds20

Our presenters are:

Simon Shaolei Du from the Institute for Advanced Study of Princeton ( …

Our work with Yura Malitsky “Adaptive Gradient Descent without Descent” got accepted at ICML with scores ‘Accept’, ‘Accept’ and ‘Weak Accept’. You can see the arxiv version of the paper here: https://arxiv.org/abs/1910.09529. Also wait for our follow-up work that we hope to release in the next few months!

This year we (Filip, Peter and I) submitted one paper to the conference on Uncertainty in Artificial Intelligence (UAI). The paper got overwhelmiingly positive feedback and will be presented virtually this summer. The current title of the paper is “99% of Distributed Optimization is a Waste of Time: The Issue and How to Fix it”, but following the …

At the time when there is nothing to do, no place to travel to and no party to have fun at, reviewing papers can be a good way to kill time. I just finished my reviews for ICML, and in 2 weeks there will be another deadline, reviews for UAI. I think I have never spent so much time reviewing as with this ICML, mostly because usually there are a lot of things …

I was selected as one of 12 Outstanding Program Committee members of AAAI, out of more than 6000 reviewers in total. There were about 7737 submissions, which is 10% more than that of NeurIPS. The award will be officially given to me and the other outstanding reviewers on 11 February at 8am. I’m looking forward to going to New York for the conference!