This summer I collaborated with an intern at our group, Ahmed Khaled (these are just 2 of his 4 first names and he actually doesn’t have a last name as any othey Egyptian). The topic of these works is federated learning, which is de facto the standard way of training large models with data from mobile users. Despite its numerical success in certain applications, there are significant difficulties in applying it to setting with heterogeneous data, and in most cases data are not homogeneous. To address why this happens, we tried to tighten the existing theory and found out that our theoretical discoveries have a tight match with numerical experiments. We are still working on this, so more papers are to come!