Hi there! I am a Faculty Fellow at New York University (CDS) and a guest researcher at Flatiron (CCM). My research is centered around developing mathematical theories for modern machine learning. Notably, I introduced a novel method to study the loss landscape complexity of neural nets. These days I am analyzing end-to-end learning dynamics in exciting models and interpreting the learning process as well as the resulting weights. I am also co-teaching Machine Learning Course.
Before my Ph.D., I studied Electrical-Electronics Engineering and Mathematics double-major at Koç University. I also earned two bronze medals in International Mathematical Olympiad (IMO). I am an active Argentine Tango dancer and I like Yoga.
See my Google Scholar page for an up to date list of publications.
We discuss how the effective ridge reveals the implicit regularization effect of finite sampling in random features. The derivative of the effective ridge tracks the variance of the optimal predictor, yielding an explanation for the variance explosion at the interpolation threshold for arbitrary datasets.