I am a senior student majoring in applied mathematics at Peking University (PKU) while pursuing a double major in computer science. I am doing undergraduate research about machine learning theory especially deep learning theory advised by Professor Liwei Wang. I am mostly interested on the theory that can inspire us make better algorithms. I spent a wonderful summer at MIT as a research intern supervised by Professor Sasha Rakhlin in 2019. I am also fortunate to work with Professor Jason D. Lee.
I will apply for Ph.D this year!
(NeurIPS 2019 Spotlight 2.4 % Acceptance rate) Convergence of Adversarial Training in Overparametrized Networks
Ruiqi Gao*, Tianle Cai*, Haochuan Li, Liwei Wang, Cho-Jui Hsieh, Jason D. Lee
Highlight: For overparameterized neural network, we prove that adversarial training can converge to global minima (with loss 0).
(NeurIPS 2019 Beyond First Order Method in ML Workshop) Gram-Gauss-Newton Method: Learning Overparameterized Neural Networks for Regression Problems
Tianle Cai*, Ruiqi Gao*, Jikai Hou*, Siyu Chen, Dong Wang, Di He, Zhihua Zhang, Liwei Wang
Highlight: A provable second-order optimization method for overparameterized network on regression problem! As light as SGD at each iteration but converge much faster than SGD for real world application.
Runtian Zhai*, Tianle Cai*, Di He, Chen Dan, Kun He, John Hopcroft, Liwei Wang
Highlight: Though robust generalization need more data, we show that just more unlabeled data is enough by both theory and experiments!
- A Gram-Gauss-Newton Method Learning Overparameterized Deep Neural Networks for Regression Problems at PKU machine learning workshop [slides]
- Visiting Research Student at Simons Institute, UC Berkeley
- Program: Foundations of Deep Learning
- June, 2019 - July, 2019
- Visiting Research Internship at MIT
- Advisor: Professor Sasha Rakhlin
- June, 2019 - Sept., 2019
- Visiting Research Student at Princeton
- Host: Professor Jason D. Lee
- Sept., 2019 - Oct., 2019