[SIST Seminar] Understanding deep learning through over-parameterized neural networks

ON2022-10-25TAG: ShanghaiTech UniversityCATEGORY: Lecture

Speaker:    HUANG Wei, RIKEN AIP
Time:         10:30-11:30 , Oct .28
Location:   Tencent Meeting
     Meeting ID: 234-209-496

Host:          SHI Ye


Abstract:
The learning dynamics of neural networks trained by gradient descent are captured by the so-called neural tangent kernel (NTK) in the infinite-width limit. The NTK has been a powerful tool for researchers to understand the optimization and generalization of over-parameterized networks. In this talk, the foundation of the NTK in addition to its application to deep graph networks and active learning will be introduced.
Bio:
HUANG Wei received a PhD degree in Computer Science from the University of Technology of Sydney, in 2021. He is currently a postdoctoral researcher at RIKEN AIP, Japan, working with Prof. Taiji Suzuki. His research has been published in top conferences including NeurIPS, ICLR and IJCAI. His research interests include deep learning theory, graph neural networks, and contrastive learning.