欢迎光临!
您现在所在的位置:首页 >> 通知公告 & 学术信息
学术信息
SEMINARS
Exploring Redundancy in Deep Neural Networks
时间  Datetime
2019-12-30 14:00 — 15:00 
地点  Venue
5#306
报告人  Speaker
Chenglong Bao
单位  Affiliation
Tsinghua University
邀请人  Host
INS
报告摘要  Abstract

The deep neural networks have been widely used in many applications and the classification accuracy increases as the network goes bigger. However, the huge computation and storage have prevented their deployments in resource-limited devices. In this talk, we will first show that there exists redundancy in current CNNs under the PAC framework. Second, we will propose the self-distillation technique that can compress the deep neural networks with dynamic inference.