• <menu id="cq4ey"><strong id="cq4ey"></strong></menu>
  • 當前位置: 首頁 > 正文
    國家天元數學中部中心高性能計算系列報告 | 呂紹高 教授(南京審計大學)
    發布時間:2021-10-14 16:51:14

    報告題目:Nonparametric Optimality for Large Compressible Deep Neural Networks under Quadratic Loss Functions

    報告時間:2021-10-15  10:00-11:30

    報告人:呂紹高 教授 南京審計大學

    騰訊會議ID:165 369 418

    訪問此鏈接進入會議,或添加至會議列表:https://meeting.tencent.com/dm/EmwdDHeescFP

    Abstract: Establishing theoretical analysis that explain the empirical success of deep learning have attracted increasing attention in modern learning literature. Towards this direction, we evaluate excess risk of a deep learning estimator based on fully connected neural networks with ReLU activation function. In this paper, we establish optimal excess bounds under the quadratic loss and the composite structures of the true function. The obtained excess bounds are built upon so called compressible conditions on over-parameterized neural networks, including widely-used sparse networks, low rank networks associated with weight matrices as special cases. The core proof is based on advanced empirical processes and new approximation results concerning deep neural networks.


    Copyright 2019 ? 天元數學中部中心 National Tianyuan Mathematics Central Center

    A片在线观看
  • <menu id="cq4ey"><strong id="cq4ey"></strong></menu>