电子文档交易市场
安卓APP | ios版本
电子文档交易市场
安卓APP | ios版本
换一换
首页 金锄头文库 > 资源分类 > PPTX文档下载
分享到微信 分享到微博 分享到QQ空间

科研PPT模板Template1

  • 资源ID:121943364       资源大小:604.78KB        全文页数:27页
  • 资源格式: PPTX        下载积分:10金贝
快捷下载 游客一键下载
账号登录下载
微信登录下载
三方登录下载: 微信开放平台登录   支付宝登录   QQ登录  
二维码
微信扫一扫登录
下载资源需要10金贝
邮箱/手机:
温馨提示:
快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。
如填写123,账号就是123,密码也是123。
支付方式: 支付宝    微信支付   
验证码:   换一换

 
账号:
密码:
验证码:   换一换
  忘记密码?
    
1、金锄头文库是“C2C”交易模式,即卖家上传的文档直接由买家下载,本站只是中间服务平台,本站所有文档下载所得的收益全部归上传人(卖家)所有,作为网络服务商,若您的权利被侵害请及时联系右侧客服;
2、如你看到网页展示的文档有jinchutou.com水印,是因预览和防盗链等技术需要对部份页面进行转换压缩成图而已,我们并不对上传的文档进行任何编辑或修改,文档下载后都不会有jinchutou.com水印标识,下载后原文更清晰;
3、所有的PPT和DOC文档都被视为“模板”,允许上传人保留章节、目录结构的情况下删减部份的内容;下载前须认真查看,确认无误后再购买;
4、文档大部份都是可以预览的,金锄头文库作为内容存储提供商,无法对各卖家所售文档的真实性、完整性、准确性以及专业性等问题提供审核和保证,请慎重购买;
5、文档的总页数、文档格式和文档大小以系统显示为准(内容中显示的页数不一定正确),网站客服只以系统显示的页数、文件格式、文档大小作为仲裁依据;
6、如果您还有什么不清楚的或需要我们协助,可以点击右侧栏的客服。
下载须知 | 常见问题汇总

科研PPT模板Template1

ThiNet A Filter Level Pruning Method for Deep Neural Network Compression Contents Background 1 Motivation 2 n Filter Selection n Greedy Method n Minimize Reconstruction Error Proposed Method 3 Experimental Results 4 Conclusion 5 2 Contents Background 1 Motivation 2 n Filter Selection n Greedy Method n Minimize Reconstruction Error Proposed Method 3 Experimental Results 4 Conclusion 5 3 Background Deep Neural Network DNN is hard to deployed on hardware with the limitation of computation resources storage battery power Figure Performance and model size of different models on ImageNet 4 Model Compression Existing Compression Methods n Quantization convert full precision weights to low precision version e g INQ BWN TWN n Pruning remove less important weights filters from the model e g Deep Compression DNS ThiNet n Design new structure SqueezeNet Distilling ShuffleNet 5 Pruning Methods n Non structured Pruning remove less important weights n Structured Pruning remove less important filters from model 6 Contents Background 1 Motivation 2 n Filter Selection n Greedy Method n Minimize Reconstruction Error Proposed Method 3 Experimental Results 4 Conclusion 5 7 Motivation Problems of Non structured Pruning n Need specialized hardware and software for inference n Ignore cache and memory issues which leads to limited practical acceleration Benefits of Structured Pruning n No change of network structure and can supported by existing deep learning libraries n Reduce the memory and accelerate inference 8 Contents Background 1 Motivation 2 n Filter Selection n Greedy Method n Minimize Reconstruction Error Proposed Method 3 Experimental Results 4 Conclusion 5 9 Proposed Method ThiNet Thin Net a filter level pruning compression framework for model compression Figure Illustration of ThiNet Filter Selection Pruning Fine tuning 10 Proposed Method Framework of ThiNet 11 Filter Selection n Convolution operation can be computed as follows 1 12 Filter Selection n Define n Then 2 3 13 Greedy Method 4 5 14 Greedy Method n Use greedy method to solve the optimization problem A greedy algorithm for minimizing Eq 5 15 Minimize the Reconstruction Error n Minimize the reconstruction error by weighting the channels n Eq 6 can be solved by the ordinary least squares approach 6 7 16 Contents Background 1 Motivation 2 n Filter Selection n Greedy Method n Minimize Reconstruction Error Proposed Method 3 Experimental Results 4 Conclusion 5 17 Pruning Strategy n VGG 16 prune the first 10 convolutional layers and replace the FC layers with a global average pooling layer n ResNet 50 prune the first two convolutional layers 256 d ReLU ReLU 256 dReLU 256 d ReLU ReLU 256 dReLU 18 Comparison of Existing Methods Figure Comparison of different channel selection methods using VGG 16 GAP on CUB 200 19 VGG 16 on ImageNet ModelTop 1Top 5 Param FLOP s f b ms Original68 34 88 44 138 34M30 94B189 92 407 56 ThiNet Conv69 80 89 53 131 44M9 58B76 71 152 05 Train from scratch67 00 87 45 131 44M9 58B76 71 152 05 ThiNet GAP67 34 87 92 8 32M9 34B71 73 145 51 ThiNet Tiny59 34 81 97 1 32M2 01B29 51 55 83 SqueezeNet Han et al 57 67 80 39 1 24M1 72B37 30 68 62 Table Pruning results of VGG 16 on ImageNet n ThiNet Conv prune 50 of the first 10 convolutional layers n ThiNet GAP replace the FC layers with a global average pooling GAP layer based on ThiNet Conv 20 VGG 16 on ImageNet MethodTop 1Top 5 Param FLOPs APoZ 1 Hu et al 2 16 0 84 APoZ 2 Hu et al 1 81 1 25 Taylor 1 Molchanov et al 1 44 Taylor 2 Molchanov et al 3 94 ThiNet WS Li et al 1 01 0 69 ThiNet Conv 1 46 1 09 ThiNet GAP 1 00 0 52 Table Comparison of state of the art methods on VGG 16 n ThiNet WS use the weight sum WS method for pruning 21 ResNet 50 on ImageNet ModelTop 1Top 5 Param FLOPsf b ms Original72 88 91 14 25 56M7 72B188 27 269 32 ThiNet 7072 04 90 67 16 94M4 88B169 38 243 37 ThiNet 5071 01 90 02 12 38M3 41B153 60 212 29 ThiNet 3068 42 88 30 8 66M2 20B144 45 200 67 Table Performance of pruning ResNet 50 on ImageNet 22 Domain Adaptation Ability Table Comparison of different methods on CUB 200 and Indoor 67 FT denotes Fine Tune 23 Data setStategy Param FLOPsTop 1 CUB 200VGG 16135 07M30 93B72 30 FT prune7 91M9 34B66 90 Train from scratch7 91M9 34B44 27 ThiNet Conv128 16M9 58B70 90 ThiNet GAP7 91M9 34B69 43 ThiNet Tiny1 12M2 01B65 45 AlexNet57 68M1 44B57 28 Indoor 67VGG 16134 52M30 93B72 46 FT prune7 84M9 34B64 70 Train from scratch7 84M9 34B38 81 ThiNet Conv127 62M9 58B72 31 ThiNet GAP7 84M9 34B70 22 ThiNet Tiny1 08M2 01B62 84 AlexNet57 68M1 44B59 55 Contents Background 1 Motivation 2 n Filter Selection n Greedy Method n Minimize Reconstruction Error Proposed Method 3 Experimental Results 4 Conclusion 5 24 Conclusion Contributions n Proposed ThiNet a filter pruning framework to accelerate and compress CNN models n Formally establish filter pruning as an optimization problem n VGG 16 model can be pruned into 5 05MB and shows promising generalization ability on transfer learning Future work n Prune the projection

注意事项

本文(科研PPT模板Template1)为本站会员(1818****572)主动上传,金锄头文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即阅读金锄头文库的“版权提示”【网址:https://www.jinchutou.com/h-59.html】,按提示上传提交保证函及证明材料,经审查核实后我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。




关于金锄头网 - 版权申诉 - 免责声明 - 诚邀英才 - 联系我们
手机版 | 川公网安备 51140202000112号 | 经营许可证(蜀ICP备13022795号)
©2008-2016 by Sichuan Goldhoe Inc. All Rights Reserved.