科研PPT模板Template2
ThiNet AFilterLevelPruningMethodforDeepNeuralNetworkCompression 1 2 Contents 3 Contents 4 Background DeepNeuralNetwork DNN ishardtodeployedonhardwarewiththelimitationofcomputationresources storage batterypower Figure PerformanceandmodelsizeofdifferentmodelsonImageNet 5 ModelCompression ExistingCompressionMethods Quantization convertfull precisionweightstolow precisionversion e g INQ BWN TWN Pruning removelessimportantweights filtersfromthemodel e g DeepCompression DNS ThiNetDesignnewstructure SqueezeNet Distilling ShuffleNet 6 PruningMethods Non structuredPruning removelessimportantweights StructuredPruning removelessimportantfiltersfrommodel 7 Contents 8 Motivation 9 Contents 10 ProposedMethod ThiNet ThinNet afilterlevelpruningcompressionframeworkformodelcompression Figure IllustrationofThiNet FilterSelection Pruning Fine tuning inputoflayer filtersoflayer inputoflayer 1 filtersoflayer 1 inputoflayer 2 11 ProposedMethod 12 FilterSelection Convolutionoperationcanbecomputedasfollows 1 1 1 2 1 1 2 1 2 istheelementsampledfrominputoflayer 2 isthecorrespondingfilter istheslidingwindow 1 13 FilterSelection Define 1 2 1 2 1 2 1 Then where Ifwecanfindasubset 1 2 and then canberemovedwithoutchangingtheresult 2 3 14 GreedyMethod Givenasetof trainingexamples thechannelselectionproblemcanbesolvedasoptimizationproblem 1 2 1 1 2 Let bethesubsetofremovedchannels then 4 5 15 GreedyMethod Usegreedymethodtosolvetheoptimizationproblem 16 MinimizetheReconstructionError Minimizethereconstructionerrorbyweightingthechannels where indicatesthetrainingsamplesafterchannelselection Eq 6 canbesolvedbytheordinaryleastsquaresapproach 1 6 7 17 Contents 18 PruningStrategy VGG 16 prunethefirst10convolutionallayersandreplacetheFClayerswithaglobalaveragepoolinglayer ResNet 50 prunethefirsttwoconvolutionallayers 19 ComparisonofExistingMethods Figure Comparisonofdifferentchannelselectionmethods usingVGG 16 GAPonCUB 200 20 VGG 16onImageNet Table PruningresultsofVGG 16onImageNet ThiNet Conv prune50 ofthefirst10convolutionallayersThiNet GAP replacetheFClayerswithaglobalaveragepooling GAP layerbasedonThiNet Conv 21 VGG 16onImageNet Table Comparisonofstate of the artmethodsonVGG 16 ThiNet WS usetheweightsum WS methodforpruning 22 ResNet 50onImageNet Table PerformanceofpruningResNet 50onImageNet 23 DomainAdaptationAbility Table ComparisonofdifferentmethodsonCUB 200andIndoor 67 FT denotes FineTune 24 Contents 25 Conclusion 26 ThankYou ThankYouQ A