贪心法:ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression
Lasso Regression:Lasso Regression:Channel Pruning for Accelerating Very Deep Neural Networks(ICLR2017)
最小化网络倒数第二层的重建误差,并将反向传播的误差累积考虑在内:NISP: Pruning Networks using Neuron Importance Score Propagation(CVPR2018)
一方面在中间层添加额外的discrimination-aware loss(用以强化中间层的判别能力),另一方面也考虑特征重建误差的loss,综合两方面loss对于参数的梯度信息,决定哪些为需要被裁剪的channel:Discrimination-aware Channel Pruning for Deep Neural Networks
是否稀疏训练?
用group Lasso进行结构化稀疏:
Learning Structured Sparsity in Deep Neural Networks
Sparse Convolutional Neural Networks
引入可学习的mask,用APG算法来稀疏mask:Data-Driven Sparse Structure Selection for Deep Neural Networks(ECCV2018)
用约束优化中的经典算法ADMM来求解:A Systematic DNN Weight Pruning Framework using Alternating Direction Method of Multipliers(ECCV2018)
L1正则:Learning Efficient Convolutional Networks through Network Slimming(ICCV2017)
ISTA:Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers(ICLR2018)
L1正则+width-multiplier+shink-expand:MorphNet: Fast & Simple Resource-Constrained Structure Learning of Deep Networks(CVPR2018)