[1] |
KRIZHEVSKY A, SUTSKEVER I, HINTON G E. ImageNet classification with deep convolutional neural networks[J]. Communications of the ACM, 2017, 60(6): 84-90.
|
[2] |
REDMON J, DIVVALA S, GIRSHICK R, et al. You only look once: unified, real-time object detection[C]// 2016 IEEE Conference on Computer Vision and Pattern Recognition. New York: IEEE Press, 2016: 779-788.
|
[3] |
LONG J, SHELHAMER E, DARRELL T. Fully convolutional networks for semantic segmentation[C]// 2015 IEEE Conference on Computer Vision and Pattern Recognition. New York: IEEE Press, 2015: 3431-3440.
|
[4] |
张明, 张芳慧, 宗佳平, 等. 基于轻量级网络的人脸检测及嵌入式实现[J]. 图学学报, 2022, 43(2): 239-246.
|
|
ZHANG M, ZHANG F H, ZONG J P, et al. Face detection and embedded implementation of lightweight network[J]. Journal of Graphics, 2022, 43(2): 239-246 (in Chinese).
DOI
|
[5] |
皮骏, 刘宇恒, 李久昊. 基于YOLOv5s的轻量化森林火灾检测算法研究[J]. 图学学报, 2023, 44(1): 26-32.
DOI
|
|
PI J, LIU Y H, LI J H. Research on lightweight forest fire detection algorithm based on YOLOv5s[J]. Journal of Graphics, 2023, 44(1): 26-32 (in Chinese).
DOI
|
[6] |
COURBARIAUX M, BENGIO Y, DAVID J P. BinaryConnect: training deep neural networks with binary weights during propagations[C]// The 29th International Conference on Neural Information Processing Systems. Cambridge: MIT Press, 2015: 3123-3131.
|
[7] |
RASTEGARI M, ORDONEZ V, REDMON J, et al. XNOR-Net: ImageNet classification using binary convolutional neural networks[C]// The 14th European Conference on Computer Vision. Cham: Springer, 2016: 525-542.
|
[8] |
CHOI J, WANG Z, VENKATARAMANI S, et al. PACT: parameterized clipping activation for quantized neural networks[EB/OL]. [2024-07-05]. https://arxiv.org/abs/1805.06085.
|
[9] |
ZHANG D Q, YANG J L, YE D Q Z, et al. LQ-Nets: learned quantization for highly accurate and compact deep neural networks[C]// The 15th European Conference on Computer Vision. Cham: Springer, 2018: 373-390.
|
[10] |
ESSER S K, MCKINSTRY J L, BABLANI D, et al. Learned step size quantization[EB/OL]. [2024-06-05]. https://arxiv.org/abs/1902.08153.
|
[11] |
HU Q H, WANG P S, CHENG J. From hashing to CNNs: training binary weight networks via Hashing[EB/OL]. [2024-07-05]. https://ojs.aaai.org/index.php/AAAI/article/view/11660.
|
[12] |
JACOB B, KLIGYS S, CHEN B, et al. Quantization and training of neural networks for efficient integer-arithmetic- only inference[C]// 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. New York: IEEE Press, 2018: 2704-2713.
|
[13] |
MIGACZ S. 8-bit Inference with TensorRT[EB/OL]. [2024-06-05]. https://www.cse.iitd.ernet.in/-rijurekha/course/tensorrt.pdf.
|
[14] |
BANNER R, NAHSHAN Y, SOUDRY D. Post training 4-bit quantization of convolutional networks for rapid-deployment[C]// The 33rd International Conference on Neural Information Processing Systems. Red Hook: Curran Associates Inc., 2019: 7950-7958.
|
[15] |
WU D, TANG Q, ZHAO Y L, et al. EasyQuant: post-training quantization via scale optimization[EB/OL]. [2024-06-05]. https://arxiv.org/abs/2006.16669.
|
[16] |
WANG P S, CHEN Q, HE X Y, et al. Towards accurate post-training network quantization via bit-split and stitching[EB/OL]. [2024-06-05]. https://dl.acm.org/doi/10.5555/3524938.3525851.
|
[17] |
NAGEL M, AMJAD R, VAN BAALEN M, et al. Up or down? adaptive rounding for post-training quantization[EB/OL]. [2024-06-05]. https://dl.acm.org/doi/10.5555/3524938.3525605.
|
[18] |
LI Y H, GONG R H, TAN X, et al. BRECQ: pushing the limit of post-training quantization by block reconstruction[EB/OL]. [2024-07-05]. https://arxiv.org/abs/2102.05426.
|
[19] |
WEI X Y, GONG R H, LI Y H, et al. QDrop: randomly dropping quantization for extremely low-bit post-training quantization[EB/OL]. [2024-07-05]. https://arxiv.org/abs/2203.05740.
|
[20] |
MA Y X, LI H X, ZHENG X W, et al. Solving oscillation problem in post-training quantization through a theoretical perspective[C]// 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition. New York: IEEE Press, 2023: 7950-7959.
|
[21] |
LEE J H, KIM J, KWON S J, et al. FlexRound: learnable rounding based on element-wise division for post-training quantization[EB/OL]. [2024-06-05]. https://dl.acm.org/doi/10.5555/3618408.3619189.
|
[22] |
HOWARD A G, ZHU M L, CHEN B, et al. MobileNets: efficient convolutional neural networks for mobile vision applications[EB/OL]. [2024-07-05].
|
[23] |
SANDLER M, HOWARD A, ZHU M L, et al. MobileNetV2: inverted residuals and linear bottlenecks[C]// 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. New York: IEEE Press, 2018: 4510-4520.
|
[24] |
ZHANG X Y, ZHOU X Y, LIN M X, et al. ShuffleNet: an extremely efficient convolutional neural network for mobile devices[C]// The IEEE Conference on Computer Vision and Pattern Recognition. New York: IEEE Press, 2018: 6848-6856.
|
[25] |
MA N N, ZHANG X Y, ZHENG H T, et al. ShuffleNet V2: practical guidelines for efficient CNN architecture design[C]// The 15th European Conference on Computer Vision. Cham: Springer, 2018: 122-138.
|
[26] |
RADOSAVOVIC I, KOSARAJU R P, GIRSHICK R, et al. Designing network design spaces[C]// 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition. New York: IEEE Press, 2020: 10425-10433.
|
[27] |
DENG J, DONG W, SOCHER R, et al. ImageNet: a large-scale hierarchical image database[C]// 2009 IEEE Conference on Computer Vision and Pattern Recognition. New York: IEEE Press, 2009: 248-255.
|