Journal of Textile Research ›› 2024, Vol. 45 ›› Issue (05): 60-69.doi: 10.13475/j.fzxb.20220902201

• Textile Engineering • Previous Articles     Next Articles

Structure classification of weft-knitted fabric based on lightweight convolutional neural network

HU Xudong1, TANG Wei1, ZENG Zhifa2, RU Xin1, PENG Laihu1, LI Jianqiang3(), WANG Boping2   

  1. 1. Key Laboratory of Modern Textile Machinery & Technology of Zhejiang Province, Zhejiang Sci-Tech University, Hangzhou, Zhejiang 310018, China
    2. Zhejiang Hengqiang Technology Co., Ltd., Hangzhou, Zhejiang 311121, China
    3. College of Biomedical Engineering & Instrument Science, Zhejiang University, Hangzhou, Zhejiang 310027, China
  • Received:2023-03-13 Revised:2023-12-05 Online:2024-05-15 Published:2024-05-31

Abstract:

Objective The structure of the fabric is one of the important parameters to guide the production of fabric. The automatic identification of the fabric structure through machine vision helps to improve the design and production efficiency. Weft knitted fabrics have complex knitting methods and various structures, and it is difficult to accurately determine the structure of the fabric only by the image of one side of the fabric. Therefore, it is necessary to design an efficient and accurate classification method for the special structure of weft knitted fabrics.

Method A fabric image acquisition platform was built to capture images of both sides of fabric samples at multi-scale and various lighting conditions. A dataset containing images of weft knitted fabrics in nine categories was produced. The method in this paper was improved based on GhostNet which is a lightweight convolutional neural network. In order to improve the network's ability to learn different features, two strategies were adopted to introduce an attention mechanism in the feature extraction stage. The network structure was adjusted to a dual-branch architecture, so that the features of the double-sided image of knitted fabrics were simultaneously extracted through the weight-sharing sub-network, and the extracted high-dimensional feature maps were serially fused.

Results The experimental part analyzes the effectiveness and performance of the proposed method. Multiple online augmentation methods increase the diversity of fabric sample data and improve the robustness of the model. Compared with the original data set, the model has higher accuracy rate on the validation set. Adding a dropout layer after the fully connected layer improves the generalization performance of the model. When the dropout rate is 0.4, the model has the best performance. For the fabric categories that are difficult to distinguish based on single-sided images, the proposed method has achieved a classification accuracy of more than 99%. In order to observe the feature extraction effect of the model more intuitively, the feature maps of different levels of the fabric image are visualized. The model pays more attention to important features such as the shape and texture of the fabric. Accuracy, macro precision, macro recall, and macro F1 are adopted to evaluate the performance of the model, and the number of parameters and computational complexity are calculated to measure the resource consumption of the model. The results of the ablation experiments show that the incorporation of the CBAM module effectively improves the performance of the model. Different models and methods are compared under the same hyper parameter settings. First, common CNN models are tested on the dataset constructed in this paper. The prosposed method achieves the highest classification accuracy of 99.51%, the macro precision rate is 0.994 1, the macro recall rate is 0.994 6, and the macro F1 score is 0.994 2, with lower FLOPs (0.31 G) and params (4.62 M) compared to other models.

Conclusion Aiming at the classification of weft-knitted fabrics, a double-sided image data set is used in the classification of knitted fabrics. An end-to-end classification method of knitted fabric structure was prosposed based on GhostNet, which is a lightweight convolutional neural network. The experimental results show that the CBAM module enhances the feature discrimination between different fabrics, which improves the network performance. The double-sided features of weft-knitted fabric were efficiently extracted by dual-branch network architecture. Compared with other classification methods based on CNN, the proposed method has a higher classification accuracy and consumes less resources, which is conducive to the deployment of the model on mobile devices or embedded devices. In future work, the case that a single fabric image contains multiple fabric structure will become the focus of research, which is of great significance to further improve the recognition efficiency of the algorithm in the actual design and production process of fabrics.

Key words: weft-knitted fabric, image recognition, lightweight convolutional neural network, two-branch network, attention mechanism

CLC Number: 

  • TS181

Fig.1

Examples of weft-knitted fabric samples. (a)Pique;(b)1×1 rib;(c)2×2 rib;(d)3×3 rib;(e)4×2 rib; (f)4×3 rib;(g)Terry;(h)Plain;(i)Lacoste"

Tab.1

Overview of weft-knitted fabric dataset"

试样
编号
类别 面密度/(g·m-2) 样本数量/组
0 单珠地 180、200、210、220 1 228
1 1×1 罗纹 170、200、220 636
2 2×2 罗纹 220、230、240 908
3 3×3 罗纹 200、210、220、230、240 868
4 4×2 罗纹 200、210、230 780
5 4×3 罗纹 180、200、210、220 684
6 毛圈 250、280、300 880
7 平针 170、190、200、220、230 1 256
8 双珠地 190、210、220 992

Fig.2

Ghost module"

Fig.3

Convolutional block attention module"

Fig.4

Structure of channel attention module (a) and spatial attention module (b)"

Fig.5

Improved bottleneck. (a) Stride=1; (b) Stride=2"

Fig.6

Architecture and parameters of dual-branch network"

Fig.7

Effect of data augmentation. (a)Iterative curve of loss values on training set;(b)Iterative curves of accuracy on validation set"

Fig.8

Influences of different dropout rates"

Fig.9

Confusion matrix"

Tab.2

Number of misclassified samples in test set"

试样
编号
类别 测试集样本
数量/组
错分样本
数量/组
0 单珠地 246 0
1 1×1 罗纹 127 1
2 2×2 罗纹 182 0
3 3×3 罗纹 174 3
4 4×2 罗纹 156 1
5 4×3 罗纹 137 2
6 毛圈 176 0
7 平针 251 1
8 双珠地 198 0
- 总计 1 647 8

Fig.10

Partial feature image of weft-knitted fabric images. (a) Original image; (b) Multi-layer featureimage"

Tab.3

Results of ablation experiments"

模型 FLOPs/G Params/M 宏精确率 宏召回率 F1分数 准确率/%
GhostNet_无注意力模块 0.31 3.64 0.962 4 0.949 8 0.954 3 95.81
GhostNet_SE模块 0.31 5.37 0.982 9 0.983 6 0.983 1 98.48
本文方法 0.31 4.62 0.994 1 0.994 6 0.994 2 99.51

Tab.4

Experimental results of different models"

模型 FLOPs/G Params/M 宏精确率 宏召回率 F1分数 准确率/%
Alexnet 0.61 24 0.911 5 0.910 9 0.910 9 91.98
ResNet18 3.65 11.19 0.979 4 0.977 1 0.977 9 97.99
MobileNet V2 0.65 2.25 0.984 0 0.980 9 0.981 8 98.42
MobileNet V3 0.12 2.12 0.954 2 0.948 4 0.950 8 95.50
DenseNet 5.79 6.97 0.983 4 0.980 8 0.982 0 98.29
EfficientNet 0.82 4.03 0.980 2 0.978 3 0.979 0 98.11
ShuffleNet 0.30 1.27 0.943 4 0.947 4 0.944 6 95.02
本文方法 0.31 4.62 0.994 1 0.994 6 0.994 2 99.51
[1] XIN B, ZHANG J, ZHANG R, et al. Color texture classification of yarn-dyed woven fabric based on dual-side scanning and co-occurrence matrix[J]. Textile Research Journal, 2017, 87(15): 1883-1895.
[2] ROOMI M M, SARANYA S. Bayesian classification of fabrics using binary co-occurrence matrix[J/OL]. International Journal of Information Sciences and Techniques, 2012, 2(2), 1-9. https://ssrn.com/abstract=3827595 .
[3] JING J, XU M, LI P, et al. Automatic classification of woven fabric structure based on texture feature and PNN[J]. Fibers and Polymers, 2014, 5(15): 1092-1098.
[4] TALAB A R R, SHAKOOR M H. Fabric classification using new mapping of local binary pattern[C]// 2018 International Conference on Intelligent Systems and Computer Vision. Fez: IEEE, 2018: 1-4.
[5] MALACA P, ROCHA L F, GOMES D, et al. Online inspection system based on machine learning techniques: real case study of fabric textures classification for the automotive industry[J]. Journal of Intelligent Manufacturing, 2019, 30(1): 351-361.
[6] GONG X, YUAN L, YANG Y, et al. Classification of colored spun fabric structure based on wavelet decomposition and hierarchical hybrid classifier[J]. Journal of The Textile Institute, 2021, 113(9): 1832-1837.
[7] XIAO Z, LIU X, WU J, et al. Knitted fabric structure recognition based on deep learning[J]. Journal of The Textile Institute, 2018, 109(9): 1217-1223.
[8] HUSSAIN I, Khan M.A.B, WANG Z, et al. Woven fabric pattern recognition and classification based on deep convolutional neural networks[J]. Electronics, 2020, 9(6): 1048.
[9] ZHANG X, ZHOU X, LIN M, et al. ShuffleNet: an extremely efficient convolutional neural network for mobile devices[C]// 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Salt Lake City: IEEE, 2018: 6848-6856.
[10] SANDLER M, HOWARD A, ZHU M, et al. MobileNetV2: inverted residuals and linear bottle-necks[C]// 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Salt Lake City: IEEE, 2018: 4510-4520.
[11] TAN M, LE Q. EfficientNet: Rethinking model scaling for convolutional neural networks[C]// International Conference on Machine Learning. New York: PMLR, 2019: 6105-6114.
[12] HAN K, WANG Y, TIAN Q, et al. GhostNet: more features from cheap operations[C]// 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Seattle: IEEE, 2020: 1580-1589.
[13] WOO S, PARK J, LEE J Y, et al. CBAM: convolutional block attention module[C]// European conference on computer vision. Switzerland: Springer, 2018: 3-19.
[14] HU J, SHEN L, ALBANIE S, et al. Squeeze-and-excitation networks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020, 42(8): 2011-2023.
doi: 10.1109/TPAMI.2019.2913372 pmid: 31034408
[1] GU Meihua, HUA Wei, DONG Xiaoxiao, ZHANG Xiaodan. Occlusive clothing image segmentation based on context extraction and attention fusion [J]. Journal of Textile Research, 2024, 45(05): 155-164.
[2] SHI Hongyu, WEI Yingjie, GUAN Shengqi, LI Yi. Cotton foreign fibers detection algorithm based on residual structure [J]. Journal of Textile Research, 2023, 44(12): 35-42.
[3] MA Chuangjia, QI Lizhe, GAO Xiaofei, WANG Ziheng, SUN Yunquan. Stitch quality detection method based on improved YOLOv4-Tiny [J]. Journal of Textile Research, 2023, 44(08): 181-188.
[4] YUAN Tiantian, WANG Xin, LUO Weihao, MEI Chennan, WEI Jingyan, ZHONG Yueqi. Three-dimensional virtual try-on network based on attention mechanism and vision transformer [J]. Journal of Textile Research, 2023, 44(07): 192-198.
[5] FU Han, HU Feng, GONG Jie, YU Lianqing. Defect reconstruction algorithm for fabric defect detection [J]. Journal of Textile Research, 2023, 44(07): 103-109.
[6] CHEN Jia, YANG Congcong, LIU Junping, HE Ruhan, LIANG Jinxing. Cross-domain generation for transferring hand-drawn sketches to garment images [J]. Journal of Textile Research, 2023, 44(01): 171-178.
[7] GU Meihua, LIU Jie, LI Liyao, CUI Lin. Clothing image segmentation method based on feature learning and attention mechanism [J]. Journal of Textile Research, 2022, 43(11): 163-171.
[8] SUN Chunhong, DING Guangtai, FANG Kun. Cashmere and wool classification based on sparse dictionary learning [J]. Journal of Textile Research, 2022, 43(04): 28-32.
[9] LIU Guowei, PAN Ruru, GAO Weidong, ZHOU Jian. Fabric defects segmentation using total variation [J]. Journal of Textile Research, 2021, 42(11): 64-70.
[10] LI Tao, DU Lei, HUANG Zhenhua, JIANG Yuping, ZOU Fengyuan. Review on pattern conversion technology based on garment flat recognition [J]. Journal of Textile Research, 2020, 41(08): 145-151.
[11] SUN Jie, DING Xiaojun, DU Lei, LI Qinman, ZOU Fengyuan. Research progress of fabric image feature extraction and retrieval based on convolutional neural network [J]. Journal of Textile Research, 2019, 40(12): 146-151.
[12] XU Qian, CHEN Minzhi. Garment grain balance evaluation system based on deep learning [J]. Journal of Textile Research, 2019, 40(10): 191-195.
[13] HAN Xiaoxue, MIAO Xuhong. Longitudinal electrical physical properties of spandex weft-knitted conductive fabric [J]. Journal of Textile Research, 2019, 40(04): 60-65.
[14] . Automatic recognition and positioning method of fabric slitting zone using one-dimensional Fourier transform [J]. JOURNAL OF TEXTILE RESEARCH, 2016, 37(01): 147-151.
[15] . Intelligent match design of intelligent collocation based on costume [J]. Journal of Textile Research, 2015, 36(07): 94-99.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!