Next Article in Journal
Spatially Modulated Fiber Speckle for High-Sensitivity Refractive Index Sensing
Previous Article in Journal
Measurement Technologies of Light Field Camera: An Overview
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Extraction of the Microstructure of Wool Fabrics Based on Structure Tensor

1
School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
2
Shanghai Aerospace Control Technology Institute, Shanghai 201109, China
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(15), 6813; https://doi.org/10.3390/s23156813
Submission received: 29 May 2023 / Revised: 9 July 2023 / Accepted: 26 July 2023 / Published: 31 July 2023
(This article belongs to the Section Sensing and Imaging)

Abstract

:
The trends of “fashionalization”, “personalization” and “customization” of wool fabrics have prompted the textile industry to change the original processing design based on the experience of engineers and trial production. In order to adapt to the promotion of intelligent production, the microstructure of wool fabrics is introduced into the finishing process. This article presents an automated method to extract the microstructure from the micro-CT data of woven wool fabrics. Firstly, image processing was performed on the 3D micro-CT images of the fabric. The raw grayscale data were converted into eigenvectors of the structure tensor to segment the individual yarns. These data were then used to calculate the three parameters of diameter, spacing and the path of the center points of the yarn for the microstructure. The experimental results showed that the proposed method was quite accurate and robust on woven single-ply tweed fabrics.

1. Introduction

Wool has good inherent properties and appearance characteristics. It has always been a popular and high-grade fabric. With the improvement of living standards, the demand for wool fabric is also more diversified and has begun to pursue “fashionalization”, “personalization” and “customization” [1,2].
The feeling that a piece of fabric brings to the eye, touch and other senses is often described as the style. The textile industry defines hundreds of styles for different types of materials for various purposes, which are intricately and intrinsically linked to each other and together determine the fabric’s quality.
Styles are usually evaluated on the basis of personal experience, supplemented by a few physical quantities measured by instruments. The results are generally highly subjective and vary individually. In addition, the expression of the customer’s needs is often colloquial and comparative in nature. The producer’s understanding of the customer’s requirements is prone to bias. Agreement between the parties may require several exchanges, which will undoubtedly reduce productivity and the customer’s experience.
Customization means processing products with different requirements in small lots and multiple batches, and with high efficiency. In the past, the design process of mass production usually used the process of trying to produce a small number of samples, comparing the differences between the samples and the requirements, adjusting the process, trying to produce samples again and comparing them again. If this process is used in customization, it will undoubtedly consume excessive time and material costs and deter potential customers [3].
In the face of the problems above, textile companies are seeking to quantify the properties of wool fabrics in order to establish a mathematical link between the processing process and the finished style, and to further realize the automated design of the processing process. Based on the theory that the microstructure of a fabric determines its macroscopic properties [4,5,6,7,8,9], it is feasible to introduce the microstructure instead of traditional quality control methods in the selection of raw materials, the design process and the production processes of fabric processing [10]. There are three main geometrically described parameters (shown in Figure 1) that characterize the microstructure: the yarn’s properties (e.g., yarn diameter R), the yarn’s spacing D and the topology, expressed in terms of the path of the center points [11].
Manual measurement is a straightforward method used to obtain the microstructure of wool fabrics, which is time-consuming and laborious. Traditional measurement tools (e.g., calipers) are applied to individual yarns disassembled from the fabric, which breaks the mutual constraint between the yarns and causes errors in the results of measurement.
With the widening application of computer vision in the textile industry [12,13,14], it is more convenient to process the high-definition top view images of fabric for feature extraction. Luan et al. [15] used the polarization properties of yarns for accurate segmentation of the warps and wefts, overcoming the requirement of other methods regarding the number of colors contained in the textile and allowing the processing of fabrics consisting of only two colors. Fang et al. [16] obtained the position of the yarn from the projected image of the fabric and the texture from the reflected image, which could effectively handle fabrics where the warps and wefts are in different colors. Xiang et al. [17] improved the luminance projection method to locate yarns and knots, which is effective for extracting the features of yarn-dyed fabric. A calibration method for cell phone photos [18] was proposed to reduce the requirement for instruments and to achieve real-time measurements.
More studies have been based on neutral networks to label the region or trajectory of yarns in an image. The extraction network of a texture’s structure proposed by Yuan et al. [19] extracted periodic texture information from fabric images to accomplish yarn segmentation. Dai et al. [20] designed a network consisting of a dilated feature network and a feature alignment module to detect each segment of the yarn based on rotating object detection. Meng et al. [21] improved the learning of yarn features by detecting yarns and floats in multitask learning.
However, the methods mentioned above only analyze the fabric’s structure at the two-dimensional level and cannot obtain third-dimensional information such as the interleafing relationship between the yarns.
Micro-CT images can capture the internal three-dimensional structure of a material with high precision in a non-destructive manner [22] and can effectively obtain the full information of the fabric’s microstructure [23]. On this basis, Shinohara et al. [24] designed a correlation function between a cylindrical model of an ideal yarn and its real voxel. The real voxel was matched to the model by maximizing the value of the function to obtain information on the position of each yarn. As a method applicable to fabrics of various tissues, it faces the problem that the computation requires iterations and priori knowledge.
More methods tend to slice the 3D image pixel by pixel along a certain coordinate axis rather than processing the stereo information directly. The slices are processed individually and then stitched back to the original structure. By slicing in the direction parallel to the fabric’s plane, excellent networks in the field of image segmentation such as U-Net [25,26] or DCNN [27] were used to label the warp and weft segments on the slices.
It is more common to slice the image in a direction perpendicular to the plane of fabric. Pidou-Brion et al. [28] proposed a yarn segmentation method with deformable meshes to fit the fault volume of the yarn to achieve the modeling of the internal structure of composites at the mesoscale.
The most common approach is still the semantic segmentation of slices using deep learning. The segmentation network proposed by Guo et al. [29] combined coarse-to-fine segmentation and region-wise segmentation to achieve accurate segmentation of the yarn’s cross-sections. Song et al. [30] used Leaky ReLu as the activation function of U-Net to improve the robustness and efficiency of the network. The identification of binder yarns in composites was enhanced by DCNN by Ali et al. [31].
Training neural networks usually requires a large number of learning samples. There is often a lack of available public datasets in the textile field. In order to reduce expensive manual annotation, training samples can be obtained by generating pseudo-images [32] or pseudo-labels [33] when different slices are said to have similar cross-sectional structures. Zheng et al. [34] proposed a new data augmentation algorithm to generate realistic artificial datasets. Applying the transfer learning strategy [35] by leveraging networks trained on other samples can also effectively reduce the need for data when learning new samples.
The objects of the methods above are mainly composite materials made of carbon fibers. The cross-section has a regular shape with clear edges and good repeatability on different slices. In contrast, wool yarn is a soft, easily deformable substance. After the fabric is made, the warps and wefts are usually tightly adhered. The cross-section (shown in Figure 2) is mostly blurred at the edges and has no fixed shape. This severely increases the difficulty of creating annotations for datasets, making deep learning unfeasible.
Another practice is to identify the cross-sections of yarns perpendicular to the plane of the slices by template matching and then to reconstruct the stereoscopic shape of the yarn [36,37,38]. The template is usually a circle or an ellipse that approximates the shape of the yarn’s cross-section [39], or a manually calibrated initial cross-section of the yarn [36].
The template methods are usually used to split yarns or fibers that are aligned in the same direction. The slice of fabric usually contains both yarns parallel to the plane of the slice and yarns perpendicular to it. The former are usually larger in area and have higher and more uniform gray values, making it easier to pass the template’s matching conditions than the latter. This ultimately leads to incorrect yarn profiles and an illogical path of the center points.
In this article, a yarn segmentation method based on the structure tensor is proposed for the unique characteristics of wool fabrics. Yarn segmentation is performed by replacing the gray values with eigenvectors of the structure tensor as the feature analyzed by the template tracking method to extract the three parameters of the yarn, namely the diameter, spacing and the center points’ path, which characterize the microstructure.
The remainder of this article is structured as follows. The experimental samples, the data acquisition method and the proposed algorithm are described in detail in Section 2. Section 3 contains the experimental results, and Section 4 presents the conclusions.

2. Materials and Methods

The algorithm proposed for extracting wool fabric’s microstructural features from micro-CT images is divided into two steps: yarn segmentation from micro-CT images and extraction of the feature parameters based on the segmentation results. The acquisition and preprocessing of the raw data are explained in Section 2.1. The next two sections interpret, in detail, the two parts of the yarn segmentation algorithm: feature transformation based on the structure tensor (Section 2.2) and the template selection algorithm based on the new features (Section 2.3). Section 2.4 illustrates the post-processing of yarn segmentation. The extraction of the feature parameters is accounted for in Section 2.5.

2.1. Materials and Preprocessing

The samples used were woven single-ply tweed fabrics. Figure 3 shows a physical image of one of the samples, with the basic information shown in Table 1. The basic information comes from the design sheet of the sample, except for the density of yarns. The density of yarns, meaning the number of yarns contained in a unit of length, was obtained by manually counting the number of yarns within 2 cm and then scaling up. The warp of the sample was made of twisted double-stranded yarn and the weft was single-stranded yarn, which are the two most common yarn styles. The ROI of this study, i.e., the minimum repeat unit of the sample’s texture, was four warps by four wefts.
Images of the samples were acquired by a micro-CT scanner, the details of which are recorded in Appendix A. The sample was fixed vertically on the sample table by a rectangular carbon fiber plate with inner size of 25 × 20 mm and was rotated to acquire the information on the two-dimensional projection inside the sample. The operating parameters (recorded in Table 2) were set according to the technical manual of the instrument and previous experiments in our laboratory.
The 2D projection images of the samples were reconstructed into 3D volume images and corrected for annular artifacts by the reconstruction software NRecon. The background corresponding to air was introduced into the reconstruction process with noise. The OTSU method [40] was used to calculate a global threshold for the image to split the foreground and background. The pixels in the background were set to 0. The sample had a natural inhomogeneity, and the reconstruction process also introduced noise in the foreground. A Gaussian filter was applied in order to smooth it while preserving the gray distribution of the image as much as possible. The image was then normalized to (0–255). The result is shown in Figure 4. For the coherence of the text, the detailed results of each step of preprocessing are recorded in Appendix B. In the following steps, the X, Y and Z axes of the 3D volume image were aligned with the directions of the weft, thickness and warp, respectively.
The fine hairiness on the yarn’s surface and lint with low grayscale values introduced errors in estimating the yarn’s diameter and fitting the path of the center points. Before segmenting the yarns, they needed to be removed. Only the tightly structured central part of the yarn should be retained. Mathematical morphology is a method of image analysis that uses structural elements to identify and extract shapes in an image that resemble structural elements [41], consisting of four basic operations: erosion, expansion, open operation and closed operation. Open operation first erodes and then expands the image, which can effectively remove small objects that do not contain structural elements other than the main object. Open operation was performed on the 3D volume image with a spherical structural element. The connected domain with the highest number of non-zero pixels was retained.

2.2. Structure Tensor

The gray value of volume image indicates the capacity to absorb X-rays. The higher the gray value, the higher the absorption capacity. For wool fabrics, the X-ray absorption capacity is positively correlated with density. The density inside the yarn decreases from the center to the periphery in the axial direction, and remains similar in the radial direction, showing a clear anisotropy.
Anisotropy can be characterized by a structure tensor. The eigenvector of the structure tensor corresponding to its smallest eigenvalue is parallel to the direction of the smallest change in the grayscale value [42]. That is, the eigenvalues of the pixels within the yarn are theoretically parallel to the axial direction of the yarn. The structure tensor is calculated by the following equation [43]
S p = I x I x N I x I y N I x I z N I y I x N I y I y N I y I z N I z I x N I z I y N I z I z N
where I denotes the grayscale image, and N denotes the mean value of all pixel points within the cubic region centered on p = ( x p , y p , z p ) , which is the pixel point to be calculated with a side length of 2 ω + 1 :
x , y , z | x x p ω , y y p ω , z z p ω
The derivatives are calculated using the five-point differential formula, which is shown below for the x-axis direction [43], and the same formula is used for the y- and z-axis directions.
I x = I x 2 , y , z 8 I x 1 , y , z + 8 I x + 1 , y , z I ( x + 2 , y , z ) 12
The eigenvector E = ( e 1 , e 2 , e 3 ) corresponding to the smallest eigenvalue is projected to a plane perpendicular to the radial axis, and the absolute value of the orientation ϕ is then determined. The warps’ eigenvector is projected to the Y–X plane. The orientation is determined as follows:
ϕ = a b s ( a t a n 2 ( e 2 , e 1 ) )
The equation is then derived by projecting the eigenvector of the wefts to the YZ plane. In Figure 5, a diagram of the orientation of the warps and wefts is shown. The orientation of the yarns that are radially perpendicular to the slice plane is near 0°, and those parallel to it are near 90°. In other words, the area where the sum of the orientation is the smallest is the cross-section of the yarn.

2.3. Segmentation by Template

Before comparing the value of the orientation as a new feature of the volume image, a preliminary prediction of the yarn’s dimensions is needed to determine the range to be included in the sum, i.e., an elliptical template needs to be created for each yarn.
The cross-section area varies less perpendicular to the radial direction and varies more parallel to the radial direction. The slice corresponding to the local minimum of the total number of non-zero pixels is between two yarns parallel to the slice plane and can be used to coarsely divide the yarns. This slice contains the least area of yarns of other directions and interferes the least with the accuracy of the cross-section’s annotation. This slice is processed in the same way to discriminate the cross-sections of the yarns.
In most cases, the cross-sections of the yarns selected by the process above are separately connected components, which are occasionally connected to a yarn in another direction. The cross-section is labeled using the region growth algorithm [44] with the seed set to the pixel with the highest grayscale value. The center of the yarn is denser than the edges and absorbs more X-rays, and the pixel with the highest gray value is usually in the center of the cross-section. The growth condition is
ϕ i ϕ m e a n < 1
where ϕ i denotes the orientation of the pixel point to be judged and ϕ m e a n denotes the average orientation angle of the selected pixel points. Figure 6a,b shows the results of labeling a set of warps and wefts.
The parameters of the elliptical template generated by the cross-section are calculated by the geometric moment and central moment [45]. Considering that the shape of the cross-section is the main object of interest, the image is binarized to eliminate the effect of grayscale. That is, all non-zero pixels in the cross-section are assigned a value of 1, representing the foreground. The equations for the horizontal and vertical coordinates ( x , y ) of the central point, the radius of the major axis a and of the minor axis b , and the inclination angle θ of the major axis are as shown below [46]
x = m 10 m 00 , y = m 01 m 00
a = 2 μ 20 + μ 02 + μ 20 μ 02 2 + 4 μ 11 2 μ 00
b = 2 ( μ 20 + μ 02 μ 20 μ 02 2 + 4 μ 11 2 ) μ 00
θ = 1 2 a r c t a n ( 2 μ 11 μ 20 μ 02 )
where m p q and μ p q denote the (p + q)-order geometric moment and the central moment, respectively; the mass center used in the calculation of the central moment is the same as the center point of the ellipse. Figure 6c and Figure 7d show the elliptical templates built on the basis of the labeling results.
The dimensions of the cross-sections (the radius of major axis a and of minor axis b ) in the adjacent slices are almost constant. The variations are mainly in the center point coordinates ( x ¯ , y ¯ ) for both warps and wefts, and in the inclination angle θ for the double-stranded yarn. The vertical coordinates of the center point vary much more than the horizontal coordinates in the small ROI, and the line of the horizontal coordinates in all slices can be approximated as a straight line.
The position of the cross-section is traced to both sides, starting from the initial slice where the template is created. The vertical coordinate y ¯ i of the center point of the cross-section in the current slice i is shifted in steps of 1 pixel in the range [ y i 1 5 , y i 1 + 5 ] , and the horizontal coordinate x ¯ i is shifted in steps of 0.2 pixels in [ x 1 2 , x 1 2 ] . y ¯ i 1 denotes the vertical coordinate of the center point of the cross-section in the previous slice i 1 , and x ¯ 1 denotes the initial value of horizontal coordinate. The inclination angle θ is varied in steps of 5° within [0°, 180°].
If the horizontal coordinate is also shifted from the value on the previous slice, it may accumulate errors to the extent that the result of labeling shifts to the adjacent yarn, so the range of the horizontal coordinate is limited by using the initial value as the base point.
Before calculating the sum of orientation angles of all pixels within the template, it is necessary to first use RLOESS smoothing [47] on the orientation to remove outliers. The orientation of zero-valued pixels is set to 90°, which means that yarns parallel to the slice plane are transformed into the background.
The smaller the sum of all pixel orientation angles, the more pixels within the template that belong to yarns perpendicular to the slice’s plane. The area where the minimum value of the sum changes the least compared with the previous slice is chosen as the cross-section of the yarn in that slice. By connecting all slices in series, the basic shape of the yarn can be obtained.

2.4. Post-Processing

The yarns extracted after the method above still have the following problems: (1) overlaps between adjacent warps or adjacent wefts, (2) overlaps in the region of the intersection of warps and wefts, and (3) residual regions. The following treatments are applied to optimize them, respectively.
  • The overlap between adjacent warps or adjacent wefts is processed in slicing order. The line connecting the highest point and the lowest point of the overlapping region on the slice is used as the new boundary of these two yarns (shown in Figure 7a).
  • The overlap of the intersecting regions of the warps and wefts are sliced by the X-axis or the Z-axis, which, in the slice, is divided equally by a column (shown in Figure 7b).
  • Pixels without an attribution are given the label of the yarn with the smallest Euclidean distance from it (and a distance less than the threshold).

2.5. Feature Extraction

The characteristics of the microstructure of the wool fabric were obtained by analyzing the cross-sections of the split yarns after slicing along the coordinate axis parallel to their axes.
  • Path of the center points: Usually, the path of the center points is fitted by B splines [48]. However, after calculation of the center points of the cross-section on each slice by Equation (6), no further interpolation was required. The discrete data of the center points were noisy. The approximate path of the center points was smoothed by polynomial fitting. The horizontal coordinate of the center point was approximated as a straight line within the ROI, so the relationship between x ¯ and slice coordinates could be fitted as a first-order polynomial. The path of the vertical coordinate was more complex and approximated a wavy line. The relationship between y ¯ and the slice’s coordinates was fitted as 6-order polynomial.
  • Diameter: The cross-section of the yarn is usually not a positive circle. The diameter can be approximated by the diameter of the major axis of the cross-section. The cross-section in the case of straightening should be perpendicular to the path of the center points, not identical to the cross-section on a slice along the coordinate’s axis. The normal plane perpendicular to the approximated path of the center points was used to obtain the cross-section of the yarn, and the radius of major axis of this cross-section was calculated using Equation (7). The process was repeated at a distance of 10 pixels on the axes between two adjacent steps. The average value was taken as the radius of the yarn.
  • Spacing: Adjacent yarns in the same direction are not strictly parallel. There is no spacing between two non-parallel lines. Because the angle between the yarns is relatively small, they could be approximated as two parallel lines within the ROI. The spacing between two adjacent yarns j and j + 1 was calculated by the difference between the means of horizontal coordinates of the approximate paths:
    d = 1 n i = 0 n x i j + 1 1 n i = 0 n x i j

3. Results and Discussion

3.1. Integration Interval

The ability of the orientation to characterize the features of the 3D volume image is closely related to the integration interval of the structure tensor. The natural inhomogeneity of the density of the sample is eliminated through the mean value over a large integration interval. However, the integration interval of pixels at the edge of the yarn contains pixels of other yarns and backgrounds. Integration intervals that are too large can instead induce new errors.
The size of the integration interval was obtained by comparing the accuracy of the orientation-based segmentation. The slice in Figure 8a was segmented by calculating the thresholds of the orientations obtained under different integration intervals by OTSU [40]. In the absence of ground truth, the segmentation effect was evaluated by the ratio of the area of the wefts (perpendicular to the slicing plane) and the warp (parallel to the slicing plane).
Figure 9 demonstrates the effect of the integration interval on the segmentation. The ratio of the area of wefts to the warp was minimized when ω = 5. From the actual segmentation results shown in Figure 8b,c, it can be seen that the wefts were basically correctly classified. In contrast, the pixels in the red circle in Figure 8b that actually belonged to the warp were incorrectly labeled as wefts. The lower the ratio, the fewer pixels in the warp that were incorrectly segmented into the wefts, and the better the effect of characterization.
Meanwhile, the time consumed to calculate the structure tensor grew exponentially in relation to the integration interval. It ensured both an acceptable computation time and an excellent characterization effect when ω = 5.

3.2. Yarn Segmentation

The ground truth of segmentation on the slices of the CT-scanned images of wool fabrics were manually labeled on the grayscale images to validate the proposed method. All the cross-sections of yarns perpendicular to the slice plane were taken as positive samples, and those of yarns parallel to the slice plane were taken as negative samples, regardless of the background (i.e., zero-valued pixels) and the specific yarn they belonged to.
The CT-scanned images of wool fabric are hard to manually label to establish the true values for each slice within the ROI. Ground truths were established only for the four slices with a local maximum of the area of wefts when the warps were used as positive samples, and vice versa. These eight slices were at the center of the weft or warp, parallel to their planes (Figure 10).
Five ROIs were selected for validation in the sample with the center, bottom side, top side, left side, and right side. Their positions are shown in Figure 11. Figure 12 illustrates the confusion matrix when the warps and the wefts were taken as positive samples. In semantic segmentation, the confusion matrix was pixel-based.
PA and IOU have commonly been used as performance evaluation metrics. PA characterizes the proportion of the total number of pixels for which the prediction is correct. The elements on the diagonal of the confusion matrix represent the number of pixels correctly predicted in positive and negative samples. PA is the ratio of its sum to the sum of all elements. IOU represents the ratio of the intersection and concatenation of the predictions with the true values. In this task, IOU is calculated as the ratio of true positive samples to the sum of all samples except the true negative samples. Table 3 shows the PA and IOU when the positive samples were warps and wefts. The IOU exceeded 70% for both warps and wefts, and the PA exceeded 85% for both.
Figure 13 shows the comparison between the predictions of the proposed method and the ground truth. In the visualization of the results, the periphery of the cross-section was well labeled. The main errors appeared in the part where the warps and wefts intersected. In the process of template selection, the orientation of zero-value pixels was modified to π 2 , which was the theoretical mean orientation of the yarn parallel to the slicing plane. In practice, the radial direction of the warps and wefts and the plane where the fabric is located are not perfectly parallel to the three coordinate axes. There is a certain angle, making the mean orientation of the yarn parallel to the slicing plane different from the theoretical value. The template selection method will prefer to select the intersecting part when the set value of the orientation of zero-value pixels is larger. This value can be used as a penalty term to adjust the results of segmentation.
Table 3 also records the precision, recall and F1 score of the proposed method. These three metrics are usually used for classification problems rather than semantic segmentation. In the evaluation phase, it only considered whether the pixels belonged to the warps or the wefts (ignoring the few pixels that were discarded) and could be considered a pixel-level binary classification problem. In Table 4, the warp had a higher precision while the weft had a higher recall. This is related to the volume of the warps and wefts. The warps, as double-stranded yarns, had a larger cross-sectional area in the ground truth, while the wefts, as single-stranded yarns, had a smaller cross-sectional area. This meant that the warps would have fewer false positive pixels and the wefts would have fewer false negative pixels.

3.3. Feature Extraction

Figure 14 shows the 3D grayscale image after preprocessing and the results of yarn segmentation in ROI No. 1. Feature extraction was performed by the method defined in Section 2.5, and the results are recorded in Table 4 and Table 5.
The density of the warps and wefts are traditional characteristics of the textile industry. They refer to the number of yarns contained in a unit of length (10 cm was used in this study). In the microstructural characteristics, it is replaced by the yarns’ spacing. The densities in Table 4 and Table 5 were obtained by dividing the unit of length by the yarns’ spacing and were recorded to facilitate a comparison with the basic information of the sample. The relative error between the warp density of this ROI and of the sample was 1.0%, and it was 5.0% for the weft density. The finished wool fabric after treatment was an inhomogeneous object.
A difference between the local and overall density of between 3% and 5% is acceptable in the textile industry. The warps of the sample were thicker, denser and more closely aligned with each other. They were less likely to be displaced than the wefts. The experimental results were consistent with this pattern.
Because of the inhomogeneity of the wool fabric, the diameter of the yarn could not be measured again by other methods as a control group. The standard deviation of the warps’ diameters was 0.011 mm and that of the wefts’ diameters was 0.023 mm. The trend was consistent with the fact that single-stranded yarns are more prone to deformation than double-stranded yarns.

4. Conclusions

The purpose of this study was to automatically extract the microstructure of wool fabrics from the images scanned by micro-CT using image processing. The microstructure allows for more objective and precise control of the wool fabric’s properties during processing, replacing the traditional process design methods in the textile industry based on the experience of a team of engineers and repeated trial production.
The proposed two-step approach was developed for the unique characteristics of wool fabrics. The microstructure is estimated first by segmentation and then by extraction. First, the eigenvectors of the structure tensor are applied instead of grayscale values as features during segmentation to separate single yarns from the fabric in a template selection manner. Secondly, its parameters are obtained from the segmented yarns by means of central moments and polynomial fitting to reconstruct its geometric model.
The method has considerable credibility and can provide both a reference for the wool fabric processing industry to control the products’ characteristics and to form a database for the subsequent development of deep learning. In the future, the relationship between the microstructure and the macroscopic properties, and the effects of processing on the microstructure will continue to be investigated, allowing the proposed method to further contribute to the development of intelligent production.

Author Contributions

Conceptualization, J.Z. and G.D.; methodology, J.Z.; software, J.Z.; validation, J.Z., Y.M. and G.D.; formal analysis, J.Z.; investigation, Y.M.; resources, G.D.; data curation, Y.M.; writing—original draft preparation, J.Z.; writing—review and editing, G.D., M.L. and X.C.; visualization, J.Z.; supervision, X.C.; project administration, G.D. and M.L.; funding acquisition, M.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Natural Science Foundation of China, Grant No. 62171283.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data are not publicly available due to the company’s requirements.

Acknowledgments

We thank Jiaxing Licheng Wool Spinning Co. Ltd. for provision of the material and technical support.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

The instrument used in the experiment was aSkyScan-1275 Micro-CT scanner from Bruker, Billerica, MA, United States. The appearance of the device is shown in Figure A1.
Figure A1. SkyScan-1275 Micro-CT scanner.
Figure A1. SkyScan-1275 Micro-CT scanner.
Sensors 23 06813 g0a1

Appendix B

Figure A2 shows the effect of the two main steps of the preprocessing described in Section 2.1 on the 3D volume images of the samples: background removal based on OTSU and fiber removal based on morphological processing. The sample shown in Section 2.1 had no fibers on its surface. No morphological treatment was applied to it. The second row of Figure A2 demonstrates the effect of fiber removal with another sample possessing abundant fibers.
Figure A2. The effect of the two main steps of preprocessing. (a) Sample A is the one shown in Section 2.1. (b) Sample B is another example used to show the effect of fiber removal.
Figure A2. The effect of the two main steps of preprocessing. (a) Sample A is the one shown in Section 2.1. (b) Sample B is another example used to show the effect of fiber removal.
Sensors 23 06813 g0a2

References

  1. He, Z.; Xu, J.; Tran, K.P.; Thomassey, S.; Zeng, X.; Yi, C. Modeling of Textile Manufacturing Processes Using Intelligent Techniques: A Review. Int. J. Adv. Manuf. Technol. 2021, 116, 39–67. [Google Scholar] [CrossRef]
  2. Tien, J.M. Toward the Fourth Industrial Revolution on Real-Time Customization. J. Syst. Sci. Syst. Eng. 2020, 29, 127–142. [Google Scholar] [CrossRef]
  3. Ma, K.; Thomassey, S.; Zeng, X. Development of a Central Order Processing System for Optimizing Demand-Driven Textile Supply Chains: A Real Case Based Simulation Study. Ann. Oper. Res. 2020, 291, 627–656. [Google Scholar] [CrossRef]
  4. Atwah, A.A.; Khan, M.A. Influence of Microscopic Features on the Self-Cleaning Ability of Textile Fabrics. Text. Res. J. 2023, 93, 450–467. [Google Scholar] [CrossRef]
  5. Nie, S.; Yu, L.; Sun, Z.; Wu, Z. Analytical Model for the Air Permeability of Parachute Fabric and Structure Parameters Sensitivity Analysis. J. Text. Inst. 2022, 113, 761–768. [Google Scholar] [CrossRef]
  6. Xiong, Y.; Yuan, L.; Gu, Q.; Wang, D.; Huo, D.; Liu, J. Correlation Analysis between Fabric Structure and Color Rendering of Polyester Colored Spun Woven Fabric Based on the Improved Relative Discrimination Criterion. Text. Res. J. 2022, 92, 2433–2445. [Google Scholar] [CrossRef]
  7. Ali, M.A.; Umer, R.; Khan, K.A.; Cantwell, W.J. In-Plane Virtual Permeability Characterization of 3D Woven Fabrics Using a Hybrid Experimental and Numerical Approach. Compos. Sci. Technol. 2019, 173, 99–109. [Google Scholar] [CrossRef]
  8. Li, N.-W.; Yick, K.-L.; Yu, A.; Ning, S. Mechanical and Thermal Behaviours of Weft-Knitted Spacer Fabric Structure with Inlays for Insole Applications. Polymers 2022, 14, 619. [Google Scholar] [CrossRef]
  9. Wang, Y.; Li, L.; Hofmann, D.; Andrade, J.E.; Daraio, C. Structured Fabrics with Tunable Mechanical Properties. Nature 2021, 596, 238–243. [Google Scholar] [CrossRef]
  10. Wang, M.; Ding, G. Method for Controlling Finished Fabric Characteristic by Microstructure. China Patent CN109557094B, 29 April 2022. [Google Scholar]
  11. Lomov, S.V.; Huysmans, G.; Luo, Y.; Parnas, R.S.; Prodromou, A.; Verpoest, I.; Phelan, F.R. Textile Composites: Modelling Strategies. Compos. Part Appl. Sci. Manuf. 2001, 32, 1379–1394. [Google Scholar] [CrossRef]
  12. Wang, R.; Zhang, Z.-F.; Yang, B.; Xi, H.-Q.; Zhai, Y.-S.; Zhang, R.-L.; Geng, L.-J.; Chen, Z.-Y.; Yang, K. Detection and Classification of Cotton Foreign Fibers Based on Polarization Imaging and Improved YOLOv5. Sensors 2023, 23, 4415. [Google Scholar] [CrossRef] [PubMed]
  13. Wang, J.; Shi, Z.; Shi, W.; Wang, H. The Detection of Yarn Roll’s Margin in Complex Background. Sensors 2023, 23, 1993. [Google Scholar] [CrossRef] [PubMed]
  14. Wang, L.; Lu, Y.; Pan, R.; Gao, W. Evaluation of Yarn Appearance on a Blackboard Based on Image Processing. Text. Res. J. 2021, 91, 2263–2271. [Google Scholar] [CrossRef]
  15. Luan, H.; Toyoura, M.; Gu, R.; Terada, T.; Wu, H.; Funatomi, T.; Xu, G. Textile Image Recoloring by Polarization Observation. Vis. Comput. 2022. [Google Scholar] [CrossRef]
  16. Fan, M.; Deng, N.; Xin, B.; Zhu, R. Recognition and Analysis of Fabric Texture by Double-Sided Fusion of Transmission and Reflection Images under Compound Light Source. J. Text. Inst. 2022, 1–13. [Google Scholar] [CrossRef]
  17. Xiang, J.; Pan, R. Automatic Recognition of Density and Weave Pattern of Yarn-Dyed Fabric. Autex Res. J. 2022. [Google Scholar] [CrossRef]
  18. Xiang, Z.; Zhang, J.; Hu, X. Vision-Based Portable Yarn Density Measure Method and System for Basic Single Color Woven Fabrics. J. Text. Inst. 2018, 109, 1543–1553. [Google Scholar] [CrossRef]
  19. Yuan, X.; Xin, B.; Luo, J.; Xu, Y. An Investigation of Woven Fabric Density Measurement Using Image Analysis Based on RTV-SFT. J. Text. Inst. 2022. [Google Scholar] [CrossRef]
  20. Dai, L.; Li, D.; Tu, Q.; Wang, J. Yarn Density Measurement for 3-D Braided Composite Preforms Based on Rotation Object Detection. IEEE Trans. Instrum. Meas. 2022, 71, 5016711. [Google Scholar] [CrossRef]
  21. Meng, S.; Pan, R.; Gao, W.; Zhou, J.; Wang, J.; He, W. A Multi-Task and Multi-Scale Convolutional Neural Network for Automatic Recognition of Woven Fabric Pattern. J. Intell. Manuf. 2021, 32, 1147–1161. [Google Scholar] [CrossRef]
  22. Stock, S.R. X-ray Microtomography of Materials. Int. Mater. Rev. 1999, 44, 141–164. [Google Scholar] [CrossRef]
  23. Huang, W.; Causse, P.; Brailovski, V.; Hu, H.; Trochu, F. Reconstruction of Mesostructural Material Twin Models of Engineering Textiles Based on Micro-CT Aided Geometric Modeling. Compos. Part Appl. Sci. Manuf. 2019, 124, 105481. [Google Scholar] [CrossRef]
  24. Shinohara, T.; Takayama, J.; Ohyama, S.; Kobayashi, A. Extraction of Yarn Positional Information from a Three-Dimensional CT Image of Textile Fabric Using Yarn Tracing with a Filament Model for Structure Analysis. Text. Res. J. 2010, 80, 623–630. [Google Scholar] [CrossRef]
  25. Sinchuk, Y.; Shishkina, O.; Gueguen, M.; Signor, L.; Nadot-Martin, C.; Trumel, H.; Van Paepegem, W. X-ray CT Based Multi-Layer Unit Cell Modeling of Carbon Fiber-Reinforced Textile Composites: Segmentation, Meshing and Elastic Property Homogenization. Compos. Struct. 2022, 298, 116003. [Google Scholar] [CrossRef]
  26. Zhong, Q.; Zhang, J.; Xu, Y.; Li, M.; Shen, B.; Tao, W.; Li, Q. Filamentous Target Segmentation of Weft Micro-CT Image Based on U-Net. Micron 2021, 146, 102923. [Google Scholar] [CrossRef] [PubMed]
  27. Ali, M.A.; Guan, Q.; Umer, R.; Cantwell, W.J.; Zhang, T. Efficient Processing of ΜCT Images Using Deep Learning Tools for Generating Digital Material Twins of Woven Fabrics. Compos. Sci. Technol. 2022, 217, 109091. [Google Scholar] [CrossRef]
  28. Pidou-Brion, V.; Le Guilloux, Y. Active Yarn Meshes for Segmentation on X-ray Computed Tomography of Textile Composite Materials at the Mesoscopic Scale. Compos. Struct. 2022, 281, 115084. [Google Scholar] [CrossRef]
  29. Guo, C.; Zhang, H.; Wang, Y.; Jia, Y.; Qi, L.; Zhu, Y.; Cui, H. Parametric Modeling of 2.5D Woven Composites Based on Computer Vision Feature Extraction. Compos. Struct. 2023, 321, 117234. [Google Scholar] [CrossRef]
  30. Song, Y.; Qu, Z.; Liao, H.; Ai, S. Material Twins Generation of Woven Polymer Composites Based on ResL-U-Net Convolutional Neural Networks. Compos. Struct. 2023, 307, 116672. [Google Scholar] [CrossRef]
  31. Ali, M.A.; Khan, T.; Irfan, M.S.; Umer, R. Semantic Segmentation of ΜCT Images of 3D Woven Fabric Using Deep Learning. In Proceedings of the 20 th European Conference on Composite Materials, ECCM20, Lausanne, Switzerland, 26–30 June 2022; Volume 4, pp. 831–837. [Google Scholar]
  32. Mendoza, A.; Trullo, R.; Wielhorski, Y. Descriptive Modeling of Textiles Using FE Simulations and Deep Learning. Compos. Sci. Technol. 2021, 213, 108897. [Google Scholar] [CrossRef]
  33. Blusseau, S.; Wielhorski, Y.; Haddad, Z.; Velasco-Forero, S. Instance Segmentation of 3D Woven Fabric from Tomography Images by Deep Learning and Morphological Pseudo-Labeling. Compos. Part B Eng. 2022, 247, 110333. [Google Scholar] [CrossRef]
  34. Zheng, K.; Chen, H.; Wu, C.; Zhang, X.; Ying, Z.; Wang, Z.; Wu, Z.; Pan, Z.; Qiu, B.J. An Improved Dataset Augmentation Approach for Deep Learning-Based XCT Images Segmentation in Layered Composite Fabric. Compos. Struct. 2023, 317, 117052. [Google Scholar] [CrossRef]
  35. Pannier, Y.; Coupé, P.; Garrigues, T.; Gueguen, M.; Carré, P. Automatic Segmentation and Fibre Orientation Estimation from Low Resolution X-ray Computed Tomography Images of 3D Woven Composites. Compos. Struct. 2023, 318, 117087. [Google Scholar] [CrossRef]
  36. Fang, G.; Chen, C.; Yuan, S.; Meng, S.; Liang, J. Micro-Tomography Based Geometry Modeling of Three-Dimensional Braided Composites. Appl. Compos. Mater. 2018, 25, 469–483. [Google Scholar] [CrossRef]
  37. Czabaj, M.W.; Riccio, M.L.; Whitacre, W.W. Numerical Reconstruction of Graphite/Epoxy Composite Microstructure Based on Sub-Micron Resolution X-ray Computed Tomography. Compos. Sci. Technol. 2014, 105, 174–182. [Google Scholar] [CrossRef]
  38. Whitacre, W.; Czabaj, M. Automated 3D Digital Reconstruction of Fiber Reinforced Polymer Composites. In Proceedings of the AIAA Guidance, Navigation, and Control Conference, Kissimmee, FL, USA, 5–9 January 2015; AIAA SciTech Forum. American Institute of Aeronautics and Astronautics: New York, NY, USA, 2015. [Google Scholar]
  39. Wu, W.; Li, W. Parametric Modeling Based on the Real Geometry of Glass Fiber Unidirectional Non-Crimp Fabric. Text. Res. J. 2019, 89, 3949–3959. [Google Scholar] [CrossRef]
  40. Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
  41. Yan, Z.; Hui, W. Segmentation Algorithm of Downhole Perforation Image Based on Morphology and Region Growth. In Proceedings of the 2022 4th International Conference on Intelligent Control, Measurement and Signal Processing (ICMSP), Hangzhou, China, 8 July 2022; pp. 709–712. [Google Scholar]
  42. Khan, A.R.; Cornea, A.; Leigland, L.A.; Kohama, S.G.; Jespersen, S.N.; Kroenke, C.D. 3D Structure Tensor Analysis of Light Microscopy Data for Validating Diffusion MRI. NeuroImage 2015, 111, 192–203. [Google Scholar] [CrossRef] [Green Version]
  43. Straumit, I.; Lomov, S.V.; Wevers, M. Quantification of the Internal Structure and Automatic Generation of Voxel Models of Textile Composites from X-ray Computed Tomography Data. Compos. Part Appl. Sci. Manuf. 2015, 69, 150–158. [Google Scholar] [CrossRef]
  44. Hamza, H.; Omar, B.; Abdelkaher, A.A.; Abdelmajid, E.M. Adaptive Region Growing Based on Detecting the Seed Point in the Central Trachea to the Pre-Segment Respiratory System. In Proceedings of the 2022 11th International Symposium on Signal, Image, Video and Communications (ISIVC), El Jadida, Morocco, 18 May 2022; pp. 1–5. [Google Scholar]
  45. Hjouji, A.; El-Mekkaoui, J.; Qjidaa, H. New Set of Non-Separable 2D and 3D Invariant Moments for Image Representation and Recognition. Multimed. Tools Appl. 2021, 80, 12309–12333. [Google Scholar] [CrossRef]
  46. Huang, J.; Song, L.; Yu, M.; Zhang, C.; Li, S.; Li, Z.; Geng, J.; Zhang, C. Quantitative Spatial Analysis of Thermal Infrared Radiation Temperature Fields by the Standard Deviational Ellipse Method for the Uniaxial Loading of Sandstone. Infrared Phys. Technol. 2022, 123, 104150. [Google Scholar] [CrossRef]
  47. Cleveland, W.S. Robust Locally Weighted Regression and Smoothing Scatterplots. J. Am. Stat. Assoc. 1979, 74, 829–836. [Google Scholar] [CrossRef]
  48. Wielhorski, Y.; Mendoza, A.; Rubino, M.; Roux, S. Numerical Modeling of 3D Woven Composite Reinforcements: A Review. Compos. Part Appl. Sci. Manuf. 2022, 154, 106729. [Google Scholar] [CrossRef]
Figure 1. Microstructure of woven fabrics: (a) top view; (b) side view. The path of the center points is indicated by the yellow dashed line.
Figure 1. Microstructure of woven fabrics: (a) top view; (b) side view. The path of the center points is indicated by the yellow dashed line.
Sensors 23 06813 g001
Figure 2. Cross-sections of wool fabric: (a) the case where the warps are perpendicular to the slicing plane; (b) the case where the wefts are perpendicular to the slicing plane.
Figure 2. Cross-sections of wool fabric: (a) the case where the warps are perpendicular to the slicing plane; (b) the case where the wefts are perpendicular to the slicing plane.
Sensors 23 06813 g002
Figure 3. One of the samples used in the experiment.
Figure 3. One of the samples used in the experiment.
Sensors 23 06813 g003
Figure 4. A 3D volume image of the sample.
Figure 4. A 3D volume image of the sample.
Sensors 23 06813 g004
Figure 5. Orientation maps of the yarns. (a) Original grayscale image of a slice in the YZ plane. (b) Orientation map projected in the YZ plane. (c) Original grayscale image of a slice in the YX plane; (d) Orientation map projected in the YX plane. For display purposes, the orientation corresponding to the zero-valued pixels in the original image have been modified to −1.
Figure 5. Orientation maps of the yarns. (a) Original grayscale image of a slice in the YZ plane. (b) Orientation map projected in the YZ plane. (c) Original grayscale image of a slice in the YX plane; (d) Orientation map projected in the YX plane. For display purposes, the orientation corresponding to the zero-valued pixels in the original image have been modified to −1.
Sensors 23 06813 g005
Figure 6. The results of labeling the yarn’s cross-section and the fitted elliptical template. (a) Results of labeling a warp’s cross-section; (b) results of labeling a weft’s cross-section (c) elliptical template of the warp’s cross-section; (d) elliptical template of the weft’s cross-section. The labelled areas are illustrated by blue and the corresponding elliptical templates are shown in red.
Figure 6. The results of labeling the yarn’s cross-section and the fitted elliptical template. (a) Results of labeling a warp’s cross-section; (b) results of labeling a weft’s cross-section (c) elliptical template of the warp’s cross-section; (d) elliptical template of the weft’s cross-section. The labelled areas are illustrated by blue and the corresponding elliptical templates are shown in red.
Sensors 23 06813 g006
Figure 7. Schematic diagram of the post-processing methods: (a) processing of the overlap between adjacent warps or wefts; (b) processing of the overlap in the area where the warp and weft threads intersect. The solid black line indicates the outline of the yarn’s cross-section and the red dotted line indicates the division line of the overlap.
Figure 7. Schematic diagram of the post-processing methods: (a) processing of the overlap between adjacent warps or wefts; (b) processing of the overlap in the area where the warp and weft threads intersect. The solid black line indicates the outline of the yarn’s cross-section and the red dotted line indicates the division line of the overlap.
Sensors 23 06813 g007
Figure 8. Results of segmentation when ω = 5: (a) The original grayscale image of the slice, where the wefts are perpendicular to the slicing plane and the warp is parallel to the slicing plane; (b) results of segmentation of the wefts; (c) results of segmentation of the warp. Wrongly segmented areas are indicated by red circles.
Figure 8. Results of segmentation when ω = 5: (a) The original grayscale image of the slice, where the wefts are perpendicular to the slicing plane and the warp is parallel to the slicing plane; (b) results of segmentation of the wefts; (c) results of segmentation of the warp. Wrongly segmented areas are indicated by red circles.
Sensors 23 06813 g008
Figure 9. The ratios of the area of the wefts and warp (in blue) and the computation time (in orange) versus the integration interval.
Figure 9. The ratios of the area of the wefts and warp (in blue) and the computation time (in orange) versus the integration interval.
Sensors 23 06813 g009
Figure 10. Location of the slices for evaluation in ROI No. 1 from the top view. The blue dashed lines indicate the four slices when the warps were taken as positive samples, and the red dashed lines indicate the four slices when the wefts were taken as positive samples.
Figure 10. Location of the slices for evaluation in ROI No. 1 from the top view. The blue dashed lines indicate the four slices when the warps were taken as positive samples, and the red dashed lines indicate the four slices when the wefts were taken as positive samples.
Sensors 23 06813 g010
Figure 11. Location of the five ROIs in the sample.
Figure 11. Location of the five ROIs in the sample.
Sensors 23 06813 g011
Figure 12. Confusion matrix: (a) when the warps were taken as positive samples, (b) when the wefts were taken as positive samples.
Figure 12. Confusion matrix: (a) when the warps were taken as positive samples, (b) when the wefts were taken as positive samples.
Sensors 23 06813 g012
Figure 13. Comparison of the ground truth and predictions for yarn segmentation. The intersections are marked in yellow. Regions included only in the ground truth are marked in dark blue, and the regions included only in the predictions are marked in light blue. (ae) mean 5 different slices.
Figure 13. Comparison of the ground truth and predictions for yarn segmentation. The intersections are marked in yellow. Regions included only in the ground truth are marked in dark blue, and the regions included only in the predictions are marked in light blue. (ae) mean 5 different slices.
Sensors 23 06813 g013
Figure 14. Top view of ROI No. 1: (a) preprocessed grayscale image; (b) results of yarn segmentation represented by different colors.
Figure 14. Top view of ROI No. 1: (a) preprocessed grayscale image; (b) results of yarn segmentation represented by different colors.
Sensors 23 06813 g014
Table 1. Basic information of the sample.
Table 1. Basic information of the sample.
Weight
(g/m)
Width
(cm)
Warp Density
(yarns/10 cm)
Weft Density
(yarns/10 cm)
Warp CountWeft Count
Property26014530026054S/254S/1
Table 2. Operating parameters of the micro-CT scanner.
Table 2. Operating parameters of the micro-CT scanner.
ParametersValue
Source voltage (kV)50
Source current (μA)200
Rotation step (deg)0.2
Frame averaging4
Image pixel size (μm × μm)15 × 15
Table 3. Experimental results of the proposed method of yarn segmentation.
Table 3. Experimental results of the proposed method of yarn segmentation.
IOU (%)PA (%)Precision (%)Recall (%)F1 Score (%)
Warp75.4686.1395.2578.1585.86
Weft70.3488.2774.2892.3482.33
Table 4. Microstructural features of the warps in the five ROIs.
Table 4. Microstructural features of the warps in the five ROIs.
Diameter
(mm)
Spacing
(mm)
Local Density
(yarns/10 cm)
Global Density
(yarns/10 cm)
ROI No. 1 0.380.33303300
ROI No. 20.400.33303
ROI No. 30.380.33303
ROI No. 40.370.33303
ROI No. 50.380.33303
Table 5. Microstructural features of the wefts in the five ROIs.
Table 5. Microstructural features of the wefts in the five ROIs.
Diameter
(mm)
Spacing
(mm)
Local Density
(yarns/10 cm)
Global Density
(yarns/10 cm)
ROI No. 1 0.300.41247260
ROI No. 20.310.39256
ROI No. 30.290.39256
ROI No. 40.350.39256
ROI No. 50.340.38267
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhu, J.; Ma, Y.; Ding, G.; Liu, M.; Chen, X. Extraction of the Microstructure of Wool Fabrics Based on Structure Tensor. Sensors 2023, 23, 6813. https://doi.org/10.3390/s23156813

AMA Style

Zhu J, Ma Y, Ding G, Liu M, Chen X. Extraction of the Microstructure of Wool Fabrics Based on Structure Tensor. Sensors. 2023; 23(15):6813. https://doi.org/10.3390/s23156813

Chicago/Turabian Style

Zhu, Jiani, Youwei Ma, Guoqing Ding, Manhua Liu, and Xin Chen. 2023. "Extraction of the Microstructure of Wool Fabrics Based on Structure Tensor" Sensors 23, no. 15: 6813. https://doi.org/10.3390/s23156813

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop