Analyzing carpet wear

A novel 3D scanner based on structured light can extract depth information of floor coverings.
06 July 2010
Sergio A. Orjuela Vargas, Filip Rooms, Wilfried Philips, Didier Van Daele and Robain De Keiser

Carpet manufacturers certify their products with labels corresponding to their durability. For this, the degree of change in carpet appearance one year after installation is simulated by subjecting carpet samples to accelerated, intensive mechanical wear in a test device. For years, textile floor coverings in Europe have been compared by human experts, who determine the labels (from 5, no noticeable wear, to 1, heavy wear) by subjectively evaluating the wear level based on color appearance and 3D structure. Industry is very interested in converting to automated, objective standards. With the this aim, extensive research has been conducted, specifically focused on image-processing techniques to extract texture parameters from digital color images using only intensity.1,2

Recently, in search of adequately capturing the 3D structure of carpets evaluated by experts, some researchers have been exploring the use of depth information instead of photographs.3,4 They used a 3D object scanner (Metris LC50) to capture 3D carpet-surface information. Depth data was captured into nonstructural grids, with the number of acquired points highly dependent on the colors of the object, before structuring into 2D images at additional computational cost. However, a major drawback of this process is that the carpet-surface shape can be distorted by the interpolation methods involved.

We have developed an alternative 3D scanner based on structured light (see Figure 1),5 which is less dependent on the illumination conditions and colors. In addition, the data points are captured on a structured grid and, as a result, the scanned images are more appropriate for image analysis. Compared to previous scanners, the new system is more than five times cheaper and seven times faster. The system is specifically designed for scanning carpets instead of 3D objects.


Figure 1. Measurement of depth in the surface of carpets based on triangulation.

The sample to be scanned is held with elastic bands upon a stainless-steel drum. A line laser generator fixed above the drum projects a fine, crisp, bright laser line of uniform intensity onto the sample's surface. The camera lens is located at a fixed distance from the rotating sample drum, which can be rotated at different speeds. The reflection of the line is detected by a camera that captures the depth information from the light so that it can be extracted and saved into an array. Each column of the frame is searched for the highest position of the highest-intensity reflection. That position represents the depth and the array is constructed using one depth value per column. A matrix representation of the depth information is built up by adding the arrays when the drum is rotating and converted to grayscale values from 1 to 255 to produce a gray image.

Figure 2 compares depth images from both the Metris and our proposed scanner. We also compared the performance of both scanners by the wear labels they produced. We scanned samples of three types of loop-pile, light-colored carpets using both scanners. Then, we extracted texture features from the depth data by comparing the distribution of local binary patterns (LBPs) between images of pristine and worn carpets. To correctly describe the wear labels using the texture features, the latter must change monotonically with the wear labels. (We assumed that the features must be at least linearly ranked with the wear labels to describe a monotonical change.) Therefore, we checked the relationship (and its linearity) by computing the rank correlation between wear labels and texture features. Figure 3 shows that both parameters are significantly different in performance, with our proposed scanner performing better.


Figure 2. Comparison of depth images from both scanners.

Figure 3. Comparison of linear and rank correlations between both scanners.

Subsequently, we evaluated texture features from the surface of eight reference samples for the European carpet-appearance standard EN1471, using our novel scanner to acquire the surface images. Figure 4 shows eight depth images of worn samples with labels from 1 to 5 in steps of 0.5. We obtained high correlations for both parameters for the different types of textile floor coverings.


Figure 4. Comparison of depth images related to wear labels.

Our results show that texture features from depth information obtained with our proposed scanner can be used to distinguish among wear labels. However, further study of combinations of these texture features with features obtained from intensity images is required before we can develop an automated and universal carpet-labeling system. To develop the latter, we first need to improve the algorithm to reconstruct the surface from the scanner data. Next, we need to optimally combine depth and intensity information to discriminate among the different wear labels. We also need to generate linear models from extracted features, either from video or depth images. Further, we would also like to extend the application area of our scanner to digitize surfaces of general textiles and other materials, and to analyze the roughness of bendable materials.

Sergio Orjuela Vargas is grateful for support from LASPAU Academic and Professional Programs for the Americas in agreement with the COLCIENCIAS Science and Technology Program and the Antonio Nariño University (Colombia).


Sergio A. Orjuela Vargas, Filip Rooms, Wilfried Philips
Department of Telecommunications and Information Processing (TELIN-IPI-IBBT)
Ghent University
Ghent, Belgium

Sergio Orjuela Vargas graduated as an electronic engineer from the National University of Colombia in 2001. He graduated in control automation from the University of Ibagué (Colombia) in 2006. He is currently a PhD student.

Didier Van Daele
Department of Textiles
Ghent University
Ghent, Belgium
Robain De Keiser
Department of Electrical Energy, Systems, and Automation
Ghent University
Ghent, Belgium

PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research