|
1.IntroductionTexture features play an important role in several image-processing applications ranging from computer vision and medical image processing to remote sensing and content-based image retrieval. Almost all the texture processing applications require rotation invariance in the texture features, which we achieve here in a very simple and cost-effective manner. Reference 1 categorizes the wide range of texture features proposed to date into two broad categories and compares them: features that use a large bank of filters or wavelets and features that use immediate pixel neighborhood properties. It shows that the latter outperforms the former. Hence, we take on improving a feature set from the latter category. In Ref. 2, texture features are extracted using a 1-D discrete Fourier transform (DFT) of the circular neighborhood around a pixel. It proposes computing a 1-D DFT of the sequence around each image pixel and uses magnitudes of the DFT coefficients to extract texture features. More recent work3 extracts similar texture features from the square neighborhood and calls it a local Fourier histogram (LFH)-based feature set. The LFH-based feature set was shown to perform better than the texture features extracted from a large filter bank of Gabor filters,4 which are computationally more expensive than the LFH-based features. In this work, we augment the LFH-based feature set by using the phases of the DFT coefficients as texture features as well. However, the improvement suggested herein equally applies to the texture features extracted from the circular neighborhood.2 Since the phases are sensitive to image rotation, we also present a method to make them rotation invariant. This does not cause any additional computational cost, but does improve performance. The following sections explain how the LFH-based features are extracted, how the local image gradient angle is determined from the features themselves, and how the image gradient angle is used to compensate the features against rotation. Results are presented before concluding the paper. 2.Method of Extracting DFT-Based Texture FeaturesThe texture features proposed in Ref. 3 are extracted in the spatial domain by taking a 1-D DFT of the sequence through , hereafter called , around a central pixel as shown in Fig. 1 . We use the local image gradient at the central pixel to compensate the extracted features for the effects of image rotation. When moving a window across a texture image, the 1-D DFT of is computed as where , represents the ’th Fourier coefficient, and represents the ’th value in . From the computed DFT, histograms of the absolute values of the first five DFT coefficients, i.e., through , were used for texture description in Ref. 3The phases of the DFT coefficients through were also proposed as features in Ref. 3 but only for those applications that do not deal with image rotation. The phase features were otherwise excluded because, unlike magnitudes, the phases of the DFT coefficients are sensitive to image rotation. Reference 2 also proposes only magnitudes of the DFT coefficients as texture features. We propose using the histograms of phases of and after appropriately compensating with the local image gradient. 2.1.Local Image GradientTraditionally, as a good compromise between cost and accuracy, the edge-detection operators such as the Sobel (SO) and Prewitt operators (PO)are often used to estimate local image gradient at a given pixel. Below are the general edge-detection operators in which the value of varies from 1, as in the PO, to 2, as in the SO: where and are convolved with a texture image to obtain two gradient images, and , respectively. The local image gradient angle is calculated asConvolving the edge detection operators of Eq. 2 with the neighborhood of Fig. 1 gives and , which are substituted in Eq. 3 givingHowever, the local image-gradient angle can also be obtained from the phase of the first coefficient of the DFT of . By substituting in Eq. 1 givesEquations 4, 5 happen to be exactly the same if and they are very similar otherwise, because the value falls between the usual values of 1 and 2. For instance, the histograms of the local image-gradient angle from and from the SO ) for image D87 of the Brodatz album (BA) have a cross-correlation coefficient (XCC) of 0.97. In addition, if we consider the image as a noisy version of the SO-driven image, the signal-to-noise ratio (SNR) is , verifying that the former is a very close approximation of the latter. All other images of the album were tested, and more or less similar values of correlation coefficient and SNR were found between the two approximations of the image gradient. Hence, instead of computing the local image-gradient angle using any 2-D edge-detection operators, we use the value to compensate the phases of the two other DFT coefficients, i.e., and , against the effects of image rotation. It can now be said that . 2.2.Effects of Image Rotation on Fourier CoefficientsConsider that an image is rotated by an arbitrary angle, with the center of rotation exactly in the middle of the image. The angle of rotation at any other point on the image would be different from what it is at the center of rotation. Let the angle of rotation be deg at point (see Fig. 1), corresponding to a shift in the string by places. This shift in causes nothing but the changes in the phases of the resulting DFT coefficients. Equation 6 states the shift property of DFT: where represents the ’th coefficient of the DFT of , and represents the ’th coefficient of the DFT of the string that is the same string shifted by places. Equation 6 shows that any displacement in time or space domain causes a phase shift given byin the Fourier domain: hence, where represents the shift in . The phase shift in is given byIntuitively, the change in the local image-gradient angle is equal to the angle of rotation at point that causes equal change in . Comparing Eqs. 7, 8 gives the phase shift in asTherefore, the phases and are adjusted accordingly against the rotation by subtracting the local image-gradient angle as in Eq. 10. For ,where represents the rotation-compensated phase , and replaces .3.Experimental Results3.1.Rotation Invariance of the Phase FeaturesAll the images from the BA were rotated to 30, 45, 60, and , and histograms of and were computed at each orientation. Table 1 shows the XCC as a similarity measurement between the histograms corresponding to and to 30, 45, 60, and averaged over all the images from the BA. As an example, Figs. 2 and 3 show the histograms of and , respectively, for the image D87 from BA. All the histograms appear the same and do not exhibit any left or right shift, indicating that the two phases are highly rotation invariant. We also experimented with the features extracted from the circular neighborhood suggested in Ref. 2 and found that they perform worse than those extracted from the square neighborhood. Table 1XCC between the histograms of ϕ2 and ϕ3 , respectively, corresponding to images oriented at 0deg and to those at 30, 45, 60, and 90deg averaged over all the images from the Brodatz album.
3.2.Texture RecognitionEach of the 107 texture images from the BA was oriented at 0, 30, 45, 60, and . Then, 16 subimages measuring were cropped from each one of the images, giving a total of 8560 images.4 Recognition was performed on this set using the LFH-based feature set without phase features, with phase features, and with texture features based on 30 Gabor filters.4, 5 Reference 6 is a more recent work that proposes exactly the same filters but with a new distance metric that cannot be used for rotation-invariant recognition or retrieval. Table 2 presents the overall and orientation-wise texture recognition results, showing that the LFH-based features with phases perform the best in terms of accuracy and the rotation variance (RV).4 Table 2Recognition rates relative to Orientation with 8560 Brodatz images.
4.Effect of NoiseReference 4 found that the LFT-based texture features exhibit les noise immunity than the features based on Gabor filters. However, our latest results show that the LFT-based features perform even better when extracted from images quantized to only 32 gray levels. Considering this, we expect the proposed features to be more noise resistant than these were without image-quantization as in Ref. 4. 5.ConclusionThe earlier feature set based on LFH does not use phases of the DFT coefficients as texture features because the phases are sensitive to image orientation. To introduce rotation invariance in the features, we showed that the process of extracting phase features can be guided by the local image gradient. This was achieved by simply subtracting the local image-gradient angle obtained from the 1-D DFT itself, so that the features become self-compensating. This computationally simple and cost-effective method proved useful in making the LFH-based texture features robust against image rotation. The new feature set including the phase features exhibits more rotation invariance and yields higher recognition rates than the one without phase features. referencesM. Varma and A. Zisserman,
“Texture classification: are filter banks necessary?,”
691
–698
(2003). Google Scholar
H. Arof and F. Deravi,
“Circular neighborhood and 1-D DFT features for texture classification and segmentation,”
IEE Proc. Vision Image Signal Process., 145 167
–172
(1998). https://doi.org/10.1049/ip-vis:19981915 Google Scholar
F. Zhou, J.-F. Feng, and Q.-Y. Shi,
“Texture feature based on local fourier transform,”
610
–613
(2001). Google Scholar
A. A. Ursani, K. Kpalma, and J. Ronsin,
“Texture features based on Fourier transform and Gabor filters: an empirical comparison,”
67
–72
(2007). Google Scholar
B. S. Manjunath and W. Y. Ma,
“Texture features for browsing and retrieval of image data,”
IEEE Trans. Pattern Anal. Mach. Intell., 18
(8), 837
–842
(1996). https://doi.org/10.1109/34.531803 Google Scholar
P. Wu, B. S. Manjunath, S. Newsam, and H. D. Shin,
“A texture descriptor for browsing and similarity retrieval,”
Signal Process. Image Commun., 16 33
–43
(2000). https://doi.org/10.1016/S0923-5965(00)00016-3 Google Scholar
|