Advanced search
1 file | 2.33 MB Add to list

Perceptual-based textures for scene labeling: a bottom-up and a top-down approach

Gaëtan Martens (UGent) , Chris Poppe (UGent) , Peter Lambert (UGent) and Rik Van de Walle (UGent)
Author
Organization
Abstract
Due to the semantic gap, the automatic interpretation of digital images is a very challenging task. Both the segmentation and classification are intricate because of the high variation of the data. Therefore, the application of appropriate features is of utter importance. This paper presents biologically inspired texture features for material classification and interpreting outdoor scenery images. Experiments show that the presented texture features obtain the best classification results for material recognition compared to other well-known texture features, with an average classification rate of 93.0%. For scene analysis, both a bottom-up and top-down strategy are employed to bridge the semantic gap. At first, images are segmented into regions based on the perceptual texture and next, a semantic label is calculated for these regions. Since this emerging interpretation is still error prone, domain knowledge is ingested to achieve a more accurate description of the depicted scene. By applying both strategies, 91.9% of the pixels from outdoor scenery images obtained a correct label.
Keywords
bottom-up strategy, scene labeling, top-down strategy, digital images, image segmentation, image classification, biologically inspired texture features, material classification, outdoor scenery images, material recognition, perceptual-based textures, image texture

Downloads

  • Martens 2010 FIT PerceptualBasedTextures.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 2.33 MB

Citation

Please use this url to cite or link to this publication:

MLA
Martens, Gaëtan et al. “Perceptual-based Textures for Scene Labeling: a Bottom-up and a Top-down Approach.” Proceedings 2010 5th International Conference on Future Information Technology (FutureTech). Piscataway, NJ, USA: IEEE, 2010. Print.
APA
Martens, Gaëtan, Poppe, C., Lambert, P., & Van de Walle, R. (2010). Perceptual-based textures for scene labeling: a bottom-up and a top-down approach. Proceedings 2010 5th International Conference on Future Information Technology (FutureTech). Presented at the 2010 5th International Conference on Future Information Technology (FutureTech), Piscataway, NJ, USA: IEEE.
Chicago author-date
Martens, Gaëtan, Chris Poppe, Peter Lambert, and Rik Van de Walle. 2010. “Perceptual-based Textures for Scene Labeling: a Bottom-up and a Top-down Approach.” In Proceedings 2010 5th International Conference on Future Information Technology (FutureTech). Piscataway, NJ, USA: IEEE.
Chicago author-date (all authors)
Martens, Gaëtan, Chris Poppe, Peter Lambert, and Rik Van de Walle. 2010. “Perceptual-based Textures for Scene Labeling: a Bottom-up and a Top-down Approach.” In Proceedings 2010 5th International Conference on Future Information Technology (FutureTech). Piscataway, NJ, USA: IEEE.
Vancouver
1.
Martens G, Poppe C, Lambert P, Van de Walle R. Perceptual-based textures for scene labeling: a bottom-up and a top-down approach. Proceedings 2010 5th International Conference on Future Information Technology (FutureTech). Piscataway, NJ, USA: IEEE; 2010.
IEEE
[1]
G. Martens, C. Poppe, P. Lambert, and R. Van de Walle, “Perceptual-based textures for scene labeling: a bottom-up and a top-down approach,” in Proceedings 2010 5th International Conference on Future Information Technology (FutureTech), Busan, South Korea, 2010.
@inproceedings{978906,
  abstract     = {Due to the semantic gap, the automatic interpretation of digital images is a very challenging task. Both the segmentation and classification are intricate because of the high variation of the data. Therefore, the application of appropriate features is of utter importance. This paper presents biologically inspired texture features for material classification and interpreting outdoor scenery images. Experiments show that the presented texture features obtain the best classification results for material recognition compared to other well-known texture features, with an average classification rate of 93.0%. For scene analysis, both a bottom-up and top-down strategy are employed to bridge the semantic gap. At first, images are segmented into regions based on the perceptual texture and next, a semantic label is calculated for these regions. Since this emerging interpretation is still error prone, domain knowledge is ingested to achieve a more accurate description of the depicted scene. By applying both strategies, 91.9% of the pixels from outdoor scenery images obtained a correct label.},
  author       = {Martens, Gaëtan and Poppe, Chris and Lambert, Peter and Van de Walle, Rik},
  booktitle    = {Proceedings 2010 5th International Conference on Future Information Technology (FutureTech)},
  isbn         = {9781424469499},
  keywords     = {bottom-up strategy,scene labeling,top-down strategy,digital images,image segmentation,image classification,biologically inspired texture features,material classification,outdoor scenery images,material recognition,perceptual-based textures,image texture},
  language     = {eng},
  location     = {Busan, South Korea},
  pages        = {6},
  publisher    = {IEEE},
  title        = {Perceptual-based textures for scene labeling: a bottom-up and a top-down approach},
  url          = {http://dx.doi.org/10.1109/FUTURETECH.2010.5482650},
  year         = {2010},
}

Altmetric
View in Altmetric
Web of Science
Times cited: