Big Texture Dataset Synthesized Based on Gradient and Convolution Kernels Using Pre-Trained Deep Neural Networks

  • Farhan A. Alenizi
  • , Faten Khalid Karim
  • , Alaa R. Al-Shamasneh
  • , Mohammad Hossein Shakoor

Research output: Contribution to journalArticlepeer-review

Abstract

Deep neural networks provide accurate results for most applications. However, they need a big dataset to train properly. Providing a big dataset is a significant challenge in most applications. Image augmentation refers to techniques that increase the amount of image data. Common operations for image augmentation include changes in illumination, rotation, contrast, size, viewing angle, and others. Recently, Generative Adversarial Networks (GANs) have been employed for image generation. However, like image augmentation methods, GAN approaches can only generate images that are similar to the original images. Therefore, they also cannot generate new classes of data. Texture images present more challenges than general images, and generating textures is more complex than creating other types of images. This study proposes a gradient-based deep neural network method that generates a new class of texture. It is possible to rapidly generate new classes of textures using different kernels from pre-trained deep networks. After generating new textures for each class, the number of textures increases through image augmentation. During this process, several techniques are proposed to automatically remove incomplete and similar textures that are created. The proposed method is faster than some well-known generative networks by around 4 to 10 times. In addition, the quality of the generated textures surpasses that of these networks. The proposed method can generate textures that surpass those of some GANs and parametric models in certain image quality metrics. It can provide a big texture dataset to train deep networks. A new big texture dataset is created artificially using the proposed method. This dataset is approximately 2 GB in size and comprises 30,000 textures, each 150 × 150 pixels in size, organized into 600 classes. It is uploaded to the Kaggle site and Google Drive. This dataset is called BigTex. Compared to other texture datasets, the proposed dataset is the largest and can serve as a comprehensive texture dataset for training more powerful deep neural networks and mitigating overfitting.

Original languageEnglish
Pages (from-to)1793-1829
Number of pages37
JournalCMES - Computer Modeling in Engineering and Sciences
Volume144
Issue number2
DOIs
StatePublished - 2025

Keywords

  • Big texture dataset
  • data generation
  • pre-trained deep neural network

Fingerprint

Dive into the research topics of 'Big Texture Dataset Synthesized Based on Gradient and Convolution Kernels Using Pre-Trained Deep Neural Networks'. Together they form a unique fingerprint.

Cite this