Optimizing Image Classification Performance with MnasNet Model on Blurred Images

Rani Puspita, Zahra Nabila Izdihar


In this era, the development of fashion in clothing is increasing. Over the last 30 years, the fashion industry has experienced significant improvements, causing its growth and development to increase. Fashion has many types and variants, but blurry images can also make it difficult for people to classify whether this is a shirt, t-shirt, or something else. Because of that, we proposed image classification. By classifying images, we can help the fashion industry to separate categories and types of various fashion. The approach uses MnasNet which is included in the deep learning approach. The data used is 70,000 which is divided into 60,000 training data and 10,000 testing data. The MnasNet architectural model produces an accuracy of 89% and a loss of 0.4426. It can be seen that MnasNet is the right method for image classification so that the problems described in the background have been successfully solved.


Deep Learning Approach; Evaluation; Fashion Industry; Image Classification; MnasNet

Full Text:



C. S. K. Aditya, V. R. S. Nastiti, Q. R. Damayanti, and G. B. Sadewa, “Implementation of Convolutional Neural Network Method in Identifying Fashion Image,” JUITA J. Inform., vol. 11, no. 2, p. 195, Nov. 2023, doi: 10.30595/juita.v11i2.17372.

Y. Zhong and S. Mitra, “The role of fashion retail buyers in China and the buyer decision-making process,” J. Fash. Mark. Manag., vol. 24, no. 4, pp. 631–649, Jan. 2020, doi: 10.1108/JFMM-03-2018-0033.

H. Xiao, K. Rasul, and R. Vollgraf, “Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms,” pp. 1–6, Sep. 2017.

Z. Yuan, “Restoration and enhancement optimization of blurred images based on SRGAN,” J. Phys. Conf. Ser., vol. 2664, no. 1, 2023, doi: 10.1088/1742-6596/2664/1/012001.

Y. Le Cun et al., “Handwritten Digit Recognition with a Back-Propagation Network,” Dermatol. Surg., vol. 39, no. 1 Pt 2, p. 149, 2013, doi: 10.1111/dsu.12130.

S. Priyowidodo, “Klasifikasi Gambar Dataset Fashion-Mnist Menggunakan Deep Convolutional Neural Network,” Jitekh, vol. 7, no. 1, pp. 34–38, 2019.

A. Peryanto, A. Yudhana, and R. Umar, “Klasifikasi Citra Menggunakan Convolutional Neural Network dan K Fold Cross Validation,” J. Appl. Informatics Comput., vol. 4, no. 1, pp. 45–51, Jul. 2020, doi: 10.30871/jaic.v4i1.2017.

G. A. Sandag and J. Waworundeng, “Identifikasi Foto Fashion Dengan Menggunakan Convolutional Neural Network (CNN),” CogITo Smart J., vol. 7, no. 2, pp. 305–314, Dec. 2021, doi: 10.31154/cogito.v7i2.340.305-314.

K. V. Greeshma and K. Sreekumar, “Fashion-MNIST classification based on HOG feature descriptor using SVM,” Int. J. Innov. Technol. Explor. Eng., vol. 8, no. 5, pp. 960–962, Mar. 2019.

M. D. Yuniartika, “Klasifikasi Viral Pneumonia mengunakan Metode Convolutional Neural Network dan Support Vector Machine,” KONVERGENSI, vol. 18, no. 8.5.2017, pp. 2003–2005, Jul. 2022, doi: 10.30996/konv.v18i1.6916.

F. Nashrullah, S. A. Wibowo, and G. Budiman, “Investigasi Parameter Epoch Pada Arsitektur ResNet-50 Untuk Klasifikasi Pornografi,” J. Comput. Electron. Telecommun., vol. 1, no. 1, pp. 1–8, 2020.

M. Tan et al., “Mnasnet: Platform-aware neural architecture search for mobile,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., vol. 2019-June, pp. 2815–2823, May. 2019, doi: 10.1109/CVPR.2019.00293.

B. Zoph, V. Vasudevan, J. Shlens, and Q. V. Le, “Learning Transferable Architectures for Scalable Image Recognition,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., pp. 8697–8710, 2018, doi: 10.1109/CVPR.2018.00907.

K. He, X. Zhang, S. Ren, and J. Sun, “Deep Residual Learning for Image Recognition,” IEEE Access, vol. 45, no. 8, pp. 1951–1954, 2015.

Z. P. Jiang, Y. Y. Liu, Z. E. Shao, and K. W. Huang, “An improved VGG16 model for pneumonia image classification,” Appl. Sci., vol. 11, no. 23, Nov. 2021, doi: 10.3390/app112311185.

A. Çınar, M. Yıldırım, and Y. Eroğlu, “Classification of pneumonia cell images using improved ResNet50 model,” Trait. du Signal, vol. 38, no. 1, pp. 165–173, Feb. 2021, doi: 10.18280/TS.380117.

A. A. Saraiva et al., “Classification of images of childhood pneumonia using convolutional neural networks,” BIOIMAGING 2019 - 6th Int. Conf. Bioimaging, Proceedings; Part 12th Int. Jt. Conf. Biomed. Eng. Syst. Technol. BIOSTEC 2019, no. Biostec, pp. 112–119, 2019, doi: 10.5220/0007404301120119.

D. Kapila, G. Vatsa, P. Prabavathi, R. M. Kingston, A. P. Srivastava, and R. Deorari, “Brain Tumor Classification by Convolutional Neural Network,” Proc. Int. Conf. Technol. Adv. Comput. Sci. ICTACS 2022, vol. 2019, no. Icasert, pp. 784–790, 2022, doi: 10.1109/ICTACS56270.2022.9988211.

J. S. Paul, A. J. Plassard, B. A. Landman, and D. Fabbri, “Deep learning for brain tumor classification,” Med. Imaging 2017 Biomed. Appl. Mol. Struct. Funct. Imaging, vol. 10137, no. 2, p. 1013710, 2017, doi: 10.1117/12.2254195.

D. Sarwinda, R. H. Paradisa, A. Bustamam, and P. Anggia, “Deep Learning in Image Classification using Residual Network (ResNet) Variants for Detection of Colorectal Cancer,” Procedia Comput. Sci., vol. 179, no. 2019, pp. 423–431, 2021, doi: 10.1016/j.procs.2021.01.025.

Z. Hameed, S. Zahia, B. Garcia-Zapirain, J. J. Aguirre, and A. M. Vanegas, “Breast cancer histopathology image classification using an ensemble of deep learning models,” Sensors (Switzerland), vol. 20, no. 16, pp. 1–17, 2020, doi: 10.3390/s20164373.

S. Y. Chaganti, I. Nanda, and K. R. Pandi, “Image classification using SVM and CNN,” IEEE Xplore, pp. 1–10, Jul. 2020.

P. Shah and M. El-Sharkawy, “A-MnasNet: Augmented MnasNet for Computer Vision,” IEMTRONICS 2020 - Int. IOT, Electron. Mechatronics Conf. Proc., pp. 1044–1047, Sep. 2020, doi: 10.1109/IEMTRONICS51293.2020.9216434.

M. Tan et al., “Mnasnet: Platform-aware neural architecture search for mobile,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., vol. 2019-June, pp. 2815–2823, 2019, doi: 10.1109/CVPR.2019.00293.

S. Aghera, H. Gajera, and S. K. Mitra, “MnasNet based lightweight CNN for facial expression recognition,” Proc. - 2020 IEEE Int. Symp. Sustain. Energy, Signal Process. Cyber Secur. iSSSC 2020, May. 2021, doi: 10.1109/iSSSC50941.2020.9358903.

X. Lei, H. Pan, and X. Huang, “A dilated cnn model for image classification,” IEEE Access, vol. 7, pp. 124087–124095, 2019, doi: 10.1109/ACCESS.2019.2927169.

Radikto, D. I. Mulyana, M. A. Rofik, and Mo. Z. Z. Zakaria, “Klasifikasi Kendaraan pada Jalan Raya menggunakan Algoritma Convolutional Neural Network ( CNN ),” J. Pendidik. Tambusai, vol. 6, no. 1, pp. 1668–1679, 2022.

R. Aryanto, M. Alfan Rosid, and S. Busono, “Penerapan Deep Learning untuk Pengenalan Tulisan Tangan Bahasa Aksara Lota Ende dengan Menggunakan Metode Convolutional Neural Networks,” J. Inf. dan Teknol., vol. 5, no. 1, pp. 258–264, May. 2023, doi: 10.37034/jidt.v5i1.313.

Y. Lecun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, May. 2015, doi: 10.1038/nature14539.

P. P. Shinde and S. Shah, “A Review of Machine Learning and Deep Learning Applications,” IEEE, pp. 1–6, 2018, doi: 10.1007/s00521-022-07421-z.

M. H. Kamarudin and Z. H. Ismail, “Lightweight deep CNN models for identifying drought stressed plant,” IOP Conf. Ser. Earth Environ. Sci., vol. 1091, no. 1, 2022, doi: 10.1088/1755-1315/1091/1/012043.

DOI: http://dx.doi.org/10.24014/ijaidm.v7i2.29571


  • There are currently no refbacks.

Office and Secretariat:

Big Data Research Centre
Puzzle Research Data Technology (Predatech)
Laboratory Building 1st Floor of Faculty of Science and Technology
UIN Sultan Syarif Kasim Riau

Jl. HR. Soebrantas KM. 18.5 No. 155 Pekanbaru Riau – 28293
Website: http://predatech.uin-suska.ac.id/ijaidm
Email: ijaidm@uin-suska.ac.id
e-Journal: http://ejournal.uin-suska.ac.id/index.php/ijaidm
Phone: 085275359942

Click Here for Information

Journal Indexing:

Google Scholar | ROAD | PKP Index | BASE | ESJI | General Impact Factor | Garuda | Moraref | One Search | Cite Factor | Crossref | WorldCat | Neliti  | SINTA | Dimensions | ICI Index Copernicus