Optimizing Malware Detection Using Back Propagation Neural Network and Hyperparameter Tuning

Annisa Arrumaisha Siregar, Sopian Soim, Mohammad Fadhli

Abstract


The escalating growth of the internet has led to an increase in cyber threats, particularly malware, posing significant risks to computer systems and networks. This research addresses the challenge of developing sophisticated malware detection systems by optimizing the Back Propagation Neural Network (BPNN) with hyperparameter tuning. The specific focus is on fine-tuning essential hyperparameters, including dropout rate, number of neurons in hidden layers, and number of hidden layers, to enhance the accuracy of malware detection. A Back Propagation Neural Network (BPNN) with dropout regularization is trained on an extensive dataset as part of the research design. Hyperparameter optimization is conducted using GridSearchCV, with experiments varying learning rates and epochs. The best configuration achieves outstanding results, with 98% accuracy, precision, recall, and F1-score. The proposed approach presents an efficient and reliable solution to bolster cybersecurity systems against malware threats.

Keywords


Malware Detection; Deep Learning; Back Propagation Neural Network; Hyperparameter Tuning; PE Header Information

Full Text:

PDF

References


Internet and social media users in the world 2023 | Statista.” https://www.statista.com/statistics/617136/digital-population-worldwide/ (accessed Jul. 01, 2023).

D. K. Bhattacharyya and J. K. Kalita, Network Anomaly Detection: ML Perspective. 2014.

L. Li et al., “On Locating Malicious Code in Piggybacked Android Apps,” J. Comput. Sci. Technol., vol. 32, no. 6, pp. 1108–1124, 2017, doi: 10.1007/s11390-017-1786-z.

R. Vinayakumar, M. Alazab, K. P. Soman, P. Poornachandran, and S. Venkatraman, “Robust Intelligent Malware Detection Using Deep Learning,” IEEE Access, vol. 7, no. c, pp. 46717–46738, 2019, doi: 10.1109/ACCESS.2019.2906934.

D. Wang, P. Cui, and W. Zhu, “Structural deep network embedding,” Proc. ACM SIGKDD Int. Conf. Knowl. Discov. Data Min., vol. 13-17-Augu, pp. 1225–1234, 2016, doi: 10.1145/2939672.2939753.

J. Saxe and K. Berlin, “Deep neural network based malware detection using two dimensional binary program features,” Proc. 2015 10th Int. Conf. Malicious Unwanted Softw., 2016.

F. Al Huda, W. F. Mahmudy, and H. Tolle, “Android malware detection using backpropagation neural network,” Indones. J. Electr. Eng. Comput. Sci., vol. 4, no. 1, pp. 240–244, 2016, doi: 10.11591/ijeecs.v4.i1.pp240-244.

Z.-P. Pan, C. Feng, and C.-J. Tang, “Malware Classification Based on the Behavior Analysis and Back Propagation Neural Network,” ITM Web Conf., vol. 7, p. 02001, 2016, doi: 10.1051/itmconf/20160702001.

A. Makandar and A. Patrot, “Malware analysis and classification using Artificial Neural Network,” Int. Conf. Trends Autom. Commun. Comput. Technol. I-TACT 2015, 2016, doi: 10.1109/ITACT.2015.7492653.

B. B. Rad, M. K. H. Nejad, and M. Shahpasand, “Malware classification and detection using artificial neural network,” J. Eng. Sci. Technol., vol. 13, no. Special Issue on ICCSIT 2018, pp. 14–23, 2018.

D. Jayanth, “Malware-Detection-in-PE-files-using-Machine-Learning,” 2021. https://github.com/DasariJayanth/Malware-Detection-in-PE-files-using-Machine-Learning/tree/master/PE_Header(exe%2C dll files)

G. Ciaburro, V Kishore Ayyadevara, and A. Perrier, Hands-On Machine Learning on Google Cloud Platform, vol. 6, no. August. Packt Publishing Ltd, 2018.

N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, “Dropout: A simple way to prevent neural networks from overfitting,” J. Mach. Learn. Res., vol. 15, no. 6, pp. 1929–1958, 2014, doi: 10.1016/0010-4361(73)90803-3.

R. H. R. Hahnioser, R. Sarpeshkar, M. A. Mahowald, R. J. Douglas, and H. S. Seung, “Digital selection and analogue amplification coexist in a cortex- inspired silicon circuit,” Nature, vol. 405, no. 6789, pp. 947–951, 2000, doi: 10.1038/35016072.

M. Nielsen, Neural Networks and Deep Learning. Determination Press, 2015. doi: 10.1108/978-1-83909-694-520211010.

F. Chollet, Deep Learning with Python, First Edit. Manning Publications, 2017. doi: 10.1007/978-1-4842-5364-9.

S. R. Dubey, S. K. Singh, and B. B. Chaudhuri, “Activation functions in deep learning: A comprehensive survey and benchmark,” Neurocomputing, vol. 503, pp. 92–108, 2022, doi: 10.1016/j.neucom.2022.06.111.

M. Liu, D. Yao, Z. Liu, J. Guo, and J. Chen, “An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent,” Comput. Intell. Neurosci., vol. 2023, pp. 1–14, 2023, doi: 10.1155/2023/4765891.

D. P. Kingma and J. L. Ba, “Adam: A method for stochastic optimization,” 3rd Int. Conf. Learn. Represent., pp. 1–15, 2015.

S. Ruder, “An overview of gradient descent optimization algorithms,” pp. 1–14, 2016, [Online]. Available: http://arxiv.org/abs/1609.04747

J. Duchi, E. Hazan, and Y. Singer, “Adaptive subgradient methods for online learning and stochastic optimization,” Journa lof Mach. Learn. Res., vol. 12, pp. 2121–2159, 2011.

M. C. Dickson, A. S. Bosman, and K. M. Malan, “Hybridised Loss Functions for Improved Neural Network Generalisation,” Pan-African Artif. Intell. Smart Syst. PAAISS 2021. Lect. Notes Inst. Comput. Sci. Soc. Informatics Telecommun. Eng., vol. 405, pp. 169–181, 2022, doi: 10.1007/978-3-030-93314-2_11.

L. Yang and A. Shami, “On hyperparameter optimization of machine learning algorithms: Theory and practice,” Neurocomputing, vol. 415, pp. 295–316, 2020, doi: 10.1016/j.neucom.2020.07.061.

E. Elgeldawi, A. Sayed, A. R. Galal, and A. M. Zaki, “Hyperparameter tuning for machine learning algorithms used for arabic sentiment analysis,” Informatics, vol. 8, no. 4, pp. 1–21, 2021, doi: 10.3390/informatics8040079.




DOI: http://dx.doi.org/10.24014/ijaidm.v6i2.24731

Refbacks

  • There are currently no refbacks.


Office and Secretariat:

Big Data Research Centre
Puzzle Research Data Technology (Predatech)
Laboratory Building 1st Floor of Faculty of Science and Technology
UIN Sultan Syarif Kasim Riau

Jl. HR. Soebrantas KM. 18.5 No. 155 Pekanbaru Riau – 28293
Website: http://predatech.uin-suska.ac.id/ijaidm
Email: ijaidm@uin-suska.ac.id
e-Journal: http://ejournal.uin-suska.ac.id/index.php/ijaidm
Phone: 085275359942

Click Here for Information


Journal Indexing:

Google Scholar | ROAD | PKP Index | BASE | ESJI | General Impact Factor | Garuda | Moraref | One Search | Cite Factor | Crossref | WorldCat | Neliti  | SINTA | Dimensions | ICI Index Copernicus 

IJAIDM Stats