A Mountain Gazelle Optimization (MGO) for Enhancing the Deep Learning Performance in Various Operating Systems
DOI:
https://doi.org/10.55145/ajest.2025.04.01.011Abstract
This study introduces a novel optimization framework that assesses and enhances deep learning algorithm performance across autonomous car operating systems. The framework generates synthetic performance measures, including inference time, memory use, CPU/GPU utilization, and accuracy, to evaluate algorithm performance. By employing the Mountain Gazelle Optimizer (MGO), the study identifies the best deep learning algorithm and operating system setup for accuracy and resource efficiency. The proposed methodology normalizes performance indicators, defines a fitness function to assist optimization, and then iterates through numerous configurations to discover the optimum option. Extensive trials and scenario comparisons validate the effectiveness of our approach. Computational efficiency and accuracy improved significantly, revealing the ideal autonomous car system performance combinations. This study not only enhances deep learning optimization but also provides practical instructions for building robust autonomous car systems in varied operating contexts, thereby informing the development of future autonomous vehicles and offering actionable insights for developers and researchers in the field.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Jamal Nasir Hasoon, Yasmin Makki Mohialden, Firas Ali Hashim
This work is licensed under a Creative Commons Attribution 4.0 International License.