Unleashing the power of hyperparameter optimization for machine and deep learning models

Loading...
Thumbnail Image

Keywords

deep learning; random search; hybrid random and grid search; hybrid random and manual search

Degree Level

masters

Advisor

Degree Name

M. Eng.

Volume

Issue

Publisher

Memorial University of Newfoundland

Abstract

Hyperparameter optimization is a crucial aspect for improving the performance of learning models through tuning their hyperparameters. This thesis introduces three effective techniques for tuning hyperparameters in various models: PointNet CNN, regularized and non-regularized standard feedforward neural networks, SVM, and PCA. The techniques employed are random search-based tuning, hybrid random and grid search-based tuning, and hybrid random and manual search-based tuning. The random search is performed through coarse and fine-tuning stages, while hybrid approaches combined the random search’s benefits with grid or manual search. These techniques achieve competitive results while optimizing computation costs regarding time and storage usage, making them valuable for state-of-the-art work. Random search and hybrid random and grid search enhance PointNet to achieve high classification accuracy that surpasses related research with an average F1-score of 93.6%, while the hybrid random and manual search reduces computational time but results in a lower accuracy of 90.97%. The random search-based tuning and hybrid random and grid search achieve an accuracy of 96.67% for the make_moons dataset. Similarly, the hyperparameter tuning techniques lead to a binary classification accuracy of 97% for the SVM model, and they identify the minimum number of PCA components and retain 99.0042% of the original dataset’s variance for a set of human face images.

Collections