Hello,
l would like to fine tune hypermaters : learning rate and learning rate decay, dropout value.
l have a set of ranges for each hyper parameter.
l would like to run un algorithm and get the accuracy given each combination of hyperparamater
l have a strange problem l run a model with decay=0.95 and l get 75% after 3 epochs. now l would like to fine tune the hyperparameter decay. l run my model in a for loop with different decay values : first strange thing is that with decay=0.95 my model performance decreases to 56% and for the rest of decay values it return 0% for train and 0 % for test. However, when l lunch it with different values manually (one by one ) rather than in a for loop. The algorithm works well .
Is it due to my for loop over the hyperparameter ?
Thank you