Cross validation : learn a CNN with a range of learning rate and learning weight decay

Hello,

l would like to fine tune hypermaters : learning rate and learning rate decay, dropout value.

l have a set of ranges for each hyper parameter.

l would like to run un algorithm and get the accuracy given each combination of hyperparamater

l have a strange problem l run a model with decay=0.95 and l get 75% after 3 epochs. now l would like to fine tune the hyperparameter decay. l run my model in a for loop with different decay values : first strange thing is that with decay=0.95 my model performance decreases to 56% and for the rest of decay values it return 0% for train and 0 % for test. However, when l lunch it with different values manually (one by one ) rather than in a for loop. The algorithm works well .

Is it due to my for loop over the hyperparameter ?

Thank you

It looks like you never reset the network parameters? Does that mean that the training for the next hyperparameter starts were the previous one stopped?

Hi @albanD,
Yes l need to reset the network . How it can be done ?

Does that mean that the training for the next hyperparameter starts were the previous one stopped :

It’s not supposed to do that !! It’s not what l want

You did not included the code that creates net. But the simplest way would be to create a new one at every iteration. That means that you need a different optimizer as well.