Hello.
When I used Adam optimizer , I got 60% train Acc.
But, When I used SGD optimizer, I got about 90% train Acc.
why did this happening?
Hello.
When I used Adam optimizer , I got 60% train Acc.
But, When I used SGD optimizer, I got about 90% train Acc.
why did this happening?
I think Adam is better than SGD in some way. SGD only efficient in some certain circumstances. Maybe you could try another dataset or make some advance on Adam.