Bayesian optimization libraries for tuning and


I am looking ‘Bayesian optimization’ for hyper-parameter tuning
for my multi-layered neural networks.
What I would like to optimize are like, ‘learning rate, batch size, number of layers and neurons’

Can you guide me any example or tutorial for bayesian optimization?
Also, Can you recommend any library for this?
I’ve been looking through ‘hyper-opt’ or ‘bayesian-optimization’ libraries and wondering if they works for pytorch.

Thanks in advance.

1 Like

Check out Pyro(Provides api for probabilistic programming) ->

1 Like

Hi, Thank you for your suggestion,
Do you see any pros or cons with hyperopt, skopt, bayesian-optimization or botorch?

From a quick browse of all above mentioned frameworks,

  1. BoTorch (Recommended) - Built on top of Pytorch with autograd features. All basic Bayesian optimization tools are included. This should be preferred if you are using Pytorch
    Pros - Modular, Simple and Scalable.
    Cons - Not extensive
  2. scikit-optimize - Integrated with scikit learn. Has extensive API and good example.
    Pros - More extensive than BoTorch.
    Cons - Not sure how easy it is to run in conjunction with Pytorch
  3. Hyperopt - Provides parallelization capabilities with MongoDB and Apache Spark
    Pros - Might be useful for if the database is huge and parallelization is required.
    Cons- Might not be the right tool for prototyping.
  4. bayesian-optimization - Runs on top of scipy and scikitlearn.
    Pros - good documentation, Clear examples, should be good for basic prototyping
    Cons - Not sure how it would go along with pytorch.

I really appreciate your help!

Thanks. The pleasure is mine.