Pytorch convolution neural network tune class weights

I have a neural network as below for binary prediction. My classes are heavily imbalanced and class 1 occurs only 2% of times. Showing last few layers only

self.batch_norm2 = nn.BatchNorm1d(num_filters)

self.fc2 = nn.Linear(np.sum(num_filters), fc2_neurons)

self.batch_norm3 = nn.BatchNorm1d(fc2_neurons)

self.fc3 = nn.Linear(fc2_neurons, 1)

My loss is as below.

BCE_With_LogitsLoss=nn.BCEWithLogitsLoss(pos_weight=class_wts[1]/class_wts[0])

Now if I would like to try various pos_weight values, is there a way to do it? something like hyper parameter tuning for pos_weight? Ideally I would like to do it using Bayesian Optimization (from bayes_opt import BayesianOptimization) but if there is any other way let me know.

As of now only thought that I have is to perform exhaustive search. But it wont be very efficient