How to know if an input has an influence on the result?

Hello. I’m working on a conditional model that generates music. I added a label to the input as a one_hot_vector that could be an artist, a genre or an instrument. My worry is that the label is a single one_hot_vector (so max 1) while the input is very long (let’s say len = 100 for example) and values range from 1 to 256. Therefore I’m afraid that priming with the label has no impact over all.
Is there a way to know if the label has an impact ? And if it doesn’t have an impact, how to “artificially” improve the influence of the label on training / generation ?

Thank you !


First of all, it is usually recommended to normalize the input, so maybe you could try to change the range of the input from (1-256) to (-1, 1). I think this might help.
If you want know how much does an input affect to the output, maybe you could try monitoring the gradient vector’s magnitude. I mean the gradient with respect to the input label, not with respect to the weights. If the gradient is very small, it might mean that the input makes small difference on the output.

Ideally, the model should be able to learn the importance of the label in the input, but if you want to “artificially” force that, maybe you could try repeating the label a bunch of times in the input?

Hope this helps

Thanks for the help Ben,
Nevermind I just checked the input is indeed normalized, still I feel like the label should have more importance for some reason. Intuitively, it must have a very big influence and it’s hard to check its influence with regards to generation (the results are still a bit messy anyway).