Shared parameters for different learning rate

I have 2 models denoted m1 and m2. They have a group of shared parameters denoted p. Besides p, m1 and m2 respectively have other parameters.
Now if I use optim.SGD( {‘param’ : m1.parameters(), ‘lr’ : 0.1}, {‘param’ : m2.parameters(), ‘lr’ : 0.01} ), then which lr is used for the shared parameters p?