Trainable variable share in 2 classes

I need to new construct an embedding variable in one master class, and in one slave class I want it to update itself, too.

Now I plan to pass the embedding matrix in my master class as a parameter to the slave class’s train function. But how

could I modify it to update?

Thanks!

your question is very confusing. I’m not sure i understand what you want, but you can use the same Variable in both these classes (repeatedly using the Variable is okay).

Sorry to confuse you. I mean the embedding parameter was created in one class, absolutely it can be updated in that class.

Then I need it to be a formal parameter for another class, that is I want to pass this parameter to another class. And hope it

can also be updated in this class.

Hope this time can express myself clear(BTW, I’m a new comer from tensorflow:joy:)

can you write some example code, i am still not sure what you mean by formal parameter and class.

Like this, first I create a class in a.py, and I import b.py.
I want to pass the embedding to b.py through function call(self.embedding)

class A(nn.Module):
	def __init__(self):
		self.embedding = nn.Embedding(3, 4)

	def forward(self):
		call(self.embedding)

In b.py, like this, I hope in class B, the parameter embedding can also be update.

class B(nn.Module):
	def __init__(self):

	def forward(self, embedding):

modelB = B()
def call(embedding):
	modelB(embedding)