hello
i want to apply this function on each element of tensor array, can i do it without for loop and apply_ function? i want to run it on GPU.
import torch
length = 8
def WToSN(In):
LenSN = 2 ** length
SN = torch.zeros(LenSN, device=torch.device(‘cuda’))
SN[0:In] = 1
return SN
weightfold2 = torch.tensor([[1, 2, 3], [4, 5, 6]])
weight = weightfold2.flatten()
WeightSN = torch.tensor([WToSN(elem) for elem in weight])
WeightSN = WeightSN.reshape(weightfold2.size(0), weightfold2.size(1), -1)
thank you
What’s wrong with using apply_? This works without a for loop
import torch
def custom_function(x):
return x ** 2
tensor = torch.tensor([1, 2, 3, 4, 5])
result = tensor.apply_(custom_function)
print(result) # tensor([ 1, 4, 9, 16, 25])
thank you for your answer
apply_ function can’t run on GPU, i want to run my code on GPU, isn’t there any solution?
Ah my bad this should work
import torch
def custom_function(x):
return x ** 2
tensor = torch.tensor([1, 2, 3, 4, 5], device="cuda")
tensor = custom_function(tensor)
print(tensor) # tensor([ 1, 4, 9, 16, 25])
thank you for your answer
it’s doing correct for this function but in my function:
import torch
length = 8
def WToSN(In):
LenSN = 2 ** length
SN = torch.zeros(LenSN, device=torch.device(‘cuda’))
SN[0:In] = 1
return SN
weight = torch.tensor([[1, 2, 3], [4, 5, 6]], device=torch.device(‘cuda’))
print(weight)
WeightSN = WToSN(weight)
print(WeightSN)
i get this error:TypeError: only integer tensors of a single element can be converted to an index
for this line: SN[0:In] = 1
can you correct this?thank you