4

Suppose that tensor A is defined as:

 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 

I'm trying to extract a flat array out of this matrix by using another tensor as indices. For example, if the second tensor is defined as:

0 1 2 3 

I want the result of the indexing to be 1-D tensor with the contents:

1 6 11 16 

It doesn't seem to behave like NumPy; I've tried A[:, B] but it just throws an error for not being able to allocate an insane amount of memory and I've no idea why!

3 Answers 3

4

1st Approach: using torch.gather

torch.gather(A, 1, B.unsqueeze_(dim=1)) 

if you want one-dimensional vector, you can add squeeze to the end:

torch.gather(A, 1, B.unsqueeze_(dim=1)).squeeze_() 

2nd Approach: using list comprehensions

You can use list comprehensions to select the items at specific indexes, then they can be concatenated using the torch.stack. An importat point here is that you should not use torch.tensor to create a new tensor from a list, if you do, you will break the chain (you cannot calculate gradient through that node):

torch.stack([A[i, B[i]] for i in range(A.size()[0])]) 
Sign up to request clarification or add additional context in comments.

Comments

1

You can convert your Tensor to a NumPy array. If you are using Cuda, don't forget to pass it to cpu. If don't, there is no need to pass it to cpu. Example code is below:

val.data.cpu().numpy()[:,B] 

Let me know if it resolves your issue

2 Comments

Will PyTorch still be able to calculate the gradient? I'm using this piece of code in the loss function, and I was worried if I use NumPy it won't be able to calculate the gradient properly.
After you gather the array, you can create new tensor from that and you will be able to calculate the gradient. It would be quite complicated but I believe that it will resolve your issue.
0

PyTorch implements torch.take which is equivalent to numpy.take

1 Comment

Thank you for the answer but torch.take treats the input as if it's 1-dimensional. Dealing with that will require shifting the indices, which is not an option. The arrays I'm dealing are really large and this isn't an option.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.