11

I have a list of tensors of the same shape. I would like to sum the entire list of tensors along an axis. Does torch.cumsum perform this op along a dim? If so it requires the list to be converted to a single tensor and summed over?

1
  • What's your accurate requirements? And what output do you want? Commented Mar 14, 2019 at 16:59

1 Answer 1

17

you don't need cumsum, sum is your friend and yes you should first convert them into a single tensor with stack or cat based on your needs, something like this:

import torch my_list = [torch.randn(3, 5), torch.randn(3, 5)] result = torch.stack(my_list, dim=0).sum(dim=0).sum(dim=0) print(result.shape) #torch.Size([5]) 
Sign up to request clarification or add additional context in comments.

1 Comment

Would have liked to see a builtin function/operator for this.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.