I have a list of tensors of the same shape. I would like to sum the entire list of tensors along an axis. Does torch.cumsum perform this op along a dim? If so it requires the list to be converted to a single tensor and summed over?
- What's your accurate requirements? And what output do you want?cloudyyyyy– cloudyyyyy2019-03-14 16:59:50 +00:00Commented Mar 14, 2019 at 16:59
Add a comment |
1 Answer
you don't need cumsum, sum is your friend and yes you should first convert them into a single tensor with stack or cat based on your needs, something like this:
import torch my_list = [torch.randn(3, 5), torch.randn(3, 5)] result = torch.stack(my_list, dim=0).sum(dim=0).sum(dim=0) print(result.shape) #torch.Size([5]) 1 Comment
Vasantha Ganesh Kanniappan
Would have liked to see a builtin function/operator for this.