Pytorch
Last updated
Last updated
tensor.detach()
creates a tensor that shares storage withtensor
that does not require grad.tensor.clone()
creates a copy of tensor that imitates the originaltensor
'srequires_grad
field. You should usedetach()
when attempting to remove a tensor from a computation graph, andclone
as a way to copy the tensor while still keeping the copy as a part of the computation graph it came from.
tensor.data
returns a new tensor that shares storage withtensor
. However, it always hasrequires_grad=False
(even if the originaltensor
hadrequires_grad=True
You should try not to call
tensor.data
in 0.4.0. What are your use cases fortensor.data
?
tensor.clone()
makes a copy oftensor
.variable.clone()
andvariable.detach()
in 0.3.1 act the same astensor.clone()
andtensor.detach()
in 0.4.0.
Item
detach
clone
data
requires_grad
False
Same as origin tensor (??)
False
note
Not use in 0.4.0
torch.gath(input, dim, index, out=None) -> Tensor
Collect values along an axis specified by dim from input tensor
Example: dim=1, index is the column of values
a = torch.tensor([[1,2],[3,4]]) torch.gather(a, 1, torch.LongTensor([[1,1],[1,0]])) 2 2 4 3
Example: dim=1, index is the column of values
a = torch.tensor([[1,2],[3,4]]) torch.gather(a, 1, torch.LongTensor([[0,0],[0,0]])) 1 1 3 3
Example: dim=0, index is the row of values
a = torch.tensor([[1,2],[3,4]]) torch.gather(a, 0, torch.LongTensor([[0,0],[0,0]])) 1 2 1 2