Pytorch
Detach and clone
https://discuss.pytorch.org/t/clone-and-detach-in-v0-4-0/16861\
tensor.detach()creates a tensor that shares storage withtensorthat does not require grad.tensor.clone()creates a copy of tensor that imitates the originaltensor'srequires_gradfield. You should usedetach()when attempting to remove a tensor from a computation graph, andcloneas a way to copy the tensor while still keeping the copy as a part of the computation graph it came from.
tensor.datareturns a new tensor that shares storage withtensor. However, it always hasrequires_grad=False(even if the originaltensorhadrequires_grad=TrueYou should try not to call
tensor.datain 0.4.0. What are your use cases fortensor.data?
tensor.clone()makes a copy oftensor.variable.clone()andvariable.detach()in 0.3.1 act the same astensor.clone()andtensor.detach()in 0.4.0.
Item
detach
clone
data
requires_grad
False
Same as origin tensor (??)
False
note
Not use in 0.4.0
Retain graph
torch.gather
torch.gath(input, dim, index, out=None) -> Tensor
Collect values along an axis specified by dim from input tensor
Example: dim=1, index is the column of values
a = torch.tensor([[1,2],[3,4]]) torch.gather(a, 1, torch.LongTensor([[1,1],[1,0]])) 2 2 4 3
Example: dim=1, index is the column of values
a = torch.tensor([[1,2],[3,4]]) torch.gather(a, 1, torch.LongTensor([[0,0],[0,0]])) 1 1 3 3
Example: dim=0, index is the row of values
a = torch.tensor([[1,2],[3,4]]) torch.gather(a, 0, torch.LongTensor([[0,0],[0,0]])) 1 2 1 2
Last updated