about PyTorch Basic data object of Tensor
( tensor ), When dealing with problems , The dimensions of data need to be changed frequently , In order to facilitate the later calculation and further processing , This paper aims to enumerate some methods of dimension transformation and give examples , Convenient for you to check .

<> Dimension view :torch.Tensor.size()

View current tensor Dimension of

for instance :
>>> import torch >>> a = torch.Tensor([[[1, 2], [3, 4], [5, 6]]]) >>> a.size()
torch.Size([1, 3, 2])
<> Tensor deformation :torch.Tensor.view(*args) → Tensor

Returns a file with the same data but different sizes tensor. Returned tensor Must have and original tensor The same data and the same number of elements , But it can be of different sizes . One
tensor It has to be continuous contiguous() To be viewed .

for instance :
>>> x = torch.randn(2, 9) >>> x.size() torch.Size([2, 9]) >>> x tensor([[-
1.6833, -0.4100, -1.5534, -0.6229, -1.0310, -0.8038, 0.5166, 0.9774, 0.3455], [-
0.2306, 0.4217, 1.2874, -0.3618, 1.7872, -0.9012, 0.8073, -1.1238, -0.3405]]) >>
> y = x.view(3, 6) >>> y.size() torch.Size([3, 6]) >>> y tensor([[-1.6833, -
0.4100, -1.5534, -0.6229, -1.0310, -0.8038], [ 0.5166, 0.9774, 0.3455, -0.2306,
0.4217, 1.2874], [-0.3618, 1.7872, -0.9012, 0.8073, -1.1238, -0.3405]]) >>> z =
x.view(2, 3, 3) >>> z.size() torch.Size([2, 3, 3]) >>> z tensor([[[-1.6833, -
0.4100, -1.5534], [-0.6229, -1.0310, -0.8038], [ 0.5166, 0.9774, 0.3455]], [[-
0.2306, 0.4217, 1.2874], [-0.3618, 1.7872, -0.9012], [ 0.8073, -1.1238, -0.3405]
]])
You can see x and y ,z The amount of data in is equal to the size of each data , Only the size or the number of dimensions has changed .

<> compress / Decompression tensor :torch.squeeze(),torch.unsqueeze()

* torch.squeeze(input, dim=None, out=None)
Will be entered in the tensor shape 1 Remove and return . If the input is in the form of (A×1×B×1×C×1×D), Then the output shape is : (A×B×C×D)

When given dim Time , Then the squash operation is only on a given dimension . for example , The input shape is : (A×1×B),squeeze(input, 0) The tensor will remain unchanged , Only with
squeeze(input, 1), The shape will change (A×B).

Return tensor and input tensor share memory , So changing the content of one will change the other .

for instance :
>>> x = torch.randn(3, 1, 2) >>> x tensor([[[-0.1986, 0.4352]], [[ 0.0971,
0.2296]], [[ 0.8339, -0.5433]]]) >>> x.squeeze().size() # Without parameters , The number of elements is 1 Dimension of
torch.Size([3, 2]) >>> x.squeeze() tensor([[-0.1986, 0.4352], [ 0.0971, 0.2296],
[ 0.8339, -0.5433]]) >>> torch.squeeze(x, 0).size() #
Add parameters , Remove the elements of the first dimension , It doesn't work , Because the first dimension has 2 Elements torch.Size([3, 1, 2]) >>> torch.squeeze(x, 1).size
() # Add parameters , Remove the elements of the second dimension , Just for 1, work torch.Size([3, 2])
You can see that if you add parameters , Only dimensions in dimensions are 1 Will disappear

* torch.unsqueeze(input, dim, out=None)
Returns a new tensor , Insert a dimension at the specified location of the input 1

Return tensor and input tensor share memory , So changing the content of one will change the other .

If dim Is negative , Will be transformed dim+input.dim()+1

Let's take the data above as an example :
>>> x.unsqueeze(0).size() torch.Size([1, 3, 1, 2]) >>> x.unsqueeze(0) tensor([[
[[-0.1986, 0.4352]], [[ 0.0971, 0.2296]], [[ 0.8339, -0.5433]]]]) >>> x.
unsqueeze(-1).size() torch.Size([3, 1, 2, 1]) >>> x.unsqueeze(-1) tensor([[[[-
0.1986], [ 0.4352]]], [[[ 0.0971], [ 0.2296]]], [[[ 0.8339], [-0.5433]]]])
You can see at the specified location , Added a dimension .

<> Extended tensor :torch.Tensor.expand(*sizes) → Tensor

return tensor A new view of , A single dimension expands to a larger size . tensor It can also be expanded to a higher dimension , The newly added dimensions will be attached . expand tensor
There is no need to allocate new memory , It's just a new one tensor View of , Through the stride Set to
0, One dimension will extend the bit to a higher dimension . Any one dimension can be expanded to any number without allocating new memory .

for instance :
>>> x = torch.Tensor([[1], [2], [3]]) >>> x.size() torch.Size([3, 1]) >>> x.
expand(3, 4) tensor([[1. 1. 1. 1.], [2. 2. 2. 2.], [3. 3. 3. 3.]]) >>>
x.expand(3, -1) tensor([[1.], [2.], [3.]])
The original data is 3 That's ok 1 column , After expansion, it becomes 3 That's ok 4 column , Fill in the method -1 And 1 equally , Size only 1 Can be expanded , If not for 1 It can't be changed , And the size is not
1 The dimension of must be filled in as before .

<> Repeated tensor :torch.Tensor.repeat(*sizes)

Repeat along the specified dimension tensor. differ expand(), This function copies the tensor Data in .

for instance :
>>> x = torch.Tensor([1, 2, 3]) >>> x.size() torch.Size([3]) >>> x.repeat(4, 2)
[1. 2. 3. 1. 2. 3.], [1. 2. 3. 1. 2. 3.], [1. 2. 3. 1. 2. 3.]])
>>> x.repeat(4, 2).size() torch.Size([4, 6])
The original data is 1 That's ok 3 column , Expand to original in line direction 4 times , Column direction expanded to original 2 times , Change into 4 That's ok 6 column .

Change can be seen as the original data as a whole , Repeat according to the specified dimension and size , Become a 4 That's ok 2 Matrix of columns , Each of these units is the same , Then put the original data into each unit .

<> Matrix transposition :torch.t(input, out=None) → Tensor

Enter a matrix (2 Dimensional tensor ), And transpose 0, 1 dimension . Can be treated as a function transpose(input, 0, 1) Abbreviated function of .

for instance :
>>> x = torch.randn(3, 5) >>> x tensor([[-1.0752, -0.9706, -0.8770, -0.4224,
0.9776], [ 0.2489, -0.2986, -0.7816, -0.0823, 1.1811], [-1.1124, 0.2160, -0.8446
, 0.1762, -0.5164]]) >>> x.t() tensor([[-1.0752, 0.2489, -1.1124], [-0.9706, -
0.2986, 0.2160], [-0.8770, -0.7816, -0.8446], [-0.4224, -0.0823, 0.1762], [
0.9776, 1.1811, -0.5164]]) >>> torch.t(x) # Another use tensor([[-1.0752, 0.2489, -
1.1124], [-0.9706, -0.2986, 0.2160], [-0.8770, -0.7816, -0.8446], [-0.4224, -
0.0823, 0.1762], [ 0.9776, 1.1811, -0.5164]])
It has to be 2 Tensor of dimension , That's the matrix , Can be used .

<> Dimension replacement :torch.transpose(),torch.Tensor.permute()

* torch.transpose(input, dim0, dim1, out=None) → Tensor
Return to input matrix input Transposition of . Exchange dimension dim0 and dim1. Output tensor and input tensor share memory , So changing one of them will cause the other to be modified as well .

for instance :
>>> x = torch.randn(2, 4, 3) >>> x tensor([[[-1.2502, -0.7363, 0.5534], [-
0.2050, 3.1847, -1.6729], [-0.2591, -0.0860, 0.4660], [-1.2189, -1.1206, 0.0637]
], [[ 1.4791, -0.7569, 2.5017], [ 0.0098, -1.0217, 0.8142], [-0.2414, -0.1790,
2.3506], [-0.6860, -0.2363, 1.0481]]]) >>> torch.transpose(x, 1, 2).size() torch
.Size([2, 3, 4]) >>> torch.transpose(x, 1, 2) tensor([[[-1.2502, -0.2050, -
0.2591, -1.2189], [-0.7363, 3.1847, -0.0860, -1.1206], [ 0.5534, -1.6729, 0.4660
, 0.0637]], [[ 1.4791, 0.0098, -0.2414, -0.6860], [-0.7569, -1.0217, -0.1790, -
0.2363], [ 2.5017, 0.8142, 2.3506, 1.0481]]]) >>> torch.transpose(x, 0, 1).size(
) torch.Size([4, 2, 3]) >>> torch.transpose(x, 0, 1) tensor([[[-1.2502, -0.7363,
0.5534], [ 1.4791, -0.7569, 2.5017]], [[-0.2050, 3.1847, -1.6729], [ 0.0098, -
1.0217, 0.8142]], [[-0.2591, -0.0860, 0.4660], [-0.2414, -0.1790, 2.3506]], [[-
1.2189, -1.1206, 0.0637], [-0.6860, -0.2363, 1.0481]]])
We can transpose multi-dimensional tensors

* torch.Tensor.permute(dims)
take tensor Dimension transposition of

Let's take the data above as an example :
>>> x.size() torch.Size([2, 4, 3]) >>> x.permute(2, 0, 1).size() torch.Size([3,
2, 4]) >>> x.permute(2, 0, 1) tensor([[[-1.2502, -0.2050, -0.2591, -1.2189], [
1.4791, 0.0098, -0.2414, -0.6860]], [[-0.7363, 3.1847, -0.0860, -1.1206], [-
0.7569, -1.0217, -0.1790, -0.2363]], [[ 0.5534, -1.6729, 0.4660, 0.0637], [
2.5017, 0.8142, 2.3506, 1.0481]]])
Directly fill in the index of each dimension in the method , The tensor exchanges the dimensions of the specified dimension , Not limited to exchange in pairs .

Technology