티스토리 뷰
기본적인 Tensor 기능은 Tensorflow와 동일한것 같다.
실상 케라스를 통해 다른라이브러리들이파생되면서 AI 코드 짜기가 쉬워진것같다...
import torch
import numpy as np
def fun():
torch.multiprocessing.freeze_support()
#device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
# Directly from data
data = [
[1, 2],
[3, 4]
]
x_data = torch.tensor(data)
print(f"x_data = {x_data}")
# From a NumPy array
np_array = np.array(data)
x_np = torch.from_numpy(np_array)
print(f"x_np = {x_np}")
# From another tensor:
x_ones = torch.ones_like(x_data)
x_rand = torch.rand_like(x_data, dtype=torch.float)
print(f"x_ones = {x_ones}")
print(f"x_rand = {x_rand}")
# With random or constant values:
shape = (2, 3,)
rand_tensor = torch.rand(shape)
ones_tensor = torch.ones(shape)
zeros_tensor = torch.zeros(shape)
print(f"rand_tensor = {rand_tensor}")
print(f"ones_tensor = {ones_tensor}")
print(f"zeros_tensor = {zeros_tensor}")
# Attributes of a Tensor
tensor = torch.rand(3, 4)
print(f"tensor.shape = {tensor.shape}")
print(f"tensor.dtype = {tensor.dtype}")
print(f"tensor.device = {tensor.device}")
# Operations on Tensors
if torch.cuda.is_available():
tensor = tensor.to('cuda')
tensor = torch.ones(4, 4)
print(f"tensor[0] = {tensor[0]}")
print(f"tensor[:,0] = {tensor[:,0]}")
print(f"tensor[...,-1] = {tensor[...,-1]}")
print(f"tensor = {tensor}")
tensor[1, :] = 0
print(f"tensor = {tensor}")
# Joining tensors
tensor_cat = torch.cat([tensor, tensor, tensor], dim=0)
print(f"tensor_cat(↧) = {tensor_cat}")
tensor_cat = torch.cat([tensor, tensor, tensor], dim=1)
print(f"tensor_cat(↦) = {tensor_cat}")
# Arithmetic operations
y1 = tensor @ tensor.T
print(f"tensor @ tensor.T = {y1}")
y2 = tensor.matmul(tensor.T)
print(f"tensor.matmul(tensor.T) = {y2}")
y3 = torch.rand_like(tensor)
torch.matmul(tensor, tensor.T, out=y3)
print(f"torch.matmul(tensor, tensor.T, out=y3) = {y3}")
z1 = tensor * tensor
print(f"tensor * tensor = {z1}")
z2 = tensor.mul(tensor)
print(f"tensor.mul(tensor) = {z2}")
z3 = torch.rand_like(tensor)
torch.mul(tensor, tensor, out=z3)
print(f"tensor.mul(tensor) = {z3}")
# Single-element tensors
agg = tensor.sum()
print(f"tensor.sum() = {agg}")
agg_item = agg.item()
print(f"tensor.sum().item() = {agg_item}, {type(agg_item)}")
# In-place operations
tensor_out = tensor.add(5)
print(f"tensor.add(5) = {tensor_out}")
tensor.add_(5)
print(f"tensor.add_(5) = {tensor}")
# Tensor to NumPy array
tensor = torch.ones(5)
print(f"torch.ones(5) = {tensor}")
numpy = tensor.numpy()
print(f"tensor.numpy() = {numpy}")
tensor.add_(1)
print(f"tensor.add_(1) tensor = {tensor}")
print(f"tensor.add_(1) numpy = {numpy}")
# Numpy array to Tensor
numpy = np.ones(5)
print(f"np.ones(5) = {numpy}")
tensor = torch.from_numpy(numpy)
print(f"torch.from_numpy(numpy) = {tensor}")
np.add(numpy, 3, out=numpy)
print(f"np.add(numpy, 3, out=numpy) numpy = {numpy}")
print(f"np.add(numpy, 3, out=numpy) tensor = {tensor}")
if __name__ == '__main__':
fun()
반응형
'Record > AI' 카테고리의 다른 글
[인공지능] Pytorch Object Detection Fintuning Tutorial 1 (0) | 2022.05.09 |
---|---|
[인공지능] Pytorch Tensorboard (0) | 2022.05.09 |
[인공지능] Pytorch 예제 (0) | 2022.05.09 |
[인공지능] Pytorch Install (0) | 2022.05.09 |
[인공지능] 노드( Node ) & 신경망( Neural Network ) (0) | 2018.08.16 |
댓글
공지사항
최근에 올라온 글
최근에 달린 댓글
- Total
- Today
- Yesterday
링크
TAG
- Prototype
- segment anything
- 이미지 잡음
- GD Xray
- gitmoji
- X-ray Dataset
- X-RAY
- X-ray 합성
- fastapi
- C#
- torchvision
- faiss
- pytorch
- REACT
- image noise
- pytest
- 산업용
- 16bit 합성
- 산업용 X-Ray
- OPI Xray
- SI Xray
- 연관 이미지 검색
- 인공지능
- ROI
- cascade Mask R-CNN
- React Redux
- Filter
- Python
- Ai
- Example
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | |||
5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 | 13 | 14 | 15 | 16 | 17 | 18 |
19 | 20 | 21 | 22 | 23 | 24 | 25 |
26 | 27 | 28 | 29 | 30 | 31 |
글 보관함