内容来自Datacamp相关课程:https://app.datacamp.com/learn/courses/introduction-to-deep-learning-with-pytorch

本文主要介绍PyTorch中的基本知识:Tensor、如何定义一个简单的Neural Network以及激活函数。

Tensor

Tensor: the building blocks of networks in PyTorch.

  • Load from list
    import torch
    
    array = [[1, 2, 3], [4, 5, 6]]
    tensor = torch.tensor(array)
    
  • Load from NumPy array
    np_array = np.array(array)
    np_tensor = torch.from_numpy(np_array)
    

Like NumPy arrays, tensors are multidimensional representations of their of their elements

Tensor Attributes

  • Tensor shape
    tensor.shape
    
  • Tensor data type
    tensor.dtype
    
  • Tensor device
    tensor.device
    

    Deep learning often requires a GPU, which, compared to a CPU can offer:

    • parallel computing capabilities
    • faster training times
    • better performance

Tensor Operations

  • addition / subtraction
  • element-wise multiplication
  • transposition
  • matrix multiplication
  • concatenation

Most NumPy array operations can be performed on PyTorch tensors.

Create Neural Network

Screenshot 2023-12-15 at 16.06.47

  1. 引入模块
    import torch.nn as nn
    
  2. 创建输入tensor
    input_tensor = torch.tensor([[0.3471, 0.4547, -0.2356]])
    
  3. 定义第一个线性layer
    linear_layer = nn.Linear(in_features=3, out_features=2)
    
  4. weight and bias
    linear_layer.weight
    linear_layer.bias
    

A linear layer takes an input, applies a linear function, and returns output.

Activation Functions

  • Activation functions add non-linearity to the network
  • A model can learn more complex relationships with non-linearity

Sigmoid Function

Screenshot 2023-12-15 at 16.23.42
Sigmoid函数用于而分类问题:

  • we take the preactivation
  • pass it to sigmoid
  • obtian a value between 0 and 1
  • use the threshold of 0.5
import torch
import torch.nn as nn

input_tensor = torch.tensor([[6.0]])
sigmoid = nn.Sigmoid()
output = sigmoid(input_tensor)

Activation function as the last layer

model = nn.Sequential(
    nn.Linear(6, 4),
    nn.Linear(4, 1),
    nn.Sigmoid
)

Sigmoid as last step in network of linear layers is equivalent to traditional logistic regression.

Softmax Function

Screenshot 2023-12-15 at 16.32.09

  • used for multi-class classificaiton problems
  • takes N-element vector as input and outputs vector of same size
import torch
import torch.nn as nn

imput_tensor = torch.tensor([[4.3, 6.1, 2.3]])

probabilities = nn.Softmax(dim=-1)  # apply softmax along the last dimension
output_tensor = probabilities(input_tensor)
最后修改日期: 2023年 12月 16日

作者

留言

撰写回覆或留言

发布留言必须填写的电子邮件地址不会公开。