多层感知机

简要介绍一些常见的激活函数

In [1]:
%matplotlib inline
import torch
from d2l import torch as d2l

ReLU提供了一种非常简单的非线性变换

In [2]:
x = torch.arange(-8.0, 8.0, 0.1, requires_grad=True)
y = torch.relu(x)
d2l.plot(x.detach(), y.detach(), 'x', 'relu(x)', figsize=(5, 2.5))
2021-05-15T04:05:52.923424 image/svg+xml Matplotlib v3.3.4, https://matplotlib.org/
In [3]:
y.backward(torch.ones_like(x), retain_graph=True)
d2l.plot(x.detach(), x.grad, 'x', 'grad of relu', figsize=(5, 2.5))
2021-05-15T04:05:53.109338 image/svg+xml Matplotlib v3.3.4, https://matplotlib.org/

对于一个定义域在$\mathbb{R}$中的输入,sigmoid函数将输入变换为区间(0, 1)上的输出

In [4]:
y = torch.sigmoid(x)
d2l.plot(x.detach(), y.detach(), 'x', 'sigmoid(x)', figsize=(5, 2.5))
2021-05-15T04:05:53.301957 image/svg+xml Matplotlib v3.3.4, https://matplotlib.org/
In [5]:
x.grad.data.zero_()
y.backward(torch.ones_like(x), retain_graph=True)
d2l.plot(x.detach(), x.grad, 'x', 'grad of sigmoid', figsize=(5, 2.5))
2021-05-15T04:05:53.491475 image/svg+xml Matplotlib v3.3.4, https://matplotlib.org/

Tanh(双曲正切)函数也能将其输入压缩转换到区间(-1, 1)上

In [6]:
y = torch.tanh(x)
d2l.plot(x.detach(), y.detach(), 'x', 'tanh(x)', figsize=(5, 2.5))
2021-05-15T04:05:53.677586 image/svg+xml Matplotlib v3.3.4, https://matplotlib.org/
In [7]:
x.grad.data.zero_()
y.backward(torch.ones_like(x), retain_graph=True)
d2l.plot(x.detach(), x.grad, 'x', 'grad of tanh', figsize=(5, 2.5))
2021-05-15T04:05:53.863671 image/svg+xml Matplotlib v3.3.4, https://matplotlib.org/