silu 函数

silu 函数#

对于定义域 \(\mathbb{R}\) 中的输入,silu 函数,定义如下:

\[ \operatorname{SiLU}(x) = \frac{x}{1 + \exp(-x)} = x \cdot \operatorname{sigmoid}(x). \]
import torch
from torch_book.plotx.utils import plot
x = torch.arange(-8.0, 1.0, 0.1, requires_grad=True)
y = torch.nn.functional.silu(x)
plot(x.detach(), y.detach(), 'x', 'silu(x)', figsize=(5, 2.5))
../../../_images/43a98bf35c43da9bb0feec8c48ffc4ab496b730818d0158f7283476f36fb0592.png
x = torch.arange(-8.0, 8.0, 0.1, requires_grad=True)
y = torch.nn.functional.silu(x)
plot(x.detach(), y.detach(), 'x', 'silu(x)', figsize=(5, 2.5))
../../../_images/b5cf3a2c45bb40782f17dd94337023ebbbf89a2a147cebb163bf864ffdd75432.png

可视化导函数:

y.backward(torch.ones_like(x),retain_graph=True)
plot(x.detach(), x.grad, 'x', '$\operatorname{D} \operatorname{silu}(x)$', figsize=(5, 2.5))
../../../_images/8ffc13cda1eef06375ce65b8529a2d847ca416f9746709a809eca169eea209fe.png