# DropPath 解析

Top  ---  Bottom

DropPath是将深度学习模型中的多分支结构随机”删除“

https://github.com/yueatsprograms/Stochastic_Depth

# 1、DropPath实现

``````def drop_path(x, drop_prob: float = 0., training: bool = False):
if drop_prob == 0. or not training:
return x
keep_prob = 1 - drop_prob
shape = (x.shape[0],) + (1,) * (x.ndim - 1)  # work with diff dim tensors, not just 2D ConvNets
random_tensor = keep_prob + torch.rand(shape, dtype=x.dtype, device=x.device)
random_tensor.floor_()  # binarize
output = x.div(keep_prob) * random_tensor
return output

class DropPath(nn.Module):
"""Drop paths (Stochastic Depth) per sample  (when applied in main path of residual blocks).
"""
def __init__(self, drop_prob=None):
super(DropPath, self).__init__()
self.drop_prob = drop_prob

def forward(self, x):
return drop_path(x, self.drop_prob, self.training)
``````

Top  ---  Bottom

# 2、DropPath在网络中的应用

``````x = x + self.drop_path(self.mlp(self.norm2(x)))
``````

``````假设一个神经元的输出激活值为a，在不使用dropout的情况下，其输出期望值为a，如果使用了dropout，神经元就可能有保留和关闭两种状态，把它看作一个离散型随机变量，它就符合概率论中的0-1分布，其输出激活值的期望变为 p*a+(1-p)*0=pa，此时若要保持期望和不使用dropout时一致，就要除以p。

``````