a = torch.nn.Parameter(torch.tensor(100.0), requires_grad=True)
# use this in your train loop after optimizer.step() in order to clip it in-place and keep it in your desired range every time it is updated
a.data.clamp_(min=2, max=10)
sigmoid方法(软限幅):
a = torch.tensor(100.0)
a = torch.nn.Parameter(2 + torch.sigmoid(a) * (10 - 2), requires_grad=True)
# param `a` will be a parameter limited between 2 and 10
1条答案
按热度按时间xytpbqjk1#
您可以通过两种不同的方式来实现它:
1.箝位方法(硬限制):
参考文献:
https://discuss.pytorch.org/t/set-constraints-on-parameters-or-layers/23620/7
How to use a learnable parameter in pytorch, constrained between 0 and 1?
How can I limit the range of parameters in pytorch?
https://pytorch.org/docs/stable/generated/torch.Tensor.clamp_.html