momentumnet.transform_to_momentumnet

momentumnet.transform_to_momentumnet(model, sub_layers=['layer1', 'layer2', 'layer3', 'layer4'], keep_first_layer=True, gamma=0.9, use_backprop=False, is_residual=False)

Return the MomentumNet counterpart of the model

Parameters
modela torch model

The resnet one desires to turn into a Momentum ResNet.

sub_layersa list of strings
(default [“layer1”, “layer2”, “layer3”, “layer4”])

The name of the submodules of the model one desires to make invertible.

keep_first_layerbool (default: True)

Whether to leave to leave the first layer of each residual layer unchanged (useful if this first layer changes the dimension of the input).

gammafloat (default: 0.9)

The momentum term for the Momentum ResNet.

use_backpropbool (default: False)

If True then the Momentum ResNet has a smaller memory footprint.

is_residualbool (default: False)

If True then the forward rule is x + f(x)

Returns
mresnetthe MomentumNet ResNet counterpart of model

Examples

>>> import torch
>>> from momentumnet import transform_to_momentumnet
>>> from torchvision.models import resnet18
>>> resnet = resnet18(pretrained=True)
>>> layers = ["layer1", "layer2", "layer3", "layer4"]
>>> mresnet = transform_to_momentumnet(resnet,
...                                    sub_layers=layers,
...                                    gamma=0.9, use_backprop=False)
>>> import torch
>>> from momentumnet import transform_to_momentumnet
>>> transformer = torch.nn.Transformer(num_encoder_layers=6,
...                                    num_decoder_layers=6)
>>> layers = ["encoder.layers", "decoder.layers"]
>>> mtransformer = transform_to_momentumnet(transformer,
...                                         sub_layers=layers,
...                                         gamma=0.9,
...                                         use_backprop=False,
...                                         keep_first_layer=False)
Fork me on GitHub