please see: https://pytorch.org/tutorials/prototype/skip_param_init.html
It is now possible to skip parameter initialization during module construction, avoiding wasted computation. This is easily accomplished using the torch.nn.utils.skip_init() function.