What is the difference between register_parameter and register_buffer in PyTorch?

Pytorch doc for register_buffer() method reads

This is typically used to register a buffer that should not to be considered a model parameter. For example, BatchNorm’s running_mean is not a parameter, but is part of the persistent state.

As you already observed, model parameters are learned and updated using SGD during the training process.
However, sometimes there are other quantities that are part of a model’s “state” and should be
– saved as part of state_dict.
– moved to cuda() or cpu() with the rest of the model’s parameters.
– cast to float/half/double with the rest of the model’s parameters.
Registering these “arguments” as the model’s buffer allows pytorch to track them and save them like regular parameters, but prevents pytorch from updating them using SGD mechanism.

An example for a buffer can be found in _BatchNorm module where the running_mean , running_var and num_batches_tracked are registered as buffers and updated by accumulating statistics of data forwarded through the layer. This is in contrast to weight and bias parameters that learns an affine transformation of the data using regular SGD optimization.

Leave a Comment

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)