0

Im trying to implement the following project into Tensorflow/Keras. https://github.com/jacobgil/pytorch-pruning

Im having a hard time understanding what register_hook does? It can be found in finetune.py, row 66. x.register_hook(self.compute_rank)

I've searched for clear explanations regarding this function and tried to find Keras-equivalents, without any luck. Do you have answers to these questions?

pogibas
  • 27,303
  • 19
  • 84
  • 117
vicwess
  • 1
  • 1

1 Answers1

1

First things first, here's the documentation:

http://pytorch.org/docs/master/autograd.html#torch.autograd.Variable.register_hook

This allows you to register a method to a Variable that is called whenever the Variable's .grad is updated, i.e. in a backward pass, and takes the grad as input. The method can return a Variable that would replace the original .grad or None if you just want to read the gradients to do something else. If you update the gradients this way, the nodes further down in the compute graph see the new updated gradient in the backward pass and will have their respective gradients calculated with the updated value.

I'm not a Tensorflow expert, but the RegisterGradient decorators (documentation) seem to be able to do the same, for an example see this answer.

Jens Petersen
  • 349
  • 1
  • 4