netket.callbacks.ConvergenceStopping#
- class netket.callbacks.ConvergenceStopping[source]#
Bases:
Pytree
A simple callback to stop the optimisation when the monitored quantity gets below a certain threshold for at least patience steps.
- Inheritance
- __init__(target, monitor='mean', *, smoothing_window=10, patience=10)[source]#
Construct a callback stopping the optimisation when the monitored quantity gets below a certain threshold for at least patience steps.
- Parameters:
target (
float
) – the threshold value for the monitored quantity. Training will stop if the driver drops below this value.monitor (
str
) – a string with the name of the quantity to be monitored. This is applied to the standard loss optimised by a driver, such as the Energy for the VMC driver. Should be one of ‘mean’, ‘variance’, ‘error_of_mean’ (default: ‘mean’).smoothing_window (
int
) – an integer number of steps over which the monitored value is averaged before comparing to target.patience (
int
) – Number of steps to wait before stopping the execution after the tracked quantity drops below the target value (default 0, meaning that it stops immediately).
- Attributes
-
target:
float
# Target value for the monitored quantity. Training will stop if the driver drops below this value.
-
monitor:
str
# Loss statistic to monitor. Should be one of ‘mean’, ‘variance’, ‘error_of_mean’.
-
target:
- Methods
- __call__(step, log_data, driver)[source]#
A boolean function that determines whether or not to stop training.
- Parameters:
step – An integer corresponding to the step (iteration or epoch) in training.
log_data – A dictionary containing log data for training.
driver – A NetKet variational driver.
- Returns:
A boolean. If True, training continues, else, it does not.