deepfold.config.TrainingConfig¶
- class deepfold.config.TrainingConfig(gradient_clipping: 'bool' = True, clip_grad_max_nrom: 'float' = 0.1, swa_enabled: 'bool' = True, swa_decay_rate: 'float' = 0.9)[source]¶
- __init__(gradient_clipping: bool = True, clip_grad_max_nrom: float = 0.1, swa_enabled: bool = True, swa_decay_rate: float = 0.9) None¶
Methods
__init__([gradient_clipping, ...])from_dict(cfg)from_preset(**additional_options)to_dict()Attributes
clip_grad_max_nromgradient_clippingoptimizer_adam_amsgradoptimizer_adam_beta_1optimizer_adam_beta_2optimizer_adam_epsoptimizer_adam_weight_decayswa_decay_rateswa_enabled