deepfold.losses.procrustes.Procrustes¶
- class deepfold.losses.procrustes.Procrustes(*args, **kwargs)[source]¶
- __init__(*args, **kwargs)¶
Methods
__init__(*args, **kwargs)apply(*args, **kwargs)backward(ctx, grad_r, grad_ds)Define a formula for differentiating the operation with backward mode automatic differentiation.
forward(ctx, m, force_rotation, ...)Define the forward of the custom autograd Function.
jvp(ctx, *grad_inputs)Define a formula for differentiating the operation with forward mode automatic differentiation.
mark_dirty(*args)Mark given tensors as modified in an in-place operation.
mark_non_differentiable(*args)Mark outputs as non-differentiable.
mark_shared_storage(*pairs)maybe_clear_saved_tensorsnameregister_hookregister_prehooksave_for_backward(*tensors)Save given tensors for a future call to
backward().save_for_forward(*tensors)Save given tensors for a future call to
jvp().set_materialize_grads(value)Set whether to materialize grad tensors.
setup_context(ctx, inputs, output)There are two ways to define the forward pass of an autograd.Function.
vjp(ctx, *grad_outputs)Define a formula for differentiating the operation with backward mode automatic differentiation.
vmap(info, in_dims, *args)Define the behavior for this autograd.Function underneath
torch.vmap().Attributes
dirty_tensorsgenerate_vmap_rulematerialize_gradsmetadataneeds_input_gradnext_functionsnon_differentiablerequires_gradsaved_for_forwardsaved_tensorssaved_variablesto_save