diffalign.models package

Subpackages

Submodules

diffalign.models.common module

class diffalign.models.common.MultiLayerPerceptron(input_dim, hidden_dims, activation='relu', dropout=0)

Bases: PatchedModule

Multi-layer Perceptron. Note there is no activation or dropout in the last layer. :param input_dim: input dimension :type input_dim: int :param hidden_dim: hidden dimensions :type hidden_dim: list of int :param activation: activation function :type activation: str or function, optional :param dropout: dropout rate :type dropout: float, optional

forward(input)
diffalign.models.common.extend_graph_order_radius(num_nodes, pos, edge_index, edge_type, batch, order=3, cutoff=10.0, extend_order=True, extend_radius=True)
diffalign.models.common.extend_to_cross_attention(pos, cutoff, batch, graph_idx)

Module contents