deepfold.modules.attentionΒΆ

Classes

CrossAttentionNoGate(c_q, c_kv, c_hidden, ...)

Cross Multi-Head Attention module without gating.

SelfAttentionWithGate(c_qkv, c_hidden, ...)

Self Multi-Head Attention module with gating.