LiTEN

class dipm.models.liten.models.LiTEN(*args: Any, **kwargs: Any)

The LiTEN model flax module. It is derived from the ForceModel class.

References

  • Qun Su, Kai Zhu, Qiaolin Gou, Jintu Zhang, Renling Hu, Yurong Li, Yongze Wang, Hui Zhang, Ziyi You, Linlong Jiang, Yu Kang, Jike Wang, Chang-Yu Hsieh and Tingjun Hou. A Scalable and Quantum-Accurate Foundation Model for Biomolecular Force Field via Linearly Tensorized Quadrangle Attention. arXiv, Jul 2025. URL: https://arxiv.org/abs/2507.00884.

config

Hyperparameters / configuration for the LiTEN model, see LiTENConfig.

Type:

dipm.models.liten.config.LiTENConfig

dataset_info

Hyperparameters dictated by the dataset (e.g., cutoff radius or average number of neighbors).

__call__(edge_vectors: Array, node_species: Array, senders: Array, receivers: Array, _n_node: Array, _rngs: Rngs | None = None) Array

Compute node-wise energy summands. This function must be overridden by the implementation of ForceModel.

class dipm.models.liten.config.LiTENConfig(*, force_head: bool = False, param_dtype: DtypeEnum = DtypeEnum.F32, task_list: list[str] | None = None, num_layers: Annotated[int, Gt(gt=0)] = 2, num_channels: Annotated[int, Gt(gt=0)] = 256, num_heads: Annotated[int, Gt(gt=0)] = 8, num_rbf: Annotated[int, Gt(gt=0)] = 32, trainable_rbf: bool = False, activation: Activation = Activation.SILU, atomic_energies: str | dict[int, float] | None = None)

The configuration / hyperparameters of the LiTEN model.

num_layers

Number of LiTEN layers. Default is 2.

Type:

int

num_channels

The number of channels. Default is 256.

Type:

int

num_heads

Number of heads in the attention block. Default is 8.

Type:

int

num_rbf

Number of basis functions used in the embedding block. Default is 32.

Type:

int

trainable_rbf

Whether to add learnable weights to each of the radial embedding basis functions. Default is False.

Type:

bool

activation

Activation function for the output block. Options are “silu” (default), “ssp” (which is shifted softplus), “tanh” and “sigmoid”.

Type:

dipm.layers.activations.Activation

vecnorm_type

The type of the vector norm. The options are “none” (default), “max_min”, and “rms”.

atomic_energies

How to treat the atomic energies. If set to None (default) or the string "average", then the average atomic energies stored in the dataset info are used. It can also be set to the string "zero" which means not to use any atomic energies in the model. Lastly, one can also pass an atomic energies dictionary via this parameter different from the one in the dataset info, that is used.

Type:

str | dict[int, float] | None