Speaker
Description
Molecular dynamics (MD) plays a crucial role in the field of atomistic simulations for calculating thermodynamic and kinetic quantities of molecules and materials.
Traditionally descriptions of atomic interactions either rely on classical or quantum mechanical models which are limited in accuracy and simulation speed respectively.
Machine learning potentials (MLPs) have emerged as a useful class of surrogate models that bridge this gap, retaining most of the quantum mechanical accuracy at drastically reduced cost.
However, obtaining informative training data for these models is a challenging task, as the systems of interest can have thousands of degrees of freedom with vastly different characteristic time scales.
Biasing the dynamics of the system along the slowest degrees of freedom can significantly decrease the time needed to obtain sufficiently informative data for the training of MLPs.
While traditional approaches to biasing MD mostly rely on hand-selected degrees of freedom to enhance the sampling in, we introduce a data-driven way to identify the most informative degrees of freedom for the MLP and limit the bias to exploring physically relevant parts of configuration space.
Further, efficient execution of atomistic machine learning workflows relies on the utilization of heterogeneous compute resources.
While model training and MD simulations are most efficient on GPUs, reference quantum mechanical computations require large amounts of CPU cores.
We introduce tools to flexibly split the tasks in a workflow across the available hardware.