core.units.mlip_unit.api.inference#
Copyright (c) Meta Platforms, Inc. and affiliates.
This source code is licensed under the MIT license found in the LICENSE file in the root directory of this source tree.
Attributes#
Classes#
str(object='') -> str |
|
Functions#
Module Contents#
- class core.units.mlip_unit.api.inference.UMATask#
Bases:
str
,enum.Enum
str(object=’’) -> str str(bytes_or_buffer[, encoding[, errors]]) -> str
Create a new string object from the given object. If encoding or errors is specified, then the object must expose a data buffer that will be decoded using the given encoding and error handler. Otherwise, returns the result of object.__str__() (if defined) or repr(object). encoding defaults to sys.getdefaultencoding(). errors defaults to ‘strict’.
- OMOL = 'omol'#
- OMAT = 'omat'#
- ODAC = 'odac'#
- OC20 = 'oc20'#
- OMC = 'omc'#
- core.units.mlip_unit.api.inference.CHARGE_RANGE#
- core.units.mlip_unit.api.inference.SPIN_RANGE = [0, 100]#
- core.units.mlip_unit.api.inference.DEFAULT_CHARGE = 0#
- core.units.mlip_unit.api.inference.DEFAULT_SPIN_OMOL = 1#
- core.units.mlip_unit.api.inference.DEFAULT_SPIN = 0#
- class core.units.mlip_unit.api.inference.MLIPInferenceCheckpoint#
- model_config: dict#
- model_state_dict: dict#
- ema_state_dict: dict#
- tasks_config: dict#
- class core.units.mlip_unit.api.inference.InferenceSettings#
- tf32: bool = False#
- activation_checkpointing: bool | None = None#
- merge_mole: bool = False#
- compile: bool = False#
- wigner_cuda: bool | None = None#
- external_graph_gen: bool | None = None#
- internal_graph_gen_version: int | None = None#
- core.units.mlip_unit.api.inference.inference_settings_default()#
- core.units.mlip_unit.api.inference.inference_settings_turbo()#
- core.units.mlip_unit.api.inference.inference_settings_traineval()#
- core.units.mlip_unit.api.inference.NAME_TO_INFERENCE_SETTING#
- core.units.mlip_unit.api.inference.guess_inference_settings(settings: str | InferenceSettings)#