core.calculate.pretrained_mlip#
Copyright (c) Meta Platforms, Inc. and affiliates.
This source code is licensed under the MIT license found in the LICENSE file in the root directory of this source tree.
Attributes#
Classes#
Functions#
|
Retrieves a prediction unit for a specified model. |
Module Contents#
- class core.calculate.pretrained_mlip.HuggingFaceCheckpoint#
- filename: str#
- repo_id: Literal['facebook/UMA']#
- subfolder: str | None = None#
- revision: str | None = None#
- class core.calculate.pretrained_mlip.PretrainedModels#
- checkpoints: dict[str, HuggingFaceCheckpoint]#
- core.calculate.pretrained_mlip._MODEL_CKPTS#
- core.calculate.pretrained_mlip.available_models#
- core.calculate.pretrained_mlip.get_predict_unit(model_name: str, inference_settings: fairchem.core.units.mlip_unit.InferenceSettings | str = 'default', overrides: dict | None = None, device: str = 'cuda') fairchem.core.units.mlip_unit.MLIPPredictUnit #
Retrieves a prediction unit for a specified model.
- Parameters:
model_name – Name of the model to load from available pretrained models.
inference_settings – Settings for inference. Can be “default” (general purpose) or “turbo” (optimized for speed but requires fixed atomic composition). Advanced use cases can use a custom InferenceSettings object.
overrides – Optional dictionary of settings to override default inference settings.
device – Optional torch device to load the model onto. If None, uses the default device.
- Returns:
An initialized MLIPPredictUnit ready for making predictions.
- Raises:
KeyError – If the specified model_name is not found in available models.