core.models.gemnet.layers.base_layers#

Copyright (c) Meta, Inc. and its affiliates.

This source code is licensed under the MIT license found in the LICENSE file in the root directory of this source tree.

Classes#

Dense

Combines dense layer with scaling for swish activation.

ScaledSiLU

Base class for all neural network modules.

SiQU

Base class for all neural network modules.

ResidualLayer

Residual block with output scaled by 1/sqrt(2).

Module Contents#

class core.models.gemnet.layers.base_layers.Dense(in_features, out_features, bias: bool = False, activation=None)#

Bases: torch.nn.Module

Combines dense layer with scaling for swish activation.

Parameters:
  • units (int) – Output embedding size.

  • activation (str) – Name of the activation function to use.

  • bias (bool) – True if use bias.

linear#
reset_parameters(initializer=he_orthogonal_init) None#
forward(x)#
class core.models.gemnet.layers.base_layers.ScaledSiLU#

Bases: torch.nn.Module

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call to(), etc.

Note

As per the example above, an __init__() call to the parent class must be made before assignment on the child.

Variables:

training (bool) – Boolean represents whether this module is in training or evaluation mode.

scale_factor#
_activation#
forward(x)#
class core.models.gemnet.layers.base_layers.SiQU#

Bases: torch.nn.Module

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes:

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call to(), etc.

Note

As per the example above, an __init__() call to the parent class must be made before assignment on the child.

Variables:

training (bool) – Boolean represents whether this module is in training or evaluation mode.

_activation#
forward(x)#
class core.models.gemnet.layers.base_layers.ResidualLayer(units: int, nLayers: int = 2, layer=Dense, **layer_kwargs)#

Bases: torch.nn.Module

Residual block with output scaled by 1/sqrt(2).

Parameters:
  • units (int) – Output embedding size.

  • nLayers (int) – Number of dense layers.

  • layer_kwargs (str) – Keyword arguments for initializing the layers.

dense_mlp#
inv_sqrt_2#
forward(input)#