bead.src.models package

Submodules

bead.src.models.flows module

Collection of flow strategies for variational inference and density estimation.

This module implements various normalizing flow architectures that can be used to transform simple probability distributions into more complex ones. These flows are particularly useful for improving the expressiveness of variational autoencoders by allowing more flexible posterior distributions.

Classes:

Planar: Planar flow transformation. Sylvester: Standard Sylvester normalizing flow. TriangularSylvester: Sylvester flow with triangular structure. IAF: Inverse Autoregressive Flow. CNN_Flow: Convolutional neural network based normalizing flow. NSF_AR: Neural Spline Flow with autoregressive structure.

class bead.src.models.flows.CNN_Flow(dim, cnn_layers, kernel_size, test_mode=0, use_revert=True)[source]

Bases: Module

forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.flows.IAF(z_size, num_flows=2, num_hidden=0, h_size=50, forget_bias=1.0, conv2d=False)[source]

Bases: Module

forward(z, h_context)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.flows.NSF_AR(dim=15, K=64, B=3, hidden_dim=8, base_network=<class 'bead.src.models.layers.FCNN'>)[source]

Bases: Module

Neural spline flow, auto-regressive. [Durkan et al. 2019]

forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

reset_parameters()[source]
class bead.src.models.flows.Planar[source]

Bases: Module

der_h(x)[source]

Derivative of tanh

forward(zk, u, w, b)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.flows.Sylvester(num_ortho_vecs)[source]

Bases: Module

Sylvester normalizing flow.

der_h(x)[source]
der_tanh(x)[source]
forward(zk, r1, r2, q_ortho, b, sum_ldj=True)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.flows.TriangularSylvester(z_size)[source]

Bases: Module

Sylvester normalizing flow with Q=P or Q=I.

der_h(x)[source]
der_tanh(x)[source]
forward(zk, r1, r2, q_ortho, b, sum_ldj=True)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

bead.src.models.layers module

Custom layer implementations for neural network architectures.

This module provides specialized neural network layers used in various models across the BEAD framework. These include masked layers for autoregressive models, CNN flow layers, graph convolutional layers, and utility functions for spline-based flows.

Classes:

Identity: Simple identity layer that returns its input unchanged. MaskedLinear: Linear layer with masking for autoregressive architectures. MaskedConv2d: 2D convolutional layer with masking capabilities. CNN_Flow_Layer: Base layer for CNN-based normalizing flows. Dilation_Block: Block of dilated convolutions for CNN flows. GraphConvolution: Graph convolutional network layer. FCNN: Simple fully connected neural network. Log1pScaler: Scaler that applies log(1+x) transformation. L2Normalizer: Scaler that applies L2 normalization. SinCosTransformer: Transforms angles to sin/cos features. ChainedScaler: Chains multiple scalers together.

Functions:

searchsorted: Utility for finding indices where elements should be inserted. unconstrained_RQS: Rational quadratic spline transformation with unconstrained inputs. RQS: Rational quadratic spline transformation.

class bead.src.models.layers.CNN_Flow_Layer(dim, kernel_size, dilation, test_mode=0, rescale=True, skip=True)[source]

Bases: Module

forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.layers.Dilation_Block(dim, kernel_size, test_mode=0)[source]

Bases: Module

forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.layers.FCNN(in_dim, out_dim, hidden_dim)[source]

Bases: Module

Simple fully connected neural network.

forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.layers.GraphConvolution(in_features, out_features, bias=True)[source]

Bases: Module

Simple GCN layer, similar to https://arxiv.org/abs/1609.02907

forward(input, adj)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

reset_parameters()[source]
class bead.src.models.layers.Identity[source]

Bases: Module

forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.layers.MaskedConv2d(in_features, out_features, size_kernel=(3, 3), diagonal_zeros=False, bias=True)[source]

Bases: Module

build_mask()[source]
forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

reset_parameters()[source]
class bead.src.models.layers.MaskedLinear(in_features, out_features, diagonal_zeros=False, bias=True)[source]

Bases: Module

build_mask()[source]
forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

reset_parameters()[source]
bead.src.models.layers.RQS(inputs, unnormalized_widths, unnormalized_heights, unnormalized_derivatives, inverse=False, left=0.0, right=1.0, bottom=0.0, top=1.0, min_bin_width=0.001, min_bin_height=0.001, min_derivative=0.001)[source]
bead.src.models.layers.searchsorted(bin_locations, inputs, eps=1e-06)[source]
bead.src.models.layers.unconstrained_RQS(inputs, unnormalized_widths, unnormalized_heights, unnormalized_derivatives, inverse=False, tail_bound=1.0, min_bin_width=0.001, min_bin_height=0.001, min_derivative=0.001)[source]

bead.src.models.models module

Neural network model architectures for anomaly detection.

This module provides various autoencoder and variational autoencoder architectures with different latent space configurations, flow transformations, and architectural choices. These models can be used for anomaly detection in particle physics data.

Classes:

AE: Basic autoencoder architecture. AE_Dropout_BN: Autoencoder with dropout and batch normalization. ConvAE: Convolutional autoencoder. ConvVAE: Convolutional variational autoencoder. Dirichlet_ConvVAE: Convolutional Dirichlet variational autoencoder. Planar_ConvVAE: ConvVAE with planar normalizing flows. OrthogonalSylvester_ConvVAE: ConvVAE with orthogonal Sylvester flows. HouseholderSylvester_ConvVAE: ConvVAE with Householder Sylvester flows. TriangularSylvester_ConvVAE: ConvVAE with triangular Sylvester flows. IAF_ConvVAE: ConvVAE with inverse autoregressive flows. ConvFlow_ConvVAE: ConvVAE with convolutional normalizing flows. NSFAR_ConvVAE: ConvVAE with neural spline flows. TransformerAE: Autoencoder with transformer components.

class bead.src.models.models.AE(in_shape, z_dim, *args, **kwargs)[source]

Bases: Module

decode(z)[source]
detach_hooks(hooks: list) None[source]
encode(x)[source]
forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

get_activations() dict[source]
get_hook(layer_name)[source]
get_layers() list[source]
store_hooks() list[source]
class bead.src.models.models.AE_Dropout_BN(in_shape, z_dim, *args, **kwargs)[source]

Bases: AE

dec_bn(z)[source]
enc_bn(x)[source]
forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.models.ConvAE(in_shape, z_dim, *args, **kwargs)[source]

Bases: Module

decode(z)[source]
encode(x)[source]
forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.models.ConvFlow_ConvVAE(in_shape, z_dim, *args, **kwargs)[source]

Bases: ConvVAE

Variational auto-encoder with convolutional flows in the decoder.

forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.models.ConvVAE(in_shape, z_dim, *args, **kwargs)[source]

Bases: ConvAE

decode(z)[source]
encode(x)[source]
forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

reparameterize(mean, logvar)[source]
class bead.src.models.models.Dirichlet_ConvVAE(in_shape, z_dim, *args, **kwargs)[source]

Bases: ConvAE

decode(z)[source]
encode(x)[source]
forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

reparameterize(mean, logvar)[source]
class bead.src.models.models.HouseholderSylvester_ConvVAE(in_shape, z_dim, *args, **kwargs)[source]

Bases: ConvVAE

Variational auto-encoder with householder sylvester flows in the decoder.

batch_construct_orthogonal(q)[source]
forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.models.IAF_ConvVAE(in_shape, z_dim, *args, **kwargs)[source]

Bases: ConvVAE

Variational auto-encoder with inverse autoregressive flows in the decoder.

encode(x)[source]
forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.models.NSFAR_ConvVAE(in_shape, z_dim, *args, **kwargs)[source]

Bases: ConvVAE

Variational auto-encoder with auto-regressive neural spline flows in the decoder.

forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.models.OrthogonalSylvester_ConvVAE(in_shape, z_dim, *args, **kwargs)[source]

Bases: ConvVAE

Variational auto-encoder with orthogonal flows in the decoder.

batch_construct_orthogonal(q)[source]
forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.models.Planar_ConvVAE(in_shape, z_dim, *args, **kwargs)[source]

Bases: ConvVAE

Variational auto-encoder with planar flows in the decoder.

forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class bead.src.models.models.TransformerAE(in_dim, h_dim=256, n_heads=1, latent_size=50, activation=<built-in function gelu>)[source]

Bases: Module

Autoencoder mixed with the Transformer Encoder layer

Parameters:

nn (_type_) – _description_

decoder(z: Tensor)[source]

_summary_

Parameters:

z (_type_) – _description_

Returns:

_description_

Return type:

_type_

encoder(x: Tensor)[source]

_summary_

Parameters:

x (_type_) – _description_

Returns:

_description_

Return type:

_type_

forward(x: Tensor)[source]

_summary_

Parameters:

z (_type_) – _description_

Returns:

_description_

Return type:

_type_

class bead.src.models.models.TriangularSylvester_ConvVAE(in_shape, z_dim, *args, **kwargs)[source]

Bases: ConvVAE

Variational auto-encoder with triangular sylvester flows in the decoder.

forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

Module contents