The library is currently undergoing a major rewrite. The API might be unstable. Check back soon!
Bensemble is a library for Bayesian Deep Learning which integrates established methods for neural network ensembling and uncertainty quantification. Bensemble provides building blocks that slot directly into your existing PyTorch workflows.
| Resource | Description |
|---|---|
| 📘 Documentation | Full API reference and user guides. |
| 📝 Tech Report | In-depth technical details and theoretical background. |
| ✍️ Blog Post | Summary of the project and motivation. |
| 📊 Benchmarks | Comparison of methods on standard datasets. |
- PyTorch-Native: All layers and methods compatible with standart PyTorch.
- Modularity: BayesianLinear, BayesianConv2d with built-in Local Reparameterization Trick (LRT).
- Core Bayesian Methods: Implements canonical algorithms from Variational Inference to Scalable Laplace approximations.
- Modern Stack: Built with
uv, fully typed, and tested.
You can install bensemble using pip:
pip install bensembleOr, if you prefer using uv for lightning-fast installation:
uv pip install bensembleBuild a Bayesian Neural Network using our layers and write a standard PyTorch training loop.
import torch import torch.nn as nn from torch.utils.data import DataLoader, TensorDataset # Import building blocks from bensemble.layers import BayesianLinear from bensemble.losses import VariationalLoss, GaussianLikelihood from bensemble.utils import get_total_kl, predict_with_uncertainty # 0. Prepare Dummy Data X_train = torch.randn(100, 10) y_train = torch.randn(100, 1) X_test = torch.randn(5, 10) dataset = TensorDataset(X_train, y_train) train_loader = DataLoader(dataset, batch_size=10, shuffle=True) # 1. Define Model using Bayesian Layers model = nn.Sequential( BayesianLinear(10, 50, prior_sigma=1.0), nn.ReLU(), BayesianLinear(50, 1, prior_sigma=1.0), ) # 2. Define Objectives (Likelihood + Divergence) likelihood = GaussianLikelihood() criterion = VariationalLoss(likelihood, alpha=1.0) optimizer = torch.optim.Adam( list(model.parameters()) + list(likelihood.parameters()), lr=0.01 ) # 3. Train Model model.train() for epoch in range(100): for x, y in train_loader: optimizer.zero_grad() preds = model(x) kl = get_total_kl(model) loss = criterion(preds, y, kl) loss.backward() optimizer.step() # 4. Predict mean, std = predict_with_uncertainty(model, X_test, num_samples=100) print(f"Prediction: {mean[0].item():.2f}") print(f"Uncertainty: ±{std[0].item():.2f}")If you want to contribute to bensemble or run tests, we recommend using uv to manage the environment.
# 1. Clone the repository git clone https://github.com/intsystems/bensemble.git cd bensemble # 2. Create and activate virtual environment via uv uv venv source .venv/bin/activate # on Windows: .venv\Scripts\activate # 3. Install in editable mode with dev dependencies uv pip install -e ".[dev]"We have implemented four distinct approaches. Check out the interactive demos for each:
| Method | Description | Demo |
|---|---|---|
| Variational Inference | Approximates posterior using Gaussian distributions using Local Reparameterization Trick | Open Notebook |
| Laplace Approximation | Fits a Gaussian around the MAP estimate using Kronecker-Factored Curvature (K-FAC). | Open Notebook |
| Variational Rényi | Generalization of VI minimizing | Open Notebook |
| Probabilistic Backprop | Propagates moments through the network using Assumed Density Filtering (ADF). | Open Notebook |
The library is covered by a comprehensive test suite to ensure reliability.
pytest tests/We use ruff to keep code clean:
ruff check . ruff format .Developed by:
This project is licensed under the MIT License - see the LICENSE file for details.