ResDAG: A modern, GPU-accelerated reservoir computing library for PyTorch

I’ve been working on reservoir computing for my PhD research and found myself repeatedly building ESN infrastructure from scratch. So I packaged it into a library: ResDAG.

It’s a pure PyTorch implementation of Echo State Networks with:

  • Native nn.Module components (GPU, TorchScript compatible)

  • 15+ graph topologies for reservoir initialization (Erdős-Rényi, Watts-Strogatz, Barabási-Albert, etc.)

  • Algebraic training via ridge regression (no SGD)

  • Utilities to plot and see a graphical representation of models’ DAG

  • Modular composition using pytorch_symbolic

  • Built-in Optuna integration for HPO

Quick example - Lorenz attractor forecasting:

import torch
import pytorch_symbolic as ps
from resdag import ESNModel, ReservoirLayer, CGReadoutLayer, ESNTrainer

# Define architecture
inp = ps.Input((100, 3))
reservoir = ReservoirLayer(
    reservoir_size=500,
    feedback_size=3,
    spectral_radius=0.9,
    topology="erdos_renyi"
)(inp)
readout = CGReadoutLayer(500, 3, alpha=1e-6, name="output")(reservoir)
model = ESNModel(inp, readout)

# Train (algebraic, not gradient-based)
trainer = ESNTrainer(model)
trainer.fit(
    warmup_inputs=(warmup_data,),
    train_inputs=(train_data,),
    targets={"output": train_targets}
)

# Forecast
predictions = model.forecast(forecast_warmup, horizon=1000)

Install: pip install resdag

GitHub: GitHub - El3ssar/ResDAG: A library for reservoir computing built in torch. PyPI: ResDAG

Feedback and contributions welcome.

1 Like