248 words
1 minute
Randomness & Reproducibility

XIV. Randomness & Reproducibility (随机性与可复现性)#

1. torch.manual_seed()#

Sets the global random seed to ensure consistent results across runs (实验可复现性).
import random, numpy as np
def set_seed(seed=42):
torch.manual_seed(seed)
torch.cuda.manual_seed_all(seed)
np.random.seed(seed)
random.seed(seed)
set_seed(42)
Note: Also set torch.backends.cudnn.deterministic=True for fully deterministic behavior.

2. torch.Generator()#

Creates an independent Random Number Generator (随机数生成器) object, avoiding interference with the global seed.
g = torch.Generator()
g.manual_seed(42)
x = torch.rand(3, generator=g) # independent state
loader = DataLoader(ds, shuffle=True, generator=g)
Note: Safer than global seeds in multi-thread / multi-process scenarios.

3. torch.distributions.*#

Probability distribution library supporting sampling and log_prob computation. Foundation for VAE and Reinforcement Learning (强化学习).
from torch.distributions import Normal, Categorical
# VAE reparameterization trick (重参数化技巧)
dist = Normal(loc=mu, scale=torch.exp(logvar))
z = dist.rsample() # differentiable sampling
log_p = dist.log_prob(z)
Note: rsample() is differentiable (reparameterization); sample() is not (for policy gradients).
💡 One-line Takeaway
Always call set_seed() at the start of every experiment and use rsample() for differentiable stochastic layers.

Randomness & Reproducibility
https://lxy-alexander.github.io/blog/posts/pytorch/api/14randomness--reproducibility/
Author
Alexander Lee
Published at
2026-03-12
License
CC BY-NC-SA 4.0