Established 2026 · Fort Anne, New York

Information theory
across substrates.

The Windstorm Institute studies the mathematical constraints governing information processing in biological, neural, and artificial systems. Our work bridges rate-distortion theory, molecular biology, and machine learning to uncover universal principles of serial decoding.

1,826
Models Evaluated
4.39
Bits — Ribosome Floor
3
Papers Published
p=0.576
Vocab Independence

Where biology meets information theory meets AI.

We investigate why serial decoding systems — from ribosomes to transformers — converge on similar throughput constraints despite operating on radically different substrates.

📐

Rate-Distortion Theory

Deriving mechanistic bounds on serial decoding throughput using Shannon's M-ary rate-distortion framework. Zero-free-parameter predictions for biological receivers.

🧬

Molecular Information Processing

The ribosome as an information channel. Thermodynamic anchoring of throughput to kT via Hopfield kinetic proofreading. Why 21 amino acids — not 10, not 100.

AI Throughput Constraints

Large-scale empirical studies of tokenizer vocabulary independence. 1,826-model sweeps demonstrating that vocabulary size is a redundancy parameter, not an information parameter.

Published research.

All papers include reproducible Python code, full experiment protocols, and honest limitations. We lead with falsified predictions because that's how science works.

Compute infrastructure.

Windstorm Labs is the experimental arm of the Institute — GPU clusters, autonomous AI research agents, and large-scale empirical science.

Primary Compute

RTX 5090

32GB VRAM. Runs 1,826-model evaluation sweeps, evolutionary simulations, and model training.

Agent Fleet

8 Nodes

Autonomous AI research agents coordinated across distributed infrastructure. Parallel experiment execution.

Models Evaluated

1,826

Largest known tokenizer-information survey. Vocabulary sizes spanning 256 to 256K tokens on shared corpus.

Open Science

100%

All code, data, and experiment protocols published. Every result reproducible on commodity hardware.

Who we are.

Grant Lavell Whitmer III

Founder & Principal Investigator

U.S. Naval Academy graduate. Cross-disciplinary researcher working at the intersection of information theory, molecular biology, and artificial intelligence. Creator of the Throughput Constraint framework and the Forma Animae thesis.

Windstorm Labs

Experimental Division

A fleet of autonomous AI research agents executing large-scale empirical experiments, adversarial review, and computational simulations. Headquartered on an NVIDIA RTX 5090 in Mount Pleasant, South Carolina.

Advisory & Collaboration

Open Positions

We are seeking advisory board members with expertise in information theory, computational biology, and rate-distortion theory. If our work interests you, we want to hear from you.

Why does the ribosome process 4.39 bits per codon — and why does a transformer process roughly the same?

Two systems separated by 3.8 billion years of evolution, built on entirely different substrates, solving the same mathematical problem: decode one symbol per time step from a noisy serial stream while minimizing discrimination cost. The rate-distortion surface doesn't care whether the receiver is RNA, neurons, or silicon. We're mapping that surface.