Δ
Ctrl + K

Principles

These are not aspirations. They are constraints. Every decision we make — from system design to research methodology — is evaluated against these principles. When a decision conflicts with a principle, the principle wins.

Scroll
01

Evidence over intuition.

Conviction without evidence is speculation. Every model we build must be backed by statistical evidence that has survived out-of-sample testing, stress scenarios, and adversarial review. When the evidence is ambiguous, we do not act. The opportunity cost of inaction is always less than the realized cost of a false conviction.

02

Risk is the foundation, not the afterthought.

We do not generate return targets and then check risk. We define risk budgets first and then determine what returns are achievable within those constraints. This inversion of the conventional process is deliberate. It ensures that our worst-case scenarios are defined before our best-case projections, and that survival always takes precedence over performance.

03

Models are hypotheses. Treat them accordingly.

No model is permanent. Every model we develop represents a testable hypothesis about some aspect of market structure. Hypotheses can be supported by evidence, but they cannot be proven. The moment we treat a model as truth rather than conjecture is the moment we become vulnerable to the conditions under which it fails. We schedule model reviews not because we doubt our models, but because we refuse to trust them unconditionally.

04

Reproducibility is non-negotiable.

Given the same inputs, our system produces the same outputs. Every computation is deterministic, logged, and auditable. This principle extends beyond the code to the research process itself: every backtest is versioned, every parameter choice is documented, every result is reproducible by any team member without additional context. If a result cannot be reproduced, it does not exist.

05

Simplicity is a feature, not a limitation.

Complexity has a cost that is rarely accounted for: it increases the surface area for failure, the difficulty of diagnosis, and the time to recovery. We prefer the simplest model that captures the relevant dynamics. When a linear model explains 90% of the variance, we do not deploy a neural network to chase the remaining 10%. The marginal improvement rarely justifies the marginal opacity.

06

Fail safely. Fail visibly. Fail forward.

Systems will fail. The question is not whether, but how. Every component is designed to degrade gracefully rather than cascade destructively. When a model produces an anomalous signal, the system reduces exposure rather than amplifying it. When a data feed drops, the system falls back to stale data with explicit uncertainty adjustments rather than operating blind. Every failure is logged, reviewed, and converted into a system improvement.

07

Patience is a competitive advantage.

The pressure to deploy capital quickly is a liability dressed as urgency. We will deploy strategies only after they have survived a validation process that is deliberately slow. We would rather miss an opportunity than deploy a strategy prematurely. Markets are persistent. The opportunities that exist today existed yesterday and will likely exist tomorrow. The same cannot be said of capital lost to an untested idea.

Principles are not what we believe.
They are what we do when it is difficult
to do them.