Circuit Breaker
A circuit breaker is a resilience pattern that monitors requests to a backend and temporarily stops sending traffic when failures exceed a threshold. This prevents a single slow or failing service from cascading into a full storefront outage.
The Alokai middleware includes a built-in circuit breaker for each integration — no setup required.
Quick Start
1. Nothing to configure
A circuit breaker is automatically created per integration when it receives its first request.
2. (Optional) Choose a preset
In middleware.config.ts:
integrations: {
commerce: {
location: "./integrations/commerce/index.server.ts",
configuration: { /* ... */ },
circuitBreaker: { preset: "BALANCED" },
},
}
Available presets:
BALANCED (default), AGGRESSIVE, FAST_FAILURE, HARD_FAIL, RELAXED_DEBUG, TOLERANT, EXTREME_DEBUG
3. (Optional) Override preset values
All preset fields can be overridden per integration if needed.
Presets Overview
| Preset | Description |
|---|---|
| BALANCED (default) | Safe general-purpose settings. |
| AGGRESSIVE | Trips quickly when backend slows. |
| FAST_FAILURE | Ultra-fast detection for latency-sensitive flows. |
| HARD_FAIL | Very strict; trips after few failures. |
| RELAXED_DEBUG | Very tolerant; useful for debugging flaky APIs. |
| TOLERANT | Slow to trip; good for unstable but low-risk services. |
| EXTREME_DEBUG | Maximum tolerance (non-prod only). |
Default values per preset
- BALANCED: 50% errors, 20s timeout, 15s reset, 20 calls min
- AGGRESSIVE: 30% errors, 5s timeout, 10s reset, 10 calls
- FAST_FAILURE: 40% errors, 3s timeout, 5s reset, 5 calls
- HARD_FAIL: 10% errors, 2s timeout, 3s reset, 3 calls
- RELAXED_DEBUG: 80% errors, 30s timeout, 30s reset, 30 calls
- TOLERANT: 70% errors, 60s timeout, 60s reset, 30 calls
- EXTREME_DEBUG: 95% errors, 120s timeout, 60s reset, 50 calls
What Happens During Failures?
What counts as a failure?
- Infrastructure errors: 5xx, timeouts, DNS/network issues → trip the breaker
- Business errors (4xx) → ignored, do not affect the breaker
Breaker states
- Closed — normal operation
- Open — integration temporarily blocked
- Half-open — testing if backend has recovered
Logging & Metrics
You will see logs for:
open— breaker trippedhalfOpen— testing recoveryclose— back to normalfailure,reject— throttled to reduce noise
Troubleshooting
“Breaker is open”
The backend had too many failures or timed out. Wait for the reset period — it recovers automatically.
Too many rejects
Try:
- lowering the error threshold
- reducing the timeout
- switching to a more tolerant preset (
TOLERANT,RELAXED_DEBUG)
Not tripping when expected
Only infrastructure failures count. 4xx business errors are intentionally ignored.
Summary
- Automatic protection per integration
- Easy preset-based configuration
- Clear logging & metrics
- Zero boilerplate
If your backend slows down or fails, the Circuit Breaker prevents everything else from breaking along with it.