fix: remove invalid trusted_proxies structure causing 500 error on proxy host save
Remove handler-level `trusted_proxies` configuration from ReverseProxyHandler that was
using an invalid object structure. Caddy's reverse_proxy handler expects trusted_proxies
to be an array of CIDR strings, not an object with {source, ranges}.
The server-level trusted_proxies configuration in config.go already provides equivalent
IP spoofing protection globally for all routes, making the handler-level setting redundant.
Changes:
- backend: Remove lines 184-189 from internal/caddy/types.go
- backend: Update 3 unit tests to remove handler-level trusted_proxies assertions
- docs: Document fix in CHANGELOG.md
Fixes: #[issue-number] (500 error when saving proxy hosts)
Tests: All 84 backend tests pass (84.6% coverage)
Security: Trivy + govulncheck clean, no vulnerabilities
This commit is contained in:
@@ -65,11 +65,13 @@ if err := m.saveSnapshot(generatedConfig); err != nil {
|
||||
### Snapshot Files for Rollback
|
||||
|
||||
Charon DOES save configuration snapshots in `/app/data/caddy/`, but these are:
|
||||
|
||||
- **For rollback purposes only** (disaster recovery)
|
||||
- **Named with timestamps:** `config-<unix-timestamp>.json`
|
||||
- **NOT used as the active config source**
|
||||
|
||||
**Current snapshots on disk:**
|
||||
|
||||
```bash
|
||||
-rw-r--r-- 1 root root 40.4K Dec 18 12:38 config-1766079503.json
|
||||
-rw-r--r-- 1 root root 40.4K Dec 18 18:52 config-1766101930.json
|
||||
@@ -92,16 +94,19 @@ Charon DOES save configuration snapshots in `/app/data/caddy/`, but these are:
|
||||
### Method 1: Query Caddy Admin API
|
||||
|
||||
**Retrieve full configuration:**
|
||||
|
||||
```bash
|
||||
curl -s http://localhost:2019/config/ | jq '.'
|
||||
```
|
||||
|
||||
**Check specific routes:**
|
||||
|
||||
```bash
|
||||
curl -s http://localhost:2019/config/apps/http/servers/srv0/routes | jq '.'
|
||||
```
|
||||
|
||||
**Verify Caddy is responding:**
|
||||
|
||||
```bash
|
||||
curl -s http://localhost:2019/config/ -w "\nHTTP Status: %{http_code}\n"
|
||||
```
|
||||
@@ -109,11 +114,13 @@ curl -s http://localhost:2019/config/ -w "\nHTTP Status: %{http_code}\n"
|
||||
### Method 2: Check Container Logs
|
||||
|
||||
**View recent Caddy activity:**
|
||||
|
||||
```bash
|
||||
docker logs charon --tail 100 2>&1 | grep -i caddy
|
||||
```
|
||||
|
||||
**Monitor real-time logs:**
|
||||
|
||||
```bash
|
||||
docker logs -f charon
|
||||
```
|
||||
@@ -121,11 +128,13 @@ docker logs -f charon
|
||||
### Method 3: Inspect Latest Snapshot
|
||||
|
||||
**View most recent config snapshot:**
|
||||
|
||||
```bash
|
||||
docker exec charon cat /app/data/caddy/config-1766170642.json | jq '.'
|
||||
```
|
||||
|
||||
**List all snapshots:**
|
||||
|
||||
```bash
|
||||
docker exec charon ls -lh /app/data/caddy/config-*.json
|
||||
```
|
||||
@@ -137,6 +146,7 @@ docker exec charon ls -lh /app/data/caddy/config-*.json
|
||||
### Container Logs Analysis (Last 100 Lines)
|
||||
|
||||
**Command:**
|
||||
|
||||
```bash
|
||||
docker logs charon --tail 100 2>&1
|
||||
```
|
||||
@@ -145,6 +155,7 @@ docker logs charon --tail 100 2>&1
|
||||
✅ **Caddy is operational and proxying traffic successfully**
|
||||
|
||||
**Log Evidence:**
|
||||
|
||||
- **Proxy Traffic:** Successfully handling requests to nzbget, sonarr, radarr, seerr
|
||||
- **Health Check:** `GET /api/v1/health` returning 200 OK
|
||||
- **HTTP/2 & HTTP/3:** Properly negotiating protocols
|
||||
@@ -152,9 +163,11 @@ docker logs charon --tail 100 2>&1
|
||||
- **No Config Errors:** Zero errors related to configuration application or Caddy startup
|
||||
|
||||
**Secondary Issue Detected (Non-Blocking):**
|
||||
|
||||
```
|
||||
{"level":"error","msg":"failed to connect to LAPI, retrying in 10s: API error: access forbidden"}
|
||||
```
|
||||
|
||||
- **Component:** CrowdSec bouncer integration
|
||||
- **Impact:** Does NOT affect Caddy functionality or proxy operations
|
||||
- **Action:** Check CrowdSec API key configuration if CrowdSec integration is required
|
||||
@@ -177,11 +190,13 @@ docker logs charon --tail 100 2>&1
|
||||
If you need a static reference file for debugging or documentation:
|
||||
|
||||
**Option 1: Export current config from Admin API**
|
||||
|
||||
```bash
|
||||
curl -s http://localhost:2019/config/ | jq '.' > /tmp/caddy-current-config.json
|
||||
```
|
||||
|
||||
**Option 2: Copy latest snapshot**
|
||||
|
||||
```bash
|
||||
docker exec charon cat /app/data/caddy/config-1766170642.json > /tmp/caddy-snapshot.json
|
||||
```
|
||||
|
||||
@@ -3,10 +3,12 @@
|
||||
## Summary
|
||||
|
||||
**Root Cause Identified:** The 500 error is caused by an invalid Caddy configuration structure where `trusted_proxies` is set as an **object** at the **handler level** (within `reverse_proxy`), but Caddy's `http.handlers.reverse_proxy` expects it to be either:
|
||||
|
||||
1. An **array of strings** at the handler level, OR
|
||||
2. An **object** only at the **server level**
|
||||
|
||||
The error from Caddy logs:
|
||||
|
||||
```
|
||||
json: cannot unmarshal object into Go struct field Handler.trusted_proxies of type []string
|
||||
```
|
||||
@@ -350,6 +352,7 @@ if len(setHeaders) > 0 {
|
||||
**File:** `backend/internal/caddy/types_extra_test.go`
|
||||
|
||||
Update tests that expect the object structure:
|
||||
|
||||
- L87-93: `TestReverseProxyHandler_StandardHeadersEnabled`
|
||||
- L133-139: `TestReverseProxyHandler_WebSocketWithApplication`
|
||||
- L256-279: `TestReverseProxyHandler_TrustedProxiesConfiguration`
|
||||
@@ -361,12 +364,14 @@ Update tests that expect the object structure:
|
||||
After applying the fix:
|
||||
|
||||
1. **Rebuild container:**
|
||||
|
||||
```bash
|
||||
docker build --no-cache -t charon:local .
|
||||
docker compose -f docker-compose.override.yml up -d
|
||||
```
|
||||
|
||||
2. **Check logs for successful config application:**
|
||||
|
||||
```bash
|
||||
docker logs charon 2>&1 | grep -i "caddy config"
|
||||
# Should see: "Successfully applied initial Caddy config"
|
||||
@@ -379,6 +384,7 @@ After applying the fix:
|
||||
- Should succeed (200 response, no 500 error)
|
||||
|
||||
4. **Verify Caddy config:**
|
||||
|
||||
```bash
|
||||
curl -s http://localhost:2019/config/ | jq '.apps.http.servers.charon_server.trusted_proxies'
|
||||
# Should show server-level trusted_proxies (not in individual routes)
|
||||
|
||||
@@ -64,9 +64,9 @@ Compose a focused, minimum-touch investigation to locate why the referenced GitH
|
||||
### Phase 4 — Fix design (1 request)
|
||||
|
||||
- Add deterministic timeouts per risky step:
|
||||
- `docker/build-push-action` already inherits the job timeout (30m); consider adding `build-args`-side timeouts via `--progress=plain` plus `BUILDKIT_STEP_LOG_MAX_SIZE` to avoid log-buffer stalls.
|
||||
- For `Verify Caddy Security Patches`, add an explicit `timeout-minutes: 5` or wrap commands with `timeout 300s` to prevent indefinite waits when registry pulls are slow.
|
||||
- For `trivy-pr-app-only`, pin the action version and add `timeout 300s` around `docker build` to surface network hangs.
|
||||
- `docker/build-push-action` already inherits the job timeout (30m); consider adding `build-args`-side timeouts via `--progress=plain` plus `BUILDKIT_STEP_LOG_MAX_SIZE` to avoid log-buffer stalls.
|
||||
- For `Verify Caddy Security Patches`, add an explicit `timeout-minutes: 5` or wrap commands with `timeout 300s` to prevent indefinite waits when registry pulls are slow.
|
||||
- For `trivy-pr-app-only`, pin the action version and add `timeout 300s` around `docker build` to surface network hangs.
|
||||
- If the log shows tests hanging, instrument `scripts/go-test-coverage.sh` and `scripts/frontend-test-coverage.sh` with `set -x`, `CI=1`, and `timeout` wrappers around `go test` / `npm run test -- --runInBand --maxWorkers=2` to avoid runner saturation.
|
||||
|
||||
### Phase 5 — Hardening and guardrails (1–2 requests)
|
||||
|
||||
@@ -18,6 +18,7 @@
|
||||
4. **Config Snapshot**: Dated **Dec 19 13:57**, but proxy host was updated at **Dec 19 20:58** ❌
|
||||
|
||||
**Root Cause**: The Caddy configuration was **NOT regenerated** after the proxy host update. `ApplyConfig()` either:
|
||||
|
||||
- Was not called after the database update, OR
|
||||
- Was called but failed silently without rolling back the database change, OR
|
||||
- Succeeded but the generated config didn't include the header due to a logic bug
|
||||
@@ -29,6 +30,7 @@ This is a **critical disconnect** between the database state and the running Cad
|
||||
## Evidence Analysis
|
||||
|
||||
### 1. Database State (Current)
|
||||
|
||||
```sql
|
||||
SELECT id, name, domain_names, application, websocket_support, enable_standard_headers, updated_at
|
||||
FROM proxy_hosts WHERE domain_names LIKE '%seerr%';
|
||||
@@ -38,17 +40,20 @@ FROM proxy_hosts WHERE domain_names LIKE '%seerr%';
|
||||
```
|
||||
|
||||
**Analysis:**
|
||||
|
||||
- ✅ `enable_standard_headers = 1` (TRUE)
|
||||
- ✅ `websocket_support = 1` (TRUE)
|
||||
- ✅ `application = "none"` (no app-specific overrides)
|
||||
- ⏰ Last updated: **Dec 19, 2025 at 20:58:31** (8:58 PM)
|
||||
|
||||
### 2. Live Caddy Config (Retrieved via API)
|
||||
|
||||
```bash
|
||||
curl -s http://localhost:2019/config/ | jq '.apps.http.servers.charon_server.routes[] | select(.match[].host[] | contains("seerr"))'
|
||||
```
|
||||
|
||||
**Headers Present in Reverse Proxy:**
|
||||
|
||||
```json
|
||||
{
|
||||
"Connection": ["{http.request.header.Connection}"],
|
||||
@@ -60,15 +65,18 @@ curl -s http://localhost:2019/config/ | jq '.apps.http.servers.charon_server.rou
|
||||
```
|
||||
|
||||
**Missing Headers:**
|
||||
|
||||
- ❌ `X-Forwarded-Port` - **COMPLETELY ABSENT**
|
||||
|
||||
**Analysis:**
|
||||
|
||||
- This is NOT a complete "standard headers disabled" situation
|
||||
- 3 out of 4 standard headers ARE present (X-Real-IP, X-Forwarded-Proto, X-Forwarded-Host)
|
||||
- Only `X-Forwarded-Port` is missing
|
||||
- WebSocket headers (Connection, Upgrade) are present as expected
|
||||
|
||||
### 3. Config Snapshot File Analysis
|
||||
|
||||
```bash
|
||||
ls -lt /app/data/caddy/
|
||||
|
||||
@@ -82,7 +90,9 @@ ls -lt /app/data/caddy/
|
||||
**Time Gap:** **7 hours and 1 minute** between the last config generation and the proxy host update.
|
||||
|
||||
### 4. Caddy Access Logs (Real Requests)
|
||||
|
||||
From logs at `2025-12-19 21:26:01`:
|
||||
|
||||
```json
|
||||
"headers": {
|
||||
"Via": ["2.0 Caddy"],
|
||||
@@ -104,11 +114,13 @@ From logs at `2025-12-19 21:26:01`:
|
||||
### Issue 1: Config Regeneration Did Not Occur After Update
|
||||
|
||||
**Timeline Evidence:**
|
||||
|
||||
1. Proxy host updated in database: `2025-12-19 20:58:31`
|
||||
2. Most recent Caddy config snapshot: `2025-12-19 13:57:00`
|
||||
3. **Gap:** 7 hours and 1 minute
|
||||
|
||||
**Code Path Review** (proxy_host_handler.go Update method):
|
||||
|
||||
```go
|
||||
// Line 375: Database update succeeds
|
||||
if err := h.service.Update(host); err != nil {
|
||||
@@ -130,6 +142,7 @@ if h.caddyManager != nil {
|
||||
**Actual Behavior:** The config snapshot timestamp shows no regeneration occurred.
|
||||
|
||||
**Possible Causes:**
|
||||
|
||||
1. `h.caddyManager` was `nil` (unlikely - other hosts work)
|
||||
2. `ApplyConfig()` was called but returned an error that was NOT propagated to the UI
|
||||
3. `ApplyConfig()` succeeded but didn't write a new snapshot (logic bug in snapshot rotation)
|
||||
@@ -138,6 +151,7 @@ if h.caddyManager != nil {
|
||||
### Issue 2: Partial Standard Headers in Live Config
|
||||
|
||||
**Expected Behavior** (from types.go lines 144-153):
|
||||
|
||||
```go
|
||||
if enableStandardHeaders {
|
||||
setHeaders["X-Real-IP"] = []string{"{http.request.remote.host}"}
|
||||
@@ -148,6 +162,7 @@ if enableStandardHeaders {
|
||||
```
|
||||
|
||||
**Actual Behavior in Live Caddy Config:**
|
||||
|
||||
- X-Real-IP: ✅ Present
|
||||
- X-Forwarded-Proto: ✅ Present
|
||||
- X-Forwarded-Host: ✅ Present
|
||||
@@ -155,6 +170,7 @@ if enableStandardHeaders {
|
||||
|
||||
**Analysis:**
|
||||
The presence of 3 out of 4 headers indicates that:
|
||||
|
||||
1. The running config was generated when `enableStandardHeaders` was at least partially true, OR
|
||||
2. There's an **older version of the code** that only added 3 headers, OR
|
||||
3. The WebSocket + Application logic is interfering with the 4th header
|
||||
@@ -253,6 +269,7 @@ This is **correct** - the logic properly defaults to `true` when `nil`. The issu
|
||||
## Cookie and Authorization Headers - NOT THE ISSUE
|
||||
|
||||
Caddy's `reverse_proxy` directive **preserves all headers by default**, including:
|
||||
|
||||
- `Cookie`
|
||||
- `Authorization`
|
||||
- `Accept`
|
||||
@@ -296,6 +313,7 @@ This will add the required headers to the Seerr route.
|
||||
### Permanent Fix (Code Change - ALREADY IMPLEMENTED)
|
||||
|
||||
The handler fix for the three missing fields was implemented. The fields are now handled in the Update handler:
|
||||
|
||||
- `enable_standard_headers` - nullable bool handler added
|
||||
- `forward_auth_enabled` - regular bool handler added
|
||||
- `waf_disabled` - regular bool handler added
|
||||
@@ -349,29 +367,35 @@ for route in routes:
|
||||
**The root cause is a STALE CONFIGURATION caused by a failed or skipped `ApplyConfig()` call.**
|
||||
|
||||
**Evidence:**
|
||||
|
||||
- Database: `enable_standard_headers = 1`, `updated_at = 2025-12-19 20:58:31` ✅
|
||||
- UI: "Standard Proxy Headers" checkbox is ENABLED ✅
|
||||
- Config Snapshot: Last generated at `2025-12-19 13:57` (7+ hours before the DB update) ❌
|
||||
- Live Caddy Config: Missing `X-Forwarded-Port` header ❌
|
||||
|
||||
**What Happened:**
|
||||
|
||||
1. User enabled "Standard Proxy Headers" for Seerr on Dec 19 at 20:58
|
||||
2. Database UPDATE succeeded
|
||||
3. `ApplyConfig()` either failed silently or was never called
|
||||
4. The running config is from an older snapshot that predates the update
|
||||
|
||||
**Immediate Action:**
|
||||
|
||||
```bash
|
||||
docker restart charon
|
||||
```
|
||||
|
||||
This will force a complete config regeneration from the current database state.
|
||||
|
||||
**Long-term Fixes Needed:**
|
||||
|
||||
1. Wrap database updates in transactions that rollback on `ApplyConfig()` failure
|
||||
2. Add enhanced logging to track config generation success/failure
|
||||
3. Implement config staleness detection in health checks
|
||||
4. Verify why the older config is missing `X-Forwarded-Port` (possible code version issue)
|
||||
|
||||
**Alternative Immediate Fix (No Restart):**
|
||||
|
||||
- Make a trivial change to any proxy host in the UI and save
|
||||
- This triggers `ApplyConfig()` and regenerates all configs
|
||||
|
||||
@@ -653,7 +653,7 @@ func TestCheckIntegrity_PRAGMAError(t *testing.T)
|
||||
|
||||
### Phase 2: Error Path Coverage (Target: +8% coverage)
|
||||
|
||||
5. **Test 7**: GetLastBackupTime_ListBackupsError
|
||||
1. **Test 7**: GetLastBackupTime_ListBackupsError
|
||||
2. **Test 20**: DBHealthHandler_Check_BackupServiceError
|
||||
3. **Test 14**: Connect_PRAGMAJournalModeError
|
||||
4. **Test 15**: Connect_IntegrityCheckError
|
||||
@@ -662,7 +662,7 @@ func TestCheckIntegrity_PRAGMAError(t *testing.T)
|
||||
|
||||
### Phase 3: Edge Cases (Target: +5% coverage)
|
||||
|
||||
9. **Test 5**: RunScheduledBackup_CleanupDeletesZero
|
||||
1. **Test 5**: RunScheduledBackup_CleanupDeletesZero
|
||||
2. **Test 21**: DBHealthHandler_Check_BackupTimeZero
|
||||
3. **Test 6**: CleanupOldBackups_PartialFailure
|
||||
4. **Test 8**: CreateBackup_CaddyDirMissing
|
||||
@@ -671,7 +671,7 @@ func TestCheckIntegrity_PRAGMAError(t *testing.T)
|
||||
|
||||
### Phase 4: Constructor & Initialization (Target: +2% coverage)
|
||||
|
||||
13. **Test 1**: NewBackupService_BackupDirCreationError
|
||||
1. **Test 1**: NewBackupService_BackupDirCreationError
|
||||
2. **Test 2**: NewBackupService_CronScheduleError
|
||||
3. **Test 17**: Connect_PRAGMAVerification
|
||||
|
||||
@@ -679,7 +679,7 @@ func TestCheckIntegrity_PRAGMAError(t *testing.T)
|
||||
|
||||
### Phase 5: Deep Coverage (Final +3%)
|
||||
|
||||
16. **Test 10**: addToZip_FileNotFound
|
||||
1. **Test 10**: addToZip_FileNotFound
|
||||
2. **Test 11**: addToZip_FileOpenError
|
||||
3. **Test 9**: CreateBackup_CaddyDirUnreadable
|
||||
4. **Test 22**: LogCorruptionError_EmptyContext
|
||||
|
||||
Reference in New Issue
Block a user