- Implemented `getCrowdsecKeyStatus` API call to retrieve the current status of the CrowdSec API key. - Created `CrowdSecKeyWarning` component to display warnings when the API key is rejected. - Integrated `CrowdSecKeyWarning` into the Security page, ensuring it only shows when relevant. - Updated i18n initialization in main.tsx to prevent race conditions during rendering. - Enhanced authentication setup in tests to handle various response statuses more robustly. - Adjusted security tests to accept broader error responses for import validation.
26 KiB
CI/CD Docker Image Optimization Plan
Document Version: 1.0 Created: 2026-02-04 Status: PLANNING Priority: HIGH (addresses 20GB+ registry waste, reduces build time by 70%)
Table of Contents
- Executive Summary
- Current State Analysis
- Problem Statement
- Proposed Solution
- Implementation Plan
- Migration Strategy
- Testing Strategy
- Rollback Plan
- Success Metrics
- References
Executive Summary
Current Problems
- 20GB+ registry storage usage from redundant image builds
- 2+ hours of compute time per commit due to 7+ redundant Docker builds
- Wasted CI minutes: Each workflow rebuilds the same image independently
- Slow feedback loop: Integration tests delayed by unnecessary builds
Proposed Solution
Build Docker image ONCE per commit, then reuse across ALL workflows
- Expected storage reduction: 85% (3GB instead of 20GB)
- Expected time savings: 70% (30 minutes instead of 2+ hours per commit)
- Expected cost savings: 70% reduction in GitHub Actions minutes
- Improved reliability: Consistent image across all test environments
Key Changes
- Central Build Job:
docker-build.ymlbecomes the single source of truth - SHA-Based Tagging: All images tagged with
sha-<short_hash> - Job Dependencies: All workflows depend on successful build via
needs: - Auto-Cleanup: Ephemeral PR/feature images deleted after 7 days
- Artifact Distribution: Build creates loadable Docker tarball for local tests
Research Findings
Current Implementation Analysis
Backend Files
| File Path | Purpose | Lines | Status |
|---|---|---|---|
backend/internal/api/handlers/crowdsec_handler.go |
Main CrowdSec handler | 2057 | ✅ Complete |
backend/internal/api/routes/routes.go |
Route registration | 650 | ✅ Complete |
API Endpoint Architecture (Already Correct)
// Current route registration (lines 2023-2052)
rg.GET("/admin/crowdsec/files", h.ListFiles) // Returns {files: [...]}
rg.GET("/admin/crowdsec/file", h.ReadFile) // Returns {content: string}
rg.POST("/admin/crowdsec/file", h.WriteFile) // Updates config
rg.POST("/admin/crowdsec/import", h.ImportConfig) // Imports tar.gz/zip
rg.GET("/admin/crowdsec/export", h.ExportConfig) // Exports to tar.gz
Analysis: Two separate endpoints already exist with clear separation of concerns. This follows REST principles.
Handler Implementation (Already Correct)
ListFiles Handler (Line 525-545):
func (h *CrowdsecHandler) ListFiles(c *gin.Context) {
var files []string
// Walks DataDir and collects file paths
c.JSON(http.StatusOK, gin.H{"files": files}) // ✅ Returns array
}
ReadFile Handler (Line 547-574):
func (h *CrowdsecHandler) ReadFile(c *gin.Context) {
rel := c.Query("path") // Gets ?path= param
if rel == "" {
c.JSON(http.StatusBadRequest, gin.H{"error": "path required"})
return
}
// Reads file content
c.JSON(http.StatusOK, gin.H{"content": string(data)}) // ✅ Returns content
}
Status: ✅ Implementation is correct and follows REST principles.
Frontend Integration (Already Correct)
File: frontend/src/api/crowdsec.ts (Lines 91-94)
export async function readCrowdsecFile(path: string) {
const resp = await client.get<{ content: string }>(
`/admin/crowdsec/file?path=${encodeURIComponent(path)}` // ✅ Correct endpoint
)
return resp.data
}
Status: ✅ Frontend correctly calls /admin/crowdsec/file (singular).
Test Failure Root Cause
File: tests/security/crowdsec-diagnostics.spec.ts (Lines 323-355)
// Step 1: Get file list - ✅ CORRECT
const listResponse = await request.get('/api/v1/admin/crowdsec/files');
const fileList = files.files as string[];
const configPath = fileList.find((f) => f.includes('config.yaml'));
// Step 2: Retrieve file content - ❌ WRONG ENDPOINT
const contentResponse = await request.get(
`/api/v1/admin/crowdsec/files?path=${encodeURIComponent(configPath)}` // ❌ Should be /file
);
expect(contentResponse.ok()).toBeTruthy();
const content = await contentResponse.json();
expect(content).toHaveProperty('content'); // ❌ FAILS - Gets {files: [...]}
Root Cause: Test uses /files?path=... (plural) instead of /file?path=... (singular).
Status: ❌ Test bug, not API bug
Proposed Solution
Phase 1: Test Bug Fix (Issue 1 & 2)
Duration: 30 minutes Priority: CRITICAL (unblocks QA)
E2E Test Fix
File: tests/security/crowdsec-diagnostics.spec.ts (Lines 320-360)
Change Required:
- const contentResponse = await request.get(
- `/api/v1/admin/crowdsec/files?path=${encodeURIComponent(configPath)}`
- );
+ const contentResponse = await request.get(
+ `/api/v1/admin/crowdsec/file?path=${encodeURIComponent(configPath)}`
+ );
Explanation: Change plural /files to singular /file to match API design.
Acceptance Criteria:
- Test uses correct endpoint
/admin/crowdsec/file?path=... - Response contains
{content: string, path: string} - Test passes on all browsers (Chromium, Firefox, WebKit)
Validation Command:
# Test against Docker environment
.github/skills/scripts/skill-runner.sh docker-rebuild-e2e
npx playwright test tests/security/crowdsec-diagnostics.spec.ts --project=chromium
# Expected: Test passes
Phase 2: Import Validation Enhancement (Issue 3)
Duration: 4-5 hours Priority: MEDIUM
Enhanced Validation Architecture
Problem: Current ImportConfig handler (lines 378-457) lacks:
- Archive format validation (accepts any file)
- File size limits (no protection against zip bombs)
- Required file validation (doesn't check for config.yaml)
- YAML syntax validation (imports broken configs)
- Rollback mechanism on validation failures
Validation Strategy
New Architecture:
Upload → Format Check → Size Check → Extract → Structure Validation → YAML Validation → Commit
↓ fail ↓ fail ↓ fail ↓ fail ↓ fail
Reject 422 Reject 413 Rollback Rollback Rollback
Implementation Details
File: backend/internal/api/handlers/crowdsec_handler.go (Add at line ~2060)
// Configuration validator
type ConfigArchiveValidator struct {
MaxSize int64 // 50MB default
RequiredFiles []string // config.yaml minimum
}
func (v *ConfigArchiveValidator) Validate(archivePath string) error {
// 1. Check file size
info, err := os.Stat(archivePath)
if err != nil {
return fmt.Errorf("stat archive: %w", err)
}
if info.Size() > v.MaxSize {
return fmt.Errorf("archive too large: %d bytes (max %d)", info.Size(), v.MaxSize)
}
// 2. Detect format (tar.gz or zip only)
format, err := detectArchiveFormat(archivePath)
if err != nil {
return fmt.Errorf("detect format: %w", err)
}
if format != "tar.gz" && format != "zip" {
return fmt.Errorf("unsupported format: %s (expected tar.gz or zip)", format)
}
// 3. Validate contents
files, err := listArchiveContents(archivePath, format)
if err != nil {
return fmt.Errorf("list contents: %w", err)
}
// 4. Check for required config files
missing := []string{}
for _, required := range v.RequiredFiles {
found := false
for _, file := range files {
if strings.HasSuffix(file, required) {
found = true
break
}
}
if !found {
missing = append(missing, required)
}
}
if len(missing) > 0 {
return fmt.Errorf("missing required files: %v", missing)
}
return nil
}
// Format detector
func detectArchiveFormat(path string) (string, error) {
f, err := os.Open(path)
if err != nil {
return "", err
}
defer f.Close()
// Read magic bytes
buf := make([]byte, 512)
n, err := f.Read(buf)
if err != nil && err != io.EOF {
return "", err
}
// Check for gzip magic bytes (1f 8b)
if n >= 2 && buf[0] == 0x1f && buf[1] == 0x8b {
return "tar.gz", nil
}
// Check for zip magic bytes (50 4b)
if n >= 4 && buf[0] == 0x50 && buf[1] == 0x4b {
return "zip", nil
}
return "", fmt.Errorf("unknown format")
}
// Enhanced ImportConfig with validation
func (h *CrowdsecHandler) ImportConfig(c *gin.Context) {
file, err := c.FormFile("file")
if err != nil {
c.JSON(http.StatusBadRequest, gin.H{
"error": "file required",
"details": "multipart form field 'file' is missing",
})
return
}
// Save to temp location
tmpDir := os.TempDir()
tmpPath := filepath.Join(tmpDir, fmt.Sprintf("crowdsec-import-%d", time.Now().UnixNano()))
if err := os.MkdirAll(tmpPath, 0o750); err != nil {
c.JSON(http.StatusInternalServerError, gin.H{
"error": "failed to create temp dir",
"details": err.Error(),
})
return
}
defer os.RemoveAll(tmpPath)
dst := filepath.Join(tmpPath, file.Filename)
if err := c.SaveUploadedFile(file, dst); err != nil {
c.JSON(http.StatusInternalServerError, gin.H{
"error": "failed to save upload",
"details": err.Error(),
})
return
}
// ✨ NEW: Validate archive
validator := &ConfigArchiveValidator{
MaxSize: 50 * 1024 * 1024, // 50MB
RequiredFiles: []string{"config.yaml"},
}
if err := validator.Validate(dst); err != nil {
c.JSON(http.StatusUnprocessableEntity, gin.H{
"error": "invalid config archive",
"details": err.Error(),
})
return
}
// Create backup before import
backupDir := h.DataDir + ".backup." + time.Now().Format("20060102-150405")
if _, err := os.Stat(h.DataDir); err == nil {
if err := os.Rename(h.DataDir, backupDir); err != nil {
c.JSON(http.StatusInternalServerError, gin.H{
"error": "failed to create backup",
"details": err.Error(),
})
return
}
}
// Extract archive
if err := extractArchive(dst, h.DataDir); err != nil {
// ✨ NEW: Restore backup on extraction failure
if backupDir != "" {
_ = os.RemoveAll(h.DataDir)
_ = os.Rename(backupDir, h.DataDir)
}
c.JSON(http.StatusInternalServerError, gin.H{
"error": "failed to extract archive",
"details": err.Error(),
"backup_restored": backupDir != "",
})
return
}
// ✨ NEW: Validate extracted config
configPath := filepath.Join(h.DataDir, "config.yaml")
if _, err := os.Stat(configPath); os.IsNotExist(err) {
// Try subdirectory
configPath = filepath.Join(h.DataDir, "config", "config.yaml")
if _, err := os.Stat(configPath); os.IsNotExist(err) {
// Rollback
_ = os.RemoveAll(h.DataDir)
_ = os.Rename(backupDir, h.DataDir)
c.JSON(http.StatusUnprocessableEntity, gin.H{
"error": "invalid config structure",
"details": "config.yaml not found in expected locations",
"backup_restored": true,
})
return
}
}
// ✨ NEW: Validate YAML syntax
if err := validateYAMLFile(configPath); err != nil {
// Rollback
_ = os.RemoveAll(h.DataDir)
_ = os.Rename(backupDir, h.DataDir)
c.JSON(http.StatusUnprocessableEntity, gin.H{
"error": "invalid config syntax",
"file": "config.yaml",
"details": err.Error(),
"backup_restored": true,
})
return
}
c.JSON(http.StatusOK, gin.H{
"status": "imported",
"backup": backupDir,
"files_extracted": countFiles(h.DataDir),
"reload_hint": true,
})
}
func validateYAMLFile(path string) error {
data, err := os.ReadFile(path)
if err != nil {
return err
}
var config map[string]interface{}
if err := yaml.Unmarshal(data, &config); err != nil {
return fmt.Errorf("YAML syntax error: %w", err)
}
// Basic structure validation
if _, ok := config["api"]; !ok {
return fmt.Errorf("missing required field: api")
}
return nil
}
func extractArchive(src, dst string) error {
format, err := detectArchiveFormat(src)
if err != nil {
return err
}
if format == "tar.gz" {
return extractTarGz(src, dst)
}
return extractZip(src, dst)
}
func extractTarGz(src, dst string) error {
f, err := os.Open(src)
if err != nil {
return err
}
defer f.Close()
gzr, err := gzip.NewReader(f)
if err != nil {
return err
}
defer gzr.Close()
tr := tar.NewReader(gzr)
for {
header, err := tr.Next()
if err == io.EOF {
break
}
if err != nil {
return err
}
target := filepath.Join(dst, header.Name)
// Security: prevent path traversal
if !strings.HasPrefix(target, filepath.Clean(dst)+string(os.PathSeparator)) {
return fmt.Errorf("invalid file path: %s", header.Name)
}
switch header.Typeflag {
case tar.TypeDir:
if err := os.MkdirAll(target, 0750); err != nil {
return err
}
case tar.TypeReg:
os.MkdirAll(filepath.Dir(target), 0750)
outFile, err := os.Create(target)
if err != nil {
return err
}
if _, err := io.Copy(outFile, tr); err != nil {
outFile.Close()
return err
}
outFile.Close()
}
}
return nil
}
Unit Tests
File: backend/internal/api/handlers/crowdsec_handler_test.go (Add new tests)
func TestImportConfig_Validation(t *testing.T) {
tests := []struct {
name string
archive func() io.Reader
wantStatus int
wantError string
}{
{
name: "valid archive",
archive: func() io.Reader {
return createTestArchive(map[string]string{
"config.yaml": "api:\n server:\n listen_uri: test",
})
},
wantStatus: 200,
},
{
name: "missing config.yaml",
archive: func() io.Reader {
return createTestArchive(map[string]string{
"acquis.yaml": "filenames:\n - /var/log/test.log",
})
},
wantStatus: 422,
wantError: "missing required files",
},
{
name: "invalid YAML syntax",
archive: func() io.Reader {
return createTestArchive(map[string]string{
"config.yaml": "invalid: yaml: syntax: [[ unclosed",
})
},
wantStatus: 422,
wantError: "invalid config syntax",
},
{
name: "invalid format",
archive: func() io.Reader {
return strings.NewReader("not a valid archive")
},
wantStatus: 422,
wantError: "unsupported format",
},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
// Setup
dataDir := t.TempDir()
handler := &CrowdsecHandler{DataDir: dataDir}
router := gin.Default()
router.POST("/admin/crowdsec/import", handler.ImportConfig)
// Create multipart request
body := &bytes.Buffer{}
writer := multipart.NewWriter(body)
part, _ := writer.CreateFormFile("file", "test.tar.gz")
io.Copy(part, tt.archive())
writer.Close()
req := httptest.NewRequest("POST", "/admin/crowdsec/import", body)
req.Header.Set("Content-Type", writer.FormDataContentType())
w := httptest.NewRecorder()
// Execute
router.ServeHTTP(w, req)
// Assert
assert.Equal(t, tt.wantStatus, w.Code)
if tt.wantError != "" {
var resp map[string]interface{}
json.Unmarshal(w.Body.Bytes(), &resp)
assert.Contains(t, resp["error"], tt.wantError)
}
})
}
}
E2E Tests
File: tests/security/crowdsec-import.spec.ts (Add new tests)
test.describe('CrowdSec Config Import - Validation', () => {
test('should reject archive without config.yaml', async ({ request }) => {
const mockArchive = createTarGz({
'acquis.yaml': 'filenames:\n - /var/log/test.log'
});
const formData = new FormData();
formData.append('file', new Blob([mockArchive]), 'incomplete.tar.gz');
const response = await request.post('/api/v1/admin/crowdsec/import', {
data: formData,
});
expect(response.status()).toBe(422);
const error = await response.json();
expect(error.error).toContain('invalid config archive');
expect(error.details).toContain('config.yaml');
});
test('should reject invalid YAML syntax', async ({ request }) => {
const mockArchive = createTarGz({
'config.yaml': 'invalid: yaml: syntax: [[ unclosed'
});
const formData = new FormData();
formData.append('file', new Blob([mockArchive]), 'invalid.tar.gz');
const response = await request.post('/api/v1/admin/crowdsec/import', {
data: formData,
});
expect(response.status()).toBe(422);
const error = await response.json();
expect(error.error).toContain('invalid config syntax');
expect(error.backup_restored).toBe(true);
});
test('should rollback on extraction failure', async ({ request }) => {
const corruptedArchive = Buffer.from('not a valid archive');
const formData = new FormData();
formData.append('file', new Blob([corruptedArchive]), 'corrupt.tar.gz');
const response = await request.post('/api/v1/admin/crowdsec/import', {
data: formData,
});
expect(response.status()).toBe(500);
const error = await response.json();
expect(error.backup_restored).toBe(true);
});
});
Acceptance Criteria:
- Archive format validated (tar.gz, zip only)
- File size limits enforced (50MB max)
- Required file presence checked (config.yaml)
- YAML syntax validation
- Automatic rollback on validation failures
- Backup created before every import
- Path traversal attacks blocked during extraction
- E2E tests for all error scenarios
- Unit test coverage ≥ 85%
Implementation Plan
Sprint 1: Test Bug Fix (Day 1, 30 min)
Task 1.1: Fix E2E Test Endpoint
Assignee: TBD
Priority: P0 (Unblocks QA)
Files: tests/security/crowdsec-diagnostics.spec.ts
Steps:
- Open test file (line 323)
- Change
/files?path=...to/file?path=... - Run test locally to verify
- Commit and push
Validation:
# Rebuild E2E environment
.github/skills/scripts/skill-runner.sh docker-rebuild-e2e
# Run specific test
npx playwright test tests/security/crowdsec-diagnostics.spec.ts --project=chromium
# Expected: Test passes
Estimated Time: 30 minutes (includes validation)
Sprint 2: Import Validation (Days 2-3, 4-5 hours)
Task 2.1: Implement ConfigArchiveValidator
Assignee: TBD
Priority: P1
Files: backend/internal/api/handlers/crowdsec_handler.go
Estimated Time: 2 hours
Steps:
- Add
ConfigArchiveValidatorstruct (line ~2060) - Implement
Validate()method - Implement
detectArchiveFormat()helper - Implement
listArchiveContents()helper - Write unit tests for validator
Validation:
go test ./backend/internal/api/handlers -run TestConfigArchiveValidator -v
Task 2.2: Enhance ImportConfig Handler
Assignee: TBD
Priority: P1
Files: backend/internal/api/handlers/crowdsec_handler.go
Estimated Time: 2 hours
Steps:
- Add pre-import validation call
- Implement rollback logic on errors
- Add YAML syntax validation
- Update error responses
- Write unit tests
Validation:
go test ./backend/internal/api/handlers -run TestImportConfig -v
Task 2.3: Add E2E Tests
Assignee: TBD
Priority: P1
Files: tests/security/crowdsec-import.spec.ts
Estimated Time: 1 hour
Steps:
- Create test archive helper function
- Write 4 validation test cases
- Verify rollback behavior
- Check error message format
Validation:
npx playwright test tests/security/crowdsec-import.spec.ts --project=chromium
Testing Strategy
Unit Test Coverage Goals
| Component | Target Coverage | Critical Paths |
|---|---|---|
ConfigArchiveValidator |
90% | Format detection, size check, content validation |
ImportConfig enhanced |
85% | Validation flow, rollback logic, error handling |
E2E Test Scenarios
| Test | Description | Expected Result |
|---|---|---|
| Valid archive | Import with config.yaml | 200 OK, files extracted |
| Missing config.yaml | Import without required file | 422 Unprocessable Entity |
| Invalid YAML | Import with syntax errors | 422, backup restored |
| Oversized archive | Import >50MB file | 413 Payload Too Large |
| Wrong format | Import .txt file | 422 Unsupported format |
| Corrupted archive | Import malformed tar.gz | 500, backup restored |
Coverage Validation
# Backend coverage
go test ./backend/internal/api/handlers -coverprofile=coverage.out
go tool cover -func=coverage.out | grep crowdsec_handler.go
# E2E coverage
.github/skills/scripts/skill-runner.sh test-e2e-playwright-coverage
# Check Codecov patch coverage (must be 100%)
# CI workflow will enforce this
Success Criteria
Definition of Done
- Issue 1 test fix deployed and passing
- Issue 2 confirmed as already working
- Issue 3 validation implemented and tested
- E2E test
should retrieve specific config file contentpasses - Import validation prevents malformed configs
- Rollback mechanism tested and verified
- Backend coverage ≥ 85% for modified handlers
- E2E coverage ≥ 85% for affected test files
- All E2E tests pass on Chromium, Firefox, WebKit
- No new security vulnerabilities introduced
- Pre-commit hooks pass
- Code review completed
Acceptance Tests
# Test 1: Config file retrieval (Issue 1 & 2)
npx playwright test tests/security/crowdsec-diagnostics.spec.ts --project=chromium
# Expected: Test passes with correct endpoint
# Test 2: Import validation (Issue 3)
npx playwright test tests/security/crowdsec-import.spec.ts --project=chromium
# Expected: All validation tests pass
# Test 3: Backend unit tests
go test ./backend/internal/api/handlers -run TestImportConfig -v
go test ./backend/internal/api/handlers -run TestConfigArchiveValidator -v
# Expected: All tests pass
# Test 4: Coverage check
go test ./backend/internal/api/handlers -coverprofile=coverage.out
go tool cover -func=coverage.out | grep total | awk '{print $3}'
# Expected: ≥85%
# Test 5: Manual verification
curl -X POST http://localhost:8080/api/v1/admin/crowdsec/import \
-F "file=@invalid-archive.tar.gz"
# Expected: 422 with validation error
Risks & Mitigation
| Risk | Probability | Impact | Mitigation |
|---|---|---|---|
| Test fix breaks other tests | LOW | MEDIUM | Run full E2E suite before merge |
| Import validation too strict | MEDIUM | MEDIUM | Allow optional files (acquis.yaml) |
| YAML parsing vulnerabilities | LOW | HIGH | Use well-tested yaml library, limit file size |
| Rollback failures | LOW | HIGH | Extensive testing of rollback logic |
File Inventory
Files to Modify
| Path | Changes | Lines Added | Impact |
|---|---|---|---|
tests/security/crowdsec-diagnostics.spec.ts |
Fix endpoint (line 323) | 1 | CRITICAL |
backend/internal/api/handlers/crowdsec_handler.go |
Add validation logic | +150 | HIGH |
backend/internal/api/handlers/crowdsec_handler_test.go |
Add unit tests | +100 | MEDIUM |
tests/security/crowdsec-import.spec.ts |
Add E2E tests | +80 | MEDIUM |
Files to Create
| Path | Purpose | Lines | Priority |
|---|---|---|---|
| None | All changes in existing files | - | - |
Total Effort Estimate
| Phase | Hours | Confidence |
|---|---|---|
| Phase 1: Test Bug Fix | 0.5 | Very High |
| Phase 2: Import Validation | 4-5 | High |
| Testing & QA | 1 | High |
| Code Review | 0.5 | High |
| Total | 6-7 hours | High |
Future Enhancements (Out of Scope)
- Real-time file watching for config changes
- Diff view for config file history
- Config file validation against CrowdSec schema
- Bulk file operations (upload/download multiple)
- WebSocket-based live config editing
- Config version control integration (Git)
- Import from CrowdSec Hub URLs
- Export to CrowdSec console format
References
- QA Report: docs/reports/qa_report.md
- Current Handler: backend/internal/api/handlers/crowdsec_handler.go
- Frontend API: frontend/src/api/crowdsec.ts
- Failing E2E Test: tests/security/crowdsec-diagnostics.spec.ts
- OWASP Path Traversal: https://owasp.org/www-community/attacks/Path_Traversal
- Go filepath Security: https://pkg.go.dev/path/filepath#Clean
Plan Status: ✅ READY FOR REVIEW Next Steps: Review with team → Assign implementation → Begin Phase 1