Compare commits

...

175 Commits

Author SHA1 Message Date
GitHub Actions
3fe592926d chore: update electron-to-chromium version to 1.5.342 in package-lock.json 2026-04-22 00:23:17 +00:00
GitHub Actions
5bcf3069c6 chore: ensure both coverage output directories are created in frontend test coverage script 2026-04-22 00:21:53 +00:00
GitHub Actions
6546130518 chore: update QA report with detailed gate status and revalidation results 2026-04-22 00:13:35 +00:00
GitHub Actions
07108cfa8d chore: refactor frontend test coverage script to improve directory handling and cleanup 2026-04-22 00:13:35 +00:00
GitHub Actions
de945c358b chore: update coverage reports directory configuration in vitest 2026-04-22 00:13:35 +00:00
GitHub Actions
e5c7b85f82 chore: enhance accessibility tests by adding route readiness checks 2026-04-22 00:13:35 +00:00
GitHub Actions
6e06cc3396 chore: update security test paths in Playwright configuration 2026-04-22 00:13:35 +00:00
GitHub Actions
7e3b5b13b4 chore: update @tailwindcss packages to version 4.2.4 and tapable to version 2.3.3 2026-04-22 00:13:35 +00:00
GitHub Actions
91ba53476c chore: update QA/Security DoD Audit Report with latest findings and gate statuses 2026-04-22 00:13:35 +00:00
GitHub Actions
442425a4a5 chore: update version to v0.27.0 2026-04-22 00:13:35 +00:00
GitHub Actions
71fe278e33 chore: update Docker client initialization and container listing logic 2026-04-22 00:13:35 +00:00
GitHub Actions
468af25887 chore: add lefthook and backend test output files to .gitignore 2026-04-22 00:13:35 +00:00
GitHub Actions
d437de1ccf chore: add new output files to .gitignore for scan and coverage results 2026-04-22 00:13:35 +00:00
GitHub Actions
8c56f40131 chore: remove unused libc entries and clean up dependencies in package-lock.json 2026-04-22 00:13:35 +00:00
GitHub Actions
2bf4f869ab chore: update vulnerability suppression and documentation for CVE-2026-34040 in .grype.yaml, .trivyignore, and SECURITY.md 2026-04-22 00:13:35 +00:00
GitHub Actions
dd698afa7e chore: update go.mod and go.sum to remove unused dependencies and add new ones 2026-04-22 00:13:35 +00:00
GitHub Actions
5db3f7046c chore: add accessibility test suite documentation and baseline expiration dates 2026-04-22 00:13:35 +00:00
GitHub Actions
b59a788101 chore: include accessibility scans in non-security CI shards
Add automated accessibility suite execution to the standard non-security
end-to-end browser shards so regressions are caught during routine CI runs.

This change is necessary to enforce accessibility checks consistently across
Chromium, Firefox, and WebKit without creating a separate pipeline path.

Behavior impact:
- Non-security shard jobs now run accessibility tests alongside existing suites
- Security-specific job behavior remains unchanged
- Sharding logic remains unchanged, with only test scope expanded

Operational consideration:
- Monitor shard runtime balance after rollout; if sustained skew appears,
  split accessibility coverage into its own sharded workflow stage.
2026-04-22 00:13:35 +00:00
GitHub Actions
e7460f7e50 chore: update accessibility baseline and enhance loading waits for a11y tests 2026-04-22 00:13:35 +00:00
GitHub Actions
1e1727faa1 chore: add accessibility tests for domains, notifications, setup, and tasks pages 2026-04-22 00:13:35 +00:00
GitHub Actions
0c87c350e5 chore: add accessibility tests for security and uptime pages 2026-04-22 00:13:35 +00:00
GitHub Actions
03101012b9 chore: add accessibility tests for various pages including certificates, dashboard, dns providers, login, proxy hosts, and settings 2026-04-22 00:13:35 +00:00
GitHub Actions
5f855ea779 chore: add accessibility testing support with @axe-core/playwright and related utilities 2026-04-22 00:13:35 +00:00
GitHub Actions
a74d10d138 doc: Integrate @axe-core/playwright for Automated Accessibility Testing
Co-authored-by: Copilot <copilot@github.com>
2026-04-22 00:13:35 +00:00
Jeremy
515a95aaf1 Merge pull request #968 from Wikid82/renovate/feature/beta-release-non-major-updates
fix(deps): update non-major-updates (feature/beta-release)
2026-04-21 20:08:35 -04:00
renovate[bot]
1bcb4de6f8 fix(deps): update non-major-updates 2026-04-21 22:49:48 +00:00
Jeremy
07764db43e Merge pull request #966 from Wikid82/renovate/feature/beta-release-non-major-updates
chore(deps): update non-major-updates (feature/beta-release)
2026-04-21 09:12:51 -04:00
renovate[bot]
54f32c03d0 chore(deps): update non-major-updates 2026-04-21 12:38:30 +00:00
Jeremy
c983250327 Merge pull request #965 from Wikid82/development
Propagate changes from development into feature/beta-release
2026-04-20 20:57:07 -04:00
Jeremy
2308f372d7 Merge pull request #964 from Wikid82/renovate/feature/beta-release-non-major-updates
fix(deps): update non-major-updates (feature/beta-release)
2026-04-20 17:56:55 -04:00
Jeremy
d68001b949 Merge pull request #963 from Wikid82/main
Propagate changes from main into development
2026-04-20 17:56:25 -04:00
Jeremy
a599623ea9 Merge branch 'development' into main 2026-04-20 17:55:51 -04:00
renovate[bot]
96f0be19a4 fix(deps): update non-major-updates 2026-04-20 21:45:50 +00:00
Jeremy
0f0a442d74 Merge pull request #962 from Wikid82/hotfix/ci
fix(ci): shift GeoLite2 update to Sunday targeting development branch
2026-04-20 12:56:13 -04:00
Jeremy
c1470eaac0 Merge pull request #961 from Wikid82/development
Propagate changes from development into feature/beta-release
2026-04-20 12:37:40 -04:00
GitHub Actions
2123fbca77 fix(ci): shift GeoLite2 update to Sunday targeting development branch
Co-authored-by: Copilot <copilot@github.com>
2026-04-20 16:35:02 +00:00
Jeremy
a8cd4bf34c Merge branch 'feature/beta-release' into development 2026-04-20 12:17:15 -04:00
Jeremy
02911109ef Merge pull request #960 from Wikid82/main
Propagate changes from main into development
2026-04-20 08:50:29 -04:00
GitHub Actions
2bad9fec53 fix: make URL preview invite modal test deterministic 2026-04-20 12:48:33 +00:00
Jeremy
54ce6f677c Merge pull request #959 from Wikid82/renovate/feature/beta-release-non-major-updates
fix(deps): update non-major-updates (feature/beta-release)
2026-04-20 08:34:32 -04:00
Jeremy
26a75f5fe3 Merge branch 'development' into main 2026-04-20 08:26:40 -04:00
Jeremy
ad7704c1df Merge branch 'feature/beta-release' into renovate/feature/beta-release-non-major-updates 2026-04-20 08:02:55 -04:00
Jeremy
877fee487b Merge pull request #958 from Wikid82/bot/update-geolite2-checksum
chore(docker): update GeoLite2-Country.mmdb checksum
2026-04-20 07:57:00 -04:00
GitHub Actions
330ccae82f fix: update vulnerability suppression for buger/jsonparser to reflect upstream fix availability 2026-04-20 11:56:26 +00:00
renovate[bot]
0a5bb296a9 fix(deps): update non-major-updates 2026-04-20 11:56:08 +00:00
GitHub Actions
437a35bd47 fix: replace div with button for close action in whitelist delete modal
Co-authored-by: Copilot <copilot@github.com>
2026-04-20 11:29:10 +00:00
GitHub Actions
612d3655fa fix: improve IP normalization in normalizeIPOrCIDR function
Co-authored-by: Copilot <copilot@github.com>
2026-04-20 11:27:56 +00:00
GitHub Actions
38cdc5d9d0 fix(deps): update @oxc-project/types and @rolldown dependencies to version 0.126.0 and 1.0.0-rc.16 respectively 2026-04-20 11:16:56 +00:00
GitHub Actions
816124634b fix(deps): update @oxc-parser dependencies to version 0.126.0 and remove unused packages 2026-04-20 11:16:20 +00:00
GitHub Actions
2b2f3c876b chore: fix Renovate lookup failure for google/uuid dependency 2026-04-20 11:02:31 +00:00
Jeremy
20f2624653 Merge pull request #957 from Wikid82/renovate/feature/beta-release-non-major-updates
fix(deps): update non-major-updates (feature/beta-release)
2026-04-20 06:51:03 -04:00
Wikid82
6509bb5d1b chore(docker): update GeoLite2-Country.mmdb checksum
Automated checksum update for GeoLite2-Country.mmdb database.

Old: b018842033872f19ed9ccefb863ec954f8024db2ae913d0d4ea14e35ace4eba1
New: 62049119bd084e19fff4689bebe258f18a5f27a386e6d26ba5180941b613fc2b

Auto-generated by: .github/workflows/update-geolite2.yml
2026-04-20 02:58:45 +00:00
Jeremy
e8724c5edc Merge branch 'feature/beta-release' into renovate/feature/beta-release-non-major-updates 2026-04-19 17:13:04 -04:00
GitHub Actions
2c284bdd49 test: add tests for handling empty UUID in DeleteWhitelist and invalid CIDR in Add method 2026-04-19 21:11:14 +00:00
GitHub Actions
db1e77ceb3 test(coverage): cover all modified lines for 100% patch coverage vs origin/main
- Add domains field to certificate mock to exercise per-domain loop
  in Dashboard component, covering the previously untested branch
- Extend CrowdSec whitelist test suite with backdrop-click close test
  to cover the dialog dismissal handler
- Remove duplicate describe blocks introduced when whitelist API tests
  were appended to crowdsec.test.ts, resolving ESLint vitest/no-identical-title
  errors that were blocking pre-commit hooks
2026-04-19 21:08:26 +00:00
GitHub Actions
df5e69236a fix(deps): update dependencies for improved stability and performance 2026-04-19 21:03:48 +00:00
renovate[bot]
a3259b042d fix(deps): update non-major-updates 2026-04-19 17:10:33 +00:00
GitHub Actions
f5e7c2bdfc fix(test): resolve CrowdSec card title lookup in Security test mock
The Security component renders the CrowdSec card title using the nested
translation key 'security.crowdsec.title', but the test mock only had the
flat key 'security.crowdsec'. The mock fallback returns the key string
itself when a lookup misses, causing getByText('CrowdSec') to find nothing.

Added 'security.crowdsec.title' to the securityTranslations map so the
mock resolves to the expected 'CrowdSec' string, matching the component's
actual t() call and allowing the title assertion to pass.
2026-04-18 01:39:06 +00:00
GitHub Actions
0859ab31ab fix(deps): update modernc.org/sqlite to version 1.49.1 for improved functionality 2026-04-18 01:36:58 +00:00
GitHub Actions
c02219cc92 fix(deps): update @asamuzakjp/dom-selector, @humanfs/core, @humanfs/node, and hasown to latest versions; add @humanfs/types dependency 2026-04-18 01:35:43 +00:00
GitHub Actions
d73b3aee5c fix(deps): update @humanfs/core and @humanfs/node to latest versions and add @humanfs/types dependency 2026-04-18 01:34:43 +00:00
Jeremy
80eb91e9a1 Merge pull request #956 from Wikid82/renovate/feature/beta-release-non-major-updates
fix(deps): update non-major-updates (feature/beta-release)
2026-04-17 21:33:31 -04:00
renovate[bot]
aa6c751007 fix(deps): update non-major-updates 2026-04-17 20:39:46 +00:00
GitHub Actions
1af786e7c8 fix: update eslint-plugin-react-hooks and typescript to latest versions for improved compatibility 2026-04-16 23:53:11 +00:00
GitHub Actions
c46c1976a2 fix: update typescript to version 6.0.3 for improved functionality and security 2026-04-16 23:52:39 +00:00
GitHub Actions
3b3ea83ecd chore: add database error handling tests for whitelist service and handler 2026-04-16 23:51:01 +00:00
GitHub Actions
5980a8081c fix: improve regex for delete button name matching in CrowdSec IP Whitelist tests 2026-04-16 14:12:07 +00:00
GitHub Actions
55f64f8050 fix: update translation keys for CrowdSec security titles and badges 2026-04-16 14:07:36 +00:00
GitHub Actions
983ae34147 fix(docker): persist CrowdSec LAPI database across container rebuilds 2026-04-16 14:04:15 +00:00
GitHub Actions
4232c0a8ee fix: update benchmark-action/github-action-benchmark to v1.22.0 and mlugg/setup-zig to v2.2.1 for improved security and functionality 2026-04-16 13:34:36 +00:00
GitHub Actions
402a8b3105 fix: update electron-to-chromium, eslint-plugin-sonarjs, minimatch, and ts-api-utils to latest versions 2026-04-16 13:34:36 +00:00
GitHub Actions
f46bb838ca feat: add QA audit report for CrowdSec IP Whitelist Management 2026-04-16 13:34:36 +00:00
GitHub Actions
3d0179a119 fix: update @asamuzakjp/css-color and @asamuzakjp/dom-selector to latest versions and add @asamuzakjp/generational-cache dependency 2026-04-16 13:34:36 +00:00
GitHub Actions
557b33dc73 fix: update docker/go-connections dependency to v0.7.0 2026-04-16 13:34:36 +00:00
GitHub Actions
2a1652d0b1 feat: add IP whitelist management details to architecture documentation 2026-04-16 13:34:36 +00:00
GitHub Actions
f0fdf9b752 test: update response key for whitelist entries and add validation test for missing fields 2026-04-16 13:34:36 +00:00
GitHub Actions
973efd6412 fix: initialize WhitelistSvc only if db is not nil and update error message in AddWhitelist handler 2026-04-16 13:34:36 +00:00
GitHub Actions
028342c63a fix: update JSON response key for whitelist entries in ListWhitelists handler 2026-04-16 13:34:36 +00:00
GitHub Actions
eb9b907ba3 feat: add end-to-end tests for CrowdSec IP whitelist management 2026-04-16 13:34:36 +00:00
GitHub Actions
aee0eeef82 feat: add unit tests for useCrowdSecWhitelist hooks 2026-04-16 13:34:36 +00:00
GitHub Actions
c977cf6190 feat: add whitelist management functionality to CrowdSecConfig 2026-04-16 13:34:36 +00:00
GitHub Actions
28bc73bb1a feat: add whitelist management hooks for querying and mutating whitelist entries 2026-04-16 13:34:36 +00:00
GitHub Actions
19719693b0 feat: add unit tests for CrowdSecWhitelistService and CrowdsecHandler 2026-04-16 13:34:36 +00:00
GitHub Actions
a243066691 feat: regenerate whitelist YAML on CrowdSec startup 2026-04-16 13:34:36 +00:00
GitHub Actions
741a59c333 feat: add whitelist management endpoints to CrowdsecHandler 2026-04-16 13:34:36 +00:00
GitHub Actions
5642a37c44 feat: implement CrowdSecWhitelistService for managing IP/CIDR whitelists 2026-04-16 13:34:36 +00:00
GitHub Actions
1726a19cb6 feat: add CrowdSecWhitelist model and integrate into API route registration 2026-04-16 13:34:36 +00:00
GitHub Actions
40090cda23 feat: add installation of crowdsecurity/whitelists parser 2026-04-16 13:34:36 +00:00
Jeremy
9945fac150 Merge branch 'development' into feature/beta-release 2026-04-16 09:33:49 -04:00
Jeremy
9c416599f8 Merge pull request #955 from Wikid82/renovate/development-non-major-updates
chore(deps): update node.js to 8510330 (development)
2026-04-16 09:33:22 -04:00
Jeremy
abf88ab4cb Merge pull request #954 from Wikid82/renovate/feature/beta-release-non-major-updates
chore(deps): update non-major-updates (feature/beta-release)
2026-04-16 09:33:04 -04:00
renovate[bot]
34903cdd49 chore(deps): update node.js to 8510330 2026-04-16 13:26:43 +00:00
renovate[bot]
98c720987d chore(deps): update non-major-updates 2026-04-16 13:26:37 +00:00
Jeremy
1bd7eab223 Merge pull request #953 from Wikid82/development
Propagate changes from development into feature/beta-release
2026-04-16 09:25:43 -04:00
Jeremy
080e17d85a Merge pull request #951 from Wikid82/main
chore(config): migrate config .github/renovate.json
2026-04-15 13:23:05 -04:00
Jeremy
a059edf60d Merge pull request #950 from Wikid82/main
chore(config): migrate config .github/renovate.json
2026-04-15 13:22:15 -04:00
GitHub Actions
0a3b64ba5c fix: correct misplaced env block in propagate-changes workflow 2026-04-15 17:19:19 +00:00
Jeremy
8ee0d0403a Merge pull request #949 from Wikid82/renovate/migrate-config
chore(config): migrate Renovate config
2026-04-15 13:07:10 -04:00
renovate[bot]
9dab9186e5 chore(config): migrate config .github/renovate.json 2026-04-15 17:05:08 +00:00
Jeremy
c63e4a3d6b Merge pull request #928 from Wikid82/feature/beta-release
feat: Custom Certificate Upload & Management
2026-04-15 12:54:04 -04:00
GitHub Actions
0e8ff1bc2a fix(deps): update @napi-rs/wasm-runtime and postcss to latest versions 2026-04-15 16:09:12 +00:00
Jeremy
683967bbfc Merge pull request #948 from Wikid82/renovate/feature/beta-release-non-major-updates
fix(deps): update non-major-updates (feature/beta-release)
2026-04-15 12:05:15 -04:00
renovate[bot]
15947616a9 fix(deps): update non-major-updates 2026-04-15 16:02:03 +00:00
GitHub Actions
813985a903 fix(dependencies): update mongo-driver to v2.5.1 2026-04-15 11:38:35 +00:00
GitHub Actions
bd48c17aab chore: update dependencies for prettier and std-env in package-lock.json 2026-04-15 11:37:28 +00:00
GitHub Actions
8239a94938 chore: Add tests for CertificateList and CertificateUploadDialog components
- Implement test to deselect a row checkbox in CertificateList by clicking it a second time.
- Add test to close detail dialog via the close button in CertificateList.
- Add test to close export dialog via the cancel button in CertificateList.
- Add test to show KEY format badge when a .key file is uploaded in CertificateUploadDialog.
- Add test to ensure no format badge is shown for unknown file extensions in CertificateUploadDialog.
2026-04-15 11:35:10 +00:00
GitHub Actions
fb8d80f6a3 fix: correct CertificateUploadDialog tests to provide required key file 2026-04-14 20:40:26 +00:00
GitHub Actions
8090c12556 feat(proxy-host): enhance certificate handling and update form integration 2026-04-14 20:35:11 +00:00
GitHub Actions
0e0d42c9fd fix(certificates): mark key file as aria-required for PEM/DER cert uploads 2026-04-14 19:10:57 +00:00
GitHub Actions
14b48f23b6 fix: add key file requirement message for PEM/DER certificates in CertificateUploadDialog 2026-04-14 16:35:37 +00:00
GitHub Actions
0c0adf0e5a fix: refactor context handling in Register tests for improved cleanup 2026-04-14 16:33:54 +00:00
GitHub Actions
135edd208c fix: update caniuse-lite to version 1.0.30001788 for improved compatibility 2026-04-14 12:58:15 +00:00
GitHub Actions
81a083a634 fix: resolve CI test failures and close patch coverage gaps 2026-04-14 12:42:22 +00:00
GitHub Actions
149a2071c3 fix: update electron-to-chromium to version 1.5.336 for improved compatibility 2026-04-14 02:35:05 +00:00
GitHub Actions
027a1b1f18 fix: replace fireEvent with userEvent for file uploads in CertificateUploadDialog tests 2026-04-14 02:33:25 +00:00
GitHub Actions
7adf39a6a0 fix: update axe-core to version 4.11.3 for improved functionality and security 2026-04-14 02:33:25 +00:00
Jeremy
5408ebc95b Merge pull request #947 from Wikid82/renovate/feature/beta-release-actions-upload-pages-artifact-5.x
chore(deps): update actions/upload-pages-artifact action to v5 (feature/beta-release)
2026-04-13 22:32:42 -04:00
Jeremy
92a90bb8a1 Merge pull request #946 from Wikid82/renovate/feature/beta-release-non-major-updates
fix(deps): update non-major-updates (feature/beta-release)
2026-04-13 22:32:26 -04:00
renovate[bot]
6391532b2d fix(deps): update non-major-updates 2026-04-14 01:08:04 +00:00
renovate[bot]
a161163508 chore(deps): update actions/upload-pages-artifact action to v5 2026-04-13 20:32:41 +00:00
GitHub Actions
5b6bf945d9 fix: add key_file validation for PEM/DER uploads and resolve CI test failures 2026-04-13 19:56:35 +00:00
GitHub Actions
877a32f180 fix: enhance form validation for certificate upload by adding required attributes and adjusting test logic 2026-04-13 17:31:05 +00:00
GitHub Actions
1fe8a79ea3 fix: update @typescript-eslint packages to version 8.58.2 and undici to version 7.25.0 2026-04-13 17:29:26 +00:00
GitHub Actions
7c8e8c001c fix: enhance error handling in ConvertPEMToPFX for empty certificate cases 2026-04-13 14:12:47 +00:00
GitHub Actions
29c56ab283 fix: add context parameter to route registration functions for improved lifecycle management 2026-04-13 14:12:47 +00:00
GitHub Actions
0391f2b3e3 fix: add PFX password parameter to ExportCertificate method and update tests 2026-04-13 14:12:47 +00:00
GitHub Actions
942f585dd1 fix: improve error response format in certificate validation 2026-04-13 14:12:47 +00:00
GitHub Actions
3005db6943 fix: remove unnecessary string checks for key file in Upload method 2026-04-13 14:12:47 +00:00
GitHub Actions
f3c33dc81b fix: update golang.org/x/term to v0.42.0 for compatibility improvements 2026-04-13 14:12:47 +00:00
Jeremy
44e2bdec95 Merge branch 'development' into feature/beta-release 2026-04-13 09:25:51 -04:00
Jeremy
d71fc0b95f Merge pull request #945 from Wikid82/renovate/development-pin-dependencies
chore(deps): pin dependencies (development)
2026-04-13 09:18:48 -04:00
renovate[bot]
f295788ac1 chore(deps): pin dependencies 2026-04-13 13:17:54 +00:00
GitHub Actions
c19aa55fd7 chore: update package-lock.json to upgrade dependencies for improved stability 2026-04-13 13:10:40 +00:00
GitHub Actions
ea3d93253f fix: update CADDY_SECURITY_VERSION to 1.1.62 for improved security 2026-04-13 13:10:40 +00:00
Jeremy
114dca89c6 Merge pull request #944 from Wikid82/renovate/feature/beta-release-major-7-github-artifact-actions
chore(deps): update actions/upload-artifact action to v7 (feature/beta-release)
2026-04-13 09:05:00 -04:00
Jeremy
c7932fa1d9 Merge pull request #942 from Wikid82/renovate/feature/beta-release-actions-setup-go-6.x
chore(deps): update actions/setup-go action to v6 (feature/beta-release)
2026-04-13 09:03:23 -04:00
renovate[bot]
f0ffc27ca7 chore(deps): update actions/upload-artifact action to v7 2026-04-13 13:02:54 +00:00
Jeremy
4dfcf70c08 Merge pull request #941 from Wikid82/renovate/feature/beta-release-actions-github-script-9.x
chore(deps): update actions/github-script action to v9 (feature/beta-release)
2026-04-13 09:02:37 -04:00
Jeremy
71b34061d9 Merge pull request #940 from Wikid82/renovate/feature/beta-release-actions-checkout-6.x
chore(deps): update actions/checkout action to v6 (feature/beta-release)
2026-04-13 09:02:14 -04:00
renovate[bot]
368130b07a chore(deps): update actions/setup-go action to v6 2026-04-13 13:01:36 +00:00
renovate[bot]
85216ba6e0 chore(deps): update actions/github-script action to v9 2026-04-13 13:01:30 +00:00
renovate[bot]
06aacdee98 chore(deps): update actions/checkout action to v6 2026-04-13 13:01:24 +00:00
Jeremy
ef44ae40ec Merge branch 'development' into feature/beta-release 2026-04-13 08:49:52 -04:00
Jeremy
26ea2e9da1 Merge pull request #937 from Wikid82/main
Propagate changes from main into development
2026-04-13 08:49:17 -04:00
Jeremy
b90da3740c Merge pull request #936 from Wikid82/renovate/feature/beta-release-non-major-updates
chore(deps): update renovatebot/github-action action to v46.1.9 (feature/beta-release)
2026-04-13 08:48:48 -04:00
GitHub Actions
0ae1dc998a test: update certificate deletion tests to use string UUIDs instead of integers 2026-04-13 12:04:47 +00:00
Jeremy
44f475778f Merge branch 'feature/beta-release' into renovate/feature/beta-release-non-major-updates 2026-04-13 00:42:41 -04:00
GitHub Actions
48f6b7a12b fix: update Dockerfile to include musl and musl-utils in apk upgrade for improved compatibility 2026-04-13 04:40:02 +00:00
renovate[bot]
122e1fc20b chore(deps): update renovatebot/github-action action to v46.1.9 2026-04-13 04:38:53 +00:00
GitHub Actions
850550c5da test: update common name display test to match exact text 2026-04-13 04:38:26 +00:00
GitHub Actions
3b4fa064d6 test: add end-to-end tests for certificate export dialog functionality 2026-04-13 04:32:26 +00:00
GitHub Actions
78a9231c8a chore: add test_output.txt to .gitignore to exclude test output files from version control 2026-04-13 04:24:16 +00:00
GitHub Actions
e88a4c7982 chore: update package-lock.json to remove unused dependencies and improve overall package management 2026-04-13 04:10:16 +00:00
GitHub Actions
9c056faec7 fix: downgrade versions of css-color, brace-expansion, baseline-browser-mapping, and electron-to-chromium for compatibility 2026-04-13 04:07:49 +00:00
GitHub Actions
e865fa2b8b chore: update package.json and package-lock.json to include vitest and coverage dependencies 2026-04-13 04:03:30 +00:00
GitHub Actions
e1bc648dfc test: add certificate feature unit tests and null-safety fix
Add comprehensive unit tests for the certificate upload, export,
and detail management feature:

- CertificateExportDialog: 21 tests covering format selection,
  blob download, error handling, and password-protected exports
- CertificateUploadDialog: 23 tests covering file validation,
  format detection, drag-and-drop, and upload flow
- CertificateDetailDialog: 19 tests covering detail display,
  loading state, missing fields, and branch coverage
- CertificateChainViewer: 8 tests covering chain visualization
- CertificateValidationPreview: 16 tests covering validation display
- FileDropZone: 18 tests covering drag-and-drop interactions
- useCertificates hooks: 10 tests covering all React Query hooks
- certificates API: 7 new tests for previously uncovered endpoints

Fix null-safety issue in ProxyHosts where cert.domains could be
undefined, causing a runtime error on split().

Frontend patch coverage: 90.6%, overall lines: 89.09%
2026-04-13 04:02:31 +00:00
GitHub Actions
9d8d97e556 fix: update @csstools/css-calc, @csstools/css-color-parser, @tanstack/query-core, globals, builtin-modules, knip, and undici to latest versions for improved functionality and security 2026-04-13 04:02:31 +00:00
GitHub Actions
9dc55675ca fix: update Coraza Caddy version to 2.5.0 for compatibility 2026-04-13 04:01:31 +00:00
GitHub Actions
30c9d735aa feat: add certificate export and upload dialogs
- Implemented CertificateExportDialog for exporting certificates in various formats (PEM, PFX, DER) with options to include private keys and set passwords.
- Created CertificateUploadDialog for uploading certificates, including validation and support for multiple file types (certificates, private keys, chain files).
- Updated DeleteCertificateDialog to use 'domains' instead of 'domain' for consistency.
- Refactored BulkDeleteCertificateDialog and DeleteCertificateDialog tests to accommodate changes in certificate structure.
- Added FileDropZone component for improved file upload experience.
- Enhanced translation files with new keys for certificate management features.
- Updated Certificates page to utilize the new CertificateUploadDialog and clean up the upload logic.
- Adjusted Dashboard and ProxyHosts pages to reflect changes in certificate data structure.
2026-04-13 04:01:31 +00:00
GitHub Actions
e49ea7061a fix: add go-pkcs12 v0.7.1 for PKCS#12 support 2026-04-13 04:01:31 +00:00
GitHub Actions
5c50d8b314 fix: update brace-expansion version to 1.1.14 for improved compatibility 2026-04-13 04:01:30 +00:00
Jeremy
af95c1bdb3 Merge pull request #934 from Wikid82/renovate/feature/beta-release-softprops-action-gh-release-3.x
chore(deps): update softprops/action-gh-release action to v3 (feature/beta-release)
2026-04-12 21:14:11 -04:00
renovate[bot]
01e3d910f1 chore(deps): update softprops/action-gh-release action to v3 2026-04-13 01:12:42 +00:00
Jeremy
1230694f55 Merge pull request #933 from Wikid82/renovate/feature/beta-release-non-major-updates
fix(deps): update non-major-updates (feature/beta-release)
2026-04-12 21:06:36 -04:00
renovate[bot]
77f15a225f fix(deps): update non-major-updates 2026-04-12 16:50:55 +00:00
Jeremy
d75abb80d1 Merge pull request #932 from Wikid82/renovate/feature/beta-release-non-major-updates
fix(deps): update non-major-updates (feature/beta-release)
2026-04-11 16:19:08 -04:00
GitHub Actions
42bc897610 fix: enhance certificate deletion handling with UUID validation and logging improvements 2026-04-11 17:54:42 +00:00
renovate[bot]
b15f7c3fbc fix(deps): update non-major-updates 2026-04-11 17:47:55 +00:00
GitHub Actions
bb99dacecd fix: update zlib and add libcrypto3 and libssl3 for improved security 2026-04-11 17:33:44 +00:00
GitHub Actions
4b925418f2 feat: Add certificate validation service with parsing and metadata extraction
- Implemented certificate parsing for PEM, DER, and PFX formats.
- Added functions to validate key matches and certificate chains.
- Introduced metadata extraction for certificates including common name, domains, and issuer organization.
- Created unit tests for all new functionalities to ensure reliability and correctness.
2026-04-11 07:17:45 +00:00
GitHub Actions
9e82efd23a fix: downgrade delve version from 1.26.2 to 1.26.1 for compatibility 2026-04-11 00:11:25 +00:00
GitHub Actions
8f7c10440c chore: align agent and instruction files with single-PR commit-slicing model
- Rewrote commit slicing guidance in Management, Planning, and subagent
  instruction files to enforce one-feature-one-PR with ordered logical commits
- Removed multi-PR branching logic from the execution workflow
- Prevents partial feature merges that cause user confusion on self-hosted tools
- All cross-references now use "Commit N" instead of "PR-N"
2026-04-10 23:41:05 +00:00
GitHub Actions
a439e1d467 fix: add git to Dockerfile dependencies for improved build capabilities 2026-04-10 21:03:54 +00:00
Jeremy
718a957ad9 Merge branch 'development' into feature/beta-release 2026-04-10 16:53:27 -04:00
GitHub Actions
059ff9c6b4 fix: update Go version from 1.26.1 to 1.26.2 in Dockerfile and documentation for security improvements 2026-04-10 20:48:46 +00:00
162 changed files with 19647 additions and 9122 deletions

View File

@@ -303,6 +303,19 @@ ACQUIS_EOF
# Also handle case where it might be without trailing slash
sed -i 's|log_dir: /var/log$|log_dir: /var/log/crowdsec|g' "$CS_CONFIG_DIR/config.yaml"
# Redirect CrowdSec LAPI database to persistent volume
# Default path /var/lib/crowdsec/data/crowdsec.db is ephemeral (not volume-mounted),
# so it is destroyed on every container rebuild. The bouncer API key (stored on the
# persistent volume at /app/data/crowdsec/) survives rebuilds but the LAPI database
# that validates it does not — causing perpetual key rejection.
# Redirecting db_path to the volume-mounted CS_DATA_DIR fixes this.
sed -i "s|db_path: /var/lib/crowdsec/data/crowdsec.db|db_path: ${CS_DATA_DIR}/crowdsec.db|g" "$CS_CONFIG_DIR/config.yaml"
if grep -q "db_path:.*${CS_DATA_DIR}" "$CS_CONFIG_DIR/config.yaml"; then
echo "✓ CrowdSec LAPI database redirected to persistent volume: ${CS_DATA_DIR}/crowdsec.db"
else
echo "⚠️ WARNING: Could not verify LAPI db_path redirect — bouncer keys may not survive rebuilds"
fi
# Verify LAPI configuration was applied correctly
if grep -q "listen_uri:.*:8085" "$CS_CONFIG_DIR/config.yaml"; then
echo "✓ CrowdSec LAPI configured for port 8085"

View File

@@ -43,7 +43,7 @@ You are "lazy" in the smartest way possible. You never do what a subordinate can
- **Identify Goal**: Understand the user's request.
- **STOP**: Do not look at the code. Do not run `list_dir`. No code is to be changed or implemented until there is a fundamentally sound plan of action that has been approved by the user.
- **Action**: Immediately call `Planning` subagent.
- *Prompt*: "Research the necessary files for '{user_request}' and write a comprehensive plan detailing as many specifics as possible to `docs/plans/current_spec.md`. Be an artist with directions and discriptions. Include file names, function names, and component names wherever possible. Break the plan into phases based on the least amount of requests. Include a Commit Slicing Strategy section that decides whether to split work into multiple PRs and, when split, defines PR-1/PR-2/PR-3 scope, dependencies, and acceptance criteria. Review and suggest updaetes to `.gitignore`, `codecov.yml`, `.dockerignore`, and `Dockerfile` if necessary. Return only when the plan is complete."
- *Prompt*: "Research the necessary files for '{user_request}' and write a comprehensive plan detailing as many specifics as possible to `docs/plans/current_spec.md`. Be an artist with directions and discriptions. Include file names, function names, and component names wherever possible. Break the plan into phases based on the least amount of requests. Include a Commit Slicing Strategy section that organizes work into logical commits within a single PR — one feature = one PR, with ordered commits (Commit 1, Commit 2, …) each defining scope, files, dependencies, and validation gates. Review and suggest updaetes to `.gitignore`, `codecov.yml`, `.dockerignore`, and `Dockerfile` if necessary. Return only when the plan is complete."
- **Task Specifics**:
- If the task is to just run tests or audits, there is no need for a plan. Directly call `QA_Security` to perform the tests and write the report. If issues are found, return to `Planning` for a remediation plan and delegate the fixes to the corresponding subagents.
@@ -59,15 +59,13 @@ You are "lazy" in the smartest way possible. You never do what a subordinate can
- **Ask**: "Plan created. Shall I authorize the construction?"
4. **Phase 4: Execution (Waterfall)**:
- **Single-PR or Multi-PR Decision**: Read the Commit Slicing Strategy in `docs/plans/current_spec.md`.
- **If single PR**:
- **Read Commit Slicing Strategy**: Read the Commit Slicing Strategy in `docs/plans/current_spec.md` to understand the ordered commits.
- **Single PR, Multiple Commits**: All work ships as one PR. Each commit maps to a phase in the plan.
- **Backend**: Call `Backend_Dev` with the plan file.
- **Frontend**: Call `Frontend_Dev` with the plan file.
- **If multi-PR**:
- Execute in PR slices, one slice at a time, in dependency order.
- Require each slice to pass review + QA gates before starting the next slice.
- Keep every slice deployable and independently testable.
- **MANDATORY**: Implementation agents must perform linting and type checks locally before declaring their slice "DONE". This is a critical step that must not be skipped to avoid broken commits and security issues.
- Execute commits in dependency order. Each commit must pass its validation gates before the next commit begins.
- The PR is merged only when all commits are complete and all DoD gates pass.
- **MANDATORY**: Implementation agents must perform linting and type checks locally before declaring their commit "DONE". This is a critical step that must not be skipped to avoid broken commits and security issues.
5. **Phase 5: Review**:
- **Supervisor**: Call `Supervisor` to review the implementation against the plan. Provide feedback and ensure alignment with best practices.
@@ -80,7 +78,7 @@ You are "lazy" in the smartest way possible. You never do what a subordinate can
- **Docs**: Call `Docs_Writer`.
- **Manual Testing**: create a new test plan in `docs/issues/*.md` for tracking manual testing focused on finding potential bugs of the implemented features.
- **Final Report**: Summarize the successful subagent runs.
- **PR Roadmap**: If split mode was used, include a concise roadmap of completed and remaining PR slices.
- **Commit Roadmap**: Include a concise summary of completed and remaining commits within the PR.
**Mandatory Commit Message**: When you reach a stopping point, provide a copy and paste code block commit message at the END of the response on format laid out in `.github/instructions/commit-message.instructions.md`
- **STRICT RULES**:

View File

@@ -38,7 +38,7 @@ You are a PRINCIPAL ARCHITECT responsible for technical planning and system desi
- Specify database schema changes
- Document component interactions and data flow
- Identify potential risks and mitigation strategies
- Determine PR sizing and whether to split the work into multiple PRs for safer and faster review
- Determine commit sizing and how to organize work into logical commits within a single PR for safer and faster review
3. **Documentation**:
- Write plan to `docs/plans/current_spec.md`
@@ -46,10 +46,10 @@ You are a PRINCIPAL ARCHITECT responsible for technical planning and system desi
- Break down into implementable tasks using examples, diagrams, and tables
- Estimate complexity for each component
- Add a **Commit Slicing Strategy** section with:
- Decision: single PR or multiple PRs
- Decision: single PR with ordered logical commits (one feature = one PR)
- Trigger reasons (scope, risk, cross-domain changes, review size)
- Ordered PR slices (`PR-1`, `PR-2`, ...), each with scope, files, dependencies, and validation gates
- Rollback and contingency notes per slice
- Ordered commits (`Commit 1`, `Commit 2`, ...), each with scope, files, dependencies, and validation gates
- Rollback and contingency notes for the PR as a whole
4. **Handoff**:
- Once plan is approved, delegate to `Supervisor` agent for review.

View File

@@ -23,21 +23,21 @@ runSubagent({
- Validate: `plan_file` exists and contains a `Handoff Contract` JSON.
- Kickoff: call `Planning` to create the plan if not present.
- Decide: check if work should be split into multiple PRs (size, risk, cross-domain impact).
- Decide: check how to organize work into logical commits within a single PR (size, risk, cross-domain impact).
- Run: execute `Backend Dev` then `Frontend Dev` sequentially.
- Parallel: run `QA and Security`, `DevOps` and `Doc Writer` in parallel for CI / QA checks and documentation.
- Return: a JSON summary with `subagent_results`, `overall_status`, and aggregated artifacts.
2.1) Multi-Commit Slicing Protocol
- If a task is large or high-risk, split into PR slices and execute in order.
- Each slice must have:
- All work for a single feature ships as one PR with ordered logical commits.
- Each commit must have:
- Scope boundary (what is included/excluded)
- Dependency on previous slices
- Validation gates (tests/scans required for that slice)
- Explicit rollback notes
- Do not start the next slice until the current slice is complete and verified.
- Keep each slice independently reviewable and deployable.
- Dependency on previous commits
- Validation gates (tests/scans required for that commit)
- Explicit rollback notes for the PR as a whole
- Do not start the next commit until the current commit is complete and verified.
- Keep each commit independently reviewable within the PR.
3) Return Contract that all subagents must return
@@ -55,7 +55,7 @@ runSubagent({
- On a subagent failure, the Management agent must capture `tests.output` and decide to retry (1 retry maximum), or request a revert/rollback.
- Clearly mark the `status` as `failed`, and include `errors` and `failing_tests` in the `summary`.
- For multi-PR execution, mark failed slice as blocked and stop downstream slices until resolved.
- For multi-commit execution, mark failed commit as blocked and stop downstream commits until resolved.
5) Example: Run a full Feature Implementation

19
.github/renovate.json vendored
View File

@@ -6,10 +6,9 @@
":separateMultipleMajorReleases",
"helpers:pinGitHubActionDigests"
],
"baseBranches": [
"baseBranchPatterns": [
"feature/beta-release",
"development"
],
"postUpdateOptions": ["npmDedupe"],
"timezone": "America/New_York",
@@ -247,20 +246,24 @@
],
"github-actions": {
"fileMatch": ["^\\.github/skills/examples/.*\\.ya?ml$"]
"managerFilePatterns": [
"/^\\.github/skills/examples/.*\\.ya?ml$/"
]
},
"packageRules": [
{
"description": "THE MEGAZORD: Group ALL non-major updates (NPM, Docker, Go, Actions) into one PR",
"matchPackagePatterns": ["*"],
"matchUpdateTypes": [
"minor",
"patch",
"pin",
"digest"
],
"groupName": "non-major-updates"
"groupName": "non-major-updates",
"matchPackageNames": [
"*"
]
},
{
"description": "Feature branches: Auto-merge non-major updates after proven stable",
@@ -321,6 +324,12 @@
"matchDatasources": ["go"],
"matchPackageNames": ["github.com/oschwald/geoip2-golang/v2"],
"sourceUrl": "https://github.com/oschwald/geoip2-golang"
},
{
"description": "Fix Renovate lookup for google/uuid",
"matchDatasources": ["go"],
"matchPackageNames": ["github.com/google/uuid"],
"sourceUrl": "https://github.com/google/uuid"
}
]
}

View File

@@ -20,10 +20,10 @@ jobs:
steps:
- name: Checkout Code
uses: actions/checkout@v4
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Setup Go
uses: actions/setup-go@v5
uses: actions/setup-go@4a3601121dd01d1626a1e23e37211e3254c1c06c # v6
with:
go-version: "1.26.2"
@@ -56,7 +56,7 @@ jobs:
- name: Comment on PR
if: always() && github.event_name == 'pull_request'
uses: actions/github-script@v7
uses: actions/github-script@3a2844b7e9c422d3c10d287c895573f7108da1b3 # v9
with:
script: |
const critical = ${{ steps.parse-report.outputs.critical }};
@@ -89,7 +89,7 @@ jobs:
- name: Upload GORM Scan Report
if: always()
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7
with:
name: gorm-security-report-${{ github.run_id }}
path: docs/reports/gorm-scan-ci-*.txt

View File

@@ -89,7 +89,7 @@ jobs:
- name: Create GitHub Release (creates tag via API)
if: ${{ steps.semver.outputs.changed == 'true' && steps.check_release.outputs.exists == 'false' }}
uses: softprops/action-gh-release@153bb8e04406b158c6c84fc1615b65b24149a1fe # v2
uses: softprops/action-gh-release@b4309332981a82ec1c5618f44dd2e27cc8bfbfda # v3
with:
tag_name: ${{ steps.determine_tag.outputs.tag }}
name: Release ${{ steps.determine_tag.outputs.tag }}

View File

@@ -52,7 +52,7 @@ jobs:
# This avoids gh-pages branch errors and permission issues on fork PRs
if: github.event.workflow_run.event == 'push' && github.event.workflow_run.head_branch == 'main'
# Security: Pinned to full SHA for supply chain security
uses: benchmark-action/github-action-benchmark@4e0b38bc48375986542b13c0d8976b7b80c60c00 # v1
uses: benchmark-action/github-action-benchmark@a60cea5bc7b49e15c1f58f411161f99e0df48372 # v1.22.0
with:
name: Go Benchmark
tool: 'go'

View File

@@ -166,7 +166,7 @@ jobs:
ref: ${{ github.sha }}
- name: Set up Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'

View File

@@ -52,7 +52,7 @@ jobs:
run: bash scripts/ci/check-codeql-parity.sh
- name: Initialize CodeQL
uses: github/codeql-action/init@c10b8064de6f491fea524254123dbe5e09572f13 # v4
uses: github/codeql-action/init@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4
with:
languages: ${{ matrix.language }}
queries: security-and-quality
@@ -92,10 +92,10 @@ jobs:
run: mkdir -p sarif-results
- name: Autobuild
uses: github/codeql-action/autobuild@c10b8064de6f491fea524254123dbe5e09572f13 # v4
uses: github/codeql-action/autobuild@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@c10b8064de6f491fea524254123dbe5e09572f13 # v4
uses: github/codeql-action/analyze@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4
with:
category: "/language:${{ matrix.language }}"
output: sarif-results/${{ matrix.language }}

View File

@@ -541,7 +541,7 @@ jobs:
format: 'table'
severity: 'CRITICAL,HIGH'
exit-code: '0'
version: 'v0.69.3'
version: 'v0.70.0'
continue-on-error: true
- name: Run Trivy vulnerability scanner (SARIF)
@@ -553,7 +553,7 @@ jobs:
format: 'sarif'
output: 'trivy-results.sarif'
severity: 'CRITICAL,HIGH'
version: 'v0.69.3'
version: 'v0.70.0'
continue-on-error: true
- name: Check Trivy SARIF exists
@@ -568,7 +568,7 @@ jobs:
- name: Upload Trivy results
if: env.TRIGGER_EVENT != 'pull_request' && steps.skip.outputs.skip_build != 'true' && steps.trivy-check.outputs.exists == 'true'
uses: github/codeql-action/upload-sarif@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1
uses: github/codeql-action/upload-sarif@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4.35.2
with:
sarif_file: 'trivy-results.sarif'
category: '.github/workflows/docker-build.yml:build-and-push'
@@ -701,7 +701,7 @@ jobs:
format: 'table'
severity: 'CRITICAL,HIGH'
exit-code: '0'
version: 'v0.69.3'
version: 'v0.70.0'
- name: Run Trivy scan on PR image (SARIF - blocking)
id: trivy-scan
@@ -712,7 +712,7 @@ jobs:
output: 'trivy-pr-results.sarif'
severity: 'CRITICAL,HIGH'
exit-code: '1' # Intended to block, but continued on error for now
version: 'v0.69.3'
version: 'v0.70.0'
continue-on-error: true
- name: Check Trivy PR SARIF exists
@@ -727,14 +727,14 @@ jobs:
- name: Upload Trivy scan results
if: always() && steps.trivy-pr-check.outputs.exists == 'true'
uses: github/codeql-action/upload-sarif@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1
uses: github/codeql-action/upload-sarif@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4.35.2
with:
sarif_file: 'trivy-pr-results.sarif'
category: 'docker-pr-image'
- name: Upload Trivy compatibility results (docker-build category)
if: always() && steps.trivy-pr-check.outputs.exists == 'true'
uses: github/codeql-action/upload-sarif@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1
uses: github/codeql-action/upload-sarif@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4.35.2
with:
sarif_file: 'trivy-pr-results.sarif'
category: '.github/workflows/docker-build.yml:build-and-push'
@@ -742,7 +742,7 @@ jobs:
- name: Upload Trivy compatibility results (docker-publish alias)
if: always() && steps.trivy-pr-check.outputs.exists == 'true'
uses: github/codeql-action/upload-sarif@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1
uses: github/codeql-action/upload-sarif@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4.35.2
with:
sarif_file: 'trivy-pr-results.sarif'
category: '.github/workflows/docker-publish.yml:build-and-push'
@@ -750,7 +750,7 @@ jobs:
- name: Upload Trivy compatibility results (nightly alias)
if: always() && steps.trivy-pr-check.outputs.exists == 'true'
uses: github/codeql-action/upload-sarif@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1
uses: github/codeql-action/upload-sarif@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4.35.2
with:
sarif_file: 'trivy-pr-results.sarif'
category: 'trivy-nightly'

View File

@@ -44,7 +44,7 @@ jobs:
ref: ${{ github.event.workflow_run.head_sha || github.sha }}
- name: Set up Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}

View File

@@ -38,7 +38,7 @@ jobs:
# Step 2: Set up Node.js (for building any JS-based doc tools)
- name: 🔧 Set up Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}
@@ -352,7 +352,7 @@ jobs:
# Step 4: Upload the built site
- name: 📤 Upload artifact
uses: actions/upload-pages-artifact@7b1f4a764d45c48632c6b24a0339c27f5614fb0b # v4
uses: actions/upload-pages-artifact@fc324d3547104276b827a68afc52ff2a11cc49c9 # v5
with:
path: '_site'

View File

@@ -151,14 +151,14 @@ jobs:
- name: Set up Node.js
if: steps.resolve-image.outputs.image_source == 'build'
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
- name: Cache npm dependencies
if: steps.resolve-image.outputs.image_source == 'build'
uses: actions/cache@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v5
with:
path: ~/.npm
key: npm-${{ hashFiles('package-lock.json') }}
@@ -225,7 +225,7 @@ jobs:
ref: ${{ github.sha }}
- name: Set up Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
@@ -427,7 +427,7 @@ jobs:
ref: ${{ github.sha }}
- name: Set up Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
@@ -637,7 +637,7 @@ jobs:
ref: ${{ github.sha }}
- name: Set up Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
@@ -859,7 +859,7 @@ jobs:
ref: ${{ github.sha }}
- name: Set up Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
@@ -980,6 +980,7 @@ jobs:
--project=chromium \
--shard=${{ matrix.shard }}/${{ matrix.total-shards }} \
--output=playwright-output/chromium-shard-${{ matrix.shard }} \
tests/a11y \
tests/core \
tests/dns-provider-crud.spec.ts \
tests/dns-provider-types.spec.ts \
@@ -1096,7 +1097,7 @@ jobs:
ref: ${{ github.sha }}
- name: Set up Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
@@ -1225,6 +1226,7 @@ jobs:
--project=firefox \
--shard=${{ matrix.shard }}/${{ matrix.total-shards }} \
--output=playwright-output/firefox-shard-${{ matrix.shard }} \
tests/a11y \
tests/core \
tests/dns-provider-crud.spec.ts \
tests/dns-provider-types.spec.ts \
@@ -1341,7 +1343,7 @@ jobs:
ref: ${{ github.sha }}
- name: Set up Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
@@ -1470,6 +1472,7 @@ jobs:
--project=webkit \
--shard=${{ matrix.shard }}/${{ matrix.total-shards }} \
--output=playwright-output/webkit-shard-${{ matrix.shard }} \
tests/a11y \
tests/core \
tests/dns-provider-crud.spec.ts \
tests/dns-provider-types.spec.ts \

View File

@@ -464,11 +464,11 @@ jobs:
image-ref: ${{ env.GHCR_REGISTRY }}/${{ env.IMAGE_NAME }}:nightly@${{ needs.build-and-push-nightly.outputs.digest }}
format: 'sarif'
output: 'trivy-nightly.sarif'
version: 'v0.69.3'
version: 'v0.70.0'
trivyignores: '.trivyignore'
- name: Upload Trivy results
uses: github/codeql-action/upload-sarif@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1
uses: github/codeql-action/upload-sarif@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4.35.2
with:
sarif_file: 'trivy-nightly.sarif'
category: 'trivy-nightly'

View File

@@ -28,7 +28,7 @@ jobs:
(github.event.workflow_run.head_branch == 'main' || github.event.workflow_run.head_branch == 'development')
steps:
- name: Set up Node (for github-script)
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}
@@ -37,6 +37,8 @@ jobs:
env:
CURRENT_BRANCH: ${{ github.event.workflow_run.head_branch || github.ref_name }}
CURRENT_SHA: ${{ github.event.workflow_run.head_sha || github.sha }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
CHARON_TOKEN: ${{ secrets.CHARON_TOKEN }}
with:
script: |
const currentBranch = process.env.CURRENT_BRANCH || context.ref.replace('refs/heads/', '');
@@ -133,7 +135,9 @@ jobs:
const sensitive = files.some(fn => configPaths.some(sp => fn.startsWith(sp) || fn.includes(sp)));
if (sensitive) {
core.info(`${src} -> ${base} contains sensitive changes (${files.join(', ')}). Skipping automatic propagation.`);
const preview = files.slice(0, 25).join(', ');
const suffix = files.length > 25 ? ` …(+${files.length - 25} more)` : '';
core.info(`${src} -> ${base} contains sensitive changes (${preview}${suffix}). Skipping automatic propagation.`);
return;
}
} catch (error) {
@@ -203,6 +207,3 @@ jobs:
await createPR('development', targetBranch);
}
}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
CHARON_TOKEN: ${{ secrets.CHARON_TOKEN }}

View File

@@ -262,7 +262,7 @@ jobs:
bash "scripts/repo_health_check.sh"
- name: Set up Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6.4.0
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'

View File

@@ -52,7 +52,7 @@ jobs:
cache-dependency-path: backend/go.sum
- name: Set up Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v6
with:
node-version: ${{ env.NODE_VERSION }}
@@ -67,7 +67,7 @@ jobs:
- name: Install Cross-Compilation Tools (Zig)
# Security: Pinned to full SHA for supply chain security
uses: goto-bus-stop/setup-zig@abea47f85e598557f500fa1fd2ab7464fcb39406 # v2
uses: mlugg/setup-zig@d1434d08867e3ee9daa34448df10607b98908d29 # v2.2.1
with:
version: 0.13.0
@@ -75,7 +75,7 @@ jobs:
- name: Run GoReleaser
uses: goreleaser/goreleaser-action@ec59f474b9834571250b370d4735c50f8e2d1e29 # v7
uses: goreleaser/goreleaser-action@e24998b8b67b290c2fa8b7c14fcfa7de2c5c9b8c # v7
with:
distribution: goreleaser
version: '~> v2.5'

View File

@@ -33,7 +33,7 @@ jobs:
go-version: ${{ env.GO_VERSION }}
- name: Run Renovate
uses: renovatebot/github-action@b67590ea780158ccd13192c22a3655a5231f869d # v46.1.8
uses: renovatebot/github-action@83ec54fee49ab67d9cd201084c1ff325b4b462e4 # v46.1.10
with:
configurationFile: .github/renovate.json
token: ${{ secrets.RENOVATE_TOKEN || secrets.GITHUB_TOKEN }}

View File

@@ -102,7 +102,7 @@ jobs:
format: 'table'
severity: 'CRITICAL,HIGH'
exit-code: '1' # Fail workflow if vulnerabilities found
version: 'v0.69.3'
version: 'v0.70.0'
continue-on-error: true
- name: Run Trivy vulnerability scanner (SARIF)
@@ -113,10 +113,10 @@ jobs:
format: 'sarif'
output: 'trivy-weekly-results.sarif'
severity: 'CRITICAL,HIGH,MEDIUM'
version: 'v0.69.3'
version: 'v0.70.0'
- name: Upload Trivy results to GitHub Security
uses: github/codeql-action/upload-sarif@c10b8064de6f491fea524254123dbe5e09572f13 # v4.35.1
uses: github/codeql-action/upload-sarif@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4.35.2
with:
sarif_file: 'trivy-weekly-results.sarif'
@@ -127,7 +127,7 @@ jobs:
format: 'json'
output: 'trivy-weekly-results.json'
severity: 'CRITICAL,HIGH,MEDIUM,LOW'
version: 'v0.69.3'
version: 'v0.70.0'
- name: Upload Trivy JSON results
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7

View File

@@ -362,7 +362,7 @@ jobs:
- name: Upload SARIF to GitHub Security
if: steps.check-artifact.outputs.artifact_found == 'true'
uses: github/codeql-action/upload-sarif@c10b8064de6f491fea524254123dbe5e09572f13 # v4
uses: github/codeql-action/upload-sarif@95e58e9a2cdfd71adc6e0353d5c52f41a045d225 # v4
continue-on-error: true
with:
sarif_file: grype-results.sarif

View File

@@ -2,7 +2,7 @@ name: Update GeoLite2 Checksum
on:
schedule:
- cron: '0 2 * * 1' # Weekly on Mondays at 2 AM UTC
- cron: '0 2 * * 0' # Weekly on Sundays at 2 AM UTC
workflow_dispatch:
permissions:
@@ -141,7 +141,8 @@ jobs:
---
**Auto-generated by:** `.github/workflows/update-geolite2.yml`
**Trigger:** Scheduled weekly check (Mondays 2 AM UTC)
- **Trigger:** Scheduled weekly check (Sundays 2 AM UTC)
base: development
branch: bot/update-geolite2-checksum
delete-branch: true
commit-message: |
@@ -182,7 +183,7 @@ jobs:
### Workflow Details
- **Run URL:** ${runUrl}
- **Triggered:** ${context.eventName === 'schedule' ? 'Scheduled (weekly)' : 'Manual dispatch'}
- **Triggered:** ${context.eventName === 'schedule' ? 'Scheduled (weekly, Sundays)' : 'Manual dispatch'}
- **Timestamp:** ${new Date().toISOString()}
### Required Actions

9
.gitignore vendored
View File

@@ -314,3 +314,12 @@ validation-evidence/**
.github/agents/# Tools Configuration.md
docs/reports/codecove_patch_report.md
vuln-results.json
test_output.txt
coverage_results.txt
final-results.json
new-results.json
scan_output.json
coverage_output.txt
frontend/lint_output.txt
lefthook_out.txt
backend/test_out.txt

View File

@@ -203,45 +203,47 @@ ignore:
# GHSA-6g7g-w4f8-9c9x: buger/jsonparser Delete panic on malformed JSON (DoS)
# Severity: HIGH (CVSS 7.5)
# Package: github.com/buger/jsonparser v1.1.1 (embedded in /usr/local/bin/crowdsec and /usr/local/bin/cscli)
# Status: NO upstream fix available — OSV marks "Last affected: v1.1.1" with no Fixed event
# Status: UPSTREAM FIX EXISTS (v1.1.2 released 2026-03-20) — awaiting CrowdSec to update dependency
# NOTE: As of 2026-04-20, grype v0.111.0 with fresh DB no longer flags this finding in the image.
# This suppression is retained as a safety net in case future DB updates re-surface it.
#
# Vulnerability Details:
# - The Delete function fails to validate offsets on malformed JSON input, producing a
# negative slice index and a runtime panic — denial of service (CWE-125).
# - CVSSv3: AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H
#
# Root Cause (Third-Party Binary + No Upstream Fix):
# Root Cause (Third-Party Binary — Fix Exists Upstream, Not Yet in CrowdSec):
# - Charon does not use buger/jsonparser directly. It is compiled into CrowdSec binaries.
# - The buger/jsonparser repository has no released fix as of 2026-03-19 (GitHub issue #275
# and golang/vulndb #4514 are both open).
# - Fix path: once buger/jsonparser releases a patched version and CrowdSec updates their
# dependency, rebuild the Docker image and remove this suppression.
# - buger/jsonparser released v1.1.2 on 2026-03-20 fixing issue #275.
# - CrowdSec has not yet released a version built with buger/jsonparser v1.1.2.
# - Fix path: once CrowdSec updates their dependency and rebuilds, rebuild the Docker image
# and remove this suppression.
#
# Risk Assessment: ACCEPTED (Limited exploitability + no upstream fix)
# Risk Assessment: ACCEPTED (Limited exploitability; fix exists upstream but not yet in CrowdSec)
# - The DoS vector requires passing malformed JSON to the vulnerable Delete function within
# CrowdSec's internal processing pipeline; this is not a direct attack surface in Charon.
# - CrowdSec's exposed surface is its HTTP API (not raw JSON stream parsing via this path).
#
# Mitigation (active while suppression is in effect):
# - Monitor buger/jsonparser: https://github.com/buger/jsonparser/issues/275
# - Monitor CrowdSec releases: https://github.com/crowdsecurity/crowdsec/releases
# - Monitor CrowdSec releases for a build using buger/jsonparser >= v1.1.2.
# - CrowdSec releases: https://github.com/crowdsecurity/crowdsec/releases
# - Weekly CI security rebuild flags the moment a fixed image ships.
#
# Review:
# - Reviewed 2026-03-19 (initial suppression): no upstream fix exists. Set 30-day review.
# - Extended 2026-04-04: no upstream fix available. buger/jsonparser issue #275 still open.
# - Next review: 2026-05-19. Remove suppression once buger/jsonparser ships a fix and
# CrowdSec updates their dependency.
# - Reviewed 2026-03-19 (initial suppression): no upstream fix. Set 30-day review.
# - Extended 2026-04-04: no upstream fix. buger/jsonparser issue #275 still open.
# - Updated 2026-04-20: buger/jsonparser v1.1.2 released 2026-03-20. CrowdSec not yet updated.
# Grype v0.111.0 with fresh DB (2026-04-20) no longer flags this finding. Suppression retained
# as a safety net. Next review: 2026-05-19 — remove if CrowdSec ships with v1.1.2+.
#
# Removal Criteria:
# - buger/jsonparser releases a patched version (v1.1.2 or higher)
# - CrowdSec releases a version built with the patched jsonparser
# - CrowdSec releases a version built with buger/jsonparser >= v1.1.2
# - Rebuild Docker image, run security-scan-docker-image, confirm finding is resolved
# - Remove this entry and the corresponding .trivyignore entry simultaneously
#
# References:
# - GHSA-6g7g-w4f8-9c9x: https://github.com/advisories/GHSA-6g7g-w4f8-9c9x
# - Upstream issue: https://github.com/buger/jsonparser/issues/275
# - Upstream fix: https://github.com/buger/jsonparser/releases/tag/v1.1.2
# - golang/vulndb: https://github.com/golang/vulndb/issues/4514
# - CrowdSec releases: https://github.com/crowdsecurity/crowdsec/releases
- vulnerability: GHSA-6g7g-w4f8-9c9x
@@ -251,21 +253,20 @@ ignore:
type: go-module
reason: |
HIGH — DoS panic via malformed JSON in buger/jsonparser v1.1.1 embedded in CrowdSec binaries.
No upstream fix: buger/jsonparser has no released patch as of 2026-03-19 (issue #275 open).
Charon does not use this package directly; the vector requires reaching CrowdSec's internal
JSON processing pipeline. Risk accepted; no remediation path until upstream ships a fix.
Reviewed 2026-03-19: no patched release available.
expiry: "2026-05-19" # Extended 2026-04-04: no upstream fix. Next review 2026-05-19.
Upstream fix: buger/jsonparser v1.1.2 released 2026-03-20; CrowdSec has not yet updated their
dependency. Grype no longer flags this as of 2026-04-20 (fresh DB). Suppression retained as
safety net pending CrowdSec update. Charon does not use this package directly.
Updated 2026-04-20: fix v1.1.2 exists upstream; awaiting CrowdSec dependency update.
expiry: "2026-05-19" # Review 2026-05-19: remove if CrowdSec ships with buger/jsonparser >= v1.1.2.
# Action items when this suppression expires:
# 1. Check buger/jsonparser releases: https://github.com/buger/jsonparser/releases
# and issue #275: https://github.com/buger/jsonparser/issues/275
# 2. If a fix has shipped AND CrowdSec has updated their dependency:
# a. Rebuild Docker image and run local security-scan-docker-image
# b. Remove this suppression entry and the corresponding .trivyignore entry
# 3. If no fix yet: Extend expiry by 30 days and update the review comment above
# 4. If extended 3+ times with no progress: Consider opening an issue upstream or
# evaluating whether CrowdSec can replace buger/jsonparser with a safe alternative
# 1. Check if CrowdSec has released a version with buger/jsonparser >= v1.1.2:
# https://github.com/crowdsecurity/crowdsec/releases
# 2. If CrowdSec has updated: rebuild Docker image, run security-scan-docker-image,
# and remove this suppression entry and the corresponding .trivyignore entry
# 3. If grype still does not flag it with fresh DB: consider removing the suppression as
# it may no longer be necessary
# 4. If no CrowdSec update yet: Extend expiry by 30 days
# GHSA-jqcq-xjh3-6g23: pgproto3/v2 DataRow.Decode panic on negative field length (DoS)
# Severity: HIGH (CVSS 7.5)
@@ -482,73 +483,6 @@ ignore:
# 4. If not yet migrated: Extend expiry by 30 days and update the review comment above
# 5. If extended 3+ times: Open an upstream issue on crowdsecurity/crowdsec requesting pgx/v5 migration
# GHSA-x744-4wpc-v9h2 / CVE-2026-34040: Docker AuthZ plugin bypass via oversized request body
# Severity: HIGH (CVSS 8.8)
# CVSS Vector: CVSS:3.1/AV:L/AC:L/PR:L/UI:N/S:C/C:H/I:H/A:H
# CWE: CWE-863 (Incorrect Authorization)
# Package: github.com/docker/docker v28.5.2+incompatible (go-module)
# Status: Fixed in moby/moby v29.3.1 — NO fix available for docker/docker import path
#
# Vulnerability Details:
# - Incomplete fix for Docker AuthZ plugin bypass (CVE-2024-41110). An attacker can send an
# oversized request body to the Docker daemon, causing it to forward the request to the AuthZ
# plugin without the body, allowing unauthorized approvals.
#
# Root Cause (No Fix Available for Import Path):
# - The fix exists in moby/moby v29.3.1, but not for the docker/docker import path that Charon uses.
# - Migration to moby/moby/v2 is not practical: currently beta with breaking changes.
# - Fix path: once docker/docker publishes a patched version or moby/moby/v2 stabilizes,
# update the dependency and remove this suppression.
#
# Risk Assessment: ACCEPTED (Not exploitable in Charon context)
# - Charon uses the Docker client SDK only (list containers). The vulnerability is server-side
# in the Docker daemon's AuthZ plugin handler.
# - Charon does not run a Docker daemon or use AuthZ plugins.
# - The attack vector requires local access to the Docker daemon socket with AuthZ plugins enabled.
#
# Mitigation (active while suppression is in effect):
# - Monitor docker/docker releases: https://github.com/moby/moby/releases
# - Monitor moby/moby/v2 stabilization: https://github.com/moby/moby
# - Weekly CI security rebuild flags the moment a fixed version ships.
#
# Review:
# - Reviewed 2026-03-30 (initial suppression): no fix for docker/docker import path. Set 30-day review.
# - Next review: 2026-04-30. Remove suppression once a fix is available for the docker/docker import path.
#
# Removal Criteria:
# - docker/docker publishes a patched version OR moby/moby/v2 stabilizes and migration is feasible
# - Update dependency, rebuild, run security-scan-docker-image, confirm finding is resolved
# - Remove this entry, the GHSA-pxq6-2prw-chj9 entry, and the corresponding .trivyignore entries simultaneously
#
# References:
# - GHSA-x744-4wpc-v9h2: https://github.com/advisories/GHSA-x744-4wpc-v9h2
# - CVE-2026-34040: https://nvd.nist.gov/vuln/detail/CVE-2026-34040
# - CVE-2024-41110 (original): https://nvd.nist.gov/vuln/detail/CVE-2024-41110
# - moby/moby releases: https://github.com/moby/moby/releases
- vulnerability: GHSA-x744-4wpc-v9h2
package:
name: github.com/docker/docker
version: "v28.5.2+incompatible"
type: go-module
reason: |
HIGH — Docker AuthZ plugin bypass via oversized request body in docker/docker v28.5.2+incompatible.
Incomplete fix for CVE-2024-41110. Fixed in moby/moby v29.3.1 but no fix for docker/docker import path.
Charon uses Docker client SDK only (list containers); the vulnerability is server-side in the Docker
daemon's AuthZ plugin handler. Charon does not run a Docker daemon or use AuthZ plugins.
Risk accepted; no remediation path until docker/docker publishes a fix or moby/moby/v2 stabilizes.
Reviewed 2026-03-30: no patched release available for docker/docker import path.
expiry: "2026-04-30" # 30-day review: no fix for docker/docker import path. Extend in 30-day increments with documented justification.
# Action items when this suppression expires:
# 1. Check docker/docker and moby/moby releases: https://github.com/moby/moby/releases
# 2. Check if moby/moby/v2 has stabilized: https://github.com/moby/moby
# 3. If a fix has shipped for docker/docker import path OR moby/moby/v2 is stable:
# a. Update the dependency and rebuild Docker image
# b. Run local security-scan-docker-image and confirm finding is resolved
# c. Remove this entry, GHSA-pxq6-2prw-chj9 entry, and all corresponding .trivyignore entries
# 4. If no fix yet: Extend expiry by 30 days and update the review comment above
# 5. If extended 3+ times: Open an issue to track moby/moby/v2 migration feasibility
# GHSA-pxq6-2prw-chj9 / CVE-2026-33997: Moby off-by-one error in plugin privilege validation
# Severity: MEDIUM (CVSS 6.8)
# Package: github.com/docker/docker v28.5.2+incompatible (go-module)
@@ -559,9 +493,9 @@ ignore:
# via crafted plugin configurations.
#
# Root Cause (No Fix Available for Import Path):
# - Same import path issue as GHSA-x744-4wpc-v9h2. The fix exists in moby/moby v29.3.1 but not
# - Same import path issue as CVE-2026-34040. The fix exists in moby/moby v29.3.1 but not
# for the docker/docker import path that Charon uses.
# - Fix path: same as GHSA-x744-4wpc-v9h2 — wait for docker/docker patch or moby/moby/v2 stabilization.
# - Fix path: same dependency migration pattern as CVE-2026-34040 (if needed) or upstream fix.
#
# Risk Assessment: ACCEPTED (Not exploitable in Charon context)
# - Charon uses the Docker client SDK only (list containers). The vulnerability is in Docker's
@@ -577,9 +511,9 @@ ignore:
# - Next review: 2026-04-30. Remove suppression once a fix is available for the docker/docker import path.
#
# Removal Criteria:
# - Same as GHSA-x744-4wpc-v9h2: docker/docker publishes a patched version OR moby/moby/v2 stabilizes
# - docker/docker publishes a patched version OR moby/moby/v2 stabilizes
# - Update dependency, rebuild, run security-scan-docker-image, confirm finding is resolved
# - Remove this entry, GHSA-x744-4wpc-v9h2 entry, and all corresponding .trivyignore entries simultaneously
# - Remove this entry and all corresponding .trivyignore entries simultaneously
#
# References:
# - GHSA-pxq6-2prw-chj9: https://github.com/advisories/GHSA-pxq6-2prw-chj9
@@ -605,7 +539,7 @@ ignore:
# 3. If a fix has shipped for docker/docker import path OR moby/moby/v2 is stable:
# a. Update the dependency and rebuild Docker image
# b. Run local security-scan-docker-image and confirm finding is resolved
# c. Remove this entry, GHSA-x744-4wpc-v9h2 entry, and all corresponding .trivyignore entries
# c. Remove this entry and all corresponding .trivyignore entries
# 4. If no fix yet: Extend expiry by 30 days and update the review comment above
# 5. If extended 3+ times: Open an issue to track moby/moby/v2 migration feasibility

View File

@@ -87,23 +87,6 @@ GHSA-x6gf-mpr2-68h6
# exp: 2026-07-09
CVE-2026-32286
# CVE-2026-34040 / GHSA-x744-4wpc-v9h2: Docker AuthZ plugin bypass via oversized request body
# Severity: HIGH (CVSS 8.8) — Package: github.com/docker/docker v28.5.2+incompatible
# Incomplete fix for CVE-2024-41110. Fixed in moby/moby v29.3.1 but no fix for docker/docker import path.
# Charon uses Docker client SDK only (list containers); the vulnerability is server-side in the Docker daemon.
# Review by: 2026-04-30
# See also: .grype.yaml for full justification
# exp: 2026-04-30
CVE-2026-34040
# GHSA-x744-4wpc-v9h2: Docker AuthZ plugin bypass via oversized request body (GHSA alias)
# Severity: HIGH (CVSS 8.8) — Package: github.com/docker/docker v28.5.2+incompatible
# GHSA alias for CVE-2026-34040. See CVE-2026-34040 entry above for full details.
# Review by: 2026-04-30
# See also: .grype.yaml for full justification
# exp: 2026-04-30
GHSA-x744-4wpc-v9h2
# CVE-2026-33997 / GHSA-pxq6-2prw-chj9: Moby off-by-one error in plugin privilege validation
# Severity: MEDIUM (CVSS 6.8) — Package: github.com/docker/docker v28.5.2+incompatible
# Fixed in moby/moby v29.3.1 but no fix for docker/docker import path.

View File

@@ -1 +1 @@
v0.21.0
v0.27.0

View File

@@ -577,6 +577,7 @@ graph LR
- Global threat intelligence (crowd-sourced IP reputation)
- Automatic IP banning with configurable duration
- Decision management API (view, create, delete bans)
- IP whitelist management: operators add/remove IPs and CIDRs via the management UI; entries are persisted in SQLite and regenerated into a `crowdsecurity/whitelists` parser YAML on every mutating operation and at startup
**Modes:**

View File

@@ -13,7 +13,7 @@ ARG BUILD_DEBUG=0
ARG GO_VERSION=1.26.2
# renovate: datasource=docker depName=alpine versioning=docker
ARG ALPINE_IMAGE=alpine:3.23.3@sha256:25109184c71bdad752c8312a8623239686a9a2071e8825f20acb8f2198c3f659
ARG ALPINE_IMAGE=alpine:3.23.4@sha256:5b10f432ef3da1b8d4c7eb6c487f2f5a8f096bc91145e68878dd4a5019afde11
# ---- Shared CrowdSec Version ----
# renovate: datasource=github-releases depName=crowdsecurity/crowdsec
@@ -43,9 +43,9 @@ ARG CADDY_CANDIDATE_VERSION=2.11.2
ARG CADDY_USE_CANDIDATE=0
ARG CADDY_PATCH_SCENARIO=B
# renovate: datasource=go depName=github.com/greenpau/caddy-security
ARG CADDY_SECURITY_VERSION=1.1.61
ARG CADDY_SECURITY_VERSION=1.1.62
# renovate: datasource=go depName=github.com/corazawaf/coraza-caddy
ARG CORAZA_CADDY_VERSION=2.4.0
ARG CORAZA_CADDY_VERSION=2.5.0
## When an official caddy image tag isn't available on the host, use a
## plain Alpine base image and overwrite its caddy binary with our
## xcaddy-built binary in the later COPY step. This avoids relying on
@@ -92,7 +92,7 @@ RUN --mount=type=cache,target=/root/.cache/go-build \
# ---- Frontend Builder ----
# Build the frontend using the BUILDPLATFORM to avoid arm64 musl Rollup native issues
# renovate: datasource=docker depName=node
FROM --platform=$BUILDPLATFORM node:24.14.1-alpine@sha256:01743339035a5c3c11a373cd7c83aeab6ed1457b55da6a69e014a95ac4e4700b AS frontend-builder
FROM --platform=$BUILDPLATFORM node:24.15.0-alpine@sha256:d1b3b4da11eefd5941e7f0b9cf17783fc99d9c6fc34884a665f40a06dbdfc94f AS frontend-builder
WORKDIR /app/frontend
# Copy frontend package files
@@ -131,7 +131,7 @@ SHELL ["/bin/ash", "-o", "pipefail", "-c"]
ARG TARGETPLATFORM
ARG TARGETARCH
# hadolint ignore=DL3018
RUN apk add --no-cache clang lld
RUN apk add --no-cache git clang lld
# hadolint ignore=DL3059
# hadolint ignore=DL3018
# Install musl (headers + runtime) and gcc for cross-compilation linker
@@ -160,7 +160,7 @@ RUN set -eux; \
# Note: xx-go install puts binaries in /go/bin/TARGETOS_TARGETARCH/dlv if cross-compiling.
# We find it and move it to /go/bin/dlv so it's in a consistent location for the next stage.
# renovate: datasource=go depName=github.com/go-delve/delve
ARG DLV_VERSION=1.26.1
ARG DLV_VERSION=1.26.2
# hadolint ignore=DL3059,DL4006
RUN CGO_ENABLED=0 xx-go install github.com/go-delve/delve/cmd/dlv@v${DLV_VERSION} && \
DLV_PATH=$(find /go/bin -name dlv -type f | head -n 1) && \
@@ -345,7 +345,7 @@ RUN --mount=type=cache,target=/root/.cache/go-build \
rm -rf /tmp/buildenv_* /tmp/caddy-initial'
# ---- CrowdSec Builder ----
# Build CrowdSec from source to ensure we use Go 1.26.1+ and avoid stdlib vulnerabilities
# Build CrowdSec from source to ensure we use Go 1.26.2+ and avoid stdlib vulnerabilities
# (CVE-2025-58183, CVE-2025-58186, CVE-2025-58187, CVE-2025-61729)
FROM --platform=$BUILDPLATFORM golang:${GO_VERSION}-alpine AS crowdsec-builder
COPY --from=xx / /
@@ -386,13 +386,13 @@ RUN go get github.com/expr-lang/expr@v${EXPR_LANG_VERSION} && \
go get github.com/jackc/pgx/v4@v4.18.3 && \
# GHSA-xmrv-pmrh-hhx2: AWS SDK v2 event stream injection
# renovate: datasource=go depName=github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream
go get github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream@v1.7.8 && \
go get github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream@v1.7.9 && \
# renovate: datasource=go depName=github.com/aws/aws-sdk-go-v2/service/cloudwatchlogs
go get github.com/aws/aws-sdk-go-v2/service/cloudwatchlogs@v1.68.0 && \
go get github.com/aws/aws-sdk-go-v2/service/cloudwatchlogs@v1.69.1 && \
# renovate: datasource=go depName=github.com/aws/aws-sdk-go-v2/service/kinesis
go get github.com/aws/aws-sdk-go-v2/service/kinesis@v1.43.5 && \
go get github.com/aws/aws-sdk-go-v2/service/kinesis@v1.43.6 && \
# renovate: datasource=go depName=github.com/aws/aws-sdk-go-v2/service/s3
go get github.com/aws/aws-sdk-go-v2/service/s3@v1.99.0 && \
go get github.com/aws/aws-sdk-go-v2/service/s3@v1.99.1 && \
go mod tidy
# Fix compatibility issues with expr-lang v1.17.7
@@ -469,7 +469,7 @@ WORKDIR /app
RUN apk add --no-cache \
bash ca-certificates sqlite-libs sqlite tzdata gettext libcap libcap-utils \
c-ares busybox-extras \
&& apk upgrade --no-cache zlib
&& apk upgrade --no-cache zlib libcrypto3 libssl3 musl musl-utils
# Copy gosu binary from gosu-builder (built with Go 1.26+ to avoid stdlib CVEs)
COPY --from=gosu-builder /gosu-out/gosu /usr/sbin/gosu
@@ -486,7 +486,7 @@ SHELL ["/bin/ash", "-o", "pipefail", "-c"]
# Note: In production, users should provide their own MaxMind license key
# This uses the publicly available GeoLite2 database
# In CI, timeout quickly rather than retrying to save build time
ARG GEOLITE2_COUNTRY_SHA256=b018842033872f19ed9ccefb863ec954f8024db2ae913d0d4ea14e35ace4eba1
ARG GEOLITE2_COUNTRY_SHA256=62049119bd084e19fff4689bebe258f18a5f27a386e6d26ba5180941b613fc2b
RUN mkdir -p /app/data/geoip && \
if [ "$CI" = "true" ] || [ "$CI" = "1" ]; then \
echo "⏱️ CI detected - quick download (10s timeout, no retries)"; \
@@ -516,7 +516,7 @@ COPY --from=caddy-builder /usr/bin/caddy /usr/bin/caddy
# Allow non-root to bind privileged ports (80/443) securely
RUN setcap 'cap_net_bind_service=+ep' /usr/bin/caddy
# Copy CrowdSec binaries from the crowdsec-builder stage (built with Go 1.26.1+)
# Copy CrowdSec binaries from the crowdsec-builder stage (built with Go 1.26.2+)
# This ensures we don't have stdlib vulnerabilities from older Go versions
COPY --from=crowdsec-builder /crowdsec-out/crowdsec /usr/local/bin/crowdsec
COPY --from=crowdsec-builder /crowdsec-out/cscli /usr/local/bin/cscli

View File

@@ -27,7 +27,7 @@ public disclosure.
## Known Vulnerabilities
Last reviewed: 2026-04-09
Last reviewed: 2026-04-21
### [HIGH] CVE-2026-31790 · OpenSSL Vulnerability in Alpine Base Image
@@ -71,48 +71,6 @@ Dockerfile.
---
### [HIGH] CVE-2026-34040 · Docker AuthZ Plugin Bypass via Oversized Request Body
| Field | Value |
|--------------|-------|
| **ID** | CVE-2026-34040 (GHSA-x744-4wpc-v9h2) |
| **Severity** | High · 8.8 |
| **Status** | Awaiting Upstream |
**What**
Docker Engine AuthZ plugins can be bypassed when an API request body exceeds a
certain size threshold. Charon uses the Docker client SDK only; this is a
server-side vulnerability in the Docker daemon's authorization plugin handler.
**Who**
- Discovered by: Automated scan (govulncheck, Grype)
- Reported: 2026-04-04
- Affects: Docker Engine daemon operators; Charon application is not directly vulnerable
**Where**
- Component: `github.com/docker/docker` v28.5.2+incompatible (Docker client SDK)
- Versions affected: Docker Engine < 29.3.1
**When**
- Discovered: 2026-04-04
- Disclosed (if public): Public
- Target fix: When moby/moby/v2 stabilizes or docker/docker import path is updated
**How**
The vulnerability requires an attacker to send oversized API request bodies to the
Docker daemon. Charon uses the Docker client SDK for container management operations
only and does not expose the Docker socket externally. The attack vector is limited
to the Docker daemon host, not the Charon application.
**Planned Remediation**
Monitor moby/moby/v2 module stabilization. The `docker/docker` import path has no
fix available. When a compatible module path exists, migrate the Docker SDK import.
---
### [HIGH] CVE-2026-2673 · OpenSSL TLS 1.3 Key Exchange Group Downgrade
| Field | Value |
@@ -194,8 +152,8 @@ via the Docker client SDK. The attack requires a malicious Docker plugin to be
installed on the host, which is outside Charon's operational scope.
**Planned Remediation**
Same as CVE-2026-34040: monitor moby/moby/v2 module stabilization. No fix
available for the current `docker/docker` import path.
Monitor Moby advisory updates and verify scanner results against current modular
Moby dependency paths.
---
@@ -239,6 +197,49 @@ Charon users is negligible since the vulnerable code path is not exercised.
## Patched Vulnerabilities
### ✅ [HIGH] CVE-2026-34040 · Docker AuthZ Plugin Bypass via Oversized Request Body
| Field | Value |
|--------------|-------|
| **ID** | CVE-2026-34040 (GHSA-x744-4wpc-v9h2) |
| **Severity** | High · 8.8 |
| **Patched** | 2026-04-21 |
**What**
Docker Engine AuthZ plugins can be bypassed when an API request body exceeds a
certain size threshold. The previous Charon backend dependency path was
`github.com/docker/docker`.
**Who**
- Discovered by: Automated scan (govulncheck, Grype)
- Reported: 2026-04-04
**Where**
- Previous component: `github.com/docker/docker` v28.5.2+incompatible (Docker client SDK)
- Remediated component path: `github.com/moby/moby/client` with `github.com/moby/moby/api`
**When**
- Discovered: 2026-04-04
- Patched: 2026-04-21
- Time to patch: 17 days
**How**
The backend Docker service imports and module dependencies were migrated away from
the vulnerable monolith package path to modular Moby dependencies.
**Resolution**
Validation evidence after remediation:
- Backend: `go mod tidy`, `go test ./...`, and `go build ./cmd/api` passed.
- Trivy gate output did not include `CVE-2026-34040` or `GHSA-x744-4wpc-v9h2`.
- Docker image scan gate reported `0 Critical` and `0 High`, and did not include
`CVE-2026-34040` or `GHSA-x744-4wpc-v9h2`.
---
### ✅ [LOW] CVE-2026-26958 · edwards25519 MultiScalarMult Invalid Results
| Field | Value |

View File

@@ -255,7 +255,11 @@ func main() {
cerb := cerberus.New(cfg.Security, db)
// Pass config to routes for auth service and certificate service
if err := routes.RegisterWithDeps(router, db, cfg, caddyManager, cerb); err != nil {
// Lifecycle context cancelled on shutdown to stop background goroutines
appCtx, appCancel := context.WithCancel(context.Background())
defer appCancel()
if err := routes.RegisterWithDeps(appCtx, router, db, cfg, caddyManager, cerb); err != nil {
log.Fatalf("register routes: %v", err)
}
@@ -291,6 +295,9 @@ func main() {
sig := <-quit
logger.Log().Infof("Received signal %v, initiating graceful shutdown...", sig)
// Cancel the app-wide context to stop background goroutines (e.g. cert expiry checker)
appCancel()
// Graceful shutdown with timeout
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()

View File

@@ -3,7 +3,6 @@ module github.com/Wikid82/charon/backend
go 1.26.2
require (
github.com/docker/docker v28.5.2+incompatible
github.com/gin-contrib/gzip v1.2.6
github.com/gin-gonic/gin v1.12.0
github.com/glebarez/sqlite v1.11.0
@@ -11,6 +10,7 @@ require (
github.com/google/uuid v1.6.0
github.com/gorilla/websocket v1.5.3
github.com/mattn/go-sqlite3 v1.14.42
github.com/moby/moby/client v0.4.1
github.com/oschwald/geoip2-golang/v2 v2.1.0
github.com/prometheus/client_golang v1.23.2
github.com/robfig/cron/v3 v3.0.1
@@ -23,6 +23,7 @@ require (
gopkg.in/natefinch/lumberjack.v2 v2.2.1
gorm.io/driver/sqlite v1.6.0
gorm.io/gorm v1.31.1
software.sslmate.com/src/go-pkcs12 v0.7.1
)
require (
@@ -35,10 +36,9 @@ require (
github.com/cloudwego/base64x v0.1.6 // indirect
github.com/containerd/errdefs v1.0.0 // indirect
github.com/containerd/errdefs/pkg v0.3.0 // indirect
github.com/containerd/log v0.1.0 // indirect
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/distribution/reference v0.6.0 // indirect
github.com/docker/go-connections v0.6.0 // indirect
github.com/docker/go-connections v0.7.0 // indirect
github.com/docker/go-units v0.5.0 // indirect
github.com/dustin/go-humanize v1.0.1 // indirect
github.com/felixge/httpsnoop v1.0.4 // indirect
@@ -60,18 +60,15 @@ require (
github.com/leodido/go-urn v1.4.0 // indirect
github.com/mattn/go-isatty v0.0.21 // indirect
github.com/moby/docker-image-spec v1.3.1 // indirect
github.com/moby/sys/atomicwriter v0.1.0 // indirect
github.com/moby/term v0.5.2 // indirect
github.com/moby/moby/api v1.54.2 // indirect
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
github.com/modern-go/reflect2 v1.0.2 // indirect
github.com/morikuni/aec v1.1.0 // indirect
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 // indirect
github.com/ncruces/go-strftime v1.0.0 // indirect
github.com/opencontainers/go-digest v1.0.0 // indirect
github.com/opencontainers/image-spec v1.1.1 // indirect
github.com/oschwald/maxminddb-golang/v2 v2.1.1 // indirect
github.com/pelletier/go-toml/v2 v2.3.0 // indirect
github.com/pkg/errors v0.9.1 // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect
github.com/prometheus/client_model v0.6.2 // indirect
github.com/prometheus/common v0.67.5 // indirect
@@ -82,11 +79,10 @@ require (
github.com/stretchr/objx v0.5.3 // indirect
github.com/twitchyliquid64/golang-asm v0.15.1 // indirect
github.com/ugorji/go/codec v1.3.1 // indirect
go.mongodb.org/mongo-driver/v2 v2.5.0 // indirect
go.mongodb.org/mongo-driver/v2 v2.5.1 // indirect
go.opentelemetry.io/auto/sdk v1.2.1 // indirect
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.68.0 // indirect
go.opentelemetry.io/otel v1.43.0 // indirect
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.43.0 // indirect
go.opentelemetry.io/otel/metric v1.43.0 // indirect
go.opentelemetry.io/otel/trace v1.43.0 // indirect
go.yaml.in/yaml/v2 v2.4.4 // indirect
@@ -94,9 +90,8 @@ require (
golang.org/x/sys v0.43.0 // indirect
google.golang.org/protobuf v1.36.11 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
gotest.tools/v3 v3.5.2 // indirect
modernc.org/libc v1.72.0 // indirect
modernc.org/mathutil v1.7.1 // indirect
modernc.org/memory v1.11.0 // indirect
modernc.org/sqlite v1.48.2 // indirect
modernc.org/sqlite v1.49.1 // indirect
)

View File

@@ -1,5 +1,3 @@
github.com/Azure/go-ansiterm v0.0.0-20250102033503-faa5f7b0171c h1:udKWzYgxTojEKWjV8V+WSxDXJ4NFATAsZjh8iIbsQIg=
github.com/Azure/go-ansiterm v0.0.0-20250102033503-faa5f7b0171c/go.mod h1:xomTg63KZ2rFqZQzSB4Vz2SUXa1BpHTVz9L5PTmPC4E=
github.com/Microsoft/go-winio v0.6.2 h1:F2VQgta7ecxGYO8k3ZZz3RS8fVIXVxONVUPlNERoyfY=
github.com/Microsoft/go-winio v0.6.2/go.mod h1:yd8OoFMLzJbo9gZq8j5qaps8bJ9aShtEA8Ipt1oGCvU=
github.com/beorn7/perks v1.0.1 h1:VlbKKnNfV8bJzeqoa4cOKqO6bYr3WgKZxO8Z16+hsOM=
@@ -10,8 +8,6 @@ github.com/bytedance/sonic v1.15.0 h1:/PXeWFaR5ElNcVE84U0dOHjiMHQOwNIx3K4ymzh/uS
github.com/bytedance/sonic v1.15.0/go.mod h1:tFkWrPz0/CUCLEF4ri4UkHekCIcdnkqXw9VduqpJh0k=
github.com/bytedance/sonic/loader v0.5.1 h1:Ygpfa9zwRCCKSlrp5bBP/b/Xzc3VxsAW+5NIYXrOOpI=
github.com/bytedance/sonic/loader v0.5.1/go.mod h1:AR4NYCk5DdzZizZ5djGqQ92eEhCCcdf5x77udYiSJRo=
github.com/cenkalti/backoff/v5 v5.0.3 h1:ZN+IMa753KfX5hd8vVaMixjnqRZ3y8CuJKRKj1xcsSM=
github.com/cenkalti/backoff/v5 v5.0.3/go.mod h1:rkhZdG3JZukswDf7f0cwqPNk4K0sa+F97BxZthm/crw=
github.com/cespare/xxhash/v2 v2.3.0 h1:UL815xU9SqsFlibzuggzjXhog7bL6oX9BbNZnL2UFvs=
github.com/cespare/xxhash/v2 v2.3.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
github.com/cloudwego/base64x v0.1.6 h1:t11wG9AECkCDk5fMSoxmufanudBtJ+/HemLstXDLI2M=
@@ -20,17 +16,13 @@ github.com/containerd/errdefs v1.0.0 h1:tg5yIfIlQIrxYtu9ajqY42W3lpS19XqdxRQeEwYG
github.com/containerd/errdefs v1.0.0/go.mod h1:+YBYIdtsnF4Iw6nWZhJcqGSg/dwvV7tyJ/kCkyJ2k+M=
github.com/containerd/errdefs/pkg v0.3.0 h1:9IKJ06FvyNlexW690DXuQNx2KA2cUJXx151Xdx3ZPPE=
github.com/containerd/errdefs/pkg v0.3.0/go.mod h1:NJw6s9HwNuRhnjJhM7pylWwMyAkmCQvQ4GpJHEqRLVk=
github.com/containerd/log v0.1.0 h1:TCJt7ioM2cr/tfR8GPbGf9/VRAX8D2B4PjzCpfX540I=
github.com/containerd/log v0.1.0/go.mod h1:VRRf09a7mHDIRezVKTRCrOq78v577GXq3bSa3EhrzVo=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/distribution/reference v0.6.0 h1:0IXCQ5g4/QMHHkarYzh5l+u8T3t73zM5QvfrDyIgxBk=
github.com/distribution/reference v0.6.0/go.mod h1:BbU0aIcezP1/5jX/8MP0YiH4SdvB5Y4f/wlDRiLyi3E=
github.com/docker/docker v28.5.2+incompatible h1:DBX0Y0zAjZbSrm1uzOkdr1onVghKaftjlSWt4AFexzM=
github.com/docker/docker v28.5.2+incompatible/go.mod h1:eEKB0N0r5NX/I1kEveEz05bcu8tLC/8azJZsviup8Sk=
github.com/docker/go-connections v0.6.0 h1:LlMG9azAe1TqfR7sO+NJttz1gy6KO7VJBh+pMmjSD94=
github.com/docker/go-connections v0.6.0/go.mod h1:AahvXYshr6JgfUJGdDCs2b5EZG/vmaMAntpSFH5BFKE=
github.com/docker/go-connections v0.7.0 h1:6SsRfJddP22WMrCkj19x9WKjEDTB+ahsdiGYf0mN39c=
github.com/docker/go-connections v0.7.0/go.mod h1:no1qkHdjq7kLMGUXYAduOhYPSJxxvgWBh7ogVvptn3Q=
github.com/docker/go-units v0.5.0 h1:69rxXcBk27SvSaaxTtLh/8llcHD8vYHT7WSdRZ/jvr4=
github.com/docker/go-units v0.5.0/go.mod h1:fgPhTUdO+D/Jk86RDLlptpiXQzgHJF7gydDDbaIK4Dk=
github.com/dustin/go-humanize v1.0.1 h1:GzkhY7T5VNhEkwH0PVJgjz+fX1rhBrR7pRT3mDkpeCY=
@@ -77,8 +69,6 @@ github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
github.com/gorilla/websocket v1.5.3 h1:saDtZ6Pbx/0u+bgYQ3q96pZgCzfhKXGPqt7kZ72aNNg=
github.com/gorilla/websocket v1.5.3/go.mod h1:YR8l580nyteQvAITg2hZ9XVh4b55+EU/adAjf1fMHhE=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.28.0 h1:HWRh5R2+9EifMyIHV7ZV+MIZqgz+PMpZ14Jynv3O2Zs=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.28.0/go.mod h1:JfhWUomR1baixubs02l85lZYYOm7LV6om4ceouMv45c=
github.com/hashicorp/golang-lru/v2 v2.0.7 h1:a+bsQ5rvGLjzHuww6tVxozPZFVghXaHOwFs4luLUK2k=
github.com/hashicorp/golang-lru/v2 v2.0.7/go.mod h1:QeFd9opnmA6QUJc5vARoKUSoFhyfM2/ZepoAG6RGpeM=
github.com/jinzhu/inflection v1.0.0 h1:K317FqzuhWc8YvSVlFMCCUb36O/S9MCKRDI7QkRKD/E=
@@ -105,19 +95,15 @@ github.com/mattn/go-sqlite3 v1.14.42 h1:MigqEP4ZmHw3aIdIT7T+9TLa90Z6smwcthx+Azv4
github.com/mattn/go-sqlite3 v1.14.42/go.mod h1:pjEuOr8IwzLJP2MfGeTb0A35jauH+C2kbHKBr7yXKVQ=
github.com/moby/docker-image-spec v1.3.1 h1:jMKff3w6PgbfSa69GfNg+zN/XLhfXJGnEx3Nl2EsFP0=
github.com/moby/docker-image-spec v1.3.1/go.mod h1:eKmb5VW8vQEh/BAr2yvVNvuiJuY6UIocYsFu/DxxRpo=
github.com/moby/sys/atomicwriter v0.1.0 h1:kw5D/EqkBwsBFi0ss9v1VG3wIkVhzGvLklJ+w3A14Sw=
github.com/moby/sys/atomicwriter v0.1.0/go.mod h1:Ul8oqv2ZMNHOceF643P6FKPXeCmYtlQMvpizfsSoaWs=
github.com/moby/sys/sequential v0.6.0 h1:qrx7XFUd/5DxtqcoH1h438hF5TmOvzC/lspjy7zgvCU=
github.com/moby/sys/sequential v0.6.0/go.mod h1:uyv8EUTrca5PnDsdMGXhZe6CCe8U/UiTWd+lL+7b/Ko=
github.com/moby/term v0.5.2 h1:6qk3FJAFDs6i/q3W/pQ97SX192qKfZgGjCQqfCJkgzQ=
github.com/moby/term v0.5.2/go.mod h1:d3djjFCrjnB+fl8NJux+EJzu0msscUP+f8it8hPkFLc=
github.com/moby/moby/api v1.54.2 h1:wiat9QAhnDQjA7wk1kh/TqHz2I1uUA7M7t9SAl/JNXg=
github.com/moby/moby/api v1.54.2/go.mod h1:+RQ6wluLwtYaTd1WnPLykIDPekkuyD/ROWQClE83pzs=
github.com/moby/moby/client v0.4.1 h1:DMQgisVoMkmMs7fp3ROSdiBnoAu8+vo3GggFl06M/wY=
github.com/moby/moby/client v0.4.1/go.mod h1:z52C9O2POPOsnxZAy//WtKcQ32P+jT/NGeXu/7nfjGQ=
github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd h1:TRLaZ9cD/w8PVh93nsPXa1VrQ6jlwL5oN8l14QlcNfg=
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
github.com/modern-go/reflect2 v1.0.2 h1:xBagoLtFs94CBntxluKeaWgTMpvLxC4ur3nMaC9Gz0M=
github.com/modern-go/reflect2 v1.0.2/go.mod h1:yWuevngMOJpCy52FWWMvUC8ws7m/LJsjYzDa0/r8luk=
github.com/morikuni/aec v1.1.0 h1:vBBl0pUnvi/Je71dsRrhMBtreIqNMYErSAbEeb8jrXQ=
github.com/morikuni/aec v1.1.0/go.mod h1:xDRgiq/iw5l+zkao76YTKzKttOp2cwPEne25HDkJnBw=
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 h1:C3w9PqII01/Oq1c1nUAm88MOHcQC9l5mIlSMApZMrHA=
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822/go.mod h1:+n7T8mK8HuQTcFwEeznm/DIxMOiR9yIdICNftLE1DvQ=
github.com/ncruces/go-strftime v1.0.0 h1:HMFp8mLCTPp341M/ZnA4qaf7ZlsbTc+miZjCLOFAw7w=
@@ -132,8 +118,6 @@ github.com/oschwald/maxminddb-golang/v2 v2.1.1 h1:lA8FH0oOrM4u7mLvowq8IT6a3Q/qEn
github.com/oschwald/maxminddb-golang/v2 v2.1.1/go.mod h1:PLdx6PR+siSIoXqqy7C7r3SB3KZnhxWr1Dp6g0Hacl8=
github.com/pelletier/go-toml/v2 v2.3.0 h1:k59bC/lIZREW0/iVaQR8nDHxVq8OVlIzYCOJf421CaM=
github.com/pelletier/go-toml/v2 v2.3.0/go.mod h1:2gIqNv+qfxSVS7cM2xJQKtLSTLUE9V8t9Stt+h56mCY=
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/prometheus/client_golang v1.23.2 h1:Je96obch5RDVy3FDMndoUsjAhG5Edi49h0RJWRi/o0o=
@@ -173,18 +157,14 @@ github.com/twitchyliquid64/golang-asm v0.15.1 h1:SU5vSMR7hnwNxj24w34ZyCi/FmDZTkS
github.com/twitchyliquid64/golang-asm v0.15.1/go.mod h1:a1lVb/DtPvCB8fslRZhAngC2+aY1QWCk3Cedj/Gdt08=
github.com/ugorji/go/codec v1.3.1 h1:waO7eEiFDwidsBN6agj1vJQ4AG7lh2yqXyOXqhgQuyY=
github.com/ugorji/go/codec v1.3.1/go.mod h1:pRBVtBSKl77K30Bv8R2P+cLSGaTtex6fsA2Wjqmfxj4=
go.mongodb.org/mongo-driver/v2 v2.5.0 h1:yXUhImUjjAInNcpTcAlPHiT7bIXhshCTL3jVBkF3xaE=
go.mongodb.org/mongo-driver/v2 v2.5.0/go.mod h1:yOI9kBsufol30iFsl1slpdq1I0eHPzybRWdyYUs8K/0=
go.mongodb.org/mongo-driver/v2 v2.5.1 h1:j2U/Qp+wvueSpqitLCSZPT/+ZpVc1xzuwdHWwl7d8ro=
go.mongodb.org/mongo-driver/v2 v2.5.1/go.mod h1:yOI9kBsufol30iFsl1slpdq1I0eHPzybRWdyYUs8K/0=
go.opentelemetry.io/auto/sdk v1.2.1 h1:jXsnJ4Lmnqd11kwkBV2LgLoFMZKizbCi5fNZ/ipaZ64=
go.opentelemetry.io/auto/sdk v1.2.1/go.mod h1:KRTj+aOaElaLi+wW1kO/DZRXwkF4C5xPbEe3ZiIhN7Y=
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.68.0 h1:CqXxU8VOmDefoh0+ztfGaymYbhdB/tT3zs79QaZTNGY=
go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.68.0/go.mod h1:BuhAPThV8PBHBvg8ZzZ/Ok3idOdhWIodywz2xEcRbJo=
go.opentelemetry.io/otel v1.43.0 h1:mYIM03dnh5zfN7HautFE4ieIig9amkNANT+xcVxAj9I=
go.opentelemetry.io/otel v1.43.0/go.mod h1:JuG+u74mvjvcm8vj8pI5XiHy1zDeoCS2LB1spIq7Ay0=
go.opentelemetry.io/otel/exporters/otlp/otlptrace v1.43.0 h1:88Y4s2C8oTui1LGM6bTWkw0ICGcOLCAI5l6zsD1j20k=
go.opentelemetry.io/otel/exporters/otlp/otlptrace v1.43.0/go.mod h1:Vl1/iaggsuRlrHf/hfPJPvVag77kKyvrLeD10kpMl+A=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.43.0 h1:3iZJKlCZufyRzPzlQhUIWVmfltrXuGyfjREgGP3UUjc=
go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.43.0/go.mod h1:/G+nUPfhq2e+qiXMGxMwumDrP5jtzU+mWN7/sjT2rak=
go.opentelemetry.io/otel/metric v1.43.0 h1:d7638QeInOnuwOONPp4JAOGfbCEpYb+K6DVWvdxGzgM=
go.opentelemetry.io/otel/metric v1.43.0/go.mod h1:RDnPtIxvqlgO8GRW18W6Z/4P462ldprJtfxHxyKd2PY=
go.opentelemetry.io/otel/sdk v1.43.0 h1:pi5mE86i5rTeLXqoF/hhiBtUNcrAGHLKQdhg4h4V9Dg=
@@ -193,8 +173,6 @@ go.opentelemetry.io/otel/sdk/metric v1.43.0 h1:S88dyqXjJkuBNLeMcVPRFXpRw2fuwdvfC
go.opentelemetry.io/otel/sdk/metric v1.43.0/go.mod h1:C/RJtwSEJ5hzTiUz5pXF1kILHStzb9zFlIEe85bhj6A=
go.opentelemetry.io/otel/trace v1.43.0 h1:BkNrHpup+4k4w+ZZ86CZoHHEkohws8AY+WTX09nk+3A=
go.opentelemetry.io/otel/trace v1.43.0/go.mod h1:/QJhyVBUUswCphDVxq+8mld+AvhXZLhe+8WVFxiFff0=
go.opentelemetry.io/proto/otlp v1.10.0 h1:IQRWgT5srOCYfiWnpqUYz9CVmbO8bFmKcwYxpuCSL2g=
go.opentelemetry.io/proto/otlp v1.10.0/go.mod h1:/CV4QoCR/S9yaPj8utp3lvQPoqMtxXdzn7ozvvozVqk=
go.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=
go.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=
go.uber.org/mock v0.6.0 h1:hyF9dfmbgIX5EfOdasqLsWD6xqpNZlXblLB/Dbnwv3Y=
@@ -219,12 +197,6 @@ golang.org/x/time v0.15.0 h1:bbrp8t3bGUeFOx08pvsMYRTCVSMk89u4tKbNOZbp88U=
golang.org/x/time v0.15.0/go.mod h1:Y4YMaQmXwGQZoFaVFk4YpCt4FLQMYKZe9oeV/f4MSno=
golang.org/x/tools v0.43.0 h1:12BdW9CeB3Z+J/I/wj34VMl8X+fEXBxVR90JeMX5E7s=
golang.org/x/tools v0.43.0/go.mod h1:uHkMso649BX2cZK6+RpuIPXS3ho2hZo4FVwfoy1vIk0=
google.golang.org/genproto/googleapis/api v0.0.0-20260401024825-9d38bb4040a9 h1:VPWxll4HlMw1Vs/qXtN7BvhZqsS9cdAittCNvVENElA=
google.golang.org/genproto/googleapis/api v0.0.0-20260401024825-9d38bb4040a9/go.mod h1:7QBABkRtR8z+TEnmXTqIqwJLlzrZKVfAUm7tY3yGv0M=
google.golang.org/genproto/googleapis/rpc v0.0.0-20260401024825-9d38bb4040a9 h1:m8qni9SQFH0tJc1X0vmnpw/0t+AImlSvp30sEupozUg=
google.golang.org/genproto/googleapis/rpc v0.0.0-20260401024825-9d38bb4040a9/go.mod h1:4Hqkh8ycfw05ld/3BWL7rJOSfebL2Q+DVDeRgYgxUU8=
google.golang.org/grpc v1.80.0 h1:Xr6m2WmWZLETvUNvIUmeD5OAagMw3FiKmMlTdViWsHM=
google.golang.org/grpc v1.80.0/go.mod h1:ho/dLnxwi3EDJA4Zghp7k2Ec1+c2jqup0bFkw07bwF4=
google.golang.org/protobuf v1.36.11 h1:fV6ZwhNocDyBLK0dj+fg8ektcVegBBuEolpbTQyBNVE=
google.golang.org/protobuf v1.36.11/go.mod h1:HTf+CrKn2C3g5S8VImy6tdcUvCska2kB7j23XfzDpco=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
@@ -263,9 +235,13 @@ modernc.org/opt v0.1.4 h1:2kNGMRiUjrp4LcaPuLY2PzUfqM/w9N23quVwhKt5Qm8=
modernc.org/opt v0.1.4/go.mod h1:03fq9lsNfvkYSfxrfUhZCWPk1lm4cq4N+Bh//bEtgns=
modernc.org/sortutil v1.2.1 h1:+xyoGf15mM3NMlPDnFqrteY07klSFxLElE2PVuWIJ7w=
modernc.org/sortutil v1.2.1/go.mod h1:7ZI3a3REbai7gzCLcotuw9AC4VZVpYMjDzETGsSMqJE=
modernc.org/sqlite v1.48.2 h1:5CnW4uP8joZtA0LedVqLbZV5GD7F/0x91AXeSyjoh5c=
modernc.org/sqlite v1.48.2/go.mod h1:hWjRO6Tj/5Ik8ieqxQybiEOUXy0NJFNp2tpvVpKlvig=
modernc.org/sqlite v1.49.1 h1:dYGHTKcX1sJ+EQDnUzvz4TJ5GbuvhNJa8Fg6ElGx73U=
modernc.org/sqlite v1.49.1/go.mod h1:m0w8xhwYUVY3H6pSDwc3gkJ/irZT/0YEXwBlhaxQEew=
modernc.org/strutil v1.2.1 h1:UneZBkQA+DX2Rp35KcM69cSsNES9ly8mQWD71HKlOA0=
modernc.org/strutil v1.2.1/go.mod h1:EHkiggD70koQxjVdSBM3JKM7k6L0FbGE5eymy9i3B9A=
modernc.org/token v1.1.0 h1:Xl7Ap9dKaEs5kLoOQeQmPWevfnk/DM5qcLcYlA8ys6Y=
modernc.org/token v1.1.0/go.mod h1:UGzOrNV1mAFSEB63lOFHIpNRUVMvYTc6yu1SMY/XTDM=
pgregory.net/rapid v1.2.0 h1:keKAYRcjm+e1F0oAuU5F5+YPAWcyxNNRK2wud503Gnk=
pgregory.net/rapid v1.2.0/go.mod h1:PY5XlDGj0+V1FCq0o192FdRhpKHGTRIWBgqjDBTrq04=
software.sslmate.com/src/go-pkcs12 v0.7.1 h1:bxkUPRsvTPNRBZa4M/aSX4PyMOEbq3V8I6hbkG4F4Q8=
software.sslmate.com/src/go-pkcs12 v0.7.1/go.mod h1:Qiz0EyvDRJjjxGyUQa2cCNZn/wMyzrRJ/qcDXOQazLI=

View File

@@ -2,14 +2,18 @@ package handlers
import (
"fmt"
"io"
"net/http"
"strconv"
"sync"
"time"
"github.com/gin-gonic/gin"
"github.com/google/uuid"
"gorm.io/gorm"
"github.com/Wikid82/charon/backend/internal/logger"
"github.com/Wikid82/charon/backend/internal/models"
"github.com/Wikid82/charon/backend/internal/services"
"github.com/Wikid82/charon/backend/internal/util"
)
@@ -28,9 +32,10 @@ type CertificateHandler struct {
service *services.CertificateService
backupService BackupServiceInterface
notificationService *services.NotificationService
db *gorm.DB
// Rate limiting for notifications
notificationMu sync.Mutex
lastNotificationTime map[uint]time.Time
lastNotificationTime map[string]time.Time
}
func NewCertificateHandler(service *services.CertificateService, backupService BackupServiceInterface, ns *services.NotificationService) *CertificateHandler {
@@ -38,10 +43,18 @@ func NewCertificateHandler(service *services.CertificateService, backupService B
service: service,
backupService: backupService,
notificationService: ns,
lastNotificationTime: make(map[uint]time.Time),
lastNotificationTime: make(map[string]time.Time),
}
}
// SetDB sets the database connection for user lookups (export re-auth).
func (h *CertificateHandler) SetDB(db *gorm.DB) {
h.db = db
}
// maxFileSize is 1MB for certificate file uploads.
const maxFileSize = 1 << 20
func (h *CertificateHandler) List(c *gin.Context) {
certs, err := h.service.ListCertificates()
if err != nil {
@@ -53,34 +66,41 @@ func (h *CertificateHandler) List(c *gin.Context) {
c.JSON(http.StatusOK, certs)
}
type UploadCertificateRequest struct {
Name string `form:"name" binding:"required"`
Certificate string `form:"certificate"` // PEM content
PrivateKey string `form:"private_key"` // PEM content
func (h *CertificateHandler) Get(c *gin.Context) {
certUUID := c.Param("uuid")
if certUUID == "" {
c.JSON(http.StatusBadRequest, gin.H{"error": "uuid is required"})
return
}
detail, err := h.service.GetCertificate(certUUID)
if err != nil {
if err == services.ErrCertNotFound {
c.JSON(http.StatusNotFound, gin.H{"error": "certificate not found"})
return
}
logger.Log().WithError(err).Error("failed to get certificate")
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to get certificate"})
return
}
c.JSON(http.StatusOK, detail)
}
func (h *CertificateHandler) Upload(c *gin.Context) {
// Handle multipart form
name := c.PostForm("name")
if name == "" {
c.JSON(http.StatusBadRequest, gin.H{"error": "name is required"})
return
}
// Read files
// Read certificate file
certFile, err := c.FormFile("certificate_file")
if err != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "certificate_file is required"})
return
}
keyFile, err := c.FormFile("key_file")
if err != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "key_file is required"})
return
}
// Open and read content
certSrc, err := certFile.Open()
if err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to open cert file"})
@@ -92,35 +112,75 @@ func (h *CertificateHandler) Upload(c *gin.Context) {
}
}()
keySrc, err := keyFile.Open()
certBytes, err := io.ReadAll(io.LimitReader(certSrc, maxFileSize))
if err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to open key file"})
c.JSON(http.StatusBadRequest, gin.H{"error": "failed to read certificate file"})
return
}
defer func() {
if errClose := keySrc.Close(); errClose != nil {
logger.Log().WithError(errClose).Warn("failed to close key file")
certPEM := string(certBytes)
// Read private key file (optional — format detection is content-based in the service)
var keyPEM string
keyFile, err := c.FormFile("key_file")
if err == nil {
keySrc, errOpen := keyFile.Open()
if errOpen != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to open key file"})
return
}
}()
defer func() {
if errClose := keySrc.Close(); errClose != nil {
logger.Log().WithError(errClose).Warn("failed to close key file")
}
}()
// Read to string
// Limit size to avoid DoS (e.g. 1MB)
certBytes := make([]byte, 1024*1024)
n, _ := certSrc.Read(certBytes)
certPEM := string(certBytes[:n])
keyBytes, errRead := io.ReadAll(io.LimitReader(keySrc, maxFileSize))
if errRead != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "failed to read key file"})
return
}
keyPEM = string(keyBytes)
}
keyBytes := make([]byte, 1024*1024)
n, _ = keySrc.Read(keyBytes)
keyPEM := string(keyBytes[:n])
// Read chain file (optional)
var chainPEM string
chainFile, err := c.FormFile("chain_file")
if err == nil {
chainSrc, errOpen := chainFile.Open()
if errOpen != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to open chain file"})
return
}
defer func() {
if errClose := chainSrc.Close(); errClose != nil {
logger.Log().WithError(errClose).Warn("failed to close chain file")
}
}()
cert, err := h.service.UploadCertificate(name, certPEM, keyPEM)
chainBytes, errRead := io.ReadAll(io.LimitReader(chainSrc, maxFileSize))
if errRead != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "failed to read chain file"})
return
}
chainPEM = string(chainBytes)
}
// Require key_file for non-PFX formats (PFX embeds the private key)
if keyPEM == "" {
format := services.DetectFormat(certBytes)
if format != services.FormatPFX {
c.JSON(http.StatusBadRequest, gin.H{"error": "key_file is required for PEM/DER certificate uploads"})
return
}
}
cert, err := h.service.UploadCertificate(name, certPEM, keyPEM, chainPEM)
if err != nil {
logger.Log().WithError(err).Error("failed to upload certificate")
c.JSON(http.StatusBadRequest, gin.H{"error": "failed to upload certificate"})
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
return
}
// Send Notification
if h.notificationService != nil {
h.notificationService.SendExternal(c.Request.Context(),
"cert",
@@ -137,24 +197,255 @@ func (h *CertificateHandler) Upload(c *gin.Context) {
c.JSON(http.StatusCreated, cert)
}
type updateCertificateRequest struct {
Name string `json:"name" binding:"required"`
}
func (h *CertificateHandler) Update(c *gin.Context) {
certUUID := c.Param("uuid")
if certUUID == "" {
c.JSON(http.StatusBadRequest, gin.H{"error": "uuid is required"})
return
}
var req updateCertificateRequest
if err := c.ShouldBindJSON(&req); err != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "name is required"})
return
}
info, err := h.service.UpdateCertificate(certUUID, req.Name)
if err != nil {
if err == services.ErrCertNotFound {
c.JSON(http.StatusNotFound, gin.H{"error": "certificate not found"})
return
}
logger.Log().WithError(err).Error("failed to update certificate")
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to update certificate"})
return
}
c.JSON(http.StatusOK, info)
}
func (h *CertificateHandler) Validate(c *gin.Context) {
// Read certificate file
certFile, err := c.FormFile("certificate_file")
if err != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "certificate_file is required"})
return
}
certSrc, err := certFile.Open()
if err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to open cert file"})
return
}
defer func() {
if errClose := certSrc.Close(); errClose != nil {
logger.Log().WithError(errClose).Warn("failed to close certificate file")
}
}()
certBytes, err := io.ReadAll(io.LimitReader(certSrc, maxFileSize))
if err != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "failed to read certificate file"})
return
}
// Read optional key file
var keyPEM string
keyFile, err := c.FormFile("key_file")
if err == nil {
keySrc, errOpen := keyFile.Open()
if errOpen != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to open key file"})
return
}
defer func() {
if errClose := keySrc.Close(); errClose != nil {
logger.Log().WithError(errClose).Warn("failed to close key file")
}
}()
keyBytes, errRead := io.ReadAll(io.LimitReader(keySrc, maxFileSize))
if errRead != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "failed to read key file"})
return
}
keyPEM = string(keyBytes)
}
// Read optional chain file
var chainPEM string
chainFile, err := c.FormFile("chain_file")
if err == nil {
chainSrc, errOpen := chainFile.Open()
if errOpen != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to open chain file"})
return
}
defer func() {
if errClose := chainSrc.Close(); errClose != nil {
logger.Log().WithError(errClose).Warn("failed to close chain file")
}
}()
chainBytes, errRead := io.ReadAll(io.LimitReader(chainSrc, maxFileSize))
if errRead != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "failed to read chain file"})
return
}
chainPEM = string(chainBytes)
}
result, err := h.service.ValidateCertificate(string(certBytes), keyPEM, chainPEM)
if err != nil {
logger.Log().WithError(err).Error("failed to validate certificate")
c.JSON(http.StatusBadRequest, gin.H{
"error": "validation failed",
"errors": []string{err.Error()},
})
return
}
c.JSON(http.StatusOK, result)
}
type exportCertificateRequest struct {
Format string `json:"format" binding:"required"`
IncludeKey bool `json:"include_key"`
PFXPassword string `json:"pfx_password"`
Password string `json:"password"`
}
func (h *CertificateHandler) Export(c *gin.Context) {
certUUID := c.Param("uuid")
if certUUID == "" {
c.JSON(http.StatusBadRequest, gin.H{"error": "uuid is required"})
return
}
var req exportCertificateRequest
if err := c.ShouldBindJSON(&req); err != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "format is required"})
return
}
// Re-authenticate when requesting private key
if req.IncludeKey {
if req.Password == "" {
c.JSON(http.StatusForbidden, gin.H{"error": "password required to export private key"})
return
}
userVal, exists := c.Get("user")
if !exists || h.db == nil {
c.JSON(http.StatusForbidden, gin.H{"error": "authentication required"})
return
}
userMap, ok := userVal.(map[string]any)
if !ok {
c.JSON(http.StatusForbidden, gin.H{"error": "invalid session"})
return
}
userID, ok := userMap["id"]
if !ok {
c.JSON(http.StatusForbidden, gin.H{"error": "invalid session"})
return
}
var user models.User
if err := h.db.First(&user, userID).Error; err != nil {
c.JSON(http.StatusForbidden, gin.H{"error": "user not found"})
return
}
if !user.CheckPassword(req.Password) {
c.JSON(http.StatusForbidden, gin.H{"error": "incorrect password"})
return
}
}
data, filename, err := h.service.ExportCertificate(certUUID, req.Format, req.IncludeKey, req.PFXPassword)
if err != nil {
if err == services.ErrCertNotFound {
c.JSON(http.StatusNotFound, gin.H{"error": "certificate not found"})
return
}
logger.Log().WithError(fmt.Errorf("%s", util.SanitizeForLog(err.Error()))).Error("failed to export certificate")
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to export certificate"})
return
}
c.Header("Content-Disposition", fmt.Sprintf("attachment; filename=%q", filename))
c.Data(http.StatusOK, "application/octet-stream", data)
}
func (h *CertificateHandler) Delete(c *gin.Context) {
idStr := c.Param("id")
id, err := strconv.ParseUint(idStr, 10, 32)
if err != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "invalid id"})
idStr := c.Param("uuid")
// Support both numeric ID (legacy) and UUID
if numID, err := strconv.ParseUint(idStr, 10, 32); err == nil && numID > 0 {
inUse, err := h.service.IsCertificateInUse(uint(numID))
if err != nil {
logger.Log().WithError(err).WithField("certificate_id", numID).Error("failed to check certificate usage")
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to check certificate usage"})
return
}
if inUse {
c.JSON(http.StatusConflict, gin.H{"error": "certificate is in use by one or more proxy hosts"})
return
}
if h.backupService != nil {
if availableSpace, err := h.backupService.GetAvailableSpace(); err != nil {
logger.Log().WithError(err).Warn("unable to check disk space, proceeding with backup")
} else if availableSpace < 100*1024*1024 {
logger.Log().WithField("available_bytes", availableSpace).Warn("low disk space, skipping backup")
c.JSON(http.StatusInsufficientStorage, gin.H{"error": "insufficient disk space for backup"})
return
}
if _, err := h.backupService.CreateBackup(); err != nil {
logger.Log().WithError(err).Error("failed to create backup before deletion")
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to create backup before deletion"})
return
}
}
if err := h.service.DeleteCertificateByID(uint(numID)); err != nil {
if err == services.ErrCertInUse {
c.JSON(http.StatusConflict, gin.H{"error": "certificate is in use by one or more proxy hosts"})
return
}
logger.Log().WithError(err).WithField("certificate_id", numID).Error("failed to delete certificate")
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to delete certificate"})
return
}
h.sendDeleteNotification(c, fmt.Sprintf("%d", numID))
c.JSON(http.StatusOK, gin.H{"message": "certificate deleted"})
return
}
// Validate ID range
if id == 0 {
// UUID path - parse to validate format and produce a canonical, safe string
parsedUUID, parseErr := uuid.Parse(idStr)
if parseErr != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "invalid id"})
return
}
certUUID := parsedUUID.String()
// Check if certificate is in use before proceeding
inUse, err := h.service.IsCertificateInUse(uint(id))
inUse, err := h.service.IsCertificateInUseByUUID(certUUID)
if err != nil {
logger.Log().WithError(err).WithField("certificate_id", id).Error("failed to check certificate usage")
if err == services.ErrCertNotFound {
c.JSON(http.StatusNotFound, gin.H{"error": "certificate not found"})
return
}
logger.Log().WithError(err).WithField("certificate_uuid", util.SanitizeForLog(certUUID)).Error("failed to check certificate usage")
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to check certificate usage"})
return
}
@@ -163,13 +454,10 @@ func (h *CertificateHandler) Delete(c *gin.Context) {
return
}
// Create backup before deletion
if h.backupService != nil {
// Check disk space before backup (require at least 100MB free)
if availableSpace, err := h.backupService.GetAvailableSpace(); err != nil {
logger.Log().WithError(err).Warn("unable to check disk space, proceeding with backup")
} else if availableSpace < 100*1024*1024 {
logger.Log().WithField("available_bytes", availableSpace).Warn("low disk space, skipping backup")
c.JSON(http.StatusInsufficientStorage, gin.H{"error": "insufficient disk space for backup"})
return
}
@@ -181,38 +469,62 @@ func (h *CertificateHandler) Delete(c *gin.Context) {
}
}
// Proceed with deletion
if err := h.service.DeleteCertificate(uint(id)); err != nil {
if err := h.service.DeleteCertificate(certUUID); err != nil {
if err == services.ErrCertInUse {
c.JSON(http.StatusConflict, gin.H{"error": "certificate is in use by one or more proxy hosts"})
return
}
logger.Log().WithError(err).WithField("certificate_id", id).Error("failed to delete certificate")
if err == services.ErrCertNotFound {
c.JSON(http.StatusNotFound, gin.H{"error": "certificate not found"})
return
}
logger.Log().WithError(err).WithField("certificate_uuid", util.SanitizeForLog(certUUID)).Error("failed to delete certificate")
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to delete certificate"})
return
}
// Send Notification with rate limiting (1 per cert per 10 seconds)
if h.notificationService != nil {
h.notificationMu.Lock()
lastTime, exists := h.lastNotificationTime[uint(id)]
if !exists || time.Since(lastTime) > 10*time.Second {
h.lastNotificationTime[uint(id)] = time.Now()
h.notificationMu.Unlock()
h.notificationService.SendExternal(c.Request.Context(),
"cert",
"Certificate Deleted",
fmt.Sprintf("Certificate ID %d deleted", id),
map[string]any{
"ID": id,
"Action": "deleted",
},
)
} else {
h.notificationMu.Unlock()
logger.Log().WithField("certificate_id", id).Debug("notification rate limited")
}
}
h.sendDeleteNotification(c, certUUID)
c.JSON(http.StatusOK, gin.H{"message": "certificate deleted"})
}
func (h *CertificateHandler) sendDeleteNotification(c *gin.Context, certRef string) {
if h.notificationService == nil {
return
}
// Re-validate to produce a CodeQL-safe value (breaks taint from user input).
// Callers already pass validated data; this is defense-in-depth.
safeRef := sanitizeCertRef(certRef)
h.notificationMu.Lock()
lastTime, exists := h.lastNotificationTime[certRef]
if exists && time.Since(lastTime) < 10*time.Second {
h.notificationMu.Unlock()
logger.Log().WithField("certificate_ref", safeRef).Debug("notification rate limited")
return
}
h.lastNotificationTime[certRef] = time.Now()
h.notificationMu.Unlock()
h.notificationService.SendExternal(c.Request.Context(),
"cert",
"Certificate Deleted",
fmt.Sprintf("Certificate %s deleted", safeRef),
map[string]any{
"Ref": safeRef,
"Action": "deleted",
},
)
}
// sanitizeCertRef re-validates a certificate reference (UUID or numeric ID)
// and returns a safe string representation. Returns a placeholder if invalid.
func sanitizeCertRef(ref string) string {
if parsed, err := uuid.Parse(ref); err == nil {
return parsed.String()
}
if n, err := strconv.ParseUint(ref, 10, 64); err == nil {
return strconv.FormatUint(n, 10)
}
return "[invalid-ref]"
}

View File

@@ -1,12 +1,18 @@
package handlers
import (
"bytes"
"encoding/json"
"mime/multipart"
"net/http"
"net/http/httptest"
"strings"
"testing"
"time"
"github.com/gin-gonic/gin"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"github.com/Wikid82/charon/backend/internal/models"
"github.com/Wikid82/charon/backend/internal/services"
@@ -18,7 +24,7 @@ func TestCertificateHandler_List_DBError(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.GET("/api/certificates", h.List)
@@ -34,9 +40,9 @@ func TestCertificateHandler_Delete_InvalidID(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/invalid", http.NoBody)
w := httptest.NewRecorder()
@@ -50,9 +56,9 @@ func TestCertificateHandler_Delete_NotFound(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/9999", http.NoBody)
w := httptest.NewRecorder()
@@ -70,11 +76,11 @@ func TestCertificateHandler_Delete_NoBackupService(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
// No backup service
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+toStr(cert.ID), http.NoBody)
w := httptest.NewRecorder()
@@ -95,9 +101,9 @@ func TestCertificateHandler_Delete_CheckUsageDBError(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+toStr(cert.ID), http.NoBody)
w := httptest.NewRecorder()
@@ -115,7 +121,7 @@ func TestCertificateHandler_List_WithCertificates(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.GET("/api/certificates", h.List)
@@ -135,9 +141,9 @@ func TestCertificateHandler_Delete_ZeroID(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/0", http.NoBody)
w := httptest.NewRecorder()
@@ -169,7 +175,7 @@ func TestCertificateHandler_DBSetupOrdering(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.GET("/api/certificates", h.List)
@@ -179,3 +185,395 @@ func TestCertificateHandler_DBSetupOrdering(t *testing.T) {
assert.Equal(t, http.StatusOK, w.Code)
}
// --- Get handler tests ---
func TestCertificateHandler_Get_Success(t *testing.T) {
db := OpenTestDBWithMigrations(t)
expiry := time.Now().Add(30 * 24 * time.Hour)
db.Create(&models.SSLCertificate{UUID: "get-uuid-1", Name: "Get Test", Provider: "custom", Domains: "get.example.com", ExpiresAt: &expiry})
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.GET("/api/certificates/:uuid", h.Get)
req := httptest.NewRequest(http.MethodGet, "/api/certificates/get-uuid-1", http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
assert.Contains(t, w.Body.String(), "get-uuid-1")
assert.Contains(t, w.Body.String(), "Get Test")
}
func TestCertificateHandler_Get_NotFound(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.GET("/api/certificates/:uuid", h.Get)
req := httptest.NewRequest(http.MethodGet, "/api/certificates/nonexistent-uuid", http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusNotFound, w.Code)
}
func TestCertificateHandler_Get_EmptyUUID(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
// Route with empty uuid param won't match, test the handler directly with blank uuid
r.GET("/api/certificates/", h.Get)
req := httptest.NewRequest(http.MethodGet, "/api/certificates/", http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
// Empty uuid should return 400 or 404 depending on router handling
assert.True(t, w.Code == http.StatusBadRequest || w.Code == http.StatusNotFound)
}
// --- SetDB test ---
func TestCertificateHandler_SetDB(t *testing.T) {
db := OpenTestDBWithMigrations(t)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
assert.Nil(t, h.db)
h.SetDB(db)
assert.NotNil(t, h.db)
}
// --- Update handler tests ---
func TestCertificateHandler_Update_Success(t *testing.T) {
db := OpenTestDBWithMigrations(t)
expiry := time.Now().Add(30 * 24 * time.Hour)
db.Create(&models.SSLCertificate{UUID: "upd-uuid-1", Name: "Old Name", Provider: "custom", Domains: "update.example.com", ExpiresAt: &expiry})
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.PUT("/api/certificates/:uuid", h.Update)
body, _ := json.Marshal(map[string]string{"name": "New Name"})
req := httptest.NewRequest(http.MethodPut, "/api/certificates/upd-uuid-1", bytes.NewBuffer(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
assert.Contains(t, w.Body.String(), "New Name")
}
func TestCertificateHandler_Update_NotFound(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.PUT("/api/certificates/:uuid", h.Update)
body, _ := json.Marshal(map[string]string{"name": "New Name"})
req := httptest.NewRequest(http.MethodPut, "/api/certificates/nonexistent-uuid", bytes.NewBuffer(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusNotFound, w.Code)
}
func TestCertificateHandler_Update_BadJSON(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.PUT("/api/certificates/:uuid", h.Update)
req := httptest.NewRequest(http.MethodPut, "/api/certificates/some-uuid", strings.NewReader("{invalid"))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
}
func TestCertificateHandler_Update_MissingName(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.PUT("/api/certificates/:uuid", h.Update)
body, _ := json.Marshal(map[string]string{})
req := httptest.NewRequest(http.MethodPut, "/api/certificates/some-uuid", bytes.NewBuffer(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
}
// --- Validate handler tests ---
func TestCertificateHandler_Validate_Success(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates/validate", h.Validate)
certPEM, keyPEM, err := generateSelfSignedCertPEM()
require.NoError(t, err)
var body bytes.Buffer
writer := multipart.NewWriter(&body)
part, _ := writer.CreateFormFile("certificate_file", "cert.pem")
_, _ = part.Write([]byte(certPEM))
part2, _ := writer.CreateFormFile("key_file", "key.pem")
_, _ = part2.Write([]byte(keyPEM))
_ = writer.Close()
req := httptest.NewRequest(http.MethodPost, "/api/certificates/validate", &body)
req.Header.Set("Content-Type", writer.FormDataContentType())
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
assert.Contains(t, w.Body.String(), "valid")
}
func TestCertificateHandler_Validate_NoCertFile(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates/validate", h.Validate)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/validate", strings.NewReader(""))
req.Header.Set("Content-Type", "multipart/form-data")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
}
func TestCertificateHandler_Validate_CertOnly(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates/validate", h.Validate)
certPEM, _, err := generateSelfSignedCertPEM()
require.NoError(t, err)
var body bytes.Buffer
writer := multipart.NewWriter(&body)
part, _ := writer.CreateFormFile("certificate_file", "cert.pem")
_, _ = part.Write([]byte(certPEM))
_ = writer.Close()
req := httptest.NewRequest(http.MethodPost, "/api/certificates/validate", &body)
req.Header.Set("Content-Type", writer.FormDataContentType())
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
}
// --- Export handler tests ---
func TestCertificateHandler_Export_EmptyUUID(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates/:uuid/export", h.Export)
body, _ := json.Marshal(map[string]any{"format": "pem"})
// Use a route that provides :uuid param as empty would not match normal routing
req := httptest.NewRequest(http.MethodPost, "/api/certificates//export", bytes.NewBuffer(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
// Router won't match empty uuid, so 404 or redirect
assert.True(t, w.Code == http.StatusNotFound || w.Code == http.StatusMovedPermanently || w.Code == http.StatusBadRequest)
}
func TestCertificateHandler_Export_BadJSON(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates/:uuid/export", h.Export)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/some-uuid/export", strings.NewReader("{bad"))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
}
func TestCertificateHandler_Export_NotFound(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates/:uuid/export", h.Export)
body, _ := json.Marshal(map[string]any{"format": "pem"})
req := httptest.NewRequest(http.MethodPost, "/api/certificates/nonexistent-uuid/export", bytes.NewBuffer(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusNotFound, w.Code)
}
func TestCertificateHandler_Export_PEMSuccess(t *testing.T) {
db := OpenTestDBWithMigrations(t)
certPEM, _, err := generateSelfSignedCertPEM()
require.NoError(t, err)
cert := models.SSLCertificate{UUID: "export-uuid-1", Name: "Export Test", Provider: "custom", Domains: "export.example.com", Certificate: certPEM}
db.Create(&cert)
r := gin.New()
r.Use(mockAuthMiddleware())
tmpDir := t.TempDir()
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates/:uuid/export", h.Export)
body, _ := json.Marshal(map[string]any{"format": "pem"})
req := httptest.NewRequest(http.MethodPost, "/api/certificates/export-uuid-1/export", bytes.NewBuffer(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
assert.Contains(t, w.Header().Get("Content-Disposition"), "Export Test.pem")
}
func TestCertificateHandler_Export_IncludeKeyNoPassword(t *testing.T) {
db := OpenTestDBWithMigrations(t)
cert := models.SSLCertificate{UUID: "export-uuid-2", Name: "Key Test", Provider: "custom", Domains: "key.example.com"}
db.Create(&cert)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates/:uuid/export", h.Export)
body, _ := json.Marshal(map[string]any{"format": "pem", "include_key": true})
req := httptest.NewRequest(http.MethodPost, "/api/certificates/export-uuid-2/export", bytes.NewBuffer(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
assert.Contains(t, w.Body.String(), "password required")
}
func TestCertificateHandler_Export_IncludeKeyNoDBSet(t *testing.T) {
db := OpenTestDBWithMigrations(t)
cert := models.SSLCertificate{UUID: "export-uuid-3", Name: "No DB Test", Provider: "custom", Domains: "nodb.example.com"}
db.Create(&cert)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
// h.db is nil - not set via SetDB
r.POST("/api/certificates/:uuid/export", h.Export)
body, _ := json.Marshal(map[string]any{"format": "pem", "include_key": true, "password": "test123"})
req := httptest.NewRequest(http.MethodPost, "/api/certificates/export-uuid-3/export", bytes.NewBuffer(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
assert.Contains(t, w.Body.String(), "authentication required")
}
// --- Delete via UUID path tests ---
func TestCertificateHandler_Delete_UUIDPath_NotFound(t *testing.T) {
db := OpenTestDBWithMigrations(t)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:uuid", h.Delete)
// Valid UUID format but does not exist
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/00000000-0000-0000-0000-000000000001", http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusNotFound, w.Code)
}
func TestCertificateHandler_Delete_UUIDPath_InUse(t *testing.T) {
db := OpenTestDBWithMigrations(t)
cert := models.SSLCertificate{UUID: "11111111-1111-1111-1111-111111111111", Name: "InUse UUID", Provider: "custom", Domains: "uuid-inuse.example.com"}
db.Create(&cert)
ph := models.ProxyHost{UUID: "ph-uuid-del", Name: "Proxy", DomainNames: "uuid-inuse.example.com", ForwardHost: "localhost", ForwardPort: 8080, CertificateID: &cert.ID}
db.Create(&ph)
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/11111111-1111-1111-1111-111111111111", http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusConflict, w.Code)
}
// --- sanitizeCertRef tests ---
func TestSanitizeCertRef(t *testing.T) {
assert.Equal(t, "00000000-0000-0000-0000-000000000001", sanitizeCertRef("00000000-0000-0000-0000-000000000001"))
assert.Equal(t, "123", sanitizeCertRef("123"))
assert.Equal(t, "[invalid-ref]", sanitizeCertRef("not-valid"))
assert.Equal(t, "0", sanitizeCertRef("0"))
}

View File

@@ -0,0 +1,707 @@
package handlers
import (
"bytes"
"encoding/json"
"fmt"
"mime/multipart"
"net/http"
"net/http/httptest"
"testing"
"github.com/gin-gonic/gin"
"github.com/google/uuid"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"gorm.io/driver/sqlite"
"gorm.io/gorm"
"github.com/Wikid82/charon/backend/internal/models"
"github.com/Wikid82/charon/backend/internal/services"
)
// --- Delete UUID path with backup service ---
func TestDelete_UUID_WithBackup_Success(t *testing.T) {
tmpDir := t.TempDir()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
certUUID := uuid.New().String()
db.Create(&models.SSLCertificate{UUID: certUUID, Name: "backup-uuid", Provider: "custom", Domains: "backup.test"})
svc := services.NewCertificateService(tmpDir, db, nil)
mock := &mockBackupService{
createFunc: func() (string, error) { return "/tmp/backup.tar.gz", nil },
availableSpaceFunc: func() (int64, error) { return 1024 * 1024 * 1024, nil },
}
h := NewCertificateHandler(svc, mock, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+certUUID, http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
}
func TestDelete_UUID_NotFound(t *testing.T) {
tmpDir := t.TempDir()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
nonExistentUUID := uuid.New().String()
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+nonExistentUUID, http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusNotFound, w.Code)
}
func TestDelete_UUID_InUse(t *testing.T) {
tmpDir := t.TempDir()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
certUUID := uuid.New().String()
cert := models.SSLCertificate{UUID: certUUID, Name: "inuse-uuid", Provider: "custom", Domains: "inuse.test"}
db.Create(&cert)
db.Create(&models.ProxyHost{UUID: "ph-uuid-inuse", Name: "ph", DomainNames: "inuse.test", ForwardHost: "localhost", ForwardPort: 8080, CertificateID: &cert.ID})
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+certUUID, http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusConflict, w.Code)
}
func TestDelete_UUID_BackupLowSpace(t *testing.T) {
tmpDir := t.TempDir()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
certUUID := uuid.New().String()
db.Create(&models.SSLCertificate{UUID: certUUID, Name: "low-space", Provider: "custom", Domains: "lowspace.test"})
svc := services.NewCertificateService(tmpDir, db, nil)
mock := &mockBackupService{
availableSpaceFunc: func() (int64, error) { return 1024, nil }, // 1KB - too low
}
h := NewCertificateHandler(svc, mock, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+certUUID, http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInsufficientStorage, w.Code)
}
func TestDelete_UUID_BackupSpaceCheckError(t *testing.T) {
tmpDir := t.TempDir()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
certUUID := uuid.New().String()
db.Create(&models.SSLCertificate{UUID: certUUID, Name: "space-err", Provider: "custom", Domains: "spaceerr.test"})
svc := services.NewCertificateService(tmpDir, db, nil)
mock := &mockBackupService{
availableSpaceFunc: func() (int64, error) { return 0, fmt.Errorf("disk error") },
createFunc: func() (string, error) { return "/tmp/backup.tar.gz", nil },
}
h := NewCertificateHandler(svc, mock, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+certUUID, http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
// Space check error → proceeds with backup → succeeds
assert.Equal(t, http.StatusOK, w.Code)
}
func TestDelete_UUID_BackupCreateError(t *testing.T) {
tmpDir := t.TempDir()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
certUUID := uuid.New().String()
db.Create(&models.SSLCertificate{UUID: certUUID, Name: "backup-fail", Provider: "custom", Domains: "backupfail.test"})
svc := services.NewCertificateService(tmpDir, db, nil)
mock := &mockBackupService{
availableSpaceFunc: func() (int64, error) { return 1024 * 1024 * 1024, nil },
createFunc: func() (string, error) { return "", fmt.Errorf("backup creation failed") },
}
h := NewCertificateHandler(svc, mock, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+certUUID, http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInternalServerError, w.Code)
}
// --- Delete UUID with notification service ---
func TestDelete_UUID_WithNotification(t *testing.T) {
tmpDir := t.TempDir()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.Setting{}, &models.Notification{}, &models.NotificationProvider{}))
certUUID := uuid.New().String()
db.Create(&models.SSLCertificate{UUID: certUUID, Name: "notify-cert", Provider: "custom", Domains: "notify.test"})
svc := services.NewCertificateService(tmpDir, db, nil)
notifSvc := services.NewNotificationService(db, nil)
h := NewCertificateHandler(svc, nil, notifSvc)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+certUUID, http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
}
// --- Validate handler ---
func TestValidate_Success(t *testing.T) {
tmpDir := t.TempDir()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
certPEM, _, err := generateSelfSignedCertPEM()
require.NoError(t, err)
body := &bytes.Buffer{}
writer := multipart.NewWriter(body)
part, err := writer.CreateFormFile("certificate_file", "cert.pem")
require.NoError(t, err)
_, err = part.Write([]byte(certPEM))
require.NoError(t, err)
require.NoError(t, writer.Close())
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates/validate", h.Validate)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/validate", body)
req.Header.Set("Content-Type", writer.FormDataContentType())
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
}
func TestValidate_InvalidCert(t *testing.T) {
tmpDir := t.TempDir()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
body := &bytes.Buffer{}
writer := multipart.NewWriter(body)
part, err := writer.CreateFormFile("certificate_file", "cert.pem")
require.NoError(t, err)
_, err = part.Write([]byte("not a certificate"))
require.NoError(t, err)
require.NoError(t, writer.Close())
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates/validate", h.Validate)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/validate", body)
req.Header.Set("Content-Type", writer.FormDataContentType())
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
assert.Contains(t, w.Body.String(), "unrecognized certificate format")
}
func TestValidate_NoCertFile(t *testing.T) {
tmpDir := t.TempDir()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates/validate", h.Validate)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/validate", http.NoBody)
req.Header.Set("Content-Type", "multipart/form-data; boundary=boundary")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
}
func TestValidate_WithKeyAndChain(t *testing.T) {
tmpDir := t.TempDir()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
certPEM, keyPEM, err := generateSelfSignedCertPEM()
require.NoError(t, err)
body := &bytes.Buffer{}
writer := multipart.NewWriter(body)
certPart, err := writer.CreateFormFile("certificate_file", "cert.pem")
require.NoError(t, err)
_, err = certPart.Write([]byte(certPEM))
require.NoError(t, err)
keyPart, err := writer.CreateFormFile("key_file", "key.pem")
require.NoError(t, err)
_, err = keyPart.Write([]byte(keyPEM))
require.NoError(t, err)
chainPart, err := writer.CreateFormFile("chain_file", "chain.pem")
require.NoError(t, err)
_, err = chainPart.Write([]byte(certPEM)) // self-signed chain
require.NoError(t, err)
require.NoError(t, writer.Close())
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates/validate", h.Validate)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/validate", body)
req.Header.Set("Content-Type", writer.FormDataContentType())
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
}
// --- Get handler DB error (non-NotFound) ---
func TestGet_DBError(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
// Deliberately don't migrate - any query will fail with "no such table"
svc := services.NewCertificateService(t.TempDir(), db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.GET("/api/certificates/:uuid", h.Get)
req := httptest.NewRequest(http.MethodGet, "/api/certificates/"+uuid.New().String(), http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
// Should be 500 since the table doesn't exist
assert.Equal(t, http.StatusInternalServerError, w.Code)
}
// --- Export handler: re-auth and service error paths ---
func TestExport_IncludeKey_MissingPassword(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
h.SetDB(db)
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates/:uuid/export", h.Export)
body := bytes.NewBufferString(`{"format":"pem","include_key":true}`)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/"+uuid.New().String()+"/export", body)
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
}
func TestExport_IncludeKey_NoUserContext(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
h.SetDB(db)
r := gin.New() // no middleware — "user" key absent
r.POST("/api/certificates/:uuid/export", h.Export)
body := bytes.NewBufferString(`{"format":"pem","include_key":true,"password":"somepass"}`)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/"+uuid.New().String()+"/export", body)
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
}
func TestExport_IncludeKey_InvalidClaimsType(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
h.SetDB(db)
r := gin.New()
r.Use(func(c *gin.Context) { c.Set("user", "not-a-map"); c.Next() })
r.POST("/api/certificates/:uuid/export", h.Export)
body := bytes.NewBufferString(`{"format":"pem","include_key":true,"password":"somepass"}`)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/"+uuid.New().String()+"/export", body)
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
}
func TestExport_IncludeKey_UserIDNotInClaims(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
h.SetDB(db)
r := gin.New()
r.Use(func(c *gin.Context) { c.Set("user", map[string]any{}); c.Next() }) // no "id" key
r.POST("/api/certificates/:uuid/export", h.Export)
body := bytes.NewBufferString(`{"format":"pem","include_key":true,"password":"somepass"}`)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/"+uuid.New().String()+"/export", body)
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
}
func TestExport_IncludeKey_UserNotFoundInDB(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
h.SetDB(db)
r := gin.New()
r.Use(func(c *gin.Context) { c.Set("user", map[string]any{"id": float64(9999)}); c.Next() })
r.POST("/api/certificates/:uuid/export", h.Export)
body := bytes.NewBufferString(`{"format":"pem","include_key":true,"password":"somepass"}`)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/"+uuid.New().String()+"/export", body)
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
}
func TestExport_IncludeKey_WrongPassword(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
u := &models.User{UUID: uuid.New().String(), Email: "export@example.com", Name: "Export User"}
require.NoError(t, u.SetPassword("correctpass"))
require.NoError(t, db.Create(u).Error)
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
h.SetDB(db)
r := gin.New()
r.Use(func(c *gin.Context) { c.Set("user", map[string]any{"id": float64(u.ID)}); c.Next() })
r.POST("/api/certificates/:uuid/export", h.Export)
body := bytes.NewBufferString(`{"format":"pem","include_key":true,"password":"wrongpass"}`)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/"+uuid.New().String()+"/export", body)
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
}
func TestExport_CertNotFound(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates/:uuid/export", h.Export)
body := bytes.NewBufferString(`{"format":"pem"}`)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/"+uuid.New().String()+"/export", body)
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusNotFound, w.Code)
}
func TestExport_ServiceError(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
certUUID := uuid.New().String()
cert := models.SSLCertificate{UUID: certUUID, Name: "test", Domains: "test.example.com", Provider: "custom"}
require.NoError(t, db.Create(&cert).Error)
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates/:uuid/export", h.Export)
body := bytes.NewBufferString(`{"format":"unsupported_xyz"}`)
req := httptest.NewRequest(http.MethodPost, "/api/certificates/"+certUUID+"/export", body)
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInternalServerError, w.Code)
}
// --- Delete numeric ID paths ---
func TestDelete_NumericID_UsageCheckError(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{})) // no ProxyHost → IsCertificateInUse fails
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/1", http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInternalServerError, w.Code)
}
func TestDelete_NumericID_LowDiskSpace(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cert := models.SSLCertificate{UUID: uuid.New().String(), Name: "low-space", Domains: "lowspace.example.com", Provider: "custom"}
require.NoError(t, db.Create(&cert).Error)
svc := services.NewCertificateService(tmpDir, db, nil)
backup := &mockBackupService{
availableSpaceFunc: func() (int64, error) { return 1024, nil }, // < 100 MB
createFunc: func() (string, error) { return "", nil },
}
h := NewCertificateHandler(svc, backup, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, fmt.Sprintf("/api/certificates/%d", cert.ID), http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInsufficientStorage, w.Code)
}
func TestDelete_NumericID_BackupError(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cert := models.SSLCertificate{UUID: uuid.New().String(), Name: "backup-err", Domains: "backuperr.example.com", Provider: "custom"}
require.NoError(t, db.Create(&cert).Error)
svc := services.NewCertificateService(tmpDir, db, nil)
backup := &mockBackupService{
availableSpaceFunc: func() (int64, error) { return 1 << 30, nil }, // 1 GB — plenty
createFunc: func() (string, error) { return "", fmt.Errorf("backup create failed") },
}
h := NewCertificateHandler(svc, backup, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, fmt.Sprintf("/api/certificates/%d", cert.ID), http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInternalServerError, w.Code)
}
func TestDelete_NumericID_DeleteError(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.ProxyHost{})) // no SSLCertificate → DeleteCertificateByID fails
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/42", http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInternalServerError, w.Code)
}
// --- Delete UUID: internal usage-check error ---
func TestDelete_UUID_UsageCheckInternalError(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{})) // no ProxyHost → IsCertificateInUse fails
certUUID := uuid.New().String()
cert := models.SSLCertificate{UUID: certUUID, Name: "uuid-err", Domains: "uuiderr.example.com", Provider: "custom"}
require.NoError(t, db.Create(&cert).Error)
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+certUUID, http.NoBody)
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInternalServerError, w.Code)
}
// --- sendDeleteNotification: rate limit ---
func TestSendDeleteNotification_RateLimit(t *testing.T) {
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
ns := services.NewNotificationService(db, nil)
svc := services.NewCertificateService(t.TempDir(), db, nil)
h := NewCertificateHandler(svc, nil, ns)
w := httptest.NewRecorder()
ctx, _ := gin.CreateTestContext(w)
ctx.Request = httptest.NewRequest(http.MethodDelete, "/", http.NoBody)
certRef := uuid.New().String()
h.sendDeleteNotification(ctx, certRef) // first call — sets timestamp
h.sendDeleteNotification(ctx, certRef) // second call — hits rate limit branch
}
// --- Update: empty UUID param (lines 207-209) ---
func TestUpdate_EmptyUUID(t *testing.T) {
svc := services.NewCertificateService(t.TempDir(), nil, nil)
h := NewCertificateHandler(svc, nil, nil)
w := httptest.NewRecorder()
ctx, _ := gin.CreateTestContext(w)
ctx.Request = httptest.NewRequest(http.MethodPut, "/api/certificates/", bytes.NewBufferString(`{"name":"test"}`))
ctx.Request.Header.Set("Content-Type", "application/json")
// No Params set — c.Param("uuid") returns ""
h.Update(ctx)
assert.Equal(t, http.StatusBadRequest, w.Code)
}
// --- Update: DB error (non-ErrCertNotFound) → lines 223-225 ---
func TestUpdate_DBError(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
// Deliberately no AutoMigrate → ssl_certificates table absent → "no such table" error
svc := services.NewCertificateService(t.TempDir(), db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.PUT("/api/certificates/:uuid", h.Update)
body, _ := json.Marshal(map[string]string{"name": "new-name"})
req := httptest.NewRequest(http.MethodPut, "/api/certificates/"+uuid.New().String(), bytes.NewReader(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInternalServerError, w.Code)
}

View File

@@ -30,9 +30,9 @@ func TestCertificateHandler_Delete_RequiresAuth(t *testing.T) {
r.Use(func(c *gin.Context) {
c.AbortWithStatusJSON(http.StatusUnauthorized, gin.H{"error": "Unauthorized"})
})
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/1", http.NoBody)
w := httptest.NewRecorder()
@@ -59,7 +59,7 @@ func TestCertificateHandler_List_RequiresAuth(t *testing.T) {
r.Use(func(c *gin.Context) {
c.AbortWithStatusJSON(http.StatusUnauthorized, gin.H{"error": "Unauthorized"})
})
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.GET("/api/certificates", h.List)
@@ -88,7 +88,7 @@ func TestCertificateHandler_Upload_RequiresAuth(t *testing.T) {
r.Use(func(c *gin.Context) {
c.AbortWithStatusJSON(http.StatusUnauthorized, gin.H{"error": "Unauthorized"})
})
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates", h.Upload)
@@ -125,7 +125,7 @@ func TestCertificateHandler_Delete_DiskSpaceCheck(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
// Mock backup service that reports low disk space
mockBackup := &mockBackupService{
@@ -135,7 +135,7 @@ func TestCertificateHandler_Delete_DiskSpaceCheck(t *testing.T) {
}
h := NewCertificateHandler(svc, mockBackup, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, fmt.Sprintf("/api/certificates/%d", cert.ID), http.NoBody)
w := httptest.NewRecorder()
@@ -177,7 +177,7 @@ func TestCertificateHandler_Delete_NotificationRateLimiting(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
mockBackup := &mockBackupService{
createFunc: func() (string, error) {
@@ -186,7 +186,7 @@ func TestCertificateHandler_Delete_NotificationRateLimiting(t *testing.T) {
}
h := NewCertificateHandler(svc, mockBackup, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
// Delete first cert
req1 := httptest.NewRequest(http.MethodDelete, fmt.Sprintf("/api/certificates/%d", cert1.ID), http.NoBody)

View File

@@ -39,9 +39,9 @@ func setupCertTestRouter(t *testing.T, db *gorm.DB) *gin.Engine {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
return r
}
@@ -111,7 +111,7 @@ func TestDeleteCertificate_CreatesBackup(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
// Mock BackupService
backupCalled := false
@@ -123,7 +123,7 @@ func TestDeleteCertificate_CreatesBackup(t *testing.T) {
}
h := NewCertificateHandler(svc, mockBackupService, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+toStr(cert.ID), http.NoBody)
w := httptest.NewRecorder()
@@ -164,7 +164,7 @@ func TestDeleteCertificate_BackupFailure(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
// Mock BackupService that fails
mockBackupService := &mockBackupService{
@@ -174,7 +174,7 @@ func TestDeleteCertificate_BackupFailure(t *testing.T) {
}
h := NewCertificateHandler(svc, mockBackupService, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+toStr(cert.ID), http.NoBody)
w := httptest.NewRecorder()
@@ -217,7 +217,7 @@ func TestDeleteCertificate_InUse_NoBackup(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
// Mock BackupService
backupCalled := false
@@ -229,7 +229,7 @@ func TestDeleteCertificate_InUse_NoBackup(t *testing.T) {
}
h := NewCertificateHandler(svc, mockBackupService, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+toStr(cert.ID), http.NoBody)
w := httptest.NewRecorder()
@@ -295,7 +295,7 @@ func TestCertificateHandler_List(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.GET("/api/certificates", h.List)
@@ -321,7 +321,7 @@ func TestCertificateHandler_Upload_MissingName(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates", h.Upload)
@@ -348,7 +348,7 @@ func TestCertificateHandler_Upload_MissingCertFile(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates", h.Upload)
@@ -378,7 +378,7 @@ func TestCertificateHandler_Upload_MissingKeyFile(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates", h.Upload)
@@ -404,10 +404,15 @@ func TestCertificateHandler_Upload_MissingKeyFile_MultipartWithCert(t *testing.T
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates", h.Upload)
certPEM, _, genErr := generateSelfSignedCertPEM()
if genErr != nil {
t.Fatalf("failed to generate self-signed cert: %v", genErr)
}
var body bytes.Buffer
writer := multipart.NewWriter(&body)
_ = writer.WriteField("name", "testcert")
@@ -415,7 +420,7 @@ func TestCertificateHandler_Upload_MissingKeyFile_MultipartWithCert(t *testing.T
if createErr != nil {
t.Fatalf("failed to create form file: %v", createErr)
}
_, _ = part.Write([]byte("-----BEGIN CERTIFICATE-----\nMIIB\n-----END CERTIFICATE-----"))
_, _ = part.Write([]byte(certPEM))
_ = writer.Close()
req := httptest.NewRequest(http.MethodPost, "/api/certificates", &body)
@@ -426,7 +431,7 @@ func TestCertificateHandler_Upload_MissingKeyFile_MultipartWithCert(t *testing.T
if w.Code != http.StatusBadRequest {
t.Fatalf("expected 400 Bad Request, got %d, body=%s", w.Code, w.Body.String())
}
if !strings.Contains(w.Body.String(), "key_file") {
if !strings.Contains(w.Body.String(), "key_file is required") {
t.Fatalf("expected error message about key_file, got: %s", w.Body.String())
}
}
@@ -447,7 +452,7 @@ func TestCertificateHandler_Upload_Success(t *testing.T) {
// Create a mock CertificateService that returns a created certificate
// Create a temporary services.CertificateService with a temp dir and DB
tmpDir := t.TempDir()
svc := services.NewCertificateService(tmpDir, db)
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.POST("/api/certificates", h.Upload)
@@ -519,7 +524,7 @@ func TestCertificateHandler_Upload_WithNotificationService(t *testing.T) {
r.Use(mockAuthMiddleware())
tmpDir := t.TempDir()
svc := services.NewCertificateService(tmpDir, db)
svc := services.NewCertificateService(tmpDir, db, nil)
ns := services.NewNotificationService(db, nil)
h := NewCertificateHandler(svc, nil, ns)
r.POST("/api/certificates", h.Upload)
@@ -555,9 +560,9 @@ func TestDeleteCertificate_InvalidID(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/invalid", http.NoBody)
w := httptest.NewRecorder()
@@ -580,9 +585,9 @@ func TestDeleteCertificate_ZeroID(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/0", http.NoBody)
w := httptest.NewRecorder()
@@ -611,7 +616,7 @@ func TestDeleteCertificate_LowDiskSpace(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
// Mock BackupService with low disk space
mockBackupService := &mockBackupService{
@@ -621,7 +626,7 @@ func TestDeleteCertificate_LowDiskSpace(t *testing.T) {
}
h := NewCertificateHandler(svc, mockBackupService, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+toStr(cert.ID), http.NoBody)
w := httptest.NewRecorder()
@@ -659,7 +664,7 @@ func TestDeleteCertificate_DiskSpaceCheckError(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
// Mock BackupService with space check error but backup succeeds
mockBackupService := &mockBackupService{
@@ -672,7 +677,7 @@ func TestDeleteCertificate_DiskSpaceCheckError(t *testing.T) {
}
h := NewCertificateHandler(svc, mockBackupService, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+toStr(cert.ID), http.NoBody)
w := httptest.NewRecorder()
@@ -717,7 +722,7 @@ func TestDeleteCertificate_ExpiredLetsEncrypt_NotInUse(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
mockBS := &mockBackupService{
createFunc: func() (string, error) {
@@ -726,7 +731,7 @@ func TestDeleteCertificate_ExpiredLetsEncrypt_NotInUse(t *testing.T) {
}
h := NewCertificateHandler(svc, mockBS, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+toStr(cert.ID), http.NoBody)
w := httptest.NewRecorder()
@@ -775,7 +780,7 @@ func TestDeleteCertificate_ValidLetsEncrypt_NotInUse(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
mockBS := &mockBackupService{
createFunc: func() (string, error) {
@@ -784,7 +789,7 @@ func TestDeleteCertificate_ValidLetsEncrypt_NotInUse(t *testing.T) {
}
h := NewCertificateHandler(svc, mockBS, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+toStr(cert.ID), http.NoBody)
w := httptest.NewRecorder()
@@ -820,9 +825,9 @@ func TestDeleteCertificate_UsageCheckError(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
h := NewCertificateHandler(svc, nil, nil)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+toStr(cert.ID), http.NoBody)
w := httptest.NewRecorder()
@@ -857,7 +862,7 @@ func TestDeleteCertificate_NotificationRateLimit(t *testing.T) {
r := gin.New()
r.Use(mockAuthMiddleware())
svc := services.NewCertificateService("/tmp", db)
svc := services.NewCertificateService("/tmp", db, nil)
ns := services.NewNotificationService(db, nil)
mockBackupService := &mockBackupService{
@@ -867,7 +872,7 @@ func TestDeleteCertificate_NotificationRateLimit(t *testing.T) {
}
h := NewCertificateHandler(svc, mockBackupService, ns)
r.DELETE("/api/certificates/:id", h.Delete)
r.DELETE("/api/certificates/:uuid", h.Delete)
// Delete first certificate
req := httptest.NewRequest(http.MethodDelete, "/api/certificates/"+toStr(cert1.ID), http.NoBody)

View File

@@ -0,0 +1,382 @@
package handlers
import (
"bytes"
"encoding/base64"
"encoding/json"
"fmt"
"mime/multipart"
"net/http"
"net/http/httptest"
"testing"
"github.com/gin-gonic/gin"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"gorm.io/driver/sqlite"
"gorm.io/gorm"
"github.com/Wikid82/charon/backend/internal/crypto"
"github.com/Wikid82/charon/backend/internal/models"
"github.com/Wikid82/charon/backend/internal/services"
)
// --- Upload: with chain file (covers chain_file multipart branch) ---
func TestCertificateHandler_Upload_WithChainFile(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
tmpDir := t.TempDir()
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates", h.Upload)
certPEM, keyPEM, err := generateSelfSignedCertPEM()
require.NoError(t, err)
var body bytes.Buffer
writer := multipart.NewWriter(&body)
_ = writer.WriteField("name", "chain-cert")
part, _ := writer.CreateFormFile("certificate_file", "cert.pem")
_, _ = part.Write([]byte(certPEM))
part2, _ := writer.CreateFormFile("key_file", "key.pem")
_, _ = part2.Write([]byte(keyPEM))
part3, _ := writer.CreateFormFile("chain_file", "chain.pem")
_, _ = part3.Write([]byte(certPEM))
_ = writer.Close()
req := httptest.NewRequest(http.MethodPost, "/api/certificates", &body)
req.Header.Set("Content-Type", writer.FormDataContentType())
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusCreated, w.Code, "body: %s", w.Body.String())
}
// --- Upload: invalid cert data ---
func TestCertificateHandler_Upload_InvalidCertData(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
tmpDir := t.TempDir()
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates", h.Upload)
var body bytes.Buffer
writer := multipart.NewWriter(&body)
_ = writer.WriteField("name", "bad-cert")
part, _ := writer.CreateFormFile("certificate_file", "cert.pem")
_, _ = part.Write([]byte("not-a-cert"))
part2, _ := writer.CreateFormFile("key_file", "key.pem")
_, _ = part2.Write([]byte("not-a-key"))
_ = writer.Close()
req := httptest.NewRequest(http.MethodPost, "/api/certificates", &body)
req.Header.Set("Content-Type", writer.FormDataContentType())
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
}
// --- Export re-authentication flow ---
func setupExportRouter(t *testing.T, db *gorm.DB) (*gin.Engine, *CertificateHandler) {
t.Helper()
tmpDir := t.TempDir()
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
h.SetDB(db)
r := gin.New()
return r, h
}
func newTestEncSvc(t *testing.T) *crypto.EncryptionService {
t.Helper()
key := make([]byte, 32)
for i := range key {
key[i] = byte(i)
}
svc, err := crypto.NewEncryptionService(base64.StdEncoding.EncodeToString(key))
require.NoError(t, err)
return svc
}
func TestCertificateHandler_Export_IncludeKeySuccess(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
user := models.User{UUID: "export-user-1", Email: "export@test.com", Name: "Exporter"}
require.NoError(t, user.SetPassword("correctpassword"))
require.NoError(t, db.Create(&user).Error)
encSvc := newTestEncSvc(t)
tmpDir := t.TempDir()
svc := services.NewCertificateService(tmpDir, db, encSvc)
h := NewCertificateHandler(svc, nil, nil)
h.SetDB(db)
certPEM, keyPEM, err := generateSelfSignedCertPEM()
require.NoError(t, err)
info, err := svc.UploadCertificate("export-cert", certPEM, keyPEM, "")
require.NoError(t, err)
r := gin.New()
r.Use(func(c *gin.Context) {
c.Set("user", map[string]any{"id": user.ID})
c.Next()
})
r.POST("/api/certificates/:uuid/export", h.Export)
payload, _ := json.Marshal(map[string]any{
"format": "pem",
"include_key": true,
"password": "correctpassword",
})
req := httptest.NewRequest(http.MethodPost, "/api/certificates/"+info.UUID+"/export", bytes.NewReader(payload))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code, "body: %s", w.Body.String())
assert.Contains(t, w.Header().Get("Content-Disposition"), "export-cert.pem")
}
func TestCertificateHandler_Export_IncludeKeyWrongPassword(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
r, h := setupExportRouter(t, db)
user := models.User{UUID: "wrong-pw-user", Email: "wrong@test.com", Name: "Wrong"}
require.NoError(t, user.SetPassword("rightpass"))
require.NoError(t, db.Create(&user).Error)
r.Use(func(c *gin.Context) {
c.Set("user", map[string]any{"id": user.ID})
c.Next()
})
r.POST("/api/certificates/:uuid/export", h.Export)
payload, _ := json.Marshal(map[string]any{
"format": "pem",
"include_key": true,
"password": "wrongpass",
})
req := httptest.NewRequest(http.MethodPost, "/api/certificates/fake-uuid/export", bytes.NewReader(payload))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
assert.Contains(t, w.Body.String(), "incorrect password")
}
func TestCertificateHandler_Export_NoUserInContext(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
r, h := setupExportRouter(t, db)
r.POST("/api/certificates/:uuid/export", h.Export)
payload, _ := json.Marshal(map[string]any{
"format": "pem",
"include_key": true,
"password": "anything",
})
req := httptest.NewRequest(http.MethodPost, "/api/certificates/fake-uuid/export", bytes.NewReader(payload))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
assert.Contains(t, w.Body.String(), "authentication required")
}
func TestCertificateHandler_Export_InvalidSession(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
r, h := setupExportRouter(t, db)
r.Use(func(c *gin.Context) {
c.Set("user", "not-a-map")
c.Next()
})
r.POST("/api/certificates/:uuid/export", h.Export)
payload, _ := json.Marshal(map[string]any{
"format": "pem",
"include_key": true,
"password": "anything",
})
req := httptest.NewRequest(http.MethodPost, "/api/certificates/fake-uuid/export", bytes.NewReader(payload))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
assert.Contains(t, w.Body.String(), "invalid session")
}
func TestCertificateHandler_Export_MissingUserID(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
r, h := setupExportRouter(t, db)
r.Use(func(c *gin.Context) {
c.Set("user", map[string]any{"name": "test"})
c.Next()
})
r.POST("/api/certificates/:uuid/export", h.Export)
payload, _ := json.Marshal(map[string]any{
"format": "pem",
"include_key": true,
"password": "anything",
})
req := httptest.NewRequest(http.MethodPost, "/api/certificates/fake-uuid/export", bytes.NewReader(payload))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
assert.Contains(t, w.Body.String(), "invalid session")
}
func TestCertificateHandler_Export_UserNotFound(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.User{}))
r, h := setupExportRouter(t, db)
r.Use(func(c *gin.Context) {
c.Set("user", map[string]any{"id": uint(9999)})
c.Next()
})
r.POST("/api/certificates/:uuid/export", h.Export)
payload, _ := json.Marshal(map[string]any{
"format": "pem",
"include_key": true,
"password": "anything",
})
req := httptest.NewRequest(http.MethodPost, "/api/certificates/fake-uuid/export", bytes.NewReader(payload))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusForbidden, w.Code)
assert.Contains(t, w.Body.String(), "user not found")
}
// --- Validate handler with key and chain ---
func TestCertificateHandler_Validate_WithKeyAndChain(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
tmpDir := t.TempDir()
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates/validate", h.Validate)
certPEM, keyPEM, err := generateSelfSignedCertPEM()
require.NoError(t, err)
var body bytes.Buffer
writer := multipart.NewWriter(&body)
part, _ := writer.CreateFormFile("certificate_file", "cert.pem")
_, _ = part.Write([]byte(certPEM))
part2, _ := writer.CreateFormFile("key_file", "key.pem")
_, _ = part2.Write([]byte(keyPEM))
part3, _ := writer.CreateFormFile("chain_file", "chain.pem")
_, _ = part3.Write([]byte(certPEM))
_ = writer.Close()
req := httptest.NewRequest(http.MethodPost, "/api/certificates/validate", &body)
req.Header.Set("Content-Type", writer.FormDataContentType())
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code, "body: %s", w.Body.String())
}
func TestCertificateHandler_Validate_InvalidCert(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
tmpDir := t.TempDir()
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates/validate", h.Validate)
var body bytes.Buffer
writer := multipart.NewWriter(&body)
part, _ := writer.CreateFormFile("certificate_file", "cert.pem")
_, _ = part.Write([]byte("not-a-cert"))
_ = writer.Close()
req := httptest.NewRequest(http.MethodPost, "/api/certificates/validate", &body)
req.Header.Set("Content-Type", writer.FormDataContentType())
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
var resp map[string]any
require.NoError(t, json.Unmarshal(w.Body.Bytes(), &resp))
errList, ok := resp["errors"].([]any)
assert.True(t, ok)
assert.Greater(t, len(errList), 0, "expected validation errors in response")
}
func TestCertificateHandler_Validate_MissingCertFile(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
tmpDir := t.TempDir()
svc := services.NewCertificateService(tmpDir, db, nil)
h := NewCertificateHandler(svc, nil, nil)
r := gin.New()
r.Use(mockAuthMiddleware())
r.POST("/api/certificates/validate", h.Validate)
var body bytes.Buffer
writer := multipart.NewWriter(&body)
_ = writer.WriteField("name", "test")
_ = writer.Close()
req := httptest.NewRequest(http.MethodPost, "/api/certificates/validate", &body)
req.Header.Set("Content-Type", writer.FormDataContentType())
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
assert.Contains(t, w.Body.String(), "certificate_file is required")
}

View File

@@ -63,6 +63,7 @@ type CrowdsecHandler struct {
Hub *crowdsec.HubService
Console *crowdsec.ConsoleEnrollmentService
Security *services.SecurityService
WhitelistSvc *services.CrowdSecWhitelistService
CaddyManager *caddy.Manager // For config reload after bouncer registration
LAPIMaxWait time.Duration // For testing; 0 means 60s default
LAPIPollInterval time.Duration // For testing; 0 means 500ms default
@@ -383,7 +384,7 @@ func NewCrowdsecHandler(db *gorm.DB, executor CrowdsecExecutor, binPath, dataDir
securitySvc = services.NewSecurityService(db)
consoleSvc = crowdsec.NewConsoleEnrollmentService(db, &crowdsec.SecureCommandExecutor{}, dataDir, consoleSecret)
}
return &CrowdsecHandler{
h := &CrowdsecHandler{
DB: db,
Executor: executor,
CmdExec: &RealCommandExecutor{},
@@ -395,6 +396,10 @@ func NewCrowdsecHandler(db *gorm.DB, executor CrowdsecExecutor, binPath, dataDir
dashCache: newDashboardCache(),
validateLAPIURL: validateCrowdsecLAPIBaseURLDefault,
}
if db != nil {
h.WhitelistSvc = services.NewCrowdSecWhitelistService(db, dataDir)
}
return h
}
// isCerberusEnabled returns true when Cerberus is enabled via DB or env flag.
@@ -2700,6 +2705,75 @@ func fileExists(path string) bool {
return err == nil
}
// ListWhitelists returns all CrowdSec IP/CIDR whitelist entries.
func (h *CrowdsecHandler) ListWhitelists(c *gin.Context) {
entries, err := h.WhitelistSvc.List(c.Request.Context())
if err != nil {
logger.Log().WithError(err).Error("failed to list whitelist entries")
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to list whitelist entries"})
return
}
c.JSON(http.StatusOK, gin.H{"whitelist": entries})
}
// AddWhitelist adds a new IP or CIDR to the CrowdSec whitelist.
func (h *CrowdsecHandler) AddWhitelist(c *gin.Context) {
var req struct {
IPOrCIDR string `json:"ip_or_cidr" binding:"required"`
Reason string `json:"reason"`
}
if err := c.ShouldBindJSON(&req); err != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "ip_or_cidr is required"})
return
}
entry, err := h.WhitelistSvc.Add(c.Request.Context(), req.IPOrCIDR, req.Reason)
if err != nil {
switch {
case errors.Is(err, services.ErrInvalidIPOrCIDR):
c.JSON(http.StatusBadRequest, gin.H{"error": "invalid IP address or CIDR notation"})
case errors.Is(err, services.ErrDuplicateEntry):
c.JSON(http.StatusConflict, gin.H{"error": "entry already exists in whitelist"})
default:
logger.Log().WithError(err).Error("failed to add whitelist entry")
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to add whitelist entry"})
}
return
}
if _, execErr := h.CmdExec.Execute(c.Request.Context(), "cscli", "hub", "reload"); execErr != nil {
logger.Log().WithError(execErr).Warn("cscli hub reload failed after whitelist add (non-fatal)")
}
c.JSON(http.StatusCreated, entry)
}
// DeleteWhitelist removes a whitelist entry by UUID.
func (h *CrowdsecHandler) DeleteWhitelist(c *gin.Context) {
id := c.Param("uuid")
if id == "" {
c.JSON(http.StatusBadRequest, gin.H{"error": "uuid is required"})
return
}
if err := h.WhitelistSvc.Delete(c.Request.Context(), id); err != nil {
switch {
case errors.Is(err, services.ErrWhitelistNotFound):
c.JSON(http.StatusNotFound, gin.H{"error": "whitelist entry not found"})
default:
logger.Log().WithError(err).Error("failed to delete whitelist entry")
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to delete whitelist entry"})
}
return
}
if _, execErr := h.CmdExec.Execute(c.Request.Context(), "cscli", "hub", "reload"); execErr != nil {
logger.Log().WithError(execErr).Warn("cscli hub reload failed after whitelist delete (non-fatal)")
}
c.Status(http.StatusNoContent)
}
// RegisterRoutes registers crowdsec admin routes under protected group
func (h *CrowdsecHandler) RegisterRoutes(rg *gin.RouterGroup) {
rg.POST("/admin/crowdsec/start", h.Start)
@@ -2742,4 +2816,8 @@ func (h *CrowdsecHandler) RegisterRoutes(rg *gin.RouterGroup) {
rg.GET("/admin/crowdsec/dashboard/scenarios", h.DashboardScenarios)
rg.GET("/admin/crowdsec/alerts", h.ListAlerts)
rg.GET("/admin/crowdsec/decisions/export", h.ExportDecisions)
// Whitelist management endpoints (Issue #939)
rg.GET("/admin/crowdsec/whitelist", h.ListWhitelists)
rg.POST("/admin/crowdsec/whitelist", h.AddWhitelist)
rg.DELETE("/admin/crowdsec/whitelist/:uuid", h.DeleteWhitelist)
}

View File

@@ -0,0 +1,284 @@
package handlers
import (
"bytes"
"context"
"encoding/json"
"errors"
"net/http"
"net/http/httptest"
"testing"
"github.com/Wikid82/charon/backend/internal/models"
"github.com/Wikid82/charon/backend/internal/services"
"github.com/gin-gonic/gin"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"gorm.io/gorm"
)
type mockCmdExecWhitelist struct {
reloadCalled bool
reloadErr error
}
func (m *mockCmdExecWhitelist) Execute(_ context.Context, _ string, _ ...string) ([]byte, error) {
m.reloadCalled = true
return nil, m.reloadErr
}
func setupWhitelistHandler(t *testing.T) (*CrowdsecHandler, *gin.Engine, *gorm.DB) {
t.Helper()
db := OpenTestDB(t)
require.NoError(t, db.AutoMigrate(&models.CrowdSecWhitelist{}))
fe := &fakeExec{}
h := newTestCrowdsecHandler(t, db, fe, "/bin/false", "")
h.WhitelistSvc = services.NewCrowdSecWhitelistService(db, "")
r := gin.New()
g := r.Group("/api/v1")
g.GET("/admin/crowdsec/whitelist", h.ListWhitelists)
g.POST("/admin/crowdsec/whitelist", h.AddWhitelist)
g.DELETE("/admin/crowdsec/whitelist/:uuid", h.DeleteWhitelist)
return h, r, db
}
func TestListWhitelists_Empty(t *testing.T) {
t.Parallel()
_, r, _ := setupWhitelistHandler(t)
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodGet, "/api/v1/admin/crowdsec/whitelist", nil)
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
var resp map[string]interface{}
require.NoError(t, json.Unmarshal(w.Body.Bytes(), &resp))
entries, ok := resp["whitelist"].([]interface{})
assert.True(t, ok)
assert.Empty(t, entries)
}
func TestAddWhitelist_ValidIP(t *testing.T) {
t.Parallel()
h, r, _ := setupWhitelistHandler(t)
mock := &mockCmdExecWhitelist{}
h.CmdExec = mock
body := `{"ip_or_cidr":"1.2.3.4","reason":"test"}`
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodPost, "/api/v1/admin/crowdsec/whitelist", bytes.NewBufferString(body))
req.Header.Set("Content-Type", "application/json")
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusCreated, w.Code)
assert.True(t, mock.reloadCalled)
var entry models.CrowdSecWhitelist
require.NoError(t, json.Unmarshal(w.Body.Bytes(), &entry))
assert.Equal(t, "1.2.3.4", entry.IPOrCIDR)
assert.NotEmpty(t, entry.UUID)
}
func TestAddWhitelist_InvalidIP(t *testing.T) {
t.Parallel()
_, r, _ := setupWhitelistHandler(t)
body := `{"ip_or_cidr":"not-valid","reason":""}`
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodPost, "/api/v1/admin/crowdsec/whitelist", bytes.NewBufferString(body))
req.Header.Set("Content-Type", "application/json")
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
}
func TestAddWhitelist_Duplicate(t *testing.T) {
t.Parallel()
_, r, _ := setupWhitelistHandler(t)
body := `{"ip_or_cidr":"9.9.9.9","reason":""}`
for i := 0; i < 2; i++ {
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodPost, "/api/v1/admin/crowdsec/whitelist", bytes.NewBufferString(body))
req.Header.Set("Content-Type", "application/json")
r.ServeHTTP(w, req)
if i == 0 {
assert.Equal(t, http.StatusCreated, w.Code)
} else {
assert.Equal(t, http.StatusConflict, w.Code)
}
}
}
func TestDeleteWhitelist_Existing(t *testing.T) {
t.Parallel()
h, r, db := setupWhitelistHandler(t)
mock := &mockCmdExecWhitelist{}
h.CmdExec = mock
svc := services.NewCrowdSecWhitelistService(db, "")
entry, err := svc.Add(t.Context(), "7.7.7.7", "to delete")
require.NoError(t, err)
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodDelete, "/api/v1/admin/crowdsec/whitelist/"+entry.UUID, nil)
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusNoContent, w.Code)
assert.True(t, mock.reloadCalled)
}
func TestDeleteWhitelist_NotFound(t *testing.T) {
t.Parallel()
_, r, _ := setupWhitelistHandler(t)
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodDelete, "/api/v1/admin/crowdsec/whitelist/00000000-0000-0000-0000-000000000000", nil)
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusNotFound, w.Code)
}
func TestListWhitelists_AfterAdd(t *testing.T) {
t.Parallel()
_, r, db := setupWhitelistHandler(t)
svc := services.NewCrowdSecWhitelistService(db, "")
_, err := svc.Add(t.Context(), "8.8.8.8", "google dns")
require.NoError(t, err)
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodGet, "/api/v1/admin/crowdsec/whitelist", nil)
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusOK, w.Code)
var resp map[string]interface{}
require.NoError(t, json.Unmarshal(w.Body.Bytes(), &resp))
entries := resp["whitelist"].([]interface{})
assert.Len(t, entries, 1)
}
func TestAddWhitelist_400_MissingField(t *testing.T) {
t.Parallel()
_, r, _ := setupWhitelistHandler(t)
body := `{}`
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodPost, "/api/v1/admin/crowdsec/whitelist", bytes.NewBufferString(body))
req.Header.Set("Content-Type", "application/json")
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
var resp map[string]interface{}
require.NoError(t, json.Unmarshal(w.Body.Bytes(), &resp))
assert.Equal(t, "ip_or_cidr is required", resp["error"])
}
func TestListWhitelists_DBError(t *testing.T) {
t.Parallel()
_, r, db := setupWhitelistHandler(t)
sqlDB, err := db.DB()
require.NoError(t, err)
_ = sqlDB.Close()
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodGet, "/api/v1/admin/crowdsec/whitelist", nil)
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInternalServerError, w.Code)
var resp map[string]interface{}
require.NoError(t, json.Unmarshal(w.Body.Bytes(), &resp))
assert.Equal(t, "failed to list whitelist entries", resp["error"])
}
func TestAddWhitelist_DBError(t *testing.T) {
t.Parallel()
_, r, db := setupWhitelistHandler(t)
sqlDB, err := db.DB()
require.NoError(t, err)
_ = sqlDB.Close()
body := `{"ip_or_cidr":"1.2.3.4","reason":"test"}`
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodPost, "/api/v1/admin/crowdsec/whitelist", bytes.NewBufferString(body))
req.Header.Set("Content-Type", "application/json")
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInternalServerError, w.Code)
var resp map[string]interface{}
require.NoError(t, json.Unmarshal(w.Body.Bytes(), &resp))
assert.Equal(t, "failed to add whitelist entry", resp["error"])
}
func TestAddWhitelist_ReloadFailure(t *testing.T) {
t.Parallel()
h, r, _ := setupWhitelistHandler(t)
mock := &mockCmdExecWhitelist{reloadErr: errors.New("cscli failed")}
h.CmdExec = mock
body := `{"ip_or_cidr":"3.3.3.3","reason":"reload test"}`
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodPost, "/api/v1/admin/crowdsec/whitelist", bytes.NewBufferString(body))
req.Header.Set("Content-Type", "application/json")
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusCreated, w.Code)
assert.True(t, mock.reloadCalled)
}
func TestDeleteWhitelist_DBError(t *testing.T) {
t.Parallel()
_, r, db := setupWhitelistHandler(t)
svc := services.NewCrowdSecWhitelistService(db, "")
entry, err := svc.Add(t.Context(), "4.4.4.4", "will close db")
require.NoError(t, err)
sqlDB, err := db.DB()
require.NoError(t, err)
_ = sqlDB.Close()
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodDelete, "/api/v1/admin/crowdsec/whitelist/"+entry.UUID, nil)
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInternalServerError, w.Code)
var resp map[string]interface{}
require.NoError(t, json.Unmarshal(w.Body.Bytes(), &resp))
assert.Equal(t, "failed to delete whitelist entry", resp["error"])
}
func TestDeleteWhitelist_ReloadFailure(t *testing.T) {
t.Parallel()
h, r, db := setupWhitelistHandler(t)
mock := &mockCmdExecWhitelist{reloadErr: errors.New("cscli failed")}
h.CmdExec = mock
svc := services.NewCrowdSecWhitelistService(db, "")
entry, err := svc.Add(t.Context(), "5.5.5.5", "reload test")
require.NoError(t, err)
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodDelete, "/api/v1/admin/crowdsec/whitelist/"+entry.UUID, nil)
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusNoContent, w.Code)
assert.True(t, mock.reloadCalled)
}
func TestDeleteWhitelist_EmptyUUID(t *testing.T) {
t.Parallel()
h, _, _ := setupWhitelistHandler(t)
w := httptest.NewRecorder()
c, _ := gin.CreateTestContext(w)
c.Request = httptest.NewRequest(http.MethodDelete, "/api/v1/admin/crowdsec/whitelist/", nil)
c.Params = gin.Params{{Key: "uuid", Value: ""}}
h.DeleteWhitelist(c)
assert.Equal(t, http.StatusBadRequest, w.Code)
var resp map[string]interface{}
require.NoError(t, json.Unmarshal(w.Body.Bytes(), &resp))
assert.Equal(t, "uuid is required", resp["error"])
}

View File

@@ -248,6 +248,38 @@ func (h *ProxyHostHandler) resolveSecurityHeaderProfileReference(value any) (*ui
return &id, nil
}
func (h *ProxyHostHandler) resolveCertificateReference(value any) (*uint, error) {
if value == nil {
return nil, nil
}
parsedID, _, parseErr := parseNullableUintField(value, "certificate_id")
if parseErr == nil {
return parsedID, nil
}
uuidValue, isString := value.(string)
if !isString {
return nil, parseErr
}
trimmed := strings.TrimSpace(uuidValue)
if trimmed == "" {
return nil, nil
}
var cert models.SSLCertificate
if err := h.db.Select("id").Where("uuid = ?", trimmed).First(&cert).Error; err != nil {
if err == gorm.ErrRecordNotFound {
return nil, fmt.Errorf("certificate not found")
}
return nil, fmt.Errorf("failed to resolve certificate")
}
id := cert.ID
return &id, nil
}
func parseForwardPortField(value any) (int, error) {
switch v := value.(type) {
case float64:
@@ -342,6 +374,15 @@ func (h *ProxyHostHandler) Create(c *gin.Context) {
payload["security_header_profile_id"] = resolvedSecurityHeaderID
}
if rawCertRef, ok := payload["certificate_id"]; ok {
resolvedCertID, resolveErr := h.resolveCertificateReference(rawCertRef)
if resolveErr != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": resolveErr.Error()})
return
}
payload["certificate_id"] = resolvedCertID
}
payloadBytes, marshalErr := json.Marshal(payload)
if marshalErr != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "invalid request payload"})
@@ -523,12 +564,12 @@ func (h *ProxyHostHandler) Update(c *gin.Context) {
// Nullable foreign keys
if v, ok := payload["certificate_id"]; ok {
parsedID, _, parseErr := parseNullableUintField(v, "certificate_id")
if parseErr != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": parseErr.Error()})
resolvedCertID, resolveErr := h.resolveCertificateReference(v)
if resolveErr != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": resolveErr.Error()})
return
}
host.CertificateID = parsedID
host.CertificateID = resolvedCertID
}
if v, ok := payload["access_list_id"]; ok {
resolvedAccessListID, resolveErr := h.resolveAccessListReference(v)

View File

@@ -0,0 +1,128 @@
package handlers
import (
"bytes"
"encoding/json"
"net/http"
"net/http/httptest"
"testing"
"github.com/google/uuid"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
func TestGenerateForwardHostWarnings_PrivateIP(t *testing.T) {
warnings := generateForwardHostWarnings("192.168.1.100")
require.Len(t, warnings, 1)
assert.Equal(t, "forward_host", warnings[0].Field)
}
func TestBulkUpdateSecurityHeaders_AllFail_Rollback(t *testing.T) {
r, _ := setupTestRouterForSecurityHeaders(t)
body, err := json.Marshal(map[string]any{
"host_uuids": []string{
uuid.New().String(),
uuid.New().String(),
uuid.New().String(),
},
})
require.NoError(t, err)
req := httptest.NewRequest(http.MethodPut, "/api/v1/proxy-hosts/bulk-update-security-headers", bytes.NewReader(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
}
func TestBulkUpdateSecurityHeaders_ProfileDB_NonNotFoundError(t *testing.T) {
r, db := setupTestRouterForSecurityHeaders(t)
// Drop the security_header_profiles table so the lookup returns a non-NotFound DB error
require.NoError(t, db.Exec("DROP TABLE security_header_profiles").Error)
profileID := uint(1)
body, err := json.Marshal(map[string]any{
"host_uuids": []string{uuid.New().String()},
"security_header_profile_id": profileID,
})
require.NoError(t, err)
req := httptest.NewRequest(http.MethodPut, "/api/v1/proxy-hosts/bulk-update-security-headers", bytes.NewReader(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusInternalServerError, w.Code)
}
func TestGenerateForwardHostWarnings_DockerBridgeIP(t *testing.T) {
warnings := generateForwardHostWarnings("172.17.0.1")
require.Len(t, warnings, 1)
assert.Equal(t, "forward_host", warnings[0].Field)
}
func TestParseNullableUintField_DefaultType(t *testing.T) {
id, exists, err := parseNullableUintField(true, "test_field")
assert.Nil(t, id)
assert.True(t, exists)
assert.Error(t, err)
}
func TestParseForwardPortField_StringEmpty(t *testing.T) {
_, err := parseForwardPortField("")
assert.Error(t, err)
}
func TestParseForwardPortField_StringNonNumeric(t *testing.T) {
_, err := parseForwardPortField("notaport")
assert.Error(t, err)
}
func TestParseForwardPortField_StringValid(t *testing.T) {
port, err := parseForwardPortField("8080")
require.NoError(t, err)
assert.Equal(t, 8080, port)
}
func TestParseForwardPortField_DefaultType(t *testing.T) {
_, err := parseForwardPortField(true)
assert.Error(t, err)
}
func TestCreate_InvalidCertificateRef(t *testing.T) {
r, _ := setupTestRouterForSecurityHeaders(t)
body, err := json.Marshal(map[string]any{
"domain_names": "cert-ref.example.com",
"forward_host": "localhost",
"forward_port": 8080,
"certificate_id": uuid.New().String(),
})
require.NoError(t, err)
req := httptest.NewRequest(http.MethodPost, "/api/v1/proxy-hosts", bytes.NewReader(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
}
func TestCreate_InvalidSecurityHeaderProfileRef(t *testing.T) {
r, _ := setupTestRouterForSecurityHeaders(t)
body, err := json.Marshal(map[string]any{
"domain_names": "shp-ref.example.com",
"forward_host": "localhost",
"forward_port": 8080,
"security_header_profile_id": uuid.New().String(),
})
require.NoError(t, err)
req := httptest.NewRequest(http.MethodPost, "/api/v1/proxy-hosts", bytes.NewReader(body))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
r.ServeHTTP(w, req)
assert.Equal(t, http.StatusBadRequest, w.Code)
}

View File

@@ -1,6 +1,7 @@
package routes_test
import (
"context"
"testing"
"github.com/gin-gonic/gin"
@@ -20,7 +21,7 @@ func TestEndpointInventory_FrontendCanonicalSaveImportContractsExistInBackend(t
require.NoError(t, err)
router := gin.New()
require.NoError(t, routes.Register(router, db, config.Config{JWTSecret: "test-secret"}))
require.NoError(t, routes.Register(context.Background(), router, db, config.Config{JWTSecret: "test-secret"}))
routes.RegisterImportHandler(router, db, config.Config{JWTSecret: "test-secret"}, "echo", "/tmp", "/import/Caddyfile")
assertStrictMethodPathMatrix(t, router.Routes(), backendImportSaveInventoryCanonical(), "backend canonical save/import inventory")
@@ -33,7 +34,7 @@ func TestEndpointInventory_FrontendParityMatchesCurrentContract(t *testing.T) {
require.NoError(t, err)
router := gin.New()
require.NoError(t, routes.Register(router, db, config.Config{JWTSecret: "test-secret"}))
require.NoError(t, routes.Register(context.Background(), router, db, config.Config{JWTSecret: "test-secret"}))
routes.RegisterImportHandler(router, db, config.Config{JWTSecret: "test-secret"}, "echo", "/tmp", "/import/Caddyfile")
assertStrictMethodPathMatrix(t, router.Routes(), frontendObservedImportSaveInventory(), "frontend observed save/import inventory")
@@ -46,7 +47,7 @@ func TestEndpointInventory_FrontendParityDetectsActualMismatch(t *testing.T) {
require.NoError(t, err)
router := gin.New()
require.NoError(t, routes.Register(router, db, config.Config{JWTSecret: "test-secret"}))
require.NoError(t, routes.Register(context.Background(), router, db, config.Config{JWTSecret: "test-secret"}))
routes.RegisterImportHandler(router, db, config.Config{JWTSecret: "test-secret"}, "echo", "/tmp", "/import/Caddyfile")
contractWithMismatch := append([]endpointInventoryEntry{}, frontendObservedImportSaveInventory()...)

View File

@@ -61,7 +61,7 @@ func migrateViewerToPassthrough(db *gorm.DB) {
}
// Register wires up API routes and performs automatic migrations.
func Register(router *gin.Engine, db *gorm.DB, cfg config.Config) error {
func Register(ctx context.Context, router *gin.Engine, db *gorm.DB, cfg config.Config) error {
// Caddy Manager - created early so it can be used by settings handlers for config reload
caddyClient := caddy.NewClient(cfg.CaddyAdminAPI)
caddyManager := caddy.NewManager(caddyClient, db, cfg.CaddyConfigDir, cfg.FrontendDir, cfg.ACMEStaging, cfg.Security)
@@ -69,11 +69,11 @@ func Register(router *gin.Engine, db *gorm.DB, cfg config.Config) error {
// Cerberus middleware applies the optional security suite checks (WAF, ACL, CrowdSec)
cerb := cerberus.New(cfg.Security, db)
return RegisterWithDeps(router, db, cfg, caddyManager, cerb)
return RegisterWithDeps(ctx, router, db, cfg, caddyManager, cerb)
}
// RegisterWithDeps wires up API routes and performs automatic migrations with prebuilt dependencies.
func RegisterWithDeps(router *gin.Engine, db *gorm.DB, cfg config.Config, caddyManager *caddy.Manager, cerb *cerberus.Cerberus) error {
func RegisterWithDeps(ctx context.Context, router *gin.Engine, db *gorm.DB, cfg config.Config, caddyManager *caddy.Manager, cerb *cerberus.Cerberus) error {
// Emergency bypass must be registered FIRST.
// When a valid X-Emergency-Token is present from an authorized source,
// it sets an emergency context flag and strips the token header so downstream
@@ -122,6 +122,7 @@ func RegisterWithDeps(router *gin.Engine, db *gorm.DB, cfg config.Config, caddyM
&models.DNSProviderCredential{}, // Multi-credential support (Phase 3)
&models.Plugin{}, // Phase 5: DNS provider plugins
&models.ManualChallenge{}, // Phase 1: Manual DNS challenges
&models.CrowdSecWhitelist{}, // Issue #939: CrowdSec IP whitelist management
); err != nil {
return fmt.Errorf("auto migrate: %w", err)
}
@@ -152,6 +153,14 @@ func RegisterWithDeps(router *gin.Engine, db *gorm.DB, cfg config.Config, caddyM
caddyClient := caddy.NewClient(cfg.CaddyAdminAPI)
caddyManager = caddy.NewManager(caddyClient, db, cfg.CaddyConfigDir, cfg.FrontendDir, cfg.ACMEStaging, cfg.Security)
}
// Wire encryption service to Caddy manager for decrypting certificate private keys
if cfg.EncryptionKey != "" {
if svc, err := crypto.NewEncryptionService(cfg.EncryptionKey); err == nil {
caddyManager.SetEncryptionService(svc)
}
}
if cerb == nil {
cerb = cerberus.New(cfg.Security, db)
}
@@ -666,11 +675,38 @@ func RegisterWithDeps(router *gin.Engine, db *gorm.DB, cfg config.Config, caddyM
// where ACME and certificates are stored (e.g. <CaddyConfigDir>/data).
caddyDataDir := cfg.CaddyConfigDir + "/data"
logger.Log().WithField("caddy_data_dir", caddyDataDir).Info("Using Caddy data directory for certificates scan")
certService := services.NewCertificateService(caddyDataDir, db)
var certEncSvc *crypto.EncryptionService
if cfg.EncryptionKey != "" {
svc, err := crypto.NewEncryptionService(cfg.EncryptionKey)
if err != nil {
logger.Log().WithError(err).Warn("Failed to initialize encryption service for certificate key storage")
} else {
certEncSvc = svc
}
}
certService := services.NewCertificateService(caddyDataDir, db, certEncSvc)
certHandler := handlers.NewCertificateHandler(certService, backupService, notificationService)
certHandler.SetDB(db)
// Migrate unencrypted private keys
if err := certService.MigratePrivateKeys(); err != nil {
logger.Log().WithError(err).Warn("Failed to migrate certificate private keys")
}
management.GET("/certificates", certHandler.List)
management.POST("/certificates", certHandler.Upload)
management.DELETE("/certificates/:id", certHandler.Delete)
management.POST("/certificates/validate", certHandler.Validate)
management.GET("/certificates/:uuid", certHandler.Get)
management.PUT("/certificates/:uuid", certHandler.Update)
management.POST("/certificates/:uuid/export", certHandler.Export)
management.DELETE("/certificates/:uuid", certHandler.Delete)
// Start certificate expiry checker
warningDays := 30
if cfg.CertExpiryWarningDays > 0 {
warningDays = cfg.CertExpiryWarningDays
}
go certService.StartExpiryChecker(ctx, notificationService, warningDays)
// Proxy Hosts & Remote Servers
proxyHostHandler := handlers.NewProxyHostHandler(db, caddyManager, notificationService, uptimeService)

View File

@@ -1,6 +1,7 @@
package routes
import (
"context"
"errors"
"testing"
@@ -34,7 +35,10 @@ func TestRegister_NotifyOnlyProviderMigrationErrorReturns(t *testing.T) {
cfg := config.Config{JWTSecret: "test-secret"}
err = Register(router, db, cfg)
ctx, cancel := context.WithCancel(context.Background())
t.Cleanup(cancel)
err = Register(ctx, router, db, cfg)
require.Error(t, err)
require.Contains(t, err.Error(), "notify-only provider migration")
}
@@ -61,7 +65,10 @@ func TestRegister_LegacyMigrationErrorIsNonFatal(t *testing.T) {
cfg := config.Config{JWTSecret: "test-secret"}
err = Register(router, db, cfg)
ctx, cancel := context.WithCancel(context.Background())
t.Cleanup(cancel)
err = Register(ctx, router, db, cfg)
require.NoError(t, err)
hasHealth := false
@@ -96,7 +103,10 @@ func TestRegister_UptimeFeatureFlagDefaultErrorIsNonFatal(t *testing.T) {
cfg := config.Config{JWTSecret: "test-secret"}
err = Register(router, db, cfg)
ctx, cancel := context.WithCancel(context.Background())
t.Cleanup(cancel)
err = Register(ctx, router, db, cfg)
require.NoError(t, err)
}
@@ -122,6 +132,9 @@ func TestRegister_SecurityHeaderPresetInitErrorIsNonFatal(t *testing.T) {
cfg := config.Config{JWTSecret: "test-secret"}
err = Register(router, db, cfg)
ctx, cancel := context.WithCancel(context.Background())
t.Cleanup(cancel)
err = Register(ctx, router, db, cfg)
require.NoError(t, err)
}

View File

@@ -1,6 +1,7 @@
package routes_test
import (
"context"
"testing"
"github.com/gin-gonic/gin"
@@ -19,7 +20,7 @@ func TestRegister_StrictSaveRouteMatrixUsedByImportWorkflows(t *testing.T) {
require.NoError(t, err)
router := gin.New()
require.NoError(t, routes.Register(router, db, config.Config{JWTSecret: "test-secret"}))
require.NoError(t, routes.Register(context.Background(), router, db, config.Config{JWTSecret: "test-secret"}))
assertStrictMethodPathMatrix(t, router.Routes(), saveRouteMatrixForImportWorkflows(), "save")
}

View File

@@ -1,6 +1,7 @@
package routes
import (
"context"
"io"
"net/http"
"net/http/httptest"
@@ -41,7 +42,7 @@ func TestRegister(t *testing.T) {
JWTSecret: "test-secret",
}
err = Register(router, db, cfg)
err = Register(context.Background(), router, db, cfg)
assert.NoError(t, err)
// Verify some routes are registered
@@ -70,7 +71,7 @@ func TestRegister_WithDevelopmentEnvironment(t *testing.T) {
Environment: "development",
}
err = Register(router, db, cfg)
err = Register(context.Background(), router, db, cfg)
assert.NoError(t, err)
}
@@ -86,7 +87,7 @@ func TestRegister_WithProductionEnvironment(t *testing.T) {
Environment: "production",
}
err = Register(router, db, cfg)
err = Register(context.Background(), router, db, cfg)
assert.NoError(t, err)
}
@@ -107,7 +108,7 @@ func TestRegister_AutoMigrateFailure(t *testing.T) {
JWTSecret: "test-secret",
}
err = Register(router, db, cfg)
err = Register(context.Background(), router, db, cfg)
assert.Error(t, err)
assert.Contains(t, err.Error(), "auto migrate")
}
@@ -148,7 +149,7 @@ func TestRegister_RoutesRegistration(t *testing.T) {
JWTSecret: "test-secret",
}
err = Register(router, db, cfg)
err = Register(context.Background(), router, db, cfg)
require.NoError(t, err)
routes := router.Routes()
@@ -181,7 +182,7 @@ func TestRegister_ProxyHostsRequireAuth(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
req := httptest.NewRequest(http.MethodPost, "/api/v1/proxy-hosts", strings.NewReader(`{}`))
req.Header.Set("Content-Type", "application/json")
@@ -200,7 +201,7 @@ func TestRegister_StateChangingRoutesDenyByDefaultWithExplicitAllowlist(t *testi
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
mutatingMethods := map[string]bool{
http.MethodPost: true,
@@ -264,7 +265,7 @@ func TestRegister_DNSProviders_NotRegisteredWhenEncryptionKeyMissing(t *testing.
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret", EncryptionKey: ""}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
for _, r := range router.Routes() {
assert.NotContains(t, r.Path, "/api/v1/dns-providers")
@@ -279,7 +280,7 @@ func TestRegister_DNSProviders_NotRegisteredWhenEncryptionKeyInvalid(t *testing.
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret", EncryptionKey: "not-base64"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
for _, r := range router.Routes() {
assert.NotContains(t, r.Path, "/api/v1/dns-providers")
@@ -295,7 +296,7 @@ func TestRegister_DNSProviders_RegisteredWhenEncryptionKeyValid(t *testing.T) {
// 32-byte all-zero key in base64
cfg := config.Config{JWTSecret: "test-secret", EncryptionKey: "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
paths := make(map[string]bool)
for _, r := range router.Routes() {
@@ -317,7 +318,7 @@ func TestRegister_AllRoutesRegistered(t *testing.T) {
JWTSecret: "test-secret",
EncryptionKey: "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=",
}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string][]string) // path -> methods
@@ -384,7 +385,7 @@ func TestRegister_MiddlewareApplied(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Test that security headers middleware is applied
w := httptest.NewRecorder()
@@ -413,7 +414,7 @@ func TestRegister_AuthenticatedRoutes(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Test that protected routes require authentication
protectedPaths := []struct {
@@ -449,7 +450,7 @@ func TestRegister_StateChangingRoutesRequireAuthentication(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
stateChangingPaths := []struct {
method string
@@ -488,7 +489,7 @@ func TestRegister_AdminRoutes(t *testing.T) {
JWTSecret: "test-secret",
EncryptionKey: "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=",
}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Admin routes should exist and require auth
adminPaths := []string{
@@ -513,7 +514,7 @@ func TestRegister_PublicRoutes(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Public routes should be accessible without auth (route exists, not 404)
publicPaths := []struct {
@@ -545,7 +546,7 @@ func TestRegister_HealthEndpoint(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodGet, "/api/v1/health", nil)
@@ -563,7 +564,7 @@ func TestRegister_MetricsEndpoint(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodGet, "/metrics", nil)
@@ -582,7 +583,7 @@ func TestRegister_DBHealthEndpoint(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
w := httptest.NewRecorder()
req := httptest.NewRequest(http.MethodGet, "/api/v1/health/db", nil)
@@ -600,7 +601,7 @@ func TestRegister_LoginEndpoint(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Test login endpoint exists and accepts POST
body := `{"username": "test", "password": "test"}`
@@ -621,7 +622,7 @@ func TestRegister_SetupEndpoint(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// GET /setup should return setup status
w := httptest.NewRecorder()
@@ -646,7 +647,7 @@ func TestRegister_WithEncryptionRoutes(t *testing.T) {
JWTSecret: "test-secret",
EncryptionKey: "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=",
}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Check if encryption routes are registered (may depend on env)
routes := router.Routes()
@@ -668,7 +669,7 @@ func TestRegister_UptimeCheckEndpoint(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Uptime check route should exist and require auth
w := httptest.NewRecorder()
@@ -687,7 +688,7 @@ func TestRegister_CrowdSecRoutes(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// CrowdSec routes should exist
routes := router.Routes()
@@ -713,7 +714,7 @@ func TestRegister_SecurityRoutes(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string]bool)
@@ -740,7 +741,7 @@ func TestRegister_AccessListRoutes(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string]bool)
@@ -763,7 +764,7 @@ func TestRegister_CertificateRoutes(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string]bool)
@@ -773,7 +774,7 @@ func TestRegister_CertificateRoutes(t *testing.T) {
// Certificate routes
assert.True(t, routeMap["/api/v1/certificates"])
assert.True(t, routeMap["/api/v1/certificates/:id"])
assert.True(t, routeMap["/api/v1/certificates/:uuid"])
}
// TestRegister_NilHandlers verifies registration behavior with minimal/nil components
@@ -792,7 +793,7 @@ func TestRegister_NilHandlers(t *testing.T) {
EncryptionKey: "", // No encryption key - DNS providers won't be registered
}
err = Register(router, db, cfg)
err = Register(context.Background(), router, db, cfg)
assert.NoError(t, err)
// Verify that routes still work without DNS provider features
@@ -823,7 +824,7 @@ func TestRegister_MiddlewareOrder(t *testing.T) {
Environment: "development",
}
err = Register(router, db, cfg)
err = Register(context.Background(), router, db, cfg)
require.NoError(t, err)
// Test that security headers are applied (they should come first)
@@ -848,7 +849,7 @@ func TestRegister_GzipCompression(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Request with Accept-Encoding: gzip
w := httptest.NewRecorder()
@@ -875,7 +876,7 @@ func TestRegister_CerberusMiddleware(t *testing.T) {
},
}
err = Register(router, db, cfg)
err = Register(context.Background(), router, db, cfg)
require.NoError(t, err)
// API routes should have Cerberus middleware applied
@@ -896,7 +897,7 @@ func TestRegister_FeatureFlagsEndpoint(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Feature flags should require auth
w := httptest.NewRecorder()
@@ -915,7 +916,7 @@ func TestRegister_WebSocketRoutes(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string]bool)
@@ -939,7 +940,7 @@ func TestRegister_NotificationRoutes(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string]bool)
@@ -967,7 +968,7 @@ func TestRegister_DomainRoutes(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string]bool)
@@ -989,7 +990,7 @@ func TestRegister_VerifyAuthEndpoint(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Verify endpoint is public (for Caddy forward auth)
w := httptest.NewRecorder()
@@ -1009,7 +1010,7 @@ func TestRegister_SMTPRoutes(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string]bool)
@@ -1064,7 +1065,7 @@ func TestRegister_EncryptionRoutesWithValidKey(t *testing.T) {
JWTSecret: "test-secret",
EncryptionKey: "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=",
}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string]bool)
@@ -1091,7 +1092,7 @@ func TestRegister_WAFExclusionRoutes(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string]bool)
@@ -1113,7 +1114,7 @@ func TestRegister_BreakGlassRoute(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string]bool)
@@ -1134,7 +1135,7 @@ func TestRegister_RateLimitPresetsRoute(t *testing.T) {
require.NoError(t, err)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
routes := router.Routes()
routeMap := make(map[string]bool)
@@ -1166,7 +1167,7 @@ func TestEmergencyEndpoint_BypassACL(t *testing.T) {
CerberusEnabled: true,
},
}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Note: We don't need to create ACL settings here because the emergency endpoint
// bypass happens at middleware level before Cerberus checks
@@ -1210,7 +1211,7 @@ func TestEmergencyBypass_MiddlewareOrder(t *testing.T) {
ManagementCIDRs: []string{"127.0.0.0/8"},
},
}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Request with emergency token should set bypass flag
w := httptest.NewRecorder()
@@ -1239,7 +1240,7 @@ func TestEmergencyBypass_InvalidToken(t *testing.T) {
CerberusEnabled: true,
},
}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Request with WRONG emergency token
w := httptest.NewRecorder()
@@ -1271,7 +1272,7 @@ func TestEmergencyBypass_UnauthorizedIP(t *testing.T) {
ManagementCIDRs: []string{"192.168.1.0/24"},
},
}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
// Request from public IP (not in management network)
w := httptest.NewRecorder()
@@ -1295,7 +1296,7 @@ func TestRegister_CreatesAccessLogFileForLogWatcher(t *testing.T) {
t.Setenv("CHARON_CADDY_ACCESS_LOG", logFilePath)
cfg := config.Config{JWTSecret: "test-secret"}
require.NoError(t, Register(router, db, cfg))
require.NoError(t, Register(context.Background(), router, db, cfg))
_, statErr := os.Stat(logFilePath)
assert.NoError(t, statErr)
@@ -1341,7 +1342,7 @@ func TestRegister_CleansLetsEncryptCertAssignments(t *testing.T) {
require.NoError(t, db.Create(&host).Error)
cfg := config.Config{JWTSecret: "test-secret"}
err = Register(router, db, cfg)
err = Register(context.Background(), router, db, cfg)
require.NoError(t, err)
var reloaded models.ProxyHost

View File

@@ -2,6 +2,7 @@
package tests
import (
"context"
"net/http"
"net/http/httptest"
"strings"
@@ -33,7 +34,7 @@ func TestIntegration_WAF_BlockAndMonitor(t *testing.T) {
}
cfg.Security.WAFMode = mode
r := gin.New()
if err := routes.Register(r, db, cfg); err != nil {
if err := routes.Register(context.Background(), r, db, cfg); err != nil {
t.Fatalf("register: %v", err)
}
return r, db

View File

@@ -8,6 +8,7 @@ import (
"path/filepath"
"strings"
"github.com/Wikid82/charon/backend/internal/crypto"
"github.com/Wikid82/charon/backend/internal/logger"
"github.com/Wikid82/charon/backend/internal/models"
"github.com/Wikid82/charon/backend/pkg/dnsprovider"
@@ -15,7 +16,7 @@ import (
// GenerateConfig creates a Caddy JSON configuration from proxy hosts.
// This is the core transformation layer from our database model to Caddy config.
func GenerateConfig(hosts []models.ProxyHost, storageDir, acmeEmail, frontendDir, sslProvider string, acmeStaging, crowdsecEnabled, wafEnabled, rateLimitEnabled, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig) (*Config, error) {
func GenerateConfig(hosts []models.ProxyHost, storageDir, acmeEmail, frontendDir, sslProvider string, acmeStaging, crowdsecEnabled, wafEnabled, rateLimitEnabled, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig, encSvc ...*crypto.EncryptionService) (*Config, error) {
// Define log file paths for Caddy access logs.
// When CrowdSec is enabled, we use /var/log/caddy/access.log which is the standard
// location that CrowdSec's acquis.yaml is configured to monitor.
@@ -427,16 +428,47 @@ func GenerateConfig(hosts []models.ProxyHost, storageDir, acmeEmail, frontendDir
}
if len(customCerts) > 0 {
// Resolve encryption service from variadic parameter
var certEncSvc *crypto.EncryptionService
if len(encSvc) > 0 && encSvc[0] != nil {
certEncSvc = encSvc[0]
}
var loadPEM []LoadPEMConfig
for _, cert := range customCerts {
// Validate that custom cert has both certificate and key
if cert.Certificate == "" || cert.PrivateKey == "" {
logger.Log().WithField("cert", cert.Name).Warn("Custom certificate missing certificate or key, skipping")
// Determine private key: prefer encrypted, fall back to plaintext for migration
var keyPEM string
if cert.PrivateKeyEncrypted != "" && certEncSvc != nil {
decrypted, err := certEncSvc.Decrypt(cert.PrivateKeyEncrypted)
if err != nil {
logger.Log().WithField("cert", cert.Name).WithError(err).Warn("Failed to decrypt private key, skipping certificate")
continue
}
keyPEM = string(decrypted)
} else if cert.PrivateKeyEncrypted != "" {
logger.Log().WithField("cert", cert.Name).Warn("Certificate has encrypted key but no encryption service available, skipping")
continue
} else if cert.PrivateKey != "" {
keyPEM = cert.PrivateKey
} else {
logger.Log().WithField("cert", cert.Name).Warn("Custom certificate has no encrypted key, skipping")
continue
}
if cert.Certificate == "" {
logger.Log().WithField("cert", cert.Name).Warn("Custom certificate missing certificate PEM, skipping")
continue
}
// Concatenate chain with leaf certificate
fullCert := cert.Certificate
if cert.CertificateChain != "" {
fullCert = fullCert + "\n" + cert.CertificateChain
}
loadPEM = append(loadPEM, LoadPEMConfig{
Certificate: cert.Certificate,
Key: cert.PrivateKey,
Certificate: fullCert,
Key: keyPEM,
Tags: []string{cert.UUID},
})
}

View File

@@ -0,0 +1,166 @@
package caddy
import (
"encoding/base64"
"testing"
"github.com/Wikid82/charon/backend/internal/crypto"
"github.com/Wikid82/charon/backend/internal/models"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
func newTestEncSvc(t *testing.T) *crypto.EncryptionService {
t.Helper()
key := make([]byte, 32)
for i := range key {
key[i] = byte(i)
}
svc, err := crypto.NewEncryptionService(base64.StdEncoding.EncodeToString(key))
require.NoError(t, err)
return svc
}
// Test: encrypted key with encryption service → decrypt success → cert loaded
func TestGenerateConfig_CustomCert_EncryptedKey(t *testing.T) {
encSvc := newTestEncSvc(t)
encKey, err := encSvc.Encrypt([]byte("-----BEGIN PRIVATE KEY-----\nfake-key-data\n-----END PRIVATE KEY-----"))
require.NoError(t, err)
certID := uint(10)
hosts := []models.ProxyHost{
{
UUID: "h-enc", DomainNames: "enc.test", ForwardHost: "127.0.0.1", ForwardPort: 8080, Enabled: true,
CertificateID: &certID,
Certificate: &models.SSLCertificate{
ID: certID, UUID: "c-enc", Name: "EncCert", Provider: "custom",
Certificate: "-----BEGIN CERTIFICATE-----\nfake-cert\n-----END CERTIFICATE-----",
PrivateKeyEncrypted: encKey,
},
},
}
cfg, err := GenerateConfig(hosts, "/data", "admin@test.com", "/dist", "letsencrypt", true, false, false, false, false, "", nil, nil, nil, nil, nil, encSvc)
require.NoError(t, err)
require.NotNil(t, cfg)
require.NotNil(t, cfg.Apps.TLS)
require.NotNil(t, cfg.Apps.TLS.Certificates)
assert.NotEmpty(t, cfg.Apps.TLS.Certificates.LoadPEM)
}
// Test: encrypted key with no encryption service → skip
func TestGenerateConfig_CustomCert_EncryptedKeyNoEncSvc(t *testing.T) {
certID := uint(11)
hosts := []models.ProxyHost{
{
UUID: "h-noenc", DomainNames: "noenc.test", ForwardHost: "127.0.0.1", ForwardPort: 8080, Enabled: true,
CertificateID: &certID,
Certificate: &models.SSLCertificate{
ID: certID, UUID: "c-noenc", Name: "NoEncSvcCert", Provider: "custom",
Certificate: "-----BEGIN CERTIFICATE-----\nfake-cert\n-----END CERTIFICATE-----",
PrivateKeyEncrypted: "encrypted-data-here",
},
},
}
cfg, err := GenerateConfig(hosts, "/data", "admin@test.com", "/dist", "letsencrypt", true, false, false, false, false, "", nil, nil, nil, nil, nil)
require.NoError(t, err)
require.NotNil(t, cfg)
// Cert should be skipped - no TLS certs loaded
if cfg.Apps.TLS != nil && cfg.Apps.TLS.Certificates != nil {
assert.Empty(t, cfg.Apps.TLS.Certificates.LoadPEM)
}
}
// Test: no key at all → skip
func TestGenerateConfig_CustomCert_NoKey(t *testing.T) {
certID := uint(12)
hosts := []models.ProxyHost{
{
UUID: "h-nokey", DomainNames: "nokey.test", ForwardHost: "127.0.0.1", ForwardPort: 8080, Enabled: true,
CertificateID: &certID,
Certificate: &models.SSLCertificate{
ID: certID, UUID: "c-nokey", Name: "NoKeyCert", Provider: "custom",
Certificate: "-----BEGIN CERTIFICATE-----\nfake-cert\n-----END CERTIFICATE-----",
},
},
}
cfg, err := GenerateConfig(hosts, "/data", "admin@test.com", "/dist", "letsencrypt", true, false, false, false, false, "", nil, nil, nil, nil, nil)
require.NoError(t, err)
require.NotNil(t, cfg)
if cfg.Apps.TLS != nil && cfg.Apps.TLS.Certificates != nil {
assert.Empty(t, cfg.Apps.TLS.Certificates.LoadPEM)
}
}
// Test: missing cert PEM → skip
func TestGenerateConfig_CustomCert_NoCertPEM(t *testing.T) {
certID := uint(13)
hosts := []models.ProxyHost{
{
UUID: "h-nocert", DomainNames: "nocert.test", ForwardHost: "127.0.0.1", ForwardPort: 8080, Enabled: true,
CertificateID: &certID,
Certificate: &models.SSLCertificate{
ID: certID, UUID: "c-nocert", Name: "NoCertPEM", Provider: "custom",
PrivateKey: "some-key",
},
},
}
cfg, err := GenerateConfig(hosts, "/data", "admin@test.com", "/dist", "letsencrypt", true, false, false, false, false, "", nil, nil, nil, nil, nil)
require.NoError(t, err)
require.NotNil(t, cfg)
if cfg.Apps.TLS != nil && cfg.Apps.TLS.Certificates != nil {
assert.Empty(t, cfg.Apps.TLS.Certificates.LoadPEM)
}
}
// Test: cert with chain → chain concatenated
func TestGenerateConfig_CustomCert_WithChain(t *testing.T) {
certID := uint(14)
hosts := []models.ProxyHost{
{
UUID: "h-chain", DomainNames: "chain.test", ForwardHost: "127.0.0.1", ForwardPort: 8080, Enabled: true,
CertificateID: &certID,
Certificate: &models.SSLCertificate{
ID: certID, UUID: "c-chain", Name: "ChainCert", Provider: "custom",
Certificate: "-----BEGIN CERTIFICATE-----\nleaf-cert\n-----END CERTIFICATE-----",
PrivateKey: "-----BEGIN PRIVATE KEY-----\nkey-data\n-----END PRIVATE KEY-----",
CertificateChain: "-----BEGIN CERTIFICATE-----\nca-cert\n-----END CERTIFICATE-----",
},
},
}
cfg, err := GenerateConfig(hosts, "/data", "admin@test.com", "/dist", "letsencrypt", true, false, false, false, false, "", nil, nil, nil, nil, nil)
require.NoError(t, err)
require.NotNil(t, cfg)
require.NotNil(t, cfg.Apps.TLS)
require.NotNil(t, cfg.Apps.TLS.Certificates)
require.NotEmpty(t, cfg.Apps.TLS.Certificates.LoadPEM)
assert.Contains(t, cfg.Apps.TLS.Certificates.LoadPEM[0].Certificate, "ca-cert")
}
// Test: decrypt failure → skip
func TestGenerateConfig_CustomCert_DecryptFailure(t *testing.T) {
encSvc := newTestEncSvc(t)
certID := uint(15)
hosts := []models.ProxyHost{
{
UUID: "h-decfail", DomainNames: "decfail.test", ForwardHost: "127.0.0.1", ForwardPort: 8080, Enabled: true,
CertificateID: &certID,
Certificate: &models.SSLCertificate{
ID: certID, UUID: "c-decfail", Name: "DecryptFail", Provider: "custom",
Certificate: "-----BEGIN CERTIFICATE-----\nfake-cert\n-----END CERTIFICATE-----",
PrivateKeyEncrypted: "not-valid-encrypted-data",
},
},
}
cfg, err := GenerateConfig(hosts, "/data", "admin@test.com", "/dist", "letsencrypt", true, false, false, false, false, "", nil, nil, nil, nil, nil, encSvc)
require.NoError(t, err)
require.NotNil(t, cfg)
if cfg.Apps.TLS != nil && cfg.Apps.TLS.Certificates != nil {
assert.Empty(t, cfg.Apps.TLS.Certificates.LoadPEM)
}
}

View File

@@ -73,6 +73,7 @@ type Manager struct {
frontendDir string
acmeStaging bool
securityCfg config.SecurityConfig
encSvc *crypto.EncryptionService
}
// NewManager creates a configuration manager.
@@ -87,6 +88,11 @@ func NewManager(client CaddyClient, db *gorm.DB, configDir, frontendDir string,
}
}
// SetEncryptionService configures the encryption service for decrypting private keys in Caddy config generation.
func (m *Manager) SetEncryptionService(svc *crypto.EncryptionService) {
m.encSvc = svc
}
// ApplyConfig generates configuration from database, validates it, applies to Caddy with rollback on failure.
func (m *Manager) ApplyConfig(ctx context.Context) error {
// Fetch all proxy hosts from database
@@ -418,7 +424,7 @@ func (m *Manager) ApplyConfig(ctx context.Context) error {
}
}
generatedConfig, err := generateConfigFunc(hosts, filepath.Join(m.configDir, "data"), acmeEmail, m.frontendDir, effectiveProvider, effectiveStaging, crowdsecEnabled, wafEnabled, rateLimitEnabled, aclEnabled, adminWhitelist, rulesets, rulesetPaths, decisions, &secCfg, dnsProviderConfigs)
generatedConfig, err := generateConfigFunc(hosts, filepath.Join(m.configDir, "data"), acmeEmail, m.frontendDir, effectiveProvider, effectiveStaging, crowdsecEnabled, wafEnabled, rateLimitEnabled, aclEnabled, adminWhitelist, rulesets, rulesetPaths, decisions, &secCfg, dnsProviderConfigs, m.encSvc)
if err != nil {
return fmt.Errorf("generate config: %w", err)
}

View File

@@ -14,6 +14,7 @@ import (
"time"
"github.com/Wikid82/charon/backend/internal/config"
"github.com/Wikid82/charon/backend/internal/crypto"
"github.com/Wikid82/charon/backend/internal/models"
"github.com/stretchr/testify/assert"
"gorm.io/driver/sqlite"
@@ -422,7 +423,7 @@ func TestManager_ApplyConfig_GenerateConfigFails(t *testing.T) {
// stub generateConfigFunc to always return error
orig := generateConfigFunc
generateConfigFunc = func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig) (*Config, error) {
generateConfigFunc = func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig, encSvc ...*crypto.EncryptionService) (*Config, error) {
return nil, fmt.Errorf("generate fail")
}
defer func() { generateConfigFunc = orig }()
@@ -600,7 +601,7 @@ func TestManager_ApplyConfig_PassesAdminWhitelistToGenerateConfig(t *testing.T)
// Stub generateConfigFunc to capture adminWhitelist
var capturedAdmin string
orig := generateConfigFunc
generateConfigFunc = func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig) (*Config, error) {
generateConfigFunc = func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig, encSvc ...*crypto.EncryptionService) (*Config, error) {
capturedAdmin = adminWhitelist
// return minimal config
return &Config{Apps: Apps{HTTP: &HTTPApp{Servers: map[string]*Server{}}}}, nil
@@ -651,7 +652,7 @@ func TestManager_ApplyConfig_PassesRuleSetsToGenerateConfig(t *testing.T) {
var capturedRules []models.SecurityRuleSet
orig := generateConfigFunc
generateConfigFunc = func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig) (*Config, error) {
generateConfigFunc = func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig, encSvc ...*crypto.EncryptionService) (*Config, error) {
capturedRules = rulesets
return &Config{Apps: Apps{HTTP: &HTTPApp{Servers: map[string]*Server{}}}}, nil
}
@@ -706,7 +707,7 @@ func TestManager_ApplyConfig_IncludesWAFHandlerWithRuleset(t *testing.T) {
var capturedWafEnabled bool
var capturedRulesets []models.SecurityRuleSet
origGen := generateConfigFunc
generateConfigFunc = func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig) (*Config, error) {
generateConfigFunc = func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig, encSvc ...*crypto.EncryptionService) (*Config, error) {
capturedWafEnabled = wafEnabled
capturedRulesets = rulesets
return origGen(hosts, storageDir, acmeEmail, frontendDir, sslProvider, acmeStaging, crowdsecEnabled, wafEnabled, rateLimitEnabled, aclEnabled, adminWhitelist, rulesets, rulesetPaths, decisions, secCfg, dnsProviderConfigs)
@@ -811,7 +812,7 @@ func TestManager_ApplyConfig_RulesetWriteFileFailure(t *testing.T) {
// Capture rulesetPaths from GenerateConfig
var capturedPaths map[string]string
origGen := generateConfigFunc
generateConfigFunc = func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig) (*Config, error) {
generateConfigFunc = func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig, encSvc ...*crypto.EncryptionService) (*Config, error) {
capturedPaths = rulesetPaths
return origGen(hosts, storageDir, acmeEmail, frontendDir, sslProvider, acmeStaging, crowdsecEnabled, wafEnabled, rateLimitEnabled, aclEnabled, adminWhitelist, rulesets, rulesetPaths, decisions, secCfg, dnsProviderConfigs)
}

View File

@@ -57,7 +57,7 @@ func TestManagerApplyConfig_DNSProviders_NoKey_SkipsDecryption(t *testing.T) {
generateConfigFunc = origGen
validateConfigFunc = origVal
}()
generateConfigFunc = func(_ []models.ProxyHost, _ string, _ string, _ string, _ string, _ bool, _ bool, _ bool, _ bool, _ bool, _ string, _ []models.SecurityRuleSet, _ map[string]string, _ []models.SecurityDecision, _ *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig) (*Config, error) {
generateConfigFunc = func(_ []models.ProxyHost, _ string, _ string, _ string, _ string, _ bool, _ bool, _ bool, _ bool, _ bool, _ string, _ []models.SecurityRuleSet, _ map[string]string, _ []models.SecurityDecision, _ *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig, _ ...*crypto.EncryptionService) (*Config, error) {
capturedLen = len(dnsProviderConfigs)
return &Config{}, nil
}
@@ -111,7 +111,7 @@ func TestManagerApplyConfig_DNSProviders_UsesFallbackEnvKeys(t *testing.T) {
generateConfigFunc = origGen
validateConfigFunc = origVal
}()
generateConfigFunc = func(_ []models.ProxyHost, _ string, _ string, _ string, _ string, _ bool, _ bool, _ bool, _ bool, _ bool, _ string, _ []models.SecurityRuleSet, _ map[string]string, _ []models.SecurityDecision, _ *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig) (*Config, error) {
generateConfigFunc = func(_ []models.ProxyHost, _ string, _ string, _ string, _ string, _ bool, _ bool, _ bool, _ bool, _ bool, _ string, _ []models.SecurityRuleSet, _ map[string]string, _ []models.SecurityDecision, _ *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig, _ ...*crypto.EncryptionService) (*Config, error) {
captured = append([]DNSProviderConfig(nil), dnsProviderConfigs...)
return &Config{}, nil
}
@@ -175,7 +175,7 @@ func TestManagerApplyConfig_DNSProviders_SkipsDecryptOrJSONFailures(t *testing.T
generateConfigFunc = origGen
validateConfigFunc = origVal
}()
generateConfigFunc = func(_ []models.ProxyHost, _ string, _ string, _ string, _ string, _ bool, _ bool, _ bool, _ bool, _ bool, _ string, _ []models.SecurityRuleSet, _ map[string]string, _ []models.SecurityDecision, _ *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig) (*Config, error) {
generateConfigFunc = func(_ []models.ProxyHost, _ string, _ string, _ string, _ string, _ bool, _ bool, _ bool, _ bool, _ bool, _ string, _ []models.SecurityRuleSet, _ map[string]string, _ []models.SecurityDecision, _ *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig, _ ...*crypto.EncryptionService) (*Config, error) {
captured = append([]DNSProviderConfig(nil), dnsProviderConfigs...)
return &Config{}, nil
}

View File

@@ -9,6 +9,7 @@ import (
"testing"
"github.com/Wikid82/charon/backend/internal/config"
"github.com/Wikid82/charon/backend/internal/crypto"
"github.com/Wikid82/charon/backend/internal/models"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
@@ -17,8 +18,8 @@ import (
)
// mockGenerateConfigFunc creates a mock config generator that captures parameters
func mockGenerateConfigFunc(capturedProvider *string, capturedStaging *bool) func([]models.ProxyHost, string, string, string, string, bool, bool, bool, bool, bool, string, []models.SecurityRuleSet, map[string]string, []models.SecurityDecision, *models.SecurityConfig, []DNSProviderConfig) (*Config, error) {
return func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig) (*Config, error) {
func mockGenerateConfigFunc(capturedProvider *string, capturedStaging *bool) func([]models.ProxyHost, string, string, string, string, bool, bool, bool, bool, bool, string, []models.SecurityRuleSet, map[string]string, []models.SecurityDecision, *models.SecurityConfig, []DNSProviderConfig, ...*crypto.EncryptionService) (*Config, error) {
return func(hosts []models.ProxyHost, storageDir string, acmeEmail string, frontendDir string, sslProvider string, acmeStaging bool, crowdsecEnabled bool, wafEnabled bool, rateLimitEnabled bool, aclEnabled bool, adminWhitelist string, rulesets []models.SecurityRuleSet, rulesetPaths map[string]string, decisions []models.SecurityDecision, secCfg *models.SecurityConfig, dnsProviderConfigs []DNSProviderConfig, encSvc ...*crypto.EncryptionService) (*Config, error) {
*capturedProvider = sslProvider
*capturedStaging = acmeStaging
return &Config{Apps: Apps{HTTP: &HTTPApp{Servers: map[string]*Server{}}}}, nil

View File

@@ -33,6 +33,7 @@ type Config struct {
CaddyLogDir string
CrowdSecLogDir string
Debug bool
CertExpiryWarningDays int
Security SecurityConfig
Emergency EmergencyConfig
}
@@ -109,6 +110,13 @@ func Load() (Config, error) {
Debug: getEnvAny("false", "CHARON_DEBUG", "CPM_DEBUG") == "true",
}
cfg.CertExpiryWarningDays = 30
if days := getEnvAny("", "CHARON_CERT_EXPIRY_WARNING_DAYS"); days != "" {
if n, err := strconv.Atoi(days); err == nil && n > 0 {
cfg.CertExpiryWarningDays = n
}
}
// Set JWTSecret using os.Getenv directly so no string literal flows into the
// field — prevents CodeQL go/parse-jwt-with-hardcoded-key taint from any fallback.
cfg.JWTSecret = os.Getenv("CHARON_JWT_SECRET")

View File

@@ -0,0 +1,13 @@
package models
import "time"
// CrowdSecWhitelist represents a single IP or CIDR block that CrowdSec should never ban.
type CrowdSecWhitelist struct {
ID uint `json:"-" gorm:"primaryKey"`
UUID string `json:"uuid" gorm:"uniqueIndex;not null"`
IPOrCIDR string `json:"ip_or_cidr" gorm:"not null;uniqueIndex"`
Reason string `json:"reason" gorm:"not null;default:''"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}

View File

@@ -7,15 +7,24 @@ import (
// SSLCertificate represents TLS certificates managed by Charon.
// Can be Let's Encrypt auto-generated or custom uploaded certs.
type SSLCertificate struct {
ID uint `json:"-" gorm:"primaryKey"`
UUID string `json:"uuid" gorm:"uniqueIndex"`
Name string `json:"name" gorm:"index"`
Provider string `json:"provider" gorm:"index"` // "letsencrypt", "letsencrypt-staging", "custom"
Domains string `json:"domains" gorm:"index"` // comma-separated list of domains
Certificate string `json:"certificate" gorm:"type:text"` // PEM-encoded certificate
PrivateKey string `json:"private_key" gorm:"type:text"` // PEM-encoded private key
ExpiresAt *time.Time `json:"expires_at,omitempty" gorm:"index"`
AutoRenew bool `json:"auto_renew" gorm:"default:false"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
ID uint `json:"-" gorm:"primaryKey"`
UUID string `json:"uuid" gorm:"uniqueIndex"`
Name string `json:"name" gorm:"index"`
Provider string `json:"provider" gorm:"index"`
Domains string `json:"domains" gorm:"index"`
CommonName string `json:"common_name"`
Certificate string `json:"-" gorm:"type:text"`
CertificateChain string `json:"-" gorm:"type:text"`
PrivateKeyEncrypted string `json:"-" gorm:"column:private_key_enc;type:text"`
PrivateKey string `json:"-" gorm:"-"`
KeyVersion int `json:"-" gorm:"default:1"`
Fingerprint string `json:"fingerprint"`
SerialNumber string `json:"serial_number"`
IssuerOrg string `json:"issuer_org"`
KeyType string `json:"key_type"`
ExpiresAt *time.Time `json:"expires_at,omitempty" gorm:"index"`
NotBefore *time.Time `json:"not_before,omitempty"`
AutoRenew bool `json:"auto_renew" gorm:"default:false"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}

View File

@@ -27,6 +27,7 @@ func TestNewInternalServiceHTTPClient(t *testing.T) {
client := NewInternalServiceHTTPClient(tt.timeout)
if client == nil {
t.Fatal("NewInternalServiceHTTPClient() returned nil")
return
}
if client.Timeout != tt.timeout {
t.Errorf("expected timeout %v, got %v", tt.timeout, client.Timeout)

View File

@@ -179,6 +179,7 @@ func TestNewSafeHTTPClient_DefaultOptions(t *testing.T) {
client := NewSafeHTTPClient()
if client == nil {
t.Fatal("NewSafeHTTPClient() returned nil")
return
}
if client.Timeout != 10*time.Second {
t.Errorf("expected default timeout of 10s, got %v", client.Timeout)
@@ -190,6 +191,7 @@ func TestNewSafeHTTPClient_WithTimeout(t *testing.T) {
client := NewSafeHTTPClient(WithTimeout(10 * time.Second))
if client == nil {
t.Fatal("NewSafeHTTPClient() returned nil")
return
}
if client.Timeout != 10*time.Second {
t.Errorf("expected timeout of 10s, got %v", client.Timeout)
@@ -848,6 +850,7 @@ func TestClientOptions_AllFunctionalOptions(t *testing.T) {
if client == nil {
t.Fatal("NewSafeHTTPClient() returned nil with all options")
return
}
if client.Timeout != 15*time.Second {
t.Errorf("expected timeout of 15s, got %v", client.Timeout)

View File

@@ -0,0 +1,38 @@
package services
import (
"crypto/rand"
"crypto/rsa"
"crypto/x509"
"crypto/x509/pkix"
"encoding/pem"
"math/big"
"time"
)
func generateSelfSignedCertPEM() (string, string, error) {
priv, err := rsa.GenerateKey(rand.Reader, 2048)
if err != nil {
return "", "", err
}
template := x509.Certificate{
SerialNumber: big.NewInt(1),
Subject: pkix.Name{CommonName: "test.example.com"},
NotBefore: time.Now(),
NotAfter: time.Now().Add(365 * 24 * time.Hour),
KeyUsage: x509.KeyUsageKeyEncipherment | x509.KeyUsageDigitalSignature,
ExtKeyUsage: []x509.ExtKeyUsage{x509.ExtKeyUsageServerAuth},
BasicConstraintsValid: true,
}
derBytes, err := x509.CreateCertificate(rand.Reader, &template, &template, &priv.PublicKey, priv)
if err != nil {
return "", "", err
}
certPEM := pem.EncodeToMemory(&pem.Block{Type: "CERTIFICATE", Bytes: derBytes})
keyPEM := pem.EncodeToMemory(&pem.Block{Type: "RSA PRIVATE KEY", Bytes: x509.MarshalPKCS1PrivateKey(priv)})
return string(certPEM), string(keyPEM), nil
}

View File

@@ -1,15 +1,19 @@
package services
import (
"context"
crand "crypto/rand"
"crypto/x509"
"encoding/pem"
"fmt"
"math/big"
"os"
"path/filepath"
"strings"
"sync"
"time"
"github.com/Wikid82/charon/backend/internal/crypto"
"github.com/Wikid82/charon/backend/internal/logger"
"github.com/Wikid82/charon/backend/internal/util"
@@ -22,22 +26,73 @@ import (
// ErrCertInUse is returned when a certificate is linked to one or more proxy hosts.
var ErrCertInUse = fmt.Errorf("certificate is in use by one or more proxy hosts")
// CertificateInfo represents parsed certificate details.
// ErrCertNotFound is returned when a certificate cannot be found by UUID.
var ErrCertNotFound = fmt.Errorf("certificate not found")
// CertificateInfo represents parsed certificate details for list responses.
type CertificateInfo struct {
ID uint `json:"id,omitempty"`
UUID string `json:"uuid,omitempty"`
Name string `json:"name,omitempty"`
Domain string `json:"domain"`
UUID string `json:"uuid"`
Name string `json:"name,omitempty"`
CommonName string `json:"common_name,omitempty"`
Domains string `json:"domains"`
Issuer string `json:"issuer"`
IssuerOrg string `json:"issuer_org,omitempty"`
Fingerprint string `json:"fingerprint,omitempty"`
SerialNumber string `json:"serial_number,omitempty"`
KeyType string `json:"key_type,omitempty"`
ExpiresAt time.Time `json:"expires_at"`
NotBefore time.Time `json:"not_before,omitempty"`
Status string `json:"status"`
Provider string `json:"provider"`
ChainDepth int `json:"chain_depth,omitempty"`
HasKey bool `json:"has_key"`
InUse bool `json:"in_use"`
}
// AssignedHostInfo represents a proxy host assigned to a certificate.
type AssignedHostInfo struct {
UUID string `json:"uuid"`
Name string `json:"name"`
DomainNames string `json:"domain_names"`
}
// ChainEntry represents a single certificate in the chain.
type ChainEntry struct {
Subject string `json:"subject"`
Issuer string `json:"issuer"`
ExpiresAt time.Time `json:"expires_at"`
Status string `json:"status"` // "valid", "expiring", "expired", "untrusted"
Provider string `json:"provider"` // "letsencrypt", "letsencrypt-staging", "custom"
}
// CertificateDetail contains full certificate metadata for detail responses.
type CertificateDetail struct {
UUID string `json:"uuid"`
Name string `json:"name,omitempty"`
CommonName string `json:"common_name,omitempty"`
Domains string `json:"domains"`
Issuer string `json:"issuer"`
IssuerOrg string `json:"issuer_org,omitempty"`
Fingerprint string `json:"fingerprint,omitempty"`
SerialNumber string `json:"serial_number,omitempty"`
KeyType string `json:"key_type,omitempty"`
ExpiresAt time.Time `json:"expires_at"`
NotBefore time.Time `json:"not_before,omitempty"`
Status string `json:"status"`
Provider string `json:"provider"`
ChainDepth int `json:"chain_depth,omitempty"`
HasKey bool `json:"has_key"`
InUse bool `json:"in_use"`
AssignedHosts []AssignedHostInfo `json:"assigned_hosts"`
Chain []ChainEntry `json:"chain"`
AutoRenew bool `json:"auto_renew"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}
// CertificateService manages certificate retrieval and parsing.
type CertificateService struct {
dataDir string
db *gorm.DB
encSvc *crypto.EncryptionService
cache []CertificateInfo
cacheMu sync.RWMutex
lastScan time.Time
@@ -46,11 +101,12 @@ type CertificateService struct {
}
// NewCertificateService creates a new certificate service.
func NewCertificateService(dataDir string, db *gorm.DB) *CertificateService {
func NewCertificateService(dataDir string, db *gorm.DB, encSvc *crypto.EncryptionService) *CertificateService {
svc := &CertificateService{
dataDir: dataDir,
db: db,
scanTTL: 5 * time.Minute, // Only rescan disk every 5 minutes
encSvc: encSvc,
scanTTL: 5 * time.Minute,
}
return svc
}
@@ -224,15 +280,18 @@ func (s *CertificateService) refreshCacheFromDB() error {
return fmt.Errorf("failed to fetch certs from DB: %w", err)
}
// Build a map of domain -> proxy host name for quick lookup
// Build a set of certificate IDs that are in use
certInUse := make(map[uint]bool)
var proxyHosts []models.ProxyHost
s.db.Find(&proxyHosts)
domainToName := make(map[string]string)
for _, ph := range proxyHosts {
if ph.CertificateID != nil {
certInUse[*ph.CertificateID] = true
}
if ph.Name == "" {
continue
}
// Handle comma-separated domains
domains := strings.Split(ph.DomainNames, ",")
for _, d := range domains {
d = strings.TrimSpace(strings.ToLower(d))
@@ -244,27 +303,20 @@ func (s *CertificateService) refreshCacheFromDB() error {
certs := make([]CertificateInfo, 0, len(dbCerts))
for _, c := range dbCerts {
status := "valid"
// Staging certificates are untrusted by browsers
if strings.Contains(c.Provider, "staging") {
status = "untrusted"
} else if c.ExpiresAt != nil {
if time.Now().After(*c.ExpiresAt) {
status = "expired"
} else if time.Now().AddDate(0, 0, 30).After(*c.ExpiresAt) {
status = "expiring"
}
}
status := certStatus(c)
expires := time.Time{}
if c.ExpiresAt != nil {
expires = *c.ExpiresAt
}
notBefore := time.Time{}
if c.NotBefore != nil {
notBefore = *c.NotBefore
}
// Try to get name from proxy host, fall back to cert name or domain
name := c.Name
// Check all domains in the cert against proxy hosts
certDomains := strings.Split(c.Domains, ",")
for _, d := range certDomains {
d = strings.TrimSpace(strings.ToLower(d))
@@ -274,15 +326,36 @@ func (s *CertificateService) refreshCacheFromDB() error {
}
}
chainDepth := 0
if c.CertificateChain != "" {
rest := []byte(c.CertificateChain)
for {
var block *pem.Block
block, rest = pem.Decode(rest)
if block == nil {
break
}
chainDepth++
}
}
certs = append(certs, CertificateInfo{
ID: c.ID,
UUID: c.UUID,
Name: name,
Domain: c.Domains,
Issuer: c.Provider,
ExpiresAt: expires,
Status: status,
Provider: c.Provider,
UUID: c.UUID,
Name: name,
CommonName: c.CommonName,
Domains: c.Domains,
Issuer: c.Provider,
IssuerOrg: c.IssuerOrg,
Fingerprint: c.Fingerprint,
SerialNumber: c.SerialNumber,
KeyType: c.KeyType,
ExpiresAt: expires,
NotBefore: notBefore,
Status: status,
Provider: c.Provider,
ChainDepth: chainDepth,
HasKey: c.PrivateKeyEncrypted != "",
InUse: certInUse[c.ID],
})
}
@@ -290,6 +363,21 @@ func (s *CertificateService) refreshCacheFromDB() error {
return nil
}
func certStatus(c models.SSLCertificate) string {
if strings.Contains(c.Provider, "staging") {
return "untrusted"
}
if c.ExpiresAt != nil {
if time.Now().After(*c.ExpiresAt) {
return "expired"
}
if time.Now().AddDate(0, 0, 30).After(*c.ExpiresAt) {
return "expiring"
}
}
return "valid"
}
// ListCertificates returns cached certificate info.
// Fast path: returns from cache if available.
// Triggers background rescan if cache is stale.
@@ -342,45 +430,205 @@ func (s *CertificateService) InvalidateCache() {
s.cacheMu.Unlock()
}
// UploadCertificate saves a new custom certificate.
func (s *CertificateService) UploadCertificate(name, certPEM, keyPEM string) (*models.SSLCertificate, error) {
// Validate PEM
block, _ := pem.Decode([]byte(certPEM))
if block == nil {
return nil, fmt.Errorf("invalid certificate PEM")
}
cert, err := x509.ParseCertificate(block.Bytes)
// UploadCertificate saves a new custom certificate with full validation and encryption.
func (s *CertificateService) UploadCertificate(name, certPEM, keyPEM, chainPEM string) (*CertificateInfo, error) {
parsed, err := ParseCertificateInput([]byte(certPEM), []byte(keyPEM), []byte(chainPEM), "")
if err != nil {
return nil, fmt.Errorf("failed to parse certificate: %w", err)
return nil, fmt.Errorf("failed to parse certificate input: %w", err)
}
// Create DB entry
// Validate key matches certificate if key is provided
if parsed.PrivateKey != nil {
if err := ValidateKeyMatch(parsed.Leaf, parsed.PrivateKey); err != nil {
return nil, fmt.Errorf("key validation failed: %w", err)
}
}
// Extract metadata
meta := ExtractCertificateMetadata(parsed.Leaf)
domains := meta.CommonName
if len(parsed.Leaf.DNSNames) > 0 {
domains = strings.Join(parsed.Leaf.DNSNames, ",")
}
notAfter := parsed.Leaf.NotAfter
notBefore := parsed.Leaf.NotBefore
sslCert := &models.SSLCertificate{
UUID: uuid.New().String(),
Name: name,
Provider: "custom",
Domains: cert.Subject.CommonName, // Or SANs
Certificate: certPEM,
PrivateKey: keyPEM,
ExpiresAt: &cert.NotAfter,
CreatedAt: time.Now(),
UpdatedAt: time.Now(),
UUID: uuid.New().String(),
Name: name,
Provider: "custom",
Domains: domains,
CommonName: meta.CommonName,
Certificate: parsed.CertPEM,
CertificateChain: parsed.ChainPEM,
Fingerprint: meta.Fingerprint,
SerialNumber: meta.SerialNumber,
IssuerOrg: meta.IssuerOrg,
KeyType: meta.KeyType,
ExpiresAt: &notAfter,
NotBefore: &notBefore,
KeyVersion: 1,
CreatedAt: time.Now(),
UpdatedAt: time.Now(),
}
// Handle SANs if present
if len(cert.DNSNames) > 0 {
sslCert.Domains = strings.Join(cert.DNSNames, ",")
// Encrypt private key at rest
if parsed.KeyPEM != "" && s.encSvc != nil {
encrypted, err := s.encSvc.Encrypt([]byte(parsed.KeyPEM))
if err != nil {
return nil, fmt.Errorf("failed to encrypt private key: %w", err)
}
sslCert.PrivateKeyEncrypted = encrypted
}
if err := s.db.Create(sslCert).Error; err != nil {
return nil, err
return nil, fmt.Errorf("failed to save certificate: %w", err)
}
// Invalidate cache so the new cert appears immediately
s.InvalidateCache()
return sslCert, nil
chainDepth := len(parsed.Intermediates)
info := &CertificateInfo{
UUID: sslCert.UUID,
Name: sslCert.Name,
CommonName: sslCert.CommonName,
Domains: sslCert.Domains,
Issuer: sslCert.Provider,
IssuerOrg: sslCert.IssuerOrg,
Fingerprint: sslCert.Fingerprint,
SerialNumber: sslCert.SerialNumber,
KeyType: sslCert.KeyType,
ExpiresAt: notAfter,
NotBefore: notBefore,
Status: certStatus(*sslCert),
Provider: sslCert.Provider,
ChainDepth: chainDepth,
HasKey: sslCert.PrivateKeyEncrypted != "",
InUse: false,
}
return info, nil
}
// GetCertificate returns full certificate detail by UUID.
func (s *CertificateService) GetCertificate(certUUID string) (*CertificateDetail, error) {
var cert models.SSLCertificate
if err := s.db.Where("uuid = ?", certUUID).First(&cert).Error; err != nil {
if err == gorm.ErrRecordNotFound {
return nil, ErrCertNotFound
}
return nil, fmt.Errorf("failed to fetch certificate: %w", err)
}
// Get assigned hosts
var hosts []models.ProxyHost
s.db.Where("certificate_id = ?", cert.ID).Find(&hosts)
assignedHosts := make([]AssignedHostInfo, 0, len(hosts))
for _, h := range hosts {
assignedHosts = append(assignedHosts, AssignedHostInfo{
UUID: h.UUID,
Name: h.Name,
DomainNames: h.DomainNames,
})
}
// Parse chain entries
chain := buildChainEntries(cert.Certificate, cert.CertificateChain)
expires := time.Time{}
if cert.ExpiresAt != nil {
expires = *cert.ExpiresAt
}
notBefore := time.Time{}
if cert.NotBefore != nil {
notBefore = *cert.NotBefore
}
detail := &CertificateDetail{
UUID: cert.UUID,
Name: cert.Name,
CommonName: cert.CommonName,
Domains: cert.Domains,
Issuer: cert.Provider,
IssuerOrg: cert.IssuerOrg,
Fingerprint: cert.Fingerprint,
SerialNumber: cert.SerialNumber,
KeyType: cert.KeyType,
ExpiresAt: expires,
NotBefore: notBefore,
Status: certStatus(cert),
Provider: cert.Provider,
ChainDepth: len(chain),
HasKey: cert.PrivateKeyEncrypted != "",
InUse: len(hosts) > 0,
AssignedHosts: assignedHosts,
Chain: chain,
AutoRenew: cert.AutoRenew,
CreatedAt: cert.CreatedAt,
UpdatedAt: cert.UpdatedAt,
}
return detail, nil
}
// ValidateCertificate validates certificate data without storing.
func (s *CertificateService) ValidateCertificate(certPEM, keyPEM, chainPEM string) (*ValidationResult, error) {
result := &ValidationResult{
Warnings: []string{},
Errors: []string{},
}
parsed, err := ParseCertificateInput([]byte(certPEM), []byte(keyPEM), []byte(chainPEM), "")
if err != nil {
result.Errors = append(result.Errors, err.Error())
return result, nil
}
meta := ExtractCertificateMetadata(parsed.Leaf)
result.CommonName = meta.CommonName
result.Domains = meta.Domains
result.IssuerOrg = meta.IssuerOrg
result.ExpiresAt = meta.NotAfter
result.ChainDepth = len(parsed.Intermediates)
// Key match check
if parsed.PrivateKey != nil {
if err := ValidateKeyMatch(parsed.Leaf, parsed.PrivateKey); err != nil {
result.Errors = append(result.Errors, fmt.Sprintf("key mismatch: %s", err.Error()))
} else {
result.KeyMatch = true
}
}
// Chain validation (best-effort, warn on failure)
if len(parsed.Intermediates) > 0 {
if err := ValidateChain(parsed.Leaf, parsed.Intermediates); err != nil {
result.Warnings = append(result.Warnings, fmt.Sprintf("chain validation: %s", err.Error()))
} else {
result.ChainValid = true
}
} else {
// Try verifying with system roots
if err := ValidateChain(parsed.Leaf, nil); err != nil {
result.Warnings = append(result.Warnings, "certificate could not be verified against system roots")
} else {
result.ChainValid = true
}
}
// Expiry warnings
daysUntilExpiry := time.Until(parsed.Leaf.NotAfter).Hours() / 24
if daysUntilExpiry < 0 {
result.Warnings = append(result.Warnings, "Certificate has expired")
} else if daysUntilExpiry < 30 {
result.Warnings = append(result.Warnings, fmt.Sprintf("Certificate expires in %.0f days", daysUntilExpiry))
}
result.Valid = len(result.Errors) == 0
return result, nil
}
// IsCertificateInUse checks if a certificate is referenced by any proxy host.
@@ -392,10 +640,30 @@ func (s *CertificateService) IsCertificateInUse(id uint) (bool, error) {
return count > 0, nil
}
// DeleteCertificate removes a certificate.
func (s *CertificateService) DeleteCertificate(id uint) error {
// IsCertificateInUseByUUID checks if a certificate is referenced by any proxy host, looked up by UUID.
func (s *CertificateService) IsCertificateInUseByUUID(certUUID string) (bool, error) {
var cert models.SSLCertificate
if err := s.db.Where("uuid = ?", certUUID).First(&cert).Error; err != nil {
if err == gorm.ErrRecordNotFound {
return false, ErrCertNotFound
}
return false, fmt.Errorf("failed to look up certificate: %w", err)
}
return s.IsCertificateInUse(cert.ID)
}
// DeleteCertificate removes a certificate by UUID.
func (s *CertificateService) DeleteCertificate(certUUID string) error {
var cert models.SSLCertificate
if err := s.db.Where("uuid = ?", certUUID).First(&cert).Error; err != nil {
if err == gorm.ErrRecordNotFound {
return ErrCertNotFound
}
return fmt.Errorf("failed to look up certificate: %w", err)
}
// Prevent deletion if the certificate is referenced by any proxy host
inUse, err := s.IsCertificateInUse(id)
inUse, err := s.IsCertificateInUse(cert.ID)
if err != nil {
return err
}
@@ -403,30 +671,22 @@ func (s *CertificateService) DeleteCertificate(id uint) error {
return ErrCertInUse
}
var cert models.SSLCertificate
if err := s.db.Where("id = ?", id).First(&cert).Error; err != nil {
return err
}
if cert.Provider == "letsencrypt" {
if cert.Provider == "letsencrypt" || cert.Provider == "letsencrypt-staging" {
// Best-effort file deletion
certRoot := filepath.Join(s.dataDir, "certificates")
_ = filepath.Walk(certRoot, func(path string, info os.FileInfo, err error) error {
if err == nil && !info.IsDir() && strings.HasSuffix(info.Name(), ".crt") {
if info.Name() == cert.Domains+".crt" {
// Found it
logger.Log().WithField("path", path).Info("CertificateService: deleting ACME cert file")
if err := os.Remove(path); err != nil {
logger.Log().WithError(err).Error("CertificateService: failed to delete cert file")
}
// Try to delete key as well
keyPath := strings.TrimSuffix(path, ".crt") + ".key"
if _, err := os.Stat(keyPath); err == nil {
if err := os.Remove(keyPath); err != nil {
logger.Log().WithError(err).Warn("Failed to remove key file")
}
}
// Also try to delete the json meta file
jsonPath := strings.TrimSuffix(path, ".crt") + ".json"
if _, err := os.Stat(jsonPath); err == nil {
if err := os.Remove(jsonPath); err != nil {
@@ -439,10 +699,348 @@ func (s *CertificateService) DeleteCertificate(id uint) error {
})
}
if err := s.db.Delete(&models.SSLCertificate{}, "id = ?", id).Error; err != nil {
return err
if err := s.db.Delete(&models.SSLCertificate{}, "id = ?", cert.ID).Error; err != nil {
return fmt.Errorf("failed to delete certificate: %w", err)
}
// Invalidate cache so the deleted cert disappears immediately
s.InvalidateCache()
return nil
}
// ExportCertificate exports a certificate in the requested format.
// Returns the file data, suggested filename, and any error.
func (s *CertificateService) ExportCertificate(certUUID string, format string, includeKey bool, pfxPassword string) ([]byte, string, error) {
var cert models.SSLCertificate
if err := s.db.Where("uuid = ?", certUUID).First(&cert).Error; err != nil {
if err == gorm.ErrRecordNotFound {
return nil, "", ErrCertNotFound
}
return nil, "", fmt.Errorf("failed to fetch certificate: %w", err)
}
baseName := cert.Name
if baseName == "" {
baseName = "certificate"
}
switch strings.ToLower(format) {
case "pem":
var buf strings.Builder
buf.WriteString(cert.Certificate)
if cert.CertificateChain != "" {
buf.WriteString("\n")
buf.WriteString(cert.CertificateChain)
}
if includeKey {
keyPEM, err := s.GetDecryptedPrivateKey(&cert)
if err != nil {
return nil, "", fmt.Errorf("failed to decrypt private key: %w", err)
}
buf.WriteString("\n")
buf.WriteString(keyPEM)
}
return []byte(buf.String()), baseName + ".pem", nil
case "der":
derData, err := ConvertPEMToDER(cert.Certificate)
if err != nil {
return nil, "", fmt.Errorf("failed to convert to DER: %w", err)
}
return derData, baseName + ".der", nil
case "pfx", "p12":
keyPEM, err := s.GetDecryptedPrivateKey(&cert)
if err != nil {
return nil, "", fmt.Errorf("failed to decrypt private key for PFX: %w", err)
}
pfxData, err := ConvertPEMToPFX(cert.Certificate, keyPEM, cert.CertificateChain, pfxPassword)
if err != nil {
return nil, "", fmt.Errorf("failed to create PFX: %w", err)
}
return pfxData, baseName + ".pfx", nil
default:
return nil, "", fmt.Errorf("unsupported export format: %s", format)
}
}
// GetDecryptedPrivateKey decrypts and returns the private key PEM for internal use.
func (s *CertificateService) GetDecryptedPrivateKey(cert *models.SSLCertificate) (string, error) {
if cert.PrivateKeyEncrypted == "" {
return "", fmt.Errorf("no encrypted private key stored")
}
if s.encSvc == nil {
return "", fmt.Errorf("encryption service not configured")
}
decrypted, err := s.encSvc.Decrypt(cert.PrivateKeyEncrypted)
if err != nil {
return "", fmt.Errorf("failed to decrypt private key: %w", err)
}
return string(decrypted), nil
}
// MigratePrivateKeys encrypts existing plaintext private keys.
// Idempotent — skips already-migrated rows.
func (s *CertificateService) MigratePrivateKeys() error {
if s.encSvc == nil {
logger.Log().Warn("CertificateService: encryption service not configured, skipping key migration")
return nil
}
// Use raw SQL because PrivateKey has gorm:"-" tag
type rawCert struct {
ID uint
PrivateKey string
PrivateKeyEnc string `gorm:"column:private_key_enc"`
}
var certs []rawCert
if err := s.db.Raw("SELECT id, private_key, private_key_enc FROM ssl_certificates WHERE private_key != '' AND (private_key_enc = '' OR private_key_enc IS NULL)").Scan(&certs).Error; err != nil {
return fmt.Errorf("failed to query certificates for migration: %w", err)
}
if len(certs) == 0 {
logger.Log().Info("CertificateService: no private keys to migrate")
return nil
}
logger.Log().WithField("count", len(certs)).Info("CertificateService: migrating plaintext private keys")
for _, c := range certs {
encrypted, err := s.encSvc.Encrypt([]byte(c.PrivateKey))
if err != nil {
logger.Log().WithField("cert_id", c.ID).WithError(err).Error("CertificateService: failed to encrypt key during migration")
continue
}
if err := s.db.Exec("UPDATE ssl_certificates SET private_key_enc = ?, key_version = 1, private_key = '' WHERE id = ?", encrypted, c.ID).Error; err != nil {
logger.Log().WithField("cert_id", c.ID).WithError(err).Error("CertificateService: failed to update migrated key")
continue
}
logger.Log().WithField("cert_id", c.ID).Info("CertificateService: migrated private key")
}
return nil
}
// DeleteCertificateByID removes a certificate by numeric ID (legacy compatibility).
func (s *CertificateService) DeleteCertificateByID(id uint) error {
var cert models.SSLCertificate
if err := s.db.Where("id = ?", id).First(&cert).Error; err != nil {
return fmt.Errorf("failed to look up certificate: %w", err)
}
return s.DeleteCertificate(cert.UUID)
}
// UpdateCertificate updates certificate metadata (name only) by UUID.
func (s *CertificateService) UpdateCertificate(certUUID string, name string) (*CertificateInfo, error) {
var cert models.SSLCertificate
if err := s.db.Where("uuid = ?", certUUID).First(&cert).Error; err != nil {
if err == gorm.ErrRecordNotFound {
return nil, ErrCertNotFound
}
return nil, fmt.Errorf("failed to fetch certificate: %w", err)
}
cert.Name = name
if err := s.db.Save(&cert).Error; err != nil {
return nil, fmt.Errorf("failed to update certificate: %w", err)
}
s.InvalidateCache()
expires := time.Time{}
if cert.ExpiresAt != nil {
expires = *cert.ExpiresAt
}
notBefore := time.Time{}
if cert.NotBefore != nil {
notBefore = *cert.NotBefore
}
var chainDepth int
if cert.CertificateChain != "" {
certs, _ := parsePEMCertificates([]byte(cert.CertificateChain))
chainDepth = len(certs)
}
inUse, _ := s.IsCertificateInUse(cert.ID)
return &CertificateInfo{
UUID: cert.UUID,
Name: cert.Name,
CommonName: cert.CommonName,
Domains: cert.Domains,
Issuer: cert.Provider,
IssuerOrg: cert.IssuerOrg,
Fingerprint: cert.Fingerprint,
SerialNumber: cert.SerialNumber,
KeyType: cert.KeyType,
ExpiresAt: expires,
NotBefore: notBefore,
Status: certStatus(cert),
Provider: cert.Provider,
ChainDepth: chainDepth,
HasKey: cert.PrivateKeyEncrypted != "",
InUse: inUse,
}, nil
}
// CheckExpiringCertificates returns certificates that are expiring within the given number of days.
func (s *CertificateService) CheckExpiringCertificates(warningDays int) ([]CertificateInfo, error) {
var certs []models.SSLCertificate
threshold := time.Now().Add(time.Duration(warningDays) * 24 * time.Hour)
if err := s.db.Where("provider = ? AND expires_at IS NOT NULL AND expires_at <= ?", "custom", threshold).Find(&certs).Error; err != nil {
return nil, fmt.Errorf("failed to query expiring certificates: %w", err)
}
result := make([]CertificateInfo, 0, len(certs))
for _, cert := range certs {
expires := time.Time{}
if cert.ExpiresAt != nil {
expires = *cert.ExpiresAt
}
notBefore := time.Time{}
if cert.NotBefore != nil {
notBefore = *cert.NotBefore
}
result = append(result, CertificateInfo{
UUID: cert.UUID,
Name: cert.Name,
CommonName: cert.CommonName,
Domains: cert.Domains,
Issuer: cert.Provider,
IssuerOrg: cert.IssuerOrg,
Fingerprint: cert.Fingerprint,
SerialNumber: cert.SerialNumber,
KeyType: cert.KeyType,
ExpiresAt: expires,
NotBefore: notBefore,
Status: certStatus(cert),
Provider: cert.Provider,
HasKey: cert.PrivateKeyEncrypted != "",
})
}
return result, nil
}
// StartExpiryChecker runs a background goroutine that periodically checks for expiring certificates.
func (s *CertificateService) StartExpiryChecker(ctx context.Context, notificationSvc *NotificationService, warningDays int) {
// Startup delay: avoid notification bursts during frequent restarts
startupDelay := 5 * time.Minute
select {
case <-ctx.Done():
return
case <-time.After(startupDelay):
}
// Add random jitter (0-60 minutes) using crypto/rand
maxJitter := int64(60 * time.Minute)
n, errRand := crand.Int(crand.Reader, big.NewInt(maxJitter))
if errRand != nil {
n = big.NewInt(maxJitter / 2)
}
jitter := time.Duration(n.Int64())
select {
case <-ctx.Done():
return
case <-time.After(jitter):
}
s.checkExpiry(ctx, notificationSvc, warningDays)
ticker := time.NewTicker(24 * time.Hour)
defer ticker.Stop()
for {
select {
case <-ctx.Done():
return
case <-ticker.C:
s.checkExpiry(ctx, notificationSvc, warningDays)
}
}
}
func (s *CertificateService) checkExpiry(ctx context.Context, notificationSvc *NotificationService, warningDays int) {
if notificationSvc == nil {
return
}
certs, err := s.CheckExpiringCertificates(warningDays)
if err != nil {
logger.Log().WithError(err).Error("CertificateService: failed to check expiring certificates")
return
}
for _, cert := range certs {
daysLeft := time.Until(cert.ExpiresAt).Hours() / 24
if daysLeft < 0 {
// Expired
if _, err := notificationSvc.Create(
models.NotificationTypeError,
"Certificate Expired",
fmt.Sprintf("Certificate %q (%s) has expired.", cert.Name, cert.Domains),
); err != nil {
logger.Log().WithError(err).Error("CertificateService: failed to create expiry notification")
}
notificationSvc.SendExternal(ctx,
"cert_expiry",
"Certificate Expired",
fmt.Sprintf("Certificate %q (%s) has expired.", cert.Name, cert.Domains),
map[string]any{"uuid": cert.UUID, "domains": cert.Domains, "status": "expired"},
)
} else {
// Expiring soon
if _, err := notificationSvc.Create(
models.NotificationTypeWarning,
"Certificate Expiring Soon",
fmt.Sprintf("Certificate %q (%s) expires in %.0f days.", cert.Name, cert.Domains, daysLeft),
); err != nil {
logger.Log().WithError(err).Error("CertificateService: failed to create expiry warning notification")
}
notificationSvc.SendExternal(ctx,
"cert_expiry",
"Certificate Expiring Soon",
fmt.Sprintf("Certificate %q (%s) expires in %.0f days.", cert.Name, cert.Domains, daysLeft),
map[string]any{"uuid": cert.UUID, "domains": cert.Domains, "days_left": int(daysLeft)},
)
}
}
}
func buildChainEntries(certPEM, chainPEM string) []ChainEntry {
var entries []ChainEntry
// Parse leaf
if certPEM != "" {
certs, _ := parsePEMCertificates([]byte(certPEM))
for _, c := range certs {
entries = append(entries, ChainEntry{
Subject: c.Subject.CommonName,
Issuer: c.Issuer.CommonName,
ExpiresAt: c.NotAfter,
})
}
}
// Parse chain
if chainPEM != "" {
certs, _ := parsePEMCertificates([]byte(chainPEM))
for _, c := range certs {
entries = append(entries, ChainEntry{
Subject: c.Subject.CommonName,
Issuer: c.Issuer.CommonName,
ExpiresAt: c.NotAfter,
})
}
}
return entries
}

View File

@@ -0,0 +1,172 @@
package services
import (
"context"
"fmt"
"testing"
"time"
"github.com/google/uuid"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"gorm.io/driver/sqlite"
"gorm.io/gorm"
"github.com/Wikid82/charon/backend/internal/models"
)
// TestCheckExpiry_QueryFails covers lines 977-979: CheckExpiringCertificates fails.
func TestCheckExpiry_QueryFails(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.Notification{}, &models.NotificationProvider{}))
// Drop ssl_certificates so CheckExpiringCertificates returns an error
require.NoError(t, db.Exec("DROP TABLE ssl_certificates").Error)
ns := NewNotificationService(db, nil)
svc := NewCertificateService(t.TempDir(), db, nil)
// Should not panic — logs the error and returns
svc.checkExpiry(context.Background(), ns, 30)
}
// TestCheckExpiry_ExpiredCert_Success covers lines 981-998: expired cert notification success path.
func TestCheckExpiry_ExpiredCert_Success(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.Notification{}, &models.NotificationProvider{}))
past := time.Now().Add(-48 * time.Hour)
certUUID := uuid.New().String()
require.NoError(t, db.Create(&models.SSLCertificate{
UUID: certUUID,
Name: "expired-cert",
Provider: "custom",
Domains: "expired.example.com",
ExpiresAt: &past,
}).Error)
ns := NewNotificationService(db, nil)
svc := NewCertificateService(t.TempDir(), db, nil)
svc.checkExpiry(context.Background(), ns, 30)
var notifications []models.Notification
require.NoError(t, db.Find(&notifications).Error)
assert.NotEmpty(t, notifications)
}
// TestCheckExpiry_ExpiringSoonCert_Success covers lines 999-1014: expiring-soon cert notification success path.
func TestCheckExpiry_ExpiringSoonCert_Success(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.Notification{}, &models.NotificationProvider{}))
soon := time.Now().Add(7 * 24 * time.Hour)
certUUID := uuid.New().String()
require.NoError(t, db.Create(&models.SSLCertificate{
UUID: certUUID,
Name: "expiring-soon-cert",
Provider: "custom",
Domains: "soon.example.com",
ExpiresAt: &soon,
}).Error)
ns := NewNotificationService(db, nil)
svc := NewCertificateService(t.TempDir(), db, nil)
svc.checkExpiry(context.Background(), ns, 30)
var notifications []models.Notification
require.NoError(t, db.Find(&notifications).Error)
assert.NotEmpty(t, notifications)
}
// TestCheckExpiry_NotificationFails covers lines 991-992 and 1006-1007:
// Create() fails for both expired and expiring-soon certs.
func TestCheckExpiry_NotificationFails(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.Notification{}, &models.NotificationProvider{}))
past := time.Now().Add(-48 * time.Hour)
soon := time.Now().Add(7 * 24 * time.Hour)
require.NoError(t, db.Create(&models.SSLCertificate{
UUID: uuid.New().String(),
Name: "expired-cert",
Provider: "custom",
Domains: "expired2.example.com",
ExpiresAt: &past,
}).Error)
require.NoError(t, db.Create(&models.SSLCertificate{
UUID: uuid.New().String(),
Name: "soon-cert",
Provider: "custom",
Domains: "soon2.example.com",
ExpiresAt: &soon,
}).Error)
// Drop notifications table so Create() fails
require.NoError(t, db.Exec("DROP TABLE notifications").Error)
ns := NewNotificationService(db, nil)
svc := NewCertificateService(t.TempDir(), db, nil)
// Should not panic — logs errors and continues
svc.checkExpiry(context.Background(), ns, 30)
}
func TestUploadCertificate_KeyMismatch(t *testing.T) {
cert1PEM, _ := generateTestCertAndKey(t, "cert1.example.com", time.Now().Add(24*time.Hour))
_, key2PEM := generateTestCertAndKey(t, "cert2.example.com", time.Now().Add(24*time.Hour))
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}))
svc := NewCertificateService(t.TempDir(), db, nil)
_, err = svc.UploadCertificate("mismatch-test", string(cert1PEM), string(key2PEM), "")
require.Error(t, err)
assert.Contains(t, err.Error(), "key validation failed")
}
func TestUploadCertificate_DBError(t *testing.T) {
certPEM, keyPEM := generateTestCertAndKey(t, "db-err.example.com", time.Now().Add(24*time.Hour))
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
// No AutoMigrate → ssl_certificates table absent → db.Create fails
svc := NewCertificateService(t.TempDir(), db, nil)
_, err = svc.UploadCertificate("db-error-test", string(certPEM), string(keyPEM), "")
require.Error(t, err)
assert.Contains(t, err.Error(), "failed to save certificate")
}
func TestGetCertificate_DBError(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
// No AutoMigrate → ssl_certificates table absent → First() returns error
svc := NewCertificateService(t.TempDir(), db, nil)
_, err = svc.GetCertificate(uuid.New().String())
require.Error(t, err)
assert.Contains(t, err.Error(), "failed to fetch certificate")
}
func TestUpdateCertificate_DBError(t *testing.T) {
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
// No AutoMigrate → ssl_certificates table absent → First() returns non-ErrRecordNotFound error
svc := NewCertificateService(t.TempDir(), db, nil)
_, err = svc.UpdateCertificate(uuid.New().String(), "new-name")
require.Error(t, err)
assert.Contains(t, err.Error(), "failed to fetch certificate")
}

View File

@@ -0,0 +1,520 @@
package services
import (
"context"
"encoding/base64"
"fmt"
"testing"
"time"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"gorm.io/driver/sqlite"
"gorm.io/gorm"
"github.com/Wikid82/charon/backend/internal/crypto"
"github.com/Wikid82/charon/backend/internal/models"
)
// newTestEncryptionService creates a real EncryptionService for tests.
func newTestEncryptionService(t *testing.T) *crypto.EncryptionService {
t.Helper()
key := make([]byte, 32)
for i := range key {
key[i] = byte(i)
}
keyB64 := base64.StdEncoding.EncodeToString(key)
svc, err := crypto.NewEncryptionService(keyB64)
require.NoError(t, err)
return svc
}
func newTestCertServiceWithEnc(t *testing.T, dataDir string, db *gorm.DB) *CertificateService {
t.Helper()
encSvc := newTestEncryptionService(t)
return &CertificateService{
dataDir: dataDir,
db: db,
encSvc: encSvc,
scanTTL: 5 * time.Minute,
}
}
func seedCertWithKey(t *testing.T, db *gorm.DB, encSvc *crypto.EncryptionService, uuid, name, domain string, expiry time.Time) models.SSLCertificate {
t.Helper()
certPEM, keyPEM := generateTestCertAndKey(t, domain, expiry)
encKey, err := encSvc.Encrypt(keyPEM)
require.NoError(t, err)
cert := models.SSLCertificate{
UUID: uuid,
Name: name,
Provider: "custom",
Domains: domain,
CommonName: domain,
Certificate: string(certPEM),
PrivateKeyEncrypted: encKey,
ExpiresAt: &expiry,
}
require.NoError(t, db.Create(&cert).Error)
return cert
}
func TestCertificateService_GetCertificate(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
t.Run("not found", func(t *testing.T) {
_, err := cs.GetCertificate("nonexistent-uuid")
assert.ErrorIs(t, err, ErrCertNotFound)
})
t.Run("found with no hosts", func(t *testing.T) {
expiry := time.Now().Add(30 * 24 * time.Hour)
notBefore := time.Now().Add(-time.Hour)
cert := models.SSLCertificate{
UUID: "get-cert-1",
Name: "Test Cert",
Provider: "custom",
Domains: "get.example.com",
CommonName: "get.example.com",
ExpiresAt: &expiry,
NotBefore: &notBefore,
}
require.NoError(t, db.Create(&cert).Error)
detail, err := cs.GetCertificate("get-cert-1")
require.NoError(t, err)
assert.Equal(t, "get-cert-1", detail.UUID)
assert.Equal(t, "Test Cert", detail.Name)
assert.Equal(t, "get.example.com", detail.CommonName)
assert.False(t, detail.InUse)
assert.Empty(t, detail.AssignedHosts)
})
t.Run("found with assigned host", func(t *testing.T) {
expiry := time.Now().Add(30 * 24 * time.Hour)
cert := models.SSLCertificate{
UUID: "get-cert-2",
Name: "Assigned Cert",
Provider: "custom",
Domains: "assigned.example.com",
CommonName: "assigned.example.com",
ExpiresAt: &expiry,
}
require.NoError(t, db.Create(&cert).Error)
ph := models.ProxyHost{
UUID: "ph-assigned",
Name: "My Proxy",
DomainNames: "assigned.example.com",
ForwardHost: "localhost",
ForwardPort: 8080,
CertificateID: &cert.ID,
}
require.NoError(t, db.Create(&ph).Error)
detail, err := cs.GetCertificate("get-cert-2")
require.NoError(t, err)
assert.True(t, detail.InUse)
require.Len(t, detail.AssignedHosts, 1)
assert.Equal(t, "My Proxy", detail.AssignedHosts[0].Name)
})
t.Run("nil expiry and not_before", func(t *testing.T) {
cert := models.SSLCertificate{
UUID: "get-cert-3",
Name: "No Dates",
Provider: "custom",
Domains: "nodates.example.com",
}
require.NoError(t, db.Create(&cert).Error)
detail, err := cs.GetCertificate("get-cert-3")
require.NoError(t, err)
assert.True(t, detail.ExpiresAt.IsZero())
assert.True(t, detail.NotBefore.IsZero())
})
}
func TestCertificateService_ValidateCertificate(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
t.Run("valid cert with key", func(t *testing.T) {
certPEM, keyPEM := generateTestCertAndKey(t, "validate.example.com", time.Now().Add(24*time.Hour))
result, err := cs.ValidateCertificate(string(certPEM), string(keyPEM), "")
require.NoError(t, err)
assert.True(t, result.Valid)
assert.True(t, result.KeyMatch)
assert.Empty(t, result.Errors)
})
t.Run("invalid cert data", func(t *testing.T) {
result, err := cs.ValidateCertificate("not-a-cert", "", "")
require.NoError(t, err)
assert.False(t, result.Valid)
assert.NotEmpty(t, result.Errors)
})
t.Run("valid cert without key", func(t *testing.T) {
certPEM := generateTestCert(t, "nokey.example.com", time.Now().Add(24*time.Hour))
result, err := cs.ValidateCertificate(string(certPEM), "", "")
require.NoError(t, err)
assert.True(t, result.Valid)
assert.False(t, result.KeyMatch)
assert.Empty(t, result.Errors)
})
t.Run("expired cert", func(t *testing.T) {
certPEM := generateTestCert(t, "expired.example.com", time.Now().Add(-24*time.Hour))
result, err := cs.ValidateCertificate(string(certPEM), "", "")
require.NoError(t, err)
assert.NotEmpty(t, result.Warnings)
})
}
func TestCertificateService_UpdateCertificate(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
t.Run("not found", func(t *testing.T) {
_, err := cs.UpdateCertificate("nonexistent-uuid", "New Name")
assert.ErrorIs(t, err, ErrCertNotFound)
})
t.Run("successful rename", func(t *testing.T) {
expiry := time.Now().Add(30 * 24 * time.Hour)
cert := models.SSLCertificate{
UUID: "update-cert-1",
Name: "Old Name",
Provider: "custom",
Domains: "update.example.com",
CommonName: "update.example.com",
ExpiresAt: &expiry,
}
require.NoError(t, db.Create(&cert).Error)
info, err := cs.UpdateCertificate("update-cert-1", "New Name")
require.NoError(t, err)
assert.Equal(t, "New Name", info.Name)
assert.Equal(t, "update-cert-1", info.UUID)
assert.Equal(t, "custom", info.Provider)
})
t.Run("updates persist", func(t *testing.T) {
var cert models.SSLCertificate
require.NoError(t, db.Where("uuid = ?", "update-cert-1").First(&cert).Error)
assert.Equal(t, "New Name", cert.Name)
})
t.Run("nil expiry and not_before", func(t *testing.T) {
cert := models.SSLCertificate{
UUID: "update-cert-2",
Name: "No Dates Cert",
Provider: "custom",
Domains: "nodates-update.example.com",
}
require.NoError(t, db.Create(&cert).Error)
info, err := cs.UpdateCertificate("update-cert-2", "Renamed No Dates")
require.NoError(t, err)
assert.Equal(t, "Renamed No Dates", info.Name)
assert.True(t, info.ExpiresAt.IsZero())
})
}
func TestCertificateService_IsCertificateInUseByUUID(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
t.Run("not found", func(t *testing.T) {
_, err := cs.IsCertificateInUseByUUID("nonexistent-uuid")
assert.ErrorIs(t, err, ErrCertNotFound)
})
t.Run("not in use", func(t *testing.T) {
cert := models.SSLCertificate{UUID: "inuse-1", Name: "Free Cert", Provider: "custom", Domains: "free.example.com"}
require.NoError(t, db.Create(&cert).Error)
inUse, err := cs.IsCertificateInUseByUUID("inuse-1")
require.NoError(t, err)
assert.False(t, inUse)
})
t.Run("in use", func(t *testing.T) {
cert := models.SSLCertificate{UUID: "inuse-2", Name: "Used Cert", Provider: "custom", Domains: "used.example.com"}
require.NoError(t, db.Create(&cert).Error)
ph := models.ProxyHost{UUID: "ph-inuse", Name: "Using Proxy", DomainNames: "used.example.com", ForwardHost: "localhost", ForwardPort: 8080, CertificateID: &cert.ID}
require.NoError(t, db.Create(&ph).Error)
inUse, err := cs.IsCertificateInUseByUUID("inuse-2")
require.NoError(t, err)
assert.True(t, inUse)
})
}
func TestCertificateService_DeleteCertificateByID(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
cert := models.SSLCertificate{UUID: "del-by-id-1", Name: "Delete By ID", Provider: "custom", Domains: "delbyid.example.com"}
require.NoError(t, db.Create(&cert).Error)
err = cs.DeleteCertificateByID(cert.ID)
require.NoError(t, err)
var found models.SSLCertificate
err = db.Where("uuid = ?", "del-by-id-1").First(&found).Error
assert.Error(t, err)
}
func TestCertificateService_ExportCertificate(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
encSvc := newTestEncryptionService(t)
cs := newTestCertServiceWithEnc(t, tmpDir, db)
domain := "export.example.com"
expiry := time.Now().Add(30 * 24 * time.Hour)
cert := seedCertWithKey(t, db, encSvc, "export-cert-1", "Export Cert", domain, expiry)
t.Run("not found", func(t *testing.T) {
_, _, err := cs.ExportCertificate("nonexistent", "pem", false, "")
assert.ErrorIs(t, err, ErrCertNotFound)
})
t.Run("pem without key", func(t *testing.T) {
data, filename, err := cs.ExportCertificate(cert.UUID, "pem", false, "")
require.NoError(t, err)
assert.Equal(t, "Export Cert.pem", filename)
assert.Contains(t, string(data), "BEGIN CERTIFICATE")
})
t.Run("pem with key", func(t *testing.T) {
data, filename, err := cs.ExportCertificate(cert.UUID, "pem", true, "")
require.NoError(t, err)
assert.Equal(t, "Export Cert.pem", filename)
assert.Contains(t, string(data), "BEGIN CERTIFICATE")
assert.Contains(t, string(data), "PRIVATE KEY")
})
t.Run("der format", func(t *testing.T) {
data, filename, err := cs.ExportCertificate(cert.UUID, "der", false, "")
require.NoError(t, err)
assert.Equal(t, "Export Cert.der", filename)
assert.NotEmpty(t, data)
})
t.Run("pfx format", func(t *testing.T) {
data, filename, err := cs.ExportCertificate(cert.UUID, "pfx", false, "")
require.NoError(t, err)
assert.Equal(t, "Export Cert.pfx", filename)
assert.NotEmpty(t, data)
})
t.Run("unsupported format", func(t *testing.T) {
_, _, err := cs.ExportCertificate(cert.UUID, "jks", false, "")
assert.Error(t, err)
assert.Contains(t, err.Error(), "unsupported export format")
})
t.Run("empty name uses fallback", func(t *testing.T) {
noNameCert := seedCertWithKey(t, db, encSvc, "export-noname", "", domain, expiry)
_, filename, err := cs.ExportCertificate(noNameCert.UUID, "pem", false, "")
require.NoError(t, err)
assert.Equal(t, "certificate.pem", filename)
})
}
func TestCertificateService_GetDecryptedPrivateKey(t *testing.T) {
encSvc := newTestEncryptionService(t)
t.Run("no encrypted key", func(t *testing.T) {
cs := &CertificateService{encSvc: encSvc}
cert := &models.SSLCertificate{PrivateKeyEncrypted: ""}
_, err := cs.GetDecryptedPrivateKey(cert)
assert.Error(t, err)
assert.Contains(t, err.Error(), "no encrypted private key")
})
t.Run("no encryption service", func(t *testing.T) {
cs := &CertificateService{encSvc: nil}
cert := &models.SSLCertificate{PrivateKeyEncrypted: "some-data"}
_, err := cs.GetDecryptedPrivateKey(cert)
assert.Error(t, err)
assert.Contains(t, err.Error(), "encryption service not configured")
})
t.Run("successful decryption", func(t *testing.T) {
cs := &CertificateService{encSvc: encSvc}
plaintext := "-----BEGIN RSA PRIVATE KEY-----\ntest\n-----END RSA PRIVATE KEY-----" //nolint:gosec // test data, not real credentials
encrypted, err := encSvc.Encrypt([]byte(plaintext))
require.NoError(t, err)
cert := &models.SSLCertificate{PrivateKeyEncrypted: encrypted}
result, err := cs.GetDecryptedPrivateKey(cert)
require.NoError(t, err)
assert.Equal(t, plaintext, result)
})
}
func TestCertificateService_CheckExpiringCertificates(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
// Create certs with different expiry states
expiringSoon := time.Now().Add(5 * 24 * time.Hour)
expired := time.Now().Add(-24 * time.Hour)
farFuture := time.Now().Add(365 * 24 * time.Hour)
db.Create(&models.SSLCertificate{UUID: "exp-soon", Name: "Expiring Soon", Provider: "custom", Domains: "soon.example.com", ExpiresAt: &expiringSoon})
db.Create(&models.SSLCertificate{UUID: "exp-past", Name: "Already Expired", Provider: "custom", Domains: "expired.example.com", ExpiresAt: &expired})
db.Create(&models.SSLCertificate{UUID: "exp-far", Name: "Far Future", Provider: "custom", Domains: "far.example.com", ExpiresAt: &farFuture})
// ACME certs should not be included (only custom)
db.Create(&models.SSLCertificate{UUID: "exp-le", Name: "LE Cert", Provider: "letsencrypt", Domains: "le.example.com", ExpiresAt: &expiringSoon})
t.Run("30 day window", func(t *testing.T) {
certs, err := cs.CheckExpiringCertificates(30)
require.NoError(t, err)
assert.Len(t, certs, 2) // expiringSoon and expired
foundSoon := false
foundExpired := false
for _, c := range certs {
if c.UUID == "exp-soon" {
foundSoon = true
}
if c.UUID == "exp-past" {
foundExpired = true
}
}
assert.True(t, foundSoon)
assert.True(t, foundExpired)
})
t.Run("1 day window", func(t *testing.T) {
certs, err := cs.CheckExpiringCertificates(1)
require.NoError(t, err)
assert.Len(t, certs, 1) // only the expired one
assert.Equal(t, "exp-past", certs[0].UUID)
})
}
func TestCertificateService_CheckExpiry(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.Setting{}, &models.NotificationProvider{}, &models.Notification{}))
cs := newTestCertificateService(tmpDir, db)
ns := NewNotificationService(db, nil)
expiringSoon := time.Now().Add(5 * 24 * time.Hour)
expired := time.Now().Add(-24 * time.Hour)
db.Create(&models.SSLCertificate{UUID: "chk-soon", Name: "Expiring", Provider: "custom", Domains: "chksoon.example.com", ExpiresAt: &expiringSoon})
db.Create(&models.SSLCertificate{UUID: "chk-past", Name: "Expired", Provider: "custom", Domains: "chkpast.example.com", ExpiresAt: &expired})
t.Run("nil notification service", func(t *testing.T) {
cs.checkExpiry(context.Background(), nil, 30)
})
t.Run("creates notifications for expiring certs", func(t *testing.T) {
cs.checkExpiry(context.Background(), ns, 30)
var notifications []models.Notification
db.Find(&notifications)
assert.GreaterOrEqual(t, len(notifications), 2)
})
}
func TestCertificateService_MigratePrivateKeys(t *testing.T) {
t.Run("no encryption service", func(t *testing.T) {
cs := &CertificateService{encSvc: nil}
err := cs.MigratePrivateKeys()
require.NoError(t, err)
})
t.Run("no keys to migrate", func(t *testing.T) {
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
// MigratePrivateKeys uses raw SQL referencing private_key column (gorm:"-" tag)
require.NoError(t, db.Exec("ALTER TABLE ssl_certificates ADD COLUMN private_key TEXT DEFAULT ''").Error)
encSvc := newTestEncryptionService(t)
cs := &CertificateService{db: db, encSvc: encSvc}
err = cs.MigratePrivateKeys()
require.NoError(t, err)
})
t.Run("migrates plaintext key", func(t *testing.T) {
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
// MigratePrivateKeys uses raw SQL referencing private_key column (gorm:"-" tag)
require.NoError(t, db.Exec("ALTER TABLE ssl_certificates ADD COLUMN private_key TEXT DEFAULT ''").Error)
// Insert cert with plaintext key using raw SQL
require.NoError(t, db.Exec(
"INSERT INTO ssl_certificates (uuid, name, provider, domains, private_key) VALUES (?, ?, ?, ?, ?)",
"migrate-1", "Migrate Test", "custom", "migrate.example.com", "plaintext-key-data",
).Error)
encSvc := newTestEncryptionService(t)
cs := &CertificateService{db: db, encSvc: encSvc}
err = cs.MigratePrivateKeys()
require.NoError(t, err)
// Verify the key was encrypted and plaintext cleared
type rawRow struct {
PrivateKey string `gorm:"column:private_key"`
PrivateKeyEnc string `gorm:"column:private_key_enc"`
}
var row rawRow
require.NoError(t, db.Raw("SELECT private_key, private_key_enc FROM ssl_certificates WHERE uuid = ?", "migrate-1").Scan(&row).Error)
assert.Empty(t, row.PrivateKey)
assert.NotEmpty(t, row.PrivateKeyEnc)
})
}

View File

@@ -0,0 +1,292 @@
package services
import (
"context"
"fmt"
"strings"
"testing"
"time"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"gorm.io/driver/sqlite"
"gorm.io/gorm"
"github.com/Wikid82/charon/backend/internal/models"
)
// --- buildChainEntries ---
func TestBuildChainEntries(t *testing.T) {
certPEM := string(generateTestCert(t, "leaf.example.com", time.Now().Add(24*time.Hour)))
chainPEM := string(generateTestCert(t, "ca.example.com", time.Now().Add(365*24*time.Hour)))
t.Run("leaf only", func(t *testing.T) {
entries := buildChainEntries(certPEM, "")
require.Len(t, entries, 1)
assert.Equal(t, "leaf.example.com", entries[0].Subject)
})
t.Run("leaf and chain", func(t *testing.T) {
entries := buildChainEntries(certPEM, chainPEM)
require.Len(t, entries, 2)
assert.Equal(t, "leaf.example.com", entries[0].Subject)
assert.Equal(t, "ca.example.com", entries[1].Subject)
})
t.Run("empty cert", func(t *testing.T) {
entries := buildChainEntries("", chainPEM)
require.Len(t, entries, 1)
assert.Equal(t, "ca.example.com", entries[0].Subject)
})
t.Run("both empty", func(t *testing.T) {
entries := buildChainEntries("", "")
assert.Empty(t, entries)
})
t.Run("invalid PEM ignored", func(t *testing.T) {
entries := buildChainEntries("not-pem", "also-not-pem")
assert.Empty(t, entries)
})
}
// --- certStatus ---
func TestCertStatus(t *testing.T) {
now := time.Now()
t.Run("valid", func(t *testing.T) {
expiry := now.Add(60 * 24 * time.Hour)
cert := models.SSLCertificate{ExpiresAt: &expiry, Provider: "custom"}
assert.Equal(t, "valid", certStatus(cert))
})
t.Run("expired", func(t *testing.T) {
expiry := now.Add(-time.Hour)
cert := models.SSLCertificate{ExpiresAt: &expiry, Provider: "custom"}
assert.Equal(t, "expired", certStatus(cert))
})
t.Run("expiring soon", func(t *testing.T) {
expiry := now.Add(15 * 24 * time.Hour) // within 30d window
cert := models.SSLCertificate{ExpiresAt: &expiry, Provider: "custom"}
assert.Equal(t, "expiring", certStatus(cert))
})
t.Run("staging provider", func(t *testing.T) {
expiry := now.Add(60 * 24 * time.Hour)
cert := models.SSLCertificate{ExpiresAt: &expiry, Provider: "letsencrypt-staging"}
assert.Equal(t, "untrusted", certStatus(cert))
})
t.Run("nil expiry", func(t *testing.T) {
cert := models.SSLCertificate{Provider: "custom"}
assert.Equal(t, "valid", certStatus(cert))
})
}
// --- ListCertificates cache paths ---
func TestListCertificates_InitializedAndStale(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
// First call initializes
certs1, err := cs.ListCertificates()
require.NoError(t, err)
assert.Empty(t, certs1)
// Force stale but initialized
cs.cacheMu.Lock()
cs.initialized = true
cs.lastScan = time.Time{} // zero → stale
cs.cacheMu.Unlock()
// Should still return (stale) cache and trigger background sync
certs2, err := cs.ListCertificates()
require.NoError(t, err)
assert.NotNil(t, certs2)
}
func TestListCertificates_CacheFresh(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s_fresh?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
cs.cacheMu.Lock()
cs.initialized = true
cs.lastScan = time.Now()
cs.cache = []CertificateInfo{{Name: "cached"}}
cs.scanTTL = 5 * time.Minute
cs.cacheMu.Unlock()
certs, err := cs.ListCertificates()
require.NoError(t, err)
require.Len(t, certs, 1)
assert.Equal(t, "cached", certs[0].Name)
}
// --- ValidateCertificate extra branches ---
func TestValidateCertificate_KeyMismatch(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
// Generate two separate cert/key pairs so key doesn't match cert
certPEM, _ := generateTestCertAndKey(t, "mismatch.example.com", time.Now().Add(24*time.Hour))
_, keyPEM := generateTestCertAndKey(t, "other.example.com", time.Now().Add(24*time.Hour))
result, err := cs.ValidateCertificate(string(certPEM), string(keyPEM), "")
require.NoError(t, err)
// Key mismatch goes to Errors
found := false
for _, e := range result.Errors {
if strings.Contains(e, "mismatch") {
found = true
}
}
assert.True(t, found, "expected key mismatch error, got errors: %v, warnings: %v", result.Errors, result.Warnings)
}
// --- UploadCertificate with encryption ---
func TestUploadCertificate_WithEncryption(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertServiceWithEnc(t, tmpDir, db)
certPEM, keyPEM := generateTestCertAndKey(t, "enc.example.com", time.Now().Add(24*time.Hour))
info, err := cs.UploadCertificate("encrypted-cert", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
assert.Equal(t, "encrypted-cert", info.Name)
// Verify private key was encrypted in DB
var stored models.SSLCertificate
require.NoError(t, db.Where("uuid = ?", info.UUID).First(&stored).Error)
assert.NotEmpty(t, stored.PrivateKeyEncrypted)
assert.Empty(t, stored.PrivateKey) // should not store plaintext
}
// --- checkExpiry additional branches ---
func TestCheckExpiry_NoNotificationService(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}, &models.Setting{}, &models.NotificationProvider{}))
cs := &CertificateService{
dataDir: tmpDir,
db: db,
scanTTL: 5 * time.Minute,
}
// No notification service set — should not panic
cs.checkExpiry(context.Background(), nil, 30)
}
// --- DeleteCertificate with backup service ---
func TestDeleteCertificate_Success(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
certPEM, keyPEM := generateTestCertAndKey(t, "delete.example.com", time.Now().Add(24*time.Hour))
info, err := cs.UploadCertificate("to-delete", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
err = cs.DeleteCertificate(info.UUID)
assert.NoError(t, err)
// Verify deleted
_, err = cs.GetCertificate(info.UUID)
assert.ErrorIs(t, err, ErrCertNotFound)
}
func TestDeleteCertificate_InUse(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
certPEM, keyPEM := generateTestCertAndKey(t, "inuse.example.com", time.Now().Add(24*time.Hour))
info, err := cs.UploadCertificate("in-use-cert", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
// Find the cert and assign to a host
var stored models.SSLCertificate
require.NoError(t, db.Where("uuid = ?", info.UUID).First(&stored).Error)
ph := models.ProxyHost{
UUID: "ph-inuse",
Name: "InUse Host",
DomainNames: "inuse.example.com",
ForwardHost: "localhost",
ForwardPort: 8080,
CertificateID: &stored.ID,
}
require.NoError(t, db.Create(&ph).Error)
err = cs.DeleteCertificate(info.UUID)
assert.Error(t, err)
assert.Contains(t, err.Error(), "in use")
}
// --- IsCertificateInUse ---
func TestIsCertificateInUse(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
cert := models.SSLCertificate{
UUID: "inuse-test", Name: "In Use Test", Provider: "custom",
Domains: "test.example.com", CommonName: "test.example.com",
}
require.NoError(t, db.Create(&cert).Error)
t.Run("not in use", func(t *testing.T) {
inUse, err := cs.IsCertificateInUse(cert.ID)
require.NoError(t, err)
assert.False(t, inUse)
})
t.Run("in use", func(t *testing.T) {
ph := models.ProxyHost{
UUID: "ph-check", Name: "Check Host", DomainNames: "test.example.com",
ForwardHost: "localhost", ForwardPort: 8080, CertificateID: &cert.ID,
}
require.NoError(t, db.Create(&ph).Error)
inUse, err := cs.IsCertificateInUse(cert.ID)
require.NoError(t, err)
assert.True(t, inUse)
})
}

View File

@@ -0,0 +1,596 @@
package services
import (
"context"
"fmt"
"testing"
"time"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"gorm.io/driver/sqlite"
"gorm.io/gorm"
"github.com/Wikid82/charon/backend/internal/models"
)
// --- ExportCertificate DER format ---
func TestExportCertificate_DER(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
certPEM, keyPEM := generateTestCertAndKey(t, "der-export.example.com", time.Now().Add(24*time.Hour))
info, err := cs.UploadCertificate("der-export", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
data, filename, err := cs.ExportCertificate(info.UUID, "der", false, "")
require.NoError(t, err)
assert.NotEmpty(t, data)
assert.Contains(t, filename, ".der")
}
// --- ExportCertificate PFX format ---
func TestExportCertificate_PFX(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertServiceWithEnc(t, tmpDir, db)
certPEM, keyPEM := generateTestCertAndKey(t, "pfx-export.example.com", time.Now().Add(24*time.Hour))
info, err := cs.UploadCertificate("pfx-export", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
data, filename, err := cs.ExportCertificate(info.UUID, "pfx", true, "test-password")
require.NoError(t, err)
assert.NotEmpty(t, data)
assert.Contains(t, filename, ".pfx")
}
func TestExportCertificate_P12(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertServiceWithEnc(t, tmpDir, db)
certPEM, keyPEM := generateTestCertAndKey(t, "p12-export.example.com", time.Now().Add(24*time.Hour))
info, err := cs.UploadCertificate("p12-export", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
data, filename, err := cs.ExportCertificate(info.UUID, "p12", true, "password")
require.NoError(t, err)
assert.NotEmpty(t, data)
assert.Contains(t, filename, ".pfx")
}
func TestExportCertificate_UnsupportedFormat(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
certPEM, keyPEM := generateTestCertAndKey(t, "unsupported.example.com", time.Now().Add(24*time.Hour))
info, err := cs.UploadCertificate("unsupported-fmt", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
_, _, err = cs.ExportCertificate(info.UUID, "xml", false, "")
assert.Error(t, err)
assert.Contains(t, err.Error(), "unsupported export format")
}
func TestExportCertificate_PEMWithKey(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertServiceWithEnc(t, tmpDir, db)
certPEM, keyPEM := generateTestCertAndKey(t, "pem-key.example.com", time.Now().Add(24*time.Hour))
info, err := cs.UploadCertificate("pem-key-export", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
data, filename, err := cs.ExportCertificate(info.UUID, "pem", true, "")
require.NoError(t, err)
assert.Contains(t, string(data), "PRIVATE KEY")
assert.Contains(t, filename, ".pem")
}
func TestExportCertificate_NotFound(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
_, _, err = cs.ExportCertificate("nonexistent-uuid", "pem", false, "")
assert.ErrorIs(t, err, ErrCertNotFound)
}
// --- GetDecryptedPrivateKey ---
func TestGetDecryptedPrivateKey_NoEncryptedKey(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
cert := &models.SSLCertificate{PrivateKeyEncrypted: ""}
_, err = cs.GetDecryptedPrivateKey(cert)
assert.Error(t, err)
assert.Contains(t, err.Error(), "no encrypted private key")
}
func TestGetDecryptedPrivateKey_NoEncryptionService(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db) // no encSvc
cert := &models.SSLCertificate{PrivateKeyEncrypted: "some-encrypted-data"}
_, err = cs.GetDecryptedPrivateKey(cert)
assert.Error(t, err)
assert.Contains(t, err.Error(), "encryption service not configured")
}
// --- MigratePrivateKeys ---
func TestMigratePrivateKeys_NoEncryptionService(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
err = cs.MigratePrivateKeys()
assert.NoError(t, err) // should return nil without error
}
func TestMigratePrivateKeys_NoCertsToMigrate(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
// MigratePrivateKeys uses raw SQL against private_key column (gorm:"-"), so add it manually
db.Exec("ALTER TABLE ssl_certificates ADD COLUMN private_key TEXT DEFAULT ''")
cs := newTestCertServiceWithEnc(t, tmpDir, db)
err = cs.MigratePrivateKeys()
assert.NoError(t, err)
}
func TestMigratePrivateKeys_WithPlaintextKey(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
// MigratePrivateKeys uses raw SQL against private_key column (gorm:"-"), so add it manually
db.Exec("ALTER TABLE ssl_certificates ADD COLUMN private_key TEXT DEFAULT ''")
cs := newTestCertServiceWithEnc(t, tmpDir, db)
_, keyPEM := generateTestCertAndKey(t, "migrate.example.com", time.Now().Add(24*time.Hour))
// Insert a cert with plaintext private_key via raw SQL
db.Exec("INSERT INTO ssl_certificates (uuid, name, provider, domains, common_name, private_key) VALUES (?, ?, ?, ?, ?, ?)",
"migrate-uuid", "Migrate Test", "custom", "migrate.example.com", "migrate.example.com", string(keyPEM))
err = cs.MigratePrivateKeys()
assert.NoError(t, err)
// Verify the key was encrypted
var encKey string
db.Raw("SELECT private_key_enc FROM ssl_certificates WHERE uuid = ?", "migrate-uuid").Scan(&encKey)
assert.NotEmpty(t, encKey)
// Verify plaintext key was cleared
var plainKey string
db.Raw("SELECT private_key FROM ssl_certificates WHERE uuid = ?", "migrate-uuid").Scan(&plainKey)
assert.Empty(t, plainKey)
}
// --- DeleteCertificateByID ---
func TestDeleteCertificateByID_Success(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
certPEM, keyPEM := generateTestCertAndKey(t, "byid.example.com", time.Now().Add(24*time.Hour))
info, err := cs.UploadCertificate("by-id-delete", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
var stored models.SSLCertificate
require.NoError(t, db.Where("uuid = ?", info.UUID).First(&stored).Error)
err = cs.DeleteCertificateByID(stored.ID)
assert.NoError(t, err)
}
func TestDeleteCertificateByID_NotFound(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
err = cs.DeleteCertificateByID(99999)
assert.Error(t, err)
}
// --- UpdateCertificate ---
func TestUpdateCertificate_Success(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
certPEM, keyPEM := generateTestCertAndKey(t, "update.example.com", time.Now().Add(24*time.Hour))
info, err := cs.UploadCertificate("old-name", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
updated, err := cs.UpdateCertificate(info.UUID, "new-name")
require.NoError(t, err)
assert.Equal(t, "new-name", updated.Name)
}
func TestUpdateCertificate_NotFound(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
_, err = cs.UpdateCertificate("nonexistent", "name")
assert.ErrorIs(t, err, ErrCertNotFound)
}
// --- IsCertificateInUseByUUID ---
func TestIsCertificateInUseByUUID_NotFound(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
_, err = cs.IsCertificateInUseByUUID("nonexistent-uuid")
assert.ErrorIs(t, err, ErrCertNotFound)
}
func TestIsCertificateInUseByUUID_NotInUse(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
certPEM, keyPEM := generateTestCertAndKey(t, "inuse-uuid.example.com", time.Now().Add(24*time.Hour))
info, err := cs.UploadCertificate("uuid-inuse-test", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
inUse, err := cs.IsCertificateInUseByUUID(info.UUID)
require.NoError(t, err)
assert.False(t, inUse)
}
// --- CheckExpiringCertificates ---
func TestCheckExpiringCertificates_WithExpiring(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
// Create a cert expiring in 10 days
expiry := time.Now().Add(10 * 24 * time.Hour)
notBefore := time.Now().Add(-24 * time.Hour)
db.Create(&models.SSLCertificate{
UUID: "expiring-uuid", Name: "Expiring Cert", Provider: "custom",
Domains: "expiring.example.com", CommonName: "expiring.example.com",
ExpiresAt: &expiry, NotBefore: &notBefore,
})
certs, err := cs.CheckExpiringCertificates(30)
require.NoError(t, err)
assert.Len(t, certs, 1)
assert.Equal(t, "Expiring Cert", certs[0].Name)
assert.Equal(t, "expiring", certs[0].Status)
}
func TestCheckExpiringCertificates_WithExpired(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
expiry := time.Now().Add(-24 * time.Hour)
db.Create(&models.SSLCertificate{
UUID: "expired-uuid", Name: "Expired Cert", Provider: "custom",
Domains: "expired.example.com", CommonName: "expired.example.com",
ExpiresAt: &expiry,
})
certs, err := cs.CheckExpiringCertificates(30)
require.NoError(t, err)
assert.Len(t, certs, 1)
assert.Equal(t, "expired", certs[0].Status)
}
func TestCheckExpiringCertificates_NoneExpiring(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
// Cert expiring in 90 days - outside 30 day window
expiry := time.Now().Add(90 * 24 * time.Hour)
db.Create(&models.SSLCertificate{
UUID: "valid-uuid", Name: "Valid Cert", Provider: "custom",
Domains: "valid.example.com", CommonName: "valid.example.com",
ExpiresAt: &expiry,
})
certs, err := cs.CheckExpiringCertificates(30)
require.NoError(t, err)
assert.Empty(t, certs)
}
// --- checkExpiry with notification service ---
func TestCheckExpiry_WithExpiringCerts(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(
&models.SSLCertificate{}, &models.ProxyHost{},
&models.Setting{}, &models.NotificationProvider{},
&models.Notification{},
))
cs := newTestCertificateService(tmpDir, db)
// Create expiring cert
expiry := time.Now().Add(10 * 24 * time.Hour)
db.Create(&models.SSLCertificate{
UUID: "notify-expiring", Name: "Notify Cert", Provider: "custom",
Domains: "notify.example.com", CommonName: "notify.example.com",
ExpiresAt: &expiry,
})
notifSvc := NewNotificationService(db, nil)
cs.checkExpiry(context.Background(), notifSvc, 30)
// Verify a notification was created
var count int64
db.Model(&models.Notification{}).Count(&count)
assert.Greater(t, count, int64(0))
}
func TestCheckExpiry_WithExpiredCerts(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(
&models.SSLCertificate{}, &models.ProxyHost{},
&models.Setting{}, &models.NotificationProvider{},
&models.Notification{},
))
cs := newTestCertificateService(tmpDir, db)
expiry := time.Now().Add(-24 * time.Hour)
db.Create(&models.SSLCertificate{
UUID: "notify-expired", Name: "Expired Notify", Provider: "custom",
Domains: "expired-notify.example.com", CommonName: "expired-notify.example.com",
ExpiresAt: &expiry,
})
notifSvc := NewNotificationService(db, nil)
cs.checkExpiry(context.Background(), notifSvc, 30)
var count int64
db.Model(&models.Notification{}).Count(&count)
assert.Greater(t, count, int64(0))
}
// --- ListCertificates with chain and proxy host ---
func TestListCertificates_WithChainAndProxyHost(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
certPEM, _, err := generateSelfSignedCertPEM()
require.NoError(t, err)
chainPEM := certPEM + "\n" + certPEM
expiry := time.Now().Add(90 * 24 * time.Hour)
notBefore := time.Now().Add(-1 * time.Hour)
certID := uint(99)
db.Create(&models.SSLCertificate{
ID: certID,
UUID: "chain-test-uuid",
Name: "Chain Test",
Provider: "custom",
Domains: "chain.example.com",
CommonName: "chain.example.com",
Certificate: certPEM,
CertificateChain: chainPEM,
ExpiresAt: &expiry,
NotBefore: &notBefore,
})
db.Create(&models.ProxyHost{
Name: "My Proxy",
DomainNames: "chain.example.com",
CertificateID: &certID,
})
certs, err := cs.ListCertificates()
require.NoError(t, err)
require.Len(t, certs, 1)
assert.Equal(t, 2, certs[0].ChainDepth)
assert.True(t, certs[0].InUse)
assert.Equal(t, "chain-test-uuid", certs[0].UUID)
}
// --- UploadCertificate with key ---
func TestUploadCertificate_WithKey(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertServiceWithEnc(t, tmpDir, db)
certPEM, keyPEM, err := generateSelfSignedCertPEM()
require.NoError(t, err)
info, err := cs.UploadCertificate("My Upload", certPEM, keyPEM, "")
require.NoError(t, err)
require.NotNil(t, info)
assert.Equal(t, "My Upload", info.Name)
assert.True(t, info.HasKey)
assert.NotEmpty(t, info.UUID)
assert.Equal(t, "custom", info.Provider)
}
// --- ValidateCertificate with key match ---
func TestValidateCertificate_WithKeyMatch(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
certPEM, keyPEM, err := generateSelfSignedCertPEM()
require.NoError(t, err)
result, err := cs.ValidateCertificate(certPEM, keyPEM, "")
require.NoError(t, err)
assert.True(t, result.Valid)
assert.True(t, result.KeyMatch)
assert.Empty(t, result.Errors)
assert.Contains(t, result.Warnings, "certificate could not be verified against system roots")
}
// --- UpdateCertificate with chain depth ---
func TestUpdateCertificate_WithChainDepth(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertificateService(tmpDir, db)
certPEM, _, err := generateSelfSignedCertPEM()
require.NoError(t, err)
chainPEM := certPEM + "\n" + certPEM + "\n" + certPEM
expiry := time.Now().Add(90 * 24 * time.Hour)
db.Create(&models.SSLCertificate{
UUID: "update-chain-uuid",
Name: "Chain Update",
Provider: "custom",
Domains: "update-chain.example.com",
CommonName: "update-chain.example.com",
Certificate: certPEM,
CertificateChain: chainPEM,
ExpiresAt: &expiry,
})
info, err := cs.UpdateCertificate("update-chain-uuid", "Renamed Chain")
require.NoError(t, err)
assert.Equal(t, "Renamed Chain", info.Name)
assert.Equal(t, 3, info.ChainDepth)
}
// --- ExportCertificate PEM with chain ---
func TestExportCertificate_PEMWithChain(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
cs := newTestCertServiceWithEnc(t, tmpDir, db)
certPEM, keyPEM, err := generateSelfSignedCertPEM()
require.NoError(t, err)
encSvc := newTestEncryptionService(t)
encKey, err := encSvc.Encrypt([]byte(keyPEM))
require.NoError(t, err)
chainPEM := certPEM
db.Create(&models.SSLCertificate{
UUID: "export-chain-uuid",
Name: "Export Chain",
Provider: "custom",
Domains: "export-chain.example.com",
CommonName: "export-chain.example.com",
Certificate: certPEM,
CertificateChain: chainPEM,
PrivateKeyEncrypted: encKey,
})
data, filename, err := cs.ExportCertificate("export-chain-uuid", "pem", true, "")
require.NoError(t, err)
assert.Equal(t, "Export Chain.pem", filename)
assert.Contains(t, string(data), "BEGIN CERTIFICATE")
assert.Contains(t, string(data), "BEGIN")
}

View File

@@ -0,0 +1,236 @@
package services
import (
"fmt"
"os"
"path/filepath"
"testing"
"time"
"github.com/google/uuid"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"gorm.io/driver/sqlite"
"gorm.io/gorm"
"github.com/Wikid82/charon/backend/internal/models"
)
func TestSyncFromDisk_StagingToProductionUpgrade(t *testing.T) {
tmpDir := t.TempDir()
certRoot := filepath.Join(tmpDir, "certificates")
require.NoError(t, os.MkdirAll(certRoot, 0755))
domain := "staging-upgrade.example.com"
certPEM, _ := generateTestCertAndKey(t, domain, time.Now().Add(24*time.Hour))
certFile := filepath.Join(certRoot, domain+".crt")
require.NoError(t, os.WriteFile(certFile, certPEM, 0600))
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
existing := models.SSLCertificate{
UUID: uuid.New().String(),
Name: domain,
Provider: "letsencrypt-staging",
Domains: domain,
Certificate: "old-content",
}
require.NoError(t, db.Create(&existing).Error)
svc := newTestCertificateService(tmpDir, db)
require.NoError(t, svc.SyncFromDisk())
var updated models.SSLCertificate
require.NoError(t, db.Where("uuid = ?", existing.UUID).First(&updated).Error)
assert.Equal(t, "letsencrypt", updated.Provider)
}
func TestSyncFromDisk_ExpiryOnlyUpdate(t *testing.T) {
tmpDir := t.TempDir()
certRoot := filepath.Join(tmpDir, "certificates")
require.NoError(t, os.MkdirAll(certRoot, 0755))
domain := "expiry-only.example.com"
certPEM, _ := generateTestCertAndKey(t, domain, time.Now().Add(24*time.Hour))
certFile := filepath.Join(certRoot, domain+".crt")
require.NoError(t, os.WriteFile(certFile, certPEM, 0600))
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
existing := models.SSLCertificate{
UUID: uuid.New().String(),
Name: domain,
Provider: "letsencrypt",
Domains: domain,
Certificate: string(certPEM), // identical content
}
require.NoError(t, db.Create(&existing).Error)
svc := newTestCertificateService(tmpDir, db)
require.NoError(t, svc.SyncFromDisk())
var updated models.SSLCertificate
require.NoError(t, db.Where("uuid = ?", existing.UUID).First(&updated).Error)
assert.Equal(t, "letsencrypt", updated.Provider)
assert.Equal(t, string(certPEM), updated.Certificate)
}
func TestSyncFromDisk_CertRootStatPermissionError(t *testing.T) {
if os.Getuid() == 0 {
t.Skip("cannot test permission error as root")
}
tmpDir := t.TempDir()
certRoot := filepath.Join(tmpDir, "certificates")
require.NoError(t, os.MkdirAll(certRoot, 0755))
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
// Restrict parent dir so os.Stat(certRoot) fails with permission error
require.NoError(t, os.Chmod(tmpDir, 0))
defer func() { _ = os.Chmod(tmpDir, 0755) }()
svc := newTestCertificateService(tmpDir, db)
err = svc.SyncFromDisk()
require.NoError(t, err)
}
func TestListCertificates_StaleCache_TriggersBackgroundSync(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
svc := newTestCertificateService(tmpDir, db)
// Simulate stale cache
svc.cacheMu.Lock()
svc.initialized = true
svc.lastScan = time.Now().Add(-10 * time.Minute)
before := svc.lastScan
svc.cacheMu.Unlock()
_, err = svc.ListCertificates()
require.NoError(t, err)
// Background goroutine should update lastScan via SyncFromDisk
require.Eventually(t, func() bool {
svc.cacheMu.RLock()
defer svc.cacheMu.RUnlock()
return svc.lastScan.After(before)
}, 2*time.Second, 10*time.Millisecond)
}
func TestGetDecryptedPrivateKey_DecryptFails(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
svc := newTestCertServiceWithEnc(t, tmpDir, db)
cert := models.SSLCertificate{
UUID: uuid.New().String(),
Name: "enc-fail",
Domains: "encfail.example.com",
Provider: "custom",
PrivateKeyEncrypted: "corrupted-ciphertext",
}
require.NoError(t, db.Create(&cert).Error)
_, err = svc.GetDecryptedPrivateKey(&cert)
assert.Error(t, err)
}
func TestDeleteCertificate_LetsEncryptProvider_FileCleanup(t *testing.T) {
tmpDir := t.TempDir()
certRoot := filepath.Join(tmpDir, "certificates")
require.NoError(t, os.MkdirAll(certRoot, 0755))
domain := "le-cleanup.example.com"
certFile := filepath.Join(certRoot, domain+".crt")
keyFile := filepath.Join(certRoot, domain+".key")
jsonFile := filepath.Join(certRoot, domain+".json")
certPEM, _ := generateTestCertAndKey(t, domain, time.Now().Add(24*time.Hour))
require.NoError(t, os.WriteFile(certFile, certPEM, 0600))
require.NoError(t, os.WriteFile(keyFile, []byte("key"), 0600))
require.NoError(t, os.WriteFile(jsonFile, []byte("{}"), 0600))
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
certUUID := uuid.New().String()
cert := models.SSLCertificate{
UUID: certUUID,
Name: domain,
Provider: "letsencrypt",
Domains: domain,
}
require.NoError(t, db.Create(&cert).Error)
svc := newTestCertificateService(tmpDir, db)
require.NoError(t, svc.DeleteCertificate(certUUID))
assert.NoFileExists(t, certFile)
assert.NoFileExists(t, keyFile)
assert.NoFileExists(t, jsonFile)
}
func TestDeleteCertificate_StagingProvider_FileCleanup(t *testing.T) {
tmpDir := t.TempDir()
certRoot := filepath.Join(tmpDir, "certificates")
require.NoError(t, os.MkdirAll(certRoot, 0755))
domain := "le-staging-cleanup.example.com"
certFile := filepath.Join(certRoot, domain+".crt")
certPEM, _ := generateTestCertAndKey(t, domain, time.Now().Add(24*time.Hour))
require.NoError(t, os.WriteFile(certFile, certPEM, 0600))
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
certUUID := uuid.New().String()
cert := models.SSLCertificate{
UUID: certUUID,
Name: domain,
Provider: "letsencrypt-staging",
Domains: domain,
}
require.NoError(t, db.Create(&cert).Error)
svc := newTestCertificateService(tmpDir, db)
require.NoError(t, svc.DeleteCertificate(certUUID))
assert.NoFileExists(t, certFile)
}
func TestCheckExpiringCertificates_DBError(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
db, err := gorm.Open(sqlite.Open(dsn), &gorm.Config{})
require.NoError(t, err)
// deliberately do NOT AutoMigrate SSLCertificate
svc := newTestCertificateService(tmpDir, db)
_, err = svc.CheckExpiringCertificates(30)
assert.Error(t, err)
}

View File

@@ -31,6 +31,14 @@ func newTestCertificateService(dataDir string, db *gorm.DB) *CertificateService
}
}
// certDBID looks up the numeric DB primary key for a certificate by UUID.
func certDBID(t *testing.T, db *gorm.DB, uuid string) uint {
t.Helper()
var cert models.SSLCertificate
require.NoError(t, db.Where("uuid = ?", uuid).First(&cert).Error)
return cert.ID
}
func TestNewCertificateService(t *testing.T) {
tmpDir := t.TempDir()
dsn := fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())
@@ -43,7 +51,7 @@ func TestNewCertificateService(t *testing.T) {
require.NoError(t, os.MkdirAll(certDir, 0o750)) // #nosec G301 -- test directory
// Test service creation
svc := NewCertificateService(tmpDir, db)
svc := NewCertificateService(tmpDir, db, nil)
assert.NotNil(t, svc)
assert.Equal(t, tmpDir, svc.dataDir)
assert.Equal(t, db, svc.db)
@@ -54,6 +62,11 @@ func TestNewCertificateService(t *testing.T) {
}
func generateTestCert(t *testing.T, domain string, expiry time.Time) []byte {
certPEM, _ := generateTestCertAndKey(t, domain, expiry)
return certPEM
}
func generateTestCertAndKey(t *testing.T, domain string, expiry time.Time) ([]byte, []byte) {
priv, err := rsa.GenerateKey(rand.Reader, 2048)
if err != nil {
t.Fatalf("Failed to generate private key: %v", err)
@@ -77,7 +90,9 @@ func generateTestCert(t *testing.T, domain string, expiry time.Time) []byte {
t.Fatalf("Failed to create certificate: %v", err)
}
return pem.EncodeToMemory(&pem.Block{Type: "CERTIFICATE", Bytes: derBytes})
certPEM := pem.EncodeToMemory(&pem.Block{Type: "CERTIFICATE", Bytes: derBytes})
keyPEM := pem.EncodeToMemory(&pem.Block{Type: "RSA PRIVATE KEY", Bytes: x509.MarshalPKCS1PrivateKey(priv)})
return certPEM, keyPEM
}
func TestCertificateService_GetCertificateInfo(t *testing.T) {
@@ -123,7 +138,7 @@ func TestCertificateService_GetCertificateInfo(t *testing.T) {
assert.NoError(t, err)
assert.Len(t, certs, 1)
if len(certs) > 0 {
assert.Equal(t, domain, certs[0].Domain)
assert.Equal(t, domain, certs[0].Domains)
assert.Equal(t, "valid", certs[0].Status)
// Check expiry within a margin
assert.WithinDuration(t, expiry, certs[0].ExpiresAt, time.Second)
@@ -153,7 +168,7 @@ func TestCertificateService_GetCertificateInfo(t *testing.T) {
// Find the expired one
var foundExpired bool
for _, c := range certs {
if c.Domain == expiredDomain {
if c.Domains == expiredDomain {
assert.Equal(t, "expired", c.Status)
foundExpired = true
}
@@ -174,11 +189,10 @@ func TestCertificateService_UploadAndDelete(t *testing.T) {
// Generate Cert
domain := "custom.example.com"
expiry := time.Now().Add(24 * time.Hour)
certPEM := generateTestCert(t, domain, expiry)
keyPEM := []byte("FAKE PRIVATE KEY")
certPEM, keyPEM := generateTestCertAndKey(t, domain, expiry)
// Test Upload
cert, err := cs.UploadCertificate("My Custom Cert", string(certPEM), string(keyPEM))
cert, err := cs.UploadCertificate("My Custom Cert", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
assert.NotNil(t, cert)
assert.Equal(t, "My Custom Cert", cert.Name)
@@ -190,7 +204,7 @@ func TestCertificateService_UploadAndDelete(t *testing.T) {
require.NoError(t, err)
var found bool
for _, c := range certs {
if c.ID == cert.ID {
if c.UUID == cert.UUID {
found = true
assert.Equal(t, "custom", c.Provider)
break
@@ -199,7 +213,7 @@ func TestCertificateService_UploadAndDelete(t *testing.T) {
assert.True(t, found)
// Test Delete
err = cs.DeleteCertificate(cert.ID)
err = cs.DeleteCertificate(cert.UUID)
require.NoError(t, err)
// Verify it's gone
@@ -207,7 +221,7 @@ func TestCertificateService_UploadAndDelete(t *testing.T) {
require.NoError(t, err)
found = false
for _, c := range certs {
if c.ID == cert.ID {
if c.UUID == cert.UUID {
found = true
break
}
@@ -248,7 +262,7 @@ func TestCertificateService_Persistence(t *testing.T) {
// Verify it's in the returned list
var foundInList bool
for _, c := range certs {
if c.Domain == domain {
if c.Domains == domain {
foundInList = true
assert.Equal(t, "letsencrypt", c.Provider)
break
@@ -264,7 +278,7 @@ func TestCertificateService_Persistence(t *testing.T) {
assert.Equal(t, string(certPEM), dbCert.Certificate)
// 4. Delete the certificate via Service (which should delete the file)
err = cs.DeleteCertificate(dbCert.ID)
err = cs.DeleteCertificate(dbCert.UUID)
require.NoError(t, err)
// Verify file is gone
@@ -278,7 +292,7 @@ func TestCertificateService_Persistence(t *testing.T) {
// Verify it's NOT in the returned list
foundInList = false
for _, c := range certs {
if c.Domain == domain {
if c.Domains == domain {
foundInList = true
break
}
@@ -301,14 +315,14 @@ func TestCertificateService_UploadCertificate_Errors(t *testing.T) {
cs := newTestCertificateService(tmpDir, db)
t.Run("invalid PEM format", func(t *testing.T) {
cert, err := cs.UploadCertificate("Invalid", "not-a-valid-pem", "also-not-valid")
cert, err := cs.UploadCertificate("Invalid", "not-a-valid-pem", "also-not-valid", "")
assert.Error(t, err)
assert.Nil(t, cert)
assert.Contains(t, err.Error(), "invalid certificate PEM")
assert.Contains(t, err.Error(), "unrecognized certificate format")
})
t.Run("empty certificate", func(t *testing.T) {
cert, err := cs.UploadCertificate("Empty", "", "some-key")
cert, err := cs.UploadCertificate("Empty", "", "some-key", "")
assert.Error(t, err)
assert.Nil(t, cert)
})
@@ -318,19 +332,18 @@ func TestCertificateService_UploadCertificate_Errors(t *testing.T) {
expiry := time.Now().Add(24 * time.Hour)
certPEM := generateTestCert(t, domain, expiry)
cert, err := cs.UploadCertificate("No Key", string(certPEM), "")
cert, err := cs.UploadCertificate("No Key", string(certPEM), "", "")
assert.NoError(t, err) // Uploading without key is allowed
assert.NotNil(t, cert)
assert.Equal(t, "", cert.PrivateKey)
assert.False(t, cert.HasKey)
})
t.Run("valid certificate with name", func(t *testing.T) {
domain := "valid.com"
expiry := time.Now().Add(24 * time.Hour)
certPEM := generateTestCert(t, domain, expiry)
keyPEM := []byte("FAKE PRIVATE KEY")
certPEM, keyPEM := generateTestCertAndKey(t, domain, expiry)
cert, err := cs.UploadCertificate("Valid Cert", string(certPEM), string(keyPEM))
cert, err := cs.UploadCertificate("Valid Cert", string(certPEM), string(keyPEM), "")
assert.NoError(t, err)
assert.NotNil(t, cert)
assert.Equal(t, "Valid Cert", cert.Name)
@@ -341,10 +354,9 @@ func TestCertificateService_UploadCertificate_Errors(t *testing.T) {
t.Run("expired certificate can be uploaded", func(t *testing.T) {
domain := "expired-upload.com"
expiry := time.Now().Add(-24 * time.Hour) // Already expired
certPEM := generateTestCert(t, domain, expiry)
keyPEM := []byte("FAKE PRIVATE KEY")
certPEM, keyPEM := generateTestCertAndKey(t, domain, expiry)
cert, err := cs.UploadCertificate("Expired Upload", string(certPEM), string(keyPEM))
cert, err := cs.UploadCertificate("Expired Upload", string(certPEM), string(keyPEM), "")
// Should still upload successfully, but status will be expired
assert.NoError(t, err)
assert.NotNil(t, cert)
@@ -430,7 +442,7 @@ func TestCertificateService_ListCertificates_EdgeCases(t *testing.T) {
domain2 := "custom.example.com"
expiry2 := time.Now().Add(48 * time.Hour)
certPEM2 := generateTestCert(t, domain2, expiry2)
_, err = cs.UploadCertificate("Custom", string(certPEM2), "FAKE KEY")
_, err = cs.UploadCertificate("Custom", string(certPEM2), "", "")
require.NoError(t, err)
certs, err := cs.ListCertificates()
@@ -457,20 +469,22 @@ func TestCertificateService_DeleteCertificate_Errors(t *testing.T) {
cs := newTestCertificateService(tmpDir, db)
t.Run("delete non-existent certificate", func(t *testing.T) {
// IsCertificateInUse will succeed (not in use), then First will fail
err := cs.DeleteCertificate(99999)
// DeleteCertificate takes UUID string; non-existent UUID returns error
err := cs.DeleteCertificate("non-existent-uuid")
assert.Error(t, err)
assert.Equal(t, gorm.ErrRecordNotFound, err)
})
t.Run("delete certificate in use returns ErrCertInUse", func(t *testing.T) {
// Create certificate
domain := "in-use.com"
expiry := time.Now().Add(24 * time.Hour)
certPEM := generateTestCert(t, domain, expiry)
cert, err := cs.UploadCertificate("In Use", string(certPEM), "FAKE KEY")
certPEM, keyPEM := generateTestCertAndKey(t, domain, expiry)
cert, err := cs.UploadCertificate("In Use", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
// Look up numeric ID for FK
dbID := certDBID(t, db, cert.UUID)
// Create proxy host using this certificate
ph := models.ProxyHost{
UUID: "test-ph",
@@ -478,18 +492,18 @@ func TestCertificateService_DeleteCertificate_Errors(t *testing.T) {
DomainNames: "in-use.com",
ForwardHost: "localhost",
ForwardPort: 8080,
CertificateID: &cert.ID,
CertificateID: &dbID,
}
require.NoError(t, db.Create(&ph).Error)
// Attempt to delete certificate - should fail with ErrCertInUse
err = cs.DeleteCertificate(cert.ID)
err = cs.DeleteCertificate(cert.UUID)
assert.Error(t, err)
assert.Equal(t, ErrCertInUse, err)
// Verify certificate still exists
var dbCert models.SSLCertificate
err = db.First(&dbCert, "id = ?", cert.ID).Error
err = db.First(&dbCert, "id = ?", dbID).Error
assert.NoError(t, err)
})
@@ -497,21 +511,24 @@ func TestCertificateService_DeleteCertificate_Errors(t *testing.T) {
// Create and upload cert
domain := "to-delete.com"
expiry := time.Now().Add(24 * time.Hour)
certPEM := generateTestCert(t, domain, expiry)
cert, err := cs.UploadCertificate("To Delete", string(certPEM), "FAKE KEY")
certPEM, keyPEM := generateTestCertAndKey(t, domain, expiry)
cert, err := cs.UploadCertificate("To Delete", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
// Look up numeric ID for verification
dbID := certDBID(t, db, cert.UUID)
// Manually remove the file (custom certs stored by numeric ID)
certPath := filepath.Join(tmpDir, "certificates", "custom", "cert.crt")
_ = os.Remove(certPath)
// Delete should still work (DB cleanup)
err = cs.DeleteCertificate(cert.ID)
err = cs.DeleteCertificate(cert.UUID)
assert.NoError(t, err)
// Verify DB record is gone
var dbCert models.SSLCertificate
err = db.First(&dbCert, "id = ?", cert.ID).Error
err = db.First(&dbCert, "id = ?", dbID).Error
assert.Error(t, err)
})
}
@@ -781,9 +798,8 @@ func TestCertificateService_CertificateWithSANs(t *testing.T) {
domain := "san.example.com"
expiry := time.Now().Add(24 * time.Hour)
certPEM := generateTestCertWithSANs(t, domain, []string{"san.example.com", "www.san.example.com", "api.san.example.com"}, expiry)
keyPEM := []byte("FAKE PRIVATE KEY")
cert, err := cs.UploadCertificate("SAN Cert", string(certPEM), string(keyPEM))
cert, err := cs.UploadCertificate("SAN Cert", string(certPEM), "", "")
require.NoError(t, err)
assert.NotNil(t, cert)
// Should have joined SANs
@@ -807,10 +823,11 @@ func TestCertificateService_IsCertificateInUse(t *testing.T) {
domain := "unused.com"
expiry := time.Now().Add(24 * time.Hour)
certPEM := generateTestCert(t, domain, expiry)
cert, err := cs.UploadCertificate("Unused", string(certPEM), "FAKE KEY")
cert, err := cs.UploadCertificate("Unused", string(certPEM), "", "")
require.NoError(t, err)
inUse, err := cs.IsCertificateInUse(cert.ID)
dbID := certDBID(t, db, cert.UUID)
inUse, err := cs.IsCertificateInUse(dbID)
assert.NoError(t, err)
assert.False(t, inUse)
})
@@ -820,9 +837,11 @@ func TestCertificateService_IsCertificateInUse(t *testing.T) {
domain := "used.com"
expiry := time.Now().Add(24 * time.Hour)
certPEM := generateTestCert(t, domain, expiry)
cert, err := cs.UploadCertificate("Used", string(certPEM), "FAKE KEY")
cert, err := cs.UploadCertificate("Used", string(certPEM), "", "")
require.NoError(t, err)
dbID := certDBID(t, db, cert.UUID)
// Create proxy host using this certificate
ph := models.ProxyHost{
UUID: "ph-1",
@@ -830,11 +849,11 @@ func TestCertificateService_IsCertificateInUse(t *testing.T) {
DomainNames: "used.com",
ForwardHost: "localhost",
ForwardPort: 8080,
CertificateID: &cert.ID,
CertificateID: &dbID,
}
require.NoError(t, db.Create(&ph).Error)
inUse, err := cs.IsCertificateInUse(cert.ID)
inUse, err := cs.IsCertificateInUse(dbID)
assert.NoError(t, err)
assert.True(t, inUse)
})
@@ -844,9 +863,11 @@ func TestCertificateService_IsCertificateInUse(t *testing.T) {
domain := "shared.com"
expiry := time.Now().Add(24 * time.Hour)
certPEM := generateTestCert(t, domain, expiry)
cert, err := cs.UploadCertificate("Shared", string(certPEM), "FAKE KEY")
cert, err := cs.UploadCertificate("Shared", string(certPEM), "", "")
require.NoError(t, err)
dbID := certDBID(t, db, cert.UUID)
// Create multiple proxy hosts using this certificate
for i := 1; i <= 3; i++ {
ph := models.ProxyHost{
@@ -855,12 +876,12 @@ func TestCertificateService_IsCertificateInUse(t *testing.T) {
DomainNames: fmt.Sprintf("host%d.shared.com", i),
ForwardHost: "localhost",
ForwardPort: 8080 + i,
CertificateID: &cert.ID,
CertificateID: &dbID,
}
require.NoError(t, db.Create(&ph).Error)
}
inUse, err := cs.IsCertificateInUse(cert.ID)
inUse, err := cs.IsCertificateInUse(dbID)
assert.NoError(t, err)
assert.True(t, inUse)
})
@@ -876,9 +897,11 @@ func TestCertificateService_IsCertificateInUse(t *testing.T) {
domain := "freed.com"
expiry := time.Now().Add(24 * time.Hour)
certPEM := generateTestCert(t, domain, expiry)
cert, err := cs.UploadCertificate("Freed", string(certPEM), "FAKE KEY")
cert, err := cs.UploadCertificate("Freed", string(certPEM), "", "")
require.NoError(t, err)
dbID := certDBID(t, db, cert.UUID)
// Create proxy host using this certificate
ph := models.ProxyHost{
UUID: "ph-freed",
@@ -886,12 +909,12 @@ func TestCertificateService_IsCertificateInUse(t *testing.T) {
DomainNames: "freed.com",
ForwardHost: "localhost",
ForwardPort: 8080,
CertificateID: &cert.ID,
CertificateID: &dbID,
}
require.NoError(t, db.Create(&ph).Error)
// Verify in use
inUse, err := cs.IsCertificateInUse(cert.ID)
inUse, err := cs.IsCertificateInUse(dbID)
assert.NoError(t, err)
assert.True(t, inUse)
@@ -899,12 +922,12 @@ func TestCertificateService_IsCertificateInUse(t *testing.T) {
require.NoError(t, db.Delete(&ph).Error)
// Verify no longer in use
inUse, err = cs.IsCertificateInUse(cert.ID)
inUse, err = cs.IsCertificateInUse(dbID)
assert.NoError(t, err)
assert.False(t, inUse)
// Now deletion should succeed
err = cs.DeleteCertificate(cert.ID)
err = cs.DeleteCertificate(cert.UUID)
assert.NoError(t, err)
})
}
@@ -922,10 +945,9 @@ func TestCertificateService_CacheBehavior(t *testing.T) {
// Create a cert
domain := "cache.example.com"
expiry := time.Now().Add(24 * time.Hour)
certPEM := generateTestCert(t, domain, expiry)
keyPEM := []byte("FAKE PRIVATE KEY")
certPEM, keyPEM := generateTestCertAndKey(t, domain, expiry)
cert, err := cs.UploadCertificate("Cache Test", string(certPEM), string(keyPEM))
cert, err := cs.UploadCertificate("Cache Test", string(certPEM), string(keyPEM), "")
require.NoError(t, err)
require.NotNil(t, cert)
@@ -940,7 +962,7 @@ func TestCertificateService_CacheBehavior(t *testing.T) {
require.Len(t, certs2, 1)
// Both should return the same cert
assert.Equal(t, certs1[0].ID, certs2[0].ID)
assert.Equal(t, certs1[0].UUID, certs2[0].UUID)
})
t.Run("invalidate cache forces resync", func(t *testing.T) {
@@ -954,7 +976,7 @@ func TestCertificateService_CacheBehavior(t *testing.T) {
// Create a cert via upload (auto-invalidates)
certPEM := generateTestCert(t, "invalidate.example.com", time.Now().Add(24*time.Hour))
_, err = cs.UploadCertificate("Invalidate Test", string(certPEM), "")
_, err = cs.UploadCertificate("Invalidate Test", string(certPEM), "", "")
require.NoError(t, err)
// Get list (should have 1)
@@ -1012,7 +1034,7 @@ func TestCertificateService_CacheBehavior(t *testing.T) {
certs, err := cs.ListCertificates()
require.NoError(t, err)
require.Len(t, certs, 1)
assert.Equal(t, "db.example.com", certs[0].Domain)
assert.Equal(t, "db.example.com", certs[0].Domains)
})
}
@@ -1032,7 +1054,7 @@ AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
-----END CERTIFICATE-----`
cert, err := cs.UploadCertificate("Corrupted", corruptedPEM, "")
cert, err := cs.UploadCertificate("Corrupted", corruptedPEM, "", "")
assert.Error(t, err)
assert.Nil(t, cert)
assert.Contains(t, err.Error(), "failed to parse certificate")
@@ -1047,7 +1069,7 @@ A7qVvdqxevEuUkW4K+2KdMXmnQbG9Aa7k7eBjK1S+0LYmVjPKlJGNXHDGuy5Fw/d
hI6GH4twrbDJCR2Bwy/XWXgqgGRzAgMBAAECgYBYWVtLze8R+KrZdHj0hLjZEPnl
-----END PRIVATE KEY-----`
cert, err := cs.UploadCertificate("Wrong Type", wrongTypePEM, "")
cert, err := cs.UploadCertificate("Wrong Type", wrongTypePEM, "", "")
assert.Error(t, err)
assert.Nil(t, cert)
assert.Contains(t, err.Error(), "failed to parse certificate")
@@ -1070,7 +1092,7 @@ hI6GH4twrbDJCR2Bwy/XWXgqgGRzAgMBAAECgYBYWVtLze8R+KrZdHj0hLjZEPnl
require.NoError(t, err)
certPEM := pem.EncodeToMemory(&pem.Block{Type: "CERTIFICATE", Bytes: derBytes})
cert, err := cs.UploadCertificate("Empty Subject", string(certPEM), "")
cert, err := cs.UploadCertificate("Empty Subject", string(certPEM), "", "")
assert.NoError(t, err) // Upload succeeds
assert.NotNil(t, cert)
assert.Equal(t, "", cert.Domains) // Empty domains field
@@ -1165,7 +1187,7 @@ func TestCertificateService_SyncFromDisk_ErrorHandling(t *testing.T) {
certs, err := cs.ListCertificates()
assert.NoError(t, err)
assert.Len(t, certs, 1)
assert.Equal(t, validDomain, certs[0].Domain)
assert.Equal(t, validDomain, certs[0].Domains)
})
}
@@ -1233,7 +1255,7 @@ func TestCertificateService_RefreshCacheFromDB_EdgeCases(t *testing.T) {
require.Len(t, certs, 1)
// Should use proxy host name
assert.Equal(t, "Matched Proxy", certs[0].Name)
assert.Contains(t, certs[0].Domain, "www.example.com")
assert.Contains(t, certs[0].Domains, "www.example.com")
})
}

View File

@@ -0,0 +1,524 @@
package services
import (
"crypto"
"crypto/ecdsa"
"crypto/ed25519"
"crypto/elliptic"
"crypto/rsa"
"crypto/sha256"
"crypto/x509"
"encoding/hex"
"encoding/pem"
"fmt"
"math/big"
"strings"
"time"
"software.sslmate.com/src/go-pkcs12"
)
// CertFormat represents a certificate file format.
type CertFormat string
const (
FormatPEM CertFormat = "pem"
FormatDER CertFormat = "der"
FormatPFX CertFormat = "pfx"
FormatUnknown CertFormat = "unknown"
)
// ParsedCertificate contains the parsed result of certificate input.
type ParsedCertificate struct {
Leaf *x509.Certificate
Intermediates []*x509.Certificate
PrivateKey crypto.PrivateKey
CertPEM string
KeyPEM string
ChainPEM string
Format CertFormat
}
// CertificateMetadata contains extracted metadata from an x509 certificate.
type CertificateMetadata struct {
CommonName string
Domains []string
Fingerprint string
SerialNumber string
IssuerOrg string
KeyType string
NotBefore time.Time
NotAfter time.Time
}
// ValidationResult contains the result of a certificate validation.
type ValidationResult struct {
Valid bool `json:"valid"`
CommonName string `json:"common_name"`
Domains []string `json:"domains"`
IssuerOrg string `json:"issuer_org"`
ExpiresAt time.Time `json:"expires_at"`
KeyMatch bool `json:"key_match"`
ChainValid bool `json:"chain_valid"`
ChainDepth int `json:"chain_depth"`
Warnings []string `json:"warnings"`
Errors []string `json:"errors"`
}
// DetectFormat determines the certificate format from raw file content.
// Uses trial-parse strategy: PEM → PFX → DER.
func DetectFormat(data []byte) CertFormat {
block, _ := pem.Decode(data)
if block != nil {
return FormatPEM
}
if _, _, _, err := pkcs12.DecodeChain(data, ""); err == nil {
return FormatPFX
}
// PFX with empty password failed, but it could be password-protected
// If data starts with PKCS12 magic bytes (ASN.1 SEQUENCE), treat as PFX candidate
if len(data) > 2 && data[0] == 0x30 {
// Could be DER or PFX; try DER parse
if _, err := x509.ParseCertificate(data); err == nil {
return FormatDER
}
// If DER parse fails, it's likely PFX
return FormatPFX
}
if _, err := x509.ParseCertificate(data); err == nil {
return FormatDER
}
return FormatUnknown
}
// ParseCertificateInput handles PEM, PFX, and DER input parsing.
func ParseCertificateInput(certData []byte, keyData []byte, chainData []byte, pfxPassword string) (*ParsedCertificate, error) {
if len(certData) == 0 {
return nil, fmt.Errorf("certificate data is empty")
}
format := DetectFormat(certData)
switch format {
case FormatPEM:
return parsePEMInput(certData, keyData, chainData)
case FormatPFX:
return parsePFXInput(certData, pfxPassword)
case FormatDER:
return parseDERInput(certData, keyData)
default:
return nil, fmt.Errorf("unrecognized certificate format")
}
}
func parsePEMInput(certData []byte, keyData []byte, chainData []byte) (*ParsedCertificate, error) {
result := &ParsedCertificate{Format: FormatPEM}
// Parse leaf certificate
certs, err := parsePEMCertificates(certData)
if err != nil {
return nil, fmt.Errorf("failed to parse certificate PEM: %w", err)
}
if len(certs) == 0 {
return nil, fmt.Errorf("no certificates found in PEM data")
}
result.Leaf = certs[0]
result.CertPEM = string(certData)
// If certData contains multiple certs, treat extras as intermediates
if len(certs) > 1 {
result.Intermediates = certs[1:]
}
// Parse chain file if provided
if len(chainData) > 0 {
chainCerts, err := parsePEMCertificates(chainData)
if err != nil {
return nil, fmt.Errorf("failed to parse chain PEM: %w", err)
}
result.Intermediates = append(result.Intermediates, chainCerts...)
result.ChainPEM = string(chainData)
}
// Build chain PEM from intermediates if not set from chain file
if result.ChainPEM == "" && len(result.Intermediates) > 0 {
var chainBuilder strings.Builder
for _, ic := range result.Intermediates {
if err := pem.Encode(&chainBuilder, &pem.Block{Type: "CERTIFICATE", Bytes: ic.Raw}); err != nil {
return nil, fmt.Errorf("failed to encode intermediate certificate: %w", err)
}
}
result.ChainPEM = chainBuilder.String()
}
// Parse private key
if len(keyData) > 0 {
key, err := parsePEMPrivateKey(keyData)
if err != nil {
return nil, fmt.Errorf("failed to parse private key PEM: %w", err)
}
result.PrivateKey = key
result.KeyPEM = string(keyData)
}
return result, nil
}
func parsePFXInput(pfxData []byte, password string) (*ParsedCertificate, error) {
privateKey, leaf, caCerts, err := pkcs12.DecodeChain(pfxData, password)
if err != nil {
return nil, fmt.Errorf("failed to decode PFX/PKCS12: %w", err)
}
result := &ParsedCertificate{
Format: FormatPFX,
Leaf: leaf,
Intermediates: caCerts,
PrivateKey: privateKey,
}
// Convert to PEM for storage
result.CertPEM = encodeCertToPEM(leaf)
if len(caCerts) > 0 {
var chainBuilder strings.Builder
for _, ca := range caCerts {
chainBuilder.WriteString(encodeCertToPEM(ca))
}
result.ChainPEM = chainBuilder.String()
}
keyPEM, err := encodeKeyToPEM(privateKey)
if err != nil {
return nil, fmt.Errorf("failed to encode private key to PEM: %w", err)
}
result.KeyPEM = keyPEM
return result, nil
}
func parseDERInput(certData []byte, keyData []byte) (*ParsedCertificate, error) {
cert, err := x509.ParseCertificate(certData)
if err != nil {
return nil, fmt.Errorf("failed to parse DER certificate: %w", err)
}
result := &ParsedCertificate{
Format: FormatDER,
Leaf: cert,
CertPEM: encodeCertToPEM(cert),
}
if len(keyData) > 0 {
key, err := parsePEMPrivateKey(keyData)
if err != nil {
// Try DER key
key, err = x509.ParsePKCS8PrivateKey(keyData)
if err != nil {
key2, err2 := x509.ParseECPrivateKey(keyData)
if err2 != nil {
return nil, fmt.Errorf("failed to parse private key: %w", err)
}
key = key2
}
}
result.PrivateKey = key
keyPEM, err := encodeKeyToPEM(key)
if err != nil {
return nil, fmt.Errorf("failed to encode private key to PEM: %w", err)
}
result.KeyPEM = keyPEM
}
return result, nil
}
// ValidateKeyMatch checks that the private key matches the certificate public key.
func ValidateKeyMatch(cert *x509.Certificate, key crypto.PrivateKey) error {
if cert == nil {
return fmt.Errorf("certificate is nil")
}
if key == nil {
return fmt.Errorf("private key is nil")
}
switch pub := cert.PublicKey.(type) {
case *rsa.PublicKey:
privKey, ok := key.(*rsa.PrivateKey)
if !ok {
return fmt.Errorf("key type mismatch: certificate has RSA public key but private key is not RSA")
}
if pub.N.Cmp(privKey.N) != 0 {
return fmt.Errorf("RSA key mismatch: certificate and private key modulus differ")
}
case *ecdsa.PublicKey:
privKey, ok := key.(*ecdsa.PrivateKey)
if !ok {
return fmt.Errorf("key type mismatch: certificate has ECDSA public key but private key is not ECDSA")
}
if pub.X.Cmp(privKey.X) != 0 || pub.Y.Cmp(privKey.Y) != 0 {
return fmt.Errorf("ECDSA key mismatch: certificate and private key points differ")
}
case ed25519.PublicKey:
privKey, ok := key.(ed25519.PrivateKey)
if !ok {
return fmt.Errorf("key type mismatch: certificate has Ed25519 public key but private key is not Ed25519")
}
pubFromPriv := privKey.Public().(ed25519.PublicKey)
if !pub.Equal(pubFromPriv) {
return fmt.Errorf("Ed25519 key mismatch: certificate and private key differ")
}
default:
return fmt.Errorf("unsupported public key type: %T", cert.PublicKey)
}
return nil
}
// ValidateChain verifies the certificate chain from leaf to root.
func ValidateChain(leaf *x509.Certificate, intermediates []*x509.Certificate) error {
if leaf == nil {
return fmt.Errorf("leaf certificate is nil")
}
pool := x509.NewCertPool()
for _, ic := range intermediates {
pool.AddCert(ic)
}
opts := x509.VerifyOptions{
Intermediates: pool,
CurrentTime: time.Now(),
KeyUsages: []x509.ExtKeyUsage{x509.ExtKeyUsageAny},
}
if _, err := leaf.Verify(opts); err != nil {
return fmt.Errorf("chain verification failed: %w", err)
}
return nil
}
// ConvertDERToPEM converts DER-encoded certificate to PEM.
func ConvertDERToPEM(derData []byte) (string, error) {
cert, err := x509.ParseCertificate(derData)
if err != nil {
return "", fmt.Errorf("invalid DER data: %w", err)
}
return encodeCertToPEM(cert), nil
}
// ConvertPFXToPEM extracts cert, key, and chain from PFX/PKCS12.
func ConvertPFXToPEM(pfxData []byte, password string) (certPEM string, keyPEM string, chainPEM string, err error) {
privateKey, leaf, caCerts, err := pkcs12.DecodeChain(pfxData, password)
if err != nil {
return "", "", "", fmt.Errorf("failed to decode PFX: %w", err)
}
certPEM = encodeCertToPEM(leaf)
keyPEM, err = encodeKeyToPEM(privateKey)
if err != nil {
return "", "", "", fmt.Errorf("failed to encode key: %w", err)
}
if len(caCerts) > 0 {
var builder strings.Builder
for _, ca := range caCerts {
builder.WriteString(encodeCertToPEM(ca))
}
chainPEM = builder.String()
}
return certPEM, keyPEM, chainPEM, nil
}
// ConvertPEMToPFX bundles cert, key, chain into PFX.
func ConvertPEMToPFX(certPEM string, keyPEM string, chainPEM string, password string) ([]byte, error) {
certs, err := parsePEMCertificates([]byte(certPEM))
if err != nil {
return nil, fmt.Errorf("failed to parse cert PEM: %w", err)
}
if len(certs) == 0 {
return nil, fmt.Errorf("no certificates found in cert PEM")
}
key, err := parsePEMPrivateKey([]byte(keyPEM))
if err != nil {
return nil, fmt.Errorf("failed to parse key PEM: %w", err)
}
var caCerts []*x509.Certificate
if chainPEM != "" {
caCerts, err = parsePEMCertificates([]byte(chainPEM))
if err != nil {
return nil, fmt.Errorf("failed to parse chain PEM: %w", err)
}
}
pfxData, err := pkcs12.Modern.Encode(key, certs[0], caCerts, password)
if err != nil {
return nil, fmt.Errorf("failed to encode PFX: %w", err)
}
return pfxData, nil
}
// ConvertPEMToDER converts PEM certificate to DER.
func ConvertPEMToDER(certPEM string) ([]byte, error) {
block, _ := pem.Decode([]byte(certPEM))
if block == nil {
return nil, fmt.Errorf("failed to decode PEM")
}
// Verify it's a valid certificate
if _, err := x509.ParseCertificate(block.Bytes); err != nil {
return nil, fmt.Errorf("invalid certificate PEM: %w", err)
}
return block.Bytes, nil
}
// ExtractCertificateMetadata extracts fingerprint, serial, issuer, key type, etc.
func ExtractCertificateMetadata(cert *x509.Certificate) *CertificateMetadata {
if cert == nil {
return nil
}
fingerprint := sha256.Sum256(cert.Raw)
fpHex := formatFingerprint(hex.EncodeToString(fingerprint[:]))
serial := formatSerial(cert.SerialNumber)
issuerOrg := ""
if len(cert.Issuer.Organization) > 0 {
issuerOrg = cert.Issuer.Organization[0]
}
domains := make([]string, 0, len(cert.DNSNames)+1)
if cert.Subject.CommonName != "" {
domains = append(domains, cert.Subject.CommonName)
}
for _, san := range cert.DNSNames {
if san != cert.Subject.CommonName {
domains = append(domains, san)
}
}
return &CertificateMetadata{
CommonName: cert.Subject.CommonName,
Domains: domains,
Fingerprint: fpHex,
SerialNumber: serial,
IssuerOrg: issuerOrg,
KeyType: detectKeyType(cert),
NotBefore: cert.NotBefore,
NotAfter: cert.NotAfter,
}
}
// --- helpers ---
func parsePEMCertificates(data []byte) ([]*x509.Certificate, error) {
var certs []*x509.Certificate
rest := data
for {
var block *pem.Block
block, rest = pem.Decode(rest)
if block == nil {
break
}
if block.Type != "CERTIFICATE" {
continue
}
cert, err := x509.ParseCertificate(block.Bytes)
if err != nil {
return nil, fmt.Errorf("failed to parse certificate: %w", err)
}
certs = append(certs, cert)
}
return certs, nil
}
func parsePEMPrivateKey(data []byte) (crypto.PrivateKey, error) {
block, _ := pem.Decode(data)
if block == nil {
return nil, fmt.Errorf("no PEM data found")
}
// Try PKCS8 first (handles RSA, ECDSA, Ed25519)
if key, err := x509.ParsePKCS8PrivateKey(block.Bytes); err == nil {
return key, nil
}
// Try PKCS1 RSA
if key, err := x509.ParsePKCS1PrivateKey(block.Bytes); err == nil {
return key, nil
}
// Try EC
if key, err := x509.ParseECPrivateKey(block.Bytes); err == nil {
return key, nil
}
return nil, fmt.Errorf("unsupported private key format")
}
func encodeCertToPEM(cert *x509.Certificate) string {
return string(pem.EncodeToMemory(&pem.Block{Type: "CERTIFICATE", Bytes: cert.Raw}))
}
func encodeKeyToPEM(key crypto.PrivateKey) (string, error) {
der, err := x509.MarshalPKCS8PrivateKey(key)
if err != nil {
return "", fmt.Errorf("failed to marshal private key: %w", err)
}
return string(pem.EncodeToMemory(&pem.Block{Type: "PRIVATE KEY", Bytes: der})), nil
}
func formatFingerprint(hex string) string {
var parts []string
for i := 0; i < len(hex); i += 2 {
end := i + 2
if end > len(hex) {
end = len(hex)
}
parts = append(parts, strings.ToUpper(hex[i:end]))
}
return strings.Join(parts, ":")
}
func formatSerial(n *big.Int) string {
if n == nil {
return ""
}
b := n.Bytes()
parts := make([]string, len(b))
for i, v := range b {
parts[i] = fmt.Sprintf("%02X", v)
}
return strings.Join(parts, ":")
}
func detectKeyType(cert *x509.Certificate) string {
switch pub := cert.PublicKey.(type) {
case *rsa.PublicKey:
bits := pub.N.BitLen()
return fmt.Sprintf("RSA-%d", bits)
case *ecdsa.PublicKey:
switch pub.Curve {
case elliptic.P256():
return "ECDSA-P256"
case elliptic.P384():
return "ECDSA-P384"
default:
return "ECDSA"
}
case ed25519.PublicKey:
return "Ed25519"
default:
return "Unknown"
}
}

View File

@@ -0,0 +1,324 @@
package services
import (
"crypto/ecdsa"
"crypto/elliptic"
"crypto/rand"
"crypto/rsa"
"crypto/x509"
"crypto/x509/pkix"
"encoding/pem"
"math/big"
"testing"
"time"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"software.sslmate.com/src/go-pkcs12"
)
// --- parsePFXInput ---
func TestParsePFXInput(t *testing.T) {
cert, priv, _, _ := makeRSACertAndKey(t, "pfx.test", time.Now().Add(time.Hour))
pfxData, err := pkcs12.Modern.Encode(priv, cert, nil, pkcs12.DefaultPassword)
require.NoError(t, err)
t.Run("valid PFX", func(t *testing.T) {
parsed, err := parsePFXInput(pfxData, pkcs12.DefaultPassword)
require.NoError(t, err)
assert.NotNil(t, parsed.Leaf)
assert.NotNil(t, parsed.PrivateKey)
assert.Equal(t, FormatPFX, parsed.Format)
assert.Contains(t, parsed.CertPEM, "BEGIN CERTIFICATE")
assert.Contains(t, parsed.KeyPEM, "PRIVATE KEY")
})
t.Run("PFX with chain", func(t *testing.T) {
caKey, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
caTmpl := &x509.Certificate{
SerialNumber: big.NewInt(100),
Subject: pkix.Name{CommonName: "Test CA"},
NotBefore: time.Now(),
NotAfter: time.Now().Add(24 * time.Hour),
IsCA: true,
BasicConstraintsValid: true,
KeyUsage: x509.KeyUsageCertSign,
}
caDER, err := x509.CreateCertificate(rand.Reader, caTmpl, caTmpl, &caKey.PublicKey, caKey)
require.NoError(t, err)
caCert, err := x509.ParseCertificate(caDER)
require.NoError(t, err)
pfxWithChain, err := pkcs12.Modern.Encode(priv, cert, []*x509.Certificate{caCert}, pkcs12.DefaultPassword)
require.NoError(t, err)
parsed, err := parsePFXInput(pfxWithChain, pkcs12.DefaultPassword)
require.NoError(t, err)
assert.NotEmpty(t, parsed.ChainPEM)
assert.Contains(t, parsed.ChainPEM, "BEGIN CERTIFICATE")
})
t.Run("invalid PFX data", func(t *testing.T) {
_, err := parsePFXInput([]byte("not-pfx"), "password")
assert.Error(t, err)
assert.Contains(t, err.Error(), "PFX")
})
t.Run("wrong password", func(t *testing.T) {
_, err := parsePFXInput(pfxData, "wrong-password")
assert.Error(t, err)
})
}
// --- parseDERInput ---
func TestParseDERInput(t *testing.T) {
cert, priv, _, keyPEM := makeRSACertAndKey(t, "der.test", time.Now().Add(time.Hour))
t.Run("DER cert only", func(t *testing.T) {
parsed, err := parseDERInput(cert.Raw, nil)
require.NoError(t, err)
assert.NotNil(t, parsed.Leaf)
assert.Equal(t, FormatDER, parsed.Format)
assert.Contains(t, parsed.CertPEM, "BEGIN CERTIFICATE")
assert.Nil(t, parsed.PrivateKey)
})
t.Run("DER cert with PEM key", func(t *testing.T) {
parsed, err := parseDERInput(cert.Raw, keyPEM)
require.NoError(t, err)
assert.NotNil(t, parsed.PrivateKey)
assert.Contains(t, parsed.KeyPEM, "PRIVATE KEY")
})
t.Run("DER cert with DER PKCS8 key", func(t *testing.T) {
derKey, err := x509.MarshalPKCS8PrivateKey(priv)
require.NoError(t, err)
parsed, err := parseDERInput(cert.Raw, derKey)
require.NoError(t, err)
assert.NotNil(t, parsed.PrivateKey)
})
t.Run("DER cert with DER EC key", func(t *testing.T) {
ecCert, ecPriv, _, _ := makeECDSACertAndKey(t, "ec-der.test")
ecDERKey, err := x509.MarshalECPrivateKey(ecPriv)
require.NoError(t, err)
parsed, err := parseDERInput(ecCert.Raw, ecDERKey)
require.NoError(t, err)
assert.NotNil(t, parsed.PrivateKey)
})
t.Run("DER cert with invalid key", func(t *testing.T) {
_, err := parseDERInput(cert.Raw, []byte("bad-key-data"))
assert.Error(t, err)
assert.Contains(t, err.Error(), "private key")
})
t.Run("invalid DER cert data", func(t *testing.T) {
_, err := parseDERInput([]byte("not-der"), nil)
assert.Error(t, err)
assert.Contains(t, err.Error(), "DER certificate")
})
}
// --- parsePEMInput chain building ---
func TestParsePEMInput_ChainBuilding(t *testing.T) {
t.Run("cert with intermediates in cert data", func(t *testing.T) {
_, _, certPEM1, _ := makeRSACertAndKey(t, "leaf.test", time.Now().Add(time.Hour))
_, _, certPEM2, _ := makeRSACertAndKey(t, "intermediate.test", time.Now().Add(time.Hour))
combined := append(certPEM1, certPEM2...)
parsed, err := parsePEMInput(combined, nil, nil)
require.NoError(t, err)
assert.NotNil(t, parsed.Leaf)
assert.Len(t, parsed.Intermediates, 1)
assert.NotEmpty(t, parsed.ChainPEM)
assert.Contains(t, parsed.ChainPEM, "BEGIN CERTIFICATE")
})
t.Run("cert with chain file", func(t *testing.T) {
_, _, certPEM, keyPEM := makeRSACertAndKey(t, "leaf.test", time.Now().Add(time.Hour))
_, _, chainPEM, _ := makeRSACertAndKey(t, "chain.test", time.Now().Add(time.Hour))
parsed, err := parsePEMInput(certPEM, keyPEM, chainPEM)
require.NoError(t, err)
assert.NotNil(t, parsed.PrivateKey)
assert.Len(t, parsed.Intermediates, 1)
assert.Equal(t, string(chainPEM), parsed.ChainPEM)
})
t.Run("invalid chain data ignored", func(t *testing.T) {
_, _, certPEM, _ := makeRSACertAndKey(t, "leaf.test", time.Now().Add(time.Hour))
parsed, err := parsePEMInput(certPEM, nil, []byte("not-pem"))
require.NoError(t, err)
assert.Empty(t, parsed.Intermediates, "invalid PEM chain should be silently ignored")
})
t.Run("invalid cert data", func(t *testing.T) {
_, err := parsePEMInput([]byte("not-pem"), nil, nil)
assert.Error(t, err)
})
t.Run("empty PEM block", func(t *testing.T) {
emptyPEM := pem.EncodeToMemory(&pem.Block{Type: "CERTIFICATE", Bytes: []byte("garbage")})
_, err := parsePEMInput(emptyPEM, nil, nil)
assert.Error(t, err)
})
}
// --- ConvertPFXToPEM ---
func TestConvertPFXToPEM(t *testing.T) {
cert, priv, _, _ := makeRSACertAndKey(t, "pfx-convert.test", time.Now().Add(time.Hour))
pfxData, err := pkcs12.Modern.Encode(priv, cert, nil, pkcs12.DefaultPassword)
require.NoError(t, err)
t.Run("valid PFX", func(t *testing.T) {
certPEM, keyPEM, chainPEM, err := ConvertPFXToPEM(pfxData, pkcs12.DefaultPassword)
require.NoError(t, err)
assert.Contains(t, certPEM, "BEGIN CERTIFICATE")
assert.Contains(t, keyPEM, "PRIVATE KEY")
assert.Empty(t, chainPEM)
})
t.Run("PFX with chain", func(t *testing.T) {
caKey, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
caTmpl := &x509.Certificate{
SerialNumber: big.NewInt(200),
Subject: pkix.Name{CommonName: "PFX Test CA"},
NotBefore: time.Now(),
NotAfter: time.Now().Add(24 * time.Hour),
IsCA: true,
BasicConstraintsValid: true,
KeyUsage: x509.KeyUsageCertSign,
}
caDER, err := x509.CreateCertificate(rand.Reader, caTmpl, caTmpl, &caKey.PublicKey, caKey)
require.NoError(t, err)
caCert, err := x509.ParseCertificate(caDER)
require.NoError(t, err)
pfxWithChain, err := pkcs12.Modern.Encode(priv, cert, []*x509.Certificate{caCert}, pkcs12.DefaultPassword)
require.NoError(t, err)
certPEM, keyPEM, chainPEM, err := ConvertPFXToPEM(pfxWithChain, pkcs12.DefaultPassword)
require.NoError(t, err)
assert.Contains(t, certPEM, "BEGIN CERTIFICATE")
assert.Contains(t, keyPEM, "PRIVATE KEY")
assert.Contains(t, chainPEM, "BEGIN CERTIFICATE")
})
t.Run("invalid PFX", func(t *testing.T) {
_, _, _, err := ConvertPFXToPEM([]byte("bad"), "password")
assert.Error(t, err)
assert.Contains(t, err.Error(), "PFX")
})
}
// --- encodeKeyToPEM ---
func TestEncodeKeyToPEM(t *testing.T) {
t.Run("RSA key", func(t *testing.T) {
priv, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
pemStr, err := encodeKeyToPEM(priv)
require.NoError(t, err)
assert.Contains(t, pemStr, "PRIVATE KEY")
})
t.Run("ECDSA key", func(t *testing.T) {
priv, err := ecdsa.GenerateKey(elliptic.P256(), rand.Reader)
require.NoError(t, err)
pemStr, err := encodeKeyToPEM(priv)
require.NoError(t, err)
assert.Contains(t, pemStr, "PRIVATE KEY")
})
}
// --- ParseCertificateInput for PFX ---
func TestParseCertificateInput_PFX(t *testing.T) {
cert, priv, _, _ := makeRSACertAndKey(t, "pfx-parse.test", time.Now().Add(time.Hour))
pfxData, err := pkcs12.Modern.Encode(priv, cert, nil, pkcs12.DefaultPassword)
require.NoError(t, err)
t.Run("PFX format detected and parsed", func(t *testing.T) {
parsed, err := ParseCertificateInput(pfxData, nil, nil, pkcs12.DefaultPassword)
require.NoError(t, err)
assert.NotNil(t, parsed.Leaf)
assert.NotNil(t, parsed.PrivateKey)
assert.Equal(t, FormatPFX, parsed.Format)
})
}
// --- detectKeyType additional branches ---
func TestDetectKeyType_P384(t *testing.T) {
priv, err := ecdsa.GenerateKey(elliptic.P384(), rand.Reader)
require.NoError(t, err)
tmpl := &x509.Certificate{
SerialNumber: big.NewInt(99),
Subject: pkix.Name{CommonName: "p384.test"},
NotBefore: time.Now(),
NotAfter: time.Now().Add(time.Hour),
}
der, err := x509.CreateCertificate(rand.Reader, tmpl, tmpl, &priv.PublicKey, priv)
require.NoError(t, err)
cert, err := x509.ParseCertificate(der)
require.NoError(t, err)
assert.Equal(t, "ECDSA-P384", detectKeyType(cert))
}
// --- parsePEMPrivateKey additional formats ---
func TestParsePEMPrivateKey_PKCS1(t *testing.T) {
priv, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
keyPEM := pem.EncodeToMemory(&pem.Block{Type: "RSA PRIVATE KEY", Bytes: x509.MarshalPKCS1PrivateKey(priv)})
key, err := parsePEMPrivateKey(keyPEM)
require.NoError(t, err)
assert.NotNil(t, key)
}
func TestParsePEMPrivateKey_EC(t *testing.T) {
priv, err := ecdsa.GenerateKey(elliptic.P256(), rand.Reader)
require.NoError(t, err)
ecDER, err := x509.MarshalECPrivateKey(priv)
require.NoError(t, err)
keyPEM := pem.EncodeToMemory(&pem.Block{Type: "EC PRIVATE KEY", Bytes: ecDER})
key, err := parsePEMPrivateKey(keyPEM)
require.NoError(t, err)
assert.NotNil(t, key)
}
func TestParsePEMPrivateKey_Invalid(t *testing.T) {
t.Run("no PEM data", func(t *testing.T) {
_, err := parsePEMPrivateKey([]byte("not pem"))
assert.Error(t, err)
assert.Contains(t, err.Error(), "no PEM data")
})
t.Run("unsupported key format", func(t *testing.T) {
badPEM := pem.EncodeToMemory(&pem.Block{Type: "UNKNOWN KEY", Bytes: []byte("junk")})
_, err := parsePEMPrivateKey(badPEM)
assert.Error(t, err)
assert.Contains(t, err.Error(), "unsupported")
})
}
// --- DetectFormat for PFX ---
func TestDetectFormat_PFX(t *testing.T) {
cert, priv, _, _ := makeRSACertAndKey(t, "detect-pfx.test", time.Now().Add(time.Hour))
pfxData, err := pkcs12.Modern.Encode(priv, cert, nil, pkcs12.DefaultPassword)
require.NoError(t, err)
format := DetectFormat(pfxData)
assert.Equal(t, FormatPFX, format, "PFX data should be detected as FormatPFX")
}

View File

@@ -0,0 +1,256 @@
package services
import (
"crypto/ecdsa"
"crypto/ed25519"
"crypto/elliptic"
"crypto/rand"
"crypto/rsa"
"crypto/x509"
"crypto/x509/pkix"
"encoding/pem"
"math/big"
"testing"
"time"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
// --- ValidateKeyMatch ECDSA ---
func TestValidateKeyMatch_ECDSA_Success(t *testing.T) {
cert, _, _, _ := makeECDSACertAndKey(t, "ecdsa-match.test")
priv, err := ecdsa.GenerateKey(elliptic.P256(), rand.Reader)
require.NoError(t, err)
// Use the actual key that signed the cert
ecCert, ecKey, _, _ := makeECDSACertAndKey(t, "ecdsa-ok.test")
err = ValidateKeyMatch(ecCert, ecKey)
assert.NoError(t, err)
// Mismatch: different ECDSA key
err = ValidateKeyMatch(cert, priv)
assert.Error(t, err)
assert.Contains(t, err.Error(), "ECDSA key mismatch")
}
func TestValidateKeyMatch_ECDSA_WrongKeyType(t *testing.T) {
cert, _, _, _ := makeECDSACertAndKey(t, "ecdsa-wrong.test")
rsaKey, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
err = ValidateKeyMatch(cert, rsaKey)
assert.Error(t, err)
assert.Contains(t, err.Error(), "key type mismatch")
}
// --- ValidateKeyMatch Ed25519 ---
func TestValidateKeyMatch_Ed25519_Success(t *testing.T) {
cert, priv, _, _ := makeEd25519CertAndKey(t, "ed25519-ok.test")
err := ValidateKeyMatch(cert, priv)
assert.NoError(t, err)
}
func TestValidateKeyMatch_Ed25519_Mismatch(t *testing.T) {
cert, _, _, _ := makeEd25519CertAndKey(t, "ed25519-mismatch.test")
_, otherPriv, err := ed25519.GenerateKey(rand.Reader)
require.NoError(t, err)
err = ValidateKeyMatch(cert, otherPriv)
assert.Error(t, err)
assert.Contains(t, err.Error(), "Ed25519 key mismatch")
}
func TestValidateKeyMatch_Ed25519_WrongKeyType(t *testing.T) {
cert, _, _, _ := makeEd25519CertAndKey(t, "ed25519-wrong.test")
rsaKey, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
err = ValidateKeyMatch(cert, rsaKey)
assert.Error(t, err)
assert.Contains(t, err.Error(), "key type mismatch")
}
func TestValidateKeyMatch_UnsupportedKeyType(t *testing.T) {
// Create a cert with a nil public key type to trigger the default branch
cert := &x509.Certificate{PublicKey: "not-a-real-key"}
key, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
err = ValidateKeyMatch(cert, key)
assert.Error(t, err)
assert.Contains(t, err.Error(), "unsupported public key type")
}
// --- ConvertDERToPEM ---
func TestConvertDERToPEM_Valid(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "der-to-pem.test", time.Now().Add(time.Hour))
pemStr, err := ConvertDERToPEM(cert.Raw)
require.NoError(t, err)
assert.Contains(t, pemStr, "BEGIN CERTIFICATE")
}
func TestConvertDERToPEM_Invalid(t *testing.T) {
_, err := ConvertDERToPEM([]byte("not-der-data"))
assert.Error(t, err)
assert.Contains(t, err.Error(), "invalid DER")
}
// --- ConvertPEMToDER ---
func TestConvertPEMToDER_Valid(t *testing.T) {
_, _, certPEM, _ := makeRSACertAndKey(t, "pem-to-der.test", time.Now().Add(time.Hour))
derData, err := ConvertPEMToDER(string(certPEM))
require.NoError(t, err)
assert.NotEmpty(t, derData)
// Verify it's valid DER
parsed, err := x509.ParseCertificate(derData)
require.NoError(t, err)
assert.Equal(t, "pem-to-der.test", parsed.Subject.CommonName)
}
func TestConvertPEMToDER_NoPEMBlock(t *testing.T) {
_, err := ConvertPEMToDER("not-pem-data")
assert.Error(t, err)
assert.Contains(t, err.Error(), "failed to decode PEM")
}
func TestConvertPEMToDER_InvalidCert(t *testing.T) {
fakePEM := string(pem.EncodeToMemory(&pem.Block{Type: "CERTIFICATE", Bytes: []byte("garbage")}))
_, err := ConvertPEMToDER(fakePEM)
assert.Error(t, err)
assert.Contains(t, err.Error(), "invalid certificate PEM")
}
// --- ConvertPEMToPFX ---
func TestConvertPEMToPFX_Valid(t *testing.T) {
_, _, certPEM, keyPEM := makeRSACertAndKey(t, "pem-to-pfx.test", time.Now().Add(time.Hour))
pfxData, err := ConvertPEMToPFX(string(certPEM), string(keyPEM), "", "test-password")
require.NoError(t, err)
assert.NotEmpty(t, pfxData)
}
func TestConvertPEMToPFX_WithChain(t *testing.T) {
_, _, certPEM, keyPEM := makeRSACertAndKey(t, "pfx-chain.test", time.Now().Add(time.Hour))
_, _, chainPEM, _ := makeRSACertAndKey(t, "pfx-ca.test", time.Now().Add(time.Hour))
pfxData, err := ConvertPEMToPFX(string(certPEM), string(keyPEM), string(chainPEM), "pass")
require.NoError(t, err)
assert.NotEmpty(t, pfxData)
}
func TestConvertPEMToPFX_BadCert(t *testing.T) {
_, err := ConvertPEMToPFX("not-pem", "not-pem", "", "pass")
assert.Error(t, err)
assert.Contains(t, err.Error(), "cert PEM")
}
func TestConvertPEMToPFX_BadKey(t *testing.T) {
_, _, certPEM, _ := makeRSACertAndKey(t, "pfx-badkey.test", time.Now().Add(time.Hour))
_, err := ConvertPEMToPFX(string(certPEM), "not-pem", "", "pass")
assert.Error(t, err)
assert.Contains(t, err.Error(), "key PEM")
}
// --- ExtractCertificateMetadata ---
func TestExtractCertificateMetadata_Nil(t *testing.T) {
result := ExtractCertificateMetadata(nil)
assert.Nil(t, result)
}
func TestExtractCertificateMetadata_Valid(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "metadata.test", time.Now().Add(24*time.Hour))
meta := ExtractCertificateMetadata(cert)
require.NotNil(t, meta)
assert.NotEmpty(t, meta.Fingerprint)
assert.NotEmpty(t, meta.SerialNumber)
assert.Contains(t, meta.KeyType, "RSA")
assert.Contains(t, meta.Domains, "metadata.test")
}
func TestExtractCertificateMetadata_WithSANs(t *testing.T) {
priv, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
tmpl := &x509.Certificate{
SerialNumber: big.NewInt(42),
Subject: pkix.Name{CommonName: "san.test", Organization: []string{"Test Org"}},
Issuer: pkix.Name{Organization: []string{"Test Issuer"}},
NotBefore: time.Now(),
NotAfter: time.Now().Add(time.Hour),
DNSNames: []string{"san.test", "alt.test", "other.test"},
}
der, err := x509.CreateCertificate(rand.Reader, tmpl, tmpl, &priv.PublicKey, priv)
require.NoError(t, err)
cert, err := x509.ParseCertificate(der)
require.NoError(t, err)
meta := ExtractCertificateMetadata(cert)
require.NotNil(t, meta)
assert.Contains(t, meta.Domains, "san.test")
assert.Contains(t, meta.Domains, "alt.test")
assert.Contains(t, meta.Domains, "other.test")
assert.Equal(t, "Test Org", meta.IssuerOrg)
}
// --- detectKeyType ---
func TestDetectKeyType_Ed25519(t *testing.T) {
cert, _, _, _ := makeEd25519CertAndKey(t, "ed25519-type.test")
assert.Equal(t, "Ed25519", detectKeyType(cert))
}
func TestDetectKeyType_RSA(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "rsa-type.test", time.Now().Add(time.Hour))
kt := detectKeyType(cert)
assert.Contains(t, kt, "RSA-")
}
func TestDetectKeyType_ECDSA_P256(t *testing.T) {
cert, _, _, _ := makeECDSACertAndKey(t, "p256-type.test")
assert.Equal(t, "ECDSA-P256", detectKeyType(cert))
}
// --- formatSerial ---
func TestFormatSerial_Nil(t *testing.T) {
assert.Equal(t, "", formatSerial(nil))
}
func TestFormatSerial_Value(t *testing.T) {
result := formatSerial(big.NewInt(256))
assert.NotEmpty(t, result)
assert.Contains(t, result, ":")
}
// --- formatFingerprint ---
func TestFormatFingerprint_Normal(t *testing.T) {
result := formatFingerprint("aabbccdd")
assert.Equal(t, "AA:BB:CC:DD", result)
}
func TestFormatFingerprint_OddLength(t *testing.T) {
result := formatFingerprint("aabbc")
assert.Contains(t, result, "AA:BB")
}
// --- DetectFormat DER ---
func TestDetectFormat_DER(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "detect-der.test", time.Now().Add(time.Hour))
format := DetectFormat(cert.Raw)
assert.Equal(t, FormatDER, format)
}
func TestDetectFormat_PEM(t *testing.T) {
_, _, certPEM, _ := makeRSACertAndKey(t, "detect-pem.test", time.Now().Add(time.Hour))
format := DetectFormat(certPEM)
assert.Equal(t, FormatPEM, format)
}

View File

@@ -0,0 +1,189 @@
package services
import (
"crypto/ecdsa"
"crypto/elliptic"
"crypto/rand"
"crypto/rsa"
"crypto/x509"
"crypto/x509/pkix"
"encoding/pem"
"math/big"
"testing"
"time"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"software.sslmate.com/src/go-pkcs12"
)
func TestDetectFormat_PasswordProtectedPFX(t *testing.T) {
cert, key, _, _ := makeRSACertAndKey(t, "pfx-pw.example.com", time.Now().Add(24*time.Hour))
pfxData, err := pkcs12.Modern.Encode(key, cert, nil, "custompw")
require.NoError(t, err)
format := DetectFormat(pfxData)
assert.Equal(t, FormatPFX, format)
}
func TestParsePEMPrivateKey_PKCS1RSA(t *testing.T) {
key, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
keyDER := x509.MarshalPKCS1PrivateKey(key)
keyPEM := pem.EncodeToMemory(&pem.Block{Type: "RSA PRIVATE KEY", Bytes: keyDER})
parsed, err := parsePEMPrivateKey(keyPEM)
require.NoError(t, err)
assert.NotNil(t, parsed)
}
func TestParsePEMPrivateKey_ECPrivKey(t *testing.T) {
key, err := ecdsa.GenerateKey(elliptic.P256(), rand.Reader)
require.NoError(t, err)
keyDER, err := x509.MarshalECPrivateKey(key)
require.NoError(t, err)
keyPEM := pem.EncodeToMemory(&pem.Block{Type: "EC PRIVATE KEY", Bytes: keyDER})
parsed, err := parsePEMPrivateKey(keyPEM)
require.NoError(t, err)
assert.NotNil(t, parsed)
}
func TestDetectKeyType_ECDSAP384(t *testing.T) {
key, err := ecdsa.GenerateKey(elliptic.P384(), rand.Reader)
require.NoError(t, err)
template := &x509.Certificate{
SerialNumber: big.NewInt(1),
Subject: pkix.Name{CommonName: "p384.example.com"},
NotBefore: time.Now().Add(-time.Minute),
NotAfter: time.Now().Add(24 * time.Hour),
}
certDER, err := x509.CreateCertificate(rand.Reader, template, template, &key.PublicKey, key)
require.NoError(t, err)
cert, err := x509.ParseCertificate(certDER)
require.NoError(t, err)
assert.Equal(t, "ECDSA-P384", detectKeyType(cert))
}
func TestDetectKeyType_ECDSAUnknownCurve(t *testing.T) {
key, err := ecdsa.GenerateKey(elliptic.P224(), rand.Reader)
require.NoError(t, err)
template := &x509.Certificate{
SerialNumber: big.NewInt(1),
Subject: pkix.Name{CommonName: "p224.example.com"},
NotBefore: time.Now().Add(-time.Minute),
NotAfter: time.Now().Add(24 * time.Hour),
}
certDER, err := x509.CreateCertificate(rand.Reader, template, template, &key.PublicKey, key)
require.NoError(t, err)
cert, err := x509.ParseCertificate(certDER)
require.NoError(t, err)
assert.Equal(t, "ECDSA", detectKeyType(cert))
}
func TestConvertPEMToPFX_EmptyChain(t *testing.T) {
_, _, certPEM, keyPEM := makeRSACertAndKey(t, "pfx-chain.example.com", time.Now().Add(24*time.Hour))
pfxData, err := ConvertPEMToPFX(string(certPEM), string(keyPEM), "", "testpass")
require.NoError(t, err)
assert.NotEmpty(t, pfxData)
}
func TestConvertPEMToDER_NonCertBlock(t *testing.T) {
key, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
keyPEM := pem.EncodeToMemory(&pem.Block{
Type: "RSA PRIVATE KEY",
Bytes: x509.MarshalPKCS1PrivateKey(key),
})
_, err = ConvertPEMToDER(string(keyPEM))
require.Error(t, err)
assert.Contains(t, err.Error(), "invalid certificate PEM")
}
func TestFormatSerial_NilInput(t *testing.T) {
assert.Equal(t, "", formatSerial(nil))
}
func TestDetectFormat_EmptyPasswordPFX(t *testing.T) {
cert, key, _, _ := makeRSACertAndKey(t, "empty-pw.example.com", time.Now().Add(24*time.Hour))
pfxData, err := pkcs12.Modern.Encode(key, cert, nil, "")
require.NoError(t, err)
format := DetectFormat(pfxData)
assert.Equal(t, FormatPFX, format)
}
func TestParseCertificateInput_BadChainPEM(t *testing.T) {
_, _, certPEM, _ := makeRSACertAndKey(t, "bad-chain-test.example.com", time.Now().Add(24*time.Hour))
badChain := []byte("-----BEGIN CERTIFICATE-----\naW52YWxpZA==\n-----END CERTIFICATE-----\n")
_, err := ParseCertificateInput(certPEM, nil, badChain, "")
require.Error(t, err)
assert.Contains(t, err.Error(), "failed to parse chain PEM")
}
func TestValidateChain_WithIntermediates(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "chain-inter.example.com", time.Now().Add(24*time.Hour))
_ = ValidateChain(cert, []*x509.Certificate{cert})
}
func TestConvertPEMToPFX_BadCertPEM(t *testing.T) {
badCertPEM := "-----BEGIN CERTIFICATE-----\naW52YWxpZA==\n-----END CERTIFICATE-----\n"
_, err := ConvertPEMToPFX(badCertPEM, "somekey", "", "pass")
require.Error(t, err)
assert.Contains(t, err.Error(), "failed to parse cert PEM")
}
func TestConvertPEMToPFX_BadChainPEM(t *testing.T) {
_, _, certPEM, keyPEM := makeRSACertAndKey(t, "pfx-bad-chain.example.com", time.Now().Add(24*time.Hour))
badChain := "-----BEGIN CERTIFICATE-----\naW52YWxpZA==\n-----END CERTIFICATE-----\n"
_, err := ConvertPEMToPFX(string(certPEM), string(keyPEM), badChain, "pass")
require.Error(t, err)
assert.Contains(t, err.Error(), "failed to parse chain PEM")
}
func TestParsePEMPrivateKey_PKCS8(t *testing.T) {
key, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
der, err := x509.MarshalPKCS8PrivateKey(key)
require.NoError(t, err)
keyPEM := pem.EncodeToMemory(&pem.Block{Type: "PRIVATE KEY", Bytes: der})
parsed, err := parsePEMPrivateKey(keyPEM)
require.NoError(t, err)
assert.NotNil(t, parsed)
}
func TestEncodeKeyToPEM_UnsupportedKeyType(t *testing.T) {
type badKey struct{}
_, err := encodeKeyToPEM(badKey{})
require.Error(t, err)
assert.Contains(t, err.Error(), "failed to marshal private key")
}
func TestDetectKeyType_Unknown(t *testing.T) {
cert := &x509.Certificate{
PublicKey: "not-a-real-key",
}
assert.Equal(t, "Unknown", detectKeyType(cert))
}

View File

@@ -0,0 +1,388 @@
package services
import (
"crypto/ecdsa"
"crypto/ed25519"
"crypto/elliptic"
"crypto/rand"
"crypto/rsa"
"crypto/x509"
"crypto/x509/pkix"
"encoding/pem"
"math/big"
"testing"
"time"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
// --- helpers ---
func makeRSACertAndKey(t *testing.T, cn string, expiry time.Time) (*x509.Certificate, *rsa.PrivateKey, []byte, []byte) {
t.Helper()
priv, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
tmpl := &x509.Certificate{
SerialNumber: big.NewInt(1),
Subject: pkix.Name{CommonName: cn},
NotBefore: time.Now(),
NotAfter: expiry,
KeyUsage: x509.KeyUsageKeyEncipherment | x509.KeyUsageDigitalSignature,
ExtKeyUsage: []x509.ExtKeyUsage{x509.ExtKeyUsageServerAuth},
}
der, err := x509.CreateCertificate(rand.Reader, tmpl, tmpl, &priv.PublicKey, priv)
require.NoError(t, err)
cert, err := x509.ParseCertificate(der)
require.NoError(t, err)
certPEM := pem.EncodeToMemory(&pem.Block{Type: "CERTIFICATE", Bytes: der})
keyPEM := pem.EncodeToMemory(&pem.Block{Type: "RSA PRIVATE KEY", Bytes: x509.MarshalPKCS1PrivateKey(priv)})
return cert, priv, certPEM, keyPEM
}
func makeECDSACertAndKey(t *testing.T, cn string) (*x509.Certificate, *ecdsa.PrivateKey, []byte, []byte) {
t.Helper()
priv, err := ecdsa.GenerateKey(elliptic.P256(), rand.Reader)
require.NoError(t, err)
tmpl := &x509.Certificate{
SerialNumber: big.NewInt(2),
Subject: pkix.Name{CommonName: cn},
NotBefore: time.Now(),
NotAfter: time.Now().Add(24 * time.Hour),
KeyUsage: x509.KeyUsageDigitalSignature,
}
der, err := x509.CreateCertificate(rand.Reader, tmpl, tmpl, &priv.PublicKey, priv)
require.NoError(t, err)
cert, err := x509.ParseCertificate(der)
require.NoError(t, err)
certPEM := pem.EncodeToMemory(&pem.Block{Type: "CERTIFICATE", Bytes: der})
keyDER, err := x509.MarshalECPrivateKey(priv)
require.NoError(t, err)
keyPEM := pem.EncodeToMemory(&pem.Block{Type: "EC PRIVATE KEY", Bytes: keyDER})
return cert, priv, certPEM, keyPEM
}
func makeEd25519CertAndKey(t *testing.T, cn string) (*x509.Certificate, ed25519.PrivateKey, []byte, []byte) {
t.Helper()
pub, priv, err := ed25519.GenerateKey(rand.Reader)
require.NoError(t, err)
tmpl := &x509.Certificate{
SerialNumber: big.NewInt(3),
Subject: pkix.Name{CommonName: cn},
NotBefore: time.Now(),
NotAfter: time.Now().Add(24 * time.Hour),
KeyUsage: x509.KeyUsageDigitalSignature,
}
der, err := x509.CreateCertificate(rand.Reader, tmpl, tmpl, pub, priv)
require.NoError(t, err)
cert, err := x509.ParseCertificate(der)
require.NoError(t, err)
certPEM := pem.EncodeToMemory(&pem.Block{Type: "CERTIFICATE", Bytes: der})
keyDER, err := x509.MarshalPKCS8PrivateKey(priv)
require.NoError(t, err)
keyPEM := pem.EncodeToMemory(&pem.Block{Type: "PRIVATE KEY", Bytes: keyDER})
return cert, priv, certPEM, keyPEM
}
// --- DetectFormat ---
func TestDetectFormat(t *testing.T) {
cert, _, certPEM, _ := makeRSACertAndKey(t, "test.com", time.Now().Add(time.Hour))
t.Run("PEM format", func(t *testing.T) {
assert.Equal(t, FormatPEM, DetectFormat(certPEM))
})
t.Run("DER format", func(t *testing.T) {
assert.Equal(t, FormatDER, DetectFormat(cert.Raw))
})
t.Run("unknown format", func(t *testing.T) {
assert.Equal(t, FormatUnknown, DetectFormat([]byte("not a cert")))
})
t.Run("empty data", func(t *testing.T) {
assert.Equal(t, FormatUnknown, DetectFormat([]byte{}))
})
}
// --- ParseCertificateInput ---
func TestParseCertificateInput(t *testing.T) {
t.Run("PEM cert only", func(t *testing.T) {
_, _, certPEM, _ := makeRSACertAndKey(t, "pem.test", time.Now().Add(time.Hour))
parsed, err := ParseCertificateInput(certPEM, nil, nil, "")
require.NoError(t, err)
assert.NotNil(t, parsed.Leaf)
assert.Equal(t, FormatPEM, parsed.Format)
assert.Nil(t, parsed.PrivateKey)
})
t.Run("PEM cert with key", func(t *testing.T) {
_, _, certPEM, keyPEM := makeRSACertAndKey(t, "pem-key.test", time.Now().Add(time.Hour))
parsed, err := ParseCertificateInput(certPEM, keyPEM, nil, "")
require.NoError(t, err)
assert.NotNil(t, parsed.Leaf)
assert.NotNil(t, parsed.PrivateKey)
assert.Equal(t, FormatPEM, parsed.Format)
})
t.Run("DER cert", func(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "der.test", time.Now().Add(time.Hour))
parsed, err := ParseCertificateInput(cert.Raw, nil, nil, "")
require.NoError(t, err)
assert.NotNil(t, parsed.Leaf)
assert.Equal(t, FormatDER, parsed.Format)
})
t.Run("empty data returns error", func(t *testing.T) {
_, err := ParseCertificateInput(nil, nil, nil, "")
assert.Error(t, err)
assert.Contains(t, err.Error(), "empty")
})
t.Run("unrecognized format returns error", func(t *testing.T) {
_, err := ParseCertificateInput([]byte("garbage"), nil, nil, "")
assert.Error(t, err)
assert.Contains(t, err.Error(), "unrecognized")
})
t.Run("invalid key PEM returns error", func(t *testing.T) {
_, _, certPEM, _ := makeRSACertAndKey(t, "badkey.test", time.Now().Add(time.Hour))
_, err := ParseCertificateInput(certPEM, []byte("not-key"), nil, "")
assert.Error(t, err)
assert.Contains(t, err.Error(), "private key")
})
}
// --- ValidateKeyMatch ---
func TestValidateKeyMatch(t *testing.T) {
t.Run("RSA matching", func(t *testing.T) {
cert, priv, _, _ := makeRSACertAndKey(t, "rsa.test", time.Now().Add(time.Hour))
assert.NoError(t, ValidateKeyMatch(cert, priv))
})
t.Run("RSA mismatched", func(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "rsa1.test", time.Now().Add(time.Hour))
_, otherPriv, _, _ := makeRSACertAndKey(t, "rsa2.test", time.Now().Add(time.Hour))
err := ValidateKeyMatch(cert, otherPriv)
assert.Error(t, err)
assert.Contains(t, err.Error(), "mismatch")
})
t.Run("ECDSA matching", func(t *testing.T) {
cert, priv, _, _ := makeECDSACertAndKey(t, "ecdsa.test")
assert.NoError(t, ValidateKeyMatch(cert, priv))
})
t.Run("ECDSA mismatched", func(t *testing.T) {
cert, _, _, _ := makeECDSACertAndKey(t, "ec1.test")
_, other, _, _ := makeECDSACertAndKey(t, "ec2.test")
assert.Error(t, ValidateKeyMatch(cert, other))
})
t.Run("Ed25519 matching", func(t *testing.T) {
cert, priv, _, _ := makeEd25519CertAndKey(t, "ed.test")
assert.NoError(t, ValidateKeyMatch(cert, priv))
})
t.Run("Ed25519 mismatched", func(t *testing.T) {
cert, _, _, _ := makeEd25519CertAndKey(t, "ed1.test")
_, other, _, _ := makeEd25519CertAndKey(t, "ed2.test")
assert.Error(t, ValidateKeyMatch(cert, other))
})
t.Run("type mismatch RSA cert with ECDSA key", func(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "rsa.test", time.Now().Add(time.Hour))
_, ecKey, _, _ := makeECDSACertAndKey(t, "ec.test")
err := ValidateKeyMatch(cert, ecKey)
assert.Error(t, err)
assert.Contains(t, err.Error(), "type mismatch")
})
t.Run("nil certificate", func(t *testing.T) {
_, priv, _, _ := makeRSACertAndKey(t, "rsa.test", time.Now().Add(time.Hour))
assert.Error(t, ValidateKeyMatch(nil, priv))
})
t.Run("nil key", func(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "rsa.test", time.Now().Add(time.Hour))
assert.Error(t, ValidateKeyMatch(cert, nil))
})
}
// --- ValidateChain ---
func TestValidateChain(t *testing.T) {
t.Run("nil leaf returns error", func(t *testing.T) {
err := ValidateChain(nil, nil)
assert.Error(t, err)
assert.Contains(t, err.Error(), "nil")
})
t.Run("self-signed cert validates", func(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "self.test", time.Now().Add(time.Hour))
// Self-signed won't pass chain validation without being a CA
err := ValidateChain(cert, nil)
assert.Error(t, err)
})
}
// --- ConvertDERToPEM ---
func TestConvertDERToPEM(t *testing.T) {
t.Run("valid DER", func(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "der.test", time.Now().Add(time.Hour))
pemStr, err := ConvertDERToPEM(cert.Raw)
require.NoError(t, err)
assert.Contains(t, pemStr, "BEGIN CERTIFICATE")
})
t.Run("invalid DER", func(t *testing.T) {
_, err := ConvertDERToPEM([]byte("not-der"))
assert.Error(t, err)
})
}
// --- ConvertPEMToDER ---
func TestConvertPEMToDER(t *testing.T) {
t.Run("valid PEM", func(t *testing.T) {
_, _, certPEM, _ := makeRSACertAndKey(t, "p2d.test", time.Now().Add(time.Hour))
der, err := ConvertPEMToDER(string(certPEM))
require.NoError(t, err)
assert.NotEmpty(t, der)
// Round-trip
cert, err := x509.ParseCertificate(der)
require.NoError(t, err)
assert.Equal(t, "p2d.test", cert.Subject.CommonName)
})
t.Run("invalid PEM", func(t *testing.T) {
_, err := ConvertPEMToDER("not-pem")
assert.Error(t, err)
})
}
// --- ExtractCertificateMetadata ---
func TestExtractCertificateMetadata(t *testing.T) {
t.Run("nil cert returns nil", func(t *testing.T) {
assert.Nil(t, ExtractCertificateMetadata(nil))
})
t.Run("RSA cert metadata", func(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "meta.test", time.Now().Add(time.Hour))
m := ExtractCertificateMetadata(cert)
require.NotNil(t, m)
assert.Equal(t, "meta.test", m.CommonName)
assert.Contains(t, m.KeyType, "RSA")
assert.NotEmpty(t, m.Fingerprint)
assert.NotEmpty(t, m.SerialNumber)
assert.Contains(t, m.Domains, "meta.test")
})
t.Run("ECDSA cert metadata", func(t *testing.T) {
cert, _, _, _ := makeECDSACertAndKey(t, "ec-meta.test")
m := ExtractCertificateMetadata(cert)
require.NotNil(t, m)
assert.Contains(t, m.KeyType, "ECDSA")
})
t.Run("Ed25519 cert metadata", func(t *testing.T) {
cert, _, _, _ := makeEd25519CertAndKey(t, "ed-meta.test")
m := ExtractCertificateMetadata(cert)
require.NotNil(t, m)
assert.Equal(t, "Ed25519", m.KeyType)
})
t.Run("cert with SANs", func(t *testing.T) {
priv, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
tmpl := &x509.Certificate{
SerialNumber: big.NewInt(10),
Subject: pkix.Name{CommonName: "main.test"},
DNSNames: []string{"main.test", "alt1.test", "alt2.test"},
NotBefore: time.Now(),
NotAfter: time.Now().Add(time.Hour),
}
der, err := x509.CreateCertificate(rand.Reader, tmpl, tmpl, &priv.PublicKey, priv)
require.NoError(t, err)
cert, _ := x509.ParseCertificate(der)
m := ExtractCertificateMetadata(cert)
require.NotNil(t, m)
assert.Contains(t, m.Domains, "main.test")
assert.Contains(t, m.Domains, "alt1.test")
assert.Contains(t, m.Domains, "alt2.test")
// CN should not be duplicated when it matches a SAN
count := 0
for _, d := range m.Domains {
if d == "main.test" {
count++
}
}
assert.Equal(t, 1, count, "CN should not be duplicated in domains list")
})
t.Run("cert with issuer org", func(t *testing.T) {
priv, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
tmpl := &x509.Certificate{
SerialNumber: big.NewInt(11),
Subject: pkix.Name{CommonName: "org.test"},
Issuer: pkix.Name{Organization: []string{"Test Org Inc"}},
NotBefore: time.Now(),
NotAfter: time.Now().Add(time.Hour),
}
der, err := x509.CreateCertificate(rand.Reader, tmpl, tmpl, &priv.PublicKey, priv)
require.NoError(t, err)
cert, _ := x509.ParseCertificate(der)
m := ExtractCertificateMetadata(cert)
require.NotNil(t, m)
// Self-signed cert's issuer org may differ from template
assert.NotEmpty(t, m.Fingerprint)
})
}
// --- Helpers ---
func TestFormatFingerprint(t *testing.T) {
assert.Equal(t, "AB:CD:EF", formatFingerprint("abcdef"))
assert.Equal(t, "01:23", formatFingerprint("0123"))
assert.Equal(t, "", formatFingerprint(""))
}
func TestFormatSerial(t *testing.T) {
assert.Equal(t, "01", formatSerial(big.NewInt(1)))
assert.Equal(t, "FF", formatSerial(big.NewInt(255)))
assert.Equal(t, "", formatSerial(nil))
}
func TestDetectKeyType(t *testing.T) {
t.Run("RSA key type", func(t *testing.T) {
cert, _, _, _ := makeRSACertAndKey(t, "rsa.test", time.Now().Add(time.Hour))
kt := detectKeyType(cert)
assert.Contains(t, kt, "RSA-2048")
})
t.Run("ECDSA-P256 key type", func(t *testing.T) {
cert, _, _, _ := makeECDSACertAndKey(t, "ec.test")
kt := detectKeyType(cert)
assert.Equal(t, "ECDSA-P256", kt)
})
t.Run("Ed25519 key type", func(t *testing.T) {
cert, _, _, _ := makeEd25519CertAndKey(t, "ed.test")
kt := detectKeyType(cert)
assert.Equal(t, "Ed25519", kt)
})
}

View File

@@ -197,6 +197,12 @@ func ReconcileCrowdSecOnStartup(db *gorm.DB, executor CrowdsecProcessManager, bi
"data_dir": dataDir,
}).Info("CrowdSec reconciliation: starting CrowdSec (mode=local, not currently running)")
// Regenerate whitelist YAML before starting so CrowdSec loads the current entries.
whitelistSvc := NewCrowdSecWhitelistService(db, dataDir)
if writeErr := whitelistSvc.WriteYAML(context.Background()); writeErr != nil {
logger.Log().WithError(writeErr).Warn("CrowdSec reconciliation: failed to write whitelist YAML on startup (non-fatal)")
}
startCtx, startCancel := context.WithTimeout(context.Background(), 30*time.Second)
defer startCancel()

View File

@@ -0,0 +1,190 @@
package services
import (
"context"
"errors"
"fmt"
"net"
"os"
"path/filepath"
"strings"
"github.com/Wikid82/charon/backend/internal/logger"
"github.com/Wikid82/charon/backend/internal/models"
"github.com/google/uuid"
"gorm.io/gorm"
)
// Sentinel errors for CrowdSecWhitelistService operations.
var (
ErrWhitelistNotFound = errors.New("whitelist entry not found")
ErrInvalidIPOrCIDR = errors.New("invalid IP address or CIDR notation")
ErrDuplicateEntry = errors.New("entry already exists in whitelist")
)
const whitelistYAMLHeader = `name: charon-whitelist
description: "Charon-managed IP/CIDR whitelist"
filter: "evt.Meta.service == 'http'"
whitelist:
reason: "Charon managed whitelist"
`
// CrowdSecWhitelistService manages the CrowdSec IP/CIDR whitelist.
type CrowdSecWhitelistService struct {
db *gorm.DB
dataDir string
}
// NewCrowdSecWhitelistService creates a new CrowdSecWhitelistService.
func NewCrowdSecWhitelistService(db *gorm.DB, dataDir string) *CrowdSecWhitelistService {
return &CrowdSecWhitelistService{db: db, dataDir: dataDir}
}
// List returns all whitelist entries ordered by creation time.
func (s *CrowdSecWhitelistService) List(ctx context.Context) ([]models.CrowdSecWhitelist, error) {
var entries []models.CrowdSecWhitelist
if err := s.db.WithContext(ctx).Order("created_at ASC").Find(&entries).Error; err != nil {
return nil, fmt.Errorf("list whitelist entries: %w", err)
}
return entries, nil
}
// Add validates and persists a new whitelist entry, then regenerates the YAML file.
// Returns ErrInvalidIPOrCIDR for malformed input and ErrDuplicateEntry for conflicts.
func (s *CrowdSecWhitelistService) Add(ctx context.Context, ipOrCIDR, reason string) (*models.CrowdSecWhitelist, error) {
normalized, err := normalizeIPOrCIDR(strings.TrimSpace(ipOrCIDR))
if err != nil {
return nil, ErrInvalidIPOrCIDR
}
entry := models.CrowdSecWhitelist{
UUID: uuid.New().String(),
IPOrCIDR: normalized,
Reason: reason,
}
if err := s.db.WithContext(ctx).Create(&entry).Error; err != nil {
if errors.Is(err, gorm.ErrDuplicatedKey) || strings.Contains(err.Error(), "UNIQUE constraint failed") {
return nil, ErrDuplicateEntry
}
return nil, fmt.Errorf("add whitelist entry: %w", err)
}
if err := s.WriteYAML(ctx); err != nil {
logger.Log().WithError(err).Warn("failed to write CrowdSec whitelist YAML after add (non-fatal)")
}
return &entry, nil
}
// Delete removes a whitelist entry by UUID and regenerates the YAML file.
// Returns ErrWhitelistNotFound if the UUID does not exist.
func (s *CrowdSecWhitelistService) Delete(ctx context.Context, id string) error {
result := s.db.WithContext(ctx).Where("uuid = ?", id).Delete(&models.CrowdSecWhitelist{})
if result.Error != nil {
return fmt.Errorf("delete whitelist entry: %w", result.Error)
}
if result.RowsAffected == 0 {
return ErrWhitelistNotFound
}
if err := s.WriteYAML(ctx); err != nil {
logger.Log().WithError(err).Warn("failed to write CrowdSec whitelist YAML after delete (non-fatal)")
}
return nil
}
// WriteYAML renders and atomically writes the CrowdSec whitelist YAML file.
// It is a no-op when dataDir is empty (unit-test mode).
func (s *CrowdSecWhitelistService) WriteYAML(ctx context.Context) error {
if s.dataDir == "" {
return nil
}
var entries []models.CrowdSecWhitelist
if err := s.db.WithContext(ctx).Order("created_at ASC").Find(&entries).Error; err != nil {
return fmt.Errorf("write whitelist yaml: query entries: %w", err)
}
var ips, cidrs []string
for _, e := range entries {
if strings.Contains(e.IPOrCIDR, "/") {
cidrs = append(cidrs, e.IPOrCIDR)
} else {
ips = append(ips, e.IPOrCIDR)
}
}
content := buildWhitelistYAML(ips, cidrs)
dir := filepath.Join(s.dataDir, "config", "parsers", "s02-enrich")
if err := os.MkdirAll(dir, 0o750); err != nil {
return fmt.Errorf("write whitelist yaml: create dir: %w", err)
}
target := filepath.Join(dir, "charon-whitelist.yaml")
tmp := target + ".tmp"
if err := os.WriteFile(tmp, content, 0o640); err != nil {
return fmt.Errorf("write whitelist yaml: write temp: %w", err)
}
if err := os.Rename(tmp, target); err != nil {
_ = os.Remove(tmp)
return fmt.Errorf("write whitelist yaml: rename: %w", err)
}
return nil
}
// normalizeIPOrCIDR validates and normalizes an IP address or CIDR block.
// For CIDRs, the network address is returned (e.g. "10.0.0.1/8" → "10.0.0.0/8").
func normalizeIPOrCIDR(raw string) (string, error) {
if strings.Contains(raw, "/") {
ip, network, err := net.ParseCIDR(raw)
if err != nil {
return "", err
}
_ = ip
return network.String(), nil
}
ip := net.ParseIP(raw)
if ip == nil {
return "", fmt.Errorf("invalid IP: %q", raw)
}
return ip.String(), nil
}
// buildWhitelistYAML constructs the YAML content for the CrowdSec whitelist parser.
func buildWhitelistYAML(ips, cidrs []string) []byte {
var sb strings.Builder
sb.WriteString(whitelistYAMLHeader)
sb.WriteString(" ip:")
if len(ips) == 0 {
sb.WriteString(" []\n")
} else {
sb.WriteString("\n")
for _, ip := range ips {
sb.WriteString(" - \"")
sb.WriteString(ip)
sb.WriteString("\"\n")
}
}
sb.WriteString(" cidr:")
if len(cidrs) == 0 {
sb.WriteString(" []\n")
} else {
sb.WriteString("\n")
for _, cidr := range cidrs {
sb.WriteString(" - \"")
sb.WriteString(cidr)
sb.WriteString("\"\n")
}
}
return []byte(sb.String())
}

View File

@@ -0,0 +1,309 @@
package services_test
import (
"context"
"fmt"
"os"
"path/filepath"
"testing"
"github.com/Wikid82/charon/backend/internal/models"
"github.com/Wikid82/charon/backend/internal/services"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"gorm.io/driver/sqlite"
"gorm.io/gorm"
gormlogger "gorm.io/gorm/logger"
)
func openWhitelistTestDB(t *testing.T) *gorm.DB {
t.Helper()
db, err := gorm.Open(sqlite.Open(":memory:"), &gorm.Config{
Logger: gormlogger.Default.LogMode(gormlogger.Silent),
})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.CrowdSecWhitelist{}))
t.Cleanup(func() {
sqlDB, err := db.DB()
if err == nil {
_ = sqlDB.Close()
}
})
return db
}
func TestCrowdSecWhitelistService_List_Empty(t *testing.T) {
t.Parallel()
svc := services.NewCrowdSecWhitelistService(openWhitelistTestDB(t), "")
entries, err := svc.List(context.Background())
require.NoError(t, err)
assert.Empty(t, entries)
}
func TestCrowdSecWhitelistService_Add_ValidIP(t *testing.T) {
t.Parallel()
svc := services.NewCrowdSecWhitelistService(openWhitelistTestDB(t), "")
entry, err := svc.Add(context.Background(), "1.2.3.4", "test reason")
require.NoError(t, err)
assert.NotEmpty(t, entry.UUID)
assert.Equal(t, "1.2.3.4", entry.IPOrCIDR)
assert.Equal(t, "test reason", entry.Reason)
}
func TestCrowdSecWhitelistService_Add_ValidCIDR(t *testing.T) {
t.Parallel()
svc := services.NewCrowdSecWhitelistService(openWhitelistTestDB(t), "")
entry, err := svc.Add(context.Background(), "192.168.1.0/24", "local net")
require.NoError(t, err)
assert.Equal(t, "192.168.1.0/24", entry.IPOrCIDR)
}
func TestCrowdSecWhitelistService_Add_NormalizesCIDR(t *testing.T) {
t.Parallel()
svc := services.NewCrowdSecWhitelistService(openWhitelistTestDB(t), "")
entry, err := svc.Add(context.Background(), "10.0.0.1/8", "normalize test")
require.NoError(t, err)
assert.Equal(t, "10.0.0.0/8", entry.IPOrCIDR)
}
func TestCrowdSecWhitelistService_Add_InvalidIP(t *testing.T) {
t.Parallel()
svc := services.NewCrowdSecWhitelistService(openWhitelistTestDB(t), "")
_, err := svc.Add(context.Background(), "not-an-ip", "")
assert.ErrorIs(t, err, services.ErrInvalidIPOrCIDR)
}
func TestCrowdSecWhitelistService_Add_Duplicate(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
svc := services.NewCrowdSecWhitelistService(db, "")
_, err := svc.Add(context.Background(), "5.5.5.5", "first")
require.NoError(t, err)
_, err = svc.Add(context.Background(), "5.5.5.5", "second")
assert.ErrorIs(t, err, services.ErrDuplicateEntry)
}
func TestCrowdSecWhitelistService_Delete_Existing(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
svc := services.NewCrowdSecWhitelistService(db, "")
entry, err := svc.Add(context.Background(), "6.6.6.6", "to delete")
require.NoError(t, err)
err = svc.Delete(context.Background(), entry.UUID)
require.NoError(t, err)
entries, err := svc.List(context.Background())
require.NoError(t, err)
assert.Empty(t, entries)
}
func TestCrowdSecWhitelistService_Delete_NotFound(t *testing.T) {
t.Parallel()
svc := services.NewCrowdSecWhitelistService(openWhitelistTestDB(t), "")
err := svc.Delete(context.Background(), "00000000-0000-0000-0000-000000000000")
assert.ErrorIs(t, err, services.ErrWhitelistNotFound)
}
func TestCrowdSecWhitelistService_WriteYAML_EmptyDataDir(t *testing.T) {
t.Parallel()
svc := services.NewCrowdSecWhitelistService(openWhitelistTestDB(t), "")
err := svc.WriteYAML(context.Background())
assert.NoError(t, err)
}
func TestCrowdSecWhitelistService_WriteYAML_CreatesFile(t *testing.T) {
t.Parallel()
tmpDir := t.TempDir()
db := openWhitelistTestDB(t)
svc := services.NewCrowdSecWhitelistService(db, tmpDir)
_, err := svc.Add(context.Background(), "1.1.1.1", "dns")
require.NoError(t, err)
_, err = svc.Add(context.Background(), "10.0.0.0/8", "internal")
require.NoError(t, err)
yamlPath := filepath.Join(tmpDir, "config", "parsers", "s02-enrich", "charon-whitelist.yaml")
content, err := os.ReadFile(yamlPath)
require.NoError(t, err)
s := string(content)
assert.Contains(t, s, "name: charon-whitelist")
assert.Contains(t, s, `"1.1.1.1"`)
assert.Contains(t, s, `"10.0.0.0/8"`)
}
func TestCrowdSecWhitelistService_WriteYAML_EmptyLists(t *testing.T) {
t.Parallel()
tmpDir := t.TempDir()
svc := services.NewCrowdSecWhitelistService(openWhitelistTestDB(t), tmpDir)
err := svc.WriteYAML(context.Background())
require.NoError(t, err)
yamlPath := filepath.Join(tmpDir, "config", "parsers", "s02-enrich", "charon-whitelist.yaml")
content, err := os.ReadFile(yamlPath)
require.NoError(t, err)
s := string(content)
assert.Contains(t, s, "ip: []")
assert.Contains(t, s, "cidr: []")
}
func TestCrowdSecWhitelistService_List_AfterAdd(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
svc := services.NewCrowdSecWhitelistService(db, "")
for i := 0; i < 3; i++ {
_, err := svc.Add(context.Background(), fmt.Sprintf("10.0.0.%d", i+1), "")
require.NoError(t, err)
}
entries, err := svc.List(context.Background())
require.NoError(t, err)
assert.Len(t, entries, 3)
}
func TestAdd_ValidIPv6_Success(t *testing.T) {
t.Parallel()
svc := services.NewCrowdSecWhitelistService(openWhitelistTestDB(t), "")
entry, err := svc.Add(context.Background(), "2001:db8::1", "ipv6 test")
require.NoError(t, err)
assert.Equal(t, "2001:db8::1", entry.IPOrCIDR)
entries, err := svc.List(context.Background())
require.NoError(t, err)
assert.Len(t, entries, 1)
assert.Equal(t, "2001:db8::1", entries[0].IPOrCIDR)
}
func TestCrowdSecWhitelistService_List_DBError(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
svc := services.NewCrowdSecWhitelistService(db, "")
sqlDB, err := db.DB()
require.NoError(t, err)
_ = sqlDB.Close()
_, err = svc.List(context.Background())
assert.Error(t, err)
}
func TestCrowdSecWhitelistService_Add_DBCreateError(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
svc := services.NewCrowdSecWhitelistService(db, "")
sqlDB, err := db.DB()
require.NoError(t, err)
_ = sqlDB.Close()
_, err = svc.Add(context.Background(), "1.2.3.4", "test")
assert.Error(t, err)
assert.NotErrorIs(t, err, services.ErrInvalidIPOrCIDR)
assert.NotErrorIs(t, err, services.ErrDuplicateEntry)
}
func TestCrowdSecWhitelistService_Delete_DBError(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
svc := services.NewCrowdSecWhitelistService(db, "")
sqlDB, err := db.DB()
require.NoError(t, err)
_ = sqlDB.Close()
err = svc.Delete(context.Background(), "some-uuid")
assert.Error(t, err)
assert.NotErrorIs(t, err, services.ErrWhitelistNotFound)
}
func TestCrowdSecWhitelistService_WriteYAML_DBError(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
tmpDir := t.TempDir()
svc := services.NewCrowdSecWhitelistService(db, tmpDir)
sqlDB, err := db.DB()
require.NoError(t, err)
_ = sqlDB.Close()
err = svc.WriteYAML(context.Background())
assert.Error(t, err)
assert.Contains(t, err.Error(), "query entries")
}
func TestCrowdSecWhitelistService_WriteYAML_MkdirError(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
// Use a path under /dev/null which cannot have subdirectories
svc := services.NewCrowdSecWhitelistService(db, "/dev/null/impossible")
err := svc.WriteYAML(context.Background())
assert.Error(t, err)
assert.Contains(t, err.Error(), "create dir")
}
func TestCrowdSecWhitelistService_WriteYAML_WriteFileError(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
tmpDir := t.TempDir()
svc := services.NewCrowdSecWhitelistService(db, tmpDir)
// Create a directory where the .tmp file would be written, causing WriteFile to fail
dir := filepath.Join(tmpDir, "config", "parsers", "s02-enrich")
require.NoError(t, os.MkdirAll(dir, 0o750))
tmpTarget := filepath.Join(dir, "charon-whitelist.yaml.tmp")
require.NoError(t, os.MkdirAll(tmpTarget, 0o750))
err := svc.WriteYAML(context.Background())
assert.Error(t, err)
assert.Contains(t, err.Error(), "write temp")
}
func TestCrowdSecWhitelistService_Add_WriteYAMLWarning(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
// dataDir that will cause MkdirAll to fail inside WriteYAML (non-fatal)
svc := services.NewCrowdSecWhitelistService(db, "/dev/null/impossible")
entry, err := svc.Add(context.Background(), "2.2.2.2", "yaml warn test")
require.NoError(t, err)
assert.Equal(t, "2.2.2.2", entry.IPOrCIDR)
}
func TestCrowdSecWhitelistService_Delete_WriteYAMLWarning(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
// First add with empty dataDir so it succeeds
svcAdd := services.NewCrowdSecWhitelistService(db, "")
entry, err := svcAdd.Add(context.Background(), "3.3.3.3", "to delete")
require.NoError(t, err)
// Now create a service with a broken dataDir and delete
svcDel := services.NewCrowdSecWhitelistService(db, "/dev/null/impossible")
err = svcDel.Delete(context.Background(), entry.UUID)
require.NoError(t, err)
}
func TestCrowdSecWhitelistService_WriteYAML_RenameError(t *testing.T) {
t.Parallel()
db := openWhitelistTestDB(t)
tmpDir := t.TempDir()
svc := services.NewCrowdSecWhitelistService(db, tmpDir)
// Create target as a directory so rename (atomic replace) fails
dir := filepath.Join(tmpDir, "config", "parsers", "s02-enrich")
require.NoError(t, os.MkdirAll(dir, 0o750))
target := filepath.Join(dir, "charon-whitelist.yaml")
require.NoError(t, os.MkdirAll(target, 0o750))
err := svc.WriteYAML(context.Background())
assert.Error(t, err)
assert.Contains(t, err.Error(), "rename")
}
func TestCrowdSecWhitelistService_Add_InvalidCIDR(t *testing.T) {
t.Parallel()
svc := services.NewCrowdSecWhitelistService(openWhitelistTestDB(t), "")
_, err := svc.Add(context.Background(), "not-an-ip/24", "invalid cidr with slash")
assert.ErrorIs(t, err, services.ErrInvalidIPOrCIDR)
}

View File

@@ -13,8 +13,7 @@ import (
"syscall"
"github.com/Wikid82/charon/backend/internal/logger"
"github.com/docker/docker/api/types/container"
"github.com/docker/docker/client"
"github.com/moby/moby/client"
)
type DockerUnavailableError struct {
@@ -86,7 +85,7 @@ func NewDockerService() *DockerService {
logger.Log().WithFields(map[string]any{"docker_host_env": envHost, "local_host": localHost}).Info("ignoring non-unix DOCKER_HOST for local docker mode")
}
cli, err := client.NewClientWithOpts(client.WithHost(localHost), client.WithAPIVersionNegotiation())
cli, err := client.New(client.WithHost(localHost))
if err != nil {
logger.Log().WithError(err).Warn("Failed to initialize Docker client - Docker features will be unavailable")
unavailableErr := NewDockerUnavailableError(err, buildLocalDockerUnavailableDetails(err, localHost))
@@ -115,7 +114,7 @@ func (s *DockerService) ListContainers(ctx context.Context, host string) ([]Dock
if host == "" || host == "local" {
cli = s.client
} else {
cli, err = client.NewClientWithOpts(client.WithHost(host), client.WithAPIVersionNegotiation())
cli, err = client.New(client.WithHost(host))
if err != nil {
return nil, fmt.Errorf("failed to create remote client: %w", err)
}
@@ -126,7 +125,7 @@ func (s *DockerService) ListContainers(ctx context.Context, host string) ([]Dock
}()
}
containers, err := cli.ContainerList(ctx, container.ListOptions{All: false})
containers, err := cli.ContainerList(ctx, client.ContainerListOptions{All: false})
if err != nil {
if isDockerConnectivityError(err) {
if host == "" || host == "local" {
@@ -138,14 +137,16 @@ func (s *DockerService) ListContainers(ctx context.Context, host string) ([]Dock
}
var result []DockerContainer
for _, c := range containers {
for _, c := range containers.Items {
// Get the first network's IP address if available
networkName := ""
ipAddress := ""
if c.NetworkSettings != nil && len(c.NetworkSettings.Networks) > 0 {
for name, net := range c.NetworkSettings.Networks {
networkName = name
ipAddress = net.IPAddress
if net != nil && net.IPAddress.IsValid() {
ipAddress = net.IPAddress.String()
}
break // Just take the first one for now
}
}
@@ -166,11 +167,16 @@ func (s *DockerService) ListContainers(ctx context.Context, host string) ([]Dock
})
}
shortID := c.ID
if len(shortID) > 12 {
shortID = shortID[:12]
}
result = append(result, DockerContainer{
ID: c.ID[:12], // Short ID
ID: shortID,
Names: names,
Image: c.Image,
State: c.State,
State: string(c.State),
Status: c.Status,
Network: networkName,
IP: ipAddress,

View File

@@ -12,6 +12,7 @@ func TestNewRFC2136Provider(t *testing.T) {
if provider == nil {
t.Fatal("NewRFC2136Provider() returned nil")
return
}
if provider.propagationTimeout != RFC2136DefaultPropagationTimeout {

View File

@@ -24,6 +24,7 @@ echo "Installing base parsers..."
cscli parsers install crowdsecurity/http-logs --force || echo "⚠️ Failed to install crowdsecurity/http-logs"
cscli parsers install crowdsecurity/syslog-logs --force || echo "⚠️ Failed to install crowdsecurity/syslog-logs"
cscli parsers install crowdsecurity/geoip-enrich --force || echo "⚠️ Failed to install crowdsecurity/geoip-enrich"
cscli parsers install crowdsecurity/whitelists --force || echo "⚠️ Failed to install crowdsecurity/whitelists"
# Install HTTP scenarios for attack detection
echo "Installing HTTP scenarios..."

View File

@@ -251,13 +251,13 @@ Go releases **two major versions per year**:
- February (e.g., Go 1.26.0)
- August (e.g., Go 1.27.0)
Plus occasional patch releases (e.g., Go 1.26.1) for security fixes.
Plus occasional patch releases (e.g., Go 1.26.2) for security fixes.
**Bottom line:** Expect to run `./scripts/rebuild-go-tools.sh` 2-3 times per year.
### Do I need to rebuild tools for patch releases?
**Usually no**, but it doesn't hurt. Patch releases (like 1.26.0 → 1.26.1) rarely break tool compatibility.
**Usually no**, but it doesn't hurt. Patch releases (like 1.26.0 → 1.26.2) rarely break tool compatibility.
**Rebuild if:**

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,432 @@
# Nightly Build Vulnerability Remediation Plan
**Date**: 2026-04-09
**Status**: Draft — Awaiting Approval
**Scope**: Dependency security patches for 5 HIGH + 3 MEDIUM vulnerability groups
**Target**: Single PR — all changes ship together
**Archived**: Previous plan (CrowdSec Hub Bootstrapping) → `docs/plans/archive/crowdsec-hub-bootstrap-spec.md`
---
## 1. Problem Statement
The Charon nightly build is failing container image vulnerability scans with **5 HIGH-severity** and **multiple MEDIUM-severity** findings. These vulnerabilities exist across three compiled binaries embedded in the container image:
1. **Charon backend** (`/app/charon`) — Go binary built from `backend/go.mod`
2. **Caddy** (`/usr/bin/caddy`) — Built via xcaddy in the Dockerfile Caddy builder stage
3. **CrowdSec** (`/usr/local/bin/crowdsec`, `/usr/local/bin/cscli`) — Built from source in the Dockerfile CrowdSec builder stage
Additionally, the **nightly branch** was synced from development before the Go 1.26.2 bump landed, so the nightly image was compiled with Go 1.26.1 (confirmed in `ci_failure.log` line 55: `GO_VERSION: 1.26.1`).
---
## 2. Research Findings
### 2.1 Go Version Audit
All files on `development` / `main` already reference **Go 1.26.2**:
| File | Current Value | Status |
|------|---------------|--------|
| `backend/go.mod` | `go 1.26.2` | ✅ Current |
| `go.work` | `go 1.26.2` | ✅ Current |
| `Dockerfile` (`ARG GO_VERSION`) | `1.26.2` | ✅ Current |
| `.github/workflows/nightly-build.yml` | `'1.26.2'` | ✅ Current |
| `.github/workflows/codecov-upload.yml` | `'1.26.2'` | ✅ Current |
| `.github/workflows/quality-checks.yml` | `'1.26.2'` | ✅ Current |
| `.github/workflows/codeql.yml` | `'1.26.2'` | ✅ Current |
| `.github/workflows/benchmark.yml` | `'1.26.2'` | ✅ Current |
| `.github/workflows/release-goreleaser.yml` | `'1.26.2'` | ✅ Current |
| `.github/workflows/e2e-tests-split.yml` | `'1.26.2'` | ✅ Current |
| `.github/skills/examples/gorm-scanner-ci-workflow.yml` | `'1.26.1'` | ❌ **Stale** |
| `scripts/install-go-1.26.0.sh` | `1.26.0` | ⚠️ Old install script (not used in CI/Docker builds) |
**Root Cause of Go stdlib CVEs**: The nightly branch's last sync predated the 1.26.2 bump. The next nightly sync from development will propagate 1.26.2 automatically. The only file requiring a fix is the example workflow.
### 2.2 Vulnerability Inventory
#### HIGH Severity (must fix — merge-blocking)
| # | CVE / GHSA | Package | Current | Fix | Binary | Dep Type |
|---|-----------|---------|---------|-----|--------|----------|
| 1 | CVE-2026-39883 | `go.opentelemetry.io/otel/sdk` | v1.40.0 | v1.43.0 | Caddy | Transitive (Caddy plugins → otelhttp → otel/sdk) |
| 2 | CVE-2026-34986 | `github.com/go-jose/go-jose/v3` | v3.0.4 | **v3.0.5** | Caddy | Transitive (caddy-security → JWT/JOSE stack) |
| 3 | CVE-2026-34986 | `github.com/go-jose/go-jose/v4` | v4.1.3 | **v4.1.4** | Caddy | Transitive (grpc v1.79.3 → go-jose/v4) |
| 4 | CVE-2026-32286 | `github.com/jackc/pgproto3/v2` | v2.3.3 | pgx/v4 v4.18.3 ¹ | CrowdSec | Transitive (CrowdSec → pgx/v4 v4.18.2 → pgproto3/v2) |
¹ pgproto3/v2 has **no patched release**. Fix requires upstream migration to pgx/v5 (uses pgproto3/v3). See §5 Risk Assessment.
#### MEDIUM Severity (fix in same pass)
| # | CVE / GHSA | Package(s) | Current | Fix | Binary | Dep Type |
|---|-----------|------------|---------|-----|--------|----------|
| 5 | GHSA-xmrv-pmrh-hhx2 | AWS SDK v2: `eventstream` v1.7.1, `cloudwatchlogs` v1.57.2, `kinesis` v1.40.1, `s3` v1.87.3 | See left | Bump all | CrowdSec | Direct deps of CrowdSec v1.7.7 |
| 6 | CVE-2026-32281, -32288, -32289 | Go stdlib | 1.26.1 | **1.26.2** | All (nightly image) | Toolchain |
| 7 | CVE-2026-39882 | OTel HTTP exporters: `otlploghttp` v0.16.0, `otlpmetrichttp` v1.40.0, `otlptracehttp` v1.40.0 | See left | Bump all | Caddy | Transitive (Caddy plugins → OTel exporters) |
### 2.3 Dependency Chain Analysis
#### Backend (`backend/go.mod`)
```
charon/backend (direct)
└─ docker/docker v28.5.2+incompatible (direct)
└─ otelhttp v0.68.0 (indirect)
└─ otel/sdk v1.43.0 (indirect) — already at latest
└─ grpc v1.79.3 (indirect)
└─ otlptracehttp v1.42.0 (indirect) ── CVE-2026-39882
```
Backend resolved versions (verified via `go list -m -json`):
| Package | Version | Type |
|---------|---------|------|
| `go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp` | v1.42.0 | indirect |
| `google.golang.org/grpc` | v1.79.3 | indirect |
| `go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp` | v0.68.0 | indirect |
**Not present in backend**: go-jose/v3, go-jose/v4, otel/sdk, pgproto3/v2, AWS SDK, otlploghttp, otlpmetrichttp.
#### CrowdSec Binary (Dockerfile `crowdsec-builder` stage)
Source: CrowdSec v1.7.7 `go.mod` (verified via `git clone --depth 1 --branch v1.7.7`):
```
crowdsec v1.7.7
└─ pgx/v4 v4.18.2 (direct) → pgproto3/v2 v2.3.3 (indirect) ── CVE-2026-32286
└─ aws-sdk-go-v2/service/s3 v1.87.3 (direct) ── GHSA-xmrv-pmrh-hhx2
└─ aws-sdk-go-v2/service/cloudwatchlogs v1.57.2 (direct) ── GHSA-xmrv-pmrh-hhx2
└─ aws-sdk-go-v2/service/kinesis v1.40.1 (direct) ── GHSA-xmrv-pmrh-hhx2
└─ aws-sdk-go-v2/aws/protocol/eventstream v1.7.1 (indirect) ── GHSA-xmrv-pmrh-hhx2
└─ otel v1.39.0, otel/metric v1.39.0, otel/trace v1.39.0 (indirect)
```
Confirmed by Trivy image scan (`trivy-image-report.json`): pgproto3/v2 v2.3.3 flagged in `usr/local/bin/crowdsec` and `usr/local/bin/cscli`.
#### Caddy Binary (Dockerfile `caddy-builder` stage)
Built via xcaddy with plugins. go.mod is generated at build time. Vulnerable packages enter via:
```
xcaddy build (Caddy v2.11.2 + plugins)
└─ caddy-security v1.1.61 → go-jose/v3 (JWT auth stack) ── CVE-2026-34986
└─ grpc (patched to v1.79.3 in Dockerfile) → go-jose/v4 v4.1.3 ── CVE-2026-34986
└─ Caddy/plugins → otel/sdk v1.40.0 ── CVE-2026-39883
└─ Caddy/plugins → otlploghttp v0.16.0, otlpmetrichttp v1.40.0, otlptracehttp v1.40.0 ── CVE-2026-39882
```
---
## 3. Technical Specifications
### 3.1 Backend go.mod Changes
**File**: `backend/go.mod` (+ `backend/go.sum` auto-generated)
```bash
cd backend
# Upgrade grpc to v1.80.0 (security patches for transitive deps)
go get google.golang.org/grpc@v1.80.0
# CVE-2026-39882: OTel HTTP exporter (backend only has otlptracehttp)
go get go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp@v1.43.0
go mod tidy
```
Expected `go.mod` diff:
- `google.golang.org/grpc` v1.79.3 → v1.80.0
- `go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp` v1.42.0 → v1.43.0
### 3.2 Dockerfile — Caddy Builder Stage Patches
**File**: `Dockerfile`, within the caddy-builder `RUN bash -c '...'` block, in the **Stage 2: Apply security patches** section.
Add after the existing `go get golang.org/x/net@v${XNET_VERSION};` line:
```bash
# CVE-2026-34986: go-jose JOSE/JWT validation bypass
# Fix in v3.0.5 and v4.1.4. Pin here until caddy-security ships fix.
# renovate: datasource=go depName=github.com/go-jose/go-jose/v3
go get github.com/go-jose/go-jose/v3@v3.0.5; \
# renovate: datasource=go depName=github.com/go-jose/go-jose/v4
go get github.com/go-jose/go-jose/v4@v4.1.4; \
# CVE-2026-39883: OTel SDK resource leak
# Fix in v1.43.0. Pin here until Caddy ships with updated OTel.
# renovate: datasource=go depName=go.opentelemetry.io/otel/sdk
go get go.opentelemetry.io/otel/sdk@v1.43.0; \
# CVE-2026-39882: OTel HTTP exporter request smuggling
# renovate: datasource=go depName=go.opentelemetry.io/otel/exporters/otlp/otlplog/otlploghttp
go get go.opentelemetry.io/otel/exporters/otlp/otlplog/otlploghttp@v0.19.0; \
# renovate: datasource=go depName=go.opentelemetry.io/otel/exporters/otlp/otlpmetric/otlpmetrichttp
go get go.opentelemetry.io/otel/exporters/otlp/otlpmetric/otlpmetrichttp@v1.43.0; \
# renovate: datasource=go depName=go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp
go get go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp@v1.43.0; \
```
Update existing grpc patch line from `v1.79.3``v1.80.0`:
```bash
# Before:
go get google.golang.org/grpc@v1.79.3; \
# After:
# CVE-2026-33186: gRPC-Go auth bypass (fixed in v1.79.3)
# CVE-2026-34986: go-jose/v4 transitive fix (requires grpc >= v1.80.0)
# renovate: datasource=go depName=google.golang.org/grpc
go get google.golang.org/grpc@v1.80.0; \
```
### 3.3 Dockerfile — CrowdSec Builder Stage Patches
**File**: `Dockerfile`, within the crowdsec-builder `RUN` block that patches dependencies.
Add after the existing `go get golang.org/x/net@v${XNET_VERSION}` line:
```bash
# CVE-2026-32286: pgproto3/v2 buffer overflow (no v2 fix exists; bump pgx/v4 to latest patch)
# renovate: datasource=go depName=github.com/jackc/pgx/v4
go get github.com/jackc/pgx/v4@v4.18.3 && \
# GHSA-xmrv-pmrh-hhx2: AWS SDK v2 event stream injection
# renovate: datasource=go depName=github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream
go get github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream@v1.7.8 && \
# renovate: datasource=go depName=github.com/aws/aws-sdk-go-v2/service/cloudwatchlogs
go get github.com/aws/aws-sdk-go-v2/service/cloudwatchlogs@v1.68.0 && \
# renovate: datasource=go depName=github.com/aws/aws-sdk-go-v2/service/kinesis
go get github.com/aws/aws-sdk-go-v2/service/kinesis@v1.43.5 && \
# renovate: datasource=go depName=github.com/aws/aws-sdk-go-v2/service/s3
go get github.com/aws/aws-sdk-go-v2/service/s3@v1.99.0 && \
```
CrowdSec grpc already at v1.80.0 — no change needed.
### 3.4 Example Workflow Fix
**File**: `.github/skills/examples/gorm-scanner-ci-workflow.yml` (line 28)
```yaml
# Before:
go-version: "1.26.1"
# After:
go-version: "1.26.2"
```
### 3.5 Go Stdlib CVEs (nightly branch — no code change needed)
The nightly workflow syncs `development → nightly` via `git merge --ff-only`. Since `development` already has Go 1.26.2 everywhere:
- Dockerfile `ARG GO_VERSION=1.26.2`
- All CI workflows `GO_VERSION: '1.26.2'`
- `backend/go.mod` `go 1.26.2`
The next nightly run at 09:00 UTC will automatically propagate Go 1.26.2 to the nightly branch and rebuild the image.
---
## 4. Implementation Plan
### Phase 1: Playwright Tests (N/A)
No UI/UX changes — this is a dependency-only update. Existing E2E tests validate runtime behavior.
### Phase 2: Backend Implementation
| Task | File(s) | Action |
|------|---------|--------|
| 2.1 | `backend/go.mod`, `backend/go.sum` | Run `go get` commands from §3.1 |
| 2.2 | Verify build | `cd backend && go build ./cmd/api` |
| 2.3 | Verify vet | `cd backend && go vet ./...` |
| 2.4 | Verify tests | `cd backend && go test ./...` |
| 2.5 | Verify vulns | `cd backend && govulncheck ./...` |
### Phase 3: Dockerfile Implementation
| Task | File(s) | Action |
|------|---------|--------|
| 3.1 | `Dockerfile` (caddy-builder, ~L258-280) | Add go-jose v3/v4, OTel SDK, OTel exporter patches per §3.2 |
| 3.2 | `Dockerfile` (caddy-builder, ~L270) | Update grpc patch v1.79.3 → v1.80.0 |
| 3.3 | `Dockerfile` (crowdsec-builder, ~L360-370) | Add pgx, AWS SDK patches per §3.3 |
| 3.3a | CrowdSec binaries | After patching deps, run `go build` on CrowdSec binaries before full Docker build for faster compilation feedback |
| 3.4 | `Dockerfile` | Verify `docker build .` completes successfully (amd64) |
### Phase 4: CI / Misc Fixes
| Task | File(s) | Action |
|------|---------|--------|
| 4.1 | `.github/skills/examples/gorm-scanner-ci-workflow.yml` | Bump Go version 1.26.2 → 1.26.2 |
### Phase 5: Validation
| Task | Validation |
|------|------------|
| 5.1 | `cd backend && go build ./cmd/api` — compiles cleanly |
| 5.2 | `cd backend && go test ./...` — all tests pass |
| 5.3 | `cd backend && go vet ./...` — no issues |
| 5.4 | `cd backend && govulncheck ./...` — 0 findings |
| 5.5 | `docker build -t charon:vuln-fix .` — image builds for amd64 |
| 5.6 | Trivy scan on built image: `docker run --rm -v /var/run/docker.sock:/var/run/docker.sock aquasec/trivy:latest image --severity CRITICAL,HIGH charon:vuln-fix` — 0 HIGH (pgproto3/v2 excepted) |
| 5.7 | Container health: `docker run -d -p 8080:8080 charon:vuln-fix && curl -f http://localhost:8080/health` |
| 5.8 | E2E Playwright tests pass against rebuilt container |
---
## 5. Risk Assessment
### Low Risk
| Change | Risk | Rationale |
|--------|------|-----------|
| `go-jose/v3` v3.0.4 → v3.0.5 | Low | Security patch release only |
| `go-jose/v4` v4.1.3 → v4.1.4 | Low | Security patch release only |
| `otel/sdk` v1.40.0 → v1.43.0 (Caddy) | Low | Minor bumps, backwards compatible |
| `otlptracehttp` v1.42.0 → v1.43.0 (backend) | Low | Minor bump |
| OTel exporters (Caddy) | Low | Minor/patch bumps |
| Go version example fix | None | Non-runtime file |
### Medium Risk
| Change | Risk | Mitigation |
|--------|------|------------|
| `grpc` v1.79.3 → v1.80.0 | Medium | Minor version bump. gRPC is indirect — Charon doesn't use gRPC directly. Run full test suite. Verify Caddy and CrowdSec still compile. |
| AWS SDK major bumps (s3 v1.87→v1.99, cloudwatchlogs v1.57→v1.68, kinesis v1.40→v1.43) | Medium | CrowdSec build may fail if internal APIs changed between versions. Mitigate: run `go mod tidy` after patches and verify CrowdSec binaries compile. **Note:** AWS SDK Go v2 packages use independent semver within the `v1.x.x` line — these are minor version bumps, not major API breaks. |
| `pgx/v4` v4.18.2 → v4.18.3 | Medium | Patch release should be safe. May not fully resolve pgproto3/v2 since no patched v2 exists. |
### Known Limitation: pgproto3/v2 (CVE-2026-32286)
The `pgproto3/v2` module has **no patched release** — the fix exists only in `pgproto3/v3` (used by `pgx/v5`). CrowdSec v1.7.7 uses `pgx/v4` which depends on `pgproto3/v2`. Remediation:
1. Bump `pgx/v4` to v4.18.3 (latest v4 patch) — may transitively resolve the issue
2. If scanner still flags pgproto3/v2 after the bump: document as **accepted risk with upstream tracking**
3. Monitor CrowdSec releases for `pgx/v5` migration
4. Consider upgrading `CROWDSEC_VERSION` ARG if a newer CrowdSec release ships with pgx/v5
---
## 6. Acceptance Criteria
- [ ] `cd backend && go build ./cmd/api` succeeds with zero warnings
- [ ] `cd backend && go test ./...` passes with zero failures
- [ ] `cd backend && go vet ./...` reports zero issues
- [ ] `cd backend && govulncheck ./...` reports zero findings
- [ ] Docker image builds successfully for amd64
- [ ] Trivy/Grype scan of built image shows 0 new HIGH findings (pgproto3/v2 excepted if upstream unpatched)
- [ ] Container starts, health check passes on port 8080
- [ ] Existing E2E Playwright tests pass against rebuilt container
- [ ] No new compile errors in Caddy or CrowdSec builder stages
- [ ] `backend/go.mod` shows updated versions for grpc, otlptracehttp
---
## 7. Commit Slicing Strategy
### Decision: Single PR
**Rationale**: All changes are dependency version bumps with no feature or behavioral changes. They address a single concern (security vulnerability remediation) and should be reviewed and merged atomically to avoid partial-fix states.
**Trigger reasons for single PR**:
- All changes are security patches — cannot ship partial fixes
- Changes span backend + Dockerfile + CI config — logically coupled
- No risk of one slice breaking another
- Total diff is small (go.mod/go.sum + Dockerfile patch lines + 1 YAML fix)
### PR-1: Nightly Build Vulnerability Remediation
**Scope**: All changes in §3.1§3.4
**Files modified**:
| File | Change Type |
|------|-------------|
| `backend/go.mod` | Dependency version bumps (grpc, otlptracehttp) |
| `backend/go.sum` | Auto-generated checksum updates |
| `Dockerfile` | Add `go get` patches in caddy-builder and crowdsec-builder stages |
| `.github/skills/examples/gorm-scanner-ci-workflow.yml` | Go version 1.26.2 → 1.26.2 |
**Dependencies**: None (standalone)
**Validation gates**:
1. `go build` / `go test` / `go vet` / `govulncheck` pass
2. Docker image builds for amd64
3. Trivy/Grype scan passes (0 new HIGH)
4. E2E tests pass
**Rollback**: Revert PR. All changes are version pins — reverting restores previous state with no data migration needed.
### Post-merge Actions
1. Nightly build will automatically sync development → nightly and rebuild the image with all patches
2. Monitor next nightly scan for zero HIGH findings
3. If pgproto3/v2 still flagged: open tracking issue for CrowdSec pgx/v5 upstream migration
4. If any AWS SDK bump breaks CrowdSec compilation: pin to intermediate version and document
---
## 8. CI Failure Amendment: pgx/v4 Module Path Mismatch
**Date**: 2026-04-09
**Failure**: PR #921 `build-and-push` job, step `crowdsec-builder 7/11`
**Error**: `go: github.com/jackc/pgx/v4@v5.9.1: invalid version: go.mod has non-.../v4 module path "github.com/jackc/pgx/v5" (and .../v4/go.mod does not exist) at revision v5.9.1`
### Root Cause
Dockerfile line 386 specifies `go get github.com/jackc/pgx/v4@v5.9.1`. This mixes the v4 module path with a v5 version tag. Go's semantic import versioning rejects this because tag `v5.9.1` declares module path `github.com/jackc/pgx/v5` in its go.mod.
### Fix
**Dockerfile line 386** — change:
```dockerfile
go get github.com/jackc/pgx/v4@v5.9.1 && \
```
to:
```dockerfile
go get github.com/jackc/pgx/v4@v4.18.3 && \
```
No changes needed to the Renovate annotation (line 385) or the CVE comment (line 384) — both are already correct.
### Why v4.18.3
- CrowdSec v1.7.7 uses `github.com/jackc/pgx/v4 v4.18.2` (direct dependency)
- v4.18.3 is the latest and likely final v4 release
- pgproto3/v2 is archived at v2.3.3 (July 2025) — no fix will be released in the v2 line
- The CVE (pgproto3/v2 buffer overflow) can only be fully resolved by CrowdSec migrating to pgx/v5 upstream
- Bumping pgx/v4 to v4.18.3 gets the latest v4 maintenance patch; the CVE remains an accepted risk per §5
### Validation
The same `docker build` that previously failed at step 7/11 should now pass through the CrowdSec dependency patching stage and proceed to compilation (steps 8-11).
---
## 9. Commands Reference
```bash
# === Backend dependency upgrades ===
cd /projects/Charon/backend
go get google.golang.org/grpc@v1.80.0
go get go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp@v1.43.0
go mod tidy
# === Validate backend ===
go build ./cmd/api
go test ./...
go vet ./...
govulncheck ./...
# === Docker build (after Dockerfile edits) ===
cd /projects/Charon
docker build -t charon:vuln-fix .
# === Scan built image ===
docker run --rm \
-v /var/run/docker.sock:/var/run/docker.sock \
aquasec/trivy:latest image \
--severity CRITICAL,HIGH \
charon:vuln-fix
# === Quick container health check ===
docker run -d --name charon-vuln-test -p 8080:8080 charon:vuln-fix
sleep 10
curl -f http://localhost:8080/health
docker stop charon-vuln-test && docker rm charon-vuln-test
```

View File

@@ -0,0 +1,460 @@
# Coverage Improvement Plan — Patch Coverage ≥ 90%
**Date**: 2026-05-02
**Status**: Draft — Awaiting Approval
**Priority**: High
**Archived Previous Plan**: Custom Certificate Upload & Management (Issue #22) → `docs/plans/archive/custom-cert-upload-management-spec-2026-05-02.md`
---
## 1. Introduction
This plan identifies exact uncovered branches across the six highest-gap backend source files and two frontend components, and specifies new test cases to close those gaps. The target is to raise overall patch coverage from **85.61% (206 missing lines)** to **≥ 90%**.
**Constraints**:
- No source file modifications — test files only
- Go tests placed in `*_patch_coverage_test.go` (same package as source)
- Frontend tests extend existing `__tests__/*.test.tsx` files
- Use testify (Go) and Vitest + React Testing Library (frontend)
---
## 2. Research Findings
### 2.1 Coverage Gap Summary
| Package | File | Missing Lines | Current Coverage |
|---|---|---|---|
| `handlers` | `certificate_handler.go` | ~54 | 70.28% |
| `services` | `certificate_service.go` | ~54 | 82.85% |
| `services` | `certificate_validator.go` | ~18 | 88.68% |
| `handlers` | `proxy_host_handler.go` | ~12 | 55.17% |
| `config` | `config.go` | ~8 | ~92% |
| `caddy` | `manager.go` | ~10 | ~88% |
| Frontend | `CertificateList.tsx` | moderate | — |
| Frontend | `CertificateUploadDialog.tsx` | moderate | — |
### 2.2 Test Infrastructure (Confirmed)
- **In-memory DB**: `gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})`
- **Mock auth**: `mockAuthMiddleware()` from `coverage_helpers_test.go`
- **Mock backup service**: `&mockBackupService{createFunc: ..., availableSpaceFunc: ...}`
- **Manager test hooks**: package-level `generateConfigFunc`, `validateConfigFunc`, `writeFileFunc` vars with `defer` restore pattern
- **Frontend mocks**: `vi.mock('../../hooks/...', ...)` and `vi.mock('react-i18next', ...)`
### 2.3 Existing Patch Test Files
| File | Existing Tests |
|---|---|
| `certificate_handler_patch_coverage_test.go` | `TestDelete_UUID_WithBackup_Success`, `_NotFound`, `_InUse` |
| `certificate_service_patch_coverage_test.go` | `TestExportCertificate_DER`, `_PFX`, `_P12`, `_UnsupportedFormat` |
| `certificate_validator_extra_coverage_test.go` | ECDSA/Ed25519 key match, `ConvertDERToPEM` valid/invalid |
| `manager_patch_coverage_test.go` | DNS provider encryption key paths |
| `proxy_host_handler_test.go` | Full CRUD + BulkUpdateACL + BulkUpdateSecurityHeaders |
| `proxy_host_handler_update_test.go` | Update edge cases, `ParseForwardPortField`, `ParseNullableUintField` |
---
## 3. Technical Specifications — Per-File Gap Analysis
### 3.1 `certificate_handler.go` — Export Re-Auth Path (~18 lines)
The `Export` handler re-authenticates the user when `include_key=true`. All six guard branches are uncovered.
**Gap location**: Lines ~260320 (password empty check, `user` context key extraction, `map[string]any` cast, `id` field lookup, DB user lookup, bcrypt check)
**New tests** (append to `certificate_handler_patch_coverage_test.go`):
| Test Name | Scenario | Expected |
|---|---|---|
| `TestExport_IncludeKey_MissingPassword` | POST with `include_key=true`, no `password` field | 403 |
| `TestExport_IncludeKey_NoUserContext` | No `"user"` key in gin context | 403 |
| `TestExport_IncludeKey_InvalidClaimsType` | `"user"` set to a plain string | 403 |
| `TestExport_IncludeKey_UserIDNotInClaims` | `user = map[string]any{}` with no `"id"` key | 403 |
| `TestExport_IncludeKey_UserNotFoundInDB` | Valid claims, no matching user row | 403 |
| `TestExport_IncludeKey_WrongPassword` | User in DB, wrong plaintext password submitted | 403 |
### 3.2 `certificate_handler.go` — Export Service Errors (~4 lines)
**Gap location**: After `ExportCertificate` call — ErrCertNotFound and generic error branches
| Test Name | Scenario | Expected |
|---|---|---|
| `TestExport_CertNotFound` | Unknown UUID | 404 |
| `TestExport_ServiceError` | Service returns non-not-found error | 500 |
### 3.3 `certificate_handler.go` — Delete Numeric-ID Error Paths (~12 lines)
**Gap location**: `IsCertificateInUse` error, disk space check, backup error, `DeleteCertificateByID` returning `ErrCertInUse` or generic error
| Test Name | Scenario | Expected |
|---|---|---|
| `TestDelete_NumericID_UsageCheckError` | `IsCertificateInUse` returns error | 500 |
| `TestDelete_NumericID_LowDiskSpace` | `availableSpaceFunc` returns 0 | 507 |
| `TestDelete_NumericID_BackupError` | `createFunc` returns error | 500 |
| `TestDelete_NumericID_CertInUse_FromService` | `DeleteCertificateByID``ErrCertInUse` | 409 |
| `TestDelete_NumericID_DeleteError` | `DeleteCertificateByID` → generic error | 500 |
### 3.4 `certificate_handler.go` — Delete UUID Additional Error Paths (~8 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestDelete_UUID_UsageCheckInternalError` | `IsCertificateInUseByUUID` returns non-ErrCertNotFound error | 500 |
| `TestDelete_UUID_LowDiskSpace` | `availableSpaceFunc` returns 0 | 507 |
| `TestDelete_UUID_BackupCreationError` | `createFunc` returns error | 500 |
| `TestDelete_UUID_CertInUse_FromService` | `DeleteCertificate``ErrCertInUse` | 409 |
### 3.5 `certificate_handler.go` — Upload/Validate File Open Errors (~8 lines)
**Gap location**: `file.Open()` calls on multipart key and chain form files returning errors
| Test Name | Scenario | Expected |
|---|---|---|
| `TestUpload_KeyFile_OpenError` | Valid cert file, malformed key multipart entry | 500 |
| `TestUpload_ChainFile_OpenError` | Valid cert+key, malformed chain multipart entry | 500 |
| `TestValidate_KeyFile_OpenError` | Valid cert, malformed key multipart entry | 500 |
| `TestValidate_ChainFile_OpenError` | Valid cert+key, malformed chain multipart entry | 500 |
### 3.6 `certificate_handler.go` — `sendDeleteNotification` Rate-Limit (~2 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestSendDeleteNotification_RateLimit` | Call `sendDeleteNotification` twice within 10-second window | Second call is a no-op |
---
### 3.7 `certificate_service.go` — `SyncFromDisk` Branches (~14 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestSyncFromDisk_StagingToProductionUpgrade` | DB has staging cert, disk has production cert for same domain | DB cert updated to production provider |
| `TestSyncFromDisk_ExpiryOnlyUpdate` | Disk cert content matches DB cert, only expiry changed | Only `expires_at` column updated |
| `TestSyncFromDisk_CertRootStatPermissionError` | `os.Chmod(certRoot, 0)` before sync; add skip guard `if os.Getuid() == 0 { t.Skip("chmod permission test cannot run as root") }` | No panic; logs error; function completes |
### 3.8 `certificate_service.go` — `ListCertificates` Background Goroutine (~4 lines)
**Gap location**: `initialized=true` && TTL expired path → spawns background goroutine
| Test Name | Scenario | Expected |
|---|---|---|
| `TestListCertificates_StaleCache_TriggersBackgroundSync` | `initialized=true`, `lastScan` = 10 min ago | Returns cached list without blocking; background sync completes |
*Use `require.Eventually(t, func() bool { return svc.lastScan.After(before) }, 2*time.Second, 10*time.Millisecond, "background sync did not update lastScan")` after the call — avoids flaky fixed sleeps.*
### 3.9 `certificate_service.go` — `GetDecryptedPrivateKey` Nil encSvc and Decrypt Failure (~4 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestGetDecryptedPrivateKey_NoEncSvc` | Service with `nil` encSvc, cert has non-empty `PrivateKeyEncrypted` | Returns error |
| `TestGetDecryptedPrivateKey_DecryptFails` | encSvc configured, corrupted ciphertext in DB | Returns wrapped error |
### 3.10 `certificate_service.go` — `MigratePrivateKeys` Branches (~6 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestMigratePrivateKeys_NoEncSvc` | `encSvc == nil` | Returns nil; logs warning |
| `TestMigratePrivateKeys_WithRows` | DB has cert with `private_key` populated, valid encSvc | Row migrated: `private_key` cleared, `private_key_enc` set |
### 3.11 `certificate_service.go` — `UpdateCertificate` Errors (~4 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestUpdateCertificate_NotFound` | Non-existent UUID | Returns `ErrCertNotFound` |
| `TestUpdateCertificate_DBSaveError` | Valid UUID, DB closed before Save | Returns wrapped error |
### 3.12 `certificate_service.go` — `DeleteCertificate` ACME File Cleanup (~8 lines)
**Gap location**: `cert.Provider == "letsencrypt"` branch → Walk certRoot and remove `.crt`/`.key`/`.json` files
| Test Name | Scenario | Expected |
|---|---|---|
| `TestDeleteCertificate_LetsEncryptProvider_FileCleanup` | Create temp `.crt` matching cert domain, delete cert | `.crt` removed from disk |
| `TestDeleteCertificate_StagingProvider_FileCleanup` | Provider = `"letsencrypt-staging"` | Same cleanup behavior triggered |
### 3.13 `certificate_service.go` — `CheckExpiringCertificates` (~8 lines)
**Implementation** (lines ~9661020): queries `provider = 'custom'` certs expiring before `threshold`, iterates and sends notification for certs with `daysLeft <= warningDays`.
| Test Name | Scenario | Expected |
|---|---|---|
| `TestCheckExpiringCertificates_ExpiresInRange` | Custom cert `expires_at = now+5d`, warningDays=30 | Returns slice with 1 cert |
| `TestCheckExpiringCertificates_AlreadyExpired` | Custom cert `expires_at = yesterday` | Result contains cert with negative days |
| `TestCheckExpiringCertificates_DBError` | DB closed before query | Returns error |
---
### 3.14 `certificate_validator.go` — `DetectFormat` Password-Protected PFX (~2 lines)
**Gap location**: PFX where `pkcs12.DecodeAll("")` fails but first byte is `0x30` (ASN.1 SEQUENCE), DER parse also fails → returns `FormatPFX`
**New file**: `certificate_validator_patch_coverage_test.go`
| Test Name | Scenario | Expected |
|---|---|---|
| `TestDetectFormat_PasswordProtectedPFX` | Generate PFX with non-empty password, call `DetectFormat` | Returns `FormatPFX` |
### 3.15 `certificate_validator.go` — `parsePEMPrivateKey` Additional Block Types (~4 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestParsePEMPrivateKey_PKCS1RSA` | PEM block type `"RSA PRIVATE KEY"` (x509.MarshalPKCS1PrivateKey) | Returns RSA key |
| `TestParsePEMPrivateKey_EC` | PEM block type `"EC PRIVATE KEY"` (x509.MarshalECPrivateKey) | Returns ECDSA key |
### 3.16 `certificate_validator.go` — `detectKeyType` P-384 and Unknown Curves (~4 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestDetectKeyType_ECDSAP384` | P-384 ECDSA key | Returns `"ECDSA-P384"` |
| `TestDetectKeyType_ECDSAUnknownCurve` | ECDSA key with custom/unknown curve (e.g. P-224) | Returns `"ECDSA"` |
### 3.17 `certificate_validator.go` — `ConvertPEMToPFX` Empty Chain (~2 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestConvertPEMToPFX_EmptyChain` | Valid cert+key PEM, empty chain string | Returns PFX bytes without error |
### 3.18 `certificate_validator.go` — `ConvertPEMToDER` Non-Certificate Block (~2 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestConvertPEMToDER_NonCertBlock` | PEM block type `"PRIVATE KEY"` | Returns nil data and error |
### 3.19 `certificate_validator.go` — `formatSerial` Nil BigInt (~2 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestFormatSerial_Nil` | `formatSerial(nil)` | Returns `""` |
---
### 3.20 `proxy_host_handler.go` — `generateForwardHostWarnings` Private IP (~2 lines)
**Gap location**: `net.ParseIP(forwardHost) != nil && network.IsPrivateIP(ip)` branch (non-Docker private IP)
**New file**: `proxy_host_handler_patch_coverage_test.go`
| Test Name | Scenario | Expected |
|---|---|---|
| `TestGenerateForwardHostWarnings_PrivateIP` | forwardHost = `"192.168.1.100"` (RFC-1918, non-Docker) | Returns warning with field `"forward_host"` |
### 3.21 `proxy_host_handler.go` — `BulkUpdateSecurityHeaders` Edge Cases (~4 lines)
| Test Name | Scenario | Expected |
|---|---|---|
| `TestBulkUpdateSecurityHeaders_AllFail_Rollback` | All UUIDs not found → `updated == 0` at end | 400, transaction rolled back |
| `TestBulkUpdateSecurityHeaders_ProfileDB_NonNotFoundError` | Profile lookup returns wrapped DB error | 500 |
---
### 3.22 Frontend: `CertificateList.tsx` — Untested Branches
**File**: `frontend/src/components/__tests__/CertificateList.test.tsx`
| Gap | New Test |
|---|---|
| `bulkDeleteMutation` success | `'calls bulkDeleteMutation.mutate with selected UUIDs on confirm'` |
| `bulkDeleteMutation` error | `'shows error toast on bulk delete failure'` |
| Sort direction toggle | `'toggles sort direction when same column clicked twice'` |
| `selectedIds` reconciliation | `'reconciles selectedIds when certificate list shrinks'` |
| Export dialog open | `'opens export dialog when export button clicked'` |
### 3.23 Frontend: `CertificateUploadDialog.tsx` — Untested Branches
**File**: `frontend/src/components/dialogs/__tests__/CertificateUploadDialog.test.tsx`
| Gap | New Test |
|---|---|
| PFX hides key/chain zones | `'hides key and chain file inputs when PFX file selected'` |
| Upload success closes dialog | `'calls onOpenChange(false) on successful upload'` |
| Upload error shows toast | `'shows error toast when upload mutation fails'` |
| Validate result shown | `'displays validation result after validate clicked'` |
---
## 4. Implementation Plan
### Phase 1: Playwright Smoke Tests (Acceptance Gating)
Add smoke coverage to confirm certificate export and delete flows reach the backend.
**File**: `tests/certificate-coverage-smoke.spec.ts`
```typescript
import { test, expect } from '@playwright/test'
test.describe('Certificate Coverage Smoke', () => {
test('export dialog opens when export button clicked', async ({ page }) => {
await page.goto('/')
// navigate to Certificates, click export on a cert
// assert dialog visible
})
test('delete dialog opens for deletable certificate', async ({ page }) => {
await page.goto('/')
// assert delete confirmation dialog appears
})
})
```
### Phase 2: Backend — Handler Tests
**File**: `backend/internal/api/handlers/certificate_handler_patch_coverage_test.go`
**Action**: Append all tests from sections 3.13.6.
Setup pattern for handler tests:
```go
func setupCertHandlerTest(t *testing.T) (*gin.Engine, *CertificateHandler, *gorm.DB) {
t.Helper()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.User{}, &models.ProxyHost{}))
tmpDir := t.TempDir()
certSvc := services.NewCertificateService(tmpDir, db, nil)
backup := &mockBackupService{
availableSpaceFunc: func() (int64, error) { return 1 << 30, nil },
createFunc: func(string) (string, error) { return "/tmp/backup.db", nil },
}
h := NewCertificateHandler(certSvc, backup, nil)
h.SetDB(db)
r := gin.New()
r.Use(mockAuthMiddleware())
h.RegisterRoutes(r.Group("/api"))
return r, h, db
}
```
For `TestExport_IncludeKey_*` tests: inject user into gin context directly using a custom middleware wrapper that sets `"user"` (type `map[string]any`, field `"id"`) to the desired value.
### Phase 3: Backend — Service Tests
**File**: `backend/internal/services/certificate_service_patch_coverage_test.go`
**Action**: Append all tests from sections 3.73.13.
Setup pattern:
```go
func newTestSvc(t *testing.T) (*CertificateService, *gorm.DB, string) {
t.Helper()
db, err := gorm.Open(sqlite.Open(fmt.Sprintf("file:%s?mode=memory&cache=shared", t.Name())), &gorm.Config{})
require.NoError(t, err)
require.NoError(t, db.AutoMigrate(&models.SSLCertificate{}, &models.ProxyHost{}))
tmpDir := t.TempDir()
return NewCertificateService(tmpDir, db, nil), db, tmpDir
}
```
For `TestMigratePrivateKeys_WithRows`: use `db.Exec("INSERT INTO ssl_certificates (..., private_key) VALUES (...)` raw SQL to bypass GORM's `gorm:"-"` tag.
### Phase 4: Backend — Validator Tests
**File**: `backend/internal/services/certificate_validator_patch_coverage_test.go` (new)
Key helpers needed:
```go
// generatePKCS1RSAKeyPEM returns an RSA key in PKCS#1 "RSA PRIVATE KEY" PEM format.
func generatePKCS1RSAKeyPEM(t *testing.T) []byte {
key, err := rsa.GenerateKey(rand.Reader, 2048)
require.NoError(t, err)
return pem.EncodeToMemory(&pem.Block{
Type: "RSA PRIVATE KEY",
Bytes: x509.MarshalPKCS1PrivateKey(key),
})
}
// generateECKeyPEM returns an EC key in "EC PRIVATE KEY" (SEC1) PEM format.
func generateECKeyPEM(t *testing.T, curve elliptic.Curve) []byte {
key, err := ecdsa.GenerateKey(curve, rand.Reader)
require.NoError(t, err)
b, err := x509.MarshalECPrivateKey(key)
require.NoError(t, err)
return pem.EncodeToMemory(&pem.Block{Type: "EC PRIVATE KEY", Bytes: b})
}
```
### Phase 5: Backend — Proxy Host Handler Tests
**File**: `backend/internal/api/handlers/proxy_host_handler_patch_coverage_test.go` (new)
Setup pattern mirrors existing `proxy_host_handler_test.go` — use in-memory SQLite, `mockAuthMiddleware`, and `mockCaddyManager` (already available via test hook vars).
### Phase 6: Frontend Tests
**Files**:
- `frontend/src/components/__tests__/CertificateList.test.tsx`
- `frontend/src/components/dialogs/__tests__/CertificateUploadDialog.test.tsx`
Use existing mock structure; add new `it(...)` blocks inside existing `describe` blocks.
Frontend bulk delete success test pattern:
```typescript
it('calls bulkDeleteMutation.mutate with selected UUIDs on confirm', async () => {
const bulkDeleteFn = vi.fn()
mockUseBulkDeleteCertificates.mockReturnValue({
mutate: bulkDeleteFn,
isPending: false,
})
render(<CertificateList />)
// select checkboxes, click bulk delete, confirm dialog
expect(bulkDeleteFn).toHaveBeenCalledWith(['uuid-1', 'uuid-2'])
})
```
### Phase 7: Validation
1. `cd /projects/Charon && bash scripts/go-test-coverage.sh`
2. `cd /projects/Charon && bash scripts/frontend-test-coverage.sh`
3. `bash scripts/local-patch-report.sh` → verify `test-results/local-patch-report.md` shows ≥ 90%
4. `bash scripts/scan-gorm-security.sh --check` → zero CRITICAL/HIGH
---
## 5. Commit Slicing Strategy
**Decision**: One PR with 5 ordered, independently-reviewable commits.
**Rationale**: Four packages touched across two build systems (Go + Node). Atomic commits allow targeted revert if a mock approach proves brittle for a specific file, without rolling back unrelated coverage gains.
| # | Scope | Files | Dependencies | Validation Gate |
|---|---|---|---|---|
| **Commit 1** | Handler re-auth + delete + file-open errors | `certificate_handler_patch_coverage_test.go` (extend) | None | `go test ./backend/internal/api/handlers/...` |
| **Commit 2** | Service SyncFromDisk, ListCerts, GetDecryptedKey, Migrate, Update, Delete, CheckExpiring | `certificate_service_patch_coverage_test.go` (extend) | None | `go test ./backend/internal/services/...` |
| **Commit 3** | Validator DetectFormat, parsePEMPrivateKey, detectKeyType, ConvertPEMToPFX/DER, formatSerial | `certificate_validator_patch_coverage_test.go` (new) | Commit 2 not required (separate file) | `go test ./backend/internal/services/...` |
| **Commit 4** | Proxy host warnings + BulkUpdateSecurityHeaders edge cases | `proxy_host_handler_patch_coverage_test.go` (new) | None | `go test ./backend/internal/api/handlers/...` |
| **Commit 5** | Frontend CertificateList + CertificateUploadDialog | `CertificateList.test.tsx`, `CertificateUploadDialog.test.tsx` (extend) | None | `npm run test` |
**Rollback**: Any commit is safe to revert independently — all changes are additive test-only files.
**Contingency**: If the `Export` handler's re-auth tests require gin context injection that the current router wiring doesn't support cleanly, use a sub-router with a custom test middleware that pre-populates `"user"` (`map[string]any{"id": uint(1)}`) with the specific value under test, bypassing `mockAuthMiddleware` for those cases only.
---
## 6. Acceptance Criteria
- [ ] `go test -race ./backend/...` — all tests pass, no data races
- [ ] Backend patch coverage ≥ 90% for all modified Go files per `test-results/local-patch-report.md`
- [ ] `npm run test` — all Vitest tests pass
- [ ] Frontend patch coverage ≥ 90% for `CertificateList.tsx` and `CertificateUploadDialog.tsx`
- [ ] GORM security scan: zero CRITICAL/HIGH findings
- [ ] No new `//nolint` or `//nosec` directives introduced
- [ ] No source file modifications — test files only
- [ ] All new Go test names follow `TestFunctionName_Scenario` convention
- [ ] Previous spec archived to `docs/plans/archive/`
---
## 7. Estimated Coverage Impact
| File | Current | Estimated After | Lines Recovered |
|---|---|---|---|
| `certificate_handler.go` | 70.28% | ~85% | ~42 lines |
| `certificate_service.go` | 82.85% | ~92% | ~44 lines |
| `certificate_validator.go` | 88.68% | ~96% | ~18 lines |
| `proxy_host_handler.go` | 55.17% | ~60% | ~8 lines |
| `CertificateList.tsx` | moderate | high | ~15 lines |
| `CertificateUploadDialog.tsx` | moderate | high | ~12 lines |
| **Overall patch** | **85.61%** | **≥ 90%** | **~139 lines** |
> **Note**: Proxy host handler remains below 90% after this plan because the `Create`/`Update`/`Delete` handler paths require full Caddy manager mock integration. A follow-up plan should address these with a dedicated `mockCaddyManager` interface.

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,897 @@
# CrowdSec IP Whitelist Management — Implementation Plan
**Issue**: [#939 — CrowdSec IP Whitelist Management](https://github.com/owner/Charon/issues/939)
**Date**: 2026-05-20
**Status**: Draft — Awaiting Approval
**Priority**: High
**Archived Previous Plan**: Coverage Improvement Plan (patch coverage ≥ 90%) → `docs/plans/archive/patch-coverage-improvement-plan-2026-05-02.md`
---
## 1. Introduction
### 1.1 Overview
CrowdSec enforces IP ban decisions by default. Operators need a way to permanently exempt known-good IPs (uptime monitors, internal subnets, VPN exits, partners) from ever being banned. CrowdSec handles this through its `whitelists` parser, which intercepts alert evaluation and suppresses bans for matching IPs/CIDRs before decisions are even written.
This feature gives Charon operators a first-class UI for managing those whitelist entries: add an IP or CIDR, give it a reason, and have Charon persist it in the database, render the required YAML parser file into the CrowdSec config tree, and signal CrowdSec to reload—all without manual file editing.
### 1.2 Objectives
- Allow operators to add, view, and remove CrowdSec whitelist entries (IPs and CIDRs) through the Charon management UI.
- Persist entries in SQLite so they survive container restarts.
- Generate a `crowdsecurity/whitelists`-compatible YAML parser file on every mutating operation and on startup.
- Automatically install the `crowdsecurity/whitelists` hub parser so CrowdSec can process the file.
- Show the Whitelist tab only when CrowdSec is in `local` mode, consistent with other CrowdSec-only tabs.
---
## 2. Research Findings
### 2.1 Existing CrowdSec Architecture
| Component | Location | Notes |
|---|---|---|
| Hub parser installer | `configs/crowdsec/install_hub_items.sh` | Run at container start; uses `cscli parsers install --force` |
| CrowdSec handler | `backend/internal/api/handlers/crowdsec_handler.go` | ~2750 LOC; `RegisterRoutes` at L2704 |
| Route registration | `backend/internal/api/routes/routes.go` | `crowdsecHandler.RegisterRoutes(management)` at ~L620 |
| CrowdSec startup | `backend/internal/services/crowdsec_startup.go` | `ReconcileCrowdSecOnStartup()` runs before process start |
| Security config | `backend/internal/models/security_config.go` | `CrowdSecMode`, `CrowdSecConfigDir` (via `cfg.Security.CrowdSecConfigDir`) |
| IP/CIDR helper | `backend/internal/security/whitelist.go` | `IsIPInCIDRList()` using `net.ParseIP` / `net.ParseCIDR` |
| AutoMigrate | `routes.go` ~L95125 | `&models.ManualChallenge{}` is currently the last entry |
### 2.2 Gap Analysis
- `crowdsecurity/whitelists` hub parser is **not** installed by `install_hub_items.sh` — the YAML file would be ignored by CrowdSec without it.
- No `CrowdSecWhitelist` model exists in `backend/internal/models/`.
- No whitelist service, handler methods, or API routes exist.
- No frontend tab, API client functions, or TanStack Query hooks exist.
- No E2E test spec covers whitelist management.
### 2.3 Relevant Patterns
**Model pattern** (from `access_list.go` + `security_config.go`):
```go
type Model struct {
ID uint `json:"-" gorm:"primaryKey"`
UUID string `json:"uuid" gorm:"uniqueIndex;not null"`
// domain fields
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}
```
**Service pattern** (from `access_list_service.go`):
```go
var ErrXxxNotFound = errors.New("xxx not found")
type XxxService struct { db *gorm.DB }
func NewXxxService(db *gorm.DB) *XxxService { return &XxxService{db: db} }
```
**Handler error response pattern** (from `crowdsec_handler.go`):
```go
c.JSON(http.StatusBadRequest, gin.H{"error": "..."})
c.JSON(http.StatusNotFound, gin.H{"error": "..."})
c.JSON(http.StatusInternalServerError, gin.H{"error": "..."})
```
**Frontend API client pattern** (from `frontend/src/api/crowdsec.ts`):
```typescript
export const listXxx = async (): Promise<XxxEntry[]> => {
const resp = await client.get<XxxEntry[]>('/admin/crowdsec/xxx')
return resp.data
}
```
**Frontend mutation pattern** (from `CrowdSecConfig.tsx`):
```typescript
const mutation = useMutation({
mutationFn: (data) => apiCall(data),
onSuccess: () => {
toast.success('...')
queryClient.invalidateQueries({ queryKey: ['crowdsec-whitelist'] })
},
onError: (err) => toast.error(err instanceof Error ? err.message : '...'),
})
```
### 2.4 CrowdSec Whitelist YAML Format
CrowdSec's `crowdsecurity/whitelists` parser expects the following YAML structure at a path under the `parsers/s02-enrich/` directory:
```yaml
name: charon-whitelist
description: "Charon-managed IP/CIDR whitelist"
filter: "evt.Meta.service == 'http'"
whitelist:
reason: "Charon managed whitelist"
ip:
- "1.2.3.4"
cidr:
- "10.0.0.0/8"
- "192.168.0.0/16"
```
For an empty whitelist, both `ip` and `cidr` must be present as empty lists (not omitted) to produce valid YAML that CrowdSec can parse without error.
---
## 3. Technical Specifications
### 3.1 Database Schema
**New model**: `backend/internal/models/crowdsec_whitelist.go`
```go
package models
import "time"
// CrowdSecWhitelist represents a single IP or CIDR exempted from CrowdSec banning.
type CrowdSecWhitelist struct {
ID uint `json:"-" gorm:"primaryKey"`
UUID string `json:"uuid" gorm:"uniqueIndex;not null"`
IPOrCIDR string `json:"ip_or_cidr" gorm:"not null;uniqueIndex"`
Reason string `json:"reason" gorm:"not null;default:''"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
}
```
**AutoMigrate registration** (`backend/internal/api/routes/routes.go`, append after `&models.ManualChallenge{}`):
```go
&models.CrowdSecWhitelist{},
```
### 3.2 API Design
All new endpoints live under the existing `/api/v1` prefix and are registered inside `CrowdsecHandler.RegisterRoutes(rg *gin.RouterGroup)`, following the same `rg.METHOD("/admin/crowdsec/...")` naming pattern as every other CrowdSec endpoint.
#### Endpoint Table
| Method | Path | Auth | Description |
|---|---|---|---|
| `GET` | `/api/v1/admin/crowdsec/whitelist` | Management | List all whitelist entries |
| `POST` | `/api/v1/admin/crowdsec/whitelist` | Management | Add a new entry |
| `DELETE` | `/api/v1/admin/crowdsec/whitelist/:uuid` | Management | Remove an entry by UUID |
#### `GET /admin/crowdsec/whitelist`
**Response 200**:
```json
{
"whitelist": [
{
"uuid": "a1b2c3d4-...",
"ip_or_cidr": "10.0.0.0/8",
"reason": "Internal subnet",
"created_at": "2026-05-20T12:00:00Z",
"updated_at": "2026-05-20T12:00:00Z"
}
]
}
```
#### `POST /admin/crowdsec/whitelist`
**Request body**:
```json
{ "ip_or_cidr": "10.0.0.0/8", "reason": "Internal subnet" }
```
**Response 201**:
```json
{
"uuid": "a1b2c3d4-...",
"ip_or_cidr": "10.0.0.0/8",
"reason": "Internal subnet",
"created_at": "...",
"updated_at": "..."
}
```
**Error responses**:
- `400` — missing/invalid `ip_or_cidr` field, unparseable IP/CIDR
- `409` — duplicate entry (same `ip_or_cidr` already exists)
- `500` — database or YAML write failure
#### `DELETE /admin/crowdsec/whitelist/:uuid`
**Response 204** — no body
**Error responses**:
- `404` — entry not found
- `500` — database or YAML write failure
### 3.3 Service Design
**New file**: `backend/internal/services/crowdsec_whitelist_service.go`
```go
package services
import (
"context"
"errors"
"net"
"os"
"path/filepath"
"text/template"
"github.com/google/uuid"
"gorm.io/gorm"
"github.com/yourusername/charon/backend/internal/models"
"github.com/yourusername/charon/backend/internal/logger"
)
var (
ErrWhitelistNotFound = errors.New("whitelist entry not found")
ErrInvalidIPOrCIDR = errors.New("invalid IP address or CIDR notation")
ErrDuplicateEntry = errors.New("whitelist entry already exists")
)
type CrowdSecWhitelistService struct {
db *gorm.DB
dataDir string
}
func NewCrowdSecWhitelistService(db *gorm.DB, dataDir string) *CrowdSecWhitelistService {
return &CrowdSecWhitelistService{db: db, dataDir: dataDir}
}
// List returns all whitelist entries ordered by creation time.
func (s *CrowdSecWhitelistService) List(ctx context.Context) ([]models.CrowdSecWhitelist, error) { ... }
// Add validates, persists, and regenerates the YAML file.
func (s *CrowdSecWhitelistService) Add(ctx context.Context, ipOrCIDR, reason string) (*models.CrowdSecWhitelist, error) { ... }
// Delete removes an entry by UUID and regenerates the YAML file.
func (s *CrowdSecWhitelistService) Delete(ctx context.Context, uuid string) error { ... }
// WriteYAML renders all current entries to <dataDir>/parsers/s02-enrich/charon-whitelist.yaml
func (s *CrowdSecWhitelistService) WriteYAML(ctx context.Context) error { ... }
```
**Validation logic** in `Add()`:
1. Trim whitespace from `ipOrCIDR`.
2. Attempt `net.ParseIP(ipOrCIDR)` — if non-nil, it's a bare IP ✓
3. Attempt `net.ParseCIDR(ipOrCIDR)` — if `err == nil`, it's a valid CIDR ✓; normalize host bits immediately: `ipOrCIDR = network.String()` (e.g., `"10.0.0.1/8"` → `"10.0.0.0/8"`).
4. If both fail → return `ErrInvalidIPOrCIDR`
5. Attempt DB insert; if GORM unique constraint error → return `ErrDuplicateEntry`
6. On success → call `WriteYAML(ctx)` (non-fatal on YAML error: log + return original entry)
> **Note**: `Add()` and `Delete()` do **not** call `cscli hub reload`. Reload is the caller's responsibility (handled in `CrowdsecHandler.AddWhitelist` and `DeleteWhitelist` via `h.CmdExec`).
**CIDR normalization snippet** (step 3):
```go
if ip, network, err := net.ParseCIDR(ipOrCIDR); err == nil {
_ = ip
ipOrCIDR = network.String() // normalizes "10.0.0.1/8" → "10.0.0.0/8"
}
```
**YAML generation** in `WriteYAML()`:
Guard: if `s.dataDir == ""`, return `nil` immediately (no-op — used in unit tests that don't need file I/O).
```go
const whitelistTmpl = `name: charon-whitelist
description: "Charon-managed IP/CIDR whitelist"
filter: "evt.Meta.service == 'http'"
whitelist:
reason: "Charon managed whitelist"
ip:
{{- range .IPs}}
- "{{.}}"
{{- end}}
{{- if not .IPs}}
[]
{{- end}}
cidr:
{{- range .CIDRs}}
- "{{.}}"
{{- end}}
{{- if not .CIDRs}}
[]
{{- end}}
`
```
Target file path: `<dataDir>/config/parsers/s02-enrich/charon-whitelist.yaml`
Directory created with `os.MkdirAll(..., 0o750)` if absent.
File written atomically: render to `<path>.tmp` → `os.Rename(tmp, path)`.
### 3.4 Handler Design
**Additions to `CrowdsecHandler` struct**:
```go
type CrowdsecHandler struct {
// ... existing fields ...
WhitelistSvc *services.CrowdSecWhitelistService // NEW
}
```
**`NewCrowdsecHandler` constructor** — initialize `WhitelistSvc`:
```go
h := &CrowdsecHandler{
// ... existing assignments ...
}
if db != nil {
h.WhitelistSvc = services.NewCrowdSecWhitelistService(db, dataDir)
}
return h
```
**Three new methods on `CrowdsecHandler`**:
```go
// ListWhitelists handles GET /admin/crowdsec/whitelist
func (h *CrowdsecHandler) ListWhitelists(c *gin.Context) {
entries, err := h.WhitelistSvc.List(c.Request.Context())
if err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to list whitelist entries"})
return
}
c.JSON(http.StatusOK, gin.H{"whitelist": entries})
}
// AddWhitelist handles POST /admin/crowdsec/whitelist
func (h *CrowdsecHandler) AddWhitelist(c *gin.Context) {
var req struct {
IPOrCIDR string `json:"ip_or_cidr" binding:"required"`
Reason string `json:"reason"`
}
if err := c.ShouldBindJSON(&req); err != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "ip_or_cidr is required"})
return
}
entry, err := h.WhitelistSvc.Add(c.Request.Context(), req.IPOrCIDR, req.Reason)
if errors.Is(err, services.ErrInvalidIPOrCIDR) {
c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()})
return
}
if errors.Is(err, services.ErrDuplicateEntry) {
c.JSON(http.StatusConflict, gin.H{"error": err.Error()})
return
}
if err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to add whitelist entry"})
return
}
// Reload CrowdSec so the new entry takes effect immediately (non-fatal).
if reloadErr := h.CmdExec.Execute("cscli", "hub", "reload"); reloadErr != nil {
logger.Log().WithError(reloadErr).Warn("failed to reload CrowdSec after whitelist add (non-fatal)")
}
c.JSON(http.StatusCreated, entry)
}
// DeleteWhitelist handles DELETE /admin/crowdsec/whitelist/:uuid
func (h *CrowdsecHandler) DeleteWhitelist(c *gin.Context) {
id := strings.TrimSpace(c.Param("uuid"))
if id == "" {
c.JSON(http.StatusBadRequest, gin.H{"error": "uuid required"})
return
}
err := h.WhitelistSvc.Delete(c.Request.Context(), id)
if errors.Is(err, services.ErrWhitelistNotFound) {
c.JSON(http.StatusNotFound, gin.H{"error": "whitelist entry not found"})
return
}
if err != nil {
c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to delete whitelist entry"})
return
}
// Reload CrowdSec so the removed entry is no longer exempt (non-fatal).
if reloadErr := h.CmdExec.Execute("cscli", "hub", "reload"); reloadErr != nil {
logger.Log().WithError(reloadErr).Warn("failed to reload CrowdSec after whitelist delete (non-fatal)")
}
c.Status(http.StatusNoContent)
}
```
**Route registration** (append inside `RegisterRoutes`, after existing decision/bouncer routes):
```go
// Whitelist management
rg.GET("/admin/crowdsec/whitelist", h.ListWhitelists)
rg.POST("/admin/crowdsec/whitelist", h.AddWhitelist)
rg.DELETE("/admin/crowdsec/whitelist/:uuid", h.DeleteWhitelist)
```
### 3.5 Startup Integration
**File**: `backend/internal/services/crowdsec_startup.go`
In `ReconcileCrowdSecOnStartup()`, before the CrowdSec process is started:
```go
// Regenerate whitelist YAML to ensure it reflects the current DB state.
whitelistSvc := NewCrowdSecWhitelistService(db, dataDir)
if err := whitelistSvc.WriteYAML(ctx); err != nil {
logger.Log().WithError(err).Warn("failed to write CrowdSec whitelist YAML on startup (non-fatal)")
}
```
This is **non-fatal**: if the DB has no entries, WriteYAML still writes an empty whitelist file, which is valid.
### 3.6 Hub Parser Installation
**File**: `configs/crowdsec/install_hub_items.sh`
Add after the existing `cscli parsers install` lines:
```bash
cscli parsers install crowdsecurity/whitelists --force || echo "⚠️ Failed to install crowdsecurity/whitelists"
```
### 3.7 Frontend Design
#### API Client (`frontend/src/api/crowdsec.ts`)
Append the following types and functions:
```typescript
export interface CrowdSecWhitelistEntry {
uuid: string
ip_or_cidr: string
reason: string
created_at: string
updated_at: string
}
export interface AddWhitelistPayload {
ip_or_cidr: string
reason: string
}
export const listWhitelists = async (): Promise<CrowdSecWhitelistEntry[]> => {
const resp = await client.get<{ whitelist: CrowdSecWhitelistEntry[] }>('/admin/crowdsec/whitelist')
return resp.data.whitelist
}
export const addWhitelist = async (data: AddWhitelistPayload): Promise<CrowdSecWhitelistEntry> => {
const resp = await client.post<CrowdSecWhitelistEntry>('/admin/crowdsec/whitelist', data)
return resp.data
}
export const deleteWhitelist = async (uuid: string): Promise<void> => {
await client.delete(`/admin/crowdsec/whitelist/${uuid}`)
}
```
#### TanStack Query Hooks (`frontend/src/hooks/useCrowdSecWhitelist.ts`)
```typescript
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query'
import { listWhitelists, addWhitelist, deleteWhitelist, AddWhitelistPayload } from '../api/crowdsec'
import { toast } from 'sonner'
export const useWhitelistEntries = () =>
useQuery({
queryKey: ['crowdsec-whitelist'],
queryFn: listWhitelists,
})
export const useAddWhitelist = () => {
const queryClient = useQueryClient()
return useMutation({
mutationFn: (data: AddWhitelistPayload) => addWhitelist(data),
onSuccess: () => {
toast.success('Whitelist entry added')
queryClient.invalidateQueries({ queryKey: ['crowdsec-whitelist'] })
},
onError: (err: unknown) => {
toast.error(err instanceof Error ? err.message : 'Failed to add whitelist entry')
},
})
}
export const useDeleteWhitelist = () => {
const queryClient = useQueryClient()
return useMutation({
mutationFn: (uuid: string) => deleteWhitelist(uuid),
onSuccess: () => {
toast.success('Whitelist entry removed')
queryClient.invalidateQueries({ queryKey: ['crowdsec-whitelist'] })
},
onError: (err: unknown) => {
toast.error(err instanceof Error ? err.message : 'Failed to remove whitelist entry')
},
})
}
```
#### CrowdSecConfig.tsx Changes
The `CrowdSecConfig.tsx` page uses a tab navigation pattern. The new "Whitelist" tab:
1. **Visibility**: Only render the tab when `isLocalMode === true` (same guard as Decisions tab).
2. **Tab value**: `"whitelist"` — append to the existing tab list.
3. **Tab panel content** (isolated component or inline JSX):
- **Add entry form**: `ip_or_cidr` text input + `reason` text input + "Add" button (disabled while `addMutation.isPending`). Validation error shown inline when backend returns 400/409.
- **Quick-add current IP**: A secondary "Add My IP" button that calls `GET /api/v1/system/my-ip` (existing endpoint) and pre-fills the `ip_or_cidr` field with the returned IP.
- **Entries table**: Columns — IP/CIDR, Reason, Added, Actions. Each row has a delete button with a confirmation dialog (matching the ban/unban modal pattern used for Decisions).
- **Empty state**: "No whitelist entries" message when the list is empty.
- **Loading state**: Skeleton rows while `useWhitelistEntries` is fetching.
**Imports added to `CrowdSecConfig.tsx`**:
```typescript
import { useWhitelistEntries, useAddWhitelist, useDeleteWhitelist } from '../hooks/useCrowdSecWhitelist'
```
### 3.8 Data Flow Diagram
```
Operator adds IP in UI
POST /api/v1/admin/crowdsec/whitelist
CrowdsecHandler.AddWhitelist()
CrowdSecWhitelistService.Add()
├── Validate IP/CIDR (net.ParseIP / net.ParseCIDR)
├── Normalize CIDR host bits (network.String())
├── Insert into SQLite (models.CrowdSecWhitelist)
└── WriteYAML() → <dataDir>/config/parsers/s02-enrich/charon-whitelist.yaml
h.CmdExec.Execute("cscli", "hub", "reload") [non-fatal on error]
Return 201 to frontend
invalidateQueries(['crowdsec-whitelist'])
Table re-fetches and shows new entry
```
```
Container restart
ReconcileCrowdSecOnStartup()
CrowdSecWhitelistService.WriteYAML()
└── Reads all DB entries → renders YAML
CrowdSec process starts
CrowdSec loads parsers/s02-enrich/charon-whitelist.yaml
└── crowdsecurity/whitelists parser activates
IPs/CIDRs in file are exempt from all ban decisions
```
### 3.9 Error Handling Matrix
| Scenario | Service Error | HTTP Status | Frontend Behavior |
|---|---|---|---|
| Blank `ip_or_cidr` | — | 400 | Inline validation (required field) |
| Malformed IP/CIDR | `ErrInvalidIPOrCIDR` | 400 | Toast: "Invalid IP address or CIDR notation" |
| Duplicate entry | `ErrDuplicateEntry` | 409 | Toast: "This IP/CIDR is already whitelisted" |
| DB unavailable | generic error | 500 | Toast: "Failed to add whitelist entry" |
| UUID not found on DELETE | `ErrWhitelistNotFound` | 404 | Toast: "Whitelist entry not found" |
| YAML write failure | logged, non-fatal | 201 (Add still succeeds) | No user-facing error; log warning |
| CrowdSec reload failure | logged, non-fatal | 201/204 (operation still succeeds) | No user-facing error; log warning |
### 3.10 Security Considerations
- **Input validation**: All `ip_or_cidr` values are validated server-side with `net.ParseIP` / `net.ParseCIDR` before persisting. Arbitrary strings are rejected.
- **Path traversal**: `WriteYAML` constructs the output path via `filepath.Join(s.dataDir, "config", "parsers", "s02-enrich", "charon-whitelist.yaml")`. `dataDir` is set at startup—not user-supplied at request time.
- **Privilege**: All three endpoints require management-level access (same as all other CrowdSec endpoints).
- **YAML injection**: Values are rendered through Go's `text/template` with explicit quoting of each entry; no raw string concatenation.
- **Log safety**: IPs are logged using the same structured field pattern used in existing CrowdSec handler methods (e.g., `logger.Log().WithField("ip", entry.IPOrCIDR).Info(...)`).
---
## 4. Implementation Plan
### Phase 1 — Hub Parser Installation (Groundwork)
**Files Changed**:
- `configs/crowdsec/install_hub_items.sh`
**Task 1.1**: Add `cscli parsers install crowdsecurity/whitelists --force` after the last parser install line (currently `crowdsecurity/syslog-logs`).
**Acceptance**: File change is syntactically valid bash; `shellcheck` passes.
---
### Phase 2 — Database Model
**Files Changed**:
- `backend/internal/models/crowdsec_whitelist.go` _(new file)_
- `backend/internal/api/routes/routes.go` _(append to AutoMigrate call)_
**Task 2.1**: Create `crowdsec_whitelist.go` with the `CrowdSecWhitelist` struct per §3.1.
**Task 2.2**: Append `&models.CrowdSecWhitelist{}` to the `db.AutoMigrate(...)` call in `routes.go`.
**Validation Gate**: `go build ./backend/...` passes; GORM generates `crowdsec_whitelists` table on next startup.
---
### Phase 3 — Whitelist Service
**Files Changed**:
- `backend/internal/services/crowdsec_whitelist_service.go` _(new file)_
**Task 3.1**: Implement `CrowdSecWhitelistService` with `List`, `Add`, `Delete`, `WriteYAML` per §3.3.
**Task 3.2**: Implement IP/CIDR validation in `Add()`:
- `net.ParseIP(ipOrCIDR) != nil` → valid bare IP
- `net.ParseCIDR(ipOrCIDR)` returns no error → valid CIDR
- Both fail → `ErrInvalidIPOrCIDR`
**Task 3.3**: Implement `WriteYAML()`:
- Query all entries from DB.
- Partition into `ips` (bare IPs) and `cidrs` (CIDR notation) slices.
- Render template per §2.4.
- Atomic write: temp file → `os.Rename`.
- Create directory (`os.MkdirAll`) if not present.
**Validation Gate**: `go test ./backend/internal/services/... -run TestCrowdSecWhitelist` passes.
---
### Phase 4 — API Endpoints
**Files Changed**:
- `backend/internal/api/handlers/crowdsec_handler.go`
**Task 4.1**: Add `WhitelistSvc *services.CrowdSecWhitelistService` field to `CrowdsecHandler` struct.
**Task 4.2**: Initialize `WhitelistSvc` in `NewCrowdsecHandler()` when `db != nil`.
**Task 4.3**: Implement `ListWhitelists`, `AddWhitelist`, `DeleteWhitelist` methods per §3.4.
**Task 4.4**: Register three routes in `RegisterRoutes()` per §3.4.
**Task 4.5**: In `AddWhitelist` and `DeleteWhitelist`, after the service call returns without error, call `h.CmdExec.Execute("cscli", "hub", "reload")`. Log a warning on failure; do not change the HTTP response status (reload failure is non-fatal).
**Validation Gate**: `go test ./backend/internal/api/handlers/... -run TestWhitelist` passes; `make lint-fast` clean.
---
### Phase 5 — Startup Integration
**Files Changed**:
- `backend/internal/services/crowdsec_startup.go`
**Task 5.1**: In `ReconcileCrowdSecOnStartup()`, after the DB and config are loaded but before calling `h.Executor.Start()`, instantiate `CrowdSecWhitelistService` and call `WriteYAML(ctx)`. Log warning on error; do not abort startup.
**Validation Gate**: `go test ./backend/internal/services/... -run TestReconcile` passes; existing reconcile tests still pass.
---
### Phase 6 — Frontend API + Hooks
**Files Changed**:
- `frontend/src/api/crowdsec.ts`
- `frontend/src/hooks/useCrowdSecWhitelist.ts` _(new file)_
**Task 6.1**: Add `CrowdSecWhitelistEntry`, `AddWhitelistPayload` types and `listWhitelists`, `addWhitelist`, `deleteWhitelist` functions to `crowdsec.ts` per §3.7.
**Task 6.2**: Create `useCrowdSecWhitelist.ts` with `useWhitelistEntries`, `useAddWhitelist`, `useDeleteWhitelist` hooks per §3.7.
**Validation Gate**: `pnpm test` (Vitest) passes; TypeScript compilation clean.
---
### Phase 7 — Frontend UI
**Files Changed**:
- `frontend/src/pages/CrowdSecConfig.tsx`
**Task 7.1**: Import the three hooks from `useCrowdSecWhitelist.ts`.
**Task 7.2**: Add `"whitelist"` to the tab list (visible only when `isLocalMode === true`).
**Task 7.3**: Implement the Whitelist tab panel:
- Add-entry form with IP/CIDR + Reason inputs.
- "Add My IP" button: `GET /api/v1/system/my-ip` → pre-fill `ip_or_cidr`.
- Entries table with UUID key, IP/CIDR, Reason, created date, delete button.
- Delete confirmation dialog (reuse existing modal pattern).
**Task 7.4**: Wire mutation errors to inline form validation messages (400/409 responses).
**Validation Gate**: `pnpm test` passes; TypeScript clean; `make lint-fast` clean.
---
### Phase 8 — Tests
**Files Changed**:
- `backend/internal/services/crowdsec_whitelist_service_test.go` _(new file)_
- `backend/internal/api/handlers/crowdsec_whitelist_handler_test.go` _(new file)_
- `tests/crowdsec-whitelist.spec.ts` _(new file)_
**Task 8.1 — Service unit tests**:
| Test | Scenario |
|---|---|
| `TestAdd_ValidIP_Success` | Bare IPv4 inserted; YAML file created |
| `TestAdd_ValidIPv6_Success` | Bare IPv6 inserted |
| `TestAdd_ValidCIDR_Success` | CIDR range inserted |
| `TestAdd_CIDRNormalization` | `"10.0.0.1/8"` stored as `"10.0.0.0/8"` |
| `TestAdd_InvalidIPOrCIDR_Error` | Returns `ErrInvalidIPOrCIDR` |
| `TestAdd_DuplicateEntry_Error` | Second identical insert returns `ErrDuplicateEntry` |
| `TestDelete_Success` | Entry removed; YAML regenerated |
| `TestDelete_NotFound_Error` | Returns `ErrWhitelistNotFound` |
| `TestList_Empty` | Returns empty slice |
| `TestList_Populated` | Returns all entries ordered by `created_at` |
| `TestWriteYAML_EmptyList` | Writes valid YAML with empty `ip: []` and `cidr: []` |
| `TestWriteYAML_MixedEntries` | IPs in `ip:` block; CIDRs in `cidr:` block |
| `TestWriteYAML_EmptyDataDir_NoOp` | `dataDir == ""` → returns `nil`, no file written |
**Task 8.2 — Handler unit tests** (using in-memory SQLite + `mockAuthMiddleware`):
| Test | Scenario |
|---|---|
| `TestListWhitelists_200` | Returns 200 with entries array |
| `TestAddWhitelist_201` | Valid payload → 201 |
| `TestAddWhitelist_400_MissingField` | Empty body → 400 |
| `TestAddWhitelist_400_InvalidIP` | Malformed IP → 400 |
| `TestAddWhitelist_409_Duplicate` | Duplicate → 409 |
| `TestDeleteWhitelist_204` | Valid UUID → 204 |
| `TestDeleteWhitelist_404` | Unknown UUID → 404 |
**Task 8.3 — E2E Playwright tests** (`tests/crowdsec-whitelist.spec.ts`):
```typescript
import { test, expect } from '@playwright/test'
test.describe('CrowdSec Whitelist Management', () => {
test.beforeEach(async ({ page }) => {
await page.goto('http://localhost:8080')
await page.getByRole('link', { name: 'Security' }).click()
await page.getByRole('tab', { name: 'CrowdSec' }).click()
await page.getByRole('tab', { name: 'Whitelist' }).click()
})
test('Whitelist tab only visible in local mode', async ({ page }) => {
await page.goto('http://localhost:8080')
await page.getByRole('link', { name: 'Security' }).click()
await page.getByRole('tab', { name: 'CrowdSec' }).click()
// When CrowdSec is not in local mode, the Whitelist tab must not exist
await expect(page.getByRole('tab', { name: 'Whitelist' })).toBeHidden()
})
test('displays empty state when no entries exist', async ({ page }) => {
await expect(page.getByText('No whitelist entries')).toBeVisible()
})
test('adds a valid IP address', async ({ page }) => {
await page.getByRole('textbox', { name: 'IP or CIDR' }).fill('203.0.113.5')
await page.getByRole('textbox', { name: 'Reason' }).fill('Uptime monitor')
await page.getByRole('button', { name: 'Add' }).click()
await expect(page.getByText('Whitelist entry added')).toBeVisible()
await expect(page.getByRole('cell', { name: '203.0.113.5' })).toBeVisible()
})
test('adds a valid CIDR range', async ({ page }) => {
await page.getByRole('textbox', { name: 'IP or CIDR' }).fill('10.0.0.0/8')
await page.getByRole('textbox', { name: 'Reason' }).fill('Internal subnet')
await page.getByRole('button', { name: 'Add' }).click()
await expect(page.getByText('Whitelist entry added')).toBeVisible()
await expect(page.getByRole('cell', { name: '10.0.0.0/8' })).toBeVisible()
})
test('"Add My IP" button pre-fills the detected client IP', async ({ page }) => {
await page.getByRole('button', { name: 'Add My IP' }).click()
const ipField = page.getByRole('textbox', { name: 'IP or CIDR' })
const value = await ipField.inputValue()
// Value must be a non-empty valid IP
expect(value).toMatch(/^[\d.]+$|^[0-9a-fA-F:]+$/)
})
test('shows validation error for invalid input', async ({ page }) => {
await page.getByRole('textbox', { name: 'IP or CIDR' }).fill('not-an-ip')
await page.getByRole('button', { name: 'Add' }).click()
await expect(page.getByText('Invalid IP address or CIDR notation')).toBeVisible()
})
test('removes an entry via delete confirmation', async ({ page }) => {
// Seed an entry first
await page.getByRole('textbox', { name: 'IP or CIDR' }).fill('198.51.100.1')
await page.getByRole('button', { name: 'Add' }).click()
await expect(page.getByRole('cell', { name: '198.51.100.1' })).toBeVisible()
// Delete it
await page.getByRole('row', { name: /198\.51\.100\.1/ }).getByRole('button', { name: 'Delete' }).click()
await page.getByRole('button', { name: 'Confirm' }).click()
await expect(page.getByText('Whitelist entry removed')).toBeVisible()
await expect(page.getByRole('cell', { name: '198.51.100.1' })).toBeHidden()
})
})
```
---
### Phase 9 — Documentation
**Files Changed**:
- `ARCHITECTURE.md`
- `docs/features/crowdsec-whitelist.md` _(new file, optional for this PR)_
**Task 9.1**: Update the CrowdSec row in the Cerberus security components table in `ARCHITECTURE.md` to mention whitelist management.
---
## 5. Acceptance Criteria
### Functional
- [ ] Operator can add a bare IPv4 address (e.g., `203.0.113.5`) to the whitelist.
- [ ] Operator can add a bare IPv6 address (e.g., `2001:db8::1`) to the whitelist.
- [ ] Operator can add a CIDR range (e.g., `10.0.0.0/8`) to the whitelist.
- [ ] Adding an invalid IP/CIDR (e.g., `not-an-ip`) returns a 400 error with a clear message.
- [ ] Adding a duplicate entry returns a 409 conflict error.
- [ ] Operator can delete an entry; it disappears from the list.
- [ ] The Whitelist tab is only visible when CrowdSec is in `local` mode.
- [ ] After adding or deleting an entry, the whitelist YAML file is regenerated in `<dataDir>/config/parsers/s02-enrich/charon-whitelist.yaml`.
- [ ] Adding or removing a whitelist entry triggers `cscli hub reload` via `h.CmdExec` so changes take effect immediately without a container restart.
- [ ] On container restart, the YAML file is regenerated from DB entries before CrowdSec starts.
- [ ] **Admin IP protection**: The "Add My IP" button pre-fills the operator's current IP in the `ip_or_cidr` field; a Playwright E2E test verifies the button correctly pre-fills the detected client IP.
### Technical
- [ ] `go test ./backend/...` passes — no regressions.
- [ ] `pnpm test` (Vitest) passes.
- [ ] `make lint-fast` clean — no new lint findings.
- [ ] GORM Security Scanner returns zero CRITICAL/HIGH findings.
- [ ] Playwright E2E suite passes (Firefox, `--project=firefox`).
- [ ] `crowdsecurity/whitelists` parser is installed by `install_hub_items.sh`.
---
## 6. Commit Slicing Strategy
**Decision**: Single PR with ordered logical commits. No scope overlap between commits; each commit leaves the codebase in a compilable state.
**Trigger reasons**: Cross-domain change (infra script + model + service + handler + startup + frontend) benefits from ordered commits for surgical rollback and focused review.
| # | Type | Commit Message | Files | Depends On | Validation Gate |
|---|---|---|---|---|---|
| 1 | `chore` | `install crowdsecurity/whitelists parser by default` | `configs/crowdsec/install_hub_items.sh` | — | `shellcheck` |
| 2 | `feat` | `add CrowdSecWhitelist model and automigrate registration` | `backend/internal/models/crowdsec_whitelist.go`, `backend/internal/api/routes/routes.go` | #1 | `go build ./backend/...` |
| 3 | `feat` | `add CrowdSecWhitelistService with YAML generation` | `backend/internal/services/crowdsec_whitelist_service.go` | #2 | `go test ./backend/internal/services/...` |
| 4 | `feat` | `add whitelist API endpoints to CrowdsecHandler` | `backend/internal/api/handlers/crowdsec_handler.go` | #3 | `go test ./backend/...` + `make lint-fast` |
| 5 | `feat` | `regenerate whitelist YAML on CrowdSec startup reconcile` | `backend/internal/services/crowdsec_startup.go` | #3 | `go test ./backend/internal/services/...` |
| 6 | `feat` | `add whitelist API client functions and TanStack hooks` | `frontend/src/api/crowdsec.ts`, `frontend/src/hooks/useCrowdSecWhitelist.ts` | #4 | `pnpm test` |
| 7 | `feat` | `add Whitelist tab to CrowdSecConfig UI` | `frontend/src/pages/CrowdSecConfig.tsx` | #6 | `pnpm test` + `make lint-fast` |
| 8 | `test` | `add whitelist service and handler unit tests` | `*_test.go` files | #4 | `go test ./backend/...` |
| 9 | `test` | `add E2E tests for CrowdSec whitelist management` | `tests/crowdsec-whitelist.spec.ts` | #7 | Playwright Firefox |
| 10 | `docs` | `update architecture docs for CrowdSec whitelist feature` | `ARCHITECTURE.md` | #7 | `make lint-fast` |
**Rollback notes**:
- Commits 13 are pure additions (no existing code modified except the `AutoMigrate` list append in commit 2 and `install_hub_items.sh` in commit 1). Reverting them is safe.
- Commit 4 modifies `crowdsec_handler.go` by adding fields and methods without altering existing ones; reverting is mechanical.
- Commit 5 modifies `crowdsec_startup.go` — the added block is isolated in a clearly marked section; revert is a 5-line removal.
- Commits 67 are frontend-only; reverting has no backend impact.
---
## 7. Open Questions / Risks
| Risk | Likelihood | Mitigation |
|---|---|---|
| CrowdSec does not hot-reload parser files — requires `cscli reload` or process restart | Resolved | `cscli hub reload` is called via `h.CmdExec.Execute(...)` in `AddWhitelist` and `DeleteWhitelist` after each successful `WriteYAML()`. Failure is non-fatal; logged as a warning. |
| `crowdsecurity/whitelists` parser path may differ across CrowdSec versions | Low | Use `<dataDir>/config/parsers/s02-enrich/` which is the canonical path; add a note to verify on version upgrades |
| Large whitelist files could cause CrowdSec performance issues | Very Low | Reasonable for typical use; document a soft limit recommendation (< 500 entries) in the UI |
| `dataDir` empty string in tests | Resolved | Guard added to `WriteYAML`: `if s.dataDir == "" { return nil }` — no-op when `dataDir` is unset |
| `CROWDSEC_TRUSTED_IPS` env var seeding | — | **Follow-up / future enhancement** (not in scope for this PR): if `CROWDSEC_TRUSTED_IPS` is set at runtime, parse comma-separated IPs and include them as read-only seed entries in the generated YAML (separate from DB-managed entries). Document in a follow-up issue. |

View File

@@ -1,447 +1,131 @@
# QA Audit Report — Nightly Build Vulnerability Remediation
# QA/Security DoD Audit Report — Issue #929
**Date**: 2026-04-09
**Scope**: Dependency-only update — no feature or UI changes
**Image Under Test**: `charon:vuln-fix` (built 2026-04-09 14:53 UTC, 632MB)
**Branch**: Current working tree (pre-PR)
Date: 2026-04-21
Repository: /projects/Charon
Branch: feature/beta-release
Scope assessed: DoD revalidation after recent fixes (E2E-first, frontend coverage, pre-commit/version gate, SA1019, Trivy CVE check)
---
## Final Recommendation
## Gate Results Summary
FAIL
| # | Gate | Status | Details |
|---|------|--------|---------|
| 1 | E2E Playwright (Firefox 4/4 shards + Chromium spot check) | PASS | 19 passed, 20 skipped (security suite), 0 failed |
| 2 | Backend Tests + Coverage | PASS | All tests pass, 88.2% statements / 88.4% lines (gate: 87%) |
| 3 | Frontend Tests + Coverage | PASS | 791 passed, 41 skipped, 89.38% stmts / 90.13% lines (gate: 87%) |
| 4 | Local Patch Coverage Report | PASS | 0 changed lines (dependency-only), 100% patch coverage |
| 5 | Frontend Type Check (tsc --noEmit) | PASS | Zero TypeScript errors |
| 6 | Pre-commit Hooks (lefthook) | PASS | All hooks passed (shellcheck, actionlint, dockerfile-check, YAML, EOF/whitespace) |
| 7a | Trivy Filesystem Scan (CRITICAL/HIGH) | PASS | 0 vulnerabilities in source |
| 7b | govulncheck (backend) | INFO | 2 findings — both `docker/docker` v28.5.2 with no upstream fix (pre-existing, documented in SECURITY.md) |
| 7c | Docker Image Scan (Grype) | PASS | 0 CRITICAL, 2 HIGH (both unfixed Alpine OpenSSL), all target CVEs resolved |
| 8 | Linting (make lint-fast) | PASS | 0 issues |
| 9 | GORM Security Scan (--check) | PASS | 0 CRITICAL, 0 HIGH, 2 INFO suggestions |
Reason: Two mandatory gates are still failing in current rerun evidence:
- Playwright E2E-first gate
- Frontend coverage gate
**Overall Status: PASS**
Pre-commit/version-check is now passing.
---
## Gate Summary
## Vulnerability Remediation Verification
| # | DoD Gate | Status | Notes |
|---|---|---|---|
| 1 | Playwright E2E first | FAIL | Healthy container path confirmed (`charon-e2e Up ... (healthy)`), auth setup passes, but accessibility suite still has 1 failing test (security headers page axe timeout) |
| 2 | Frontend coverage | FAIL | `scripts/frontend-test-coverage.sh` still ends with unhandled `ENOENT` on `frontend/coverage/.tmp/coverage-132.json` |
| 3 | Pre-commit hooks + version check | PASS | `lefthook run pre-commit --all-files` passes; `check-version-match` passes (`.version` matches latest tag `v0.27.0`) |
| 4 | SA1019 reconfirmation | PASS | `golangci-lint run ./... --enable-only staticcheck` reports `0 issues`; no `SA1019` occurrences |
| 5 | Trivy FS status (CVE-2026-34040) | PASS (not detected) | Current FS scan (`trivy fs --scanners vuln .`) exits 0 with no CVE hit; `CVE-2026-34040` not present in available Trivy artifacts |
### Target CVEs — All Resolved
## Detailed Evidence
All CVEs identified in the spec (`docs/plans/current_spec.md`) were verified as absent from the `charon:vuln-fix` image:
### 1) Playwright E2E-first gate (revalidated)
| CVE / GHSA | Package | Was | Now | Status |
|-----------|---------|-----|-----|--------|
| CVE-2026-39883 | otel/sdk | v1.40.0 | v1.43.0 | Resolved |
| CVE-2026-34986 | go-jose/v3 | v3.0.4 | v3.0.5 | Resolved |
| CVE-2026-34986 | go-jose/v4 | v4.1.3 | v4.1.4 | Resolved |
| CVE-2026-32286 | pgproto3/v2 | v2.3.3 | Not detected | Resolved |
| GHSA-xmrv-pmrh-hhx2 | AWS SDK v2 (multiple) | various | Patched | Resolved |
| CVE-2026-39882 | OTel HTTP exporters | v1.40.0v1.42.0 | v1.43.0 | Resolved |
| CVE-2026-32281/32288/32289 | Go stdlib | 1.26.1 | 1.26.2 | Resolved (via Dockerfile ARG) |
Execution evidence:
- Container health:
- `docker ps --filter name=charon-e2e --format '{{.Names}} {{.Status}}'`
- Output: `charon-e2e Up 35 minutes (healthy)`
- Auth setup:
- `PLAYWRIGHT_HTML_OPEN=never npx playwright test --project=firefox tests/auth.setup.ts -g "authenticate"`
- Result: `1 passed`
- Evidence: `Login successful`
- Accessibility rerun:
- `PLAYWRIGHT_HTML_OPEN=never npx playwright test --project=firefox -g "accessibility"`
- Result: `1 failed, 2 skipped, 64 passed`
- Failing test:
- `tests/a11y/security.a11y.spec.ts:21:5`
- `Accessibility: Security security headers page has no critical a11y violations`
- Failure detail: `Test timeout of 90000ms exceeded` during axe analyze step.
### Remaining Vulnerabilities in Docker Image (Pre-existing, Unfixed Upstream)
Gate disposition: FAIL.
| Severity | CVE | Package | Version | Status |
|----------|-----|---------|---------|--------|
| HIGH | CVE-2026-31790 | libcrypto3, libssl3 | 3.5.5-r0 | Awaiting Alpine patch |
| Medium | CVE-2025-60876 | busybox | 1.37.0-r30 | Awaiting Alpine patch |
| Medium | GHSA-6jwv-w5xf-7j27 | go.etcd.io/bbolt | v1.4.3 | CrowdSec transitive dep |
| Unknown | CVE-2026-28387/28388/28389/28390/31789 | libcrypto3, libssl3 | 3.5.5-r0 | Awaiting Alpine NVD scoring + patch |
### 2) Frontend coverage gate (revalidated)
**Note**: CVE-2026-31790 (HIGH, OpenSSL) is a **new finding** not previously documented in SECURITY.md. It affects the Alpine 3.23.3 base image and has no fix available. It is **not introduced by this PR** — it would be present in any image built on Alpine 3.23.3. Recommend adding to SECURITY.md known vulnerabilities section.
Execution:
- `bash scripts/frontend-test-coverage.sh`
### govulncheck Findings (Backend Source — Pre-existing)
Result:
- Coverage run still fails with unhandled rejection.
- Blocking error remains present:
- `Error: ENOENT: no such file or directory, open '/projects/Charon/frontend/coverage/.tmp/coverage-132.json'`
- Run summary before abort:
- `Test Files 128 passed | 5 skipped (187)`
- `Tests 1918 passed | 90 skipped (2008)`
| ID | Module | Fixed In | Notes |
|----|--------|----------|-------|
| GO-2026-4887 (CVE-2026-34040) | docker/docker v28.5.2 | N/A | Already in SECURITY.md |
| GO-2026-4883 (CVE-2026-33997) | docker/docker v28.5.2 | N/A | Already in SECURITY.md |
Additional state:
- `frontend/coverage/lcov.info` and `frontend/coverage/coverage-summary.json` can exist despite gate failure, but command-level DoD gate remains FAIL due non-zero termination path from unhandled ENOENT.
---
Gate disposition: FAIL.
## Coverage Details
### 3) Pre-commit hooks + version-check gate (revalidated)
### Backend (Go)
Execution:
- `lefthook run pre-commit --all-files`
- `bash ./scripts/check-version-match-tag.sh`
- Statement coverage: **88.2%**
- Line coverage: **88.4%**
- Gate threshold: 87% — **PASSED**
Result:
- Pre-commit summary shows all required hooks completed successfully, including:
- `check-version-match`
- `golangci-lint-fast`
- `frontend-type-check`
- `frontend-lint`
- `semgrep`
- Version check output:
- `OK: .version matches latest Git tag v0.27.0`
### Frontend (React/TypeScript)
Gate disposition: PASS.
- Statements: **89.38%**
- Branches: **81.86%**
- Functions: **86.71%**
- Lines: **90.13%**
- Gate threshold: 87% — **PASSED**
### 4) SA1019 reconfirmation
### Patch Coverage
Execution:
- `cd backend && golangci-lint run ./... --enable-only staticcheck`
- Changed source lines: **0** (dependency-only update)
- Patch coverage: **100%**
Result:
- Output: `0 issues.`
- Additional grep for `SA1019`: no matches.
---
Conclusion: SA1019 remains resolved.
## E2E Test Details
### 5) Trivy FS reconfirmation for CVE-2026-34040
Tests executed against `charon:vuln-fix` container on `http://127.0.0.1:8080`:
Execution:
- `trivy fs --scanners vuln .`
| Browser | Shards | Passed | Skipped | Failed |
|---------|--------|--------|---------|--------|
| Firefox | 4/4 | 11 | 20 | 0 |
| Chromium | 1/4 (spot) | 8 | 0 | 0 |
Result:
- Exit status: `0`
- Output indicates scan completed with:
- `Number of language-specific files num=0`
- CVE lookup:
- No `CVE-2026-34040` match found in available Trivy JSON artifacts (`vuln-results.json`, `trivy-image-report.json`).
Skipped tests are from the security suite (separate project configuration). No test failures observed. The full 3-browser suite will run in CI.
Conclusion: CVE-2026-34040 not detected in current FS scan context.
---
## Local Patch Report Artifact Check
## GORM Scanner Details
Execution:
- `bash /projects/Charon/scripts/local-patch-report.sh`
- Scanned: 43 Go files (2401 lines)
- CRITICAL: 0
- HIGH: 0
- MEDIUM: 0
- INFO: 2 (missing indexes on `UserPermittedHost` foreign keys — pre-existing, non-blocking)
Result:
- Generated successfully in warn mode.
- Artifacts verified:
- `/projects/Charon/test-results/local-patch-report.md`
- `/projects/Charon/test-results/local-patch-report.json`
---
## Blocking Issues
## Recommendations
1. Playwright E2E accessibility suite has one failing security headers test (axe timeout).
2. Frontend coverage command still fails with ENOENT under `frontend/coverage/.tmp`.
1. **Add CVE-2026-31790 to SECURITY.md** — New HIGH OpenSSL vulnerability in Alpine base image. No fix available. Monitor Alpine security advisories.
2. **Monitor docker/docker module migration** — 2 govulncheck findings with no upstream fix. Track moby/moby/v2 stabilization.
3. **Monitor bbolt GHSA-6jwv-w5xf-7j27** — Medium severity in CrowdSec transitive dependency. Track CrowdSec updates.
4. **Full CI E2E suite** — Local validation passed on Firefox + Chromium spot check. The complete 3-browser suite should run in CI pipeline.
## Decision
---
Overall DoD decision for Issue #929: FAIL
## Conclusion
All audit gates **PASS**. The dependency-only changes successfully remediate all 5 HIGH and 3 MEDIUM vulnerability groups identified in the spec. No regressions detected in tests, type safety, linting, or security scans. The remaining HIGH finding (CVE-2026-31790) is a pre-existing Alpine base image issue unrelated to this PR.
**Verdict: Clear to merge.**
# QA Security Audit Report
| Field | Value |
|-------------|--------------------------------|
| **Date** | 2026-03-24 |
| **Image** | `charon:local` (Alpine 3.23.3) |
| **Go** | 1.26.1 |
| **Grype** | 0.110.0 |
| **Trivy** | 0.69.1 |
| **CodeQL** | Latest (SARIF v2.1.0) |
---
## Executive Summary
The current `charon:local` image built on 2026-03-24 shows a significantly improved
security posture compared to the CI baseline. Three previously tracked SECURITY.md
vulnerabilities are now **resolved** due to Go 1.26.1 compilation and Alpine package
updates. Two new medium/low findings emerged. No CRITICAL or HIGH active
vulnerabilities remain in the unignored scan results.
| Category | Critical | High | Medium | Low | Total |
|------------------------|----------|------|--------|-----|-------|
| **Active (unignored)** | 0 | 0 | 4 | 2 | 6 |
| **Ignored (documented)**| 0 | 4 | 0 | 0 | 4 |
| **Resolved since last audit** | 1 | 4 | 1 | 0 | 6 |
---
## Scans Executed
| # | Scan | Tool | Result |
|---|-------------------------------|-----------|----------------------|
| 1 | Trivy Filesystem | Trivy | 0 findings (no lang-specific files detected) |
| 2 | Docker Image (SBOM + Grype) | Syft/Grype| 6 active, 8 ignored |
| 3 | Trivy Image Report | Trivy | 1 HIGH (stale Feb 25 report; resolved in current build) |
| 4 | CodeQL Go | CodeQL | 1 finding (false positive — see below) |
| 5 | CodeQL JavaScript | CodeQL | 0 findings |
| 6 | GORM Security Scanner | Custom | PASSED (0 issues, 2 info) |
| 7 | Lefthook / Pre-commit | Lefthook | Configured (project uses `lefthook.yml`, not `.pre-commit-config.yaml`) |
---
## Active Findings (Unignored)
### CVE-2025-60876 — BusyBox wget HTTP Request Smuggling
| Field | Value |
|------------------|-------|
| **Severity** | Medium (CVSS 6.5) |
| **Package** | `busybox` 1.37.0-r30 (Alpine APK) |
| **Affected** | `busybox`, `busybox-binsh`, `busybox-extras`, `ssl_client` (4 matches) |
| **Fix Available** | No |
| **Classification** | AWAITING UPSTREAM |
| **EPSS** | 0.00064 (0.20 percentile) |
**Description**: BusyBox wget through 1.37 accepts raw CR/LF and other C0 control bytes
in the HTTP request-target, allowing request line splitting and header injection (CWE-284).
**Risk Assessment**: Low practical risk. Charon does not invoke `busybox wget` in its
application logic. The vulnerable `wget` applet would need to be manually invoked inside
the container with attacker-controlled URLs.
**Remediation**: Monitor Alpine 3.23 for a patched `busybox` APK. No action required
until upstream ships a fix.
---
### CVE-2026-26958 / GHSA-fw7p-63qq-7hpr — edwards25519 MultiScalarMult Invalid Results
| Field | Value |
|------------------|-------|
| **Severity** | Low (CVSS 1.7) |
| **Package** | `filippo.io/edwards25519` v1.1.0 |
| **Location** | CrowdSec binaries (`/usr/local/bin/crowdsec`, `/usr/local/bin/cscli`) |
| **Fix Available** | v1.1.1 |
| **Classification** | AWAITING UPSTREAM |
| **EPSS** | 0.00018 (0.04 percentile) |
**Description**: `MultiScalarMult` produces invalid results or undefined behavior if
the receiver is not the identity point. This is a rarely used, advanced API.
**Risk Assessment**: Minimal. CrowdSec does not directly expose edwards25519
`MultiScalarMult` to external input. The fix exists at v1.1.1 but requires CrowdSec
to rebuild with the updated dependency.
**Remediation**: Awaiting CrowdSec upstream release with updated dependency. No
action available for Charon maintainers.
---
## Ignored Findings (Documented with Justification)
These findings are suppressed in the Grype configuration with documented risk
acceptance rationale. All are in third-party binaries bundled in the container;
none are in Charon's own code.
### CVE-2026-2673 — OpenSSL TLS 1.3 Key Exchange Group Downgrade
| Field | Value |
|------------------|-------|
| **Severity** | High (CVSS 7.5) |
| **Package** | `libcrypto3` / `libssl3` 3.5.5-r0 |
| **Matches** | 2 (libcrypto3, libssl3) |
| **Classification** | ALREADY DOCUMENTED · AWAITING UPSTREAM |
Charon terminates TLS at the Caddy layer; the Go backend does not act as a raw
TLS 1.3 server. Alpine 3.23 still ships 3.5.5-r0. Risk accepted pending Alpine patch.
---
### GHSA-6g7g-w4f8-9c9x — DoS in buger/jsonparser (CrowdSec)
| Field | Value |
|------------------|-------|
| **Severity** | High (CVSS 7.5) |
| **Package** | `github.com/buger/jsonparser` v1.1.1 |
| **Matches** | 2 (crowdsec, cscli binaries) |
| **Fix Available** | v1.1.2 |
| **Classification** | ALREADY DOCUMENTED · AWAITING UPSTREAM |
Charon does not use this package directly. The vector requires reaching CrowdSec's
internal JSON processing pipeline. Risk accepted pending CrowdSec upstream fix.
---
### GHSA-jqcq-xjh3-6g23 / GHSA-x6gf-mpr2-68h6 / CVE-2026-4427 — DoS in pgproto3/v2 (CrowdSec)
| Field | Value |
|------------------|-------|
| **Severity** | High (CVSS 7.5) |
| **Package** | `github.com/jackc/pgproto3/v2` v2.3.3 |
| **Matches** | 4 (2 GHSAs × 2 binaries) |
| **Fix Available** | No (v2 is archived/EOL) |
| **Classification** | ALREADY DOCUMENTED · AWAITING UPSTREAM |
pgproto3/v2 is archived with no fix planned. CrowdSec must migrate to pgx/v5.
Charon uses SQLite, not PostgreSQL; this code path is unreachable in standard
deployment.
---
## Resolved Findings (Since Last SECURITY.md Update)
The following vulnerabilities documented in SECURITY.md are no longer detected in the
current image build. **SECURITY.md should be updated to move these to "Patched
Vulnerabilities".**
### CVE-2025-68121 — Go Stdlib Critical in CrowdSec (RESOLVED)
| Field | Value |
|------------------|-------|
| **Previous Severity** | Critical |
| **Resolution** | CrowdSec binaries now compiled with Go 1.26.1 (was Go 1.25.6) |
| **Verified** | Not detected in Grype scan of current image |
---
### CHARON-2025-001 — CrowdSec Go Stdlib CVE Cluster (RESOLVED)
| Field | Value |
|------------------|-------|
| **Previous Severity** | High |
| **Aliases** | CVE-2025-58183, CVE-2025-58186, CVE-2025-58187, CVE-2025-61729, CVE-2026-25679, CVE-2025-61732, CVE-2026-27142, CVE-2026-27139 |
| **Resolution** | CrowdSec binaries now compiled with Go 1.26.1 |
| **Verified** | None of the aliased CVEs detected in Grype scan |
---
### CVE-2026-27171 — zlib CPU Exhaustion (RESOLVED)
| Field | Value |
|------------------|-------|
| **Previous Severity** | Medium |
| **Resolution** | Alpine now ships `zlib` 1.3.2-r0 (fix threshold: 1.3.2) |
| **Verified** | Not detected in Grype scan; zlib 1.3.2-r0 confirmed in SBOM |
---
### CVE-2026-33186 — gRPC-Go Authorization Bypass (RESOLVED)
| Field | Value |
|------------------|-------|
| **Previous Severity** | Critical |
| **Packages** | `google.golang.org/grpc` v1.74.2 (CrowdSec), v1.79.1 (Caddy) |
| **Resolution** | Upstream releases now include patched gRPC (>= v1.79.3) |
| **Verified** | Not detected in Grype scan; ignore rule present but no match |
---
### GHSA-69x3-g4r3-p962 / CVE-2026-25793 — Nebula ECDSA Malleability (RESOLVED)
| Field | Value |
|------------------|-------|
| **Previous Severity** | High |
| **Package** | `github.com/slackhq/nebula` v1.9.7 in Caddy |
| **Resolution** | Caddy now ships with nebula >= v1.10.3 |
| **Verified** | Not detected in Grype scan; Trivy image report from Feb 25 had this but current build does not |
> **Note**: The stale Trivy image report (`trivy-image-report.json`, dated 2026-02-25) still
> shows CVE-2026-25793. This report predates the current build and should be regenerated.
---
### GHSA-479m-364c-43vc — goxmldsig XML Signature Bypass (RESOLVED)
| Field | Value |
|------------------|-------|
| **Previous Severity** | High |
| **Package** | `github.com/russellhaering/goxmldsig` v1.5.0 in Caddy |
| **Resolution** | Caddy now ships with goxmldsig >= v1.6.0 |
| **Verified** | Not detected in Grype scan; ignore rule present but no match |
---
## CodeQL Analysis
### go/cookie-secure-not-set — FALSE POSITIVE
| Field | Value |
|------------------|-------|
| **Severity** | Medium (CodeQL) |
| **File** | `backend/internal/api/handlers/auth_handler.go:152` |
| **Classification** | FALSE POSITIVE (stale SARIF) |
**Finding**: CodeQL reports "Cookie does not set Secure attribute to true" at line 152.
**Verification**: The `setSecureCookie` function at line 148-156 calls `c.SetCookie()`
with `secure: true` (6th positional argument). The Secure attribute IS set correctly.
This SARIF was generated from a previous code version and does not reflect the current
source. **The CodeQL SARIF files should be regenerated.**
### JavaScript / JS
No findings. Both `codeql-results-javascript.sarif` and `codeql-results-js.sarif` contain
0 results.
---
## GORM Security Scanner
| Metric | Value |
|------------|-------|
| **Result** | PASSED |
| **Files** | 43 Go files (2,396 lines) |
| **Critical** | 0 |
| **High** | 0 |
| **Medium** | 0 |
| **Info** | 2 (missing indexes on foreign keys in `UserPermittedHost`) |
The 2 informational suggestions (`UserID` and `ProxyHostID` missing `gorm:"index"` in
`backend/internal/models/user.go:130-131`) are performance recommendations, not security
issues. They do not block this audit.
---
## CI vs Local Scan Discrepancy
The CI reported **3 Critical, 5 High, 1 Medium**. The local scan on the freshly built
image reports **0 Critical, 0 High, 4 Medium, 2 Low** (active) plus **4 High** (ignored).
**Root causes for the discrepancy:**
1. **Resolved vulnerabilities**: 3 Critical and 4 High findings were resolved by Go 1.26.1
compilation and upstream Caddy/CrowdSec dependency updates since the CI image was built.
2. **Grype ignore rules**: The local scan applies documented risk acceptance rules that
suppress 4 High findings in third-party binaries. CI (Trivy) does not use these rules.
3. **Stale CI artifacts**: The `trivy-image-report.json` dates from 2026-02-25 and does
not reflect the current image state. The `codeql-results-go.sarif` references code that
has since been fixed.
---
## Recommended Actions
### Immediate (This Sprint)
1. **Update SECURITY.md**: Move CVE-2025-68121, CHARON-2025-001, and CVE-2026-27171 to
a "Patched Vulnerabilities" section. Add CVE-2025-60876 and CVE-2026-26958 as new
known vulnerabilities.
2. **Regenerate stale scan artifacts**: Re-run Trivy image scan and CodeQL analysis to
produce current SARIF/JSON files. The existing files predate fixes and produce
misleading CI results.
3. **Clean up Grype ignore rules**: Remove ignore entries for vulnerabilities that are
no longer detected (CVE-2026-33186, GHSA-69x3-g4r3-p962, GHSA-479m-364c-43vc).
Stale ignore rules obscure the actual security posture.
### Next Release
4. **Monitor Alpine APK updates**: Watch for patched `busybox` (CVE-2025-60876) and
`openssl` (CVE-2026-2673) packages in Alpine 3.23.
5. **Monitor CrowdSec releases**: Watch for CrowdSec builds with updated
`filippo.io/edwards25519` >= v1.1.1, `buger/jsonparser` >= v1.1.2, and
`pgx/v5` migration (replacing pgproto3/v2).
6. **Monitor Go 1.26.2-alpine**: When available, bump `GO_VERSION` to pick up any
remaining stdlib patches.
### Informational (Non-Blocking)
7. **GORM indexes**: Consider adding `gorm:"index"` to `UserID` and `ProxyHostID` in
`UserPermittedHost` for query performance.
---
## Gotify Token Review
Verified: No Gotify application tokens appear in scan output, log artifacts, test results,
API examples, or URL query parameters. All diagnostic output is clean.
---
## Conclusion
The Charon container image security posture has materially improved. Six previously known
vulnerabilities are now resolved through Go toolchain and dependency updates. The remaining
active findings are medium/low severity, reside in Alpine base packages and CrowdSec
third-party binaries, and have no available fixes. No vulnerabilities exist in Charon's
own application code. GORM and CodeQL scans confirm the backend code is clean.
Promotion recommendation: keep blocked until both failing mandatory gates are green on rerun.

View File

@@ -0,0 +1,226 @@
# QA Audit Report — CrowdSec IP Whitelist Management
**Feature Branch**: `feature/beta-release`
**Pull Request**: #952
**Repository**: Wikid82/Charon
**Audit Date**: 2026-04-16
**Auditor**: QA Security Agent
---
## Overall Verdict
### APPROVED WITH CONDITIONS
The CrowdSec IP Whitelist Management feature passes all critical quality and security gates. All feature-specific E2E tests pass across three browsers. Backend and frontend coverage exceed thresholds. No security vulnerabilities were found in the feature code. Two upstream HIGH CVEs in the Docker image and below-threshold overall patch coverage require tracking but do not block the release.
---
## Gate Results Summary
| # | Gate | Result | Detail |
|---|------|--------|--------|
| 1 | Playwright E2E | **PASS** | All CrowdSec whitelist tests passed; 14 pre-existing failures (unrelated) |
| 2 | Go Backend Coverage | **PASS** | 88.4% line coverage (threshold: 87%) |
| 3 | Frontend Coverage | **PASS** | 90.06% line coverage (threshold: 87%) |
| 4 | Patch Coverage | **WARN** | Overall 89.4% (threshold: 90%); Backend 88.0% PASS; Frontend 97.0% PASS |
| 5 | TypeScript Type Check | **PASS** | `npx tsc --noEmit` — 0 errors |
| 6 | Lefthook (Lint/Format) | **PASS** | All 6 hooks green |
| 7 | GORM Security Scan | **PASS** | 0 CRITICAL/HIGH/MEDIUM issues |
| 8 | Trivy Filesystem Scan | **PASS** | 0 CRITICAL/HIGH vulnerabilities |
| 9 | Trivy Docker Image Scan | **WARN** | 2 unique HIGH CVEs (upstream dependencies) |
| 10 | CodeQL SARIF Review | **PASS** | 1 pre-existing Go finding; 0 JS findings; 0 whitelist-related |
---
## Detailed Gate Analysis
### Gate 1 — Playwright E2E Tests
**Result**: PASS
**Browsers**: Chromium, Firefox, WebKit (all three)
**CrowdSec Whitelist-Specific Tests (10 tests)**: All PASSED
- Add whitelist entry with valid IP
- Add whitelist entry with valid CIDR
- Reject invalid IP/CIDR input
- Reject duplicate entry
- Delete whitelist entry
- Display whitelist table with entries
- Empty state display
- Whitelist tab visibility (local mode only)
- Form validation and error handling
- Toast notification on success/failure
**Pre-existing Failures (14 unique, unrelated to this feature)**:
- Certificate deletion tests (7): cert delete/bulk-delete operations
- Caddy import tests (3): conflict details, server detection, resolution
- Navigation test (1): main navigation item count
- User management tests (2): invite link copy, keyboard navigation
- Integration test (1): system health check
None of the pre-existing failures are related to the CrowdSec whitelist feature.
### Gate 2 — Go Backend Coverage
**Result**: PASS
**Coverage**: 88.4% line coverage
**Threshold**: 87%
### Gate 3 — Frontend Coverage
**Result**: PASS
**Coverage**: 90.06% line coverage (Statements: 89.03%, Branches: 85.84%, Functions: 85.85%)
**Threshold**: 87%
5 pre-existing test timeouts in `ProxyHostForm-dns.test.tsx` and `ProxyHostForm-dropdown-changes.test.tsx` — not whitelist-related.
### Gate 4 — Patch Coverage
**Result**: WARN (non-blocking)
| Scope | Changed Lines | Covered | Patch % | Status |
|-------|--------------|---------|---------|--------|
| Overall | 1689 | 1510 | 89.4% | WARN (threshold: 90%) |
| Backend | 1426 | 1255 | 88.0% | PASS (threshold: 85%) |
| Frontend | 263 | 255 | 97.0% | PASS (threshold: 85%) |
**CrowdSec-Specific Patch Coverage**:
- `crowdsec_handler.go`: 71.2% — 17 uncovered changed lines (error-handling branches)
- `crowdsec_whitelist_service.go`: 83.6% — 18 uncovered changed lines (YAML write failure, edge cases)
- `CrowdSecConfig.tsx`: 93.3% — 2 uncovered changed lines
**Recommendation**: Add targeted unit tests for error-handling branches in `crowdsec_handler.go` (lines 2712-2772) and `crowdsec_whitelist_service.go` (lines 47-148) to bring CrowdSec-specific patch coverage above 90%. This is tracked as a follow-up improvement and does not block release.
### Gate 5 — TypeScript Type Check
**Result**: PASS
`npx tsc --noEmit` from `frontend/` completed with 0 errors.
### Gate 6 — Lefthook (Lint/Format)
**Result**: PASS
All 6 hooks passed:
- `go-fmt`
- `go-vet`
- `go-staticcheck`
- `eslint`
- `prettier-check`
- `tsc-check`
### Gate 7 — GORM Security Scan
**Result**: PASS
`./scripts/scan-gorm-security.sh --check` — 0 CRITICAL, 0 HIGH, 0 MEDIUM issues.
No exposed IDs, secrets, or DTO embedding violations in CrowdSec whitelist models.
### Gate 8 — Trivy Filesystem Scan
**Result**: PASS
`trivy fs --scanners vuln --severity CRITICAL,HIGH --format table .` — 0 vulnerabilities detected in application source and dependencies.
### Gate 9 — Trivy Docker Image Scan
**Result**: WARN (non-blocking for this feature)
Image: `charon:local` (Alpine 3.23.3)
| CVE | Severity | Package | Installed | Fixed | Component |
|-----|----------|---------|-----------|-------|-----------|
| CVE-2026-34040 | HIGH | github.com/docker/docker | v28.5.2+incompatible | 29.3.1 | Charon Go binary (Moby authorization bypass) |
| CVE-2026-32286 | HIGH | github.com/jackc/pgproto3/v2 | v2.3.3 | No fix | CrowdSec binaries (PostgreSQL protocol DoS) |
**Analysis**:
- CVE-2026-34040: Moby authorization bypass — affects Docker API access control. Charon does not expose Docker API to untrusted networks. Low practical risk. Update `github.com/docker/docker` to v29.3.1 when available.
- CVE-2026-32286: PostgreSQL protocol DoS — present only in CrowdSec's `crowdsec` and `cscli` binaries, not in Charon's own code. Awaiting upstream fix from CrowdSec.
**Recommendation**: Track both CVEs for remediation. Neither impacts CrowdSec whitelist management functionality or Charon's own security posture directly.
### Gate 10 — CodeQL SARIF Review
**Result**: PASS
- **Go**: 1 pre-existing finding — `cookie-secure-not-set` at `auth_handler.go:152`. Not whitelist-related. Tracked separately.
- **JavaScript**: 0 findings.
- **CrowdSec whitelist**: 0 findings across both Go and JavaScript.
---
## Security Review — CrowdSec IP Whitelist Feature
### 1. IP/CIDR Input Validation
**Status**: SECURE
The `normalizeIPOrCIDR()` function in `crowdsec_whitelist_service.go` uses Go standard library functions `net.ParseIP()` and `net.ParseCIDR()` for validation. Invalid inputs are rejected with the sentinel error `ErrInvalidIPOrCIDR`. No user input passes through without validation.
### 2. YAML Injection Prevention
**Status**: SECURE
`buildWhitelistYAML()` uses a `strings.Builder` to construct YAML output. Only IP addresses and CIDR ranges that have already passed `normalizeIPOrCIDR()` validation are included. The normalized output from `net.ParseIP`/`net.ParseCIDR` cannot contain YAML metacharacters.
### 3. Path Traversal Protection
**Status**: SECURE
`WriteYAML()` uses hardcoded file paths (no user input in path construction). Atomic write pattern: writes to `.tmp` suffix, then `os.Rename()` to final path. No directory traversal vectors.
### 4. SQL Injection Prevention
**Status**: SECURE
All GORM queries use parameterized operations:
- `Where("uuid = ?", id)` for delete
- `Where("ip_or_cidr = ?", normalized)` for duplicate check
- Standard GORM `Create()` for inserts
No raw SQL or string concatenation.
### 5. Authentication & Authorization
**Status**: SECURE
All whitelist routes are registered under the admin route group in `routes.go`, which is protected by:
- Cerberus middleware (authentication/authorization enforcement)
- Emergency bypass middleware (for recovery scenarios only)
- Security headers and gzip middleware
No unauthenticated access to whitelist endpoints is possible.
### 6. Log Safety
**Status**: SECURE
Whitelist service logs only operational error context (e.g., "failed to write CrowdSec whitelist YAML after add"). No IP addresses, user data, or PII are written to logs. Other handler code uses `util.SanitizeForLog()` for user-controlled input in log messages.
---
## Conditions for Approval
These items are tracked as follow-up improvements and do not block merge:
1. **Patch Coverage Improvement**: Add targeted unit tests for error-handling branches in:
- `crowdsec_handler.go` (lines 2712-2772, 71.2% patch coverage)
- `crowdsec_whitelist_service.go` (lines 47-148, 83.6% patch coverage)
2. **Upstream CVE Tracking**:
- CVE-2026-34040: Update `github.com/docker/docker` to v29.3.1 when Go module is available
- CVE-2026-32286: Monitor CrowdSec upstream for `pgproto3` fix
3. **Pre-existing Test Failures**: 14 pre-existing E2E test failures (certificate deletion, caddy import, navigation, user management) should be tracked in existing issues. None are regressions from this feature.
---
## Artifacts
| Artifact | Location |
|----------|----------|
| Playwright HTML Report | `playwright-report/index.html` |
| Backend Coverage | `backend/coverage.txt` |
| Frontend Coverage | `frontend/coverage/lcov.info`, `frontend/coverage/coverage-summary.json` |
| Patch Coverage Report | `test-results/local-patch-report.md`, `test-results/local-patch-report.json` |
| GORM Security Scan | Inline (0 findings) |
| Trivy Filesystem Scan | Inline (0 findings) |
| Trivy Image Scan | `trivy-image-report.json` |
| CodeQL Go SARIF | `codeql-results-go.sarif` |
| CodeQL JS SARIF | `codeql-results-javascript.sarif` |

View File

@@ -0,0 +1,47 @@
# QA Audit Report — PR #928: CI Test Fix
**Date:** 2026-04-13
**Scope:** Targeted fix for two CI test failures across three files
**Auditor:** QA Security Agent
## Modified Files
| File | Change |
|---|---|
| `backend/internal/api/handlers/certificate_handler.go` | Added `key_file` validation guard for non-PFX uploads |
| `backend/internal/api/handlers/certificate_handler_test.go` | Added `TestCertificateHandler_Upload_MissingKeyFile_MultipartWithCert` test |
| `backend/internal/services/certificate_service_coverage_test.go` | Removed duplicate `ALTER TABLE` in `TestCertificateService_MigratePrivateKeys` |
## Check Results
| # | Check | Result | Details |
|---|---|---|---|
| 1 | **Backend Build** (`go build ./...`) | **PASS** | Clean build, no errors |
| 2a | **Targeted Test 1** (`TestCertificateHandler_Upload_MissingKeyFile_MultipartWithCert`) | **PASS** | Previously failing, now passes |
| 2b | **Targeted Test 2** (`TestCertificateService_MigratePrivateKeys`) | **PASS** | Previously failing (duplicate ALTER TABLE), now passes — all 3 subtests pass |
| 3a | **Regression: Handler Suite** (`TestCertificateHandler*`, 37 tests) | **PASS** | All 37 tests pass in 1.27s |
| 3b | **Regression: Service Suite** (`TestCertificateService*`) | **PASS** | All tests pass in 2.96s |
| 4 | **Go Vet** (`go vet ./...`) | **PASS** | No issues |
| 5 | **Golangci-lint** (handlers + services) | **PASS** | All warnings are pre-existing in unmodified files (crowdsec, audit_log). Zero new lint issues in modified files |
| 6 | **Security Review** | **PASS** | See analysis below |
## Security Analysis
The handler change (lines 169175 of `certificate_handler.go`) was reviewed for:
| Vector | Assessment |
|---|---|
| **Injection** | `services.DetectFormat()` operates on already-read `certBytes` (bounded by `maxFileSize` = 1MB via `io.LimitReader`). No additional I/O or shell invocation. |
| **Information Disclosure** | Returns a static error string `"key_file is required for PEM/DER certificate uploads"`. No user-controlled data reflected in the response. |
| **Auth Bypass** | Route `POST /certificates` is registered inside the `management` group (confirmed at `routes.go:696`), which requires authentication. The guard is additive — it rejects earlier, not later. |
| **DoS** | No new allocations or expensive operations. `DetectFormat` is a simple byte-header check on data already in memory. |
**Verdict:** No new attack surface introduced. The change is a pure input validation tightening.
## Warnings
- **Pre-existing lint warnings** exist in unmodified files (`crowdsec_handler.go`, `crowdsec_*_test.go`, `audit_log_handler_test.go`). These are tracked separately and are not related to this PR.
## Final Verdict
**PASS** — All six checks pass. Both previously failing CI tests now succeed. No regressions detected in the broader handler and service suites. No security concerns with the changes.

File diff suppressed because it is too large Load Diff

View File

@@ -33,20 +33,20 @@
"@radix-ui/react-select": "^2.2.6",
"@radix-ui/react-tabs": "^1.1.13",
"@radix-ui/react-tooltip": "^1.2.8",
"@tanstack/react-query": "^5.97.0",
"axios": "1.15.0",
"@tanstack/react-query": "^5.99.2",
"axios": "1.15.2",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
"date-fns": "^4.1.0",
"i18next": "^26.0.4",
"i18next": "^26.0.6",
"i18next-browser-languagedetector": "^8.2.1",
"lucide-react": "^1.8.0",
"react": "^19.2.5",
"react-dom": "^19.2.5",
"react-hook-form": "^7.72.1",
"react-hook-form": "^7.73.1",
"react-hot-toast": "^2.6.0",
"react-i18next": "^17.0.2",
"react-router-dom": "^7.14.0",
"react-i18next": "^17.0.4",
"react-router-dom": "^7.14.2",
"recharts": "^3.8.1",
"tailwind-merge": "^3.5.0",
"tldts": "^7.0.28"
@@ -57,7 +57,7 @@
"@eslint/json": "^1.2.0",
"@eslint/markdown": "^8.0.1",
"@playwright/test": "^1.59.1",
"@tailwindcss/postcss": "^4.2.2",
"@tailwindcss/postcss": "^4.2.4",
"@testing-library/jest-dom": "^6.9.1",
"@testing-library/react": "^16.3.2",
"@testing-library/user-event": "^14.6.1",
@@ -65,52 +65,52 @@
"@types/node": "^25.6.0",
"@types/react": "^19.2.14",
"@types/react-dom": "^19.2.3",
"@typescript-eslint/eslint-plugin": "^8.58.1",
"@typescript-eslint/parser": "^8.58.1",
"@typescript-eslint/utils": "^8.58.1",
"@typescript-eslint/eslint-plugin": "^8.59.0",
"@typescript-eslint/parser": "^8.59.0",
"@typescript-eslint/utils": "^8.59.0",
"@vitejs/plugin-react": "^6.0.1",
"@vitest/coverage-istanbul": "^4.1.4",
"@vitest/coverage-v8": "^4.1.4",
"@vitest/eslint-plugin": "^1.6.15",
"@vitest/ui": "^4.1.4",
"autoprefixer": "^10.4.27",
"eslint": "^10.2.0",
"@vitest/coverage-istanbul": "^4.1.5",
"@vitest/coverage-v8": "^4.1.5",
"@vitest/eslint-plugin": "^1.6.16",
"@vitest/ui": "^4.1.5",
"autoprefixer": "^10.5.0",
"eslint": "^10.2.1",
"eslint-import-resolver-typescript": "^4.4.4",
"eslint-plugin-import-x": "^4.16.2",
"eslint-plugin-jsx-a11y": "^6.10.2",
"eslint-plugin-no-unsanitized": "^4.1.5",
"eslint-plugin-promise": "^7.2.1",
"eslint-plugin-react-compiler": "^19.1.0-rc.2",
"eslint-plugin-react-hooks": "^7.0.1",
"eslint-plugin-react-hooks": "^7.1.1",
"eslint-plugin-react-refresh": "^0.5.2",
"eslint-plugin-security": "^4.0.0",
"eslint-plugin-sonarjs": "^4.0.2",
"eslint-plugin-sonarjs": "^4.0.3",
"eslint-plugin-testing-library": "^7.16.2",
"eslint-plugin-unicorn": "^64.0.0",
"eslint-plugin-unused-imports": "^4.4.1",
"jsdom": "29.0.2",
"knip": "^6.3.1",
"postcss": "^8.5.9",
"tailwindcss": "^4.2.2",
"typescript": "^6.0.2",
"typescript-eslint": "^8.58.1",
"vite": "^8.0.8",
"vitest": "^4.1.4",
"knip": "^6.6.0",
"postcss": "^8.5.10",
"tailwindcss": "^4.2.4",
"typescript": "^6.0.3",
"typescript-eslint": "^8.59.0",
"vite": "^8.0.9",
"vitest": "^4.1.5",
"zod-validation-error": "^5.0.0"
},
"overrides": {
"typescript": "^6.0.2",
"typescript": "^6.0.3",
"eslint-plugin-react-hooks": {
"eslint": "^10.2.0"
"eslint": "^10.2.1"
},
"eslint-plugin-jsx-a11y": {
"eslint": "^10.2.0"
"eslint": "^10.2.1"
},
"eslint-plugin-promise": {
"eslint": "^10.2.0"
"eslint": "^10.2.1"
},
"@vitejs/plugin-react": {
"vite": "8.0.8"
"vite": "8.0.9"
}
}
}

View File

@@ -1,12 +1,23 @@
import { describe, it, expect, vi, beforeEach } from 'vitest';
import { getCertificates, uploadCertificate, deleteCertificate, type Certificate } from '../certificates';
import {
getCertificates,
getCertificateDetail,
uploadCertificate,
updateCertificate,
deleteCertificate,
exportCertificate,
validateCertificate,
type Certificate,
type CertificateDetail,
} from '../certificates';
import client from '../client';
vi.mock('../client', () => ({
default: {
get: vi.fn(),
post: vi.fn(),
put: vi.fn(),
delete: vi.fn(),
},
}));
@@ -17,12 +28,14 @@ describe('certificates API', () => {
});
const mockCert: Certificate = {
id: 1,
domain: 'example.com',
uuid: 'abc-123',
domains: 'example.com',
issuer: 'Let\'s Encrypt',
expires_at: '2023-01-01',
status: 'valid',
provider: 'letsencrypt',
has_key: true,
in_use: false,
};
it('getCertificates calls client.get', async () => {
@@ -47,7 +60,76 @@ describe('certificates API', () => {
it('deleteCertificate calls client.delete', async () => {
vi.mocked(client.delete).mockResolvedValue({ data: {} });
await deleteCertificate(1);
expect(client.delete).toHaveBeenCalledWith('/certificates/1');
await deleteCertificate('abc-123');
expect(client.delete).toHaveBeenCalledWith('/certificates/abc-123');
});
it('getCertificateDetail calls client.get with uuid', async () => {
const detail: CertificateDetail = {
...mockCert,
assigned_hosts: [],
chain: [],
auto_renew: false,
created_at: '2023-01-01',
updated_at: '2023-01-01',
};
vi.mocked(client.get).mockResolvedValue({ data: detail });
const result = await getCertificateDetail('abc-123');
expect(client.get).toHaveBeenCalledWith('/certificates/abc-123');
expect(result).toEqual(detail);
});
it('updateCertificate calls client.put with name', async () => {
vi.mocked(client.put).mockResolvedValue({ data: mockCert });
const result = await updateCertificate('abc-123', 'New Name');
expect(client.put).toHaveBeenCalledWith('/certificates/abc-123', { name: 'New Name' });
expect(result).toEqual(mockCert);
});
it('exportCertificate calls client.post with blob response type', async () => {
const blob = new Blob(['data']);
vi.mocked(client.post).mockResolvedValue({ data: blob });
const result = await exportCertificate('abc-123', 'pem', true, 'pass', 'pfx-pass');
expect(client.post).toHaveBeenCalledWith(
'/certificates/abc-123/export',
{ format: 'pem', include_key: true, password: 'pass', pfx_password: 'pfx-pass' },
{ responseType: 'blob' },
);
expect(result).toEqual(blob);
});
it('validateCertificate calls client.post with FormData', async () => {
const validation = { valid: true, common_name: 'example.com', domains: ['example.com'], issuer_org: 'LE', expires_at: '2024-01-01', key_match: true, chain_valid: true, chain_depth: 1, warnings: [], errors: [] };
vi.mocked(client.post).mockResolvedValue({ data: validation });
const certFile = new File(['cert'], 'cert.pem', { type: 'text/plain' });
const keyFile = new File(['key'], 'key.pem', { type: 'text/plain' });
const result = await validateCertificate(certFile, keyFile);
expect(client.post).toHaveBeenCalledWith('/certificates/validate', expect.any(FormData), {
headers: { 'Content-Type': 'multipart/form-data' },
});
expect(result).toEqual(validation);
});
it('uploadCertificate includes chain file when provided', async () => {
vi.mocked(client.post).mockResolvedValue({ data: mockCert });
const certFile = new File(['cert'], 'cert.pem');
const keyFile = new File(['key'], 'key.pem');
const chainFile = new File(['chain'], 'chain.pem');
await uploadCertificate('My Cert', certFile, keyFile, chainFile);
const formData = vi.mocked(client.post).mock.calls[0][1] as FormData;
expect(formData.get('chain_file')).toBeTruthy();
});
it('validateCertificate includes chain file when provided', async () => {
vi.mocked(client.post).mockResolvedValue({ data: {} });
const certFile = new File(['cert'], 'cert.pem');
const chainFile = new File(['chain'], 'chain.pem');
await validateCertificate(certFile, undefined, chainFile);
const formData = vi.mocked(client.post).mock.calls[0][1] as FormData;
expect(formData.get('chain_file')).toBeTruthy();
expect(formData.get('key_file')).toBeNull();
});
});

View File

@@ -116,6 +116,120 @@ describe('crowdsec API', () => {
})
})
describe('listCrowdsecDecisions', () => {
it('should call GET /admin/crowdsec/decisions and return data', async () => {
const mockData = {
decisions: [
{ id: '1', ip: '1.2.3.4', reason: 'bot', duration: '24h', created_at: '2024-01-01T00:00:00Z', source: 'crowdsec' },
],
}
vi.mocked(client.get).mockResolvedValue({ data: mockData })
const result = await crowdsec.listCrowdsecDecisions()
expect(client.get).toHaveBeenCalledWith('/admin/crowdsec/decisions')
expect(result).toEqual(mockData)
})
})
describe('banIP', () => {
it('should call POST /admin/crowdsec/ban with ip, duration, and reason', async () => {
vi.mocked(client.post).mockResolvedValue({})
await crowdsec.banIP('1.2.3.4', '24h', 'manual ban')
expect(client.post).toHaveBeenCalledWith('/admin/crowdsec/ban', {
ip: '1.2.3.4',
duration: '24h',
reason: 'manual ban',
})
})
})
describe('unbanIP', () => {
it('should call DELETE /admin/crowdsec/ban/{encoded ip}', async () => {
vi.mocked(client.delete).mockResolvedValue({})
await crowdsec.unbanIP('1.2.3.4')
expect(client.delete).toHaveBeenCalledWith('/admin/crowdsec/ban/1.2.3.4')
})
it('should URL-encode special characters in the IP', async () => {
vi.mocked(client.delete).mockResolvedValue({})
await crowdsec.unbanIP('::1')
expect(client.delete).toHaveBeenCalledWith('/admin/crowdsec/ban/%3A%3A1')
})
})
describe('getCrowdsecKeyStatus', () => {
it('should call GET /admin/crowdsec/key-status and return data', async () => {
const mockData = {
key_source: 'file' as const,
env_key_rejected: false,
current_key_preview: 'abc***xyz',
message: 'Key loaded from file',
}
vi.mocked(client.get).mockResolvedValue({ data: mockData })
const result = await crowdsec.getCrowdsecKeyStatus()
expect(client.get).toHaveBeenCalledWith('/admin/crowdsec/key-status')
expect(result).toEqual(mockData)
})
})
describe('listWhitelists', () => {
it('should call GET /admin/crowdsec/whitelist and return the whitelist array', async () => {
const mockWhitelist = [
{
uuid: 'uuid-1',
ip_or_cidr: '192.168.1.1',
reason: 'Home',
created_at: '2024-01-01T00:00:00Z',
updated_at: '2024-01-01T00:00:00Z',
},
]
vi.mocked(client.get).mockResolvedValue({ data: { whitelist: mockWhitelist } })
const result = await crowdsec.listWhitelists()
expect(client.get).toHaveBeenCalledWith('/admin/crowdsec/whitelist')
expect(result).toEqual(mockWhitelist)
})
})
describe('addWhitelist', () => {
it('should call POST /admin/crowdsec/whitelist and return the created entry', async () => {
const payload = { ip_or_cidr: '192.168.1.1', reason: 'Home' }
const mockEntry = {
uuid: 'uuid-1',
ip_or_cidr: '192.168.1.1',
reason: 'Home',
created_at: '2024-01-01T00:00:00Z',
updated_at: '2024-01-01T00:00:00Z',
}
vi.mocked(client.post).mockResolvedValue({ data: mockEntry })
const result = await crowdsec.addWhitelist(payload)
expect(client.post).toHaveBeenCalledWith('/admin/crowdsec/whitelist', payload)
expect(result).toEqual(mockEntry)
})
})
describe('deleteWhitelist', () => {
it('should call DELETE /admin/crowdsec/whitelist/{uuid}', async () => {
vi.mocked(client.delete).mockResolvedValue({})
await crowdsec.deleteWhitelist('uuid-1')
expect(client.delete).toHaveBeenCalledWith('/admin/crowdsec/whitelist/uuid-1')
})
})
describe('default export', () => {
it('should export all functions', () => {
expect(crowdsec.default).toHaveProperty('startCrowdsec')
@@ -126,6 +240,14 @@ describe('crowdsec API', () => {
expect(crowdsec.default).toHaveProperty('listCrowdsecFiles')
expect(crowdsec.default).toHaveProperty('readCrowdsecFile')
expect(crowdsec.default).toHaveProperty('writeCrowdsecFile')
expect(crowdsec.default).toHaveProperty('listCrowdsecDecisions')
expect(crowdsec.default).toHaveProperty('banIP')
expect(crowdsec.default).toHaveProperty('unbanIP')
expect(crowdsec.default).toHaveProperty('getCrowdsecKeyStatus')
expect(crowdsec.default).toHaveProperty('listWhitelists')
expect(crowdsec.default).toHaveProperty('addWhitelist')
expect(crowdsec.default).toHaveProperty('deleteWhitelist')
})
})
})

View File

@@ -1,53 +1,123 @@
import client from './client'
/** Represents an SSL/TLS certificate. */
export interface Certificate {
id?: number
uuid: string
name?: string
domain: string
common_name?: string
domains: string
issuer: string
issuer_org?: string
fingerprint?: string
serial_number?: string
key_type?: string
expires_at: string
not_before?: string
status: 'valid' | 'expiring' | 'expired' | 'untrusted'
provider: string
chain_depth?: number
has_key: boolean
in_use: boolean
/** @deprecated Use uuid instead */
id?: number
}
export interface AssignedHost {
uuid: string
name: string
domain_names: string
}
export interface ChainEntry {
subject: string
issuer: string
expires_at: string
}
export interface CertificateDetail extends Certificate {
assigned_hosts: AssignedHost[]
chain: ChainEntry[]
auto_renew: boolean
created_at: string
updated_at: string
}
export interface ValidationResult {
valid: boolean
common_name: string
domains: string[]
issuer_org: string
expires_at: string
key_match: boolean
chain_valid: boolean
chain_depth: number
warnings: string[]
errors: string[]
}
/**
* Fetches all SSL certificates.
* @returns Promise resolving to array of Certificate objects
* @throws {AxiosError} If the request fails
*/
export async function getCertificates(): Promise<Certificate[]> {
const response = await client.get<Certificate[]>('/certificates')
return response.data
}
/**
* Uploads a new SSL certificate with its private key.
* @param name - Display name for the certificate
* @param certFile - The certificate file (PEM format)
* @param keyFile - The private key file (PEM format)
* @returns Promise resolving to the created Certificate
* @throws {AxiosError} If upload fails or certificate is invalid
*/
export async function uploadCertificate(name: string, certFile: File, keyFile: File): Promise<Certificate> {
export async function getCertificateDetail(uuid: string): Promise<CertificateDetail> {
const response = await client.get<CertificateDetail>(`/certificates/${uuid}`)
return response.data
}
export async function uploadCertificate(
name: string,
certFile: File,
keyFile?: File,
chainFile?: File,
): Promise<Certificate> {
const formData = new FormData()
formData.append('name', name)
formData.append('certificate_file', certFile)
formData.append('key_file', keyFile)
if (keyFile) formData.append('key_file', keyFile)
if (chainFile) formData.append('chain_file', chainFile)
const response = await client.post<Certificate>('/certificates', formData, {
headers: {
'Content-Type': 'multipart/form-data',
},
headers: { 'Content-Type': 'multipart/form-data' },
})
return response.data
}
/**
* Deletes an SSL certificate.
* @param id - The ID of the certificate to delete
* @throws {AxiosError} If deletion fails or certificate not found
*/
export async function deleteCertificate(id: number): Promise<void> {
await client.delete(`/certificates/${id}`)
export async function updateCertificate(uuid: string, name: string): Promise<Certificate> {
const response = await client.put<Certificate>(`/certificates/${uuid}`, { name })
return response.data
}
export async function deleteCertificate(uuid: string): Promise<void> {
await client.delete(`/certificates/${uuid}`)
}
export async function exportCertificate(
uuid: string,
format: string,
includeKey: boolean,
password?: string,
pfxPassword?: string,
): Promise<Blob> {
const response = await client.post(
`/certificates/${uuid}/export`,
{ format, include_key: includeKey, password, pfx_password: pfxPassword },
{ responseType: 'blob' },
)
return response.data as Blob
}
export async function validateCertificate(
certFile: File,
keyFile?: File,
chainFile?: File,
): Promise<ValidationResult> {
const formData = new FormData()
formData.append('certificate_file', certFile)
if (keyFile) formData.append('key_file', keyFile)
if (chainFile) formData.append('chain_file', chainFile)
const response = await client.post<ValidationResult>('/certificates/validate', formData, {
headers: { 'Content-Type': 'multipart/form-data' },
})
return response.data
}

View File

@@ -156,4 +156,31 @@ export async function getCrowdsecKeyStatus(): Promise<CrowdSecKeyStatus> {
return resp.data
}
export default { startCrowdsec, stopCrowdsec, statusCrowdsec, importCrowdsecConfig, exportCrowdsecConfig, listCrowdsecFiles, readCrowdsecFile, writeCrowdsecFile, listCrowdsecDecisions, banIP, unbanIP, getCrowdsecKeyStatus }
export interface CrowdSecWhitelistEntry {
uuid: string
ip_or_cidr: string
reason: string
created_at: string
updated_at: string
}
export interface AddWhitelistPayload {
ip_or_cidr: string
reason: string
}
export const listWhitelists = async (): Promise<CrowdSecWhitelistEntry[]> => {
const resp = await client.get<{ whitelist: CrowdSecWhitelistEntry[] }>('/admin/crowdsec/whitelist')
return resp.data.whitelist
}
export const addWhitelist = async (data: AddWhitelistPayload): Promise<CrowdSecWhitelistEntry> => {
const resp = await client.post<CrowdSecWhitelistEntry>('/admin/crowdsec/whitelist', data)
return resp.data
}
export const deleteWhitelist = async (uuid: string): Promise<void> => {
await client.delete(`/admin/crowdsec/whitelist/${uuid}`)
}
export default { startCrowdsec, stopCrowdsec, statusCrowdsec, importCrowdsecConfig, exportCrowdsecConfig, listCrowdsecFiles, readCrowdsecFile, writeCrowdsecFile, listCrowdsecDecisions, banIP, unbanIP, getCrowdsecKeyStatus, listWhitelists, addWhitelist, deleteWhitelist }

View File

@@ -9,7 +9,7 @@ export interface Location {
}
export interface Certificate {
id: number;
id?: number;
uuid: string;
name: string;
provider: string;
@@ -40,7 +40,7 @@ export interface ProxyHost {
advanced_config?: string;
advanced_config_backup?: string;
enabled: boolean;
certificate_id?: number | null;
certificate_id?: number | string | null;
certificate?: Certificate | null;
access_list_id?: number | string | null;
access_list?: {

View File

@@ -0,0 +1,72 @@
import { Link2, ShieldCheck } from 'lucide-react'
import { useTranslation } from 'react-i18next'
import type { ChainEntry } from '../api/certificates'
interface CertificateChainViewerProps {
chain: ChainEntry[]
}
function getChainLabel(index: number, total: number, t: (key: string) => string): string {
if (index === 0) return t('certificates.chainLeaf')
if (index === total - 1 && total > 1) return t('certificates.chainRoot')
return t('certificates.chainIntermediate')
}
export default function CertificateChainViewer({ chain }: CertificateChainViewerProps) {
const { t } = useTranslation()
if (!chain || chain.length === 0) {
return (
<p className="text-sm text-content-muted italic">{t('certificates.noChainData')}</p>
)
}
return (
<div
className="space-y-0"
role="list"
aria-label={t('certificates.certificateChain')}
>
{chain.map((entry, index) => {
const label = getChainLabel(index, chain.length, t)
const isLast = index === chain.length - 1
return (
<div key={index} role="listitem">
<div className="flex items-start gap-3">
<div className="flex flex-col items-center">
<div className="flex h-8 w-8 items-center justify-center rounded-full border border-gray-700 bg-surface-muted">
{index === 0 ? (
<ShieldCheck className="h-4 w-4 text-brand-400" aria-hidden="true" />
) : (
<Link2 className="h-4 w-4 text-content-muted" aria-hidden="true" />
)}
</div>
{!isLast && (
<div className="w-px h-6 bg-gray-700" aria-hidden="true" />
)}
</div>
<div className="min-w-0 flex-1 pb-2">
<div className="flex items-center gap-2">
<span className="text-xs font-medium uppercase tracking-wide text-content-muted">
{label}
</span>
</div>
<p className="text-sm font-medium text-content-primary truncate" title={entry.subject}>
{entry.subject}
</p>
<p className="text-xs text-content-muted truncate" title={entry.issuer}>
{t('certificates.issuerOrg')}: {entry.issuer}
</p>
<p className="text-xs text-content-muted">
{t('certificates.expiresAt')}: {new Date(entry.expires_at).toLocaleDateString()}
</p>
</div>
</div>
</div>
)
})}
</div>
)
}

View File

@@ -1,32 +1,28 @@
import { useMutation, useQueryClient } from '@tanstack/react-query'
import { Trash2, ChevronUp, ChevronDown } from 'lucide-react'
import { Download, Eye, Trash2, ChevronUp, ChevronDown } from 'lucide-react'
import { useState, useMemo, useEffect } from 'react'
import { useTranslation } from 'react-i18next'
import BulkDeleteCertificateDialog from './dialogs/BulkDeleteCertificateDialog'
import CertificateDetailDialog from './dialogs/CertificateDetailDialog'
import CertificateExportDialog from './dialogs/CertificateExportDialog'
import DeleteCertificateDialog from './dialogs/DeleteCertificateDialog'
import { LoadingSpinner, ConfigReloadOverlay } from './LoadingStates'
import { Button } from './ui/Button'
import { Checkbox } from './ui/Checkbox'
import { Tooltip, TooltipContent, TooltipProvider, TooltipTrigger } from './ui/Tooltip'
import { deleteCertificate, type Certificate } from '../api/certificates'
import { useCertificates } from '../hooks/useCertificates'
import { useProxyHosts } from '../hooks/useProxyHosts'
import { type Certificate } from '../api/certificates'
import { useCertificates, useDeleteCertificate, useBulkDeleteCertificates } from '../hooks/useCertificates'
import { toast } from '../utils/toast'
import type { ProxyHost } from '../api/proxyHosts'
type SortColumn = 'name' | 'expires'
type SortDirection = 'asc' | 'desc'
export function isInUse(cert: Certificate, hosts: ProxyHost[]): boolean {
if (!cert.id) return false
return hosts.some(h => (h.certificate_id ?? h.certificate?.id) === cert.id)
export function isInUse(cert: Certificate): boolean {
return cert.in_use
}
export function isDeletable(cert: Certificate, hosts: ProxyHost[]): boolean {
if (!cert.id) return false
if (isInUse(cert, hosts)) return false
export function isDeletable(cert: Certificate): boolean {
if (cert.in_use) return false
return (
cert.provider === 'custom' ||
cert.provider === 'letsencrypt-staging' ||
@@ -35,65 +31,48 @@ export function isDeletable(cert: Certificate, hosts: ProxyHost[]): boolean {
)
}
function daysUntilExpiry(expiresAt: string): number {
const now = new Date()
const expiry = new Date(expiresAt)
return Math.ceil((expiry.getTime() - now.getTime()) / (1000 * 60 * 60 * 24))
}
export default function CertificateList() {
const { certificates, isLoading, error } = useCertificates()
const { hosts } = useProxyHosts()
const queryClient = useQueryClient()
const { t } = useTranslation()
const [sortColumn, setSortColumn] = useState<SortColumn>('name')
const [sortDirection, setSortDirection] = useState<SortDirection>('asc')
const [certToDelete, setCertToDelete] = useState<Certificate | null>(null)
const [selectedIds, setSelectedIds] = useState<Set<number>>(new Set())
const [certToView, setCertToView] = useState<Certificate | null>(null)
const [certToExport, setCertToExport] = useState<Certificate | null>(null)
const [selectedIds, setSelectedIds] = useState<Set<string>>(new Set())
const [showBulkDeleteDialog, setShowBulkDeleteDialog] = useState(false)
const deleteMutation = useDeleteCertificate()
useEffect(() => {
setSelectedIds(prev => {
const validIds = new Set(certificates.map(c => c.id).filter((id): id is number => id != null))
const validIds = new Set(certificates.map(c => c.uuid).filter(Boolean))
const reconciled = new Set([...prev].filter(id => validIds.has(id)))
if (reconciled.size === prev.size) return prev
return reconciled
})
}, [certificates])
const deleteMutation = useMutation({
mutationFn: async (id: number) => {
await deleteCertificate(id)
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ['certificates'] })
queryClient.invalidateQueries({ queryKey: ['proxyHosts'] })
toast.success(t('certificates.deleteSuccess'))
setCertToDelete(null)
},
onError: (error: Error) => {
toast.error(`${t('certificates.deleteFailed')}: ${error.message}`)
setCertToDelete(null)
},
})
const handleDelete = (cert: Certificate) => {
deleteMutation.mutate(cert.uuid, {
onSuccess: () => {
toast.success(t('certificates.deleteSuccess'))
setCertToDelete(null)
},
onError: (error: Error) => {
toast.error(`${t('certificates.deleteFailed')}: ${error.message}`)
setCertToDelete(null)
},
})
}
const bulkDeleteMutation = useMutation({
mutationFn: async (ids: number[]) => {
const results = await Promise.allSettled(ids.map(id => deleteCertificate(id)))
const failed = results.filter(r => r.status === 'rejected').length
const succeeded = results.filter(r => r.status === 'fulfilled').length
return { succeeded, failed }
},
onSuccess: ({ succeeded, failed }) => {
queryClient.invalidateQueries({ queryKey: ['certificates'] })
queryClient.invalidateQueries({ queryKey: ['proxyHosts'] })
setSelectedIds(new Set())
setShowBulkDeleteDialog(false)
if (failed > 0) {
toast.error(t('certificates.bulkDeletePartial', { deleted: succeeded, failed }))
} else {
toast.success(t('certificates.bulkDeleteSuccess', { count: succeeded }))
}
},
onError: () => {
toast.error(t('certificates.bulkDeleteFailed'))
setShowBulkDeleteDialog(false)
},
})
const bulkDeleteMutation = useBulkDeleteCertificates()
const sortedCertificates = useMemo(() => {
return [...certificates].sort((a, b) => {
@@ -101,8 +80,8 @@ export default function CertificateList() {
switch (sortColumn) {
case 'name': {
const aName = (a.name || a.domain || '').toLowerCase()
const bName = (b.name || b.domain || '').toLowerCase()
const aName = (a.name || a.domains || '').toLowerCase()
const bName = (b.name || b.domains || '').toLowerCase()
comparison = aName.localeCompare(bName)
break
}
@@ -118,15 +97,15 @@ export default function CertificateList() {
})
}, [certificates, sortColumn, sortDirection])
const selectableCertIds = useMemo<Set<number>>(() => {
const ids = new Set<number>()
const selectableCertIds = useMemo<Set<string>>(() => {
const ids = new Set<string>()
for (const cert of sortedCertificates) {
if (isDeletable(cert, hosts) && cert.id) {
ids.add(cert.id)
if (isDeletable(cert) && cert.uuid) {
ids.add(cert.uuid)
}
}
return ids
}, [sortedCertificates, hosts])
}, [sortedCertificates])
const allSelectableSelected =
selectableCertIds.size > 0 && selectedIds.size === selectableCertIds.size
@@ -141,12 +120,12 @@ export default function CertificateList() {
}
}
const handleSelectRow = (id: number) => {
const handleSelectRow = (uuid: string) => {
const next = new Set(selectedIds)
if (next.has(id)) {
next.delete(id)
if (next.has(uuid)) {
next.delete(uuid)
} else {
next.add(id)
next.add(uuid)
}
setSelectedIds(next)
}
@@ -243,18 +222,19 @@ export default function CertificateList() {
</tr>
) : (
sortedCertificates.map((cert) => {
const inUse = isInUse(cert, hosts)
const deletable = isDeletable(cert, hosts)
const inUse = isInUse(cert)
const deletable = isDeletable(cert)
const isInUseDeletableCategory = inUse && (cert.provider === 'custom' || cert.provider === 'letsencrypt-staging' || cert.status === 'expired' || cert.status === 'expiring')
const days = daysUntilExpiry(cert.expires_at)
return (
<tr key={cert.id || cert.domain} className="hover:bg-gray-800/50 transition-colors">
<tr key={cert.uuid} className="hover:bg-gray-800/50 transition-colors">
{deletable && !inUse ? (
<td className="w-12 px-4 py-4">
<Checkbox
checked={selectedIds.has(cert.id!)}
onCheckedChange={() => handleSelectRow(cert.id!)}
aria-label={t('certificates.selectCert', { name: cert.name || cert.domain })}
checked={selectedIds.has(cert.uuid)}
onCheckedChange={() => handleSelectRow(cert.uuid)}
aria-label={t('certificates.selectCert', { name: cert.name || cert.domains })}
/>
</td>
) : isInUseDeletableCategory ? (
@@ -267,7 +247,7 @@ export default function CertificateList() {
checked={false}
disabled
aria-disabled="true"
aria-label={t('certificates.selectCert', { name: cert.name || cert.domain })}
aria-label={t('certificates.selectCert', { name: cert.name || cert.domains })}
/>
</span>
</TooltipTrigger>
@@ -279,7 +259,7 @@ export default function CertificateList() {
<td className="w-12 px-4 py-4" aria-hidden="true" />
)}
<td className="px-6 py-4 font-medium text-white">{cert.name || '-'}</td>
<td className="px-6 py-4 font-medium text-white">{cert.domain}</td>
<td className="px-6 py-4 font-medium text-white">{cert.domains}</td>
<td className="px-6 py-4">
<div className="flex items-center gap-2">
<span>{cert.issuer}</span>
@@ -291,49 +271,80 @@ export default function CertificateList() {
</div>
</td>
<td className="px-6 py-4">
{new Date(cert.expires_at).toLocaleDateString()}
<TooltipProvider>
<Tooltip>
<TooltipTrigger asChild>
<span className={days <= 0 ? 'text-red-400' : days <= 30 ? 'text-yellow-400' : ''}>
{new Date(cert.expires_at).toLocaleDateString()}
</span>
</TooltipTrigger>
<TooltipContent>
{days > 0
? t('certificates.expiresInDays', { days })
: t('certificates.expiredAgo', { days: Math.abs(days) })}
</TooltipContent>
</Tooltip>
</TooltipProvider>
</td>
<td className="px-6 py-4">
<StatusBadge status={cert.status} />
</td>
<td className="px-6 py-4">
{(() => {
if (cert.id && inUse && (cert.provider === 'custom' || cert.provider === 'letsencrypt-staging' || cert.status === 'expired')) {
return (
<TooltipProvider>
<Tooltip>
<TooltipTrigger asChild>
<button
aria-disabled="true"
aria-label={t('certificates.deleteTitle')}
className="text-red-400/40 cursor-not-allowed transition-colors"
onClick={(e) => e.preventDefault()}
>
<Trash2 className="w-4 h-4" />
</button>
</TooltipTrigger>
<TooltipContent>
{t('certificates.deleteInUse')}
</TooltipContent>
</Tooltip>
</TooltipProvider>
)
}
<div className="flex items-center gap-2">
<button
onClick={() => setCertToView(cert)}
className="text-gray-400 hover:text-white transition-colors"
aria-label={t('certificates.viewDetails')}
data-testid={`view-cert-${cert.uuid}`}
>
<Eye className="w-4 h-4" />
</button>
<button
onClick={() => setCertToExport(cert)}
className="text-gray-400 hover:text-white transition-colors"
aria-label={t('certificates.export')}
data-testid={`export-cert-${cert.uuid}`}
>
<Download className="w-4 h-4" />
</button>
{(() => {
if (inUse && (cert.provider === 'custom' || cert.provider === 'letsencrypt-staging' || cert.status === 'expired')) {
return (
<TooltipProvider>
<Tooltip>
<TooltipTrigger asChild>
<button
aria-disabled="true"
aria-label={t('certificates.deleteTitle')}
className="text-red-400/40 cursor-not-allowed transition-colors"
onClick={(e) => e.preventDefault()}
>
<Trash2 className="w-4 h-4" />
</button>
</TooltipTrigger>
<TooltipContent>
{t('certificates.deleteInUse')}
</TooltipContent>
</Tooltip>
</TooltipProvider>
)
}
if (deletable) {
return (
<button
onClick={() => setCertToDelete(cert)}
className="text-red-400 hover:text-red-300 transition-colors"
aria-label={t('certificates.deleteTitle')}
>
<Trash2 className="w-4 h-4" />
</button>
)
}
if (deletable) {
return (
<button
onClick={() => setCertToDelete(cert)}
className="text-red-400 hover:text-red-300 transition-colors"
aria-label={t('certificates.deleteTitle')}
>
<Trash2 className="w-4 h-4" />
</button>
)
}
return null
})()}
return null
})()}
</div>
</td>
</tr>
)
@@ -347,20 +358,44 @@ export default function CertificateList() {
certificate={certToDelete}
open={certToDelete !== null}
onConfirm={() => {
if (certToDelete?.id) {
deleteMutation.mutate(certToDelete.id)
if (certToDelete?.uuid) {
handleDelete(certToDelete)
}
}}
onCancel={() => setCertToDelete(null)}
isDeleting={deleteMutation.isPending}
/>
<BulkDeleteCertificateDialog
certificates={sortedCertificates.filter(c => c.id && selectedIds.has(c.id))}
certificates={sortedCertificates.filter(c => selectedIds.has(c.uuid))}
open={showBulkDeleteDialog}
onConfirm={() => bulkDeleteMutation.mutate(Array.from(selectedIds))}
onConfirm={() => bulkDeleteMutation.mutate(Array.from(selectedIds), {
onSuccess: ({ succeeded, failed }) => {
setSelectedIds(new Set())
setShowBulkDeleteDialog(false)
if (failed > 0) {
toast.error(t('certificates.bulkDeletePartial', { deleted: succeeded, failed }))
} else {
toast.success(t('certificates.bulkDeleteSuccess', { count: succeeded }))
}
},
onError: () => {
toast.error(t('certificates.bulkDeleteFailed'))
setShowBulkDeleteDialog(false)
},
})}
onCancel={() => setShowBulkDeleteDialog(false)}
isDeleting={bulkDeleteMutation.isPending}
/>
<CertificateDetailDialog
certificate={certToView}
open={certToView !== null}
onOpenChange={(open) => { if (!open) setCertToView(null) }}
/>
<CertificateExportDialog
certificate={certToExport}
open={certToExport !== null}
onOpenChange={(open) => { if (!open) setCertToExport(null) }}
/>
</>
)
}

View File

@@ -25,9 +25,9 @@ export default function CertificateStatusCard({ certificates, hosts, isLoading }
const domains = new Set<string>()
for (const cert of certificates) {
// Handle missing or undefined domain field
if (!cert.domain) continue
// Certificate domain field can be comma-separated
for (const d of cert.domain.split(',')) {
if (!cert.domains) continue
// Certificate domains field can be comma-separated
for (const d of cert.domains.split(',')) {
const trimmed = d.trim().toLowerCase()
if (trimmed) domains.add(trimmed)
}

View File

@@ -0,0 +1,107 @@
import { AlertTriangle, CheckCircle, XCircle } from 'lucide-react'
import { useTranslation } from 'react-i18next'
import type { ValidationResult } from '../api/certificates'
interface CertificateValidationPreviewProps {
result: ValidationResult
}
export default function CertificateValidationPreview({
result,
}: CertificateValidationPreviewProps) {
const { t } = useTranslation()
return (
<div
className="rounded-lg border border-gray-700 bg-surface-muted/50 p-4 space-y-3"
data-testid="certificate-validation-preview"
role="region"
aria-label={t('certificates.validationPreview')}
>
<div className="flex items-center gap-2">
{result.valid ? (
<CheckCircle className="h-5 w-5 text-green-400" aria-hidden="true" />
) : (
<XCircle className="h-5 w-5 text-red-400" aria-hidden="true" />
)}
<span className="font-medium text-content-primary">
{result.valid
? t('certificates.validCertificate')
: t('certificates.invalidCertificate')}
</span>
</div>
<dl className="grid grid-cols-[auto_1fr] gap-x-4 gap-y-1.5 text-sm">
<dt className="text-content-muted">{t('certificates.commonName')}</dt>
<dd className="text-content-primary">{result.common_name || '-'}</dd>
<dt className="text-content-muted">{t('certificates.domains')}</dt>
<dd className="text-content-primary">
{result.domains?.length ? result.domains.join(', ') : '-'}
</dd>
<dt className="text-content-muted">{t('certificates.issuerOrg')}</dt>
<dd className="text-content-primary">{result.issuer_org || '-'}</dd>
<dt className="text-content-muted">{t('certificates.expiresAt')}</dt>
<dd className="text-content-primary">
{result.expires_at ? new Date(result.expires_at).toLocaleDateString() : '-'}
</dd>
<dt className="text-content-muted">{t('certificates.keyMatch')}</dt>
<dd>
{result.key_match ? (
<span className="text-green-400">Yes</span>
) : (
<span className="text-yellow-400">No key provided</span>
)}
</dd>
<dt className="text-content-muted">{t('certificates.chainValid')}</dt>
<dd>
{result.chain_valid ? (
<span className="text-green-400">Yes</span>
) : (
<span className="text-yellow-400">Not verified</span>
)}
</dd>
{result.chain_depth > 0 && (
<>
<dt className="text-content-muted">{t('certificates.chainDepth')}</dt>
<dd className="text-content-primary">{result.chain_depth}</dd>
</>
)}
</dl>
{result.warnings.length > 0 && (
<div className="flex items-start gap-2 rounded-md border border-yellow-900/50 bg-yellow-900/10 p-3">
<AlertTriangle className="h-4 w-4 text-yellow-400 mt-0.5 shrink-0" aria-hidden="true" />
<div className="space-y-1">
<p className="text-sm font-medium text-yellow-400">{t('certificates.warnings')}</p>
<ul className="list-disc list-inside text-sm text-yellow-300/80 space-y-0.5">
{result.warnings.map((w, i) => (
<li key={i}>{w}</li>
))}
</ul>
</div>
</div>
)}
{result.errors.length > 0 && (
<div className="flex items-start gap-2 rounded-md border border-red-900/50 bg-red-900/10 p-3">
<XCircle className="h-4 w-4 text-red-400 mt-0.5 shrink-0" aria-hidden="true" />
<div className="space-y-1">
<p className="text-sm font-medium text-red-400">{t('certificates.errors')}</p>
<ul className="list-disc list-inside text-sm text-red-300/80 space-y-0.5">
{result.errors.map((e, i) => (
<li key={i}>{e}</li>
))}
</ul>
</div>
</div>
)}
</div>
)
}

View File

@@ -123,7 +123,7 @@ function buildInitialFormData(host?: ProxyHost): Partial<ProxyHost> & {
application: (host?.application || 'none') as ApplicationPreset,
advanced_config: host?.advanced_config || '',
enabled: host?.enabled ?? true,
certificate_id: host?.certificate_id,
certificate_id: host?.certificate?.uuid ?? host?.certificate_id,
access_list_id: host?.access_list?.uuid ?? host?.access_list_id,
security_header_profile_id: host?.security_header_profile?.uuid ?? host?.security_header_profile_id,
dns_provider_id: host?.dns_provider_id || null,
@@ -249,9 +249,10 @@ function getEntityToken(entity: { id?: number; uuid?: string }): string | null {
}
export default function ProxyHostForm({ host, onSubmit, onCancel }: ProxyHostFormProps) {
type ProxyHostFormState = Omit<Partial<ProxyHost>, 'access_list_id' | 'security_header_profile_id'> & {
type ProxyHostFormState = Omit<Partial<ProxyHost>, 'access_list_id' | 'security_header_profile_id' | 'certificate_id'> & {
access_list_id?: number | string | null
security_header_profile_id?: number | string | null
certificate_id?: number | string | null
addUptime?: boolean
uptimeInterval?: number
uptimeMaxRetries?: number
@@ -562,6 +563,7 @@ export default function ProxyHostForm({ host, onSubmit, onCancel }: ProxyHostFor
...payloadWithoutUptime,
access_list_id: normalizeAccessListReference(payloadWithoutUptime.access_list_id),
security_header_profile_id: normalizeSecurityHeaderReference(payloadWithoutUptime.security_header_profile_id),
certificate_id: normalizeAccessListReference(payloadWithoutUptime.certificate_id),
}
const res = await onSubmit(submitPayload)
@@ -910,18 +912,25 @@ export default function ProxyHostForm({ host, onSubmit, onCancel }: ProxyHostFor
<label className="block text-sm font-medium text-gray-300 mb-2">
SSL Certificate
</label>
<Select value={String(formData.certificate_id || 0)} onValueChange={e => setFormData(prev => ({ ...prev, certificate_id: parseInt(e) || null }))}>
<Select
value={resolveSelectToken(formData.certificate_id as number | string | null | undefined)}
onValueChange={token => setFormData(prev => ({ ...prev, certificate_id: resolveTokenToFormValue(token) }))}
>
<SelectTrigger className="w-full bg-gray-900 border-gray-700 text-white" aria-label="SSL Certificate">
<SelectValue />
</SelectTrigger>
<SelectContent>
<SelectItem value="0">Auto-manage with Let's Encrypt (recommended)</SelectItem>
{certificates.map(cert => (
<SelectItem key={cert.id || cert.domain} value={String(cert.id ?? 0)}>
{(cert.name || cert.domain)}
{cert.provider ? ` (${cert.provider})` : ''}
</SelectItem>
))}
<SelectItem value="none">Auto-manage with Let's Encrypt (recommended)</SelectItem>
{certificates.map(cert => {
const token = getEntityToken(cert)
if (!token) return null
return (
<SelectItem key={cert.uuid} value={token}>
{cert.name || cert.domains}
{cert.provider ? ` (${cert.provider})` : ''}
</SelectItem>
)
})}
</SelectContent>
</Select>
<p className="text-xs text-gray-500 mt-1">

View File

@@ -0,0 +1,71 @@
import { render, screen } from '@testing-library/react'
import { describe, it, expect, vi } from 'vitest'
import type { ChainEntry } from '../../api/certificates'
import CertificateChainViewer from '../CertificateChainViewer'
vi.mock('react-i18next', () => ({
useTranslation: () => ({
t: (key: string) => key,
i18n: { language: 'en', changeLanguage: vi.fn() },
}),
}))
function makeChain(count: number): ChainEntry[] {
return Array.from({ length: count }, (_, i) => ({
subject: `Subject ${i}`,
issuer: `Issuer ${i}`,
expires_at: '2026-06-01T00:00:00Z',
}))
}
describe('CertificateChainViewer', () => {
it('renders empty state when chain is empty', () => {
render(<CertificateChainViewer chain={[]} />)
expect(screen.getByText('certificates.noChainData')).toBeTruthy()
})
it('renders single entry as leaf', () => {
render(<CertificateChainViewer chain={makeChain(1)} />)
expect(screen.getByText('certificates.chainLeaf')).toBeTruthy()
expect(screen.getByText('Subject 0')).toBeTruthy()
})
it('renders two entries as leaf + root', () => {
render(<CertificateChainViewer chain={makeChain(2)} />)
expect(screen.getByText('certificates.chainLeaf')).toBeTruthy()
expect(screen.getByText('certificates.chainRoot')).toBeTruthy()
})
it('renders three entries as leaf + intermediate + root', () => {
render(<CertificateChainViewer chain={makeChain(3)} />)
expect(screen.getByText('certificates.chainLeaf')).toBeTruthy()
expect(screen.getByText('certificates.chainIntermediate')).toBeTruthy()
expect(screen.getByText('certificates.chainRoot')).toBeTruthy()
})
it('displays issuer for each entry', () => {
render(<CertificateChainViewer chain={makeChain(2)} />)
expect(screen.getByText(/Issuer 0/)).toBeTruthy()
expect(screen.getByText(/Issuer 1/)).toBeTruthy()
})
it('displays formatted expiration dates', () => {
render(<CertificateChainViewer chain={makeChain(1)} />)
const dateStr = new Date('2026-06-01T00:00:00Z').toLocaleDateString()
expect(screen.getByText(new RegExp(dateStr))).toBeTruthy()
})
it('uses list role with list items', () => {
render(<CertificateChainViewer chain={makeChain(2)} />)
expect(screen.getByRole('list')).toBeTruthy()
expect(screen.getAllByRole('listitem')).toHaveLength(2)
})
it('has aria-label on list', () => {
render(<CertificateChainViewer chain={makeChain(1)} />)
expect(screen.getByRole('list').getAttribute('aria-label')).toBe(
'certificates.certificateChain',
)
})
})

Some files were not shown because too many files have changed in this diff Show More