Validation & Testing Roadmap¶
CloudTaser is in preview. The path to production-ready runs through a structured validation program. This page states what we're committing to test, by when, and what evidence we'll publish. It's the public commitment tracking cloudtaser-pipeline#190.
If you're evaluating CloudTaser for procurement, this roadmap is the concrete schedule of evidence we will produce. Your auditor or DPO can use the deliverables below to plan the compliance-evidence portion of your deployment timeline. Dates below are public commitments; when a date slips we will update this page and explain the slip in git history.
For organisational trust milestones (SOC 2 readiness, reference customers, named pentest vendor), see the companion page Preview Status & Roadmap. This page focuses on validation evidence — coverage data, recovery-latency distributions, reproducible-build hashes, throughput CDFs — the things an engineer or auditor can download and inspect.
Phase structure¶
Four phases run across Q2 2026 and Q3 2026. Each phase has a testing category, public coverage goals (what gets tested and how), and named public deliverables (what external readers will see). Deliverables are dated; CI-driven deliverables are re-dated on every successful run.
Phase 1 — Functional coverage¶
Quarter: Q2 2026 Category: Functional
Public coverage goals¶
- DB proxy matrix: three encryption modes (deterministic, randomised, order-preserving) crossed with five PostgreSQL majors (12, 13, 14, 15, 16) and MySQL 8. Each cell is a green integration job in CI, producing a coverage row for every driver/encryption pair we claim to support.
- S3 proxy edge cases: multipart upload (5MB through 5GB object sizes), range reads against objects encrypted under deterministic and randomised modes, presigned URL handoff with client-side decrypt, concurrent key-conflict resolution, and ETag / Content-MD5 correctness when the underlying object is client-side encrypted.
- Wrapper fork/exec chain coverage: explicit test matrix for shell-wrapped entrypoints (
sh -c,bash -c), Python entrypoints (python -m, direct/usr/bin/python3 script.py), and POSIX-spawn chains. Closes the ambiguity class where the wrapper's PID-1 contract needs to hold across interpreter re-exec steps.
Public deliverables¶
- Coverage page at
docs/validation/phase-1-coverage.md(published when Phase 1 CI jobs land), listing each matrix cell and the CI run that last proved it green. - Coverage badges on each component README linking back to the deliverable page.
Phase 2 — Resilience¶
Quarter: Q2 2026 Category: Resilience
Public coverage goals¶
- Beacon HA outage recovery: 1-of-3 and 2-of-3 node loss scenarios, measuring time-to-reconnect for wrappers and operators holding in-flight secret requests.
- OpenBao sealed-to-unsealed transition: behaviour during the transient window when an unseal is in progress — does the wrapper surface a retry-able error, or does it fail-open / fail-closed correctly?
- Network partition + reconnect: cluster-to-beacon link break, DNS partition, asymmetric partition (cluster sees beacon but not vice versa). We want to characterise CAP behaviour publicly so customers can plan.
- Pod eviction mid-fetch: wrapper evicted while a secret fetch is in flight — no secret exposure on the evicted node, no orphaned vault leases.
- Upgrade-path matrix: every release proves three consecutive version pairs green (N-2 → N-1, N-1 → N, N → N+1 rolling) across operator, wrapper, helm chart, CRDs.
Public deliverables¶
- Recovery-latency CDF at
docs/validation/resilience.md— one chart per scenario, dated from the latest CI run. Re-published weekly so the data never goes stale. - Upgrade-matrix table updated automatically from CI release dispatches.
Phase 3 — Security and audit¶
Quarter: Q2-Q3 2026 Category: Security
Public coverage goals¶
- Third-party security audit: engagement scheduled Q2 2026, report target Q3 2026. On completion, a sanitised findings summary is published here. Full report is available under NDA to design partners and serious procurement reviewers. (Vendor shortlist and selection rationale are maintained internally; see Preview Status & Roadmap for the named candidates.)
- Admission-policy enforcement test: tampered-image rejection is verified on every operator release. The admission webhook is expected to reject pod specs whose image digest has been altered between sign-time and admission. CI fails the release if rejection does not occur.
- Reproducible wrapper build: a public sha256 is published per release on
releases.cloudtaser.io, computed from a hermetic build of the wrapper binary. Target: Q3 2026 for wrapper; operator and eBPF reproducibility follow. - External DPIA template review: a GDPR consultant walkthrough of the published DPIA templates with findings fed back into
docs/compliance/. Closes the "did a lawyer actually look at this" question customers' DPOs raise.
Public deliverables¶
- Audit completion date + sanitised findings summary on this page under a dated section.
- Reproducible-build sha256s published alongside each release artefact on
releases.cloudtaser.io. - DPIA template review annotations merged into the compliance pages with a dated changelog entry.
Phase 4 — Performance and public benchmarks¶
Quarter: Q3 2026 Category: Performance
Public coverage goals¶
- Wrapper microbenchmarks:
getenvlatency, memfd read throughput, fork/exec overhead versus an unwrapped baseline. Published as distributions, not single numbers. - S3 proxy throughput CDFs per object-size class (1 KB, 64 KB, 1 MB, 64 MB, 1 GB) at three concurrency points: single-stream, 16 concurrent streams, 64 concurrent streams. Customers sizing their proxies can read the distribution for their workload shape.
- DB proxy query throughput: single-row read/write, bulk-insert (10k-row batches), large-SELECT with decrypt on the result set. Measured against an unencrypted baseline so the overhead is visible.
- End-to-end demo latency: 50th, 95th, 99th percentile time from
kubectl applyto wrapped pod serving traffic. This is the number that dominates a reviewer's first-demo impression, so we publish it.
Public deliverables¶
- Live CDF charts at
docs/validation/performance.md, updated weekly from CI benchmark runs on dedicated hardware. - Published numbers use the worst of the last three runs as the headline figure — no cherry-picking the best run.
How to cite this¶
If you're evaluating CloudTaser for procurement, this roadmap is the concrete schedule of evidence we will produce. Your auditor or DPO can use it to plan the compliance-evidence portion of your deployment timeline. Dated deliverables above are public commitments; internal priorities or adversary-testing scenarios are not documented publicly for sound security-operations reasons.
Cite this page in your procurement documentation by the permalink and date, e.g. https://docs.cloudtaser.io/validation/testing-roadmap/ as of YYYY-MM-DD. The git history of this file is the audit trail of commitments made and commitments met.
Related pages¶
- Preview Status & Roadmap — organisational trust milestones (SOC 2, audits, reference customers)
- Sovereign Deployment Decision Guide — technical deployment decisions
- Operational Readiness — blast radius, SLA, backout
- Protection Score — self-measurement methodology
- Compliance Mapping — regulatory framework coverage