Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
0fc3af8
feat(infra): add GCP Terraform envs (dev/prod)
betterclever Feb 9, 2026
8ebad70
fix(infra): enable compute API for dev env
betterclever Feb 9, 2026
4bfe724
feat(infra): allow token auth for Terraform
betterclever Feb 9, 2026
4cd8a6b
docs(infra): document token auth fallback
betterclever Feb 9, 2026
7f9a1ae
fix(infra): support terraform 1.5.x
betterclever Feb 9, 2026
6ef25fc
feat(infra): add dev-local env without remote state
betterclever Feb 9, 2026
b1005f1
chore(infra): commit terraform provider locks
betterclever Feb 9, 2026
d4ac903
fix(infra): adopt existing shipsec-dev cluster into terraform
betterclever Feb 11, 2026
95775c4
feat(deploy): add Helm charts and deploy scripts from production-arch…
betterclever Feb 11, 2026
ed092c3
feat(infra): add managed services (Cloud SQL, Memorystore, GCS) and w…
betterclever Feb 11, 2026
3fd8429
feat(worker): add K8s Job execution engine replacing DIND
betterclever Feb 11, 2026
62fbcdf
feat(deploy): add RBAC and Helm config for K8s Job execution
betterclever Feb 11, 2026
948f368
refactor(components): update components for K8s-compatible volume/run…
betterclever Feb 11, 2026
5351f77
feat(frontend): allow studio-next.shipsec.ai domain
betterclever Feb 11, 2026
c4e6943
feat(deploy): add nginx Ingress for studio-next.shipsec.ai
betterclever Feb 11, 2026
c625623
feat(deploy): add TLS with Let's Encrypt via cert-manager
betterclever Feb 11, 2026
34412b1
refactor(deploy): switch to path-based routing on single domain
betterclever Feb 11, 2026
3992372
fix(frontend): use relative API URL for path-based routing
betterclever Feb 11, 2026
e7a0dec
fix(frontend): remove all hardcoded localhost:3211 defaults
betterclever Feb 11, 2026
cafd49e
fix(infra): enable Kafka event ingestion and fix cross-namespace conn…
betterclever Feb 12, 2026
ed54472
fix(frontend): handle relative URL in terminal chunks fetch
betterclever Feb 12, 2026
db48759
fix(worker): wait for container running before streaming K8s logs
betterclever Feb 12, 2026
6bbf037
fix(worker): read terminated pod logs for terminal streaming
betterclever Feb 12, 2026
2e5563c
fix(worker): emit terminal chunks as PTY with base64 encoding
betterclever Feb 12, 2026
4e01573
chore(deploy): update worker image tag for PTY terminal fix
betterclever Feb 13, 2026
5ed06d4
feat(worker): enable TTY on K8s Job containers for ANSI terminal output
betterclever Feb 13, 2026
bbce297
chore(deploy): update worker image tag with TTY support
betterclever Feb 13, 2026
b213a66
feat(worker): track deltaMs timing in K8s terminal stream chunks
betterclever Feb 13, 2026
d74f94a
chore(deploy): update worker image tag with deltaMs timing
betterclever Feb 13, 2026
5471311
feat(worker): use K8s Attach API for live PTY streaming
betterclever Feb 13, 2026
5d4d72e
feat(deploy): add pods/attach RBAC for live PTY streaming via K8s Att…
betterclever Feb 13, 2026
643acfa
ci: add daily upstream sync workflow
betterclever Feb 17, 2026
8759da6
Merge pull request #11 from ShipSecAI/chore/upstream-sync-workflow
betterclever Feb 17, 2026
df2fc2b
fix(worker): improve K8s runner output parsing for non-JSON delimited…
betterclever Feb 17, 2026
b7c19eb
Merge remote-tracking branch 'private/main' into codex/gcp-terraform
betterclever Feb 17, 2026
959c3b1
feat(worker): add GCS FUSE volume support for K8s runner
betterclever Feb 17, 2026
49d5de9
chore(infra): migrate dev terraform state to remote GCS backend
betterclever Feb 17, 2026
7785259
fix(worker): fix GCS FUSE k8s runner shell syntax, sidecar detection,…
betterclever Feb 17, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 26 additions & 18 deletions .ai/analytics-output-port-design.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# Analytics Output Port Design

## Status: Approved

## Date: 2025-01-21

## Problem Statement
Expand All @@ -12,6 +13,7 @@ When connecting a component's `rawOutput` (which contains complex nested JSON) t
3. **Varying schemas**: Different scanner outputs accumulate unique field paths over time

Example error:

```
illegal_argument_exception: Limit of total fields [1000] has been exceeded
```
Expand Down Expand Up @@ -51,6 +53,7 @@ illegal_argument_exception: Limit of total fields [1000] has been exceeded
### Document Structure

**Before (PRD design):**

```json
{
"workflow_id": "...",
Expand All @@ -69,6 +72,7 @@ illegal_argument_exception: Limit of total fields [1000] has been exceeded
```

**After (new design):**

```json
{
"check_id": "DB_RLS_DISABLED",
Expand Down Expand Up @@ -97,13 +101,14 @@ illegal_argument_exception: Limit of total fields [1000] has been exceeded

Components should use their existing structured list outputs:

| Component | Port | Type | Notes |
|-----------|------|------|-------|
| Nuclei | `results` | `z.array(z.record(z.string(), z.unknown()))` | Scanner + asset_key added |
| TruffleHog | `results` | `z.array(z.record(z.string(), z.unknown()))` | Scanner + asset_key added |
| Component | Port | Type | Notes |
| ---------------- | --------- | -------------------------------------------- | ------------------------- |
| Nuclei | `results` | `z.array(z.record(z.string(), z.unknown()))` | Scanner + asset_key added |
| TruffleHog | `results` | `z.array(z.record(z.string(), z.unknown()))` | Scanner + asset_key added |
| Supabase Scanner | `results` | `z.array(z.record(z.string(), z.unknown()))` | Scanner + asset_key added |

All `results` ports include:

- `scanner`: Scanner identifier (e.g., `'nuclei'`, `'trufflehog'`, `'supabase-scanner'`)
- `asset_key`: Primary asset identifier from the finding
- `finding_hash`: Stable hash for deduplication (16-char hex from SHA-256)
Expand All @@ -113,6 +118,7 @@ All `results` ports include:
The `finding_hash` enables tracking findings across workflow runs:

**Generation:**

```typescript
import { createHash } from 'crypto';

Expand All @@ -130,6 +136,7 @@ function generateFindingHash(...fields: (string | undefined | null)[]): string {
| Supabase Scanner | `check_id + projectRef + resource` |

**Use cases:**

- **New vs recurring**: Is this finding appearing for the first time?
- **First-seen / last-seen**: When did we first detect this? Is it still present?
- **Resolution tracking**: Findings that stop appearing may be resolved
Expand All @@ -139,19 +146,20 @@ function generateFindingHash(...fields: (string | undefined | null)[]): string {

The indexer automatically adds these fields under `shipsec`:

| Field | Description |
|-------|-------------|
| `organization_id` | Organization that owns the workflow |
| `run_id` | Unique identifier for this workflow execution |
| `workflow_id` | ID of the workflow definition |
| `workflow_name` | Human-readable workflow name |
| `component_id` | Component type (e.g., `core.analytics.sink`) |
| `node_ref` | Node reference in the workflow graph |
| `asset_key` | Auto-detected or specified asset identifier |
| Field | Description |
| ----------------- | --------------------------------------------- |
| `organization_id` | Organization that owns the workflow |
| `run_id` | Unique identifier for this workflow execution |
| `workflow_id` | ID of the workflow definition |
| `workflow_name` | Human-readable workflow name |
| `component_id` | Component type (e.g., `core.analytics.sink`) |
| `node_ref` | Node reference in the workflow graph |
| `asset_key` | Auto-detected or specified asset identifier |

### Querying in OpenSearch

With this structure, users can:

- Filter by organization: `shipsec.organization_id: "org_123"`
- Filter by workflow: `shipsec.workflow_id: "xxx"`
- Filter by run: `shipsec.run_id: "xxx"`
Expand All @@ -164,11 +172,11 @@ With this structure, users can:

### Trade-offs

| Decision | Pro | Con |
|----------|-----|-----|
| Serialize nested objects | Prevents field explosion | Can't query inside serialized fields |
| `shipsec` namespace | No field collision | Slightly more verbose queries |
| No generic schema | Better fit per component | Less consistency across components |
| Decision | Pro | Con |
| ------------------------ | ------------------------- | ------------------------------------------ |
| Serialize nested objects | Prevents field explosion | Can't query inside serialized fields |
| `shipsec` namespace | No field collision | Slightly more verbose queries |
| No generic schema | Better fit per component | Less consistency across components |
| Same timestamp per batch | Accurate (same scan time) | Can't distinguish individual finding times |

### Implementation Files
Expand Down
83 changes: 83 additions & 0 deletions .github/workflows/upstream-sync.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
name: Sync upstream main

on:
schedule:
- cron: "0 9 * * *" # Once daily at 9am UTC
workflow_dispatch: # Manual trigger

permissions:
contents: write
pull-requests: write

jobs:
sync:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Add upstream remote
run: |
git remote add upstream https://github.com/ShipSecAI/studio.git || true
git fetch upstream main

- name: Check for divergence
id: check
run: |
UPSTREAM_SHA=$(git rev-parse upstream/main)
# Check if upstream-sync branch exists on origin
if git ls-remote --exit-code origin upstream-sync &>/dev/null; then
CURRENT_SHA=$(git rev-parse origin/upstream-sync)
else
CURRENT_SHA=""
fi

if [ "$UPSTREAM_SHA" = "$CURRENT_SHA" ]; then
echo "skip=true" >> "$GITHUB_OUTPUT"
echo "No new upstream commits"
else
echo "skip=false" >> "$GITHUB_OUTPUT"
AHEAD=$(git rev-list --count origin/main..upstream/main)
echo "ahead=$AHEAD" >> "$GITHUB_OUTPUT"
echo "Upstream is $AHEAD commits ahead"
fi

- name: Push upstream-sync branch
if: steps.check.outputs.skip == 'false'
run: |
git checkout -B upstream-sync upstream/main
git push origin upstream-sync --force

- name: Create or update PR
if: steps.check.outputs.skip == 'false'
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
EXISTING_PR=$(gh pr list --head upstream-sync --base main --state open --json number --jq '.[0].number' 2>/dev/null || echo "")

if [ -n "$EXISTING_PR" ]; then
echo "PR #$EXISTING_PR already exists, updated sync branch"
gh pr comment "$EXISTING_PR" --body "Sync branch updated. Upstream is now ${{ steps.check.outputs.ahead }} commits ahead of main."
else
gh pr create \
--head upstream-sync \
--base main \
--title "sync: merge upstream main" \
--body "$(cat <<'EOF'
Automated sync from [ShipSecAI/studio](https://github.com/ShipSecAI/studio) main.

**${{ steps.check.outputs.ahead }} new upstream commits.**

Review the changes and merge when ready. If there are conflicts, resolve them locally:
```bash
git fetch origin upstream-sync main
git checkout main
git merge origin/upstream-sync
# resolve conflicts
git push origin main
```
EOF
)"
fi
14 changes: 14 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -75,3 +75,17 @@ vite.config.ts.timestamp-*
.playground/
.omc/
MCP_FLOW_TRACE.md

# Terraform / OpenTofu
.terraform/
*.tfstate
*.tfstate.*
*.tfvars
*.tfvars.json
crash.log
crash.*.log
override.tf
override.tf.json
*_override.tf
*_override.tf.json
.terraform.lock.hcl.bak
6 changes: 6 additions & 0 deletions .prettierignore
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,9 @@ node_modules/

# Generated files
*.generated.ts

# Helm templates (Go template syntax is not valid YAML)
deploy/helm/*/templates/

# GitHub Actions (uses ${{ }} template syntax)
.github/workflows/
8 changes: 4 additions & 4 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -83,8 +83,8 @@ FROM base AS frontend
# Frontend build-time configuration
ARG VITE_AUTH_PROVIDER=local
ARG VITE_CLERK_PUBLISHABLE_KEY=""
ARG VITE_API_URL=http://localhost:3211
ARG VITE_BACKEND_URL=http://localhost:3211
ARG VITE_API_URL=""
ARG VITE_BACKEND_URL=""
ARG VITE_DEFAULT_ORG_ID=local-dev
ARG VITE_GIT_SHA=unknown
ARG VITE_PUBLIC_POSTHOG_KEY=""
Expand Down Expand Up @@ -125,8 +125,8 @@ FROM base AS frontend-debug
# Frontend build-time configuration
ARG VITE_AUTH_PROVIDER=local
ARG VITE_CLERK_PUBLISHABLE_KEY=""
ARG VITE_API_URL=http://localhost:3211
ARG VITE_BACKEND_URL=http://localhost:3211
ARG VITE_API_URL=""
ARG VITE_BACKEND_URL=""
ARG VITE_DEFAULT_ORG_ID=local-dev
ARG VITE_GIT_SHA=unknown
ARG VITE_PUBLIC_POSTHOG_KEY=""
Expand Down
Loading