Interesting agent trajectories from Harbor computer use evaluations.
This repo is deployed to Vercel using a Blob-backed jobs archive so we can keep large Harbor job data without hitting Vercel's serverless function bundle size limit.
- The app runs as a Python function at
api/index.py. - Viewer UI assets are served from vendored Harbor static files in
vendor/. - Job data is loaded from one of these sources, in order:
JOBS_TAR_URLenvironment variable (preferred)config/jobs_tar_url.txtfallback- local
jobs/folder (mainly local dev fallback)
- On first request (cold start), the function downloads
jobs.tgzfrom Blob, extracts it to/tmp/harbor-jobs, and serves data from there.
Vercel serverless functions have an unzipped size limit. Bundling jobs/ (and
the full Harbor dependency tree) can exceed it. This setup avoids that by:
- Excluding heavy local folders from deploy uploads via
.vercelignore. - Loading data from Vercel Blob at runtime.
- Vendoring only the Harbor viewer code/assets needed at runtime in
vendor/.
api/index.py: runtime entrypoint and Blob download/extract logic.vercel.json: rewrites all routes to Python function and includesvendor/**..vercelignore: excludesjobs/, localharbor/, and local-only files.scripts/upload-jobs-to-blob.sh: uploads new jobs tarball and updates env vars.config/jobs_tar_url.txt: Blob URL fallback if env var is missing.
- Node + npm/bun (for local tooling if needed)
- Python 3.12+
- Vercel CLI (
npm i -g vercel) - Vercel account with project access
- Login to Vercel:
vercel login- Link this folder to your Vercel project:
vercel link- Create and link a Blob store to the project (only once):
vercel blob store add harbor-trajectories-jobsWhen prompted, choose to link it to the project and select environments.
- Pull env vars locally (script also does this when needed):
vercel env pull .env.vercel.local- Upload jobs data and set Blob URL env vars:
bash scripts/upload-jobs-to-blob.shThis script does all of the following:
-
Tarballs
jobs/to a temporary.tgz -
Uploads it to Vercel Blob
-
Updates
config/jobs_tar_url.txt -
Sets
JOBS_TAR_URLfor production, preview, and development -
Deploy:
vercel deploy --prodAny time jobs/ changes, run:
bash scripts/upload-jobs-to-blob.sh
vercel deploy --prodCheck these endpoints:
curl -s https://<your-domain>/api/health
curl -s https://<your-domain>/api/config
curl -s 'https://<your-domain>/api/jobs?page=1&page_size=1'Expected:
/api/healthreturns{"status":"ok"}/api/configshows jobs dir under/tmp/harbor-jobs/.../api/jobsreturns real jobs, not an empty list
Cause: deployment is not using Blob URL (wrong project/domain, missing env var, or old commit).
Fix:
- Confirm you are opening the correct deployment/project URL.
- Run
bash scripts/upload-jobs-to-blob.sh. - Redeploy:
vercel deploy --prod. - Re-check
/api/config.
Cause: large files got included in deploy artifact.
Fix:
- Ensure
.vercelignoreis present and includes/jobsand/harbor. - Keep
jobs/in Blob only (do not include in function bundle). - Redeploy.
Fix:
- Ensure Blob store is linked to the Vercel project.
- Run
vercel env pull .env.vercel.local. - Re-run upload script.
.env.vercel.local can contain secrets and is ignored via .gitignore.
Do not commit it.