Skip to content

[core] V2: unify wait+step queue dispatch in suspension processing#1925

Draft
VaguelySerious wants to merge 5 commits intomainfrom
peter/fix-step-vs-wait-race-eager-wait
Draft

[core] V2: unify wait+step queue dispatch in suspension processing#1925
VaguelySerious wants to merge 5 commits intomainfrom
peter/fix-step-vs-wait-race-eager-wait

Conversation

@VaguelySerious
Copy link
Copy Markdown
Member

@VaguelySerious VaguelySerious commented May 5, 2026

Summary

Replaces the prior eager-wait-queue patch (and supersedes #1924's carve-out — see history below) with a single unified queue dispatch in V2 suspension processing.

V2 had two different mechanisms for "do something later":

  • Steps were dispatched as outbound queue messages with stepId+stepName and an idempotency key.
  • Waits were encoded as the function's return value ({ timeoutSeconds }); the queue redelivered the same message after that delay.

The asymmetry was load-bearing in three branches of suspension processing and made Promise.race(step, sleep) semantically incorrect under inline step execution.

This PR drops the asymmetry: every pending operation we are not running inline becomes an outbound queue message, dispatched in one Promise.all batch.

ownedPendingSteps = pendingSteps.filter(owned by this handler)
inlineStep        = ownedPendingSteps[0]   // optional

dispatches = [
  ...for each non-inline pendingStep: queue stepId message (idempotency=correlationId),
  ...if timeoutSeconds defined:        queue delayed continuation (delaySeconds=timeoutSeconds),
]
await Promise.all(dispatches)

if (!inlineStep) return
await executeStep(inlineStep)

Inline step execution is restored even when there's a pending wait — the wait timer fires in a separate function invocation, in parallel with the inline step. If the sleep wins, that parallel invocation observes wait_completed via the "complete elapsed waits" pass and finishes the run; if the step wins, the wait continuation fires later and no-ops on the terminal run.

Commits

  1. [world-local] Honor delaySeconds before message deliveryworld-local's queue ignored delaySeconds, so wait continuations would fire instantly in dev. Now matches the broker behavior of world-vercel and world-postgres. Three new unit tests for the delay branch.
  2. [core] V2: unify wait+step queue dispatch in suspension processing — the actual unification. runtime.ts is 50 lines shorter and the suspension processing has one path instead of three.
  3. [docs] V2 unified suspension dispatch + changeseteager-processing.mdx "Mixed Suspensions" section rewritten to describe the unified model. Changeset.

Out of scope (kept as { timeoutSeconds })

The retry/throttle and hook-conflict paths still return { timeoutSeconds } because their semantics are "redeliver THIS message after a delay" rather than "schedule a fresh wait timer." Unifying those would reset the message attempt counter and require a different runaway-loop guard. Tracked as a follow-up.

Reproducer (failing test from #1916)

sleepWinsRaceWorkflow: 1s sleep raced against 10s step. Expected 'sleep' to win.

Event log on the failing run, before any fix:

t (s) Event
1.738 step_created + wait_created resumeAt = t+1s
1.742 step_started inline execution begins
11.825 step_completed step body finishes after ~10s
11.861 wait_completed created right after step_completed, not on time
11.959 run_completed replay picked 'step'

Event log after this PR:

t (s) Event
0.501 step_created + wait_created resumeAt = t+1s
0.508 step_started inline execution begins (single dispatch — no duplicates)
1.546 wait_completed exactly 1s after wait_created — wait continuation fired in parallel invocation
1.633 run_completed replay picked 'sleep'
10.582 step_completed the in-progress inline step finishes after the run completed (orphan write, harmless)

The 10s inline step continues to run in the background — world-local's step_completed write on a terminal run is allowed (writes are not gated on run state for step lifecycle events; the orphan event has no observable effect). On world-vercel / world-postgres, the same thing happens — the step body runs to completion in its inline invocation but step_completed either succeeds against the terminal run or is no-oped depending on the world implementation.

Test plan

  • New e2e tests sleepWinsRaceWorkflow and stepWinsRaceWorkflow (the failing tests from Add dev-tmux skill for portless+tmux local Workflow SDK dev #1916, plus the symmetric stepWins case) pass against the nextjs-turbopack workbench locally.
  • Event log inspection confirms wait_completed fires at t≈1s after wait_created (not after the inline step), and step_started appears exactly once per run (the eager-queue prototype produced duplicates in dev).
  • All 842 @workflow/core unit tests pass.
  • All 346 @workflow/world-local unit tests pass (3 new for delaySeconds).

History

🤖 Generated with Claude Code

Fix `Promise.race(step, sleep)` semantics in V2 mixed suspensions
without losing inline step execution.

Inline `await executeStep(...)` blocks the V2 handler for the full
step duration, but `wait_completed` events are only created on the
*next* loop iteration's "complete elapsed waits" pass. So if the
sleep is shorter than the step, replay always picked the step
because the wait_completed event hadn't been written yet —
`sleepWinsRaceWorkflow` returned `'step'` instead of `'sleep'`.

Fix: when a suspension contains both an owned inline step and at
least one pending wait, queue a delayed self-message with
`delaySeconds = suspensionResult.timeoutSeconds` *before* starting
inline execution. The queued continuation fires in a separate
function invocation while the step is still running. That parallel
invocation's "complete elapsed waits" pass writes wait_completed,
replay observes the elapsed wait, and `Promise.race` resolves with
the sleep correctly. The original (still-running) inline invocation
finishes its step, sees `run_completed` on the next loop iteration,
and exits.

This preserves inline-step execution speed for the step-wins case:
the step finishes inline and the workflow returns directly. The
eagerly-queued wait continuation fires after the step has won and
just no-ops on the terminal run.

Test plan:
- New e2e tests `sleepWinsRaceWorkflow` and `stepWinsRaceWorkflow`
  exercising `Promise.race` between a step function and `sleep()`,
  in both directions.
- Verified locally against `nextjs-turbopack` workbench: both pass.
  Event log confirms `wait_completed` is created at t≈1s after
  `wait_created` (1s sleep) instead of at t≈11s after the inline
  step finishes.

Eager-processing changelog updated with a "Mixed Suspensions"
section describing the pre-scheduled wait approach and its
rationale.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@changeset-bot
Copy link
Copy Markdown

changeset-bot Bot commented May 5, 2026

🦋 Changeset detected

Latest commit: 3f67059

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 20 packages
Name Type
@workflow/core Patch
@workflow/world-local Patch
@workflow/builders Patch
@workflow/cli Patch
@workflow/next Patch
@workflow/nitro Patch
@workflow/vitest Patch
@workflow/web-shared Patch
@workflow/web Patch
workflow Patch
@workflow/world-testing Patch
tarballs Patch
@workflow/world-postgres Patch
@workflow/astro Patch
@workflow/nest Patch
@workflow/rollup Patch
@workflow/sveltekit Patch
@workflow/vite Patch
@workflow/nuxt Patch
@workflow/ai Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@vercel
Copy link
Copy Markdown
Contributor

vercel Bot commented May 5, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
example-nextjs-workflow-turbopack Ready Ready Preview, Comment May 5, 2026 1:22am
example-nextjs-workflow-webpack Ready Ready Preview, Comment May 5, 2026 1:22am
example-workflow Ready Ready Preview, Comment May 5, 2026 1:22am
workbench-astro-workflow Ready Ready Preview, Comment May 5, 2026 1:22am
workbench-express-workflow Ready Ready Preview, Comment May 5, 2026 1:22am
workbench-fastify-workflow Ready Ready Preview, Comment May 5, 2026 1:22am
workbench-hono-workflow Ready Ready Preview, Comment May 5, 2026 1:22am
workbench-nitro-workflow Ready Ready Preview, Comment May 5, 2026 1:22am
workbench-nuxt-workflow Ready Ready Preview, Comment May 5, 2026 1:22am
workbench-sveltekit-workflow Ready Ready Preview, Comment May 5, 2026 1:22am
workbench-tanstack-start-workflow Ready Ready Preview, Comment May 5, 2026 1:22am
workbench-vite-workflow Ready Ready Preview, Comment May 5, 2026 1:22am
workflow-docs Ready Ready Preview, Comment, Open in v0 May 5, 2026 1:22am
workflow-swc-playground Ready Ready Preview, Comment May 5, 2026 1:22am
workflow-tarballs Ready Ready Preview, Comment May 5, 2026 1:22am
workflow-web Ready Ready Preview, Comment May 5, 2026 1:22am

@VaguelySerious VaguelySerious changed the title [core] V2: pre-schedule the wait timer before inline-executing a step (alt to #1924) [core] Enqueue all steps/waits before inline step execution, remove timeoutSeconds return May 5, 2026
Conflicts:
- docs/content/docs/changelog/eager-processing.mdx — took main's
  "Mixed Suspensions" section as the base. Will be rewritten in the
  unification commits that follow.
- packages/core/src/runtime.ts — auto-merged. Option A's
  `inlineStep === undefined when timeoutSeconds !== undefined` check
  is now in place from main, which makes Option B's eager-wait-queue
  branch unreachable. Will collapse the two into a unified queue
  dispatch in the next commits.
The local queue's `queue()` enqueue path ignored the `delaySeconds`
option entirely — every message was delivered immediately, regardless
of the requested delay. VQS-side queues (used by world-vercel and
world-postgres) honor delaySeconds at the broker, so this brings
world-local in line with production semantics.

The runtime needs this to land before the wait-as-continuation
unification in the next commit: that change starts queueing wait
timers as fresh delayed continuations instead of returning
`{ timeoutSeconds }`. Without delaySeconds support, those wait
continuations would fire instantly in dev and trigger spurious
replays.

Sleep happens outside the queue's worker semaphore so a delayed
message doesn't tie up a worker slot during its delay window — other
immediate messages are free to dispatch in parallel.

New tests in queue.test.ts cover:
- delaySeconds > 0 → setTimeout called with the right ms value
- delaySeconds === 0 → no setTimeout (immediate dispatch)
- delaySeconds omitted → no setTimeout (immediate dispatch)
Replace the asymmetric "steps go to the queue, waits become a
{ timeoutSeconds } return value" pattern with a single Promise.all
batch that queues every pending operation we are not running inline.

Before this change, suspension processing had three branches that
all needed to keep the wait/step asymmetry consistent:

- pendingSteps.length === 0 returned { timeoutSeconds }
- inlineStep + waits eagerly queued a delayed self-message AND set
  inlineStep to undefined (Option A) AND returned { timeoutSeconds }
- inlineStep retry path returned { timeoutSeconds } if there were waits

After this change, every suspension goes through one path:

  for non-inline pendingSteps: queue stepId message
  if timeoutSeconds defined:    queue delayed continuation
  await Promise.all(dispatches)
  if !inlineStep: return
  await executeStep(inlineStep)

Behaviorally, this restores inline step execution even when the
suspension also has a wait (Option A's carve-out is no longer
necessary): the wait timer fires in a separate function invocation
on the queue, in parallel with the inline step. If the sleep wins
the race, that parallel invocation observes wait_completed via the
"complete elapsed waits" pass and finishes the run; if the step
wins, the wait continuation fires later and no-ops on the terminal
run via the existing terminal-event check.

Other cleanups:
- The inline-step retry path no longer needs to forward
  suspensionResult.timeoutSeconds — the wait timer was already
  enqueued as part of the unified dispatch above.
- A dead post-step `if (timeoutSeconds && pendingSteps.length === 1)`
  block (just a comment, no body) is removed; the loop's
  "complete elapsed waits" pass handles the same case correctly.
- Step queueing now uses a shared `traceCarrier` rather than
  re-serializing per step.

Retry/throttle and hook-conflict paths still return { timeoutSeconds }
since their semantics are "redeliver THIS message after a delay"
rather than "schedule a fresh wait timer." Those can be unified in
a follow-up.

Test plan:
- New e2e tests `sleepWinsRaceWorkflow` and `stepWinsRaceWorkflow`
  pass against the `nextjs-turbopack` workbench.
- Event log inspection confirms wait_completed fires at t≈1s (after
  wait_created at t≈0s) for the sleep-wins case, and that the inline
  step runs only once (no duplicate step_started events that the
  earlier eager-queue approach produced in dev).
- All 842 @workflow/core unit tests pass.
- All 346 @workflow/world-local unit tests pass (with the
  delaySeconds support added in the previous commit).

Requires the world-local delaySeconds fix in the prior commit;
without it, wait continuations would fire instantly in dev and the
parallel replay would re-enter handleSuspension before the wait
elapsed (recoverable via existing redelivery, but inefficient).
Update the "Mixed Suspensions" section in eager-processing.mdx to
describe the unified parallel-dispatch model:

- All non-inline pendingSteps are queued with stepId
- The wait timer (if any) is queued as a delayed continuation
- All dispatched in one Promise.all batch
- One owned step is then inline-executed (if any)

The doc previously described Option A (the carve-out where waits
forced all steps to be queued); the unified model removes that
carve-out and explains why the wait continuation works in parallel
with the inline step.

Also notes the dependency on world-local's new delaySeconds support
(landed earlier in the same PR series).

Changeset bumps both @workflow/core and @workflow/world-local since
both packages have user-observable behavior changes.
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 5, 2026

🧪 E2E Test Results

Some tests failed

Summary

Passed Failed Skipped Total
✅ ▲ Vercel Production 925 0 219 1144
✅ 💻 Local Development 1237 0 219 1456
✅ 📦 Local Production 1237 0 219 1456
❌ 🐘 Local Postgres 1226 11 219 1456
✅ 🪟 Windows 104 0 0 104
❌ 📋 Other 551 1 176 728
Total 5280 12 1052 6344

❌ Failed Tests

🐘 Local Postgres (11 failed)

astro-stable (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM

express-stable (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM

fastify-stable (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM

hono-stable (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM

nextjs-turbopack-stable-lazy-discovery-disabled (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM

nextjs-webpack-canary (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM

nextjs-webpack-stable-lazy-discovery-disabled (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM

nextjs-webpack-stable-lazy-discovery-enabled (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM

nuxt-stable (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM

sveltekit-stable (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM

vite-stable (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM
📋 Other (1 failed)

e2e-local-postgres-tanstack-start- (1 failed):

  • sleepWinsRaceWorkflow | wrun_01KQTVMS5A2CXBVF608G4JKVDM

Details by Category

✅ ▲ Vercel Production
App Passed Failed Skipped
✅ astro 78 0 26
✅ example 78 0 26
✅ express 78 0 26
✅ fastify 78 0 26
✅ hono 78 0 26
✅ nextjs-turbopack 102 0 2
✅ nextjs-webpack 102 0 2
✅ nitro 78 0 26
✅ nuxt 78 0 26
✅ sveltekit 97 0 7
✅ vite 78 0 26
✅ 💻 Local Development
App Passed Failed Skipped
✅ astro-stable 79 0 25
✅ express-stable 79 0 25
✅ fastify-stable 79 0 25
✅ hono-stable 79 0 25
✅ nextjs-turbopack-canary 85 0 19
✅ nextjs-turbopack-stable-lazy-discovery-disabled 104 0 0
✅ nextjs-turbopack-stable-lazy-discovery-enabled 104 0 0
✅ nextjs-webpack-canary 85 0 19
✅ nextjs-webpack-stable-lazy-discovery-disabled 104 0 0
✅ nextjs-webpack-stable-lazy-discovery-enabled 104 0 0
✅ nitro-stable 79 0 25
✅ nuxt-stable 79 0 25
✅ sveltekit-stable 98 0 6
✅ vite-stable 79 0 25
✅ 📦 Local Production
App Passed Failed Skipped
✅ astro-stable 79 0 25
✅ express-stable 79 0 25
✅ fastify-stable 79 0 25
✅ hono-stable 79 0 25
✅ nextjs-turbopack-canary 85 0 19
✅ nextjs-turbopack-stable-lazy-discovery-disabled 104 0 0
✅ nextjs-turbopack-stable-lazy-discovery-enabled 104 0 0
✅ nextjs-webpack-canary 85 0 19
✅ nextjs-webpack-stable-lazy-discovery-disabled 104 0 0
✅ nextjs-webpack-stable-lazy-discovery-enabled 104 0 0
✅ nitro-stable 79 0 25
✅ nuxt-stable 79 0 25
✅ sveltekit-stable 98 0 6
✅ vite-stable 79 0 25
❌ 🐘 Local Postgres
App Passed Failed Skipped
❌ astro-stable 78 1 25
❌ express-stable 78 1 25
❌ fastify-stable 78 1 25
❌ hono-stable 78 1 25
✅ nextjs-turbopack-canary 85 0 19
❌ nextjs-turbopack-stable-lazy-discovery-disabled 103 1 0
✅ nextjs-turbopack-stable-lazy-discovery-enabled 104 0 0
❌ nextjs-webpack-canary 84 1 19
❌ nextjs-webpack-stable-lazy-discovery-disabled 103 1 0
❌ nextjs-webpack-stable-lazy-discovery-enabled 103 1 0
✅ nitro-stable 79 0 25
❌ nuxt-stable 78 1 25
❌ sveltekit-stable 97 1 6
❌ vite-stable 78 1 25
✅ 🪟 Windows
App Passed Failed Skipped
✅ nextjs-turbopack 104 0 0
❌ 📋 Other
App Passed Failed Skipped
✅ e2e-local-dev-nest-stable 79 0 25
✅ e2e-local-dev-tanstack-start- 79 0 25
✅ e2e-local-postgres-nest-stable 79 0 25
❌ e2e-local-postgres-tanstack-start- 78 1 25
✅ e2e-local-prod-nest-stable 79 0 25
✅ e2e-local-prod-tanstack-start- 79 0 25
✅ e2e-vercel-prod-tanstack-start 78 0 26

📋 View full workflow run


Some E2E test jobs failed:

  • Vercel Prod: success
  • Local Dev: success
  • Local Prod: success
  • Local Postgres: failure
  • Windows: success

Check the workflow run for details.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 5, 2026

📊 Benchmark Results

📈 Comparing against baseline from main branch. Green 🟢 = faster, Red 🔺 = slower.

workflow with no steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Express 0.033s (-26.6% 🟢) 1.006s (~) 0.973s 10 1.00x
💻 Local Nitro 0.034s (-22.3% 🟢) 1.005s (~) 0.972s 10 1.03x
🐘 Postgres Express 0.046s (-20.3% 🟢) 1.013s (~) 0.967s 10 1.42x
🐘 Postgres Nitro 0.047s (-50.3% 🟢) 1.013s (-2.9%) 0.965s 10 1.46x
💻 Local Next.js (Turbopack) 0.053s 1.006s 0.953s 10 1.62x
🌐 Redis Next.js (Turbopack) 0.054s 1.005s 0.951s 10 1.66x
🐘 Postgres Next.js (Turbopack) 0.057s 1.012s 0.955s 10 1.76x
🌐 MongoDB Next.js (Turbopack) 0.120s 1.007s 0.887s 10 3.70x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 0.310s (-24.2% 🟢) 2.090s (-16.7% 🟢) 1.779s 10 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

workflow with 1 step

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Express 1.069s (-5.0% 🟢) 2.006s (~) 0.938s 10 1.00x
💻 Local Nitro 1.070s (-5.4% 🟢) 2.006s (~) 0.937s 10 1.00x
🐘 Postgres Express 1.075s (-6.3% 🟢) 2.009s (~) 0.934s 10 1.01x
🐘 Postgres Nitro 1.084s (-4.9%) 2.009s (~) 0.924s 10 1.01x
💻 Local Next.js (Turbopack) 1.111s 2.006s 0.894s 10 1.04x
🌐 Redis Next.js (Turbopack) 1.116s 2.006s 0.890s 10 1.04x
🐘 Postgres Next.js (Turbopack) 1.118s 2.010s 0.892s 10 1.05x
🌐 MongoDB Next.js (Turbopack) 1.160s 2.008s 0.847s 10 1.09x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.620s (-58.4% 🟢) 3.787s (-35.9% 🟢) 2.167s 10 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

workflow with 10 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 10.389s (-5.2% 🟢) 11.015s (~) 0.626s 3 1.00x
💻 Local Nitro 10.395s (-5.0% 🟢) 11.021s (~) 0.626s 3 1.00x
💻 Local Express 10.411s (-4.7%) 11.022s (~) 0.611s 3 1.00x
🐘 Postgres Nitro 10.434s (-4.0%) 11.017s (~) 0.583s 3 1.00x
🌐 Redis Next.js (Turbopack) 10.659s 11.022s 0.364s 3 1.03x
💻 Local Next.js (Turbopack) 10.689s 11.022s 0.333s 3 1.03x
🐘 Postgres Next.js (Turbopack) 10.704s 11.017s 0.313s 3 1.03x
🌐 MongoDB Next.js (Turbopack) 10.742s 11.017s 0.275s 3 1.03x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 14.713s (-38.0% 🟢) 16.568s (-34.0% 🟢) 1.855s 2 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

workflow with 25 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 13.429s (-7.9% 🟢) 14.016s (-6.7% 🟢) 0.587s 5 1.00x
💻 Local Nitro 13.455s (-10.7% 🟢) 14.027s (-12.5% 🟢) 0.571s 5 1.00x
💻 Local Express 13.479s (-10.0% 🟢) 14.028s (-6.7% 🟢) 0.549s 5 1.00x
🐘 Postgres Nitro 13.483s (-7.6% 🟢) 14.019s (-6.7% 🟢) 0.535s 5 1.00x
🌐 Redis Next.js (Turbopack) 14.005s 14.027s 0.022s 5 1.04x
🐘 Postgres Next.js (Turbopack) 14.103s 15.017s 0.914s 4 1.05x
💻 Local Next.js (Turbopack) 14.116s 15.030s 0.914s 4 1.05x
🌐 MongoDB Next.js (Turbopack) 14.206s 15.022s 0.816s 4 1.06x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 21.904s (-66.0% 🟢) 24.166s (-63.7% 🟢) 2.262s 3 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

workflow with 50 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 11.839s (-15.5% 🟢) 12.018s (-17.6% 🟢) 0.179s 8 1.00x
💻 Local Nitro 11.868s (-29.3% 🟢) 12.021s (-29.4% 🟢) 0.153s 8 1.00x
🐘 Postgres Nitro 11.909s (-14.7% 🟢) 12.016s (-16.0% 🟢) 0.108s 8 1.01x
💻 Local Express 11.962s (-27.9% 🟢) 12.397s (-27.2% 🟢) 0.435s 8 1.01x
🌐 Redis Next.js (Turbopack) 12.882s 13.024s 0.142s 7 1.09x
💻 Local Next.js (Turbopack) 13.054s 13.598s 0.544s 7 1.10x
🌐 MongoDB Next.js (Turbopack) 13.238s 14.018s 0.779s 7 1.12x
🐘 Postgres Next.js (Turbopack) 13.258s 14.018s 0.760s 7 1.12x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 318.648s (-24.7% 🟢) 321.481s (-24.3% 🟢) 2.833s 1 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

Promise.all with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.171s (-7.1% 🟢) 2.007s (~) 0.837s 15 1.00x
🐘 Postgres Nitro 1.179s (-7.5% 🟢) 2.007s (~) 0.829s 15 1.01x
💻 Local Nitro 1.180s (-27.6% 🟢) 2.006s (-3.3%) 0.826s 15 1.01x
💻 Local Express 1.184s (-20.5% 🟢) 2.005s (~) 0.822s 15 1.01x
🐘 Postgres Next.js (Turbopack) 1.254s 2.007s 0.753s 15 1.07x
💻 Local Next.js (Turbopack) 1.308s 2.006s 0.698s 15 1.12x
🌐 Redis Next.js (Turbopack) 1.412s 2.006s 0.594s 15 1.21x
🌐 MongoDB Next.js (Turbopack) 2.024s 2.826s 0.801s 11 1.73x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.509s (-11.0% 🟢) 4.483s (+3.7%) 1.974s 7 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

Promise.all with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.300s (-44.9% 🟢) 2.008s (-33.3% 🟢) 0.708s 15 1.00x
🐘 Postgres Nitro 1.332s (-43.3% 🟢) 2.150s (-28.5% 🟢) 0.818s 14 1.02x
🐘 Postgres Next.js (Turbopack) 1.555s 2.151s 0.596s 14 1.20x
💻 Local Express 1.755s (-40.6% 🟢) 2.008s (-41.9% 🟢) 0.253s 15 1.35x
💻 Local Nitro 1.764s (-43.9% 🟢) 2.006s (-48.4% 🟢) 0.242s 15 1.36x
💻 Local Next.js (Turbopack) 1.868s 2.073s 0.205s 15 1.44x
🌐 Redis Next.js (Turbopack) 2.360s 3.008s 0.648s 10 1.82x
🌐 MongoDB Next.js (Turbopack) 3.561s 4.008s 0.446s 8 2.74x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.977s (-26.5% 🟢) 5.028s (-15.1% 🟢) 2.051s 6 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

Promise.all with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.589s (-54.4% 🟢) 3.455s (-13.9% 🟢) 1.865s 9 1.00x
🐘 Postgres Nitro 1.679s (-51.8% 🟢) 3.345s (-16.6% 🟢) 1.666s 9 1.06x
🐘 Postgres Next.js (Turbopack) 2.457s 3.458s 1.001s 9 1.55x
🌐 Redis Next.js (Turbopack) 3.668s 4.009s 0.341s 8 2.31x
💻 Local Nitro 5.218s (-37.5% 🟢) 5.681s (-37.0% 🟢) 0.463s 6 3.28x
💻 Local Express 5.364s (-35.7% 🟢) 5.846s (-35.2% 🟢) 0.482s 6 3.37x
💻 Local Next.js (Turbopack) 5.967s 6.615s 0.648s 5 3.75x
🌐 MongoDB Next.js (Turbopack) 6.278s 7.011s 0.733s 5 3.95x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 3.369s (-4.4%) 5.644s (+2.0%) 2.275s 6 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

Promise.race with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.151s (-8.4% 🟢) 2.007s (~) 0.856s 15 1.00x
🐘 Postgres Nitro 1.167s (-7.2% 🟢) 2.007s (~) 0.840s 15 1.01x
🐘 Postgres Next.js (Turbopack) 1.254s 2.007s 0.753s 15 1.09x
💻 Local Next.js (Turbopack) 1.330s 2.006s 0.676s 15 1.16x
🌐 Redis Next.js (Turbopack) 1.370s 2.006s 0.636s 15 1.19x
💻 Local Nitro 1.394s (-25.3% 🟢) 2.006s (-14.3% 🟢) 0.612s 15 1.21x
💻 Local Express 1.411s (-25.5% 🟢) 2.006s (-15.1% 🟢) 0.595s 15 1.23x
🌐 MongoDB Next.js (Turbopack) 2.032s 2.735s 0.703s 11 1.77x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.304s (-6.3% 🟢) 4.392s (+5.3% 🔺) 2.088s 7 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

Promise.race with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.273s (-45.6% 🟢) 2.008s (-33.3% 🟢) 0.735s 15 1.00x
🐘 Postgres Nitro 1.288s (-44.9% 🟢) 2.008s (-33.3% 🟢) 0.719s 15 1.01x
🐘 Postgres Next.js (Turbopack) 1.506s 2.006s 0.500s 15 1.18x
💻 Local Nitro 2.026s (-33.9% 🟢) 2.509s (-35.5% 🟢) 0.482s 12 1.59x
💻 Local Express 2.065s (-34.1% 🟢) 2.508s (-33.3% 🟢) 0.444s 12 1.62x
💻 Local Next.js (Turbopack) 2.133s 3.008s 0.875s 10 1.68x
🌐 Redis Next.js (Turbopack) 2.360s 3.008s 0.648s 10 1.85x
🌐 MongoDB Next.js (Turbopack) 3.560s 4.009s 0.449s 8 2.80x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.616s (-19.1% 🟢) 4.624s (-8.9% 🟢) 2.008s 7 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

Promise.race with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.499s (-57.2% 🟢) 3.112s (-22.4% 🟢) 1.613s 10 1.00x
🐘 Postgres Nitro 1.611s (-53.7% 🟢) 3.210s (-19.9% 🟢) 1.599s 10 1.07x
🐘 Postgres Next.js (Turbopack) 2.789s 3.681s 0.892s 9 1.86x
🌐 Redis Next.js (Turbopack) 3.640s 4.009s 0.369s 8 2.43x
💻 Local Next.js (Turbopack) 5.638s 6.213s 0.575s 5 3.76x
💻 Local Nitro 5.801s (-36.6% 🟢) 6.416s (-36.0% 🟢) 0.615s 5 3.87x
💻 Local Express 5.810s (-34.0% 🟢) 6.215s (-33.0% 🟢) 0.405s 5 3.88x
🌐 MongoDB Next.js (Turbopack) 6.283s 7.013s 0.731s 5 4.19x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 3.971s (-22.0% 🟢) 5.666s (-16.9% 🟢) 1.695s 6 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

workflow with 10 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 0.460s (-44.0% 🟢) 1.006s (~) 0.546s 60 1.00x
💻 Local Nitro 0.464s (-52.7% 🟢) 1.004s (-8.2% 🟢) 0.540s 60 1.01x
🐘 Postgres Express 0.467s (-44.4% 🟢) 1.023s (~) 0.556s 59 1.01x
💻 Local Express 0.481s (-51.1% 🟢) 1.004s (-6.7% 🟢) 0.524s 60 1.05x
🌐 Redis Next.js (Turbopack) 0.629s 1.004s 0.375s 60 1.37x
🐘 Postgres Next.js (Turbopack) 0.666s 1.006s 0.339s 60 1.45x
🌐 MongoDB Next.js (Turbopack) 0.714s 1.005s 0.291s 60 1.55x
💻 Local Next.js (Turbopack) 0.719s 1.021s 0.302s 59 1.56x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 4.956s (-77.5% 🟢) 6.833s (-71.6% 🟢) 1.877s 9 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

workflow with 25 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.003s (-49.3% 🟢) 1.306s (-42.2% 🟢) 0.303s 70 1.00x
🐘 Postgres Nitro 1.066s (-44.7% 🟢) 1.738s (-17.3% 🟢) 0.672s 52 1.06x
💻 Local Nitro 1.157s (-61.9% 🟢) 2.006s (-46.6% 🟢) 0.850s 45 1.15x
💻 Local Express 1.196s (-60.3% 🟢) 2.006s (-44.1% 🟢) 0.810s 45 1.19x
🌐 Redis Next.js (Turbopack) 1.499s 2.006s 0.507s 45 1.49x
🐘 Postgres Next.js (Turbopack) 1.599s 2.007s 0.408s 45 1.59x
💻 Local Next.js (Turbopack) 1.761s 2.029s 0.268s 45 1.76x
🌐 MongoDB Next.js (Turbopack) 1.780s 2.006s 0.226s 45 1.78x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 14.236s (-63.9% 🟢) 16.374s (-60.3% 🟢) 2.139s 6 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

workflow with 50 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 2.201s (-46.4% 🟢) 2.865s (-37.8% 🟢) 0.664s 42 1.00x
🐘 Postgres Express 2.203s (-44.8% 🟢) 2.718s (-37.8% 🟢) 0.515s 45 1.00x
💻 Local Nitro 2.686s (-71.1% 🟢) 3.032s (-69.7% 🟢) 0.346s 40 1.22x
💻 Local Express 2.688s (-70.8% 🟢) 3.032s (-69.7% 🟢) 0.344s 40 1.22x
🌐 Redis Next.js (Turbopack) 2.965s 3.191s 0.226s 38 1.35x
🐘 Postgres Next.js (Turbopack) 3.180s 4.009s 0.829s 30 1.44x
💻 Local Next.js (Turbopack) 3.750s 4.075s 0.325s 30 1.70x
🌐 MongoDB Next.js (Turbopack) 4.047s 4.971s 0.923s 25 1.84x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 40.157s (-58.6% 🟢) 42.769s (-56.5% 🟢) 2.612s 3 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

workflow with 10 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 0.200s (-29.1% 🟢) 1.006s (~) 0.806s 60 1.00x
🐘 Postgres Nitro 0.210s (-25.9% 🟢) 1.006s (~) 0.796s 60 1.05x
🐘 Postgres Next.js (Turbopack) 0.264s 1.006s 0.742s 60 1.32x
🌐 Redis Next.js (Turbopack) 0.270s 1.004s 0.734s 60 1.35x
💻 Local Express 0.446s (-20.5% 🟢) 1.004s (~) 0.558s 60 2.22x
💻 Local Nitro 0.464s (-23.3% 🟢) 1.004s (-1.7%) 0.541s 60 2.31x
💻 Local Next.js (Turbopack) 0.548s 1.022s 0.473s 59 2.74x
🌐 MongoDB Next.js (Turbopack) 1.054s 2.006s 0.952s 30 5.26x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.933s (+16.4% 🔺) 4.035s (+20.4% 🔺) 2.102s 15 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

workflow with 25 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 0.350s (-31.4% 🟢) 1.006s (~) 0.656s 90 1.00x
🐘 Postgres Nitro 0.376s (-24.2% 🟢) 1.006s (~) 0.630s 90 1.08x
🌐 Redis Next.js (Turbopack) 0.428s 1.004s 0.576s 90 1.22x
🐘 Postgres Next.js (Turbopack) 0.558s 1.040s 0.482s 88 1.60x
💻 Local Nitro 2.189s (-13.7% 🟢) 2.943s (-2.2%) 0.754s 31 6.26x
💻 Local Next.js (Turbopack) 2.228s 2.944s 0.716s 31 6.37x
💻 Local Express 2.248s (-10.5% 🟢) 2.943s (-2.2%) 0.695s 31 6.43x
🌐 MongoDB Next.js (Turbopack) 2.601s 3.006s 0.405s 30 7.44x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 3.385s (+4.9%) 5.641s (+17.0% 🔺) 2.257s 16 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

workflow with 50 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 0.698s (-14.8% 🟢) 1.183s (+16.3% 🔺) 0.485s 102 1.00x
🐘 Postgres Nitro 0.733s (-7.2% 🟢) 1.171s (+16.3% 🔺) 0.438s 103 1.05x
🌐 Redis Next.js (Turbopack) 0.783s 1.004s 0.221s 120 1.12x
🐘 Postgres Next.js (Turbopack) 2.374s 3.166s 0.793s 39 3.40x
🌐 MongoDB Next.js (Turbopack) 5.404s 6.012s 0.608s 20 7.75x
💻 Local Nitro 10.217s (-8.7% 🟢) 10.778s (-7.6% 🟢) 0.561s 12 14.65x
💻 Local Express 10.475s (-6.4% 🟢) 10.940s (-8.4% 🟢) 0.465s 11 15.02x
💻 Local Next.js (Turbopack) 11.322s 12.027s 0.705s 10 16.23x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 8.380s (+8.5% 🔺) 10.586s (+12.6% 🔺) 2.205s 12 1.00x
▲ Vercel Express ⚠️ missing - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - -

🔍 Observability: Nitro

Stream Benchmarks (includes TTFB metrics)
workflow with stream

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.124s (+448.1% 🔺) 2.003s (+100.6% 🔺) 0.001s (-31.3% 🟢) 2.010s (+98.7% 🔺) 0.886s 10 1.00x
💻 Local Express 1.132s (+468.4% 🔺) 2.004s (+99.5% 🔺) 0.012s (~) 2.019s (+98.3% 🔺) 0.887s 10 1.01x
🐘 Postgres Nitro 1.141s (+456.6% 🔺) 1.994s (+99.5% 🔺) 0.001s (-20.0% 🟢) 2.010s (+98.8% 🔺) 0.869s 10 1.02x
💻 Local Nitro 1.143s (+434.8% 🔺) 2.005s (+99.6% 🔺) 0.012s (-1.6%) 2.019s (+98.2% 🔺) 0.877s 10 1.02x
💻 Local Next.js (Turbopack) 1.173s 2.004s 0.012s 2.020s 0.846s 10 1.04x
🐘 Postgres Next.js (Turbopack) 1.194s 2.002s 0.001s 2.012s 0.818s 10 1.06x
🌐 MongoDB Next.js (Turbopack) ⚠️ missing - - - - -
🌐 Redis Next.js (Turbopack) ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.454s (-36.0% 🟢) 3.876s (-26.5% 🟢) 1.440s (+94.1% 🔺) 5.824s (-10.2% 🟢) 3.370s 10 1.00x
▲ Vercel Express ⚠️ missing - - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - - -

🔍 Observability: Nitro

stream pipeline with 5 transform steps (1MB)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.501s (+138.3% 🔺) 2.009s (+99.6% 🔺) 0.004s (+2.7%) 2.026s (+98.1% 🔺) 0.525s 30 1.00x
🐘 Postgres Nitro 1.523s (+144.0% 🔺) 2.005s (+99.2% 🔺) 0.004s (-8.1% 🟢) 2.026s (+98.2% 🔺) 0.503s 30 1.01x
🐘 Postgres Next.js (Turbopack) 1.682s 2.010s 0.004s 2.024s 0.342s 30 1.12x
💻 Local Express 1.705s (+125.2% 🔺) 2.011s (+95.4% 🔺) 0.009s (-3.6%) 2.200s (+111.6% 🔺) 0.495s 28 1.14x
💻 Local Next.js (Turbopack) 1.706s 2.010s 0.010s 2.024s 0.317s 30 1.14x
💻 Local Nitro 1.723s (+105.5% 🔺) 2.013s (+98.9% 🔺) 0.009s (-5.1% 🟢) 2.202s (+97.4% 🔺) 0.479s 28 1.15x
🌐 MongoDB Next.js (Turbopack) ⚠️ missing - - - - -
🌐 Redis Next.js (Turbopack) ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 5.996s (-79.6% 🟢) 7.667s (-75.1% 🟢) 0.456s (+307.1% 🔺) 8.571s (-73.0% 🟢) 2.575s 8 1.00x
▲ Vercel Express ⚠️ missing - - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - - -

🔍 Observability: Nitro

10 parallel streams (1MB each)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 0.699s (-27.3% 🟢) 1.029s (-19.5% 🟢) 0.000s (-20.7% 🟢) 1.047s (-19.8% 🟢) 0.348s 58 1.00x
🐘 Postgres Nitro 0.706s (-27.1% 🟢) 1.026s (-17.7% 🟢) 0.000s (-58.6% 🟢) 1.049s (-16.6% 🟢) 0.343s 58 1.01x
🐘 Postgres Next.js (Turbopack) 0.967s 1.363s 0.000s 1.371s 0.404s 44 1.38x
💻 Local Express 1.338s (+9.2% 🔺) 2.016s (~) 0.001s (+60.0% 🔺) 2.018s (~) 0.681s 30 1.91x
💻 Local Nitro 1.353s (+10.6% 🔺) 2.015s (~) 0.000s (+266.7% 🔺) 2.017s (~) 0.664s 30 1.94x
💻 Local Next.js (Turbopack) 1.471s 2.015s 0.000s 2.018s 0.546s 30 2.11x
🌐 MongoDB Next.js (Turbopack) ⚠️ missing - - - - -
🌐 Redis Next.js (Turbopack) ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 3.099s (+1.6%) 5.057s (+15.1% 🔺) 0.000s (-100.0% 🟢) 5.668s (+17.9% 🔺) 2.569s 11 1.00x
▲ Vercel Express ⚠️ missing - - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - - -

🔍 Observability: Nitro

fan-out fan-in 10 streams (1MB each)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.432s (-19.2% 🟢) 2.099s (-3.6%) 0.000s (+Infinity% 🔺) 2.116s (-3.8%) 0.684s 29 1.00x
🐘 Postgres Nitro 1.442s (-19.5% 🟢) 2.058s (-3.9%) 0.000s (-100.0% 🟢) 2.116s (-2.7%) 0.674s 29 1.01x
🐘 Postgres Next.js (Turbopack) 2.122s 2.731s 0.000s 2.738s 0.616s 22 1.48x
💻 Local Next.js (Turbopack) 2.851s 3.291s 0.001s 3.296s 0.445s 19 1.99x
💻 Local Nitro 3.023s (-10.7% 🟢) 3.779s (-6.3% 🟢) 0.000s (-29.7% 🟢) 3.781s (-6.3% 🟢) 0.758s 16 2.11x
💻 Local Express 3.151s (-9.1% 🟢) 3.905s (-3.2%) 0.000s (-92.2% 🟢) 3.907s (-3.2%) 0.756s 16 2.20x
🌐 MongoDB Next.js (Turbopack) ⚠️ missing - - - - -
🌐 Redis Next.js (Turbopack) ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 4.760s (+16.3% 🔺) 6.196s (+15.3% 🔺) 0.000s (-100.0% 🟢) 6.961s (+20.1% 🔺) 2.201s 9 1.00x
▲ Vercel Express ⚠️ missing - - - - -
▲ Vercel Next.js (Turbopack) ⚠️ missing - - - - -

🔍 Observability: Nitro

Summary

Fastest Framework by World

Winner determined by most benchmark wins

World 🥇 Fastest Framework Wins
💻 Local Nitro 11/21
🐘 Postgres Express 19/21
▲ Vercel Nitro 21/21
Fastest World by Framework

Winner determined by most benchmark wins

Framework 🥇 Fastest World Wins
Express 🐘 Postgres 19/21
Next.js (Turbopack) 🐘 Postgres 10/21
Nitro 🐘 Postgres 16/21
Column Definitions
  • Workflow Time: Runtime reported by workflow (completedAt - createdAt) - primary metric
  • TTFB: Time to First Byte - time from workflow start until first stream byte received (stream benchmarks only)
  • Slurp: Time from first byte to complete stream consumption (stream benchmarks only)
  • Wall Time: Total testbench time (trigger workflow + poll for result)
  • Overhead: Testbench overhead (Wall Time - Workflow Time)
  • Samples: Number of benchmark iterations run
  • vs Fastest: How much slower compared to the fastest configuration for this benchmark

Worlds:

  • 💻 Local: In-memory filesystem world (local development)
  • 🐘 Postgres: PostgreSQL database world (local development)
  • ▲ Vercel: Vercel production/preview deployment
  • 🌐 Turso: Community world (local development)
  • 🌐 MongoDB: Community world (local development)
  • 🌐 Redis: Community world (local development)
  • 🌐 Jazz: Community world (local development)

📋 View full workflow run


Some benchmark jobs failed:

  • Local: success
  • Postgres: success
  • Vercel: failure

Check the workflow run for details.

@VaguelySerious VaguelySerious changed the title [core] Enqueue all steps/waits before inline step execution, remove timeoutSeconds return [core] V2: unify wait+step queue dispatch in suspension processing May 5, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant