Merged
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Performance Benchmark: Go rewrite vs Node.js
Benchmarks comparing the Go rewrite (
a920db4) against the previous Node.js implementation (f6d0868) of reverse-shell.Both versions are checked out from git, built from source, and validated for correctness before benchmarking. Responses are verified to be identical between implementations.
Test Environment
Results
1. Cold Start — exec to first HTTP response
Measures the full lifecycle: process
exec, runtime init,listen,accept, process request, write response, client reads full response. 100 runs, zero failures.Go serves its first response in ~3 ms. Node.js needs ~54 ms to boot the V8 engine, JIT-compile the JS, and respond.
Methodology
A compiled Go harness
execs the server process and immediately enters a tightnet.Dialloop (each failed attempt takes ~microseconds on localhost — immediateECONNREFUSED). The instant the TCP connection succeeds (the kernellisten()backlog is active), a rawHTTP/1.0 GETrequest is written onto the connection. Those bytes land in the kernel TCP receive buffer before the server has even calledaccept(). The harness then blocks onread()until the full response arrives and validates it contains all expected payload strings. Clock runs fromexecto final response byte.2. Hot Response — serial latency (c=1)
Single-connection latency after warmup. 10,000 requests.
Both sub-millisecond. Node.js is marginally faster — V8's JIT-compiled string templates outperform Go's
fmt.Sprintffor this trivial workload once hot.3. Hot Response — throughput (c=50)
50 concurrent connections, 50,000 requests after warmup.
Effectively identical. Both push ~37k req/s. The workload is pure string templating — neither runtime is the bottleneck.
4. Memory Usage (RSS)
Go's memory is 5–6x smaller and stays nearly flat under load. Node.js carries V8 heap, JIT compiler, and GC overhead regardless of workload complexity.
5. CPU Time
Cumulative server process CPU time across all hot benchmarks (~110k total requests):
Node.js used ~2x less CPU for the same work — V8's event loop is efficient for this I/O-bound, string-heavy workload. Go's goroutine-per-connection model has more scheduling overhead for trivial handlers.
6. Deployable Size
Everything needed to run each version:
node)api/index.js)Go compiles to a single self-contained binary — 13.7x smaller than the Node.js runtime alone. Copy it anywhere and it runs.
Summary
Where Go wins
s-maxage=2592000) that spins up instances on-demand, this is the metric that matters most.Where it doesn't matter
Hot path performance is identical for this workload. The handler is trivial string templating — neither runtime is the bottleneck. In production, responses are served from the edge cache anyway.
Reproducing
The benchmark checks out both versions from git, builds them, validates responses are correct and identical, then runs all tests. Two files — drop them anywhere in the repo:
bench/run.sh— orchestrationbench/harness.go— cold start measurement