YaiTimeout is the first canonical example target for prr.
It is a small VanillaScript utility that replaces setTimeout loops with a batched scheduling approach intended to avoid event-queue saturation during large bursts of timed work.
This example is useful for evaluating prr because it is:
- small enough for fast experiments
- stateful enough to trigger real review questions
- performance-sensitive enough to expose weak or noisy review behavior
- representative of the kind of focused utility this project should review well with free/default models
examples/yai-timeout/yai-timeout.js: the implementation under review
Inspect the planned review first:
prr review examples/yai-timeout/yai-timeout.js --dry-runRun the actual review:
prr review examples/yai-timeout/yai-timeout.jsIf you have extra context that is not obvious from the code, attach a short markdown note:
prr review examples/yai-timeout/yai-timeout.js \
--context-file ./review-context.mdRefresh that note whenever the code or review goal changes materially.
Or compare reviewer strategies:
prr experiment run examples/yai-timeout/yai-timeout.js --preset smokeGood reviews for this example should focus on:
- cancellation and cleanup behavior
- timer accuracy and drift assumptions
- memory growth and retained references
- callback error handling
- edge cases around empty input, repeated execution, and abort timing
Weak reviews usually drift into:
- generic style preferences
- broad architecture advice without concrete defects
- speculative claims about performance without evidence from the code
The first successful OpenAI-backed benchmark on this file validated the workflow:
--context-fileproduced a better review than the plain run- the briefing caught an O(n^2) metrics update path that the plain run missed
- the briefing also reduced wasted exploration enough to use fewer tokens overall
- later rounds worked best when the context note was refreshed after each material code change
The highest-confidence issues from that run were:
- cancellation that can leave
execute()unsettled - missing abort/cancel guards inside element callbacks
- unbounded timing log retention
- redundant or misleading timing metrics
This example is considered done for the current prr proof-of-concept cycle.
Closure criteria met:
- no unresolved high-severity findings from recent rounds
- repeated review rounds converged on duplicate or low-impact suggestions
- behavior is stable enough as a practical
setTimeoutreplacement for focused utility workloads
Known limits remain intentional:
- this is not a hard real-time scheduler
- very large bursts can still trade precision for throughput
- callback-heavy usage still depends on callback cost and browser/runtime constraints