diff --git a/.agents/skills/wrdn-effect-atom-reactivity-keys/SKILL.md b/.agents/skills/wrdn-effect-atom-reactivity-keys/SKILL.md new file mode 100644 index 000000000..d480f3652 --- /dev/null +++ b/.agents/skills/wrdn-effect-atom-reactivity-keys/SKILL.md @@ -0,0 +1,25 @@ +--- +name: wrdn-effect-atom-reactivity-keys +description: Add reactivityKeys to effect-atom write mutation calls. Use when lint flags a useAtomSet mutation call that mutates data without invalidation keys. +allowed-tools: Read Grep Glob Bash +--- + +Effect-atom write mutations must say which reads they invalidate. + +## Fix Shape + +- Find the `useAtomSet(...)` write mutation call. +- Add `reactivityKeys` to the mutation payload at the call site. +- Use the narrowest keys that cover the rows/lists affected by the write. +- Keep read-only probe/preview OAuth flows out of this pattern. +- If the mutation should update UI immediately, check whether `wrdn-effect-atom-optimistic` also applies. + +## Good + +```ts +await updateSource({ + params: { scopeId, sourceId }, + payload, + reactivityKeys: [["sources", scopeId]], +}); +``` diff --git a/.agents/skills/wrdn-effect-promise-exit/SKILL.md b/.agents/skills/wrdn-effect-promise-exit/SKILL.md new file mode 100644 index 000000000..a6d7b864a --- /dev/null +++ b/.agents/skills/wrdn-effect-promise-exit/SKILL.md @@ -0,0 +1,115 @@ +--- +name: wrdn-effect-promise-exit +description: Replace React/effect-atom mutation handlers that use promise-mode plus try/catch with promiseExit and explicit Exit handling. Use when lint or review flags try/catch around useAtomSet mutation calls, especially UI handlers that set error/busy state after a failed mutation. +allowed-tools: Read Grep Glob Bash +--- + +You fix one pattern: a React handler awaits an effect-atom mutation in `mode: "promise"` and catches failures with `try/catch`. + +The preferred UI boundary is `mode: "promiseExit"` plus `Exit.isFailure`. This keeps mutation failures as values, matches Effect's error model, and prevents optimistic mutation cleanup from depending on thrown exceptions. + +## Trace before changing + +1. **Find the mutation setter.** Look for `const doX = useAtomSet(, { mode: "promise" })`. +2. **Confirm it is an effect-atom mutation boundary.** The setter should come from `@effect/atom-react` and a mutation atom from `./atoms`, `../api/atoms`, or plugin React atoms. +3. **Find thrown-control handling.** The same handler has `try { await doX(...) } catch (e) { ... }`, usually setting error text, resetting `adding`/`saving`, or showing a toast. +4. **Check for non-mutation async work in the same block.** If the block also awaits follow-up mutations, convert those to `promiseExit` too or keep a narrow boundary only around truly non-effect APIs. +5. **Do not rewrite unrelated local async code.** Probe requests, OAuth popup helpers, `fetch`, and browser APIs may need a different skill unless the lint finding specifically points at the mutation call. + +## Fix shape + +- Change the setter to `{ mode: "promiseExit" }`. +- Import `* as Exit from "effect/Exit"` if missing. +- Import `* as Option from "effect/Option"` only when extracting an optional error. +- Replace `try/catch` around the mutation with: + - `const exit = await doX(args);` + - `if (Exit.isFailure(exit)) { ...; return; }` + - success work after the failure branch. +- Use `Exit.findErrorOption(exit)` when preserving an existing error message or typed error branch. +- Keep existing typed error handling when present, e.g. `SecretInUseError`, `ConnectionInUseError`. + +## Bad + +```tsx +const doAdd = useAtomSet(addGraphqlSource, { mode: "promise" }); + +const handleAdd = async () => { + setAdding(true); + setAddError(null); + try { + await doAdd({ + params: { scopeId }, + payload, + reactivityKeys: sourceWriteKeys, + }); + props.onComplete(); + } catch (e) { + setAddError(e instanceof Error ? e.message : "Failed to add source"); + setAdding(false); + } +}; +``` + +## Good + +```tsx +import * as Exit from "effect/Exit"; +import * as Option from "effect/Option"; + +const doAdd = useAtomSet(addGraphqlSource, { mode: "promiseExit" }); + +const handleAdd = async () => { + setAdding(true); + setAddError(null); + const exit = await doAdd({ + params: { scopeId }, + payload, + reactivityKeys: sourceWriteKeys, + }); + if (Exit.isFailure(exit)) { + const error = Exit.findErrorOption(exit); + setAddError( + Option.isSome(error) && error.value instanceof Error + ? error.value.message + : "Failed to add source", + ); + setAdding(false); + return; + } + props.onComplete(); +}; +``` + +## Follow-up mutation chains + +If success work depends on the mutation result, read it after the failure branch: + +```tsx +const exit = await doAdd(args); +if (Exit.isFailure(exit)) { + setAdding(false); + return; +} + +const sourceId = exit.value.namespace; +``` + +If a follow-up effect-atom mutation can fail and the UI treats that as add failure, make that setter `promiseExit` too and branch the same way. Do not put the follow-up mutation in `try/catch` just because the first mutation now returns `Exit`. + +## What not to report + +- `try/catch` around non-effect APIs such as `new URL`, `JSON.parse`, raw `fetch`, or browser popup code. Those may be real lint findings, but they need a different remediation skill. +- `useAtomSet(..., { mode: "promise" })` with no local failure handling and no lint finding. Some call sites intentionally let callers decide the boundary. +- Tests or SDK/server Effect code. This skill is for React/effect-atom UI mutation handlers. +- Manual optimistic placeholder cleanup. Use `wrdn-effect-atom-optimistic` for that; if both patterns appear together, fix optimistic plumbing first, then use `promiseExit` for the remaining mutation boundary. + +## Output requirements + +When reviewing, report: + +- **File and line** of the `useAtomSet(..., { mode: "promise" })` or `try/catch`. +- **Mutation** being called. +- **Why** it should return `Exit` at this UI boundary. +- **Fix**: the exact setter mode and the failure branch to add. + +When editing, keep changes local to the handler and imports unless a follow-up mutation in the same success path must also become `promiseExit`. diff --git a/.agents/skills/wrdn-effect-schema-boundaries/SKILL.md b/.agents/skills/wrdn-effect-schema-boundaries/SKILL.md new file mode 100644 index 000000000..1e192deec --- /dev/null +++ b/.agents/skills/wrdn-effect-schema-boundaries/SKILL.md @@ -0,0 +1,30 @@ +--- +name: wrdn-effect-schema-boundaries +description: Normalize unknown or loosely typed data at boundaries with Effect Schema, named guards, or typed adapters. Use when lint flags double casts, inline object assertions, unknown shape probing, or ad hoc property checks on unknown values. +allowed-tools: Read Grep Glob Bash +--- + +You fix one pattern: domain code is asserting or probing an unknown shape instead of parsing it once at the boundary. + +## Fix Shape + +- Prefer `Schema.decodeUnknownEffect(MySchema)(value)` for untrusted input. +- Keep domain code typed after the decode; do not keep `unknown` and probe it repeatedly. +- Replace `as unknown as X`, `as Record`, inline object assertions, `"field" in value`, and `Reflect.get` with a schema, typed adapter, or named guard. +- A named guard is acceptable only when parsing is not the right abstraction and the guard has a precise return type. + +## Good + +```ts +const ParsedConfig = Schema.Struct({ + endpoint: Schema.String, +}); + +const config = yield * Schema.decodeUnknownEffect(ParsedConfig)(raw); +``` + +## Bad + +```ts +const config = raw as unknown as { endpoint: string }; +``` diff --git a/.agents/skills/wrdn-effect-schema-inferred-types/SKILL.md b/.agents/skills/wrdn-effect-schema-inferred-types/SKILL.md new file mode 100644 index 000000000..5dbbbaf73 --- /dev/null +++ b/.agents/skills/wrdn-effect-schema-inferred-types/SKILL.md @@ -0,0 +1,107 @@ +--- +name: wrdn-effect-schema-inferred-types +description: Replace duplicated TypeScript shape declarations next to Effect Schema definitions with schema-derived types. Use when lint or review flags an interface/type alias that repeats fields already described by a nearby Schema.Struct, Schema.Union, Schema.TaggedStruct, or other Effect Schema model. +allowed-tools: Read Grep Glob Bash +--- + +You fix one pattern: a runtime `Schema` and a manual TypeScript type describe the same shape. + +The preferred boundary is schema-first. Define the schema once, export `type X = typeof XSchema.Type` or `type X = Schema.Schema.Type`, and make domain code consume the inferred type. This prevents drift between parsing and static types. + +## Trace before changing + +1. **Find the runtime schema.** Look for `Schema.Struct`, `Schema.Union`, `Schema.TaggedStruct`, `Schema.Record`, `Schema.Array`, or `Schema.decodeTo`. +2. **Find the duplicate static shape.** A nearby `interface X` or `type X = { ... }` repeats the same fields, nullability, optionality, or literals. +3. **Check export consumers.** If callers import the type, keep the exported type name stable and change only its definition. +4. **Confirm the schema is the source of truth.** If the manual type is wider/narrower than runtime parsing, decide whether the schema or consumers are wrong before replacing it. +5. **Handle recursion narrowly.** Recursive schemas may need one private recursive helper type to annotate `Schema.suspend`; keep exported domain types inferred from the schema. + +## Fix shape + +- Move the schema before the exported type alias when needed. +- Replace duplicated exported interfaces with aliases derived from the schema: + +```ts +export const SourceSchema = Schema.Struct({ + id: SourceId, + name: Schema.String, + enabled: Schema.Boolean, +}); + +export type Source = typeof SourceSchema.Type; +``` + +- Use `Schema.Schema.Type` when it reads better for non-exported or generic schemas: + +```ts +type IntrospectionResult = Schema.Schema.Type; +``` + +- If using `Schema.decodeTo`, infer the domain type from the decoded/domain schema, not from the raw transport schema. +- Do not keep a manual interface solely for documentation. Add schema annotations or comments only when they clarify behavior the schema cannot express. + +## Bad + +```ts +export interface StoredSource { + readonly id: string; + readonly url: string; + readonly headers: readonly Header[]; +} + +export const StoredSourceSchema = Schema.Struct({ + id: Schema.String, + url: Schema.String, + headers: Schema.Array(HeaderSchema), +}); +``` + +## Good + +```ts +export const StoredSourceSchema = Schema.Struct({ + id: Schema.String, + url: Schema.String, + headers: Schema.Array(HeaderSchema), +}); + +export type StoredSource = typeof StoredSourceSchema.Type; +``` + +## Recursive schemas + +Use a private helper only where TypeScript needs an annotation for self-reference: + +```ts +interface TypeRefRecursive { + readonly kind: string; + readonly ofType: TypeRefRecursive | null; +} + +const TypeRefSchema: Schema.Codec = Schema.Struct({ + kind: Schema.String, + ofType: Schema.NullOr(Schema.suspend(() => TypeRefSchema)), +}); + +export type TypeRef = typeof TypeRefSchema.Type; +``` + +The exported domain type is still schema-derived. The private helper exists only to satisfy the recursive schema definition. + +## What not to report + +- Domain types that intentionally do not have a runtime schema. +- Input builder types where the schema parses a different transport representation. +- Branded IDs or opaque aliases that are used by schemas but are not themselves duplicate object shapes. +- Private recursive helper types used only to type `Schema.suspend`, as long as exported consumer-facing types are inferred. + +## Output requirements + +When reviewing, report: + +- **File and line** of the duplicated manual type. +- **Schema** that already owns the shape. +- **Why** the manual type can drift. +- **Fix**: the exact inferred alias to use. + +When editing, keep exported type names stable unless every caller is updated in the same change. diff --git a/.agents/skills/wrdn-effect-typed-errors/SKILL.md b/.agents/skills/wrdn-effect-typed-errors/SKILL.md new file mode 100644 index 000000000..15290171f --- /dev/null +++ b/.agents/skills/wrdn-effect-typed-errors/SKILL.md @@ -0,0 +1,329 @@ +--- +name: wrdn-effect-typed-errors +description: Fix lint findings that use untyped JavaScript error handling instead of Effect typed failures. Use when lint flags new Error, throw, try/catch, Promise.catch, Promise.reject, instanceof Error, unknown error message/stringification, or redundant helpers that only construct tagged errors. +allowed-tools: Read Grep Glob Bash +--- + +You fix one family of patterns: untyped JavaScript error handling in Effect code. + +The preferred boundary is typed `Schema.TaggedError` / `Data.TaggedError` values in the Effect error channel. Construct the tagged error directly at the failure site unless a helper performs real classification or normalization. + +## Trace before changing + +1. **Identify the boundary.** Is this Effect domain code, React UI code, a third-party callback, or plain test/tooling code? +2. **Find the existing domain errors.** Check nearby `errors.ts`, `Schema.TaggedError`, `Data.TaggedError`, and API `.addError(...)` declarations before adding a new class. +3. **Decide whether a new error is needed.** Add a new tagged error only if callers have a distinct recovery path, HTTP status, UI affordance, retry policy, or telemetry classification. +4. **Preserve failure semantics.** If the old code failed, the new code should fail in the Effect error channel. Do not replace thrown failures with fallback values like `false`, `null`, `undefined`, `[]`, or `"unknown"` unless the existing contract already treats that condition as non-fatal. +5. **Preserve the typed channel.** Do not convert typed failures into `Error`, thrown exceptions, `String(error)`, or `.message` reads from unknown values. +6. **Recognize real boundaries.** Runtime workers, Vite/CLI tooling, callback APIs, and third-party interfaces may have to throw, catch, or reject at the boundary. Do not contort those files into fake Effect shapes. Keep the boundary idiom when it is contained and immediately wrapped into an Effect error channel, stable IPC envelope, or test/tooling result. +7. **Do not hide construction behind trivial helpers.** Inline `new DomainError(...)` unless the helper branches on input or maps an external error format into a domain error. + +## Preserve behavior first + +The lint rule is about **where the failure lives**, not whether the operation should still fail. + +Bad fix: this removes the lint finding by silently changing invalid input into a non-match. + +```ts +case "in": + if (!Array.isArray(value)) return false; + return value.some((v) => cmp(lhs, v)); +``` + +Good fix: keep the invalid input as a failure, but make it typed. + +```ts +case "in": + if (!Array.isArray(value)) { + return Effect.fail( + new StorageError({ message: "Value must be an array", cause: clause }), + ); + } + return Effect.succeed(value.some((v) => cmp(lhs, v))); +``` + +When the containing helper was synchronous, make the helper return `Effect.Effect` and thread that through callers. Do not collapse the error into a success value to avoid changing call sites. + +## Boundary exceptions + +The lint rule is not a mandate to make every file Effect-shaped. It is acceptable to keep `try/catch`, `throw`, `new Error`, `.catch`, or `String(error)` at a true adapter boundary when all of these are true: + +- the surrounding API is inherently throwing, callback-based, Promise-based, process/IPC-based, or plain JS tooling +- the untyped behavior is contained to the boundary function or module +- control is immediately translated into a typed Effect failure, stable IPC payload, stable test assertion, or deliberately best-effort cleanup +- the suppression is narrow and explains the boundary + +Good boundary suppression: + +```ts +// oxlint-disable-next-line executor/no-try-catch-or-throw -- boundary: JSON.parse feeds stable IPC failure envelope +try { + const message = JSON.parse(line); + handleHostMessage(message); +} catch (error) { + writeIpcMessage({ type: "failed", error: formatBoundaryError(error) }); +} +``` + +Bad boundary fix: do not replace natural boundary code with fake thenables, fake error objects, promise chains that emulate `try/catch`, or broad helper machinery solely to make lint pass. + +```ts +return makeRejectedThenable(makeErrorLike("Tool path missing")); +``` + +For Effect domain code, fix the code. For boundary code, either wrap once with `Effect.try` / `Effect.tryPromise` at the entry point or use a narrow suppression with a reason. + +## Fix shapes + +### Throw / new Error + +Bad: + +```ts +throw new Error("Missing source"); +``` + +Good in `Effect.gen`: + +```ts +return yield* new SourceNotFoundError({ sourceId }); +``` + +Good in combinators: + +```ts +Effect.fail(new SourceNotFoundError({ sourceId })); +``` + +If a third-party interface requires throwing, keep the throw at the adapter edge only and convert back into a typed failure as soon as control returns to Effect. Prefer a narrow `oxlint-disable-next-line` with a `boundary:` reason over code contortions. + +### Effect.fail inside generators + +Prefer yielding the error directly in generator code: + +```ts +return yield* new SourceNotFoundError({ sourceId }); +``` + +Do not write: + +```ts +return yield* Effect.fail(new SourceNotFoundError({ sourceId })); +``` + +Use `Effect.fail(...)` in non-generator combinator code: + +```ts +Effect.flatMap( + source, + Option.match({ + onNone: () => Effect.fail(new SourceNotFoundError({ sourceId })), + onSome: Effect.succeed, + }), +); +``` + +### Promise.catch / Promise.reject + +Bad: + +```ts +await client.close().catch(() => {}); +return Promise.reject(new Error("failed")); +``` + +Good: + +```ts +Effect.tryPromise({ + try: () => client.close(), + catch: (cause) => new ClientCloseError({ cause }), +}); +``` + +If the failure is intentionally ignored: + +```ts +Effect.ignore( + Effect.tryPromise({ + try: () => client.close(), + catch: (cause) => new ClientCloseError({ cause }), + }), +); +``` + +### try/catch + +Bad: + +```ts +try { + return JSON.parse(text); +} catch (cause) { + return new ParseError({ message: String(cause) }); +} +``` + +Good for schema-backed input: + +```ts +Schema.decodeUnknownEffect(Schema.fromJsonString(InputSchema))(text).pipe( + Effect.mapError(() => new ParseError({ message: "Failed to parse input" })), +); +``` + +Good for non-schema throwing APIs: + +```ts +Effect.try({ + try: () => new URL(value), + catch: (cause) => new UrlParseError({ value, cause }), +}); +``` + +### Unknown error message / instanceof Error + +Bad: + +```ts +err instanceof Error ? err.message : String(err); +``` + +Also bad: destructuring `message` only hides the same unknown-state problem from a shallow property-access lint. + +```ts +const { message } = err; +return message; +``` + +Prefer one of: + +```ts +Effect.mapError((err) => new DomainError({ cause: err })); +``` + +```ts +Effect.catchTag("KnownError", (err) => Effect.fail(new DomainError({ message: err.message }))); +``` + +Only read `.message` from a typed error union when that field is explicitly part of the user-facing contract. Most boundary errors should instead use a stable product message and keep the original value in a separate `cause`, trace, log, or telemetry channel. Do not inspect unknown thrown values for domain behavior or customer copy. + +If the lint rule overfires inside a branch that has already narrowed to a specific typed error, keep the direct typed read and use a narrow suppression with a reason. Do not rewrite to destructuring just to avoid the lint selector. + +Bad: leaks internal provider/native details to users. + +```ts +Effect.tryPromise({ + try: () => client.call(), + catch: (cause) => + new SourceError({ + message: cause instanceof Error ? cause.message : String(cause), + }), +}); +``` + +Good: user-facing message is stable; internal detail goes into `cause` only if the error type has an internal channel. + +```ts +Effect.tryPromise({ + try: () => client.call(), + catch: (cause) => + new SourceError({ + message: "Failed to connect to source", + cause, + }), +}); +``` + +If the error schema is serialized to customers and only has `message`, do not put internal details there. Prefer adding a non-serialized/internal `cause` field or logging/telemetry over suppressing the lint rule. + +### Manual tags and broad error laundering + +Bad: manually probing `_tag` to recover from typed Effect failures. + +```ts +Effect.mapError((err) => + "_tag" in err && err._tag === "SecretOwnedByConnectionError" + ? new SourceError({ message: "Failed to resolve secret" }) + : err, +); +``` + +Good: catch the one typed case you intentionally translate. + +```ts +effect.pipe( + Effect.catchTag("SecretOwnedByConnectionError", () => + Effect.fail(new SourceError({ message: "Failed to resolve secret" })), + ), +); +``` + +Do not wrap a typed error union into one local error only to satisfy a narrower helper signature. Widen the helper/cache/invocation error channel when callers can still use the original typed failure. Wrap only when the new error adds product meaning, such as turning a connection-owned secret into a source configuration problem. + +For Effect data types, use public helpers instead of `_tag` checks: + +```ts +if (Option.isNone(parsed)) return null; +if (Exit.isFailure(exit)) return ... +``` + +### Redundant error helpers + +Bad: + +```ts +const connectionError = (message: string) => + new McpConnectionError({ transport: "remote", message }); + +return yield* connectionError("Endpoint URL is required"); +``` + +Good: + +```ts +return yield* new McpConnectionError({ + transport: "remote", + message: "Endpoint URL is required", +}); +``` + +Helpers are allowed only when they do real work, such as: + +- choosing between different tagged errors +- decoding/parsing an external error shape +- preserving protocol-specific fields +- normalizing third-party SDK failures into one domain error + +## New error or existing error? + +Reuse an existing tagged error when only the message changes. + +Create a new tagged error when a caller can reasonably branch differently: + +- different HTTP status +- retry vs no retry +- auth/sign-in affordance +- not-found vs conflict vs validation +- user-actionable vs internal failure +- different telemetry grouping that should not depend on message text + +Do not create one tagged error per sentence of prose. + +## What not to report + +- Test assertions that intentionally construct errors as fixture values. +- Runtime adapter edges that must satisfy a third-party throwing API, IPC contract, process worker contract, or tooling contract, as long as the untyped behavior is contained and converted to typed Effect failure or a stable boundary envelope. +- Real normalization helpers like `toOAuth2Error(cause)` that inspect protocol fields and preserve structured semantics. +- React/effect-atom mutation handlers using `try/catch`; use `wrdn-effect-promise-exit` for that UI-specific boundary. + +## Output requirements + +When reviewing, report: + +- **File and line** of the untyped error pattern. +- **Rule** being violated. +- **Existing domain error** to use, or the new tagged error that should exist. +- **Fix** in the relevant shape: direct `yield* new ErrorType(...)`, `Effect.tryPromise`, schema decode, or direct constructor inline. + +When editing, keep the error type precise and avoid broad message parsing. diff --git a/.agents/skills/wrdn-effect-value-inferred-types/SKILL.md b/.agents/skills/wrdn-effect-value-inferred-types/SKILL.md new file mode 100644 index 000000000..11a7ab443 --- /dev/null +++ b/.agents/skills/wrdn-effect-value-inferred-types/SKILL.md @@ -0,0 +1,95 @@ +--- +name: wrdn-effect-value-inferred-types +description: Replace duplicated object API types with types inferred from the runtime value or factory that owns the shape. Use when lint or review flags an interface/type alias that mirrors a returned object such as a plugin extension, client surface, route map, or handler table. +allowed-tools: Read Grep Glob Bash +--- + +You fix one pattern: a TypeScript object type manually mirrors a runtime object that already owns the shape. + +Prefer value-first APIs. Build the object in a named factory, then export `type X = ReturnType`. Consumers keep importing the stable type name, but the type cannot drift from the implementation. + +## Trace before changing + +1. **Find the source value.** Look for an object returned from a named factory, `extension: (...) => ({ ... })`, a client object, route map, or handler table. +2. **Find the duplicate type.** A nearby `interface X` or `type X = { ... }` repeats the object methods/properties. +3. **Check whether the value is the source of truth.** If the interface is a contract with multiple implementations, keep the interface. +4. **Preserve the exported type name.** Replace its definition with `ReturnType` and update callers only if needed. +5. **Use `satisfies` only at boundaries.** Do not make the implementation satisfy a duplicate shape that could drift. + +## Fix shape + +```ts +const makePluginExtension = (ctx: PluginCtx) => { + const addSource = ... + const removeSource = ... + + return { + addSource, + removeSource, + }; +}; + +export type PluginExtension = ReturnType; +``` + +For factories that need options: + +```ts +const makePluginExtension = + (options: PluginOptions) => + (ctx: PluginCtx) => ({ + addSource: ..., + }); + +export type PluginExtension = ReturnType>; +``` + +## Bad + +```ts +export interface McpPluginExtension { + readonly addSource: (config: McpSourceConfig) => Effect.Effect; + readonly removeSource: (namespace: string, scope: string) => Effect.Effect; +} + +extension: (ctx) => { + return { + addSource, + removeSource, + } satisfies McpPluginExtension; +}; +``` + +## Good + +```ts +const makeMcpPluginExtension = (ctx: PluginCtx) => { + return { + addSource, + removeSource, + }; +}; + +export type McpPluginExtension = ReturnType; + +extension: makeMcpPluginExtension; +``` + +## What not to report + +- Service/dependency interfaces with multiple implementations. +- Public config input types that are intentionally a stable authored API. +- Branded IDs, discriminated unions, or small aliases that do not mirror one object value. +- Test fakes typed against an existing exported contract. +- Schema-owned data shapes; use `wrdn-effect-schema-inferred-types` for those. + +## Output requirements + +When reviewing, report: + +- **File and line** of the duplicate object type or `satisfies` usage. +- **Value/factory** that owns the shape. +- **Why** the manual type can drift. +- **Fix**: the exact `ReturnType` alias to introduce. + +When editing, name the factory after the exported type, e.g. `makeMcpPluginExtension` for `McpPluginExtension`. diff --git a/.agents/skills/wrdn-effect-vitest-tests/SKILL.md b/.agents/skills/wrdn-effect-vitest-tests/SKILL.md new file mode 100644 index 000000000..bceb412af --- /dev/null +++ b/.agents/skills/wrdn-effect-vitest-tests/SKILL.md @@ -0,0 +1,29 @@ +--- +name: wrdn-effect-vitest-tests +description: Keep tests deterministic and Effect-aware. Use when lint flags direct vitest imports or conditional assertions inside tests. +allowed-tools: Read Grep Glob Bash +--- + +Use `@effect/vitest` for tests in this repo. + +## Fix Shape + +- Import `describe`, `it`, `expect`, and helpers from `@effect/vitest`. +- Import utility helpers from `@effect/vitest/utils` when needed. +- Do not import from raw `vitest` except in config or tooling files. +- Do not put `expect(...)` behind `if`, ternary, logical, or switch branches. +- Split conditional behavior into separate tests, or assert the branch condition and expected value explicitly. + +## Bad + +```ts +if (result.ok) { + expect(result.value).toBe("x"); +} +``` + +## Good + +```ts +expect(result).toEqual({ ok: true, value: "x" }); +``` diff --git a/.agents/skills/wrdn-package-boundaries/SKILL.md b/.agents/skills/wrdn-package-boundaries/SKILL.md new file mode 100644 index 000000000..34dfdc92f --- /dev/null +++ b/.agents/skills/wrdn-package-boundaries/SKILL.md @@ -0,0 +1,26 @@ +--- +name: wrdn-package-boundaries +description: Preserve workspace package boundaries. Use when lint flags relative imports that cross package roots. +allowed-tools: Read Grep Glob Bash +--- + +Workspace packages should import each other through package exports, not relative paths. + +## Fix Shape + +- Replace cross-package relative imports with the target package name. +- If the needed module is not exported, add the smallest package export that matches the package's public surface. +- Keep relative imports only within the same package root. +- Do not reach into another package's private source tree from an app or package. + +## Good + +```ts +import { createExecutor } from "@executor-js/sdk"; +``` + +## Bad + +```ts +import { createExecutor } from "../../../core/sdk/src"; +``` diff --git a/.agents/skills/wrdn-typescript-type-safety/SKILL.md b/.agents/skills/wrdn-typescript-type-safety/SKILL.md new file mode 100644 index 000000000..452ceccc7 --- /dev/null +++ b/.agents/skills/wrdn-typescript-type-safety/SKILL.md @@ -0,0 +1,14 @@ +--- +name: wrdn-typescript-type-safety +description: Remove TypeScript escape hatches. Use when lint flags @ts-nocheck or similar broad type bypasses. +allowed-tools: Read Grep Glob Bash +--- + +Fix the type boundary instead of disabling TypeScript. + +## Fix Shape + +- Remove `@ts-nocheck`. +- Narrow the failing expression, add a schema/guard at an unknown boundary, or improve the local type. +- If a cast is unavoidable, keep it narrow and document the invariant at the cast site. +- Do not silence an entire file for a localized mismatch. diff --git a/.oxlintrc.jsonc b/.oxlintrc.jsonc index b6a2517f4..846cb9fb6 100644 --- a/.oxlintrc.jsonc +++ b/.oxlintrc.jsonc @@ -9,7 +9,19 @@ "executor/no-cross-package-relative-imports": "error", "executor/require-reactivity-keys": "error", "executor/no-effect-internal-tags": "error", + "executor/no-error-constructor": "error", + "executor/no-instanceof-error": "error", + "executor/no-instanceof-tagged-error": "error", + "executor/no-manual-tag-check": "error", + "executor/no-promise-catch": "error", + "executor/no-promise-reject": "error", + "executor/no-redundant-error-factory": "error", "executor/no-ts-nocheck": "error", + "executor/no-try-catch-or-throw": "error", + "executor/no-unknown-error-message": "error", + "executor/prefer-schema-inferred-types": "error", + "executor/prefer-value-inferred-extension-types": "error", + "executor/prefer-yield-tagged-error": "error", "react/forbid-elements": [ "error", { diff --git a/apps/cloud/src/mcp-session.e2e.node.test.ts b/apps/cloud/src/mcp-session.e2e.node.test.ts index b0fb06992..23d446962 100644 --- a/apps/cloud/src/mcp-session.e2e.node.test.ts +++ b/apps/cloud/src/mcp-session.e2e.node.test.ts @@ -15,7 +15,7 @@ // before prod does. import { describe, expect, it } from "@effect/vitest"; -import { Effect } from "effect"; +import { Data, Effect } from "effect"; import { Client } from "@modelcontextprotocol/sdk/client/index.js"; import { InMemoryTransport } from "@modelcontextprotocol/sdk/inMemory.js"; import { ElicitRequestSchema } from "@modelcontextprotocol/sdk/types.js"; @@ -41,6 +41,10 @@ import { makeTestWorkOSVaultClient } from "@executor-js/plugin-workos-vault/test import executorConfig from "../executor.config"; import { DbService } from "./services/db"; +class TransportCloseError extends Data.TaggedError("TransportCloseError")<{ + readonly cause: unknown; +}> {} + // --------------------------------------------------------------------------- // Test-only plugin: exposes one in-memory tool that elicits once. Lets the // eliciting test drive the real engine + sandbox rather than a stub engine. @@ -146,10 +150,23 @@ const openSession = ( return { client, clientTransport, serverTransport }; }), ({ clientTransport, serverTransport }) => - Effect.promise(async () => { - await clientTransport.close().catch(() => undefined); - await serverTransport.close().catch(() => undefined); - }), + Effect.all( + [ + Effect.ignore( + Effect.tryPromise({ + try: () => clientTransport.close(), + catch: (cause) => new TransportCloseError({ cause }), + }), + ), + Effect.ignore( + Effect.tryPromise({ + try: () => serverTransport.close(), + catch: (cause) => new TransportCloseError({ cause }), + }), + ), + ], + { discard: true, concurrency: 1 }, + ), ).pipe(Effect.map(({ client }) => ({ client }))); const nextOrgId = (() => { diff --git a/apps/cloud/src/org/handlers.ts b/apps/cloud/src/org/handlers.ts index 6093fec02..8ec4d6014 100644 --- a/apps/cloud/src/org/handlers.ts +++ b/apps/cloud/src/org/handlers.ts @@ -1,5 +1,5 @@ import { HttpApiBuilder } from "effect/unstable/httpapi"; -import { Effect } from "effect"; +import { Cause, Effect } from "effect"; import { UserStoreService } from "../auth/context"; import { AuthContext } from "../auth/middleware"; @@ -103,7 +103,7 @@ const reserveMemberSlot = Effect.gen(function* () { Effect.catchCause((cause) => Effect.gen(function* () { yield* Effect.logError("members.seats lookup failed; failing closed").pipe( - Effect.annotateLogs({ "org.id": auth.organizationId, cause: String(cause) }), + Effect.annotateLogs({ "org.id": auth.organizationId, cause: Cause.pretty(cause) }), ); return yield* new Forbidden(); }), diff --git a/apps/cloud/src/services/db.schema.test.ts b/apps/cloud/src/services/db.schema.test.ts index e5fb2378a..d6cdd9de9 100644 --- a/apps/cloud/src/services/db.schema.test.ts +++ b/apps/cloud/src/services/db.schema.test.ts @@ -15,6 +15,7 @@ import { describe, expect, it } from "@effect/vitest"; import { drizzle } from "drizzle-orm/postgres-js"; +import { Effect } from "effect"; import postgres from "postgres"; import * as cloudSchema from "./schema"; @@ -44,20 +45,26 @@ describe("combinedSchema", () => { // getters could theoretically drop tables if evaluated before their // declarations. Construct a drizzle instance and walk its fullSchema // to catch that class of bug too. - it("drizzle(combinedSchema) exposes every table under _.fullSchema", () => { + it.effect("drizzle(combinedSchema) exposes every table under _.fullSchema", () => // postgres() lazily connects — safe to build with a dummy url, we // never .query() so no socket is opened. - const sql = postgres("postgres://u:p@127.0.0.1:1/x", { max: 1 }); - try { - const db = drizzle(sql, { schema: combinedSchema }); - const drizzleInternals = (value: unknown): { _: { fullSchema: Record } } => - value as { _: { fullSchema: Record } }; - const fullSchema = drizzleInternals(db)._.fullSchema; - for (const key of Object.keys(executorSchema)) { - expect(fullSchema, `fullSchema missing "${key}"`).toHaveProperty(key); - } - } finally { - sql.end({ timeout: 0 }).catch(() => undefined); - } - }); + Effect.acquireRelease( + Effect.sync(() => postgres("postgres://u:p@127.0.0.1:1/x", { max: 1 })), + (sql) => Effect.promise(() => sql.end({ timeout: 0 })), + ).pipe( + Effect.flatMap((sql) => + Effect.sync(() => { + const db = drizzle(sql, { schema: combinedSchema }); + const drizzleInternals = ( + value: unknown, + ): { _: { fullSchema: Record } } => + value as { _: { fullSchema: Record } }; + const fullSchema = drizzleInternals(db)._.fullSchema; + for (const key of Object.keys(executorSchema)) { + expect(fullSchema, `fullSchema missing "${key}"`).toHaveProperty(key); + } + }), + ), + ), + ); }); diff --git a/apps/cloud/src/services/db.ts b/apps/cloud/src/services/db.ts index 1e31c4d37..3d502eaa1 100644 --- a/apps/cloud/src/services/db.ts +++ b/apps/cloud/src/services/db.ts @@ -73,6 +73,7 @@ export class DbService extends Context.Service< // Fire-and-forget: the Terminate round-trip sometimes hangs, and // we don't need to block scope close waiting for it. Effect.sync(() => { + // oxlint-disable-next-line executor/no-promise-catch -- boundary: deliberately best-effort fire-and-forget cleanup sql.end({ timeout: 0 }).catch(() => undefined); }), ), diff --git a/apps/cloud/src/services/mcp-oauth.node.test.ts b/apps/cloud/src/services/mcp-oauth.node.test.ts index 4136983cf..91b8740d9 100644 --- a/apps/cloud/src/services/mcp-oauth.node.test.ts +++ b/apps/cloud/src/services/mcp-oauth.node.test.ts @@ -82,6 +82,7 @@ const startFakeServer = async (): Promise => { res.end(payload); }; + // oxlint-disable-next-line executor/no-try-catch-or-throw -- boundary: Node http request handler converts server failures into a stable HTTP 500 test response try { if (url.pathname === "/.well-known/oauth-protected-resource") { const origin = `http://${req.headers.host}`; @@ -215,6 +216,7 @@ const startFakeServer = async (): Promise => { send(404, { error: "not_found", params: url.pathname }); } catch (e) { + // oxlint-disable-next-line executor/no-unknown-error-message -- boundary: fake OAuth HTTP server exposes thrown callback failures as a stable response body send(500, { error: "server_error", message: String(e) }); } }); @@ -244,10 +246,12 @@ const followAuthorize = async ( const response = await fetch(authorizationUrl, { redirect: "manual" }); expect(response.status).toBe(302); const location = response.headers.get("location"); + // oxlint-disable-next-line executor/no-try-catch-or-throw, executor/no-raw-error-throw, executor/no-error-constructor -- boundary: test helper fails the test when the fake OAuth redirect is malformed if (!location) throw new Error("no location header on authorize redirect"); const dest = new URL(location); const code = dest.searchParams.get("code"); const state = dest.searchParams.get("state"); + // oxlint-disable-next-line executor/no-try-catch-or-throw, executor/no-raw-error-throw, executor/no-error-constructor -- boundary: test helper fails the test when the fake OAuth redirect is malformed if (!code || !state) throw new Error(`redirect missing code/state: ${location}`); return { code, state }; }; diff --git a/apps/cloud/src/services/slack.ts b/apps/cloud/src/services/slack.ts index ecd76b573..3ed834356 100644 --- a/apps/cloud/src/services/slack.ts +++ b/apps/cloud/src/services/slack.ts @@ -10,6 +10,7 @@ import { Context, Data, Effect, Layer } from "effect"; export class SlackError extends Data.TaggedError("SlackError")<{ method: string; error: string; + cause?: unknown; }> {} export type ISlackService = Readonly<{ @@ -67,15 +68,27 @@ const make = Effect.sync(() => { body: JSON.stringify(body), }); const json = (await res.json()) as A; - if (!json.ok) throw new Error(json.error ?? "unknown_slack_error"); return json; }, catch: (cause) => new SlackError({ method, - error: cause instanceof Error ? cause.message : String(cause), + error: "slack_request_failed", + cause, }), - }).pipe(Effect.withSpan(`slack.${method}`)); + }).pipe( + Effect.flatMap((json) => + json.ok + ? Effect.succeed(json) + : Effect.fail( + new SlackError({ + method, + error: json.error ?? "unknown_slack_error", + }), + ), + ), + Effect.withSpan(`slack.${method}`), + ); const createConnectInvite: ISlackService["createConnectInvite"] = ({ email, diff --git a/packages/core/api/src/handlers/oauth.ts b/packages/core/api/src/handlers/oauth.ts index 443a3d812..b5a08057d 100644 --- a/packages/core/api/src/handlers/oauth.ts +++ b/packages/core/api/src/handlers/oauth.ts @@ -46,9 +46,18 @@ const resolveOAuthSecretBackedMap = { - if (error instanceof OAuthStartError) return error.message; - if (error instanceof OAuthCompleteError) return error.message; - if (error instanceof OAuthProbeError) return error.message; + if (error instanceof OAuthStartError) { + // oxlint-disable-next-line executor/no-unknown-error-message -- typed OAuthStartError branch preserves user-facing popup text + return error.message; + } + if (error instanceof OAuthCompleteError) { + // oxlint-disable-next-line executor/no-unknown-error-message -- typed OAuthCompleteError branch preserves user-facing popup text + return error.message; + } + if (error instanceof OAuthProbeError) { + // oxlint-disable-next-line executor/no-unknown-error-message -- typed OAuthProbeError branch preserves user-facing popup text + return error.message; + } if (error instanceof OAuthSessionNotFoundError) { return `OAuth session not found: ${error.sessionId}`; } @@ -147,7 +156,9 @@ export const OAuthHandlers = HttpApiBuilder.group(ExecutorApi, "oauth", (handler Effect.tapError((cause) => Effect.logError("OAuth callback completion failed", cause), ), - Effect.catchCause(() => Effect.fail(new Error("Authentication failed"))), + Effect.catchCause(() => + Effect.fail(new OAuthCompleteError({ message: "Authentication failed" })), + ), ), urlParams, toErrorMessage: toPopupErrorMessage, diff --git a/packages/core/cli/src/generators/index.ts b/packages/core/cli/src/generators/index.ts index 600cd7149..fd707dbc9 100644 --- a/packages/core/cli/src/generators/index.ts +++ b/packages/core/cli/src/generators/index.ts @@ -13,6 +13,7 @@ export const generateSchema = ( ) => { const generator = generators[adapter]; if (!generator) { + // oxlint-disable-next-line executor/no-try-catch-or-throw, executor/no-error-constructor -- boundary: CLI generator API is Promise-based, preserve its rejection shape throw new Error( `Generator "${adapter}" is not supported. Available: ${Object.keys(generators).join(", ")}`, ); diff --git a/packages/core/sdk/src/oauth-discovery.test.ts b/packages/core/sdk/src/oauth-discovery.test.ts index 2d0cd9462..ec27d64a3 100644 --- a/packages/core/sdk/src/oauth-discovery.test.ts +++ b/packages/core/sdk/src/oauth-discovery.test.ts @@ -251,11 +251,13 @@ describe("registerDynamicClient", () => { expect(Exit.isFailure(exit)).toBe(true); if (!Exit.isFailure(exit)) return; const reason = exit.cause.reasons.find(Cause.isFailReason); - if (!(reason?.error instanceof OAuthDiscoveryError)) { - throw new Error("expected OAuthDiscoveryError"); - } - expect(reason.error.status).toBe(400); - expect(reason.error.message).toMatch(/invalid_client_metadata/); + const error = reason?.error; + expect(error).toBeInstanceOf(OAuthDiscoveryError); + if (!(error instanceof OAuthDiscoveryError)) return; + expect(error).toMatchObject({ + status: 400, + message: expect.stringMatching(/invalid_client_metadata/), + }); }); }); diff --git a/packages/core/storage-core/src/testing/memory.ts b/packages/core/storage-core/src/testing/memory.ts index 9d81d6278..757fe4644 100644 --- a/packages/core/storage-core/src/testing/memory.ts +++ b/packages/core/storage-core/src/testing/memory.ts @@ -25,17 +25,14 @@ import type { StorageFailure, } from "../adapter"; import type { DBSchema } from "../schema"; +import { StorageError } from "../errors"; import { createAdapter } from "../factory"; type Row = Record; type Store = Record; type Comparable = string | number | boolean | Date; -const compare = ( - a: unknown, - b: unknown, - op: "gt" | "gte" | "lt" | "lte", -): boolean => { +const compare = (a: unknown, b: unknown, op: "gt" | "gte" | "lt" | "lte"): boolean => { if ( !( typeof a === "string" || @@ -43,12 +40,7 @@ const compare = ( typeof a === "boolean" || a instanceof Date ) || - !( - typeof b === "string" || - typeof b === "number" || - typeof b === "boolean" || - b instanceof Date - ) + !(typeof b === "string" || typeof b === "number" || typeof b === "boolean" || b instanceof Date) ) { return false; } @@ -69,58 +61,60 @@ const compare = ( const rowAs = (row: Row): T => row as T; const rowsAs = (rows: readonly Row[]): T[] => rows.map(rowAs); -const evalClause = (record: Row, clause: CleanedWhere): boolean => { +const evalClause = (record: Row, clause: CleanedWhere): Effect.Effect => { const { field, value, operator, mode } = clause; const isInsensitive = mode === "insensitive" && (typeof value === "string" || - (Array.isArray(value) && - (value as unknown[]).every((v) => typeof v === "string"))); + (Array.isArray(value) && (value as unknown[]).every((v) => typeof v === "string"))); const lhs = record[field]; - const lowerStr = (v: unknown) => - typeof v === "string" ? v.toLowerCase() : v; + const lowerStr = (v: unknown) => (typeof v === "string" ? v.toLowerCase() : v); const cmp = (a: unknown, b: unknown): boolean => isInsensitive ? lowerStr(a) === lowerStr(b) : a === b; switch (operator) { case "in": - if (!Array.isArray(value)) throw new Error("Value must be an array"); - return (value as unknown[]).some((v) => cmp(lhs, v)); + if (!Array.isArray(value)) { + return Effect.fail(new StorageError({ message: "Value must be an array", cause: clause })); + } + return Effect.succeed((value as unknown[]).some((v) => cmp(lhs, v))); case "not_in": - if (!Array.isArray(value)) throw new Error("Value must be an array"); - return !(value as unknown[]).some((v) => cmp(lhs, v)); + if (!Array.isArray(value)) { + return Effect.fail(new StorageError({ message: "Value must be an array", cause: clause })); + } + return Effect.succeed(!(value as unknown[]).some((v) => cmp(lhs, v))); case "contains": { - if (typeof lhs !== "string" || typeof value !== "string") return false; - return isInsensitive - ? lhs.toLowerCase().includes(value.toLowerCase()) - : lhs.includes(value); + if (typeof lhs !== "string" || typeof value !== "string") return Effect.succeed(false); + return Effect.succeed( + isInsensitive ? lhs.toLowerCase().includes(value.toLowerCase()) : lhs.includes(value), + ); } case "starts_with": { - if (typeof lhs !== "string" || typeof value !== "string") return false; - return isInsensitive - ? lhs.toLowerCase().startsWith(value.toLowerCase()) - : lhs.startsWith(value); + if (typeof lhs !== "string" || typeof value !== "string") return Effect.succeed(false); + return Effect.succeed( + isInsensitive ? lhs.toLowerCase().startsWith(value.toLowerCase()) : lhs.startsWith(value), + ); } case "ends_with": { - if (typeof lhs !== "string" || typeof value !== "string") return false; - return isInsensitive - ? lhs.toLowerCase().endsWith(value.toLowerCase()) - : lhs.endsWith(value); + if (typeof lhs !== "string" || typeof value !== "string") return Effect.succeed(false); + return Effect.succeed( + isInsensitive ? lhs.toLowerCase().endsWith(value.toLowerCase()) : lhs.endsWith(value), + ); } case "ne": - return !cmp(lhs, value); + return Effect.succeed(!cmp(lhs, value)); case "gt": - return value != null && compare(lhs, value, "gt"); + return Effect.succeed(value != null && compare(lhs, value, "gt")); case "gte": - return value != null && compare(lhs, value, "gte"); + return Effect.succeed(value != null && compare(lhs, value, "gte")); case "lt": - return value != null && compare(lhs, value, "lt"); + return Effect.succeed(value != null && compare(lhs, value, "lt")); case "lte": - return value != null && compare(lhs, value, "lte"); + return Effect.succeed(value != null && compare(lhs, value, "lte")); case "eq": default: - return cmp(lhs, value); + return Effect.succeed(cmp(lhs, value)); } }; @@ -132,22 +126,43 @@ const evalClause = (record: Row, clause: CleanedWhere): boolean => { // conformance suite. This diverges from upstream's *memory* adapter, which // still uses a left-to-right fold; we prefer drizzle parity so that a // plugin that works against memory always works against SQL. -const matchAll = (record: Row, where: readonly CleanedWhere[]): boolean => { - if (where.length === 0) return true; - if (where.length === 1) return evalClause(record, where[0]!); - const andGroup = where.filter( - (w) => w.connector === "AND" || !w.connector, - ); - const orGroup = where.filter((w) => w.connector === "OR"); - const andResult = - andGroup.length === 0 ? true : andGroup.every((w) => evalClause(record, w)); - const orResult = - orGroup.length === 0 ? true : orGroup.some((w) => evalClause(record, w)); - return andResult && orResult; -}; +const matchAll = ( + record: Row, + where: readonly CleanedWhere[], +): Effect.Effect => + Effect.gen(function* () { + if (where.length === 0) return true; + if (where.length === 1) return yield* evalClause(record, where[0]!); + const andGroup = where.filter((w) => w.connector === "AND" || !w.connector); + const orGroup = where.filter((w) => w.connector === "OR"); + let andResult = true; + for (const clause of andGroup) { + if (!(yield* evalClause(record, clause))) { + andResult = false; + break; + } + } + let orResult = orGroup.length === 0; + for (const clause of orGroup) { + if (yield* evalClause(record, clause)) { + orResult = true; + break; + } + } + return andResult && orResult; + }); -const filterWhere = (rows: Row[], where: readonly CleanedWhere[]): Row[] => - rows.filter((r) => matchAll(r, where)); +const filterWhere = ( + rows: Row[], + where: readonly CleanedWhere[], +): Effect.Effect => + Effect.gen(function* () { + const out: Row[] = []; + for (const row of rows) { + if (yield* matchAll(row, where)) out.push(row); + } + return out; + }); const cloneStore = (s: Store): Store => { const out: Store = {}; @@ -167,9 +182,7 @@ export interface MakeMemoryAdapterOptions { readonly generateId?: () => string; } -export const makeMemoryAdapter = ( - options: MakeMemoryAdapterOptions, -): DBAdapter => { +export const makeMemoryAdapter = (options: MakeMemoryAdapterOptions): DBAdapter => { let store: Store = {}; const tableFor = (model: string): Row[] => { @@ -186,9 +199,7 @@ export const makeMemoryAdapter = ( const out: Row = { ...base }; for (const [target, cfg] of Object.entries(join)) { const targetRows = tableFor(target); - const matches = targetRows.filter( - (r) => r[cfg.on.to] === base[cfg.on.from], - ); + const matches = targetRows.filter((r) => r[cfg.on.to] === base[cfg.on.from]); if (cfg.relation === "one-to-one") { out[target] = matches[0] ?? null; } else { @@ -209,8 +220,8 @@ export const makeMemoryAdapter = ( select?: string[] | undefined; join?: JoinConfig | undefined; }) => - Effect.sync(() => { - const rows = filterWhere(tableFor(model), where); + Effect.gen(function* () { + const rows = yield* filterWhere(tableFor(model), where); const first = rows[0]; if (!first) return null; return rowAs(join ? attachJoins(first, join) : first); @@ -232,8 +243,8 @@ export const makeMemoryAdapter = ( offset?: number | undefined; join?: JoinConfig | undefined; }) => - Effect.sync(() => { - let rows = filterWhere(tableFor(model), where ?? []); + Effect.gen(function* () { + let rows = yield* filterWhere(tableFor(model), where ?? []); if (sortBy) { const { field, direction } = sortBy; const sign = direction === "asc" ? 1 : -1; @@ -261,8 +272,8 @@ export const makeMemoryAdapter = ( where: CleanedWhere[]; update: T; }) => - Effect.sync(() => { - const rows = filterWhere(tableFor(model), where); + Effect.gen(function* () { + const rows = yield* filterWhere(tableFor(model), where); const first = rows[0]; if (!first) return null; Object.assign(first, update as Row); @@ -289,21 +300,21 @@ export const makeMemoryAdapter = ( findMany, count: ({ model, where }) => - Effect.sync(() => filterWhere(tableFor(model), where ?? []).length), + Effect.map(filterWhere(tableFor(model), where ?? []), (rows) => rows.length), update: updateOne, updateMany: ({ model, where, update }) => - Effect.sync(() => { - const rows = filterWhere(tableFor(model), where); + Effect.gen(function* () { + const rows = yield* filterWhere(tableFor(model), where); for (const r of rows) Object.assign(r, update); return rows.length; }), delete: ({ model, where }) => - Effect.sync(() => { + Effect.gen(function* () { const table = tableFor(model); - const matches = filterWhere(table, where); + const matches = yield* filterWhere(table, where); const first = matches[0]; if (!first) return; const idx = table.indexOf(first); @@ -311,9 +322,9 @@ export const makeMemoryAdapter = ( }), deleteMany: ({ model, where }) => - Effect.sync(() => { + Effect.gen(function* () { const table = tableFor(model); - const matches = new Set(filterWhere(table, where)); + const matches = new Set(yield* filterWhere(table, where)); let count = 0; store[model] = table.filter((r) => { if (matches.has(r)) { @@ -328,11 +339,9 @@ export const makeMemoryAdapter = ( // Snapshot-based transaction: clone on entry, restore on failure. const txFn: DBAdapterFactoryConfig["transaction"] = ( - cb: (trx: Parameters[0] extends ( - t: infer T, - ) => unknown - ? T - : never) => Effect.Effect, + cb: ( + trx: Parameters[0] extends (t: infer T) => unknown ? T : never, + ) => Effect.Effect, ) => Effect.gen(function* () { const snapshot = cloneStore(store); @@ -353,9 +362,7 @@ export const makeMemoryAdapter = ( supportsDates: true, supportsBooleans: true, supportsArrays: true, - customIdGenerator: options.generateId - ? () => options.generateId!() - : undefined, + customIdGenerator: options.generateId ? () => options.generateId!() : undefined, transaction: txFn, }, adapter: custom, diff --git a/packages/core/vite-plugin/src/index.ts b/packages/core/vite-plugin/src/index.ts index 9f58c24f5..098c64929 100644 --- a/packages/core/vite-plugin/src/index.ts +++ b/packages/core/vite-plugin/src/index.ts @@ -44,6 +44,7 @@ const DEFAULT_CONFIG_CANDIDATES = [ const DEFAULT_JSONC_CANDIDATES = ["executor.jsonc", "executor.json"]; const readJsoncPlugins = (path: string): readonly string[] => { + // oxlint-disable-next-line executor/no-try-catch-or-throw -- boundary: Vite plugin probes optional project config files try { const raw = readFileSync(path, "utf8"); const parsed = jsonc.parse(raw) as @@ -61,6 +62,7 @@ const tryResolveClient = ( fromDir: string, ): string | null => { const require = createRequire(resolvePath(fromDir, "_anchor.js")); + // oxlint-disable-next-line executor/no-try-catch-or-throw -- boundary: Vite resolver probes optional client entrypoints try { return require.resolve(`${packageName}/client`); } catch { diff --git a/packages/kernel/runtime-deno-subprocess/src/deno-subprocess-worker.mjs b/packages/kernel/runtime-deno-subprocess/src/deno-subprocess-worker.mjs index 417cdcd36..d60c302a4 100644 --- a/packages/kernel/runtime-deno-subprocess/src/deno-subprocess-worker.mjs +++ b/packages/kernel/runtime-deno-subprocess/src/deno-subprocess-worker.mjs @@ -17,10 +17,13 @@ const writeIpcMessage = (message) => { }; const toErrorMessage = (error) => { + // oxlint-disable-next-line executor/no-instanceof-error -- boundary: Deno worker serializes arbitrary host/user failures to IPC if (error instanceof Error) { + // oxlint-disable-next-line executor/no-unknown-error-message -- boundary: preserve native stack/message text for subprocess IPC return error.stack ?? error.message; } + // oxlint-disable-next-line executor/no-unknown-error-message -- boundary: last-resort serialization for non-Error IPC failure values return String(error); }; @@ -49,6 +52,7 @@ const createToolsProxy = (path = []) => { apply(_target, _thisArg, args) { const toolPath = path.join("."); if (!toolPath) { + // oxlint-disable-next-line executor/no-try-catch-or-throw, executor/no-error-constructor -- boundary: proxy apply trap must reject invalid dynamic tool invocations throw new Error("Tool path missing in invocation"); } @@ -62,6 +66,7 @@ const formatLogArg = (value) => { return value; } + // oxlint-disable-next-line executor/no-try-catch-or-throw -- boundary: console serialization fallback for arbitrary user values try { return JSON.stringify(value); } catch { @@ -113,6 +118,7 @@ const handleStart = (message) => { started = true; + // oxlint-disable-next-line executor/no-promise-catch -- boundary: top-level subprocess bridge turns user-code rejection into IPC runUserCode(message.code) .then((result) => { writeIpcMessage({ @@ -143,6 +149,7 @@ const handleToolResult = (message) => { return; } + // oxlint-disable-next-line executor/no-error-constructor -- boundary: remote tool failure must reject the caller's native Promise pending.reject(new Error(message.error)); }; @@ -187,6 +194,7 @@ const decodeLines = async () => { continue; } + // oxlint-disable-next-line executor/no-try-catch-or-throw -- boundary: host IPC line parser reports malformed input over IPC try { const message = JSON.parse(line); handleHostMessage(message); diff --git a/packages/plugins/google-discovery/src/react/AddGoogleDiscoverySource.tsx b/packages/plugins/google-discovery/src/react/AddGoogleDiscoverySource.tsx index bca1c86aa..e6b94ac32 100644 --- a/packages/plugins/google-discovery/src/react/AddGoogleDiscoverySource.tsx +++ b/packages/plugins/google-discovery/src/react/AddGoogleDiscoverySource.tsx @@ -1,5 +1,7 @@ import { useCallback, useEffect, useMemo, useRef, useState } from "react"; import { useAtomSet } from "@effect/atom-react"; +import * as Exit from "effect/Exit"; +import * as Option from "effect/Option"; import { usePendingSources } from "@executor-js/react/api/optimistic"; import { sourceWriteKeys } from "@executor-js/react/api/reactivity-keys"; @@ -203,7 +205,7 @@ export default function AddGoogleDiscoverySource(props: { const scopeId = useScope(); const doProbe = useAtomSet(probeGoogleDiscovery, { mode: "promise" }); - const doAdd = useAtomSet(addGoogleDiscoverySource, { mode: "promise" }); + const doAdd = useAtomSet(addGoogleDiscoverySource, { mode: "promiseExit" }); const { beginAdd } = usePendingSources(); const secretList = useSecretPickerSecrets(); const oauth = useOAuthPopupFlow({ @@ -331,33 +333,33 @@ export default function AddGoogleDiscoverySource(props: { name: displayName, kind: "google-discovery", }); - try { - await doAdd({ - params: { scopeId }, - payload: { - name: displayName, - discoveryUrl: discoveryUrl.trim(), - namespace, - auth: - authKind === "oauth2" && oauthAuth - ? { - kind: "oauth2" as const, - connectionId: oauthAuth.connectionId, - clientIdSecretId: oauthAuth.clientIdSecretId, - clientSecretSecretId: oauthAuth.clientSecretSecretId, - scopes: oauthAuth.scopes, - } - : { kind: "none" as const }, - }, - reactivityKeys: [...sourceWriteKeys], - }); - props.onComplete(); - } catch (e) { - setError(e instanceof Error ? e.message : "Failed to add source"); + const exit = await doAdd({ + params: { scopeId }, + payload: { + name: displayName, + discoveryUrl: discoveryUrl.trim(), + namespace, + auth: + authKind === "oauth2" && oauthAuth + ? { + kind: "oauth2" as const, + connectionId: oauthAuth.connectionId, + clientIdSecretId: oauthAuth.clientIdSecretId, + clientSecretSecretId: oauthAuth.clientSecretSecretId, + scopes: oauthAuth.scopes, + } + : { kind: "none" as const }, + }, + reactivityKeys: [...sourceWriteKeys], + }); + placeholder.done(); + if (Exit.isFailure(exit)) { + const error = Exit.findErrorOption(exit); + setError(Option.isSome(error) ? error.value.message : "Failed to add source"); setAdding(false); - } finally { - placeholder.done(); + return; } + props.onComplete(); }, [ probe, doAdd, diff --git a/packages/plugins/google-discovery/src/sdk/plugin.ts b/packages/plugins/google-discovery/src/sdk/plugin.ts index 268757289..bbcc80d8b 100644 --- a/packages/plugins/google-discovery/src/sdk/plugin.ts +++ b/packages/plugins/google-discovery/src/sdk/plugin.ts @@ -21,7 +21,6 @@ import { googleDiscoverySchema, makeGoogleDiscoveryStore, type GoogleDiscoveryStore, - type GoogleDiscoveryStoredSource, } from "./binding-store"; import { extractGoogleDiscoveryManifest } from "./document"; import { annotationsForOperation, invokeGoogleDiscoveryTool } from "./invoke"; @@ -86,30 +85,9 @@ export type GoogleDiscoveryExtensionFailure = | GoogleDiscoverySourceError | StorageFailure; -export interface GoogleDiscoveryPluginExtension { - readonly probeDiscovery: ( - input: string | GoogleDiscoveryProbeInput, - ) => Effect.Effect< - GoogleDiscoveryProbeResult, - GoogleDiscoveryParseError | GoogleDiscoverySourceError - >; - readonly addSource: ( - input: GoogleDiscoveryAddSourceInput, - ) => Effect.Effect< - { readonly toolCount: number; readonly namespace: string }, - GoogleDiscoveryParseError | GoogleDiscoverySourceError | StorageFailure - >; - readonly removeSource: (namespace: string, scope: string) => Effect.Effect; - readonly getSource: ( - namespace: string, - scope: string, - ) => Effect.Effect; - readonly updateSource: ( - namespace: string, - scope: string, - input: GoogleDiscoveryUpdateSourceInput, - ) => Effect.Effect; -} +export type GoogleDiscoveryPluginExtension = ReturnType< + typeof makeGoogleDiscoveryPluginExtension +>; // --------------------------------------------------------------------------- // URL normalization + slug helpers (unchanged) @@ -120,12 +98,8 @@ const DISCOVERY_SERVICE_HOST = "https://www.googleapis.com/discovery/v1/apis"; const normalizeDiscoveryUrl = (discoveryUrl: string): string => { const trimmed = discoveryUrl.trim(); if (trimmed.length === 0) return trimmed; - let parsed: URL; - try { - parsed = new URL(trimmed); - } catch { - return trimmed; - } + if (!URL.canParse(trimmed)) return trimmed; + const parsed = new URL(trimmed); if (parsed.pathname !== "/$discovery/rest") return trimmed; const version = parsed.searchParams.get("version")?.trim(); if (!version) return trimmed; @@ -147,7 +121,7 @@ const resolveGoogleDiscoveryCredentials = ( ctx: PluginCtx, ): Effect.Effect< { headers?: Record; queryParams?: Record } | undefined, - GoogleDiscoverySourceError + GoogleDiscoverySourceError | StorageFailure > => Effect.gen(function* () { if (!credentials) return undefined; @@ -158,16 +132,11 @@ const resolveGoogleDiscoveryCredentials = ( new GoogleDiscoverySourceError({ message: `Secret not found for header "${name}"`, }), - onError: (_error, name) => - new GoogleDiscoverySourceError({ - message: `Secret not found for header "${name}"`, - }), }).pipe( - Effect.mapError((err) => - err instanceof GoogleDiscoverySourceError - ? err - : new GoogleDiscoverySourceError({ message: "Secret resolution failed" }), - ), + Effect.catchTags({ + SecretOwnedByConnectionError: () => + Effect.fail(new GoogleDiscoverySourceError({ message: "Secret resolution failed" })), + }), ); const queryParams = yield* resolveSecretBackedMap({ values: credentials.queryParams, @@ -176,16 +145,11 @@ const resolveGoogleDiscoveryCredentials = ( new GoogleDiscoverySourceError({ message: `Secret not found for query parameter "${name}"`, }), - onError: (_error, name) => - new GoogleDiscoverySourceError({ - message: `Secret not found for query parameter "${name}"`, - }), }).pipe( - Effect.mapError((err) => - err instanceof GoogleDiscoverySourceError - ? err - : new GoogleDiscoverySourceError({ message: "Secret resolution failed" }), - ), + Effect.catchTags({ + SecretOwnedByConnectionError: () => + Effect.fail(new GoogleDiscoverySourceError({ message: "Secret resolution failed" })), + }), ); return { ...(headers ? { headers } : {}), @@ -200,29 +164,41 @@ const fetchDiscoveryDocument = ( readonly queryParams?: Record; }, ) => - Effect.tryPromise({ - try: async () => { - const url = new URL(normalizeDiscoveryUrl(discoveryUrl)); - for (const [key, value] of Object.entries(credentials?.queryParams ?? {})) { - url.searchParams.set(key, value); - } - const response = await fetch(url.toString(), { - headers: credentials?.headers, - signal: AbortSignal.timeout(20_000), - }); - if (!response.ok) { - throw new GoogleDiscoverySourceError({ - message: `Google Discovery fetch failed with status ${response.status}`, + Effect.gen(function* () { + const url = yield* Effect.try({ + try: () => new URL(normalizeDiscoveryUrl(discoveryUrl)), + catch: () => + new GoogleDiscoverySourceError({ + message: "Invalid Google Discovery URL", + }), + }); + const response = yield* Effect.tryPromise({ + try: async () => { + for (const [key, value] of Object.entries(credentials?.queryParams ?? {})) { + url.searchParams.set(key, value); + } + return fetch(url.toString(), { + headers: credentials?.headers, + signal: AbortSignal.timeout(20_000), }); - } - return response.text(); - }, - catch: (cause) => - cause instanceof GoogleDiscoverySourceError - ? cause - : new GoogleDiscoverySourceError({ - message: cause instanceof Error ? cause.message : String(cause), - }), + }, + catch: () => + new GoogleDiscoverySourceError({ + message: "Google Discovery fetch failed", + }), + }); + if (!response.ok) { + return yield* new GoogleDiscoverySourceError({ + message: `Google Discovery fetch failed with status ${response.status}`, + }); + } + return yield* Effect.tryPromise({ + try: () => response.text(), + catch: () => + new GoogleDiscoverySourceError({ + message: "Failed to read Google Discovery response", + }), + }); }); const normalizeSlug = (value: string): string => @@ -306,101 +282,102 @@ const registerManifest = ( // Plugin // --------------------------------------------------------------------------- +const makeGoogleDiscoveryPluginExtension = (ctx: PluginCtx) => ({ + probeDiscovery: (input: string | GoogleDiscoveryProbeInput) => + Effect.gen(function* () { + const discoveryUrl = typeof input === "string" ? input : input.discoveryUrl; + const credentials = + typeof input === "string" + ? undefined + : yield* resolveGoogleDiscoveryCredentials(input.credentials, ctx); + const text = yield* fetchDiscoveryDocument(discoveryUrl, credentials); + const manifest = yield* extractGoogleDiscoveryManifest(text); + const scopes = Object.keys( + Option.isSome(manifest.oauthScopes) ? manifest.oauthScopes.value : {}, + ).sort(); + const operations = manifest.methods.map((method) => ({ + toolPath: method.toolPath, + method: method.binding.method, + pathTemplate: method.binding.pathTemplate, + description: Option.isSome(method.description) ? method.description.value : null, + })); + return { + name: Option.isSome(manifest.title) + ? manifest.title.value + : `${manifest.service} ${manifest.version}`, + title: Option.isSome(manifest.title) ? manifest.title.value : null, + service: manifest.service, + version: manifest.version, + toolCount: manifest.methods.length, + scopes, + operations, + }; + }), + + addSource: (input: GoogleDiscoveryAddSourceInput) => + ctx.transaction( + Effect.gen(function* () { + const credentials = yield* resolveGoogleDiscoveryCredentials(input.credentials, ctx); + const text = yield* fetchDiscoveryDocument(input.discoveryUrl, credentials); + const manifest = yield* extractGoogleDiscoveryManifest(text); + const namespace = + input.namespace ?? + deriveNamespace({ + name: input.name, + service: manifest.service, + version: manifest.version, + }); + const sourceData = new GoogleDiscoveryStoredSourceDataSchema({ + name: input.name, + discoveryUrl: normalizeDiscoveryUrl(input.discoveryUrl), + credentials: input.credentials, + service: manifest.service, + version: manifest.version, + rootUrl: manifest.rootUrl, + servicePath: manifest.servicePath, + auth: input.auth, + }); + const toolCount = yield* registerManifest( + ctx, + namespace, + input.scope, + manifest, + sourceData, + ); + return { toolCount, namespace }; + }), + ), + + removeSource: (namespace: string, scope: string) => + ctx.transaction( + Effect.gen(function* () { + yield* ctx.storage.removeBindingsBySource(namespace, scope); + yield* ctx.storage.removeSource(namespace, scope); + yield* ctx.core.sources.unregister(namespace).pipe(Effect.ignore); + }), + ), + + // OAuth start/complete live on `ctx.oauth` now — the UI calls + // the shared `/scopes/:scopeId/oauth/*` endpoints directly with a + // Google-specific `authorization-code` strategy and writes the + // resulting connection back via `updateSource`. + + getSource: (namespace: string, scope: string) => ctx.storage.getSource(namespace, scope), + + updateSource: (namespace: string, scope: string, input: GoogleDiscoveryUpdateSourceInput) => + ctx.storage.updateSourceMeta(namespace, scope, { + name: input.name?.trim() || undefined, + auth: input.auth, + }), +}); + export const googleDiscoveryPlugin = definePlugin(() => ({ id: "googleDiscovery" as const, packageName: "@executor-js/plugin-google-discovery", schema: googleDiscoverySchema, storage: (deps) => makeGoogleDiscoveryStore(deps), - extension: (ctx) => - ({ - probeDiscovery: (input) => - Effect.gen(function* () { - const discoveryUrl = typeof input === "string" ? input : input.discoveryUrl; - const credentials = - typeof input === "string" - ? undefined - : yield* resolveGoogleDiscoveryCredentials(input.credentials, ctx); - const text = yield* fetchDiscoveryDocument(discoveryUrl, credentials); - const manifest = yield* extractGoogleDiscoveryManifest(text); - const scopes = Object.keys( - Option.isSome(manifest.oauthScopes) ? manifest.oauthScopes.value : {}, - ).sort(); - const operations = manifest.methods.map((method) => ({ - toolPath: method.toolPath, - method: method.binding.method, - pathTemplate: method.binding.pathTemplate, - description: Option.isSome(method.description) ? method.description.value : null, - })); - return { - name: Option.isSome(manifest.title) - ? manifest.title.value - : `${manifest.service} ${manifest.version}`, - title: Option.isSome(manifest.title) ? manifest.title.value : null, - service: manifest.service, - version: manifest.version, - toolCount: manifest.methods.length, - scopes, - operations, - }; - }), - - addSource: (input) => - ctx.transaction( - Effect.gen(function* () { - const credentials = yield* resolveGoogleDiscoveryCredentials(input.credentials, ctx); - const text = yield* fetchDiscoveryDocument(input.discoveryUrl, credentials); - const manifest = yield* extractGoogleDiscoveryManifest(text); - const namespace = - input.namespace ?? - deriveNamespace({ - name: input.name, - service: manifest.service, - version: manifest.version, - }); - const sourceData = new GoogleDiscoveryStoredSourceDataSchema({ - name: input.name, - discoveryUrl: normalizeDiscoveryUrl(input.discoveryUrl), - credentials: input.credentials, - service: manifest.service, - version: manifest.version, - rootUrl: manifest.rootUrl, - servicePath: manifest.servicePath, - auth: input.auth, - }); - const toolCount = yield* registerManifest( - ctx, - namespace, - input.scope, - manifest, - sourceData, - ); - return { toolCount, namespace }; - }), - ), - - removeSource: (namespace, scope) => - ctx.transaction( - Effect.gen(function* () { - yield* ctx.storage.removeBindingsBySource(namespace, scope); - yield* ctx.storage.removeSource(namespace, scope); - yield* ctx.core.sources.unregister(namespace).pipe(Effect.ignore); - }), - ), - - // OAuth start/complete live on `ctx.oauth` now — the UI calls - // the shared `/scopes/:scopeId/oauth/*` endpoints directly with a - // Google-specific `authorization-code` strategy and writes the - // resulting connection back via `updateSource`. - - getSource: (namespace, scope) => ctx.storage.getSource(namespace, scope), - - updateSource: (namespace, scope, input) => - ctx.storage.updateSourceMeta(namespace, scope, { - name: input.name?.trim() || undefined, - auth: input.auth, - }), - }) satisfies GoogleDiscoveryPluginExtension, + extension: makeGoogleDiscoveryPluginExtension, invokeTool: ({ ctx, toolRow, args }) => invokeGoogleDiscoveryTool({ @@ -563,7 +540,7 @@ export const googleDiscoveryPlugin = definePlugin(() => ({ servicePath: manifest.servicePath, }); yield* registerManifest(typedCtx, sourceId, scope, manifest, next); - }).pipe(Effect.mapError((err) => (err instanceof Error ? err : new Error(String(err))))), + }), // Connection refresh is owned by the canonical `"oauth2"` // ConnectionProvider registered by core — no plugin-specific handler diff --git a/packages/plugins/graphql/src/sdk/introspect.ts b/packages/plugins/graphql/src/sdk/introspect.ts index 589ceafd7..b3cbdd456 100644 --- a/packages/plugins/graphql/src/sdk/introspect.ts +++ b/packages/plugins/graphql/src/sdk/introspect.ts @@ -1,4 +1,4 @@ -import { Effect } from "effect"; +import { Effect, Schema, SchemaGetter } from "effect"; import { HttpClient, HttpClientRequest } from "effect/unstable/http"; import { GraphqlIntrospectionError } from "./errors"; @@ -78,52 +78,78 @@ const INTROSPECTION_QUERY = ` `; // --------------------------------------------------------------------------- -// Introspection result types +// Introspection result schema // --------------------------------------------------------------------------- -export interface IntrospectionTypeRef { +interface IntrospectionTypeRefRecursive { readonly kind: string; readonly name: string | null; - readonly ofType: IntrospectionTypeRef | null; + readonly ofType: IntrospectionTypeRefRecursive | null; } -export interface IntrospectionInputValue { - readonly name: string; - readonly description: string | null; - readonly type: IntrospectionTypeRef; - readonly defaultValue: string | null; -} +const IntrospectionTypeRefModel: Schema.Codec = Schema.Struct({ + kind: Schema.String, + name: Schema.NullOr(Schema.String), + ofType: Schema.NullOr( + Schema.suspend((): Schema.Codec => IntrospectionTypeRefModel), + ), +}); -export interface IntrospectionField { - readonly name: string; - readonly description: string | null; - readonly args: readonly IntrospectionInputValue[]; - readonly type: IntrospectionTypeRef; -} +const IntrospectionInputValueModel = Schema.Struct({ + name: Schema.String, + description: Schema.NullOr(Schema.String), + type: IntrospectionTypeRefModel, + defaultValue: Schema.NullOr(Schema.String), +}); -export interface IntrospectionEnumValue { - readonly name: string; - readonly description: string | null; -} +const IntrospectionFieldModel = Schema.Struct({ + name: Schema.String, + description: Schema.NullOr(Schema.String), + args: Schema.Array(IntrospectionInputValueModel), + type: IntrospectionTypeRefModel, +}); -export interface IntrospectionType { - readonly kind: string; - readonly name: string; - readonly description: string | null; - readonly fields: readonly IntrospectionField[] | null; - readonly inputFields: readonly IntrospectionInputValue[] | null; - readonly enumValues: readonly IntrospectionEnumValue[] | null; -} +const IntrospectionEnumValueModel = Schema.Struct({ + name: Schema.String, + description: Schema.NullOr(Schema.String), +}); -export interface IntrospectionSchema { - readonly queryType: { readonly name: string } | null; - readonly mutationType: { readonly name: string } | null; - readonly types: readonly IntrospectionType[]; -} +const IntrospectionTypeModel = Schema.Struct({ + kind: Schema.String, + name: Schema.String, + description: Schema.NullOr(Schema.String), + fields: Schema.NullOr(Schema.Array(IntrospectionFieldModel)), + inputFields: Schema.NullOr(Schema.Array(IntrospectionInputValueModel)), + enumValues: Schema.NullOr(Schema.Array(IntrospectionEnumValueModel)), +}); -export interface IntrospectionResult { - readonly __schema: IntrospectionSchema; -} +const IntrospectionSchemaModel = Schema.Struct({ + queryType: Schema.NullOr(Schema.Struct({ name: Schema.String })), + mutationType: Schema.NullOr(Schema.Struct({ name: Schema.String })), + types: Schema.Array(IntrospectionTypeModel), +}); + +const IntrospectionResultModel = Schema.Struct({ + __schema: IntrospectionSchemaModel, +}); + +export type IntrospectionTypeRef = Schema.Schema.Type; +export type IntrospectionInputValue = Schema.Schema.Type; +export type IntrospectionField = Schema.Schema.Type; +export type IntrospectionEnumValue = Schema.Schema.Type; +export type IntrospectionType = Schema.Schema.Type; +export type IntrospectionSchema = Schema.Schema.Type; +export type IntrospectionResult = Schema.Schema.Type; + +const IntrospectionJsonModel = Schema.Union([ + IntrospectionResultModel, + Schema.Struct({ data: IntrospectionResultModel }), +]).pipe( + Schema.decodeTo(IntrospectionResultModel, { + decode: SchemaGetter.transform((value) => ("data" in value ? value.data : value)), + encode: SchemaGetter.transform((value) => value), + }), +); // --------------------------------------------------------------------------- // Introspect a GraphQL endpoint @@ -176,9 +202,7 @@ export const introspect = Effect.fn("GraphQL.introspect")(function* ( } const raw = yield* response.json.pipe( - Effect.tapCause((cause) => - Effect.logError("graphql introspection JSON parse failed", cause), - ), + Effect.tapCause((cause) => Effect.logError("graphql introspection JSON parse failed", cause)), Effect.mapError( () => new GraphqlIntrospectionError({ @@ -211,18 +235,11 @@ export const introspect = Effect.fn("GraphQL.introspect")(function* ( export const parseIntrospectionJson = ( text: string, ): Effect.Effect => - Effect.try({ - try: () => { - const parsed = JSON.parse(text); - // Accept both { data: { __schema } } and { __schema } formats - const result = parsed.data ?? parsed; - if (!result.__schema) { - throw new Error("Missing __schema in introspection JSON"); - } - return result as IntrospectionResult; - }, - catch: (err) => - new GraphqlIntrospectionError({ - message: `Failed to parse introspection JSON: ${err instanceof Error ? err.message : String(err)}`, - }), - }); + Schema.decodeUnknownEffect(Schema.fromJsonString(IntrospectionJsonModel))(text).pipe( + Effect.mapError( + () => + new GraphqlIntrospectionError({ + message: "Failed to parse introspection JSON", + }), + ), + ); diff --git a/packages/plugins/mcp/src/api/handlers.ts b/packages/plugins/mcp/src/api/handlers.ts index 6fd0ecdca..4769514c1 100644 --- a/packages/plugins/mcp/src/api/handlers.ts +++ b/packages/plugins/mcp/src/api/handlers.ts @@ -3,12 +3,15 @@ import { Context, Effect } from "effect"; import { addGroup, capture } from "@executor-js/api"; import type { + McpConnectionAccessFailure, + McpExtensionFailure, McpPluginExtension, McpProbeEndpointInput, McpSourceConfig, McpUpdateSourceInput, } from "../sdk/plugin"; import type { SecretBackedValue } from "../sdk/types"; +import { McpConnectionError } from "../sdk/errors"; import { McpStoredSourceSchema } from "../sdk/stored-source"; import { McpGroup } from "./group"; @@ -29,6 +32,53 @@ export class McpExtensionService extends Context.Service( + effect: Effect.Effect, +): Effect.Effect< + A, + Exclude | McpConnectionError, + R +> => + effect.pipe( + Effect.catchTags({ + ConnectionNotFoundError: (err) => + Effect.fail( + new McpConnectionError({ + transport: "remote", + message: `OAuth connection "${err.connectionId}" was not found`, + }), + ), + ConnectionProviderNotRegisteredError: (err) => + Effect.fail( + new McpConnectionError({ + transport: "remote", + message: `OAuth provider "${err.provider}" is not registered`, + }), + ), + ConnectionRefreshNotSupportedError: (err) => + Effect.fail( + new McpConnectionError({ + transport: "remote", + message: `OAuth provider "${err.provider}" does not support refresh`, + }), + ), + ConnectionReauthRequiredError: (err) => + Effect.fail( + new McpConnectionError({ + transport: "remote", + message: `OAuth connection "${err.connectionId}" requires reauthentication`, + }), + ), + ConnectionRefreshError: (err) => + Effect.fail( + new McpConnectionError({ + transport: "remote", + message: `Failed to refresh OAuth connection "${err.connectionId}"`, + }), + ), + }), + ); + // --------------------------------------------------------------------------- // Convert API payload → McpSourceConfig // --------------------------------------------------------------------------- @@ -100,67 +150,79 @@ export const McpHandlers = HttpApiBuilder.group(ExecutorApiWithMcp, "mcp", (hand handlers .handle("probeEndpoint", ({ payload }) => capture( - Effect.gen(function* () { - const ext = yield* McpExtensionService; - return yield* ext.probeEndpoint(payload as McpProbeEndpointInput); - }), + connectionAccessToMcpError( + Effect.gen(function* () { + const ext = yield* McpExtensionService; + return yield* ext.probeEndpoint(payload as McpProbeEndpointInput); + }), + ), ), ) .handle("addSource", ({ params: path, payload }) => capture( - Effect.gen(function* () { - const ext = yield* McpExtensionService; - return yield* ext.addSource( - toSourceConfig(payload as Parameters[0], path.scopeId), - ); - }), + connectionAccessToMcpError( + Effect.gen(function* () { + const ext = yield* McpExtensionService; + return yield* ext.addSource( + toSourceConfig(payload as Parameters[0], path.scopeId), + ); + }), + ), ), ) .handle("removeSource", ({ params: path, payload }) => capture( - Effect.gen(function* () { - const ext = yield* McpExtensionService; - yield* ext.removeSource(payload.namespace, path.scopeId); - return { removed: true }; - }), + connectionAccessToMcpError( + Effect.gen(function* () { + const ext = yield* McpExtensionService; + yield* ext.removeSource(payload.namespace, path.scopeId); + return { removed: true }; + }), + ), ), ) .handle("refreshSource", ({ params: path, payload }) => capture( - Effect.gen(function* () { - const ext = yield* McpExtensionService; - return yield* ext.refreshSource(payload.namespace, path.scopeId); - }), + connectionAccessToMcpError( + Effect.gen(function* () { + const ext = yield* McpExtensionService; + return yield* ext.refreshSource(payload.namespace, path.scopeId); + }), + ), ), ) .handle("getSource", ({ params: path }) => capture( - Effect.gen(function* () { - const ext = yield* McpExtensionService; - const source = yield* ext.getSource(path.namespace, path.scopeId); - return source - ? new McpStoredSourceSchema({ - namespace: source.namespace, - name: source.name, - config: source.config, - }) - : null; - }), + connectionAccessToMcpError( + Effect.gen(function* () { + const ext = yield* McpExtensionService; + const source = yield* ext.getSource(path.namespace, path.scopeId); + return source + ? new McpStoredSourceSchema({ + namespace: source.namespace, + name: source.name, + config: source.config, + }) + : null; + }), + ), ), ) .handle("updateSource", ({ params: path, payload }) => capture( - Effect.gen(function* () { - const ext = yield* McpExtensionService; - yield* ext.updateSource(path.namespace, path.scopeId, { - name: payload.name, - endpoint: payload.endpoint, - headers: payload.headers, - queryParams: payload.queryParams, - auth: payload.auth as McpUpdateSourceInput["auth"], - }); - return { updated: true }; - }), + connectionAccessToMcpError( + Effect.gen(function* () { + const ext = yield* McpExtensionService; + yield* ext.updateSource(path.namespace, path.scopeId, { + name: payload.name, + endpoint: payload.endpoint, + headers: payload.headers, + queryParams: payload.queryParams, + auth: payload.auth as McpUpdateSourceInput["auth"], + }); + return { updated: true }; + }), + ), ), ), ); diff --git a/packages/plugins/mcp/src/react/EditMcpSource.tsx b/packages/plugins/mcp/src/react/EditMcpSource.tsx index 4ab5e6d07..5a9de87f0 100644 --- a/packages/plugins/mcp/src/react/EditMcpSource.tsx +++ b/packages/plugins/mcp/src/react/EditMcpSource.tsx @@ -1,5 +1,6 @@ import { useState } from "react"; import { useAtomValue, useAtomSet } from "@effect/atom-react"; +import { Exit } from "effect"; import * as AsyncResult from "effect/unstable/reactivity/AsyncResult"; import { mcpSourceAtom, updateMcpSource } from "./atoms"; import { useScope } from "@executor-js/react/api/scope-context"; @@ -35,7 +36,7 @@ function RemoteEditForm(props: { onSave: () => void; }) { const scopeId = useScope(); - const doUpdate = useAtomSet(updateMcpSource, { mode: "promise" }); + const doUpdate = useAtomSet(updateMcpSource, { mode: "promiseExit" }); const secretList = useSecretPickerSecrets(); const identity = useSourceIdentity({ @@ -64,24 +65,24 @@ function RemoteEditForm(props: { setSaving(true); setError(null); const { headers, queryParams } = serializeHttpCredentials(credentials); - try { - await doUpdate({ - params: { scopeId, namespace: props.sourceId }, - payload: { - name: identity.name.trim() || undefined, - endpoint: endpoint.trim() || undefined, - headers, - queryParams, - }, - reactivityKeys: sourceWriteKeys, - }); - setDirty(false); - props.onSave(); - } catch (e) { - setError(e instanceof Error ? e.message : "Failed to update source"); - } finally { + const exit = await doUpdate({ + params: { scopeId, namespace: props.sourceId }, + payload: { + name: identity.name.trim() || undefined, + endpoint: endpoint.trim() || undefined, + headers, + queryParams, + }, + reactivityKeys: sourceWriteKeys, + }); + if (Exit.isFailure(exit)) { + setError("Failed to update source"); setSaving(false); + return; } + setDirty(false); + props.onSave(); + setSaving(false); }; return ( diff --git a/packages/plugins/mcp/src/sdk/discover.ts b/packages/plugins/mcp/src/sdk/discover.ts index 58f6d7278..ed6c711f3 100644 --- a/packages/plugins/mcp/src/sdk/discover.ts +++ b/packages/plugins/mcp/src/sdk/discover.ts @@ -27,10 +27,10 @@ export const discoverTools = ( // Acquire connection const connection = yield* connector.pipe( Effect.mapError( - (err) => + () => new McpToolDiscoveryError({ stage: "connect", - message: `Failed connecting to MCP server: ${err.message}`, + message: "Failed connecting to MCP server", }), ), ); @@ -38,23 +38,19 @@ export const discoverTools = ( // List tools const listResult = yield* Effect.tryPromise({ try: () => connection.client.listTools(), - catch: (cause) => + catch: () => new McpToolDiscoveryError({ stage: "list_tools", - message: `Failed listing MCP tools: ${ - cause instanceof Error ? cause.message : String(cause) - }`, + message: "Failed listing MCP tools", }), }); if (!isListToolsResult(listResult)) { - yield* Effect.promise(() => connection.close().catch(() => {})); - return yield* Effect.fail( - new McpToolDiscoveryError({ - stage: "list_tools", - message: "MCP listTools response did not match the expected schema", - }), - ); + yield* Effect.ignore(Effect.tryPromise(() => connection.close())); + return yield* new McpToolDiscoveryError({ + stage: "list_tools", + message: "MCP listTools response did not match the expected schema", + }); } const manifest = extractManifestFromListToolsResult(listResult, { @@ -62,7 +58,7 @@ export const discoverTools = ( }); // Close the connection after discovery - yield* Effect.promise(() => connection.close().catch(() => {})); + yield* Effect.ignore(Effect.tryPromise(() => connection.close())); return manifest; }); diff --git a/packages/plugins/mcp/src/sdk/invoke.ts b/packages/plugins/mcp/src/sdk/invoke.ts index 6cd7e9dbd..4cce2a7c9 100644 --- a/packages/plugins/mcp/src/sdk/invoke.ts +++ b/packages/plugins/mcp/src/sdk/invoke.ts @@ -10,13 +10,14 @@ // 4. Retrying once on connection failure (invalidate + reconnect). // --------------------------------------------------------------------------- -import { Cause, Effect, Exit, Schema, ScopedCache } from "effect"; +import { Cause, Effect, Exit, Predicate, Schema, ScopedCache } from "effect"; import { ElicitRequestSchema } from "@modelcontextprotocol/sdk/types.js"; import { FormElicitation, UrlElicitation, + type ElicitationDeclinedError, type Elicit, type ElicitationRequest, } from "@executor-js/sdk/core"; @@ -108,14 +109,13 @@ const installElicitationHandler = ( } const failure = exit.cause.reasons.find(Cause.isFailReason); if (failure) { - const err = failure.error as { - readonly _tag?: string; - readonly action?: "decline" | "cancel"; - }; - if (err._tag === "ElicitationDeclinedError") { - return { action: err.action ?? "decline" }; + const err = failure.error; + if (Predicate.isTagged(err, "ElicitationDeclinedError")) { + const action = (err as ElicitationDeclinedError).action; + return { action }; } } + // oxlint-disable-next-line executor/no-try-catch-or-throw -- boundary: MCP SDK request handler must throw JSON-RPC failures throw Cause.squash(exit.cause); }, ); @@ -135,12 +135,10 @@ const useConnection = ( installElicitationHandler(connection.client, elicit); return yield* Effect.tryPromise({ try: () => connection.client.callTool({ name: toolName, arguments: args }), - catch: (cause) => + catch: () => new McpInvocationError({ toolName, - message: `MCP tool call failed for ${toolName}: ${ - cause instanceof Error ? cause.message : String(cause) - }`, + message: `MCP tool call failed for ${toolName}`, }), }).pipe( Effect.withSpan("plugin.mcp.client.call_tool", { @@ -153,7 +151,7 @@ const useConnection = ( // Public API // --------------------------------------------------------------------------- -export interface InvokeMcpToolInput { +export interface InvokeMcpToolInput { readonly toolId: string; readonly toolName: string; readonly args: unknown; @@ -162,22 +160,22 @@ export interface InvokeMcpToolInput { * connection cache key so per-user OAuth/secret resolution doesn't * collapse multiple users onto one shared connection. */ readonly invokerScope: string; - readonly resolveConnector: () => Effect.Effect; + readonly resolveConnector: () => Effect.Effect; readonly connectionCache: ScopedCache.ScopedCache< string, McpConnection, - McpConnectionError + E >; readonly pendingConnectors: Map< string, - Effect.Effect + Effect.Effect >; readonly elicit: Elicit; } -export const invokeMcpTool = ( - input: InvokeMcpToolInput, -): Effect.Effect => { +export const invokeMcpTool = ( + input: InvokeMcpToolInput, +): Effect.Effect => { const transport: string = input.sourceData.transport === "stdio" ? "stdio" diff --git a/packages/plugins/mcp/src/sdk/manifest.ts b/packages/plugins/mcp/src/sdk/manifest.ts index 6b92f1b8a..aee559f2c 100644 --- a/packages/plugins/mcp/src/sdk/manifest.ts +++ b/packages/plugins/mcp/src/sdk/manifest.ts @@ -1,4 +1,4 @@ -import { Schema } from "effect"; +import { Option, Schema } from "effect"; import { McpToolAnnotations } from "./types"; @@ -51,7 +51,7 @@ const decodeListToolsResult = Schema.decodeUnknownOption(ListToolsResult); const decodeServerInfo = Schema.decodeUnknownOption(ServerInfo); export const isListToolsResult = (value: unknown): boolean => - decodeListToolsResult(value)._tag === "Some"; + Option.isSome(decodeListToolsResult(value)); // --------------------------------------------------------------------------- // Tool ID sanitization @@ -86,14 +86,21 @@ export const extractManifestFromListToolsResult = ( ): McpToolManifest => { const seen = new Map(); - const listed = decodeListToolsResult(listToolsResult).pipe((opt) => - opt._tag === "Some" ? opt.value.tools : [], + const listed = decodeListToolsResult(listToolsResult).pipe( + Option.match({ + onNone: () => [], + onSome: (result) => result.tools, + }), ); - const server = decodeServerInfo(metadata?.serverInfo).pipe((opt): McpServerMetadata | null => - opt._tag === "Some" - ? { name: opt.value.name ?? null, version: opt.value.version ?? null } - : null, + const server = decodeServerInfo(metadata?.serverInfo).pipe( + Option.match({ + onNone: (): McpServerMetadata | null => null, + onSome: (info): McpServerMetadata => ({ + name: info.name ?? null, + version: info.version ?? null, + }), + }), ); const tools = listed.flatMap((tool): McpToolManifestEntry[] => { @@ -125,13 +132,8 @@ const slugify = (value: string): string => .replace(/[^a-z0-9]+/g, "_") .replace(/^_+|_+$/g, ""); -const hostnameOf = (url: string): string | null => { - try { - return new URL(url).hostname; - } catch { - return null; - } -}; +const hostnameOf = (url: string): string | null => + URL.canParse(url) ? new URL(url).hostname : null; const basenameOf = (path: string): string => path.trim().split(/[\\/]/).pop() ?? path.trim(); diff --git a/packages/plugins/mcp/src/sdk/per-user-auth-isolation.test.ts b/packages/plugins/mcp/src/sdk/per-user-auth-isolation.test.ts index cf655a4d1..ea6410167 100644 --- a/packages/plugins/mcp/src/sdk/per-user-auth-isolation.test.ts +++ b/packages/plugins/mcp/src/sdk/per-user-auth-isolation.test.ts @@ -19,7 +19,7 @@ import * as http from "node:http"; import { describe, expect, it } from "@effect/vitest"; -import { Cause, Effect, Exit } from "effect"; +import { Cause, Effect, Exit, Predicate } from "effect"; import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js"; import { StreamableHTTPServerTransport } from "@modelcontextprotocol/sdk/server/streamableHttp.js"; import { z } from "zod"; @@ -325,8 +325,8 @@ describe("per-user MCP auth isolation", () => { cause?: { _tag?: string }; } | undefined; - expect(outer?._tag).toBe("ToolInvocationError"); - expect(outer?.cause?._tag).toBe("McpConnectionError"); + expect(Predicate.isTagged(outer, "ToolInvocationError")).toBe(true); + expect(Predicate.isTagged(outer?.cause, "ConnectionNotFoundError")).toBe(true); // CRITICAL: no outbound MCP request was made on user B's behalf // carrying user A's bearer token. Auth resolution must have @@ -431,8 +431,8 @@ describe("per-user MCP auth isolation", () => { cause?: { _tag?: string }; } | undefined; - expect(outer?._tag).toBe("ToolInvocationError"); - expect(outer?.cause?._tag).toBe("McpConnectionError"); + expect(Predicate.isTagged(outer, "ToolInvocationError")).toBe(true); + expect(Predicate.isTagged(outer?.cause, "McpConnectionError")).toBe(true); const afterUserB = server.recorded().slice(recordedBeforeUserB); for (const req of afterUserB) { diff --git a/packages/plugins/mcp/src/sdk/plugin.ts b/packages/plugins/mcp/src/sdk/plugin.ts index 6b227dd19..39b1421f9 100644 --- a/packages/plugins/mcp/src/sdk/plugin.ts +++ b/packages/plugins/mcp/src/sdk/plugin.ts @@ -1,4 +1,4 @@ -import { Duration, Effect, Exit, Result, Scope, ScopedCache } from "effect"; +import { Duration, Effect, Exit, Option, Result, Scope, ScopedCache } from "effect"; import type { OAuthClientProvider } from "@modelcontextprotocol/sdk/client/auth.js"; @@ -10,6 +10,11 @@ import { SourceDetectionResult, Usage, definePlugin, + type ConnectionNotFoundError, + type ConnectionProviderNotRegisteredError, + type ConnectionReauthRequiredError, + type ConnectionRefreshError, + type ConnectionRefreshNotSupportedError, resolveSecretBackedMap as resolveSharedSecretBackedMap, type PluginCtx, type StorageFailure, @@ -182,42 +187,43 @@ const makeOAuthProvider = (accessToken: string): OAuthClientProvider => ({ tokens: () => ({ access_token: accessToken, token_type: "Bearer" }), saveTokens: () => undefined, redirectToAuthorization: async () => { + // oxlint-disable-next-line executor/no-try-catch-or-throw, executor/no-error-constructor -- boundary: MCP SDK OAuth provider callback must reject unsupported reauth throw new Error("MCP OAuth re-authorization required"); }, saveCodeVerifier: () => undefined, codeVerifier: () => { + // oxlint-disable-next-line executor/no-try-catch-or-throw, executor/no-error-constructor -- boundary: MCP SDK OAuth provider callback must throw when no PKCE verifier exists throw new Error("No active PKCE verifier"); }, saveDiscoveryState: () => undefined, discoveryState: () => undefined, }); -const remoteConnectionError = (message: string) => - new McpConnectionError({ transport: "remote", message }); - -const mcpDiscoveryError = (message: string) => - new McpToolDiscoveryError({ stage: "list_tools", message }); - const resolveSecretBackedMap = ( values: Record | undefined, ctx: PluginCtx, ): Effect.Effect | undefined, McpConnectionError | StorageFailure> => resolveSharedSecretBackedMap({ values, - getSecret: ctx.secrets.get, + getSecret: (secretId) => + ctx.secrets.get(secretId).pipe( + Effect.catchTag( + "SecretOwnedByConnectionError", + () => + Effect.fail( + new McpConnectionError({ + transport: "remote", + message: `Failed to resolve secret "${secretId}"`, + }), + ), + ), + ), onMissing: (_name, value) => - remoteConnectionError(`Failed to resolve secret "${value.secretId}"`), - onError: (err, _name, value) => - "_tag" in err && err._tag === "SecretOwnedByConnectionError" - ? remoteConnectionError(`Failed to resolve secret "${value.secretId}"`) - : err, - }).pipe( - Effect.mapError((err) => - "_tag" in err && err._tag === "SecretOwnedByConnectionError" - ? remoteConnectionError("Failed to resolve secret") - : err, - ), - ); + new McpConnectionError({ + transport: "remote", + message: `Failed to resolve secret "${value.secretId}"`, + }), + }); const plainStringMap = ( values: Record | undefined, @@ -233,11 +239,23 @@ const plainStringMap = ( // Shared connector resolution — reads secrets, builds stdio/remote input // --------------------------------------------------------------------------- +export type McpConnectionAccessFailure = + | ConnectionNotFoundError + | ConnectionProviderNotRegisteredError + | ConnectionRefreshNotSupportedError + | ConnectionReauthRequiredError + | ConnectionRefreshError; + +type McpConnectorResolutionFailure = + | McpConnectionError + | McpConnectionAccessFailure + | StorageFailure; + const resolveConnectorInput = ( sd: McpStoredSourceData, ctx: PluginCtx, allowStdio: boolean, -): Effect.Effect => { +): Effect.Effect => { if (sd.transport === "stdio") { if (!allowStdio) { return Effect.fail( @@ -268,16 +286,22 @@ const resolveConnectorInput = ( const val = yield* ctx.secrets .get(auth.secretId) .pipe( - Effect.mapError((err) => - "_tag" in err && err._tag === "SecretOwnedByConnectionError" - ? remoteConnectionError(`Failed to resolve secret "${auth.secretId}"`) - : err, + Effect.catchTag( + "SecretOwnedByConnectionError", + () => + Effect.fail( + new McpConnectionError({ + transport: "remote", + message: `Failed to resolve secret "${auth.secretId}"`, + }), + ), ), ); if (val === null) { - return yield* Effect.fail( - remoteConnectionError(`Failed to resolve secret "${auth.secretId}"`), - ); + return yield* new McpConnectionError({ + transport: "remote", + message: `Failed to resolve secret "${auth.secretId}"`, + }); } headers[auth.headerName] = auth.prefix ? `${auth.prefix}${val}` : val; } else if (auth.kind === "oauth2") { @@ -286,17 +310,7 @@ const resolveConnectorInput = ( // The canonical `"oauth2"` ConnectionProvider registered by // core owns the refresh lifecycle; we just wrap the current // token for the SDK's transport. - const accessToken = yield* ctx.connections - .accessToken(auth.connectionId) - .pipe( - Effect.mapError((err) => - remoteConnectionError( - `Failed to resolve OAuth connection "${auth.connectionId}": ${ - "message" in err ? (err as { message: string }).message : String(err) - }`, - ), - ), - ); + const accessToken = yield* ctx.connections.accessToken(auth.connectionId); authProvider = makeOAuthProvider(accessToken); } @@ -318,15 +332,25 @@ const resolveConnectorInput = ( // --------------------------------------------------------------------------- interface McpRuntime { - readonly connectionCache: ScopedCache.ScopedCache; - readonly pendingConnectors: Map>; + readonly connectionCache: ScopedCache.ScopedCache< + string, + McpConnection, + McpConnectorResolutionFailure + >; + readonly pendingConnectors: Map< + string, + Effect.Effect + >; readonly cacheScope: Scope.Closeable; } const makeRuntime = (): Effect.Effect => Effect.gen(function* () { const cacheScope = yield* Scope.make(); - const pendingConnectors = new Map>(); + const pendingConnectors = new Map< + string, + Effect.Effect + >(); const connectionCache = yield* ScopedCache.make({ lookup: (key: string) => Effect.acquireRelease( @@ -342,7 +366,17 @@ const makeRuntime = (): Effect.Effect => } return connector; }), - (connection) => Effect.promise(() => connection.close().catch(() => {})), + (connection) => + Effect.ignore( + Effect.tryPromise({ + try: () => connection.close(), + catch: () => + new McpConnectionError({ + transport: "auto", + message: "Failed to close MCP connection", + }), + }), + ), ), capacity: 64, timeToLive: Duration.minutes(5), @@ -455,7 +489,10 @@ export const mcpPlugin = definePlugin((options?: McpPluginOptions) => { const endpoint = typeof input === "string" ? input : input.endpoint; const trimmed = endpoint.trim(); if (!trimmed) { - return yield* Effect.fail(remoteConnectionError("Endpoint URL is required")); + return yield* new McpConnectionError({ + transport: "remote", + message: "Endpoint URL is required", + }); } const name = yield* Effect.try({ @@ -510,13 +547,13 @@ export const mcpPlugin = definePlugin((options?: McpPluginOptions) => { queryParams: probeQueryParams, }); if (shape.kind !== "mcp") { - return yield* Effect.fail( - remoteConnectionError( + return yield* new McpConnectionError({ + transport: "remote", + message: shape.kind === "not-mcp" ? `Endpoint does not look like an MCP server: ${shape.reason}` : `Could not reach endpoint: ${shape.reason}`, - ), - ); + }); } const probeResult = yield* ctx.oauth @@ -542,9 +579,10 @@ export const mcpPlugin = definePlugin((options?: McpPluginOptions) => { } satisfies McpProbeResult; } - return yield* Effect.fail( - remoteConnectionError("MCP server requires authentication but OAuth discovery failed"), - ); + return yield* new McpConnectionError({ + transport: "remote", + message: "MCP server requires authentication but OAuth discovery failed", + }); }).pipe( Effect.withSpan("mcp.plugin.probe_endpoint", { attributes: { "mcp.endpoint": typeof input === "string" ? input : input.endpoint }, @@ -585,12 +623,15 @@ export const mcpPlugin = definePlugin((options?: McpPluginOptions) => { // the caller at the end. const discovery: Result.Result< McpToolManifest, - McpToolDiscoveryError | McpConnectionError | StorageFailure + McpToolDiscoveryError | McpConnectorResolutionFailure > = Result.isSuccess(resolved) ? yield* discoverTools(createMcpConnector(resolved.success)).pipe( - Effect.mapError((err) => - mcpDiscoveryError(`MCP discovery failed: ${err.message}`), + Effect.mapError(() => + new McpToolDiscoveryError({ + stage: "list_tools", + message: "MCP discovery failed", + }), ), Effect.result, Effect.withSpan("mcp.plugin.discover_tools", { @@ -706,9 +747,10 @@ export const mcpPlugin = definePlugin((options?: McpPluginOptions) => { }), ); if (!sd) { - return yield* Effect.fail( - remoteConnectionError(`No stored config for MCP source "${namespace}"`), - ); + return yield* new McpConnectionError({ + transport: "remote", + message: `No stored config for MCP source "${namespace}"`, + }); } const ci = yield* resolveConnectorInput(sd, ctx, allowStdio).pipe( @@ -720,7 +762,13 @@ export const mcpPlugin = definePlugin((options?: McpPluginOptions) => { }), ); const manifest = yield* discoverTools(createMcpConnector(ci)).pipe( - Effect.mapError((err) => mcpDiscoveryError(`MCP refresh failed: ${err.message}`)), + Effect.mapError( + () => + new McpToolDiscoveryError({ + stage: "list_tools", + message: "MCP refresh failed", + }), + ), Effect.withSpan("mcp.plugin.discover_tools", { attributes: { "mcp.source.namespace": namespace }, }), @@ -836,7 +884,10 @@ export const mcpPlugin = definePlugin((options?: McpPluginOptions) => { }), ); if (!entry) { - return yield* Effect.fail(new Error(`No MCP binding found for tool "${toolRow.id}"`)); + return yield* new McpConnectionError({ + transport: "auto", + message: `No MCP binding found for tool "${toolRow.id}"`, + }); } const sd = yield* ctx.storage.getSourceConfig(entry.namespace, toolScope).pipe( @@ -845,9 +896,10 @@ export const mcpPlugin = definePlugin((options?: McpPluginOptions) => { }), ); if (!sd) { - return yield* Effect.fail( - new Error(`No MCP source config for namespace "${entry.namespace}"`), - ); + return yield* new McpConnectionError({ + transport: "auto", + message: `No MCP source config for namespace "${entry.namespace}"`, + }); } return yield* invokeMcpTool({ @@ -859,14 +911,6 @@ export const mcpPlugin = definePlugin((options?: McpPluginOptions) => { resolveConnector: () => resolveConnectorInput(sd, ctx, allowStdio).pipe( Effect.flatMap((ci) => createMcpConnector(ci)), - Effect.mapError((err) => - err instanceof McpConnectionError - ? err - : new McpConnectionError({ - transport: "auto", - message: err instanceof Error ? err.message : String(err), - }), - ), Effect.withSpan("mcp.plugin.resolve_connector", { attributes: { "mcp.source.namespace": entry.namespace, @@ -896,7 +940,7 @@ export const mcpPlugin = definePlugin((options?: McpPluginOptions) => { try: () => new URL(trimmed), catch: (cause) => cause, }).pipe(Effect.option); - if (parsed._tag === "None") return null; + if (Option.isNone(parsed)) return null; const name = parsed.value.hostname || "mcp"; const namespace = deriveMcpNamespace({ endpoint: trimmed }); @@ -1102,7 +1146,11 @@ export const mcpPlugin = definePlugin((options?: McpPluginOptions) => { * composition. `UniqueViolationError` passes through — plugins can * `Effect.catchTag` it if they want a friendlier user-facing error. */ -export type McpExtensionFailure = McpConnectionError | McpToolDiscoveryError | StorageFailure; +export type McpExtensionFailure = + | McpConnectionError + | McpToolDiscoveryError + | McpConnectionAccessFailure + | StorageFailure; export interface McpPluginExtension { readonly probeEndpoint: ( diff --git a/packages/plugins/openapi/src/react/EditOpenApiSource.tsx b/packages/plugins/openapi/src/react/EditOpenApiSource.tsx index b5cb3f5ff..6e2b2e077 100644 --- a/packages/plugins/openapi/src/react/EditOpenApiSource.tsx +++ b/packages/plugins/openapi/src/react/EditOpenApiSource.tsx @@ -1,6 +1,8 @@ import { useEffect, useMemo, useRef, useState } from "react"; import { useAtomSet, useAtomValue } from "@effect/atom-react"; import * as AsyncResult from "effect/unstable/reactivity/AsyncResult"; +import * as Exit from "effect/Exit"; +import * as Option from "effect/Option"; import { connectionsAtom, sourceAtom, startOAuth } from "@executor-js/react/api/atoms"; import { useScope, useScopeStack, useUserScope } from "@executor-js/react/api/scope-context"; @@ -163,6 +165,7 @@ export default function EditOpenApiSource(props: { const doUpdate = useAtomSet(updateOpenApiSource, { mode: "promise" }); const doSetBinding = useAtomSet(setOpenApiSourceBinding, { mode: "promise" }); + const doSetBindingExit = useAtomSet(setOpenApiSourceBinding, { mode: "promiseExit" }); const doRemoveBinding = useAtomSet(removeOpenApiSourceBinding, { mode: "promise" }); const doStartOAuth = useAtomSet(startOAuth, { mode: "promise" }); const oauth = useOAuthPopupFlow({ @@ -321,23 +324,22 @@ export default function EditOpenApiSource(props: { if (!trimmed) return; setBusyKey(inputKey); setError(null); - try { - await doSetBinding({ - params: { scopeId: displayScope }, - payload: { - sourceId: props.sourceId, - sourceScope, - scope: targetScope, - slot, - value: { kind: "secret", secretId: SecretId.make(trimmed) }, - }, - reactivityKeys: sourceWriteKeys, - }); - } catch (e) { - setError(e instanceof Error ? e.message : "Failed to save credential binding"); - } finally { - setBusyKey(null); + const exit = await doSetBindingExit({ + params: { scopeId: displayScope }, + payload: { + sourceId: props.sourceId, + sourceScope, + scope: targetScope, + slot, + value: { kind: "secret", secretId: SecretId.make(trimmed) }, + }, + reactivityKeys: sourceWriteKeys, + }); + if (Exit.isFailure(exit)) { + const error = Exit.findErrorOption(exit); + setError(Option.isSome(error) ? error.value.message : "Failed to save credential binding"); } + setBusyKey(null); }; const clearBinding = async (targetScope: ScopeId, slot: string) => { diff --git a/packages/plugins/workos-vault/src/sdk/plugin.ts b/packages/plugins/workos-vault/src/sdk/plugin.ts index 077aad260..74e35eff4 100644 --- a/packages/plugins/workos-vault/src/sdk/plugin.ts +++ b/packages/plugins/workos-vault/src/sdk/plugin.ts @@ -1,10 +1,11 @@ -import { Effect } from "effect"; +import { Data, Effect } from "effect"; import { definePlugin } from "@executor-js/sdk/core"; import { makeConfiguredWorkOSVaultClient, type WorkOSVaultClient, + type WorkOSVaultClientInstantiationError, type WorkOSVaultCredentials, } from "./client"; import { @@ -36,9 +37,17 @@ export interface WorkOSVaultPluginOptions { readonly contextForScope?: WorkOSVaultContextForScope; } -export interface WorkOSVaultExtension { - readonly providerKey: typeof WORKOS_VAULT_PROVIDER_KEY; -} +class WorkOSVaultPluginConfigurationError extends Data.TaggedError( + "WorkOSVaultPluginConfigurationError", +)<{ + readonly message: string; +}> {} + +const makeWorkOSVaultExtension = (_ctx: unknown) => ({ + providerKey: WORKOS_VAULT_PROVIDER_KEY, +}); + +export type WorkOSVaultExtension = ReturnType; // The plugin's typed store is just its metadata-store wrapper. The // secret provider closes over this store plus the resolved WorkOS @@ -48,15 +57,19 @@ type WorkosVaultPluginStore = WorkosVaultStore; const buildClient = ( options: WorkOSVaultPluginOptions | undefined, -): Effect.Effect => { +): Effect.Effect< + WorkOSVaultClient, + WorkOSVaultClientInstantiationError | WorkOSVaultPluginConfigurationError, + never +> => { if (options?.client) return Effect.succeed(options.client); if (options?.credentials) { return makeConfiguredWorkOSVaultClient(options.credentials); } return Effect.fail( - new Error( - "workosVaultPlugin requires either `client` or `credentials` to be provided", - ), + new WorkOSVaultPluginConfigurationError({ + message: "workosVaultPlugin requires either `client` or `credentials` to be provided", + }), ); }; @@ -67,9 +80,7 @@ export const workosVaultPlugin = definePlugin( schema: workosVaultSchema, storage: (deps): WorkosVaultPluginStore => makeWorkosVaultStore(deps), - extension: (_ctx): WorkOSVaultExtension => ({ - providerKey: WORKOS_VAULT_PROVIDER_KEY, - }), + extension: makeWorkOSVaultExtension, secretProviders: (ctx) => { // Build (or accept) the WorkOS client once at startup. If diff --git a/scripts/oxlint-plugin-executor.js b/scripts/oxlint-plugin-executor.js index 94ea9e2ae..755b122ec 100644 --- a/scripts/oxlint-plugin-executor.js +++ b/scripts/oxlint-plugin-executor.js @@ -2,15 +2,24 @@ import noConditionalTests from "./oxlint-plugin-executor/rules/no-conditional-te import noCrossPackageRelativeImports from "./oxlint-plugin-executor/rules/no-cross-package-relative-imports.js"; import noDoubleCast from "./oxlint-plugin-executor/rules/no-double-cast.js"; import noEffectInternalTags from "./oxlint-plugin-executor/rules/no-effect-internal-tags.js"; +import noErrorConstructor from "./oxlint-plugin-executor/rules/no-error-constructor.js"; import noInlineObjectTypeAssertion from "./oxlint-plugin-executor/rules/no-inline-object-type-assertion.js"; +import noInstanceofError from "./oxlint-plugin-executor/rules/no-instanceof-error.js"; import noInstanceofTaggedError from "./oxlint-plugin-executor/rules/no-instanceof-tagged-error.js"; import noManualTagCheck from "./oxlint-plugin-executor/rules/no-manual-tag-check.js"; +import noPromiseCatch from "./oxlint-plugin-executor/rules/no-promise-catch.js"; import noPromiseClientSurface from "./oxlint-plugin-executor/rules/no-promise-client-surface.js"; +import noPromiseReject from "./oxlint-plugin-executor/rules/no-promise-reject.js"; import noRawErrorThrow from "./oxlint-plugin-executor/rules/no-raw-error-throw.js"; import noRedundantErrorFactory from "./oxlint-plugin-executor/rules/no-redundant-error-factory.js"; import noTsNocheck from "./oxlint-plugin-executor/rules/no-ts-nocheck.js"; +import noTryCatchOrThrow from "./oxlint-plugin-executor/rules/no-try-catch-or-throw.js"; +import noUnknownErrorMessage from "./oxlint-plugin-executor/rules/no-unknown-error-message.js"; import noUnknownShapeProbing from "./oxlint-plugin-executor/rules/no-unknown-shape-probing.js"; import noVitestImport from "./oxlint-plugin-executor/rules/no-vitest-import.js"; +import preferSchemaInferredTypes from "./oxlint-plugin-executor/rules/prefer-schema-inferred-types.js"; +import preferYieldTaggedError from "./oxlint-plugin-executor/rules/prefer-yield-tagged-error.js"; +import preferValueInferredExtensionTypes from "./oxlint-plugin-executor/rules/prefer-value-inferred-extension-types.js"; import requireReactivityKeys from "./oxlint-plugin-executor/rules/require-reactivity-keys.js"; export default { @@ -24,13 +33,22 @@ export default { "no-cross-package-relative-imports": noCrossPackageRelativeImports, "require-reactivity-keys": requireReactivityKeys, "no-effect-internal-tags": noEffectInternalTags, + "no-error-constructor": noErrorConstructor, "no-ts-nocheck": noTsNocheck, "no-inline-object-type-assertion": noInlineObjectTypeAssertion, + "no-instanceof-error": noInstanceofError, "no-instanceof-tagged-error": noInstanceofTaggedError, "no-manual-tag-check": noManualTagCheck, + "no-promise-catch": noPromiseCatch, "no-promise-client-surface": noPromiseClientSurface, + "no-promise-reject": noPromiseReject, "no-raw-error-throw": noRawErrorThrow, "no-redundant-error-factory": noRedundantErrorFactory, + "no-try-catch-or-throw": noTryCatchOrThrow, + "no-unknown-error-message": noUnknownErrorMessage, "no-unknown-shape-probing": noUnknownShapeProbing, + "prefer-schema-inferred-types": preferSchemaInferredTypes, + "prefer-value-inferred-extension-types": preferValueInferredExtensionTypes, + "prefer-yield-tagged-error": preferYieldTaggedError, }, }; diff --git a/scripts/oxlint-plugin-executor/rules/no-conditional-tests.js b/scripts/oxlint-plugin-executor/rules/no-conditional-tests.js index 52eba8e09..fd724764b 100644 --- a/scripts/oxlint-plugin-executor/rules/no-conditional-tests.js +++ b/scripts/oxlint-plugin-executor/rules/no-conditional-tests.js @@ -57,7 +57,8 @@ export default { if (name !== "expect") return; context.report({ node, - message: "Avoid conditional expect calls; split the test or assert both branches explicitly.", + message: + "Avoid conditional expect calls; split the test or assert both branches explicitly. Skill: wrdn-effect-vitest-tests.", }); }, FunctionDeclaration: enterFunction, diff --git a/scripts/oxlint-plugin-executor/rules/no-cross-package-relative-imports.js b/scripts/oxlint-plugin-executor/rules/no-cross-package-relative-imports.js index 13e54d7aa..0c34fe6be 100644 --- a/scripts/oxlint-plugin-executor/rules/no-cross-package-relative-imports.js +++ b/scripts/oxlint-plugin-executor/rules/no-cross-package-relative-imports.js @@ -23,7 +23,7 @@ export default { context.report({ node: node.source, - message: `Import ${target.name} via its package export instead of a relative path.`, + message: `Import ${target.name} via its package export instead of a relative path. Skill: wrdn-package-boundaries.`, }); }, }; diff --git a/scripts/oxlint-plugin-executor/rules/no-double-cast.js b/scripts/oxlint-plugin-executor/rules/no-double-cast.js index 012462cce..03fd91df5 100644 --- a/scripts/oxlint-plugin-executor/rules/no-double-cast.js +++ b/scripts/oxlint-plugin-executor/rules/no-double-cast.js @@ -19,7 +19,7 @@ export default { context.report({ node, message: - "Avoid double casts through unknown/any; use a typed boundary, schema decode, or a narrow allow comment with a reason.", + "Avoid double casts through unknown/any; use a typed boundary, schema decode, or a narrow allow comment with a reason. Skill: wrdn-effect-schema-boundaries.", }); }, }; diff --git a/scripts/oxlint-plugin-executor/rules/no-effect-internal-tags.js b/scripts/oxlint-plugin-executor/rules/no-effect-internal-tags.js index 9b65aa293..18dff2773 100644 --- a/scripts/oxlint-plugin-executor/rules/no-effect-internal-tags.js +++ b/scripts/oxlint-plugin-executor/rules/no-effect-internal-tags.js @@ -71,7 +71,7 @@ function reportIfEffectTagComparison( context.report({ node: access, - message: `Use Effect's public helpers instead of checking internal _tag "${tag}".`, + message: `Use Effect's public helpers instead of checking internal _tag "${tag}". Skill: wrdn-effect-typed-errors.`, }); } diff --git a/scripts/oxlint-plugin-executor/rules/no-error-constructor.js b/scripts/oxlint-plugin-executor/rules/no-error-constructor.js new file mode 100644 index 000000000..6a587443a --- /dev/null +++ b/scripts/oxlint-plugin-executor/rules/no-error-constructor.js @@ -0,0 +1,40 @@ +import { nodeName } from "../utils.js"; + +const errorConstructors = new Set([ + "AggregateError", + "Error", + "EvalError", + "RangeError", + "ReferenceError", + "SyntaxError", + "TypeError", + "URIError", +]); + +const message = + "Do not construct built-in Error objects in Effect domain code. Use typed domain errors and Effect.fail instead; at true adapter boundaries use a narrow suppression with a boundary reason. Skill: wrdn-effect-typed-errors."; + +const isErrorConstructor = (node) => errorConstructors.has(nodeName(node)); + +export default { + meta: { + type: "problem", + docs: { + description: "Disallow built-in Error constructors.", + }, + }, + create(context) { + return { + NewExpression(node) { + if (isErrorConstructor(node.callee)) { + context.report({ node, message }); + } + }, + CallExpression(node) { + if (isErrorConstructor(node.callee)) { + context.report({ node, message }); + } + }, + }; + }, +}; diff --git a/scripts/oxlint-plugin-executor/rules/no-inline-object-type-assertion.js b/scripts/oxlint-plugin-executor/rules/no-inline-object-type-assertion.js index a741f199d..9f03db925 100644 --- a/scripts/oxlint-plugin-executor/rules/no-inline-object-type-assertion.js +++ b/scripts/oxlint-plugin-executor/rules/no-inline-object-type-assertion.js @@ -1,7 +1,7 @@ import { isIdentifier } from "../utils.js"; const message = - "Do not assert against inline object-shaped types. Use a named type, Schema, or a proper type guard."; + "Do not assert against inline object-shaped types. Use a named type, Schema, or a proper type guard. Skill: wrdn-effect-schema-boundaries."; const isUnknownKeyword = (node) => node?.type === "TSUnknownKeyword"; diff --git a/scripts/oxlint-plugin-executor/rules/no-instanceof-error.js b/scripts/oxlint-plugin-executor/rules/no-instanceof-error.js new file mode 100644 index 000000000..661bd32b2 --- /dev/null +++ b/scripts/oxlint-plugin-executor/rules/no-instanceof-error.js @@ -0,0 +1,22 @@ +import { nodeName } from "../utils.js"; + +const message = + "Do not use instanceof Error. Preserve typed failures with Effect tagged-error handling. Skill: wrdn-effect-typed-errors."; + +export default { + meta: { + type: "problem", + docs: { + description: "Disallow instanceof Error checks.", + }, + }, + create(context) { + return { + BinaryExpression(node) { + if (node.operator === "instanceof" && nodeName(node.right) === "Error") { + context.report({ node, message }); + } + }, + }; + }, +}; diff --git a/scripts/oxlint-plugin-executor/rules/no-instanceof-tagged-error.js b/scripts/oxlint-plugin-executor/rules/no-instanceof-tagged-error.js index d8a566b14..a0ac94866 100644 --- a/scripts/oxlint-plugin-executor/rules/no-instanceof-tagged-error.js +++ b/scripts/oxlint-plugin-executor/rules/no-instanceof-tagged-error.js @@ -1,7 +1,7 @@ import { isIdentifier, nodeName } from "../utils.js"; const message = - "Do not use instanceof for tagged errors. Use Effect.catchTag, Effect.catchTags, or a _tag-based guard."; + "Do not use instanceof for tagged errors. Use Effect.catchTag, Effect.catchTags, or a _tag-based guard. Skill: wrdn-effect-typed-errors."; const looksLikeTaggedErrorName = (name) => typeof name === "string" && name !== "Error" && name.endsWith("Error"); diff --git a/scripts/oxlint-plugin-executor/rules/no-manual-tag-check.js b/scripts/oxlint-plugin-executor/rules/no-manual-tag-check.js index f6ebe68da..45f39bbbe 100644 --- a/scripts/oxlint-plugin-executor/rules/no-manual-tag-check.js +++ b/scripts/oxlint-plugin-executor/rules/no-manual-tag-check.js @@ -1,7 +1,7 @@ import { isIdentifier, isStringLiteral } from "../utils.js"; const message = - "Do not inspect _tag manually. Use Effect.catchTag, Effect.catchTags, Predicate.isTagged, or another Effect tagged-error API."; + "Do not inspect _tag manually. Use Effect.catchTag/catchTags for error handling, Predicate.isTagged for guards, or public Effect helpers for Effect data. Skill: wrdn-effect-typed-errors."; const isTagProperty = (node) => isIdentifier(node, "_tag") || (isStringLiteral(node) && node.value === "_tag"); @@ -15,6 +15,16 @@ export default { }, create(context) { return { + BinaryExpression(node) { + if (node.operator === "in" && isTagProperty(node.left)) { + context.report({ node, message }); + return; + } + if (!["===", "!==", "==", "!="].includes(node.operator)) return; + if (isTagAccess(node.left) || isTagAccess(node.right)) { + context.report({ node, message }); + } + }, MemberExpression(node) { if (isTagProperty(node.property)) { context.report({ node, message }); @@ -23,3 +33,5 @@ export default { }; }, }; + +const isTagAccess = (node) => node?.type === "MemberExpression" && isTagProperty(node.property); diff --git a/scripts/oxlint-plugin-executor/rules/no-promise-catch.js b/scripts/oxlint-plugin-executor/rules/no-promise-catch.js new file mode 100644 index 000000000..daa07271e --- /dev/null +++ b/scripts/oxlint-plugin-executor/rules/no-promise-catch.js @@ -0,0 +1,30 @@ +import { getPropertyName, isIdentifier, unwrapExpression } from "../utils.js"; + +const message = + "Do not use Promise .catch(). Model async failures with Effect.tryPromise and typed Effect error handling. Skill: wrdn-effect-typed-errors."; + +const isCatchMember = (node) => { + const expression = unwrapExpression(node); + if (isIdentifier(unwrapExpression(expression?.object), "Effect")) return false; + return ( + expression?.type === "MemberExpression" && getPropertyName(expression.property) === "catch" + ); +}; + +export default { + meta: { + type: "problem", + docs: { + description: "Disallow Promise-style .catch() error handling.", + }, + }, + create(context) { + return { + CallExpression(node) { + if (isCatchMember(node.callee)) { + context.report({ node, message }); + } + }, + }; + }, +}; diff --git a/scripts/oxlint-plugin-executor/rules/no-promise-client-surface.js b/scripts/oxlint-plugin-executor/rules/no-promise-client-surface.js index 3ae5916f6..047ec94f4 100644 --- a/scripts/oxlint-plugin-executor/rules/no-promise-client-surface.js +++ b/scripts/oxlint-plugin-executor/rules/no-promise-client-surface.js @@ -1,7 +1,7 @@ import { containsPromiseType, nodeName } from "../utils.js"; const message = - "Do not expose Promise-shaped client surfaces. Wrap third-party SDK promises at the adapter boundary and expose Effect methods."; + "Do not expose Promise-shaped client surfaces. Wrap third-party SDK promises at the adapter boundary and expose Effect methods. Skill: effect-client-wrapper."; const isExported = (node) => node?.parent?.type === "ExportNamedDeclaration"; diff --git a/scripts/oxlint-plugin-executor/rules/no-promise-reject.js b/scripts/oxlint-plugin-executor/rules/no-promise-reject.js new file mode 100644 index 000000000..2e996cc7c --- /dev/null +++ b/scripts/oxlint-plugin-executor/rules/no-promise-reject.js @@ -0,0 +1,74 @@ +import { getPropertyName, isIdentifier, unwrapExpression } from "../utils.js"; + +const promiseRejectMessage = + "Do not use Promise.reject(). Model async failures with Effect.fail or Effect.tryPromise. Skill: wrdn-effect-typed-errors."; +const rejectCallbackMessage = + "Do not call Promise executor reject(). Model async failures with Effect.fail or Effect.tryPromise. Skill: wrdn-effect-typed-errors."; + +const isPromiseReject = (node) => { + const expression = unwrapExpression(node); + return ( + expression?.type === "MemberExpression" && + isIdentifier(unwrapExpression(expression.object), "Promise") && + getPropertyName(expression.property) === "reject" + ); +}; + +const isPromiseConstructor = (node) => + node?.type === "NewExpression" && isIdentifier(unwrapExpression(node.callee), "Promise"); + +const isFunction = (node) => + node?.type === "ArrowFunctionExpression" || + node?.type === "FunctionExpression" || + node?.type === "FunctionDeclaration"; + +export default { + meta: { + type: "problem", + docs: { + description: "Disallow Promise rejection APIs.", + }, + }, + create(context) { + const promiseExecutors = new WeakSet(); + const rejectNames = []; + + const enterFunction = (node) => { + if (!promiseExecutors.has(node)) return; + const rejectParam = node.params?.[1]; + if (isIdentifier(rejectParam)) { + rejectNames.push(rejectParam.name); + } else { + rejectNames.push(undefined); + } + }; + + const exitFunction = (node) => { + if (promiseExecutors.has(node)) rejectNames.pop(); + }; + + return { + NewExpression(node) { + if (!isPromiseConstructor(node)) return; + const executor = node.arguments?.[0]; + if (isFunction(executor)) promiseExecutors.add(executor); + }, + CallExpression(node) { + if (isPromiseReject(node.callee)) { + context.report({ node, message: promiseRejectMessage }); + return; + } + + if (isIdentifier(node.callee) && rejectNames.includes(node.callee.name)) { + context.report({ node, message: rejectCallbackMessage }); + } + }, + FunctionDeclaration: enterFunction, + "FunctionDeclaration:exit": exitFunction, + FunctionExpression: enterFunction, + "FunctionExpression:exit": exitFunction, + ArrowFunctionExpression: enterFunction, + "ArrowFunctionExpression:exit": exitFunction, + }; + }, +}; diff --git a/scripts/oxlint-plugin-executor/rules/no-raw-error-throw.js b/scripts/oxlint-plugin-executor/rules/no-raw-error-throw.js index 92d9bc8d4..773b105fe 100644 --- a/scripts/oxlint-plugin-executor/rules/no-raw-error-throw.js +++ b/scripts/oxlint-plugin-executor/rules/no-raw-error-throw.js @@ -1,7 +1,7 @@ import { isIdentifier } from "../utils.js"; const message = - "Do not throw raw Error objects in Effect code. Return Effect.fail with a tagged error or assert directly in tests."; + "Do not throw raw Error objects in Effect code. Return Effect.fail with a tagged error or assert directly in tests. Skill: wrdn-effect-typed-errors."; const isNewError = (node) => node?.type === "NewExpression" && isIdentifier(node.callee, "Error"); diff --git a/scripts/oxlint-plugin-executor/rules/no-redundant-error-factory.js b/scripts/oxlint-plugin-executor/rules/no-redundant-error-factory.js index 1b1426712..2e3fccca3 100644 --- a/scripts/oxlint-plugin-executor/rules/no-redundant-error-factory.js +++ b/scripts/oxlint-plugin-executor/rules/no-redundant-error-factory.js @@ -1,27 +1,65 @@ import { isIdentifier } from "../utils.js"; const message = - "Do not add redundant make*Error wrappers that only construct a tagged error. Construct the tagged error directly."; + "Do not add redundant helpers that only construct a tagged error. Construct the tagged error directly. Skill: wrdn-effect-typed-errors."; const isErrorFactoryName = (name) => /^make[A-Z].*Error$/.test(name); +const isErrorHelperName = (name) => + isErrorFactoryName(name) || String(name ?? "").endsWith("Error"); + +const parameterName = (param) => { + if (isIdentifier(param)) return param.name; + if (param?.type === "AssignmentPattern" && isIdentifier(param.left)) return param.left.name; + if (param?.type === "RestElement" && isIdentifier(param.argument)) return param.argument.name; + return undefined; +}; + const isNewErrorExpression = (node) => node?.type === "NewExpression" && isIdentifier(node.callee) && node.callee.name.endsWith("Error"); +const isForwardedValue = (node, parameterNames) => { + if (node?.type === "Literal" || node?.type === "StringLiteral") return true; + if (node?.type === "Identifier") return parameterNames.has(node.name); + return ( + node?.type === "MemberExpression" && + isIdentifier(node.object) && + parameterNames.has(node.object.name) + ); +}; + +const isObjectWithOnlyForwardedFields = (node, parameterNames) => { + if (node?.type !== "ObjectExpression") return true; + return (node.properties ?? []).every((property) => { + if (property.type === "SpreadElement") return false; + return isForwardedValue(property.value, parameterNames); + }); +}; + +const isRedundantNewErrorExpression = (node, parameterNames) => { + if (!isNewErrorExpression(node)) return false; + if ((node.arguments ?? []).length === 0) return true; + if (node.arguments.length > 1) return false; + const argument = node.arguments[0]; + if (argument?.type === "Identifier") return parameterNames.has(argument.name); + return isObjectWithOnlyForwardedFields(argument, parameterNames); +}; + const returnsOnlyNewError = (node) => { - if (isNewErrorExpression(node)) return true; + const parameterNames = new Set((node?.params ?? []).map(parameterName).filter(Boolean)); + if (isRedundantNewErrorExpression(node?.body ?? node, parameterNames)) return true; if (node?.type !== "BlockStatement") return false; const statements = node.body ?? []; return ( statements.length === 1 && statements[0]?.type === "ReturnStatement" && - isNewErrorExpression(statements[0].argument) + isRedundantNewErrorExpression(statements[0].argument, parameterNames) ); }; -const reportIfRedundantFactory = (context, name, body, node) => { - if (isErrorFactoryName(name) && returnsOnlyNewError(body)) { - context.report({ node, message }); +const reportIfRedundantFactory = (context, name, fnNode, reportNode) => { + if (isErrorHelperName(name) && returnsOnlyNewError(fnNode)) { + context.report({ node: reportNode, message }); } }; @@ -35,7 +73,7 @@ export default { create(context) { return { FunctionDeclaration(node) { - reportIfRedundantFactory(context, node.id?.name, node.body, node); + reportIfRedundantFactory(context, node.id?.name, node, node); }, VariableDeclarator(node) { if (!isIdentifier(node.id)) return; @@ -45,7 +83,7 @@ export default { ) { return; } - reportIfRedundantFactory(context, node.id.name, node.init.body, node); + reportIfRedundantFactory(context, node.id.name, node.init, node); }, }; }, diff --git a/scripts/oxlint-plugin-executor/rules/no-try-catch-or-throw.js b/scripts/oxlint-plugin-executor/rules/no-try-catch-or-throw.js new file mode 100644 index 000000000..4b876efd9 --- /dev/null +++ b/scripts/oxlint-plugin-executor/rules/no-try-catch-or-throw.js @@ -0,0 +1,23 @@ +const tryCatchMessage = + "Do not use try/catch blocks in Effect domain code. Model failures with Effect instead; at true adapter boundaries use a narrow suppression with a boundary reason. Skill: wrdn-effect-typed-errors; React useAtomSet mutation handlers use wrdn-effect-promise-exit."; +const throwMessage = + "Do not throw errors in Effect domain code. Model failures with Effect.fail or typed error values instead; at true adapter boundaries use a narrow suppression with a boundary reason. Skill: wrdn-effect-typed-errors."; + +export default { + meta: { + type: "problem", + docs: { + description: "Disallow try/catch blocks and throw statements.", + }, + }, + create(context) { + return { + TryStatement(node) { + context.report({ node, message: tryCatchMessage }); + }, + ThrowStatement(node) { + context.report({ node, message: throwMessage }); + }, + }; + }, +}; diff --git a/scripts/oxlint-plugin-executor/rules/no-ts-nocheck.js b/scripts/oxlint-plugin-executor/rules/no-ts-nocheck.js index fc0b10aef..1ce03adf3 100644 --- a/scripts/oxlint-plugin-executor/rules/no-ts-nocheck.js +++ b/scripts/oxlint-plugin-executor/rules/no-ts-nocheck.js @@ -18,7 +18,7 @@ export default { context.report({ node, - message: `Do not use ${directiveName}; fix the types or narrow the file scope.`, + message: `Do not use ${directiveName}; fix the types or narrow the file scope. Skill: wrdn-typescript-type-safety.`, }); }, }; diff --git a/scripts/oxlint-plugin-executor/rules/no-unknown-error-message.js b/scripts/oxlint-plugin-executor/rules/no-unknown-error-message.js new file mode 100644 index 000000000..87987bf1b --- /dev/null +++ b/scripts/oxlint-plugin-executor/rules/no-unknown-error-message.js @@ -0,0 +1,49 @@ +import { getPropertyName, isIdentifier, nodeName, unwrapExpression } from "../utils.js"; + +const stringMessage = + "Do not stringify unknown errors. Keep typed failures in Effect or normalize at a typed boundary. Skill: wrdn-effect-typed-errors."; +const messagePropertyMessage = + "Do not read .message from unknown errors. Preserve typed failures with Effect tagged-error handling. Skill: wrdn-effect-typed-errors."; +const destructuredMessage = + "Do not destructure .message from unknown errors. Preserve typed failures with Effect tagged-error handling. Skill: wrdn-effect-typed-errors."; + +const errorLikeNames = new Set(["cause", "e", "err", "error", "reason", "unknownError"]); + +const isErrorLikeIdentifier = (node) => { + const name = nodeName(unwrapExpression(node)); + return errorLikeNames.has(name); +}; + +export default { + meta: { + type: "problem", + docs: { + description: "Disallow common unknown-error string and message normalization patterns.", + }, + }, + create(context) { + return { + CallExpression(node) { + if (!isIdentifier(unwrapExpression(node.callee), "String")) return; + if (node.arguments.some(isErrorLikeIdentifier)) { + context.report({ node, message: stringMessage }); + } + }, + MemberExpression(node) { + if (getPropertyName(node.property) !== "message") return; + if (isErrorLikeIdentifier(node.object)) { + context.report({ node, message: messagePropertyMessage }); + } + }, + VariableDeclarator(node) { + if (node.id?.type !== "ObjectPattern" || !isErrorLikeIdentifier(node.init)) return; + for (const property of node.id.properties ?? []) { + if (property.type !== "Property") continue; + if (getPropertyName(property.key) === "message") { + context.report({ node: property, message: destructuredMessage }); + } + } + }, + }; + }, +}; diff --git a/scripts/oxlint-plugin-executor/rules/no-unknown-shape-probing.js b/scripts/oxlint-plugin-executor/rules/no-unknown-shape-probing.js index f409aef2a..4a6f6b51d 100644 --- a/scripts/oxlint-plugin-executor/rules/no-unknown-shape-probing.js +++ b/scripts/oxlint-plugin-executor/rules/no-unknown-shape-probing.js @@ -1,7 +1,7 @@ import { isIdentifier, isStringLiteral } from "../utils.js"; const message = - "Do not probe unknown object shapes in domain code. Normalize at a boundary with Schema, a typed adapter, or a named guard."; + "Do not probe unknown object shapes in domain code. Normalize at a boundary with Schema, a typed adapter, or a named guard. Skill: wrdn-effect-schema-boundaries."; const isReflectGet = (node) => node?.type === "MemberExpression" && diff --git a/scripts/oxlint-plugin-executor/rules/no-vitest-import.js b/scripts/oxlint-plugin-executor/rules/no-vitest-import.js index f420b8f2a..fe4b15ee9 100644 --- a/scripts/oxlint-plugin-executor/rules/no-vitest-import.js +++ b/scripts/oxlint-plugin-executor/rules/no-vitest-import.js @@ -15,7 +15,7 @@ export default { context.report({ node: node.source, message: - "Import test helpers from @effect/vitest or @effect/vitest/utils instead of vitest.", + "Import test helpers from @effect/vitest or @effect/vitest/utils instead of vitest. Skill: wrdn-effect-vitest-tests.", }); }, }; diff --git a/scripts/oxlint-plugin-executor/rules/prefer-schema-inferred-types.js b/scripts/oxlint-plugin-executor/rules/prefer-schema-inferred-types.js new file mode 100644 index 000000000..cab0d6bfe --- /dev/null +++ b/scripts/oxlint-plugin-executor/rules/prefer-schema-inferred-types.js @@ -0,0 +1,66 @@ +import { getCallName, isIdentifier, typeReferenceName } from "../utils.js"; + +const message = + "This object type duplicates a nearby Effect Schema. Export an inferred type from the schema instead. Skill: wrdn-effect-schema-inferred-types."; + +const schemaSuffixPattern = /(Schema|Model|Struct)$/; + +const schemaBaseName = (name) => { + const base = name.replace(schemaSuffixPattern, ""); + return base.length > 0 && base !== name ? base : undefined; +}; + +const isSchemaMemberCall = (node) => + node?.type === "CallExpression" && + node.callee?.type === "MemberExpression" && + isIdentifier(node.callee.object, "Schema"); + +const isSchemaModelExpression = (node) => { + if (isSchemaMemberCall(node)) return true; + if (node?.type === "CallExpression" && getCallName(node.callee) === "pipe") { + return isSchemaModelExpression(node.callee.object); + } + return false; +}; + +const isObjectTypeAlias = (node) => node.typeAnnotation?.type === "TSTypeLiteral"; + +const isInferredSchemaType = (node) => { + if (node.typeAnnotation?.type !== "TSTypeReference") return false; + const name = typeReferenceName(node.typeAnnotation); + return name === "Schema.Schema.Type"; +}; + +export default { + meta: { + type: "problem", + docs: { + description: message, + }, + }, + create(context) { + const schemaBases = new Set(); + const candidates = []; + + return { + VariableDeclarator(node) { + if (!isIdentifier(node.id) || !isSchemaModelExpression(node.init)) return; + const base = schemaBaseName(node.id.name); + if (base) schemaBases.add(base); + }, + TSInterfaceDeclaration(node) { + candidates.push({ node, name: node.id?.name }); + }, + TSTypeAliasDeclaration(node) { + if (!isObjectTypeAlias(node) || isInferredSchemaType(node)) return; + candidates.push({ node, name: node.id?.name }); + }, + "Program:exit"() { + for (const candidate of candidates) { + if (!candidate.name || !schemaBases.has(candidate.name)) continue; + context.report({ node: candidate.node, message }); + } + }, + }; + }, +}; diff --git a/scripts/oxlint-plugin-executor/rules/prefer-value-inferred-extension-types.js b/scripts/oxlint-plugin-executor/rules/prefer-value-inferred-extension-types.js new file mode 100644 index 000000000..59b9076ec --- /dev/null +++ b/scripts/oxlint-plugin-executor/rules/prefer-value-inferred-extension-types.js @@ -0,0 +1,81 @@ +import { isIdentifier } from "../utils.js"; + +const message = + "Do not duplicate plugin extension object shapes. Derive the extension type from the extension factory return value. Skill: wrdn-effect-value-inferred-types."; + +const extensionNamePattern = /(?:Plugin)?Extension$/; + +const isExtensionTypeName = (name) => typeof name === "string" && extensionNamePattern.test(name); + +const isExtensionProperty = (node) => + node?.type === "Property" && + !node.computed && + ((node.key?.type === "Identifier" && node.key.name === "extension") || + ((node.key?.type === "Literal" || node.key?.type === "StringLiteral") && + node.key.value === "extension")); + +const isSatisfiesExtension = (node, extensionTypeNames) => + node?.type === "TSSatisfiesExpression" && + node.typeAnnotation?.type === "TSTypeReference" && + isIdentifier(node.typeAnnotation.typeName) && + extensionTypeNames.has(node.typeAnnotation.typeName.name); + +const returnsSatisfiesExtension = (node, extensionTypeNames) => { + if (!node) return false; + if (isSatisfiesExtension(node, extensionTypeNames)) return true; + if (node.type === "BlockStatement") { + return (node.body ?? []).some( + (statement) => + statement.type === "ReturnStatement" && + isSatisfiesExtension(statement.argument, extensionTypeNames), + ); + } + return false; +}; + +const isAnnotatedExtensionFunction = (node, extensionTypeNames) => + (node?.type === "ArrowFunctionExpression" || node?.type === "FunctionExpression") && + node.returnType?.typeAnnotation?.type === "TSTypeReference" && + isIdentifier(node.returnType.typeAnnotation.typeName) && + extensionTypeNames.has(node.returnType.typeAnnotation.typeName.name); + +export default { + meta: { + type: "problem", + docs: { + description: message, + }, + }, + create(context) { + const extensionTypeNames = new Set(); + const extensionProperties = []; + + return { + TSInterfaceDeclaration(node) { + if (isExtensionTypeName(node.id?.name)) { + extensionTypeNames.add(node.id.name); + } + }, + TSTypeAliasDeclaration(node) { + if (isExtensionTypeName(node.id?.name) && node.typeAnnotation?.type === "TSTypeLiteral") { + extensionTypeNames.add(node.id.name); + } + }, + Property(node) { + if (!isExtensionProperty(node)) return; + extensionProperties.push(node); + }, + "Program:exit"() { + for (const node of extensionProperties) { + const value = node.value; + if ( + isAnnotatedExtensionFunction(value, extensionTypeNames) || + returnsSatisfiesExtension(value?.body, extensionTypeNames) + ) { + context.report({ node, message }); + } + } + }, + }; + }, +}; diff --git a/scripts/oxlint-plugin-executor/rules/prefer-yield-tagged-error.js b/scripts/oxlint-plugin-executor/rules/prefer-yield-tagged-error.js new file mode 100644 index 000000000..182983982 --- /dev/null +++ b/scripts/oxlint-plugin-executor/rules/prefer-yield-tagged-error.js @@ -0,0 +1,40 @@ +import { getPropertyName, isIdentifier } from "../utils.js"; + +const message = + "Yield tagged errors directly in Effect.gen instead of yielding Effect.fail(new ErrorType(...)). Skill: wrdn-effect-typed-errors."; + +const isEffectFail = (node) => + node?.type === "MemberExpression" && + isIdentifier(node.object, "Effect") && + getPropertyName(node.property) === "fail"; + +const isTaggedErrorConstruction = (node) => + node?.type === "NewExpression" && + isIdentifier(node.callee) && + node.callee.name !== "Error" && + node.callee.name.endsWith("Error"); + +const isYieldedEffectFailOfTaggedError = (node) => + node?.type === "YieldExpression" && + node.delegate === true && + node.argument?.type === "CallExpression" && + isEffectFail(node.argument.callee) && + isTaggedErrorConstruction(node.argument.arguments?.[0]); + +export default { + meta: { + type: "problem", + docs: { + description: message, + }, + }, + create(context) { + return { + YieldExpression(node) { + if (isYieldedEffectFailOfTaggedError(node)) { + context.report({ node, message }); + } + }, + }; + }, +}; diff --git a/scripts/oxlint-plugin-executor/rules/require-reactivity-keys.js b/scripts/oxlint-plugin-executor/rules/require-reactivity-keys.js index f31557923..a3245dd2d 100644 --- a/scripts/oxlint-plugin-executor/rules/require-reactivity-keys.js +++ b/scripts/oxlint-plugin-executor/rules/require-reactivity-keys.js @@ -56,7 +56,7 @@ export default { context.report({ node: call, - message: `Mutation ${mutation.mutationName} must pass reactivityKeys at the call site.`, + message: `Mutation ${mutation.mutationName} must pass reactivityKeys at the call site. Skill: wrdn-effect-atom-reactivity-keys.`, }); }, }; diff --git a/tests/presets-reachable.test.ts b/tests/presets-reachable.test.ts index 120e7603c..e83818651 100644 --- a/tests/presets-reachable.test.ts +++ b/tests/presets-reachable.test.ts @@ -62,10 +62,10 @@ describe("graphql presets are reachable endpoints", () => { const result = yield* introspect(preset.url).pipe( Effect.provide(FetchHttpClient.layer), Effect.map((r) => ({ ok: true as const, schema: r })), - Effect.catch((err) => + Effect.catchTag("GraphqlIntrospectionError", ({ message }) => Effect.succeed({ ok: false as const, - message: String(err), + message, }), ), ); diff --git a/tests/release-bootstrap-smoke.test.ts b/tests/release-bootstrap-smoke.test.ts index ef56d9bc4..8aa3401c3 100644 --- a/tests/release-bootstrap-smoke.test.ts +++ b/tests/release-bootstrap-smoke.test.ts @@ -61,6 +61,7 @@ const listen = async (server: ReturnType): Promise server.listen(0, "127.0.0.1", () => { const address = server.address(); if (!address || typeof address === "string") { + // oxlint-disable-next-line executor/no-promise-reject, executor/no-error-constructor -- boundary: Node http server smoke-test helper reports listen failures through Promise rejection reject(new Error("Failed to resolve server address")); return; } @@ -72,6 +73,7 @@ const closeServer = async (server: ReturnType): Promise { server.close((error) => { if (error) { + // oxlint-disable-next-line executor/no-promise-reject -- boundary: Node http server smoke-test helper preserves close callback failure semantics reject(error); return; } @@ -121,6 +123,7 @@ describe("release bootstrap smoke", () => { await mkdir(join(installedWrapperDir, "node_modules"), { recursive: true }); await cp(platformDir, installedPlatformDir, { recursive: true }); + // oxlint-disable-next-line executor/no-try-catch-or-throw -- boundary: smoke test temp install cleanup must run after process and filesystem assertions try { const firstRun = await runCommand( process.execPath, @@ -178,11 +181,13 @@ describe("release bootstrap smoke", () => { webStderr += chunk; }); + // oxlint-disable-next-line executor/no-try-catch-or-throw -- boundary: spawned CLI smoke test must terminate the child process after assertions try { const deadline = Date.now() + 30_000; let rootResponse: Response | null = null; while (Date.now() < deadline) { await new Promise((resolveDelay) => setTimeout(resolveDelay, 250)); + // oxlint-disable-next-line executor/no-try-catch-or-throw -- boundary: polling fetch intentionally ignores transient connection failures until the server is ready try { rootResponse = await fetch(`http://127.0.0.1:${webPort}/`); if (rootResponse.ok) { diff --git a/tests/tools-cli.test.ts b/tests/tools-cli.test.ts index d3931beb5..8d2294e2a 100644 --- a/tests/tools-cli.test.ts +++ b/tests/tools-cli.test.ts @@ -33,15 +33,19 @@ describe("CLI tooling helpers", () => { it.effect("rejects non-object JSON input", () => Effect.gen(function* () { - const error = yield* parseJsonObjectInput('[1,2,3]').pipe(Effect.flip); - expect(error.message).toContain("must decode to a JSON object"); + const error = yield* parseJsonObjectInput("[1,2,3]").pipe(Effect.flip); + expect(error).toEqual( + expect.objectContaining({ + message: expect.stringContaining("must decode to a JSON object"), + }), + ); }), ); it("builds bracket-safe invocation code for dynamic tool paths", () => { const code = buildInvokeToolCode("google-drive.files.list", { pageSize: 10 }); expect(code).toContain('const __target = tools["google-drive"]["files"]["list"]'); - expect(code).toContain('const __args = {'); + expect(code).toContain("const __args = {"); }); it("builds tool paths from dot or segmented forms", () => { @@ -64,13 +68,17 @@ describe("CLI tooling helpers", () => { expect(searchCode).toBe( 'return await tools.search({"query":"google calendar events","limit":5,"namespace":"google"});', ); - expect(sourcesCode).toBe('return await tools.executor.sources.list({"limit":20,"query":"google"});'); + expect(sourcesCode).toBe( + 'return await tools.executor.sources.list({"limit":20,"query":"google"});', + ); }); it("extracts completed result payload and pause execution id", () => { - expect(extractExecutionResult({ status: "completed", result: { ok: true }, logs: [] })).toEqual({ - ok: true, - }); + expect(extractExecutionResult({ status: "completed", result: { ok: true }, logs: [] })).toEqual( + { + ok: true, + }, + ); expect(extractExecutionResult({ status: "completed" })).toBeNull(); expect(extractExecutionId({ executionId: "exec_123" })).toBe("exec_123");