diff --git a/aiprompts/usechat-backend-design.md b/aiprompts/usechat-backend-design.md
new file mode 100644
index 0000000000..f5793718c1
--- /dev/null
+++ b/aiprompts/usechat-backend-design.md
@@ -0,0 +1,463 @@
+# useChat Compatible Backend Design for Wave Terminal
+
+## Overview
+
+This document outlines how to create a `useChat()` compatible backend API using Go and Server-Sent Events (SSE) to replace the current complex RPC-based AI chat system. The goal is to leverage Vercel AI SDK's `useChat()` hook while maintaining all existing AI provider functionality.
+
+## Current vs Target Architecture
+
+### Current Architecture
+```
+Frontend (React) → Custom RPC → Go Backend → AI Providers
+- 10+ Jotai atoms for state management
+- Custom WaveAIStreamRequest/WaveAIPacketType
+- Complex configuration merging in frontend
+- Custom streaming protocol over WebSocket
+```
+
+### Target Architecture
+```
+Frontend (useChat) → HTTP/SSE → Go Backend → AI Providers
+- Single useChat() hook manages all state
+- Standard HTTP POST + SSE streaming
+- Backend-driven configuration resolution
+- Standard AI SDK streaming format
+```
+
+## API Design
+
+### 1. Endpoint Structure
+
+**Chat Streaming Endpoint:**
+```
+POST /api/ai/chat/{blockId}?preset={presetKey}
+```
+
+**Conversation Persistence Endpoints:**
+```
+POST /api/ai/conversations/{blockId} # Save conversation
+GET /api/ai/conversations/{blockId} # Load conversation
+```
+
+**Why this approach:**
+- `blockId`: Identifies the conversation context (existing Wave concept)
+- `preset`: URL parameter for AI configuration preset
+- **Separate persistence**: Clean separation of streaming vs storage
+- **Fast localhost calls**: Frontend can call both endpoints quickly
+- **Simple backend**: Each endpoint has single responsibility
+
+### 2. Request Format & Message Flow
+
+**Simplified Approach:**
+- Frontend manages **entire conversation state** (like all modern chat apps)
+- Frontend sends **complete message history** with each request
+- Backend just processes the messages and streams response
+- Frontend handles persistence via existing Wave file system
+
+**Standard useChat() Request:**
+```json
+{
+ "messages": [
+ {
+ "id": "msg-1",
+ "role": "user",
+ "content": "Hello world"
+ },
+ {
+ "id": "msg-2",
+ "role": "assistant",
+ "content": "Hi there!"
+ },
+ {
+ "id": "msg-3",
+ "role": "user",
+ "content": "How are you?" // <- NEW message user just typed
+ }
+ ]
+}
+```
+
+**Backend Processing:**
+1. **Receive complete conversation** from frontend
+2. **Resolve AI configuration** (preset, model, etc.)
+3. **Send messages directly** to AI provider
+4. **Stream response** back to frontend
+5. **Frontend calls separate persistence endpoint** when needed
+
+**Optional Extensions:**
+```json
+{
+ "messages": [...],
+ "options": {
+ "temperature": 0.7,
+ "maxTokens": 1000,
+ "model": "gpt-4" // Override preset model
+ }
+}
+```
+
+### 3. Configuration Resolution
+
+**Priority Order (backend resolves):**
+1. **Request options** (highest priority)
+2. **URL preset parameter**
+3. **Block metadata** (`block.meta["ai:preset"]`)
+4. **Global settings** (`settings["ai:preset"]`)
+5. **Default preset** (lowest priority)
+
+**Backend Logic:**
+```go
+func resolveAIConfig(blockId, presetKey string, requestOptions map[string]any) (*WaveAIOptsType, error) {
+ // 1. Load block metadata
+ block := getBlock(blockId)
+ blockPreset := block.Meta["ai:preset"]
+
+ // 2. Load global settings
+ settings := getGlobalSettings()
+ globalPreset := settings["ai:preset"]
+
+ // 3. Resolve preset hierarchy
+ finalPreset := presetKey
+ if finalPreset == "" {
+ finalPreset = blockPreset
+ }
+ if finalPreset == "" {
+ finalPreset = globalPreset
+ }
+ if finalPreset == "" {
+ finalPreset = "default"
+ }
+
+ // 4. Load and merge preset config
+ presetConfig := loadPreset(finalPreset)
+
+ // 5. Apply request overrides
+ return mergeAIConfig(presetConfig, requestOptions), nil
+}
+```
+
+### 4. Response Format (SSE)
+
+**Key Insight: Minimal Conversion**
+Most AI providers (OpenAI, Anthropic) already return SSE streams. Instead of converting to our custom format and back, we can **proxy/transform** their streams directly to useChat format.
+
+**Headers:**
+```
+Content-Type: text/event-stream
+Cache-Control: no-cache
+Connection: keep-alive
+Access-Control-Allow-Origin: *
+```
+
+**useChat Expected Format:**
+```
+data: {"type":"text","text":"Hello"}
+
+data: {"type":"text","text":" world"}
+
+data: {"type":"text","text":"!"}
+
+data: {"type":"finish","finish_reason":"stop","usage":{"prompt_tokens":10,"completion_tokens":3,"total_tokens":13}}
+
+data: [DONE]
+```
+
+**Provider Stream Transformation:**
+- **OpenAI**: Already SSE → direct proxy (no conversion needed)
+- **Anthropic**: Already SSE → direct proxy (minimal field mapping)
+- **Google**: Already streaming → direct proxy
+- **Perplexity**: OpenAI-compatible → direct proxy
+- **Wave Cloud**: WebSocket → **requires conversion** (only one needing transformation)
+
+**Error Format:**
+```
+data: {"type":"error","error":"API key invalid"}
+
+data: [DONE]
+```
+
+## Implementation Plan
+
+### Phase 1: HTTP Handler
+
+```go
+// Simplified approach: Direct provider streaming with minimal transformation
+func (s *WshServer) HandleAIChat(w http.ResponseWriter, r *http.Request) {
+ // 1. Parse URL parameters
+ blockId := mux.Vars(r)["blockId"]
+ presetKey := r.URL.Query().Get("preset")
+
+ // 2. Parse request body
+ var req struct {
+ Messages []struct {
+ Role string `json:"role"`
+ Content string `json:"content"`
+ } `json:"messages"`
+ Options map[string]any `json:"options,omitempty"`
+ }
+ json.NewDecoder(r.Body).Decode(&req)
+
+ // 3. Resolve configuration
+ aiOpts, err := resolveAIConfig(blockId, presetKey, req.Options)
+ if err != nil {
+ http.Error(w, err.Error(), 400)
+ return
+ }
+
+ // 4. Set SSE headers
+ w.Header().Set("Content-Type", "text/event-stream")
+ w.Header().Set("Cache-Control", "no-cache")
+ w.Header().Set("Connection", "keep-alive")
+
+ // 5. Route to provider and stream directly
+ switch aiOpts.APIType {
+ case "openai", "perplexity":
+ // Direct proxy - these are already SSE compatible
+ streamDirectSSE(w, r.Context(), aiOpts, req.Messages)
+ case "anthropic":
+ // Direct proxy with minimal field mapping
+ streamAnthropicSSE(w, r.Context(), aiOpts, req.Messages)
+ case "google":
+ // Direct proxy
+ streamGoogleSSE(w, r.Context(), aiOpts, req.Messages)
+ default:
+ // Wave Cloud - only one requiring conversion (WebSocket → SSE)
+ if isCloudAIRequest(aiOpts) {
+ streamWaveCloudToUseChat(w, r.Context(), aiOpts, req.Messages)
+ } else {
+ http.Error(w, "Unsupported provider", 400)
+ }
+ }
+}
+
+// Example: Direct OpenAI streaming (minimal conversion)
+func streamOpenAIToUseChat(w http.ResponseWriter, ctx context.Context, opts *WaveAIOptsType, messages []Message) {
+ client := openai.NewClient(opts.APIToken)
+
+ stream, err := client.CreateChatCompletionStream(ctx, openai.ChatCompletionRequest{
+ Model: opts.Model,
+ Messages: convertToOpenAIMessages(messages),
+ Stream: true,
+ })
+ if err != nil {
+ fmt.Fprintf(w, "data: {\"type\":\"error\",\"error\":%q}\n\n", err.Error())
+ fmt.Fprintf(w, "data: [DONE]\n\n")
+ return
+ }
+ defer stream.Close()
+
+ for {
+ response, err := stream.Recv()
+ if errors.Is(err, io.EOF) {
+ fmt.Fprintf(w, "data: [DONE]\n\n")
+ return
+ }
+ if err != nil {
+ fmt.Fprintf(w, "data: {\"type\":\"error\",\"error\":%q}\n\n", err.Error())
+ fmt.Fprintf(w, "data: [DONE]\n\n")
+ return
+ }
+
+ // Direct transformation: OpenAI format → useChat format
+ for _, choice := range response.Choices {
+ if choice.Delta.Content != "" {
+ fmt.Fprintf(w, "data: {\"type\":\"text\",\"text\":%q}\n\n", choice.Delta.Content)
+ }
+ if choice.FinishReason != "" {
+ fmt.Fprintf(w, "data: {\"type\":\"finish\",\"finish_reason\":%q}\n\n", choice.FinishReason)
+ }
+ }
+
+ w.(http.Flusher).Flush()
+ }
+}
+
+// Wave Cloud conversion (only provider needing transformation)
+func streamWaveCloudToUseChat(w http.ResponseWriter, ctx context.Context, opts *WaveAIOptsType, messages []Message) {
+ // Use existing Wave Cloud WebSocket logic
+ waveReq := wshrpc.WaveAIStreamRequest{
+ Opts: opts,
+ Prompt: convertMessagesToPrompt(messages),
+ }
+
+ stream := waveai.RunAICommand(ctx, waveReq) // Returns WebSocket stream
+
+ // Convert Wave Cloud packets to useChat SSE format
+ for packet := range stream {
+ if packet.Error != nil {
+ fmt.Fprintf(w, "data: {\"type\":\"error\",\"error\":%q}\n\n", packet.Error.Error())
+ break
+ }
+
+ resp := packet.Response
+ if resp.Text != "" {
+ fmt.Fprintf(w, "data: {\"type\":\"text\",\"text\":%q}\n\n", resp.Text)
+ }
+ if resp.FinishReason != "" {
+ usage := ""
+ if resp.Usage != nil {
+ usage = fmt.Sprintf(",\"usage\":{\"prompt_tokens\":%d,\"completion_tokens\":%d,\"total_tokens\":%d}",
+ resp.Usage.PromptTokens, resp.Usage.CompletionTokens, resp.Usage.TotalTokens)
+ }
+ fmt.Fprintf(w, "data: {\"type\":\"finish\",\"finish_reason\":%q%s}\n\n", resp.FinishReason, usage)
+ }
+
+ w.(http.Flusher).Flush()
+ }
+
+ fmt.Fprintf(w, "data: [DONE]\n\n")
+}
+```
+
+### Phase 2: Frontend Integration
+
+```typescript
+import { useChat } from '@ai-sdk/react';
+
+function WaveAI({ blockId }: { blockId: string }) {
+ // Get current preset from block metadata or settings
+ const preset = useAtomValue(currentPresetAtom);
+
+ const { messages, input, handleInputChange, handleSubmit, isLoading, error } = useChat({
+ api: `/api/ai/chat/${blockId}?preset=${preset}`,
+ initialMessages: [], // Load from existing aidata file
+ onFinish: (message) => {
+ // Save conversation to aidata file
+ saveConversation(blockId, messages);
+ }
+ });
+
+ return (
+
+
+ {messages.map(message => (
+
+
+
+ ))}
+ {isLoading &&
}
+ {error &&
{error.message}
}
+
+
+
+
+ );
+}
+```
+
+### Phase 3: Advanced Features
+
+#### Multi-modal Support
+```typescript
+// useChat supports multi-modal out of the box
+const { messages, append } = useChat({
+ api: `/api/ai/chat/${blockId}`,
+});
+
+// Send image + text
+await append({
+ role: 'user',
+ content: [
+ { type: 'text', text: 'What do you see in this image?' },
+ { type: 'image', image: imageFile }
+ ]
+});
+```
+
+#### Thinking Models
+```go
+// Backend detects thinking models and formats appropriately
+if isThinkingModel(aiOpts.Model) {
+ // Send thinking content separately
+ fmt.Fprintf(w, "data: {\"type\":\"thinking\",\"text\":%q}\n\n", thinkingText)
+ fmt.Fprintf(w, "data: {\"type\":\"text\",\"text\":%q}\n\n", responseText)
+}
+```
+
+#### Context Injection
+```typescript
+// Add system messages or context via useChat options
+const { messages, append } = useChat({
+ api: `/api/ai/chat/${blockId}`,
+ initialMessages: [
+ {
+ role: 'system',
+ content: 'You are a helpful terminal assistant...'
+ }
+ ]
+});
+```
+
+## Migration Strategy
+
+### 1. Parallel Implementation
+- Keep existing RPC system running
+- Add new HTTP/SSE endpoint alongside
+- Feature flag to switch between systems
+
+### 2. Gradual Migration
+- Start with new blocks using useChat
+- Migrate existing conversations on first interaction
+- Remove RPC system once stable
+
+### 3. Backward Compatibility
+- Existing aidata files work unchanged
+- Same provider backends (OpenAI, Anthropic, etc.)
+- Same configuration system
+
+## Benefits
+
+### Complexity Reduction
+- **Frontend**: ~900 lines → ~100 lines (90% reduction)
+- **State Management**: 10+ atoms → 1 useChat hook
+- **Configuration**: Frontend merging → Backend resolution
+- **Streaming**: Custom protocol → Standard SSE
+
+### Modern Features
+- **Multi-modal**: Images, files, audio support
+- **Thinking Models**: Built-in reasoning trace support
+- **Conversation Management**: Edit, retry, branch conversations
+- **Error Handling**: Automatic retry and error boundaries
+- **Performance**: Optimized streaming and batching
+
+### Developer Experience
+- **Type Safety**: Full TypeScript support
+- **Testing**: Standard HTTP endpoints easier to test
+- **Debugging**: Standard browser dev tools work
+- **Documentation**: Leverage AI SDK docs and community
+
+## Configuration Examples
+
+### URL-based Configuration
+```
+POST /api/ai/chat/block-123?preset=claude-coding
+POST /api/ai/chat/block-456?preset=gpt4-creative
+```
+
+### Header-based Overrides
+```
+POST /api/ai/chat/block-123
+X-AI-Model: gpt-4-turbo
+X-AI-Temperature: 0.8
+```
+
+### Request Body Options
+```json
+{
+ "messages": [...],
+ "options": {
+ "model": "claude-3-sonnet",
+ "temperature": 0.7,
+ "maxTokens": 2000
+ }
+}
+```
+
+This design maintains all existing functionality while dramatically simplifying the implementation and adding modern AI chat capabilities.
\ No newline at end of file
diff --git a/aiprompts/usechat-streamingproto.md b/aiprompts/usechat-streamingproto.md
new file mode 100644
index 0000000000..57ab550ba1
--- /dev/null
+++ b/aiprompts/usechat-streamingproto.md
@@ -0,0 +1,185 @@
+Data Stream Protocol
+A data stream follows a special protocol that the AI SDK provides to send information to the frontend.
+
+The data stream protocol uses Server-Sent Events (SSE) format for improved standardization, keep-alive through ping, reconnect capabilities, and better cache handling.
+
+When you provide data streams from a custom backend, you need to set the x-vercel-ai-ui-message-stream header to v1.
+
+The following stream parts are currently supported:
+
+Message Start Part
+Indicates the beginning of a new message with metadata.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"start","messageId":"..."}
+Text Parts
+Text content is streamed using a start/delta/end pattern with unique IDs for each text block.
+
+Text Start Part
+Indicates the beginning of a text block.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"text-start","id":"msg_68679a454370819ca74c8eb3d04379630dd1afb72306ca5d"}
+Text Delta Part
+Contains incremental text content for the text block.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"text-delta","id":"msg_68679a454370819ca74c8eb3d04379630dd1afb72306ca5d","delta":"Hello"}
+Text End Part
+Indicates the completion of a text block.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"text-end","id":"msg_68679a454370819ca74c8eb3d04379630dd1afb72306ca5d"}
+Reasoning Parts
+Reasoning content is streamed using a start/delta/end pattern with unique IDs for each reasoning block.
+
+Reasoning Start Part
+Indicates the beginning of a reasoning block.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"reasoning-start","id":"reasoning_123"}
+Reasoning Delta Part
+Contains incremental reasoning content for the reasoning block.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"reasoning-delta","id":"reasoning_123","delta":"This is some reasoning"}
+Reasoning End Part
+Indicates the completion of a reasoning block.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"reasoning-end","id":"reasoning_123"}
+Source Parts
+Source parts provide references to external content sources.
+
+Source URL Part
+References to external URLs.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"source-url","sourceId":"https://example.com","url":"https://example.com"}
+Source Document Part
+References to documents or files.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"source-document","sourceId":"https://example.com","mediaType":"file","title":"Title"}
+File Part
+The file parts contain references to files with their media type.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"file","url":"https://example.com/file.png","mediaType":"image/png"}
+Data Parts
+Custom data parts allow streaming of arbitrary structured data with type-specific handling.
+
+Format: Server-Sent Event with JSON object where the type includes a custom suffix
+
+Example:
+
+data: {"type":"data-weather","data":{"location":"SF","temperature":100}}
+The data-\* type pattern allows you to define custom data types that your frontend can handle specifically.
+
+Error Part
+The error parts are appended to the message as they are received.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"error","errorText":"error message"}
+Tool Input Start Part
+Indicates the beginning of tool input streaming.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"tool-input-start","toolCallId":"call_fJdQDqnXeGxTmr4E3YPSR7Ar","toolName":"getWeatherInformation"}
+Tool Input Delta Part
+Incremental chunks of tool input as it's being generated.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"tool-input-delta","toolCallId":"call_fJdQDqnXeGxTmr4E3YPSR7Ar","inputTextDelta":"San Francisco"}
+Tool Input Available Part
+Indicates that tool input is complete and ready for execution.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"tool-input-available","toolCallId":"call_fJdQDqnXeGxTmr4E3YPSR7Ar","toolName":"getWeatherInformation","input":{"city":"San Francisco"}}
+Tool Output Available Part
+Contains the result of tool execution.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"tool-output-available","toolCallId":"call_fJdQDqnXeGxTmr4E3YPSR7Ar","output":{"city":"San Francisco","weather":"sunny"}}
+Start Step Part
+A part indicating the start of a step.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"start-step"}
+Finish Step Part
+A part indicating that a step (i.e., one LLM API call in the backend) has been completed.
+
+This part is necessary to correctly process multiple stitched assistant calls, e.g. when calling tools in the backend, and using steps in useChat at the same time.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"finish-step"}
+Finish Message Part
+A part indicating the completion of a message.
+
+Format: Server-Sent Event with JSON object
+
+Example:
+
+data: {"type":"finish"}
+Stream Termination
+The stream ends with a special [DONE] marker.
+
+Format: Server-Sent Event with literal [DONE]
+
+Example:
+
+data: [DONE]
+The data stream protocol is supported by useChat and useCompletion on the frontend and used by default. useCompletion only supports the text and data stream parts.
+
+On the backend, you can use toUIMessageStreamResponse() from the streamText result object to return a streaming HTTP response.
diff --git a/cmd/testai/main-testai.go b/cmd/testai/main-testai.go
new file mode 100644
index 0000000000..87b34d4bb0
--- /dev/null
+++ b/cmd/testai/main-testai.go
@@ -0,0 +1,111 @@
+// Copyright 2025, Command Line Inc.
+// SPDX-License-Identifier: Apache-2.0
+
+package main
+
+import (
+ "context"
+ "fmt"
+ "net/http"
+ "os"
+ "time"
+
+ "github.com/wavetermdev/waveterm/pkg/waveai"
+ "github.com/wavetermdev/waveterm/pkg/wshrpc"
+)
+
+// TestResponseWriter implements http.ResponseWriter and additional interfaces for testing
+type TestResponseWriter struct {
+ header http.Header
+}
+
+func (w *TestResponseWriter) Header() http.Header {
+ if w.header == nil {
+ w.header = make(http.Header)
+ }
+ return w.header
+}
+
+func (w *TestResponseWriter) Write(data []byte) (int, error) {
+ fmt.Printf("SSE: %s", string(data))
+ return len(data), nil
+}
+
+func (w *TestResponseWriter) WriteHeader(statusCode int) {
+ fmt.Printf("Status: %d\n", statusCode)
+}
+
+// Implement http.Flusher interface
+func (w *TestResponseWriter) Flush() {
+ // No-op for testing
+}
+
+// Implement interfaces needed by http.ResponseController
+func (w *TestResponseWriter) SetWriteDeadline(deadline time.Time) error {
+ // No-op for testing
+ return nil
+}
+
+func (w *TestResponseWriter) SetReadDeadline(deadline time.Time) error {
+ // No-op for testing
+ return nil
+}
+
+func main() {
+ if len(os.Args) < 2 {
+ fmt.Println("Usage: go run main-testai.go [message]")
+ fmt.Println("Example: go run main-testai.go o4-mini 'What is 2+2?'")
+ fmt.Println("Set OPENAI_API_KEY environment variable")
+ os.Exit(1)
+ }
+
+ apiKey := os.Getenv("OPENAI_API_KEY")
+ if apiKey == "" {
+ fmt.Println("Error: OPENAI_API_KEY environment variable not set")
+ os.Exit(1)
+ }
+
+ model := os.Args[1]
+ message := "What is 2+2?"
+ if len(os.Args) > 2 {
+ message = os.Args[2]
+ }
+
+ // Create AI options
+ opts := &wshrpc.WaveAIOptsType{
+ APIToken: apiKey,
+ Model: model,
+ MaxTokens: 1000,
+ }
+
+ // Create messages
+ messages := []waveai.UseChatMessage{
+ {
+ Role: "user",
+ Content: message,
+ },
+ }
+
+ fmt.Printf("Testing AI streaming with model: %s\n", model)
+ fmt.Printf("Message: %s\n", message)
+ fmt.Println("---")
+
+ // Create a test response writer and SSE handler
+ ctx := context.Background()
+ testWriter := &TestResponseWriter{}
+ sseHandler := waveai.MakeSSEHandlerCh(testWriter, ctx)
+
+ // Setup the SSE handler
+ err := sseHandler.SetupSSE()
+ if err != nil {
+ fmt.Printf("Error setting up SSE: %v\n", err)
+ return
+ }
+ defer sseHandler.Close()
+
+ // Call the streaming function
+ waveai.StreamOpenAIToUseChat(sseHandler, ctx, opts, messages)
+
+ fmt.Println("---")
+ fmt.Println("Test completed")
+}
diff --git a/frontend/app/view/waveai/reasoning.tsx b/frontend/app/view/waveai/reasoning.tsx
new file mode 100644
index 0000000000..d334a4f80b
--- /dev/null
+++ b/frontend/app/view/waveai/reasoning.tsx
@@ -0,0 +1,143 @@
+// Copyright 2025, Command Line Inc.
+// SPDX-License-Identifier: Apache-2.0
+
+import React, { createContext, memo, useCallback, useContext, useEffect, useState } from "react";
+import { Streamdown } from "streamdown";
+
+type ReasoningContextValue = {
+ isStreaming: boolean;
+ isOpen: boolean;
+ setIsOpen: (open: boolean) => void;
+ duration: number;
+};
+
+const ReasoningContext = createContext(null);
+
+const useReasoning = () => {
+ const context = useContext(ReasoningContext);
+ if (!context) {
+ throw new Error("Reasoning components must be used within Reasoning");
+ }
+ return context;
+};
+
+const AUTO_CLOSE_DELAY = 1000;
+
+export const Reasoning = memo(
+ ({
+ className,
+ isStreaming = false,
+ open,
+ defaultOpen = false,
+ onOpenChange,
+ duration: durationProp = 3,
+ children,
+ }: {
+ className?: string;
+ isStreaming?: boolean;
+ open?: boolean;
+ defaultOpen?: boolean;
+ onOpenChange?: (open: boolean) => void;
+ duration?: number;
+ children: React.ReactNode;
+ }) => {
+ const [isOpen, setIsOpenState] = useState(defaultOpen);
+ const [duration, setDuration] = useState(0);
+ const [hasAutoClosedRef, setHasAutoClosedRef] = useState(false);
+ const [startTime, setStartTime] = useState(null);
+
+ const setIsOpen = useCallback(
+ (newOpen: boolean) => {
+ setIsOpenState(newOpen);
+ onOpenChange?.(newOpen);
+ },
+ [onOpenChange]
+ );
+
+ // Track duration when streaming starts and ends
+ useEffect(() => {
+ if (isStreaming) {
+ if (startTime === null) {
+ setStartTime(Date.now());
+ }
+ } else if (startTime !== null) {
+ setDuration(Math.round((Date.now() - startTime) / 1000));
+ setStartTime(null);
+ }
+ }, [isStreaming, startTime]);
+
+ // Don't auto-open or auto-close - let user control the state manually
+
+ // Handle controlled open state
+ useEffect(() => {
+ if (open !== undefined) {
+ setIsOpenState(open);
+ }
+ }, [open]);
+
+ return (
+
+ {children}
+
+ );
+ }
+);
+
+export const ReasoningTrigger = memo(
+ ({
+ className,
+ title = "Reasoning",
+ children,
+ onClick,
+ }: {
+ className?: string;
+ title?: string;
+ children?: React.ReactNode;
+ onClick?: () => void;
+ }) => {
+ const { isStreaming, isOpen, setIsOpen, duration } = useReasoning();
+
+ const handleClick = useCallback(() => {
+ setIsOpen(!isOpen);
+ onClick?.();
+ }, [isOpen, setIsOpen, onClick]);
+
+ return (
+
+ );
+ }
+);
+
+export const ReasoningContent = memo(({ className, children }: { className?: string; children: string }) => {
+ const { isOpen } = useReasoning();
+
+ if (!isOpen) return null;
+
+ return (
+
+ {children}
+
+ );
+});
+
+Reasoning.displayName = "Reasoning";
+ReasoningTrigger.displayName = "ReasoningTrigger";
+ReasoningContent.displayName = "ReasoningContent";
diff --git a/frontend/app/view/waveai/waveai.tsx b/frontend/app/view/waveai/waveai.tsx
index 048a76b487..fc1a961a48 100644
--- a/frontend/app/view/waveai/waveai.tsx
+++ b/frontend/app/view/waveai/waveai.tsx
@@ -19,6 +19,7 @@ import { OverlayScrollbarsComponent, OverlayScrollbarsComponentRef } from "overl
import { forwardRef, memo, useCallback, useEffect, useImperativeHandle, useMemo, useRef, useState } from "react";
import { debounce, throttle } from "throttle-debounce";
import "./waveai.scss";
+import { WaveAiUseChat, WaveAiUseChatModel } from "./waveaiusechat";
interface ChatMessageType {
id: string;
@@ -296,7 +297,15 @@ export class WaveAiModel implements ViewModel {
}
get viewComponent(): ViewComponent {
- return WaveAi;
+ // Check if we should use the new useChat implementation
+ const useNewImplementation = this.shouldUseNewImplementation();
+ return useNewImplementation ? WaveAiUseChat : WaveAi;
+ }
+
+ private shouldUseNewImplementation(): boolean {
+ // For now, check for a meta flag to enable the new implementation
+ const blockMeta = globalStore.get(this.blockAtom)?.meta ?? {};
+ return blockMeta["ai:usechat"] === "true" || blockMeta["ai:usechat"] === true;
}
dispose() {
@@ -685,7 +694,7 @@ const ChatInput = forwardRef(
}
);
-const WaveAi = ({ model }: { model: WaveAiModel; blockId: string }) => {
+const WaveAiOld = ({ model }: { model: WaveAiModel; blockId: string }) => {
const { sendMessage } = model.useWaveAi();
const waveaiRef = useRef(null);
const chatWindowRef = useRef(null);
@@ -879,4 +888,15 @@ const WaveAi = ({ model }: { model: WaveAiModel; blockId: string }) => {
);
};
+const WaveAi = ({ model, blockId }: { model: WaveAiModel; blockId: string }) => {
+ const useNewImplementation = true;
+
+ if (useNewImplementation) {
+ const useChatModel = useMemo(() => new WaveAiUseChatModel(blockId), [blockId]);
+ return ;
+ }
+
+ return ;
+};
+
export { WaveAi };
diff --git a/frontend/app/view/waveai/waveaiusechat.tsx b/frontend/app/view/waveai/waveaiusechat.tsx
new file mode 100644
index 0000000000..1b334e7a03
--- /dev/null
+++ b/frontend/app/view/waveai/waveaiusechat.tsx
@@ -0,0 +1,657 @@
+// Copyright 2025, Command Line Inc.
+// SPDX-License-Identifier: Apache-2.0
+
+import { Button } from "@/app/element/button";
+import { TypingIndicator } from "@/app/element/typingindicator";
+import { atoms, fetchWaveFile, WOS } from "@/store/global";
+import { BlockService, ObjectService } from "@/store/services";
+import { getWebServerEndpoint } from "@/util/endpoints";
+import { checkKeyPressed } from "@/util/keyutil";
+import { fireAndForget, isBlank, mergeMeta } from "@/util/util";
+import { useChat } from "@ai-sdk/react";
+import { DefaultChatTransport } from "ai";
+import { atom, Atom, useAtomValue } from "jotai";
+import { OverlayScrollbarsComponent, OverlayScrollbarsComponentRef } from "overlayscrollbars-react";
+import React, { forwardRef, memo, useCallback, useEffect, useImperativeHandle, useMemo, useRef, useState } from "react";
+import { Streamdown } from "streamdown";
+import { debounce, throttle } from "throttle-debounce";
+import { Reasoning, ReasoningContent, ReasoningTrigger } from "./reasoning";
+
+interface WaveAiUseChatProps {
+ blockId: string;
+ model: WaveAiUseChatModelImpl;
+}
+
+interface ChatMessage {
+ id: string;
+ role: "user" | "assistant" | "system";
+ content: string;
+ reasoning?: string;
+}
+
+const slidingWindowSize = 30;
+
+class WaveAiUseChatModelImpl implements ViewModel {
+ viewType: string;
+ blockId: string;
+ blockAtom: Atom;
+ presetKey: Atom;
+ presetMap: Atom<{ [k: string]: MetaType }>;
+ mergedPresets: Atom;
+ aiOpts: Atom;
+ viewIcon?: Atom;
+ viewName?: Atom;
+ viewText?: Atom;
+ endIconButtons?: Atom;
+ textAreaRef: React.RefObject;
+
+ constructor(blockId: string) {
+ this.viewType = "waveai";
+ this.blockId = blockId;
+ this.blockAtom = WOS.getWaveObjectAtom(`block:${blockId}`);
+ this.viewIcon = atom("sparkles");
+ this.viewName = atom("Wave AI");
+ this.textAreaRef = React.createRef();
+
+ this.presetKey = atom((get) => {
+ const metaPresetKey = get(this.blockAtom).meta["ai:preset"];
+ const globalPresetKey = get(atoms.settingsAtom)["ai:preset"];
+ return metaPresetKey ?? globalPresetKey;
+ });
+
+ this.presetMap = atom((get) => {
+ const fullConfig = get(atoms.fullConfigAtom);
+ const presets = fullConfig.presets;
+ const settings = fullConfig.settings;
+ return Object.fromEntries(
+ Object.entries(presets)
+ .filter(([k]) => k.startsWith("ai@"))
+ .map(([k, v]) => {
+ const aiPresetKeys = Object.keys(v).filter((k) => k.startsWith("ai:"));
+ const newV = { ...v };
+ newV["display:name"] =
+ aiPresetKeys.length == 1 && aiPresetKeys.includes("ai:*")
+ ? `${newV["display:name"] ?? "Default"} (${settings["ai:model"]})`
+ : newV["display:name"];
+ return [k, newV];
+ })
+ );
+ });
+
+ this.mergedPresets = atom((get) => {
+ const meta = get(this.blockAtom).meta;
+ let settings = get(atoms.settingsAtom);
+ let presetKey = get(this.presetKey);
+ let presets = get(atoms.fullConfigAtom).presets;
+ let selectedPresets = presets?.[presetKey] ?? {};
+
+ let mergedPresets: MetaType = {};
+ mergedPresets = mergeMeta(settings, selectedPresets, "ai");
+ mergedPresets = mergeMeta(mergedPresets, meta, "ai");
+
+ return mergedPresets;
+ });
+
+ this.aiOpts = atom((get) => {
+ const mergedPresets = get(this.mergedPresets);
+
+ const opts: WaveAIOptsType = {
+ model: mergedPresets["ai:model"] ?? null,
+ apitype: mergedPresets["ai:apitype"] ?? null,
+ orgid: mergedPresets["ai:orgid"] ?? null,
+ apitoken: mergedPresets["ai:apitoken"] ?? null,
+ apiversion: mergedPresets["ai:apiversion"] ?? null,
+ maxtokens: mergedPresets["ai:maxtokens"] ?? null,
+ timeoutms: mergedPresets["ai:timeoutms"] ?? 60000,
+ baseurl: mergedPresets["ai:baseurl"] ?? null,
+ proxyurl: mergedPresets["ai:proxyurl"] ?? null,
+ };
+ return opts;
+ });
+
+ this.viewText = atom((get) => {
+ const viewTextChildren: HeaderElem[] = [];
+ const aiOpts = get(this.aiOpts);
+ const presets = get(this.presetMap);
+ const presetKey = get(this.presetKey);
+ const presetName = presets[presetKey]?.["display:name"] ?? "";
+ const isCloud = isBlank(aiOpts.apitoken) && isBlank(aiOpts.baseurl);
+
+ // Handle known API providers
+ switch (aiOpts?.apitype) {
+ case "anthropic":
+ viewTextChildren.push({
+ elemtype: "iconbutton",
+ icon: "globe",
+ title: `Using Remote Anthropic API (${aiOpts.model})`,
+ noAction: true,
+ });
+ break;
+ case "perplexity":
+ viewTextChildren.push({
+ elemtype: "iconbutton",
+ icon: "globe",
+ title: `Using Remote Perplexity API (${aiOpts.model})`,
+ noAction: true,
+ });
+ break;
+ default:
+ if (isCloud) {
+ viewTextChildren.push({
+ elemtype: "iconbutton",
+ icon: "cloud",
+ title: "Using Wave's AI Proxy (gpt-4o-mini)",
+ noAction: true,
+ });
+ } else {
+ const baseUrl = aiOpts.baseurl ?? "OpenAI Default Endpoint";
+ const modelName = aiOpts.model;
+ if (baseUrl.startsWith("http://localhost") || baseUrl.startsWith("http://127.0.0.1")) {
+ viewTextChildren.push({
+ elemtype: "iconbutton",
+ icon: "location-dot",
+ title: `Using Local Model @ ${baseUrl} (${modelName})`,
+ noAction: true,
+ });
+ } else {
+ viewTextChildren.push({
+ elemtype: "iconbutton",
+ icon: "globe",
+ title: `Using Remote Model @ ${baseUrl} (${modelName})`,
+ noAction: true,
+ });
+ }
+ }
+ }
+
+ const dropdownItems = Object.entries(presets)
+ .sort((a, b) => ((a[1]["display:order"] ?? 0) > (b[1]["display:order"] ?? 0) ? 1 : -1))
+ .map(
+ (preset) =>
+ ({
+ label: preset[1]["display:name"],
+ onClick: () =>
+ fireAndForget(() =>
+ ObjectService.UpdateObjectMeta(WOS.makeORef("block", this.blockId), {
+ "ai:preset": preset[0],
+ })
+ ),
+ }) as MenuItem
+ );
+
+ viewTextChildren.push({
+ elemtype: "menubutton",
+ text: presetName,
+ title: "Select AI Configuration",
+ items: dropdownItems,
+ });
+ return viewTextChildren;
+ });
+
+ this.endIconButtons = atom((_) => {
+ let clearButton: IconButtonDecl = {
+ elemtype: "iconbutton",
+ icon: "delete-left",
+ title: "Clear Chat History",
+ click: this.clearMessages.bind(this),
+ };
+ return [clearButton];
+ });
+ }
+
+ get viewComponent(): ViewComponent {
+ return WaveAiUseChat;
+ }
+
+ dispose() {
+ // No cleanup needed for useChat version
+ }
+
+ async populateMessages(): Promise {
+ const history = await this.fetchAiData();
+ return history.map((msg) => ({
+ id: crypto.randomUUID(),
+ role: msg.role as "user" | "assistant" | "system",
+ content: msg.content,
+ }));
+ }
+
+ async fetchAiData(): Promise> {
+ const { data } = await fetchWaveFile(this.blockId, "aidata");
+ if (!data) {
+ return [];
+ }
+ const history: Array = JSON.parse(new TextDecoder().decode(data));
+ return history.slice(Math.max(history.length - slidingWindowSize, 0));
+ }
+
+ async saveMessages(messages: ChatMessage[]): Promise {
+ const history: WaveAIPromptMessageType[] = messages.map((msg) => ({
+ role: msg.role,
+ content: msg.content,
+ }));
+ await BlockService.SaveWaveAiData(this.blockId, history);
+ }
+
+ giveFocus(): boolean {
+ if (this?.textAreaRef?.current) {
+ this.textAreaRef.current?.focus();
+ return true;
+ }
+ return false;
+ }
+
+ async clearMessages() {
+ await BlockService.SaveWaveAiData(this.blockId, []);
+ }
+
+ keyDownHandler(waveEvent: WaveKeyboardEvent): boolean {
+ if (checkKeyPressed(waveEvent, "Cmd:l")) {
+ fireAndForget(this.clearMessages.bind(this));
+ return true;
+ }
+ return false;
+ }
+}
+
+const ChatWindow = memo(
+ forwardRef<
+ OverlayScrollbarsComponentRef,
+ { messages: ChatMessage[]; isLoading: boolean; error: Error | null; fontSize?: string; fixedFontSize?: string }
+ >(({ messages, isLoading, error, fontSize, fixedFontSize }, ref) => {
+ const osRef = useRef(null);
+ const [userHasScrolled, setUserHasScrolled] = useState(false);
+ const [shouldAutoScroll, setShouldAutoScroll] = useState(true);
+
+ useImperativeHandle(ref, () => osRef.current!, []);
+
+ const scrollToBottom = useCallback(() => {
+ if (osRef.current && shouldAutoScroll) {
+ const viewport = osRef.current.osInstance()?.elements().viewport;
+ if (viewport) {
+ viewport.scrollTop = viewport.scrollHeight;
+ }
+ }
+ }, [shouldAutoScroll]);
+
+ const handleScroll = useMemo(
+ () =>
+ throttle(100, () => {
+ if (osRef.current) {
+ const viewport = osRef.current.osInstance()?.elements().viewport;
+ if (viewport) {
+ const { scrollTop, scrollHeight, clientHeight } = viewport;
+ const isNearBottom = scrollHeight - scrollTop - clientHeight < 100;
+ setShouldAutoScroll(isNearBottom);
+ if (!isNearBottom && !userHasScrolled) {
+ setUserHasScrolled(true);
+ }
+ }
+ }
+ }),
+ [userHasScrolled]
+ );
+
+ const resetUserScroll = useMemo(
+ () =>
+ debounce(300, () => {
+ setUserHasScrolled(false);
+ }),
+ []
+ );
+
+ useEffect(() => {
+ scrollToBottom();
+ }, [messages, isLoading, scrollToBottom]);
+
+ useEffect(() => {
+ if (shouldAutoScroll && userHasScrolled) {
+ resetUserScroll();
+ }
+ }, [shouldAutoScroll, userHasScrolled, resetUserScroll]);
+
+ return (
+
+
+
+ {messages.map((message, index) => {
+ // Only the last assistant message should be streaming when isLoading is true
+ const isLastAssistantMessage =
+ message.role === "assistant" && index === messages.length - 1;
+ const isCurrentlyStreaming = isLoading && isLastAssistantMessage;
+
+ return (
+
+ );
+ })}
+ {error && (
+
+
+
+
+
+
+ Error: {error.message}
+
+
+
+ )}
+
+
+
+ );
+ })
+);
+ChatWindow.displayName = "ChatWindow";
+
+const ChatItem = memo(
+ ({
+ message,
+ fontSize,
+ fixedFontSize,
+ isStreaming = false,
+ }: {
+ message: ChatMessage;
+ fontSize?: string;
+ fixedFontSize?: string;
+ isStreaming?: boolean;
+ }) => {
+ const { role, content, reasoning } = message;
+
+ if (role === "user") {
+ return (
+
+
+
+ {content}
+
+
+
+
+
+
+ );
+ }
+
+ if (role === "assistant") {
+ return (
+
+
+
+
+
+ {reasoning && (
+
+
+
+ {reasoning || ""}
+
+
+ )}
+ {content ? (
+
+
+ {content}
+
+
+ ) : (
+
+
+
+ )}
+
+
+ );
+ }
+
+ return null;
+ }
+);
+ChatItem.displayName = "ChatItem";
+
+const ChatInput = memo(
+ ({
+ input,
+ handleInputChange,
+ handleSubmit,
+ isLoading,
+ textAreaRef,
+ }: {
+ input: string;
+ handleInputChange: (e: React.ChangeEvent) => void;
+ handleSubmit: (e: React.FormEvent) => void;
+ isLoading: boolean;
+ textAreaRef: React.RefObject;
+ }) => {
+ const [textAreaHeight, setTextAreaHeight] = useState(25);
+ const maxLines = 5;
+ const lineHeight = 17;
+ const minHeight = 25;
+ const maxHeight = minHeight + (maxLines - 1) * lineHeight;
+
+ const adjustTextAreaHeight = useCallback(() => {
+ if (textAreaRef.current) {
+ const textArea = textAreaRef.current;
+ textArea.style.height = `${minHeight}px`;
+ const scrollHeight = textArea.scrollHeight;
+ const newHeight = Math.min(Math.max(scrollHeight, minHeight), maxHeight);
+ setTextAreaHeight(newHeight);
+ textArea.style.height = `${newHeight}px`;
+ }
+ }, [textAreaRef, minHeight, maxHeight]);
+
+ useEffect(() => {
+ adjustTextAreaHeight();
+ }, [input, adjustTextAreaHeight]);
+
+ const handleKeyDown = useCallback(
+ (event: React.KeyboardEvent) => {
+ if (event.key === "Enter" && !event.shiftKey) {
+ event.preventDefault();
+ handleSubmit(event as any);
+ return;
+ }
+ },
+ [handleSubmit]
+ );
+
+ return (
+
+ );
+ }
+);
+ChatInput.displayName = "ChatInput";
+
+const WaveAiUseChat = ({ blockId, model }: WaveAiUseChatProps) => {
+ const presetKey = useAtomValue(model.presetKey);
+ const fontSize = useAtomValue(model.mergedPresets)?.["ai:fontsize"];
+ const fixedFontSize = useAtomValue(model.mergedPresets)?.["ai:fixedfontsize"];
+ const [initialMessages, setInitialMessages] = useState([]);
+ const [isInitialized, setIsInitialized] = useState(false);
+
+ // Load initial messages
+ useEffect(() => {
+ const loadMessages = async () => {
+ try {
+ const messages = await model.populateMessages();
+ setInitialMessages(messages);
+ setIsInitialized(true);
+ } catch (error) {
+ console.error("Failed to load initial messages:", error);
+ setIsInitialized(true);
+ }
+ };
+ loadMessages();
+ }, [model]);
+
+ const [input, setInput] = useState("");
+ const { messages, sendMessage, status, error, setMessages, stop } = useChat({
+ id: `chat-${blockId}`,
+ messages: initialMessages.map((m) => ({
+ id: m.id,
+ role: m.role,
+ parts: [{ type: "text", text: m.content }],
+ })),
+ transport: new DefaultChatTransport({
+ api: `${getWebServerEndpoint()}/api/aichat?blockid=${blockId}&preset=${encodeURIComponent(presetKey)}`,
+ body: () => ({
+ blockId,
+ preset: presetKey,
+ }),
+ headers: async () => ({
+ "X-Block-ID": blockId,
+ }),
+ prepareSendMessagesRequest: ({ id, messages, trigger, messageId }) => ({
+ headers: { "X-Session-ID": id },
+ body: {
+ messages: messages.slice(-30), // Keep last 30 messages
+ trigger,
+ messageId,
+ },
+ }),
+ credentials: "include",
+ }),
+ onFinish: async ({ message }) => {
+ // Save conversation after each completion
+ try {
+ const allMessages = [...messages, message];
+ const chatMessages = allMessages.map((m) => {
+ const text =
+ m.parts
+ ?.filter((p: any) => p.type === "text")
+ .map((p: any) => p.text)
+ .join("") ?? "";
+ const reasoning =
+ m.parts
+ ?.filter((p: any) => p.type === "reasoning")
+ .map((p: any) => p.text)
+ .join("") ?? "";
+ return {
+ id: m.id,
+ role: m.role as "user" | "assistant" | "system",
+ content: text,
+ reasoning,
+ };
+ });
+ await model.saveMessages(chatMessages);
+ } catch (error) {
+ console.error("Failed to save messages:", error);
+ }
+ },
+ onError: (error) => {
+ console.error("Chat error:", error);
+ },
+ });
+
+ const isLoading = status === "submitted" || status === "streaming";
+
+ const handleInputChange = useCallback((e: React.ChangeEvent) => {
+ setInput(e.target.value);
+ }, []);
+
+ const handleSubmit = useCallback(
+ (e: React.FormEvent) => {
+ e.preventDefault();
+ if (!input.trim() || isLoading) return;
+
+ sendMessage({ text: input });
+ setInput("");
+ },
+ [input, isLoading, sendMessage]
+ );
+
+ // Clear messages handler
+ const handleClearMessages = useCallback(async () => {
+ try {
+ await model.clearMessages();
+ setMessages([]);
+ } catch (error) {
+ console.error("Failed to clear messages:", error);
+ }
+ }, [model, setMessages]);
+
+ // Update model's clear method to use our handler
+ useEffect(() => {
+ model.clearMessages = handleClearMessages;
+ }, [model, handleClearMessages]);
+
+ if (!isInitialized) {
+ return (
+
+ );
+ }
+
+ return (
+
+ {
+ const text = m.parts
+ .filter((p: any) => p.type === "text")
+ .map((p: any) => p.text)
+ .join("");
+
+ const reasoning = m.parts
+ .filter((p: any) => p.type === "reasoning")
+ .map((p: any) => p.text)
+ .join("");
+
+ return {
+ id: m.id,
+ role: m.role as "user" | "assistant" | "system",
+ content: text,
+ reasoning,
+ };
+ })}
+ isLoading={isLoading}
+ error={error}
+ fontSize={fontSize}
+ fixedFontSize={fixedFontSize}
+ />
+
+
+ );
+};
+
+export { WaveAiUseChat };
+export const WaveAiUseChatModel = WaveAiUseChatModelImpl;
diff --git a/go.mod b/go.mod
index 1a739ee34a..cbec2b0aa7 100644
--- a/go.mod
+++ b/go.mod
@@ -15,7 +15,6 @@ require (
github.com/golang-migrate/migrate/v4 v4.19.0
github.com/google/generative-ai-go v0.20.1
github.com/google/uuid v1.6.0
- github.com/gorilla/handlers v1.5.2
github.com/gorilla/mux v1.8.1
github.com/gorilla/websocket v1.5.3
github.com/invopop/jsonschema v0.13.0
@@ -24,6 +23,7 @@ require (
github.com/kevinburke/ssh_config v1.2.0
github.com/mattn/go-sqlite3 v1.14.32
github.com/mitchellh/mapstructure v1.5.0
+ github.com/openai/openai-go/v2 v2.1.1
github.com/sashabaranov/go-openai v1.41.1
github.com/sawka/txwrap v0.2.0
github.com/shirou/gopsutil/v4 v4.25.7
@@ -82,6 +82,10 @@ require (
github.com/rivo/uniseg v0.4.7 // indirect
github.com/sirupsen/logrus v1.9.3 // indirect
github.com/spf13/pflag v1.0.9 // indirect
+ github.com/tidwall/gjson v1.14.4 // indirect
+ github.com/tidwall/match v1.1.1 // indirect
+ github.com/tidwall/pretty v1.2.1 // indirect
+ github.com/tidwall/sjson v1.2.5 // indirect
github.com/tklauser/go-sysconf v0.3.15 // indirect
github.com/tklauser/numcpus v0.10.0 // indirect
github.com/ubuntu/decorate v0.0.0-20230125165522-2d5b0a9bb117 // indirect
diff --git a/go.sum b/go.sum
index cf10eed4a3..005ac274d6 100644
--- a/go.sum
+++ b/go.sum
@@ -96,8 +96,6 @@ github.com/googleapis/enterprise-certificate-proxy v0.3.6 h1:GW/XbdyBFQ8Qe+YAmFU
github.com/googleapis/enterprise-certificate-proxy v0.3.6/go.mod h1:MkHOF77EYAE7qfSuSS9PU6g4Nt4e11cnsDUowfwewLA=
github.com/googleapis/gax-go/v2 v2.15.0 h1:SyjDc1mGgZU5LncH8gimWo9lW1DtIfPibOG81vgd/bo=
github.com/googleapis/gax-go/v2 v2.15.0/go.mod h1:zVVkkxAQHa1RQpg9z2AUCMnKhi0Qld9rcmyfL1OZhoc=
-github.com/gorilla/handlers v1.5.2 h1:cLTUSsNkgcwhgRqvCNmdbRWG0A3N4F+M2nWKdScwyEE=
-github.com/gorilla/handlers v1.5.2/go.mod h1:dX+xVpaxdSw+q0Qek8SSsl3dfMk3jNddUkMzo0GtH0w=
github.com/gorilla/mux v1.8.1 h1:TuBL49tXwgrFYWhqrNgrUNEY92u81SPhu7sTdzQEiWY=
github.com/gorilla/mux v1.8.1/go.mod h1:AKf9I4AEqPTmMytcMc0KkNouC66V3BtZ4qD5fmWSiMQ=
github.com/gorilla/websocket v1.5.3 h1:saDtZ6Pbx/0u+bgYQ3q96pZgCzfhKXGPqt7kZ72aNNg=
@@ -133,6 +131,8 @@ github.com/mattn/go-sqlite3 v1.14.32 h1:JD12Ag3oLy1zQA+BNn74xRgaBbdhbNIDYvQUEuuE
github.com/mattn/go-sqlite3 v1.14.32/go.mod h1:Uh1q+B4BYcTPb+yiD3kU8Ct7aC0hY9fxUwlHK0RXw+Y=
github.com/mitchellh/mapstructure v1.5.0 h1:jeMsZIYE/09sWLaz43PL7Gy6RuMjD2eJVyuac5Z2hdY=
github.com/mitchellh/mapstructure v1.5.0/go.mod h1:bFUtVrKA4DC2yAKiSyO/QUcy7e+RRV2QTWOzhPopBRo=
+github.com/openai/openai-go/v2 v2.1.1 h1:/RMA/V3D+yF/Cc4jHXFt6lkqSOWRf5roRi+DvZaDYQI=
+github.com/openai/openai-go/v2 v2.1.1/go.mod h1:sIUkR+Cu/PMUVkSKhkk742PRURkQOCFhiwJ7eRSBqmk=
github.com/photostorm/pty v1.1.19-0.20230903182454-31354506054b h1:cLGKfKb1uk0hxI0Q8L83UAJPpeJ+gSpn3cCU/tjd3eg=
github.com/photostorm/pty v1.1.19-0.20230903182454-31354506054b/go.mod h1:KO+FcPtyLAiRC0hJwreJVvfwc7vnNz77UxBTIGHdPVk=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
@@ -165,6 +165,16 @@ github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81P
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.10.0 h1:Xv5erBjTwe/5IxqUQTdXv5kgmIvbHo3QQyRwhJsOfJA=
github.com/stretchr/testify v1.10.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
+github.com/tidwall/gjson v1.14.2/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
+github.com/tidwall/gjson v1.14.4 h1:uo0p8EbA09J7RQaflQ1aBRffTR7xedD2bcIVSYxLnkM=
+github.com/tidwall/gjson v1.14.4/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
+github.com/tidwall/match v1.1.1 h1:+Ho715JplO36QYgwN9PGYNhgZvoUSc9X2c80KVTi+GA=
+github.com/tidwall/match v1.1.1/go.mod h1:eRSPERbgtNPcGhD8UCthc6PmLEQXEWd3PRB5JTxsfmM=
+github.com/tidwall/pretty v1.2.0/go.mod h1:ITEVvHYasfjBbM0u2Pg8T2nJnzm8xPwvNhhsoaGGjNU=
+github.com/tidwall/pretty v1.2.1 h1:qjsOFOWWQl+N3RsoF5/ssm1pHmJJwhjlSbZ51I6wMl4=
+github.com/tidwall/pretty v1.2.1/go.mod h1:ITEVvHYasfjBbM0u2Pg8T2nJnzm8xPwvNhhsoaGGjNU=
+github.com/tidwall/sjson v1.2.5 h1:kLy8mja+1c9jlljvWTlSazM7cKDRfJuR/bOJhcY5NcY=
+github.com/tidwall/sjson v1.2.5/go.mod h1:Fvgq9kS/6ociJEDnK0Fk1cpYF4FIW6ZF7LAe+6jwd28=
github.com/tklauser/go-sysconf v0.3.15 h1:VE89k0criAymJ/Os65CSn1IXaol+1wrsFHEB8Ol49K4=
github.com/tklauser/go-sysconf v0.3.15/go.mod h1:Dmjwr6tYFIseJw7a3dRLJfsHAMXZ3nEnL/aZY+0IuI4=
github.com/tklauser/numcpus v0.10.0 h1:18njr6LDBk1zuna922MgdjQuJFjrdppsZG60sHGfjso=
diff --git a/package.json b/package.json
index 8499c340f0..dd44b9892c 100644
--- a/package.json
+++ b/package.json
@@ -91,6 +91,7 @@
"vitest": "^3.0.9"
},
"dependencies": {
+ "@ai-sdk/react": "^2.0.18",
"@floating-ui/react": "^0.27.16",
"@monaco-editor/loader": "^1.5.0",
"@monaco-editor/react": "^4.7.0",
@@ -107,6 +108,7 @@
"@xterm/addon-web-links": "^0.11.0",
"@xterm/addon-webgl": "^0.18.0",
"@xterm/xterm": "^5.5.0",
+ "ai": "^5.0.18",
"base64-js": "^1.5.1",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
@@ -150,13 +152,15 @@
"rxjs": "^7.8.2",
"shell-quote": "^1.8.3",
"sprintf-js": "^1.1.3",
+ "streamdown": "^1.0.11",
"tailwind-merge": "^3.3.1",
"throttle-debounce": "^5.0.2",
"tinycolor2": "^1.6.0",
"use-device-pixel-ratio": "^1.1.2",
"winston": "^3.17.0",
"ws": "^8.18.3",
- "yaml": "^2.7.1"
+ "yaml": "^2.7.1",
+ "zod": "^4.0.17"
},
"resolutions": {
"send@npm:0.18.0": "0.19.0",
@@ -167,6 +171,7 @@
"esbuild@npm:^0.21.5": "^0.25",
"esbuild@npm:~0.23.0": "^0.25",
"node-abi": "^4.6.0",
+ "zod": "^4.0.17",
"react-gauge-chart@npm:^0.5.1/react": "19.1.1",
"react-gauge-chart@npm:^0.5.1/react-dom": "19.1.1"
},
diff --git a/pkg/waveai/ssehandlerch.go b/pkg/waveai/ssehandlerch.go
new file mode 100644
index 0000000000..5302e2ffb9
--- /dev/null
+++ b/pkg/waveai/ssehandlerch.go
@@ -0,0 +1,390 @@
+// Copyright 2025, Command Line Inc.
+// SPDX-License-Identifier: Apache-2.0
+
+package waveai
+
+import (
+ "context"
+ "encoding/json"
+ "fmt"
+ "net/http"
+ "sync"
+ "time"
+)
+
+// see /aiprompts/usechat-streamingproto.md for protocol
+
+const (
+ SSEContentType = "text/event-stream"
+ SSECacheControl = "no-cache"
+ SSEConnection = "keep-alive"
+ SSEKeepaliveMsg = ": keepalive\n\n"
+ SSEStreamStartMsg = ": stream-start\n\n"
+ SSEKeepaliveInterval = 1 * time.Second
+)
+
+// SSEMessageType represents the type of message to write
+type SSEMessageType string
+
+const (
+ SSEMsgData SSEMessageType = "data"
+ SSEMsgEvent SSEMessageType = "event"
+ SSEMsgComment SSEMessageType = "comment"
+ SSEMsgError SSEMessageType = "error"
+)
+
+// AI message type constants
+const (
+ AiMsgStart = "start"
+ AiMsgTextStart = "text-start"
+ AiMsgTextDelta = "text-delta"
+ AiMsgTextEnd = "text-end"
+ AiMsgReasoningStart = "reasoning-start"
+ AiMsgReasoningDelta = "reasoning-delta"
+ AiMsgReasoningEnd = "reasoning-end"
+ AiMsgFinish = "finish"
+ AiMsgError = "error"
+)
+
+// SSEMessage represents a message to be written to the SSE stream
+type SSEMessage struct {
+ Type SSEMessageType
+ Data string
+ EventType string // Only used for SSEMsgEvent
+}
+
+// SSEHandlerCh provides channel-based Server-Sent Events functionality
+type SSEHandlerCh struct {
+ w http.ResponseWriter
+ rc *http.ResponseController
+ ctx context.Context
+ writeCh chan SSEMessage
+ errCh chan error
+
+ mu sync.RWMutex
+ closed bool
+ err error
+
+ wg sync.WaitGroup
+}
+
+// MakeSSEHandlerCh creates a new channel-based SSE handler
+func MakeSSEHandlerCh(w http.ResponseWriter, ctx context.Context) *SSEHandlerCh {
+ return &SSEHandlerCh{
+ w: w,
+ rc: http.NewResponseController(w),
+ ctx: ctx,
+ writeCh: make(chan SSEMessage, 10), // Buffered to prevent blocking
+ errCh: make(chan error, 1), // Buffered for single error
+ }
+}
+
+// SetupSSE configures the response headers and starts the writer goroutine
+func (h *SSEHandlerCh) SetupSSE() error {
+ h.mu.Lock()
+ defer h.mu.Unlock()
+
+ if h.closed {
+ return fmt.Errorf("SSE handler is closed")
+ }
+
+ // Reset write deadline for streaming
+ if err := h.rc.SetWriteDeadline(time.Time{}); err != nil {
+ return fmt.Errorf("failed to reset write deadline: %v", err)
+ }
+
+ // Set SSE headers
+ h.w.Header().Set("Content-Type", SSEContentType)
+ h.w.Header().Set("Cache-Control", "no-cache, no-store, must-revalidate")
+ h.w.Header().Set("Connection", SSEConnection)
+ h.w.Header().Set("x-vercel-ai-ui-message-stream", "v1")
+ h.w.Header().Set("X-Accel-Buffering", "no")
+ h.w.Header().Set("Cache-Control", "no-transform")
+
+ // Send headers and establish streaming
+ h.w.WriteHeader(http.StatusOK)
+ fmt.Fprint(h.w, SSEStreamStartMsg)
+ if err := h.flush(); err != nil {
+ return err
+ }
+
+ // Start the writer goroutine
+ h.wg.Add(1)
+ go h.writerLoop()
+
+ return nil
+}
+
+// writerLoop handles all writes and keepalives in a single goroutine
+func (h *SSEHandlerCh) writerLoop() {
+ defer h.wg.Done()
+
+ keepaliveTicker := time.NewTicker(SSEKeepaliveInterval)
+ defer keepaliveTicker.Stop()
+
+ for {
+ select {
+ case msg, ok := <-h.writeCh:
+ if !ok {
+ // Channel closed, send [DONE] and exit
+ h.writeDirectly("[DONE]", SSEMsgData)
+ return
+ }
+
+ if err := h.writeMessage(msg); err != nil {
+ h.setError(err)
+ return
+ }
+
+ case <-keepaliveTicker.C:
+ if err := h.writeDirectly("keepalive", SSEMsgComment); err != nil {
+ h.setError(err)
+ return
+ }
+
+ case <-h.ctx.Done():
+ return
+ }
+ }
+}
+
+// writeMessage writes a message to the SSE stream
+func (h *SSEHandlerCh) writeMessage(msg SSEMessage) error {
+ switch msg.Type {
+ case SSEMsgData:
+ return h.writeDirectly(msg.Data, SSEMsgData)
+ case SSEMsgEvent:
+ return h.writeEvent(msg.EventType, msg.Data)
+ case SSEMsgComment:
+ return h.writeDirectly(msg.Data, SSEMsgComment)
+ case SSEMsgError:
+ return h.writeDirectly(msg.Data, SSEMsgData)
+ default:
+ return fmt.Errorf("unknown message type: %s", msg.Type)
+ }
+}
+
+// writeDirectly writes data directly to the response writer
+func (h *SSEHandlerCh) writeDirectly(data string, msgType SSEMessageType) error {
+ switch msgType {
+ case SSEMsgData:
+ _, err := fmt.Fprintf(h.w, "data: %s\n\n", data)
+ if err != nil {
+ return err
+ }
+ case SSEMsgComment:
+ _, err := fmt.Fprintf(h.w, ": %s\n\n", data)
+ if err != nil {
+ return err
+ }
+ default:
+ return fmt.Errorf("unsupported direct write type: %s", msgType)
+ }
+ return h.flush()
+}
+
+// writeEvent writes an SSE event with optional event type
+func (h *SSEHandlerCh) writeEvent(eventType, data string) error {
+ if eventType != "" {
+ if _, err := fmt.Fprintf(h.w, "event: %s\n", eventType); err != nil {
+ return err
+ }
+ }
+ if _, err := fmt.Fprintf(h.w, "data: %s\n\n", data); err != nil {
+ return err
+ }
+ return h.flush()
+}
+
+// flush attempts to flush the response writer
+func (h *SSEHandlerCh) flush() error {
+ return h.rc.Flush()
+}
+
+// setError sets the error state thread-safely
+func (h *SSEHandlerCh) setError(err error) {
+ h.mu.Lock()
+ defer h.mu.Unlock()
+
+ if h.err == nil {
+ h.err = err
+ // Send error to error channel if there's space
+ select {
+ case h.errCh <- err:
+ default:
+ }
+ }
+}
+
+// WriteData queues data to be written in SSE format
+func (h *SSEHandlerCh) WriteData(data string) error {
+ h.mu.RLock()
+ closed := h.closed
+ h.mu.RUnlock()
+
+ if closed {
+ return fmt.Errorf("SSE handler is closed")
+ }
+
+ select {
+ case h.writeCh <- SSEMessage{Type: SSEMsgData, Data: data}:
+ return nil
+ case <-h.ctx.Done():
+ return h.ctx.Err()
+ default:
+ return fmt.Errorf("write channel is full")
+ }
+}
+
+// WriteJsonData marshals data to JSON and queues it for writing
+func (h *SSEHandlerCh) WriteJsonData(data interface{}) error {
+ jsonData, err := json.Marshal(data)
+ if err != nil {
+ return fmt.Errorf("failed to marshal JSON: %v", err)
+ }
+ return h.WriteData(string(jsonData))
+}
+
+// WriteError queues an error message and closes the handler
+func (h *SSEHandlerCh) WriteError(errorMsg string) error {
+ errorResp := map[string]interface{}{
+ "type": AiMsgError,
+ "errorText": errorMsg,
+ }
+ if err := h.WriteJsonData(errorResp); err != nil {
+ return err
+ }
+ h.Close()
+ return nil
+}
+
+// WriteEvent queues an SSE event with optional event type
+func (h *SSEHandlerCh) WriteEvent(eventType, data string) error {
+ h.mu.RLock()
+ closed := h.closed
+ h.mu.RUnlock()
+
+ if closed {
+ return fmt.Errorf("SSE handler is closed")
+ }
+
+ select {
+ case h.writeCh <- SSEMessage{Type: SSEMsgEvent, Data: data, EventType: eventType}:
+ return nil
+ case <-h.ctx.Done():
+ return h.ctx.Err()
+ default:
+ return fmt.Errorf("write channel is full")
+ }
+}
+
+// WriteComment queues an SSE comment
+func (h *SSEHandlerCh) WriteComment(comment string) error {
+ h.mu.RLock()
+ closed := h.closed
+ h.mu.RUnlock()
+
+ if closed {
+ return fmt.Errorf("SSE handler is closed")
+ }
+
+ select {
+ case h.writeCh <- SSEMessage{Type: SSEMsgComment, Data: comment}:
+ return nil
+ case <-h.ctx.Done():
+ return h.ctx.Err()
+ default:
+ return fmt.Errorf("write channel is full")
+ }
+}
+
+// Err returns any error that occurred during writing
+func (h *SSEHandlerCh) Err() error {
+ h.mu.RLock()
+ defer h.mu.RUnlock()
+ return h.err
+}
+
+// Close closes the write channel, sends [DONE], and cleans up resources
+func (h *SSEHandlerCh) Close() {
+ h.mu.Lock()
+ if h.closed {
+ h.mu.Unlock()
+ return
+ }
+ h.closed = true
+
+ // Close the write channel, which will trigger [DONE] in writerLoop
+ close(h.writeCh)
+ h.mu.Unlock()
+
+ // Wait for writer goroutine to finish (without holding the lock)
+ h.wg.Wait()
+}
+
+// AI message writing methods
+
+func (h *SSEHandlerCh) AiMsgStart(messageId string) error {
+ resp := map[string]interface{}{
+ "type": AiMsgStart,
+ "messageId": messageId,
+ }
+ return h.WriteJsonData(resp)
+}
+
+func (h *SSEHandlerCh) AiMsgTextStart(textId string) error {
+ resp := map[string]interface{}{
+ "type": AiMsgTextStart,
+ "id": textId,
+ }
+ return h.WriteJsonData(resp)
+}
+
+func (h *SSEHandlerCh) AiMsgTextDelta(textId string, text string) error {
+ resp := map[string]interface{}{
+ "type": AiMsgTextDelta,
+ "id": textId,
+ "delta": text,
+ }
+ return h.WriteJsonData(resp)
+}
+
+func (h *SSEHandlerCh) AiMsgTextEnd(textId string) error {
+ resp := map[string]interface{}{
+ "type": AiMsgTextEnd,
+ "id": textId,
+ }
+ return h.WriteJsonData(resp)
+}
+
+func (h *SSEHandlerCh) AiMsgFinish(finishReason string, usage interface{}) error {
+ resp := map[string]interface{}{
+ "type": AiMsgFinish,
+ }
+ return h.WriteJsonData(resp)
+}
+
+func (h *SSEHandlerCh) AiMsgReasoningStart(reasoningId string) error {
+ resp := map[string]interface{}{
+ "type": AiMsgReasoningStart,
+ "id": reasoningId,
+ }
+ return h.WriteJsonData(resp)
+}
+
+func (h *SSEHandlerCh) AiMsgReasoningDelta(reasoningId string, reasoning string) error {
+ resp := map[string]interface{}{
+ "type": AiMsgReasoningDelta,
+ "id": reasoningId,
+ "delta": reasoning,
+ }
+ return h.WriteJsonData(resp)
+}
+
+func (h *SSEHandlerCh) AiMsgReasoningEnd(reasoningId string) error {
+ resp := map[string]interface{}{
+ "type": AiMsgReasoningEnd,
+ "id": reasoningId,
+ }
+ return h.WriteJsonData(resp)
+}
diff --git a/pkg/waveai/usechat-openai-completions.go b/pkg/waveai/usechat-openai-completions.go
new file mode 100644
index 0000000000..31944d27db
--- /dev/null
+++ b/pkg/waveai/usechat-openai-completions.go
@@ -0,0 +1,162 @@
+// Copyright 2025, Command Line Inc.
+// SPDX-License-Identifier: Apache-2.0
+
+package waveai
+
+import (
+ "context"
+ "fmt"
+ "strings"
+
+ "github.com/openai/openai-go/v2"
+ "github.com/openai/openai-go/v2/option"
+ "github.com/wavetermdev/waveterm/pkg/wshrpc"
+)
+
+// OpenAI Chat Completion streaming response format
+type OpenAIStreamChoice struct {
+ Index int `json:"index"`
+ Delta struct {
+ Content string `json:"content,omitempty"`
+ Reasoning string `json:"reasoning,omitempty"`
+ } `json:"delta"`
+ FinishReason *string `json:"finish_reason"`
+}
+
+type OpenAIStreamResponse struct {
+ ID string `json:"id"`
+ Object string `json:"object"`
+ Created int64 `json:"created"`
+ Model string `json:"model"`
+ Choices []OpenAIStreamChoice `json:"choices"`
+ Usage *OpenAIUsageResponse `json:"usage,omitempty"`
+}
+
+type OpenAIUsageResponse struct {
+ PromptTokens int `json:"prompt_tokens"`
+ CompletionTokens int `json:"completion_tokens"`
+ TotalTokens int `json:"total_tokens"`
+}
+
+func StreamOpenAIChatCompletions(sseHandler *SSEHandlerCh, ctx context.Context, opts *wshrpc.WaveAIOptsType, messages []UseChatMessage) {
+ // Set up OpenAI client options
+ clientOpts := []option.RequestOption{
+ option.WithAPIKey(opts.APIToken),
+ }
+
+ if opts.BaseURL != "" {
+ clientOpts = append(clientOpts, option.WithBaseURL(opts.BaseURL))
+ }
+ if opts.OrgID != "" {
+ clientOpts = append(clientOpts, option.WithOrganization(opts.OrgID))
+ }
+
+ client := openai.NewClient(clientOpts...)
+
+ // Convert messages to ChatCompletionMessageParam, filtering out empty content
+ var chatMessages []openai.ChatCompletionMessageParamUnion
+ for _, msg := range messages {
+ content := msg.GetContent()
+ // Skip messages with empty content as OpenAI requires non-empty content
+ if strings.TrimSpace(content) == "" {
+ continue
+ }
+
+ // Create appropriate message based on role
+ switch msg.Role {
+ case "user":
+ chatMessages = append(chatMessages, openai.UserMessage(content))
+ case "assistant":
+ chatMessages = append(chatMessages, openai.AssistantMessage(content))
+ case "system":
+ chatMessages = append(chatMessages, openai.SystemMessage(content))
+ default:
+ chatMessages = append(chatMessages, openai.UserMessage(content))
+ }
+ }
+
+ // Create request using Chat Completions API
+ req := openai.ChatCompletionNewParams{
+ Model: opts.Model,
+ Messages: chatMessages,
+ }
+
+ if opts.MaxTokens > 0 {
+ if isReasoningModel(opts.Model) {
+ req.MaxCompletionTokens = openai.Int(int64(opts.MaxTokens))
+ } else {
+ req.MaxTokens = openai.Int(int64(opts.MaxTokens))
+ }
+ }
+
+ // Create stream using Chat Completions API
+ stream := client.Chat.Completions.NewStreaming(ctx, req)
+ defer stream.Close()
+
+ // Generate IDs for the streaming protocol
+ messageId := generateID()
+ textId := generateID()
+
+ // Send message start
+ sseHandler.AiMsgStart(messageId)
+
+ // Track whether we've started text streaming and finished
+ textStarted := false
+ textEnded := false
+ finished := false
+
+ // Stream responses using event-based API
+ for stream.Next() {
+ chunk := stream.Current()
+
+ if len(chunk.Choices) > 0 {
+ choice := chunk.Choices[0]
+
+ // Handle content delta
+ if choice.Delta.Content != "" {
+ // Send text start only when we have actual content
+ if !textStarted {
+ sseHandler.AiMsgTextStart(textId)
+ textStarted = true
+ }
+ sseHandler.AiMsgTextDelta(textId, choice.Delta.Content)
+ }
+
+ // Handle finish reason
+ if choice.FinishReason != "" && !finished {
+ usage := &OpenAIUsageResponse{}
+ if chunk.Usage.PromptTokens > 0 || chunk.Usage.CompletionTokens > 0 {
+ usage.PromptTokens = int(chunk.Usage.PromptTokens)
+ usage.CompletionTokens = int(chunk.Usage.CompletionTokens)
+ usage.TotalTokens = int(chunk.Usage.TotalTokens)
+ }
+
+ // End text if it was started but not ended
+ if textStarted && !textEnded {
+ sseHandler.AiMsgTextEnd(textId)
+ textEnded = true
+ }
+
+ sseHandler.AiMsgFinish(choice.FinishReason, usage)
+ finished = true
+ return
+ }
+ }
+ }
+
+ // Handle stream errors
+ if err := stream.Err(); err != nil {
+ sseHandler.WriteError(fmt.Sprintf("OpenAI API error: %v", err))
+ return
+ }
+
+ // Cleanup if stream ended without completion event
+ if !finished {
+ // End text if it was started but not ended
+ if textStarted && !textEnded {
+ sseHandler.AiMsgTextEnd(textId)
+ textEnded = true
+ }
+ sseHandler.AiMsgFinish("stop", nil)
+ }
+}
diff --git a/pkg/waveai/usechat-openai-responses.go b/pkg/waveai/usechat-openai-responses.go
new file mode 100644
index 0000000000..678de3feae
--- /dev/null
+++ b/pkg/waveai/usechat-openai-responses.go
@@ -0,0 +1,225 @@
+// Copyright 2025, Command Line Inc.
+// SPDX-License-Identifier: Apache-2.0
+
+package waveai
+
+import (
+ "context"
+ "fmt"
+ "strings"
+
+ "github.com/openai/openai-go/v2"
+ "github.com/openai/openai-go/v2/option"
+ "github.com/openai/openai-go/v2/responses"
+ "github.com/openai/openai-go/v2/shared"
+ "github.com/wavetermdev/waveterm/pkg/wavebase"
+ "github.com/wavetermdev/waveterm/pkg/wshrpc"
+)
+
+func createOpenAIRequest(opts *wshrpc.WaveAIOptsType, messages []UseChatMessage) (openai.Client, responses.ResponseNewParams) {
+ // Set up OpenAI client options
+ clientOpts := []option.RequestOption{
+ option.WithAPIKey(opts.APIToken),
+ }
+
+ if opts.BaseURL != "" {
+ clientOpts = append(clientOpts, option.WithBaseURL(opts.BaseURL))
+ }
+ if opts.OrgID != "" {
+ clientOpts = append(clientOpts, option.WithOrganization(opts.OrgID))
+ }
+
+ client := openai.NewClient(clientOpts...)
+
+ // Convert messages to input items, filtering out empty content
+ var inputItems []responses.ResponseInputItemUnionParam
+ for _, msg := range messages {
+ content := msg.GetContent()
+ // Skip messages with empty content as OpenAI requires non-empty content
+ if strings.TrimSpace(content) == "" {
+ continue
+ }
+
+ // Convert role to EasyInputMessageRole
+ var role responses.EasyInputMessageRole
+ switch msg.Role {
+ case "user":
+ role = responses.EasyInputMessageRoleUser
+ case "assistant":
+ role = responses.EasyInputMessageRoleAssistant
+ case "system":
+ role = responses.EasyInputMessageRoleSystem
+ default:
+ role = responses.EasyInputMessageRoleUser
+ }
+
+ inputItems = append(inputItems, responses.ResponseInputItemParamOfMessage(content, role))
+ }
+
+ // Create request using Responses API for reasoning support
+ req := responses.ResponseNewParams{
+ Model: opts.Model,
+ Input: responses.ResponseNewParamsInputUnion{
+ OfInputItemList: responses.ResponseInputParam(inputItems),
+ },
+ }
+
+ // Only set reasoning parameter for reasoning models
+ if isReasoningModel(opts.Model) {
+ req.Reasoning = shared.ReasoningParam{
+ Effort: openai.ReasoningEffortMedium,
+ Summary: openai.ReasoningSummaryAuto,
+ }
+ }
+
+ if opts.MaxTokens > 0 {
+ req.MaxOutputTokens = openai.Int(int64(opts.MaxTokens))
+ }
+
+ return client, req
+}
+
+func StreamOpenAIResponsesAPI(sseHandler *SSEHandlerCh, ctx context.Context, opts *wshrpc.WaveAIOptsType, messages []UseChatMessage) {
+ client, req := createOpenAIRequest(opts, messages)
+
+ // Create stream using Responses API
+ stream := client.Responses.NewStreaming(ctx, req)
+ defer stream.Close()
+
+ // Generate IDs for the streaming protocol
+ messageId := generateID()
+ textId := generateID()
+ reasoningId := generateID()
+
+ // Send message start
+ sseHandler.AiMsgStart(messageId)
+
+ // Track whether we've started text/reasoning streaming and finished
+ textStarted := false
+ textEnded := false
+ reasoningStarted := false
+ reasoningEnded := false
+ finished := false
+
+ // Stream responses using event-based API
+ for stream.Next() {
+ event := stream.Current()
+
+ fmt.Printf("DEBUG: Received event type: %s\n", event.Type)
+
+ switch event.Type {
+ case "response.output_item.added":
+ outputItem := event.AsResponseOutputItemAdded()
+ // fmt.Printf("DEBUG: output_item.added - Type: %s\n", outputItem.Item.Type)
+ if outputItem.Item.Type == "reasoning" && !reasoningStarted {
+ sseHandler.AiMsgReasoningStart(reasoningId)
+ reasoningStarted = true
+ }
+
+ case "response.reasoning_summary_part.added":
+ // Optional; first empty part—no-op
+
+ case "response.reasoning_summary_text.delta":
+ reasoningDelta := event.AsResponseReasoningSummaryTextDelta()
+ fmt.Printf("DEBUG: reasoning delta - reasoningEnded=%t, delta='%s'\n", reasoningEnded, reasoningDelta.Delta)
+ if reasoningDelta.Delta != "" && !reasoningEnded {
+ sseHandler.AiMsgReasoningDelta(reasoningId, reasoningDelta.Delta)
+ }
+
+ case "response.reasoning_summary_text.done":
+ fmt.Printf("DEBUG: reasoning summary text done - reasoningStarted=%t, reasoningEnded=%t (not ending here, waiting for output_item.done)\n", reasoningStarted, reasoningEnded)
+ // Don't end reasoning here - there may be multiple reasoning parts
+ // Wait for response.output_item.done to end reasoning
+
+ case "response.reasoning_summary_part.done":
+ // Reasoning summary part done - no action needed
+
+ case "response.content_part.added":
+ // First output_text part for message—no-op
+
+ case "response.content_part.done":
+ // Content part done - no action needed
+
+ case "response.output_text.delta":
+ textDelta := event.AsResponseOutputTextDelta()
+ if textDelta.Delta != "" && !textEnded {
+ if !textStarted {
+ sseHandler.AiMsgTextStart(textId)
+ textStarted = true
+ }
+ sseHandler.AiMsgTextDelta(textId, textDelta.Delta)
+ }
+
+ case "response.output_text.done":
+ if textStarted && !textEnded {
+ sseHandler.AiMsgTextEnd(textId)
+ textEnded = true
+ }
+
+ case "response.output_item.done":
+ // Item-level close (reasoning or message)
+ // If we had started reasoning but haven't ended it, end it now
+ if reasoningStarted && !reasoningEnded {
+ sseHandler.AiMsgReasoningEnd(reasoningId)
+ reasoningEnded = true
+ }
+
+ case "response.completed":
+ responseDone := event.AsResponseCompleted()
+ if !finished {
+ usage := &OpenAIUsageResponse{}
+ responseUsage := responseDone.Response.Usage
+ usage.PromptTokens = int(responseUsage.InputTokens)
+ usage.CompletionTokens = int(responseUsage.OutputTokens)
+ usage.TotalTokens = usage.PromptTokens + usage.CompletionTokens
+
+ // End reasoning if it was started but not ended
+ if reasoningStarted && !reasoningEnded {
+ sseHandler.AiMsgReasoningEnd(reasoningId)
+ reasoningEnded = true
+ }
+ // End text if it was started but not ended
+ if textStarted && !textEnded {
+ sseHandler.AiMsgTextEnd(textId)
+ textEnded = true
+ }
+
+ finishReason := "stop"
+ if responseDone.Response.Status == "completed" {
+ finishReason = "stop"
+ }
+
+ sseHandler.AiMsgFinish(finishReason, usage)
+ finished = true
+ }
+ return
+
+ default:
+ // Log unhandled event types in dev mode
+ if wavebase.IsDevMode() {
+ fmt.Printf("DEBUG: Unhandled event type: %s\n", event.Type)
+ }
+ }
+ }
+
+ // Handle stream errors
+ if err := stream.Err(); err != nil {
+ sseHandler.WriteError(fmt.Sprintf("OpenAI API error: %v", err))
+ return
+ }
+
+ // Cleanup if stream ended without completion event
+ if !finished {
+ // End reasoning if it was started but not ended
+ if reasoningStarted && !reasoningEnded {
+ sseHandler.AiMsgReasoningEnd(reasoningId)
+ reasoningEnded = true
+ }
+ // End text if it was started but not ended
+ if textStarted && !textEnded {
+ sseHandler.AiMsgTextEnd(textId)
+ textEnded = true
+ }
+ sseHandler.AiMsgFinish("stop", nil)
+ }
+}
diff --git a/pkg/waveai/usechat.go b/pkg/waveai/usechat.go
new file mode 100644
index 0000000000..8cd4eec844
--- /dev/null
+++ b/pkg/waveai/usechat.go
@@ -0,0 +1,231 @@
+// Copyright 2025, Command Line Inc.
+// SPDX-License-Identifier: Apache-2.0
+
+package waveai
+
+import (
+ "context"
+ "crypto/rand"
+ "encoding/hex"
+ "encoding/json"
+ "fmt"
+ "log"
+ "net/http"
+ "strings"
+
+ "github.com/wavetermdev/waveterm/pkg/waveobj"
+ "github.com/wavetermdev/waveterm/pkg/wconfig"
+ "github.com/wavetermdev/waveterm/pkg/wshrpc"
+ "github.com/wavetermdev/waveterm/pkg/wstore"
+)
+
+type UseChatMessagePart struct {
+ Type string `json:"type"`
+ Text string `json:"text"`
+}
+
+type UseChatMessage struct {
+ Role string `json:"role"`
+ Content string `json:"content,omitempty"`
+ Parts []UseChatMessagePart `json:"parts,omitempty"`
+}
+
+// GetContent extracts the text content from either content field or parts array
+func (m *UseChatMessage) GetContent() string {
+ if m.Content != "" {
+ return m.Content
+ }
+ if len(m.Parts) > 0 {
+ var content strings.Builder
+ for _, part := range m.Parts {
+ if part.Type == "text" {
+ content.WriteString(part.Text)
+ }
+ }
+ return content.String()
+ }
+ return ""
+}
+
+type UseChatRequest struct {
+ Messages []UseChatMessage `json:"messages"`
+ Options *wconfig.AiSettingsType `json:"options,omitempty"`
+}
+
+func resolveAIConfig(ctx context.Context, blockId, presetKey string, requestOptions *wconfig.AiSettingsType) (*wshrpc.WaveAIOptsType, error) {
+ // Get block metadata
+ block, err := wstore.DBMustGet[*waveobj.Block](ctx, blockId)
+ if err != nil {
+ return nil, fmt.Errorf("failed to get block: %v", err)
+ }
+
+ // Get global settings
+ fullConfig := wconfig.GetWatcher().GetFullConfig()
+ globalAiSettings := fullConfig.Settings.GetAiSettings()
+
+ // Resolve preset hierarchy
+ finalPreset := presetKey
+ if finalPreset == "" && block != nil && block.Meta != nil {
+ if blockPreset, ok := block.Meta["ai:preset"].(string); ok {
+ finalPreset = blockPreset
+ }
+ }
+ if finalPreset == "" {
+ finalPreset = globalAiSettings.AiPreset
+ }
+ if finalPreset == "" {
+ finalPreset = "default"
+ }
+
+ // Load preset configuration
+ var presetAiSettings *wconfig.AiSettingsType
+ if finalPreset != "default" {
+ var presetKey string
+ if strings.HasPrefix(finalPreset, "ai@") {
+ presetKey = finalPreset
+ } else {
+ presetKey = fmt.Sprintf("ai@%s", finalPreset)
+ }
+ if preset, ok := fullConfig.Presets[presetKey]; ok {
+ presetAiSettings = &wconfig.AiSettingsType{}
+ if err := json.Unmarshal(mustMarshal(preset), presetAiSettings); err == nil {
+ // Successfully unmarshaled preset
+ } else {
+ presetAiSettings = nil
+ }
+ }
+ }
+
+ // Extract block AI settings from metadata
+ var blockAiSettings *wconfig.AiSettingsType
+ if block != nil && block.Meta != nil {
+ blockAiSettings = &wconfig.AiSettingsType{}
+ if err := json.Unmarshal(mustMarshal(block.Meta), blockAiSettings); err != nil {
+ blockAiSettings = nil
+ }
+ }
+
+ // Merge settings with hierarchy: global < preset < block < request
+ finalSettings := wconfig.MergeAiSettings(globalAiSettings, presetAiSettings, blockAiSettings, requestOptions)
+
+ // Convert to WaveAIOptsType
+ aiOpts := &wshrpc.WaveAIOptsType{
+ Model: finalSettings.AiModel,
+ APIType: finalSettings.AiApiType,
+ APIToken: finalSettings.AiApiToken,
+ BaseURL: finalSettings.AiBaseURL,
+ OrgID: finalSettings.AiOrgID,
+ APIVersion: finalSettings.AIApiVersion,
+ ProxyURL: finalSettings.AiProxyUrl,
+ MaxTokens: int(finalSettings.AiMaxTokens),
+ TimeoutMs: int(finalSettings.AiTimeoutMs),
+ }
+
+ // Set defaults
+ if aiOpts.Model == "" {
+ aiOpts.Model = "gpt-4.1"
+ }
+ if aiOpts.APIType == "" {
+ aiOpts.APIType = APIType_OpenAI
+ }
+ if aiOpts.MaxTokens == 0 {
+ aiOpts.MaxTokens = 4000
+ }
+
+ return aiOpts, nil
+}
+
+func mustMarshal(v any) []byte {
+ data, err := json.Marshal(v)
+ if err != nil {
+ return []byte("{}")
+ }
+ return data
+}
+
+func shouldUseChatCompletionsAPI(model string) bool {
+ m := strings.ToLower(model)
+ // Chat Completions API is required for older models: gpt-3.5-*, gpt-4, gpt-4-turbo, o1-*
+ return strings.HasPrefix(m, "gpt-3.5") ||
+ strings.HasPrefix(m, "gpt-4-") ||
+ m == "gpt-4" ||
+ strings.HasPrefix(m, "o1-")
+}
+
+func StreamOpenAIToUseChat(sseHandler *SSEHandlerCh, ctx context.Context, opts *wshrpc.WaveAIOptsType, messages []UseChatMessage) {
+ // Route to appropriate API based on model
+ if shouldUseChatCompletionsAPI(opts.Model) {
+ // Older models (gpt-3.5, gpt-4, gpt-4-turbo, o1-*) use Chat Completions API
+ StreamOpenAIChatCompletions(sseHandler, ctx, opts, messages)
+ } else {
+ // Newer models (gpt-4.1, gpt-4o, gpt-5, o3, o4, etc.) use Responses API for reasoning support
+ StreamOpenAIResponsesAPI(sseHandler, ctx, opts, messages)
+ }
+}
+
+func generateID() string {
+ bytes := make([]byte, 16)
+ rand.Read(bytes)
+ return hex.EncodeToString(bytes)
+}
+
+func HandleAIChat(w http.ResponseWriter, r *http.Request) {
+ // Handle CORS preflight requests
+ if r.Method == http.MethodOptions {
+ w.WriteHeader(http.StatusOK)
+ return
+ }
+
+ // Parse query parameters first
+ blockId := r.URL.Query().Get("blockid")
+ presetKey := r.URL.Query().Get("preset")
+
+ if blockId == "" {
+ http.Error(w, "blockid query parameter is required", http.StatusBadRequest)
+ return
+ }
+
+ // Parse request body completely before sending any response
+ var req UseChatRequest
+ if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
+ http.Error(w, fmt.Sprintf("Invalid request body: %v", err), http.StatusBadRequest)
+ return
+ }
+
+ // Resolve AI configuration
+ aiOpts, err := resolveAIConfig(r.Context(), blockId, presetKey, req.Options)
+ if err != nil {
+ http.Error(w, fmt.Sprintf("Configuration error: %v", err), http.StatusInternalServerError)
+ return
+ }
+
+ // Validate configuration
+ if aiOpts.Model == "" {
+ http.Error(w, "No AI model specified", http.StatusBadRequest)
+ return
+ }
+ log.Printf("using AI model: %s (%s)", aiOpts.Model, aiOpts.BaseURL)
+
+ // For now, only support OpenAI
+ if aiOpts.APIType != APIType_OpenAI && aiOpts.APIType != "" {
+ http.Error(w, fmt.Sprintf("Unsupported API type: %s (only OpenAI supported in POC)", aiOpts.APIType), http.StatusBadRequest)
+ return
+ }
+
+ if aiOpts.APIToken == "" {
+ http.Error(w, "No API token provided", http.StatusBadRequest)
+ return
+ }
+
+ // Create SSE handler and set up streaming
+ sseHandler := MakeSSEHandlerCh(w, r.Context())
+ defer sseHandler.Close()
+
+ if err := sseHandler.SetupSSE(); err != nil {
+ http.Error(w, fmt.Sprintf("Failed to setup SSE: %v", err), http.StatusInternalServerError)
+ return
+ }
+
+ // Stream OpenAI response
+ StreamOpenAIToUseChat(sseHandler, r.Context(), aiOpts, req.Messages)
+}
diff --git a/pkg/web/web.go b/pkg/web/web.go
index 8a3e6470b4..07d0625a6e 100644
--- a/pkg/web/web.go
+++ b/pkg/web/web.go
@@ -19,7 +19,6 @@ import (
"time"
"github.com/google/uuid"
- "github.com/gorilla/handlers"
"github.com/gorilla/mux"
"github.com/wavetermdev/waveterm/pkg/authkey"
"github.com/wavetermdev/waveterm/pkg/docsite"
@@ -29,6 +28,7 @@ import (
"github.com/wavetermdev/waveterm/pkg/schema"
"github.com/wavetermdev/waveterm/pkg/service"
"github.com/wavetermdev/waveterm/pkg/util/utilfn"
+ "github.com/wavetermdev/waveterm/pkg/waveai"
"github.com/wavetermdev/waveterm/pkg/wavebase"
"github.com/wavetermdev/waveterm/pkg/wshrpc"
"github.com/wavetermdev/waveterm/pkg/wshrpc/wshclient"
@@ -404,6 +404,13 @@ func WebFnWrap(opts WebFnOpts, fn WebFnType) WebFnType {
w.Header().Set(CacheControlHeaderKey, CacheControlHeaderNoCache)
}
w.Header().Set("Access-Control-Expose-Headers", "X-ZoneFileInfo")
+
+ // Handle CORS preflight OPTIONS requests without auth validation
+ if r.Method == http.MethodOptions {
+ w.WriteHeader(http.StatusOK)
+ return
+ }
+
err := authkey.ValidateIncomingRequest(r)
if err != nil {
w.WriteHeader(http.StatusUnauthorized)
@@ -442,17 +449,49 @@ const schemaPrefix = "/schema/"
// blocking
func RunWebServer(listener net.Listener) {
gr := mux.NewRouter()
- gr.HandleFunc("/wave/stream-local-file", WebFnWrap(WebFnOpts{AllowCaching: true}, handleStreamLocalFile))
- gr.HandleFunc("/wave/stream-file", WebFnWrap(WebFnOpts{AllowCaching: true}, handleStreamFile))
- gr.PathPrefix("/wave/stream-file/").HandlerFunc(WebFnWrap(WebFnOpts{AllowCaching: true}, handleStreamFile))
- gr.HandleFunc("/wave/file", WebFnWrap(WebFnOpts{AllowCaching: false}, handleWaveFile))
- gr.HandleFunc("/wave/service", WebFnWrap(WebFnOpts{JsonErrors: true}, handleService))
- gr.HandleFunc("/vdom/{uuid}/{path:.*}", WebFnWrap(WebFnOpts{AllowCaching: true}, handleVDom))
+
+ // Create separate routers for different timeout requirements
+ waveRouter := mux.NewRouter()
+ waveRouter.HandleFunc("/wave/stream-local-file", WebFnWrap(WebFnOpts{AllowCaching: true}, handleStreamLocalFile))
+ waveRouter.HandleFunc("/wave/stream-file", WebFnWrap(WebFnOpts{AllowCaching: true}, handleStreamFile))
+ waveRouter.PathPrefix("/wave/stream-file/").HandlerFunc(WebFnWrap(WebFnOpts{AllowCaching: true}, handleStreamFile))
+ waveRouter.HandleFunc("/wave/file", WebFnWrap(WebFnOpts{AllowCaching: false}, handleWaveFile))
+ waveRouter.HandleFunc("/wave/service", WebFnWrap(WebFnOpts{JsonErrors: true}, handleService))
+
+ vdomRouter := mux.NewRouter()
+ vdomRouter.HandleFunc("/vdom/{uuid}/{path:.*}", WebFnWrap(WebFnOpts{AllowCaching: true}, handleVDom))
+
+ // Routes that need timeout handling
+ gr.PathPrefix("/wave/").Handler(http.TimeoutHandler(waveRouter, HttpTimeoutDuration, "Timeout"))
+ gr.PathPrefix("/vdom/").Handler(http.TimeoutHandler(vdomRouter, HttpTimeoutDuration, "Timeout"))
+
+ // Routes that should NOT have timeout handling (for streaming)
+ gr.HandleFunc("/api/aichat", WebFnWrap(WebFnOpts{AllowCaching: false}, waveai.HandleAIChat))
+
+ // Other routes without timeout
gr.PathPrefix(docsitePrefix).Handler(http.StripPrefix(docsitePrefix, docsite.GetDocsiteHandler()))
gr.PathPrefix(schemaPrefix).Handler(http.StripPrefix(schemaPrefix, schema.GetSchemaHandler()))
- handler := http.TimeoutHandler(gr, HttpTimeoutDuration, "Timeout")
+
+ handler := http.Handler(gr)
if wavebase.IsDevMode() {
- handler = handlers.CORS(handlers.AllowedOrigins([]string{"*"}))(handler)
+ originalHandler := handler
+ handler = http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
+ origin := r.Header.Get("Origin")
+ if origin != "" {
+ w.Header().Set("Access-Control-Allow-Origin", origin)
+ }
+ w.Header().Set("Access-Control-Allow-Methods", "GET, POST, PUT, DELETE, OPTIONS")
+ w.Header().Set("Access-Control-Allow-Headers", "Content-Type, X-Session-Id, X-AuthKey, Authorization, X-Requested-With, Accept, x-vercel-ai-ui-message-stream")
+ w.Header().Set("Access-Control-Expose-Headers", "X-ZoneFileInfo, Content-Length, Content-Type, x-vercel-ai-ui-message-stream")
+ w.Header().Set("Access-Control-Allow-Credentials", "true")
+
+ if r.Method == "OPTIONS" {
+ w.WriteHeader(204)
+ return
+ }
+
+ originalHandler.ServeHTTP(w, r)
+ })
}
server := &http.Server{
ReadTimeout: HttpReadTimeout,
diff --git a/yarn.lock b/yarn.lock
index 10f8946a58..f076d12dd2 100644
--- a/yarn.lock
+++ b/yarn.lock
@@ -19,6 +19,58 @@ __metadata:
languageName: node
linkType: hard
+"@ai-sdk/gateway@npm:1.0.21":
+ version: 1.0.21
+ resolution: "@ai-sdk/gateway@npm:1.0.21"
+ dependencies:
+ "@ai-sdk/provider": "npm:2.0.0"
+ "@ai-sdk/provider-utils": "npm:3.0.8"
+ peerDependencies:
+ zod: ^3.25.76 || ^4
+ checksum: 10c0/f99f195dae40d116885746f4cfe0d60b73b76c487c57479a6215f18382a686a14e43da6d1724390d033ee18aebb91a28c535e7246e677bbe2cba9fc11444ef2f
+ languageName: node
+ linkType: hard
+
+"@ai-sdk/provider-utils@npm:3.0.8":
+ version: 3.0.8
+ resolution: "@ai-sdk/provider-utils@npm:3.0.8"
+ dependencies:
+ "@ai-sdk/provider": "npm:2.0.0"
+ "@standard-schema/spec": "npm:^1.0.0"
+ eventsource-parser: "npm:^3.0.5"
+ peerDependencies:
+ zod: ^3.25.76 || ^4
+ checksum: 10c0/f466657c886cbb9f7ecbcd2dd1abc51a88af9d3f1cff030f7e97e70a4790a99f3338ad886e9c0dccf04dacdcc84522c7d57119b9a4e8e1d84f2dae9c893c397e
+ languageName: node
+ linkType: hard
+
+"@ai-sdk/provider@npm:2.0.0":
+ version: 2.0.0
+ resolution: "@ai-sdk/provider@npm:2.0.0"
+ dependencies:
+ json-schema: "npm:^0.4.0"
+ checksum: 10c0/e50e520016c9fc0a8b5009cadd47dae2f1c81ec05c1792b9e312d7d15479f024ca8039525813a33425c884e3449019fed21043b1bfabd6a2626152ca9a388199
+ languageName: node
+ linkType: hard
+
+"@ai-sdk/react@npm:^2.0.18":
+ version: 2.0.40
+ resolution: "@ai-sdk/react@npm:2.0.40"
+ dependencies:
+ "@ai-sdk/provider-utils": "npm:3.0.8"
+ ai: "npm:5.0.40"
+ swr: "npm:^2.2.5"
+ throttleit: "npm:2.1.0"
+ peerDependencies:
+ react: ^18 || ^19 || ^19.0.0-rc
+ zod: ^3.25.76 || ^4
+ peerDependenciesMeta:
+ zod:
+ optional: true
+ checksum: 10c0/c666451a0d1989a925c0526f64c113374677c0075196b73a4b7914d255b7fbe2d3342119985045436ef5dad654330c5b25500684b14638fb664ef660ee2310a7
+ languageName: node
+ linkType: hard
+
"@algolia/autocomplete-core@npm:1.17.9":
version: 1.17.9
resolution: "@algolia/autocomplete-core@npm:1.17.9"
@@ -221,6 +273,23 @@ __metadata:
languageName: node
linkType: hard
+"@antfu/install-pkg@npm:^1.1.0":
+ version: 1.1.0
+ resolution: "@antfu/install-pkg@npm:1.1.0"
+ dependencies:
+ package-manager-detector: "npm:^1.3.0"
+ tinyexec: "npm:^1.0.1"
+ checksum: 10c0/140d5994c76fd3d0e824c88f1ce91b3370e8066a8bc2f5729ae133bf768caa239f7915e29c78f239b7ead253113ace51293e95127fafe2b786b88eb615b3be47
+ languageName: node
+ linkType: hard
+
+"@antfu/utils@npm:^9.2.0":
+ version: 9.2.0
+ resolution: "@antfu/utils@npm:9.2.0"
+ checksum: 10c0/c622bd64985abee4324ddc80e11312b6e8154d648f7e80b2e0010276ca527b723ab25df4013cba344b2cd606960e42e05186013c6af79ef724ccb59acba9b893
+ languageName: node
+ linkType: hard
+
"@babel/code-frame@npm:^7.0.0, @babel/code-frame@npm:^7.10.4, @babel/code-frame@npm:^7.16.0, @babel/code-frame@npm:^7.21.4, @babel/code-frame@npm:^7.25.9, @babel/code-frame@npm:^7.26.0, @babel/code-frame@npm:^7.8.3":
version: 7.26.2
resolution: "@babel/code-frame@npm:7.26.2"
@@ -1686,6 +1755,55 @@ __metadata:
languageName: node
linkType: hard
+"@braintree/sanitize-url@npm:^7.0.4":
+ version: 7.1.1
+ resolution: "@braintree/sanitize-url@npm:7.1.1"
+ checksum: 10c0/fdfc1759c4244e287693ce1e9d42d649423e7c203fdccf27a571f8951ddfe34baa5273b7e6a8dd3007d7676859c7a0a9819be0ab42a3505f8505ad0eefecf7c1
+ languageName: node
+ linkType: hard
+
+"@chevrotain/cst-dts-gen@npm:11.0.3":
+ version: 11.0.3
+ resolution: "@chevrotain/cst-dts-gen@npm:11.0.3"
+ dependencies:
+ "@chevrotain/gast": "npm:11.0.3"
+ "@chevrotain/types": "npm:11.0.3"
+ lodash-es: "npm:4.17.21"
+ checksum: 10c0/9e945a0611386e4e08af34c2d0b3af36c1af08f726b58145f11310f2aeafcb2d65264c06ec65a32df6b6a65771e6a55be70580c853afe3ceb51487e506967104
+ languageName: node
+ linkType: hard
+
+"@chevrotain/gast@npm:11.0.3":
+ version: 11.0.3
+ resolution: "@chevrotain/gast@npm:11.0.3"
+ dependencies:
+ "@chevrotain/types": "npm:11.0.3"
+ lodash-es: "npm:4.17.21"
+ checksum: 10c0/54fc44d7b4a7b0323f49d957dd88ad44504922d30cb226d93b430b0e09925efe44e0726068581d777f423fabfb878a2238ed2c87b690c0c0014ebd12b6968354
+ languageName: node
+ linkType: hard
+
+"@chevrotain/regexp-to-ast@npm:11.0.3":
+ version: 11.0.3
+ resolution: "@chevrotain/regexp-to-ast@npm:11.0.3"
+ checksum: 10c0/6939c5c94fbfb8c559a4a37a283af5ded8e6147b184a7d7bcf5ad1404d9d663c78d81602bd8ea8458ec497358a9e1671541099c511835d0be2cad46f00c62b3f
+ languageName: node
+ linkType: hard
+
+"@chevrotain/types@npm:11.0.3":
+ version: 11.0.3
+ resolution: "@chevrotain/types@npm:11.0.3"
+ checksum: 10c0/72fe8f0010ebef848e47faea14a88c6fdc3cdbafaef6b13df4a18c7d33249b1b675e37b05cb90a421700c7016dae7cd4187ab6b549e176a81cea434f69cd2503
+ languageName: node
+ linkType: hard
+
+"@chevrotain/utils@npm:11.0.3":
+ version: 11.0.3
+ resolution: "@chevrotain/utils@npm:11.0.3"
+ checksum: 10c0/b31972d1b2d444eef1499cf9b7576fc1793e8544910de33a3c18e07c270cfad88067f175d0ee63e7bc604713ebed647f8190db45cc8311852cd2d4fe2ef14068
+ languageName: node
+ linkType: hard
+
"@chromatic-com/storybook@npm:^3.2.7":
version: 3.2.7
resolution: "@chromatic-com/storybook@npm:3.2.7"
@@ -3681,6 +3799,29 @@ __metadata:
languageName: node
linkType: hard
+"@iconify/types@npm:^2.0.0":
+ version: 2.0.0
+ resolution: "@iconify/types@npm:2.0.0"
+ checksum: 10c0/65a3be43500c7ccacf360e136d00e1717f050b7b91da644e94370256ac66f582d59212bdb30d00788aab4fc078262e91c95b805d1808d654b72f6d2072a7e4b2
+ languageName: node
+ linkType: hard
+
+"@iconify/utils@npm:^3.0.1":
+ version: 3.0.1
+ resolution: "@iconify/utils@npm:3.0.1"
+ dependencies:
+ "@antfu/install-pkg": "npm:^1.1.0"
+ "@antfu/utils": "npm:^9.2.0"
+ "@iconify/types": "npm:^2.0.0"
+ debug: "npm:^4.4.1"
+ globals: "npm:^15.15.0"
+ kolorist: "npm:^1.8.0"
+ local-pkg: "npm:^1.1.1"
+ mlly: "npm:^1.7.4"
+ checksum: 10c0/50a6fa3d242d99dc43e83cf7c731875e13e87ba358cca0b6c71b5e865aa1dbeb3e0a3212bdafa75048b1bf2ec464c7a2d3ca8abe203adf1c1f574472d9173f53
+ languageName: node
+ linkType: hard
+
"@isaacs/balanced-match@npm:^4.0.1":
version: 4.0.1
resolution: "@isaacs/balanced-match@npm:4.0.1"
@@ -3966,6 +4107,15 @@ __metadata:
languageName: node
linkType: hard
+"@mermaid-js/parser@npm:^0.6.2":
+ version: 0.6.2
+ resolution: "@mermaid-js/parser@npm:0.6.2"
+ dependencies:
+ langium: "npm:3.3.1"
+ checksum: 10c0/6059341a5dc3fdf56dd75c858843154e18c582e5cc41c3e73e9a076e218116c6bdbdba729d27154cef61430c900d87342423bbb81e37d8a9968c6c2fdd99e87a
+ languageName: node
+ linkType: hard
+
"@monaco-editor/loader@npm:^1.5.0":
version: 1.5.0
resolution: "@monaco-editor/loader@npm:1.5.0"
@@ -4155,6 +4305,13 @@ __metadata:
languageName: node
linkType: hard
+"@opentelemetry/api@npm:1.9.0":
+ version: 1.9.0
+ resolution: "@opentelemetry/api@npm:1.9.0"
+ checksum: 10c0/9aae2fe6e8a3a3eeb6c1fdef78e1939cf05a0f37f8a4fae4d6bf2e09eb1e06f966ece85805626e01ba5fab48072b94f19b835449e58b6d26720ee19a58298add
+ languageName: node
+ linkType: hard
+
"@parcel/watcher-android-arm64@npm:2.5.0":
version: 2.5.0
resolution: "@parcel/watcher-android-arm64@npm:2.5.0"
@@ -4956,6 +5113,74 @@ __metadata:
languageName: node
linkType: hard
+"@shikijs/core@npm:3.12.2":
+ version: 3.12.2
+ resolution: "@shikijs/core@npm:3.12.2"
+ dependencies:
+ "@shikijs/types": "npm:3.12.2"
+ "@shikijs/vscode-textmate": "npm:^10.0.2"
+ "@types/hast": "npm:^3.0.4"
+ hast-util-to-html: "npm:^9.0.5"
+ checksum: 10c0/3a05bc0a316a8a0170996ffe5dfc76021d20a459ed2c1aa5e659468a1a65af1cf0f69415535ecbd54387f4c61caf5d4b4e88b8fde21caf6af649d4178da52092
+ languageName: node
+ linkType: hard
+
+"@shikijs/engine-javascript@npm:3.12.2":
+ version: 3.12.2
+ resolution: "@shikijs/engine-javascript@npm:3.12.2"
+ dependencies:
+ "@shikijs/types": "npm:3.12.2"
+ "@shikijs/vscode-textmate": "npm:^10.0.2"
+ oniguruma-to-es: "npm:^4.3.3"
+ checksum: 10c0/421a3c20ab9841ffcefd776ac8c4ad7394458fa7919113b4ee5a0d97c2093dcc2299c886ea37216356972108024b5ac6b787dae26c25b42c54ae358f9feec3f0
+ languageName: node
+ linkType: hard
+
+"@shikijs/engine-oniguruma@npm:3.12.2":
+ version: 3.12.2
+ resolution: "@shikijs/engine-oniguruma@npm:3.12.2"
+ dependencies:
+ "@shikijs/types": "npm:3.12.2"
+ "@shikijs/vscode-textmate": "npm:^10.0.2"
+ checksum: 10c0/89887dda52949f82537388000b13f9060ae1fcdd87f4b305282b97bdbb1afde0bc4abb8f4f046896164fd8b9c251923f2a4d780fac933fa214ba90fb957a9873
+ languageName: node
+ linkType: hard
+
+"@shikijs/langs@npm:3.12.2":
+ version: 3.12.2
+ resolution: "@shikijs/langs@npm:3.12.2"
+ dependencies:
+ "@shikijs/types": "npm:3.12.2"
+ checksum: 10c0/1e72b8efedb5d3959ac4d4fea5a2d5eb7855eea5cddc35e21b5090bf903d40c058214be8c1d63355ad2cffc022be8ec0297bdc6695eddc91c02324dcc417814d
+ languageName: node
+ linkType: hard
+
+"@shikijs/themes@npm:3.12.2":
+ version: 3.12.2
+ resolution: "@shikijs/themes@npm:3.12.2"
+ dependencies:
+ "@shikijs/types": "npm:3.12.2"
+ checksum: 10c0/728b89554a166dca87aa3a4b53d0aa2c0b2c560252036275b3e8d3b4c790cf8fc980b5e06574e4fbad223627149eff150af12db5acd599fffd71e6ff6391af18
+ languageName: node
+ linkType: hard
+
+"@shikijs/types@npm:3.12.2":
+ version: 3.12.2
+ resolution: "@shikijs/types@npm:3.12.2"
+ dependencies:
+ "@shikijs/vscode-textmate": "npm:^10.0.2"
+ "@types/hast": "npm:^3.0.4"
+ checksum: 10c0/74622ac69a84f0d7b66f6f9253bdaa0fee69b7bc97d5f85e12b2a70a9d77d2b04fdbccf65fcd9460449340a214594cb945fee8b3d2c091175e58e8cdf2cb2920
+ languageName: node
+ linkType: hard
+
+"@shikijs/vscode-textmate@npm:^10.0.2":
+ version: 10.0.2
+ resolution: "@shikijs/vscode-textmate@npm:10.0.2"
+ checksum: 10c0/36b682d691088ec244de292dc8f91b808f95c89466af421cf84cbab92230f03c8348649c14b3251991b10ce632b0c715e416e992dd5f28ff3221dc2693fd9462
+ languageName: node
+ linkType: hard
+
"@shuding/opentype.js@npm:1.4.0-beta.0":
version: 1.4.0-beta.0
resolution: "@shuding/opentype.js@npm:1.4.0-beta.0"
@@ -6231,13 +6456,38 @@ __metadata:
languageName: node
linkType: hard
-"@types/d3-array@npm:^3.0.3":
+"@types/d3-array@npm:*, @types/d3-array@npm:^3.0.3":
version: 3.2.1
resolution: "@types/d3-array@npm:3.2.1"
checksum: 10c0/38bf2c778451f4b79ec81a2288cb4312fe3d6449ecdf562970cc339b60f280f31c93a024c7ff512607795e79d3beb0cbda123bb07010167bce32927f71364bca
languageName: node
linkType: hard
+"@types/d3-axis@npm:*":
+ version: 3.0.6
+ resolution: "@types/d3-axis@npm:3.0.6"
+ dependencies:
+ "@types/d3-selection": "npm:*"
+ checksum: 10c0/d756d42360261f44d8eefd0950c5bb0a4f67a46dd92069da3f723ac36a1e8cb2b9ce6347d836ef19d5b8aef725dbcf8fdbbd6cfbff676ca4b0642df2f78b599a
+ languageName: node
+ linkType: hard
+
+"@types/d3-brush@npm:*":
+ version: 3.0.6
+ resolution: "@types/d3-brush@npm:3.0.6"
+ dependencies:
+ "@types/d3-selection": "npm:*"
+ checksum: 10c0/fd6e2ac7657a354f269f6b9c58451ffae9d01b89ccb1eb6367fd36d635d2f1990967215ab498e0c0679ff269429c57fad6a2958b68f4d45bc9f81d81672edc01
+ languageName: node
+ linkType: hard
+
+"@types/d3-chord@npm:*":
+ version: 3.0.6
+ resolution: "@types/d3-chord@npm:3.0.6"
+ checksum: 10c0/c5a25eb5389db01e63faec0c5c2ec7cc41c494e9b3201630b494c4e862a60f1aa83fabbc33a829e7e1403941e3c30d206c741559b14406ac2a4239cfdf4b4c17
+ languageName: node
+ linkType: hard
+
"@types/d3-color@npm:*":
version: 3.1.3
resolution: "@types/d3-color@npm:3.1.3"
@@ -6245,14 +6495,93 @@ __metadata:
languageName: node
linkType: hard
-"@types/d3-ease@npm:^3.0.0":
+"@types/d3-contour@npm:*":
+ version: 3.0.6
+ resolution: "@types/d3-contour@npm:3.0.6"
+ dependencies:
+ "@types/d3-array": "npm:*"
+ "@types/geojson": "npm:*"
+ checksum: 10c0/e7d83e94719af4576ceb5ac7f277c5806f83ba6c3631744ae391cffc3641f09dfa279470b83053cd0b2acd6784e8749c71141d05bdffa63ca58ffb5b31a0f27c
+ languageName: node
+ linkType: hard
+
+"@types/d3-delaunay@npm:*":
+ version: 6.0.4
+ resolution: "@types/d3-delaunay@npm:6.0.4"
+ checksum: 10c0/d154a8864f08c4ea23ecb9bdabcef1c406a25baa8895f0cb08a0ed2799de0d360e597552532ce7086ff0cdffa8f3563f9109d18f0191459d32bb620a36939123
+ languageName: node
+ linkType: hard
+
+"@types/d3-dispatch@npm:*":
+ version: 3.0.7
+ resolution: "@types/d3-dispatch@npm:3.0.7"
+ checksum: 10c0/38c6605ebf0bf0099dfb70eafe0dd4ae8213368b40b8f930b72a909ff2e7259d2bd8a54d100bb5a44eb4b36f4f2a62dcb37f8be59613ca6b507c7a2f910b3145
+ languageName: node
+ linkType: hard
+
+"@types/d3-drag@npm:*":
+ version: 3.0.7
+ resolution: "@types/d3-drag@npm:3.0.7"
+ dependencies:
+ "@types/d3-selection": "npm:*"
+ checksum: 10c0/65e29fa32a87c72d26c44b5e2df3bf15af21cd128386bcc05bcacca255927c0397d0cd7e6062aed5f0abd623490544a9d061c195f5ed9f018fe0b698d99c079d
+ languageName: node
+ linkType: hard
+
+"@types/d3-dsv@npm:*":
+ version: 3.0.7
+ resolution: "@types/d3-dsv@npm:3.0.7"
+ checksum: 10c0/c0f01da862465594c8a28278b51c850af3b4239cc22b14fd1a19d7a98f93d94efa477bf59d8071beb285dca45bf614630811451e18e7c52add3a0abfee0a1871
+ languageName: node
+ linkType: hard
+
+"@types/d3-ease@npm:*, @types/d3-ease@npm:^3.0.0":
version: 3.0.2
resolution: "@types/d3-ease@npm:3.0.2"
checksum: 10c0/aff5a1e572a937ee9bff6465225d7ba27d5e0c976bd9eacdac2e6f10700a7cb0c9ea2597aff6b43a6ed850a3210030870238894a77ec73e309b4a9d0333f099c
languageName: node
linkType: hard
-"@types/d3-interpolate@npm:^3.0.1":
+"@types/d3-fetch@npm:*":
+ version: 3.0.7
+ resolution: "@types/d3-fetch@npm:3.0.7"
+ dependencies:
+ "@types/d3-dsv": "npm:*"
+ checksum: 10c0/3d147efa52a26da1a5d40d4d73e6cebaaa964463c378068062999b93ea3731b27cc429104c21ecbba98c6090e58ef13429db6399238c5e3500162fb3015697a0
+ languageName: node
+ linkType: hard
+
+"@types/d3-force@npm:*":
+ version: 3.0.10
+ resolution: "@types/d3-force@npm:3.0.10"
+ checksum: 10c0/c82b459079a106b50e346c9b79b141f599f2fc4f598985a5211e72c7a2e20d35bd5dc6e91f306b323c8bfa325c02c629b1645f5243f1c6a55bd51bc85cccfa92
+ languageName: node
+ linkType: hard
+
+"@types/d3-format@npm:*":
+ version: 3.0.4
+ resolution: "@types/d3-format@npm:3.0.4"
+ checksum: 10c0/3ac1600bf9061a59a228998f7cd3f29e85cbf522997671ba18d4d84d10a2a1aff4f95aceb143fa9960501c3ec351e113fc75884e6a504ace44dc1744083035ee
+ languageName: node
+ linkType: hard
+
+"@types/d3-geo@npm:*":
+ version: 3.1.0
+ resolution: "@types/d3-geo@npm:3.1.0"
+ dependencies:
+ "@types/geojson": "npm:*"
+ checksum: 10c0/3745a93439038bb5b0b38facf435f7079812921d46406f5d38deaee59e90084ff742443c7ea0a8446df81a0d81eaf622fe7068cf4117a544bd4aa3b2dc182f88
+ languageName: node
+ linkType: hard
+
+"@types/d3-hierarchy@npm:*":
+ version: 3.1.7
+ resolution: "@types/d3-hierarchy@npm:3.1.7"
+ checksum: 10c0/873711737d6b8e7b6f1dda0bcd21294a48f75024909ae510c5d2c21fad2e72032e0958def4d9f68319d3aaac298ad09c49807f8bfc87a145a82693b5208613c7
+ languageName: node
+ linkType: hard
+
+"@types/d3-interpolate@npm:*, @types/d3-interpolate@npm:^3.0.1":
version: 3.0.4
resolution: "@types/d3-interpolate@npm:3.0.4"
dependencies:
@@ -6268,6 +6597,43 @@ __metadata:
languageName: node
linkType: hard
+"@types/d3-polygon@npm:*":
+ version: 3.0.2
+ resolution: "@types/d3-polygon@npm:3.0.2"
+ checksum: 10c0/f46307bb32b6c2aef8c7624500e0f9b518de8f227ccc10170b869dc43e4c542560f6c8d62e9f087fac45e198d6e4b623e579c0422e34c85baf56717456d3f439
+ languageName: node
+ linkType: hard
+
+"@types/d3-quadtree@npm:*":
+ version: 3.0.6
+ resolution: "@types/d3-quadtree@npm:3.0.6"
+ checksum: 10c0/7eaa0a4d404adc856971c9285e1c4ab17e9135ea669d847d6db7e0066126a28ac751864e7ce99c65d526e130f56754a2e437a1617877098b3bdcc3ef23a23616
+ languageName: node
+ linkType: hard
+
+"@types/d3-random@npm:*":
+ version: 3.0.3
+ resolution: "@types/d3-random@npm:3.0.3"
+ checksum: 10c0/5f4fea40080cd6d4adfee05183d00374e73a10c530276a6455348983dda341003a251def28565a27c25d9cf5296a33e870e397c9d91ff83fb7495a21c96b6882
+ languageName: node
+ linkType: hard
+
+"@types/d3-scale-chromatic@npm:*":
+ version: 3.1.0
+ resolution: "@types/d3-scale-chromatic@npm:3.1.0"
+ checksum: 10c0/93c564e02d2e97a048e18fe8054e4a935335da6ab75a56c3df197beaa87e69122eef0dfbeb7794d4a444a00e52e3123514ee27cec084bd21f6425b7037828cc2
+ languageName: node
+ linkType: hard
+
+"@types/d3-scale@npm:*":
+ version: 4.0.9
+ resolution: "@types/d3-scale@npm:4.0.9"
+ dependencies:
+ "@types/d3-time": "npm:*"
+ checksum: 10c0/4ac44233c05cd50b65b33ecb35d99fdf07566bcdbc55bc1306b2f27d1c5134d8c560d356f2c8e76b096e9125ffb8d26d95f78d56e210d1c542cb255bdf31d6c8
+ languageName: node
+ linkType: hard
+
"@types/d3-scale@npm:^4.0.2":
version: 4.0.8
resolution: "@types/d3-scale@npm:4.0.8"
@@ -6277,7 +6643,14 @@ __metadata:
languageName: node
linkType: hard
-"@types/d3-shape@npm:^3.1.0":
+"@types/d3-selection@npm:*":
+ version: 3.0.11
+ resolution: "@types/d3-selection@npm:3.0.11"
+ checksum: 10c0/0c512956c7503ff5def4bb32e0c568cc757b9a2cc400a104fc0f4cfe5e56d83ebde2a97821b6f2cb26a7148079d3b86a2f28e11d68324ed311cf35c2ed980d1d
+ languageName: node
+ linkType: hard
+
+"@types/d3-shape@npm:*, @types/d3-shape@npm:^3.1.0":
version: 3.1.7
resolution: "@types/d3-shape@npm:3.1.7"
dependencies:
@@ -6286,6 +6659,13 @@ __metadata:
languageName: node
linkType: hard
+"@types/d3-time-format@npm:*":
+ version: 4.0.3
+ resolution: "@types/d3-time-format@npm:4.0.3"
+ checksum: 10c0/9ef5e8e2b96b94799b821eed5d61a3d432c7903247966d8ad951b8ce5797fe46554b425cb7888fa5bf604b4663c369d7628c0328ffe80892156671c58d1a7f90
+ languageName: node
+ linkType: hard
+
"@types/d3-time@npm:*, @types/d3-time@npm:^3.0.0":
version: 3.0.4
resolution: "@types/d3-time@npm:3.0.4"
@@ -6293,13 +6673,70 @@ __metadata:
languageName: node
linkType: hard
-"@types/d3-timer@npm:^3.0.0":
+"@types/d3-timer@npm:*, @types/d3-timer@npm:^3.0.0":
version: 3.0.2
resolution: "@types/d3-timer@npm:3.0.2"
checksum: 10c0/c644dd9571fcc62b1aa12c03bcad40571553020feeb5811f1d8a937ac1e65b8a04b759b4873aef610e28b8714ac71c9885a4d6c127a048d95118f7e5b506d9e1
languageName: node
linkType: hard
+"@types/d3-transition@npm:*":
+ version: 3.0.9
+ resolution: "@types/d3-transition@npm:3.0.9"
+ dependencies:
+ "@types/d3-selection": "npm:*"
+ checksum: 10c0/4f68b9df7ac745b3491216c54203cbbfa0f117ae4c60e2609cdef2db963582152035407fdff995b10ee383bae2f05b7743493f48e1b8e46df54faa836a8fb7b5
+ languageName: node
+ linkType: hard
+
+"@types/d3-zoom@npm:*":
+ version: 3.0.8
+ resolution: "@types/d3-zoom@npm:3.0.8"
+ dependencies:
+ "@types/d3-interpolate": "npm:*"
+ "@types/d3-selection": "npm:*"
+ checksum: 10c0/1dbdbcafddcae12efb5beb6948546963f29599e18bc7f2a91fb69cc617c2299a65354f2d47e282dfb86fec0968406cd4fb7f76ba2d2fb67baa8e8d146eb4a547
+ languageName: node
+ linkType: hard
+
+"@types/d3@npm:^7.4.3":
+ version: 7.4.3
+ resolution: "@types/d3@npm:7.4.3"
+ dependencies:
+ "@types/d3-array": "npm:*"
+ "@types/d3-axis": "npm:*"
+ "@types/d3-brush": "npm:*"
+ "@types/d3-chord": "npm:*"
+ "@types/d3-color": "npm:*"
+ "@types/d3-contour": "npm:*"
+ "@types/d3-delaunay": "npm:*"
+ "@types/d3-dispatch": "npm:*"
+ "@types/d3-drag": "npm:*"
+ "@types/d3-dsv": "npm:*"
+ "@types/d3-ease": "npm:*"
+ "@types/d3-fetch": "npm:*"
+ "@types/d3-force": "npm:*"
+ "@types/d3-format": "npm:*"
+ "@types/d3-geo": "npm:*"
+ "@types/d3-hierarchy": "npm:*"
+ "@types/d3-interpolate": "npm:*"
+ "@types/d3-path": "npm:*"
+ "@types/d3-polygon": "npm:*"
+ "@types/d3-quadtree": "npm:*"
+ "@types/d3-random": "npm:*"
+ "@types/d3-scale": "npm:*"
+ "@types/d3-scale-chromatic": "npm:*"
+ "@types/d3-selection": "npm:*"
+ "@types/d3-shape": "npm:*"
+ "@types/d3-time": "npm:*"
+ "@types/d3-time-format": "npm:*"
+ "@types/d3-timer": "npm:*"
+ "@types/d3-transition": "npm:*"
+ "@types/d3-zoom": "npm:*"
+ checksum: 10c0/a9c6d65b13ef3b42c87f2a89ea63a6d5640221869f97d0657b0cb2f1dac96a0f164bf5605643c0794e0de3aa2bf05df198519aaf15d24ca135eb0e8bd8a9d879
+ languageName: node
+ linkType: hard
+
"@types/debug@npm:^4, @types/debug@npm:^4.0.0, @types/debug@npm:^4.1.6":
version: 4.1.12
resolution: "@types/debug@npm:4.1.12"
@@ -6430,7 +6867,14 @@ __metadata:
languageName: node
linkType: hard
-"@types/hast@npm:^3.0.0":
+"@types/geojson@npm:*":
+ version: 7946.0.16
+ resolution: "@types/geojson@npm:7946.0.16"
+ checksum: 10c0/1ff24a288bd5860b766b073ead337d31d73bdc715e5b50a2cee5cb0af57a1ed02cc04ef295f5fa68dc40fe3e4f104dd31282b2b818a5ba3231bc1001ba084e3c
+ languageName: node
+ linkType: hard
+
+"@types/hast@npm:^3.0.0, @types/hast@npm:^3.0.4":
version: 3.0.4
resolution: "@types/hast@npm:3.0.4"
dependencies:
@@ -6515,6 +6959,13 @@ __metadata:
languageName: node
linkType: hard
+"@types/katex@npm:^0.16.0":
+ version: 0.16.7
+ resolution: "@types/katex@npm:0.16.7"
+ checksum: 10c0/68dcb9f68a90513ec78ca0196a142e15c2a2c270b1520d752bafd47a99207115085a64087b50140359017d7e9c870b3c68e7e4d36668c9e348a9ef0c48919b5a
+ languageName: node
+ linkType: hard
+
"@types/keyv@npm:^3.1.4":
version: 3.1.4
resolution: "@types/keyv@npm:3.1.4"
@@ -6891,6 +7342,13 @@ __metadata:
languageName: node
linkType: hard
+"@types/trusted-types@npm:^2.0.7":
+ version: 2.0.7
+ resolution: "@types/trusted-types@npm:2.0.7"
+ checksum: 10c0/4c4855f10de7c6c135e0d32ce462419d8abbbc33713b31d294596c0cc34ae1fa6112a2f9da729c8f7a20707782b0d69da3b1f8df6645b0366d08825ca1522e0c
+ languageName: node
+ linkType: hard
+
"@types/unist@npm:*, @types/unist@npm:^3.0.0":
version: 3.0.3
resolution: "@types/unist@npm:3.0.3"
@@ -7684,6 +8142,20 @@ __metadata:
languageName: node
linkType: hard
+"ai@npm:5.0.40, ai@npm:^5.0.18":
+ version: 5.0.40
+ resolution: "ai@npm:5.0.40"
+ dependencies:
+ "@ai-sdk/gateway": "npm:1.0.21"
+ "@ai-sdk/provider": "npm:2.0.0"
+ "@ai-sdk/provider-utils": "npm:3.0.8"
+ "@opentelemetry/api": "npm:1.9.0"
+ peerDependencies:
+ zod: ^3.25.76 || ^4
+ checksum: 10c0/9c29f8f47ff568a2a4af14fa79d90bea4dea8edee57ba2ed9c618ce1d7c3c8b104329ac869becfc41448eb9dabb0605d27b6dddffb1c6498badd444028d43371
+ languageName: node
+ linkType: hard
+
"ajv-formats@npm:^2.1.1":
version: 2.1.1
resolution: "ajv-formats@npm:2.1.1"
@@ -8760,6 +9232,31 @@ __metadata:
languageName: node
linkType: hard
+"chevrotain-allstar@npm:~0.3.0":
+ version: 0.3.1
+ resolution: "chevrotain-allstar@npm:0.3.1"
+ dependencies:
+ lodash-es: "npm:^4.17.21"
+ peerDependencies:
+ chevrotain: ^11.0.0
+ checksum: 10c0/5cadedffd3114eb06b15fd3939bb1aa6c75412dbd737fe302b52c5c24334f9cb01cee8edc1d1067d98ba80dddf971f1d0e94b387de51423fc6cf3c5d8b7ef27a
+ languageName: node
+ linkType: hard
+
+"chevrotain@npm:~11.0.3":
+ version: 11.0.3
+ resolution: "chevrotain@npm:11.0.3"
+ dependencies:
+ "@chevrotain/cst-dts-gen": "npm:11.0.3"
+ "@chevrotain/gast": "npm:11.0.3"
+ "@chevrotain/regexp-to-ast": "npm:11.0.3"
+ "@chevrotain/types": "npm:11.0.3"
+ "@chevrotain/utils": "npm:11.0.3"
+ lodash-es: "npm:4.17.21"
+ checksum: 10c0/ffd425fa321e3f17e9833d7f44cd39f2743f066e92ca74b226176080ca5d455f853fe9091cdfd86354bd899d85c08b3bdc3f55b267e7d07124b048a88349765f
+ languageName: node
+ linkType: hard
+
"chokidar@npm:^3.0.0, chokidar@npm:^3.4.2, chokidar@npm:^3.5.3, chokidar@npm:^3.6.0":
version: 3.6.0
resolution: "chokidar@npm:3.6.0"
@@ -9203,6 +9700,20 @@ __metadata:
languageName: node
linkType: hard
+"confbox@npm:^0.1.8":
+ version: 0.1.8
+ resolution: "confbox@npm:0.1.8"
+ checksum: 10c0/fc2c68d97cb54d885b10b63e45bd8da83a8a71459d3ecf1825143dd4c7f9f1b696b3283e07d9d12a144c1301c2ebc7842380bdf0014e55acc4ae1c9550102418
+ languageName: node
+ linkType: hard
+
+"confbox@npm:^0.2.2":
+ version: 0.2.2
+ resolution: "confbox@npm:0.2.2"
+ checksum: 10c0/7c246588d533d31e8cdf66cb4701dff6de60f9be77ab54c0d0338e7988750ac56863cc0aca1b3f2046f45ff223a765d3e5d4977a7674485afcd37b6edf3fd129
+ languageName: node
+ linkType: hard
+
"config-chain@npm:^1.1.11":
version: 1.1.13
resolution: "config-chain@npm:1.1.13"
@@ -9351,6 +9862,24 @@ __metadata:
languageName: node
linkType: hard
+"cose-base@npm:^1.0.0":
+ version: 1.0.3
+ resolution: "cose-base@npm:1.0.3"
+ dependencies:
+ layout-base: "npm:^1.0.0"
+ checksum: 10c0/a6e400b1d101393d6af0967c1353355777c1106c40417c5acaef6ca8bdda41e2fc9398f466d6c85be30290943ad631f2590569f67b3fd5368a0d8318946bd24f
+ languageName: node
+ linkType: hard
+
+"cose-base@npm:^2.2.0":
+ version: 2.2.0
+ resolution: "cose-base@npm:2.2.0"
+ dependencies:
+ layout-base: "npm:^2.0.0"
+ checksum: 10c0/14b9f8100ac322a00777ffb1daeb3321af368bbc9cabe3103943361273baee2003202ffe38e4ab770960b600214224e9c196195a78d589521540aa694df7cdec
+ languageName: node
+ linkType: hard
+
"cosmiconfig@npm:^6.0.0":
version: 6.0.0
resolution: "cosmiconfig@npm:6.0.0"
@@ -9735,6 +10264,44 @@ __metadata:
languageName: node
linkType: hard
+"cytoscape-cose-bilkent@npm:^4.1.0":
+ version: 4.1.0
+ resolution: "cytoscape-cose-bilkent@npm:4.1.0"
+ dependencies:
+ cose-base: "npm:^1.0.0"
+ peerDependencies:
+ cytoscape: ^3.2.0
+ checksum: 10c0/5e2480ddba9da1a68e700ed2c674cbfd51e9efdbd55788f1971a68de4eb30708e3b3a5e808bf5628f7a258680406bbe6586d87a9133e02a9bdc1ab1a92f512f2
+ languageName: node
+ linkType: hard
+
+"cytoscape-fcose@npm:^2.2.0":
+ version: 2.2.0
+ resolution: "cytoscape-fcose@npm:2.2.0"
+ dependencies:
+ cose-base: "npm:^2.2.0"
+ peerDependencies:
+ cytoscape: ^3.2.0
+ checksum: 10c0/ce472c9f85b9057e75c5685396f8e1f2468895e71b184913e05ad56dcf3092618fe59a1054f29cb0995051ba8ebe566ad0dd49a58d62845145624bd60cd44917
+ languageName: node
+ linkType: hard
+
+"cytoscape@npm:^3.29.3":
+ version: 3.33.1
+ resolution: "cytoscape@npm:3.33.1"
+ checksum: 10c0/dffcf5f74df4d91517c4faf394df880d8283ce76edef19edba0c762941cf4f18daf7c4c955ec50c794f476ace39ad4394f8c98483222bd2682e1fd206e976411
+ languageName: node
+ linkType: hard
+
+"d3-array@npm:1 - 2":
+ version: 2.12.1
+ resolution: "d3-array@npm:2.12.1"
+ dependencies:
+ internmap: "npm:^1.0.0"
+ checksum: 10c0/7eca10427a9f113a4ca6a0f7301127cab26043fd5e362631ef5a0edd1c4b2dd70c56ed317566700c31e4a6d88b55f3951aaba192291817f243b730cb2352882e
+ languageName: node
+ linkType: hard
+
"d3-array@npm:2 - 3, d3-array@npm:2.10.0 - 3, d3-array@npm:2.5.0 - 3, d3-array@npm:3, d3-array@npm:^3.1.6, d3-array@npm:^3.2.0":
version: 3.2.4
resolution: "d3-array@npm:3.2.4"
@@ -9895,6 +10462,13 @@ __metadata:
languageName: node
linkType: hard
+"d3-path@npm:1":
+ version: 1.0.9
+ resolution: "d3-path@npm:1.0.9"
+ checksum: 10c0/e35e84df5abc18091f585725b8235e1fa97efc287571585427d3a3597301e6c506dea56b11dfb3c06ca5858b3eb7f02c1bf4f6a716aa9eade01c41b92d497eb5
+ languageName: node
+ linkType: hard
+
"d3-path@npm:1 - 3, d3-path@npm:3, d3-path@npm:^3.1.0":
version: 3.1.0
resolution: "d3-path@npm:3.1.0"
@@ -9923,6 +10497,16 @@ __metadata:
languageName: node
linkType: hard
+"d3-sankey@npm:^0.12.3":
+ version: 0.12.3
+ resolution: "d3-sankey@npm:0.12.3"
+ dependencies:
+ d3-array: "npm:1 - 2"
+ d3-shape: "npm:^1.2.0"
+ checksum: 10c0/261debb01a13269f6fc53b9ebaef174a015d5ad646242c23995bf514498829ab8b8f920a7873724a7494288b46bea3ce7ebc5a920b745bc8ae4caa5885cf5204
+ languageName: node
+ linkType: hard
+
"d3-scale-chromatic@npm:3":
version: 3.1.0
resolution: "d3-scale-chromatic@npm:3.1.0"
@@ -9962,6 +10546,15 @@ __metadata:
languageName: node
linkType: hard
+"d3-shape@npm:^1.2.0":
+ version: 1.3.7
+ resolution: "d3-shape@npm:1.3.7"
+ dependencies:
+ d3-path: "npm:1"
+ checksum: 10c0/548057ce59959815decb449f15632b08e2a1bdce208f9a37b5f98ec7629dda986c2356bc7582308405ce68aedae7d47b324df41507404df42afaf352907577ae
+ languageName: node
+ linkType: hard
+
"d3-time-format@npm:2 - 4, d3-time-format@npm:4":
version: 4.1.0
resolution: "d3-time-format@npm:4.1.0"
@@ -10053,6 +10646,23 @@ __metadata:
languageName: node
linkType: hard
+"dagre-d3-es@npm:7.0.11":
+ version: 7.0.11
+ resolution: "dagre-d3-es@npm:7.0.11"
+ dependencies:
+ d3: "npm:^7.9.0"
+ lodash-es: "npm:^4.17.21"
+ checksum: 10c0/52f88bdfeca0d8554bee0c1419377585355b4ef179e5fedd3bac75f772745ecb789f6d7ea377a17566506bc8f151bc0dfe02a5175207a547975f335cd88c726c
+ languageName: node
+ linkType: hard
+
+"dayjs@npm:^1.11.13":
+ version: 1.11.18
+ resolution: "dayjs@npm:1.11.18"
+ checksum: 10c0/83b67f5d977e2634edf4f5abdd91d9041a696943143638063016915d2cd8c7e57e0751e40379a07ebca8be7a48dd380bef8752d22a63670f2d15970e34f96d7a
+ languageName: node
+ linkType: hard
+
"dayjs@npm:^1.11.15":
version: 1.11.15
resolution: "dayjs@npm:1.11.15"
@@ -10521,6 +11131,18 @@ __metadata:
languageName: node
linkType: hard
+"dompurify@npm:^3.2.5":
+ version: 3.2.6
+ resolution: "dompurify@npm:3.2.6"
+ dependencies:
+ "@types/trusted-types": "npm:^2.0.7"
+ dependenciesMeta:
+ "@types/trusted-types":
+ optional: true
+ checksum: 10c0/c8f8e5b0879a0d93c84a2e5e78649a47d0c057ed0f7850ca3d573d2cca64b84fb1ff85bd4b20980ade69c4e5b80ae73011340f1c2ff375c7ef98bb8268e1d13a
+ languageName: node
+ linkType: hard
+
"domutils@npm:^2.5.2, domutils@npm:^2.8.0":
version: 2.8.0
resolution: "domutils@npm:2.8.0"
@@ -11623,6 +12245,13 @@ __metadata:
languageName: node
linkType: hard
+"eventsource-parser@npm:^3.0.5":
+ version: 3.0.6
+ resolution: "eventsource-parser@npm:3.0.6"
+ checksum: 10c0/70b8ccec7dac767ef2eca43f355e0979e70415701691382a042a2df8d6a68da6c2fca35363669821f3da876d29c02abe9b232964637c1b6635c940df05ada78a
+ languageName: node
+ linkType: hard
+
"execa@npm:^5.0.0":
version: 5.1.1
resolution: "execa@npm:5.1.1"
@@ -11700,6 +12329,13 @@ __metadata:
languageName: node
linkType: hard
+"exsolve@npm:^1.0.7":
+ version: 1.0.7
+ resolution: "exsolve@npm:1.0.7"
+ checksum: 10c0/4479369d0bd84bb7e0b4f5d9bc18d26a89b6dbbbccd73f9d383d14892ef78ddbe159e01781055342f83dc00ebe90044036daf17ddf55cc21e2cac6609aa15631
+ languageName: node
+ linkType: hard
+
"extend-shallow@npm:^2.0.1":
version: 2.0.1
resolution: "extend-shallow@npm:2.0.1"
@@ -12512,6 +13148,13 @@ __metadata:
languageName: node
linkType: hard
+"globals@npm:^15.15.0":
+ version: 15.15.0
+ resolution: "globals@npm:15.15.0"
+ checksum: 10c0/f9ae80996392ca71316495a39bec88ac43ae3525a438b5626cd9d5ce9d5500d0a98a266409605f8cd7241c7acf57c354a48111ea02a767ba4f374b806d6861fe
+ languageName: node
+ linkType: hard
+
"globalthis@npm:^1.0.1":
version: 1.0.4
resolution: "globalthis@npm:1.0.4"
@@ -12652,6 +13295,13 @@ __metadata:
languageName: node
linkType: hard
+"hachure-fill@npm:^0.5.2":
+ version: 0.5.2
+ resolution: "hachure-fill@npm:0.5.2"
+ checksum: 10c0/307e3b6f9f2d3c11a82099c3f71eecbb9c440c00c1f896ac1732c23e6dbff16a92bb893d222b8b721b89cf11e58649ca60b4c24e5663f705f877cefd40153429
+ languageName: node
+ linkType: hard
+
"handle-thing@npm:^2.0.0":
version: 2.0.1
resolution: "handle-thing@npm:2.0.1"
@@ -12659,6 +13309,16 @@ __metadata:
languageName: node
linkType: hard
+"harden-react-markdown@npm:^1.0.5":
+ version: 1.0.5
+ resolution: "harden-react-markdown@npm:1.0.5"
+ peerDependencies:
+ react: ">=16.8.0"
+ react-markdown: ">=9.0.0"
+ checksum: 10c0/c47035dcb4f80950cc7b7b2e8d4f161ae871a2d2c5b2a7f469179620d8b806476370dc68ddb2cfaa5612663c07d05b2577fc692493e5571d2a2d6835fa2c1198
+ languageName: node
+ linkType: hard
+
"has-flag@npm:^4.0.0":
version: 4.0.0
resolution: "has-flag@npm:4.0.0"
@@ -12716,8 +13376,45 @@ __metadata:
version: 2.0.2
resolution: "hasown@npm:2.0.2"
dependencies:
- function-bind: "npm:^1.1.2"
- checksum: 10c0/3769d434703b8ac66b209a4cca0737519925bbdb61dd887f93a16372b14694c63ff4e797686d87c90f08168e81082248b9b028bad60d4da9e0d1148766f56eb9
+ function-bind: "npm:^1.1.2"
+ checksum: 10c0/3769d434703b8ac66b209a4cca0737519925bbdb61dd887f93a16372b14694c63ff4e797686d87c90f08168e81082248b9b028bad60d4da9e0d1148766f56eb9
+ languageName: node
+ linkType: hard
+
+"hast-util-from-dom@npm:^5.0.0":
+ version: 5.0.1
+ resolution: "hast-util-from-dom@npm:5.0.1"
+ dependencies:
+ "@types/hast": "npm:^3.0.0"
+ hastscript: "npm:^9.0.0"
+ web-namespaces: "npm:^2.0.0"
+ checksum: 10c0/9a90381e048107a093a3da758bb17b67aaf5322e222f02497f841c4990abf94aa177d38d5b9bf61ad07b3601d0409f34f5b556d89578cc189230c6b994d2af77
+ languageName: node
+ linkType: hard
+
+"hast-util-from-html-isomorphic@npm:^2.0.0":
+ version: 2.0.0
+ resolution: "hast-util-from-html-isomorphic@npm:2.0.0"
+ dependencies:
+ "@types/hast": "npm:^3.0.0"
+ hast-util-from-dom: "npm:^5.0.0"
+ hast-util-from-html: "npm:^2.0.0"
+ unist-util-remove-position: "npm:^5.0.0"
+ checksum: 10c0/fc68d9245e794483a802d5c85a9f6c25959e00db78cc796411efc965134f3206f9cc9fa38134572ea781ad74663e801f1f83202007b208e27a770855566a62b6
+ languageName: node
+ linkType: hard
+
+"hast-util-from-html@npm:^2.0.0":
+ version: 2.0.3
+ resolution: "hast-util-from-html@npm:2.0.3"
+ dependencies:
+ "@types/hast": "npm:^3.0.0"
+ devlop: "npm:^1.1.0"
+ hast-util-from-parse5: "npm:^8.0.0"
+ parse5: "npm:^7.0.0"
+ vfile: "npm:^6.0.0"
+ vfile-message: "npm:^4.0.0"
+ checksum: 10c0/993ef707c1a12474c8d4094fc9706a72826c660a7e308ea54c50ad893353d32e139b7cbc67510c2e82feac572b320e3b05aeb13d0f9c6302d61261f337b46764
languageName: node
linkType: hard
@@ -12820,6 +13517,25 @@ __metadata:
languageName: node
linkType: hard
+"hast-util-to-html@npm:^9.0.5":
+ version: 9.0.5
+ resolution: "hast-util-to-html@npm:9.0.5"
+ dependencies:
+ "@types/hast": "npm:^3.0.0"
+ "@types/unist": "npm:^3.0.0"
+ ccount: "npm:^2.0.0"
+ comma-separated-tokens: "npm:^2.0.0"
+ hast-util-whitespace: "npm:^3.0.0"
+ html-void-elements: "npm:^3.0.0"
+ mdast-util-to-hast: "npm:^13.0.0"
+ property-information: "npm:^7.0.0"
+ space-separated-tokens: "npm:^2.0.0"
+ stringify-entities: "npm:^4.0.0"
+ zwitch: "npm:^2.0.4"
+ checksum: 10c0/b7a08c30bab4371fc9b4a620965c40b270e5ae7a8e94cf885f43b21705179e28c8e43b39c72885d1647965fb3738654e6962eb8b58b0c2a84271655b4d748836
+ languageName: node
+ linkType: hard
+
"hast-util-to-jsx-runtime@npm:^2.0.0":
version: 2.3.2
resolution: "hast-util-to-jsx-runtime@npm:2.3.2"
@@ -12901,6 +13617,19 @@ __metadata:
languageName: node
linkType: hard
+"hastscript@npm:^9.0.0":
+ version: 9.0.1
+ resolution: "hastscript@npm:9.0.1"
+ dependencies:
+ "@types/hast": "npm:^3.0.0"
+ comma-separated-tokens: "npm:^2.0.0"
+ hast-util-parse-selector: "npm:^4.0.0"
+ property-information: "npm:^7.0.0"
+ space-separated-tokens: "npm:^2.0.0"
+ checksum: 10c0/18dc8064e5c3a7a2ae862978e626b97a254e1c8a67ee9d0c9f06d373bba155ed805fc5b5ce21b990fb7bc174624889e5e1ce1cade264f1b1d58b48f994bc85ce
+ languageName: node
+ linkType: hard
+
"he@npm:1.2.0, he@npm:^1.2.0":
version: 1.2.0
resolution: "he@npm:1.2.0"
@@ -13472,6 +14201,13 @@ __metadata:
languageName: node
linkType: hard
+"internmap@npm:^1.0.0":
+ version: 1.0.1
+ resolution: "internmap@npm:1.0.1"
+ checksum: 10c0/60942be815ca19da643b6d4f23bd0bf4e8c97abbd080fb963fe67583b60bdfb3530448ad4486bae40810e92317bded9995cc31411218acc750d72cd4e8646eee
+ languageName: node
+ linkType: hard
+
"interpret@npm:^1.0.0":
version: 1.4.0
resolution: "interpret@npm:1.4.0"
@@ -14157,6 +14893,13 @@ __metadata:
languageName: node
linkType: hard
+"json-schema@npm:^0.4.0":
+ version: 0.4.0
+ resolution: "json-schema@npm:0.4.0"
+ checksum: 10c0/d4a637ec1d83544857c1c163232f3da46912e971d5bf054ba44fdb88f07d8d359a462b4aec46f2745efbc57053365608d88bc1d7b1729f7b4fc3369765639ed3
+ languageName: node
+ linkType: hard
+
"json-stable-stringify-without-jsonify@npm:^1.0.1":
version: 1.0.1
resolution: "json-stable-stringify-without-jsonify@npm:1.0.1"
@@ -14212,6 +14955,17 @@ __metadata:
languageName: node
linkType: hard
+"katex@npm:^0.16.0, katex@npm:^0.16.22":
+ version: 0.16.22
+ resolution: "katex@npm:0.16.22"
+ dependencies:
+ commander: "npm:^8.3.0"
+ bin:
+ katex: cli.js
+ checksum: 10c0/07b8b1f07ae53171b5f1ea0cf6f18841d2055825c8b11cd81cfe039afcd3af2cfc84ad033531ee3875088329105195b039c267e0dd4b0c237807e3c3b2009913
+ languageName: node
+ linkType: hard
+
"keyv@npm:^4.0.0, keyv@npm:^4.5.3":
version: 4.5.4
resolution: "keyv@npm:4.5.4"
@@ -14221,6 +14975,13 @@ __metadata:
languageName: node
linkType: hard
+"khroma@npm:^2.1.0":
+ version: 2.1.0
+ resolution: "khroma@npm:2.1.0"
+ checksum: 10c0/634d98753ff5d2540491cafeb708fc98de0d43f4e6795256d5c8f6e3ad77de93049ea41433928fda3697adf7bbe6fe27351858f6d23b78f8b5775ef314c59891
+ languageName: node
+ linkType: hard
+
"kind-of@npm:^6.0.0, kind-of@npm:^6.0.2":
version: 6.0.3
resolution: "kind-of@npm:6.0.3"
@@ -14242,6 +15003,13 @@ __metadata:
languageName: node
linkType: hard
+"kolorist@npm:^1.8.0":
+ version: 1.8.0
+ resolution: "kolorist@npm:1.8.0"
+ checksum: 10c0/73075db44a692bf6c34a649f3b4b3aea4993b84f6b754cbf7a8577e7c7db44c0bad87752bd23b0ce533f49de2244ce2ce03b7b1b667a85ae170a94782cc50f9b
+ languageName: node
+ linkType: hard
+
"kuler@npm:^2.0.0":
version: 2.0.0
resolution: "kuler@npm:2.0.0"
@@ -14249,6 +15017,19 @@ __metadata:
languageName: node
linkType: hard
+"langium@npm:3.3.1":
+ version: 3.3.1
+ resolution: "langium@npm:3.3.1"
+ dependencies:
+ chevrotain: "npm:~11.0.3"
+ chevrotain-allstar: "npm:~0.3.0"
+ vscode-languageserver: "npm:~9.0.1"
+ vscode-languageserver-textdocument: "npm:~1.0.11"
+ vscode-uri: "npm:~3.0.8"
+ checksum: 10c0/0c54803068addb0f7c16a57fdb2db2e5d4d9a21259d477c3c7d0587c2c2f65a313f9eeef3c95ac1c2e41cd11d4f2eaf620d2c03fe839a3350ffee59d2b4c7647
+ languageName: node
+ linkType: hard
+
"latest-version@npm:^7.0.0":
version: 7.0.0
resolution: "latest-version@npm:7.0.0"
@@ -14268,6 +15049,20 @@ __metadata:
languageName: node
linkType: hard
+"layout-base@npm:^1.0.0":
+ version: 1.0.2
+ resolution: "layout-base@npm:1.0.2"
+ checksum: 10c0/2a55d0460fd9f6ed53d7e301b9eb3dea19bda03815d616a40665ce6dc75c1f4d62e1ca19a897da1cfaf6de1b91de59cd6f2f79ba1258f3d7fccc7d46ca7f3337
+ languageName: node
+ linkType: hard
+
+"layout-base@npm:^2.0.0":
+ version: 2.0.1
+ resolution: "layout-base@npm:2.0.1"
+ checksum: 10c0/a44df9ef3cbff9916a10f616635e22b5787c89fa62b2fec6f99e8e6ee512c7cebd22668ce32dab5a83c934ba0a309c51a678aa0b40d70853de6c357893c0a88b
+ languageName: node
+ linkType: hard
+
"lazy-val@npm:^1.0.5":
version: 1.0.5
resolution: "lazy-val@npm:1.0.5"
@@ -14458,6 +15253,17 @@ __metadata:
languageName: node
linkType: hard
+"local-pkg@npm:^1.1.1":
+ version: 1.1.2
+ resolution: "local-pkg@npm:1.1.2"
+ dependencies:
+ mlly: "npm:^1.7.4"
+ pkg-types: "npm:^2.3.0"
+ quansync: "npm:^0.2.11"
+ checksum: 10c0/1bcfcc5528dea95cba3caa478126a348d3985aad9f69ecf7802c13efef90897e1c5ff7851974332c5e6d4a4698efe610fef758a068c8bc3feb5322aeb35d5993
+ languageName: node
+ linkType: hard
+
"locate-path@npm:^3.0.0":
version: 3.0.0
resolution: "locate-path@npm:3.0.0"
@@ -14486,6 +15292,13 @@ __metadata:
languageName: node
linkType: hard
+"lodash-es@npm:4.17.21, lodash-es@npm:^4.17.21":
+ version: 4.17.21
+ resolution: "lodash-es@npm:4.17.21"
+ checksum: 10c0/fb407355f7e6cd523a9383e76e6b455321f0f153a6c9625e21a8827d10c54c2a2341bd2ae8d034358b60e07325e1330c14c224ff582d04612a46a4f0479ff2f2
+ languageName: node
+ linkType: hard
+
"lodash.debounce@npm:^4.0.8":
version: 4.0.8
resolution: "lodash.debounce@npm:4.0.8"
@@ -14657,6 +15470,15 @@ __metadata:
languageName: node
linkType: hard
+"lucide-react@npm:^0.542.0":
+ version: 0.542.0
+ resolution: "lucide-react@npm:0.542.0"
+ peerDependencies:
+ react: ^16.5.1 || ^17.0.0 || ^18.0.0 || ^19.0.0
+ checksum: 10c0/3ccdb898a480f0194f93abef0b6f0347bc2647db24051e65a17c1dbd22ca0d10a7636d9af68c92665b42b6c9e761567a0d65944fe8568fb0e30a7ea57281ccec
+ languageName: node
+ linkType: hard
+
"lz-string@npm:^1.5.0":
version: 1.5.0
resolution: "lz-string@npm:1.5.0"
@@ -14794,6 +15616,24 @@ __metadata:
languageName: node
linkType: hard
+"marked@npm:^15.0.7":
+ version: 15.0.12
+ resolution: "marked@npm:15.0.12"
+ bin:
+ marked: bin/marked.js
+ checksum: 10c0/e09da211544b787ecfb25fed07af206060bf7cd6d9de6cb123f15c496a57f83b7aabea93340aaa94dae9c94e097ae129377cad6310abc16009590972e85f4212
+ languageName: node
+ linkType: hard
+
+"marked@npm:^16.2.1":
+ version: 16.2.1
+ resolution: "marked@npm:16.2.1"
+ bin:
+ marked: bin/marked.js
+ checksum: 10c0/0f4a259c8a7884a14830f06b4b44c0e5fab27924102f2713606e0627c8473d130ab9070900a40c18b0e8d39ba2f31194ce6931999ba4c7bd3b6240f6cfe9cbb3
+ languageName: node
+ linkType: hard
+
"matcher@npm:^3.0.0":
version: 3.0.0
resolution: "matcher@npm:3.0.0"
@@ -14968,6 +15808,21 @@ __metadata:
languageName: node
linkType: hard
+"mdast-util-math@npm:^3.0.0":
+ version: 3.0.0
+ resolution: "mdast-util-math@npm:3.0.0"
+ dependencies:
+ "@types/hast": "npm:^3.0.0"
+ "@types/mdast": "npm:^4.0.0"
+ devlop: "npm:^1.0.0"
+ longest-streak: "npm:^3.0.0"
+ mdast-util-from-markdown: "npm:^2.0.0"
+ mdast-util-to-markdown: "npm:^2.1.0"
+ unist-util-remove-position: "npm:^5.0.0"
+ checksum: 10c0/d4e839e38719f26872ed78aac18339805a892f1b56585a9cb8668f34e221b4f0660b9dfe49ec96dbbe79fd1b63b648608a64046d8286bcd2f9d576e80b48a0a1
+ languageName: node
+ linkType: hard
+
"mdast-util-mdx-expression@npm:^2.0.0":
version: 2.0.1
resolution: "mdast-util-mdx-expression@npm:2.0.1"
@@ -15056,7 +15911,7 @@ __metadata:
languageName: node
linkType: hard
-"mdast-util-to-markdown@npm:^2.0.0":
+"mdast-util-to-markdown@npm:^2.0.0, mdast-util-to-markdown@npm:^2.1.0":
version: 2.1.2
resolution: "mdast-util-to-markdown@npm:2.1.2"
dependencies:
@@ -15156,6 +16011,34 @@ __metadata:
languageName: node
linkType: hard
+"mermaid@npm:^11.11.0":
+ version: 11.11.0
+ resolution: "mermaid@npm:11.11.0"
+ dependencies:
+ "@braintree/sanitize-url": "npm:^7.0.4"
+ "@iconify/utils": "npm:^3.0.1"
+ "@mermaid-js/parser": "npm:^0.6.2"
+ "@types/d3": "npm:^7.4.3"
+ cytoscape: "npm:^3.29.3"
+ cytoscape-cose-bilkent: "npm:^4.1.0"
+ cytoscape-fcose: "npm:^2.2.0"
+ d3: "npm:^7.9.0"
+ d3-sankey: "npm:^0.12.3"
+ dagre-d3-es: "npm:7.0.11"
+ dayjs: "npm:^1.11.13"
+ dompurify: "npm:^3.2.5"
+ katex: "npm:^0.16.22"
+ khroma: "npm:^2.1.0"
+ lodash-es: "npm:^4.17.21"
+ marked: "npm:^15.0.7"
+ roughjs: "npm:^4.6.6"
+ stylis: "npm:^4.3.6"
+ ts-dedent: "npm:^2.2.0"
+ uuid: "npm:^11.1.0"
+ checksum: 10c0/92f54ea525ab8ea1086dd0f32eff37e1e8c97d88533b295bda957c09dbe6d5a3ad94176fa52af891435c902839900a0b968e17a424b4ccb3567d7a072061e5ab
+ languageName: node
+ linkType: hard
+
"methods@npm:~1.1.2":
version: 1.1.2
resolution: "methods@npm:1.1.2"
@@ -15307,6 +16190,21 @@ __metadata:
languageName: node
linkType: hard
+"micromark-extension-math@npm:^3.0.0":
+ version: 3.1.0
+ resolution: "micromark-extension-math@npm:3.1.0"
+ dependencies:
+ "@types/katex": "npm:^0.16.0"
+ devlop: "npm:^1.0.0"
+ katex: "npm:^0.16.0"
+ micromark-factory-space: "npm:^2.0.0"
+ micromark-util-character: "npm:^2.0.0"
+ micromark-util-symbol: "npm:^2.0.0"
+ micromark-util-types: "npm:^2.0.0"
+ checksum: 10c0/56e6f2185a4613f9d47e7e98cf8605851c990957d9229c942b005e286c8087b61dc9149448d38b2f8be6d42cc6a64aad7e1f2778ddd86fbbb1a2f48a3ca1872f
+ languageName: node
+ linkType: hard
+
"micromark-extension-mdx-expression@npm:^3.0.0":
version: 3.0.0
resolution: "micromark-extension-mdx-expression@npm:3.0.0"
@@ -15969,6 +16867,18 @@ __metadata:
languageName: node
linkType: hard
+"mlly@npm:^1.7.4":
+ version: 1.8.0
+ resolution: "mlly@npm:1.8.0"
+ dependencies:
+ acorn: "npm:^8.15.0"
+ pathe: "npm:^2.0.3"
+ pkg-types: "npm:^1.3.1"
+ ufo: "npm:^1.6.1"
+ checksum: 10c0/f174b844ae066c71e9b128046677868e2e28694f0bbeeffbe760b2a9d8ff24de0748d0fde6fabe706700c1d2e11d3c0d7a53071b5ea99671592fac03364604ab
+ languageName: node
+ linkType: hard
+
"monaco-editor@npm:^0.52.2":
version: 0.52.2
resolution: "monaco-editor@npm:0.52.2"
@@ -16465,6 +17375,24 @@ __metadata:
languageName: node
linkType: hard
+"oniguruma-parser@npm:^0.12.1":
+ version: 0.12.1
+ resolution: "oniguruma-parser@npm:0.12.1"
+ checksum: 10c0/b843ea54cda833efb19f856314afcbd43e903ece3de489ab78c527ddec84859208052557daa9fad4bdba89ebdd15b0cc250de86b3daf8c7cbe37bac5a6a185d3
+ languageName: node
+ linkType: hard
+
+"oniguruma-to-es@npm:^4.3.3":
+ version: 4.3.3
+ resolution: "oniguruma-to-es@npm:4.3.3"
+ dependencies:
+ oniguruma-parser: "npm:^0.12.1"
+ regex: "npm:^6.0.1"
+ regex-recursion: "npm:^6.0.2"
+ checksum: 10c0/bc034e84dfee4dbc061cf6364023e66e1667fb8dc3afcad3b7d6a2c77e2d4a4809396ee2fb8c1fd3d6f00f76f7ca14b773586bf862c5f0c0074c059e2a219252
+ languageName: node
+ linkType: hard
+
"open@npm:^8.0.4, open@npm:^8.0.9, open@npm:^8.4.0":
version: 8.4.2
resolution: "open@npm:8.4.2"
@@ -16653,6 +17581,13 @@ __metadata:
languageName: node
linkType: hard
+"package-manager-detector@npm:^1.3.0":
+ version: 1.3.0
+ resolution: "package-manager-detector@npm:1.3.0"
+ checksum: 10c0/b4b54a81a3230edd66564a59ff6a2233086961e36ba91a28a0f6d6932a8dec36618ace50e8efec9c4d8c6aa9828e98814557a39fb6b106c161434ccb44a80e1c
+ languageName: node
+ linkType: hard
+
"papaparse@npm:^5.5.3":
version: 5.5.3
resolution: "papaparse@npm:5.5.3"
@@ -16777,6 +17712,13 @@ __metadata:
languageName: node
linkType: hard
+"path-data-parser@npm:0.1.0, path-data-parser@npm:^0.1.0":
+ version: 0.1.0
+ resolution: "path-data-parser@npm:0.1.0"
+ checksum: 10c0/ba22d54669a8bc4a3df27431fe667900685585d1196085b803d0aa4066b83e709bbf2be7c1d2b56e706b49cc698231d55947c22abbfc4843ca424bbf8c985745
+ languageName: node
+ linkType: hard
+
"path-exists@npm:^3.0.0":
version: 3.0.0
resolution: "path-exists@npm:3.0.0"
@@ -16866,7 +17808,7 @@ __metadata:
languageName: node
linkType: hard
-"pathe@npm:^2.0.3":
+"pathe@npm:^2.0.1, pathe@npm:^2.0.3":
version: 2.0.3
resolution: "pathe@npm:2.0.3"
checksum: 10c0/c118dc5a8b5c4166011b2b70608762e260085180bb9e33e80a50dcdb1e78c010b1624f4280c492c92b05fc276715a4c357d1f9edc570f8f1b3d90b6839ebaca1
@@ -16931,6 +17873,28 @@ __metadata:
languageName: node
linkType: hard
+"pkg-types@npm:^1.3.1":
+ version: 1.3.1
+ resolution: "pkg-types@npm:1.3.1"
+ dependencies:
+ confbox: "npm:^0.1.8"
+ mlly: "npm:^1.7.4"
+ pathe: "npm:^2.0.1"
+ checksum: 10c0/19e6cb8b66dcc66c89f2344aecfa47f2431c988cfa3366bdfdcfb1dd6695f87dcce37fbd90fe9d1605e2f4440b77f391e83c23255347c35cf84e7fd774d7fcea
+ languageName: node
+ linkType: hard
+
+"pkg-types@npm:^2.3.0":
+ version: 2.3.0
+ resolution: "pkg-types@npm:2.3.0"
+ dependencies:
+ confbox: "npm:^0.2.2"
+ exsolve: "npm:^1.0.7"
+ pathe: "npm:^2.0.3"
+ checksum: 10c0/d2bbddc5b81bd4741e1529c08ef4c5f1542bbdcf63498b73b8e1d84cff71806d1b8b1577800549bb569cb7aa20056257677b979bff48c97967cba7e64f72ae12
+ languageName: node
+ linkType: hard
+
"pkg-up@npm:^3.1.0":
version: 3.1.0
resolution: "pkg-up@npm:3.1.0"
@@ -16965,6 +17929,23 @@ __metadata:
languageName: node
linkType: hard
+"points-on-curve@npm:0.2.0, points-on-curve@npm:^0.2.0":
+ version: 0.2.0
+ resolution: "points-on-curve@npm:0.2.0"
+ checksum: 10c0/f0d92343fcc2ad1f48334633e580574c1e0e28038a756133e171e537f270d6d64203feada5ee556e36f448a1b46e0306dee07b30f589f4e3ad720f6ee38ef48c
+ languageName: node
+ linkType: hard
+
+"points-on-path@npm:^0.2.1":
+ version: 0.2.1
+ resolution: "points-on-path@npm:0.2.1"
+ dependencies:
+ path-data-parser: "npm:0.1.0"
+ points-on-curve: "npm:0.2.0"
+ checksum: 10c0/a7010340f9f196976f61838e767bb7b0b7f6273ab4fb9eb37c61001fe26fbfc3fcd63c96d5e85b9a4ab579213ab366f2ddaaf60e2a9253e2b91a62db33f395ba
+ languageName: node
+ linkType: hard
+
"polished@npm:^4.2.2":
version: 4.3.1
resolution: "polished@npm:4.3.1"
@@ -18026,6 +19007,13 @@ __metadata:
languageName: node
linkType: hard
+"property-information@npm:^7.0.0":
+ version: 7.1.0
+ resolution: "property-information@npm:7.1.0"
+ checksum: 10c0/e0fe22cff26103260ad0e82959229106563fa115a54c4d6c183f49d88054e489cc9f23452d3ad584179dc13a8b7b37411a5df873746b5e4086c865874bfa968e
+ languageName: node
+ linkType: hard
+
"proto-list@npm:~1.2.1":
version: 1.2.4
resolution: "proto-list@npm:1.2.4"
@@ -18078,6 +19066,13 @@ __metadata:
languageName: node
linkType: hard
+"quansync@npm:^0.2.11":
+ version: 0.2.11
+ resolution: "quansync@npm:0.2.11"
+ checksum: 10c0/cb9a1f8ebce074069f2f6a78578873ffedd9de9f6aa212039b44c0870955c04a71c3b1311b5d97f8ac2f2ec476de202d0a5c01160cb12bc0a11b7ef36d22ef56
+ languageName: node
+ linkType: hard
+
"queue-microtask@npm:^1.2.2":
version: 1.2.3
resolution: "queue-microtask@npm:1.2.3"
@@ -18863,6 +19858,31 @@ __metadata:
languageName: node
linkType: hard
+"regex-recursion@npm:^6.0.2":
+ version: 6.0.2
+ resolution: "regex-recursion@npm:6.0.2"
+ dependencies:
+ regex-utilities: "npm:^2.3.0"
+ checksum: 10c0/68e8b6889680e904b75d7f26edaf70a1a4dc1087406bff53face4c2929d918fd77c72223843fe816ac8ed9964f96b4160650e8d5909e26a998c6e9de324dadb1
+ languageName: node
+ linkType: hard
+
+"regex-utilities@npm:^2.3.0":
+ version: 2.3.0
+ resolution: "regex-utilities@npm:2.3.0"
+ checksum: 10c0/78c550a80a0af75223244fff006743922591bd8f61d91fef7c86b9b56cf9bbf8ee5d7adb6d8991b5e304c57c90103fc4818cf1e357b11c6c669b782839bd7893
+ languageName: node
+ linkType: hard
+
+"regex@npm:^6.0.1":
+ version: 6.0.1
+ resolution: "regex@npm:6.0.1"
+ dependencies:
+ regex-utilities: "npm:^2.3.0"
+ checksum: 10c0/687b3e063d4ca19b0de7c55c24353f868a0fb9ba21512692470d2fb412e3a410894dd5924c91ea49d8cb8fa865e36ec956e52436ae0a256bdc095ff136c30aba
+ languageName: node
+ linkType: hard
+
"regexpu-core@npm:^6.1.1":
version: 6.1.1
resolution: "regexpu-core@npm:6.1.1"
@@ -18926,6 +19946,21 @@ __metadata:
languageName: node
linkType: hard
+"rehype-katex@npm:^7.0.1":
+ version: 7.0.1
+ resolution: "rehype-katex@npm:7.0.1"
+ dependencies:
+ "@types/hast": "npm:^3.0.0"
+ "@types/katex": "npm:^0.16.0"
+ hast-util-from-html-isomorphic: "npm:^2.0.0"
+ hast-util-to-text: "npm:^4.0.0"
+ katex: "npm:^0.16.0"
+ unist-util-visit-parents: "npm:^6.0.0"
+ vfile: "npm:^6.0.0"
+ checksum: 10c0/73c770319536128b75055d904d06951789d00a0552c11724c0dac2e244dcb21041630552d118a11cc42233fdcd1bfee525e78a0020fde635bd916cceb281dfb1
+ languageName: node
+ linkType: hard
+
"rehype-raw@npm:^7.0.0":
version: 7.0.0
resolution: "rehype-raw@npm:7.0.0"
@@ -19437,6 +20472,18 @@ __metadata:
languageName: node
linkType: hard
+"remark-math@npm:^6.0.0":
+ version: 6.0.0
+ resolution: "remark-math@npm:6.0.0"
+ dependencies:
+ "@types/mdast": "npm:^4.0.0"
+ mdast-util-math: "npm:^3.0.0"
+ micromark-extension-math: "npm:^3.0.0"
+ unified: "npm:^11.0.0"
+ checksum: 10c0/859613c4db194bb6b3c9c063661dc52b8ceda9c5cf3256b42f73d93eb8f38a6d634eb5f976fe094425f6f1035aaf329eb49ada314feb3b2b1073326b6d3aaa02
+ languageName: node
+ linkType: hard
+
"remark-mdx@npm:^3.0.0, remark-mdx@npm:^3.1.0":
version: 3.1.0
resolution: "remark-mdx@npm:3.1.0"
@@ -19858,6 +20905,18 @@ __metadata:
languageName: node
linkType: hard
+"roughjs@npm:^4.6.6":
+ version: 4.6.6
+ resolution: "roughjs@npm:4.6.6"
+ dependencies:
+ hachure-fill: "npm:^0.5.2"
+ path-data-parser: "npm:^0.1.0"
+ points-on-curve: "npm:^0.2.0"
+ points-on-path: "npm:^0.2.1"
+ checksum: 10c0/68c11bf4516aa014cef2fe52426a9bab237c2f500d13e1a4f13b523cb5723667bf2d92b9619325efdc5bc2a193588ff5af8d51683df17cfb8720e96fe2b92b0c
+ languageName: node
+ linkType: hard
+
"rtlcss@npm:^4.1.0":
version: 4.3.0
resolution: "rtlcss@npm:4.3.0"
@@ -20346,6 +21405,22 @@ __metadata:
languageName: node
linkType: hard
+"shiki@npm:^3.12.2":
+ version: 3.12.2
+ resolution: "shiki@npm:3.12.2"
+ dependencies:
+ "@shikijs/core": "npm:3.12.2"
+ "@shikijs/engine-javascript": "npm:3.12.2"
+ "@shikijs/engine-oniguruma": "npm:3.12.2"
+ "@shikijs/langs": "npm:3.12.2"
+ "@shikijs/themes": "npm:3.12.2"
+ "@shikijs/types": "npm:3.12.2"
+ "@shikijs/vscode-textmate": "npm:^10.0.2"
+ "@types/hast": "npm:^3.0.4"
+ checksum: 10c0/86e6c6a83807069f8e6e0d5e69a301fc74fff6abd92d490a9dc59066aac2e7a8582bf0742f118ab89f4ec88965a6e8946c5deb594e4ecb8edf07d95071967ace
+ languageName: node
+ linkType: hard
+
"side-channel@npm:^1.0.6":
version: 1.0.6
resolution: "side-channel@npm:1.0.6"
@@ -20776,6 +21851,28 @@ __metadata:
languageName: node
linkType: hard
+"streamdown@npm:^1.0.11":
+ version: 1.2.0
+ resolution: "streamdown@npm:1.2.0"
+ dependencies:
+ clsx: "npm:^2.1.1"
+ harden-react-markdown: "npm:^1.0.5"
+ katex: "npm:^0.16.22"
+ lucide-react: "npm:^0.542.0"
+ marked: "npm:^16.2.1"
+ mermaid: "npm:^11.11.0"
+ react-markdown: "npm:^10.1.0"
+ rehype-katex: "npm:^7.0.1"
+ remark-gfm: "npm:^4.0.1"
+ remark-math: "npm:^6.0.0"
+ shiki: "npm:^3.12.2"
+ tailwind-merge: "npm:^3.3.1"
+ peerDependencies:
+ react: ^18.0.0 || ^19.0.0
+ checksum: 10c0/7679766c02104cc919d8ae0d346059e47aeafd3991242c3d1d87e8472d545d12df0f8d428128ff07ffce2d87c018a5d0f93700521db75d571810bf7426b7fa75
+ languageName: node
+ linkType: hard
+
"streamx@npm:^2.15.0, streamx@npm:^2.20.0":
version: 2.20.1
resolution: "streamx@npm:2.20.1"
@@ -20980,6 +22077,13 @@ __metadata:
languageName: node
linkType: hard
+"stylis@npm:^4.3.6":
+ version: 4.3.6
+ resolution: "stylis@npm:4.3.6"
+ checksum: 10c0/e736d484983a34f7c65d362c67dc79b7bce388054b261c2b7b23d02eaaf280617033f65d44b1ea341854f4331a5074b885668ac8741f98c13a6cfd6443ae85d0
+ languageName: node
+ linkType: hard
+
"sumchecker@npm:^3.0.1":
version: 3.0.1
resolution: "sumchecker@npm:3.0.1"
@@ -21045,6 +22149,18 @@ __metadata:
languageName: node
linkType: hard
+"swr@npm:^2.2.5":
+ version: 2.3.6
+ resolution: "swr@npm:2.3.6"
+ dependencies:
+ dequal: "npm:^2.0.3"
+ use-sync-external-store: "npm:^1.4.0"
+ peerDependencies:
+ react: ^16.11.0 || ^17.0.0 || ^18.0.0 || ^19.0.0
+ checksum: 10c0/9534f350982e36a3ae0a13da8c0f7da7011fc979e77f306e60c4e5db0f9b84f17172c44f973441ba56bb684b69b0d9838ab40011a6b6b3e32d0cd7f3d5405f99
+ languageName: node
+ linkType: hard
+
"synckit@npm:^0.11.8":
version: 0.11.11
resolution: "synckit@npm:0.11.11"
@@ -21264,6 +22380,13 @@ __metadata:
languageName: node
linkType: hard
+"throttleit@npm:2.1.0":
+ version: 2.1.0
+ resolution: "throttleit@npm:2.1.0"
+ checksum: 10c0/1696ae849522cea6ba4f4f3beac1f6655d335e51b42d99215e196a718adced0069e48deaaf77f7e89f526ab31de5b5c91016027da182438e6f9280be2f3d5265
+ languageName: node
+ linkType: hard
+
"thunky@npm:^1.0.2":
version: 1.1.0
resolution: "thunky@npm:1.1.0"
@@ -21322,6 +22445,13 @@ __metadata:
languageName: node
linkType: hard
+"tinyexec@npm:^1.0.1":
+ version: 1.0.1
+ resolution: "tinyexec@npm:1.0.1"
+ checksum: 10c0/e1ec3c8194a0427ce001ba69fd933d0c957e2b8994808189ed8020d3e0c01299aea8ecf0083cc514ecbf90754695895f2b5c0eac07eb2d0c406f7d4fbb8feade
+ languageName: node
+ linkType: hard
+
"tinyglobby@npm:^0.2.13, tinyglobby@npm:^0.2.14":
version: 0.2.14
resolution: "tinyglobby@npm:0.2.14"
@@ -21709,6 +22839,13 @@ __metadata:
languageName: node
linkType: hard
+"ufo@npm:^1.6.1":
+ version: 1.6.1
+ resolution: "ufo@npm:1.6.1"
+ checksum: 10c0/5a9f041e5945fba7c189d5410508cbcbefef80b253ed29aa2e1f8a2b86f4bd51af44ee18d4485e6d3468c92be9bf4a42e3a2b72dcaf27ce39ce947ec994f1e6b
+ languageName: node
+ linkType: hard
+
"undici-types@npm:~6.19.8":
version: 6.19.8
resolution: "undici-types@npm:6.19.8"
@@ -21951,6 +23088,16 @@ __metadata:
languageName: node
linkType: hard
+"unist-util-remove-position@npm:^5.0.0":
+ version: 5.0.0
+ resolution: "unist-util-remove-position@npm:5.0.0"
+ dependencies:
+ "@types/unist": "npm:^3.0.0"
+ unist-util-visit: "npm:^5.0.0"
+ checksum: 10c0/e8c76da4399446b3da2d1c84a97c607b37d03d1d92561e14838cbe4fdcb485bfc06c06cfadbb808ccb72105a80643976d0660d1fe222ca372203075be9d71105
+ languageName: node
+ linkType: hard
+
"unist-util-stringify-position@npm:^4.0.0":
version: 4.0.0
resolution: "unist-util-stringify-position@npm:4.0.0"
@@ -22145,6 +23292,15 @@ __metadata:
languageName: node
linkType: hard
+"uuid@npm:^11.1.0":
+ version: 11.1.0
+ resolution: "uuid@npm:11.1.0"
+ bin:
+ uuid: dist/esm/bin/uuid
+ checksum: 10c0/34aa51b9874ae398c2b799c88a127701408cd581ee89ec3baa53509dd8728cbb25826f2a038f9465f8b7be446f0fbf11558862965b18d21c993684297628d4d3
+ languageName: node
+ linkType: hard
+
"uuid@npm:^8.3.2":
version: 8.3.2
resolution: "uuid@npm:8.3.2"
@@ -22594,7 +23750,7 @@ __metadata:
languageName: node
linkType: hard
-"vscode-languageserver-protocol@npm:^3.0.0, vscode-languageserver-protocol@npm:^3.17.5":
+"vscode-languageserver-protocol@npm:3.17.5, vscode-languageserver-protocol@npm:^3.0.0, vscode-languageserver-protocol@npm:^3.17.5":
version: 3.17.5
resolution: "vscode-languageserver-protocol@npm:3.17.5"
dependencies:
@@ -22604,7 +23760,7 @@ __metadata:
languageName: node
linkType: hard
-"vscode-languageserver-textdocument@npm:^1.0.0, vscode-languageserver-textdocument@npm:^1.0.11":
+"vscode-languageserver-textdocument@npm:^1.0.0, vscode-languageserver-textdocument@npm:^1.0.11, vscode-languageserver-textdocument@npm:~1.0.11":
version: 1.0.12
resolution: "vscode-languageserver-textdocument@npm:1.0.12"
checksum: 10c0/534349894b059602c4d97615a1147b6c4c031141c2093e59657f54e38570f5989c21b376836f13b9375419869242e9efb4066643208b21ab1e1dee111a0f00fb
@@ -22618,7 +23774,18 @@ __metadata:
languageName: node
linkType: hard
-"vscode-uri@npm:^3.0.0, vscode-uri@npm:^3.0.8":
+"vscode-languageserver@npm:~9.0.1":
+ version: 9.0.1
+ resolution: "vscode-languageserver@npm:9.0.1"
+ dependencies:
+ vscode-languageserver-protocol: "npm:3.17.5"
+ bin:
+ installServerIntoExtension: bin/installServerIntoExtension
+ checksum: 10c0/8a0838d77c98a211c76e54bd3a6249fc877e4e1a73322673fb0e921168d8e91de4f170f1d4ff7e8b6289d0698207afc6aba6662d4c1cd8e4bd7cae96afd6b0c2
+ languageName: node
+ linkType: hard
+
+"vscode-uri@npm:^3.0.0, vscode-uri@npm:^3.0.8, vscode-uri@npm:~3.0.8":
version: 3.0.8
resolution: "vscode-uri@npm:3.0.8"
checksum: 10c0/f7f217f526bf109589969fe6e66b71e70b937de1385a1d7bb577ca3ee7c5e820d3856a86e9ff2fa9b7a0bc56a3dd8c3a9a557d3fedd7df414bc618d5e6b567f9
@@ -22694,6 +23861,7 @@ __metadata:
version: 0.0.0-use.local
resolution: "waveterm@workspace:."
dependencies:
+ "@ai-sdk/react": "npm:^2.0.18"
"@chromatic-com/storybook": "npm:^3.2.7"
"@eslint/js": "npm:^8.57.0"
"@floating-ui/react": "npm:^0.27.16"
@@ -22741,6 +23909,7 @@ __metadata:
"@xterm/addon-web-links": "npm:^0.11.0"
"@xterm/addon-webgl": "npm:^0.18.0"
"@xterm/xterm": "npm:^5.5.0"
+ ai: "npm:^5.0.18"
base64-js: "npm:^1.5.1"
class-variance-authority: "npm:^0.7.1"
clsx: "npm:^2.1.1"
@@ -22799,6 +23968,7 @@ __metadata:
sprintf-js: "npm:^1.1.3"
storybook: "npm:^8.6.14"
storybook-dark-mode: "npm:^4.0.2"
+ streamdown: "npm:^1.0.11"
tailwind-merge: "npm:^3.3.1"
tailwindcss: "npm:^4.1.12"
tailwindcss-animate: "npm:^1.0.7"
@@ -22819,6 +23989,7 @@ __metadata:
winston: "npm:^3.17.0"
ws: "npm:^8.18.3"
yaml: "npm:^2.7.1"
+ zod: "npm:^4.0.17"
languageName: unknown
linkType: soft
@@ -23385,7 +24556,14 @@ __metadata:
languageName: node
linkType: hard
-"zwitch@npm:^2.0.0":
+"zod@npm:^4.0.17":
+ version: 4.1.8
+ resolution: "zod@npm:4.1.8"
+ checksum: 10c0/5eae39da09d7bd0564a30dfd2348811e4e2e7dd15955d8f3444f8e196f35e5422b1482eda234b722fafb0738f4a8b718adb042b860936bfdd2cc19cdbdac8a9a
+ languageName: node
+ linkType: hard
+
+"zwitch@npm:^2.0.0, zwitch@npm:^2.0.4":
version: 2.0.4
resolution: "zwitch@npm:2.0.4"
checksum: 10c0/3c7830cdd3378667e058ffdb4cf2bb78ac5711214e2725900873accb23f3dfe5f9e7e5a06dcdc5f29605da976fc45c26d9a13ca334d6eea2245a15e77b8fc06e