Skip to content

Feature: Multi-LLM Provider Support#27

Merged
ShirleyRex merged 10 commits intomasterfrom
feat/multi-llm-support
Mar 15, 2026
Merged

Feature: Multi-LLM Provider Support#27
ShirleyRex merged 10 commits intomasterfrom
feat/multi-llm-support

Conversation

@bossfelfo
Copy link
Collaborator

Description

Refactors the application to support multiple LLM providers (OpenAI, Anthropic, Gemini) and removes the .env requirement by migrating to client-side API key management.

Changes

  • Abstracts LLM Providers: Implemented a generic LLMProvider interface (lib/llm/types.ts) with distinct strategy adapters for OpenAI, Anthropic, and Gemini.
  • Client-Side Key Management: Removed .env.example and hardcoded key references. API keys are now collected in the UI, stored in localStorage, and passed dynamically in the request payload.
  • Error Normalization: Added normalizeProviderError to intercept and safely extract human-readable messages from deeply nested JSON payloads across all three vendor SDKs before displaying them in UI Toasts.
  • Gemini Stabilisation:
    • Bound the provider to the explicitly defined v1beta endpoint to support the 2.x models.
    • Stripped out the systemInstruction config field (which is notoriously unstable across versions) and manually prepends system instructions to user messages to guarantee cross-model compatibility.
  • Model Registry Updates: Bumped default models to current standards (gpt-4o-mini, claude-3-5-sonnet-latest, gemini-2.5-flash).

Developer Evidence

image

Testing Notes

  • Verified end-to-end classification and generation flows for all three providers.
  • Confirmed localStorage properly encrypts/stores/clears keys from the settings UI.
  • Validated that raw SDK errors (e.g., Anthropic "Insufficient Credits") strictly map to readable frontend toasts without leaking raw HTTP payloads.

@vercel
Copy link

vercel bot commented Mar 12, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
assertify-io Ready Ready Preview, Comment Mar 14, 2026 0:10am
assertify.io Ready Ready Preview, Comment Mar 14, 2026 0:10am

Request Review

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Refactors the app’s LLM integration to support multiple providers (OpenAI/Anthropic/Gemini), shifts API key entry to the UI (persisted client-side), and normalizes provider errors surfaced to the frontend.

Changes:

  • Introduces an LLMProvider abstraction with provider adapters (OpenAI, Anthropic, Gemini) and shared error normalization.
  • Adds provider/model selection to settings + request payloads; updates API routes to instantiate the selected provider.
  • Removes .env-based API key setup and updates docs accordingly.

Reviewed changes

Copilot reviewed 16 out of 17 changed files in this pull request and generated 9 comments.

Show a summary per file
File Description
package.json Adds Anthropic/Gemini SDK deps; updates Next + eslint config versions.
package-lock.json Lockfile updates for new deps and version resolution changes.
lib/settings.ts Extends persisted settings with provider + model, with sanitization defaults.
lib/llm/types.ts Adds provider types/metadata registry and common interfaces.
lib/llm/index.ts Exposes provider factory + llm exports.
lib/llm/openai.ts OpenAI provider adapter.
lib/llm/anthropic.ts Anthropic provider adapter.
lib/llm/gemini.ts Gemini provider adapter with v1beta binding and system-prompt handling.
lib/llm/errors.ts Adds cross-provider error normalization for UI-safe messages.
app/page.tsx Adds provider/model selector + per-provider API key storage and request payload updates.
app/settings/page.tsx Adds provider and model selection UI.
app/questions/page.tsx Uses per-provider API key storage + includes provider/model in API requests.
app/api/classify/route.ts Uses provider factory + normalized errors.
app/api/generate/route.ts Uses provider factory + normalized errors.
app/api/generate-questions/route.ts Uses provider factory + normalized errors.
README.md Updates setup instructions to UI-based key entry and multi-LLM support.
.env.example Removed env-based API key workflow.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link
Owner

@ShirleyRex ShirleyRex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good

@ShirleyRex ShirleyRex merged commit 6d31e60 into master Mar 15, 2026
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants