Releases: ApartsinProjects/ModelMesh
v0.2.0 — Developer Experience Release
What's New in v0.2.0
This release focuses on developer experience — making ModelMesh easier to adopt, test, debug, and extend. All features ship with full Python + TypeScript parity.
7 New Features
1. Structured Exception Hierarchy
Catch specific error types instead of bare RuntimeError:
from modelmesh.exceptions import RateLimitError, BudgetExceededError
try:
response = client.chat.completions.create(...)
except RateLimitError as e:
print(f"Retry in {e.retry_after}s")
except BudgetExceededError as e:
print(f"{e.limit_type} limit: ${e.limit_value}")2. Request/Response Middleware
Intercept every request for logging, transforms, or custom headers:
from modelmesh import Middleware
class LoggingMiddleware(Middleware):
async def before_request(self, request, context):
print(f"-> {context.model_id}")
return request
async def after_response(self, response, context):
print(f"<- {response.usage.total_tokens} tokens")
return response3. Context Manager Support
Clean resource cleanup with with/async with (Python) or close() (TypeScript):
async with modelmesh.create("chat") as client:
response = await client.chat.completions.create(...)
# Automatic cleanup4. Usage Tracking API
Monitor costs and tokens across models and providers:
print(client.usage.total_cost)
print(client.usage.by_model)5. Mock Testing Client
Test without live APIs — pre-configured responses with call recording:
from modelmesh.testing import mock_client, MockResponse
client = mock_client(responses=[MockResponse(content="Hello!", tokens=10)])6. Capability Discovery
Explore available capabilities without memorizing paths:
modelmesh.capabilities.resolve("chat-completion")
# -> 'generation.text-generation.chat-completion'
modelmesh.capabilities.search("text")
# -> ['text-embeddings', 'text-generation', ...]7. Routing Explanation
Debug why a model was selected without making API calls:
explanation = client.explain(model="chat")
print(explanation.selected_model, explanation.reason)Audit & Quality
- 50+ code-docs-tests inconsistencies fixed
- Cross-language parity audit: matching method signatures, parameters, return types
- All new features backward-compatible — existing code works unchanged
Testing
- 1,879 tests passing (1,166 Python + 713 TypeScript)
- 63 new Python DX tests + 69 new TypeScript DX tests
Samples & Documentation
- 12 new quickstart samples (06-11 for Python and TypeScript)
- 5 new developer guides: QuickStart, Error Handling, Middleware, Testing, Capabilities
- Updated docs index with new features and navigation
Install
# Python
pip install modelmesh-lite==0.2.0
# TypeScript
npm install @nistrapa/modelmesh-core@0.2.0TypeScript Sub-path Imports
New deep imports for tree-shaking:
import { mockClient } from '@nistrapa/modelmesh-core/testing';
import { resolve } from '@nistrapa/modelmesh-core/capabilities';
import { Middleware } from '@nistrapa/modelmesh-core/middleware';Full Changelog: v0.1.1...v0.2.0
v0.1.1 — Package Publish & Docs Update
What's New
Published Packages
- PyPI:
pip install modelmesh-lite - npm:
npm install @nistrapa/modelmesh-core - Docker:
ghcr.io/apartsinprojects/modelmesh:v0.1.1
Changes since v0.1.0
Package Rename
- npm package renamed from
@modelmesh/coreto@nistrapa/modelmesh-core - Updated all 50 files: docs, samples, scripts, and package.json
New Documentation
- AI Agent Integration Guide (
docs/ForAIAgent.md) — decision tree, all 4 integration paths, YAML config reference, troubleshooting - Claude Skills (
ClaudeSkill/) — 5 ready-to-use skills for install, configure, integrate, deploy, and test
Fixes
- Fixed publish workflow secret name (
PYPI_TOKEN→PYPI_API_TOKEN) - Fixed TypeScript build errors (RUNTIME static property type widening on CDK base classes)
- Added
py.typedmarker (PEP 561) for mypy/pyright support - Added
exportsfield to TypeScript package.json for subpath imports
Infrastructure
- Docker proxy deployment with OpenAI-compatible REST API
- CI/CD publish workflow for PyPI, npm, and GHCR
- 1,366 tests passing (855 Python + 511 TypeScript)
Install
# Python
pip install modelmesh-lite
# TypeScript
npm install @nistrapa/modelmesh-core
# Docker
docker pull ghcr.io/apartsinprojects/modelmesh:v0.1.1Quick Start
import modelmesh
client = modelmesh.create("chat-completion")
response = client.chat.completions.create(
model="chat-completion",
messages=[{"role": "user", "content": "Hello!"}],
)v0.1.0 - Initial Release
ModelMesh Lite v0.1.0
First release of ModelMesh Lite -- capability-driven AI model routing.
Highlights
- Core: Router, capability tree, pools, state management, event emitter
- CDK: Base classes for all 6 connector types with mixins (cache, metrics, rate limiter, HTTP client)
- Pre-shipped connectors: OpenAI and Anthropic providers
- MeshClient: OpenAI SDK-compatible client (
chat.completions.create) - Config: YAML loader, auto-detect for 9 providers
- Observability: Routing events, request logging, aggregate statistics
- 356 tests across 11 test modules
- Zero external dependencies for core
Install
pip install modelmesh-liteOr install the wheel directly:
pip install modelmesh_lite-0.1.0-py3-none-any.whlQuick Start
import modelmesh
client = modelmesh.create("chat-completion")
response = client.chat.completions.create(
model="chat-completion",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)