Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 2 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,17 +31,13 @@ if not api_key:

llm = LLM(model="openrouter:openrouter/free", api_key=api_key)
result = llm.chat("Describe Republic in one sentence.", max_tokens=48)

if result.error:
print(result.error.kind, result.error.message)
else:
print(result.value)
print(result)
```

## Why It Feels Natural

- **Plain Python**: The main flow is regular functions and branches, no extra DSL.
- **Structured Result**: Core interfaces return `StructuredOutput`, with stable `ErrorKind` values.
- **Structured error handling**: Errors are explicit and typed, so retry and fallback logic stays deterministic.
- **Tools without magic**: Supports both automatic and manual tool execution with clear debugging and auditing.
- **Tape-first memory**: Use anchor/handoff to bound context windows and replay full evidence.
- **Event streaming**: Subscribe to text deltas, tool calls, tool results, usage, and final state.
Expand Down
16 changes: 11 additions & 5 deletions docs/guides/chat.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ from republic import LLM

llm = LLM(model="openrouter:openrouter/free", api_key="<API_KEY>")
out = llm.chat("Output exactly one word: ready", max_tokens=8)
print(out.value, out.error)
print(out)
```

## Messages Mode
Expand All @@ -25,12 +25,18 @@ out = llm.chat(messages=messages, max_tokens=48)
## Structured Error Handling

```python
result = llm.chat("Write one sentence.")
if result.error:
if result.error.kind == "temporary":
from republic import ErrorPayload, LLM

llm = LLM(model="openrouter:openrouter/free", api_key="<API_KEY>")

try:
out = llm.chat("Write one sentence.", max_tokens=32)
print(out)
except ErrorPayload as error:
if error.kind == "temporary":
print("retry later")
else:
print("fail fast:", result.error.message)
print("fail fast:", error.message)
```

## Retries and Fallback
Expand Down
7 changes: 2 additions & 5 deletions docs/guides/tools.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ from republic import LLM
llm = LLM(model="openrouter:openai/gpt-4o-mini", api_key="<API_KEY>")
out = llm.run_tools("What is weather in Tokyo?", tools=[get_weather])

print(out.kind) # text | tools | error
print(out.kind) # text | tools | error
print(out.tool_results)
print(out.error)
```
Expand All @@ -32,10 +32,7 @@ print(out.error)

```python
calls = llm.tool_calls("Use get_weather for Berlin.", tools=[get_weather])
if calls.error:
raise RuntimeError(calls.error.message)

execution = llm.tools.execute(calls.value, tools=[get_weather])
execution = llm.tools.execute(calls, tools=[get_weather])
print(execution.tool_results)
print(execution.error)
```
Expand Down
19 changes: 9 additions & 10 deletions docs/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,11 +20,7 @@ llm = LLM(model="openrouter:openrouter/free", api_key="<API_KEY>")

```python
out = llm.chat("Write one short release note.", max_tokens=48)

if out.error:
print("error:", out.error.kind, out.error.message)
else:
print("text:", out.value)
print("text:", out)
```

## Step 3: Add an auditable trace to the session
Expand All @@ -36,21 +32,24 @@ tape = llm.tape("release-notes")
tape.handoff("draft_v1", state={"owner": "assistant"})

reply = tape.chat("Summarize the version changes in three bullets.", system_prompt="Keep it concise.")
print(reply.value)
print(reply)
```

## Step 4: Handle failures and fallback

```python
from republic import ErrorPayload, LLM

llm = LLM(
model="openai:gpt-4o-mini",
fallback_models=["openrouter:openrouter/free"],
max_retries=2,
api_key={"openai": "<OPENAI_KEY>", "openrouter": "<OPENROUTER_KEY>"},
)

result = llm.chat("say hello", max_tokens=8)
if result.error:
# error.kind is one of invalid_input/config/provider/tool/temporary/not_found/unknown
print(result.error.kind, result.error.message)
try:
result = llm.chat("say hello", max_tokens=8)
print(result)
except ErrorPayload as error:
print(error.kind, error.message)
```
Loading