Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "0.4.0-alpha.15"
".": "0.4.0-alpha.16"
}
15 changes: 15 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,20 @@
# Changelog

## 0.4.0-alpha.16 (2026-01-23)

Full Changelog: [v0.4.0-alpha.15...v0.4.0-alpha.16](https://github.com/llamastack/llama-stack-client-python/compare/v0.4.0-alpha.15...v0.4.0-alpha.16)

### Features

* **client:** add support for binary request streaming ([d17dede](https://github.com/llamastack/llama-stack-client-python/commit/d17dede18fa45e3433bea4923d4b280331257975))


### Chores

* **internal:** codegen related update ([a176b2e](https://github.com/llamastack/llama-stack-client-python/commit/a176b2e9501b6855ba31f420ea23f1e94170e7aa))
* **internal:** codegen related update ([4cf153d](https://github.com/llamastack/llama-stack-client-python/commit/4cf153ddfbe68ce5966ec1d199e3c6fb69c1abe0))
* **internal:** version bump ([580d0ff](https://github.com/llamastack/llama-stack-client-python/commit/580d0ffc4b0540294cf42bd28d6dd3254586133f))

## 0.4.0-alpha.15 (2026-01-06)

Full Changelog: [v0.4.0-alpha.14...v0.4.0-alpha.15](https://github.com/llamastack/llama-stack-client-python/compare/v0.4.0-alpha.14...v0.4.0-alpha.15)
Expand Down
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,8 @@ You can find more example apps with client SDKs to talk with the Llama Stack ser
## Installation

```sh
pip install llama-stack-client
# install from PyPI
pip install '--pre llama_stack_client'
```

## Usage
Expand Down Expand Up @@ -106,7 +107,7 @@ You can enable this by installing `aiohttp`:

```sh
# install from PyPI
pip install --pre llama_stack_client[aiohttp]
pip install '--pre llama_stack_client[aiohttp]'
```

Then you can enable it by instantiating the client with `http_client=DefaultAioHttpClient()`:
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "llama_stack_client"
version = "0.4.0-alpha.15"
version = "0.4.0-alpha.16"
description = "The official Python library for the llama-stack-client API"
dynamic = ["readme"]
license = "MIT"
Expand Down
26 changes: 12 additions & 14 deletions requirements-dev.lock
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ anyio==4.12.0
# via
# httpx
# llama-stack-client
black==25.12.0
black==26.1.0
certifi==2025.11.12
# via
# httpcore
Expand All @@ -33,7 +33,7 @@ distro==1.9.0
# via llama-stack-client
execnet==2.1.2
# via pytest-xdist
filelock==3.20.1
filelock==3.20.3
# via virtualenv
fire==0.7.1
# via llama-stack-client
Expand All @@ -45,7 +45,7 @@ httpx==0.28.1
# via
# llama-stack-client
# respx
identify==2.6.15
identify==2.6.16
# via pre-commit
idna==3.11
# via
Expand All @@ -68,15 +68,15 @@ nodeenv==1.9.1
# via
# pre-commit
# pyright
numpy==2.3.5
numpy==2.4.1
# via pandas
packaging==25.0
# via
# black
# pytest
pandas==2.3.3
pandas==3.0.0
# via llama-stack-client
pathspec==0.12.1
pathspec==1.0.3
# via
# black
# mypy
Expand Down Expand Up @@ -108,10 +108,8 @@ pytest-asyncio==1.3.0
pytest-xdist==3.8.0
python-dateutil==2.9.0.post0
# via pandas
pytokens==0.3.0
pytokens==0.4.0
# via black
pytz==2025.2
# via pandas
pyyaml==6.0.3
# via
# pre-commit
Expand All @@ -126,7 +124,7 @@ six==1.17.0
# via python-dateutil
sniffio==1.3.1
# via llama-stack-client
termcolor==3.2.0
termcolor==3.3.0
# via
# fire
# llama-stack-client
Expand All @@ -145,13 +143,13 @@ typing-extensions==4.15.0
# typing-inspection
typing-inspection==0.4.2
# via pydantic
tzdata==2025.3
tzdata==2025.3 ; sys_platform == 'emscripten' or sys_platform == 'win32'
# via pandas
urllib3==2.6.2
urllib3==2.6.3
# via requests
virtualenv==20.35.4
virtualenv==20.36.1
# via pre-commit
wcwidth==0.2.14
wcwidth==0.3.1
# via prompt-toolkit
zipp==3.23.0
# via importlib-metadata
Loading
Loading