Skip to content

CUSTOM_CAP env var not honoured #155

@dehidehidehi

Description

@dehidehidehi

Branch

other (specify below)

Branch (if other)

b3nw/LLM-API-Key-Proxy

Version / Tag / Commit

build-20260409-1-b6ba75f

Provider(s) Affected

gemini_cli

Deployment Method

Docker

Operating System

Ubuntu 22

Python Version (if running from source)

No response

Bug Description

Hello,

It would seem this configuration is not applied by the LLM proxy; I can see requests getting very close to zero despite this in my .env

CUSTOM_CAP_COOLDOWN_GEMINI_CLI_T2_PRO="offset:3600"
CUSTOM_CAP_GEMINI_CLI_T2_PRO=225
CUSTOM_CAP_GEMINI_CLI_T2_25_FLASH=1425
CUSTOM_CAP_GEMINI_CLI_T2_3_FLASH=1425

same issue if i try using %

CUSTOM_CAP_GEMINI_CLI_T2_PRO="90%"
CUSTOM_CAP_GEMINI_CLI_T2_25_FLASH="90%"
CUSTOM_CAP_GEMINI_CLI_T2_3_FLASH="90%"

The top of my .env file is this :

LOG_LEVEL=info
ENABLE_REQUEST_LOGGING=false
OAUTH_REFRESH_INTERVAL=600
GLOBAL_TIMEOUT=300
SKIP_OAUTH_INIT_CHECK=true
ROTATION_TOLERANCE=2.5

Steps to Reproduce

  1. Use gemini_cli provider
  2. Use most of my requests
  3. See the requests go down to 0/1500

Expected Behavior

I'd expect the requests to go no lower than around 75 remaining requests.

Actual Behavior

The requests go all the way down to zero.

Error Logs / Messages


Pre-submission Checklist

  • I have searched existing issues to ensure this is not a duplicate

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions