A command-line toolkit to support you in your daily work as a software programmer. Built to integrate into your existing workflows, providing a secure, powerful and flexible pair-programming experience with LLMs.
Visit [jp.computer] to learn more.
Note
This project is in active development. Expect breaking changes. What is documented here is subject to change and may not be up-to-date. Please consult the installation instructions to get started, and reach out to us if you need any assistance, or have feedback.
JP is built to be provider-agnostic, your workflow shouldn't be coupled to any single LLM backend; private and secure by default, with no implicit network access or silent tool execution; a proper Unix citizen, a single static binary that composes with pipes, respects your shell, and stays out of your way; extensible through sandboxed plugins and configurable where it matters; open-source and independent, funded without VC money, no allegiance to any LLM provider, just software that serves its users.
JP is in active development. Install from source:
cargo install --locked --git https://github.com/dcdpr/jp.gitInitiate a new workspace in an existing directory:
jp init .
> Confirm before running tools?
Yes (safest option)
> Which LLM model do you want to use?
ollama/qwen3
Initialized workspace at current directoryRun your first query:
jp query "Is this thing on?"
Hello there! I am Jean-pierre, how can I help you today?Configure your JP workspace:
open .jp/config.tomlSee what else you can do:
jp help