Using MAX pipelines to run LLMs offline Examples of using the MAX stack to directly query LLMs from Python without spinning up a webserver. A single Pixi command runs the example: pixi run basic