A lightweight, modular AI Agent built using LangGraph for workflow orchestration and Groq-powered LLaMA models for ultra-fast inference. Supports multi-step reasoning, tool execution, memory, and structured outputs packaged as a Python backend service.
-
Agentic workflow using LangGraph (stateful nodes + conditional edges)
-
Low-latency LLaMA inference via Groq API
-
Custom tools for retrieval, reasoning, and structured tasks
-
Deterministic multi-step reasoning with LangGraph execution
-
Python
-
LangGraph
-
Groq API (LLaMA models)
-
Pydantic
-
Dotenv
-
Clone the repository: git clone https://github.com/your-username/your-repo.git
-
cd your-repo
-
Install dependencies: pip install -r requirements.txt
-
Add your environment variables
-
Create a .env file and add your API Key
Feel free to star the repo or open an issue/PR!