MediQuery.ai is a professional-grade medical assistant built using a Retrieval-Augmented Generation (RAG) architecture. It answers questions based on your indexed medical documents instead of raw LLM hallucinations, improving accuracy and trustworthiness.
Real-time status of the MediQuery.ai ecosystem:
Live Production URL: http://13.60.62.104:8080
Modern, responsive UI with Dark Mode toggle for optimal readability.
| π Dark Mode (Default) | βοΈ Light Mode |
|---|---|
![]() |
![]() |
- Contextβaware answers:
Powered by LangChain and Groq (Llamaβ3.3β70B) for fast, accurate medical reasoning grounded in your documents. - Semantic search:
Uses Pinecone for realβtime similarity search over embedded medical text. - Modern UI:
Clean, responsive Flask frontend with Dark Mode, glassmorphism, and realβtime thinking indicators. - Automated deployment:
Fully integrated CI/CD pipeline (GitHub Actions β AWS ECR β EC2/Docker).
| Category | Technology |
|---|---|
| LLM Framework | LangChain |
| Large Language Model | Groq (llama3-70b-8192) |
| Vector Database | Pinecone |
| Embeddings | HuggingFace (all-MiniLM-L6-v2) |
| Backend | Flask (Python 3.10) |
| Frontend | HTML5, CSS3, JavaScript (AJAX) |
| Cloud / DevOps | AWS (EC2, ECR), Docker, GitHub Actions |
.
βββ app.py # Flask app (routes & RAG logic)
βββ store_index.py # Script to populate Pinecone index
βββ src/
β βββ helper.py # Helper functions (embeddings, utils)
β βββ prompt.py # System / RAG prompts
βββ assets/ # Screenshots (linked above)
βββ static/ # CSS / JS, images (Dark Mode styles)
βββ templates/ # HTML templates (chat.html)
βββ requirements.txt # Python dependencies
βββ Dockerfile # Docker build config
βββ .github/workflows # GitHub Actions CI/CD workflows
-
Clone the repository
git clone https://github.com/salonyranjan/MediQuery.ai.git cd MediQuery.ai -
Create a conda environment
conda create -n medibot python=3.10 -y conda activate medibot
-
Install dependencies
pip install -r requirements.txt
-
Create
.envfileCreate
.envin the root directory and add:PINECONE_API_KEY=xxxxxxxxxxxxxxxx OPENAI_API_KEY=xxxxxxxxxxxxxxxx GROQ_API_KEY=xxxxxxxxxxxxxxxxReplace values with your actual API keys.
-
Build the vector index
python store_index.py
-
Run the app
python app.py
Open your browser:http://localhost:8080
text
If you prefer running via Docker:
# 1. Build the image
docker build -t mediquery .
# 2. Run the container
docker run -d -p 8080:8080 \
-e PINECONE_API_KEY="xxxxxxxxxxxxxxxx" \
-e GROQ_API_KEY="xxxxxxxxxxxxxxxx" \
--name mediquery_app \
mediqueryThen open:http://localhost:8080
-
Login to AWS Console
Go to https://aws.amazon.com and sign in. -
Create an IAM user for deployment
- Attach policies:
AmazonEC2ContainerRegistryFullAccessAmazonEC2FullAccess
- Save the
AWS_ACCESS_KEY_IDandAWS_SECRET_ACCESS_KEY.
- Attach policies:
-
Create an ECR repository
Example (replace with your account and region):
315865595366.dkr.ecr.us-east-1.amazonaws.com/medicalbot -
Launch an EC2 instance (Ubuntu)
- Choose an Ubuntu AMI (e.g.,
ubuntu/focal). - Attach a key pair and security group allowing port
8080.
- Choose an Ubuntu AMI (e.g.,
-
Install Docker on EC2
sudo apt-get update -y sudo apt-get upgrade -y curl -fsSL https://get.docker.com -o get-docker.sh sudo sh get-docker.sh sudo usermod -aG docker ubuntu newgrp docker
-
Configure EC2 as a selfβhosted GitHub Actions runner
- In GitHub: Settings β Actions β Runners β New selfβhosted runner.
- Choose Linux and run the provided commands on your EC2 machine.
-
Add GitHub Secrets
In your repo: Settings β Secrets and variables β Actions.
Add:
AWS_ACCESS_KEY_IDAWS_SECRET_ACCESS_KEYAWS_DEFAULT_REGION(e.g.,eu-north-1)ECR_REPO(e.g.,577435557871.dkr.ecr.eu-north-1.amazonaws.com/medical_chatbot)PINECONE_API_KEYOPENAI_API_KEYGROQ_API_KEY
Once GitHub Actions runs, it will:
- Build and push the Docker image to ECR.
- Deploy it to EC2 via
docker pullanddocker run. - Expose the app on port
8080.

