Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# Contributing to DevR.AI 🚀

Thank you for your interest in contributing to DevR.AI!
We welcome contributions of all kinds — bug fixes, improvements, documentation, and new features.

---

## 🛠 Getting Started

1. Fork the repository
2. Clone your fork locally
3. Create and activate a virtual environment
4. Install dependencies

```bash
pip install -r requirements.txt
Comment on lines +15 to +16
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Code block is not closed and content appears truncated.

The fenced code block starting at Line 15 is missing its closing ```. Additionally, the contribution guide ends abruptly after the install command—consider adding sections for:

  • Creating a feature branch
  • Running tests
  • Submitting pull requests
  • Code style guidelines
📝 Suggested completion
 ```bash
 pip install -r requirements.txt
+```
+
+---
+
+## 📋 Submitting Changes
+
+1. Create a new branch for your changes
+2. Make your changes and commit with clear messages
+3. Push your branch and open a pull request
+4. Ensure all checks pass before requesting review
+
+---
+
+## 🧪 Running Tests
+
+```bash
+pytest
+```
+
+---
+
+Thank you for contributing! 🎉
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
```bash
pip install -r requirements.txt
🤖 Prompt for AI Agents
In `@CONTRIBUTING.md` around lines 15 - 16, The CONTRIBUTING.md file has an
unclosed fenced code block and truncated content; close the triple-backtick
after the pip install line and append standard contribution guidance sections
(e.g., Creating a feature branch, Running tests, Submitting pull requests, Code
style guidelines) so the doc is complete and self-contained; update the section
around the pip install code block (the fenced block beginning with ```bash and
the pip install -r requirements.txt line) and add short subsections covering
branch workflow, test command (e.g., pytest), PR submission steps, and a brief
code style/conventions note.

11 changes: 9 additions & 2 deletions backend/app/database/falkor/code-graph-backend/api/graph.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
import logging
logging.basicConfig(level=logging.DEBUG,
format='%(filename)s - %(asctime)s - %(levelname)s - %(message)s')
logger = logging.getLogger(__name__)

def graph_exists(name: str):
db = FalkorDB(host=os.getenv('FALKORDB_HOST', 'localhost'),
Expand Down Expand Up @@ -53,13 +54,19 @@ def __init__(self, name: str) -> None:
try:
self.g.create_node_range_index("File", "name", "ext")
except Exception:
pass
logger.exception(
"Failed to create node range index for File(name, ext)"
)


# index Function using full-text search
try:
self.g.create_node_fulltext_index("Searchable", "name")
except Exception:
pass
logger.exception(
"Failed to create fulltext index for Searchable(name)"
)


def clone(self, clone: str) -> "Graph":
"""
Expand Down
13 changes: 11 additions & 2 deletions backend/app/services/embedding_service/service.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import asyncio
import logging
import config
from typing import List, Dict, Any, Optional
Expand Down Expand Up @@ -66,6 +67,10 @@ def llm(self) -> ChatGoogleGenerativeAI:
raise
return self._llm


def _encode_sync(self, *args, **kwargs):
return self.model.encode(*args, **kwargs)

async def get_embedding(self, text: str) -> List[float]:
"""Generate embedding for a single text input"""
try:
Expand All @@ -74,12 +79,14 @@ async def get_embedding(self, text: str) -> List[float]:
text = [text]

# Generate embeddings
embeddings = self.model.encode(
embeddings = await asyncio.to_thread(
self._encode_sync,
text,
convert_to_tensor=True,
show_progress_bar=False
)


# Convert to standard Python list and return
embedding_list = embeddings[0].cpu().tolist()
logger.debug(f"Generated embedding with dimension: {len(embedding_list)}")
Expand All @@ -92,13 +99,15 @@ async def get_embeddings(self, texts: List[str]) -> List[List[float]]:
"""Generate embeddings for multiple text inputs in batches"""
try:
# Generate embeddings
embeddings = self.model.encode(
embeddings = await asyncio.to_thread(
self._encode_sync,
texts,
convert_to_tensor=True,
batch_size=MAX_BATCH_SIZE,
show_progress_bar=len(texts) > 10
)


# Convert to standard Python list
embedding_list = embeddings.cpu().tolist()
logger.info(f"Generated {len(embedding_list)} embeddings")
Expand Down