An intelligent AI agent that researches and analyzes developer tools, technologies, and platforms to provide comprehensive recommendations. Built with LangGraph, DeepSeek AI, and Firecrawl for web scraping and analysis.
AI Agent discovering and analyzing developer tools
Comprehensive analysis of tool features, pricing, and tech stacks
- π Intelligent Tool Discovery: Automatically finds and extracts relevant developer tools from articles and web content
- π Comprehensive Analysis: Analyzes pricing models, tech stacks, API availability, and integration capabilities
- π€ DeepSeek AI Integration: Uses DeepSeek AI through OpenRouter for intelligent content analysis
- π·οΈ Web Scraping: Leverages Firecrawl for efficient web content extraction
- π‘ Structured Recommendations: Provides detailed, actionable recommendations for developer tools
graph LR
A[π User Query] --> B[π° Tool Extraction]
B --> C[π’ Company Research]
C --> D[π‘ Recommendation Generation]
D --> E[π Final Results]
style A fill:#e1f5fe
style B fill:#f3e5f5
style C fill:#e8f5e8
style D fill:#fff3e0
style E fill:#fce4ec
| Stage | π Tool Extraction | π’ Company Research | π‘ Recommendation Generation |
|---|---|---|---|
| Input | User query | Extracted tool names | Analyzed company data |
| Process | Web scraping + AI extraction | Website analysis + AI parsing | Data synthesis + AI recommendations |
| Output | List of relevant tools | Detailed company profiles | Actionable recommendations |
The project uses a sophisticated LangGraph workflow with three main stages:
- π Tool Extraction: Searches for articles and extracts relevant tool names using AI
- π’ Company Research: Analyzes individual tools and their features in detail
- π‘ Recommendation Generation: Provides comprehensive, actionable recommendations
- π Python 3.12+ - Modern Python with latest features
- π¦ uv - Fast Python package manager
- π Firecrawl API Key - For web scraping capabilities
- π€ OpenRouter API Key - For DeepSeek AI access (optional)
git clone <your-repository-url>
cd advance-agentThe project uses uv for fast dependency management:
uv syncCreate a .env file in the project root:
# Firecrawl API Key (required for web scraping)
FIRECRAWL_API_KEY=your_firecrawl_api_key_here
# Optional: OpenRouter API Key (if you want to use your own)
OPENROUTER_API_KEY=your_openrouter_api_key_here- π Visit Firecrawl
- π Sign up for an account
- π― Navigate to your dashboard
- π Copy your API key
- π Visit OpenRouter
- π³ Sign up and add credits to your account
- π― Navigate to your dashboard
- π Copy your API key
from src.workflow import Workflow
# π Initialize the workflow
workflow = Workflow()
# π Run research on a specific query
result = workflow.run("database management tools")
# π Access the results
print(f"Query: {result.query}")
print(f"Extracted Tools: {result.extracted_tools}")
print(f"Companies Analyzed: {len(result.companies)}")
print(f"Analysis: {result.analysis}")# ποΈ Research database tools
result = workflow.run("PostgreSQL alternatives")
# βοΈ Research deployment platforms
result = workflow.run("cloud deployment platforms")
# π Research monitoring tools
result = workflow.run("application monitoring tools")
# π Research authentication services
result = workflow.run("authentication providers")π¦ advance-agent/
βββ π src/
β βββ π __init__.py
β βββ π workflow.py # Main workflow implementation
β βββ π models.py # Pydantic data models
β βββ π·οΈ firecrawl.py # Firecrawl service wrapper
β βββ π€ prompts.py # AI prompts for analysis
βββ βοΈ pyproject.toml # Project configuration
βββ π uv.lock # Dependency lock file
βββ πΈ 1st.jpg # Demo screenshot 1
βββ πΈ 2nd.jpg # Demo screenshot 2
βββ πΈ 3rd.jpg # Demo screenshot 3
βββ π README.md # This file
| File | Purpose | Status |
|---|---|---|
| Core workflow implementation | β | |
| Pydantic data structures | β | |
| Firecrawl service wrapper | β | |
| AI prompt templates | β |
The project is configured to use DeepSeek AI through OpenRouter. The configuration is in src/workflow.py:
self.client = OpenAI(
base_url="https://openrouter.ai/api/v1",
api_key="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",
)Model Used: deepseek/deepseek-chat-v3-0324:free
To use a different model or your own OpenRouter API key:
- Update the API key in
src/workflow.py - Change the model name in the
_create_deepseek_llmmethod
def _extract_tools(self, state: ResearchState):
# Search for articles about the query
articles_query = f"{state.query} tools comparison best alternatives"
search_results = self.firecrawl.search_companies(articles_query, num_results=5)
# Scrape content from found articles
# Extract tool names using DeepSeek AI
# Return list of extracted toolsWhat it does:
- Searches for articles related to the query
- Scrapes content from found URLs
- Uses AI to extract specific tool names from the content
- Returns a list of relevant tools
def _get_company_info(self, state: ResearchState):
# For each extracted tool
for tool_name in tool_names:
# Search for the tool's official website
# Scrape the website content
# Analyze the content using AI
# Extract pricing, tech stack, features, etc.What it does:
- Takes the extracted tool names
- Searches for each tool's official website
- Scrapes detailed information from the websites
- Uses AI to analyze pricing, tech stack, API availability, etc.
def _analyze_step(self, state: ResearchState):
# Combine all analyzed company data
# Generate comprehensive recommendations
# Provide actionable insightsWhat it does:
- Combines all the analyzed company information
- Uses AI to generate comprehensive recommendations
- Provides actionable insights and comparisons
class CompanyAnalysis(BaseModel):
pricing_model: str = "Unknown" # Free, Freemium, Paid, Enterprise
is_open_source: Optional[bool] = None
tech_stack: List[str] = []
description: str = ""
api_available: Optional[bool] = None
language_support: List[str] = []
integration_capabilities: List[str] = []class CompanyInfo(BaseModel):
name: str
description: str
website: str
pricing_model: Optional[str] = None
is_open_source: Optional[bool] = None
tech_stack: List[str] = []
competitors: List[str] = []
api_available: Optional[bool] = None
language_support: List[str] = []
integration_capabilities: List[str] = []
developer_experience_rating: Optional[str] = Noneclass ResearchState(BaseModel):
query: str
extracted_tools: List[str] = []
companies: List[CompanyInfo] = []
search_results: List[Dict[str, Any]] = []
analysis: Optional[str] = NoneThe project uses carefully crafted prompts for different analysis tasks:
Extracts specific tool names from articles and content.
Analyzes company websites to extract:
- Pricing model
- Tech stack
- API availability
- Language support
- Integration capabilities
Generates actionable recommendations based on all analyzed data.
ValueError: FIRECRAWL_API_KEY is not set in the environment variables
Solution: Make sure you have set the FIRECRAWL_API_KEY in your .env file.
openai.AuthenticationError: Invalid API key
Solution: The project uses a pre-configured API key. If you want to use your own, update it in src/workflow.py.
ModuleNotFoundError: No module named 'openai'
Solution: Run uv sync to install all dependencies.
AttributeError: 'tuple' object has no attribute 'get'
Solution: The code handles multiple response formats from Firecrawl. This error should be resolved in the current version.
To enable debug logging, you can add print statements in the workflow methods or modify the logging level.
# Test that all imports work
uv run python -c "from src.workflow import Workflow; print('β All imports successful')"from src.workflow import Workflow
# Test with a simple query
workflow = Workflow()
result = workflow.run("database tools")
print(f"Found {len(result.extracted_tools)} tools")
print(f"Analyzed {len(result.companies)} companies")| Stage | β±οΈ Time | π Status |
|---|---|---|
| Tool Extraction | ~30-60 seconds | |
| Company Analysis | ~2-5 minutes per tool | |
| Recommendation Generation | ~10-30 seconds |
β‘ Total time: 3-10 minutes depending on the number of tools found and analyzed.
- π API Security: API keys are stored in environment variables
- π« Data Protection: No sensitive data is logged or stored
- π Web Ethics: All web scraping follows robots.txt guidelines
- β‘ Rate Limiting: Rate limiting is implemented for API calls
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- LangGraph for workflow orchestration
- DeepSeek AI for intelligent analysis
- Firecrawl for web scraping capabilities
- OpenRouter for AI model access
If you encounter any issues or have questions:
- Check the Troubleshooting section
- Search existing Issues
- Create a new issue with detailed information
Stay updated with the latest changes:
git pull origin main
uv sync