Skip to content

GML-2015 Add UI page for configurations, Role base access#28

Open
prinskumar-tigergraph wants to merge 3 commits intomainfrom
ui_page_server_config_clean
Open

GML-2015 Add UI page for configurations, Role base access#28
prinskumar-tigergraph wants to merge 3 commits intomainfrom
ui_page_server_config_clean

Conversation

@prinskumar-tigergraph
Copy link
Collaborator

@prinskumar-tigergraph prinskumar-tigergraph commented Mar 11, 2026

User description

Updated code for
-Pages for Configuration
-Role based access to setup configurations
-graph level access to change configuration


PR Type

Enhancement, Bug fix


Description

  • Add setup UI for configs and prompts

  • Enforce role-based access across endpoints

  • Support graph-specific LLM/prompt configurations

  • Fix VertexAI imports and model naming


Diagram Walkthrough

flowchart LR
  uiSetup["Setup UI pages (KG Admin, Server Config, Prompts)"]
  apiRoutes["New UI API routes (/config, /prompts, /roles)"]
  roleGuard["Role checks (superuser/globaldesigner/graph admin)"]
  cfgReload["Config reload (LLM/DB/GraphRAG)"]
  perGraph["Per-graph completion config + prompts"]
  eccJobs["ECC jobs use fresh config"]
  vertexFix["VertexAI import/model fixes"]

  uiSetup -- "calls" --> apiRoutes
  apiRoutes -- "protected by" --> roleGuard
  apiRoutes -- "persist + sanitize" --> perGraph
  apiRoutes -- "trigger" --> cfgReload
  cfgReload -- "applies to" --> eccJobs
  perGraph -- "used by" --> eccJobs
  vertexFix -- "stabilizes" --> eccJobs
Loading

File Walkthrough

Relevant files
Enhancement
16 files
ui.py
Add config/prompts APIs with role-based access                     
+904/-8 
config.py
Graph-specific config resolution and reload utilities       
+277/-25
agent.py
Use per-graph completion config for agents                             
+22/-23 
main.py
Reload configs at job start and consistency routes             
+47/-0   
ecc_util.py
LLM provider selection via per-graph config                           
+23/-20 
community_summarizer.py
Load community summary prompt from configured path             
+32/-13 
workers.py
Pass graph to LLM provider for summarization                         
+1/-1     
KGAdmin.tsx
KG admin page for init, ingest, refresh                                   
+664/-0 
GraphRAGConfig.tsx
UI to edit GraphRAG processing settings                                   
+408/-0 
ModeToggle.tsx
Show Setup link based on resolved roles                                   
+73/-9   
main.tsx
Router for setup sections and redirects                                   
+43/-3   
IngestGraph.tsx
Ingestion UI for local and cloud sources                                 
+1557/-0
LLMConfig.tsx
UI to edit and test LLM services                                                 
+1344/-0
GraphDBConfig.tsx
UI to edit and test DB connection                                               
+437/-0 
CustomizePrompts.tsx
UI to view and save prompt files                                                 
+298/-0 
SetupLayout.tsx
Shared layout for setup navigation                                             
+260/-0 
Bug fix
2 files
embedding_services.py
Fix VertexAI embeddings import and parameters                       
+3/-2     
google_vertexai_service.py
Switch to official VertexAI client and args                           
+3/-2     
Configuration changes
1 files
nginx.conf
Add reverse proxy rules for /setup routes                               
+10/-0   
Dependencies
1 files
requirements.txt
Add langchain-google-vertexai dependency                                 
+1/-0     
Additional files
6 files
community_summarization.txt +11/-0   
community_summarization.txt +11/-0   
community_summarization.txt +11/-0   
community_summarization.txt +11/-0   
community_summarization.txt +11/-0   
Bot.tsx +1/-0     

@tg-pr-agent
Copy link

tg-pr-agent bot commented Mar 11, 2026

PR Reviewer Guide 🔍

Here are some key observations to aid the review process:

⏱️ Estimated effort to review: 5 🔵🔵🔵🔵🔵
🧪 No relevant tests
🔒 Security concerns

Path traversal:
In graphrag/app/routers/ui.py, user-provided graphname is used directly to construct filesystem paths (e.g., common/prompts/{graphname}, configs/{graphname}/server_config.json) without sanitization. A crafted graphname like ../../some/dir could lead to writing outside intended directories. Sanitize/validate graphname against a strict allowlist and normalize paths before file operations.

Sensitive information handling: reload_llm_config may erase authentication_configuration when saving UI-provided configs (which are intentionally stripped of secrets by get_config). This can break services and may cause admins to re-enter secrets unnecessarily. Merge configs to preserve secrets when not provided.

Additionally, several admin-only endpoints return raw exception messages in responses. While gated by role checks, consider standardizing error messages to avoid leaking internal details.

⚡ Recommended focus areas for review

Path Traversal Risk

Unsanitized graphname is interpolated into filesystem paths for prompts/configs, enabling potential directory traversal and arbitrary file overwrite/creation. Sanitize graphname (e.g., allowlist of [A-Za-z0-9_-]) before using it in paths.

if graphname:
    # Create graph-specific prompt dir, seed from default (first time only)
    graph_prompt_dir = f"common/prompts/{graphname}"
    os.makedirs(graph_prompt_dir, exist_ok=True)
    if os.path.exists(default_prompt_path):
        for fname in os.listdir(default_prompt_path):
            src = os.path.join(default_prompt_path, fname)
            dst = os.path.join(graph_prompt_dir, fname)
            if os.path.isfile(src) and not os.path.exists(dst):
                shutil.copy2(src, dst)

    # Create or update configs/{graphname}/server_config.json
    graph_config_dir = f"configs/{graphname}"
    os.makedirs(graph_config_dir, exist_ok=True)
    graph_config_path = os.path.join(graph_config_dir, "server_config.json")
    if not os.path.exists(graph_config_path):
        with open(SERVER_CONFIG, "r") as f:
            graph_server_config = json.load(f)
    else:
        with open(graph_config_path, "r") as f:
            graph_server_config = json.load(f)
    graph_server_config["llm_config"]["completion_service"]["prompt_path"] = f"./{graph_prompt_dir}/"
    with open(graph_config_path, "w") as f:
        json.dump(graph_server_config, f, indent=2)

    prompt_path = graph_prompt_dir
else:
    prompt_path = default_prompt_path
Secret Wipe on Save

reload_llm_config overwrites the on-disk llm_config with the incoming payload without preserving authentication_configuration. Since get_config strips secrets before sending to UI, saving back may erase provider credentials. Consider merging to retain existing secrets when not explicitly provided.

# Preserve existing API keys if not provided in new config
existing_llm_config = server_config.get("llm_config", {})

# Directly save the new LLM config without preserving old API keys
server_config["llm_config"] = new_llm_config

with open(SERVER_CONFIG, "w") as f:
    json.dump(server_config, f, indent=2)
Per-Graph Prompt Not Respected

The community summarization prompt is loaded once at import using the default completion_config, ignoring per-graph overrides and runtime config reloads. This likely causes mismatched prompts when graph-specific prompt paths are configured.

# Load prompt from file
def load_community_prompt():
    prompt_path = completion_config.get("prompt_path", "./common/prompts/openai_gpt4/")
    if prompt_path.startswith("./"):
        prompt_path = prompt_path[2:]
    prompt_path = prompt_path.rstrip("/")

    prompt_file = os.path.join(prompt_path, "community_summarization.txt")
    if not os.path.exists(prompt_file):
        error_msg = f"Community summarization prompt file not found: {prompt_file}. Please ensure the file exists in the configured prompt path."
        logger.error(error_msg)
        raise FileNotFoundError(error_msg)

    try:
        with open(prompt_file, "r", encoding="utf-8") as f:
            content = f.read()
            logger.info(f"Successfully loaded community summarization prompt from: {prompt_file}")
            return content
    except Exception as e:
        error_msg = f"Failed to read community summarization prompt from {prompt_file}: {str(e)}"
        logger.error(error_msg)
        raise Exception(error_msg)


# src: https://github.com/microsoft/graphrag/blob/main/graphrag/index/graph/extractors/summarize/prompts.py
SUMMARIZE_PROMPT = PromptTemplate.from_template(load_community_prompt())

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants