Welcome to the developer guide for TigerGraph GraphRAG. This guide provides information on how to add a new LangChain tool, embedding service, or LLM generation service to GraphRAG.
- Adding a New LangChain Tool
- Adding a New Embedding Service
- Adding a New LLM Generation Service
- Adding New Tests
If you want your agent to connect to other data sources or be able to perform custom logic, you can add a new LangChain tool to TigerGraph GraphRAG. To add a new LangChain tool, follow these steps:
- In the
app/toolsdirectory, create a new file for your tool. The file should be namedtoolname.pywheretoolnameis the name of your tool. - Define your tool. The tool should a valid Python class that inherits from the LangChain
BaseToolclass. For more information refer to the LangChain documentation. - Add your tool to the
app/tools/__init__.pyfile. This file should contain an import statement for your tool. For example:
from .generate_function import GenerateFunction- Enable your tool to be used by the agent. To do this, import and instantiate your tool in the
app/agent.pyfile. For example:
from tools import GenerateFunction
generate_function = GenerateFunction()Then add the tool to the tools list in the Agent class. For example:
tools = [mq2s, gen_func, new_tool]-
Test your tool. Run the service and test your tool to ensure that it works as expected.
-
(Optional): Think that your tool could be useful for others? Consider contributing it! To contribute your tool, submit a pull request to the TigerGraph GraphRAG repository and checkout our contributing guidelines.
One might want to add a new embedding service to TigerGraph GraphRAG to better fit their deployment environment. To do this, follow these steps:
- In
app/embeddings/embedding_service.pyand create a new class that inherits from theBaseEmbeddingServiceclass. For example:
class MyEmbeddingService(BaseEmbeddingService):
def __init__(self, config):
super().__init__(config)
# Add your custom initialization code here- Implement the needed methods for your service. If you utilize a LangChain-supported embedding service, you can use the
BaseEmbeddingServiceclass as a reference. If you are using a custom endpoint, you will need to implement theembed_documentsandembed_querymethods accordingly. - Import your service and dd your service to the
app/main.pyfile where theEmbeddingServiceclass is instantiated. For example:
from common.embeddings.embedding_service import MyembeddingService
if llm_config["embedding_service"]["embedding_model_service"].lower() == "MyEmbeddingService":
embedding_service = MyEmbeddingService(llm_config["embedding_service"])- Test your service. Run the service and test your service to ensure that it works as expected.
- (Optional): Think that your service could be useful for others? Consider contributing it! To contribute your service, submit a pull request to the TigerGraph GraphRAG repository and checkout our contributing guidelines.
To add a new LLM generation service to TigerGraph GraphRAG, follow these steps:
- Create a new file in the
app/llm_servicesdirectory. The file should be namedservice_name.pywhereservice_nameis the name of your service. - Define your service. The service should be a valid Python class that inherits from the
LLM_Modelclass defined in theapp/llm_services/base_llm.pyfile. - Add your service to the
app/llm_services/__init__.pyfile. This file should contain an import statement for your service. For example:
from .service_name import ServiceName- Import and instantiate your service in the
app/main.pyfile. For example:
from common.llm_services import ServiceName
# Within the instantiation of the Agent class elif block
elif llm_config["completion_service"]["llm_service"].lower() == "my_service":
logger.debug(f"/{graphname}/query request_id={req_id_cv.get()} llm_service=my_service agent created")
agent = TigerGraphAgent(AzureOpenAI(llm_config["completion_service"]), conn, embedding_service, embedding_store)- Test your service. Run the service and test your service to ensure that it works as expected.
- (Optional): Think that your service could be useful for others? Consider contributing it! To contribute your service, submit a pull request to the TigerGraph GraphRAG repository and checkout our contributing guidelines.
To add a new InquiryAI test suite to TigerGraph GraphRAG, follow these steps:
-
Download the InquiryAI test template from here in
.tsvformat. -
Create a new directory in the
tests/test_questionsdirectory. The directory should be namedsuite_namewheresuite_nameis the name of your test suite. -
Add the
.tsvfile to the new directory, populated with your example questions and expected answers. -
(Optional): Add the necessary GSQL and setup script to the
tests/test_questions/suite_namedirectory to support your test suite. The setup scripts are not run with the test suite, but help to set up the graph for the test suite. The tests assume that the graph is already set up. -
Add necessary query descriptors to the
tests/test_questions/suite_namedirectory. Within a directory named after the query, add a.jsonfile with the query descriptor. Optionally add a.gsqlfile with the query itself. -
Add the test suite to the
tests/test_questions/parse_test_config.pyfile by adding an available schema to theschemaargument list. -
Test your test suite. Run the test suite and ensure that it works as expected. Run the tests with the following command (and add desired options described here):
./run_tests.sh- (Optional): Think that your test suite could be useful for others? Consider contributing it! To contribute your test suite, submit a pull request to the TigerGraph GraphRAG repository and checkout our contributing guidelines.