A full-stack AI agent that answers business questions by decomposing them into sub-queries using the ThoughtSpot REST API, fetching data for each, and streaming a Gemini-powered analysis with an auto-generated ThoughtSpot Liveboard.
import { ThoughtSpotRestApi, createBearerAuthenticationConfig } from "@thoughtspot/rest-api-sdk";
import { GoogleGenerativeAI } from "@google/generative-ai";
const thoughtSpotClient = new ThoughtSpotRestApi(
createBearerAuthenticationConfig(THOUGHTSPOT_HOST, async () => BEARER_TOKEN)
);
// Step 1: Decompose a high-level question into answerable sub-queries
const result = await thoughtSpotClient.queryGetDecomposedQuery({
nlsRequest: { query: "How can I increase sales?" },
content: [],
worksheetIds: [DATASOURCE_ID],
});
const subQuestions = result.decomposedQueryResponse?.decomposedQueries?.map(q => q.query!) ?? [];
// e.g. ["What is revenue by region?", "What are top products by sales?", ...]
// Step 2: Get answer + CSV data for each sub-question
const answer = await thoughtSpotClient.singleAnswer({
query: subQuestion,
metadata_identifier: DATASOURCE_ID,
});
const csvData = await thoughtSpotClient.exportAnswerReport({
session_identifier: answer.session_identifier!,
generation_number: answer.generation_number!,
file_format: "CSV",
});
// Step 3: Create a Liveboard with all answers via TML import
await thoughtSpotClient.importMetadataTML({
metadata_tmls: [JSON.stringify(liveboardTml)],
import_policy: "ALL_OR_NONE",
});
// Step 4: Send answers to Gemini and stream the analysis back to the user
const genAI = new GoogleGenerativeAI(GEMINI_API_KEY);
const model = genAI.getGenerativeModel({
model: "gemini-2.5-flash",
tools: [{ functionDeclarations: [getRelevantDataFunctionDefinition] }],
});
const chat = model.startChat();
const stream = await chat.sendMessageStream(userQuery);
for await (const chunk of stream.stream) {
res.write(chunk.candidates[0].content.parts[0].text); // streamed to client
}- Interactive streaming chat interface built with React + Ant Design
- Uses
queryGetDecomposedQueryto break complex business questions into answerable sub-queries - Fetches CSV data per sub-query using
singleAnswer+exportAnswerReport - Auto-creates a ThoughtSpot Liveboard from all answers and embeds it inline via
importMetadataTML - Gemini function-calling agent orchestrates the full flow and streams a summary with citations
- Clone the repository
- Install dependencies:
npm install-
Set up environment variables: Update the
.envfile in the root directory with the following variables:GEMINI_API_KEY=your_gemini_api_key VITE_THOUGHTSPOT_HOST=your_thoughtspot_host VITE_TOKEN_SERVER=your_token_server VITE_TS_BEARER_TOKEN=your_bearer_token (if not providing token server) VITE_TS_DATASOURCE_ID=your_datasource_id VITE_USERNAME=your_username -
Start both the frontend and backend servers:
npm run dev:allOr start them separately:
# Start the backend server
npm run api
# In a new terminal, start the frontend development server
npm run dev- Open your browser and navigate to
http://localhost:5173
- Go to ThoughtSpot => Develop => Rest Playground v2.0
- Authentication => Get Full access token
- Scroll down and expand the "body"
- Add your "username" and "password".
- Put whatever "validity_time" you want the token to be.
- Click on "Try it out" on the bottom right.
- You should get a token in the response, thats the bearer token.
src/- Frontend React applicationcomponents/- React componentsChatButton.tsx- Floating chat button componentChatSidebar.tsx- Chat sidebar with streaming responses
App.tsx- Main application componentmain.tsx- Application entry point
api/- Backend Express serveragent.ts- AI agent implementation with Gemini integrationrelevant-data.ts- Data retrieval functionsthoughtspot.ts- ThoughtSpot API integration
- When the application loads, it initializes a chat session by calling
/api/start - The user can click the chat button in the bottom right corner to open the chat sidebar
- The user can type a message and send it to the AI agent
- The agent processes the message, retrieves relevant data from ThoughtSpot, and streams the response back to the user
- The response is displayed in the chat interface in real-time
- React
- TypeScript
- Ant Design
- Vite
- Express
- Google Generative AI
- ThoughtSpot REST API SDK
- ThoughtSpot REST API SDK
- queryGetDecomposedQuery (NLS Query Decomposition)
- singleAnswer API
- importMetadataTML
- Google Gemini Function Calling
$ git clone https://github.com/thoughtspot/developer-examples
$ cd visual-embed/spotter/query-decomposition
$ npm i
$ npm run dev:all
- React
- Typescript
- Web