Fullstack AI + Data integration project performs a “Text-To-SQL” query using LLMs ChatGPT, Claude and Ollama over a small dataset stored locally. More data can be added.
-
Fullstack Application: Combines a React frontend with a LangGraph backend, representing each state of execution as a node.
-
Automatic Routing: Identifies which action to perform (RAG-based or analytics), addresses knowledge gaps, and refines searches.
-
The data files are located in the
backend/datafolder (includingproducts.csvandrules.md), from which thefrontend/storefolder is created containing SQLite databases for both files and embeddings. -
The project is organized into two main directories:
frontend/: Contains the React application built with Vite.backend/: Contains the LangGraph/FastAPI application, including the agent logic.
- Data Ingestion & Indexing: Raw data is ingested and indexed into a vector store.
- Context Retrieval: Retrieves relevant context for free-text questions.
- Ad-hoc Analytics: Performs SQL-style queries for analytics.
- Natural Language Generation: Generates natural answers via OpenAI, with a persona flag to adjust tone.
- HTTP API Exposure: Exposes a simple HTTP API.
- Docker Containerization: Containerizes the application for deployment.
- React (with Vite) - For the frontend user interface.
- Tailwind CSS - For styling.
- Shadcn UI - For ui components.
- LangGraph - For building the backend research agent.
- Open AI - CHatGPT as LLM for query generation and answer synthesis.
- Ollama - Use local LLM like llama3.2, llama3.3 for query generation so the data stays in your local
- Node.js and npm (or yarn/pnpm)
- Python 3.11+
- API Keys: The backend agent requires an OpenAI API key.
- Navigate to the main directory.
- navigate to the
backend\folder - Add the followings to the
.envfile- OpenAI API key:
OPENAI_API_KEY=sk-proj-.......... - LangSmith API Key:
LANGSMITH_API_KEY=lsv2_...........
- OpenAI API key:
-
Go to
frontend\and usenpmto download node modulescd frontendnpm install
-
To fix vulnerabilities if occurs
- To address issues that do not require attention, run:
npm audit fix
- To address all issues (including breaking changes), run:
npm audit fix --force
- To address issues that do not require attention, run:
-
Run the cmd to open frontend web app at
http://localhost:5173/appnpm run dev
-
Navigate to
backend\and useuvas the package manager to set up python env as stated inpyproject.toml.cd backenduv sync
-
Run the LangGraph development server with the
--allow-blockingflag, which enables blocking operations in the agent workflow during development.uv run langgraph dev --allow-blocking
-
The backend API will be available at
http://127.0.0.1:2024\docs. It will also open a browser window to the FastAPI docs where request can be accessed. The core of the backend is a LangGraph agent defined inbackend/src/agent/graph.pywhich follows the steps below:
Click on http://localhost:5173/app/ to see the web application.
- What is the average, maximum and minimum turnover of each country?
- What are the countries with the maximum turnover?
- List all the countries segment wise
1. Build the Docker Image:
Notice!: You might encounter some issue when building the image, I am currently fixing the error
Run the following command from the project root directory:
docker build -t hybrid-rag-service -f Dockerfile .2. Run the Production Server:
docker-compose upThe project is organized into two main directories: The core of the backend is a LangGraph agent defined in backend/src/agent/graph.py. It follows these steps:
backend/
├── .langgraph_api/
├── src/
│ ├── agent/
│ │ ├── __init__.py
│ │ ├── app.py
│ │ ├── configuration.py
│ │ ├── graph.py
│ │ ├── router.py
│ │ └── state.py
│ ├── data/
│ │ ├── products.csv
│ │ └── rules.md
│ ├── store/
│ │ ├── analytic.sqlite
│ │ └── rag.sqlite
│ ├── wrangler/
│ │ ├── embedding/
│ │ │ ├── __init__.py
│ │ │ ├── base.py
│ │ │ └── openai.py
│ │ ├── model/
│ │ │ ├── __init__.py
│ │ │ ├── chunk.py
│ │ │ ├── document.py
│ │ │ └── product.py
│ │ ├── repository/
│ │ │ ├── __init__.py
│ │ │ ├── analytic.py
│ │ │ ├── base.py
│ │ │ ├── chunk.py
│ │ │ ├── document.py
│ │ │ └── store.py
│ │ ├── __init__.py
│ │ ├── ingest.py
│ │ ├── qa_agent.py
│ │ ├── queryTranslation.py
│ │ ├── ragUtil.py
│ │ └── repository.py
│ └── __init__.py
├── .venv/
├── test-agent.ipynb
├── uv.lock
├── .gitignore
├── langgraph.json
├── LICENSE
├── Makefile
└── pyproject.toml
frontend/
├── node_modules/
├── public/
│ └── vite.svg
├── src/
│ ├── components/
│ │ ├── ui/
│ │ │ ├── badge.tsx
│ │ │ ├── button.tsx
│ │ │ ├── card.tsx
│ │ │ ├── input.tsx
│ │ │ ├── scroll-area.tsx
│ │ │ ├── select.tsx
│ │ │ ├── tabs.tsx
│ │ │ └── textarea.tsx
│ │ ├── ActivityTimeline.tsx
│ │ ├── ChatMessagesView.tsx
│ │ ├── InputForm.tsx
│ │ ├── TableView.tsx
│ │ └── WelcomeScreen.tsx
│ ├── lib/
│ │ └── utils.ts
│ ├── App.tsx
│ ├── global.css
│ ├── main.tsx
│ └── vite-env.d.ts
├── .gitignore
├── components.json
├── eslint.config.js
├── index.html
├── package-lock.json
├── package.json
├── tsconfig.json
├── tsconfig.node.json
└── vite.config.ts
