A demonstration project showcasing microservices architecture using .NET, RabbitMQ message broker, and Python AI worker for generating dictionary words.
This project demonstrates a distributed system with the following components:
sequenceDiagram
participant API as ".NET API"
participant RabbitMQ as "RabbitMQ Message Broker"
participant Consumer as ".NET Consumer"
participant Database as "SQLite Database"
participant AI as "Python AI Worker"
API->>RabbitMQ: Send word generation request
RabbitMQ->>AI: Forward request to AI worker
AI->>RabbitMQ: Send generated words back
RabbitMQ->>Consumer: Deliver words to consumer
Consumer->>Database: Store words in database
API->>Database: Query stored words
├── ai/ # Python AI service for word generation
│ ├── main.py # Main worker script
│ ├── lib/generator.py # OpenAI integration for word definitions
│ ├── models/word.py # Data models
│ └── requirements.txt # Python dependencies
├── backend/ # .NET backend services
│ ├── dictionary.api/ # REST API service
│ ├── dictionary.consumer/ # RabbitMQ consumer service
│ └── dictionary.data/ # Data layer with Entity Framework
├── frontend/ # React frontend application
│ ├── src/ # Source code
│ ├── public/ # Static assets
│ ├── package.json # Node.js dependencies
│ └── vite.config.js # Vite configuration
├── docs/ # Documentation
│ └── flows.mermaid # Architecture diagrams
├── .env.example # Environment variables template
└── docker-compose.yml # Container orchestration
- REST API: .NET 9 Web API for managing dictionary words
- Message Queue: RabbitMQ for asynchronous communication
- AI Word Generation: Python service using OpenAI GPT-4o-mini to generate dictionary word definitions
- Data Persistence: SQLite database with Entity Framework Core
- Microservices: Containerized services with Docker
- Real-time Processing: Background consumer for processing messages
- Frontend: React SPA with Material-UI for user interaction
- .NET 9: Latest .NET runtime
- ASP.NET Core: Web API framework
- Entity Framework Core: ORM with SQLite
- RabbitMQ.Client: Message queue integration
- Python 3.11: Runtime environment
- Pika: RabbitMQ client library
- OpenAI GPT-4o-mini: AI model for generating word definitions
- Custom AI Generator: Word generation logic with OpenAI integration
- React 19: Latest React version
- Material-UI (MUI): Modern React components
- Vite: Fast build tool and dev server
- Axios: HTTP client for API calls
- ESLint & Prettier: Code quality and formatting
- RabbitMQ: Message broker with management UI
- SQLite: Lightweight database
- Docker: Containerization
- Docker Compose: Multi-container orchestration
- Docker and Docker Compose
- .NET 9 SDK (for local development)
- Python 3.11+ (for local development)
- OpenAI API key (for AI word generation)
-
Clone the repository
git clone <repository-url> cd dotnet-rabbitmq
-
Setup environment variables
# Copy the example environment file cp .env.example .env # Edit .env file and add your OpenAI API key # OPENAI_API_KEY=your_actual_api_key_here
-
Start all services
docker-compose up -d
-
Access the services
- Frontend: http://localhost:3000 (React SPA with Material-UI)
- API: http://localhost:5001
- RabbitMQ Management UI: http://localhost:15672
- Username:
user - Password:
password
- Username:
POST /api/word/generate
Content-Type: application/json
{
"word": "example",
"count": 5
}GET /api/wordGET /api/word/{id}-
Navigate to frontend directory
cd frontend -
Install dependencies
npm install
-
Start development server
npm run dev
-
Build for production
npm run build
-
Lint and format code
npm run lint npm run format
-
Navigate to backend directory
cd backend -
Restore dependencies
dotnet restore
-
Run the API
cd dictionary.api dotnet run -
Run the Consumer (in separate terminal)
cd dictionary.consumer dotnet run
-
Navigate to AI directory
cd ai -
Install dependencies
pip install -r requirements.txt
-
Run the AI worker
# Set your OpenAI API key first export OPENAI_API_KEY=your_api_key_here python main.py
The project uses SQLite database which is automatically created when the services start. The database file is stored in a Docker volume and persisted between container restarts.
The project includes a .env.example file with all required environment variables. Copy it to .env and update the values as needed:
cp .env.example .envThe services can be configured using the following environment variables:
RabbitMQ__Host: RabbitMQ server hostname (default:rabbitmq)RabbitMQ__Port: RabbitMQ server port (default:5672)RabbitMQ__Username: RabbitMQ username (default:user)RabbitMQ__Password: RabbitMQ password (default:password)
ConnectionStrings__Database: SQLite connection string
OPENAI_API_KEY: Your OpenAI API key (required for AI word generation)- The AI service uses GPT-4o-mini model to generate word definitions, synonyms, antonyms, and example sentences
- API → RabbitMQ: When a word generation request is made, the API publishes a message to the
wordsqueue - RabbitMQ → AI Worker: The Python AI service consumes messages from the queue and generates words
- AI Worker → RabbitMQ: Generated words are published back to RabbitMQ
- RabbitMQ → Consumer: The .NET consumer processes the generated words
- Consumer → Database: Words are stored in the SQLite database
- API → Database: The API queries the database to return stored words
The system uses RabbitMQ queues for communication:
wordsqueue: For word generation requests- Message format: JSON with word and count properties
- Acknowledgment: Messages are acknowledged after successful processing
- RabbitMQ Management UI: Monitor queue status, message rates, and connections
- Logs: Each service outputs structured logs for debugging
- Health Checks: API endpoints for service health monitoring
- Services not starting: Ensure Docker is running and ports are available
- Database connection errors: Check if SQLite file permissions are correct
- RabbitMQ connection failures: Verify RabbitMQ is running and credentials are correct
- AI worker not processing: Check Python dependencies and RabbitMQ connectivity
View logs for specific services:
docker-compose logs dictionary-api
docker-compose logs dictionary-consumer
docker-compose logs ai-generator
docker-compose logs rabbitmq- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
This project is a demonstration/learning project. Please refer to the license file for usage terms.
Note: This is a demo project designed for learning microservices architecture with .NET and RabbitMQ. It's not intended for production use without additional security, monitoring, and reliability measures.