This repository contains a Python-based agent designed to query and process code documentation and files using various tools and models. The agent leverages the capabilities of LlamaParse, Ollama, and other modules to provide an interactive experience for querying and generating code documentation.
- Document Parsing: Uses LlamaParse to parse documents and extract relevant information.
- Query Engine: Utilizes VectorStoreIndex and QueryEngineTool to handle queries.
- Multiple Models: Integrates multiple language models including "mistral" and "codellama".
- Code Generation: Generates code snippets based on prompts and saves them to files.
- Retry Mechanism: Implements a retry mechanism to handle query failures gracefully.
- Mistral AI Model: Used for conversation and interactive prompts.
- Local Ollama: Used for various local processing tasks.
- Codellama (Ollama): Used specifically for code generation tasks.
-
Clone the Repository:
git clone https://github.com/smahmudrahat/Multi-Agent-For-Code-generation-in-local-machine-.git
-
Create and Activate Virtual Environment:
python3 -m venv env source env/bin/activate
-
Install Requirements:
pip install -r requirements.txt
-
Set Up Environment Variables:
- Create a
.env
file in the root directory and add your environment variables:LLAMA_CLOUD_API_KEY=your_api_key_here
- Create a
-
Run the Script:
python main.py
-
Interactive Prompt:
- Enter prompts to query the agent. Type "q" to quit.
- The agent will process the query and generate code snippets based on the input.
-
Generated Code:
- The generated code will be displayed and saved to a file in the
output
directory.
- The generated code will be displayed and saved to a file in the
Contributions are welcome! Please submit a pull request or open an issue to discuss your changes.
This project was inspired by and uses code from the AI Agent Code Generator video.
This project is licensed under the MIT License.