Skip to content

Conversation

tijszwinkels
Copy link

For example:

python api.py --model "Qwen/Qwen2-72B-Instruct" --reference-models "Qwen/Qwen2-72B-Instruct" "Qwen/Qwen1.5-72B-Chat" "mistralai/Mixtral-8x22B-Instruct-v0.1" "databricks/dbrx-instruct" --temperature 0.7 --max-tokens 512 --rounds 1 --port 5001

Start a api on port 5001, (http://localhost:5001/v1) implementing a OpenAI compatible API.
This way, MoA can be used by any tooling and clients that support configuring a new OpenAI endpoint (Such as LibeChat, Continue.dev, big-agi.com, etc).

@jkennedyvz
Copy link

jkennedyvz commented Jul 25, 2024

Suggest re-visiting #32 and keeping api.py as a standalone contribution for this PR.*

@tijszwinkels
Copy link
Author

Suggest re-visiting #32 and adding api.py as a standalone contribution.

Just to confirm; Would you like me to add these changes to #32?

@jkennedyvz
Copy link

Suggest re-visiting #32 and adding api.py as a standalone contribution.

Just to confirm; Would you like me to add these changes to #32?

I'm suggesting this PR contain only the addition of api.py, and the readme. As-is it includes some changes to bot.py that should be handled in #32

I am not a maintainer for this, but I think splitting up the changes and reducing non-functional changes will make these two PRs easier to approve.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants