Skip to content

examples: Langchain ChatOpenAI integration doesn't work #1545

@benm5678

Description

@benm5678

LocalAI version:

63e1f8f

Environment, CPU architecture, OS, and Version:

AWs g5.xlarge (single A10)

Describe the bug

It seems langchain's ChatOpenAI expects <base_url>/engines/<model_name>/chat/completions to respond, but doesn't seem LocalAI exposes that -- I tried your example code and it has the same issue: https://github.com/mudler/LocalAI/blob/master/examples/langchain/langchainpy-localai-example/full_demo.py

To Reproduce

Try the langchain example code

Expected behavior

Langchain's ChatOpenAI should work just by setting the base url to LocalAI (e.g. http://local-ai-host:8080/v1)

Logs

Error from langchain: openai.error.InvalidRequestError: Cannot POST /v1/engines/thebloke__llama2-chat-ayt-13b-gguf__llama2-chat-ayt-13b.q5_k_s.gguf/chat/completions

Error in LocalAI debug: [127.0.0.1]:34992 404 - POST /v1/engines/thebloke__llama2-chat-ayt-13b-gguf__llama2-chat-ayt-13b.q5_k_s.gguf/chat/completions

Additional context

This is the output of our /models endpoint in LocalAI: {"object":"list","data":[{"id":"thebloke__llama2-chat-ayt-13b-gguf__llama2-chat-ayt-13b.q5_k_s.gguf","object":"model"}]}

Calling it with this curl command works: curl http://localhost:8000/v1/chat/completions -H "Content-Type: application/json" -d '{ "model": "thebloke__llama2-chat-ayt-13b-gguf__llama2-chat-ayt-13b.q5_k_s.gguf", "messages": [{"role": "user", "content": "hi"}], "temperature": 0.9 }'

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions