-
Notifications
You must be signed in to change notification settings - Fork 185
Description
Unable to create agent, it creates the agent but can not start or edit it.
=ERROR msg="Failed to create agent avatar" error="failed to generate image prompt: error, status code: 500, status: 500 Internal Server Error, message: could not load model - all backends returned error: [llama-cpp]: failed to load model with internal loader: could not load model: rpc error: code = Canceled desc = \n[llama-cpp-fallback]: failed to load model with internal loader: could not load model: rpc error: code = Canceled desc = \n[stablediffusion-ggml]: failed to load model with internal loader: could not load model: rpc error: code = Unknown desc = could not load model\n[whisper]: failed to load model with internal loader: could not load model: rpc error: code = Unknown desc = stat /build/models/arcee-agent: no such file or directory\n[bark-cpp]: failed to load model with internal loader: could not load model: rpc error: code = Unknown desc = inference failed\n[piper]: failed to load model with internal loader: could not load model: rpc error: code = Unknown desc = unsupported model type /build/models/arcee-agent (should end with .onnx)\n[silero-vad]: failed to load model with internal loader: could not load model: rpc error: code = Unknown desc = create silero detector: failed to create session: Load model from /build/models/arcee-agent failed:Load model /build/models/arcee-agent failed. File doesn't exist\n[huggingface]: failed to load model with internal loader: could not load model: rpc error: code = Unknown desc = no huggingface token provided\n[/build/backend/python/faster-whisper/run.sh]: failed to load model with internal loader: backend not found: /tmp/localai/backend_data/backend-assets/grpc/build/backend/python/faster-whisper/run.sh\n[/build/backend/python/vllm/run.sh]: failed to load model with internal loader: backend not found: /tmp/localai/backend_data/backend-assets/grpc/build/backend/python/vllm/run.sh\n[/build/backend/python/exllama2/run.sh]: failed to load model with internal loader: backend not found: /tmp/localai/backend_data/backend-assets/grpc/build/backend/python/exllama2/run.sh\n[/build/backend/python/bark/run.sh]: failed to load model with internal loader: backend not found: /tmp/localai/backend_data/backend-assets/grpc/build/backend/python/bark/run.sh\n[/build/backend/python/transformers/run.sh]: failed to load model with internal loader: backend not found: /tmp/localai/backend_data/backend-assets/grpc/build/backend/python/transformers/run.sh\n[/build/backend/python/rerankers/run.sh]: failed to load model with internal loader: backend not found: /tmp/localai/backend_data/backend-assets/grpc/build/backend/python/rerankers/run.sh\n[/build/backend/python/autogptq/run.sh]: failed to load model with internal loader: backend not found: /tmp/localai/backend_data/backend-assets/grpc/build/backend/python/autogptq/run.sh\n[/build/backend/python/kokoro/run.sh]: failed to load model with internal loader: backend not found: /tmp/localai/backend_data/backend-assets/grpc/build/backend/python/kokoro/run.sh\n[/build/backend/python/diffusers/run.sh]: failed to load model with internal loader: backend not found: /tmp/localai/backend_data/backend-assets/grpc/build/backend/python/diffusers/run.sh\n[/build/backend/python/coqui/run.sh]: failed to load model with internal loader: backend not found: /tmp/localai/backend_data/backend-assets/grpc/build/backend/python/coqui/run.sh" source.file=/work/core/state/pool.go source.L=16
level=ERROR msg="Agent not found" name=Infobot source.file=/work/webui/routes.go source.L=204
image: quay.io/mudler/localagi:master
environment:
- LOCALAGI_MODEL=${MODEL_NAME:-arcee-agent}
- LOCALAGI_MULTIMODAL_MODEL=${MULTIMODAL_MODEL:-minicpm-v-2_6}
- LOCALAGI_IMAGE_MODEL=${IMAGE_MODEL:-sd-1.5-ggml}
No python in default image, what does this mean for docker compose as it's the default image?