Skip to content

Conversation

mudler
Copy link
Owner

@mudler mudler commented Mar 7, 2025

Description

This PR fixes embedding calculation on models which are embedded-only for llama.cpp. We process embeddings in batches now.

Notes for Reviewers

Signed commits

  • Yes, I signed my commits.

@mudler mudler added the bug Something isn't working label Mar 7, 2025
Copy link

netlify bot commented Mar 7, 2025

Deploy Preview for localai ready!

Name Link
🔨 Latest commit 20d3272
🔍 Latest deploy log https://app.netlify.com/sites/localai/deploys/67cb2c6c7abba4000883885f
😎 Deploy Preview https://deploy-preview-4957--localai.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@mudler mudler merged commit e4fa894 into master Mar 7, 2025
25 checks passed
@mudler mudler deleted the fix/llamacpp-embeddings branch March 7, 2025 18:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant