You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Rename huggingface-embeddings -> sentencetransformers in embeddings.md for consistency with the backend structure
The Dockerfile still knows the "huggingface-embeddings"
backend (I assume for compatibility reasons) but uses the
sentencetransformers backend under the hood anyway.
I figured it would be good to update the docs to use the new naming to
make it less confusing moving forward. As the docker container knows
both the "huggingface-embeddings" and the "sentencetransformers"
backend, this should not break anything.
Signed-off-by: Marcus Köhler <[email protected]>
To use `sentence-transformers` and models in `huggingface` you can use the `huggingface` embedding backend.
64
+
To use `sentence-transformers` and models in `huggingface` you can use the `sentencetransformers` embedding backend.
65
65
66
66
```yaml
67
67
name: text-embedding-ada-002
68
-
backend: huggingface-embeddings
68
+
backend: sentencetransformers
69
69
embeddings: true
70
70
parameters:
71
71
model: all-MiniLM-L6-v2
72
72
```
73
73
74
-
The `huggingface` backend uses Python [sentence-transformers](https://github.com/UKPLab/sentence-transformers). For a list of all pre-trained models available see here: https://github.com/UKPLab/sentence-transformers#pre-trained-models
74
+
The `sentencetransformers` backend uses Python [sentence-transformers](https://github.com/UKPLab/sentence-transformers). For a list of all pre-trained models available see here: https://github.com/UKPLab/sentence-transformers#pre-trained-models
75
75
76
76
{{% notice note %}}
77
77
78
-
- The `huggingface` backend is an optional backend of LocalAI and uses Python. If you are running `LocalAI` from the containers you are good to go and should be already configured for use. If you are running `LocalAI` manually you must install the python dependencies (`pip install -r /path/to/LocalAI/extra/requirements`) and specify the extra backend in the `EXTERNAL_GRPC_BACKENDS` environment variable ( `EXTERNAL_GRPC_BACKENDS="huggingface-embeddings:/path/to/LocalAI/backend/python/sentencetransformers/sentencetransformers.py"` ) .
79
-
- The `huggingface` backend does support only embeddings of text, and not of tokens. If you need to embed tokens you can use the `bert` backend or `llama.cpp`.
80
-
- No models are required to be downloaded before using the `huggingface` backend. The models will be downloaded automatically the first time the API is used.
78
+
- The `sentencetransformers` backend is an optional backend of LocalAI and uses Python. If you are running `LocalAI` from the containers you are good to go and should be already configured for use. If you are running `LocalAI` manually you must install the python dependencies (`pip install -r /path/to/LocalAI/extra/requirements`) and specify the extra backend in the `EXTERNAL_GRPC_BACKENDS` environment variable ( `EXTERNAL_GRPC_BACKENDS="sentencetransformers:/path/to/LocalAI/backend/python/sentencetransformers/sentencetransformers.py"` ) .
79
+
- The `sentencetransformers` backend does support only embeddings of text, and not of tokens. If you need to embed tokens you can use the `bert` backend or `llama.cpp`.
80
+
- No models are required to be downloaded before using the `sentencetransformers` backend. The models will be downloaded automatically the first time the API is used.
0 commit comments