Skip to content

Commit fd1b7b3

Browse files
authored
docs: Add docker instructions, add community projects section in README (#1359)
docs: Add docker instructions
1 parent 687730a commit fd1b7b3

File tree

2 files changed

+40
-6
lines changed

2 files changed

+40
-6
lines changed

README.md

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -109,9 +109,17 @@ Hot topics:
109109

110110
Check out the [Getting started](https://localai.io/basics/getting_started/index.html) section in our documentation.
111111

112-
### 💡 Example: Use Luna-AI Llama model
112+
### Community
113113

114-
See the [documentation](https://localai.io/basics/getting_started)
114+
WebUI
115+
- https://github.com/Jirubizu/localai-admin
116+
- https://github.com/go-skynet/LocalAI-frontend
117+
118+
Model galleries
119+
- https://github.com/go-skynet/model-gallery
120+
121+
Other:
122+
- Helm chart https://github.com/go-skynet/helm-charts
115123

116124
### 🔗 Resources
117125

docs/content/getting_started/_index.en.md

Lines changed: 30 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,36 @@ weight = 1
66
url = '/basics/getting_started/'
77
+++
88

9-
`LocalAI` is available as a container image and binary. You can check out all the available images with corresponding tags [here](https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest).
9+
`LocalAI` is available as a container image and binary. It can be used with docker, podman, kubernetes and any container engine. You can check out all the available images with corresponding tags [here](https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest).
10+
11+
See also our [How to]({{%relref "howtos" %}}) section for end-to-end guided examples curated by the community.
1012

1113
### How to get started
12-
For a always up to date step by step how to of setting up LocalAI, Please see our [How to]({{%relref "howtos" %}}) page.
1314

14-
### Fast Setup
15-
The easiest way to run LocalAI is by using [`docker compose`](https://docs.docker.com/compose/install/) or with [Docker](https://docs.docker.com/engine/install/) (to build locally, see the [build section]({{%relref "build" %}})). The following example uses `docker compose`:
15+
The easiest way to run LocalAI is by using [`docker compose`](https://docs.docker.com/compose/install/) or with [Docker](https://docs.docker.com/engine/install/) (to build locally, see the [build section]({{%relref "build" %}})).
16+
17+
{{< tabs >}}
18+
{{% tab name="Docker" %}}
19+
20+
```bash
21+
# Prepare the models into the `model` directory
22+
mkdir models
23+
# copy your models to it
24+
cp your-model.bin models/
25+
# run the LocalAI container
26+
docker run -p 8080:8080 -v $PWD/models:/models -ti --rm quay.io/go-skynet/local-ai:latest --models-path /models --context-size 700 --threads 4
27+
# Try the endpoint with curl
28+
curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d '{
29+
"model": "your-model.bin",
30+
"prompt": "A long time ago in a galaxy far, far away",
31+
"temperature": 0.7
32+
}'
33+
```
34+
35+
{{% /tab %}}
36+
{{% tab name="Docker compose" %}}
37+
38+
1639

1740
```bash
1841

@@ -44,6 +67,9 @@ curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d
4467
"temperature": 0.7
4568
}'
4669
```
70+
{{% /tab %}}
71+
72+
{{< /tabs >}}
4773

4874
### Example: Use luna-ai-llama2 model with `docker compose`
4975

0 commit comments

Comments
 (0)