update some yml and some extensions' standalone deployment (#556)
Some checks are pending
Deploy CI - 13.0 / deploy (push) Waiting to run

* update some yml and some extensions' standalone deployment

* retrev INNER_SEAHUB_SERVICE_URL

* update description of INNER_SEAHUB_SERVICE_URL

* update description of INNER_SEAHUB_SERVICE_URL
This commit is contained in:
Huang Junxiang 2025-07-03 10:16:03 +08:00 committed by GitHub
parent bb3f4abb1e
commit 84909ef655
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
7 changed files with 104 additions and 35 deletions

View File

@ -9,10 +9,12 @@ From Seafile 13, users can enable ***Seafile AI*** to support the following feat
## Deploy Seafile AI basic service
### Deploy Seafile AI on the host with Seafile
The Seafile AI basic service will use API calls to external large language model service to implement file labeling, file and image summaries, text translation, and sdoc writing assistance.
!!! warning "Seafile AI requires Redis cache"
In order to deploy Seafile AI correctly, you need to use Redis cache. Please set `CACHE_PROVIDER=redis` in .env and set Redis related configuration information correctly.
In order to deploy Seafile AI correctly, you have to use ***Redis*** as the cache. Please set `CACHE_PROVIDER=redis` in `.env` and set Redis related configuration information correctly.
1. Download `seafile-ai.yml`
@ -20,20 +22,6 @@ The Seafile AI basic service will use API calls to external large language model
wget https://manual.seafile.com/13.0/repo/docker/seafile-ai.yml
```
!!! note "Deploy in a cluster or standalone deployment"
If you deploy Seafile in a cluster and would like to deploy Seafile AI, please expose port `8888` in `seafile-ai.yml`:
```yml
services:
seafile-ai:
...
ports:
- 8888:8888
```
At the same time, Seafile AI should be deployed on one of the cluster nodes.
2. Modify `.env`, insert or modify the following fields:
```
@ -42,17 +30,6 @@ The Seafile AI basic service will use API calls to external large language model
ENABLE_SEAFILE_AI=true
SEAFILE_AI_LLM_KEY=<your LLM access key>
```
!!! note "Deploy in a cluster or standalone deployment"
Please also specify the following items in `.env`:
- `.env` on the host where deploys Seafile server:
- `SEAFILE_AI_SERVER_URL`: the service url of Seafile AI (e.g., `http://seafile-ai.example.com:8888`)
- `.env` on the host where deploys Seafile AI:
- `SEAFILE_SERVER_URL`: your Seafile server's url (e.g., `https://seafile.example.com`)
- `REDIS_HOST`: your redis host
- `REDIS_PORT`: your redis port
- `REDIS_PASSWORD`: your redis password
!!! tip "About LLM configs"
By default, Seafile uses the ***GPT-4o-mini*** model from *OpenAI*. You only need to provide your ***OpenAI API Key***. If you need to use other LLM (including self-deployed LLM service), you also need to specify the following in `.env`:
@ -70,6 +47,48 @@ The Seafile AI basic service will use API calls to external large language model
docker compose up -d
```
### Deploy Seafile AI on another host to Seafile
1. Download `seafile-ai.yml` and `.env`:
```sh
wget https://manual.seafile.com/13.0/repo/docker/seafile-ai/seafile-ai.yml
wget -O .env https://manual.seafile.com/13.0/repo/docker/seafile-ai/env
```
2. Modify `.env` in the host will deploy Seafile AI according to following table
| variable | description |
|------------------------|---------------------------------------------------------------------------------------------------------------|
| `SEAFILE_VOLUME` | The volume directory of thumbnail server data |
| `JWT_PRIVATE_KEY` | JWT key, the same as the config in Seafile `.env` file |
| `INNER_SEAHUB_SERVICE_URL`| Intranet URL for accessing Seahub component, like `http://<your Seafile server intranet IP>`. |
| `REDIS_HOST` | Redis server host |
| `REDIS_PORT` | Redis server port |
| `REDIS_PASSWORD` | Redis server password |
| `SEAFILE_AI_LLM_TYPE` | Large Language Model (LLM) API Type (e.g., `openai`) |
| `SEAFILE_AI_LLM_URL` | LLM API url (leave blank if you would like to use official OpenAI's API endpoint) |
| `SEAFILE_AI_LLM_KEY` | LLM API key |
| `FACE_EMBEDDING_SERVICE_URL` | Face embedding service url |
then start your Seafile AI server:
```sh
docker compose up -d
```
3. Modify `.env` in the host deployed Seafile
```env
SEAFILE_AI_SERVER_URL=http://<your seafile ai host>:8888
```
then restart your Seafile server
```sh
docker compose down && docker compose up -d
```
## Deploy face embedding service (Optional)
The face embedding service is used to detect and encode faces in images and is an extension component of Seafile AI. Generally, we **recommend** that you deploy the service on a machine with a **GPU** and a graphics card driver that supports [OnnxRuntime](https://onnxruntime.ai/docs/) (so it can also be deployed on a different machine from the Seafile AI base service). Currently, the Seafile AI face embedding service only supports the following modes:
@ -134,7 +153,7 @@ Since the face embedding service may need to be deployed on some hosts with GPU(
- 8886:8886
```
2. Modify the `.env` of the Seafile AI basic service:
2. Modify the `.env` of where deployed Seafile AI:
```
FACE_EMBEDDING_SERVICE_URL=http://<your face embedding service host>:8886

View File

@ -55,7 +55,7 @@ Then modify the `.env` file according to your environment. The following fields
| `SEAFILE_MYSQL_DB_PASSWORD`| Seafile MySQL password |
| `TIME_ZONE` | Time zone |
| `JWT_PRIVATE_KEY` | JWT key, the same as the config in Seafile `.env` file |
| `INNER_SEAHUB_SERVICE_URL`| Inner Seafile url |
| `INNER_SEAHUB_SERVICE_URL`| Intranet URL for accessing Seahub component, like `http://<your Seafile server intranet IP>`. |
| `SEAF_SERVER_STORAGE_TYPE` | What kind of the Seafile data for storage. Available options are `disk` (i.e., local disk), `s3` and `multiple` (see the details of [multiple storage backends](../setup/setup_with_multiple_storage_backends.md)) |
| `S3_COMMIT_BUCKET` | S3 storage backend commit objects bucket |
| `S3_FS_BUCKET` | S3 storage backend fs objects bucket |

View File

@ -12,7 +12,7 @@ services:
- SEAFILE_AI_LLM_KEY=${SEAFILE_AI_LLM_KEY:-}
- FACE_EMBEDDING_SERVICE_URL=${FACE_EMBEDDING_SERVICE_URL:-http://face-embedding:8886}
- FACE_EMBEDDING_SERVICE_KEY=${FACE_EMBEDDING_SERVICE_KEY:-${JWT_PRIVATE_KEY:?Variable is not set or empty}}
- SEAFILE_SERVER_URL=${SEAFILE_SERVER_URL:-http://seafile}
- SEAFILE_SERVER_URL=${INNER_SEAHUB_SERVICE_URL:-http://seafile}
- JWT_PRIVATE_KEY=${JWT_PRIVATE_KEY:?Variable is not set or empty}
- SEAFILE_AI_LOG_LEVEL=${SEAFILE_AI_LOG_LEVEL:-info}
- CACHE_PROVIDER=${CACHE_PROVIDER:-redis}

View File

@ -0,0 +1,24 @@
COMPOSE_FILE='seafile-ai.yml'
COMPOSE_PATH_SEPARATOR=','
SEAFILE_AI_IMAGE=seafileltd/seafile-ai:latest
SEAFILE_VOLUME=/opt/seafile-data
SEAFILE_SERVER_PROTOCOL=http
SEAFILE_SERVER_HOSTNAME=
CACHE_PROVIDER=redis
REDIS_HOST=... # your redis host
REDIS_PORT=6379
REDIS_PASSWORD=
JWT_PRIVATE_KEY=
FACE_EMBEDDING_SERVICE_URL=
SEAFILE_AI_LLM_TYPE=openai
SEAFILE_AI_LLM_URL=
SEAFILE_AI_LLM_KEY=... # your llm key
INNER_SEAHUB_SERVICE_URL= # https://seafile.example.com

View File

@ -0,0 +1,27 @@
services:
seafile-ai:
image: ${SEAFILE_AI_IMAGE:-seafileltd/seafile-ai:latest}
container_name: seafile-ai
volumes:
- ${SEAFILE_VOLUME:-/opt/seafile-data}:/shared
ports:
- 8888:8888
environment:
- SEAFILE_AI_LLM_TYPE=${SEAFILE_AI_LLM_TYPE:-openai}
- SEAFILE_AI_LLM_URL=${SEAFILE_AI_LLM_URL:-}
- SEAFILE_AI_LLM_KEY=${SEAFILE_AI_LLM_KEY:-}
- FACE_EMBEDDING_SERVICE_URL=${FACE_EMBEDDING_SERVICE_URL:-}
- FACE_EMBEDDING_SERVICE_KEY=${FACE_EMBEDDING_SERVICE_KEY:-${JWT_PRIVATE_KEY:?Variable is not set or empty}}
- SEAFILE_SERVER_URL=${INNER_SEAHUB_SERVICE_URL:?Variable is not set or empty}
- JWT_PRIVATE_KEY=${JWT_PRIVATE_KEY:?Variable is not set or empty}
- SEAFILE_AI_LOG_LEVEL=${SEAFILE_AI_LOG_LEVEL:-info}
- CACHE_PROVIDER=${CACHE_PROVIDER:-redis}
- REDIS_HOST=${REDIS_HOST:-redis}
- REDIS_PORT=${REDIS_PORT:-6379}
- REDIS_PASSWORD=${REDIS_PASSWORD:-}
networks:
- seafile-net
networks:
seafile-net:
name: seafile-net

View File

@ -6,12 +6,11 @@ THUMBNAIL_SERVER_IMAGE=seafileltd/thumbnail-server:13.0.0-testing
SEAFILE_VOLUME=/opt/seafile-data
SEAFILE_MYSQL_DB_HOST=192.168.0.2
SEAFILE_MYSQL_DB_HOST=... # your mysql host
SEAFILE_MYSQL_DB_USER=seafile
SEAFILE_MYSQL_DB_PASSWORD=PASSWORD
SEAFILE_MYSQL_DB_PASSWORD=... # your mysql password
TIME_ZONE=Etc/UTC
JWT_PRIVATE_KEY=
JWT_PRIVATE_KEY=... # your jwt private key
INNER_SEAHUB_SERVICE_URL=192.168.0.2
INNER_SEAHUB_SERVICE_URL= # https://seafile.example.com

View File

@ -18,7 +18,7 @@ services:
- SEAFILE_MYSQL_DB_SEAFILE_DB_NAME=${SEAFILE_MYSQL_DB_SEAFILE_DB_NAME:-seafile_db}
- JWT_PRIVATE_KEY=${JWT_PRIVATE_KEY:?Variable is not set or empty}
- SITE_ROOT=${SITE_ROOT:-/}
- INNER_SEAHUB_SERVICE_URL=${INNER_SEAHUB_SERVICE_URL:Variable is not set or empty}
- INNER_SEAHUB_SERVICE_URL=${INNER_SEAHUB_SERVICE_URL:?Variable is not set or empty}
- SEAF_SERVER_STORAGE_TYPE=${SEAF_SERVER_STORAGE_TYPE:-}
- S3_COMMIT_BUCKET=${S3_COMMIT_BUCKET:-}
- S3_FS_BUCKET=${S3_FS_BUCKET:-}