diff --git a/manual/extension/seafile-ai.md b/manual/extension/seafile-ai.md index 72b3a9d9..75d3ebc6 100644 --- a/manual/extension/seafile-ai.md +++ b/manual/extension/seafile-ai.md @@ -9,10 +9,12 @@ From Seafile 13, users can enable ***Seafile AI*** to support the following feat ## Deploy Seafile AI basic service +### Deploy Seafile AI on the host with Seafile + The Seafile AI basic service will use API calls to external large language model service to implement file labeling, file and image summaries, text translation, and sdoc writing assistance. !!! warning "Seafile AI requires Redis cache" - In order to deploy Seafile AI correctly, you need to use Redis cache. Please set `CACHE_PROVIDER=redis` in .env and set Redis related configuration information correctly. + In order to deploy Seafile AI correctly, you have to use ***Redis*** as the cache. Please set `CACHE_PROVIDER=redis` in `.env` and set Redis related configuration information correctly. 1. Download `seafile-ai.yml` @@ -20,20 +22,6 @@ The Seafile AI basic service will use API calls to external large language model wget https://manual.seafile.com/13.0/repo/docker/seafile-ai.yml ``` - !!! note "Deploy in a cluster or standalone deployment" - - If you deploy Seafile in a cluster and would like to deploy Seafile AI, please expose port `8888` in `seafile-ai.yml`: - - ```yml - services: - seafile-ai: - ... - ports: - - 8888:8888 - ``` - - At the same time, Seafile AI should be deployed on one of the cluster nodes. - 2. Modify `.env`, insert or modify the following fields: ``` @@ -42,17 +30,6 @@ The Seafile AI basic service will use API calls to external large language model ENABLE_SEAFILE_AI=true SEAFILE_AI_LLM_KEY= ``` - - !!! note "Deploy in a cluster or standalone deployment" - Please also specify the following items in `.env`: - - - `.env` on the host where deploys Seafile server: - - `SEAFILE_AI_SERVER_URL`: the service url of Seafile AI (e.g., `http://seafile-ai.example.com:8888`) - - `.env` on the host where deploys Seafile AI: - - `SEAFILE_SERVER_URL`: your Seafile server's url (e.g., `https://seafile.example.com`) - - `REDIS_HOST`: your redis host - - `REDIS_PORT`: your redis port - - `REDIS_PASSWORD`: your redis password !!! tip "About LLM configs" By default, Seafile uses the ***GPT-4o-mini*** model from *OpenAI*. You only need to provide your ***OpenAI API Key***. If you need to use other LLM (including self-deployed LLM service), you also need to specify the following in `.env`: @@ -70,6 +47,48 @@ The Seafile AI basic service will use API calls to external large language model docker compose up -d ``` +### Deploy Seafile AI on another host to Seafile + +1. Download `seafile-ai.yml` and `.env`: + + ```sh + wget https://manual.seafile.com/13.0/repo/docker/seafile-ai/seafile-ai.yml + wget -O .env https://manual.seafile.com/13.0/repo/docker/seafile-ai/env + ``` + +2. Modify `.env` in the host will deploy Seafile AI according to following table + + | variable | description | + |------------------------|---------------------------------------------------------------------------------------------------------------| + | `SEAFILE_VOLUME` | The volume directory of thumbnail server data | + | `JWT_PRIVATE_KEY` | JWT key, the same as the config in Seafile `.env` file | + | `INNER_SEAHUB_SERVICE_URL`| Intranet URL for accessing Seahub component, like `http://`. | + | `REDIS_HOST` | Redis server host | + | `REDIS_PORT` | Redis server port | + | `REDIS_PASSWORD` | Redis server password | + | `SEAFILE_AI_LLM_TYPE` | Large Language Model (LLM) API Type (e.g., `openai`) | + | `SEAFILE_AI_LLM_URL` | LLM API url (leave blank if you would like to use official OpenAI's API endpoint) | + | `SEAFILE_AI_LLM_KEY` | LLM API key | + | `FACE_EMBEDDING_SERVICE_URL` | Face embedding service url | + + then start your Seafile AI server: + + ```sh + docker compose up -d + ``` + +3. Modify `.env` in the host deployed Seafile + + ```env + SEAFILE_AI_SERVER_URL=http://:8888 + ``` + + then restart your Seafile server + + ```sh + docker compose down && docker compose up -d + ``` + ## Deploy face embedding service (Optional) The face embedding service is used to detect and encode faces in images and is an extension component of Seafile AI. Generally, we **recommend** that you deploy the service on a machine with a **GPU** and a graphics card driver that supports [OnnxRuntime](https://onnxruntime.ai/docs/) (so it can also be deployed on a different machine from the Seafile AI base service). Currently, the Seafile AI face embedding service only supports the following modes: @@ -134,7 +153,7 @@ Since the face embedding service may need to be deployed on some hosts with GPU( - 8886:8886 ``` -2. Modify the `.env` of the Seafile AI basic service: +2. Modify the `.env` of where deployed Seafile AI: ``` FACE_EMBEDDING_SERVICE_URL=http://:8886 diff --git a/manual/extension/thumbnail-server.md b/manual/extension/thumbnail-server.md index 5cd7631e..05b7b58d 100644 --- a/manual/extension/thumbnail-server.md +++ b/manual/extension/thumbnail-server.md @@ -55,7 +55,7 @@ Then modify the `.env` file according to your environment. The following fields | `SEAFILE_MYSQL_DB_PASSWORD`| Seafile MySQL password | | `TIME_ZONE` | Time zone | | `JWT_PRIVATE_KEY` | JWT key, the same as the config in Seafile `.env` file | -| `INNER_SEAHUB_SERVICE_URL`| Inner Seafile url | +| `INNER_SEAHUB_SERVICE_URL`| Intranet URL for accessing Seahub component, like `http://`. | | `SEAF_SERVER_STORAGE_TYPE` | What kind of the Seafile data for storage. Available options are `disk` (i.e., local disk), `s3` and `multiple` (see the details of [multiple storage backends](../setup/setup_with_multiple_storage_backends.md)) | | `S3_COMMIT_BUCKET` | S3 storage backend commit objects bucket | | `S3_FS_BUCKET` | S3 storage backend fs objects bucket | diff --git a/manual/repo/docker/seafile-ai.yml b/manual/repo/docker/seafile-ai.yml index 4653266f..761e9718 100644 --- a/manual/repo/docker/seafile-ai.yml +++ b/manual/repo/docker/seafile-ai.yml @@ -12,7 +12,7 @@ services: - SEAFILE_AI_LLM_KEY=${SEAFILE_AI_LLM_KEY:-} - FACE_EMBEDDING_SERVICE_URL=${FACE_EMBEDDING_SERVICE_URL:-http://face-embedding:8886} - FACE_EMBEDDING_SERVICE_KEY=${FACE_EMBEDDING_SERVICE_KEY:-${JWT_PRIVATE_KEY:?Variable is not set or empty}} - - SEAFILE_SERVER_URL=${SEAFILE_SERVER_URL:-http://seafile} + - SEAFILE_SERVER_URL=${INNER_SEAHUB_SERVICE_URL:-http://seafile} - JWT_PRIVATE_KEY=${JWT_PRIVATE_KEY:?Variable is not set or empty} - SEAFILE_AI_LOG_LEVEL=${SEAFILE_AI_LOG_LEVEL:-info} - CACHE_PROVIDER=${CACHE_PROVIDER:-redis} diff --git a/manual/repo/docker/seafile-ai/env b/manual/repo/docker/seafile-ai/env new file mode 100644 index 00000000..6f47f68a --- /dev/null +++ b/manual/repo/docker/seafile-ai/env @@ -0,0 +1,24 @@ +COMPOSE_FILE='seafile-ai.yml' +COMPOSE_PATH_SEPARATOR=',' + +SEAFILE_AI_IMAGE=seafileltd/seafile-ai:latest + +SEAFILE_VOLUME=/opt/seafile-data + +SEAFILE_SERVER_PROTOCOL=http +SEAFILE_SERVER_HOSTNAME= + +CACHE_PROVIDER=redis +REDIS_HOST=... # your redis host +REDIS_PORT=6379 +REDIS_PASSWORD= + +JWT_PRIVATE_KEY= + +FACE_EMBEDDING_SERVICE_URL= + +SEAFILE_AI_LLM_TYPE=openai +SEAFILE_AI_LLM_URL= +SEAFILE_AI_LLM_KEY=... # your llm key + +INNER_SEAHUB_SERVICE_URL= # https://seafile.example.com diff --git a/manual/repo/docker/seafile-ai/seafile-ai.yml b/manual/repo/docker/seafile-ai/seafile-ai.yml new file mode 100644 index 00000000..bc3173fd --- /dev/null +++ b/manual/repo/docker/seafile-ai/seafile-ai.yml @@ -0,0 +1,27 @@ +services: + seafile-ai: + image: ${SEAFILE_AI_IMAGE:-seafileltd/seafile-ai:latest} + container_name: seafile-ai + volumes: + - ${SEAFILE_VOLUME:-/opt/seafile-data}:/shared + ports: + - 8888:8888 + environment: + - SEAFILE_AI_LLM_TYPE=${SEAFILE_AI_LLM_TYPE:-openai} + - SEAFILE_AI_LLM_URL=${SEAFILE_AI_LLM_URL:-} + - SEAFILE_AI_LLM_KEY=${SEAFILE_AI_LLM_KEY:-} + - FACE_EMBEDDING_SERVICE_URL=${FACE_EMBEDDING_SERVICE_URL:-} + - FACE_EMBEDDING_SERVICE_KEY=${FACE_EMBEDDING_SERVICE_KEY:-${JWT_PRIVATE_KEY:?Variable is not set or empty}} + - SEAFILE_SERVER_URL=${INNER_SEAHUB_SERVICE_URL:?Variable is not set or empty} + - JWT_PRIVATE_KEY=${JWT_PRIVATE_KEY:?Variable is not set or empty} + - SEAFILE_AI_LOG_LEVEL=${SEAFILE_AI_LOG_LEVEL:-info} + - CACHE_PROVIDER=${CACHE_PROVIDER:-redis} + - REDIS_HOST=${REDIS_HOST:-redis} + - REDIS_PORT=${REDIS_PORT:-6379} + - REDIS_PASSWORD=${REDIS_PASSWORD:-} + networks: + - seafile-net + +networks: + seafile-net: + name: seafile-net diff --git a/manual/repo/docker/thumbnail-server/env b/manual/repo/docker/thumbnail-server/env index 0161561a..489f0aa2 100644 --- a/manual/repo/docker/thumbnail-server/env +++ b/manual/repo/docker/thumbnail-server/env @@ -6,12 +6,11 @@ THUMBNAIL_SERVER_IMAGE=seafileltd/thumbnail-server:13.0.0-testing SEAFILE_VOLUME=/opt/seafile-data -SEAFILE_MYSQL_DB_HOST=192.168.0.2 +SEAFILE_MYSQL_DB_HOST=... # your mysql host SEAFILE_MYSQL_DB_USER=seafile -SEAFILE_MYSQL_DB_PASSWORD=PASSWORD - +SEAFILE_MYSQL_DB_PASSWORD=... # your mysql password TIME_ZONE=Etc/UTC -JWT_PRIVATE_KEY= +JWT_PRIVATE_KEY=... # your jwt private key -INNER_SEAHUB_SERVICE_URL=192.168.0.2 +INNER_SEAHUB_SERVICE_URL= # https://seafile.example.com diff --git a/manual/repo/docker/thumbnail-server/thumbnail-server.yml b/manual/repo/docker/thumbnail-server/thumbnail-server.yml index 6143dbc0..4cf38169 100644 --- a/manual/repo/docker/thumbnail-server/thumbnail-server.yml +++ b/manual/repo/docker/thumbnail-server/thumbnail-server.yml @@ -18,7 +18,7 @@ services: - SEAFILE_MYSQL_DB_SEAFILE_DB_NAME=${SEAFILE_MYSQL_DB_SEAFILE_DB_NAME:-seafile_db} - JWT_PRIVATE_KEY=${JWT_PRIVATE_KEY:?Variable is not set or empty} - SITE_ROOT=${SITE_ROOT:-/} - - INNER_SEAHUB_SERVICE_URL=${INNER_SEAHUB_SERVICE_URL:Variable is not set or empty} + - INNER_SEAHUB_SERVICE_URL=${INNER_SEAHUB_SERVICE_URL:?Variable is not set or empty} - SEAF_SERVER_STORAGE_TYPE=${SEAF_SERVER_STORAGE_TYPE:-} - S3_COMMIT_BUCKET=${S3_COMMIT_BUCKET:-} - S3_FS_BUCKET=${S3_FS_BUCKET:-}