Deployed 84909ef6 to 13.0 with MkDocs 1.6.1 and mike 2.1.3

This commit is contained in:
ci-bot 2025-07-03 02:16:31 +00:00
parent 4c2930c6c8
commit 5b7b99b086
8 changed files with 183 additions and 39 deletions

View File

@ -2153,6 +2153,30 @@
</span>
</a>
<nav class="md-nav" aria-label="Deploy Seafile AI basic service">
<ul class="md-nav__list">
<li class="md-nav__item">
<a href="#deploy-seafile-ai-on-the-host-with-seafile" class="md-nav__link">
<span class="md-ellipsis">
Deploy Seafile AI on the host with Seafile
</span>
</a>
</li>
<li class="md-nav__item">
<a href="#deploy-seafile-ai-on-another-host-to-seafile" class="md-nav__link">
<span class="md-ellipsis">
Deploy Seafile AI on another host to Seafile
</span>
</a>
</li>
</ul>
</nav>
</li>
<li class="md-nav__item">
@ -4514,6 +4538,30 @@
</span>
</a>
<nav class="md-nav" aria-label="Deploy Seafile AI basic service">
<ul class="md-nav__list">
<li class="md-nav__item">
<a href="#deploy-seafile-ai-on-the-host-with-seafile" class="md-nav__link">
<span class="md-ellipsis">
Deploy Seafile AI on the host with Seafile
</span>
</a>
</li>
<li class="md-nav__item">
<a href="#deploy-seafile-ai-on-another-host-to-seafile" class="md-nav__link">
<span class="md-ellipsis">
Deploy Seafile AI on another host to Seafile
</span>
</a>
</li>
</ul>
</nav>
</li>
<li class="md-nav__item">
@ -4586,27 +4634,17 @@
<li>Detect text in images (OCR)</li>
</ul>
<h2 id="deploy-seafile-ai-basic-service">Deploy Seafile AI basic service<a class="headerlink" href="#deploy-seafile-ai-basic-service" title="Permanent link">&para;</a></h2>
<h3 id="deploy-seafile-ai-on-the-host-with-seafile">Deploy Seafile AI on the host with Seafile<a class="headerlink" href="#deploy-seafile-ai-on-the-host-with-seafile" title="Permanent link">&para;</a></h3>
<p>The Seafile AI basic service will use API calls to external large language model service to implement file labeling, file and image summaries, text translation, and sdoc writing assistance.</p>
<div class="admonition warning">
<p class="admonition-title">Seafile AI requires Redis cache</p>
<p>In order to deploy Seafile AI correctly, you need to use Redis cache. Please set <code>CACHE_PROVIDER=redis</code> in .env and set Redis related configuration information correctly.</p>
<p>In order to deploy Seafile AI correctly, you have to use <strong><em>Redis</em></strong> as the cache. Please set <code>CACHE_PROVIDER=redis</code> in <code>.env</code> and set Redis related configuration information correctly.</p>
</div>
<ol>
<li>
<p>Download <code>seafile-ai.yml</code></p>
<div class="highlight"><pre><span></span><code>wget<span class="w"> </span>https://manual.seafile.com/13.0/repo/docker/seafile-ai.yml
</code></pre></div>
<div class="admonition note">
<p class="admonition-title">Deploy in a cluster or standalone deployment</p>
<p>If you deploy Seafile in a cluster and would like to deploy Seafile AI, please expose port <code>8888</code> in <code>seafile-ai.yml</code>:</p>
<div class="highlight"><pre><span></span><code>services:
seafile-ai:
...
ports:
- 8888:8888
</code></pre></div>
<p>At the same time, Seafile AI should be deployed on one of the cluster nodes.</p>
</div>
</li>
<li>
<p>Modify <code>.env</code>, insert or modify the following fields:</p>
@ -4615,23 +4653,6 @@
ENABLE_SEAFILE_AI=true
SEAFILE_AI_LLM_KEY=&lt;your LLM access key&gt;
</code></pre></div>
<div class="admonition note">
<p class="admonition-title">Deploy in a cluster or standalone deployment</p>
<p>Please also specify the following items in <code>.env</code>:</p>
<ul>
<li><code>.env</code> on the host where deploys Seafile server:<ul>
<li><code>SEAFILE_AI_SERVER_URL</code>: the service url of Seafile AI (e.g., <code>http://seafile-ai.example.com:8888</code>)</li>
</ul>
</li>
<li><code>.env</code> on the host where deploys Seafile AI:<ul>
<li><code>SEAFILE_SERVER_URL</code>: your Seafile server's url (e.g., <code>https://seafile.example.com</code>)</li>
<li><code>REDIS_HOST</code>: your redis host</li>
<li><code>REDIS_PORT</code>: your redis port</li>
<li><code>REDIS_PASSWORD</code>: your redis password</li>
</ul>
</li>
</ul>
</div>
<div class="admonition tip">
<p class="admonition-title">About LLM configs</p>
<p>By default, Seafile uses the <strong><em>GPT-4o-mini</em></strong> model from <em>OpenAI</em>. You only need to provide your <strong><em>OpenAI API Key</em></strong>. If you need to use other LLM (including self-deployed LLM service), you also need to specify the following in <code>.env</code>:</p>
@ -4648,6 +4669,79 @@ docker<span class="w"> </span>compose<span class="w"> </span>up<span class="w">
</code></pre></div>
</li>
</ol>
<h3 id="deploy-seafile-ai-on-another-host-to-seafile">Deploy Seafile AI on another host to Seafile<a class="headerlink" href="#deploy-seafile-ai-on-another-host-to-seafile" title="Permanent link">&para;</a></h3>
<ol>
<li>
<p>Download <code>seafile-ai.yml</code> and <code>.env</code>:</p>
<div class="highlight"><pre><span></span><code>wget<span class="w"> </span>https://manual.seafile.com/13.0/repo/docker/seafile-ai/seafile-ai.yml
wget<span class="w"> </span>-O<span class="w"> </span>.env<span class="w"> </span>https://manual.seafile.com/13.0/repo/docker/seafile-ai/env
</code></pre></div>
</li>
<li>
<p>Modify <code>.env</code> in the host will deploy Seafile AI according to following table</p>
<table>
<thead>
<tr>
<th>variable</th>
<th>description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>SEAFILE_VOLUME</code></td>
<td>The volume directory of thumbnail server data</td>
</tr>
<tr>
<td><code>JWT_PRIVATE_KEY</code></td>
<td>JWT key, the same as the config in Seafile <code>.env</code> file</td>
</tr>
<tr>
<td><code>INNER_SEAHUB_SERVICE_URL</code></td>
<td>Intranet URL for accessing Seahub component, like <code>http://&lt;your Seafile server intranet IP&gt;</code>.</td>
</tr>
<tr>
<td><code>REDIS_HOST</code></td>
<td>Redis server host</td>
</tr>
<tr>
<td><code>REDIS_PORT</code></td>
<td>Redis server port</td>
</tr>
<tr>
<td><code>REDIS_PASSWORD</code></td>
<td>Redis server password</td>
</tr>
<tr>
<td><code>SEAFILE_AI_LLM_TYPE</code></td>
<td>Large Language Model (LLM) API Type (e.g., <code>openai</code>)</td>
</tr>
<tr>
<td><code>SEAFILE_AI_LLM_URL</code></td>
<td>LLM API url (leave blank if you would like to use official OpenAI's API endpoint)</td>
</tr>
<tr>
<td><code>SEAFILE_AI_LLM_KEY</code></td>
<td>LLM API key</td>
</tr>
<tr>
<td><code>FACE_EMBEDDING_SERVICE_URL</code></td>
<td>Face embedding service url</td>
</tr>
</tbody>
</table>
<p>then start your Seafile AI server:</p>
<div class="highlight"><pre><span></span><code>docker<span class="w"> </span>compose<span class="w"> </span>up<span class="w"> </span>-d
</code></pre></div>
</li>
<li>
<p>Modify <code>.env</code> in the host deployed Seafile</p>
<div class="highlight"><pre><span></span><code>SEAFILE_AI_SERVER_URL=http://&lt;your seafile ai host&gt;:8888
</code></pre></div>
<p>then restart your Seafile server</p>
<div class="highlight"><pre><span></span><code>docker<span class="w"> </span>compose<span class="w"> </span>down<span class="w"> </span><span class="o">&amp;&amp;</span><span class="w"> </span>docker<span class="w"> </span>compose<span class="w"> </span>up<span class="w"> </span>-d
</code></pre></div>
</li>
</ol>
<h2 id="deploy-face-embedding-service-optional">Deploy face embedding service (Optional)<a class="headerlink" href="#deploy-face-embedding-service-optional" title="Permanent link">&para;</a></h2>
<p>The face embedding service is used to detect and encode faces in images and is an extension component of Seafile AI. Generally, we <strong>recommend</strong> that you deploy the service on a machine with a <strong>GPU</strong> and a graphics card driver that supports <a href="https://onnxruntime.ai/docs/">OnnxRuntime</a> (so it can also be deployed on a different machine from the Seafile AI base service). Currently, the Seafile AI face embedding service only supports the following modes:</p>
<ul>
@ -4714,7 +4808,7 @@ docker<span class="w"> </span>compose<span class="w"> </span>up<span class="w">
</code></pre></div>
</li>
<li>
<p>Modify the <code>.env</code> of the Seafile AI basic service:</p>
<p>Modify the <code>.env</code> of where deployed Seafile AI:</p>
<div class="highlight"><pre><span></span><code>FACE_EMBEDDING_SERVICE_URL=http://&lt;your face embedding service host&gt;:8886
</code></pre></div>
</li>

View File

@ -4589,7 +4589,7 @@ wget<span class="w"> </span>-O<span class="w"> </span>.env<span class="w"> </spa
</tr>
<tr>
<td><code>INNER_SEAHUB_SERVICE_URL</code></td>
<td>Inner Seafile url</td>
<td>Intranet URL for accessing Seahub component, like <code>http://&lt;your Seafile server intranet IP&gt;</code>.</td>
</tr>
<tr>
<td><code>SEAF_SERVER_STORAGE_TYPE</code></td>

View File

@ -12,7 +12,7 @@ services:
- SEAFILE_AI_LLM_KEY=${SEAFILE_AI_LLM_KEY:-}
- FACE_EMBEDDING_SERVICE_URL=${FACE_EMBEDDING_SERVICE_URL:-http://face-embedding:8886}
- FACE_EMBEDDING_SERVICE_KEY=${FACE_EMBEDDING_SERVICE_KEY:-${JWT_PRIVATE_KEY:?Variable is not set or empty}}
- SEAFILE_SERVER_URL=${SEAFILE_SERVER_URL:-http://seafile}
- SEAFILE_SERVER_URL=${INNER_SEAHUB_SERVICE_URL:-http://seafile}
- JWT_PRIVATE_KEY=${JWT_PRIVATE_KEY:?Variable is not set or empty}
- SEAFILE_AI_LOG_LEVEL=${SEAFILE_AI_LOG_LEVEL:-info}
- CACHE_PROVIDER=${CACHE_PROVIDER:-redis}

View File

@ -0,0 +1,24 @@
COMPOSE_FILE='seafile-ai.yml'
COMPOSE_PATH_SEPARATOR=','
SEAFILE_AI_IMAGE=seafileltd/seafile-ai:latest
SEAFILE_VOLUME=/opt/seafile-data
SEAFILE_SERVER_PROTOCOL=http
SEAFILE_SERVER_HOSTNAME=
CACHE_PROVIDER=redis
REDIS_HOST=... # your redis host
REDIS_PORT=6379
REDIS_PASSWORD=
JWT_PRIVATE_KEY=
FACE_EMBEDDING_SERVICE_URL=
SEAFILE_AI_LLM_TYPE=openai
SEAFILE_AI_LLM_URL=
SEAFILE_AI_LLM_KEY=... # your llm key
INNER_SEAHUB_SERVICE_URL= # https://seafile.example.com

View File

@ -0,0 +1,27 @@
services:
seafile-ai:
image: ${SEAFILE_AI_IMAGE:-seafileltd/seafile-ai:latest}
container_name: seafile-ai
volumes:
- ${SEAFILE_VOLUME:-/opt/seafile-data}:/shared
ports:
- 8888:8888
environment:
- SEAFILE_AI_LLM_TYPE=${SEAFILE_AI_LLM_TYPE:-openai}
- SEAFILE_AI_LLM_URL=${SEAFILE_AI_LLM_URL:-}
- SEAFILE_AI_LLM_KEY=${SEAFILE_AI_LLM_KEY:-}
- FACE_EMBEDDING_SERVICE_URL=${FACE_EMBEDDING_SERVICE_URL:-}
- FACE_EMBEDDING_SERVICE_KEY=${FACE_EMBEDDING_SERVICE_KEY:-${JWT_PRIVATE_KEY:?Variable is not set or empty}}
- SEAFILE_SERVER_URL=${INNER_SEAHUB_SERVICE_URL:?Variable is not set or empty}
- JWT_PRIVATE_KEY=${JWT_PRIVATE_KEY:?Variable is not set or empty}
- SEAFILE_AI_LOG_LEVEL=${SEAFILE_AI_LOG_LEVEL:-info}
- CACHE_PROVIDER=${CACHE_PROVIDER:-redis}
- REDIS_HOST=${REDIS_HOST:-redis}
- REDIS_PORT=${REDIS_PORT:-6379}
- REDIS_PASSWORD=${REDIS_PASSWORD:-}
networks:
- seafile-net
networks:
seafile-net:
name: seafile-net

View File

@ -6,12 +6,11 @@ THUMBNAIL_SERVER_IMAGE=seafileltd/thumbnail-server:13.0.0-testing
SEAFILE_VOLUME=/opt/seafile-data
SEAFILE_MYSQL_DB_HOST=192.168.0.2
SEAFILE_MYSQL_DB_HOST=... # your mysql host
SEAFILE_MYSQL_DB_USER=seafile
SEAFILE_MYSQL_DB_PASSWORD=PASSWORD
SEAFILE_MYSQL_DB_PASSWORD=... # your mysql password
TIME_ZONE=Etc/UTC
JWT_PRIVATE_KEY=
JWT_PRIVATE_KEY=... # your jwt private key
INNER_SEAHUB_SERVICE_URL=192.168.0.2
INNER_SEAHUB_SERVICE_URL= # https://seafile.example.com

View File

@ -18,7 +18,7 @@ services:
- SEAFILE_MYSQL_DB_SEAFILE_DB_NAME=${SEAFILE_MYSQL_DB_SEAFILE_DB_NAME:-seafile_db}
- JWT_PRIVATE_KEY=${JWT_PRIVATE_KEY:?Variable is not set or empty}
- SITE_ROOT=${SITE_ROOT:-/}
- INNER_SEAHUB_SERVICE_URL=${INNER_SEAHUB_SERVICE_URL:Variable is not set or empty}
- INNER_SEAHUB_SERVICE_URL=${INNER_SEAHUB_SERVICE_URL:?Variable is not set or empty}
- SEAF_SERVER_STORAGE_TYPE=${SEAF_SERVER_STORAGE_TYPE:-}
- S3_COMMIT_BUCKET=${S3_COMMIT_BUCKET:-}
- S3_FS_BUCKET=${S3_FS_BUCKET:-}

File diff suppressed because one or more lines are too long