* Milvus (#1644)

* feat: support regx

* 4.8.3 test and fix (#1648)

* perf: version tip

* feat: sandbox support log

* fix: debug component render

* fix: share page header

* fix: input guide auth

* fix: iso viewport

* remove file

* fix: route url

* feat: add debug timout

* perf: reference select support trigger

* perf: session code

* perf: theme

* perf: load milvus
This commit is contained in:
Archer 2024-06-01 09:26:11 +08:00 committed by GitHub
parent 9fc6a8c74a
commit a259d034b8
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
81 changed files with 1775 additions and 594 deletions

View File

@ -4,8 +4,6 @@ on:
paths:
- 'projects/app/**'
- 'packages/**'
branches:
- 'main'
workflow_dispatch:
jobs:

1
.npmrc
View File

@ -1 +1,2 @@
public-hoist-pattern[]=*tiktoken*
public-hoist-pattern[]=*@zilliz/milvus2-sdk-node*

View File

@ -52,10 +52,10 @@ https://github.com/labring/FastGPT/assets/15308462/7d3a38df-eb0e-4388-9250-2409b
`1` 应用编排能力
- [x] 提供简易模式,无需操作编排
- [x] 工作流编排
- [x] 源文件引用追踪
- [x] 工具调用
- [x] 插件 - 工作流封装能力
- [ ] Code sandbox
- [x] Code sandbox
- [ ] 循环调用
`2` 知识库能力
- [x] 多库复用,混用
@ -65,7 +65,7 @@ https://github.com/labring/FastGPT/assets/15308462/7d3a38df-eb0e-4388-9250-2409b
- [x] 支持 txtmdhtmlpdfdocxpptxcsvxlsx (有需要更多可 PR file loader)
- [x] 支持 url 读取、CSV 批量导入
- [x] 混合检索 & 重排
- [ ] 支持文件阅读器
- [ ] 标签过滤
`3` 应用调试能力
- [x] 知识库单点搜索测试

Binary file not shown.

After

Width:  |  Height:  |  Size: 55 KiB

View File

@ -7,34 +7,60 @@ toc: true
weight: 707
---
## 部署架构图
![](/imgs/sealos-fastgpt.webp)
{{% alert icon="🤖" context="success" %}}
- MongoDB用于存储除了向量外的各类数据
- PostgreSQL/Milvus存储向量数据
- OneAPI: 聚合各类 AI API支持多模型调用 (任何模型问题,先自行通过 OneAPI 测试校验)
{{% /alert %}}
## 推荐配置
### PgVector版本
体验测试首选
{{< table "table-hover table-striped-columns" >}}
| 环境 | 最低配置(单节点) | 推荐配置 |
| ---- | ---- | ---- |
| 测试 | 2c2g | 2c4g |
| 100w 组向量 | 4c8g 50GB | 4c16g 50GB |
| 500w 组向量 | 8c32g | 16c64g 200GB |
| 500w 组向量 | 8c32g 200GB | 16c64g 200GB |
{{< /table >}}
## 部署架构图
### Milvus版本
![](/imgs/sealos-fastgpt.webp)
对于千万级以上向量性能更优秀。
[点击查看 Milvus 官方推荐配置](https://milvus.io/docs/prerequisite-docker.md)
### 1. 准备好代理环境(国外服务器可忽略)
{{< table "table-hover table-striped-columns" >}}
| 环境 | 最低配置(单节点) | 推荐配置 |
| ---- | ---- | ---- |
| 测试 | 2c8g | 4c16g |
| 100w 组向量 | 未测试 | |
| 500w 组向量 | | |
{{< /table >}}
确保可以访问 OpenAI具体方案可以参考[代理方案](/docs/development/proxy/)。或直接在 Sealos 上 [部署 OneAPI](/docs/development/one-api),既解决代理问题也能实现多 Key 轮询、接入其他大模型。
### zilliz cloud版本
### 2. 多模型支持
亿级以上向量首选。
FastGPT 使用了 one-api 项目来管理模型池,其可以兼容 OpenAI 、Azure 、国内主流模型和本地模型等。
由于向量库使用了 Cloud无需占用本地资源无需太关注
可选择 [Sealos 快速部署 OneAPI](/docs/development/one-api),更多部署方法可参考该项目的 [README](https://github.com/songquanpeng/one-api),也可以直接通过以下按钮一键部署:
## 前置工作
<a href="https://template.cloud.sealos.io/deploy?templateName=one-api" rel="external" target="_blank"><img src="https://cdn.jsdelivr.net/gh/labring-actions/templates@main/Deploy-on-Sealos.svg" alt="Deploy on Sealos"/></a>
### 1. 确保网络环境
## 一、安装 Docker 和 docker-compose
如果使用`OpenAI`等国外模型接口,请确保可以正常访问,否则会报错:`Connection error` 等。 方案可以参考:[代理方案](/docs/development/proxy/)
### 2. 准备 Docker 环境
{{< tabs tabTotal="3" >}}
{{< tab tabName="Linux" >}}
@ -79,22 +105,75 @@ brew install orbstack
{{< /tab >}}
{{< /tabs >}}
## 二、创建目录并下载 docker-compose.yml
依次执行下面命令,创建 FastGPT 文件并拉取`docker-compose.yml`和`config.json`,执行完后目录下会有 2 个文件。
## 开始部署
非 Linux 环境或无法访问外网环境可手动创建一个目录并下载下面2个链接的文件: [docker-compose.yml](https://github.com/labring/FastGPT/blob/main/files/deploy/fastgpt/docker-compose.yml),[config.json](https://github.com/labring/FastGPT/blob/main/projects/app/data/config.json)
### 1. 下载 docker-compose.yml
**注意: `docker-compose.yml` 配置文件中 Mongo 为 5.x部分服务器不支持需手动更改其镜像版本为 4.4.24**需要自己在docker hub下载阿里云镜像没做备份
非 Linux 环境或无法访问外网环境,可手动创建一个目录,并下载配置文件和对应版本的`docker-compose.yml`
- [config.json](https://github.com/labring/FastGPT/blob/main/projects/app/data/config.json)
- [docker-compose.yml](https://github.com/labring/FastGPT/blob/main/files/docker) (注意,不同向量库版本的文件不一样)
{{% alert icon="🤖" context="success" %}}
所有 `docker-compose.yml` 配置文件中 `MongoDB` 为 5.x需要用到AUX指令集部分 CPU 不支持,需手动更改其镜像版本为 4.4.24**需要自己在docker hub下载阿里云镜像没做备份
{{% /alert %}}
**Linux 快速脚本**
```bash
mkdir fastgpt
cd fastgpt
curl -O https://raw.githubusercontent.com/labring/FastGPT/main/files/deploy/fastgpt/docker-compose.yml
curl -O https://raw.githubusercontent.com/labring/FastGPT/main/projects/app/data/config.json
# pgvector 版本(测试推荐,简单快捷)
curl -o docker-compose.yml https://github.com/labring/FastGPT/blob/main/files/docker/docker-compose-pgvector.yml
# milvus 版本
# curl -o docker-compose.yml https://github.com/labring/FastGPT/blob/main/files/docker/docker-compose-milvus.yml
# zilliz 版本
# curl -o docker-compose.yml https://github.com/labring/FastGPT/blob/main/files/docker/docker-compose-zilliz.yml
```
## 三、启动容器
### 2. 修改 docker-compose.yml 环境变量
{{< tabs tabTotal="3" >}}
{{< tab tabName="PgVector版本" >}}
{{< markdownify >}}
```
无需操作
```
{{< /markdownify >}}
{{< /tab >}}
{{< tab tabName="Milvus版本" >}}
{{< markdownify >}}
```
无需操作
```
{{< /markdownify >}}
{{< /tab >}}
{{< tab tabName="Zilliz版本" >}}
{{< markdownify >}}
![zilliz_key](/imgs/zilliz_key.png)
{{% alert icon="🤖" context="success" %}}
修改`MILVUS_ADDRESS`和`MILVUS_TOKEN`链接参数,分别对应 `zilliz``Public Endpoint``Api key`记得把自己ip加入白名单。
{{% /alert %}}
{{< /markdownify >}}
{{< /tab >}}
{{< /tabs >}}
### 3. 启动容器
在 docker-compose.yml 同级目录下执行。请确保`docker-compose`版本最好在2.17以上,否则可能无法执行自动化命令。
@ -107,13 +186,13 @@ sleep 10
docker restart oneapi
```
## 四、打开 OneAPI 添加模型
### 4. 打开 OneAPI 添加模型
可以通过`ip:3001`访问OneAPI默认账号为`root`密码为`123456`。
在OneApi中添加合适的AI模型渠道。[点击查看相关教程](/docs/development/one-api/)
## 五、访问 FastGPT
### 5. 访问 FastGPT
目前可以通过 `ip:3000` 直接访问(注意防火墙)。登录用户名为 `root`,密码为`docker-compose.yml`环境变量里设置的 `DEFAULT_ROOT_PSW`
@ -125,7 +204,9 @@ docker restart oneapi
### Mongo 副本集自动初始化失败
最新的 docker-compose 示例优化 Mongo 副本集初始化,实现了全自动。目前在 unbuntu20,22 centos7, wsl2, mac, window 均通过测试。如果你的环境特殊,可以手动初始化副本集:
最新的 docker-compose 示例优化 Mongo 副本集初始化,实现了全自动。目前在 unbuntu20,22 centos7, wsl2, mac, window 均通过测试。仍无法正常启动,大部分是因为 cpu 不支持 AUX 指令集,可以切换 Mongo4.x 版本。
如果是由于,无法自动初始化副本集合,可以手动初始化副本集:
1. 终端中执行下面命令创建mongo密钥
@ -266,13 +347,14 @@ PG 数据库没有连接上/初始化失败可以查看日志。FastGPT 会
### Operation `auth_codes.findOne()` buffering timed out after 10000ms
mongo连接失败查看mongo的运行状态对应日志。
mongo连接失败查看mongo的运行状态**对应日志**
可能原因:
1. mongo 服务有没有起来(有些 cpu 不支持 AVX无法用 mongo5需要换成 mongo4.x可以docker hub找个最新的4.x修改镜像版本重新运行
2. 连接数据库的环境变量填写错误账号密码注意host和port非容器网络连接需要用公网ip并加上 directConnection=true
3. 副本集启动失败。导致容器一直重启。
4. `Illegal instruction.... Waiting for MongoDB to start`: cpu 不支持 AVX无法用 mongo5需要换成 mongo4.x
### 首次部署root用户提示未注册

View File

@ -20,10 +20,10 @@ SANDBOX_URL=内网地址
## Docker 部署
可以拉取最新 [docker-compose.yml](https://github.com/labring/FastGPT/blob/main/files/deploy/fastgpt/docker-compose.yml) 文件参考
可以拉取最新 [docker-compose.yml](https://github.com/labring/FastGPT/blob/main/files/docker/docker-compose.yml) 文件参考
1. 新增一个容器 `sandbox`
2. fastgpt容器新增环境变量: `SANDBOX_URL`
2. fastgpt 和 fastgpt-pro(商业版) 容器新增环境变量: `SANDBOX_URL`
3. sandbox 简易不要开启外网访问,未做凭证校验。
## V4.8.2 更新说明

View File

@ -0,0 +1,22 @@
---
title: 'V4.8.3(进行中)'
description: 'FastGPT V4.8.3 更新说明'
icon: 'upgrade'
draft: false
toc: true
weight: 821
---
## 升级指南
- fastgpt 镜像 tag 修改成 v4.8.3
- fastgpt-sandbox 镜像 tag 修改成 v4.8.3
- 商业版镜像 tag 修改成 v4.8.3
## V4.8.3 更新说明
1. 新增 - 支持 Milvus 数据库, 可参考最新的 [docker-compose-milvus.yml](https://github.com/labring/FastGPT/blob/main/files/docker/docker-compose-milvus.yml).
2. 新增 - 给 chat 接口 empty answer 增加 log便于排查模型问题。
3. 新增 - ifelse判断器字符串支持正则。
4. 新增 - 代码支持
5. 修复 - 变量更新在 Debug 模式下出错。

View File

@ -0,0 +1,202 @@
# 数据库的默认账号和密码仅首次运行时设置有效
# 如果修改了账号密码,记得改数据库和项目连接参数,别只改一处~
# 该配置文件只是给快速启动,测试使用。正式使用,记得务必修改账号密码,以及调整合适的知识库参数,共享内存等。
# 如何无法访问 dockerhub 和 git可以用阿里云阿里云没有arm包
version: '3.3'
services:
minio:
container_name: minio
image: minio/minio:RELEASE.2023-03-20T20-16-18Z
environment:
MINIO_ACCESS_KEY: minioadmin
MINIO_SECRET_KEY: minioadmin
ports:
- '9001:9001'
- '9000:9000'
networks:
- fastgpt
volumes:
- ./minio:/minio_data
command: minio server /minio_data --console-address ":9001"
healthcheck:
test: ['CMD', 'curl', '-f', 'http://localhost:9000/minio/health/live']
interval: 30s
timeout: 20s
retries: 3
# milvus
milvusEtcd:
container_name: milvusEtcd
image: quay.io/coreos/etcd:v3.5.5
environment:
- ETCD_AUTO_COMPACTION_MODE=revision
- ETCD_AUTO_COMPACTION_RETENTION=1000
- ETCD_QUOTA_BACKEND_BYTES=4294967296
- ETCD_SNAPSHOT_COUNT=50000
networks:
- fastgpt
volumes:
- ./milvus/etcd:/etcd
command: etcd -advertise-client-urls=http://127.0.0.1:2379 -listen-client-urls http://0.0.0.0:2379 --data-dir /etcd
healthcheck:
test: ['CMD', 'etcdctl', 'endpoint', 'health']
interval: 30s
timeout: 20s
retries: 3
milvusStandalone:
container_name: milvusStandalone
image: milvusdb/milvus:v2.4.3
command: ['milvus', 'run', 'standalone']
security_opt:
- seccomp:unconfined
environment:
ETCD_ENDPOINTS: milvusEtcd:2379
MINIO_ADDRESS: minio:9000
networks:
- fastgpt
volumes:
- ./milvus/data:/var/lib/milvus
healthcheck:
test: ['CMD', 'curl', '-f', 'http://localhost:9091/healthz']
interval: 30s
start_period: 90s
timeout: 20s
retries: 3
depends_on:
- 'milvusEtcd'
- 'minio'
mongo:
image: registry.cn-hangzhou.aliyuncs.com/fastgpt/mongo:5.0.18 # 阿里云
container_name: mongo
restart: always
ports:
- 27017:27017
networks:
- fastgpt
command: mongod --keyFile /data/mongodb.key --replSet rs0
environment:
- MONGO_INITDB_ROOT_USERNAME=myusername
- MONGO_INITDB_ROOT_PASSWORD=mypassword
volumes:
- ./mongo/data:/data/db
entrypoint:
- bash
- -c
- |
openssl rand -base64 128 > /data/mongodb.key
chmod 400 /data/mongodb.key
chown 999:999 /data/mongodb.key
echo 'const isInited = rs.status().ok === 1
if(!isInited){
rs.initiate({
_id: "rs0",
members: [
{ _id: 0, host: "mongo:27017" }
]
})
}' > /data/initReplicaSet.js
# 启动MongoDB服务
exec docker-entrypoint.sh "$$@" &
# 等待MongoDB服务启动
until mongo -u myusername -p mypassword --authenticationDatabase admin --eval "print('waited for connection')" > /dev/null 2>&1; do
echo "Waiting for MongoDB to start..."
sleep 2
done
# 执行初始化副本集的脚本
mongo -u myusername -p mypassword --authenticationDatabase admin /data/initReplicaSet.js
# 等待docker-entrypoint.sh脚本执行的MongoDB服务进程
wait $$!
# fastgpt
sandbox:
container_name: sandbox
image: ghcr.io/labring/fastgpt-sandbox:v4.8.2 # git
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt-sandbox:v4.8.2 # 阿里云
networks:
- fastgpt
restart: always
fastgpt:
container_name: fastgpt
image: ghcr.io/labring/fastgpt:v4.8.3-alpha # git
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt:v4.8.3-alpha # 阿里云
ports:
- 3000:3000
networks:
- fastgpt
depends_on:
- mongo
- milvusStandalone
- sandbox
restart: always
environment:
# root 密码,用户名为: root。如果需要修改 root 密码,直接修改这个环境变量,并重启即可。
- DEFAULT_ROOT_PSW=1234
# AI模型的API地址哦。务必加 /v1。这里默认填写了OneApi的访问地址。
- OPENAI_BASE_URL=http://oneapi:3000/v1
# AI模型的API Key。这里默认填写了OneAPI的快速默认key测试通后务必及时修改
- CHAT_API_KEY=sk-fastgpt
# 数据库最大连接数
- DB_MAX_LINK=30
# 登录凭证密钥
- TOKEN_KEY=any
# root的密钥常用于升级时候的初始化请求
- ROOT_KEY=root_key
# 文件阅读加密
- FILE_TOKEN_KEY=filetoken
# MongoDB 连接参数. 用户名myusername,密码mypassword。
- MONGODB_URI=mongodb://myusername:mypassword@mongo:27017/fastgpt?authSource=admin
# zilliz 连接参数
- MILVUS_ADDRESS=http://milvusStandalone:19530
- MILVUS_TOKEN=none
# sandbox 地址
- SANDBOX_URL=http://sandbox:3000
volumes:
- ./config.json:/app/data/config.json
# oneapi
mysql:
image: mysql:8.0.36
container_name: mysql
restart: always
ports:
- 3306:3306
networks:
- fastgpt
command: --default-authentication-plugin=mysql_native_password
environment:
# 默认root密码仅首次运行有效
MYSQL_ROOT_PASSWORD: oneapimmysql
MYSQL_DATABASE: oneapi
volumes:
- ./mysql:/var/lib/mysql
oneapi:
container_name: oneapi
image: ghcr.io/songquanpeng/one-api:latest
ports:
- 3001:3000
depends_on:
- mysql
networks:
- fastgpt
restart: always
environment:
# mysql 连接参数
- SQL_DSN=root:oneapimmysql@tcp(mysql:3306)/oneapi
# 登录凭证加密密钥
- SESSION_SECRET=oneapikey
# 内存缓存
- MEMORY_CACHE_ENABLED=true
# 启动聚合更新,减少数据交互频率
- BATCH_UPDATE_ENABLED=true
# 聚合更新时长
- BATCH_UPDATE_INTERVAL=10
# 初始化的 root 密钥(建议部署完后更改,否则容易泄露)
- INITIAL_ROOT_TOKEN=fastgpt
volumes:
- ./oneapi:/data
networks:
fastgpt:

View File

@ -5,6 +5,7 @@
version: '3.3'
services:
# db
pg:
image: pgvector/pgvector:0.7.0-pg15 # docker hub
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/pgvector:v0.7.0 # 阿里云
@ -67,6 +68,8 @@ services:
# 等待docker-entrypoint.sh脚本执行的MongoDB服务进程
wait $$!
# fastgpt
sandbox:
container_name: sandbox
image: ghcr.io/labring/fastgpt-sandbox:v4.8.2 # git
@ -110,7 +113,8 @@ services:
- SANDBOX_URL=http://sandbox:3000
volumes:
- ./config.json:/app/data/config.json
- ./fastgpt/tmp:/app/tmp
# oneapi
mysql:
image: mysql:8.0.36
container_name: mysql

View File

@ -0,0 +1,140 @@
# 数据库的默认账号和密码仅首次运行时设置有效
# 如果修改了账号密码,记得改数据库和项目连接参数,别只改一处~
# 该配置文件只是给快速启动,测试使用。正式使用,记得务必修改账号密码,以及调整合适的知识库参数,共享内存等。
# 如何无法访问 dockerhub 和 git可以用阿里云阿里云没有arm包
version: '3.3'
services:
mongo:
image: mongo:5.0.18 # dockerhub
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/mongo:5.0.18 # 阿里云
# image: mongo:4.4.29 # cpu不支持AVX时候使用
container_name: mongo
restart: always
ports:
- 27017:27017
networks:
- fastgpt
command: mongod --keyFile /data/mongodb.key --replSet rs0
environment:
- MONGO_INITDB_ROOT_USERNAME=myusername
- MONGO_INITDB_ROOT_PASSWORD=mypassword
volumes:
- ./mongo/data:/data/db
entrypoint:
- bash
- -c
- |
openssl rand -base64 128 > /data/mongodb.key
chmod 400 /data/mongodb.key
chown 999:999 /data/mongodb.key
echo 'const isInited = rs.status().ok === 1
if(!isInited){
rs.initiate({
_id: "rs0",
members: [
{ _id: 0, host: "mongo:27017" }
]
})
}' > /data/initReplicaSet.js
# 启动MongoDB服务
exec docker-entrypoint.sh "$$@" &
# 等待MongoDB服务启动
until mongo -u myusername -p mypassword --authenticationDatabase admin --eval "print('waited for connection')" > /dev/null 2>&1; do
echo "Waiting for MongoDB to start..."
sleep 2
done
# 执行初始化副本集的脚本
mongo -u myusername -p mypassword --authenticationDatabase admin /data/initReplicaSet.js
# 等待docker-entrypoint.sh脚本执行的MongoDB服务进程
wait $$!
sandbox:
container_name: sandbox
image: ghcr.io/labring/fastgpt-sandbox:v4.8.2 # git
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt-sandbox:v4.8.2 # 阿里云
networks:
- fastgpt
restart: always
fastgpt:
container_name: fastgpt
image: ghcr.io/labring/fastgpt:v4.8.2 # git
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt:v4.8.2 # 阿里云
ports:
- 3000:3000
networks:
- fastgpt
depends_on:
- mongo
- sandbox
restart: always
environment:
# root 密码,用户名为: root。如果需要修改 root 密码,直接修改这个环境变量,并重启即可。
- DEFAULT_ROOT_PSW=1234
# AI模型的API地址哦。务必加 /v1。这里默认填写了OneApi的访问地址。
- OPENAI_BASE_URL=http://oneapi:3000/v1
# AI模型的API Key。这里默认填写了OneAPI的快速默认key测试通后务必及时修改
- CHAT_API_KEY=sk-fastgpt
# 数据库最大连接数
- DB_MAX_LINK=30
# 登录凭证密钥
- TOKEN_KEY=any
# root的密钥常用于升级时候的初始化请求
- ROOT_KEY=root_key
# 文件阅读加密
- FILE_TOKEN_KEY=filetoken
# MongoDB 连接参数. 用户名myusername,密码mypassword。
- MONGODB_URI=mongodb://myusername:mypassword@mongo:27017/fastgpt?authSource=admin
# zilliz 连接参数
- MILVUS_ADDRESS=zilliz_cloud_address
- MILVUS_TOKEN=zilliz_cloud_token
# sandbox 地址
- SANDBOX_URL=http://sandbox:3000
volumes:
- ./config.json:/app/data/config.json
# oneapi
mysql:
image: mysql:8.0.36
container_name: mysql
restart: always
ports:
- 3306:3306
networks:
- fastgpt
command: --default-authentication-plugin=mysql_native_password
environment:
# 默认root密码仅首次运行有效
MYSQL_ROOT_PASSWORD: oneapimmysql
MYSQL_DATABASE: oneapi
volumes:
- ./mysql:/var/lib/mysql
oneapi:
container_name: oneapi
image: ghcr.io/songquanpeng/one-api:latest
ports:
- 3001:3000
depends_on:
- mysql
networks:
- fastgpt
restart: always
environment:
# mysql 连接参数
- SQL_DSN=root:oneapimmysql@tcp(mysql:3306)/oneapi
# 登录凭证加密密钥
- SESSION_SECRET=oneapikey
# 内存缓存
- MEMORY_CACHE_ENABLED=true
# 启动聚合更新,减少数据交互频率
- BATCH_UPDATE_ENABLED=true
# 聚合更新时长
- BATCH_UPDATE_INTERVAL=10
# 初始化的 root 密钥(建议部署完后更改,否则容易泄露)
- INITIAL_ROOT_TOKEN=fastgpt
volumes:
- ./oneapi:/data
networks:
fastgpt:

View File

@ -21,9 +21,15 @@
"react-i18next": "13.5.0",
"zhlint": "^0.7.1"
},
"resolutions": {
"react": "18.3.1",
"react-dom": "18.3.1",
"@types/react": "18.3.0",
"@types/react-dom": "18.3.0"
},
"lint-staged": {
"./**/**/*.{ts,tsx,scss}": "npm run format-code",
"./**/**/*.md": "npm run format-doc"
"./docSite/**/**/*.md": "npm run format-doc"
},
"engines": {
"node": ">=18.0.0",

View File

@ -1,6 +1,6 @@
import { ErrType } from '../errorCode';
/* dataset: 507000 */
/* dataset: 508000 */
export enum PluginErrEnum {
unExist = 'pluginUnExist',
unAuth = 'pluginUnAuth'
@ -19,7 +19,7 @@ export default errList.reduce((acc, cur, index) => {
return {
...acc,
[cur.statusText]: {
code: 507000 + index,
code: 508000 + index,
statusText: cur.statusText,
message: cur.message,
data: null

View File

@ -0,0 +1,23 @@
import { ErrType } from '../errorCode';
/* dataset: 509000 */
export enum SystemErrEnum {
communityVersionNumLimit = 'communityVersionNumLimit'
}
const systemErr = [
{
statusText: SystemErrEnum.communityVersionNumLimit,
message: '超出开源版数量限制,请升级商业版: https://fastgpt.in'
}
];
export default systemErr.reduce((acc, cur, index) => {
return {
...acc,
[cur.statusText]: {
code: 509000 + index,
statusText: cur.statusText,
message: cur.message,
data: null
}
};
}, {} as ErrType<`${SystemErrEnum}`>);

View File

@ -7,6 +7,7 @@ import outLinkErr from './code/outLink';
import teamErr from './code/team';
import userErr from './code/user';
import commonErr from './code/common';
import SystemErrEnum from './code/system';
export const ERROR_CODE: { [key: number]: string } = {
400: '请求失败',
@ -98,5 +99,6 @@ export const ERROR_RESPONSE: Record<
...teamErr,
...userErr,
...pluginErr,
...commonErr
...commonErr,
...SystemErrEnum
};

View File

@ -1,7 +1,10 @@
import { replaceSensitiveText } from '../string/tools';
export const getErrText = (err: any, def = '') => {
const msg: string = typeof err === 'string' ? err : err?.message ?? def;
const msg: string =
typeof err === 'string'
? err
: err?.response?.data?.message || err?.response?.message || err?.message || def;
msg && console.log('error =>', msg);
return replaceSensitiveText(msg);
};

View File

@ -1 +0,0 @@
export const PgDatasetTableName = 'modeldata';

View File

@ -84,6 +84,9 @@ export type DispatchNodeResponseType = {
toolCallTokens?: number;
toolDetail?: ChatHistoryItemResType[];
toolStop?: boolean;
// code
codeLog?: string;
};
export type DispatchNodeResultType<T> = {

View File

@ -8,6 +8,8 @@ export enum VariableConditionEnum {
startWith = 'startWith',
endWith = 'endWith',
reg = 'reg',
greaterThan = 'greaterThan',
greaterThanOrEqualTo = 'greaterThanOrEqualTo',
lessThan = 'lessThan',
@ -31,6 +33,7 @@ export const stringConditionList = [
{ label: '不为空', value: VariableConditionEnum.isNotEmpty },
{ label: '等于', value: VariableConditionEnum.equalTo },
{ label: '不等于', value: VariableConditionEnum.notEqual },
{ label: '正则', value: VariableConditionEnum.reg },
{ label: '包含', value: VariableConditionEnum.include },
{ label: '不包含', value: VariableConditionEnum.notInclude },
{ label: '开始为', value: VariableConditionEnum.startWith },

View File

@ -67,6 +67,20 @@ export const CodeNode: FlowNodeTemplateType = {
description: '代码运行错误信息,成功时返回空',
valueType: WorkflowIOValueTypeEnum.object,
type: FlowNodeOutputTypeEnum.static
},
{
id: 'qLUQfhG0ILRX',
type: FlowNodeOutputTypeEnum.dynamic,
key: 'result',
valueType: WorkflowIOValueTypeEnum.string,
label: 'result'
},
{
id: 'gR0mkQpJ4Og8',
type: FlowNodeOutputTypeEnum.dynamic,
key: 'data2',
valueType: WorkflowIOValueTypeEnum.string,
label: 'data2'
}
]
};

View File

@ -43,7 +43,7 @@ export type FlowNodeInputItemType = {
export type FlowNodeOutputItemType = {
id: string; // output unique id(Does not follow the key change)
type: `${FlowNodeOutputTypeEnum}`;
type: FlowNodeOutputTypeEnum;
key: `${NodeOutputKeyEnum}` | string;
valueType?: WorkflowIOValueTypeEnum;
value?: any;

View File

@ -2,18 +2,18 @@ import { connectionMongo, ClientSession } from './index';
export const mongoSessionRun = async <T = unknown>(fn: (session: ClientSession) => Promise<T>) => {
const session = await connectionMongo.startSession();
session.startTransaction();
try {
session.startTransaction();
const result = await fn(session);
await session.commitTransaction();
await session.endSession();
return result as T;
} catch (error) {
await session.abortTransaction();
await session.endSession();
return Promise.reject(error);
} finally {
await session.endSession();
}
};

View File

@ -1,13 +1,35 @@
import dayjs from 'dayjs';
import chalk from 'chalk';
enum LogLevelEnum {
debug = 'debug',
info = 'info',
warn = 'warn',
error = 'error'
}
const logMap = {
[LogLevelEnum.debug]: {
levelLog: chalk.green('[Debug]')
},
[LogLevelEnum.info]: {
levelLog: chalk.blue('[Info]')
},
[LogLevelEnum.warn]: {
levelLog: chalk.yellow('[Warn]')
},
[LogLevelEnum.error]: {
levelLog: chalk.red('[Error]')
}
};
/* add logger */
export const addLog = {
log(level: 'info' | 'warn' | 'error', msg: string, obj: Record<string, any> = {}) {
log(level: LogLevelEnum, msg: string, obj: Record<string, any> = {}) {
const stringifyObj = JSON.stringify(obj);
const isEmpty = Object.keys(obj).length === 0;
console.log(
`[${level.toLocaleUpperCase()}] ${dayjs().format('YYYY-MM-DD HH:mm:ss')} ${msg} ${
`${logMap[level].levelLog} ${dayjs().format('YYYY-MM-DD HH:mm:ss')} ${msg} ${
level !== 'error' && !isEmpty ? stringifyObj : ''
}`
);
@ -44,14 +66,17 @@ export const addLog = {
});
} catch (error) {}
},
debug(msg: string, obj?: Record<string, any>) {
this.log(LogLevelEnum.debug, msg, obj);
},
info(msg: string, obj?: Record<string, any>) {
this.log('info', msg, obj);
this.log(LogLevelEnum.info, msg, obj);
},
warn(msg: string, obj?: Record<string, any>) {
this.log('warn', msg, obj);
this.log(LogLevelEnum.warn, msg, obj);
},
error(msg: string, error?: any) {
this.log('error', msg, {
this.log(LogLevelEnum.error, msg, {
message: error?.message || error,
stack: error?.stack,
...(error?.config && {

View File

@ -0,0 +1,6 @@
export const DatasetVectorDbName = 'fastgpt';
export const DatasetVectorTableName = 'modeldata';
export const PG_ADDRESS = process.env.PG_URL;
export const MILVUS_ADDRESS = process.env.MILVUS_ADDRESS;
export const MILVUS_TOKEN = process.env.MILVUS_TOKEN;

View File

@ -1,3 +1,5 @@
import type { EmbeddingRecallItemType } from './type';
export type DeleteDatasetVectorProps = (
| { id: string }
| { datasetIds: string[]; collectionIds?: string[] }
@ -5,12 +7,19 @@ export type DeleteDatasetVectorProps = (
) & {
teamId: string;
};
export type DelDatasetVectorCtrlProps = DeleteDatasetVectorProps & {
retry?: number;
};
export type InsertVectorProps = {
teamId: string;
datasetId: string;
collectionId: string;
};
export type InsertVectorControllerProps = InsertVectorProps & {
vector: number[];
retry?: number;
};
export type EmbeddingRecallProps = {
teamId: string;
@ -18,3 +27,11 @@ export type EmbeddingRecallProps = {
// similarity?: number;
// efSearch?: number;
};
export type EmbeddingRecallCtrlProps = EmbeddingRecallProps & {
vector: number[];
limit: number;
retry?: number;
};
export type EmbeddingRecallResponse = {
results: EmbeddingRecallItemType[];
};

View File

@ -1,18 +1,25 @@
/* vector crud */
import { PgVector } from './pg/class';
import { PgVectorCtrl } from './pg/class';
import { getVectorsByText } from '../../core/ai/embedding';
import { InsertVectorProps } from './controller.d';
import { VectorModelItemType } from '@fastgpt/global/core/ai/model.d';
import { MILVUS_ADDRESS, PG_ADDRESS } from './constants';
import { MilvusCtrl } from './milvus/class';
const getVectorObj = () => {
return new PgVector();
if (PG_ADDRESS) return new PgVectorCtrl();
if (MILVUS_ADDRESS) return new MilvusCtrl();
return new PgVectorCtrl();
};
export const initVectorStore = getVectorObj().init;
export const deleteDatasetDataVector = getVectorObj().delete;
export const recallFromVectorStore = getVectorObj().recall;
export const getVectorDataByTime = getVectorObj().getVectorDataByTime;
export const getVectorCountByTeamId = getVectorObj().getVectorCountByTeamId;
const Vector = getVectorObj();
export const initVectorStore = Vector.init;
export const deleteDatasetDataVector = Vector.delete;
export const recallFromVectorStore = Vector.embRecall;
export const getVectorDataByTime = Vector.getVectorDataByTime;
export const getVectorCountByTeamId = Vector.getVectorCountByTeamId;
export const insertDatasetDataVector = async ({
model,
@ -27,9 +34,9 @@ export const insertDatasetDataVector = async ({
input: query,
type: 'db'
});
const { insertId } = await getVectorObj().insert({
const { insertId } = await Vector.insert({
...props,
vectors
vector: vectors[0]
});
return {

View File

@ -0,0 +1,287 @@
import { DataType, LoadState, MilvusClient } from '@zilliz/milvus2-sdk-node';
import {
DatasetVectorDbName,
DatasetVectorTableName,
MILVUS_ADDRESS,
MILVUS_TOKEN
} from '../constants';
import type {
DelDatasetVectorCtrlProps,
EmbeddingRecallCtrlProps,
EmbeddingRecallResponse,
InsertVectorControllerProps
} from '../controller.d';
import { delay } from '@fastgpt/global/common/system/utils';
import { addLog } from '../../../common/system/log';
export class MilvusCtrl {
constructor() {}
getClient = async () => {
if (!MILVUS_ADDRESS) {
return Promise.reject('MILVUS_ADDRESS is not set');
}
if (global.milvusClient) return global.milvusClient;
global.milvusClient = new MilvusClient({
address: MILVUS_ADDRESS,
token: MILVUS_TOKEN
});
addLog.info(`Milvus connected`);
return global.milvusClient;
};
init = async () => {
const client = await this.getClient();
// init db(zilliz cloud will error)
try {
const { db_names } = await client.listDatabases();
if (!db_names.includes(DatasetVectorDbName)) {
await client.createDatabase({
db_name: DatasetVectorDbName
});
}
await client.useDatabase({
db_name: DatasetVectorDbName
});
} catch (error) {}
// init collection and index
const { value: hasCollection } = await client.hasCollection({
collection_name: DatasetVectorTableName
});
if (!hasCollection) {
const result = await client.createCollection({
collection_name: DatasetVectorTableName,
description: 'Store dataset vector',
enableDynamicField: true,
fields: [
{
name: 'id',
data_type: DataType.Int64,
is_primary_key: true,
autoID: true
},
{
name: 'vector',
data_type: DataType.FloatVector,
dim: 1536
},
{ name: 'teamId', data_type: DataType.VarChar, max_length: 64 },
{ name: 'datasetId', data_type: DataType.VarChar, max_length: 64 },
{ name: 'collectionId', data_type: DataType.VarChar, max_length: 64 },
{
name: 'createTime',
data_type: DataType.Int64
}
],
index_params: [
{
field_name: 'vector',
index_name: 'vector_HNSW',
index_type: 'HNSW',
metric_type: 'IP',
params: { efConstruction: 32, M: 64 }
},
{
field_name: 'teamId',
index_type: 'Trie'
},
{
field_name: 'datasetId',
index_type: 'Trie'
},
{
field_name: 'collectionId',
index_type: 'Trie'
},
{
field_name: 'createTime',
index_type: 'STL_SORT'
}
]
});
addLog.info(`Create milvus collection: `, result);
}
const { state: colLoadState } = await client.getLoadState({
collection_name: DatasetVectorTableName
});
if (
colLoadState === LoadState.LoadStateNotExist ||
colLoadState === LoadState.LoadStateNotLoad
) {
await client.loadCollectionSync({
collection_name: DatasetVectorTableName
});
addLog.info(`Milvus collection load success`);
}
};
insert = async (props: InsertVectorControllerProps): Promise<{ insertId: string }> => {
const client = await this.getClient();
const { teamId, datasetId, collectionId, vector, retry = 3 } = props;
try {
const result = await client.insert({
collection_name: DatasetVectorTableName,
data: [
{
vector,
teamId: String(teamId),
datasetId: String(datasetId),
collectionId: String(collectionId),
createTime: Date.now()
}
]
});
const insertId = (() => {
if ('int_id' in result.IDs) {
return `${result.IDs.int_id.data?.[0]}`;
}
return `${result.IDs.str_id.data?.[0]}`;
})();
return {
insertId: insertId
};
} catch (error) {
if (retry <= 0) {
return Promise.reject(error);
}
await delay(500);
return this.insert({
...props,
retry: retry - 1
});
}
};
delete = async (props: DelDatasetVectorCtrlProps): Promise<any> => {
const { teamId, retry = 2 } = props;
const client = await this.getClient();
const teamIdWhere = `(teamId=="${String(teamId)}")`;
const where = await (() => {
if ('id' in props && props.id) return `(id==${props.id})`;
if ('datasetIds' in props && props.datasetIds) {
const datasetIdWhere = `(datasetId in [${props.datasetIds
.map((id) => `"${String(id)}"`)
.join(',')}])`;
if ('collectionIds' in props && props.collectionIds) {
return `${datasetIdWhere} and (collectionId in [${props.collectionIds
.map((id) => `"${String(id)}"`)
.join(',')}])`;
}
return `${datasetIdWhere}`;
}
if ('idList' in props && Array.isArray(props.idList)) {
if (props.idList.length === 0) return;
return `(id in [${props.idList.map((id) => String(id)).join(',')}])`;
}
return Promise.reject('deleteDatasetData: no where');
})();
if (!where) return;
const concatWhere = `${teamIdWhere} and ${where}`;
try {
await client.delete({
collection_name: DatasetVectorTableName,
filter: concatWhere
});
} catch (error) {
if (retry <= 0) {
return Promise.reject(error);
}
await delay(500);
return this.delete({
...props,
retry: retry - 1
});
}
};
embRecall = async (props: EmbeddingRecallCtrlProps): Promise<EmbeddingRecallResponse> => {
const client = await this.getClient();
const { teamId, datasetIds, vector, limit, retry = 2 } = props;
try {
const { results } = await client.search({
collection_name: DatasetVectorTableName,
data: vector,
limit,
filter: `(teamId == "${teamId}") and (datasetId in [${datasetIds.map((id) => `"${String(id)}"`).join(',')}])`,
output_fields: ['collectionId']
});
const rows = results as {
score: number;
id: string;
collectionId: string;
}[];
return {
results: rows.map((item) => ({
id: String(item.id),
collectionId: item.collectionId,
score: item.score
}))
};
} catch (error) {
if (retry <= 0) {
return Promise.reject(error);
}
return this.embRecall({
...props,
retry: retry - 1
});
}
};
getVectorCountByTeamId = async (teamId: string) => {
const client = await this.getClient();
const result = await client.query({
collection_name: DatasetVectorTableName,
output_fields: ['count(*)'],
filter: `teamId == "${String(teamId)}"`
});
const total = result.data?.[0]?.['count(*)'] as number;
return total;
};
getVectorDataByTime = async (start: Date, end: Date) => {
const client = await this.getClient();
const startTimestamp = new Date(start).getTime();
const endTimestamp = new Date(end).getTime();
const result = await client.query({
collection_name: DatasetVectorTableName,
output_fields: ['id', 'teamId', 'datasetId'],
filter: `(createTime >= ${startTimestamp}) and (createTime <= ${endTimestamp})`
});
const rows = result.data as {
id: string;
teamId: string;
datasetId: string;
}[];
return rows.map((item) => ({
id: String(item.id),
teamId: item.teamId,
datasetId: item.datasetId
}));
};
}

View File

@ -1,18 +1,180 @@
/* pg vector crud */
import { DatasetVectorTableName } from '../constants';
import { delay } from '@fastgpt/global/common/system/utils';
import { PgClient, connectPg } from './index';
import { PgSearchRawType } from '@fastgpt/global/core/dataset/api';
import {
initPg,
insertDatasetDataVector,
deleteDatasetDataVector,
embeddingRecall,
getVectorDataByTime,
getVectorCountByTeamId
} from './controller';
DelDatasetVectorCtrlProps,
EmbeddingRecallCtrlProps,
EmbeddingRecallResponse,
InsertVectorControllerProps
} from '../controller.d';
import dayjs from 'dayjs';
export class PgVector {
export class PgVectorCtrl {
constructor() {}
init = initPg;
insert = insertDatasetDataVector;
delete = deleteDatasetDataVector;
recall = embeddingRecall;
getVectorCountByTeamId = getVectorCountByTeamId;
getVectorDataByTime = getVectorDataByTime;
init = async () => {
try {
await connectPg();
await PgClient.query(`
CREATE EXTENSION IF NOT EXISTS vector;
CREATE TABLE IF NOT EXISTS ${DatasetVectorTableName} (
id BIGSERIAL PRIMARY KEY,
vector VECTOR(1536) NOT NULL,
team_id VARCHAR(50) NOT NULL,
dataset_id VARCHAR(50) NOT NULL,
collection_id VARCHAR(50) NOT NULL,
createtime TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
`);
await PgClient.query(
`CREATE INDEX CONCURRENTLY IF NOT EXISTS vector_index ON ${DatasetVectorTableName} USING hnsw (vector vector_ip_ops) WITH (m = 32, ef_construction = 128);`
);
await PgClient.query(
`CREATE INDEX CONCURRENTLY IF NOT EXISTS team_dataset_collection_index ON ${DatasetVectorTableName} USING btree(team_id, dataset_id, collection_id);`
);
await PgClient.query(
`CREATE INDEX CONCURRENTLY IF NOT EXISTS create_time_index ON ${DatasetVectorTableName} USING btree(createtime);`
);
console.log('init pg successful');
} catch (error) {
console.log('init pg error', error);
}
};
insert = async (props: InsertVectorControllerProps): Promise<{ insertId: string }> => {
const { teamId, datasetId, collectionId, vector, retry = 3 } = props;
try {
const { rows } = await PgClient.insert(DatasetVectorTableName, {
values: [
[
{ key: 'vector', value: `[${vector}]` },
{ key: 'team_id', value: String(teamId) },
{ key: 'dataset_id', value: String(datasetId) },
{ key: 'collection_id', value: String(collectionId) }
]
]
});
return {
insertId: rows[0].id
};
} catch (error) {
if (retry <= 0) {
return Promise.reject(error);
}
await delay(500);
return this.insert({
...props,
retry: retry - 1
});
}
};
delete = async (props: DelDatasetVectorCtrlProps): Promise<any> => {
const { teamId, retry = 2 } = props;
const teamIdWhere = `team_id='${String(teamId)}' AND`;
const where = await (() => {
if ('id' in props && props.id) return `${teamIdWhere} id=${props.id}`;
if ('datasetIds' in props && props.datasetIds) {
const datasetIdWhere = `dataset_id IN (${props.datasetIds
.map((id) => `'${String(id)}'`)
.join(',')})`;
if ('collectionIds' in props && props.collectionIds) {
return `${teamIdWhere} ${datasetIdWhere} AND collection_id IN (${props.collectionIds
.map((id) => `'${String(id)}'`)
.join(',')})`;
}
return `${teamIdWhere} ${datasetIdWhere}`;
}
if ('idList' in props && Array.isArray(props.idList)) {
if (props.idList.length === 0) return;
return `${teamIdWhere} id IN (${props.idList.map((id) => String(id)).join(',')})`;
}
return Promise.reject('deleteDatasetData: no where');
})();
if (!where) return;
try {
await PgClient.delete(DatasetVectorTableName, {
where: [where]
});
} catch (error) {
if (retry <= 0) {
return Promise.reject(error);
}
await delay(500);
return this.delete({
...props,
retry: retry - 1
});
}
};
embRecall = async (props: EmbeddingRecallCtrlProps): Promise<EmbeddingRecallResponse> => {
const { teamId, datasetIds, vector, limit, retry = 2 } = props;
try {
const results: any = await PgClient.query(
`
BEGIN;
SET LOCAL hnsw.ef_search = ${global.systemEnv?.pgHNSWEfSearch || 100};
select id, collection_id, vector <#> '[${vector}]' AS score
from ${DatasetVectorTableName}
where team_id='${teamId}'
AND dataset_id IN (${datasetIds.map((id) => `'${String(id)}'`).join(',')})
order by score limit ${limit};
COMMIT;`
);
const rows = results?.[2]?.rows as PgSearchRawType[];
return {
results: rows.map((item) => ({
id: String(item.id),
collectionId: item.collection_id,
score: item.score * -1
}))
};
} catch (error) {
if (retry <= 0) {
return Promise.reject(error);
}
return this.embRecall({
...props,
retry: retry - 1
});
}
};
getVectorCountByTeamId = async (teamId: string) => {
const total = await PgClient.count(DatasetVectorTableName, {
where: [['team_id', String(teamId)]]
});
return total;
};
getVectorDataByTime = async (start: Date, end: Date) => {
const { rows } = await PgClient.query<{
id: string;
team_id: string;
dataset_id: string;
}>(`SELECT id, team_id, dataset_id
FROM ${DatasetVectorTableName}
WHERE createtime BETWEEN '${dayjs(start).format('YYYY-MM-DD HH:mm:ss')}' AND '${dayjs(
end
).format('YYYY-MM-DD HH:mm:ss')}';
`);
return rows.map((item) => ({
id: String(item.id),
teamId: item.team_id,
datasetId: item.dataset_id
}));
};
}

View File

@ -1,195 +0,0 @@
/* pg vector crud */
import { PgDatasetTableName } from '@fastgpt/global/common/vectorStore/constants';
import { delay } from '@fastgpt/global/common/system/utils';
import { PgClient, connectPg } from './index';
import { PgSearchRawType } from '@fastgpt/global/core/dataset/api';
import { EmbeddingRecallItemType } from '../type';
import { DeleteDatasetVectorProps, EmbeddingRecallProps, InsertVectorProps } from '../controller.d';
import dayjs from 'dayjs';
export async function initPg() {
try {
await connectPg();
await PgClient.query(`
CREATE EXTENSION IF NOT EXISTS vector;
CREATE TABLE IF NOT EXISTS ${PgDatasetTableName} (
id BIGSERIAL PRIMARY KEY,
vector VECTOR(1536) NOT NULL,
team_id VARCHAR(50) NOT NULL,
dataset_id VARCHAR(50) NOT NULL,
collection_id VARCHAR(50) NOT NULL,
createtime TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
`);
await PgClient.query(
`CREATE INDEX CONCURRENTLY IF NOT EXISTS vector_index ON ${PgDatasetTableName} USING hnsw (vector vector_ip_ops) WITH (m = 32, ef_construction = 128);`
);
await PgClient.query(
`CREATE INDEX CONCURRENTLY IF NOT EXISTS team_dataset_collection_index ON ${PgDatasetTableName} USING btree(team_id, dataset_id, collection_id);`
);
await PgClient.query(
`CREATE INDEX CONCURRENTLY IF NOT EXISTS create_time_index ON ${PgDatasetTableName} USING btree(createtime);`
);
console.log('init pg successful');
} catch (error) {
console.log('init pg error', error);
}
}
export const insertDatasetDataVector = async (
props: InsertVectorProps & {
vectors: number[][];
retry?: number;
}
): Promise<{ insertId: string }> => {
const { teamId, datasetId, collectionId, vectors, retry = 3 } = props;
try {
const { rows } = await PgClient.insert(PgDatasetTableName, {
values: [
[
{ key: 'vector', value: `[${vectors[0]}]` },
{ key: 'team_id', value: String(teamId) },
{ key: 'dataset_id', value: String(datasetId) },
{ key: 'collection_id', value: String(collectionId) }
]
]
});
return {
insertId: rows[0].id
};
} catch (error) {
if (retry <= 0) {
return Promise.reject(error);
}
await delay(500);
return insertDatasetDataVector({
...props,
retry: retry - 1
});
}
};
export const deleteDatasetDataVector = async (
props: DeleteDatasetVectorProps & {
retry?: number;
}
): Promise<any> => {
const { teamId, retry = 2 } = props;
const teamIdWhere = `team_id='${String(teamId)}' AND`;
const where = await (() => {
if ('id' in props && props.id) return `${teamIdWhere} id=${props.id}`;
if ('datasetIds' in props && props.datasetIds) {
const datasetIdWhere = `dataset_id IN (${props.datasetIds
.map((id) => `'${String(id)}'`)
.join(',')})`;
if ('collectionIds' in props && props.collectionIds) {
return `${teamIdWhere} ${datasetIdWhere} AND collection_id IN (${props.collectionIds
.map((id) => `'${String(id)}'`)
.join(',')})`;
}
return `${teamIdWhere} ${datasetIdWhere}`;
}
if ('idList' in props && Array.isArray(props.idList)) {
if (props.idList.length === 0) return;
return `${teamIdWhere} id IN (${props.idList.map((id) => `'${String(id)}'`).join(',')})`;
}
return Promise.reject('deleteDatasetData: no where');
})();
if (!where) return;
try {
await PgClient.delete(PgDatasetTableName, {
where: [where]
});
} catch (error) {
if (retry <= 0) {
return Promise.reject(error);
}
await delay(500);
return deleteDatasetDataVector({
...props,
retry: retry - 1
});
}
};
export const embeddingRecall = async (
props: EmbeddingRecallProps & {
vectors: number[][];
limit: number;
retry?: number;
}
): Promise<{
results: EmbeddingRecallItemType[];
}> => {
const { teamId, datasetIds, vectors, limit, retry = 2 } = props;
try {
const results: any = await PgClient.query(
`
BEGIN;
SET LOCAL hnsw.ef_search = ${global.systemEnv?.pgHNSWEfSearch || 100};
select id, collection_id, vector <#> '[${vectors[0]}]' AS score
from ${PgDatasetTableName}
where team_id='${teamId}'
AND dataset_id IN (${datasetIds.map((id) => `'${String(id)}'`).join(',')})
order by score limit ${limit};
COMMIT;`
);
const rows = results?.[2]?.rows as PgSearchRawType[];
return {
results: rows.map((item) => ({
id: item.id,
collectionId: item.collection_id,
score: item.score * -1
}))
};
} catch (error) {
console.log(error);
if (retry <= 0) {
return Promise.reject(error);
}
return embeddingRecall({
...props,
retry: retry - 1
});
}
};
export const getVectorCountByTeamId = async (teamId: string) => {
const total = await PgClient.count(PgDatasetTableName, {
where: [['team_id', String(teamId)]]
});
return total;
};
export const getVectorDataByTime = async (start: Date, end: Date) => {
const { rows } = await PgClient.query<{
id: string;
team_id: string;
dataset_id: string;
}>(`SELECT id, team_id, dataset_id
FROM ${PgDatasetTableName}
WHERE createtime BETWEEN '${dayjs(start).format('YYYY-MM-DD HH:mm:ss')}' AND '${dayjs(end).format(
'YYYY-MM-DD HH:mm:ss'
)}';
`);
return rows.map((item) => ({
id: String(item.id),
teamId: item.team_id,
datasetId: item.dataset_id
}));
};

View File

@ -2,6 +2,7 @@ import { delay } from '@fastgpt/global/common/system/utils';
import { addLog } from '../../system/log';
import { Pool } from 'pg';
import type { QueryResultRow } from 'pg';
import { PG_ADDRESS } from '../constants';
export const connectPg = async (): Promise<Pool> => {
if (global.pgClient) {
@ -9,7 +10,7 @@ export const connectPg = async (): Promise<Pool> => {
}
global.pgClient = new Pool({
connectionString: process.env.PG_URL,
connectionString: PG_ADDRESS,
max: Number(process.env.DB_MAX_LINK || 20),
min: 10,
keepAlive: true,

View File

@ -1,7 +1,9 @@
import type { Pool } from 'pg';
import { MilvusClient } from '@zilliz/milvus2-sdk-node';
declare global {
var pgClient: Pool | null;
var milvusClient: MilvusClient | null;
}
export type EmbeddingRecallItemType = {

View File

@ -85,7 +85,7 @@ export async function searchDatasetData(props: SearchDatasetDataProps) {
const { results } = await recallFromVectorStore({
teamId,
datasetIds,
vectors,
vector: vectors[0],
limit
});
@ -94,7 +94,7 @@ export async function searchDatasetData(props: SearchDatasetDataProps) {
{
teamId,
datasetId: { $in: datasetIds },
collectionId: { $in: results.map((item) => item.collectionId) },
collectionId: { $in: Array.from(new Set(results.map((item) => item.collectionId))) },
'indexes.dataId': { $in: results.map((item) => item.id?.trim()) }
},
'datasetId collectionId q a chunkIndex indexes'
@ -118,26 +118,24 @@ export async function searchDatasetData(props: SearchDatasetDataProps) {
concatResults.sort((a, b) => b.score - a.score);
const formatResult = concatResults
.map((data, index) => {
if (!data.collectionId) {
console.log('Collection is not found', data);
}
const formatResult = concatResults.map((data, index) => {
if (!data.collectionId) {
console.log('Collection is not found', data);
}
const result: SearchDataResponseItemType = {
id: String(data._id),
q: data.q,
a: data.a,
chunkIndex: data.chunkIndex,
datasetId: String(data.datasetId),
collectionId: String(data.collectionId?._id),
...getCollectionSourceData(data.collectionId),
score: [{ type: SearchScoreTypeEnum.embedding, value: data.score, index }]
};
const result: SearchDataResponseItemType = {
id: String(data._id),
q: data.q,
a: data.a,
chunkIndex: data.chunkIndex,
datasetId: String(data.datasetId),
collectionId: String(data.collectionId?._id),
...getCollectionSourceData(data.collectionId),
score: [{ type: SearchScoreTypeEnum.embedding, value: data.score, index }]
};
return result;
})
.filter((item) => item !== null) as SearchDataResponseItemType[];
return result;
});
return {
embeddingRecallResults: formatResult,

View File

@ -45,6 +45,7 @@ import { DispatchNodeResponseKeyEnum } from '@fastgpt/global/core/workflow/runti
import { getHistories } from '../utils';
import { filterSearchResultsByMaxChars } from '../../utils';
import { getHistoryPreview } from '@fastgpt/global/core/chat/utils';
import { addLog } from '../../../../common/system/log';
export type ChatProps = ModuleDispatchProps<
AIChatNodeProps & {
@ -167,21 +168,19 @@ export const dispatchChatCompletion = async (props: ChatProps): Promise<ChatResp
})
);
const response = await ai.chat.completions.create(
{
...modelConstantsData?.defaultConfig,
model: modelConstantsData.model,
temperature,
max_tokens,
stream,
messages: loadMessages
},
{
headers: {
Accept: 'application/json, text/plain, */*'
}
const requestBody = {
...modelConstantsData?.defaultConfig,
model: modelConstantsData.model,
temperature,
max_tokens,
stream,
messages: loadMessages
};
const response = await ai.chat.completions.create(requestBody, {
headers: {
Accept: 'application/json, text/plain, */*'
}
);
});
const { answerText } = await (async () => {
if (res && stream) {
@ -189,7 +188,8 @@ export const dispatchChatCompletion = async (props: ChatProps): Promise<ChatResp
const { answer } = await streamResponse({
res,
detail,
stream: response
stream: response,
requestBody
});
return {
@ -349,11 +349,13 @@ async function getMaxTokens({
async function streamResponse({
res,
detail,
stream
stream,
requestBody
}: {
res: NextApiResponse;
detail: boolean;
stream: StreamChatType;
requestBody: Record<string, any>;
}) {
const write = responseWriteController({
res,
@ -378,6 +380,7 @@ async function streamResponse({
}
if (!answer) {
addLog.info(`LLM model response empty`, requestBody);
return Promise.reject('core.chat.Chat API is error or undefined');
}

View File

@ -25,7 +25,10 @@ export const dispatchRunCode = async (props: RunCodeType): Promise<RunCodeRespon
try {
const { data: runResult } = await axios.post<{
success: boolean;
data: Record<string, any>;
data: {
codeReturn: Record<string, any>;
log: string;
};
}>(sandBoxRequestUrl, {
code,
variables: customVariables
@ -33,10 +36,11 @@ export const dispatchRunCode = async (props: RunCodeType): Promise<RunCodeRespon
if (runResult.success) {
return {
[NodeOutputKeyEnum.rawResponse]: runResult.data,
[NodeOutputKeyEnum.rawResponse]: runResult.data.codeReturn,
[DispatchNodeResponseKeyEnum.nodeResponse]: {
customInputs: customVariables,
customOutputs: runResult.data
customOutputs: runResult.data.codeReturn,
codeLog: runResult.data.log
},
...runResult.data
};

View File

@ -13,6 +13,7 @@ import {
import { ModuleDispatchProps } from '@fastgpt/global/core/workflow/type';
import { getElseIFLabel, getHandleId } from '@fastgpt/global/core/workflow/utils';
import { getReferenceVariableValue } from '@fastgpt/global/core/workflow/runtime/utils';
import { replaceRegChars } from '@fastgpt/global/common/string/tools';
type Props = ModuleDispatchProps<{
[NodeInputKeyEnum.condition]: IfElseConditionType;
@ -48,39 +49,52 @@ function isInclude(value: any, target: any) {
}
}
function checkCondition(condition: VariableConditionEnum, variableValue: any, value: string) {
const operations = {
[VariableConditionEnum.isEmpty]: () => isEmpty(variableValue),
[VariableConditionEnum.isNotEmpty]: () => !isEmpty(variableValue),
function checkCondition(condition: VariableConditionEnum, inputValue: any, value: string) {
const operations: Record<VariableConditionEnum, () => boolean> = {
[VariableConditionEnum.isEmpty]: () => isEmpty(inputValue),
[VariableConditionEnum.isNotEmpty]: () => !isEmpty(inputValue),
[VariableConditionEnum.equalTo]: () => String(variableValue) === value,
[VariableConditionEnum.notEqual]: () => String(variableValue) !== value,
[VariableConditionEnum.equalTo]: () => String(inputValue) === value,
[VariableConditionEnum.notEqual]: () => String(inputValue) !== value,
// number
[VariableConditionEnum.greaterThan]: () => Number(variableValue) > Number(value),
[VariableConditionEnum.lessThan]: () => Number(variableValue) < Number(value),
[VariableConditionEnum.greaterThanOrEqualTo]: () => Number(variableValue) >= Number(value),
[VariableConditionEnum.lessThanOrEqualTo]: () => Number(variableValue) <= Number(value),
[VariableConditionEnum.greaterThan]: () => Number(inputValue) > Number(value),
[VariableConditionEnum.lessThan]: () => Number(inputValue) < Number(value),
[VariableConditionEnum.greaterThanOrEqualTo]: () => Number(inputValue) >= Number(value),
[VariableConditionEnum.lessThanOrEqualTo]: () => Number(inputValue) <= Number(value),
// array or string
[VariableConditionEnum.include]: () => isInclude(variableValue, value),
[VariableConditionEnum.notInclude]: () => !isInclude(variableValue, value),
[VariableConditionEnum.include]: () => isInclude(inputValue, value),
[VariableConditionEnum.notInclude]: () => !isInclude(inputValue, value),
// string
[VariableConditionEnum.startWith]: () => variableValue?.startsWith(value),
[VariableConditionEnum.endWith]: () => variableValue?.endsWith(value),
[VariableConditionEnum.startWith]: () => inputValue?.startsWith(value),
[VariableConditionEnum.endWith]: () => inputValue?.endsWith(value),
[VariableConditionEnum.reg]: () => {
if (typeof inputValue !== 'string' || !value) return false;
if (value.startsWith('/')) {
value = value.slice(1);
}
if (value.endsWith('/')) {
value = value.slice(0, -1);
}
const reg = new RegExp(value, 'g');
const result = reg.test(inputValue);
return result;
},
// array
[VariableConditionEnum.lengthEqualTo]: () => variableValue?.length === Number(value),
[VariableConditionEnum.lengthNotEqualTo]: () => variableValue?.length !== Number(value),
[VariableConditionEnum.lengthGreaterThan]: () => variableValue?.length > Number(value),
[VariableConditionEnum.lengthGreaterThanOrEqualTo]: () =>
variableValue?.length >= Number(value),
[VariableConditionEnum.lengthLessThan]: () => variableValue?.length < Number(value),
[VariableConditionEnum.lengthLessThanOrEqualTo]: () => variableValue?.length <= Number(value)
[VariableConditionEnum.lengthEqualTo]: () => inputValue?.length === Number(value),
[VariableConditionEnum.lengthNotEqualTo]: () => inputValue?.length !== Number(value),
[VariableConditionEnum.lengthGreaterThan]: () => inputValue?.length > Number(value),
[VariableConditionEnum.lengthGreaterThanOrEqualTo]: () => inputValue?.length >= Number(value),
[VariableConditionEnum.lengthLessThan]: () => inputValue?.length < Number(value),
[VariableConditionEnum.lengthLessThanOrEqualTo]: () => inputValue?.length <= Number(value)
};
return (operations[condition] || (() => false))();
return operations[condition]?.() ?? false;
}
function getResult(
@ -92,13 +106,13 @@ function getResult(
const listResult = list.map((item) => {
const { variable, condition: variableCondition, value } = item;
const variableValue = getReferenceVariableValue({
const inputValue = getReferenceVariableValue({
value: variable,
variables,
nodes: runtimeNodes
});
return checkCondition(variableCondition as VariableConditionEnum, variableValue, value || '');
return checkCondition(variableCondition as VariableConditionEnum, inputValue, value || '');
});
return condition === 'AND' ? listResult.every(Boolean) : listResult.some(Boolean);

View File

@ -16,7 +16,7 @@ type Props = ModuleDispatchProps<{
type Response = DispatchNodeResultType<{}>;
export const dispatchUpdateVariable = async (props: Props): Promise<Response> => {
const { res, detail, params, variables, runtimeNodes } = props;
const { res, detail, stream, params, variables, runtimeNodes } = props;
const { updateList } = params;
updateList.forEach((item) => {
@ -54,7 +54,7 @@ export const dispatchUpdateVariable = async (props: Props): Promise<Response> =>
}
});
if (detail) {
if (detail && stream) {
responseWrite({
res,
event: SseResponseEventEnum.updateVariables,

View File

@ -1,3 +1,5 @@
import { getErrText } from '@fastgpt/global/common/error/utils';
import { replaceSensitiveText } from '@fastgpt/global/common/string/tools';
import type { ChatItemType } from '@fastgpt/global/core/chat/type.d';
import {
WorkflowIOValueTypeEnum,
@ -89,11 +91,10 @@ export const removeSystemVariable = (variables: Record<string, any>) => {
export const formatHttpError = (error: any) => {
return {
message: error?.message,
message: getErrText(error),
data: error?.response?.data,
name: error?.name,
method: error?.config?.method,
baseURL: error?.config?.baseURL,
url: error?.config?.url,
code: error?.code,
status: error?.status
};

View File

@ -5,7 +5,9 @@
"@fastgpt/global": "workspace:*",
"@node-rs/jieba": "1.10.0",
"@xmldom/xmldom": "^0.8.10",
"@zilliz/milvus2-sdk-node": "2.4.2",
"axios": "^1.5.1",
"chalk": "^5.3.0",
"cheerio": "1.0.0-rc.12",
"cookie": "^0.5.0",
"date-fns": "2.30.0",
@ -22,7 +24,7 @@
"mongoose": "^7.0.2",
"multer": "1.4.5-lts.1",
"next": "14.2.3",
"nextjs-cors": "^2.1.2",
"nextjs-cors": "^2.2.0",
"node-cron": "^3.0.3",
"node-xlsx": "^0.23.0",
"papaparse": "5.4.1",

View File

@ -5,6 +5,7 @@ import { MongoPlugin } from '../../core/plugin/schema';
import { MongoDataset } from '../../core/dataset/schema';
import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
import { TeamErrEnum } from '@fastgpt/global/common/error/code/team';
import { SystemErrEnum } from '@fastgpt/global/common/error/code/system';
export const checkDatasetLimit = async ({
teamId,
@ -13,14 +14,14 @@ export const checkDatasetLimit = async ({
teamId: string;
insertLen?: number;
}) => {
const [{ standardConstants, totalPoints, usedPoints, datasetMaxSize }, usedSize] =
await Promise.all([getTeamPlanStatus({ teamId }), getVectorCountByTeamId(teamId)]);
const { standardConstants, totalPoints, usedPoints, datasetMaxSize, usedDatasetSize } =
await getTeamPlanStatus({ teamId });
if (!standardConstants) return;
if (usedSize + insertLen >= datasetMaxSize) {
if (usedDatasetSize + insertLen >= datasetMaxSize) {
return Promise.reject(
`您的知识库容量为: ${datasetMaxSize}组,已使用: ${usedSize}组,导入当前文件需要: ${insertLen}组,请增加知识库容量后导入。`
`您的知识库容量为: ${datasetMaxSize}组,已使用: ${usedDatasetSize}组,导入当前文件需要: ${insertLen}组,请增加知识库容量后导入。`
);
}
@ -59,6 +60,9 @@ export const checkTeamDatasetLimit = async (teamId: string) => {
if (standardConstants && datasetCount >= standardConstants.maxDatasetAmount) {
return Promise.reject(TeamErrEnum.datasetAmountNotEnough);
}
if (!global.feConfigs.isPlus && datasetCount >= 30) {
return Promise.reject(SystemErrEnum.communityVersionNumLimit);
}
};
export const checkTeamAppLimit = async (teamId: string) => {
const [{ standardConstants }, appCount] = await Promise.all([

View File

@ -56,11 +56,18 @@ const MultipleRowSelect = ({
}}
onClick={() => {
const newValue = [...cloneValue];
newValue[index] = item.value;
setCloneValue(newValue);
if (!hasChildren) {
if (item.value === selectedValue) {
newValue[index] = undefined;
setCloneValue(newValue);
onSelect(newValue);
onClose();
} else {
newValue[index] = item.value;
setCloneValue(newValue);
if (!hasChildren) {
onSelect(newValue);
onClose();
}
}
}}
{...(item.value === selectedValue

View File

@ -20,6 +20,8 @@ const { definePartsStyle: numInputPart, defineMultiStyleConfig: numInputMultiSty
const { definePartsStyle: checkBoxPart, defineMultiStyleConfig: checkBoxMultiStyle } =
createMultiStyleConfigHelpers(checkboxAnatomy.keys);
const shadowLight = '0px 0px 0px 2.4px rgba(51, 112, 255, 0.15)';
// 按键
const Button = defineStyleConfig({
baseStyle: {
@ -289,7 +291,7 @@ const Input: ComponentStyleConfig = {
borderColor: 'borderColor.low',
_focus: {
borderColor: 'primary.500',
boxShadow: '0px 0px 0px 2.4px rgba(51, 112, 255, 0.15)',
boxShadow: shadowLight,
bg: 'white'
},
_disabled: {
@ -328,7 +330,7 @@ const NumberInput = numInputMultiStyle({
borderColor: 'myGray.200',
_focus: {
borderColor: 'primary.500 !important',
boxShadow: '0px 0px 0px 2.4px rgba(51, 112, 255, 0.15) !important',
boxShadow: `${shadowLight} !important`,
bg: 'transparent'
},
_disabled: {
@ -362,7 +364,7 @@ const Textarea: ComponentStyleConfig = {
},
_focus: {
borderColor: 'primary.500',
boxShadow: '0px 0px 0px 2.4px rgba(51, 112, 255, 0.15)',
boxShadow: shadowLight,
bg: 'white'
}
}
@ -396,7 +398,7 @@ const Select = selectMultiStyle({
field: {
borderColor: 'myGray.200',
_focusWithin: {
boxShadow: '0px 0px 0px 2.4px rgba(51, 112, 255, 0.15)',
boxShadow: shadowLight,
borderColor: 'primary.500'
}
}
@ -408,6 +410,21 @@ const Checkbox = checkBoxMultiStyle({
baseStyle: checkBoxPart({
label: {
fontFamily: 'mono' // change the font family of the label
},
control: {
bg: 'none',
_checked: {
bg: 'primary.50',
borderColor: 'primary.600',
color: 'primary.600',
boxShadow: `${shadowLight} !important`,
_hover: {
bg: 'primary.50'
}
},
_hover: {
borderColor: 'primary.400'
}
}
})
});
@ -437,6 +454,11 @@ export const theme = extendTheme({
},
a: {
color: 'primary.600'
},
'*': {
_focusVisible: {
boxShadow: 'none'
}
}
}
},
@ -491,6 +513,25 @@ export const theme = extendTheme({
800: '#2450B5',
900: '#1D4091'
},
blue: {
1: 'rgba(51, 112, 255, 0.1)',
'015': 'rgba(51, 112, 255, 0.15)',
3: 'rgba(51, 112, 255, 0.3)',
5: 'rgba(51, 112, 255, 0.5)',
7: 'rgba(51, 112, 255, 0.7)',
9: 'rgba(51, 112, 255, 0.9)',
50: '#F0F4FF',
100: '#E1EAFF',
200: '#C5D7FF',
300: '#94B5FF',
400: '#5E8FFF',
500: '#487FFF',
600: '#3370FF',
700: '#2B5FD9',
800: '#2450B5',
900: '#1D4091'
},
red: {
1: 'rgba(217,45,32,0.1)',
3: 'rgba(217,45,32,0.3)',
@ -579,7 +620,8 @@ export const theme = extendTheme({
5: '0px 0px 1px 0px rgba(19, 51, 107, 0.15), 0px 20px 24px -8px rgba(19, 51, 107, 0.15)',
6: '0px 0px 1px 0px rgba(19, 51, 107, 0.20), 0px 24px 48px -12px rgba(19, 51, 107, 0.20)',
7: '0px 0px 1px 0px rgba(19, 51, 107, 0.20), 0px 32px 64px -12px rgba(19, 51, 107, 0.20)',
focus: '0px 0px 0px 2.4px rgba(51, 112, 255, 0.15)'
focus: shadowLight,
outline: 'none'
},
breakpoints: {
sm: '900px',

View File

@ -4,6 +4,12 @@ settings:
autoInstallPeers: true
excludeLinksFromLockfile: false
overrides:
react: 18.3.1
react-dom: 18.3.1
'@types/react': 18.3.0
'@types/react-dom': 18.3.0
importers:
.:
@ -102,9 +108,15 @@ importers:
'@xmldom/xmldom':
specifier: ^0.8.10
version: 0.8.10
'@zilliz/milvus2-sdk-node':
specifier: 2.4.2
version: 2.4.2
axios:
specifier: ^1.5.1
version: 1.5.1
chalk:
specifier: ^5.3.0
version: 5.3.0
cheerio:
specifier: 1.0.0-rc.12
version: 1.0.0-rc.12
@ -154,8 +166,8 @@ importers:
specifier: 14.2.3
version: 14.2.3(@babel/core@7.24.4)(react-dom@18.3.1)(react@18.3.1)(sass@1.58.3)
nextjs-cors:
specifier: ^2.1.2
version: 2.1.2(next@14.2.3)
specifier: ^2.2.0
version: 2.2.0(next@14.2.3)
node-cron:
specifier: ^3.0.3
version: 3.0.3
@ -3162,6 +3174,11 @@ packages:
dev: true
optional: true
/@colors/colors@1.6.0:
resolution: {integrity: sha512-Ir+AOibqzrIsL6ajt3Rz3LskB7OiMVHqltZmspbW/TJuTVuyOMirVqAkjfY6JISiLHgyNqicAC8AyHHGzNd/dA==}
engines: {node: '>=0.1.90'}
dev: false
/@cspotcode/source-map-support@0.8.1:
resolution: {integrity: sha512-IchNf6dN4tHoMFIn/7OE8LWZ19Y6q/67Bmf6vnGREv8RSbBVb9LPJxEcnwrcwX6ixSvaiGoomAUvu4YSxXrVgw==}
engines: {node: '>=12'}
@ -3169,6 +3186,14 @@ packages:
'@jridgewell/trace-mapping': 0.3.9
dev: true
/@dabh/diagnostics@2.0.3:
resolution: {integrity: sha512-hrlQOIi7hAfzsMqlGSFyVucrx38O+j6wiGOf//H2ecvIEqYN4ADBSS2iLMh5UFyDunCNniUIPk/q3riFv45xRA==}
dependencies:
colorspace: 1.1.4
enabled: 2.0.0
kuler: 2.0.0
dev: false
/@emnapi/core@1.1.1:
resolution: {integrity: sha512-eu4KjHfXg3I+UUR7vSuwZXpRo4c8h4Rtb5Lu2F7Z4JqJFl/eidquONEBiRs6viXKpWBC3BaJBy68xGJ2j56idw==}
requiresBuild: true
@ -3669,6 +3694,25 @@ packages:
engines: {node: '>=16.15'}
dev: false
/@grpc/grpc-js@1.10.8:
resolution: {integrity: sha512-vYVqYzHicDqyKB+NQhAc54I1QWCBLCrYG6unqOIcBTHx+7x8C9lcoLj3KVJXs2VB4lUbpWY+Kk9NipcbXYWmvg==}
engines: {node: '>=12.10.0'}
dependencies:
'@grpc/proto-loader': 0.7.13
'@js-sdsl/ordered-map': 4.4.2
dev: false
/@grpc/proto-loader@0.7.13:
resolution: {integrity: sha512-AiXO/bfe9bmxBjxxtYxFAXGZvMaN5s8kO+jBHAJCON8rJoB5YS/D6X7ZNc6XQkuHNmyl4CYaMI1fJ/Gn27RGGw==}
engines: {node: '>=6'}
hasBin: true
dependencies:
lodash.camelcase: 4.3.0
long: 5.2.3
protobufjs: 7.3.0
yargs: 17.7.2
dev: false
/@humanwhocodes/config-array@0.11.14:
resolution: {integrity: sha512-3T8LkOmg45BV5FICb15QQMsyUSWrQ8AygVfC7ZG32zOalnqrilm018ZVCw0eapXux8FtA33q8PSRSstjee3jSg==}
engines: {node: '>=10.10.0'}
@ -3968,6 +4012,10 @@ packages:
'@jridgewell/sourcemap-codec': 1.4.15
dev: true
/@js-sdsl/ordered-map@4.4.2:
resolution: {integrity: sha512-iUKgm52T8HOE/makSxjqoWhe95ZJA1/G1sYsGev2JDKUSS14KAgg1LHb+Ba+IPow0xflbnSkOsZcO08C7w1gYw==}
dev: false
/@jsdevtools/ono@7.1.3:
resolution: {integrity: sha512-4JQNk+3mVzK3xh2rqd6RB4J46qUR19azEHBneZyTZM+c456qOrbbM/5xcR8huNCCcbVt7+UmizG6GuUvPvKUYg==}
dev: false
@ -4797,6 +4845,10 @@ packages:
transitivePeerDependencies:
- encoding
/@petamoriken/float16@3.8.7:
resolution: {integrity: sha512-/Ri4xDDpe12NT6Ex/DRgHzLlobiQXEW/hmG08w1wj/YU7hLemk97c+zHQFp0iZQ9r7YqgLEXZR2sls4HxBf9NA==}
dev: false
/@pkgjs/parseargs@0.11.0:
resolution: {integrity: sha512-+1VkjdD0QBLPodGrJUeqarH8VAIvQODIbwh9XpP5Syisf7YoQgsJKPNFoqqLQlu+VQ/tVSshMR6loPMn8U+dPg==}
engines: {node: '>=14'}
@ -4807,6 +4859,49 @@ packages:
resolution: {integrity: sha512-P1st0aksCrn9sGZhp8GMYwBnQsbvAWsZAX44oXNNvLHGqAOcoVxmjZiohstwQ7SqKnbR47akdNi+uleWD8+g6A==}
dev: false
/@protobufjs/aspromise@1.1.2:
resolution: {integrity: sha512-j+gKExEuLmKwvz3OgROXtrJ2UG2x8Ch2YZUxahh+s1F2HZ+wAceUNLkvy6zKCPVRkU++ZWQrdxsUeQXmcg4uoQ==}
dev: false
/@protobufjs/base64@1.1.2:
resolution: {integrity: sha512-AZkcAA5vnN/v4PDqKyMR5lx7hZttPDgClv83E//FMNhR2TMcLUhfRUBHCmSl0oi9zMgDDqRUJkSxO3wm85+XLg==}
dev: false
/@protobufjs/codegen@2.0.4:
resolution: {integrity: sha512-YyFaikqM5sH0ziFZCN3xDC7zeGaB/d0IUb9CATugHWbd1FRFwWwt4ld4OYMPWu5a3Xe01mGAULCdqhMlPl29Jg==}
dev: false
/@protobufjs/eventemitter@1.1.0:
resolution: {integrity: sha512-j9ednRT81vYJ9OfVuXG6ERSTdEL1xVsNgqpkxMsbIabzSo3goCjDIveeGv5d03om39ML71RdmrGNjG5SReBP/Q==}
dev: false
/@protobufjs/fetch@1.1.0:
resolution: {integrity: sha512-lljVXpqXebpsijW71PZaCYeIcE5on1w5DlQy5WH6GLbFryLUrBD4932W/E2BSpfRJWseIL4v/KPgBFxDOIdKpQ==}
dependencies:
'@protobufjs/aspromise': 1.1.2
'@protobufjs/inquire': 1.1.0
dev: false
/@protobufjs/float@1.0.2:
resolution: {integrity: sha512-Ddb+kVXlXst9d+R9PfTIxh1EdNkgoRe5tOX6t01f1lYWOvJnSPDBlG241QLzcyPdoNTsblLUdujGSE4RzrTZGQ==}
dev: false
/@protobufjs/inquire@1.1.0:
resolution: {integrity: sha512-kdSefcPdruJiFMVSbn801t4vFK7KB/5gd2fYvrxhuJYg8ILrmn9SKSX2tZdV6V+ksulWqS7aXjBcRXl3wHoD9Q==}
dev: false
/@protobufjs/path@1.1.2:
resolution: {integrity: sha512-6JOcJ5Tm08dOHAbdR3GrvP+yUUfkjG5ePsHYczMFLq3ZmMkAD98cDgcT2iA1lJ9NVwFd4tH/iSSoe44YWkltEA==}
dev: false
/@protobufjs/pool@1.1.0:
resolution: {integrity: sha512-0kELaGSIDBKvcgS4zkjz1PeddatrjYcmMWOlAuAPwAeccUrPHdUqo/J6LiymHHEiJT5NrF1UVwxY14f+fy4WQw==}
dev: false
/@protobufjs/utf8@1.1.0:
resolution: {integrity: sha512-Vvn3zZrhQZkkBE8LSuW3em98c0FwgO4nxzv6OdSxPKJIEKY2bGbHn+mhGIPerzI4twdxaP8/0+06HBpwf345Lw==}
dev: false
/@reactflow/background@11.2.4(immer@9.0.19)(react-dom@18.3.1)(react@18.3.1):
resolution: {integrity: sha512-SYQbCRCU0GuxT/40Tm7ZK+l5wByGnNJSLtZhbL9C/Hl7JhsJXV3UGXr0vrlhVZUBEtkWA7XhZM/5S9XEA5XSFA==}
peerDependencies:
@ -5686,6 +5781,10 @@ packages:
'@types/superagent': 8.1.7
dev: true
/@types/triple-beam@1.3.5:
resolution: {integrity: sha512-6WaYesThRMCl19iryMYP7/x2OVgCtbIVflDGFpWnb9irXI3UjYE4AzmYuiUKY1AJstGijoY+MgUszMgRxIYTYw==}
dev: false
/@types/tunnel@0.0.4:
resolution: {integrity: sha512-bQgDBL5XiqrrPUaZd9bZ2esOXcU4GTmgg0n6LHDqoMJezO3VFRZsW8qN6Gp64/LAmjtzNU3iAHBfV3Z2ht5DSg==}
dependencies:
@ -6034,6 +6133,19 @@ packages:
'@zag-js/dom-query': 0.16.0
dev: false
/@zilliz/milvus2-sdk-node@2.4.2:
resolution: {integrity: sha512-fkPu7XXzfUvHoCnSPVOjqQpWuSnnn9x2NMmmCcIOyRzMeXIsrz4Mf/+M7LUzmT8J9F0Khx65B0rJgCu27YzWQw==}
dependencies:
'@grpc/grpc-js': 1.10.8
'@grpc/proto-loader': 0.7.13
'@petamoriken/float16': 3.8.7
dayjs: 1.11.7
generic-pool: 3.9.0
lru-cache: 9.1.2
protobufjs: 7.3.0
winston: 3.13.0
dev: false
/abbrev@1.1.1:
resolution: {integrity: sha512-nne9/IiQ/hzIhY6pdDnbBtz7DjPTKrY00P/zvPSm5pOFkl6xuGrGnXn/VtTNNfNtAfZ9/1RtehkszU9qcTii0Q==}
requiresBuild: true
@ -6433,6 +6545,10 @@ packages:
engines: {node: '>=8'}
dev: true
/async@3.2.5:
resolution: {integrity: sha512-baNZyqaaLhyLVKm/DlvdW051MSgO6b8eVfIezl9E5PqWxFgzLm/wQntEW4zOytVburDEr0JlALEpdOFwvErLsg==}
dev: false
/asynckit@0.4.0:
resolution: {integrity: sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==}
@ -6951,6 +7067,11 @@ packages:
engines: {node: ^12.17.0 || ^14.13 || >=16.0.0}
dev: true
/chalk@5.3.0:
resolution: {integrity: sha512-dLitG79d+GV1Nb/VYcCDFivJeK1hiukt9QjRNVOsUtTy1rR1YJsmpGGTZ3qJos+uw7WmWF4wUwBd9jxjocFC2w==}
engines: {node: ^12.17.0 || ^14.13 || >=16.0.0}
dev: false
/char-regex@1.0.2:
resolution: {integrity: sha512-kWWXztvZ5SBQV+eRgKFeh8q5sLuZY2+8WUIzlxWVTg+oGwY14qylx1KbKzHd8P6ZYkAg0xyIDU9JMHhyJMZ1jw==}
engines: {node: '>=10'}
@ -7162,7 +7283,6 @@ packages:
string-width: 4.2.3
strip-ansi: 6.0.1
wrap-ansi: 7.0.0
dev: true
/clone@1.0.4:
resolution: {integrity: sha512-JQHZ2QMW6l3aH/j6xCqQThY/9OH4D/9ls34cgkUBiEeocRTU04tHfKPBsUK1PqZCUQM7GiA0IIXJSuXHI64Kbg==}
@ -7199,6 +7319,13 @@ packages:
/color-name@1.1.4:
resolution: {integrity: sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==}
/color-string@1.9.1:
resolution: {integrity: sha512-shrVawQFojnZv6xM40anx4CkoDP+fZsw/ZerEMsW/pyzsRbElpsL/DBVW7q3ExxwusdNXI3lXpuhEZkzs8p5Eg==}
dependencies:
color-name: 1.1.4
simple-swizzle: 0.2.2
dev: false
/color-support@1.1.3:
resolution: {integrity: sha512-qiBjkpbMLO/HL68y+lh4q0/O1MZFj2RX6X/KmMa3+gJD3z+WwI1ZzDHysvqHGS3mP6mznPckpXmw1nI9cJjyRg==}
hasBin: true
@ -7210,10 +7337,24 @@ packages:
resolution: {integrity: sha512-zW190nQTIoXcGCaU08DvVNFTmQhUpnJfVuAKfWqUQkflXKpaDdpaYoM0iluLS9lgJNHyBF58KKA2FBEwkD7wog==}
dev: false
/color@3.2.1:
resolution: {integrity: sha512-aBl7dZI9ENN6fUGC7mWpMTPNHmWUSNan9tuWN6ahh5ZLNk9baLJOnSMlrQkHcrfFgz2/RigjUVAjdx36VcemKA==}
dependencies:
color-convert: 1.9.3
color-string: 1.9.1
dev: false
/colorette@2.0.20:
resolution: {integrity: sha512-IfEDxwoWIjkeXL1eXcDiow4UbKjhLdq6/EuSVR9GMN7KVH3r9gQ83e73hsz1Nd1T3ijd5xv1wcWRYO+D6kCI2w==}
dev: true
/colorspace@1.1.4:
resolution: {integrity: sha512-BgvKJiuVu1igBUF2kEjRCZXol6wiiGbY5ipL/oVPwm0BL9sIpMIzM8IK7vwuxIIzOXMV3Ey5w+vxhm0rR/TN8w==}
dependencies:
color: 3.2.1
text-hex: 1.0.0
dev: false
/combined-stream@1.0.8:
resolution: {integrity: sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==}
engines: {node: '>= 0.8'}
@ -8258,6 +8399,10 @@ packages:
engines: {node: '>= 4'}
dev: false
/enabled@2.0.0:
resolution: {integrity: sha512-AKrN98kuwOzMIdAizXGI86UFBoo26CL21UM763y1h/GMSJ4/OHU9k2YlsmBpyScFo/wbLzWQJBMCW4+IO3/+OQ==}
dev: false
/encodeurl@1.0.2:
resolution: {integrity: sha512-TPJXq8JqFaVYm2CWmPvnP2Iyo4ZSM7/QKcSmuMLDObfpH5fi7RUGmd/rTDf+rut/saiDiQEeVTNgAmJEdAOx0w==}
engines: {node: '>= 0.8'}
@ -9308,6 +9453,10 @@ packages:
pend: 1.2.0
dev: false
/fecha@4.2.3:
resolution: {integrity: sha512-OP2IUU6HeYKJi3i0z4A19kHMQoLVs4Hc+DPqqxI2h/DPZHTm/vjsfC6P0b4jCMy14XizLBqvndQ+UilD7707Jw==}
dev: false
/figures@3.2.0:
resolution: {integrity: sha512-yaduQFRKLXYOGgEn6AZau90j3ggSOyiqXU0F9JZfeXYhNa+Jk4X+s45A2zg5jns87GAFa34BBm2kXw4XpNcbdg==}
engines: {node: '>=8'}
@ -9409,6 +9558,10 @@ packages:
resolution: {integrity: sha512-X8cqMLLie7KsNUDSdzeN8FYK9rEt4Dt67OsG/DNGnYTSDBG4uFAJFBnUeiV+zCVAvwFy56IjM9sH51jVaEhNxw==}
dev: true
/fn.name@1.1.0:
resolution: {integrity: sha512-GRnmB5gPyJpAhTQdSZTSp9uaPSvl09KoYcMQtsB9rQoOmzs9dH6ffeccH+Z+cv6P68Hu5bC6JjRh4Ah/mHSNRw==}
dev: false
/focus-lock@1.3.4:
resolution: {integrity: sha512-Gv0N3mvej3pD+HWkNryrF8sExzEHqhQ6OSFxD4DPxm9n5HGCaHme98ZMBZroNEAJcsdtHxk+skvThGKyUeoEGA==}
engines: {node: '>=10'}
@ -9611,6 +9764,11 @@ packages:
dev: false
optional: true
/generic-pool@3.9.0:
resolution: {integrity: sha512-hymDOu5B53XvN4QT9dBmZxPX4CWhBPPLguTZ9MMFeFa/Kg0xWVfylOVNlJji/E7yTZWFd/q9GO5TxDLq156D7g==}
engines: {node: '>= 4'}
dev: false
/gensync@1.0.0-beta.2:
resolution: {integrity: sha512-3hN7NaskYvMDLQY55gnW3NQ+mesEAepTqlg+VEbj7zzqEMBVNhzcGYYeqFo/TlYz6eQiFcp1HcsCZO+nGgS8zg==}
engines: {node: '>=6.9.0'}
@ -9618,7 +9776,6 @@ packages:
/get-caller-file@2.0.5:
resolution: {integrity: sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg==}
engines: {node: 6.* || 8.* || >= 10.*}
dev: true
/get-func-name@2.0.2:
resolution: {integrity: sha512-8vXOvuE167CtIc3OyItco7N/dpRtBbYOsPsXCz7X/PMnlGjYjSGuZJgM1Y7mmew7BKf9BqvLX2tnOVy1BBUsxQ==}
@ -10235,6 +10392,10 @@ packages:
/is-arrayish@0.2.1:
resolution: {integrity: sha512-zz06S8t0ozoDXMG+ube26zeCTNXcKIPJZJi8hBrF4idCLms4CG9QtK7qBl1boi5ODzFpjswb5JPmHCbMpjaYzg==}
/is-arrayish@0.3.2:
resolution: {integrity: sha512-eVRqCvVlZbuw3GrM63ovNSNAeA1K16kaR/LRY/92w0zxQ5/1YzwblUX652i4Xs9RwAGjW9d9y6X88t8OaAJfWQ==}
dev: false
/is-async-function@2.0.0:
resolution: {integrity: sha512-Y1JXKrfykRJGdlDwdKlLpLyMIiWqWvuSd17TvZk68PLAOGOoF4Xyav1z0Xhoi+gCYjZVeC5SI+hYFOfvXmGRCA==}
engines: {node: '>= 0.4'}
@ -10422,7 +10583,6 @@ packages:
/is-stream@2.0.1:
resolution: {integrity: sha512-hFoiJiTl63nn+kstHGBtewWSKnQLpyb155KHheA1l39uvtO9nWIop1p3udqPcUd/xbF1VLMO4n7OI6p7RbngDg==}
engines: {node: '>=8'}
dev: true
/is-stream@3.0.0:
resolution: {integrity: sha512-LnQR4bZ9IADDRSkvpqMGvt/tEJWclzklNgSw48V5EAaAeDd6qGvN8ei6k5p0tvxSR171VmGyHuTiAOfxAbr8kA==}
@ -11197,6 +11357,10 @@ packages:
engines: {node: '>=6'}
dev: false
/kuler@2.0.0:
resolution: {integrity: sha512-Xq9nH7KlWZmXAtodXDDRE7vs6DU1gTU8zYDHDiWLSip45Egwq3plLHzPn27NgvzL2r1LMPC1vdqh98sQxtqj4A==}
dev: false
/language-subtag-registry@0.3.22:
resolution: {integrity: sha512-tN0MCzyWnoz/4nHS6uxdlFWoUZT7ABptwKPQ52Ea7URk6vll88bWBVhodtnlfEuCcKWNGoc+uGbw1cwa9IKh/w==}
dev: true
@ -11345,6 +11509,10 @@ packages:
resolution: {integrity: sha512-mKnC+QJ9pWVzv+C4/U3rRsHapFfHvQFoFB92e52xeyGMcX6/OlIl78je1u8vePzYZSkkogMPJ2yjxxsb89cxyw==}
dev: false
/lodash.camelcase@4.3.0:
resolution: {integrity: sha512-TwuEnCnxbc3rAvhf/LbG7tJUDzhqXyFnv3dtzLOPgCG/hODL7WFnsbwktkD7yUV0RrreP/l1PALq/YSg6VvjlA==}
dev: false
/lodash.debounce@4.0.8:
resolution: {integrity: sha512-FT1yDzDYEoYWhnSGnpE/4Kj1fLZkDFyqRb7fNt6FdYOSxlUWAtp42Eh6Wb0rGIv/m9Bgo7x4GhQbm5Ys4SG5ow==}
dev: true
@ -11417,6 +11585,22 @@ packages:
wrap-ansi: 6.2.0
dev: true
/logform@2.6.0:
resolution: {integrity: sha512-1ulHeNPp6k/LD8H91o7VYFBng5i1BDE7HoKxVbZiGFidS1Rj65qcywLxX+pVfAPoQJEjRdvKcusKwOupHCVOVQ==}
engines: {node: '>= 12.0.0'}
dependencies:
'@colors/colors': 1.6.0
'@types/triple-beam': 1.3.5
fecha: 4.2.3
ms: 2.1.3
safe-stable-stringify: 2.4.3
triple-beam: 1.4.1
dev: false
/long@5.2.3:
resolution: {integrity: sha512-lcHwpNoggQTObv5apGNCTdJrO69eHOZMi4BNC+rTLER8iHAqGrUVeLh/irVIM7zTw2bOXA8T6uNPeujwOLg/2Q==}
dev: false
/longest-streak@3.1.0:
resolution: {integrity: sha512-9Ri+o0JYgehTaVBBDoMqIl8GXtbWg711O3srftcHhZ0dqnETqLaoIK0x17fUw9rFSlK/0NlsKe0Ahhyl5pXE2g==}
dev: false
@ -11464,6 +11648,11 @@ packages:
yallist: 4.0.0
dev: false
/lru-cache@9.1.2:
resolution: {integrity: sha512-ERJq3FOzJTxBbFjZ7iDs+NiK4VI9Wz+RdrrAB8dio1oV+YvdPzUEE4QNiT2VD51DkIbCYRUUzCRkssXCHqSnKQ==}
engines: {node: 14 || >=16.14}
dev: false
/luxon@3.4.4:
resolution: {integrity: sha512-zobTr7akeGHnv7eBOXcRgMeCP6+uyYsczwmeRCauvpvaAltgNyTbLH/+VaEAPUeWBT+1GuNmz4wC/6jtQzbbVA==}
engines: {node: '>=12'}
@ -12444,10 +12633,10 @@ packages:
- '@babel/core'
- babel-plugin-macros
/nextjs-cors@2.1.2(next@14.2.3):
resolution: {integrity: sha512-2yOVivaaf2ILe4f/qY32hnj3oC77VCOsUQJQfhVMGsXE/YMEWUY2zy78sH9FKUCM7eG42/l3pDofIzMD781XGA==}
/nextjs-cors@2.2.0(next@14.2.3):
resolution: {integrity: sha512-FZu/A+L59J4POJNqwXYyCPDvsLDeu5HjSBvytzS6lsrJeDz5cmnH45zV+VoNic0hjaeER9xGaiIjZIWzEHnxQg==}
peerDependencies:
next: ^8.1.1-canary.54 || ^9.0.0 || ^10.0.0-0 || ^11.0.0 || ^12.0.0 || ^13.0.0
next: ^8.1.1-canary.54 || ^9.0.0 || ^10.0.0-0 || ^11.0.0 || ^12.0.0 || ^13.0.0 || ^14.0.0
dependencies:
cors: 2.8.5
next: 14.2.3(@babel/core@7.24.4)(react-dom@18.3.1)(react@18.3.1)(sass@1.58.3)
@ -12728,6 +12917,12 @@ packages:
dependencies:
wrappy: 1.0.2
/one-time@1.0.0:
resolution: {integrity: sha512-5DXOiRKwuSEcQ/l0kGCF6Q3jcADFv5tSmRaJck/OqkVFcOzutB134KRSfF0xDrL39MNnqxbHBbUUcjZIhTgb2g==}
dependencies:
fn.name: 1.1.0
dev: false
/onetime@5.1.2:
resolution: {integrity: sha512-kbpaSSGJTWdAY5KPVeMOKXSrPtr8C8C7wodJbcsd51jRnmD+GZu8Y0VoU6Dm5Z4vWr0Ig/1NKuWRKf7j5aaYSg==}
engines: {node: '>=6'}
@ -13331,6 +13526,25 @@ packages:
resolution: {integrity: sha512-OHYtXfu5aI2sS2LWFSN5rgJjrQ4pCy8i1jubJLe2QvMF8JJ++HXTUIVWFLfXJoaOfvYYjk2SN8J2wFUWIGXT4w==}
dev: false
/protobufjs@7.3.0:
resolution: {integrity: sha512-YWD03n3shzV9ImZRX3ccbjqLxj7NokGN0V/ESiBV5xWqrommYHYiihuIyavq03pWSGqlyvYUFmfoMKd+1rPA/g==}
engines: {node: '>=12.0.0'}
requiresBuild: true
dependencies:
'@protobufjs/aspromise': 1.1.2
'@protobufjs/base64': 1.1.2
'@protobufjs/codegen': 2.0.4
'@protobufjs/eventemitter': 1.1.0
'@protobufjs/fetch': 1.1.0
'@protobufjs/float': 1.0.2
'@protobufjs/inquire': 1.1.0
'@protobufjs/path': 1.1.2
'@protobufjs/pool': 1.1.0
'@protobufjs/utf8': 1.1.0
'@types/node': 20.8.5
long: 5.2.3
dev: false
/proxy-addr@2.0.7:
resolution: {integrity: sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg==}
engines: {node: '>= 0.10'}
@ -13943,7 +14157,6 @@ packages:
/require-directory@2.1.1:
resolution: {integrity: sha512-fGxEI7+wsG9xrvdjsrlmL22OMTTiHRwAMroiEeMgq8gzoLC/PQr7RsRDSTLUg/bZAZtF+TVIkHc6/4RIKrui+Q==}
engines: {node: '>=0.10.0'}
dev: true
/require-from-string@2.0.2:
resolution: {integrity: sha512-Xf0nWe6RseziFMu+Ap9biiUbmplq6S9/p+7w7YXP/JBHhrUDDUhwa+vANyubuqfZWTveU//DYVGsDG7RKL/vEw==}
@ -14331,6 +14544,12 @@ packages:
simple-concat: 1.0.1
dev: false
/simple-swizzle@0.2.2:
resolution: {integrity: sha512-JA//kQgZtbuY83m+xT+tXJkmJncGMTFT+C+g2h2R9uxkYIrE2yy9sgmcLhCnw57/WSD+Eh3J97FPEDFnbXnDUg==}
dependencies:
is-arrayish: 0.3.2
dev: false
/sisteransi@1.0.5:
resolution: {integrity: sha512-bLGGlR1QxBcynn2d5YmDX4MGjlZvy2MRBDRNHLJ8VI6l6+9FUiyTFNJ0IveOSP0bcXgVDPRcfGqA0pjaqUpfVg==}
dev: true
@ -14473,6 +14692,10 @@ packages:
deprecated: 'Modern JS already guarantees Array#sort() is a stable sort, so this library is deprecated. See the compatibility table on MDN: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/sort#browser_compatibility'
dev: true
/stack-trace@0.0.10:
resolution: {integrity: sha512-KGzahc7puUKkzyMt+IqAep+TVNbKP+k2Lmwhub39m1AsTSkaDutx56aDCo+HLDzf/D26BIHTJWNiTG1KAJiQCg==}
dev: false
/stack-utils@2.0.6:
resolution: {integrity: sha512-XlkWvfIm6RmsWtNJx+uqtKLS8eqFbxUg0ZzLXqY0caEy9l7hruX8IpiDnjsLavoBgqCCR71TqWO8MaXYheJ3RQ==}
engines: {node: '>=10'}
@ -14878,6 +15101,10 @@ packages:
minimatch: 3.1.2
dev: true
/text-hex@1.0.0:
resolution: {integrity: sha512-uuVGNWzgJ4yhRaNSiubPY7OjISw4sw4E5Uv0wbjp+OzcbmVU/rsT8ujgcXJhn9ypzsgr5vlzpPqP+MBBKcGvbg==}
dev: false
/text-table@0.2.0:
resolution: {integrity: sha512-N+8UisAXDGk8PFXP4HAzVR9nbfmVJ3zYLAWiTIoqC5v5isinhr+r5uaO8+7r3BMfuNIufIsA7RdpVgacC2cSpw==}
dev: true
@ -15001,6 +15228,11 @@ packages:
deprecated: Use String.prototype.trim() instead
dev: true
/triple-beam@1.4.1:
resolution: {integrity: sha512-aZbgViZrg1QNcG+LULa7nhZpJTZSLm/mXnHXnbAbjmN5aSa0y7V+wvv6+4WaBtpISJzThKy+PIPxc1Nq1EJ9mg==}
engines: {node: '>= 14.0.0'}
dev: false
/trough@1.0.5:
resolution: {integrity: sha512-rvuRbTarPXmMb79SmzEp8aqXNKcK+y0XaB298IXueQ8I2PsrATcPBCSPyK/dDNa2iWOhKlfNnOjdAOTBU/nkFA==}
dev: true
@ -15951,6 +16183,32 @@ packages:
execa: 4.1.0
dev: true
/winston-transport@4.7.0:
resolution: {integrity: sha512-ajBj65K5I7denzer2IYW6+2bNIVqLGDHqDw3Ow8Ohh+vdW+rv4MZ6eiDvHoKhfJFZ2auyN8byXieDDJ96ViONg==}
engines: {node: '>= 12.0.0'}
dependencies:
logform: 2.6.0
readable-stream: 3.6.2
triple-beam: 1.4.1
dev: false
/winston@3.13.0:
resolution: {integrity: sha512-rwidmA1w3SE4j0E5MuIufFhyJPBDG7Nu71RkZor1p2+qHvJSZ9GYDA81AyleQcZbh/+V6HjeBdfnTZJm9rSeQQ==}
engines: {node: '>= 12.0.0'}
dependencies:
'@colors/colors': 1.6.0
'@dabh/diagnostics': 2.0.3
async: 3.2.5
is-stream: 2.0.1
logform: 2.6.0
one-time: 1.0.0
readable-stream: 3.6.2
safe-stable-stringify: 2.4.3
stack-trace: 0.0.10
triple-beam: 1.4.1
winston-transport: 4.7.0
dev: false
/wrap-ansi@6.2.0:
resolution: {integrity: sha512-r6lPcBGxZXlIcymEu7InxDMhdW0KDxpLgoFLcguasxCaJ/SOIZwINatK9KY/tf+ZrlywOKU0UDj3ATXUBfxJXA==}
engines: {node: '>=8'}
@ -15999,7 +16257,6 @@ packages:
/y18n@5.0.8:
resolution: {integrity: sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA==}
engines: {node: '>=10'}
dev: true
/yallist@3.1.1:
resolution: {integrity: sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g==}
@ -16021,7 +16278,6 @@ packages:
/yargs-parser@21.1.1:
resolution: {integrity: sha512-tVpsJW7DdjecAiFpbIB1e3qxIQsE6NoPc5/eTdrbbIC4h0LVsWhnoa3g+m2HclBIujHzsxZ4VJVA+GUuc2/LBw==}
engines: {node: '>=12'}
dev: true
/yargs@17.7.2:
resolution: {integrity: sha512-7dSzzRQ++CKnNI/krKnYRV7JKKPUXMEh61soaHKg9mrWEhzFWhFnxPxGl+69cD1Ou63C13NUPCnmIcrvqCuM6w==}
@ -16034,7 +16290,6 @@ packages:
string-width: 4.2.3
y18n: 5.0.8
yargs-parser: 21.1.1
dev: true
/yauzl@2.10.0:
resolution: {integrity: sha512-p4a9I6X6nu6IhoGmBqAcbJy1mlC4j27vEPZX9F4L4/vZT3Lyq1VkFHw/V/PUcB9Buo+DG3iHkT0x3Qya58zc3g==}

View File

@ -18,8 +18,14 @@ OPENAI_BASE_URL=https://api.openai.com/v1
CHAT_API_KEY=sk-xxxx
# mongo 数据库连接参数本地开发时mongo可能需要增加 directConnection=true 参数,才能连接上。
MONGODB_URI=mongodb://username:password@0.0.0.0:27017/fastgpt?authSource=admin
# PG 数据库连接参数
# 向量库优先级: pg > milvus
# PG 向量库连接参数
PG_URL=postgresql://username:password@host:port/postgres
# mivlus 向量库连接参数
MILVUS_ADDRESS=https://in03-78bd7f60e6e2a7c.api.gcp-us-west1.zillizcloud.com
MILVUS_TOKEN=133964348b00b4b4e4b51bef680a61350950385c8c64a3ec16b1ab92d3c67dcc4e0370fb9dd15791bcd6dadaf765e98a98735d0d
# code sandbox url
SANDBOX_URL=http://localhost:3001
# 商业版地址

View File

@ -62,9 +62,11 @@ COPY --from=builder --chown=nextjs:nodejs /app/projects/app/.next/server/chunks
# copy worker
COPY --from=builder --chown=nextjs:nodejs /app/projects/app/.next/server/worker /app/projects/app/.next/server/worker
# copy tiktoken but not copy ./node_modules/tiktoken/encoders
# copy standload packages
COPY --from=mainDeps /app/node_modules/tiktoken ./node_modules/tiktoken
RUN rm -rf ./node_modules/tiktoken/encoders
COPY --from=mainDeps /app/node_modules/@zilliz/milvus2-sdk-node ./node_modules/@zilliz/milvus2-sdk-node
# copy package.json to version file
COPY --from=builder /app/projects/app/package.json ./package.json

View File

@ -5,7 +5,12 @@
"Reset template": "Reset template",
"Reset template confirm": "Are you sure to restore the code template? Be careful to save the current code."
},
"ifelse": {
"Input value": "Input",
"Select value": "Select"
},
"response": {
"Code log": "Log",
"Custom inputs": "Custom inputs",
"Custom outputs": "Custom outputs",
"Error": "Error"

View File

@ -184,7 +184,7 @@
"not support": "您的浏览器不支持语音输入"
},
"system": {
"Commercial version function": "商业版特有功能",
"Commercial version function": "请升级商业版后使用该功能: https://fastgpt.in",
"Help Chatbot": "机器人助手",
"Use Helper": "使用帮助"
},

View File

@ -5,7 +5,12 @@
"Reset template": "还原模板",
"Reset template confirm": "确认还原代码模板?请注意保存当前代码。"
},
"ifelse": {
"Input value": "输入值",
"Select value": "选择值"
},
"response": {
"Code log": "Log日志",
"Custom inputs": "自定义输入",
"Custom outputs": "自定义输出",
"Error": "错误信息"

View File

@ -2,10 +2,12 @@
const { i18n } = require('./next-i18next.config');
const path = require('path');
const isDev = process.env.NODE_ENV === 'development';
const nextConfig = {
i18n,
output: 'standalone',
reactStrictMode: process.env.NODE_ENV === 'development' ? false : true,
reactStrictMode: isDev ? false : true,
compress: true,
webpack(config, { isServer, nextRuntime }) {
Object.assign(config.resolve.alias, {
@ -41,11 +43,9 @@ const nextConfig = {
}
if (isServer) {
config.externals.push('worker_threads');
// config.externals.push('@zilliz/milvus2-sdk-node');
if (nextRuntime === 'nodejs') {
// config.output.globalObject = 'self';
const oldEntry = config.entry;
config = {
...config,
@ -89,7 +89,12 @@ const nextConfig = {
transpilePackages: ['@fastgpt/*', 'ahooks'],
experimental: {
// 优化 Server Components 的构建和运行,避免不必要的客户端打包。
serverComponentsExternalPackages: ['mongoose', 'pg', '@node-rs/jieba'],
serverComponentsExternalPackages: [
'mongoose',
'pg',
'@node-rs/jieba',
'@zilliz/milvus2-sdk-node'
],
outputFileTracingRoot: path.join(__dirname, '../../')
}
};

View File

@ -1,6 +1,6 @@
{
"name": "app",
"version": "4.8.1",
"version": "4.8.3",
"private": false,
"scripts": {
"dev": "next dev",

View File

@ -51,16 +51,8 @@ const ChatInput = ({
name: 'files'
});
const {
shareId,
outLinkUid,
teamId,
teamToken,
isChatting,
whisperConfig,
autoTTSResponse,
chatInputGuide
} = useContextSelector(ChatBoxContext, (v) => v);
const { isChatting, whisperConfig, autoTTSResponse, chatInputGuide, outLinkAuthData } =
useContextSelector(ChatBoxContext, (v) => v);
const { isPc, whisperModel } = useSystemStore();
const canvasRef = useRef<HTMLCanvasElement>(null);
const { t } = useTranslation();
@ -87,10 +79,7 @@ const ChatInput = ({
maxSize: 1024 * 1024 * 16,
// 7 day expired.
expiredTime: addDays(new Date(), 7),
shareId,
outLinkUid,
teamId,
teamToken
...outLinkAuthData
});
updateFile(fileIndex, {
...file,
@ -175,7 +164,7 @@ const ChatInput = ({
speakingTimeString,
renderAudioGraph,
stream
} = useSpeech({ appId, shareId, outLinkUid, teamId, teamToken });
} = useSpeech({ appId, ...outLinkAuthData });
useEffect(() => {
if (!stream) {
return;

View File

@ -24,6 +24,7 @@ export default function InputGuideBox({
const { t } = useTranslation();
const { chatT } = useI18n();
const chatInputGuide = useContextSelector(ChatBoxContext, (v) => v.chatInputGuide);
const outLinkAuthData = useContextSelector(ChatBoxContext, (v) => v.outLinkAuthData);
const { data = [] } = useRequest2(
async () => {
@ -31,7 +32,8 @@ export default function InputGuideBox({
return await queryChatInputGuideList(
{
appId,
searchKey: text.slice(0, 50)
searchKey: text.slice(0, 50),
...outLinkAuthData
},
chatInputGuide.customUrl ? chatInputGuide.customUrl : undefined
);

View File

@ -45,6 +45,7 @@ type useChatStoreType = OutLinkChatAuthProps & {
setChatHistories: React.Dispatch<React.SetStateAction<ChatSiteItemType[]>>;
isChatting: boolean;
chatInputGuide: ChatInputGuideConfigType;
outLinkAuthData: OutLinkChatAuthProps;
};
export const ChatBoxContext = createContext<useChatStoreType>({
welcomeText: '',
@ -98,7 +99,8 @@ export const ChatBoxContext = createContext<useChatStoreType>({
chatInputGuide: {
open: false,
customUrl: ''
}
},
outLinkAuthData: {}
});
export type ChatProviderProps = OutLinkChatAuthProps & {
@ -128,6 +130,16 @@ const Provider = ({
chatInputGuide = defaultChatInputGuideConfig
} = useMemo(() => chatConfig, [chatConfig]);
const outLinkAuthData = useMemo(
() => ({
shareId,
outLinkUid,
teamId,
teamToken
}),
[shareId, outLinkUid, teamId, teamToken]
);
// segment audio
const [audioPlayingChatId, setAudioPlayingChatId] = useState<string>();
const {
@ -141,10 +153,7 @@ const Provider = ({
splitText2Audio
} = useAudioPlay({
ttsConfig,
shareId,
outLinkUid,
teamId,
teamToken
...outLinkAuthData
});
const autoTTSResponse =
@ -181,7 +190,8 @@ const Provider = ({
chatHistories,
setChatHistories,
isChatting,
chatInputGuide
chatInputGuide,
outLinkAuthData
};
return <ChatBoxContext.Provider value={value}>{children}</ChatBoxContext.Provider>;

View File

@ -321,6 +321,7 @@ export const ResponseBox = React.memo(function ResponseBox({
{/* code */}
<Row label={workflowT('response.Custom inputs')} value={activeModule?.customInputs} />
<Row label={workflowT('response.Custom outputs')} value={activeModule?.customOutputs} />
<Row label={workflowT('response.Code log')} value={activeModule?.codeLog} />
</Box>
</>
);

View File

@ -13,19 +13,10 @@ import Auth from './auth';
const Navbar = dynamic(() => import('./navbar'));
const NavbarPhone = dynamic(() => import('./navbarPhone'));
const UpdateInviteModal = dynamic(
() => import('@/components/support/user/team/UpdateInviteModal'),
{ ssr: false }
);
const NotSufficientModal = dynamic(() => import('@/components/support/wallet/NotSufficientModal'), {
ssr: false
});
const SystemMsgModal = dynamic(() => import('@/components/support/user/inform/SystemMsgModal'), {
ssr: false
});
const ImportantInform = dynamic(() => import('@/components/support/user/inform/ImportantInform'), {
ssr: false
});
const UpdateInviteModal = dynamic(() => import('@/components/support/user/team/UpdateInviteModal'));
const NotSufficientModal = dynamic(() => import('@/components/support/wallet/NotSufficientModal'));
const SystemMsgModal = dynamic(() => import('@/components/support/user/inform/SystemMsgModal'));
const ImportantInform = dynamic(() => import('@/components/support/user/inform/ImportantInform'));
const pcUnShowLayoutRoute: Record<string, boolean> = {
'/': true,
@ -126,7 +117,7 @@ const Layout = ({ children }: { children: JSX.Element }) => {
{feConfigs?.isPlus && (
<>
{!!userInfo && <UpdateInviteModal />}
{isNotSufficientModal && !isHideNavbar && <NotSufficientModal />}
{isNotSufficientModal && <NotSufficientModal />}
{!!userInfo && <SystemMsgModal />}
{!!userInfo && importantInforms.length > 0 && (
<ImportantInform informs={importantInforms} refetch={refetchUnRead} />

View File

@ -5,6 +5,10 @@ const NextHead = ({ title, icon, desc }: { title?: string; icon?: string; desc?:
return (
<Head>
<title>{title}</title>
<meta
name="viewport"
content="width=device-width,initial-scale=1.0,maximum-scale=1.0,minimum-scale=1.0,user-scalable=no, viewport-fit=cover"
/>
{desc && <meta name="description" content={desc} />}
{icon && <link rel="icon" href={icon} />}
</Head>

View File

@ -145,10 +145,27 @@ export const useDebug = () => {
node.nodeId === runtimeNode.nodeId
? {
...runtimeNode,
inputs: runtimeNode.inputs.map((input) => ({
...input,
value: data[input.key] ?? input.value
}))
inputs: runtimeNode.inputs.map((input) => {
let parseValue = (() => {
try {
if (
input.valueType === WorkflowIOValueTypeEnum.string ||
input.valueType === WorkflowIOValueTypeEnum.number ||
input.valueType === WorkflowIOValueTypeEnum.boolean
)
return data[input.key];
return JSON.parse(data[input.key]);
} catch (e) {
return data[input.key];
}
})();
return {
...input,
value: parseValue ?? input.value
};
})
}
: node
),
@ -168,7 +185,7 @@ export const useDebug = () => {
<Box flex={'1 0 0'} overflow={'auto'} px={6}>
{renderInputs.map((input) => {
const required = input.required || false;
console.log(input.valueType);
const RenderInput = (() => {
if (input.valueType === WorkflowIOValueTypeEnum.string) {
return (
@ -206,19 +223,23 @@ export const useDebug = () => {
</Box>
);
}
if (typeof input.value === 'string') {
return (
<JsonEditor
bg={'myGray.50'}
placeholder={t(input.placeholder || '')}
resize
value={getValues(input.key)}
onChange={(e) => {
setValue(input.key, e);
}}
/>
);
let value = getValues(input.key) || '';
if (typeof value !== 'string') {
value = JSON.stringify(value, null, 2);
}
return (
<JsonEditor
bg={'myGray.50'}
placeholder={t(input.placeholder || '')}
resize
value={value}
onChange={(e) => {
setValue(input.key, e);
}}
/>
);
})();
return !!RenderInput ? (

View File

@ -31,6 +31,7 @@ import { Position, useReactFlow } from 'reactflow';
import { getRefData } from '@/web/core/workflow/utils';
import DragIcon from '@fastgpt/web/components/common/DndDrag/DragIcon';
import { AppContext } from '@/web/core/app/context/appContext';
import { useI18n } from '@/web/context/I18n';
const ListItem = ({
provided,
@ -415,6 +416,7 @@ const ConditionValueInput = ({
condition?: VariableConditionEnum;
onChange: (e: string) => void;
}) => {
const { workflowT } = useI18n();
const nodeList = useContextSelector(WorkflowContext, (v) => v.nodeList);
// get value type
@ -439,7 +441,7 @@ const ConditionValueInput = ({
]}
onchange={onChange}
value={value}
placeholder={'选择值'}
placeholder={workflowT('ifelse.Select value')}
isDisabled={
condition === VariableConditionEnum.isEmpty ||
condition === VariableConditionEnum.isNotEmpty
@ -450,7 +452,11 @@ const ConditionValueInput = ({
return (
<MyInput
value={value}
placeholder={'输入值'}
placeholder={
condition === VariableConditionEnum.reg
? '/^((+|00)86)?1[3-9]d{9}$/'
: workflowT('ifelse.Input value')
}
w={'100%'}
bg={'white'}
isDisabled={
@ -461,7 +467,7 @@ const ConditionValueInput = ({
/>
);
}
}, [condition, onChange, value, valueType]);
}, [condition, onChange, value, valueType, workflowT]);
return Render;
};

View File

@ -42,7 +42,7 @@ const Reference = ({ item, nodeId }: RenderInputProps) => {
const nodeList = useContextSelector(WorkflowContext, (v) => v.nodeList);
const onSelect = useCallback(
(e: any) => {
(e: ReferenceValueProps) => {
const workflowStartNode = nodeList.find(
(node) => node.flowNodeType === FlowNodeTypeEnum.workflowStart
);

View File

@ -16,7 +16,7 @@ import { WorkflowContext } from '@/components/core/workflow/context';
import QuestionTip from '@fastgpt/web/components/common/MyTooltip/QuestionTip';
const RenderList: {
types: `${FlowNodeOutputTypeEnum}`[];
types: FlowNodeOutputTypeEnum[];
Component: React.ComponentType<RenderOutputProps>;
}[] = [];

View File

@ -1,12 +1,8 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@fastgpt/service/common/response';
import { connectToDatabase } from '@/service/mongo';
import { authCert } from '@fastgpt/service/support/permission/auth/common';
import { PgClient } from '@fastgpt/service/common/vectorStore/pg';
import { NextAPI } from '@/service/middleware/entry';
import { PgDatasetTableName } from '@fastgpt/global/common/vectorStore/constants';
import { connectionMongo } from '@fastgpt/service/common/mongo';
import { addLog } from '@fastgpt/service/common/system/log';
/* pg 中的数据搬到 mongo dataset.datas 中,并做映射 */
async function handler(req: NextApiRequest, res: NextApiResponse) {

View File

@ -1,41 +0,0 @@
import { authUserNotVisitor } from '@fastgpt/service/support/permission/auth/user';
import { NextApiRequest, NextApiResponse } from 'next';
import { MongoChatInputGuide } from '@fastgpt/service/core/chat/inputGuide/schema';
import axios from 'axios';
import { NextAPI } from '@/service/middleware/entry';
async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
const { textList = [], appId, customUrl } = req.body;
if (!customUrl) {
const { teamId } = await authUserNotVisitor({ req, authToken: true });
const currentQGuide = await MongoChatInputGuide.find({ appId, teamId });
const currentTexts = currentQGuide.map((item) => item.text);
const textsToDelete = currentTexts.filter((text) => !textList.includes(text));
await MongoChatInputGuide.deleteMany({ text: { $in: textsToDelete }, appId, teamId });
const newTexts = textList.filter((text: string) => !currentTexts.includes(text));
const newDocuments = newTexts.map((text: string) => ({
text: text,
appId: appId,
teamId: teamId
}));
await MongoChatInputGuide.insertMany(newDocuments);
} else {
try {
const response = await axios.post(customUrl, {
textList,
appId
});
res.status(200).json(response.data);
} catch (error) {
res.status(500).json({ error });
}
}
}
export default NextAPI(handler);

View File

@ -2,21 +2,29 @@ import type { NextApiResponse } from 'next';
import { MongoChatInputGuide } from '@fastgpt/service/core/chat/inputGuide/schema';
import { NextAPI } from '@/service/middleware/entry';
import { ApiRequestProps } from '@fastgpt/service/type/next';
import { authApp } from '@fastgpt/service/support/permission/auth/app';
import { OutLinkChatAuthProps } from '@fastgpt/global/support/permission/chat';
import { authChatCert } from '@/service/support/permission/auth/chat';
import { MongoApp } from '@fastgpt/service/core/app/schema';
import { AppErrEnum } from '@fastgpt/global/common/error/code/app';
export type QueryChatInputGuideProps = {
export type QueryChatInputGuideBody = OutLinkChatAuthProps & {
appId: string;
searchKey: string;
};
export type QueryChatInputGuideResponse = string[];
async function handler(
req: ApiRequestProps<{}, QueryChatInputGuideProps>,
req: ApiRequestProps<QueryChatInputGuideBody>,
res: NextApiResponse<any>
): Promise<QueryChatInputGuideResponse> {
const { appId, searchKey } = req.query;
const { appId, searchKey } = req.body;
await authApp({ req, appId, authToken: true, authApiKey: true, per: 'r' });
// tmp auth
const { teamId } = await authChatCert({ req, authToken: true });
const app = await MongoApp.findOne({ _id: appId, teamId });
if (!app) {
return Promise.reject(AppErrEnum.unAuthApp);
}
const params = {
appId,

View File

@ -31,26 +31,17 @@ import { useI18n } from '@/web/context/I18n';
import { useContextSelector } from 'use-context-selector';
import { AppContext } from '@/web/core/app/context/appContext';
const DatasetSelectModal = dynamic(() => import('@/components/core/app/DatasetSelectModal'), {
ssr: false
});
const DatasetParamsModal = dynamic(() => import('@/components/core/app/DatasetParamsModal'), {
ssr: false
});
const ToolSelectModal = dynamic(() => import('./ToolSelectModal'), { ssr: false });
const TTSSelect = dynamic(() => import('@/components/core/app/TTSSelect'), { ssr: false });
const QGSwitch = dynamic(() => import('@/components/core/app/QGSwitch'), { ssr: false });
const WhisperConfig = dynamic(() => import('@/components/core/app/WhisperConfig'), { ssr: false });
const InputGuideConfig = dynamic(() => import('@/components/core/app/InputGuideConfig'), {
ssr: false
});
const DatasetSelectModal = dynamic(() => import('@/components/core/app/DatasetSelectModal'));
const DatasetParamsModal = dynamic(() => import('@/components/core/app/DatasetParamsModal'));
const ToolSelectModal = dynamic(() => import('./ToolSelectModal'));
const TTSSelect = dynamic(() => import('@/components/core/app/TTSSelect'));
const QGSwitch = dynamic(() => import('@/components/core/app/QGSwitch'));
const WhisperConfig = dynamic(() => import('@/components/core/app/WhisperConfig'));
const InputGuideConfig = dynamic(() => import('@/components/core/app/InputGuideConfig'));
const ScheduledTriggerConfig = dynamic(
() => import('@/components/core/app/ScheduledTriggerConfig'),
{ ssr: false }
() => import('@/components/core/app/ScheduledTriggerConfig')
);
const WelcomeTextConfig = dynamic(() => import('@/components/core/app/WelcomeTextConfig'), {
ssr: false
});
const WelcomeTextConfig = dynamic(() => import('@/components/core/app/WelcomeTextConfig'));
const BoxStyles: BoxProps = {
px: 5,

View File

@ -15,16 +15,16 @@ const ChatHeader = ({
appName,
appAvatar,
chatModels,
appId,
showHistory,
onRoute2AppDetail,
onOpenSlider
}: {
history: ChatItemType[];
appName: string;
appAvatar: string;
chatModels?: string[];
appId?: string;
showHistory?: boolean;
onRoute2AppDetail?: () => void;
onOpenSlider: () => void;
}) => {
const router = useRouter();
@ -80,13 +80,7 @@ const ChatHeader = ({
<Flex px={3} alignItems={'center'} flex={'1 0 0'} w={0} justifyContent={'center'}>
<Avatar src={appAvatar} w={'16px'} />
<Box
ml={1}
className="textEllipsis"
onClick={() => {
appId && router.push(`/app/detail?appId=${appId}`);
}}
>
<Box ml={1} className="textEllipsis" onClick={onRoute2AppDetail}>
{appName}
</Box>
</Flex>

View File

@ -335,10 +335,10 @@ const Chat = ({ appId, chatId }: { appId: string; chatId: string }) => {
<ChatHeader
appAvatar={chatData.app.avatar}
appName={chatData.app.name}
appId={appId}
history={chatData.history}
chatModels={chatData.app.chatModels}
onOpenSlider={onOpenSlider}
onRoute2AppDetail={() => router.push(`/app/detail?appId=${appId}`)}
showHistory
/>

View File

@ -378,7 +378,6 @@ const OutLink = ({
history={chatData.history}
showHistory={showHistory === '1'}
onOpenSlider={onOpenSlider}
appId={chatData.appId}
/>
{/* chat box */}
<Box flex={1}>

View File

@ -1,4 +1,4 @@
import React, { useMemo, useRef, useState } from 'react';
import React, { useCallback, useMemo, useRef, useState } from 'react';
import {
Box,
Flex,
@ -17,7 +17,7 @@ import {
import MyIcon from '@fastgpt/web/components/common/Icon';
import { useTranslation } from 'next-i18next';
import LeftRadio from '@fastgpt/web/components/common/Radio/LeftRadio';
import { TrainingTypeMap } from '@fastgpt/global/core/dataset/constants';
import { TrainingModeEnum, TrainingTypeMap } from '@fastgpt/global/core/dataset/constants';
import { ImportProcessWayEnum } from '@/web/core/dataset/constants';
import MyTooltip from '@/components/MyTooltip';
import { useSystemStore } from '@/web/common/system/useSystemStore';
@ -27,6 +27,7 @@ import Preview from '../components/Preview';
import Tag from '@fastgpt/web/components/common/Tag/index';
import { useContextSelector } from 'use-context-selector';
import { DatasetImportContext } from '../Context';
import { useToast } from '@fastgpt/web/hooks/useToast';
function DataProcess({ showPreviewChunks = true }: { showPreviewChunks: boolean }) {
const { t } = useTranslation();
@ -42,8 +43,10 @@ function DataProcess({ showPreviewChunks = true }: { showPreviewChunks: boolean
maxChunkSize,
priceTip
} = useContextSelector(DatasetImportContext, (v) => v);
const { getValues, setValue, register } = processParamsForm;
const [refresh, setRefresh] = useState(false);
const { getValues, setValue, register, watch } = processParamsForm;
const { toast } = useToast();
const mode = watch('mode');
const way = watch('way');
const {
isOpen: isOpenCustomPrompt,
@ -53,12 +56,21 @@ function DataProcess({ showPreviewChunks = true }: { showPreviewChunks: boolean
const trainingModeList = useMemo(() => {
const list = Object.entries(TrainingTypeMap);
return list;
}, []);
return list.filter(([key, value]) => {
if (feConfigs?.isPlus) return true;
return value.openSource;
});
}, [feConfigs?.isPlus]);
const onSelectTrainWay = useCallback(
(e: TrainingModeEnum) => {
if (!feConfigs?.isPlus && !TrainingTypeMap[e]?.openSource) {
return toast({
status: 'warning',
title: t('common.system.Commercial version function')
});
}
setValue('mode', e);
},
[feConfigs?.isPlus, setValue, t, toast]
);
return (
<Box h={'100%'} display={['block', 'flex']} gap={5}>
@ -80,11 +92,8 @@ function DataProcess({ showPreviewChunks = true }: { showPreviewChunks: boolean
}))}
px={3}
py={2}
value={getValues('mode')}
onChange={(e) => {
setValue('mode', e);
setRefresh(!refresh);
}}
value={mode}
onChange={onSelectTrainWay}
gridTemplateColumns={'repeat(3,1fr)'}
defaultBg="white"
activeBg="white"
@ -105,7 +114,7 @@ function DataProcess({ showPreviewChunks = true }: { showPreviewChunks: boolean
title: t('core.dataset.import.Custom process'),
desc: t('core.dataset.import.Custom process desc'),
value: ImportProcessWayEnum.custom,
children: getValues('way') === ImportProcessWayEnum.custom && (
children: way === ImportProcessWayEnum.custom && (
<Box mt={5}>
{showChunkInput && chunkSizeField && (
<Box>
@ -250,11 +259,10 @@ function DataProcess({ showPreviewChunks = true }: { showPreviewChunks: boolean
py={3}
defaultBg="white"
activeBg="white"
value={getValues('way')}
value={way}
w={'100%'}
onChange={(e) => {
setValue('way', e);
setRefresh(!refresh);
}}
></LeftRadio>
</Flex>
@ -286,7 +294,6 @@ function DataProcess({ showPreviewChunks = true }: { showPreviewChunks: boolean
defaultValue={getValues('qaPrompt')}
onChange={(e) => {
setValue('qaPrompt', e);
setRefresh(!refresh);
}}
onClose={onCloseCustomPrompt}
/>

View File

@ -18,21 +18,19 @@ import MyRadio from '@/components/common/MyRadio';
import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
import { MongoImageTypeEnum } from '@fastgpt/global/common/file/image/constants';
import { QuestionOutlineIcon } from '@chakra-ui/icons';
import MySelect from '@fastgpt/web/components/common/MySelect';
import AIModelSelector from '@/components/Select/AIModelSelector';
import { useI18n } from '@/web/context/I18n';
const CreateModal = ({ onClose, parentId }: { onClose: () => void; parentId?: string }) => {
const { t } = useTranslation();
const { datasetT } = useI18n();
const [refresh, setRefresh] = useState(false);
const { toast } = useToast();
const router = useRouter();
const { isPc, feConfigs, vectorModelList, datasetModelList } = useSystemStore();
const filterNotHiddenVectorModelList = vectorModelList.filter((item) => !item.hidden);
const { register, setValue, getValues, handleSubmit } = useForm<CreateDatasetParams>({
const { register, setValue, handleSubmit, watch } = useForm<CreateDatasetParams>({
defaultValues: {
parentId,
type: DatasetTypeEnum.dataset,
@ -43,6 +41,10 @@ const CreateModal = ({ onClose, parentId }: { onClose: () => void; parentId?: st
agentModel: datasetModelList[0].model
}
});
const avatar = watch('avatar');
const datasetType = watch('type');
const vectorModel = watch('vectorModel');
const agentModel = watch('agentModel');
const { File, onOpen: onOpenSelectFile } = useSelectFile({
fileType: '.jpg,.png',
@ -61,7 +63,6 @@ const CreateModal = ({ onClose, parentId }: { onClose: () => void; parentId?: st
maxH: 300
});
setValue('avatar', src);
setRefresh((state) => !state);
} catch (err: any) {
toast({
title: getErrText(err, t('common.avatar.Select Failed')),
@ -85,6 +86,22 @@ const CreateModal = ({ onClose, parentId }: { onClose: () => void; parentId?: st
}
});
const onSelectDatasetType = useCallback(
(e: DatasetTypeEnum) => {
if (
!feConfigs?.isPlus &&
(e === DatasetTypeEnum.websiteDataset || e === DatasetTypeEnum.externalFile)
) {
return toast({
status: 'warning',
title: t('common.system.Commercial version function')
});
}
setValue('type', e);
},
[feConfigs?.isPlus, setValue, t, toast]
);
return (
<MyModal
iconSrc="/imgs/workflow/db.png"
@ -109,28 +126,21 @@ const CreateModal = ({ onClose, parentId }: { onClose: () => void; parentId?: st
icon: 'core/dataset/commonDataset',
desc: datasetT('Common Dataset Desc')
},
...(feConfigs.isPlus
? [
{
title: datasetT('Website Dataset'),
value: DatasetTypeEnum.websiteDataset,
icon: 'core/dataset/websiteDataset',
desc: datasetT('Website Dataset Desc')
},
{
title: datasetT('External File'),
value: DatasetTypeEnum.externalFile,
icon: 'core/dataset/externalDataset',
desc: datasetT('External file Dataset Desc')
}
]
: [])
{
title: datasetT('Website Dataset'),
value: DatasetTypeEnum.websiteDataset,
icon: 'core/dataset/websiteDataset',
desc: datasetT('Website Dataset Desc')
},
{
title: datasetT('External File'),
value: DatasetTypeEnum.externalFile,
icon: 'core/dataset/externalDataset',
desc: datasetT('External file Dataset Desc')
}
]}
value={getValues('type')}
onChange={(e) => {
setValue('type', e as DatasetTypeEnum);
setRefresh(!refresh);
}}
value={datasetType}
onChange={onSelectDatasetType}
/>
</>
<Box mt={5}>
@ -141,7 +151,7 @@ const CreateModal = ({ onClose, parentId }: { onClose: () => void; parentId?: st
<MyTooltip label={t('common.avatar.Select Avatar')}>
<Avatar
flexShrink={0}
src={getValues('avatar')}
src={avatar}
w={['28px', '32px']}
h={['28px', '32px']}
cursor={'pointer'}
@ -173,14 +183,13 @@ const CreateModal = ({ onClose, parentId }: { onClose: () => void; parentId?: st
<Box flex={1}>
<AIModelSelector
w={'100%'}
value={getValues('vectorModel')}
value={vectorModel}
list={filterNotHiddenVectorModelList.map((item) => ({
label: item.name,
value: item.model
}))}
onchange={(e) => {
setValue('vectorModel', e);
setRefresh((state) => !state);
}}
/>
</Box>
@ -192,14 +201,13 @@ const CreateModal = ({ onClose, parentId }: { onClose: () => void; parentId?: st
<Box flex={1}>
<AIModelSelector
w={'100%'}
value={getValues('agentModel')}
value={agentModel}
list={datasetModelList.map((item) => ({
label: item.name,
value: item.model
}))}
onchange={(e) => {
setValue('agentModel', e);
setRefresh((state) => !state);
}}
/>
</Box>

View File

@ -1,6 +1,5 @@
import React, { useCallback, useEffect } from 'react';
import { useRouter } from 'next/router';
import { useSystemStore } from '@/web/common/system/useSystemStore';
import type { ResLogin } from '@/global/support/api/userRes.d';
import { useChatStore } from '@/web/core/chat/storeChat';
import { useUserStore } from '@/web/support/user/useUserStore';
@ -9,7 +8,6 @@ import { postFastLogin } from '@/web/support/user/api';
import { useToast } from '@fastgpt/web/hooks/useToast';
import Loading from '@fastgpt/web/components/common/MyLoading';
import { serviceSideProps } from '@/web/common/utils/i18n';
import { useQuery } from '@tanstack/react-query';
import { getErrText } from '@fastgpt/global/common/error/utils';
const FastLogin = ({

View File

@ -3,6 +3,7 @@ import { AppTypeEnum } from '@fastgpt/global/core/app/constants';
import { WorkflowIOValueTypeEnum } from '@fastgpt/global/core/workflow/constants';
import {
FlowNodeInputTypeEnum,
FlowNodeOutputTypeEnum,
FlowNodeTypeEnum
} from '@fastgpt/global/core/workflow/node/constant';
@ -109,7 +110,7 @@ export const appTemplates: (AppItemType & {
key: 'userChatInput',
label: 'core.module.input.label.user question',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
},
@ -220,7 +221,7 @@ export const appTemplates: (AppItemType & {
label: 'core.module.output.label.New context',
description: 'core.module.output.description.New context',
valueType: WorkflowIOValueTypeEnum.chatHistory,
type: 'static'
type: FlowNodeOutputTypeEnum.static
},
{
id: 'answerText',
@ -228,7 +229,7 @@ export const appTemplates: (AppItemType & {
label: 'core.module.output.label.Ai response content',
description: 'core.module.output.description.Ai response content',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
}
@ -356,7 +357,7 @@ export const appTemplates: (AppItemType & {
key: 'userChatInput',
label: 'core.module.input.label.user question',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
},
@ -467,7 +468,7 @@ export const appTemplates: (AppItemType & {
label: 'core.module.output.label.New context',
description: 'core.module.output.description.New context',
valueType: WorkflowIOValueTypeEnum.chatHistory,
type: 'static'
type: FlowNodeOutputTypeEnum.static
},
{
id: 'answerText',
@ -475,7 +476,7 @@ export const appTemplates: (AppItemType & {
label: 'core.module.output.label.Ai response content',
description: 'core.module.output.description.Ai response content',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
}
@ -586,7 +587,7 @@ export const appTemplates: (AppItemType & {
key: 'userChatInput',
label: 'core.module.input.label.user question',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
},
@ -698,7 +699,7 @@ export const appTemplates: (AppItemType & {
label: 'core.module.output.label.New context',
description: 'core.module.output.description.New context',
valueType: WorkflowIOValueTypeEnum.chatHistory,
type: 'static'
type: FlowNodeOutputTypeEnum.static
},
{
id: 'answerText',
@ -706,7 +707,7 @@ export const appTemplates: (AppItemType & {
label: 'core.module.output.label.Ai response content',
description: 'core.module.output.description.Ai response content',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
},
@ -795,7 +796,7 @@ export const appTemplates: (AppItemType & {
id: 'quoteQA',
key: 'quoteQA',
label: 'core.module.Dataset quote.label',
type: 'static',
type: FlowNodeOutputTypeEnum.static,
valueType: WorkflowIOValueTypeEnum.datasetQuote,
description: '特殊数组格式,搜索结果为空时,返回空数组。'
}
@ -914,7 +915,7 @@ export const appTemplates: (AppItemType & {
key: 'userChatInput',
label: 'core.module.input.label.user question',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
},
@ -1026,7 +1027,7 @@ export const appTemplates: (AppItemType & {
label: 'core.module.output.label.New context',
description: 'core.module.output.description.New context',
valueType: WorkflowIOValueTypeEnum.chatHistory,
type: 'static'
type: FlowNodeOutputTypeEnum.static
},
{
id: 'answerText',
@ -1034,7 +1035,7 @@ export const appTemplates: (AppItemType & {
label: 'core.module.output.label.Ai response content',
description: 'core.module.output.description.Ai response content',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
},
@ -1116,7 +1117,7 @@ export const appTemplates: (AppItemType & {
key: 'cqResult',
label: '分类结果',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
},
@ -1232,7 +1233,7 @@ export const appTemplates: (AppItemType & {
key: 'quoteQA',
label: 'core.module.Dataset quote.label',
description: '特殊数组格式,搜索结果为空时,返回空数组。',
type: 'static',
type: FlowNodeOutputTypeEnum.static,
valueType: WorkflowIOValueTypeEnum.datasetQuote
}
]

View File

@ -7,6 +7,7 @@ import {
import { StoreNodeItemType } from '@fastgpt/global/core/workflow/type/index.d';
import {
FlowNodeInputTypeEnum,
FlowNodeOutputTypeEnum,
FlowNodeTypeEnum
} from '@fastgpt/global/core/workflow/node/constant';
import { NodeInputKeyEnum, WorkflowIOValueTypeEnum } from '@fastgpt/global/core/workflow/constants';
@ -65,7 +66,7 @@ export function form2AppWorkflow(data: AppSimpleEditFormType): WorkflowType {
key: 'userChatInput',
label: 'core.module.input.label.user question',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
};
@ -181,7 +182,7 @@ export function form2AppWorkflow(data: AppSimpleEditFormType): WorkflowType {
label: 'core.module.output.label.New context',
description: 'core.module.output.description.New context',
valueType: WorkflowIOValueTypeEnum.chatHistory,
type: 'static'
type: FlowNodeOutputTypeEnum.static
},
{
id: 'answerText',
@ -189,7 +190,7 @@ export function form2AppWorkflow(data: AppSimpleEditFormType): WorkflowType {
label: 'core.module.output.label.Ai response content',
description: 'core.module.output.description.Ai response content',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
}
@ -315,7 +316,7 @@ export function form2AppWorkflow(data: AppSimpleEditFormType): WorkflowType {
label: 'core.module.output.label.New context',
description: 'core.module.output.description.New context',
valueType: WorkflowIOValueTypeEnum.chatHistory,
type: 'static'
type: FlowNodeOutputTypeEnum.static
},
{
id: 'answerText',
@ -323,7 +324,7 @@ export function form2AppWorkflow(data: AppSimpleEditFormType): WorkflowType {
label: 'core.module.output.label.Ai response content',
description: 'core.module.output.description.Ai response content',
valueType: WorkflowIOValueTypeEnum.string,
type: 'static'
type: FlowNodeOutputTypeEnum.static
}
]
},
@ -416,7 +417,7 @@ export function form2AppWorkflow(data: AppSimpleEditFormType): WorkflowType {
id: 'quoteQA',
key: 'quoteQA',
label: 'core.module.Dataset quote.label',
type: 'static',
type: FlowNodeOutputTypeEnum.static,
valueType: WorkflowIOValueTypeEnum.datasetQuote
}
]
@ -537,7 +538,7 @@ export function form2AppWorkflow(data: AppSimpleEditFormType): WorkflowType {
id: 'quoteQA',
key: 'quoteQA',
label: 'core.module.Dataset quote.label',
type: 'static',
type: FlowNodeOutputTypeEnum.static,
valueType: WorkflowIOValueTypeEnum.datasetQuote
}
]

View File

@ -14,7 +14,7 @@ import type {
import type { updateInputGuideBody } from '@/pages/api/core/chat/inputGuide/update';
import type { deleteChatInputGuideQuery } from '@/pages/api/core/chat/inputGuide/delete';
import type {
QueryChatInputGuideProps,
QueryChatInputGuideBody,
QueryChatInputGuideResponse
} from '@/pages/api/core/chat/inputGuide/query';
@ -26,10 +26,13 @@ export const getCountChatInputGuideTotal = (data: countChatInputGuideTotalQuery)
export const getChatInputGuideList = (data: ChatInputGuideProps) =>
GET<ChatInputGuideResponse>(`/core/chat/inputGuide/list`, data);
export const queryChatInputGuideList = (data: QueryChatInputGuideProps, url?: string) => {
return GET<QueryChatInputGuideResponse>(url ?? `/core/chat/inputGuide/query`, data, {
withCredentials: !url
});
export const queryChatInputGuideList = (data: QueryChatInputGuideBody, url?: string) => {
if (url) {
return GET<QueryChatInputGuideResponse>(url, data, {
withCredentials: !url
});
}
return POST<QueryChatInputGuideResponse>(`/core/chat/inputGuide/query`, data);
};
export const postChatInputGuides = (data: createInputGuideBody) =>

View File

@ -167,7 +167,7 @@ type V1WorkflowType = {
};
defaultEditField?: {
inputType?: InputTypeEnum; // input type
outputType?: `${FlowNodeOutputTypeEnum}`;
outputType?: FlowNodeOutputTypeEnum;
required?: boolean;
key?: string;
label?: string;
@ -219,7 +219,7 @@ type V1WorkflowType = {
};
defaultEditField?: {
inputType?: `${FlowNodeInputTypeEnum}`; // input type
outputType?: `${FlowNodeOutputTypeEnum}`;
outputType?: FlowNodeOutputTypeEnum;
required?: boolean;
key?: string;
label?: string;

View File

@ -2,7 +2,13 @@ import { GET, POST, PUT, DELETE } from '@/web/common/api/request';
import { PostWorkflowDebugProps, PostWorkflowDebugResponse } from '@/global/core/workflow/api';
export const postWorkflowDebug = (data: PostWorkflowDebugProps) =>
POST<PostWorkflowDebugResponse>('/core/workflow/debug', {
...data,
mode: 'debug'
});
POST<PostWorkflowDebugResponse>(
'/core/workflow/debug',
{
...data,
mode: 'debug'
},
{
timeout: 300000
}
);

View File

@ -1,6 +1,6 @@
{
"name": "sandbox",
"version": "0.0.1",
"version": "4.8.3",
"description": "",
"author": "",
"private": true,

View File

@ -12,7 +12,7 @@ export class HttpExceptionFilter implements ExceptionFilter {
response.status(500).send({
success: false,
time: new Date(),
msg: getErrText(error)
message: getErrText(error)
});
}
}

View File

@ -2,3 +2,8 @@ export class RunCodeDto {
code: string;
variables: object;
}
export class RunCodeResponse {
codeReturn: Record<string, any>;
log: string;
}

View File

@ -1,6 +1,6 @@
import { Controller, Post, Body, HttpCode } from '@nestjs/common';
import { SandboxService } from './sandbox.service';
import { RunCodeDto } from './dto/create-sandbox.dto';
import { RunCodeDto, RunCodeResponse } from './dto/create-sandbox.dto';
import { WorkerNameEnum, runWorker } from 'src/worker/utils';
@Controller('sandbox')
@ -10,6 +10,6 @@ export class SandboxController {
@Post('/js')
@HttpCode(200)
runJs(@Body() codeProps: RunCodeDto) {
return runWorker(WorkerNameEnum.runJs, codeProps);
return runWorker<RunCodeResponse>(WorkerNameEnum.runJs, codeProps);
}
}

View File

@ -1,4 +1,4 @@
import { RunCodeDto } from 'src/sandbox/dto/create-sandbox.dto';
import { RunCodeDto, RunCodeResponse } from 'src/sandbox/dto/create-sandbox.dto';
import { parentPort } from 'worker_threads';
import { workerResponse } from './utils';
@ -6,19 +6,33 @@ import { workerResponse } from './utils';
const ivm = require('isolated-vm');
parentPort?.on('message', ({ code, variables = {} }: RunCodeDto) => {
const resolve = (data: any) => workerResponse({ parentPort, type: 'success', data });
const resolve = (data: RunCodeResponse) => workerResponse({ parentPort, type: 'success', data });
const reject = (error: any) => workerResponse({ parentPort, type: 'error', data: error });
const isolate = new ivm.Isolate({ memoryLimit: 32 });
const context = isolate.createContextSync();
const jail = context.global;
// custom log function
// custom function
const logData = [];
const CustomLogStr = 'CUSTOM_LOG';
code = code.replace(/console\.log/g, `${CustomLogStr}`);
jail.setSync(CustomLogStr, function (...args) {
logData.push(
args
.map((item) => (typeof item === 'object' ? JSON.stringify(item, null, 2) : item))
.join(', ')
);
});
jail.setSync('responseData', function (args: any): any {
if (typeof args === 'object') {
resolve(args);
resolve({
codeReturn: args,
log: logData.join('\n')
});
} else {
reject('Not an invalid response');
reject('Not an invalid response, must return an object');
}
});