V4.9.12 feature (#5022)
Some checks are pending
Deploy doc image to cf / deploy-production (push) Waiting to run
Deploy doc image by kubeconfig / build-fastgpt-docs-images (push) Waiting to run
Deploy doc image by kubeconfig / update-docs-image (push) Blocked by required conditions
Build FastGPT images in Personal warehouse / build-fastgpt-images (push) Waiting to run

* New chatinput (#4995)

* feat: Change border style

* refactor: Improve layout and styling of ChatInput component

* style: Update ChatInput component styling and mobile layout

* fix: update key detection for Enter key in ChatInput component

* feat: 添加 WelcomePage 组件,支持变量输入和欢迎信息展示

* style: Updated the PC voice input interface of the VoiceInput component and optimized the layout and style

* style: Optimize the layout and style of the WelcomePage component, and adjust the responsive design

* feat: Dynamically load the WelcomePage component and optimize the welcome information display logic

* refactor: Adjust the style and behavior of the ChatInput component and delete the WelcomePage component

* style: Modify the minimum height setting of the ChatInput component to simplify responsive design

* style: Optimize the layout and style of PC voice input components, and enhance the processing and drawing logic of waveform data

* style: Adjust ChatInput component's margin and textarea height logic for improved layout and responsiveness; refine PCVoiceInput component's positioning and display elements

* style: Enhance PCVoiceInput component's time display styling with custom font properties

* feat: Change border style

* refactor: Improve layout and styling of ChatInput component

* style: Update ChatInput component styling and mobile layout

* fix: update key detection for Enter key in ChatInput component

* feat: 添加 WelcomePage 组件,支持变量输入和欢迎信息展示

* style: Updated the PC voice input interface of the VoiceInput component and optimized the layout and style

* style: Optimize the layout and style of the WelcomePage component, and adjust the responsive design

* feat: Dynamically load the WelcomePage component and optimize the welcome information display logic

* refactor: Adjust the style and behavior of the ChatInput component and delete the WelcomePage component

* style: Modify the minimum height setting of the ChatInput component to simplify responsive design

* style: Optimize the layout and style of PC voice input components, and enhance the processing and drawing logic of waveform data

* style: Adjust ChatInput component's margin and textarea height logic for improved layout and responsiveness; refine PCVoiceInput component's positioning and display elements

* style: Enhance PCVoiceInput component's time display styling with custom font properties

* style: Add new 'xxl' size to theme spacing for improved layout options

* style: Update close icon fill color to use currentColor for better theming support

* style: Enhance voice input functionality and UI responsiveness; improve waveform sensitivity and amplitude

* style: Conditionally render file preview based on voice input state

* style: 优化移动端音频波形渲染,增强清晰度和敏感度

* style: Update comments to English to enhance code readability and consistency

* style: Adjust the mobile audio waveform update frequency and optimize rendering performance

* style: Optimize the file preview rendering logic in voice input mode to enhance user experience

* style: Optimize the file preview rendering logic in voice input mode to enhance user experience

* style: Adjust the chat input box placeholder color and border color to enhance visual effects

* fix: pg test

* Test secret (#5011)

* add http header auth config (#4982)

* add http header auth config

* optimize code

* add mcp tools header auth

* fix build

* fix ui

* fix

* teamid

* secret value encrypt (#5002)

* perf: secret code

* header auth ui (#5012)

* header auth ui

* fix i18n

* doc

* perf: type

* header secret ui

* reset ui

* perf: check secret invalid

---------

Co-authored-by: heheer <heheer@sealos.io>

* feat: cq and extrat AI memory (#5013)

* fix: login xss

* feat: Users can download the invoice by self (#5015)

* Users can download the invoice by themselves

* Direct file stream implementation for transmission presentation

* i18n

* Chatbox-fix (#5018)

* feat: Change border style

* refactor: Improve layout and styling of ChatInput component

* style: Update ChatInput component styling and mobile layout

* fix: update key detection for Enter key in ChatInput component

* feat: 添加 WelcomePage 组件,支持变量输入和欢迎信息展示

* style: Updated the PC voice input interface of the VoiceInput component and optimized the layout and style

* style: Optimize the layout and style of the WelcomePage component, and adjust the responsive design

* feat: Dynamically load the WelcomePage component and optimize the welcome information display logic

* refactor: Adjust the style and behavior of the ChatInput component and delete the WelcomePage component

* style: Modify the minimum height setting of the ChatInput component to simplify responsive design

* style: Optimize the layout and style of PC voice input components, and enhance the processing and drawing logic of waveform data

* style: Adjust ChatInput component's margin and textarea height logic for improved layout and responsiveness; refine PCVoiceInput component's positioning and display elements

* style: Enhance PCVoiceInput component's time display styling with custom font properties

* feat: Change border style

* refactor: Improve layout and styling of ChatInput component

* style: Update ChatInput component styling and mobile layout

* fix: update key detection for Enter key in ChatInput component

* feat: 添加 WelcomePage 组件,支持变量输入和欢迎信息展示

* style: Updated the PC voice input interface of the VoiceInput component and optimized the layout and style

* style: Optimize the layout and style of the WelcomePage component, and adjust the responsive design

* feat: Dynamically load the WelcomePage component and optimize the welcome information display logic

* refactor: Adjust the style and behavior of the ChatInput component and delete the WelcomePage component

* style: Modify the minimum height setting of the ChatInput component to simplify responsive design

* style: Optimize the layout and style of PC voice input components, and enhance the processing and drawing logic of waveform data

* style: Adjust ChatInput component's margin and textarea height logic for improved layout and responsiveness; refine PCVoiceInput component's positioning and display elements

* style: Enhance PCVoiceInput component's time display styling with custom font properties

* style: Add new 'xxl' size to theme spacing for improved layout options

* style: Update close icon fill color to use currentColor for better theming support

* style: Enhance voice input functionality and UI responsiveness; improve waveform sensitivity and amplitude

* style: Conditionally render file preview based on voice input state

* style: 优化移动端音频波形渲染,增强清晰度和敏感度

* style: Update comments to English to enhance code readability and consistency

* style: Adjust the mobile audio waveform update frequency and optimize rendering performance

* style: Optimize the file preview rendering logic in voice input mode to enhance user experience

* style: Optimize the file preview rendering logic in voice input mode to enhance user experience

* style: Adjust the chat input box placeholder color and border color to enhance visual effects

* New chatinput (#4995)

* feat: Change border style

* refactor: Improve layout and styling of ChatInput component

* style: Update ChatInput component styling and mobile layout

* fix: update key detection for Enter key in ChatInput component

* feat: 添加 WelcomePage 组件,支持变量输入和欢迎信息展示

* style: Updated the PC voice input interface of the VoiceInput component and optimized the layout and style

* style: Optimize the layout and style of the WelcomePage component, and adjust the responsive design

* feat: Dynamically load the WelcomePage component and optimize the welcome information display logic

* refactor: Adjust the style and behavior of the ChatInput component and delete the WelcomePage component

* style: Modify the minimum height setting of the ChatInput component to simplify responsive design

* style: Optimize the layout and style of PC voice input components, and enhance the processing and drawing logic of waveform data

* style: Adjust ChatInput component's margin and textarea height logic for improved layout and responsiveness; refine PCVoiceInput component's positioning and display elements

* style: Enhance PCVoiceInput component's time display styling with custom font properties

* feat: Change border style

* refactor: Improve layout and styling of ChatInput component

* style: Update ChatInput component styling and mobile layout

* fix: update key detection for Enter key in ChatInput component

* feat: 添加 WelcomePage 组件,支持变量输入和欢迎信息展示

* style: Updated the PC voice input interface of the VoiceInput component and optimized the layout and style

* style: Optimize the layout and style of the WelcomePage component, and adjust the responsive design

* feat: Dynamically load the WelcomePage component and optimize the welcome information display logic

* refactor: Adjust the style and behavior of the ChatInput component and delete the WelcomePage component

* style: Modify the minimum height setting of the ChatInput component to simplify responsive design

* style: Optimize the layout and style of PC voice input components, and enhance the processing and drawing logic of waveform data

* style: Adjust ChatInput component's margin and textarea height logic for improved layout and responsiveness; refine PCVoiceInput component's positioning and display elements

* style: Enhance PCVoiceInput component's time display styling with custom font properties

* style: Add new 'xxl' size to theme spacing for improved layout options

* style: Update close icon fill color to use currentColor for better theming support

* style: Enhance voice input functionality and UI responsiveness; improve waveform sensitivity and amplitude

* style: Conditionally render file preview based on voice input state

* style: 优化移动端音频波形渲染,增强清晰度和敏感度

* style: Update comments to English to enhance code readability and consistency

* style: Adjust the mobile audio waveform update frequency and optimize rendering performance

* style: Optimize the file preview rendering logic in voice input mode to enhance user experience

* style: Optimize the file preview rendering logic in voice input mode to enhance user experience

* style: Adjust the chat input box placeholder color and border color to enhance visual effects

* fix: pg test

* Test secret (#5011)

* add http header auth config (#4982)

* add http header auth config

* optimize code

* add mcp tools header auth

* fix build

* fix ui

* fix

* teamid

* secret value encrypt (#5002)

* perf: secret code

* header auth ui (#5012)

* header auth ui

* fix i18n

* doc

* perf: type

* header secret ui

* reset ui

* perf: check secret invalid

---------

Co-authored-by: heheer <heheer@sealos.io>

* feat: cq and extrat AI memory (#5013)

* refactor: Refactored the ChatInput component, optimized the layout of the text area and button group, and improved the user experience

* refactor: Updated ChatInput component, optimized layout and style, and enhanced user experience

* feat: update docs

---------

Co-authored-by: archer <545436317@qq.com>
Co-authored-by: heheer <heheer@sealos.io>

* input ui

* fix: chat input ux

* Return in JSON format to handle checkres (#5019)

* Users can download the invoice by themselves

* Direct file stream implementation for transmission presentation

* Return in JSON format to handle checkres

* fix: invoice

* fix: ui

* doc

* update package

* fix: ts

* fix: login checker

* fix: team plan

* perf: aiproxy ux

---------

Co-authored-by: Theresa <63280168+sd0ric4@users.noreply.github.com>
Co-authored-by: heheer <heheer@sealos.io>
Co-authored-by: Zhuangzai fa <143257420+ctrlz526@users.noreply.github.com>
This commit is contained in:
Archer 2025-06-13 00:42:09 +08:00 committed by GitHub
parent 30ca3e3d5c
commit 095b75ee27
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
111 changed files with 2516 additions and 1685 deletions

View File

@ -23,7 +23,7 @@ services:
volumes:
- ./pg/data:/var/lib/postgresql/data
healthcheck:
test: ['CMD', 'pg_isready', '-U', 'postgres', '-d', 'postgres']
test: ['CMD', 'pg_isready', '-U', 'username', '-d', 'postgres']
interval: 5s
timeout: 5s
retries: 10

View File

@ -26,8 +26,6 @@ data:
"usedInToolCall": true,
"toolChoice": true,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {}
},
@ -47,8 +45,6 @@ data:
"usedInToolCall": true,
"toolChoice": true,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {}
},
@ -68,8 +64,6 @@ data:
"usedInToolCall": true,
"toolChoice": true,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {}
},
@ -89,8 +83,6 @@ data:
"usedInToolCall": false,
"toolChoice": true,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {}
}

View File

@ -146,8 +146,6 @@ curl --location --request POST 'https://<oneapi_url>/v1/chat/completions' \
"usedInToolCall": true, // 是否用于工具调用务必保证至少有一个为true
"toolChoice": true, // 是否支持工具选择(分类,内容提取,工具调用会用到。)
"functionCall": false, // 是否支持函数调用(分类,内容提取,工具调用会用到。会优先使用 toolChoice如果为false则使用 functionCall如果仍为 false则使用提示词模式
"customCQPrompt": "", // 自定义文本分类提示词(不支持工具和函数调用的模型
"customExtractPrompt": "", // 自定义内容提取提示词
"defaultSystemChatPrompt": "", // 对话默认携带的系统提示词
"defaultConfig": {} // 请求API时挟带一些默认配置比如 GLM4 的 top_p
}

View File

@ -59,11 +59,10 @@ images: []
可以。需要准备好向量模型和LLM模型。
### 其他模型没法进行问题分类/内容提取
### 其他模型没法进行内容提取
看日志。如果提示 JSON invalidnot support tool 之类的,说明该模型不支持工具调用或函数调用,需要设置`toolChoice=false`和`functionCall=false`就会默认走提示词模式。目前内置提示词仅针对了商业模型API进行测试。问题分类基本可用内容提取不太行。
1. 看日志。如果提示 JSON invalidnot support tool 之类的,说明该模型不支持工具调用或函数调用,需要设置`toolChoice=false`和`functionCall=false`就会默认走提示词模式。目前内置提示词仅针对了商业模型API进行测试。问题分类基本可用内容提取不太行。
2. 如果已经配置正常,并且没有错误日志,则说明可能提示词不太适合该模型,可以通过修改`customCQPrompt`来自定义提示词。
### 页面崩溃
1. 关闭翻译

View File

@ -111,8 +111,6 @@ weight: 744
"usedInToolCall": true, // 是否用于工具调用务必保证至少有一个为true
"toolChoice": true, // 是否支持工具选择(分类,内容提取,工具调用会用到。)
"functionCall": false, // 是否支持函数调用(分类,内容提取,工具调用会用到。会优先使用 toolChoice如果为false则使用 functionCall如果仍为 false则使用提示词模式
"customCQPrompt": "", // 自定义文本分类提示词(不支持工具和函数调用的模型
"customExtractPrompt": "", // 自定义内容提取提示词
"defaultSystemChatPrompt": "", // 对话默认携带的系统提示词
"defaultConfig": {}, // 请求API时挟带一些默认配置比如 GLM4 的 top_p
"fieldMap": {} // 字段映射o1 模型需要把 max_tokens 映射为 max_completion_tokens
@ -322,8 +320,6 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
"usedInToolCall": true, // 是否用于工具调用务必保证至少有一个为true
"toolChoice": true, // 是否支持工具选择(分类,内容提取,工具调用会用到。)
"functionCall": false, // 是否支持函数调用(分类,内容提取,工具调用会用到。会优先使用 toolChoice如果为false则使用 functionCall如果仍为 false则使用提示词模式
"customCQPrompt": "", // 自定义文本分类提示词(不支持工具和函数调用的模型
"customExtractPrompt": "", // 自定义内容提取提示词
"defaultSystemChatPrompt": "", // 对话默认携带的系统提示词
"defaultConfig": {}, // 请求API时挟带一些默认配置比如 GLM4 的 top_p
"fieldMap": {} // 字段映射o1 模型需要把 max_tokens 映射为 max_completion_tokens
@ -345,8 +341,6 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
"usedInToolCall": true,
"toolChoice": true,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {}
@ -368,8 +362,6 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
"usedInToolCall": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {
"temperature": 1,
@ -394,8 +386,6 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
"usedInToolCall": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {
"temperature": 1,

View File

@ -33,7 +33,6 @@ weight: 813
"usedInToolCall": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {
@ -57,7 +56,6 @@ weight: 813
"usedInToolCall": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {

View File

@ -38,7 +38,6 @@ weight: 808
"usedInQueryExtension": true,
"toolChoice": true,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {},

View File

@ -10,7 +10,9 @@ weight: 788
## 🚀 新增内容
1. AI proxy 监控完善,支持以图表/表格形式查看模型调用和性能情况。
2. 商业版支持知识库分块时LLM 进行自动分段识别。
2. HTTP 节点和 MCP 支持单独“鉴权配置”,鉴权配置明文不会二次返回客户端,以保障数据安全。
3. 问题分类和内容提取,提示词中自动加入上一轮结果进行额外引导。
4. 商业版支持知识库分块时LLM 进行自动分段识别。
## ⚙️ 优化
@ -18,8 +20,12 @@ weight: 788
2. 后端全量计算知识库 chunk 参数,避免自动模式下部分参数未正确使用默认值。
3. 将文本分块移至 worker 线程,避免阻塞。
4. 展示更多套餐用量信息。
5. 优化输入框样式,桌面和移动端的语音输入样式更新。
## 🐛 修复
1. 自定义问答提取提示词被覆盖。
2. 模板导入时,存在空 indexes 时,导致数据插入失败。
2. 模板导入时,存在空 indexes 时,导致数据插入失败。
3. 登录页可能存在的 XSS 攻击。
4. 输入框语音输入时候会丢失文件列表的问题。
5. 知识库文档中图片 TTL 字段未清除,导致图片过期。

View File

@ -372,113 +372,113 @@ services:
# 接入 ChatGLM2-m3e 模型
## 将 FastGPT 接入私有化模型 ChatGLM2和m3e-large
## 前言
FastGPT 默认使用了 OpenAI 的 LLM 模型和向量模型,如果想要私有化部署的话,可以使用 ChatGLM2 和 m3e-large 模型。以下是由用户@不做了睡大觉 提供的接入方法。该镜像直接集成了 M3E-Large 和 ChatGLM2-6B 模型,可以直接使用。
## 部署镜像
+ 镜像名: `stawky/chatglm2-m3e:latest`
+ 国内镜像名: `registry.cn-hangzhou.aliyuncs.com/fastgpt_docker/chatglm2-m3e:latest`
+ 端口号: 6006
```
# 设置安全凭证即oneapi中的渠道密钥
默认值sk-aaabbbcccdddeeefffggghhhiiijjjkkk
也可以通过环境变量引入sk-key。有关docker环境变量引入的方法请自寻教程此处不再赘述。
```
## 接入 [One API](/docs/development/modelconfig/one-api/)
为 chatglm2 和 m3e-large 各添加一个渠道,参数如下:
![](/imgs/model-m3e1.png)
这里我填入 m3e 作为向量模型chatglm2 作为语言模型
## 测试
curl 例子:
```bash
curl --location --request POST 'https://domain/v1/embeddings' \
--header 'Authorization: Bearer sk-aaabbbcccdddeeefffggghhhiiijjjkkk' \
--header 'Content-Type: application/json' \
--data-raw '{
"model": "m3e",
"input": ["laf是什么"]
}'
```
```bash
curl --location --request POST 'https://domain/v1/chat/completions' \
--header 'Authorization: Bearer sk-aaabbbcccdddeeefffggghhhiiijjjkkk' \
--header 'Content-Type: application/json' \
--data-raw '{
"model": "chatglm2",
"messages": [{"role": "user", "content": "Hello!"}]
}'
```
Authorization 为 sk-aaabbbcccdddeeefffggghhhiiijjjkkk。model 为刚刚在 One API 填写的自定义模型。
## 接入 FastGPT
修改 config.json 配置文件,在 llmModels 中加入 chatglm2, 在 vectorModels 中加入 M3E 模型:
```json
"llmModels": [
//其他对话模型
{
"model": "chatglm2",
"name": "chatglm2",
"maxToken": 8000,
"price": 0,
"quoteMaxToken": 4000,
"maxTemperature": 1.2,
"defaultSystemChatPrompt": ""
}
],
"vectorModels": [
{
"model": "text-embedding-ada-002",
"name": "Embedding-2",
"price": 0.2,
"defaultToken": 500,
"maxToken": 3000
},
{
"model": "m3e",
"name": "M3E测试使用",
"price": 0.1,
"defaultToken": 500,
"maxToken": 1800
}
],
```
## 测试使用
M3E 模型的使用方法如下:
1. 创建知识库时候选择 M3E 模型。
注意,一旦选择后,知识库将无法修改向量模型。
![](/imgs/model-m3e2.png)
2. 导入数据
3. 搜索测试
![](/imgs/model-m3e3.png)
4. 应用绑定知识库
注意,应用只能绑定同一个向量模型的知识库,不能跨模型绑定。并且,需要注意调整相似度,不同向量模型的相似度(距离)会有所区别,需要自行测试实验。
![](/imgs/model-m3e4.png)
chatglm2 模型的使用方法如下:
## 前言
FastGPT 默认使用了 OpenAI 的 LLM 模型和向量模型,如果想要私有化部署的话,可以使用 ChatGLM2 和 m3e-large 模型。以下是由用户@不做了睡大觉 提供的接入方法。该镜像直接集成了 M3E-Large 和 ChatGLM2-6B 模型,可以直接使用。
## 部署镜像
+ 镜像名: `stawky/chatglm2-m3e:latest`
+ 国内镜像名: `registry.cn-hangzhou.aliyuncs.com/fastgpt_docker/chatglm2-m3e:latest`
+ 端口号: 6006
```
# 设置安全凭证即oneapi中的渠道密钥
默认值sk-aaabbbcccdddeeefffggghhhiiijjjkkk
也可以通过环境变量引入sk-key。有关docker环境变量引入的方法请自寻教程此处不再赘述。
```
## 接入 [One API](/docs/development/modelconfig/one-api/)
为 chatglm2 和 m3e-large 各添加一个渠道,参数如下:
![](/imgs/model-m3e1.png)
这里我填入 m3e 作为向量模型chatglm2 作为语言模型
## 测试
curl 例子:
```bash
curl --location --request POST 'https://domain/v1/embeddings' \
--header 'Authorization: Bearer sk-aaabbbcccdddeeefffggghhhiiijjjkkk' \
--header 'Content-Type: application/json' \
--data-raw '{
"model": "m3e",
"input": ["laf是什么"]
}'
```
```bash
curl --location --request POST 'https://domain/v1/chat/completions' \
--header 'Authorization: Bearer sk-aaabbbcccdddeeefffggghhhiiijjjkkk' \
--header 'Content-Type: application/json' \
--data-raw '{
"model": "chatglm2",
"messages": [{"role": "user", "content": "Hello!"}]
}'
```
Authorization 为 sk-aaabbbcccdddeeefffggghhhiiijjjkkk。model 为刚刚在 One API 填写的自定义模型。
## 接入 FastGPT
修改 config.json 配置文件,在 llmModels 中加入 chatglm2, 在 vectorModels 中加入 M3E 模型:
```json
"llmModels": [
//其他对话模型
{
"model": "chatglm2",
"name": "chatglm2",
"maxToken": 8000,
"price": 0,
"quoteMaxToken": 4000,
"maxTemperature": 1.2,
"defaultSystemChatPrompt": ""
}
],
"vectorModels": [
{
"model": "text-embedding-ada-002",
"name": "Embedding-2",
"price": 0.2,
"defaultToken": 500,
"maxToken": 3000
},
{
"model": "m3e",
"name": "M3E测试使用",
"price": 0.1,
"defaultToken": 500,
"maxToken": 1800
}
],
```
## 测试使用
M3E 模型的使用方法如下:
1. 创建知识库时候选择 M3E 模型。
注意,一旦选择后,知识库将无法修改向量模型。
![](/imgs/model-m3e2.png)
2. 导入数据
3. 搜索测试
![](/imgs/model-m3e3.png)
4. 应用绑定知识库
注意,应用只能绑定同一个向量模型的知识库,不能跨模型绑定。并且,需要注意调整相似度,不同向量模型的相似度(距离)会有所区别,需要自行测试实验。
![](/imgs/model-m3e4.png)
chatglm2 模型的使用方法如下:
模型选择 chatglm2 即可
# 接入 ChatGLM2-6B
@ -785,180 +785,180 @@ CUSTOM_READ_FILE_EXTENSION=pdf
# 使用 Ollama 接入本地模型
## 采用 Ollama 部署自己的模型
[Ollama](https://ollama.com/)是一个开源的AI大模型部署工具专注于简化大语言模型的部署和使用支持一键下载和运行各种大模型。
## 安装 Ollama
Ollama 本身支持多种安装方式,但是推荐使用 Docker 拉取镜像部署。如果是个人设备上安装了 Ollama 后续需要解决如何让 Docker 中 FastGPT 容器访问宿主机 Ollama的问题较为麻烦。
### Docker 安装(推荐)
你可以使用 Ollama 官方的 Docker 镜像来一键安装和启动 Ollama 服务(确保你的机器上已经安装了 Docker命令如下
```bash
docker pull ollama/ollama
docker run --rm -d --name ollama -p 11434:11434 ollama/ollama
```
如果你的 FastGPT 是在 Docker 中进行部署的,建议在拉取 Ollama 镜像时保证和 FastGPT 镜像处于同一网络,否则可能出现 FastGPT 无法访问的问题,命令如下:
```bash
docker run --rm -d --name ollama --network (你的 Fastgpt 容器所在网络) -p 11434:11434 ollama/ollama
```
### 主机安装
如果你不想使用 Docker ,也可以采用主机安装,以下是主机安装的一些方式。
#### MacOS
如果你使用的是 macOS且系统中已经安装了 Homebrew 包管理器,可通过以下命令来安装 Ollama
```bash
brew install ollama
ollama serve #安装完成后,使用该命令启动服务
```
#### Linux
在 Linux 系统上,你可以借助包管理器来安装 Ollama。以 Ubuntu 为例,在终端执行以下命令:
```bash
curl https://ollama.com/install.sh | sh #此命令会从官方网站下载并执行安装脚本。
ollama serve #安装完成后,同样启动服务
```
#### Windows
在 Windows 系统中,你可以从 Ollama 官方网站 下载 Windows 版本的安装程序。下载完成后,运行安装程序,按照安装向导的提示完成安装。安装完成后,在命令提示符或 PowerShell 中启动服务:
```bash
ollama serve #安装完成并启动服务后,你可以在浏览器中访问 http://localhost:11434 来验证 Ollama 是否安装成功。
```
#### 补充说明
如果你是采用的主机应用 Ollama 而不是镜像,需要确保你的 Ollama 可以监听0.0.0.0。
##### 1. Linxu 系统
如果 Ollama 作为 systemd 服务运行,打开终端,编辑 Ollama 的 systemd 服务文件使用命令sudo systemctl edit ollama.service在[Service]部分添加Environment="OLLAMA_HOST=0.0.0.0"。保存并退出编辑器然后执行sudo systemctl daemon - reload和sudo systemctl restart ollama使配置生效。
##### 2. MacOS 系统
打开终端使用launchctl setenv ollama_host "0.0.0.0"命令设置环境变量,然后重启 Ollama 应用程序以使更改生效。
##### 3. Windows 系统
通过 “开始” 菜单或搜索栏打开 “编辑系统环境变量”,在 “系统属性” 窗口中点击 “环境变量”,在 “系统变量” 部分点击 “新建”创建一个名为OLLAMA_HOST的变量变量值设置为0.0.0.0,点击 “确定” 保存更改,最后从 “开始” 菜单重启 Ollama 应用程序。
### Ollama 拉取模型镜像
在安装 Ollama 后,本地是没有模型镜像的,需要自己去拉取 Ollama 中的模型镜像。命令如下:
```bash
# Docker 部署需要先进容器,命令为: docker exec -it < Ollama 容器名 > /bin/sh
ollama pull <模型名>
```
![](/imgs/Ollama-pull.png)
### 测试通信
在安装完成后,需要进行检测测试,首先进入 FastGPT 所在的容器,尝试访问自己的 Ollama ,命令如下:
```bash
docker exec -it < FastGPT 所在的容器名 > /bin/sh
curl http://XXX.XXX.XXX.XXX:11434 #容器部署地址为“http://<容器名>:<端口>”,主机安装地址为"http://<主机IP>:<端口>"主机IP不可为localhost
```
看到访问显示自己的 Ollama 服务以及启动,说明可以正常通信。
## 将 Ollama 接入 FastGPT
### 1. 查看 Ollama 所拥有的模型
首先采用下述命令查看 Ollama 中所拥有的模型,
```bash
# Docker 部署 Ollama需要此命令 docker exec -it < Ollama 容器名 > /bin/sh
ollama ls
```
![](/imgs/Ollama-models1.png)
### 2. AI Proxy 接入
如果你采用的是 FastGPT 中的默认配置文件部署[这里](/docs/development/docker.md),即默认采用 AI Proxy 进行启动。
![](/imgs/Ollama-aiproxy1.png)
以及在确保你的 FastGPT 可以直接访问 Ollama 容器的情况下,无法访问,参考上文[点此跳转](#安装-ollama)的安装过程检测是不是主机不能监测0.0.0.0,或者容器不在同一个网络。
![](/imgs/Ollama-aiproxy2.png)
在 FastGPT 中点击账号->模型提供商->模型配置->新增模型添加自己的模型即可添加模型时需要保证模型ID和 OneAPI 中的模型名称一致。详细参考[这里](/docs/development/modelConfig/intro.md)
![](/imgs/Ollama-models2.png)
![](/imgs/Ollama-models3.png)
运行 FastGPT ,在页面中选择账号->模型提供商->模型渠道->新增渠道。之后,在渠道选择中选择 Ollama ,然后加入自己拉取的模型,填入代理地址,如果是容器中安装 Ollama 代理地址为http://地址:端口补充容器部署地址为“http://<容器名>:<端口>”,主机安装地址为"http://<主机IP>:<端口>"主机IP不可为localhost
![](/imgs/Ollama-aiproxy3.png)
在工作台中创建一个应用,选择自己之前添加的模型,此处模型名称为自己当时设置的别名。注:同一个模型无法多次添加,系统会采取最新添加时设置的别名。
![](/imgs/Ollama-models4.png)
### 3. OneAPI 接入
如果你想使用 OneAPI ,首先需要拉取 OneAPI 镜像,然后将其在 FastGPT 容器的网络中运行。具体命令如下:
```bash
# 拉取 oneAPI 镜像
docker pull intel/oneapi-hpckit
# 运行容器并指定自定义网络和容器名
docker run -it --network < FastGPT 网络 > --name 容器名 intel/oneapi-hpckit /bin/bash
```
进入 OneAPI 页面,添加新的渠道,类型选择 Ollama ,在模型中填入自己 Ollama 中的模型,需要保证添加的模型名称和 Ollama 中一致,再在下方填入自己的 Ollama 代理地址默认http://地址:端口,不需要填写/v1。添加成功后在 OneAPI 进行渠道测试,测试成功则说明添加成功。此处演示采用的是 Docker 部署 Ollama 的效果,主机 Ollama需要修改代理地址为http://<主机IP>:<端口>
![](/imgs/Ollama-oneapi1.png)
渠道添加成功后,点击令牌,点击添加令牌,填写名称,修改配置。
![](/imgs/Ollama-oneapi2.png)
修改部署 FastGPT 的 docker-compose.yml 文件,在其中将 AI Proxy 的使用注释,在 OPENAI_BASE_URL 中加入自己的 OneAPI 开放地址默认是http://地址:端口/v1v1必须填写。KEY 中填写自己在 OneAPI 的令牌。
![](/imgs/Ollama-oneapi3.png)
[直接跳转5](#5-模型添加和使用)添加模型,并使用。
### 4. 直接接入
如果你既不想使用 AI Proxy也不想使用 OneAPI也可以选择直接接入修改部署 FastGPT 的 docker-compose.yml 文件,在其中将 AI Proxy 的使用注释,采用和 OneAPI 的类似配置。注释掉 AIProxy 相关代码在OPENAI_BASE_URL中加入自己的 Ollama 开放地址默认是http://地址:端口/v1强调:v1必须填写。在KEY中随便填入因为 Ollama 默认没有鉴权,如果开启鉴权,请自行填写。其他操作和在 OneAPI 中加入 Ollama 一致,只需在 FastGPT 中加入自己的模型即可使用。此处演示采用的是 Docker 部署 Ollama 的效果,主机 Ollama需要修改代理地址为http://<主机IP>:<端口>
![](/imgs/Ollama-direct1.png)
完成后[点击这里](#5-模型添加和使用)进行模型添加并使用。
### 5. 模型添加和使用
在 FastGPT 中点击账号->模型提供商->模型配置->新增模型添加自己的模型即可添加模型时需要保证模型ID和 OneAPI 中的模型名称一致。
![](/imgs/Ollama-models2.png)
![](/imgs/Ollama-models3.png)
在工作台中创建一个应用,选择自己之前添加的模型,此处模型名称为自己当时设置的别名。注:同一个模型无法多次添加,系统会采取最新添加时设置的别名。
![](/imgs/Ollama-models4.png)
### 6. 补充
[Ollama](https://ollama.com/)是一个开源的AI大模型部署工具专注于简化大语言模型的部署和使用支持一键下载和运行各种大模型。
## 安装 Ollama
Ollama 本身支持多种安装方式,但是推荐使用 Docker 拉取镜像部署。如果是个人设备上安装了 Ollama 后续需要解决如何让 Docker 中 FastGPT 容器访问宿主机 Ollama的问题较为麻烦。
### Docker 安装(推荐)
你可以使用 Ollama 官方的 Docker 镜像来一键安装和启动 Ollama 服务(确保你的机器上已经安装了 Docker命令如下
```bash
docker pull ollama/ollama
docker run --rm -d --name ollama -p 11434:11434 ollama/ollama
```
如果你的 FastGPT 是在 Docker 中进行部署的,建议在拉取 Ollama 镜像时保证和 FastGPT 镜像处于同一网络,否则可能出现 FastGPT 无法访问的问题,命令如下:
```bash
docker run --rm -d --name ollama --network (你的 Fastgpt 容器所在网络) -p 11434:11434 ollama/ollama
```
### 主机安装
如果你不想使用 Docker ,也可以采用主机安装,以下是主机安装的一些方式。
#### MacOS
如果你使用的是 macOS且系统中已经安装了 Homebrew 包管理器,可通过以下命令来安装 Ollama
```bash
brew install ollama
ollama serve #安装完成后,使用该命令启动服务
```
#### Linux
在 Linux 系统上,你可以借助包管理器来安装 Ollama。以 Ubuntu 为例,在终端执行以下命令:
```bash
curl https://ollama.com/install.sh | sh #此命令会从官方网站下载并执行安装脚本。
ollama serve #安装完成后,同样启动服务
```
#### Windows
在 Windows 系统中,你可以从 Ollama 官方网站 下载 Windows 版本的安装程序。下载完成后,运行安装程序,按照安装向导的提示完成安装。安装完成后,在命令提示符或 PowerShell 中启动服务:
```bash
ollama serve #安装完成并启动服务后,你可以在浏览器中访问 http://localhost:11434 来验证 Ollama 是否安装成功。
```
#### 补充说明
如果你是采用的主机应用 Ollama 而不是镜像,需要确保你的 Ollama 可以监听0.0.0.0。
##### 1. Linxu 系统
如果 Ollama 作为 systemd 服务运行,打开终端,编辑 Ollama 的 systemd 服务文件使用命令sudo systemctl edit ollama.service在[Service]部分添加Environment="OLLAMA_HOST=0.0.0.0"。保存并退出编辑器然后执行sudo systemctl daemon - reload和sudo systemctl restart ollama使配置生效。
##### 2. MacOS 系统
打开终端使用launchctl setenv ollama_host "0.0.0.0"命令设置环境变量,然后重启 Ollama 应用程序以使更改生效。
##### 3. Windows 系统
通过 “开始” 菜单或搜索栏打开 “编辑系统环境变量”,在 “系统属性” 窗口中点击 “环境变量”,在 “系统变量” 部分点击 “新建”创建一个名为OLLAMA_HOST的变量变量值设置为0.0.0.0,点击 “确定” 保存更改,最后从 “开始” 菜单重启 Ollama 应用程序。
### Ollama 拉取模型镜像
在安装 Ollama 后,本地是没有模型镜像的,需要自己去拉取 Ollama 中的模型镜像。命令如下:
```bash
# Docker 部署需要先进容器,命令为: docker exec -it < Ollama 容器名 > /bin/sh
ollama pull <模型名>
```
![](/imgs/Ollama-pull.png)
### 测试通信
在安装完成后,需要进行检测测试,首先进入 FastGPT 所在的容器,尝试访问自己的 Ollama ,命令如下:
```bash
docker exec -it < FastGPT 所在的容器名 > /bin/sh
curl http://XXX.XXX.XXX.XXX:11434 #容器部署地址为“http://<容器名>:<端口>”,主机安装地址为"http://<主机IP>:<端口>"主机IP不可为localhost
```
看到访问显示自己的 Ollama 服务以及启动,说明可以正常通信。
## 将 Ollama 接入 FastGPT
### 1. 查看 Ollama 所拥有的模型
首先采用下述命令查看 Ollama 中所拥有的模型,
```bash
# Docker 部署 Ollama需要此命令 docker exec -it < Ollama 容器名 > /bin/sh
ollama ls
```
![](/imgs/Ollama-models1.png)
### 2. AI Proxy 接入
如果你采用的是 FastGPT 中的默认配置文件部署[这里](/docs/development/docker.md),即默认采用 AI Proxy 进行启动。
![](/imgs/Ollama-aiproxy1.png)
以及在确保你的 FastGPT 可以直接访问 Ollama 容器的情况下,无法访问,参考上文[点此跳转](#安装-ollama)的安装过程检测是不是主机不能监测0.0.0.0,或者容器不在同一个网络。
![](/imgs/Ollama-aiproxy2.png)
在 FastGPT 中点击账号->模型提供商->模型配置->新增模型添加自己的模型即可添加模型时需要保证模型ID和 OneAPI 中的模型名称一致。详细参考[这里](/docs/development/modelConfig/intro.md)
![](/imgs/Ollama-models2.png)
![](/imgs/Ollama-models3.png)
运行 FastGPT ,在页面中选择账号->模型提供商->模型渠道->新增渠道。之后,在渠道选择中选择 Ollama ,然后加入自己拉取的模型,填入代理地址,如果是容器中安装 Ollama 代理地址为http://地址:端口补充容器部署地址为“http://<容器名>:<端口>”,主机安装地址为"http://<主机IP>:<端口>"主机IP不可为localhost
![](/imgs/Ollama-aiproxy3.png)
在工作台中创建一个应用,选择自己之前添加的模型,此处模型名称为自己当时设置的别名。注:同一个模型无法多次添加,系统会采取最新添加时设置的别名。
![](/imgs/Ollama-models4.png)
### 3. OneAPI 接入
如果你想使用 OneAPI ,首先需要拉取 OneAPI 镜像,然后将其在 FastGPT 容器的网络中运行。具体命令如下:
```bash
# 拉取 oneAPI 镜像
docker pull intel/oneapi-hpckit
# 运行容器并指定自定义网络和容器名
docker run -it --network < FastGPT 网络 > --name 容器名 intel/oneapi-hpckit /bin/bash
```
进入 OneAPI 页面,添加新的渠道,类型选择 Ollama ,在模型中填入自己 Ollama 中的模型,需要保证添加的模型名称和 Ollama 中一致,再在下方填入自己的 Ollama 代理地址默认http://地址:端口,不需要填写/v1。添加成功后在 OneAPI 进行渠道测试,测试成功则说明添加成功。此处演示采用的是 Docker 部署 Ollama 的效果,主机 Ollama需要修改代理地址为http://<主机IP>:<端口>
![](/imgs/Ollama-oneapi1.png)
渠道添加成功后,点击令牌,点击添加令牌,填写名称,修改配置。
![](/imgs/Ollama-oneapi2.png)
修改部署 FastGPT 的 docker-compose.yml 文件,在其中将 AI Proxy 的使用注释,在 OPENAI_BASE_URL 中加入自己的 OneAPI 开放地址默认是http://地址:端口/v1v1必须填写。KEY 中填写自己在 OneAPI 的令牌。
![](/imgs/Ollama-oneapi3.png)
[直接跳转5](#5-模型添加和使用)添加模型,并使用。
### 4. 直接接入
如果你既不想使用 AI Proxy也不想使用 OneAPI也可以选择直接接入修改部署 FastGPT 的 docker-compose.yml 文件,在其中将 AI Proxy 的使用注释,采用和 OneAPI 的类似配置。注释掉 AIProxy 相关代码在OPENAI_BASE_URL中加入自己的 Ollama 开放地址默认是http://地址:端口/v1强调:v1必须填写。在KEY中随便填入因为 Ollama 默认没有鉴权,如果开启鉴权,请自行填写。其他操作和在 OneAPI 中加入 Ollama 一致,只需在 FastGPT 中加入自己的模型即可使用。此处演示采用的是 Docker 部署 Ollama 的效果,主机 Ollama需要修改代理地址为http://<主机IP>:<端口>
![](/imgs/Ollama-direct1.png)
完成后[点击这里](#5-模型添加和使用)进行模型添加并使用。
### 5. 模型添加和使用
在 FastGPT 中点击账号->模型提供商->模型配置->新增模型添加自己的模型即可添加模型时需要保证模型ID和 OneAPI 中的模型名称一致。
![](/imgs/Ollama-models2.png)
![](/imgs/Ollama-models3.png)
在工作台中创建一个应用,选择自己之前添加的模型,此处模型名称为自己当时设置的别名。注:同一个模型无法多次添加,系统会采取最新添加时设置的别名。
![](/imgs/Ollama-models4.png)
### 6. 补充
上述接入 Ollama 的代理地址中,主机安装 Ollama 的地址为“http://<主机IP>:<端口>”,容器部署 Ollama 地址为“http://<容器名>:<端口>”
# 使用 Xinference 接入本地模型
@ -1103,7 +1103,6 @@ curl --location --request POST 'https://<oneapi_url>/v1/chat/completions' \
"usedInToolCall": true, // 是否用于工具调用务必保证至少有一个为true
"toolChoice": true, // 是否支持工具选择(分类,内容提取,工具调用会用到。)
"functionCall": false, // 是否支持函数调用(分类,内容提取,工具调用会用到。会优先使用 toolChoice如果为false则使用 functionCall如果仍为 false则使用提示词模式
"customCQPrompt": "", // 自定义文本分类提示词(不支持工具和函数调用的模型
"customExtractPrompt": "", // 自定义内容提取提示词
"defaultSystemChatPrompt": "", // 对话默认携带的系统提示词
"defaultConfig": {} // 请求API时挟带一些默认配置比如 GLM4 的 top_p
@ -2815,8 +2814,6 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
"usedInToolCall": true,
"toolChoice": true,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {}
@ -2838,8 +2835,6 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
"usedInToolCall": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {
"temperature": 1,
@ -2864,8 +2859,6 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
"usedInToolCall": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {
"temperature": 1,

1
env.d.ts vendored
View File

@ -5,6 +5,7 @@ declare global {
DEFAULT_ROOT_PSW: string;
DB_MAX_LINK: string;
FILE_TOKEN_KEY: string;
AES256_SECRET_KEY: string;
ROOT_KEY: string;
OPENAI_BASE_URL: string;
CHAT_API_KEY: string;

View File

@ -0,0 +1,8 @@
import { i18nT } from '../../../web/i18n/utils';
export enum HeaderSecretTypeEnum {
None = 'None',
Bearer = 'Bearer',
Basic = 'Basic',
Custom = 'Custom'
}

View File

@ -0,0 +1,6 @@
export type SecretValueType = {
value: string;
secret: string;
};
export type StoreSecretValueType = Record<string, SecretValueType>;

View File

@ -51,9 +51,6 @@ export type LLMModelItemType = PriceType &
functionCall: boolean;
toolChoice: boolean;
customCQPrompt: string;
customExtractPrompt: string;
defaultSystemChatPrompt?: string;
defaultConfig?: Record<string, any>;
fieldMap?: Record<string, string>;

View File

@ -26,8 +26,6 @@ export const defaultQAModels: LLMModelItemType[] = [
datasetProcess: true,
toolChoice: true,
functionCall: false,
customCQPrompt: '',
customExtractPrompt: '',
defaultSystemChatPrompt: '',
defaultConfig: {}
}

View File

@ -1,5 +1,3 @@
import { getPromptByVersion } from './utils';
export const Prompt_AgentQA = {
description: `<Context></Context> 标记中是一段文本,学习和分析它,并整理学习成果:
-
@ -27,73 +25,140 @@ A2:
`
};
export const getExtractJsonPrompt = (version?: string) => {
const promptMap: Record<string, string> = {
['4.9.2']: `你可以从 <对话记录></对话记录> 中提取指定 Json 信息,你仅需返回 Json 字符串,无需回答问题。
<提取要求>
{{description}}
</提取要求>
export const getExtractJsonPrompt = ({
schema,
systemPrompt,
memory
}: {
schema?: string;
systemPrompt?: string;
memory?: string;
}) => {
const list = [
'【历史记录】',
'【用户输入】',
systemPrompt ? '【背景知识】' : '',
memory ? '【历史提取结果】' : ''
].filter(Boolean);
const prompt = `## 背景
${list.join('、')}
<提取规则>
- json JsonSchema
- type ; key ; description ; enum value
-
</提取规则>
##
<JsonSchema>
{{json}}
</JsonSchema>
<对话记录>
{{text}}
</对话记录>
json :`
};
return getPromptByVersion(version, promptMap);
};
export const getExtractJsonToolPrompt = (version?: string) => {
const promptMap: Record<string, string> = {
['4.9.2']: `我正在执行一个函数,需要你提供一些参数,请以 JSON 字符串格式返回这些参数,要求:
"""
- {{description}}
- JSON Schema
-
-
"""
: """{{content}}"""
`
};
${
systemPrompt
? `## 特定要求
${systemPrompt}`
: ''
}
return getPromptByVersion(version, promptMap);
${
memory
? `## 历史提取结果
${memory}`
: ''
}
## JSON Schema
${schema}
##
- json
- `.replace(/\n{3,}/g, '\n\n');
return prompt;
};
export const getExtractJsonToolPrompt = ({
systemPrompt,
memory
}: {
systemPrompt?: string;
memory?: string;
}) => {
const list = [
'【历史记录】',
'【用户输入】',
systemPrompt ? '【背景知识】' : '',
memory ? '【历史提取结果】' : ''
].filter(Boolean);
const prompt = `## 背景
"request_function" ${list.join('、')}
##
-
-
- 使 JSON
${
systemPrompt
? `## 特定要求
${systemPrompt}`
: ''
}
${
memory
? `## 历史提取结果
${memory}`
: ''
}`.replace(/\n{3,}/g, '\n\n');
return prompt;
};
export const getCQPrompt = (version?: string) => {
const promptMap: Record<string, string> = {
['4.9.2']: `请帮我执行一个"问题分类"任务,将问题分类为以下几种类型之一:
export const getCQSystemPrompt = ({
systemPrompt,
memory,
typeList
}: {
systemPrompt?: string;
memory?: string;
typeList: string;
}) => {
const list = [
systemPrompt ? '【背景知识】' : '',
'【历史记录】',
memory ? '【上一轮分类结果】' : ''
].filter(Boolean);
const CLASSIFY_QUESTION_SYSTEM_PROMPT = `## 角色
"分类助手"${list.join('、')}
"""
{{typeList}}
"""
${
systemPrompt
? `## 背景知识
${systemPrompt}`
: ''
}
##
{{systemPrompt}}
${
memory
? `## 上一轮分类结果
${memory}`
: ''
}
##
{{history}}
##
##
${typeList}
"问题"ID
##
"{{question}}"
ID=
`
};
1.
2.
3.
return getPromptByVersion(version, promptMap);
##
id `.replace(/\n{3,}/g, '\n\n');
return CLASSIFY_QUESTION_SYSTEM_PROMPT;
};
export const QuestionGuidePrompt = `You are an AI assistant tasked with predicting the user's next question based on the conversation history. Your goal is to generate 3 potential questions that will guide the user to continue the conversation. When generating these questions, adhere to the following rules:

View File

@ -8,15 +8,18 @@ import { nanoid } from 'nanoid';
import { type McpToolConfigType } from '../type';
import { i18nT } from '../../../../web/i18n/utils';
import { type RuntimeNodeItemType } from '../../workflow/runtime/type';
import { type StoreSecretValueType } from '../../../common/secret/type';
export const getMCPToolSetRuntimeNode = ({
url,
toolList,
headerSecret,
name,
avatar
}: {
url: string;
toolList: McpToolConfigType[];
headerSecret?: StoreSecretValueType;
name?: string;
avatar?: string;
}): RuntimeNodeItemType => {
@ -31,7 +34,11 @@ export const getMCPToolSetRuntimeNode = ({
label: 'Tool Set Data',
valueType: WorkflowIOValueTypeEnum.object,
renderTypeList: [FlowNodeInputTypeEnum.hidden],
value: { url, toolList }
value: {
url,
toolList,
headerSecret
}
}
],
outputs: [],
@ -43,10 +50,12 @@ export const getMCPToolSetRuntimeNode = ({
export const getMCPToolRuntimeNode = ({
tool,
url,
headerSecret,
avatar = 'core/app/type/mcpToolsFill'
}: {
tool: McpToolConfigType;
url: string;
headerSecret?: StoreSecretValueType;
avatar?: string;
}): RuntimeNodeItemType => {
return {
@ -60,7 +69,11 @@ export const getMCPToolRuntimeNode = ({
label: 'Tool Data',
valueType: WorkflowIOValueTypeEnum.object,
renderTypeList: [FlowNodeInputTypeEnum.hidden],
value: { ...tool, url }
value: {
...tool,
url,
headerSecret
}
},
...Object.entries(tool.inputSchema?.properties || {}).map(([key, value]) => ({
key,

View File

@ -7,9 +7,6 @@ import { type StoreNodeItemType } from '../workflow/type/node';
import { DatasetSearchModeEnum } from '../dataset/constants';
import { type WorkflowTemplateBasicType } from '../workflow/type';
import { AppTypeEnum } from './constants';
import { AppErrEnum } from '../../common/error/code/app';
import { PluginErrEnum } from '../../common/error/code/plugin';
import { i18nT } from '../../../web/i18n/utils';
import appErrList from '../../common/error/code/app';
import pluginErrList from '../../common/error/code/plugin';

View File

@ -93,6 +93,7 @@ export type AIChatItemValueItemType = {
export type AIChatItemType = {
obj: ChatRoleEnum.AI;
value: AIChatItemValueItemType[];
memories?: Record<string, any>;
userGoodFeedback?: string;
userBadFeedback?: string;
customFeedbacks?: string[];

View File

@ -129,6 +129,7 @@ export enum NodeInputKeyEnum {
textareaInput = 'system_textareaInput',
addInputParam = 'system_addInputParam',
forbidStream = 'system_forbid_stream',
headerSecret = 'system_header_secret',
// history
historyMaxAmount = 'maxContext',

View File

@ -25,10 +25,10 @@ export enum DispatchNodeResponseKeyEnum {
toolResponses = 'toolResponses', // The result is passed back to the tool node for use
assistantResponses = 'assistantResponses', // assistant response
rewriteHistories = 'rewriteHistories', // If have the response, workflow histories will be rewrite
interactive = 'INTERACTIVE', // is interactive
runTimes = 'runTimes', // run times
newVariables = 'newVariables' // new variables
newVariables = 'newVariables', // new variables
memories = 'system_memories' // memories
}
export const needReplaceReferenceInputTypeList = [

View File

@ -246,6 +246,7 @@ export type DispatchNodeResultType<T = {}> = {
[DispatchNodeResponseKeyEnum.rewriteHistories]?: ChatItemType[];
[DispatchNodeResponseKeyEnum.runTimes]?: number;
[DispatchNodeResponseKeyEnum.newVariables]?: Record<string, any>;
[DispatchNodeResponseKeyEnum.memories]?: Record<string, any>;
} & T;
/* Single node props */

View File

@ -39,9 +39,9 @@ export const ClassifyQuestionModule: FlowNodeTemplateType = {
},
{
...Input_Template_System_Prompt,
label: 'core.module.input.label.Background',
description: 'core.module.input.description.Background',
placeholder: 'core.module.input.placeholder.Classify background'
label: i18nT('common:core.module.input.label.Background'),
description: i18nT('common:core.module.input.description.Background'),
placeholder: i18nT('common:core.module.input.placeholder.Classify background')
},
Input_Template_History,
Input_Template_UserChatInput,

View File

@ -65,6 +65,13 @@ export const HttpNode468: FlowNodeTemplateType = {
placeholder: 'https://api.ai.com/getInventory',
required: false
},
{
key: NodeInputKeyEnum.headerSecret,
renderTypeList: [FlowNodeInputTypeEnum.hidden],
valueType: WorkflowIOValueTypeEnum.object,
label: '',
required: false
},
{
key: NodeInputKeyEnum.httpHeaders,
renderTypeList: [FlowNodeInputTypeEnum.custom],

View File

@ -0,0 +1,6 @@
export type InvoiceFileInfo = {
data: string; // base64 encoded file data
mimeType: string;
filename: string;
size: number;
};

View File

@ -0,0 +1,28 @@
import crypto from 'crypto';
import { AES256_SECRET_KEY } from './constants';
export const encryptSecret = (text: string) => {
const iv = crypto.randomBytes(16);
const key = crypto.scryptSync(AES256_SECRET_KEY, 'salt', 32);
const cipher = crypto.createCipheriv('aes-256-gcm', key, iv);
const encrypted = Buffer.concat([cipher.update(text, 'utf8'), cipher.final()]);
const authTag = cipher.getAuthTag();
return `${iv.toString('hex')}:${encrypted.toString('hex')}:${authTag.toString('hex')}`;
};
export const decryptSecret = (encryptedText: string) => {
const [ivHex, encryptedHex, authTagHex] = encryptedText.split(':');
if (!ivHex || !encryptedHex || !authTagHex) {
return '';
}
const iv = Buffer.from(ivHex, 'hex');
const encrypted = Buffer.from(encryptedHex, 'hex');
const authTag = Buffer.from(authTagHex, 'hex');
const key = crypto.scryptSync(AES256_SECRET_KEY, 'salt', 32);
const decipher = crypto.createDecipheriv('aes-256-gcm', key, iv);
decipher.setAuthTag(authTag);
const decrypted = Buffer.concat([decipher.update(encrypted), decipher.final()]);
return decrypted.toString('utf8');
};

View File

@ -0,0 +1 @@
export const AES256_SECRET_KEY = process.env.AES256_SECRET_KEY || 'fastgptkey';

View File

@ -0,0 +1,43 @@
import { decryptSecret, encryptSecret } from './aes256gcm';
import type { SecretValueType } from '@fastgpt/global/common/secret/type';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
import { HeaderSecretTypeEnum } from '@fastgpt/global/common/secret/constants';
export const storeSecretValue = (
storeSecret: StoreSecretValueType
): Record<string, SecretValueType> => {
return Object.fromEntries(
Object.entries(storeSecret).map(([key, value]) => [
key,
{
secret: encryptSecret(value.value),
value: ''
}
])
);
};
export const getSecretValue = ({
storeSecret
}: {
storeSecret?: StoreSecretValueType;
}): Record<string, string> => {
if (!storeSecret) return {};
return Object.entries(storeSecret).reduce(
(acc: Record<string, string>, [key, { secret, value }]) => {
const actualValue = value || decryptSecret(secret);
if (key === HeaderSecretTypeEnum.Bearer) {
acc['Authorization'] = `Bearer ${actualValue}`;
} else if (key === HeaderSecretTypeEnum.Basic) {
acc['Authorization'] = `Basic ${actualValue}`;
} else {
acc[key] = actualValue;
}
return acc;
},
{}
);
};

View File

@ -9,10 +9,7 @@
"quoteMaxToken": 120000,
"maxTemperature": 0.99,
"showTopP": true,
"responseFormatList": [
"text",
"json_object"
],
"responseFormatList": ["text", "json_object"],
"showStopSign": true,
"vision": false,
"toolChoice": true,
@ -20,10 +17,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -37,10 +32,7 @@
"quoteMaxToken": 120000,
"maxTemperature": 0.99,
"showTopP": true,
"responseFormatList": [
"text",
"json_object"
],
"responseFormatList": ["text", "json_object"],
"showStopSign": true,
"vision": false,
"toolChoice": true,
@ -48,10 +40,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -65,10 +55,7 @@
"quoteMaxToken": 900000,
"maxTemperature": 0.99,
"showTopP": true,
"responseFormatList": [
"text",
"json_object"
],
"responseFormatList": ["text", "json_object"],
"showStopSign": true,
"vision": false,
"toolChoice": false,
@ -76,10 +63,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -93,10 +78,7 @@
"quoteMaxToken": 120000,
"maxTemperature": 0.99,
"showTopP": true,
"responseFormatList": [
"text",
"json_object"
],
"responseFormatList": ["text", "json_object"],
"showStopSign": true,
"vision": false,
"toolChoice": true,
@ -104,10 +86,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -128,10 +108,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -152,10 +130,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -172,4 +148,4 @@
"type": "embedding"
}
]
}
}

View File

@ -16,10 +16,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -40,10 +38,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -64,10 +60,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -88,10 +82,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -112,10 +104,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -136,10 +126,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -160,10 +148,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},

View File

@ -17,10 +17,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"type": "llm"
},
@ -38,10 +36,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},

View File

@ -16,10 +16,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -40,10 +38,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -64,10 +60,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -88,10 +82,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -112,10 +104,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -136,10 +126,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -158,10 +146,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -182,10 +168,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -206,10 +190,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -230,10 +212,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -254,10 +234,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -278,10 +256,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},

View File

@ -14,10 +14,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -38,10 +36,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -62,10 +58,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -86,10 +80,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -112,4 +104,4 @@
"type": "embedding"
}
]
}
}

View File

@ -14,10 +14,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -38,10 +36,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -62,10 +58,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -86,10 +80,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -110,10 +102,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -134,10 +124,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -158,10 +146,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -182,10 +168,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -206,10 +190,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -230,10 +212,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},

View File

@ -16,10 +16,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -40,10 +38,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -64,10 +60,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -88,10 +82,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},

View File

@ -14,10 +14,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"type": "llm",
@ -37,10 +35,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"type": "llm",
@ -48,4 +44,4 @@
"showStopSign": true
}
]
}
}

View File

@ -14,10 +14,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -38,10 +36,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -62,10 +58,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -86,10 +80,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -110,10 +102,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -134,10 +124,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -158,10 +146,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -177,4 +163,4 @@
"type": "embedding"
}
]
}
}

View File

@ -14,10 +14,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -38,10 +36,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -50,4 +46,4 @@
"showStopSign": true
}
]
}
}

View File

@ -14,10 +14,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -38,10 +36,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -241,4 +237,4 @@
"type": "tts"
}
]
}
}

View File

@ -14,10 +14,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -38,10 +36,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -62,10 +58,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -86,10 +80,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -98,4 +90,4 @@
"showStopSign": true
}
]
}
}

View File

@ -14,10 +14,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -39,10 +37,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -64,10 +60,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -89,10 +83,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -114,10 +106,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -139,10 +129,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},

View File

@ -17,9 +17,7 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -41,9 +39,7 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -65,9 +61,7 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -89,9 +83,7 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -113,10 +105,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -135,10 +125,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {
@ -161,10 +149,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {
@ -187,10 +173,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {
@ -213,10 +197,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {
@ -239,10 +221,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {
@ -265,10 +245,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {
"stream": false
@ -295,10 +273,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"type": "llm"
},
@ -317,10 +293,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"type": "llm"
},

View File

@ -14,10 +14,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -39,10 +37,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -63,10 +59,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -88,10 +82,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"type": "llm",
"showTopP": true,
@ -110,10 +102,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -136,10 +126,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {
"stream": true
@ -164,10 +152,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {
"stream": true
@ -192,10 +178,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {
"stream": true
@ -220,10 +204,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {
"stream": true
@ -248,10 +230,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {
"stream": true
@ -276,10 +256,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {
"stream": true
@ -304,10 +282,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {
"stream": true
@ -332,10 +308,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {
"stream": true
@ -360,10 +334,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": false,
"usedInClassify": false,
"customCQPrompt": "",
"usedInExtractFields": false,
"usedInQueryExtension": false,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {
"stream": true
@ -387,10 +359,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": false,
"usedInClassify": false,
"customCQPrompt": "",
"usedInExtractFields": false,
"usedInQueryExtension": false,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {
"stream": true
@ -413,10 +383,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -437,10 +405,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -462,10 +428,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -487,10 +451,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -512,10 +474,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -537,10 +497,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": false,
"usedInClassify": false,
"customCQPrompt": "",
"usedInExtractFields": false,
"usedInQueryExtension": false,
"customExtractPrompt": "",
"usedInToolCall": false,
"defaultConfig": {},
"fieldMap": {},

View File

@ -14,10 +14,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -40,8 +38,6 @@
"usedInToolCall": false,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"type": "llm",
@ -61,10 +57,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -207,4 +201,4 @@
"type": "rerank"
}
]
}
}

View File

@ -16,8 +16,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -38,8 +36,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -60,8 +56,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -82,8 +76,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -102,10 +94,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -126,10 +116,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -138,4 +126,4 @@
"showStopSign": true
}
]
}
}

View File

@ -16,8 +16,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -38,8 +36,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -60,8 +56,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -82,8 +76,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -104,8 +96,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -126,8 +116,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -148,8 +136,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -170,8 +156,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -192,8 +176,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -214,8 +196,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -236,8 +216,6 @@
"usedInQueryExtension": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
@ -327,4 +305,4 @@
"type": "tts"
}
]
}
}

View File

@ -14,10 +14,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -38,10 +36,8 @@
"defaultSystemChatPrompt": "",
"datasetProcess": true,
"usedInClassify": true,
"customCQPrompt": "",
"usedInExtractFields": true,
"usedInQueryExtension": true,
"customExtractPrompt": "",
"usedInToolCall": true,
"defaultConfig": {},
"fieldMap": {},
@ -50,4 +46,4 @@
"showStopSign": true
}
]
}
}

View File

@ -1,19 +1,49 @@
import { type AppSchema } from '@fastgpt/global/core/app/type';
import { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
import { getLLMModel } from '../ai/model';
import { MongoApp } from './schema';
import type { StoreNodeItemType } from '@fastgpt/global/core/workflow/type/node';
import { storeSecretValue } from '../../common/secret/utils';
export const beforeUpdateAppFormat = <T extends AppSchema['modules'] | undefined>({
nodes,
isPlugin
}: {
nodes: T;
isPlugin: boolean;
}) => {
return {
nodes
};
export const beforeUpdateAppFormat = ({ nodes }: { nodes?: StoreNodeItemType[] }) => {
if (!nodes) return;
nodes.forEach((node) => {
// Format header secret
node.inputs.forEach((input) => {
if (input.key === NodeInputKeyEnum.headerSecret && typeof input.value === 'object') {
input.value = storeSecretValue(input.value);
}
});
// Format dataset search
if (node.flowNodeType === FlowNodeTypeEnum.datasetSearchNode) {
node.inputs.forEach((input) => {
if (input.key === NodeInputKeyEnum.datasetSelectList) {
const val = input.value as undefined | { datasetId: string }[] | { datasetId: string };
if (!val) {
input.value = [];
} else if (Array.isArray(val)) {
// Not rewrite reference value
if (val.length === 2 && val.every((item) => typeof item === 'string')) {
return;
}
input.value = val
.map((dataset: { datasetId: string }) => ({
datasetId: dataset.datasetId
}))
.filter((item) => !!item.datasetId);
} else if (typeof val === 'object' && val !== null) {
input.value = [
{
datasetId: val.datasetId
}
];
}
}
});
}
});
};
/* Get apps */

View File

@ -8,9 +8,11 @@ import { retryFn } from '@fastgpt/global/common/system/utils';
export class MCPClient {
private client: Client;
private url: string;
private headers: Record<string, any> = {};
constructor(config: { url: string }) {
constructor(config: { url: string; headers: Record<string, any> }) {
this.url = config.url;
this.headers = config.headers;
this.client = new Client({
name: 'FastGPT-MCP-client',
version: '1.0.0'
@ -19,11 +21,34 @@ export class MCPClient {
private async getConnection(): Promise<Client> {
try {
const transport = new StreamableHTTPClientTransport(new URL(this.url));
const transport = new StreamableHTTPClientTransport(new URL(this.url), {
requestInit: {
headers: this.headers
}
});
await this.client.connect(transport);
return this.client;
} catch (error) {
await this.client.connect(new SSEClientTransport(new URL(this.url)));
await this.client.connect(
new SSEClientTransport(new URL(this.url), {
requestInit: {
headers: this.headers
},
eventSourceInit: {
fetch: (url, init) => {
const headers = new Headers({
...init?.headers,
...this.headers
});
return fetch(url, {
...init,
headers
});
}
}
})
);
return this.client;
}
}

View File

@ -83,8 +83,6 @@ export async function rewriteAppWorkflowToDetail({
})
);
/* Add node(App Type) versionlabel and latest sign ==== */
// Get all dataset ids from nodes
nodes.forEach((node) => {
if (node.flowNodeType !== FlowNodeTypeEnum.datasetSearchNode) return;
@ -170,34 +168,3 @@ export async function rewriteAppWorkflowToDetail({
return nodes;
}
export async function rewriteAppWorkflowToSimple(formatNodes: StoreNodeItemType[]) {
formatNodes.forEach((node) => {
if (node.flowNodeType !== FlowNodeTypeEnum.datasetSearchNode) return;
node.inputs.forEach((input) => {
if (input.key === NodeInputKeyEnum.datasetSelectList) {
const val = input.value as undefined | { datasetId: string }[] | { datasetId: string };
if (!val) {
input.value = [];
} else if (Array.isArray(val)) {
// Not rewrite reference value
if (val.length === 2 && val.every((item) => typeof item === 'string')) {
return;
}
input.value = val
.map((dataset: { datasetId: string }) => ({
datasetId: dataset.datasetId
}))
.filter((item) => !!item.datasetId);
} else if (typeof val === 'object' && val !== null) {
input.value = [
{
datasetId: val.datasetId
}
];
}
}
});
});
}

View File

@ -1,5 +1,5 @@
import { connectionMongo, getMongoModel, type Model } from '../../common/mongo';
const { Schema, model, models } = connectionMongo;
import { connectionMongo, getMongoModel } from '../../common/mongo';
const { Schema } = connectionMongo;
import { type ChatItemSchema as ChatItemType } from '@fastgpt/global/core/chat/type';
import { ChatRoleMap } from '@fastgpt/global/core/chat/constants';
import { getNanoid } from '@fastgpt/global/common/string/tools';
@ -61,16 +61,13 @@ const ChatItemSchema = new Schema({
type: Array,
default: []
},
memories: Object,
errorMsg: String,
userGoodFeedback: {
type: String
},
userGoodFeedback: String,
userBadFeedback: {
type: String
},
customFeedbacks: {
type: [String]
},
customFeedbacks: [String],
adminFeedback: {
type: {
datasetId: String,

View File

@ -1,7 +1,6 @@
import type { ChatItemType, ChatItemValueItemType } from '@fastgpt/global/core/chat/type';
import type { ChatItemType } from '@fastgpt/global/core/chat/type';
import { MongoChatItem } from './chatItemSchema';
import { addLog } from '../../common/system/log';
import { ChatItemValueTypeEnum } from '@fastgpt/global/core/chat/constants';
import { delFileByFileIdList, getGFSCollection } from '../../common/file/gridfs/controller';
import { BucketNameEnum } from '@fastgpt/global/common/file/constants';
import { MongoChat } from './chatSchema';
@ -29,29 +28,9 @@ export async function getChatItems({
]);
histories.reverse();
histories.forEach((item) => {
// @ts-ignore
item.value = adaptStringValue(item.value);
});
return { histories, total };
}
/* Temporary adaptation for old conversation records */
export const adaptStringValue = (value: any): ChatItemValueItemType[] => {
if (typeof value === 'string') {
return [
{
type: ChatItemValueTypeEnum.text,
text: {
content: value
}
}
];
}
return value;
};
export const addCustomFeedbacks = async ({
appId,
chatId,

View File

@ -59,25 +59,27 @@ export const dispatchAppRequest = async (props: Props): Promise<Response> => {
const chatHistories = getHistories(history, histories);
const { files } = chatValue2RuntimePrompt(query);
const { flowResponses, flowUsages, assistantResponses } = await dispatchWorkFlow({
...props,
runningAppInfo: {
id: String(appData._id),
teamId: String(appData.teamId),
tmbId: String(appData.tmbId)
},
runtimeNodes: storeNodes2RuntimeNodes(
appData.modules,
getWorkflowEntryNodeIds(appData.modules)
),
runtimeEdges: storeEdges2RuntimeEdges(appData.edges),
histories: chatHistories,
query: runtimePrompt2ChatsValue({
files,
text: userChatInput
}),
variables: props.variables
});
const { flowResponses, flowUsages, assistantResponses, system_memories } = await dispatchWorkFlow(
{
...props,
runningAppInfo: {
id: String(appData._id),
teamId: String(appData.teamId),
tmbId: String(appData.tmbId)
},
runtimeNodes: storeNodes2RuntimeNodes(
appData.modules,
getWorkflowEntryNodeIds(appData.modules)
),
runtimeEdges: storeEdges2RuntimeEdges(appData.edges),
histories: chatHistories,
query: runtimePrompt2ChatsValue({
files,
text: userChatInput
}),
variables: props.variables
}
);
const completeMessages = chatHistories.concat([
{
@ -94,6 +96,7 @@ export const dispatchAppRequest = async (props: Props): Promise<Response> => {
return {
assistantResponses,
system_memories,
[DispatchNodeResponseKeyEnum.nodeResponse]: {
moduleLogo: appData.avatar,
query: userChatInput,

View File

@ -11,19 +11,17 @@ import type { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { NodeOutputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { DispatchNodeResponseKeyEnum } from '@fastgpt/global/core/workflow/runtime/constants';
import type { ModuleDispatchProps } from '@fastgpt/global/core/workflow/runtime/type';
import { getCQPrompt } from '@fastgpt/global/core/ai/prompt/agent';
import { getCQSystemPrompt } from '@fastgpt/global/core/ai/prompt/agent';
import { type LLMModelItemType } from '@fastgpt/global/core/ai/model.d';
import { getLLMModel } from '../../../ai/model';
import { getHistories } from '../utils';
import { formatModelChars2Points } from '../../../../support/wallet/usage/utils';
import { type DispatchNodeResultType } from '@fastgpt/global/core/workflow/runtime/type';
import { chatValue2RuntimePrompt } from '@fastgpt/global/core/chat/adapt';
import { getHandleId } from '@fastgpt/global/core/workflow/utils';
import { loadRequestMessages } from '../../../chat/utils';
import { llmCompletionsBodyFormat, formatLLMResponse } from '../../../ai/utils';
import { addLog } from '../../../../common/system/log';
import { ModelTypeEnum } from '../../../../../global/core/ai/model';
import { replaceVariable } from '@fastgpt/global/common/string/tools';
type Props = ModuleDispatchProps<{
[NodeInputKeyEnum.aiModel]: string;
@ -35,12 +33,16 @@ type Props = ModuleDispatchProps<{
type CQResponse = DispatchNodeResultType<{
[NodeOutputKeyEnum.cqResult]: string;
}>;
type ActionProps = Props & { cqModel: LLMModelItemType };
type ActionProps = Props & {
cqModel: LLMModelItemType;
lastMemory?: ClassifyQuestionAgentItemType;
};
/* request openai chat */
export const dispatchClassifyQuestion = async (props: Props): Promise<CQResponse> => {
const {
externalProvider,
runningAppInfo,
node: { nodeId, name },
histories,
params: { model, history = 6, agents, userChatInput }
@ -52,10 +54,16 @@ export const dispatchClassifyQuestion = async (props: Props): Promise<CQResponse
const cqModel = getLLMModel(model);
const memoryKey = `${runningAppInfo.id}-${nodeId}`;
const chatHistories = getHistories(history, histories);
// @ts-ignore
const lastMemory = chatHistories[chatHistories.length - 1]?.memories?.[
memoryKey
] as ClassifyQuestionAgentItemType;
const { arg, inputTokens, outputTokens } = await completions({
...props,
lastMemory,
histories: chatHistories,
cqModel
});
@ -74,6 +82,9 @@ export const dispatchClassifyQuestion = async (props: Props): Promise<CQResponse
[DispatchNodeResponseKeyEnum.skipHandleId]: agents
.filter((item) => item.key !== result.key)
.map((item) => getHandleId(nodeId, 'source', item.key)),
[DispatchNodeResponseKeyEnum.memories]: {
[memoryKey]: result
},
[DispatchNodeResponseKeyEnum.nodeResponse]: {
totalPoints: externalProvider.openaiAccount?.key ? 0 : totalPoints,
model: modelName,
@ -100,26 +111,35 @@ const completions = async ({
cqModel,
externalProvider,
histories,
params: { agents, systemPrompt = '', userChatInput },
node: { version }
lastMemory,
params: { agents, systemPrompt = '', userChatInput }
}: ActionProps) => {
const messages: ChatItemType[] = [
{
obj: ChatRoleEnum.System,
value: [
{
type: ChatItemValueTypeEnum.text,
text: {
content: getCQSystemPrompt({
systemPrompt,
memory: lastMemory ? JSON.stringify(lastMemory) : '',
typeList: JSON.stringify(
agents.map((item) => ({ id: item.key, description: item.value }))
)
})
}
}
]
},
...histories,
{
obj: ChatRoleEnum.Human,
value: [
{
type: ChatItemValueTypeEnum.text,
text: {
content: replaceVariable(cqModel.customCQPrompt || getCQPrompt(version), {
systemPrompt: systemPrompt || 'null',
typeList: agents
.map((item) => `{"类型ID":"${item.key}", "问题类型":"${item.value}"}`)
.join('\n------\n'),
history: histories
.map((item) => `${item.obj}:${chatValue2RuntimePrompt(item.value).text}`)
.join('\n------\n'),
question: userChatInput
})
content: userChatInput
}
}
]
@ -145,7 +165,6 @@ const completions = async ({
const { text: answer, usage } = await formatLLMResponse(response);
// console.log(JSON.stringify(chats2GPTMessages({ messages, reserveId: false }), null, 2));
// console.log(answer, '----');
const id =
agents.find((item) => answer.includes(item.key))?.key ||

View File

@ -13,7 +13,7 @@ import type { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { NodeOutputKeyEnum, toolValueTypeList } from '@fastgpt/global/core/workflow/constants';
import { DispatchNodeResponseKeyEnum } from '@fastgpt/global/core/workflow/runtime/constants';
import type { ModuleDispatchProps } from '@fastgpt/global/core/workflow/runtime/type';
import { replaceVariable, sliceJsonStr } from '@fastgpt/global/common/string/tools';
import { sliceJsonStr } from '@fastgpt/global/common/string/tools';
import { type LLMModelItemType } from '@fastgpt/global/core/ai/model.d';
import { getHistories } from '../utils';
import { getLLMModel } from '../../../ai/model';
@ -21,12 +21,10 @@ import { formatModelChars2Points } from '../../../../support/wallet/usage/utils'
import json5 from 'json5';
import {
type ChatCompletionMessageParam,
type ChatCompletionTool,
type UnStreamChatType
type ChatCompletionTool
} from '@fastgpt/global/core/ai/type';
import { ChatCompletionRequestMessageRoleEnum } from '@fastgpt/global/core/ai/constants';
import { type DispatchNodeResultType } from '@fastgpt/global/core/workflow/runtime/type';
import { chatValue2RuntimePrompt } from '@fastgpt/global/core/chat/adapt';
import { llmCompletionsBodyFormat, formatLLMResponse } from '../../../ai/utils';
import { ModelTypeEnum } from '../../../../../global/core/ai/model';
import {
@ -46,14 +44,15 @@ type Response = DispatchNodeResultType<{
[NodeOutputKeyEnum.contextExtractFields]: string;
}>;
type ActionProps = Props & { extractModel: LLMModelItemType };
type ActionProps = Props & { extractModel: LLMModelItemType; lastMemory?: Record<string, any> };
const agentFunName = 'request_function';
export async function dispatchContentExtract(props: Props): Promise<Response> {
const {
externalProvider,
node: { name },
runningAppInfo,
node: { nodeId, name },
histories,
params: { content, history = 6, model, description, extractKeys }
} = props;
@ -65,18 +64,27 @@ export async function dispatchContentExtract(props: Props): Promise<Response> {
const extractModel = getLLMModel(model);
const chatHistories = getHistories(history, histories);
const memoryKey = `${runningAppInfo.id}-${nodeId}`;
// @ts-ignore
const lastMemory = chatHistories[chatHistories.length - 1]?.memories?.[memoryKey] as Record<
string,
any
>;
const { arg, inputTokens, outputTokens } = await (async () => {
if (extractModel.toolChoice) {
return toolChoice({
...props,
histories: chatHistories,
extractModel
extractModel,
lastMemory
});
}
return completions({
...props,
histories: chatHistories,
extractModel
extractModel,
lastMemory
});
})();
@ -121,6 +129,9 @@ export async function dispatchContentExtract(props: Props): Promise<Response> {
return {
[NodeOutputKeyEnum.success]: success,
[NodeOutputKeyEnum.contextExtractFields]: JSON.stringify(arg),
[DispatchNodeResponseKeyEnum.memories]: {
[memoryKey]: arg
},
...arg,
[DispatchNodeResponseKeyEnum.nodeResponse]: {
totalPoints: externalProvider.openaiAccount?.key ? 0 : totalPoints,
@ -144,39 +155,7 @@ export async function dispatchContentExtract(props: Props): Promise<Response> {
};
}
const getFunctionCallSchema = async ({
extractModel,
histories,
params: { content, extractKeys, description },
node: { version }
}: ActionProps) => {
const messages: ChatItemType[] = [
...histories,
{
obj: ChatRoleEnum.Human,
value: [
{
type: ChatItemValueTypeEnum.text,
text: {
content: replaceVariable(getExtractJsonToolPrompt(version), {
description,
content
})
}
}
]
}
];
const adaptMessages = chats2GPTMessages({ messages, reserveId: false });
const filterMessages = await filterGPTMessageByMaxContext({
messages: adaptMessages,
maxContext: extractModel.maxContext
});
const requestMessages = await loadRequestMessages({
messages: filterMessages,
useVision: false
});
const getJsonSchema = ({ params: { extractKeys } }: ActionProps) => {
const properties: Record<
string,
{
@ -194,32 +173,71 @@ const getFunctionCallSchema = async ({
...(item.enum ? { enum: item.enum.split('\n').filter(Boolean) } : {})
};
});
// function body
const agentFunction = {
name: agentFunName,
description: '需要执行的函数',
parameters: {
type: 'object',
properties,
required: []
}
};
return {
filterMessages: requestMessages,
agentFunction
};
return properties;
};
const toolChoice = async (props: ActionProps) => {
const { externalProvider, extractModel } = props;
const {
externalProvider,
extractModel,
histories,
params: { content, description },
lastMemory
} = props;
const { filterMessages, agentFunction } = await getFunctionCallSchema(props);
const messages: ChatItemType[] = [
{
obj: ChatRoleEnum.System,
value: [
{
type: ChatItemValueTypeEnum.text,
text: {
content: getExtractJsonToolPrompt({
systemPrompt: description,
memory: lastMemory ? JSON.stringify(lastMemory) : undefined
})
}
}
]
},
...histories,
{
obj: ChatRoleEnum.Human,
value: [
{
type: ChatItemValueTypeEnum.text,
text: {
content
}
}
]
}
];
const adaptMessages = chats2GPTMessages({ messages, reserveId: false });
const filterMessages = await filterGPTMessageByMaxContext({
messages: adaptMessages,
maxContext: extractModel.maxContext
});
const requestMessages = await loadRequestMessages({
messages: filterMessages,
useVision: false
});
const schema = getJsonSchema(props);
const tools: ChatCompletionTool[] = [
{
type: 'function',
function: agentFunction
function: {
name: agentFunName,
description: '需要执行的函数',
parameters: {
type: 'object',
properties: schema,
required: []
}
}
}
];
@ -228,12 +246,13 @@ const toolChoice = async (props: ActionProps) => {
stream: true,
model: extractModel.model,
temperature: 0.01,
messages: filterMessages,
messages: requestMessages,
tools,
tool_choice: { type: 'function', function: { name: agentFunName } }
},
extractModel
);
const { response } = await createChatCompletion({
body,
userKey: externalProvider.openaiAccount
@ -242,7 +261,7 @@ const toolChoice = async (props: ActionProps) => {
const arg: Record<string, any> = (() => {
try {
return json5.parse(toolCalls?.[0]?.function?.arguments || '');
return json5.parse(toolCalls?.[0]?.function?.arguments || text || '');
} catch (error) {
console.log('body', body);
console.log('AI response', text, toolCalls?.[0]?.function);
@ -267,40 +286,39 @@ const toolChoice = async (props: ActionProps) => {
};
};
const completions = async ({
extractModel,
externalProvider,
histories,
params: { content, extractKeys, description = 'No special requirements' },
node: { version }
}: ActionProps) => {
const completions = async (props: ActionProps) => {
const {
extractModel,
externalProvider,
histories,
lastMemory,
params: { content, description }
} = props;
const messages: ChatItemType[] = [
{
obj: ChatRoleEnum.System,
value: [
{
type: ChatItemValueTypeEnum.text,
text: {
content: getExtractJsonPrompt({
systemPrompt: description,
memory: lastMemory ? JSON.stringify(lastMemory) : undefined,
schema: JSON.stringify(getJsonSchema(props))
})
}
}
]
},
...histories,
{
obj: ChatRoleEnum.Human,
value: [
{
type: ChatItemValueTypeEnum.text,
text: {
content: replaceVariable(
extractModel.customExtractPrompt || getExtractJsonPrompt(version),
{
description,
json: extractKeys
.map((item) => {
const valueType = item.valueType || 'string';
if (valueType !== 'string' && valueType !== 'number') {
item.enum = undefined;
}
return `{"type":${item.valueType || 'string'}, "key":"${item.key}", "description":"${item.desc}" ${
item.enum ? `, "enum":"[${item.enum.split('\n')}]"` : ''
}}`;
})
.join('\n'),
text: `${histories.map((item) => `${item.obj}:${chatValue2RuntimePrompt(item.value).text}`).join('\n')}
Human: ${content}`
}
)
content
}
}
]

View File

@ -223,6 +223,7 @@ export async function dispatchWorkFlow(data: Props): Promise<DispatchFlowRespons
interactiveResponse: InteractiveNodeResponseType;
}
| undefined;
let system_memories: Record<string, any> = {}; // Workflow node memories
/* Store special response field */
function pushStore(
@ -235,7 +236,8 @@ export async function dispatchWorkFlow(data: Props): Promise<DispatchFlowRespons
toolResponses,
assistantResponses,
rewriteHistories,
runTimes = 1
runTimes = 1,
system_memories: newMemories
}: Omit<
DispatchNodeResultType<{
[NodeOutputKeyEnum.answerText]?: string;
@ -249,6 +251,13 @@ export async function dispatchWorkFlow(data: Props): Promise<DispatchFlowRespons
workflowRunTimes += runTimes;
props.maxRunTimes -= runTimes;
if (newMemories) {
system_memories = {
...system_memories,
...newMemories
};
}
if (responseData) {
chatResponses.push(responseData);
}
@ -771,7 +780,12 @@ export async function dispatchWorkFlow(data: Props): Promise<DispatchFlowRespons
[DispatchNodeResponseKeyEnum.assistantResponses]:
mergeAssistantResponseAnswerText(chatAssistantResponse),
[DispatchNodeResponseKeyEnum.toolResponses]: toolRunResponse,
newVariables: removeSystemVariable(variables, externalProvider.externalWorkflowVariables),
[DispatchNodeResponseKeyEnum.newVariables]: removeSystemVariable(
variables,
externalProvider.externalWorkflowVariables
),
[DispatchNodeResponseKeyEnum.memories]:
Object.keys(system_memories).length > 0 ? system_memories : undefined,
durationSeconds
};
} catch (error) {

View File

@ -80,32 +80,33 @@ export const dispatchRunPlugin = async (props: RunPluginProps): Promise<RunPlugi
appId: String(plugin.id),
...(externalProvider ? externalProvider.externalWorkflowVariables : {})
};
const { flowResponses, flowUsages, assistantResponses, runTimes } = await dispatchWorkFlow({
...props,
// Rewrite stream mode
...(system_forbid_stream
? {
stream: false,
workflowStreamResponse: undefined
}
: {}),
runningAppInfo: {
id: String(plugin.id),
// 如果系统插件有 teamId 和 tmbId则使用系统插件的 teamId 和 tmbId管理员指定了插件作为系统插件
teamId: plugin.teamId || runningAppInfo.teamId,
tmbId: plugin.tmbId || runningAppInfo.tmbId,
isChildApp: true
},
variables: runtimeVariables,
query: getPluginRunUserQuery({
pluginInputs: getPluginInputsFromStoreNodes(plugin.nodes),
const { flowResponses, flowUsages, assistantResponses, runTimes, system_memories } =
await dispatchWorkFlow({
...props,
// Rewrite stream mode
...(system_forbid_stream
? {
stream: false,
workflowStreamResponse: undefined
}
: {}),
runningAppInfo: {
id: String(plugin.id),
// 如果系统插件有 teamId 和 tmbId则使用系统插件的 teamId 和 tmbId管理员指定了插件作为系统插件
teamId: plugin.teamId || runningAppInfo.teamId,
tmbId: plugin.tmbId || runningAppInfo.tmbId,
isChildApp: true
},
variables: runtimeVariables,
files
}).value,
chatConfig: {},
runtimeNodes,
runtimeEdges: storeEdges2RuntimeEdges(plugin.edges)
});
query: getPluginRunUserQuery({
pluginInputs: getPluginInputsFromStoreNodes(plugin.nodes),
variables: runtimeVariables,
files
}).value,
chatConfig: {},
runtimeNodes,
runtimeEdges: storeEdges2RuntimeEdges(plugin.edges)
});
const output = flowResponses.find((item) => item.moduleType === FlowNodeTypeEnum.pluginOutput);
if (output) {
output.moduleLogo = plugin.avatar;
@ -119,6 +120,7 @@ export const dispatchRunPlugin = async (props: RunPluginProps): Promise<RunPlugi
return {
// 嵌套运行时,如果 childApp stream=false实际上不会有任何内容输出给用户所以不需要存储
assistantResponses: system_forbid_stream ? [] : assistantResponses,
system_memories,
// responseData, // debug
[DispatchNodeResponseKeyEnum.runTimes]: runTimes,
[DispatchNodeResponseKeyEnum.nodeResponse]: {

View File

@ -124,30 +124,36 @@ export const dispatchRunAppNode = async (props: Props): Promise<Response> => {
? query
: runtimePrompt2ChatsValue({ files: userInputFiles, text: userChatInput });
const { flowResponses, flowUsages, assistantResponses, runTimes, workflowInteractiveResponse } =
await dispatchWorkFlow({
...props,
lastInteractive: childrenInteractive,
// Rewrite stream mode
...(system_forbid_stream
? {
stream: false,
workflowStreamResponse: undefined
}
: {}),
runningAppInfo: {
id: String(appData._id),
teamId: String(appData.teamId),
tmbId: String(appData.tmbId),
isChildApp: true
},
runtimeNodes,
runtimeEdges,
histories: chatHistories,
variables: childrenRunVariables,
query: theQuery,
chatConfig
});
const {
flowResponses,
flowUsages,
assistantResponses,
runTimes,
workflowInteractiveResponse,
system_memories
} = await dispatchWorkFlow({
...props,
lastInteractive: childrenInteractive,
// Rewrite stream mode
...(system_forbid_stream
? {
stream: false,
workflowStreamResponse: undefined
}
: {}),
runningAppInfo: {
id: String(appData._id),
teamId: String(appData.teamId),
tmbId: String(appData.tmbId),
isChildApp: true
},
runtimeNodes,
runtimeEdges,
histories: chatHistories,
variables: childrenRunVariables,
query: theQuery,
chatConfig
});
const completeMessages = chatHistories.concat([
{
@ -165,6 +171,7 @@ export const dispatchRunAppNode = async (props: Props): Promise<Response> => {
const usagePoints = flowUsages.reduce((sum, item) => sum + (item.totalPoints || 0), 0);
return {
system_memories,
[DispatchNodeResponseKeyEnum.interactive]: workflowInteractiveResponse
? {
type: 'childrenInteractive',

View File

@ -6,11 +6,14 @@ import { DispatchNodeResponseKeyEnum } from '@fastgpt/global/core/workflow/runti
import { NodeOutputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { MCPClient } from '../../../app/mcp';
import { getErrText } from '@fastgpt/global/common/error/utils';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
import { getSecretValue } from '../../../../common/secret/utils';
type RunToolProps = ModuleDispatchProps<{
toolData: {
name: string;
url: string;
headerSecret: StoreSecretValueType;
};
}>;
@ -27,7 +30,12 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
const { toolData, ...restParams } = params;
const { name: toolName, url } = toolData;
const mcpClient = new MCPClient({ url });
const mcpClient = new MCPClient({
url,
headers: getSecretValue({
storeSecret: toolData.headerSecret
})
});
try {
const result = await mcpClient.toolCall(toolName, restParams);

View File

@ -29,6 +29,8 @@ import { createFileToken } from '../../../../support/permission/controller';
import { JSONPath } from 'jsonpath-plus';
import type { SystemPluginSpecialResponse } from '../../../../../plugins/type';
import json5 from 'json5';
import { getSecretValue } from '../../../../common/secret/utils';
import type { StoreSecretValueType } from '@fastgpt/global/common/secret/type';
type PropsArrType = {
key: string;
@ -37,6 +39,7 @@ type PropsArrType = {
};
type HttpRequestProps = ModuleDispatchProps<{
[NodeInputKeyEnum.abandon_httpUrl]: string;
[NodeInputKeyEnum.headerSecret]?: StoreSecretValueType;
[NodeInputKeyEnum.httpMethod]: string;
[NodeInputKeyEnum.httpReqUrl]: string;
[NodeInputKeyEnum.httpHeaders]?: PropsArrType[];
@ -83,6 +86,7 @@ export const dispatchHttp468Request = async (props: HttpRequestProps): Promise<H
system_httpFormBody: httpFormBody = [],
system_httpContentType: httpContentType = ContentTypes.json,
system_httpTimeout: httpTimeout = 60,
system_header_secret: headerSecret,
[NodeInputKeyEnum.addInputParam]: dynamicInput,
...body
}
@ -156,110 +160,6 @@ export const dispatchHttp468Request = async (props: HttpRequestProps): Promise<H
return String(val);
};
// Test cases for variable replacement in JSON body
// const bodyTest = () => {
// const testData = [
// // 基本字符串替换
// {
// body: `{"name":"{{name}}","age":"18"}`,
// variables: [{ key: '{{name}}', value: '测试' }],
// result: `{"name":"测试","age":"18"}`
// },
// // 特殊字符处理
// {
// body: `{"text":"{{text}}"}`,
// variables: [{ key: '{{text}}', value: '包含"引号"和\\反斜杠' }],
// result: `{"text":"包含\\"引号\\"和\\反斜杠"}`
// },
// // 数字类型处理
// {
// body: `{"count":{{count}},"price":{{price}}}`,
// variables: [
// { key: '{{count}}', value: '42' },
// { key: '{{price}}', value: '99.99' }
// ],
// result: `{"count":42,"price":99.99}`
// },
// // 布尔值处理
// {
// body: `{"isActive":{{isActive}},"hasData":{{hasData}}}`,
// variables: [
// { key: '{{isActive}}', value: 'true' },
// { key: '{{hasData}}', value: 'false' }
// ],
// result: `{"isActive":true,"hasData":false}`
// },
// // 对象类型处理
// {
// body: `{"user":{{user}},"user2":"{{user2}}"}`,
// variables: [
// { key: '{{user}}', value: `{"id":1,"name":"张三"}` },
// { key: '{{user2}}', value: `{"id":1,"name":"张三"}` }
// ],
// result: `{"user":{"id":1,"name":"张三"},"user2":"{\\"id\\":1,\\"name\\":\\"张三\\"}"}`
// },
// // 数组类型处理
// {
// body: `{"items":{{items}}}`,
// variables: [{ key: '{{items}}', value: '[1, 2, 3]' }],
// result: `{"items":[1,2,3]}`
// },
// // null 和 undefined 处理
// {
// body: `{"nullValue":{{nullValue}},"undefinedValue":{{undefinedValue}}}`,
// variables: [
// { key: '{{nullValue}}', value: 'null' },
// { key: '{{undefinedValue}}', value: 'undefined' }
// ],
// result: `{"nullValue":null,"undefinedValue":null}`
// },
// // 嵌套JSON结构
// {
// body: `{"data":{"nested":{"value":"{{nestedValue}}"}}}`,
// variables: [{ key: '{{nestedValue}}', value: '嵌套值' }],
// result: `{"data":{"nested":{"value":"嵌套值"}}}`
// },
// // 多变量替换
// {
// body: `{"first":"{{first}}","second":"{{second}}","third":{{third}}}`,
// variables: [
// { key: '{{first}}', value: '第一' },
// { key: '{{second}}', value: '第二' },
// { key: '{{third}}', value: '3' }
// ],
// result: `{"first":"第一","second":"第二","third":3}`
// },
// // JSON字符串作为变量值
// {
// body: `{"config":{{config}}}`,
// variables: [{ key: '{{config}}', value: '{"setting":"enabled","mode":"advanced"}' }],
// result: `{"config":{"setting":"enabled","mode":"advanced"}}`
// }
// ];
// for (let i = 0; i < testData.length; i++) {
// const item = testData[i];
// let bodyStr = item.body;
// for (const variable of item.variables) {
// const isQuote = isVariableInQuotes(bodyStr, variable.key);
// bodyStr = bodyStr.replace(variable.key, valToStr(variable.value, isQuote));
// }
// bodyStr = bodyStr.replace(/(".*?")\s*:\s*undefined\b/g, '$1:null');
// console.log(bodyStr === item.result, i);
// if (bodyStr !== item.result) {
// console.log(bodyStr);
// console.log(item.result);
// } else {
// try {
// JSON.parse(item.result);
// } catch (error) {
// console.log('反序列化异常', i, item.result);
// }
// }
// }
// };
// bodyTest();
// 1. Replace {{key.key}} variables
const regex1 = /\{\{\$([^.]+)\.([^$]+)\$\}\}/g;
@ -313,16 +213,13 @@ export const dispatchHttp468Request = async (props: HttpRequestProps): Promise<H
httpReqUrl = replaceStringVariables(httpReqUrl);
// parse header
const headers = await (() => {
const publicHeaders = await (async () => {
try {
const contentType = contentTypeMap[httpContentType];
if (contentType) {
httpHeader = [{ key: 'Content-Type', value: contentType, type: 'string' }, ...httpHeader];
}
if (!httpHeader || httpHeader.length === 0) return {};
// array
return httpHeader.reduce((acc: Record<string, string>, item) => {
const key = replaceStringVariables(item.key);
const value = replaceStringVariables(item.value);
@ -333,6 +230,9 @@ export const dispatchHttp468Request = async (props: HttpRequestProps): Promise<H
return Promise.reject('Header 为非法 JSON 格式');
}
})();
const sensitiveHeaders = getSecretValue({
storeSecret: headerSecret
});
const params = httpParams.reduce((acc: Record<string, string>, item) => {
const key = replaceStringVariables(item.key);
@ -418,7 +318,7 @@ export const dispatchHttp468Request = async (props: HttpRequestProps): Promise<H
return fetchData({
method: httpMethod,
url: httpReqUrl,
headers,
headers: { ...sensitiveHeaders, ...publicHeaders },
body: requestBody,
params,
timeout: httpTimeout
@ -471,7 +371,7 @@ export const dispatchHttp468Request = async (props: HttpRequestProps): Promise<H
totalPoints: 0,
params: Object.keys(params).length > 0 ? params : undefined,
body: Object.keys(formattedRequestBody).length > 0 ? formattedRequestBody : undefined,
headers: Object.keys(headers).length > 0 ? headers : undefined,
headers: Object.keys(publicHeaders).length > 0 ? publicHeaders : undefined,
httpResult: rawResponse
},
[DispatchNodeResponseKeyEnum.toolResponses]:
@ -486,7 +386,7 @@ export const dispatchHttp468Request = async (props: HttpRequestProps): Promise<H
[DispatchNodeResponseKeyEnum.nodeResponse]: {
params: Object.keys(params).length > 0 ? params : undefined,
body: Object.keys(formattedRequestBody).length > 0 ? formattedRequestBody : undefined,
headers: Object.keys(headers).length > 0 ? headers : undefined,
headers: Object.keys(publicHeaders).length > 0 ? publicHeaders : undefined,
httpResult: { error: formatHttpError(error) }
},
[NodeOutputKeyEnum.httpRawResponse]: getErrText(error)

View File

@ -25,7 +25,8 @@ export type DispatchFlowResponse = {
[DispatchNodeResponseKeyEnum.toolResponses]: ToolRunResponseItemType;
[DispatchNodeResponseKeyEnum.assistantResponses]: AIChatItemValueItemType[];
[DispatchNodeResponseKeyEnum.runTimes]: number;
newVariables: Record<string, string>;
[DispatchNodeResponseKeyEnum.memories]?: Record<string, any>;
[DispatchNodeResponseKeyEnum.newVariables]: Record<string, string>;
durationSeconds: number;
};

View File

@ -164,11 +164,18 @@ export const rewriteRuntimeWorkFlow = (
const toolList =
toolSetNode.inputs.find((input) => input.key === 'toolSetData')?.value?.toolList || [];
const url = toolSetNode.inputs.find((input) => input.key === 'toolSetData')?.value?.url;
const headerSecret = toolSetNode.inputs.find((input) => input.key === 'toolSetData')?.value
?.headerSecret;
const incomingEdges = edges.filter((edge) => edge.target === toolSetNode.nodeId);
for (const tool of toolList) {
const newToolNode = getMCPToolRuntimeNode({ avatar: toolSetNode.avatar, tool, url });
const newToolNode = getMCPToolRuntimeNode({
avatar: toolSetNode.avatar,
tool,
url,
headerSecret
});
nodes.push({ ...newToolNode, name: `${toolSetNode.name} / ${tool.name}` });

View File

@ -1,3 +1,5 @@
<svg viewBox="0 0 13 12" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M9.65613 2.84383C9.85139 3.0391 9.85139 3.35568 9.65613 3.55094L7.20708 5.99999L9.65617 8.44908C9.85143 8.64434 9.85143 8.96093 9.65617 9.15619C9.46091 9.35145 9.14433 9.35145 8.94906 9.15619L6.49997 6.7071L4.05088 9.15619C3.85562 9.35145 3.53904 9.35145 3.34377 9.15619C3.14851 8.96093 3.14851 8.64434 3.34377 8.44908L5.79286 5.99999L3.34382 3.55094C3.14855 3.35568 3.14855 3.0391 3.34382 2.84383C3.53908 2.64857 3.85566 2.64857 4.05092 2.84383L6.49997 5.29288L8.94902 2.84383C9.14428 2.64857 9.46087 2.64857 9.65613 2.84383Z" fill="#92A5C9"/>
</svg>
<svg viewBox="0 0 13 12" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd"
d="M9.65613 2.84383C9.85139 3.0391 9.85139 3.35568 9.65613 3.55094L7.20708 5.99999L9.65617 8.44908C9.85143 8.64434 9.85143 8.96093 9.65617 9.15619C9.46091 9.35145 9.14433 9.35145 8.94906 9.15619L6.49997 6.7071L4.05088 9.15619C3.85562 9.35145 3.53904 9.35145 3.34377 9.15619C3.14851 8.96093 3.14851 8.64434 3.34377 8.44908L5.79286 5.99999L3.34382 3.55094C3.14855 3.35568 3.14855 3.0391 3.34382 2.84383C3.53908 2.64857 3.85566 2.64857 4.05092 2.84383L6.49997 5.29288L8.94902 2.84383C9.14428 2.64857 9.46087 2.64857 9.65613 2.84383Z"
fill="currentColor" />
</svg>

Before

Width:  |  Height:  |  Size: 675 B

After

Width:  |  Height:  |  Size: 699 B

View File

@ -1,5 +1,3 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16" fill="none" class="icon-sm m-1 md:m-0">
<path
d="M.5 1.163A1 1 0 0 1 1.97.28l12.868 6.837a1 1 0 0 1 0 1.766L1.969 15.72A1 1 0 0 1 .5 14.836V10.33a1 1 0 0 1 .816-.983L8.5 8 1.316 6.653A1 1 0 0 1 .5 5.67V1.163Z"
fill="currentColor"></path>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="none">
<path d="M18.2341 9.53355C18.0616 9.19432 17.5241 8.94365 16.4489 8.44231L4.77875 3.00039C3.54356 2.4244 2.92596 2.13641 2.51138 2.23006C2.1515 2.31134 1.85435 2.54718 1.71066 2.86554C1.54513 3.23231 1.75085 3.84719 2.16228 5.07695L3.40682 8.79688L7.33305 9.25598C8.61345 9.4057 9.25367 9.48056 9.38362 9.5746C9.69329 9.7987 9.69329 10.2351 9.38362 10.4592C9.25367 10.5532 8.61347 10.6281 7.33307 10.7778L3.39507 11.2383L2.16228 14.9231C1.75085 16.1528 1.54513 16.7677 1.71066 17.1345C1.85435 17.4528 2.1515 17.6887 2.51138 17.77C2.92596 17.8636 3.54356 17.5756 4.77875 16.9996L16.4489 11.5577C17.5241 11.0564 18.0616 10.8057 18.2341 10.4665C18.3842 10.1713 18.3842 9.82876 18.2341 9.53355Z" />
</svg>

Before

Width:  |  Height:  |  Size: 324 B

After

Width:  |  Height:  |  Size: 777 B

View File

@ -1 +1 @@
<?xml version="1.0" standalone="no"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg t="1682424901088" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="3662" xmlns:xlink="http://www.w3.org/1999/xlink" width="32" height="32"><path d="M885.333333 85.333333H138.666667a53.393333 53.393333 0 0 0-53.333334 53.333334v746.666666a53.393333 53.393333 0 0 0 53.333334 53.333334h746.666666a53.393333 53.393333 0 0 0 53.333334-53.333334V138.666667a53.393333 53.393333 0 0 0-53.333334-53.333334z m-160 602.666667a37.373333 37.373333 0 0 1-37.333333 37.333333H336a37.373333 37.373333 0 0 1-37.333333-37.333333V336a37.373333 37.373333 0 0 1 37.333333-37.333333h352a37.373333 37.373333 0 0 1 37.333333 37.333333z" p-id="3663"></path></svg>
<svg t="1749735945875" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="1418" width="64" height="64"><path d="M512 1014.7488005c-276.51184005 0-502.7488005-226.23696045-502.7488005-502.7488005s226.23696045-502.7488005 502.7488005-502.7488005 502.7488005 226.23696045 502.7488005 502.7488005-226.23696045 502.7488005-502.7488005 502.7488005z m0-125.68719975c207.38388041 0 377.06160075-169.67772034 377.06160075-377.06160075s-169.67772034-377.06160075-377.06160075-377.06160075-377.06160075 169.67772034-377.06160075 377.06160075 169.67772034 377.06160075 377.06160075 377.06160075z m-125.68719975-502.7488005h251.3743995v251.3743995h-251.3743995v-251.3743995z" p-id="1419"></path></svg>

Before

Width:  |  Height:  |  Size: 823 B

After

Width:  |  Height:  |  Size: 729 B

View File

@ -1,10 +1,12 @@
{
"Invoice_document": "Invoice documents",
"all": "all",
"back": "return",
"bank_account": "Account opening account",
"bank_name": "Bank of deposit",
"bill_detail": "Bill details",
"bill_record": "billing records",
"click_to_download": "Click to download",
"company_address": "Company address",
"company_phone": "Company phone number",
"completed": "Completed",

View File

@ -96,6 +96,10 @@
"all_quotes": "All quotes",
"all_result": "Full Results",
"app_not_version": "This application has not been published, please publish it first",
"auth_config": "Authentication",
"auth_type": "Authentication type",
"auth_type.Custom": "Customize",
"auth_type.None": "None",
"back": "Back",
"base_config": "Basic Configuration",
"bill_already_processed": "Order has been processed",
@ -260,7 +264,7 @@
"core.app.have_saved": "Saved",
"core.app.logs.Source And Time": "Source & Time",
"core.app.more": "View More",
"core.app.name": "name",
"name": "name",
"core.app.no_app": "No Apps Yet, Create One Now!",
"core.app.not_saved": "Not Saved",
"core.app.outLink.Can Drag": "Icon Can Be Dragged",
@ -728,6 +732,7 @@
"core.workflow.inputType.switch": "Switch",
"core.workflow.inputType.textInput": "Text Input box",
"core.workflow.inputType.textarea": "Multi-line Input Box",
"key": "key",
"core.workflow.publish.OnRevert version": "Click to Revert to This Version",
"core.workflow.publish.OnRevert version confirm": "Confirm to Revert to This Version? The configuration of the editing version will be saved, and a new release version will be created for the reverted version.",
"core.workflow.publish.histories": "Release Records",
@ -736,7 +741,7 @@
"core.workflow.template.Search": "Search",
"core.workflow.tool.Handle": "Tool Connector",
"core.workflow.tool.Select Tool": "Select Tool",
"core.workflow.value": "Value",
"value": "Value",
"core.workflow.variable": "Variable",
"create": "Create",
"create_failed": "Create failed",
@ -797,6 +802,8 @@
"delete_success": "Deleted Successfully",
"delete_warning": "Deletion Warning",
"embedding_model_not_config": "No index model is detected",
"enable_auth": "Enable authentication",
"error.Create failed": "Create failed",
"error.code_error": "Verification code error",
"error.fileNotFound": "File not found~",
"error.inheritPermissionError": "Inherit permission Error",
@ -827,6 +834,7 @@
"get_QR_failed": "Failed to Get QR Code",
"get_app_failed": "Failed to Retrieve App",
"get_laf_failed": "Failed to Retrieve Laf Function List",
"had_auth_value": "Filled in",
"has_verification": "Verified, Click to Unbind",
"have_done": "Completed",
"import_failed": "Import Failed",
@ -1001,6 +1009,7 @@
"save_failed": "save_failed",
"save_success": "Saved Successfully",
"scan_code": "Scan the QR code to pay",
"secret_tips": "The value will not return plaintext again after saving",
"select_file_failed": "File Selection Failed",
"select_reference_variable": "Select Reference Variable",
"select_template": "Select Template",

View File

@ -1,10 +1,12 @@
{
"Invoice_document": "发票文件",
"all": "全部",
"back": "返回",
"bank_account": "开户账号",
"bank_name": "开户银行",
"bill_detail": "账单详情",
"bill_record": "账单记录",
"click_to_download": "点击下载",
"company_address": "公司地址",
"company_phone": "公司电话",
"completed": "已完成",

View File

@ -96,6 +96,10 @@
"all_quotes": "全部引用",
"all_result": "完整结果",
"app_not_version": " 该应用未发布过,请先发布应用",
"auth_config": "鉴权配置",
"auth_type": "鉴权类型",
"auth_type.Custom": "自定义",
"auth_type.None": "无",
"back": "返回",
"base_config": "基础配置",
"bill_already_processed": "订单已处理",
@ -215,7 +219,6 @@
"core.app.Interval timer run": "定时执行",
"core.app.Interval timer tip": "可定时执行应用",
"core.app.Make a brief introduction of your app": "给你的 AI 应用一个介绍",
"core.app.name": "名称",
"core.app.Name and avatar": "头像 & 名称",
"core.app.Publish": "发布",
"core.app.Publish Confirm": "确认发布应用?会立即更新所有发布渠道的应用状态。",
@ -261,6 +264,7 @@
"core.app.have_saved": "已保存",
"core.app.logs.Source And Time": "来源 & 时间",
"core.app.more": "查看更多",
"name": "名称",
"core.app.no_app": "还没有应用,快去创建一个吧!",
"core.app.not_saved": "未保存",
"core.app.outLink.Can Drag": "图标可拖拽",
@ -728,6 +732,7 @@
"core.workflow.inputType.switch": "开关",
"core.workflow.inputType.textInput": "文本输入框",
"core.workflow.inputType.textarea": "多行输入框",
"key": "键",
"core.workflow.publish.OnRevert version": "点击回退到该版本",
"core.workflow.publish.OnRevert version confirm": "确认回退至该版本?会为您保存编辑中版本的配置,并为回退版本创建一个新的发布版本。",
"core.workflow.publish.histories": "发布记录",
@ -736,7 +741,7 @@
"core.workflow.template.Search": "搜索",
"core.workflow.tool.Handle": "工具连接器",
"core.workflow.tool.Select Tool": "选择工具",
"core.workflow.value": "值",
"value": "值",
"core.workflow.variable": "变量",
"create": "去创建",
"create_failed": "创建失败",
@ -797,6 +802,8 @@
"delete_success": "删除成功",
"delete_warning": "删除警告",
"embedding_model_not_config": "检测到没有可用的索引模型",
"enable_auth": "启用鉴权",
"error.Create failed": "创建失败",
"error.code_error": "验证码错误",
"error.fileNotFound": "文件找不到了~",
"error.inheritPermissionError": "权限继承错误",
@ -827,6 +834,7 @@
"get_QR_failed": "获取二维码失败",
"get_app_failed": "获取应用失败",
"get_laf_failed": "获取Laf函数列表失败",
"had_auth_value": "已填写",
"has_verification": "已验证,点击取消绑定",
"have_done": "已完成",
"import_failed": "导入失败",
@ -1001,6 +1009,7 @@
"save_failed": "保存异常",
"save_success": "保存成功",
"scan_code": "扫码支付",
"secret_tips": "值保存后不会再次明文返回",
"select_file_failed": "选择文件异常",
"select_reference_variable": "选择引用变量",
"select_template": "选择模板",

View File

@ -1,10 +1,12 @@
{
"Invoice_document": "發票文件",
"all": "全部",
"back": "返回",
"bank_account": "開戶帳號",
"bank_name": "開戶銀行",
"bill_detail": "帳單詳細資訊",
"bill_record": "帳單記錄",
"click_to_download": "點擊下載",
"company_address": "公司地址",
"company_phone": "公司電話",
"completed": "已完成",

View File

@ -96,6 +96,10 @@
"all_quotes": "全部引用",
"all_result": "完整結果",
"app_not_version": "該應用未發布過,請先發布應用",
"auth_config": "鑑權配置",
"auth_type": "鑑權類型",
"auth_type.Custom": "自定義",
"auth_type.None": "無",
"back": "返回",
"base_config": "基本設定",
"bill_already_processed": "訂單已處理",
@ -260,7 +264,7 @@
"core.app.have_saved": "已儲存",
"core.app.logs.Source And Time": "來源與時間",
"core.app.more": "檢視更多",
"core.app.name": "名稱",
"name": "名稱",
"core.app.no_app": "還沒有應用程式,快來建立一個吧!",
"core.app.not_saved": "未儲存",
"core.app.outLink.Can Drag": "圖示可拖曳",
@ -728,6 +732,7 @@
"core.workflow.inputType.switch": "開關",
"core.workflow.inputType.textInput": "文字輸入框",
"core.workflow.inputType.textarea": "多行輸入框",
"key": "鍵",
"core.workflow.publish.OnRevert version": "點選回復至此版本",
"core.workflow.publish.OnRevert version confirm": "確認回復至此版本?將為您儲存編輯中版本的設定,並為回復版本建立一個新的發布版本。",
"core.workflow.publish.histories": "發布記錄",
@ -736,7 +741,7 @@
"core.workflow.template.Search": "搜尋",
"core.workflow.tool.Handle": "工具聯結器",
"core.workflow.tool.Select Tool": "選擇工具",
"core.workflow.value": "值",
"value": "值",
"core.workflow.variable": "變數",
"create": "建立",
"create_failed": "建立失敗",
@ -797,6 +802,8 @@
"delete_success": "刪除成功",
"delete_warning": "刪除警告",
"embedding_model_not_config": "偵測到沒有可用的索引模型",
"enable_auth": "啟用鑑權",
"error.Create failed": "建立失敗",
"error.code_error": "驗證碼錯誤",
"error.fileNotFound": "找不到檔案",
"error.inheritPermissionError": "繼承權限錯誤",
@ -827,6 +834,7 @@
"get_QR_failed": "取得 QR Code 失敗",
"get_app_failed": "取得應用程式失敗",
"get_laf_failed": "取得 LAF 函式清單失敗",
"had_auth_value": "已填寫",
"has_verification": "已驗證,點選解除綁定",
"have_done": "已完成",
"import_failed": "匯入失敗",
@ -1001,6 +1009,7 @@
"save_failed": "儲存失敗",
"save_success": "儲存成功",
"scan_code": "掃碼支付",
"secret_tips": "值保存後不會再次明文返回",
"select_file_failed": "選擇檔案失敗",
"select_reference_variable": "選擇引用變數",
"select_template": "選擇範本",

View File

@ -255,7 +255,6 @@ const Button = defineStyleConfig({
grayGhost: {
color: 'myGray.500',
fontWeight: '500',
p: 0,
bg: 'transparent',
transition: 'background 0.1s',
_hover: {
@ -816,7 +815,8 @@ export const theme = extendTheme({
md: '0.5rem',
semilg: '0.625rem',
lg: '0.75rem',
xl: '1rem'
xl: '1rem',
xxl: '1.25rem'
},
shadows: {
1: '0px 1px 2px 0px rgba(19, 51, 107, 0.05), 0px 0px 1px 0px rgba(19, 51, 107, 0.08)',

View File

@ -5,6 +5,8 @@ DEFAULT_ROOT_PSW=123456
DB_MAX_LINK=5
# 文件阅读时的密钥
FILE_TOKEN_KEY=filetokenkey
# 密钥加密key
AES256_SECRET_KEY=fastgptsecret
# root key, 最高权限
ROOT_KEY=fdafasd
# openai 基本地址,可用作中转。

View File

@ -26,9 +26,7 @@
"usedInExtractFields": true, // true
"usedInToolCall": true, // true
"toolChoice": true, //
"functionCall": false, // 使 toolChoicefalse使 functionCall false使
"customCQPrompt": "", //
"customExtractPrompt": "", //
"functionCall": false, // 使 toolChoicefalse使 functionCall false使 // //
"defaultSystemChatPrompt": "", //
"defaultConfig": {}, // API GLM4 top_p
"fieldMap": {} // o1 max_tokens max_completion_tokens
@ -50,8 +48,6 @@
"usedInToolCall": true,
"toolChoice": true,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {}
@ -73,8 +69,6 @@
"usedInToolCall": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {
"temperature": 1,
@ -99,8 +93,6 @@
"usedInToolCall": true,
"toolChoice": false,
"functionCall": false,
"customCQPrompt": "",
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {
"temperature": 1,

View File

@ -1,6 +1,6 @@
{
"name": "app",
"version": "4.9.11",
"version": "4.9.12",
"private": false,
"scripts": {
"dev": "next dev",

View File

@ -0,0 +1,381 @@
import type { ButtonProps } from '@chakra-ui/react';
import {
Box,
Button,
Flex,
FormControl,
IconButton,
Input,
ModalBody,
ModalFooter,
Slider,
useDisclosure
} from '@chakra-ui/react';
import { HeaderSecretTypeEnum } from '@fastgpt/global/common/secret/constants';
import type { SecretValueType, StoreSecretValueType } from '@fastgpt/global/common/secret/type';
import React, { useEffect, useMemo, useState } from 'react';
import { useFieldArray, useForm, type UseFormRegister } from 'react-hook-form';
import { useTranslation } from 'next-i18next';
import MyIcon from '@fastgpt/web/components/common/Icon';
import MyModal from '@fastgpt/web/components/common/MyModal';
import MySelect from '@fastgpt/web/components/common/MySelect';
type HeaderSecretConfigType = {
Bearer?: SecretValueType;
Basic?: SecretValueType;
customs?: {
key: string;
value: SecretValueType;
}[];
};
const getShowInput = ({
secretValue,
editingIndex,
index
}: {
secretValue?: SecretValueType;
editingIndex?: number;
index: number;
}) => {
const hasSecret = !!secretValue?.secret;
const hasValue = !!secretValue?.value;
const isEditing = editingIndex === index;
return !hasSecret || hasValue || isEditing;
};
const AuthValueDisplay = ({
showInput,
fieldName,
index = 0,
onEdit,
register
}: {
showInput: boolean;
fieldName: string;
index?: number;
onEdit: (index?: number) => void;
register: UseFormRegister<HeaderSecretConfigType>;
}) => {
const { t } = useTranslation();
return (
<Flex>
{showInput ? (
<FormControl flex={1}>
<Input
placeholder={'Value'}
bg={'myGray.50'}
h={8}
maxLength={200}
{...register(fieldName as any, {
required: true
})}
onFocus={() => onEdit(index)}
onBlur={() => onEdit(undefined)}
/>
</FormControl>
) : (
<Flex
flex={1}
borderRadius={'6px'}
border={'0.5px solid'}
borderColor={'primary.200'}
bg={'primary.50'}
h={8}
px={3}
alignItems={'center'}
gap={1}
>
<MyIcon name="checkCircle" w={'16px'} color={'primary.600'} />
<Box fontSize={'sm'} fontWeight={'medium'} color={'primary.600'}>
{t('common:had_auth_value')}
</Box>
</Flex>
)}
{!showInput && (
<IconButton
aria-label="Edit header"
icon={<MyIcon name="edit" w={'16px'} />}
size="sm"
variant="ghost"
color={'myGray.500'}
_hover={{ color: 'primary.600' }}
onClick={() => onEdit(index)}
/>
)}
</Flex>
);
};
const getSecretType = (config: HeaderSecretConfigType): HeaderSecretTypeEnum => {
if (config.Bearer) {
return HeaderSecretTypeEnum.Bearer;
} else if (config.Basic) {
return HeaderSecretTypeEnum.Basic;
} else if (config.customs && config.customs.length > 0) {
return HeaderSecretTypeEnum.Custom;
}
return HeaderSecretTypeEnum.None;
};
const HeaderAuthConfig = ({
storeHeaderSecretConfig,
onUpdate,
buttonProps
}: {
storeHeaderSecretConfig?: StoreSecretValueType;
onUpdate: (data: StoreSecretValueType) => void;
buttonProps?: ButtonProps;
}) => {
const { t } = useTranslation();
const headerSecretList = [
{
label: t('common:auth_type.None'),
value: HeaderSecretTypeEnum.None
},
{
label: 'Bearer',
value: HeaderSecretTypeEnum.Bearer
},
{
label: 'Basic',
value: HeaderSecretTypeEnum.Basic
},
{
label: t('common:auth_type.Custom'),
value: HeaderSecretTypeEnum.Custom
}
];
const { isOpen, onOpen, onClose } = useDisclosure();
const headerSecretValue: HeaderSecretConfigType = useMemo(() => {
if (!storeHeaderSecretConfig || Object.keys(storeHeaderSecretConfig).length === 0) {
return {};
}
const entries = Object.entries(storeHeaderSecretConfig);
const [key, value] = entries[0];
if (
entries.length === 1 &&
(key === HeaderSecretTypeEnum.Bearer || key === HeaderSecretTypeEnum.Basic)
) {
return {
[key]: {
secret: value.secret,
value: value.value
}
};
}
return {
customs: entries.map(([key, value]) => ({
key,
value: {
secret: value.secret,
value: value.value
}
}))
};
}, [storeHeaderSecretConfig]);
const [currentAuthType, setCurrentAuthType] = useState<HeaderSecretTypeEnum>(
getSecretType(headerSecretValue)
);
const [editingIndex, setEditingIndex] = useState<number>();
const { control, register, watch, handleSubmit, reset } = useForm<HeaderSecretConfigType>({
defaultValues: {
Basic: headerSecretValue?.Basic || { secret: '', value: '' },
Bearer: headerSecretValue?.Bearer || { secret: '', value: '' },
customs: headerSecretValue?.customs || []
}
});
const {
fields: customHeaders,
append: appendHeader,
remove: removeHeader
} = useFieldArray({
control,
name: 'customs'
});
const BearerValue = watch('Bearer');
const BasicValue = watch('Basic');
// Add default custom
useEffect(() => {
if (currentAuthType === HeaderSecretTypeEnum.Custom && customHeaders.length === 0) {
appendHeader({ key: '', value: { secret: '', value: '' } });
}
}, [currentAuthType, customHeaders.length, appendHeader]);
const onSubmit = async (data: HeaderSecretConfigType) => {
if (!headerSecretValue) return;
const storeData: StoreSecretValueType = {};
if (currentAuthType === HeaderSecretTypeEnum.Bearer) {
storeData.Bearer = {
value: data.Bearer?.value || '',
secret: data.Bearer?.secret || ''
};
} else if (currentAuthType === HeaderSecretTypeEnum.Basic) {
storeData.Basic = {
value: data.Basic?.value || '',
secret: data.Basic?.secret || ''
};
} else if (currentAuthType === HeaderSecretTypeEnum.Custom) {
data.customs?.forEach((item) => {
storeData[item.key] = item.value;
});
}
onUpdate(storeData);
onClose();
};
return (
<>
<Button
variant={'grayGhost'}
borderRadius={'md'}
{...buttonProps}
leftIcon={<MyIcon name={'common/setting'} w={4} />}
onClick={onOpen}
>
{t('common:auth_config')}
</Button>
{isOpen && (
<MyModal
isOpen={isOpen}
onClose={onClose}
iconSrc={'common/setting'}
iconColor={'primary.600'}
title={t('common:auth_config')}
w={480}
>
<ModalBody px={9}>
<FormControl mb={2}>
<Box fontSize={'14px'} fontWeight={'medium'} color={'myGray.900'} mb={2}>
{t('common:auth_type')}
</Box>
<MySelect
bg={'myGray.50'}
value={currentAuthType}
onChange={setCurrentAuthType}
list={headerSecretList}
/>
</FormControl>
{currentAuthType !== HeaderSecretTypeEnum.None && (
<Flex mb={2} gap={2} color={'myGray.900'} fontWeight={'medium'} fontSize={'14px'}>
{currentAuthType === HeaderSecretTypeEnum.Custom && (
<Box w={1 / 3}>{t('common:key')}</Box>
)}
<Box w={2 / 3}>{t('common:value')}</Box>
</Flex>
)}
{currentAuthType !== HeaderSecretTypeEnum.None && (
<>
{currentAuthType === HeaderSecretTypeEnum.Bearer ||
currentAuthType === HeaderSecretTypeEnum.Basic ? (
<AuthValueDisplay
key={currentAuthType}
showInput={getShowInput({
secretValue:
currentAuthType === HeaderSecretTypeEnum.Bearer ? BearerValue : BasicValue,
editingIndex,
index: 0
})}
fieldName={`${currentAuthType}.value` as any}
onEdit={setEditingIndex}
register={register}
/>
) : (
<Box>
{customHeaders.map((item, index) => {
const headerValue = watch(`customs.${index}.value`);
return (
<Flex key={item.id} mb={2} align="center">
<Input
w={1 / 3}
h={8}
bg="myGray.50"
placeholder="key"
maxLength={20}
{...register(`customs.${index}.key`, {
required: true
})}
/>
<Box w={2 / 3} ml={2}>
<AuthValueDisplay
showInput={getShowInput({
secretValue: headerValue,
editingIndex,
index
})}
fieldName={`customs.${index}.value.value`}
index={index}
onEdit={setEditingIndex}
register={register}
/>
</Box>
{customHeaders.length > 1 && (
<IconButton
aria-label="Remove header"
icon={<MyIcon name="delete" w="16px" />}
size="sm"
variant="ghost"
color={'myGray.500'}
_hover={{ color: 'red.500' }}
isDisabled={customHeaders.length <= 1}
onClick={() => removeHeader(index)}
/>
)}
</Flex>
);
})}
<Button
leftIcon={<MyIcon name="common/addLight" w="16px" />}
variant="whiteBase"
minH={8}
h={8}
onClick={() => appendHeader({ key: '', value: { secret: '', value: '' } })}
>
{t('common:add_new')}
</Button>
</Box>
)}
</>
)}
</ModalBody>
<ModalFooter px={9} display={'flex'} flexDirection={'column'}>
<Flex justifyContent={'end'} w={'full'}>
<Button onClick={handleSubmit(onSubmit)}>{t('common:Save')}</Button>
</Flex>
</ModalFooter>
<Box
borderTop={'sm'}
color={'myGray.500'}
bg={'myGray.50'}
fontSize={'xs'}
textAlign={'center'}
py={2}
borderBottomRadius={'md'}
>
{t('common:secret_tips')}
</Box>
</MyModal>
)}
</>
);
};
export default React.memo(HeaderAuthConfig);

View File

@ -26,7 +26,7 @@ const WelcomeTextConfig = (props: TextareaProps) => {
rows={6}
fontSize={'sm'}
bg={'myGray.50'}
minW={'384px'}
minW={['auto', '384px']}
placeholder={t('common:core.app.tip.welcomeTextTip')}
autoHeight
minH={100}

View File

@ -1,6 +1,5 @@
import { useSystemStore } from '@/web/common/system/useSystemStore';
import { Box, Flex, Spinner, Textarea } from '@chakra-ui/react';
import React, { useRef, useEffect, useCallback, useMemo, useState } from 'react';
import { Box, Flex, Textarea } from '@chakra-ui/react';
import React, { useRef, useCallback, useMemo, useState } from 'react';
import { useTranslation } from 'next-i18next';
import MyTooltip from '@fastgpt/web/components/common/MyTooltip';
import MyIcon from '@fastgpt/web/components/common/Icon';
@ -49,6 +48,9 @@ const ChatInput = ({
const { setValue, watch, control } = chatForm;
const inputValue = watch('input');
// Check voice input state
const [mobilePreSpeak, setMobilePreSpeak] = useState(false);
const outLinkAuthData = useContextSelector(ChatBoxContext, (v) => v.outLinkAuthData);
const appId = useContextSelector(ChatBoxContext, (v) => v.appId);
const chatId = useContextSelector(ChatBoxContext, (v) => v.chatId);
@ -108,146 +110,203 @@ const ChatInput = ({
const RenderTextarea = useMemo(
() => (
<Flex alignItems={'flex-end'} mt={fileList.length > 0 ? 1 : 0} pl={[2, 4]}>
{/* file selector */}
{(showSelectFile || showSelectImg) && (
<Flex
h={'22px'}
alignItems={'center'}
justifyContent={'center'}
cursor={'pointer'}
transform={'translateY(1px)'}
onClick={() => {
onOpenSelectFile();
<Flex direction={'column'} mt={fileList.length > 0 ? 1 : 0}>
{/* Textarea */}
<Flex w={'100%'}>
{/* Prompt Container */}
<Textarea
ref={TextareaDom}
py={0}
mx={[2, 4]}
px={2}
border={'none'}
_focusVisible={{
border: 'none'
}}
>
<MyTooltip label={selectFileLabel}>
<MyIcon name={selectFileIcon as any} w={'18px'} color={'myGray.600'} />
</MyTooltip>
<File onSelect={(files) => onSelectFile({ files })} />
</Flex>
)}
{/* input area */}
<Textarea
ref={TextareaDom}
py={0}
pl={2}
pr={['30px', '48px']}
border={'none'}
_focusVisible={{
border: 'none'
}}
placeholder={
isPc ? t('common:core.chat.Type a message') : t('chat:input_placeholder_phone')
}
resize={'none'}
rows={1}
height={'22px'}
lineHeight={'22px'}
maxHeight={'50vh'}
maxLength={-1}
overflowY={'auto'}
whiteSpace={'pre-wrap'}
wordBreak={'break-all'}
boxShadow={'none !important'}
color={'myGray.900'}
fontSize={['md', 'sm']}
value={inputValue}
onChange={(e) => {
const textarea = e.target;
textarea.style.height = textareaMinH;
textarea.style.height = `${textarea.scrollHeight}px`;
setValue('input', textarea.value);
}}
onKeyDown={(e) => {
// enter send.(pc or iframe && enter and unPress shift)
const isEnter = e.keyCode === 13;
if (isEnter && TextareaDom.current && (e.ctrlKey || e.altKey)) {
// Add a new line
const index = TextareaDom.current.selectionStart;
const val = TextareaDom.current.value;
TextareaDom.current.value = `${val.slice(0, index)}\n${val.slice(index)}`;
TextareaDom.current.selectionStart = index + 1;
TextareaDom.current.selectionEnd = index + 1;
TextareaDom.current.style.height = textareaMinH;
TextareaDom.current.style.height = `${TextareaDom.current.scrollHeight}px`;
return;
placeholder={
isPc ? t('common:core.chat.Type a message') : t('chat:input_placeholder_phone')
}
resize={'none'}
rows={1}
height={[5, 6]}
lineHeight={[5, 6]}
maxHeight={[24, 32]}
mb={0}
maxLength={-1}
overflowY={'hidden'}
overflowX={'hidden'}
whiteSpace={'pre-wrap'}
wordBreak={'break-word'}
boxShadow={'none !important'}
color={'myGray.900'}
fontWeight={400}
fontSize={'1rem'}
letterSpacing={'0.5px'}
w={'100%'}
_placeholder={{
color: '#707070',
fontSize: 'sm'
}}
value={inputValue}
onChange={(e) => {
const textarea = e.target;
textarea.style.height = textareaMinH;
const maxHeight = 128;
const newHeight = Math.min(textarea.scrollHeight, maxHeight);
textarea.style.height = `${newHeight}px`;
// 全选内容
// @ts-ignore
e.key === 'a' && e.ctrlKey && e.target?.select();
if ((isPc || window !== parent) && e.keyCode === 13 && !e.shiftKey) {
handleSend();
e.preventDefault();
}
}}
onPaste={(e) => {
const clipboardData = e.clipboardData;
if (clipboardData && (showSelectFile || showSelectImg)) {
const items = clipboardData.items;
const files = Array.from(items)
.map((item) => (item.kind === 'file' ? item.getAsFile() : undefined))
.filter((file) => {
return file && fileTypeFilter(file);
}) as File[];
onSelectFile({ files });
if (files.length > 0) {
e.preventDefault();
e.stopPropagation();
// Only show scrollbar when content exceeds max height
if (textarea.scrollHeight > maxHeight) {
textarea.style.overflowY = 'auto';
} else {
textarea.style.overflowY = 'hidden';
}
}
}}
/>
<Flex
alignItems={'center'}
position={'absolute'}
right={[2, 4]}
bottom={['10px', '12px']}
zIndex={3}
>
{/* Voice input icon */}
{whisperConfig?.open && !inputValue && (
<MyTooltip label={t('common:core.chat.Record')}>
<Flex
alignItems={'center'}
justifyContent={'center'}
flexShrink={0}
h={['28px', '32px']}
w={['28px', '32px']}
mr={2}
borderRadius={'md'}
cursor={'pointer'}
_hover={{ bg: '#F5F5F8' }}
onClick={() => {
VoiceInputRef.current?.onSpeak?.();
}}
>
<MyIcon
name={'core/chat/recordFill'}
width={['22px', '25px']}
height={['22px', '25px']}
color={'myGray.600'}
/>
</Flex>
</MyTooltip>
setValue('input', textarea.value);
}}
onKeyDown={(e) => {
// enter send.(pc or iframe && enter and unPress shift)
const isEnter = e.key === 'Enter';
if (isEnter && TextareaDom.current && (e.ctrlKey || e.altKey)) {
// Add a new line
const index = TextareaDom.current.selectionStart;
const val = TextareaDom.current.value;
TextareaDom.current.value = `${val.slice(0, index)}\n${val.slice(index)}`;
TextareaDom.current.selectionStart = index + 1;
TextareaDom.current.selectionEnd = index + 1;
TextareaDom.current.style.height = textareaMinH;
TextareaDom.current.style.height = `${TextareaDom.current.scrollHeight}px`;
return;
}
// Select all content
// @ts-ignore
e.key === 'a' && e.ctrlKey && e.target?.select();
if ((isPc || window !== parent) && e.keyCode === 13 && !e.shiftKey) {
handleSend();
e.preventDefault();
}
}}
onPaste={(e) => {
const clipboardData = e.clipboardData;
if (clipboardData && (showSelectFile || showSelectImg)) {
const items = clipboardData.items;
const files = Array.from(items)
.map((item) => (item.kind === 'file' ? item.getAsFile() : undefined))
.filter((file) => {
return file && fileTypeFilter(file);
}) as File[];
onSelectFile({ files });
if (files.length > 0) {
e.preventDefault();
e.stopPropagation();
}
}
}}
/>
</Flex>
</Flex>
),
[
TextareaDom,
fileList.length,
handleSend,
inputValue,
isPc,
onSelectFile,
setValue,
showSelectFile,
showSelectImg,
t
]
);
const RenderButtonGroup = useMemo(() => {
const iconSize = {
w: isPc ? '20px' : '16px',
h: isPc ? '20px' : '16px'
};
return (
<Flex
alignItems={'center'}
justifyContent={'flex-end'}
w={'100%'}
mt={0}
pr={[3, 4]}
h={[8, 9]}
gap={[0, 1]}
>
{/* Attachment and Voice Group */}
<Flex alignItems={'center'} h={[8, 9]}>
{/* file selector button */}
{(showSelectFile || showSelectImg) && (
<Flex
alignItems={'center'}
justifyContent={'center'}
w={[8, 9]}
h={[8, 9]}
p={[1, 2]}
borderRadius={'sm'}
cursor={'pointer'}
_hover={{ bg: 'rgba(0, 0, 0, 0.04)' }}
onClick={() => {
onOpenSelectFile();
}}
>
<MyTooltip label={selectFileLabel}>
<MyIcon name={selectFileIcon as any} {...iconSize} color={'#707070'} />
</MyTooltip>
<File onSelect={(files) => onSelectFile({ files })} />
</Flex>
)}
{/* send and stop icon */}
{/* Voice input button */}
{whisperConfig?.open && !inputValue && (
<Flex
alignItems={'center'}
justifyContent={'center'}
w={[8, 9]}
h={[8, 9]}
p={[1, 2]}
borderRadius={'sm'}
cursor={'pointer'}
_hover={{ bg: 'rgba(0, 0, 0, 0.04)' }}
onClick={() => {
VoiceInputRef.current?.onSpeak?.();
}}
>
<MyTooltip label={t('common:core.chat.Record')}>
<MyIcon name={'core/chat/recordFill'} {...iconSize} color={'#707070'} />
</MyTooltip>
</Flex>
)}
</Flex>
{/* Divider Container */}
{((whisperConfig?.open && !inputValue) || showSelectFile || showSelectImg) && (
<Flex alignItems={'center'} justifyContent={'center'} w={2} h={4} mr={2}>
<Box w={'2px'} h={5} bg={'myGray.200'} />
</Flex>
)}
{/* Send Button Container */}
<Flex alignItems={'center'} w={[8, 9]} h={[8, 9]} borderRadius={'lg'}>
<Flex
alignItems={'center'}
justifyContent={'center'}
flexShrink={0}
h={['28px', '32px']}
w={['28px', '32px']}
borderRadius={'md'}
bg={isChatting ? '' : !havInput || hasFileUploading ? '#E5E5E5' : 'primary.500'}
cursor={havInput ? 'pointer' : 'not-allowed'}
lineHeight={1}
w={[7, 9]}
h={[7, 9]}
p={[1, 2]}
bg={
isChatting ? 'primary.50' : canSendMessage ? 'primary.500' : 'rgba(17, 24, 36, 0.1)'
}
borderRadius={['md', 'lg']}
cursor={canSendMessage ? 'pointer' : 'not-allowed'}
onClick={() => {
if (isChatting) {
return onStop();
@ -256,56 +315,40 @@ const ChatInput = ({
}}
>
{isChatting ? (
<MyIcon
animation={'zoomStopIcon 0.4s infinite alternate'}
width={['22px', '25px']}
height={['22px', '25px']}
cursor={'pointer'}
name={'stop'}
color={'gray.500'}
/>
<MyIcon {...iconSize} name={'stop'} color={'primary.600'} />
) : (
<MyTooltip label={t('common:core.chat.Send Message')}>
<MyIcon
name={'core/chat/sendFill'}
width={['18px', '20px']}
height={['18px', '20px']}
color={'white'}
/>
<MyIcon name={'core/chat/sendFill'} {...iconSize} color={'white'} />
</MyTooltip>
)}
</Flex>
</Flex>
</Flex>
),
[
File,
TextareaDom,
fileList,
handleSend,
hasFileUploading,
havInput,
inputValue,
isChatting,
isPc,
onOpenSelectFile,
onSelectFile,
onStop,
selectFileIcon,
selectFileLabel,
setValue,
showSelectFile,
showSelectImg,
t
]
);
);
}, [
isPc,
showSelectFile,
showSelectImg,
selectFileLabel,
selectFileIcon,
File,
whisperConfig?.open,
inputValue,
t,
isChatting,
canSendMessage,
onOpenSelectFile,
onSelectFile,
handleSend,
onStop
]);
return (
<Box
m={['0 auto', '10px auto']}
m={['0 auto 10px', '10px auto']}
w={'100%'}
maxW={['auto', 'min(800px, 100%)']}
px={[0, 5]}
maxW={['auto', 'min(820px, 100%)']}
px={[3, 5]}
onDragOver={(e) => e.preventDefault()}
onDrop={(e) => {
e.preventDefault();
@ -331,53 +374,71 @@ const ChatInput = ({
}
}}
>
<Box
pt={fileList.length > 0 ? '0' : ['14px', '18px']}
pb={['14px', '18px']}
{/* Real Chat Input */}
<Flex
direction={'column'}
minH={mobilePreSpeak ? '48px' : ['96px', '120px']}
pt={fileList.length > 0 ? '0' : mobilePreSpeak ? [0, 4] : [3, 4]}
pb={[2, 4]}
position={'relative'}
boxShadow={`0 0 10px rgba(0,0,0,0.2)`}
borderRadius={['none', 'md']}
boxShadow={`0px 5px 16px -4px rgba(19, 51, 107, 0.08)`}
borderRadius={['xl', 'xxl']}
bg={'white'}
overflow={'display'}
{...(isPc
? {
border: '1px solid',
borderColor: 'rgba(0,0,0,0.12)'
}
: {
borderTop: '1px solid',
borderTopColor: 'rgba(0,0,0,0.15)'
})}
border={'0.5px solid rgba(0, 0, 0, 0.15)'}
borderColor={'rgba(0,0,0,0.12)'}
>
{/* Chat input guide box */}
{chatInputGuide.open && (
<InputGuideBox
appId={appId}
text={inputValue}
onSelect={(e) => {
setValue('input', e);
}}
onSend={(e) => {
handleSend(e);
}}
/>
)}
{/* file preview */}
<Box px={[1, 3]}>
<FilePreview fileList={fileList} removeFiles={removeFiles} />
<Box flex={1}>
{/* Chat input guide box */}
{chatInputGuide.open && (
<InputGuideBox
appId={appId}
text={inputValue}
onSelect={(e) => {
setValue('input', e);
}}
onSend={(e) => {
handleSend(e);
}}
/>
)}
{/* file preview */}
{(!mobilePreSpeak || isPc || inputValue) && (
<Box px={[2, 3]}>
<FilePreview fileList={fileList} removeFiles={removeFiles} />
</Box>
)}
{/* loading spinner */}
{/* voice input and loading container */}
{!inputValue && (
<VoiceInput
ref={VoiceInputRef}
handleSend={(text) => {
onSendMessage({
text: text.trim(),
files: fileList
});
replaceFiles([]);
}}
resetInputVal={(val) => {
setMobilePreSpeak(false);
resetInputVal({
text: val,
files: fileList
});
}}
mobilePreSpeak={mobilePreSpeak}
setMobilePreSpeak={setMobilePreSpeak}
/>
)}
{RenderTextarea}
</Box>
{/* voice input and loading container */}
{!inputValue && (
<VoiceInput
ref={VoiceInputRef}
onSendMessage={onSendMessage}
resetInputVal={resetInputVal}
/>
)}
{RenderTextarea}
</Box>
{!mobilePreSpeak && <Box>{RenderButtonGroup}</Box>}
</Flex>
<ComplianceTip type={'chat'} />
</Box>
);

View File

@ -6,8 +6,7 @@ import React, {
useCallback,
useState,
forwardRef,
useImperativeHandle,
useMemo
useImperativeHandle
} from 'react';
import { useTranslation } from 'next-i18next';
import MyTooltip from '@fastgpt/web/components/common/MyTooltip';
@ -20,11 +19,14 @@ import { isMobile } from '@fastgpt/web/common/system/utils';
export interface VoiceInputComponentRef {
onSpeak: () => void;
getVoiceInputState: () => { isSpeaking: boolean; isTransCription: boolean };
}
type VoiceInputProps = {
onSendMessage: (params: { text: string; files?: any[]; autoTTSResponse?: boolean }) => void;
resetInputVal: (val: { text: string }) => void;
handleSend: (val: string) => void;
resetInputVal: (val: string) => void;
mobilePreSpeak: boolean;
setMobilePreSpeak: React.Dispatch<React.SetStateAction<boolean>>;
};
// PC voice input
@ -40,38 +42,104 @@ const PCVoiceInput = ({
const { t } = useTranslation();
return (
<HStack h={'100%'} px={4}>
<Box fontSize="sm" color="myGray.500" flex={'1 0 0'}>
{t('common:core.chat.Speaking')}
</Box>
<canvas
ref={canvasRef}
style={{
height: '10px',
width: '100px',
background: 'white'
}}
/>
<Box fontSize="sm" color="myGray.500" whiteSpace={'nowrap'}>
{speakingTimeString}
</Box>
<MyTooltip label={t('common:core.chat.Cancel Speak')}>
<MyIconButton
name={'core/chat/cancelSpeak'}
h={'22px'}
w={'22px'}
onClick={() => stopSpeak(true)}
<Box
position="absolute"
top={0}
left={0}
right={0}
bottom={0}
bg="rgba(255, 255, 255, 0.3)"
backdropFilter="blur(8px)"
borderRadius="xxl"
zIndex={10}
display="flex"
alignItems="center"
justifyContent="center"
flexDirection="column"
>
{/* Center Waveform Area */}
<Flex
position="absolute"
top="50%"
left="0"
right="0"
transform="translateY(-80%)"
alignItems="center"
justifyContent="center"
direction="column"
gap={1}
w="100%"
>
<Box fontSize="sm" color="myGray.600" fontWeight="500">
{t('common:core.chat.Speaking')}
</Box>
<canvas
ref={canvasRef}
style={{
height: '32px',
width: '90%',
background: 'transparent'
}}
/>
</MyTooltip>
<MyTooltip label={t('common:core.chat.Finish Speak')}>
<MyIconButton
name={'core/chat/finishSpeak'}
h={'22px'}
w={'22px'}
onClick={() => stopSpeak(false)}
/>
</MyTooltip>
</HStack>
</Flex>
{/* Action Buttons - Bottom */}
<Flex position="absolute" right={4} bottom={3.5} alignItems="center" gap={2} h={9}>
{/* Time Display */}
<Box
fontSize="sm"
color="myGray.600"
mr={2}
bg="rgba(255, 255, 255, 0.9)"
px={2}
py={1}
borderRadius="md"
fontWeight="500"
>
{speakingTimeString}
</Box>
{/* Cancel Button */}
<MyTooltip label={t('common:core.chat.Cancel Speak')}>
<Flex
w={9}
h={9}
alignItems="center"
justifyContent="center"
border="sm"
borderRadius="lg"
cursor="pointer"
bg="rgba(255, 255, 255, 0.95)"
boxShadow="0 2px 8px rgba(0, 0, 0, 0.1)"
_hover={{ bg: 'white', transform: 'scale(1.05)' }}
transition="all 0.2s"
onClick={() => stopSpeak(true)}
>
<MyIcon name={'close'} w={5} h={5} color={'myGray.500'} />
</Flex>
</MyTooltip>
{/* Confirm Button */}
<MyTooltip label={t('common:core.chat.Finish Speak')}>
<Flex
w={9}
h={9}
alignItems="center"
justifyContent="center"
border="sm"
borderRadius="lg"
cursor="pointer"
bg="rgba(255, 255, 255, 0.95)"
boxShadow="0 2px 8px rgba(0, 0, 0, 0.1)"
_hover={{ bg: 'white', transform: 'scale(1.05)' }}
transition="all 0.2s"
onClick={() => stopSpeak(false)}
>
<MyIcon name={'check'} w={5} h={5} color={'myGray.500'} />
</Flex>
</MyTooltip>
</Flex>
</Box>
);
};
@ -120,9 +188,9 @@ const MobileVoiceInput = ({
const currentY = touch.pageY;
const deltaY = startYRef.current - currentY;
if (deltaY > 90) {
if (deltaY > 60) {
setIsCancel(true);
} else if (deltaY <= 90) {
} else if (deltaY <= 60) {
setIsCancel(false);
}
},
@ -157,8 +225,8 @@ const MobileVoiceInput = ({
transform={'translateY(-50%)'}
zIndex={5}
name={'core/chat/backText'}
h={'22px'}
w={'22px'}
h={6}
w={6}
onClick={onCloseSpeak}
/>
</MyTooltip>
@ -166,9 +234,10 @@ const MobileVoiceInput = ({
<Flex
alignItems={'center'}
justifyContent={'center'}
h="100%"
flex="1 0 0"
bg={isSpeaking ? (isCancel ? 'red.500' : 'primary.500') : 'white'}
bg={isSpeaking ? (isCancel ? 'red.500' : 'primary.500') : 'rgba(255, 255, 255, 0.95)'}
backdropFilter={!isSpeaking ? 'blur(4px)' : 'none'}
borderRadius="xxl"
onTouchMove={handleTouchMove}
onTouchEnd={handleTouchEnd}
onTouchStart={handleTouchStart}
@ -199,10 +268,10 @@ const MobileVoiceInput = ({
left={0}
right={0}
bottom={maskBottom}
h={'200px'}
h={'48px'}
bg="linear-gradient(to top, white, rgba(255, 255, 255, 0.7), rgba(255, 255, 255, 0))"
>
<Box fontSize="sm" color="myGray.500" position="absolute" bottom={'10px'}>
<Box fontSize="sm" color="myGray.500" position="absolute" bottom={2.5}>
{isCancel ? t('chat:release_cancel') : t('chat:release_send')}
</Box>
</Flex>
@ -212,7 +281,7 @@ const MobileVoiceInput = ({
};
const VoiceInput = forwardRef<VoiceInputComponentRef, VoiceInputProps>(
({ onSendMessage, resetInputVal }, ref) => {
({ handleSend, resetInputVal, mobilePreSpeak, setMobilePreSpeak }, ref) => {
const { t } = useTranslation();
const isMobileDevice = isMobile();
const { isPc } = useSystem();
@ -220,7 +289,6 @@ const VoiceInput = forwardRef<VoiceInputComponentRef, VoiceInputProps>(
const outLinkAuthData = useContextSelector(ChatBoxContext, (v) => v.outLinkAuthData);
const appId = useContextSelector(ChatBoxContext, (v) => v.appId);
const whisperConfig = useContextSelector(ChatBoxContext, (v) => v.whisperConfig);
const autoTTSResponse = useContextSelector(ChatBoxContext, (v) => v.autoTTSResponse);
const canvasRef = useRef<HTMLCanvasElement>(null);
const {
@ -234,8 +302,6 @@ const VoiceInput = forwardRef<VoiceInputComponentRef, VoiceInputProps>(
stream
} = useSpeech({ appId, ...outLinkAuthData });
const [mobilePreSpeak, setMobilePreSpeak] = useState(false);
// Canvas render
useEffect(() => {
if (!stream) {
@ -290,16 +356,13 @@ const VoiceInput = forwardRef<VoiceInputComponentRef, VoiceInputProps>(
const finishWhisperTranscription = (text: string) => {
if (!text) return;
if (whisperConfig?.autoSend) {
onSendMessage({
text,
autoTTSResponse
});
handleSend(text);
} else {
resetInputVal({ text });
resetInputVal(text);
}
};
startSpeak(finishWhisperTranscription);
}, [autoTTSResponse, onSendMessage, resetInputVal, startSpeak, whisperConfig?.autoSend]);
}, [handleSend, resetInputVal, startSpeak, whisperConfig?.autoSend]);
const onSpeach = useCallback(() => {
if (isMobileDevice) {
@ -307,9 +370,10 @@ const VoiceInput = forwardRef<VoiceInputComponentRef, VoiceInputProps>(
} else {
onStartSpeak();
}
}, [isMobileDevice, onStartSpeak]);
}, [isMobileDevice, onStartSpeak, setMobilePreSpeak]);
useImperativeHandle(ref, () => ({
onSpeak: onSpeach
onSpeak: onSpeach,
getVoiceInputState: () => ({ isSpeaking: isSpeaking || mobilePreSpeak, isTransCription })
}));
if (!whisperConfig?.open) return null;
@ -324,7 +388,7 @@ const VoiceInput = forwardRef<VoiceInputComponentRef, VoiceInputProps>(
left={0}
right={0}
bottom={0}
bg="white"
bg="transparent"
zIndex={5}
borderRadius={isPc ? 'md' : ''}
onContextMenu={(e) => e.preventDefault()}
@ -348,15 +412,17 @@ const VoiceInput = forwardRef<VoiceInputComponentRef, VoiceInputProps>(
{isTransCription && (
<Flex
position={'absolute'}
borderRadius="xxl"
top={0}
bottom={0}
left={0}
right={0}
pl={5}
alignItems={'center'}
bg={'white'}
bg={'rgba(255, 255, 255, 0.95)'}
backdropFilter="blur(4px)"
color={'primary.500'}
zIndex={6}
zIndex={15}
>
<Spinner size={'sm'} mr={4} />
{t('common:core.chat.Converting to text')}

View File

@ -959,6 +959,7 @@ const ChatBox = ({
pb={3}
>
<Box id="chat-container" maxW={['100%', '92%']} h={'100%'} mx={'auto'}>
{/* chat header */}
{showEmpty && <Empty />}
{!!welcomeText && <WelcomeBox welcomeText={welcomeText} />}
{/* variable input */}

View File

@ -22,7 +22,7 @@ const RenderFilePreview = ({
<Flex
overflow={'visible'}
wrap={'wrap'}
pt={3}
pt={[2, 3]}
userSelect={'none'}
mb={fileList.length > 0 ? 2 : 0}
gap={'6px'}

View File

@ -1,4 +1,4 @@
import { getInvoiceRecords } from '@/web/support/wallet/bill/invoice/api';
import { getInvoiceRecords, readInvoiceFile } from '@/web/support/wallet/bill/invoice/api';
import MyBox from '@fastgpt/web/components/common/MyBox';
import { useTranslation } from 'next-i18next';
import { useState } from 'react';
@ -22,6 +22,7 @@ import MyIcon from '@fastgpt/web/components/common/Icon';
import dayjs from 'dayjs';
import { formatStorePrice2Read } from '@fastgpt/global/support/wallet/usage/tools';
import MyModal from '@fastgpt/web/components/common/MyModal';
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
const InvoiceTable = () => {
const { t } = useTranslation();
@ -135,6 +136,29 @@ function InvoiceDetailModal({
onClose: () => void;
}) {
const { t } = useTranslation();
const { runAsync: handleDownloadInvoice } = useRequest2(async (id: string) => {
const fileInfo = await readInvoiceFile(id);
// Blob
const byteCharacters = atob(fileInfo.data);
const byteNumbers = new Array(byteCharacters.length);
for (let i = 0; i < byteCharacters.length; i++) {
byteNumbers[i] = byteCharacters.charCodeAt(i);
}
const byteArray = new Uint8Array(byteNumbers);
const blob = new Blob([byteArray], { type: fileInfo.mimeType });
const fileUrl = URL.createObjectURL(blob);
// preview
window.open(fileUrl, '_blank');
// clean
setTimeout(() => {
URL.revokeObjectURL(fileUrl);
}, 1000);
});
return (
<MyModal
maxW={['90vw', '700px']}
@ -165,6 +189,14 @@ function InvoiceDetailModal({
/>
<LabelItem label={t('account_bill:contact_phone')} value={invoice.contactPhone} />
<LabelItem label={t('account_bill:email_address')} value={invoice.emailAddress} />
{invoice.status === 2 && (
<Flex alignItems={'center'} justify={'space-between'}>
<FormLabel flex={'0 0 120px'}>{t('account_bill:Invoice_document')}</FormLabel>
<Box cursor={'pointer'} onClick={() => handleDownloadInvoice(invoice._id)}>
{t('account_bill:click_to_download')}
</Box>
</Flex>
)}
</Flex>
</ModalBody>
</MyModal>

View File

@ -38,7 +38,6 @@ import { useSystemStore } from '@/web/common/system/useSystemStore';
import QuestionTip from '@fastgpt/web/components/common/MyTooltip/QuestionTip';
import MyModal from '@fastgpt/web/components/common/MyModal';
import FormLabel from '@fastgpt/web/components/common/MyBox/FormLabel';
import { getCQPrompt, getExtractJsonPrompt } from '@fastgpt/global/core/ai/prompt/agent';
export const AddModelButton = ({
onCreate,
@ -173,6 +172,36 @@ export const ModelEditModal = ({
}
);
const CustomApi = useMemo(
() => (
<>
<Tr>
<Td>
<HStack spacing={1}>
<Box>{t('account:model.request_url')}</Box>
<QuestionTip label={t('account:model.request_url_tip')} />
</HStack>
</Td>
<Td textAlign={'right'}>
<Input {...register('requestUrl')} {...InputStyles} />
</Td>
</Tr>
<Tr>
<Td>
<HStack spacing={1}>
<Box>{t('account:model.request_auth')}</Box>
<QuestionTip label={t('account:model.request_auth_tip')} />
</HStack>
</Td>
<Td textAlign={'right'}>
<Input {...register('requestAuth')} {...InputStyles} />
</Td>
</Tr>
</>
),
[]
);
return (
<MyModal
iconSrc={'modal/edit'}
@ -534,28 +563,7 @@ export const ModelEditModal = ({
</Tr>
</>
)}
<Tr>
<Td>
<HStack spacing={1}>
<Box>{t('account:model.request_url')}</Box>
<QuestionTip label={t('account:model.request_url_tip')} />
</HStack>
</Td>
<Td textAlign={'right'}>
<Input {...register('requestUrl')} {...InputStyles} />
</Td>
</Tr>
<Tr>
<Td>
<HStack spacing={1}>
<Box>{t('account:model.request_auth')}</Box>
<QuestionTip label={t('account:model.request_auth_tip')} />
</HStack>
</Td>
<Td textAlign={'right'}>
<Input {...register('requestAuth')} {...InputStyles} />
</Td>
</Tr>
{!isLLMModel && CustomApi}
</Tbody>
</Table>
</TableContainer>
@ -666,36 +674,6 @@ export const ModelEditModal = ({
<MyTextarea {...register('defaultSystemChatPrompt')} {...InputStyles} />
</Td>
</Tr>
<Tr>
<Td>
<HStack spacing={1}>
<Box>{t('account:model.custom_cq_prompt')}</Box>
<QuestionTip
label={t('account:model.custom_cq_prompt_tip', {
prompt: getCQPrompt()
})}
/>
</HStack>
</Td>
<Td textAlign={'right'}>
<MyTextarea {...register('customCQPrompt')} {...InputStyles} />
</Td>
</Tr>
<Tr>
<Td>
<HStack spacing={1}>
<Box>{t('account:model.custom_extract_prompt')}</Box>
<QuestionTip
label={t('account:model.custom_extract_prompt_tip', {
prompt: getExtractJsonPrompt()
})}
/>
</HStack>
</Td>
<Td textAlign={'right'}>
<MyTextarea {...register('customExtractPrompt')} {...InputStyles} />
</Td>
</Tr>
<Tr>
<Td>
<HStack spacing={1}>
@ -723,6 +701,7 @@ export const ModelEditModal = ({
/>
</Td>
</Tr>
{CustomApi}
</Tbody>
</Table>
</TableContainer>

View File

@ -4,7 +4,7 @@ import { Box, Grid, HStack, useTheme } from '@chakra-ui/react';
import MyBox from '@fastgpt/web/components/common/MyBox';
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
import { useTranslation } from 'next-i18next';
import { addDays } from 'date-fns';
import { addDays, addHours } from 'date-fns';
import dayjs from 'dayjs';
import DateRangePicker, {
type DateRangeType
@ -47,24 +47,15 @@ const ChartsBoxStyles: BoxProps = {
// Default date range: Past 7 days
const getDefaultDateRange = (): DateRangeType => {
const from = addDays(new Date(), -7);
from.setHours(0, 0, 0, 0);
const from = addHours(new Date(), -24);
from.setMinutes(0, 0, 0); // Set minutes to 0
const to = new Date();
to.setHours(23, 59, 59, 999);
const to = addHours(new Date(), 1);
to.setMinutes(0, 0, 0); // Set minutes to 0
return { from, to };
};
const calculateTimeDiffs = (from: Date, to: Date) => {
const startDate = dayjs(from);
const endDate = dayjs(to);
return {
daysDiff: endDate.diff(startDate, 'day'),
hoursDiff: endDate.diff(startDate, 'hour')
};
};
const ModelDashboard = ({ Tab }: { Tab: React.ReactNode }) => {
const { t } = useTranslation();
const theme = useTheme();
@ -89,7 +80,7 @@ const ModelDashboard = ({ Tab }: { Tab: React.ReactNode }) => {
}>({
channelId: undefined,
model: undefined,
timespan: 'day',
timespan: 'hour',
dateRange: getDefaultDateRange()
});
@ -159,22 +150,22 @@ const ModelDashboard = ({ Tab }: { Tab: React.ReactNode }) => {
return map;
}, [systemModelList]);
const computeTimespan = (daysDiff: number, hoursDiff: number) => {
const computeTimespan = (hoursDiff: number) => {
const options: { label: string; value: 'minute' | 'hour' | 'day' }[] = [];
if (daysDiff <= 1) {
if (hoursDiff <= 1 * 24) {
options.push({ label: t('account_model:timespan_minute'), value: 'minute' });
}
if (daysDiff < 7) {
if (hoursDiff < 7 * 24) {
options.push({ label: t('account_model:timespan_hour'), value: 'hour' });
}
if (daysDiff >= 1) {
if (hoursDiff >= 1 * 24) {
options.push({ label: t('account_model:timespan_day'), value: 'day' });
}
const defaultTimespan: 'minute' | 'hour' | 'day' = (() => {
if (hoursDiff < 1) {
return 'minute';
} else if (daysDiff < 2) {
} else if (hoursDiff < 2 * 24) {
return 'hour';
} else {
return 'day';
@ -183,7 +174,7 @@ const ModelDashboard = ({ Tab }: { Tab: React.ReactNode }) => {
return { options, defaultTimespan };
};
const [timespanOptions, setTimespanOptions] = useState(computeTimespan(30, 60).options);
const [timespanOptions, setTimespanOptions] = useState(computeTimespan(48).options);
// Handle date range change with automatic timespan adjustment
const handleDateRangeChange = (dateRange: DateRangeType) => {
@ -191,11 +182,9 @@ const ModelDashboard = ({ Tab }: { Tab: React.ReactNode }) => {
// Computed timespan
if (dateRange.from && dateRange.to) {
const { daysDiff, hoursDiff } = calculateTimeDiffs(dateRange.from, dateRange.to);
const { options: newTimespanOptions, defaultTimespan: newDefaultTimespan } = computeTimespan(
daysDiff,
hoursDiff
);
const hoursDiff = dayjs(dateRange.to).diff(dayjs(dateRange.from), 'hour');
const { options: newTimespanOptions, defaultTimespan: newDefaultTimespan } =
computeTimespan(hoursDiff);
setTimespanOptions(newTimespanOptions);
newFilterProps.timespan = newDefaultTimespan;

View File

@ -52,7 +52,7 @@ const WorkflowVariableModal = ({
<Box h={'1px'} bg={'myGray.150'} my={4}></Box>
<Flex alignItems={'center'}>
<Box fontSize={'14px'} color={'myGray.900'} fontWeight={'medium'}>
{t('common:core.workflow.value')}
{t('common:value')}
</Box>
<Input
ml={8}

View File

@ -16,10 +16,19 @@ import dynamic from 'next/dynamic';
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
import Markdown from '@/components/Markdown';
import { postRunMCPTool } from '@/web/core/app/api/plugin';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
const JsonEditor = dynamic(() => import('@fastgpt/web/components/common/Textarea/JsonEditor'));
const ChatTest = ({ currentTool, url }: { currentTool: McpToolConfigType | null; url: string }) => {
const ChatTest = ({
currentTool,
url,
headerSecret
}: {
currentTool?: McpToolConfigType;
url: string;
headerSecret: StoreSecretValueType;
}) => {
const { t } = useTranslation();
const [output, setOutput] = useState<string>('');
@ -42,6 +51,7 @@ const ChatTest = ({ currentTool, url }: { currentTool: McpToolConfigType | null;
return await postRunMCPTool({
params: data,
url,
headerSecret,
toolName: currentTool.name
});
},
@ -135,7 +145,15 @@ const ChatTest = ({ currentTool, url }: { currentTool: McpToolConfigType | null;
);
};
const Render = ({ currentTool, url }: { currentTool: McpToolConfigType | null; url: string }) => {
const Render = ({
currentTool,
url,
headerSecret
}: {
currentTool?: McpToolConfigType;
url: string;
headerSecret: StoreSecretValueType;
}) => {
const { chatId } = useChatStore();
const { appDetail } = useContextSelector(AppContext, (v) => v);
@ -157,7 +175,7 @@ const Render = ({ currentTool, url }: { currentTool: McpToolConfigType | null; u
showNodeStatus
>
<ChatRecordContextProvider params={chatRecordProviderParams}>
<ChatTest currentTool={currentTool} url={url} />
<ChatTest currentTool={currentTool} url={url} headerSecret={headerSecret} />
</ChatRecordContextProvider>
</ChatItemContextProvider>
);
@ -178,7 +196,7 @@ const RenderToolInput = ({
type: string;
description?: string;
};
toolData: McpToolConfigType | null;
toolData?: McpToolConfigType;
value: any;
onChange: (value: any) => void;
isInvalid: boolean;

View File

@ -8,6 +8,7 @@ import ChatTest from './ChatTest';
import MyBox from '@fastgpt/web/components/common/MyBox';
import EditForm from './EditForm';
import { type McpToolConfigType } from '@fastgpt/global/core/app/type';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
const Edit = ({
url,
@ -15,14 +16,18 @@ const Edit = ({
toolList,
setToolList,
currentTool,
setCurrentTool
setCurrentTool,
headerSecret,
setHeaderSecret
}: {
url: string;
setUrl: (url: string) => void;
toolList: McpToolConfigType[];
setToolList: (toolList: McpToolConfigType[]) => void;
currentTool: McpToolConfigType | null;
currentTool?: McpToolConfigType;
setCurrentTool: (tool: McpToolConfigType) => void;
headerSecret: StoreSecretValueType;
setHeaderSecret: (headerSecret: StoreSecretValueType) => void;
}) => {
const { isPc } = useSystem();
@ -56,12 +61,14 @@ const Edit = ({
setCurrentTool={setCurrentTool}
url={url}
setUrl={setUrl}
headerSecret={headerSecret}
setHeaderSecret={setHeaderSecret}
/>
</Box>
</Flex>
{isPc && (
<Box flex={'2 0 0'} w={0} mb={3}>
<ChatTest currentTool={currentTool} url={url} />
<ChatTest currentTool={currentTool} url={url} headerSecret={headerSecret} />
</Box>
)}
</MyBox>

View File

@ -13,6 +13,8 @@ import Avatar from '@fastgpt/web/components/common/Avatar';
import MyBox from '@fastgpt/web/components/common/MyBox';
import type { getMCPToolsBody } from '@/pages/api/support/mcp/client/getTools';
import { getMCPTools } from '@/web/core/app/api/plugin';
import HeaderAuthConfig from '@/components/common/secret/HeaderAuthConfig';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
const EditForm = ({
url,
@ -20,14 +22,18 @@ const EditForm = ({
toolList,
setToolList,
currentTool,
setCurrentTool
setCurrentTool,
headerSecret,
setHeaderSecret
}: {
url: string;
setUrl: (url: string) => void;
toolList: McpToolConfigType[];
setToolList: (toolList: McpToolConfigType[]) => void;
currentTool: McpToolConfigType | null;
currentTool?: McpToolConfigType;
setCurrentTool: (tool: McpToolConfigType) => void;
headerSecret: StoreSecretValueType;
setHeaderSecret: (headerSecret: StoreSecretValueType) => void;
}) => {
const { t } = useTranslation();
@ -52,6 +58,14 @@ const EditForm = ({
<FormLabel ml={2} flex={1}>
{t('app:MCP_tools_url')}
</FormLabel>
<HeaderAuthConfig
storeHeaderSecretConfig={headerSecret}
onUpdate={setHeaderSecret}
buttonProps={{
size: 'sm',
variant: 'grayGhost'
}}
/>
</Flex>
<Flex alignItems={'center'} gap={2} mt={3}>
<Input
@ -66,7 +80,7 @@ const EditForm = ({
h={8}
isLoading={isGettingTools}
onClick={() => {
runGetMCPTools({ url });
runGetMCPTools({ url, headerSecret });
}}
>
{t('common:Parse')}

View File

@ -10,8 +10,17 @@ import { useRouter } from 'next/router';
import { useSystemStore } from '@/web/common/system/useSystemStore';
import { type McpToolConfigType } from '@fastgpt/global/core/app/type';
import { postUpdateMCPTools } from '@/web/core/app/api/plugin';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
const Header = ({ url, toolList }: { url: string; toolList: McpToolConfigType[] }) => {
const Header = ({
url,
toolList,
headerSecret
}: {
url: string;
toolList: McpToolConfigType[];
headerSecret: StoreSecretValueType;
}) => {
const { t } = useTranslation();
const appId = useContextSelector(AppContext, (v) => v.appId);
const router = useRouter();
@ -41,7 +50,7 @@ const Header = ({ url, toolList }: { url: string; toolList: McpToolConfigType[]
const { runAsync: saveMCPTools, loading: isSavingMCPTools } = useRequest2(
async () => {
return await postUpdateMCPTools({ appId, url, toolList });
return await postUpdateMCPTools({ appId, url, toolList, headerSecret });
},
{
successToast: t('common:update_success')

View File

@ -7,6 +7,7 @@ import { AppContext } from '../context';
import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
import { type McpToolConfigType } from '@fastgpt/global/core/app/type';
import { type MCPToolSetData } from '@/pageComponents/dashboard/apps/MCPToolsEditModal';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
const MCPTools = () => {
const appDetail = useContextSelector(AppContext, (v) => v.appDetail);
@ -19,13 +20,12 @@ const MCPTools = () => {
const [url, setUrl] = useState(toolSetData?.url || '');
const [toolList, setToolList] = useState<McpToolConfigType[]>(toolSetData?.toolList || []);
const [currentTool, setCurrentTool] = useState<McpToolConfigType | null>(
toolSetData?.toolList[0] || null
);
const [headerSecret, setHeaderSecret] = useState<StoreSecretValueType>(toolSetData?.headerSecret);
const [currentTool, setCurrentTool] = useState<McpToolConfigType>(toolSetData?.toolList[0]);
return (
<Flex h={'100%'} flexDirection={'column'} px={[3, 0]} pr={[3, 3]}>
<Header url={url} toolList={toolList} />
<Header url={url} toolList={toolList} headerSecret={headerSecret} />
<Edit
url={url}
setUrl={setUrl}
@ -33,6 +33,8 @@ const MCPTools = () => {
setToolList={setToolList}
currentTool={currentTool}
setCurrentTool={setCurrentTool}
headerSecret={headerSecret}
setHeaderSecret={setHeaderSecret}
/>
</Flex>
);

View File

@ -8,7 +8,6 @@ import RenderOutput from '../render/RenderOutput';
import {
Box,
Flex,
Input,
Table,
Thead,
Tbody,
@ -50,7 +49,9 @@ import { getEditorVariables } from '../../../utils';
import PromptEditor from '@fastgpt/web/components/common/Textarea/PromptEditor';
import { WorkflowNodeEdgeContext } from '../../../context/workflowInitContext';
import { useSystemStore } from '@/web/common/system/useSystemStore';
const CurlImportModal = dynamic(() => import('./CurlImportModal'));
const HeaderAuthConfig = dynamic(() => import('@/components/common/secret/HeaderAuthConfig'));
const defaultFormBody = {
key: NodeInputKeyEnum.httpFormBody,
@ -271,6 +272,7 @@ export function RenderHttpProps({
const edges = useContextSelector(WorkflowNodeEdgeContext, (v) => v.edges);
const nodeList = useContextSelector(WorkflowContext, (v) => v.nodeList);
const onChangeNode = useContextSelector(WorkflowContext, (v) => v.onChangeNode);
const { appDetail } = useContextSelector(AppContext, (v) => v);
const { feConfigs } = useSystemStore();
@ -281,6 +283,7 @@ export function RenderHttpProps({
const jsonBody = inputs.find((item) => item.key === NodeInputKeyEnum.httpJsonBody);
const formBody =
inputs.find((item) => item.key === NodeInputKeyEnum.httpFormBody) || defaultFormBody;
const headerSecret = inputs.find((item) => item.key === NodeInputKeyEnum.headerSecret)!;
const contentType = inputs.find((item) => item.key === NodeInputKeyEnum.httpContentType);
const paramsLength = params?.value?.length || 0;
@ -335,6 +338,21 @@ export function RenderHttpProps({
ml={1}
label={t('common:core.module.http.Props tip', { variable: variableText })}
/>
<Flex flex={1} />
<HeaderAuthConfig
storeHeaderSecretConfig={headerSecret?.value}
onUpdate={(data) => {
onChangeNode({
nodeId,
type: 'updateInput',
key: NodeInputKeyEnum.headerSecret,
value: {
...headerSecret,
value: data
}
});
}}
/>
</Flex>
<LightRowTabs<TabEnum>
width={'100%'}
@ -403,7 +421,9 @@ export function RenderHttpProps({
contentType,
formBody,
headersLength,
headerSecret,
nodeId,
onChangeNode,
paramsLength,
requestMethods,
selectedTab,

View File

@ -182,7 +182,7 @@ const NodeVariableUpdate = ({ data, selected }: NodeProps<FlowNodeItemType>) =>
</Flex>
<Flex mt={2} w={'full'} alignItems={'center'} className="nodrag">
<Flex w={'80px'}>
<Box>{t('common:core.workflow.value')}</Box>
<Box>{t('common:value')}</Box>
<MyTooltip
label={
menuList.current.find((item) => item.renderType === updateItem.renderType)?.label

View File

@ -26,10 +26,13 @@ import { AppListContext } from './context';
import { useContextSelector } from 'use-context-selector';
import { type McpToolConfigType } from '@fastgpt/global/core/app/type';
import type { getMCPToolsBody } from '@/pages/api/support/mcp/client/getTools';
import HeaderAuthConfig from '@/components/common/secret/HeaderAuthConfig';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
export type MCPToolSetData = {
url: string;
toolList: McpToolConfigType[];
headerSecret: StoreSecretValueType;
};
export type EditMCPToolsProps = {
@ -49,6 +52,7 @@ const MCPToolsEditModal = ({ onClose }: { onClose: () => void }) => {
name: '',
mcpData: {
url: '',
headerSecret: {},
toolList: []
}
}
@ -63,6 +67,7 @@ const MCPToolsEditModal = ({ onClose }: { onClose: () => void }) => {
avatar: data.avatar,
toolList: data.mcpData.toolList,
url: data.mcpData.url,
headerSecret: data.mcpData.headerSecret,
parentId
});
},
@ -131,9 +136,26 @@ const MCPToolsEditModal = ({ onClose }: { onClose: () => void }) => {
/>
</Flex>
<Box color={'myGray.900'} fontSize={'14px'} fontWeight={'medium'} mt={6}>
{t('app:MCP_tools_url')}
</Box>
<Flex
color={'myGray.900'}
fontSize={'14px'}
fontWeight={'medium'}
mt={6}
alignItems={'center'}
>
<Box>{t('app:MCP_tools_url')}</Box>
<Box flex={1} />
<HeaderAuthConfig
storeHeaderSecretConfig={mcpData.headerSecret}
onUpdate={(data) => {
setValue('mcpData.headerSecret', data);
}}
buttonProps={{
size: 'sm',
variant: 'grayGhost'
}}
/>
</Flex>
<Flex alignItems={'center'} gap={2} mt={2}>
<Input
h={8}
@ -148,7 +170,10 @@ const MCPToolsEditModal = ({ onClose }: { onClose: () => void }) => {
h={8}
isLoading={isGettingTools}
onClick={() => {
runGetMCPTools({ url: mcpData.url });
runGetMCPTools({
url: mcpData.url,
headerSecret: mcpData.headerSecret
});
}}
>
{t('common:Parse')}

View File

@ -13,8 +13,6 @@ async function handler(
const activeModelList = global.systemActiveModelList.map((model) => ({
...model,
customCQPrompt: undefined,
customExtractPrompt: undefined,
defaultSystemChatPrompt: undefined,
fieldMap: undefined,
defaultConfig: undefined,

View File

@ -21,8 +21,8 @@ import { MongoOpenApi } from '@fastgpt/service/support/openapi/schema';
import { removeImageByPath } from '@fastgpt/service/common/file/image/controller';
import { addOperationLog } from '@fastgpt/service/support/operationLog/addOperationLog';
import { OperationLogEventEnum } from '@fastgpt/global/support/operationLog/constants';
import { AppTypeEnum } from '@fastgpt/global/core/app/constants';
import { getI18nAppType } from '@fastgpt/service/support/operationLog/util';
async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
const { appId } = req.query as { appId: string };

View File

@ -14,6 +14,9 @@ import {
import { pushTrack } from '@fastgpt/service/common/middle/tracks/utils';
import { checkTeamAppLimit } from '@fastgpt/service/support/permission/teamLimit';
import { WritePermissionVal } from '@fastgpt/global/support/permission/constant';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
import { encryptSecret } from '@fastgpt/service/common/secret/aes256gcm';
import { storeSecretValue } from '@fastgpt/service/common/secret/utils';
export type createMCPToolsQuery = {};
@ -22,6 +25,7 @@ export type createMCPToolsBody = Omit<
'type' | 'modules' | 'edges' | 'chatConfig'
> & {
url: string;
headerSecret?: StoreSecretValueType;
toolList: McpToolConfigType[];
};
@ -31,7 +35,7 @@ async function handler(
req: ApiRequestProps<createMCPToolsBody, createMCPToolsQuery>,
res: ApiResponseType<createMCPToolsResponse>
): Promise<createMCPToolsResponse> {
const { name, avatar, toolList, url, parentId } = req.body;
const { name, avatar, toolList, url, headerSecret = {}, parentId } = req.body;
const { teamId, tmbId, userId } = parentId
? await authApp({ req, appId: parentId, per: WritePermissionVal, authToken: true })
@ -39,6 +43,8 @@ async function handler(
await checkTeamAppLimit(teamId);
const formatedHeaderAuth = storeSecretValue(headerSecret);
const mcpToolsId = await mongoSessionRun(async (session) => {
const mcpToolsId = await onCreateApp({
name,
@ -47,7 +53,15 @@ async function handler(
teamId,
tmbId,
type: AppTypeEnum.toolSet,
modules: [getMCPToolSetRuntimeNode({ url, toolList, name, avatar })],
modules: [
getMCPToolSetRuntimeNode({
url,
toolList,
name,
avatar,
headerSecret: formatedHeaderAuth
})
],
session
});
@ -60,7 +74,13 @@ async function handler(
tmbId,
type: AppTypeEnum.tool,
intro: tool.description,
modules: [getMCPToolRuntimeNode({ tool, url })],
modules: [
getMCPToolRuntimeNode({
tool,
url,
headerSecret: formatedHeaderAuth
})
],
session
});
}

View File

@ -18,12 +18,15 @@ import {
} from '@fastgpt/global/core/app/mcpTools/utils';
import { type MCPToolSetData } from '@/pageComponents/dashboard/apps/MCPToolsEditModal';
import { MongoAppVersion } from '@fastgpt/service/core/app/version/schema';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
import { storeSecretValue } from '@fastgpt/service/common/secret/utils';
export type updateMCPToolsQuery = {};
export type updateMCPToolsBody = {
appId: string;
url: string;
headerSecret: StoreSecretValueType;
toolList: McpToolConfigType[];
};
@ -33,12 +36,14 @@ async function handler(
req: ApiRequestProps<updateMCPToolsBody, updateMCPToolsQuery>,
res: ApiResponseType<updateMCPToolsResponse>
): Promise<updateMCPToolsResponse> {
const { appId, url, toolList } = req.body;
const { appId, url, toolList, headerSecret } = req.body;
const { app } = await authApp({ req, authToken: true, appId, per: ManagePermissionVal });
const toolSetNode = app.modules.find((item) => item.flowNodeType === FlowNodeTypeEnum.toolSet);
const toolSetData = toolSetNode?.inputs[0].value as MCPToolSetData;
const formatedHeaderAuth = storeSecretValue(headerSecret);
await mongoSessionRun(async (session) => {
if (
!isEqual(toolSetData, {
@ -50,32 +55,43 @@ async function handler(
parentApp: app,
toolSetData: {
url,
toolList
toolList,
headerSecret: formatedHeaderAuth
},
session
});
}
await MongoApp.updateOne(
{ _id: appId },
{
modules: [getMCPToolSetRuntimeNode({ url, toolList, name: app.name, avatar: app.avatar })],
updateTime: new Date()
},
{ session }
);
// create tool set node
const toolSetRuntimeNode = getMCPToolSetRuntimeNode({
url,
toolList,
headerSecret: formatedHeaderAuth,
name: app.name,
avatar: app.avatar
});
await MongoAppVersion.updateOne(
{
appId
},
{
$set: {
nodes: [getMCPToolSetRuntimeNode({ url, toolList, name: app.name, avatar: app.avatar })]
}
},
{ session }
);
// update app and app version
await Promise.all([
MongoApp.updateOne(
{ _id: appId },
{
modules: [toolSetRuntimeNode],
updateTime: new Date()
},
{ session }
),
MongoAppVersion.updateOne(
{ appId },
{
$set: {
nodes: [toolSetRuntimeNode]
}
},
{ session }
)
]);
});
return {};
@ -89,7 +105,11 @@ const updateMCPChildrenTool = async ({
session
}: {
parentApp: AppDetailType;
toolSetData: MCPToolSetData;
toolSetData: {
url: string;
toolList: McpToolConfigType[];
headerSecret: StoreSecretValueType;
};
session: ClientSession;
}) => {
const { teamId, tmbId } = parentApp;
@ -120,7 +140,13 @@ const updateMCPChildrenTool = async ({
tmbId,
type: AppTypeEnum.tool,
intro: tool.description,
modules: [getMCPToolRuntimeNode({ tool, url: toolSetData.url })],
modules: [
getMCPToolRuntimeNode({
tool,
url: toolSetData.url,
headerSecret: toolSetData.headerSecret
})
],
session
});
}
@ -133,7 +159,26 @@ const updateMCPChildrenTool = async ({
await MongoApp.updateOne(
{ _id: dbTool._id },
{
modules: [getMCPToolRuntimeNode({ tool, url: toolSetData.url })]
modules: [
getMCPToolRuntimeNode({
tool,
url: toolSetData.url,
headerSecret: toolSetData.headerSecret
})
]
},
{ session }
);
await MongoAppVersion.updateOne(
{ appId: dbTool._id },
{
nodes: [
getMCPToolRuntimeNode({
tool,
url: toolSetData.url,
headerSecret: toolSetData.headerSecret
})
]
},
{ session }
);

View File

@ -108,9 +108,8 @@ async function handler(req: ApiRequestProps<AppUpdateBody, AppUpdateQuery>) {
const onUpdate = async (session?: ClientSession) => {
// format nodes data
// 1. dataset search limit, less than model quoteMaxToken
const { nodes: formatNodes } = beforeUpdateAppFormat({
nodes,
isPlugin: app.type === AppTypeEnum.plugin
beforeUpdateAppFormat({
nodes
});
await refreshSourceAvatar(avatar, app.avatar, session);
@ -134,8 +133,8 @@ async function handler(req: ApiRequestProps<AppUpdateBody, AppUpdateQuery>) {
...(avatar && { avatar }),
...(intro !== undefined && { intro }),
...(teamTags && { teamTags }),
...(formatNodes && {
modules: formatNodes
...(nodes && {
modules: nodes
}),
...(edges && {
edges
@ -235,7 +234,7 @@ const logAppUpdate = ({
const values: string[] = [];
if (name !== undefined) {
names.push(i18nT('common:core.app.name'));
names.push(i18nT('common:name'));
values.push(name);
}

View File

@ -9,12 +9,11 @@ import { getNextTimeByCronStringAndTimezone } from '@fastgpt/global/common/strin
import { type PostPublishAppProps } from '@/global/core/app/api';
import { WritePermissionVal } from '@fastgpt/global/support/permission/constant';
import { type ApiRequestProps } from '@fastgpt/service/type/next';
import { AppTypeEnum } from '@fastgpt/global/core/app/constants';
import { rewriteAppWorkflowToSimple } from '@fastgpt/service/core/app/utils';
import { addOperationLog } from '@fastgpt/service/support/operationLog/addOperationLog';
import { OperationLogEventEnum } from '@fastgpt/global/support/operationLog/constants';
import { getI18nAppType } from '@fastgpt/service/support/operationLog/util';
import { i18nT } from '@fastgpt/web/i18n/utils';
async function handler(req: ApiRequestProps<PostPublishAppProps>, res: NextApiResponse<any>) {
const { appId } = req.query as { appId: string };
const { nodes = [], edges = [], chatConfig, isPublish, versionName, autoSave } = req.body;
@ -26,16 +25,13 @@ async function handler(req: ApiRequestProps<PostPublishAppProps>, res: NextApiRe
authToken: true
});
const { nodes: formatNodes } = beforeUpdateAppFormat({
nodes,
isPlugin: app.type === AppTypeEnum.plugin
beforeUpdateAppFormat({
nodes
});
await rewriteAppWorkflowToSimple(formatNodes);
if (autoSave) {
await MongoApp.findByIdAndUpdate(appId, {
modules: formatNodes,
modules: nodes,
edges,
chatConfig,
updateTime: new Date()
@ -62,7 +58,7 @@ async function handler(req: ApiRequestProps<PostPublishAppProps>, res: NextApiRe
[
{
appId,
nodes: formatNodes,
nodes: nodes,
edges,
chatConfig,
isPublish,
@ -77,7 +73,7 @@ async function handler(req: ApiRequestProps<PostPublishAppProps>, res: NextApiRe
await MongoApp.findByIdAndUpdate(
appId,
{
modules: formatNodes,
modules: nodes,
edges,
chatConfig,
updateTime: new Date(),

View File

@ -129,7 +129,7 @@ async function handler(req: NextApiRequest, res: NextApiResponse) {
chatId,
offset: 0,
limit,
field: `dataId obj value nodeOutputs`
field: `dataId obj value memories`
}),
MongoChat.findOne({ appId: app._id, chatId }, 'source variableList variables'),
// auth balance
@ -162,40 +162,46 @@ async function handler(req: NextApiRequest, res: NextApiResponse) {
});
/* start process */
const { flowResponses, assistantResponses, newVariables, flowUsages, durationSeconds } =
await dispatchWorkFlow({
res,
requestOrigin: req.headers.origin,
mode: 'test',
timezone,
externalProvider,
uid: tmbId,
const {
flowResponses,
assistantResponses,
system_memories,
newVariables,
flowUsages,
durationSeconds
} = await dispatchWorkFlow({
res,
requestOrigin: req.headers.origin,
mode: 'test',
timezone,
externalProvider,
uid: tmbId,
runningAppInfo: {
id: appId,
teamId: app.teamId,
tmbId: app.tmbId
},
runningUserInfo: {
teamId,
tmbId
},
runningAppInfo: {
id: appId,
teamId: app.teamId,
tmbId: app.tmbId
},
runningUserInfo: {
teamId,
tmbId
},
chatId,
responseChatItemId,
runtimeNodes,
runtimeEdges: storeEdges2RuntimeEdges(edges, interactive),
variables,
query: removeEmptyUserInput(userQuestion.value),
lastInteractive: interactive,
chatConfig,
histories: newHistories,
stream: true,
maxRunTimes: WORKFLOW_MAX_RUN_TIMES,
workflowStreamResponse: workflowResponseWrite,
version: 'v2',
responseDetail: true
});
chatId,
responseChatItemId,
runtimeNodes,
runtimeEdges: storeEdges2RuntimeEdges(edges, interactive),
variables,
query: removeEmptyUserInput(userQuestion.value),
lastInteractive: interactive,
chatConfig,
histories: newHistories,
stream: true,
maxRunTimes: WORKFLOW_MAX_RUN_TIMES,
workflowStreamResponse: workflowResponseWrite,
version: 'v2',
responseDetail: true
});
workflowResponseWrite({
event: SseResponseEventEnum.answer,
@ -222,6 +228,7 @@ async function handler(req: NextApiRequest, res: NextApiResponse) {
dataId: responseChatItemId,
obj: ChatRoleEnum.AI,
value: assistantResponses,
memories: system_memories,
[DispatchNodeResponseKeyEnum.nodeResponse]: flowResponses
};

View File

@ -49,9 +49,7 @@ async function handler(
relatedImgId: fileId
},
customPdfParse
},
relatedId: fileId
}
});
// remove buffer

Some files were not shown because too many files have changed in this diff Show More