feature: V4.11.1 (#5350)

* perf: system toolset & mcp (#5200)

* feat: support system toolset

* fix: type

* fix: system tool config

* chore: mcptool config migrate

* refactor: mcp toolset

* fix: fe type error

* fix: type error

* fix: show version

* chore: support extract tool's secretInputConfig out of inputs

* chore: compatible with old version mcp

* chore: adjust

* deps: update dependency @fastgpt-skd/plugin

* fix: version

* fix: some bug (#5316)

* chore: compatible with old version mcp

* fix: version

* fix: compatible bug

* fix: mcp object params

* fix: type error

* chore: update test cases

* chore: remove log

* fix: toolset node name

* optimize app logs sort (#5310)

* log keys config modal

* multiple select

* api

* fontsize

* code

* chatid

* fix build

* fix

* fix component

* change name

* log keys config

* fix

* delete unused

* fix

* perf: log code

* perf: send auth code modal enter press

* fix log (#5328)

* perf: mcp toolset comment

* perf: log ui

* remove log (#5347)

* doc

* fix: action

* remove log

* fix: Table Optimization (#5319)

* feat: table test: 1

* feat: table test: 2

* feat: table test: 3

* feat: table test: 4

* feat: table test : 5 把maxSize改回chunkSize

* feat: table test : 6 都删了,只看maxSize

* feat: table test : 7 恢复初始,接下来删除标签功能

* feat: table test : 8 删除标签功能

* feat: table test : 9 删除标签功能成功

* feat: table test : 10 继续调试,修改trainingStates

* feat: table test : 11 修改第一步

* feat: table test : 12 修改第二步

* feat: table test : 13 修改了HtmlTable2Md

* feat: table test : 14 修改表头分块规则

* feat: table test : 15 前面表格分的太细了

* feat: table test : 16 改着改着表头又不加了

* feat: table test : 17 用CUSTOM_SPLIT_SIGN不行,重新改

* feat: table test : 18 表头仍然还会多加,但现在分块搞的合理了终于

* feat: table test : 19 还是需要搞好表头问题,先保存一下调试情况

* feat: table test : 20 调试结束,看一下replace有没有问题,没问题就pr

* feat: table test : 21 先把注释删了

* feat: table test : 21 注释replace都改了,下面切main分支看看情况

* feat: table test : 22 修改旧文件

* feat: table test : 23 修改测试文件

* feat: table test : 24 xlsx表格处理

* feat: table test : 25 刚才没保存先com了

* feat: table test : 26 fix

* feat: table test : 27 先com一版调试

* feat: table test : 28 试试放format2csv里

* feat: table test : 29 xlsx解决

* feat: table test : 30 tablesplit解决

* feat: table test : 31

* feat: table test : 32

* perf: table split

* perf: mcp old version compatibility (#5342)

* fix: system-tool secret inputs

* fix: rewrite runtime node i18n for system tool

* perf: mcp old version compatibility

* fix: splitPluginId

* fix: old mcp toolId

* fix: filter secret key

* feat: support system toolset activation

* chore: remove log

* perf: mcp update

* perf: rewrite toolset

* fix:delete variable id (#5335)

* perf: variable update

* fix: multiple select ui

* perf: model config move to plugin

* fix: var conflit

* perf: variable checker

* Avoid empty number

* update doc time

* fix: test

* fix: mcp object

* update count app

* update count app

---------

Co-authored-by: Finley Ge <32237950+FinleyGe@users.noreply.github.com>
Co-authored-by: heheer <heheer@sealos.io>
Co-authored-by: heheer <zhiyu44@qq.com>
Co-authored-by: colnii <1286949794@qq.com>
Co-authored-by: dreamer6680 <1468683855@qq.com>
This commit is contained in:
Archer 2025-08-01 16:08:20 +08:00 committed by GitHub
parent e0c21a949c
commit e25d7efb5b
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
143 changed files with 2596 additions and 4177 deletions

View File

@ -1,28 +0,0 @@
---
description:
globs:
alwaysApply: false
---
这是一个工作流节点报错捕获的设计文档。
# 背景
现在工作流运行节点若其中有一个报错则整个工作流中断无法继续运行只有红色英文toast不太友好。对于对工作流可控性要求高的用户愿意通过编排为报错兜底。现有编排较麻烦。
这类似于代码里的 try catch 机制,用户可以获得 catch 的错误,而不是直接抛错并结束工作流。
# 实现效果
1. 部分节点可以拥有报错捕获选项,也就是 node 里 catchError 不为 undefined 的节点catchError=true 代表启用报错捕获catchError=false 代表不启用报错捕获。
2. 支持报错捕获节点,在输出栏右侧会有一个“错误捕获”的开关。
3. node 的 output 属性种,有一个`errorField`的字段,标识该输出是开启报错捕获时候,才会拥有的输出。
4. 开启报错捕获的节点,在运行错误时,不会阻塞后续节点运行,而是输出报错信息,并继续向下执行。
5. 开启报错捕获的节点,会多出一个“错误输出”的分支连线,错误时候会走错误的分支提示。
# 实现方案
1. FlowNodeCommonType 属性上增加一个`catchError`的可选 boolean 值。如果需要报错捕获的节点,则设置 true/false标识启用报错捕获并且设置默认是否启用报错捕获。
2. FlowNodeOutputTypeEnume增加一个 error 枚举值,表示该字段是错误时候才展示的输出。
3. IOTitle 组件里接收 catchError 字段,如果为 true则在右侧展示“错误捕获”的开关。
4. 所有现有的 RenderOutput 的组件,都需要改造。传入的 flowOutputList 都不包含 hidden 和 error类型的。
5. 单独在`FastGPT/projects/app/src/pageComponents/app/detail/WorkflowComponents/Flow/nodes/render/RenderOutput`下新建一个`CatchError`的组件,用于专门渲染错误类型输出,同时有一个 SourceHandler。

116
CLAUDE.md Normal file
View File

@ -0,0 +1,116 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
FastGPT is an AI Agent construction platform providing out-of-the-box data processing, model invocation capabilities, and visual workflow orchestration through Flow. This is a full-stack TypeScript application built on NextJS with MongoDB/PostgreSQL backends.
**Tech Stack**: NextJS + TypeScript + ChakraUI + MongoDB + PostgreSQL (PG Vector)/Milvus
## Architecture
This is a monorepo using pnpm workspaces with the following key structure:
### Packages (Library Code)
- `packages/global/` - Shared types, constants, utilities used across all projects
- `packages/service/` - Backend services, database schemas, API controllers, workflow engine
- `packages/web/` - Shared frontend components, hooks, styles, i18n
- `packages/templates/` - Application templates for the template market
### Projects (Applications)
- `projects/app/` - Main NextJS web application (frontend + API routes)
- `projects/sandbox/` - NestJS code execution sandbox service
- `projects/mcp_server/` - Model Context Protocol server implementation
### Key Directories
- `document/` - Documentation site (NextJS app with content)
- `plugins/` - External plugins (models, crawlers, etc.)
- `deploy/` - Docker and Helm deployment configurations
- `test/` - Centralized test files and utilities
## Development Commands
### Main Commands (run from project root)
- `pnpm dev` - Start development for all projects (uses package.json workspace scripts)
- `pnpm build` - Build all projects
- `pnpm test` - Run tests using Vitest
- `pnpm test:workflow` - Run workflow-specific tests
- `pnpm lint` - Run ESLint across all TypeScript files with auto-fix
- `pnpm format-code` - Format code using Prettier
### Project-Specific Commands
**Main App (projects/app/)**:
- `cd projects/app && pnpm dev` - Start NextJS dev server
- `cd projects/app && pnpm build` - Build NextJS app
- `cd projects/app && pnpm start` - Start production server
**Sandbox (projects/sandbox/)**:
- `cd projects/sandbox && pnpm dev` - Start NestJS dev server with watch mode
- `cd projects/sandbox && pnpm build` - Build NestJS app
- `cd projects/sandbox && pnpm test` - Run Jest tests
**MCP Server (projects/mcp_server/)**:
- `cd projects/mcp_server && bun dev` - Start with Bun in watch mode
- `cd projects/mcp_server && bun build` - Build MCP server
- `cd projects/mcp_server && bun start` - Start MCP server
### Utility Commands
- `pnpm create:i18n` - Generate i18n translation files
- `pnpm api:gen` - Generate OpenAPI documentation
- `pnpm initIcon` - Initialize icon assets
- `pnpm gen:theme-typings` - Generate Chakra UI theme typings
## Testing
The project uses Vitest for testing with coverage reporting. Key test commands:
- `pnpm test` - Run all tests
- `pnpm test:workflow` - Run workflow tests specifically
- Test files are located in `test/` directory and `projects/app/test/`
- Coverage reports are generated in `coverage/` directory
## Code Organization Patterns
### Monorepo Structure
- Shared code lives in `packages/` and is imported using workspace references
- Each project in `projects/` is a standalone application
- Use `@fastgpt/global`, `@fastgpt/service`, `@fastgpt/web` imports for shared packages
### API Structure
- NextJS API routes in `projects/app/src/pages/api/`
- Core business logic in `packages/service/core/`
- Database schemas in `packages/service/` with MongoDB/Mongoose
### Frontend Architecture
- React components in `projects/app/src/components/` and `packages/web/components/`
- Chakra UI for styling with custom theme in `packages/web/styles/theme.ts`
- i18n support with files in `packages/web/i18n/`
- State management using React Context and Zustand
### Workflow System
- Visual workflow editor using ReactFlow
- Workflow engine in `packages/service/core/workflow/`
- Node definitions in `packages/global/core/workflow/template/`
- Dispatch system for executing workflow nodes
## Development Notes
- **Package Manager**: Uses pnpm with workspace configuration
- **Node Version**: Requires Node.js >=18.16.0, pnpm >=9.0.0
- **Database**: Supports MongoDB, PostgreSQL with pgvector, or Milvus for vector storage
- **AI Integration**: Supports multiple AI providers through unified interface
- **Internationalization**: Full i18n support for Chinese, English, and Japanese
## Key File Patterns
- `.ts` and `.tsx` files use TypeScript throughout
- Database schemas use Mongoose with TypeScript
- API routes follow NextJS conventions
- Component files use React functional components with hooks
- Shared types defined in `packages/global/` with `.d.ts` files
## Environment Configuration
- Configuration files in `projects/app/data/config.json`
- Environment-specific configs supported
- Model configurations in `packages/service/core/ai/config/`

View File

@ -26,8 +26,9 @@ import { Alert } from '@/components/docs/Alert';
除了各模型官方的服务外,还有一些第三方服务商提供模型接入服务,当然你也可以用 Ollama 等来部署本地模型,最终都需要接入 OneAPI下面是一些第三方服务商
<Alert icon=" " context="info">
- [SiliconCloud(硅基流动)](https://cloud.siliconflow.cn/i/TR9Ym0c4): 提供开源模型调用的平台。
- [Sealos AIProxy](https://hzh.sealos.run/?uid=fnWRt09fZP&openapp=system-aiproxy): 提供国内各家模型代理,无需逐一申请 api。
- [SiliconCloud(硅基流动)](https://cloud.siliconflow.cn/i/TR9Ym0c4): 提供开源模型调用的平台。 -
[Sealos AIProxy](https://hzh.sealos.run/?uid=fnWRt09fZP&openapp=system-aiproxy):
提供国内各家模型代理,无需逐一申请 api。
</Alert>
在 OneAPI 配置好模型后,你就可以打开 FastGPT 页面,启用对应模型了。
@ -35,9 +36,8 @@ import { Alert } from '@/components/docs/Alert';
### 2. 配置介绍
<Alert icon="🤖" context="success">
注意:
1. 目前语音识别模型和重排模型仅会生效一个,所以配置时候,只需要配置一个即可。
2. 系统至少需要一个语言模型和一个索引模型才能正常使用。
注意: 1. 目前语音识别模型和重排模型仅会生效一个,所以配置时候,只需要配置一个即可。 2.
系统至少需要一个语言模型和一个索引模型才能正常使用。
</Alert>
#### 核心配置
@ -57,64 +57,64 @@ import { Alert } from '@/components/docs/Alert';
系统内置了目前主流厂商的模型,如果你不熟悉配置,直接点击`启用`即可,需要注意的是,`模型 ID`需要和 OneAPI 中渠道的`模型`一致。
| | |
| --- | --- |
| | |
| ------------------------------- | ------------------------------- |
| ![alt text](/imgs/image-91.png) | ![alt text](/imgs/image-92.png) |
#### 修改模型配置
点击模型右侧的齿轮即可进行模型配置,不同类型模型的配置有区别。
| | |
| --- | --- |
| | |
| ------------------------------- | ------------------------------- |
| ![alt text](/imgs/image-93.png) | ![alt text](/imgs/image-94.png) |
## 新增自定义模型
如果系统内置的模型无法满足你的需求,你可以添加自定义模型。自定义模型中,如果`模型 ID`与系统内置的模型 ID 一致,则会被认为是修改系统模型。
| | |
| --- | --- |
| | |
| ------------------------------- | ------------------------------- |
| ![alt text](/imgs/image-96.png) | ![alt text](/imgs/image-97.png) |
#### 通过配置文件配置
如果你觉得通过页面配置模型比较麻烦,你也可以通过配置文件来配置模型。或者希望快速将一个系统的配置,复制到另一个系统,也可以通过配置文件来实现。
| | |
| --- | --- |
| | |
| ------------------------------- | ------------------------------- |
| ![alt text](/imgs/image-98.png) | ![alt text](/imgs/image-99.png) |
**语言模型字段说明:**
```json
{
"model": "模型 ID",
"metadata": {
"isCustom": true, // 是否为自定义模型
"isActive": true, // 是否启用
"provider": "OpenAI", // 模型提供商主要用于分类展示目前已经内置提供商包括https://github.com/labring/FastGPT/blob/main/packages/global/core/ai/provider.ts, 可 pr 提供新的提供商,或直接填写 Other
"model": "gpt-4o-mini", // 模型ID(对应OneAPI中渠道的模型名)
"name": "gpt-4o-mini", // 模型别名
"maxContext": 125000, // 最大上下文
"maxResponse": 16000, // 最大回复
"quoteMaxToken": 120000, // 最大引用内容
"maxTemperature": 1.2, // 最大温度
"charsPointsPrice": 0, // n积分/1k token商业版
"censor": false, // 是否开启敏感校验(商业版)
"vision": true, // 是否支持图片输入
"datasetProcess": true, // 是否设置为文本理解模型QA务必保证至少有一个为true否则知识库会报错
"usedInClassify": true, // 是否用于问题分类务必保证至少有一个为true
"usedInExtractFields": true, // 是否用于内容提取务必保证至少有一个为true
"usedInToolCall": true, // 是否用于工具调用务必保证至少有一个为true
"toolChoice": true, // 是否支持工具选择(分类,内容提取,工具调用会用到。)
"functionCall": false, // 是否支持函数调用(分类,内容提取,工具调用会用到。会优先使用 toolChoice如果为false则使用 functionCall如果仍为 false则使用提示词模式
"customCQPrompt": "", // 自定义文本分类提示词(不支持工具和函数调用的模型
"customExtractPrompt": "", // 自定义内容提取提示词
"defaultSystemChatPrompt": "", // 对话默认携带的系统提示词
"defaultConfig": {}, // 请求API时挟带一些默认配置比如 GLM4 的 top_p
"fieldMap": {} // 字段映射o1 模型需要把 max_tokens 映射为 max_completion_tokens
}
"model": "模型 ID",
"metadata": {
"isCustom": true, // 是否为自定义模型
"isActive": true, // 是否启用
"provider": "OpenAI", // 模型提供商主要用于分类展示目前已经内置提供商包括https://github.com/labring/FastGPT/blob/main/packages/global/core/ai/provider.ts, 可 pr 提供新的提供商,或直接填写 Other
"model": "gpt-4o-mini", // 模型ID(对应OneAPI中渠道的模型名)
"name": "gpt-4o-mini", // 模型别名
"maxContext": 125000, // 最大上下文
"maxResponse": 16000, // 最大回复
"quoteMaxToken": 120000, // 最大引用内容
"maxTemperature": 1.2, // 最大温度
"charsPointsPrice": 0, // n积分/1k token商业版
"censor": false, // 是否开启敏感校验(商业版)
"vision": true, // 是否支持图片输入
"datasetProcess": true, // 是否设置为文本理解模型QA务必保证至少有一个为true否则知识库会报错
"usedInClassify": true, // 是否用于问题分类务必保证至少有一个为true
"usedInExtractFields": true, // 是否用于内容提取务必保证至少有一个为true
"usedInToolCall": true, // 是否用于工具调用务必保证至少有一个为true
"toolChoice": true, // 是否支持工具选择(分类,内容提取,工具调用会用到。)
"functionCall": false, // 是否支持函数调用(分类,内容提取,工具调用会用到。会优先使用 toolChoice如果为false则使用 functionCall如果仍为 false则使用提示词模式
"customCQPrompt": "", // 自定义文本分类提示词(不支持工具和函数调用的模型
"customExtractPrompt": "", // 自定义内容提取提示词
"defaultSystemChatPrompt": "", // 对话默认携带的系统提示词
"defaultConfig": {}, // 请求API时挟带一些默认配置比如 GLM4 的 top_p
"fieldMap": {} // 字段映射o1 模型需要把 max_tokens 映射为 max_completion_tokens
}
}
```
@ -122,17 +122,17 @@ import { Alert } from '@/components/docs/Alert';
```json
{
"model": "模型 ID",
"metadata": {
"isCustom": true, // 是否为自定义模型
"isActive": true, // 是否启用
"provider": "OpenAI", // 模型提供商
"model": "text-embedding-3-small", // 模型ID
"name": "text-embedding-3-small", // 模型别名
"charsPointsPrice": 0, // n积分/1k token
"defaultToken": 512, // 默认文本分割时候的 token
"maxToken": 3000 // 最大 token
}
"model": "模型 ID",
"metadata": {
"isCustom": true, // 是否为自定义模型
"isActive": true, // 是否启用
"provider": "OpenAI", // 模型提供商
"model": "text-embedding-3-small", // 模型ID
"name": "text-embedding-3-small", // 模型别名
"charsPointsPrice": 0, // n积分/1k token
"defaultToken": 512, // 默认文本分割时候的 token
"maxToken": 3000 // 最大 token
}
}
```
@ -140,17 +140,17 @@ import { Alert } from '@/components/docs/Alert';
```json
{
"model": "模型 ID",
"metadata": {
"isCustom": true, // 是否为自定义模型
"isActive": true, // 是否启用
"provider": "BAAI", // 模型提供商
"model": "bge-reranker-v2-m3", // 模型ID
"name": "ReRanker-Base", // 模型别名
"requestUrl": "", // 自定义请求地址
"requestAuth": "", // 自定义请求认证
"type": "rerank" // 模型类型
}
"model": "模型 ID",
"metadata": {
"isCustom": true, // 是否为自定义模型
"isActive": true, // 是否启用
"provider": "BAAI", // 模型提供商
"model": "bge-reranker-v2-m3", // 模型ID
"name": "ReRanker-Base", // 模型别名
"requestUrl": "", // 自定义请求地址
"requestAuth": "", // 自定义请求认证
"type": "rerank" // 模型类型
}
}
```
@ -158,26 +158,27 @@ import { Alert } from '@/components/docs/Alert';
```json
{
"model": "模型 ID",
"metadata": {
"isActive": true, // 是否启用
"isCustom": true, // 是否为自定义模型
"type": "tts", // 模型类型
"provider": "FishAudio", // 模型提供商
"model": "fishaudio/fish-speech-1.5", // 模型ID
"name": "fish-speech-1.5", // 模型别名
"voices": [ // 音色
{
"label": "fish-alex", // 音色名称
"value": "fishaudio/fish-speech-1.5:alex", // 音色ID
},
{
"label": "fish-anna", // 音色名称
"value": "fishaudio/fish-speech-1.5:anna", // 音色ID
}
],
"charsPointsPrice": 0 // n积分/1k token
}
"model": "模型 ID",
"metadata": {
"isActive": true, // 是否启用
"isCustom": true, // 是否为自定义模型
"type": "tts", // 模型类型
"provider": "FishAudio", // 模型提供商
"model": "fishaudio/fish-speech-1.5", // 模型ID
"name": "fish-speech-1.5", // 模型别名
"voices": [
// 音色
{
"label": "fish-alex", // 音色名称
"value": "fishaudio/fish-speech-1.5:alex" // 音色ID
},
{
"label": "fish-anna", // 音色名称
"value": "fishaudio/fish-speech-1.5:anna" // 音色ID
}
],
"charsPointsPrice": 0 // n积分/1k token
}
}
```
@ -185,16 +186,16 @@ import { Alert } from '@/components/docs/Alert';
```json
{
"model": "whisper-1",
"metadata": {
"isActive": true, // 是否启用
"isCustom": true, // 是否为自定义模型
"provider": "OpenAI", // 模型提供商
"model": "whisper-1", // 模型ID
"name": "whisper-1", // 模型别名
"charsPointsPrice": 0, // n积分/1k token
"type": "stt" // 模型类型
}
"model": "whisper-1",
"metadata": {
"isActive": true, // 是否启用
"isCustom": true, // 是否为自定义模型
"provider": "OpenAI", // 模型提供商
"model": "whisper-1", // 模型ID
"name": "whisper-1", // 模型别名
"charsPointsPrice": 0, // n积分/1k token
"type": "stt" // 模型类型
}
}
```
@ -210,7 +211,6 @@ FastGPT 页面上提供了每类模型的简单测试,可以初步检查模型
由于 OneAPI 不支持 Rerank 模型所以需要单独配置。FastGPT 中,模型配置支持自定义请求地址,可以绕过 OneAPI直接向提供商发起请求可以利用这个特性来接入 Rerank 模型。
#### 使用硅基流动的在线模型
有免费的 `bge-reranker-v2-m3` 模型可以使用。
@ -255,7 +255,6 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
由于 OpenAI 没有提供 ReRank 模型,遵循的是 Cohere 的格式。[点击查看接口请求示例](/docs/development/faq/#如何检查模型问题)
### 模型价格配置
商业版用户可以通过配置模型价格,来进行账号计费。系统包含两种计费模式:按总 tokens 计费和输入输出 Tokens 分开计费。
@ -267,7 +266,6 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
由于模型更新非常频繁,官方不一定及时更新,如果未能找到你期望的内置模型,你可以[提交 Issue](https://github.com/labring/FastGPT/issues),提供模型的名字和对应官网。或者直接[提交 PR](https://github.com/labring/FastGPT/pulls),提供模型配置。
### 添加模型提供商
如果你需要添加模型提供商,需要修改以下代码:
@ -278,7 +276,7 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
### 添加模型
你可以在`FastGPT/packages/service/core/ai/config/provider`目录下,找对应模型提供商的配置文件,并追加模型配置。请自行全文检查,`model`字段,必须在所有模型中唯一。具体配置字段说明,参考[模型配置字段说明](/docs/development/modelconfig/intro/#通过配置文件配置)
你可以在`FastGPT-plugin`项目中`modules/model/provider`目录下,找对应模型提供商的配置文件,并追加模型配置。请自行全文检查,`model`字段,必须在所有模型中唯一。具体配置字段说明,参考[模型配置字段说明](/docs/development/modelconfig/intro/#通过配置文件配置)
## 旧版模型配置说明
@ -396,7 +394,7 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
"customExtractPrompt": "",
"defaultSystemChatPrompt": "",
"defaultConfig": {
"temperature": 1,
"temperature": 1,
"max_tokens": null,
"stream": false
}
@ -405,7 +403,7 @@ OneAPI 的语言识别接口,无法正确的识别其他模型(会始终识
"vectorModels": [
{
"provider": "OpenAI",
"model": "text-embedding-3-small",
"model": "text-embedding-3-small",
"name": "text-embedding-3-small",
"charsPointsPrice": 0,
"defaultToken": 512,

View File

@ -5,14 +5,28 @@ description: 'FastGPT V4.11.1 更新说明'
## 🚀 新增内容
1. 账号注销。
2. 新文档框架。
1. 系统工具,工具集支持直接给工具调用使用。
2. MCP 结构重写,更新 MCP后会自动更新所有在用的 MCP 组件,无需重新删除再添加。
3. 对话日志看板,支持自定义字段展示。
4. 账号注销。
5. 新文档框架。
## ⚙️ 优化
1. 兑换码功能支持指定对公支付模式。
2. 优化支付套餐模式。
3. 全局变量修改变量名后,节点中的引用值不会丢失。
4. 将模型预设配置移动到 FastGPT Plugin 项目中。
## 🐛 修复
1. MCP object 类型数据传递错误。
2. 登录页 UI 偏移。
3. Excel 表带有换行符号时,导致分块异常。
## 🔨 工具更新
1. 新增libulibu 绘图工具。
2. 新增:秘塔搜索工具。
3. 新增:支持 Signoz 系统监控接入。
4. 修复:数学表达式工具数据类型错误。

View File

@ -1,7 +1,7 @@
{
"document/content/docs/api/api1.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/api/api2.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/api/index.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/api/index.mdx": "2025-07-30T15:38:30+08:00",
"document/content/docs/api/test/api3.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/FAQ/app.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/FAQ/chat.mdx": "2025-07-23T21:35:03+08:00",
@ -29,7 +29,7 @@
"document/content/docs/introduction/development/migration/docker_db.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/development/migration/docker_mongo.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/development/modelConfig/ai-proxy.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/development/modelConfig/intro.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/development/modelConfig/intro.mdx": "2025-07-31T14:42:34+08:00",
"document/content/docs/introduction/development/modelConfig/one-api.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/development/modelConfig/ppio.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/development/modelConfig/siliconCloud.mdx": "2025-07-23T21:35:03+08:00",
@ -46,7 +46,7 @@
"document/content/docs/introduction/development/upgrading/4100.mdx": "2025-07-27T12:42:13+08:00",
"document/content/docs/introduction/development/upgrading/4101.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/development/upgrading/4110.mdx": "2025-07-23T23:27:37+08:00",
"document/content/docs/introduction/development/upgrading/4111.mdx": "2025-07-23T23:27:37+08:00",
"document/content/docs/introduction/development/upgrading/4111.mdx": "2025-08-01T10:50:10+08:00",
"document/content/docs/introduction/development/upgrading/42.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/development/upgrading/421.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/development/upgrading/43.mdx": "2025-07-23T21:35:03+08:00",
@ -155,18 +155,17 @@
"document/content/docs/introduction/guide/knowledge_base/websync.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/guide/knowledge_base/yuque_dataset.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/guide/plugins/bing_search_plugin.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/guide/plugins/dev_system_tool.mdx": "2025-07-30T15:02:01+08:00",
"document/content/docs/introduction/guide/plugins/dev_system_tool.mdx": "2025-07-30T22:30:03+08:00",
"document/content/docs/introduction/guide/plugins/doc2x_plugin_guide.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/guide/plugins/google_search_plugin_guide.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/guide/plugins/how_to_submit_system_plugin.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/guide/plugins/searxng_plugin_guide.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/guide/team_permissions/invitation_link.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/guide/team_permissions/team_roles_permissions.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/index.en.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/index.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/shopping_cart/intro.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/introduction/shopping_cart/intro.mdx": "2025-07-30T22:30:03+08:00",
"document/content/docs/introduction/shopping_cart/saas.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/protocol/index.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/protocol/index.mdx": "2025-07-30T15:38:30+08:00",
"document/content/docs/protocol/open-source.en.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/protocol/open-source.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/protocol/privacy.en.mdx": "2025-07-23T21:35:03+08:00",
@ -187,4 +186,4 @@
"document/content/docs/use-cases/external-integration/official_account.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/use-cases/external-integration/openapi.mdx": "2025-07-23T21:35:03+08:00",
"document/content/docs/use-cases/index.mdx": "2025-07-24T14:23:04+08:00"
}
}

View File

@ -6,7 +6,6 @@
"build": "next build",
"update-index-action": "node ./update-index.mjs",
"update-index": "node --env-file=.env.local ./update-index.mjs",
"get-doc-times": "node ./github.js",
"dev": "next dev --turbo",
"start": "next start",
"postinstall": "fumadocs-mdx"

View File

@ -9,12 +9,6 @@ async function main() {
/** @type {import('fumadocs-core/search/algolia').DocumentRecord[]} **/
const records = JSON.parse(content.toString());
console.log({
NEXT_PUBLIC_SEARCH_APPID: process.env.NEXT_PUBLIC_SEARCH_APPID,
SEARCH_APPWRITEKEY: process.env.SEARCH_APPWRITEKEY,
SEARCH_APPWRITEKEY: process.env.SEARCH_APPWRITEKEY
})
if (!process.env.NEXT_PUBLIC_SEARCH_APPID || !process.env.SEARCH_APPWRITEKEY || !process.env.SEARCH_APPWRITEKEY) {
console.log('NEXT_PUBLIC_SEARCH_APPID or SEARCH_APPWRITEKEY is not set');
return;

View File

@ -55,7 +55,7 @@ export const htmlTable2Md = (content: string): string => {
tableData[rowIndex] = [];
}
let colIndex = 0;
const cells = row.match(/<td.*?>(.*?)<\/td>/g) || [];
const cells = row.match(/<td[^>]*\/>|<td[^>]*>.*?<\/td>/g) || [];
cells.forEach((cell) => {
while (tableData[rowIndex][colIndex]) {
@ -63,8 +63,12 @@ export const htmlTable2Md = (content: string): string => {
}
const colspan = parseInt(cell.match(/colspan="(\d+)"/)?.[1] || '1');
const rowspan = parseInt(cell.match(/rowspan="(\d+)"/)?.[1] || '1');
const content = cell.replace(/<td.*?>|<\/td>/g, '').trim();
let content = '';
if (cell.endsWith('/>')) {
content = '';
} else {
content = cell.replace(/<td[^>]*>|<\/td>/g, '').trim();
}
for (let i = 0; i < rowspan; i++) {
for (let j = 0; j < colspan; j++) {
if (!tableData[rowIndex + i]) {

View File

@ -64,7 +64,15 @@ const strIsMdTable = (str: string) => {
};
const markdownTableSplit = (props: SplitProps): SplitResponse => {
let { text = '', chunkSize } = props;
const splitText2Lines = text.split('\n');
// split by rows
const splitText2Lines = text.split('\n').filter((line) => line.trim());
// If there are not enough rows to form a table, return directly
if (splitText2Lines.length < 2) {
return { chunks: [text], chars: text.length };
}
const header = splitText2Lines[0];
const headerSize = header.split('|').length - 2;
@ -130,21 +138,6 @@ const commonSplit = (props: SplitProps): SplitResponse => {
text = text.replace(/(```[\s\S]*?```|~~~[\s\S]*?~~~)/g, function (match) {
return match.replace(/\n/g, codeBlockMarker);
});
// 2. Markdown 表格处理 - 单独提取表格出来,进行表头合并
const tableReg =
/(\n\|(?:(?:[^\n|]+\|){1,})\n\|(?:[:\-\s]+\|){1,}\n(?:\|(?:[^\n|]+\|)*\n?)*)(?:\n|$)/g;
const tableDataList = text.match(tableReg);
if (tableDataList) {
tableDataList.forEach((tableData) => {
const { chunks } = markdownTableSplit({
text: tableData.trim(),
chunkSize
});
const splitText = chunks.join('\n');
text = text.replace(tableData, `\n${splitText}\n`);
});
}
// replace invalid \n
text = text.replace(/(\r?\n|\r){3,}/g, '\n\n\n');
@ -173,7 +166,7 @@ const commonSplit = (props: SplitProps): SplitResponse => {
const stepReges: { reg: RegExp | string; maxLen: number }[] = [
...customReg.map((text) => ({
reg: text.replaceAll('\\n', '\n'),
reg: text.replace(/\\n/g, '\n'),
maxLen: chunkSize
})),
...markdownHeaderRules,
@ -181,7 +174,7 @@ const commonSplit = (props: SplitProps): SplitResponse => {
{ reg: /([\n](```[\s\S]*?```|~~~[\s\S]*?~~~))/g, maxLen: maxSize }, // code block
// HTML Table tag 尽可能保障完整
{
reg: /(\n\|(?:(?:[^\n|]+\|){1,})\n\|(?:[:\-\s]+\|){1,}\n(?:\|(?:[^\n|]+\|)*\n)*)/g,
reg: /(\n\|(?:[^\n|]*\|)+\n\|(?:[:\-\s]*\|)+\n(?:\|(?:[^\n|]*\|)*\n)*)/g,
maxLen: chunkSize
}, // Markdown Table 尽可能保证完整性
{ reg: /(\n{2,})/g, maxLen: chunkSize },
@ -332,6 +325,21 @@ const commonSplit = (props: SplitProps): SplitResponse => {
const newText = lastText + currentText;
const newTextLen = getTextValidLength(newText);
// split the current table if it will exceed after adding
if (strIsMdTable(currentText) && newTextLen > maxLen) {
if (lastTextLen > 0) {
chunks.push(lastText);
lastText = '';
}
const { chunks: tableChunks } = markdownTableSplit({
text: currentText,
chunkSize: chunkSize * 1.2
});
chunks.push(...tableChunks);
continue;
}
// Markdown 模式下,会强制向下拆分最小块,并再最后一个标题深度,给小块都补充上所有标题(包含父级标题)
if (isMarkdownStep) {
// split new Text, split chunks must will greater 1 (small lastText)
@ -468,10 +476,10 @@ export const splitText2Chunks = (props: SplitProps): SplitResponse => {
const splitResult = splitWithCustomSign.map((item) => {
if (strIsMdTable(item)) {
return markdownTableSplit(props);
return markdownTableSplit({ ...props, text: item });
}
return commonSplit(props);
return commonSplit({ ...props, text: item });
});
return {

View File

@ -96,7 +96,7 @@ export const getNextTimeByCronStringAndTimezone = ({
tz: timezone
};
const interval = cronParser.parseExpression(cronString, options);
const date = interval.next().toString();
const date = String(interval.next());
return new Date(date);
} catch (error) {

View File

@ -56,12 +56,12 @@ const getNodeInputRenderTypeFromSchemaInputType = ({
}
if (type === 'string') {
return {
renderTypeList: [FlowNodeInputTypeEnum.input]
renderTypeList: [FlowNodeInputTypeEnum.input, FlowNodeInputTypeEnum.reference]
};
}
if (type === 'number') {
return {
renderTypeList: [FlowNodeInputTypeEnum.numberInput],
renderTypeList: [FlowNodeInputTypeEnum.numberInput, FlowNodeInputTypeEnum.reference],
max: maximum,
min: minimum
};
@ -71,7 +71,7 @@ const getNodeInputRenderTypeFromSchemaInputType = ({
renderTypeList: [FlowNodeInputTypeEnum.switch]
};
}
return { renderTypeList: [FlowNodeInputTypeEnum.JSONEditor] };
return { renderTypeList: [FlowNodeInputTypeEnum.JSONEditor, FlowNodeInputTypeEnum.reference] };
};
export const jsonSchema2NodeInput = (jsonSchema: JSONSchemaInputType): FlowNodeInputItemType[] => {
return Object.entries(jsonSchema?.properties || {}).map(([key, value]) => ({

View File

@ -0,0 +1,49 @@
import { i18nT } from '../../../../web/i18n/utils';
export enum AppLogKeysEnum {
SOURCE = 'source',
USER = 'user',
TITLE = 'title',
SESSION_ID = 'sessionId',
CREATED_TIME = 'createdTime',
LAST_CONVERSATION_TIME = 'lastConversationTime',
MESSAGE_COUNT = 'messageCount',
FEEDBACK = 'feedback',
CUSTOM_FEEDBACK = 'customFeedback',
ANNOTATED_COUNT = 'annotatedCount',
POINTS = 'points',
RESPONSE_TIME = 'responseTime',
ERROR_COUNT = 'errorCount'
}
export const AppLogKeysEnumMap = {
[AppLogKeysEnum.SOURCE]: i18nT('app:logs_keys_source'),
[AppLogKeysEnum.USER]: i18nT('app:logs_keys_user'),
[AppLogKeysEnum.TITLE]: i18nT('app:logs_keys_title'),
[AppLogKeysEnum.SESSION_ID]: i18nT('app:logs_keys_sessionId'),
[AppLogKeysEnum.CREATED_TIME]: i18nT('app:logs_keys_createdTime'),
[AppLogKeysEnum.LAST_CONVERSATION_TIME]: i18nT('app:logs_keys_lastConversationTime'),
[AppLogKeysEnum.MESSAGE_COUNT]: i18nT('app:logs_keys_messageCount'),
[AppLogKeysEnum.FEEDBACK]: i18nT('app:logs_keys_feedback'),
[AppLogKeysEnum.CUSTOM_FEEDBACK]: i18nT('app:logs_keys_customFeedback'),
[AppLogKeysEnum.ANNOTATED_COUNT]: i18nT('app:logs_keys_annotatedCount'),
[AppLogKeysEnum.POINTS]: i18nT('app:logs_keys_points'),
[AppLogKeysEnum.RESPONSE_TIME]: i18nT('app:logs_keys_responseTime'),
[AppLogKeysEnum.ERROR_COUNT]: i18nT('app:logs_keys_errorCount')
};
export const DefaultAppLogKeys = [
{ key: AppLogKeysEnum.SOURCE, enable: true },
{ key: AppLogKeysEnum.USER, enable: true },
{ key: AppLogKeysEnum.TITLE, enable: true },
{ key: AppLogKeysEnum.SESSION_ID, enable: false },
{ key: AppLogKeysEnum.CREATED_TIME, enable: false },
{ key: AppLogKeysEnum.LAST_CONVERSATION_TIME, enable: true },
{ key: AppLogKeysEnum.MESSAGE_COUNT, enable: true },
{ key: AppLogKeysEnum.FEEDBACK, enable: true },
{ key: AppLogKeysEnum.CUSTOM_FEEDBACK, enable: false },
{ key: AppLogKeysEnum.ANNOTATED_COUNT, enable: false },
{ key: AppLogKeysEnum.POINTS, enable: false },
{ key: AppLogKeysEnum.RESPONSE_TIME, enable: false },
{ key: AppLogKeysEnum.ERROR_COUNT, enable: false }
];

12
packages/global/core/app/logs/type.d.ts vendored Normal file
View File

@ -0,0 +1,12 @@
import type { AppLogKeysEnum } from './constants';
export type AppLogKeysType = {
key: AppLogKeysEnum;
enable: boolean;
};
export type AppLogKeysSchemaType = {
teamId: string;
appId: string;
logKeys: AppLogKeysType[];
};

View File

@ -14,38 +14,37 @@ import { type RuntimeNodeItemType } from '../../workflow/runtime/type';
import { type StoreSecretValueType } from '../../../common/secret/type';
import { jsonSchema2NodeInput } from '../jsonschema';
import { getNanoid } from '../../../common/string/tools';
import { PluginSourceEnum } from '../plugin/constants';
export const getMCPToolSetRuntimeNode = ({
url,
toolList,
headerSecret,
name,
avatar
avatar,
toolId
}: {
url: string;
toolList: McpToolConfigType[];
headerSecret?: StoreSecretValueType;
name?: string;
avatar?: string;
toolId: string;
}): RuntimeNodeItemType => {
return {
nodeId: getNanoid(16),
flowNodeType: FlowNodeTypeEnum.toolSet,
avatar,
intro: '',
inputs: [
{
key: NodeInputKeyEnum.toolSetData,
label: '',
valueType: WorkflowIOValueTypeEnum.object,
renderTypeList: [FlowNodeInputTypeEnum.hidden],
value: {
url,
headerSecret,
toolList
}
intro: 'MCP Tools',
toolConfig: {
mcpToolSet: {
toolList,
headerSecret,
url,
toolId
}
],
},
inputs: [],
outputs: [],
name: name || '',
version: ''
@ -54,34 +53,24 @@ export const getMCPToolSetRuntimeNode = ({
export const getMCPToolRuntimeNode = ({
tool,
url,
headerSecret,
avatar = 'core/app/type/mcpToolsFill'
avatar = 'core/app/type/mcpToolsFill',
parentId
}: {
tool: McpToolConfigType;
url: string;
headerSecret?: StoreSecretValueType;
avatar?: string;
parentId: string;
}): RuntimeNodeItemType => {
return {
nodeId: getNanoid(16),
flowNodeType: FlowNodeTypeEnum.tool,
avatar,
intro: tool.description,
inputs: [
{
key: NodeInputKeyEnum.toolData,
label: 'Tool Data',
valueType: WorkflowIOValueTypeEnum.object,
renderTypeList: [FlowNodeInputTypeEnum.hidden],
value: {
...tool,
url,
headerSecret
}
},
...jsonSchema2NodeInput(tool.inputSchema)
],
toolConfig: {
mcpTool: {
toolId: `${PluginSourceEnum.mcp}-${parentId}/${tool.name}`
}
},
inputs: jsonSchema2NodeInput(tool.inputSchema),
outputs: [
{
id: NodeOutputKeyEnum.rawResponse,
@ -97,3 +86,12 @@ export const getMCPToolRuntimeNode = ({
version: ''
};
};
/**
* Get the parent id of the mcp toolset
* mcp-123123/toolName ==> 123123
* 123123/toolName ==> 123123
* @param id mcp-parentId/name or parentId/name
* @returns parentId
*/
export const getMCPParentId = (id: string) => id.split('-').pop()?.split('/')[0];

View File

@ -2,6 +2,7 @@ export enum PluginSourceEnum {
personal = 'personal', // this is a app.
systemTool = 'systemTool', // FastGPT-plugin tools, pure code.
commercial = 'commercial', // configured in Pro, with associatedPluginId. Specially, commercial-dalle3 is a systemTool
mcp = 'mcp', // mcp
// @deprecated
community = 'community' // this is deprecated, will be replaced by systemTool
}

View File

@ -1,6 +1,7 @@
import { type StoreNodeItemType } from '../../workflow/type/node';
import { type FlowNodeInputItemType } from '../../workflow/type/io';
import { FlowNodeTypeEnum } from '../../workflow/node/constant';
import { PluginSourceEnum } from './constants';
export const getPluginInputsFromStoreNodes = (nodes: StoreNodeItemType[]) => {
return nodes.find((node) => node.flowNodeType === FlowNodeTypeEnum.pluginInput)?.inputs || [];
@ -22,3 +23,42 @@ export const getPluginRunContent = ({
});
return JSON.stringify(pluginInputsWithValue);
};
/**
plugin id rule:
- personal: ObjectId
- commercial: commercial-ObjectId
- systemtool: systemTool-id
- mcp tool: mcp-parentId/toolName
(deprecated) community: community-id
*/
export function splitCombinePluginId(id: string) {
const splitRes = id.split('-');
if (splitRes.length === 1) {
// app id
return {
source: PluginSourceEnum.personal,
pluginId: id
};
}
const [source, ...rest] = id.split('-') as [PluginSourceEnum, string | undefined];
const pluginId = rest.join('-');
if (!source || !pluginId) throw new Error('pluginId not found');
// 兼容4.10.0 之前的插件
if (source === 'community' || id === 'commercial-dalle3') {
return {
source: PluginSourceEnum.systemTool,
pluginId: `${PluginSourceEnum.systemTool}-${pluginId}`
};
}
if (source === 'mcp') {
return {
source: PluginSourceEnum.mcp,
pluginId
};
}
return { source, pluginId: id };
}

View File

@ -146,7 +146,7 @@ export type SettingAIDataType = {
// variable
export type VariableItemType = {
id: string;
// id: string;
key: string;
label: string;
type: VariableInputEnum;
@ -161,6 +161,8 @@ export type VariableItemType = {
max?: number;
min?: number;
// select
list?: { label: string; value: string }[];
// @deprecated
enums?: { value: string; label: string }[];
};
// tts

View File

@ -105,6 +105,13 @@ export const toolValueTypeList: {
type: 'boolean'
}
}
},
{
label: 'object',
value: WorkflowIOValueTypeEnum.object,
jsonSchema: {
type: 'object'
}
}
];
export const valueTypeJsonSchemaMap: Record<string, JsonSchemaPropertiesItemType> =

View File

@ -87,6 +87,7 @@ export const valueTypeFormat = (value: any, type?: WorkflowIOValueTypeEnum) => {
return typeof value === 'object' ? JSON.stringify(value) : String(value);
}
if (type === WorkflowIOValueTypeEnum.number) {
if (value === '') return undefined;
return Number(value);
}
if (type === WorkflowIOValueTypeEnum.boolean) {

View File

@ -23,14 +23,29 @@ import type { AppDetailType, AppSchema, McpToolConfigType } from '../../app/type
import type { ParentIdType } from 'common/parentFolder/type';
import { AppTypeEnum } from '../../app/constants';
import type { WorkflowInteractiveResponseType } from '../template/system/interactive/type';
import type { StoreSecretValueType } from '../../../common/secret/type';
export type NodeToolConfigType = {
mcpTool?: McpToolConfigType & {
mcpToolSet?: {
toolId: string; // ObjectId of the MCP App
url: string;
headerSecret?: StoreSecretValueType;
toolList: McpToolConfigType[];
};
mcpTool?: {
toolId: string;
};
systemTool?: {
toolId: string;
};
systemToolSet?: {
toolId: string;
toolList: {
toolId: string;
name: string;
description: string;
}[];
};
};
export type FlowNodeCommonType = {

View File

@ -18,6 +18,7 @@ import {
type ReferenceArrayValueType,
type ReferenceItemValueType
} from './type/io.d';
import type { NodeToolConfigType } from './type/node';
import { type StoreNodeItemType } from './type/node';
import type {
VariableItemType,
@ -310,35 +311,25 @@ export const appData2FlowNodeIO = ({
};
};
export const toolData2FlowNodeIO = ({
nodes
}: {
nodes: StoreNodeItemType[];
}): {
inputs: FlowNodeInputItemType[];
outputs: FlowNodeOutputItemType[];
} => {
export const toolData2FlowNodeIO = ({ nodes }: { nodes: StoreNodeItemType[] }) => {
const toolNode = nodes.find((node) => node.flowNodeType === FlowNodeTypeEnum.tool);
return {
inputs: toolNode?.inputs || [],
outputs: toolNode?.outputs || []
outputs: toolNode?.outputs || [],
toolConfig: toolNode?.toolConfig
};
};
export const toolSetData2FlowNodeIO = ({
nodes
}: {
nodes: StoreNodeItemType[];
}): {
inputs: FlowNodeInputItemType[];
outputs: FlowNodeOutputItemType[];
} => {
export const toolSetData2FlowNodeIO = ({ nodes }: { nodes: StoreNodeItemType[] }) => {
const toolSetNode = nodes.find((node) => node.flowNodeType === FlowNodeTypeEnum.toolSet);
return {
inputs: toolSetNode?.inputs || [],
outputs: toolSetNode?.outputs || []
outputs: toolSetNode?.outputs || [],
toolConfig: toolSetNode?.toolConfig,
showSourceHandle: false,
showTargetHandle: false
};
};

View File

@ -1,10 +0,0 @@
{
"provider": "AliCloud",
"list": [
{
"model": "SenseVoiceSmall",
"name": "SenseVoiceSmall",
"type": "stt"
}
]
}

View File

@ -1,17 +0,0 @@
{
"provider": "BAAI",
"list": [
{
"model": "bge-m3",
"name": "bge-m3",
"defaultToken": 512,
"maxToken": 8000,
"type": "embedding"
},
{
"model": "bge-reranker-v2-m3",
"name": "bge-reranker-v2-m3",
"type": "rerank"
}
]
}

View File

@ -1,4 +0,0 @@
{
"provider": "Baichuan",
"list": []
}

View File

@ -1,121 +0,0 @@
{
"provider": "ChatGLM",
"list": [
{
"model": "glm-4-air",
"name": "glm-4-air",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 120000,
"maxTemperature": 0.99,
"showTopP": true,
"responseFormatList": ["text", "json_object"],
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "glm-4-flash",
"name": "glm-4-flash",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 120000,
"maxTemperature": 0.99,
"showTopP": true,
"responseFormatList": ["text", "json_object"],
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "glm-4-long",
"name": "glm-4-long",
"maxContext": 1000000,
"maxResponse": 4000,
"quoteMaxToken": 900000,
"maxTemperature": 0.99,
"showTopP": true,
"responseFormatList": ["text", "json_object"],
"showStopSign": true,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "glm-4-plus",
"name": "GLM-4-plus",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 120000,
"maxTemperature": 0.99,
"showTopP": true,
"responseFormatList": ["text", "json_object"],
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "glm-4v-flash",
"name": "glm-4v-flash",
"maxContext": 8000,
"maxResponse": 1000,
"quoteMaxToken": 6000,
"maxTemperature": 0.99,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "glm-4v-plus",
"name": "GLM-4v-plus",
"maxContext": 8000,
"maxResponse": 1000,
"quoteMaxToken": 6000,
"maxTemperature": 0.99,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "embedding-3",
"name": "embedding-3",
"defaultToken": 512,
"maxToken": 8000,
"defaultConfig": {
"dimensions": 1024
},
"type": "embedding"
}
]
}

View File

@ -1,124 +0,0 @@
{
"provider": "Claude",
"list": [
{
"model": "claude-sonnet-4-20250514",
"name": "claude-sonnet-4-20250514",
"maxContext": 200000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "claude-opus-4-20250514",
"name": "claude-opus-4-20250514",
"maxContext": 200000,
"maxResponse": 4096,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "claude-3-7-sonnet-20250219",
"name": "claude-3-7-sonnet-20250219",
"maxContext": 200000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "claude-3-5-haiku-20241022",
"name": "claude-3-5-haiku-20241022",
"maxContext": 200000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "claude-3-5-sonnet-20240620",
"name": "Claude-3-5-sonnet-20240620",
"maxContext": 200000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "claude-3-5-sonnet-20241022",
"name": "Claude-3-5-sonnet-20241022",
"maxContext": 200000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "claude-3-opus-20240229",
"name": "claude-3-opus-20240229",
"maxContext": 200000,
"maxResponse": 4096,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
}
]
}

View File

@ -1,39 +0,0 @@
{
"provider": "DeepSeek",
"list": [
{
"model": "deepseek-chat",
"name": "Deepseek-chat",
"maxContext": 64000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1,
"showTopP": true,
"responseFormatList": ["text", "json_object"],
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm"
},
{
"model": "deepseek-reasoner",
"name": "Deepseek-reasoner",
"maxContext": 64000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": null,
"vision": false,
"reasoning": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": false,
"showStopSign": false
}
]
}

View File

@ -1,276 +0,0 @@
{
"provider": "Doubao",
"list": [
{
"model": "doubao-seed-1-6-250615",
"name": "doubao-seed-1-6-250615",
"maxContext": 220000,
"maxResponse": 16000,
"quoteMaxToken": 220000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "doubao-seed-1-6-flash-250615",
"name": "doubao-seed-1-6-flash-250615",
"maxContext": 220000,
"maxResponse": 16000,
"quoteMaxToken": 220000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "doubao-seed-1-6-thinking-250615",
"name": "doubao-seed-1-6-thinking-250615",
"maxContext": 220000,
"maxResponse": 16000,
"quoteMaxToken": 220000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "Doubao-1.5-lite-32k",
"name": "Doubao-1.5-lite-32k",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "Doubao-1.5-pro-32k",
"name": "Doubao-1.5-pro-32k",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "Doubao-1.5-pro-256k",
"name": "Doubao-1.5-pro-256k",
"maxContext": 256000,
"maxResponse": 12000,
"quoteMaxToken": 256000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "Doubao-1.5-vision-pro-32k",
"name": "Doubao-1.5-vision-pro-32k",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "Doubao-lite-4k",
"name": "Doubao-lite-4k",
"maxContext": 4000,
"maxResponse": 4000,
"quoteMaxToken": 4000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "Doubao-lite-32k",
"name": "Doubao-lite-32k",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "Doubao-lite-128k",
"name": "Doubao-lite-128k",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 120000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "Doubao-vision-lite-32k",
"name": "Doubao-vision-lite-32k",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "Doubao-pro-4k",
"name": "Doubao-pro-4k",
"maxContext": 4000,
"maxResponse": 4000,
"quoteMaxToken": 4000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "Doubao-pro-32k",
"name": "Doubao-pro-32k",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "Doubao-pro-128k",
"name": "Doubao-pro-128k",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 120000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "Doubao-vision-pro-32k",
"name": "Doubao-vision-pro-32k",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "Doubao-embedding-large",
"name": "Doubao-embedding-large",
"defaultToken": 512,
"maxToken": 4096,
"type": "embedding",
"normalization": true
},
{
"model": "Doubao-embedding",
"name": "Doubao-embedding",
"defaultToken": 512,
"maxToken": 4096,
"type": "embedding",
"normalization": true
}
]
}

View File

@ -1,87 +0,0 @@
{
"provider": "Ernie",
"list": [
{
"model": "ERNIE-4.0-8K",
"name": "ERNIE-4.0-8K",
"maxContext": 8000,
"maxResponse": 2048,
"quoteMaxToken": 5000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "ERNIE-4.0-Turbo-8K",
"name": "ERNIE-4.0-Turbo-8K",
"maxContext": 8000,
"maxResponse": 2048,
"quoteMaxToken": 5000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "ERNIE-Lite-8K",
"name": "ERNIE-lite-8k",
"maxContext": 8000,
"maxResponse": 2048,
"quoteMaxToken": 6000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "ERNIE-Speed-128K",
"name": "ERNIE-Speed-128K",
"maxContext": 128000,
"maxResponse": 4096,
"quoteMaxToken": 120000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "Embedding-V1",
"name": "Embedding-V1",
"defaultToken": 512,
"maxToken": 1000,
"type": "embedding"
},
{
"model": "tao-8k",
"name": "tao-8k",
"defaultToken": 512,
"maxToken": 8000,
"type": "embedding"
}
]
}

View File

@ -1,4 +0,0 @@
{
"provider": "FishAudio",
"list": []
}

View File

@ -1,214 +0,0 @@
{
"provider": "Gemini",
"list": [
{
"model": "gemini-2.5-pro",
"name": "gemini-2.5-pro",
"maxContext": 1000000,
"maxResponse": 63000,
"quoteMaxToken": 1000000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gemini-2.5-flash",
"name": "gemini-2.5-flash",
"maxContext": 1000000,
"maxResponse": 63000,
"quoteMaxToken": 1000000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gemini-2.5-pro-exp-03-25",
"name": "gemini-2.5-pro-exp-03-25",
"maxContext": 1000000,
"maxResponse": 63000,
"quoteMaxToken": 1000000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gemini-2.5-flash-preview-04-17",
"name": "gemini-2.5-flash-preview-04-17",
"maxContext": 1000000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gemini-2.0-flash",
"name": "gemini-2.0-flash",
"maxContext": 1000000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gemini-2.0-pro-exp",
"name": "gemini-2.0-pro-exp",
"maxContext": 2000000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gemini-1.5-flash",
"name": "gemini-1.5-flash",
"maxContext": 1000000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gemini-1.5-pro",
"name": "gemini-1.5-pro",
"maxContext": 2000000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gemini-2.0-flash-exp",
"name": "gemini-2.0-flash-exp",
"maxContext": 1000000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gemini-2.0-flash-thinking-exp-1219",
"name": "gemini-2.0-flash-thinking-exp-1219",
"maxContext": 1000000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gemini-2.0-flash-thinking-exp-01-21",
"name": "gemini-2.0-flash-thinking-exp-01-21",
"maxContext": 1000000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gemini-exp-1206",
"name": "gemini-exp-1206",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 120000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "text-embedding-004",
"name": "text-embedding-004",
"defaultToken": 512,
"maxToken": 2000,
"type": "embedding"
}
]
}

View File

@ -1,105 +0,0 @@
{
"provider": "Grok",
"list": [
{
"model": "grok-4",
"name": "grok-4",
"maxContext": 256000,
"maxResponse": 8000,
"quoteMaxToken": 128000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "grok-4-0709",
"name": "grok-4-0709",
"maxContext": 256000,
"maxResponse": 8000,
"quoteMaxToken": 128000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "grok-3-mini",
"name": "grok-3-mini",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 128000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "grok-3-mini-fast",
"name": "grok-3-mini-fast",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 128000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "grok-3",
"name": "grok-3",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 128000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "grok-3-fast",
"name": "grok-3-fast",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 128000,
"maxTemperature": 1,
"showTopP": true,
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
}
]
}

View File

@ -1,37 +0,0 @@
{
"provider": "Groq",
"list": [
{
"model": "llama-3.1-8b-instant",
"name": "Groq-llama-3.1-8b-instant",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1.2,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "llama-3.3-70b-versatile",
"name": "Groq-llama-3.3-70b-versatile",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1.2,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
}
]
}

View File

@ -1,131 +0,0 @@
{
"provider": "Hunyuan",
"list": [
{
"model": "hunyuan-large",
"name": "hunyuan-large",
"maxContext": 28000,
"maxResponse": 4000,
"quoteMaxToken": 20000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "hunyuan-lite",
"name": "hunyuan-lite",
"maxContext": 250000,
"maxResponse": 6000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "hunyuan-pro",
"name": "hunyuan-pro",
"maxContext": 28000,
"maxResponse": 4000,
"quoteMaxToken": 28000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "hunyuan-standard",
"name": "hunyuan-standard",
"maxContext": 32000,
"maxResponse": 2000,
"quoteMaxToken": 20000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "hunyuan-turbo-vision",
"name": "hunyuan-turbo-vision",
"maxContext": 6000,
"maxResponse": 2000,
"quoteMaxToken": 6000,
"maxTemperature": 1,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "hunyuan-turbo",
"name": "hunyuan-turbo",
"maxContext": 28000,
"maxResponse": 4000,
"quoteMaxToken": 20000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "hunyuan-vision",
"name": "hunyuan-vision",
"maxContext": 6000,
"maxResponse": 2000,
"quoteMaxToken": 4000,
"maxTemperature": 1,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "hunyuan-embedding",
"name": "hunyuan-embedding",
"defaultToken": 512,
"maxToken": 1024,
"type": "embedding"
}
]
}

View File

@ -1,39 +0,0 @@
{
"provider": "Intern",
"list": [
{
"model": "internlm2-pro-chat",
"name": "internlm2-pro-chat",
"maxContext": 32000,
"maxResponse": 8000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "internlm3-8b-instruct",
"name": "internlm3-8b-instruct",
"maxContext": 32000,
"maxResponse": 8000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
}
]
}

View File

@ -1,4 +0,0 @@
{
"provider": "Meta",
"list": []
}

View File

@ -1,230 +0,0 @@
{
"provider": "MiniMax",
"list": [
{
"model": "MiniMax-Text-01",
"name": "MiniMax-Text-01",
"maxContext": 1000000,
"maxResponse": 1000000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "abab6.5s-chat",
"name": "MiniMax-abab6.5s",
"maxContext": 245000,
"maxResponse": 10000,
"quoteMaxToken": 240000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "speech-01-turbo",
"name": "speech-01-turbo",
"voices": [
{
"label": "male-qn-qingse",
"value": "male-qn-qingse"
},
{
"label": "male-qn-jingying",
"value": "male-qn-jingying"
},
{
"label": "male-qn-badao",
"value": "male-qn-badao"
},
{
"label": "male-qn-daxuesheng",
"value": "male-qn-daxuesheng"
},
{
"label": "female-shaonv",
"value": "female-shaonv"
},
{
"label": "female-yujie",
"value": "female-yujie"
},
{
"label": "female-chengshu",
"value": "female-chengshu"
},
{
"label": "female-tianmei",
"value": "female-tianmei"
},
{
"label": "presenter_male",
"value": "presenter_male"
},
{
"label": "presenter_female",
"value": "presenter_female"
},
{
"label": "audiobook_male_1",
"value": "audiobook_male_1"
},
{
"label": "audiobook_male_2",
"value": "audiobook_male_2"
},
{
"label": "audiobook_female_1",
"value": "audiobook_female_1"
},
{
"label": "audiobook_female_2",
"value": "audiobook_female_2"
},
{
"label": "male-qn-qingse-jingpin",
"value": "male-qn-qingse-jingpin"
},
{
"label": "male-qn-jingying-jingpin",
"value": "male-qn-jingying-jingpin"
},
{
"label": "male-qn-badao-jingpin",
"value": "male-qn-badao-jingpin"
},
{
"label": "male-qn-daxuesheng-jingpin",
"value": "male-qn-daxuesheng-jingpin"
},
{
"label": "female-shaonv-jingpin",
"value": "female-shaonv-jingpin"
},
{
"label": "female-yujie-jingpin",
"value": "female-yujie-jingpin"
},
{
"label": "female-chengshu-jingpin",
"value": "female-chengshu-jingpin"
},
{
"label": "female-tianmei-jingpin",
"value": "female-tianmei-jingpin"
},
{
"label": "clever_boy",
"value": "clever_boy"
},
{
"label": "cute_boy",
"value": "cute_boy"
},
{
"label": "lovely_girl",
"value": "lovely_girl"
},
{
"label": "cartoon_pig",
"value": "cartoon_pig"
},
{
"label": "bingjiao_didi",
"value": "bingjiao_didi"
},
{
"label": "junlang_nanyou",
"value": "junlang_nanyou"
},
{
"label": "chunzhen_xuedi",
"value": "chunzhen_xuedi"
},
{
"label": "lengdan_xiongzhang",
"value": "lengdan_xiongzhang"
},
{
"label": "badao_shaoye",
"value": "badao_shaoye"
},
{
"label": "tianxin_xiaoling",
"value": "tianxin_xiaoling"
},
{
"label": "qiaopi_mengmei",
"value": "qiaopi_mengmei"
},
{
"label": "wumei_yujie",
"value": "wumei_yujie"
},
{
"label": "diadia_xuemei",
"value": "diadia_xuemei"
},
{
"label": "danya_xuejie",
"value": "danya_xuejie"
},
{
"label": "Santa_Claus",
"value": "Santa_Claus"
},
{
"label": "Grinch",
"value": "Grinch"
},
{
"label": "Rudolph",
"value": "Rudolph"
},
{
"label": "Arnold",
"value": "Arnold"
},
{
"label": "Charming_Santa",
"value": "Charming_Santa"
},
{
"label": "Charming_Lady",
"value": "Charming_Lady"
},
{
"label": "Sweet_Girl",
"value": "Sweet_Girl"
},
{
"label": "Cute_Elf",
"value": "Cute_Elf"
},
{
"label": "Attractive_Girl",
"value": "Attractive_Girl"
},
{
"label": "Serene_Woman",
"value": "Serene_Woman"
}
],
"type": "tts"
}
]
}

View File

@ -1,73 +0,0 @@
{
"provider": "MistralAI",
"list": [
{
"model": "ministral-3b-latest",
"name": "Ministral-3b-latest",
"maxContext": 130000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1.2,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "ministral-8b-latest",
"name": "Ministral-8b-latest",
"maxContext": 130000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1.2,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "mistral-large-latest",
"name": "Mistral-large-latest",
"maxContext": 130000,
"maxResponse": 8000,
"quoteMaxToken": 60000,
"maxTemperature": 1.2,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "mistral-small-latest",
"name": "Mistral-small-latest",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1.2,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
}
]
}

View File

@ -1,4 +0,0 @@
{
"provider": "Moka",
"list": []
}

View File

@ -1,181 +0,0 @@
{
"provider": "Moonshot",
"list": [
{
"model": "kimi-k2-0711-preview",
"name": "kimi-k2-0711-preview",
"maxContext": 128000,
"maxResponse": 32000,
"quoteMaxToken": 128000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "kimi-latest-8k",
"name": "kimi-latest-8k",
"maxContext": 8000,
"maxResponse": 4000,
"quoteMaxToken": 6000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "kimi-latest-32k",
"name": "kimi-latest-32k",
"maxContext": 32000,
"maxResponse": 16000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "kimi-latest-128k",
"name": "kimi-latest-128k",
"maxContext": 128000,
"maxResponse": 32000,
"quoteMaxToken": 128000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "moonshot-v1-8k",
"name": "moonshot-v1-8k",
"maxContext": 8000,
"maxResponse": 4000,
"quoteMaxToken": 6000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "moonshot-v1-32k",
"name": "moonshot-v1-32k",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "moonshot-v1-128k",
"name": "moonshot-v1-128k",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 60000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "moonshot-v1-8k-vision-preview",
"name": "moonshot-v1-8k-vision-preview",
"maxContext": 8000,
"maxResponse": 4000,
"quoteMaxToken": 6000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "moonshot-v1-32k-vision-preview",
"name": "moonshot-v1-32k-vision-preview",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "moonshot-v1-128k-vision-preview",
"name": "moonshot-v1-128k-vision-preview",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 60000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
}
]
}

View File

@ -1,4 +0,0 @@
{
"provider": "Ollama",
"list": []
}

View File

@ -1,301 +0,0 @@
{
"provider": "OpenAI",
"list": [
{
"model": "gpt-4.1",
"name": "gpt-4.1",
"maxContext": 1000000,
"maxResponse": 32000,
"quoteMaxToken": 1000000,
"maxTemperature": 1.2,
"showTopP": true,
"responseFormatList": ["text", "json_object", "json_schema"],
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "gpt-4.1-mini",
"name": "gpt-4.1-mini",
"maxContext": 1000000,
"maxResponse": 32000,
"quoteMaxToken": 1000000,
"maxTemperature": 1.2,
"showTopP": true,
"responseFormatList": ["text", "json_object", "json_schema"],
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "gpt-4.1-nano",
"name": "gpt-4.1-nano",
"maxContext": 1000000,
"maxResponse": 32000,
"quoteMaxToken": 1000000,
"maxTemperature": 1.2,
"showTopP": true,
"responseFormatList": ["text", "json_object", "json_schema"],
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "gpt-4o-mini",
"name": "GPT-4o-mini",
"maxContext": 128000,
"maxResponse": 16000,
"quoteMaxToken": 60000,
"maxTemperature": 1.2,
"showTopP": true,
"responseFormatList": ["text", "json_object", "json_schema"],
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "gpt-4o",
"name": "GPT-4o",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 60000,
"maxTemperature": 1.2,
"showTopP": true,
"responseFormatList": ["text", "json_object", "json_schema"],
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": true,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm"
},
{
"model": "o4-mini",
"name": "o4-mini",
"maxContext": 200000,
"maxResponse": 100000,
"quoteMaxToken": 120000,
"maxTemperature": null,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {
"max_tokens": "max_completion_tokens"
},
"type": "llm",
"showTopP": true,
"showStopSign": false
},
{
"model": "o3",
"name": "o3",
"maxContext": 200000,
"maxResponse": 100000,
"quoteMaxToken": 120000,
"maxTemperature": null,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {
"max_tokens": "max_completion_tokens"
},
"type": "llm",
"showTopP": true,
"showStopSign": false
},
{
"model": "o3-mini",
"name": "o3-mini",
"maxContext": 200000,
"maxResponse": 100000,
"quoteMaxToken": 120000,
"maxTemperature": null,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {
"max_tokens": "max_completion_tokens"
},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "o1",
"name": "o1",
"maxContext": 195000,
"maxResponse": 8000,
"quoteMaxToken": 120000,
"maxTemperature": null,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {
"max_tokens": "max_completion_tokens"
},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "o1-mini",
"name": "o1-mini",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 120000,
"maxTemperature": null,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {
"max_tokens": "max_completion_tokens"
},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "o1-preview",
"name": "o1-preview",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 120000,
"maxTemperature": null,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {
"stream": false
},
"fieldMap": {
"max_tokens": "max_completion_tokens"
},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "gpt-3.5-turbo",
"name": "gpt-3.5-turbo",
"maxContext": 16000,
"maxResponse": 4000,
"quoteMaxToken": 13000,
"maxTemperature": 1.2,
"showTopP": true,
"showStopSign": true,
"vision": false,
"toolChoice": true,
"functionCall": true,
"defaultSystemChatPrompt": "",
"type": "llm"
},
{
"model": "gpt-4-turbo",
"name": "gpt-4-turbo",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 60000,
"maxTemperature": 1.2,
"showTopP": true,
"showStopSign": true,
"vision": true,
"toolChoice": true,
"functionCall": true,
"defaultSystemChatPrompt": "",
"type": "llm"
},
{
"model": "text-embedding-3-large",
"name": "text-embedding-3-large",
"defaultToken": 512,
"maxToken": 8000,
"defaultConfig": {
"dimensions": 1024
},
"type": "embedding"
},
{
"model": "text-embedding-3-small",
"name": "text-embedding-3-small",
"defaultToken": 512,
"maxToken": 8000,
"type": "embedding"
},
{
"model": "text-embedding-ada-002",
"name": "text-embedding-ada-002",
"defaultToken": 512,
"maxToken": 8000,
"type": "embedding"
},
{
"model": "tts-1",
"name": "TTS1",
"voices": [
{
"label": "Alloy",
"value": "alloy"
},
{
"label": "Echo",
"value": "echo"
},
{
"label": "Fable",
"value": "fable"
},
{
"label": "Onyx",
"value": "onyx"
},
{
"label": "Nova",
"value": "nova"
},
{
"label": "Shimmer",
"value": "shimmer"
}
],
"type": "tts"
},
{
"model": "whisper-1",
"name": "whisper-1",
"type": "stt"
}
]
}

View File

@ -1,4 +0,0 @@
{
"provider": "Other",
"list": []
}

View File

@ -1,4 +0,0 @@
{
"provider": "PPIO",
"list": []
}

View File

@ -1,440 +0,0 @@
{
"provider": "Qwen",
"list": [
{
"model": "qwen-max",
"name": "Qwen-max",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 120000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen-vl-max",
"name": "qwen-vl-max",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 120000,
"maxTemperature": 1,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "qwen-plus",
"name": "Qwen-plus",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 120000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen-vl-plus",
"name": "qwen-vl-plus",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 120000,
"maxTemperature": 1,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "qwen-turbo",
"name": "Qwen-turbo",
"maxContext": 1000000,
"maxResponse": 8000,
"quoteMaxToken": 1000000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen3-235b-a22b",
"name": "qwen3-235b-a22b",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"vision": false,
"reasoning": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {
"stream": true
},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen3-32b",
"name": "qwen3-32b",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"vision": false,
"reasoning": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {
"stream": true
},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen3-30b-a3b",
"name": "qwen3-30b-a3b",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"vision": false,
"reasoning": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {
"stream": true
},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen3-14b",
"name": "qwen3-14b",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"vision": false,
"reasoning": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {
"stream": true
},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen3-8b",
"name": "qwen3-8b",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"vision": false,
"reasoning": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {
"stream": true
},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen3-4b",
"name": "qwen3-4b",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": 1,
"vision": false,
"reasoning": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {
"stream": true
},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen3-1.7b",
"name": "qwen3-1.7b",
"maxContext": 32000,
"maxResponse": 8000,
"quoteMaxToken": 30000,
"maxTemperature": 1,
"vision": false,
"reasoning": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {
"stream": true
},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen3-0.6b",
"name": "qwen3-0.6b",
"maxContext": 32000,
"maxResponse": 8000,
"quoteMaxToken": 30000,
"maxTemperature": 1,
"vision": false,
"reasoning": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {
"stream": true
},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwq-plus",
"name": "qwq-plus",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": null,
"vision": false,
"reasoning": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"datasetProcess": false,
"usedInClassify": false,
"usedInExtractFields": false,
"usedInQueryExtension": false,
"defaultConfig": {
"stream": true
},
"fieldMap": {},
"type": "llm",
"showTopP": false,
"showStopSign": false
},
{
"model": "qwq-32b",
"name": "qwq-32b",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 100000,
"maxTemperature": null,
"vision": false,
"reasoning": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"datasetProcess": false,
"usedInClassify": false,
"usedInExtractFields": false,
"usedInQueryExtension": false,
"defaultConfig": {
"stream": true
},
"fieldMap": {},
"type": "llm",
"showTopP": false,
"showStopSign": false
},
{
"model": "qwen-coder-turbo",
"name": "qwen-coder-turbo",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 50000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "qwen2.5-7b-instruct",
"name": "qwen2.5-7b-instruct",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 50000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen2.5-14b-instruct",
"name": "qwen2.5-14b-instruct",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 50000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen2.5-32b-instruct",
"name": "qwen2.5-32b-instruct",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 50000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen2.5-72b-instruct",
"name": "Qwen2.5-72B-instruct",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 50000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true,
"responseFormatList": ["text", "json_object"]
},
{
"model": "qwen-long",
"name": "qwen-long",
"maxContext": 10000000,
"maxResponse": 6000,
"quoteMaxToken": 10000000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"datasetProcess": false,
"usedInClassify": false,
"usedInExtractFields": false,
"usedInQueryExtension": false,
"usedInToolCall": false,
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": false,
"showStopSign": false
},
{
"model": "text-embedding-v4",
"name": "text-embedding-v4",
"defaultToken": 512,
"maxToken": 8000,
"type": "embedding",
"defaultConfig": {
"dimensions": 1536
}
},
{
"model": "text-embedding-v3",
"name": "text-embedding-v3",
"defaultToken": 512,
"maxToken": 8000,
"type": "embedding"
},
{
"model": "gte-rerank-v2",
"name": "gte-rerank-v2",
"type": "rerank"
}
]
}

View File

@ -1,194 +0,0 @@
{
"provider": "Siliconflow",
"list": [
{
"model": "Qwen/Qwen2.5-72B-Instruct",
"name": "Qwen/Qwen2.5-72B-Instruct",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 50000,
"maxTemperature": 1,
"vision": false,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "Qwen/Qwen2-VL-72B-Instruct",
"name": "Qwen/Qwen2-VL-72B-Instruct",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"censor": false,
"vision": true,
"datasetProcess": false,
"usedInClassify": false,
"usedInExtractFields": false,
"usedInToolCall": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "deepseek-ai/DeepSeek-V2.5",
"name": "deepseek-ai/DeepSeek-V2.5",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"vision": true,
"toolChoice": true,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "BAAI/bge-m3",
"name": "BAAI/bge-m3",
"defaultToken": 512,
"maxToken": 8000,
"type": "embedding"
},
{
"model": "FunAudioLLM/CosyVoice2-0.5B",
"name": "FunAudioLLM/CosyVoice2-0.5B",
"voices": [
{
"label": "alex",
"value": "FunAudioLLM/CosyVoice2-0.5B:alex"
},
{
"label": "anna",
"value": "FunAudioLLM/CosyVoice2-0.5B:anna"
},
{
"label": "bella",
"value": "FunAudioLLM/CosyVoice2-0.5B:bella"
},
{
"label": "benjamin",
"value": "FunAudioLLM/CosyVoice2-0.5B:benjamin"
},
{
"label": "charles",
"value": "FunAudioLLM/CosyVoice2-0.5B:charles"
},
{
"label": "claire",
"value": "FunAudioLLM/CosyVoice2-0.5B:claire"
},
{
"label": "david",
"value": "FunAudioLLM/CosyVoice2-0.5B:david"
},
{
"label": "diana",
"value": "FunAudioLLM/CosyVoice2-0.5B:diana"
}
],
"type": "tts"
},
{
"model": "RVC-Boss/GPT-SoVITS",
"name": "RVC-Boss/GPT-SoVITS",
"voices": [
{
"label": "alex",
"value": "RVC-Boss/GPT-SoVITS:alex"
},
{
"label": "anna",
"value": "RVC-Boss/GPT-SoVITS:anna"
},
{
"label": "bella",
"value": "RVC-Boss/GPT-SoVITS:bella"
},
{
"label": "benjamin",
"value": "RVC-Boss/GPT-SoVITS:benjamin"
},
{
"label": "charles",
"value": "RVC-Boss/GPT-SoVITS:charles"
},
{
"label": "claire",
"value": "RVC-Boss/GPT-SoVITS:claire"
},
{
"label": "david",
"value": "RVC-Boss/GPT-SoVITS:david"
},
{
"label": "diana",
"value": "RVC-Boss/GPT-SoVITS:diana"
}
],
"type": "tts"
},
{
"model": "fishaudio/fish-speech-1.5",
"name": "fish-speech-1.5",
"voices": [
{
"label": "alex",
"value": "fishaudio/fish-speech-1.5:alex"
},
{
"label": "anna",
"value": "fishaudio/fish-speech-1.5:anna"
},
{
"label": "bella",
"value": "fishaudio/fish-speech-1.5:bella"
},
{
"label": "benjamin",
"value": "fishaudio/fish-speech-1.5:benjamin"
},
{
"label": "charles",
"value": "fishaudio/fish-speech-1.5:charles"
},
{
"label": "claire",
"value": "fishaudio/fish-speech-1.5:claire"
},
{
"label": "david",
"value": "fishaudio/fish-speech-1.5:david"
},
{
"label": "diana",
"value": "fishaudio/fish-speech-1.5:diana"
}
],
"type": "tts"
},
{
"model": "FunAudioLLM/SenseVoiceSmall",
"name": "FunAudioLLM/SenseVoiceSmall",
"type": "stt"
},
{
"model": "BAAI/bge-reranker-v2-m3",
"name": "BAAI/bge-reranker-v2-m3",
"type": "rerank"
}
]
}

View File

@ -1,99 +0,0 @@
{
"provider": "SparkDesk",
"list": [
{
"model": "lite",
"name": "SparkDesk-lite",
"maxContext": 32000,
"maxResponse": 4000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "generalv3",
"name": "SparkDesk-Pro",
"maxContext": 8000,
"maxResponse": 8000,
"quoteMaxToken": 8000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "pro-128k",
"name": "SparkDesk-Pro-128k",
"maxContext": 128000,
"maxResponse": 4000,
"quoteMaxToken": 128000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "generalv3.5",
"name": "SparkDesk-max",
"maxContext": 8000,
"maxResponse": 8000,
"quoteMaxToken": 8000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "max-32k",
"name": "SparkDesk-max-32k",
"maxContext": 32000,
"maxResponse": 8000,
"quoteMaxToken": 32000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "4.0Ultra",
"name": "SparkDesk-v4.0 Ultra",
"maxContext": 8000,
"maxResponse": 8000,
"quoteMaxToken": 8000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
}
]
}

View File

@ -1,253 +0,0 @@
{
"provider": "StepFun",
"list": [
{
"model": "step-1-flash",
"name": "step-1-flash",
"maxContext": 8000,
"maxResponse": 4000,
"quoteMaxToken": 6000,
"maxTemperature": 2,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "step-1-8k",
"name": "step-1-8k",
"maxContext": 8000,
"maxResponse": 8000,
"quoteMaxToken": 8000,
"maxTemperature": 2,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "step-1-32k",
"name": "step-1-32k",
"maxContext": 32000,
"maxResponse": 8000,
"quoteMaxToken": 32000,
"maxTemperature": 2,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "step-1-128k",
"name": "step-1-128k",
"maxContext": 128000,
"maxResponse": 8000,
"quoteMaxToken": 128000,
"maxTemperature": 2,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "step-1-256k",
"name": "step-1-256k",
"maxContext": 256000,
"maxResponse": 8000,
"quoteMaxToken": 256000,
"maxTemperature": 2,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "step-1o-vision-32k",
"name": "step-1o-vision-32k",
"maxContext": 32000,
"quoteMaxToken": 32000,
"maxResponse": 8000,
"maxTemperature": 2,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "step-1v-8k",
"name": "step-1v-8k",
"maxContext": 8000,
"maxResponse": 8000,
"quoteMaxToken": 8000,
"maxTemperature": 2,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "step-1v-32k",
"name": "step-1v-32k",
"maxContext": 32000,
"quoteMaxToken": 32000,
"maxResponse": 8000,
"maxTemperature": 2,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "step-2-mini",
"name": "step-2-mini",
"maxContext": 8000,
"maxResponse": 4000,
"quoteMaxToken": 6000,
"maxTemperature": 2,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "step-2-16k",
"name": "step-2-16k",
"maxContext": 16000,
"maxResponse": 4000,
"quoteMaxToken": 4000,
"maxTemperature": 2,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "step-2-16k-exp",
"name": "step-2-16k-exp",
"maxContext": 16000,
"maxResponse": 4000,
"quoteMaxToken": 4000,
"maxTemperature": 2,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "step-tts-mini",
"name": "step-tts-mini",
"voices": [
{
"label": "cixingnansheng",
"value": "cixingnansheng"
},
{
"label": "zhengpaiqingnian",
"value": "zhengpaiqingnian"
},
{
"label": "yuanqinansheng",
"value": "yuanqinansheng"
},
{
"label": "qingniandaxuesheng",
"value": "qingniandaxuesheng"
},
{
"label": "boyinnansheng",
"value": "boyinnansheng"
},
{
"label": "ruyananshi",
"value": "ruyananshi"
},
{
"label": "shenchennanyin",
"value": "shenchennanyin"
},
{
"label": "qinqienvsheng",
"value": "qinqienvsheng"
},
{
"label": "wenrounvsheng",
"value": "wenrounvsheng"
},
{
"label": "jilingshaonv",
"value": "jilingshaonv"
},
{
"label": "yuanqishaonv",
"value": "yuanqishaonv"
},
{
"label": "ruanmengnvsheng",
"value": "ruanmengnvsheng"
},
{
"label": "youyanvsheng",
"value": "youyanvsheng"
},
{
"label": "lengyanyujie",
"value": "lengyanyujie"
},
{
"label": "shuangkuaijiejie",
"value": "shuangkuaijiejie"
},
{
"label": "wenjingxuejie",
"value": "wenjingxuejie"
},
{
"label": "linjiajiejie",
"value": "linjiajiejie"
},
{
"label": "linjiameimei",
"value": "linjiameimei"
},
{
"label": "zhixingjiejie",
"value": "zhixingjiejie"
}
],
"type": "tts"
}
]
}

View File

@ -1,39 +0,0 @@
{
"provider": "Yi",
"list": [
{
"model": "yi-lightning",
"name": "yi-lightning",
"maxContext": 16000,
"maxResponse": 4000,
"quoteMaxToken": 12000,
"maxTemperature": 1,
"vision": false,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
},
{
"model": "yi-vision-v2",
"name": "yi-vision-v2",
"maxContext": 16000,
"maxResponse": 4000,
"quoteMaxToken": 12000,
"maxTemperature": 1,
"vision": true,
"toolChoice": false,
"functionCall": false,
"defaultSystemChatPrompt": "",
"defaultConfig": {},
"fieldMap": {},
"type": "llm",
"showTopP": true,
"showStopSign": true
}
]
}

View File

@ -1,22 +0,0 @@
{
"provider": "Jina",
"list": [
{
"type": "embedding",
"model": "jina-embeddings-v3",
"name": "jina-embeddings-v3",
"defaultToken": 512,
"maxToken": 8000
},
{
"model": "jina-reranker-v2-base-multilingual",
"name": "jina-reranker-v2-base-multilingual",
"type": "rerank"
},
{
"model": "jina-reranker-m0",
"name": "jina-reranker-m0",
"type": "rerank"
}
]
}

View File

@ -1,5 +1,3 @@
import path from 'path';
import * as fs from 'fs';
import { type SystemModelItemType } from '../type';
import { ModelTypeEnum } from '@fastgpt/global/core/ai/model';
import { MongoSystemModel } from './schema';
@ -11,34 +9,16 @@ import {
type RerankModelItemType
} from '@fastgpt/global/core/ai/model.d';
import { debounce } from 'lodash';
import {
getModelProvider,
type ModelProviderIdType,
type ModelProviderType
} from '@fastgpt/global/core/ai/provider';
import { getModelProvider } from '@fastgpt/global/core/ai/provider';
import { findModelFromAlldata } from '../model';
import {
reloadFastGPTConfigBuffer,
updateFastGPTConfigBuffer
} from '../../../common/system/config/controller';
import { delay } from '@fastgpt/global/common/system/utils';
import { pluginClient } from '../../../thirdProvider/fastgptPlugin';
import { setCron } from '../../../common/system/cron';
const getModelConfigBaseUrl = () => {
const currentFileUrl = new URL(import.meta.url);
const filePath = decodeURIComponent(
process.platform === 'win32'
? currentFileUrl.pathname.substring(1) // Remove leading slash on Windows
: currentFileUrl.pathname
);
const modelsPath = path.join(path.dirname(filePath), 'provider');
return modelsPath;
};
/*
TODO: 分优先级读取
1.
2.
*/
export const loadSystemModels = async (init = false) => {
const pushModel = (model: SystemModelItemType) => {
global.systemModelList.push(model);
@ -108,17 +88,19 @@ export const loadSystemModels = async (init = false) => {
global.systemDefaultModel = {};
try {
const dbModels = await MongoSystemModel.find({}).lean();
// Get model from db and plugin
const [dbModels, systemModels] = await Promise.all([
MongoSystemModel.find({}).lean(),
pluginClient.model.list().then((res) => {
if (res.status === 200) return res.body;
console.error('Get fastGPT plugin model error');
return [];
})
]);
// Load system model from local
const modelsPath = getModelConfigBaseUrl();
const providerList = await fs.promises.readdir(modelsPath);
await Promise.all(
providerList.map(async (name) => {
const fileContent = (await import(`./provider/${name}`))?.default as {
provider: ModelProviderIdType;
list: SystemModelItemType[];
};
systemModels.map(async (model) => {
const mergeObject = (obj1: any, obj2: any) => {
if (!obj1 && !obj2) return undefined;
const formatObj1 = typeof obj1 === 'object' ? obj1 : {};
@ -126,27 +108,24 @@ export const loadSystemModels = async (init = false) => {
return { ...formatObj1, ...formatObj2 };
};
fileContent.list.forEach((fileModel) => {
const dbModel = dbModels.find((item) => item.model === fileModel.model);
const dbModel = dbModels.find((item) => item.model === model.model);
const modelData: any = {
...fileModel,
...dbModel?.metadata,
// @ts-ignore
defaultConfig: mergeObject(fileModel.defaultConfig, dbModel?.metadata?.defaultConfig),
// @ts-ignore
fieldMap: mergeObject(fileModel.fieldMap, dbModel?.metadata?.fieldMap),
provider: getModelProvider(dbModel?.metadata?.provider || fileContent.provider).id,
type: dbModel?.metadata?.type || fileModel.type,
isCustom: false
};
pushModel(modelData);
});
const modelData: any = {
...model,
...dbModel?.metadata,
// @ts-ignore
defaultConfig: mergeObject(model.defaultConfig, dbModel?.metadata?.defaultConfig),
// @ts-ignore
fieldMap: mergeObject(model.fieldMap, dbModel?.metadata?.fieldMap),
provider: getModelProvider(dbModel?.metadata?.provider || (model.provider as any)).id,
type: dbModel?.metadata?.type || model.type,
isCustom: false
};
pushModel(modelData);
})
);
// Custom model
// Custom model(Not in system config)
dbModels.forEach((dbModel) => {
if (global.systemModelList.find((item) => item.model === dbModel.model)) return;
@ -190,7 +169,18 @@ export const loadSystemModels = async (init = false) => {
return providerA.order - providerB.order;
});
console.log('Load models success', JSON.stringify(global.systemActiveModelList, null, 2));
console.log(
`Load models success, total: ${global.systemModelList.length}, active: ${global.systemActiveModelList.length}`,
JSON.stringify(
global.systemActiveModelList.map((item) => ({
provider: item.provider,
model: item.model,
name: item.name
})),
null,
2
)
);
} catch (error) {
console.error('Load models error', error);
// @ts-ignore
@ -205,17 +195,16 @@ export const getSystemModelConfig = async (model: string): Promise<SystemModelIt
if (modelData.isCustom) return Promise.reject('Custom model not data');
// Read file
const fileContent = (await import(`./provider/${modelData.provider}`))?.default as {
provider: ModelProviderType;
list: SystemModelItemType[];
};
const modelDefaulConfig = await pluginClient.model.list().then((res) => {
if (res.status === 200) {
return res.body.find((item) => item.model === model) as SystemModelItemType;
}
const config = fileContent.list.find((item) => item.model === model);
if (!config) return Promise.reject('Model config is not found');
return Promise.reject('Can not get model config from plugin');
});
return {
...config,
...modelDefaulConfig,
provider: modelData.provider,
isCustom: false
};
@ -246,3 +235,11 @@ export const updatedReloadSystemModel = async () => {
// 3. 延迟1秒等待其他节点刷新
await delay(1000);
};
export const cronRefreshModels = async () => {
setCron('*/5 * * * *', async () => {
// 1. 更新模型(所有节点都会触发)
await loadSystemModels(true);
// 2. 更新缓存(仅主节点触发)
await updateFastGPTConfigBuffer();
});
};

View File

@ -1,6 +1,7 @@
import { type AppSchema } from '@fastgpt/global/core/app/type';
import { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
import { AppTypeEnum } from '@fastgpt/global/core/app/constants';
import { MongoApp } from './schema';
import type { StoreNodeItemType } from '@fastgpt/global/core/workflow/type/node';
import { encryptSecretValue, storeSecretValue } from '../../common/secret/utils';
@ -19,6 +20,7 @@ import { MongoResourcePermission } from '../../support/permission/schema';
import { PerResourceTypeEnum } from '@fastgpt/global/support/permission/constant';
import { removeImageByPath } from '../../common/file/image/controller';
import { mongoSessionRun } from '../../common/mongo/sessionRun';
import { MongoAppLogKeys } from './logs/logkeysSchema';
export const beforeUpdateAppFormat = ({ nodes }: { nodes?: StoreNodeItemType[] }) => {
if (!nodes) return;
@ -140,6 +142,10 @@ export const onDelOneApp = async ({
fields: '_id avatar'
});
const deletedAppIds = apps
.filter((app) => app.type !== AppTypeEnum.folder)
.map((app) => String(app._id));
// Remove eval job
const evalJobs = await MongoEvaluation.find(
{
@ -191,6 +197,10 @@ export const onDelOneApp = async ({
resourceId: appId
}).session(session);
await MongoAppLogKeys.deleteMany({
appId
}).session(session);
// delete app
await MongoApp.deleteOne(
{
@ -204,8 +214,10 @@ export const onDelOneApp = async ({
};
if (session) {
return del(session);
await del(session);
return deletedAppIds;
}
return mongoSessionRun(del);
await mongoSessionRun(del);
return deletedAppIds;
};

View File

@ -0,0 +1,32 @@
import type { AppLogKeysSchemaType } from '@fastgpt/global/core/app/logs/type';
import { connectionMongo, getMongoModel } from '../../../common/mongo';
import { AppCollectionName } from '../schema';
import { TeamCollectionName } from '@fastgpt/global/support/user/team/constant';
const { Schema } = connectionMongo;
export const AppLogKeysCollectionEnum = 'app_log_keys';
const AppLogKeysSchema = new Schema({
teamId: {
type: Schema.Types.ObjectId,
ref: TeamCollectionName,
required: true
},
appId: {
type: Schema.Types.ObjectId,
ref: AppCollectionName,
required: true
},
logKeys: {
type: Array,
required: true
}
});
AppLogKeysSchema.index({ teamId: 1, appId: 1 });
export const MongoAppLogKeys = getMongoModel<AppLogKeysSchemaType>(
AppLogKeysCollectionEnum,
AppLogKeysSchema
);

View File

@ -1,9 +1,13 @@
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
import { SSEClientTransport } from '@modelcontextprotocol/sdk/client/sse.js';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js';
import type { AppSchema } from '@fastgpt/global/core/app/type';
import { type McpToolConfigType } from '@fastgpt/global/core/app/type';
import { addLog } from '../../common/system/log';
import { retryFn } from '@fastgpt/global/common/system/utils';
import { PluginSourceEnum } from '@fastgpt/global/core/app/plugin/constants';
import { MongoApp } from './schema';
import type { McpToolDataType } from '@fastgpt/global/core/app/mcpTools/type';
export class MCPClient {
private client: Client;
@ -128,3 +132,35 @@ export class MCPClient {
}
}
}
export const getMCPChildren = async (app: AppSchema) => {
const isNewMcp = !!app.modules[0].toolConfig?.mcpToolSet;
const id = String(app._id);
if (isNewMcp) {
return (
app.modules[0].toolConfig?.mcpToolSet?.toolList.map((item) => ({
...item,
id: `${PluginSourceEnum.mcp}-${id}/${item.name}`,
avatar: app.avatar
})) ?? []
);
} else {
// Old mcp toolset
const children = await MongoApp.find({
teamId: app.teamId,
parentId: id
}).lean();
return children.map((item) => {
const node = item.modules[0];
const toolData: McpToolDataType = node.inputs[0].value;
return {
avatar: app.avatar,
id: `${PluginSourceEnum.mcp}-${id}/${item.name}`,
...toolData
};
});
}
};

View File

@ -1,6 +1,10 @@
import { type FlowNodeTemplateType } from '@fastgpt/global/core/workflow/type/node.d';
import type {
NodeToolConfigType,
FlowNodeTemplateType
} from '@fastgpt/global/core/workflow/type/node.d';
import {
FlowNodeOutputTypeEnum,
FlowNodeInputTypeEnum,
FlowNodeTypeEnum
} from '@fastgpt/global/core/workflow/node/constant';
import {
@ -28,7 +32,7 @@ import {
NodeInputKeyEnum
} from '@fastgpt/global/core/workflow/constants';
import { getNanoid } from '@fastgpt/global/common/string/tools';
import { getSystemToolList } from '../tool/api';
import { APIGetSystemToolList } from '../tool/api';
import { Types } from '../../../common/mongo';
import type { SystemPluginConfigSchemaType } from './type';
import type {
@ -37,37 +41,11 @@ import type {
} from '@fastgpt/global/core/workflow/type/io';
import { isProduction } from '@fastgpt/global/common/system/constants';
import { Output_Template_Error_Message } from '@fastgpt/global/core/workflow/template/output';
/**
plugin id rule:
- personal: ObjectId
- commercial: commercial-ObjectId
- systemtool: systemTool-id
(deprecated) community: community-id
*/
export function splitCombinePluginId(id: string) {
const splitRes = id.split('-');
if (splitRes.length === 1) {
// app id
return {
source: PluginSourceEnum.personal,
pluginId: id
};
}
const [source, pluginId] = id.split('-') as [PluginSourceEnum, string | undefined];
if (!source || !pluginId) throw new Error('pluginId not found');
// 兼容4.10.0 之前的插件
if (source === 'community' || id === 'commercial-dalle3') {
return {
source: PluginSourceEnum.systemTool,
pluginId: `${PluginSourceEnum.systemTool}-${pluginId}`
};
}
return { source, pluginId: id };
}
import type { RuntimeNodeItemType } from '@fastgpt/global/core/workflow/runtime/type';
import { splitCombinePluginId } from '@fastgpt/global/core/app/plugin/utils';
import { getMCPToolRuntimeNode } from '@fastgpt/global/core/app/mcpTools/utils';
import { AppTypeEnum } from '@fastgpt/global/core/app/constants';
import { getMCPChildren } from '../mcp';
type ChildAppType = SystemPluginTemplateItemType & {
teamId?: string;
@ -81,77 +59,102 @@ export const getSystemPluginByIdAndVersionId = async (
pluginId: string,
versionId?: string
): Promise<ChildAppType> => {
const plugin = await (async (): Promise<ChildAppType> => {
const plugin = await getSystemPluginById(pluginId);
const plugin = await getSystemToolById(pluginId);
// Admin selected system tool
if (plugin.associatedPluginId) {
// The verification plugin is set as a system plugin
const systemPlugin = await MongoSystemPlugin.findOne(
{ pluginId: plugin.id, 'customConfig.associatedPluginId': plugin.associatedPluginId },
'associatedPluginId'
).lean();
if (!systemPlugin) return Promise.reject(PluginErrEnum.unExist);
// Admin selected system tool
if (plugin.associatedPluginId) {
// The verification plugin is set as a system plugin
const systemPlugin = await MongoSystemPlugin.findOne(
{ pluginId: plugin.id, 'customConfig.associatedPluginId': plugin.associatedPluginId },
'associatedPluginId'
).lean();
if (!systemPlugin) return Promise.reject(PluginErrEnum.unExist);
const app = await MongoApp.findById(plugin.associatedPluginId).lean();
if (!app) return Promise.reject(PluginErrEnum.unExist);
const version = versionId
? await getAppVersionById({
appId: plugin.associatedPluginId,
versionId,
app
})
: await getAppLatestVersion(plugin.associatedPluginId, app);
if (!version.versionId) return Promise.reject('App version not found');
const isLatest = version.versionId
? await checkIsLatestVersion({
appId: plugin.associatedPluginId,
versionId: version.versionId
})
: true;
return {
...plugin,
workflow: {
nodes: version.nodes,
edges: version.edges,
chatConfig: version.chatConfig
},
version: versionId ? version?.versionId : '',
versionLabel: version?.versionName,
isLatestVersion: isLatest,
teamId: String(app.teamId),
tmbId: String(app.tmbId)
};
}
// System tool
const versionList = (plugin.versionList as SystemPluginTemplateItemType['versionList']) || [];
if (versionList.length === 0) {
return Promise.reject('Can not find plugin version list');
}
const app = await MongoApp.findById(plugin.associatedPluginId).lean();
if (!app) return Promise.reject(PluginErrEnum.unExist);
const version = versionId
? versionList.find((item) => item.value === versionId) ?? versionList[0]
: versionList[0];
const lastVersion = versionList[0];
? await getAppVersionById({
appId: plugin.associatedPluginId,
versionId,
app
})
: await getAppLatestVersion(plugin.associatedPluginId, app);
if (!version.versionId) return Promise.reject('App version not found');
const isLatest = version.versionId
? await checkIsLatestVersion({
appId: plugin.associatedPluginId,
versionId: version.versionId
})
: true;
return {
...plugin,
inputs: version.inputs,
outputs: version.outputs,
version: versionId ? version?.value : '',
versionLabel: versionId ? version?.value : '',
isLatestVersion: !version || !lastVersion || version.value === lastVersion?.value
workflow: {
nodes: version.nodes,
edges: version.edges,
chatConfig: version.chatConfig
},
version: versionId ? version?.versionId : '',
versionLabel: version?.versionName,
isLatestVersion: isLatest,
teamId: String(app.teamId),
tmbId: String(app.tmbId)
};
})();
}
return plugin;
// System toolset
if (plugin.isFolder) {
return {
...plugin,
inputs: [],
outputs: [],
inputList: plugin.inputList,
version: '',
isLatestVersion: true
};
}
// System tool
const versionList = (plugin.versionList as SystemPluginTemplateItemType['versionList']) || [];
if (versionList.length === 0) {
return Promise.reject('Can not find plugin version list');
}
const version = versionId
? versionList.find((item) => item.value === versionId) ?? versionList[0]
: versionList[0];
const lastVersion = versionList[0];
// concat parent (if exists) input config
const parent = plugin.parentId ? await getSystemToolById(plugin.parentId) : undefined;
if (parent && parent.inputList) {
plugin?.inputs?.push({
key: 'system_input_config',
label: '',
renderTypeList: [FlowNodeInputTypeEnum.hidden],
inputList: parent.inputList
});
}
return {
...plugin,
inputs: version.inputs,
outputs: version.outputs,
version: versionId ? version?.value : '',
versionLabel: versionId ? version?.value : '',
isLatestVersion: !version || !lastVersion || version.value === lastVersion?.value
};
};
/* Format plugin to workflow preview node data */
/*
Format plugin to workflow preview node data
Persion workflow/plugin: objectId
Persion mcptoolset: objectId
Persion mcp tool: mcp-parentId/name
System tool/toolset: system-toolId
*/
export async function getChildAppPreviewNode({
appId,
versionId,
@ -164,6 +167,8 @@ export async function getChildAppPreviewNode({
const { source, pluginId } = splitCombinePluginId(appId);
const app: ChildAppType = await (async () => {
// 1. App
// 2. MCP ToolSets
if (source === PluginSourceEnum.personal) {
const item = await MongoApp.findById(pluginId).lean();
if (!item) return Promise.reject(PluginErrEnum.unExist);
@ -178,6 +183,17 @@ export async function getChildAppPreviewNode({
})
: true;
if (item.type === AppTypeEnum.toolSet) {
const children = await getMCPChildren(item);
version.nodes[0].toolConfig = {
mcpToolSet: {
toolId: pluginId,
toolList: children,
url: ''
}
};
}
return {
id: String(item._id),
teamId: String(item.teamId),
@ -201,29 +217,105 @@ export async function getChildAppPreviewNode({
hasTokenFee: false,
pluginOrder: 0
};
} else {
}
// mcp tool
else if (source === PluginSourceEnum.mcp) {
const [parentId, toolName] = pluginId.split('/');
// 1. get parentApp
const item = await MongoApp.findById(parentId).lean();
if (!item) return Promise.reject(PluginErrEnum.unExist);
const version = await getAppVersionById({ appId: parentId, versionId, app: item });
const toolConfig = version.nodes[0].toolConfig?.mcpToolSet;
const tool = toolConfig?.toolList.find((item) => item.name === toolName);
if (!tool || !toolConfig) return Promise.reject(PluginErrEnum.unExist);
return {
avatar: item.avatar,
id: appId,
name: tool.name,
templateType: FlowNodeTemplateTypeEnum.tools,
workflow: {
nodes: [
getMCPToolRuntimeNode({
tool: {
description: tool.description,
inputSchema: tool.inputSchema,
name: tool.name
},
avatar: item.avatar,
parentId: item._id
})
],
edges: []
},
version: '',
isLatestVersion: true
};
}
// 1. System Tools
// 2. System Plugins configured in Pro (has associatedPluginId)
else {
return getSystemPluginByIdAndVersionId(pluginId, versionId);
}
})();
const { flowNodeType, nodeIOConfig } = await (async () => {
const { flowNodeType, nodeIOConfig } = await (async (): Promise<{
flowNodeType: FlowNodeTypeEnum;
nodeIOConfig: {
inputs: FlowNodeInputItemType[];
outputs: FlowNodeOutputItemType[];
toolConfig?: NodeToolConfigType;
showSourceHandle?: boolean;
showTargetHandle?: boolean;
};
}> => {
if (source === PluginSourceEnum.systemTool) {
// system Tool or Toolsets
const children = app.isFolder
? (await getSystemTools()).filter((item) => item.parentId === pluginId)
: [];
return {
flowNodeType: FlowNodeTypeEnum.tool,
flowNodeType: app.isFolder ? FlowNodeTypeEnum.toolSet : FlowNodeTypeEnum.tool,
nodeIOConfig: {
inputs: app.inputs || [],
outputs: app.outputs || [],
inputs: [
...(app.inputList
? [
{
key: NodeInputKeyEnum.systemInputConfig,
label: '',
renderTypeList: [FlowNodeInputTypeEnum.hidden],
inputList: app.inputList
}
]
: []),
...(app.inputs ?? [])
],
outputs: app.outputs ?? [],
toolConfig: {
systemTool: {
toolId: app.id
}
}
...(app.isFolder
? {
systemToolSet: {
toolId: app.id,
toolList: children.map((item) => ({
toolId: item.id,
name: parseI18nString(item.name, lang),
description: parseI18nString(item.intro, lang)
}))
}
}
: { systemTool: { toolId: app.id } })
},
showSourceHandle: app.isFolder ? false : true,
showTargetHandle: app.isFolder ? false : true
}
};
}
// Plugin workflow
if (!!app.workflow.nodes.find((node) => node.flowNodeType === FlowNodeTypeEnum.pluginInput)) {
// plugin app
return {
flowNodeType: FlowNodeTypeEnum.pluginModule,
nodeIOConfig: pluginData2FlowNodeIO({ nodes: app.workflow.nodes })
@ -235,6 +327,7 @@ export async function getChildAppPreviewNode({
!!app.workflow.nodes.find((node) => node.flowNodeType === FlowNodeTypeEnum.toolSet) &&
app.workflow.nodes.length === 1
) {
// mcp tools
return {
flowNodeType: FlowNodeTypeEnum.toolSet,
nodeIOConfig: toolSetData2FlowNodeIO({ nodes: app.workflow.nodes })
@ -294,11 +387,15 @@ export async function getChildAppPreviewNode({
System plugin: plugin id
Personal plugin: Version id
*/
export async function getChildAppRuntimeById(
id: string,
versionId?: string,
lang: localeType = 'en'
): Promise<PluginRuntimeType> {
export async function getChildAppRuntimeById({
id,
versionId,
lang = 'en'
}: {
id: string;
versionId?: string;
lang?: localeType;
}): Promise<PluginRuntimeType> {
const app = await (async () => {
const { source, pluginId } = splitCombinePluginId(id);
@ -351,6 +448,36 @@ export async function getChildAppRuntimeById(
};
}
export async function getSystemPluginRuntimeNodeById({
pluginId,
name,
intro
}: {
pluginId: string;
name: string;
intro: string;
}): Promise<RuntimeNodeItemType> {
const { source } = splitCombinePluginId(pluginId);
if (source === PluginSourceEnum.systemTool) {
const tool = await getSystemPluginByIdAndVersionId(pluginId);
return {
...tool,
name,
intro,
inputs: tool.inputs ?? [],
outputs: tool.outputs ?? [],
flowNodeType: FlowNodeTypeEnum.tool,
nodeId: getNanoid(),
toolConfig: {
systemTool: {
toolId: pluginId
}
}
};
}
return Promise.reject(PluginErrEnum.unExist);
}
const dbPluginFormat = (item: SystemPluginConfigSchemaType): SystemPluginTemplateItemType => {
const { name, avatar, intro, version, weight, templateType, associatedPluginId, userGuide } =
item.customConfig!;
@ -405,11 +532,11 @@ export const refetchSystemPlugins = () => {
});
};
export const getSystemPlugins = async (): Promise<SystemPluginTemplateItemType[]> => {
export const getSystemTools = async (): Promise<SystemPluginTemplateItemType[]> => {
if (getCachedSystemPlugins().expires > Date.now() && isProduction) {
return getCachedSystemPlugins().data;
} else {
const tools = await getSystemToolList();
const tools = await APIGetSystemToolList();
// 从数据库里加载插件配置进行替换
const systemPluginsArray = await MongoSystemPlugin.find({}).lean();
@ -436,34 +563,21 @@ export const getSystemPlugins = async (): Promise<SystemPluginTemplateItemType[]
const dbPluginConfig = systemPlugins.get(item.id);
const versionList = (item.versionList as SystemPluginTemplateItemType['versionList']) || [];
const inputs = versionList[0]?.inputs;
const inputs = versionList[0]?.inputs ?? [];
const outputs = versionList[0]?.outputs ?? [];
return {
isActive: item.isActive,
id: item.id,
parentId: item.parentId,
...item,
isFolder: tools.some((tool) => tool.parentId === item.id),
name: item.name,
avatar: item.avatar,
intro: item.intro,
author: item.author,
courseUrl: item.courseUrl,
showStatus: true,
weight: item.weight,
templateType: item.templateType,
originCost: item.originCost,
currentCost: item.currentCost,
hasTokenFee: item.hasTokenFee,
pluginOrder: item.pluginOrder,
workflow: {
nodes: [],
edges: []
},
versionList,
inputList: inputs?.find((input) => input.key === NodeInputKeyEnum.systemInputConfig)
?.inputList as any,
inputs,
outputs,
inputList: item?.secretInputConfig,
hasSystemSecret: !!dbPluginConfig?.inputListVal
};
});
@ -484,10 +598,10 @@ export const getSystemPlugins = async (): Promise<SystemPluginTemplateItemType[]
}
};
export const getSystemPluginById = async (id: string): Promise<SystemPluginTemplateItemType> => {
export const getSystemToolById = async (id: string): Promise<SystemPluginTemplateItemType> => {
const { source, pluginId } = splitCombinePluginId(id);
if (source === PluginSourceEnum.systemTool) {
const tools = await getSystemPlugins();
const tools = await getSystemTools();
const tool = tools.find((item) => item.id === pluginId);
if (tool) {
return tool;

View File

@ -1,9 +1,9 @@
import { type ChatNodeUsageType } from '@fastgpt/global/support/wallet/bill/type';
import { type PluginRuntimeType } from '@fastgpt/global/core/app/plugin/type';
import { splitCombinePluginId } from './controller';
import { PluginSourceEnum } from '@fastgpt/global/core/app/plugin/constants';
import { splitCombinePluginId } from '@fastgpt/global/core/app/plugin/utils';
/*
/*
Plugin points calculation:
1. /
- 0

View File

@ -1,16 +1,9 @@
import createClient, { RunToolWithStream } from '@fastgpt-sdk/plugin';
import { RunToolWithStream } from '@fastgpt-sdk/plugin';
import { PluginSourceEnum } from '@fastgpt/global/core/app/plugin/constants';
import { pluginClient, BASE_URL, TOKEN } from '../../../thirdProvider/fastgptPlugin';
const BASE_URL = process.env.PLUGIN_BASE_URL || '';
const TOKEN = process.env.PLUGIN_TOKEN || '';
const client = createClient({
baseUrl: BASE_URL,
token: TOKEN
});
export async function getSystemToolList() {
const res = await client.tool.list();
export async function APIGetSystemToolList() {
const res = await pluginClient.tool.list();
if (res.status === 200) {
return res.body.map((item) => {
@ -33,4 +26,4 @@ const runToolInstance = new RunToolWithStream({
baseUrl: BASE_URL,
token: TOKEN
});
export const runSystemTool = runToolInstance.run.bind(runToolInstance);
export const APIRunSystemTool = runToolInstance.run.bind(runToolInstance);

View File

@ -3,11 +3,13 @@ import { getEmbeddingModel } from '../ai/model';
import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
import { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import type { StoreNodeItemType } from '@fastgpt/global/core/workflow/type/node';
import { getChildAppPreviewNode, splitCombinePluginId } from './plugin/controller';
import { getChildAppPreviewNode } from './plugin/controller';
import { PluginSourceEnum } from '@fastgpt/global/core/app/plugin/constants';
import { authAppByTmbId } from '../../support/permission/app/auth';
import { ReadPermissionVal } from '@fastgpt/global/support/permission/constant';
import { getErrText } from '@fastgpt/global/common/error/utils';
import { splitCombinePluginId } from '@fastgpt/global/core/app/plugin/utils';
import type { localeType } from '@fastgpt/global/common/i18n/type';
export async function listAppDatasetDataByTeamIdAndDatasetIds({
teamId,
@ -33,12 +35,14 @@ export async function rewriteAppWorkflowToDetail({
nodes,
teamId,
isRoot,
ownerTmbId
ownerTmbId,
lang
}: {
nodes: StoreNodeItemType[];
teamId: string;
isRoot: boolean;
ownerTmbId: string;
lang?: localeType;
}) {
const datasetIdSet = new Set<string>();
@ -51,8 +55,9 @@ export async function rewriteAppWorkflowToDetail({
try {
const [preview] = await Promise.all([
getChildAppPreviewNode({
appId: pluginId,
versionId: node.version
appId: node.pluginId,
versionId: node.version,
lang
}),
...(source === PluginSourceEnum.personal
? [
@ -80,6 +85,8 @@ export async function rewriteAppWorkflowToDetail({
node.hasTokenFee = preview.hasTokenFee;
node.hasSystemSecret = preview.hasSystemSecret;
node.toolConfig = preview.toolConfig;
// Latest version
if (!node.version) {
const inputsMap = new Map(node.inputs.map((item) => [item.key, item]));

View File

@ -361,7 +361,7 @@ const getMultiInput = async ({
};
};
/*
/*
Tool call auth add file prompt to question
Guide the LLM to call tool.
*/

View File

@ -10,14 +10,16 @@ import { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { MCPClient } from '../../../app/mcp';
import { getSecretValue } from '../../../../common/secret/utils';
import type { McpToolDataType } from '@fastgpt/global/core/app/mcpTools/type';
import { runSystemTool } from '../../../app/tool/api';
import { APIRunSystemTool } from '../../../app/tool/api';
import { MongoSystemPlugin } from '../../../app/plugin/systemPluginSchema';
import { SystemToolInputTypeEnum } from '@fastgpt/global/core/app/systemTool/constants';
import type { StoreSecretValueType } from '@fastgpt/global/common/secret/type';
import { getSystemPluginById } from '../../../app/plugin/controller';
import { getSystemToolById } from '../../../app/plugin/controller';
import { textAdaptGptResponse } from '@fastgpt/global/core/workflow/runtime/utils';
import { pushTrack } from '../../../../common/middle/tracks/utils';
import { getNodeErrResponse } from '../utils';
import { splitCombinePluginId } from '@fastgpt/global/core/app/plugin/utils';
import { getAppVersionById } from '../../../../core/app/version/controller';
type SystemInputConfigType = {
type: SystemToolInputTypeEnum;
@ -52,8 +54,8 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
try {
// run system tool
if (systemToolId) {
const tool = await getSystemPluginById(systemToolId);
if (toolConfig?.systemTool?.toolId) {
const tool = await getSystemToolById(toolConfig.systemTool!.toolId);
const inputConfigParams = await (async () => {
switch (params.system_input_config?.type) {
@ -82,7 +84,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
const formatToolId = tool.id.split('-')[1];
const res = await runSystemTool({
const res = await APIRunSystemTool({
toolId: formatToolId,
inputs,
systemVar: {
@ -112,6 +114,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
}
}
});
let result = res.output || {};
if (res.error) {
@ -175,8 +178,33 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
}
]
};
} else if (toolConfig?.mcpTool?.toolId) {
const { pluginId } = splitCombinePluginId(toolConfig.mcpTool.toolId);
const [parentId, toolName] = pluginId.split('/');
const tool = await getAppVersionById({
appId: parentId,
versionId: version
});
const { headerSecret, url } =
tool.nodes[0].toolConfig?.mcpToolSet ?? tool.nodes[0].inputs[0].value;
const mcpClient = new MCPClient({
url,
headers: getSecretValue({
storeSecret: headerSecret
})
});
const result = await mcpClient.toolCall(toolName, params);
return {
[DispatchNodeResponseKeyEnum.nodeResponse]: {
toolRes: result,
moduleLogo: avatar
},
[DispatchNodeResponseKeyEnum.toolResponses]: result
};
} else {
// mcp tool
// mcp tool (old version compatible)
const { toolData, system_toolData, ...restParams } = params;
const { name: toolName, url, headerSecret } = toolData || system_toolData;

View File

@ -152,7 +152,7 @@ export async function dispatchWorkFlow(data: Props): Promise<DispatchFlowRespons
} = data;
const startTime = Date.now();
rewriteRuntimeWorkFlow(runtimeNodes, runtimeEdges);
await rewriteRuntimeWorkFlow({ nodes: runtimeNodes, edges: runtimeEdges });
// 初始化深度和自动增加深度,避免无限嵌套
if (!props.workflowDispatchDeep) {
@ -212,11 +212,10 @@ export async function dispatchWorkFlow(data: Props): Promise<DispatchFlowRespons
sendStreamTimerSign();
}
// Add system variables
// Get default variables
variables = {
...getSystemVariable(data),
...externalProvider.externalWorkflowVariables,
...variables
...getSystemVariables(data)
};
}
@ -846,23 +845,35 @@ export async function dispatchWorkFlow(data: Props): Promise<DispatchFlowRespons
}
/* get system variable */
const getSystemVariable = ({
const getSystemVariables = ({
timezone,
runningAppInfo,
chatId,
responseChatItemId,
histories = [],
uid,
chatConfig
chatConfig,
variables
}: Props): SystemVariablesType => {
const variables = chatConfig?.variables || [];
const variablesMap = variables.reduce<Record<string, any>>((acc, item) => {
acc[item.key] = valueTypeFormat(item.defaultValue, item.valueType);
// Get global variables(Label -> key; Key -> key)
const globalVariables = chatConfig?.variables || [];
const variablesMap = globalVariables.reduce<Record<string, any>>((acc, item) => {
// API
if (variables[item.label] !== undefined) {
acc[item.key] = valueTypeFormat(variables[item.label], item.valueType);
}
// Web
else if (variables[item.key] !== undefined) {
acc[item.key] = valueTypeFormat(variables[item.key], item.valueType);
} else {
acc[item.key] = valueTypeFormat(item.defaultValue, item.valueType);
}
return acc;
}, {});
return {
...variablesMap,
// System var:
userId: uid,
appId: String(runningAppInfo.id),
chatId,

View File

@ -1,4 +1,7 @@
import { getPluginInputsFromStoreNodes } from '@fastgpt/global/core/app/plugin/utils';
import {
getPluginInputsFromStoreNodes,
splitCombinePluginId
} from '@fastgpt/global/core/app/plugin/utils';
import { chatValue2RuntimePrompt } from '@fastgpt/global/core/chat/adapt';
import { PluginSourceEnum } from '@fastgpt/global/core/app/plugin/constants';
import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
@ -15,9 +18,8 @@ import { ReadPermissionVal } from '@fastgpt/global/support/permission/constant';
import { computedPluginUsage } from '../../../app/plugin/utils';
import { filterSystemVariables, getNodeErrResponse } from '../utils';
import { getPluginRunUserQuery } from '@fastgpt/global/core/workflow/utils';
import type { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import type { NodeOutputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { getChildAppRuntimeById, splitCombinePluginId } from '../../../app/plugin/controller';
import type { NodeInputKeyEnum, NodeOutputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { getChildAppRuntimeById } from '../../../app/plugin/controller';
import { dispatchWorkFlow } from '../index';
import { getUserChatInfoAndAuthTeamPoints } from '../../../../support/permission/auth/team';
import { dispatchRunTool } from '../child/runTool';
@ -66,7 +68,7 @@ export const dispatchRunPlugin = async (props: RunPluginProps): Promise<RunPlugi
});
}
/*
/*
1. Team app
2. Admin selected system tool
*/
@ -79,7 +81,7 @@ export const dispatchRunPlugin = async (props: RunPluginProps): Promise<RunPlugi
per: ReadPermissionVal
});
plugin = await getChildAppRuntimeById(pluginId, version);
plugin = await getChildAppRuntimeById({ id: pluginId, versionId: version });
const outputFilterMap =
plugin.nodes

View File

@ -1,7 +1,7 @@
import { getErrText } from '@fastgpt/global/common/error/utils';
import { ChatRoleEnum } from '@fastgpt/global/core/chat/constants';
import type { ChatItemType } from '@fastgpt/global/core/chat/type.d';
import { NodeOutputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { NodeInputKeyEnum, NodeOutputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import {
type RuntimeEdgeItemType,
type RuntimeNodeItemType,
@ -17,7 +17,12 @@ import { getNanoid } from '@fastgpt/global/common/string/tools';
import { type SearchDataResponseItemType } from '@fastgpt/global/core/dataset/type';
import { getMCPToolRuntimeNode } from '@fastgpt/global/core/app/mcpTools/utils';
import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
import type { McpToolSetDataType } from '@fastgpt/global/core/app/mcpTools/type';
import {
getSystemPluginRuntimeNodeById,
getSystemTools
} from '../../../core/app/plugin/controller';
import { MongoApp } from '../../../core/app/schema';
import { getMCPChildren } from '../../../core/app/mcp';
export const getWorkflowResponseWrite = ({
res,
@ -151,10 +156,19 @@ export const formatHttpError = (error: any) => {
};
};
export const rewriteRuntimeWorkFlow = (
nodes: RuntimeNodeItemType[],
edges: RuntimeEdgeItemType[]
) => {
/**
* ToolSet node will be replaced by Children Tool Nodes.
* @param nodes
* @param edges
* @returns
*/
export const rewriteRuntimeWorkFlow = async ({
nodes,
edges
}: {
nodes: RuntimeNodeItemType[];
edges: RuntimeEdgeItemType[];
}) => {
const toolSetNodes = nodes.filter((node) => node.flowNodeType === FlowNodeTypeEnum.toolSet);
if (toolSetNodes.length === 0) {
@ -165,35 +179,63 @@ export const rewriteRuntimeWorkFlow = (
for (const toolSetNode of toolSetNodes) {
nodeIdsToRemove.add(toolSetNode.nodeId);
const toolSetValue = toolSetNode.inputs[0]?.value as McpToolSetDataType | undefined;
if (!toolSetValue) continue;
const toolList = toolSetValue.toolList;
const url = toolSetValue.url;
const headerSecret = toolSetValue.headerSecret;
const systemToolId = toolSetNode.toolConfig?.systemToolSet?.toolId;
const mcpToolsetVal = toolSetNode.toolConfig?.mcpToolSet ?? toolSetNode.inputs[0].value;
const incomingEdges = edges.filter((edge) => edge.target === toolSetNode.nodeId);
for (const tool of toolList) {
const newToolNode = getMCPToolRuntimeNode({
avatar: toolSetNode.avatar,
tool,
url,
headerSecret
});
nodes.push({ ...newToolNode, name: `${toolSetNode.name} / ${tool.name}` });
const pushEdges = (nodeId: string) => {
for (const inEdge of incomingEdges) {
edges.push({
source: inEdge.source,
target: newToolNode.nodeId,
target: nodeId,
sourceHandle: inEdge.sourceHandle,
targetHandle: 'selectedTools',
status: inEdge.status
});
}
};
// systemTool
if (systemToolId) {
const toolsetInputConfig = toolSetNode.inputs.find(
(item) => item.key === NodeInputKeyEnum.systemInputConfig
);
const tools = await getSystemTools();
const children = tools.filter((item) => item.parentId === systemToolId);
for (const child of children) {
const toolListItem = toolSetNode.toolConfig?.systemToolSet?.toolList.find(
(item) => item.toolId === child.id
)!;
const newNode = await getSystemPluginRuntimeNodeById({
pluginId: child.id,
name: toolListItem?.name,
intro: toolListItem?.description
});
const newNodeInputConfig = newNode.inputs.find(
(item) => item.key === NodeInputKeyEnum.systemInputConfig
);
if (newNodeInputConfig) {
newNodeInputConfig.value = toolsetInputConfig?.value;
}
nodes.push(newNode);
pushEdges(newNode.nodeId);
}
} else if (mcpToolsetVal) {
const app = await MongoApp.findOne({ _id: toolSetNode.pluginId }).lean();
if (!app) continue;
const toolList = await getMCPChildren(app);
for (const tool of toolList) {
const newToolNode = getMCPToolRuntimeNode({
avatar: toolSetNode.avatar,
tool,
// New ?? Old
parentId: mcpToolsetVal.toolId ?? toolSetNode.pluginId
});
nodes.push({ ...newToolNode, name: `${toolSetNode.name}/${tool.name}` });
pushEdges(newToolNode.nodeId);
}
}
}

View File

@ -3,7 +3,7 @@
"version": "1.0.0",
"type": "module",
"dependencies": {
"@fastgpt-sdk/plugin": "^0.1.2",
"@fastgpt-sdk/plugin": "^0.1.4",
"@fastgpt/global": "workspace:*",
"@modelcontextprotocol/sdk": "^1.12.1",
"@node-rs/jieba": "2.0.1",

View File

@ -10,10 +10,10 @@ import { AppPermission } from '@fastgpt/global/support/permission/app/controller
import { type PermissionValueType } from '@fastgpt/global/support/permission/type';
import { AppFolderTypeList } from '@fastgpt/global/core/app/constants';
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
import { splitCombinePluginId } from '../../../core/app/plugin/controller';
import { PluginSourceEnum } from '@fastgpt/global/core/app/plugin/constants';
import { type AuthModeType, type AuthResponseType } from '../type';
import { AppDefaultPermissionVal } from '@fastgpt/global/support/permission/app/constant';
import { splitCombinePluginId } from '@fastgpt/global/core/app/plugin/utils';
export const authPluginByTmbId = async ({
tmbId,

View File

@ -46,7 +46,7 @@ export const checkTeamAppLimit = async (teamId: string, amount = 1) => {
MongoApp.countDocuments({
teamId,
type: {
$in: [AppTypeEnum.simple, AppTypeEnum.workflow, AppTypeEnum.plugin, AppTypeEnum.tool]
$in: [AppTypeEnum.simple, AppTypeEnum.workflow, AppTypeEnum.plugin, AppTypeEnum.toolSet]
}
})
]);
@ -59,7 +59,7 @@ export const checkTeamAppLimit = async (teamId: string, amount = 1) => {
if (global?.licenseData?.maxApps && typeof global?.licenseData?.maxApps === 'number') {
const totalApps = await MongoApp.countDocuments({
type: {
$in: [AppTypeEnum.simple, AppTypeEnum.workflow, AppTypeEnum.plugin, AppTypeEnum.tool]
$in: [AppTypeEnum.simple, AppTypeEnum.workflow, AppTypeEnum.plugin, AppTypeEnum.toolSet]
}
});
if (totalApps >= global.licenseData.maxApps) {

View File

@ -139,17 +139,27 @@ export const useDoc2xServer = ({ apiKey }: { apiKey: string }) => {
// Finifsh
if (result_data.status === 'success') {
const cleanedText = result_data.result.pages
.map((page) => page.md)
.join('')
.replace(/\\[\(\)]/g, '$')
.replace(/\\[\[\]]/g, '$$')
.replace(/<img\s+src="([^"]+)"(?:\s*\?[^>]*)?(?:\s*\/>|>)/g, '![img]($1)')
.replace(/<!-- Media -->/g, '')
.replace(/<!-- Footnote -->/g, '')
.replace(/<!-- Meanless:[\s\S]*?-->/g, '')
.replace(/<!-- figureText:[\s\S]*?-->/g, '')
.replace(/\$(.+?)\s+\\tag\{(.+?)\}\$/g, '$$$1 \\qquad \\qquad ($2)$$')
.replace(/\\text\{([^}]*?)(\b\w+)_(\w+\b)([^}]*?)\}/g, '\\text{$1$2\\_$3$4}');
const remainingTags = cleanedText.match(/<!--[\s\S]*?-->/g);
if (remainingTags) {
addLog.warn(`[Doc2x] Remaining dirty tags after cleaning:`, {
count: remainingTags.length,
tags: remainingTags.slice(0, 3)
});
}
return {
text: result_data.result.pages
.map((page) => page.md)
.join('')
.replace(/\\[\(\)]/g, '$')
.replace(/\\[\[\]]/g, '$$')
.replace(/<img\s+src="([^"]+)"(?:\s*\?[^>]*)?(?:\s*\/>|>)/g, '![img]($1)')
.replace(/<!-- Media -->/g, '')
.replace(/<!-- Footnote -->/g, '')
.replace(/\$(.+?)\s+\\tag\{(.+?)\}\$/g, '$$$1 \\qquad \\qquad ($2)$$')
.replace(/\\text\{([^}]*?)(\b\w+)_(\w+\b)([^}]*?)\}/g, '\\text{$1$2\\_$3$4}'),
text: cleanedText,
pages: result_data.result.pages.length
};
}

View File

@ -0,0 +1,9 @@
import createClient from '@fastgpt-sdk/plugin';
export const BASE_URL = process.env.PLUGIN_BASE_URL || '';
export const TOKEN = process.env.PLUGIN_TOKEN || '';
export const pluginClient = createClient({
baseUrl: BASE_URL,
token: TOKEN
});

View File

@ -20,18 +20,16 @@ export const readXlsxRawText = async ({
const rawText = format2Csv.map((item) => item.csvText).join('\n');
const formatText = format2Csv
.map((item) => {
const csvArr = Papa.parse(item.csvText).data as string[][];
const header = csvArr[0];
const formatText = result
.map(({ data }) => {
const header = data[0];
if (!header) return;
const formatText = `| ${header.join(' | ')} |
| ${header.map(() => '---').join(' | ')} |
${csvArr
${data
.slice(1)
.map((row) => `| ${row.map((item) => item.replace(/\n/g, '\\n')).join(' | ')} |`)
.map((row) => `| ${row.map((cell) => String(cell).replace(/\n/g, '\\n')).join(' | ')} |`)
.join('\n')}`;
return formatText;

View File

@ -2,6 +2,7 @@ import { parentPort } from 'worker_threads';
import type { SplitProps } from '@fastgpt/global/common/string/textSplitter';
import { splitText2Chunks } from '@fastgpt/global/common/string/textSplitter';
import { workerResponse } from '../controller';
import { delay } from '@fastgpt/global/common/system/utils';
parentPort?.on('message', async (props: SplitProps) => {
const result = splitText2Chunks(props);

View File

@ -1,5 +1,6 @@
import React, { useState, useMemo, useRef, useEffect } from 'react';
import { Box, Card, Flex, useTheme, useOutsideClick, Button } from '@chakra-ui/react';
import type { BoxProps } from '@chakra-ui/react';
import { Box, Card, Flex, useOutsideClick, Button } from '@chakra-ui/react';
import { addDays, format } from 'date-fns';
import { DayPicker } from 'react-day-picker';
import 'react-day-picker/dist/style.css';
@ -15,21 +16,23 @@ export type DateRangeType = {
const DateRangePicker = ({
onChange,
onSuccess,
position = 'bottom',
popPosition = 'bottom',
defaultDate = {
from: addDays(new Date(), -30),
to: new Date()
},
dateRange
dateRange,
formLabel,
...props
}: {
onChange?: (date: DateRangeType) => void;
onSuccess?: (date: DateRangeType) => void;
position?: 'bottom' | 'top';
popPosition?: 'bottom' | 'top';
defaultDate?: DateRangeType;
dateRange?: DateRangeType;
}) => {
formLabel?: string;
} & BoxProps) => {
const { t } = useTranslation();
const theme = useTheme();
const OutRangeRef = useRef(null);
const [range, setRange] = useState<DateRangeType>(defaultDate);
const [showSelected, setShowSelected] = useState(false);
@ -42,9 +45,9 @@ const DateRangePicker = ({
const formatSelected = useMemo(() => {
if (range?.from && range.to) {
return `${format(range.from, 'y-MM-dd')} ~ ${format(range.to, 'y-MM-dd')}`;
return `${format(range.from, 'y/MM/dd')} - ${format(range.to, 'y/MM/dd')}`;
}
return `${format(new Date(), 'y-MM-dd')} ~ ${format(new Date(), 'y-MM-dd')}`;
return `${format(new Date(), 'y/MM/dd')} - ${format(new Date(), 'y/MM/dd')}`;
}, [range]);
useOutsideClick({
@ -57,19 +60,30 @@ const DateRangePicker = ({
return (
<Box position={'relative'} ref={OutRangeRef}>
<Flex
border={theme.borders.base}
border={'base'}
px={3}
pr={3}
py={1}
borderRadius={'sm'}
cursor={'pointer'}
bg={'myGray.50'}
fontSize={'sm'}
onClick={() => setShowSelected(true)}
alignItems={'center'}
{...props}
>
<Box color={'myGray.600'} fontWeight={'400'}>
{formLabel && (
<>
<Box fontSize={'sm'} color={'myGray.600'}>
{formLabel}
</Box>
<Box w={'1px'} h={'12px'} bg={'myGray.200'} mx={2} />
</>
)}
<Box color={'myGray.600'} fontWeight={'400'} flex={1}>
{formatSelected}
</Box>
<MyIcon ml={2} name={'date'} w={'16px'} color={'myGray.600'} />
{!formLabel && <MyIcon ml={2} name={'date'} w={'16px'} color={'myGray.600'} />}
</Flex>
{showSelected && (
<Card
@ -77,9 +91,9 @@ const DateRangePicker = ({
zIndex={1}
css={{
'--rdp-background-color': '#d6e8ff',
' --rdp-accent-color': '#0000ff'
'--rdp-accent-color': '#0000ff'
}}
{...(position === 'top'
{...(popPosition === 'top'
? {
bottom: '40px'
}

View File

@ -487,5 +487,6 @@ export const iconPaths = {
union: () => import('./icons/union.svg'),
user: () => import('./icons/user.svg'),
visible: () => import('./icons/visible.svg'),
invisible: () => import('./icons/invisible.svg'),
wx: () => import('./icons/wx.svg')
};

View File

@ -0,0 +1,7 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 17" >
<path d="M3.09477 5.68345C2.49088 6.22956 1.97818 6.87496 1.58219 7.59384C1.46142 7.81308 1.40104 7.9227 1.35575 8.1445C1.32591 8.29065 1.32591 8.59337 1.35575 8.73952C1.40104 8.96131 1.46142 9.07094 1.58219 9.29018C2.82944 11.5544 5.23455 13.0897 8.00004 13.0897C8.7567 13.0897 9.48638 12.9748 10.1727 12.7614L9.15237 11.7411C8.77428 11.814 8.38849 11.8511 8.00004 11.8511C6.86129 11.8511 5.74538 11.5316 4.77917 10.929C4.1443 10.533 3.5908 10.0252 3.14375 9.43336C2.97848 9.21457 2.89585 9.10518 2.82105 8.82259C2.7747 8.64749 2.7747 8.23652 2.82105 8.06143C2.89585 7.77884 2.97848 7.66945 3.14375 7.45066C3.38856 7.12656 3.6653 6.82766 3.96984 6.55852L3.09477 5.68345Z" />
<path d="M6.16773 8.75641C6.23231 9.13279 6.41184 9.48294 6.68547 9.75658C6.95911 10.0302 7.30926 10.2097 7.68564 10.2743L6.16773 8.75641Z" />
<path d="M9.77417 8.99756L7.44449 6.66788C7.62269 6.61207 7.80986 6.58293 8.00004 6.58293C8.4931 6.58293 8.96596 6.7788 9.31461 7.12744C9.66325 7.47609 9.85912 7.94895 9.85912 8.44201C9.85912 8.63219 9.82998 8.81936 9.77417 8.99756Z" />
<path d="M11.5121 10.7355C12.026 10.3724 12.4791 9.93283 12.8563 9.43336C13.0216 9.21457 13.1042 9.10518 13.179 8.82259C13.2254 8.64749 13.2254 8.23652 13.179 8.06143C13.1042 7.77884 13.0216 7.66945 12.8563 7.45066C12.4093 6.85882 11.8558 6.35101 11.2209 5.95502C10.2547 5.35237 9.13879 5.03288 8.00004 5.03288C7.35512 5.03288 6.71753 5.13535 6.11031 5.3337L5.1481 4.37149C6.02445 3.99987 6.98812 3.79431 8.00004 3.79431C10.7655 3.79431 13.1706 5.32959 14.4179 7.59383C14.5387 7.81308 14.599 7.9227 14.6443 8.1445C14.6742 8.29065 14.6742 8.59337 14.6443 8.73952C14.599 8.96131 14.5387 9.07094 14.4179 9.29018C13.9151 10.203 13.224 10.9974 12.3971 11.6204L11.5121 10.7355Z" />
<path d="M2.3253 3.86234C2.61571 3.57193 3.08655 3.57193 3.37695 3.86234L12.6912 13.1766C12.9816 13.467 12.9816 13.9378 12.6912 14.2282C12.4008 14.5186 11.9299 14.5186 11.6395 14.2282L2.3253 4.91399C2.03489 4.62358 2.03489 4.15274 2.3253 3.86234Z" />
</svg>

After

Width:  |  Height:  |  Size: 2.0 KiB

View File

@ -1,3 +1,3 @@
<svg viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M8.00011 6.3335C8.44214 6.3335 8.86606 6.50909 9.17863 6.82165C9.49119 7.13421 9.66678 7.55814 9.66678 8.00016C9.66678 8.44219 9.49119 8.86611 9.17863 9.17867C8.86606 9.49124 8.44214 9.66683 8.00011 9.66683C7.55809 9.66683 7.13416 9.49124 6.8216 9.17867C6.50904 8.86611 6.33345 8.44219 6.33345 8.00016C6.33345 7.55814 6.50904 7.13421 6.8216 6.82165C7.13416 6.50909 7.55809 6.3335 8.00011 6.3335ZM8.00011 3.8335C10.4794 3.8335 12.6356 5.20988 13.7537 7.23977C13.862 7.43633 13.9161 7.53461 13.9567 7.73345C13.9835 7.86447 13.9835 8.13586 13.9567 8.26688C13.9161 8.46572 13.862 8.564 13.7537 8.76055C12.6356 10.7905 10.4794 12.1668 8.00011 12.1668C5.52085 12.1668 3.36466 10.7905 2.2465 8.76055C2.13823 8.564 2.0841 8.46572 2.0435 8.26688C2.01675 8.13586 2.01675 7.86447 2.0435 7.73345C2.0841 7.53461 2.13823 7.43633 2.2465 7.23977C3.36466 5.20988 5.52085 3.8335 8.00011 3.8335ZM3.64644 7.11141C3.49828 7.30756 3.4242 7.40563 3.35714 7.65897C3.31559 7.81594 3.31559 8.18438 3.35714 8.34135C3.4242 8.5947 3.49828 8.69277 3.64644 8.88891C4.04722 9.41949 4.54344 9.87474 5.1126 10.2297C5.97881 10.77 6.97922 11.0565 8.00011 11.0565C9.021 11.0565 10.0214 10.77 10.8876 10.2297C11.4568 9.87474 11.953 9.41949 12.3538 8.88891C12.5019 8.69277 12.576 8.5947 12.6431 8.34135C12.6846 8.18438 12.6846 7.81594 12.6431 7.65897C12.576 7.40563 12.5019 7.30756 12.3538 7.11141C11.953 6.58083 11.4568 6.12558 10.8876 5.77058C10.0214 5.2303 9.021 4.94387 8.00011 4.94387C6.97922 4.94387 5.97881 5.2303 5.1126 5.77058C4.54344 6.12558 4.04722 6.58083 3.64644 7.11141Z" />
</svg>
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16" >
<path d="M8.00003 6.12221C8.49803 6.12221 8.97563 6.32004 9.32776 6.67217C9.6799 7.02431 9.87773 7.50191 9.87773 7.9999C9.87773 8.4979 9.6799 8.9755 9.32776 9.32764C8.97563 9.67977 8.49803 9.8776 8.00003 9.8776C7.50203 9.8776 7.02443 9.67977 6.6723 9.32764C6.32016 8.9755 6.12233 8.4979 6.12233 7.9999C6.12233 7.50191 6.32016 7.02431 6.6723 6.67217C7.02443 6.32004 7.50203 6.12221 8.00003 6.12221ZM8.00003 3.30566C10.7932 3.30566 13.2224 4.85632 14.4821 7.14324C14.6041 7.36468 14.6651 7.4754 14.7109 7.69942C14.741 7.84703 14.741 8.15278 14.7109 8.30039C14.6651 8.52441 14.6041 8.63513 14.4821 8.85657C13.2224 11.1435 10.7932 12.6941 8.00003 12.6941C5.20685 12.6941 2.77765 11.1435 1.51791 8.85657C1.39593 8.63513 1.33494 8.52441 1.2892 8.30039C1.25907 8.15278 1.25907 7.84703 1.2892 7.69942C1.33494 7.4754 1.39593 7.36468 1.51791 7.14324C2.77765 4.85632 5.20685 3.30566 8.00003 3.30566ZM3.0951 6.99862C2.92819 7.2196 2.84473 7.33009 2.76917 7.61551C2.72236 7.79236 2.72236 8.20745 2.76917 8.38429C2.84473 8.66971 2.92819 8.7802 3.09511 9.00118C3.54663 9.59895 4.10568 10.1118 4.74691 10.5118C5.72279 11.1205 6.84988 11.4432 8.00003 11.4432C9.15018 11.4432 10.2773 11.1205 11.2532 10.5118C11.8944 10.1118 12.4534 9.59895 12.905 9.00118C13.0719 8.7802 13.1553 8.66971 13.2309 8.38429C13.2777 8.20745 13.2777 7.79236 13.2309 7.61551C13.1553 7.33009 13.0719 7.2196 12.905 6.99862C12.4534 6.40086 11.8944 5.88797 11.2532 5.48801C10.2773 4.87932 9.15018 4.55664 8.00003 4.55664C6.84988 4.55664 5.72279 4.87932 4.74691 5.48801C4.10568 5.88797 3.54663 6.40086 3.0951 6.99862Z" />
</svg>

Before

Width:  |  Height:  |  Size: 1.6 KiB

After

Width:  |  Height:  |  Size: 1.6 KiB

View File

@ -1,9 +1,11 @@
import type { FlexProps } from '@chakra-ui/react';
import {
Box,
Button,
type ButtonProps,
Checkbox,
Flex,
Input,
Menu,
MenuButton,
MenuItem,
@ -11,13 +13,27 @@ import {
MenuList,
useDisclosure
} from '@chakra-ui/react';
import React, { useCallback, useMemo, useRef, useState } from 'react';
import React, { useCallback, useEffect, useMemo, useRef, useState } from 'react';
import MyTag from '../Tag/index';
import MyIcon from '../Icon';
import MyAvatar from '../Avatar';
import { useTranslation } from 'next-i18next';
import type { useScrollPagination } from '../../../hooks/useScrollPagination';
import MyDivider from '../MyDivider';
import { shadowLight } from '../../../styles/theme';
const menuItemStyles: MenuItemProps = {
borderRadius: 'sm',
py: 2,
display: 'flex',
alignItems: 'center',
_hover: {
backgroundColor: 'myGray.100'
},
_notLast: {
mb: 2
}
};
export type SelectProps<T = any> = {
list: {
@ -30,19 +46,25 @@ export type SelectProps<T = any> = {
setIsSelectAll?: React.Dispatch<React.SetStateAction<boolean>>;
placeholder?: string;
maxH?: number;
itemWrap?: boolean;
onSelect: (val: T[]) => void;
closeable?: boolean;
isDisabled?: boolean;
ScrollData?: ReturnType<typeof useScrollPagination>['ScrollData'];
formLabel?: string;
formLabelFontSize?: string;
inputValue?: string;
setInputValue?: (val: string) => void;
tagStyle?: FlexProps;
} & Omit<ButtonProps, 'onSelect'>;
const MultipleSelect = <T = any,>({
value = [],
placeholder,
list = [],
maxH = 400,
onSelect,
closeable = false,
itemWrap = true,
@ -50,27 +72,92 @@ const MultipleSelect = <T = any,>({
isSelectAll,
setIsSelectAll,
isDisabled = false,
formLabel,
formLabelFontSize = 'sm',
inputValue,
setInputValue,
tagStyle,
...props
}: SelectProps<T>) => {
const ref = useRef<HTMLButtonElement>(null);
const SearchInputRef = useRef<HTMLInputElement>(null);
const tagsContainerRef = useRef<HTMLDivElement>(null);
const { t } = useTranslation();
const { isOpen, onOpen, onClose } = useDisclosure();
const menuItemStyles: MenuItemProps = {
borderRadius: 'sm',
py: 2,
display: 'flex',
alignItems: 'center',
_hover: {
backgroundColor: 'myGray.100'
},
_notLast: {
mb: 2
}
const canInput = setInputValue !== undefined;
type SelectedItemType = {
icon?: string;
label: string | React.ReactNode;
value: T;
};
const [visibleItems, setVisibleItems] = useState<SelectedItemType[]>([]);
const [overflowItems, setOverflowItems] = useState<SelectedItemType[]>([]);
const selectedItems = useMemo(() => {
return value.map((val) => {
const listItem = list.find((item) => item.value === val);
return listItem || { value: val, label: String(val) };
});
}, [value, list]);
const handleKeyDown = useCallback(
(e: React.KeyboardEvent<HTMLInputElement>) => {
if (e.key === 'Backspace' && (!inputValue || inputValue === '')) {
const newValue = [...value];
newValue.pop();
onSelect(newValue);
}
},
[inputValue, value, isSelectAll, onSelect]
);
useEffect(() => {
if (!isOpen) {
setInputValue?.('');
}
}, [isOpen]);
useEffect(() => {
const getWidth = (w: any) =>
typeof w === 'number' ? w : typeof w === 'string' ? parseInt(w) : 0;
const totalWidth = getWidth(props.w) || 200;
const tagWidth = getWidth(tagStyle?.w) || 60;
const formLabelWidth = formLabel ? formLabel.length * 8 + 20 : 0;
const availableWidth = totalWidth - formLabelWidth - 40;
const overflowWidth = 30;
if (availableWidth <= 0) {
setVisibleItems(selectedItems.length > 0 ? [selectedItems[0]] : []);
setOverflowItems(selectedItems.slice(1));
return;
}
const { count } = selectedItems.reduce(
(acc, item, i) => {
const remain = selectedItems.length - i - 1;
const needOverflow = remain > 0 ? overflowWidth : 0;
if (acc.used + tagWidth + needOverflow <= availableWidth) {
return {
used: acc.used + tagWidth,
count: i + 1
};
}
return acc;
},
{ used: 0, count: 0 }
);
setVisibleItems(selectedItems.slice(0, count));
setOverflowItems(selectedItems.slice(count));
}, [selectedItems, isOpen, props.w, tagStyle, formLabel]);
const onclickItem = useCallback(
(val: T) => {
// 全选状态下value 实际上上空。
if (isSelectAll) {
onSelect(list.map((item) => item.value).filter((i) => i !== val));
setIsSelectAll?.(false);
@ -141,12 +228,11 @@ const MultipleSelect = <T = any,>({
>
<MenuButton
as={Flex}
h={'100%'}
alignItems={'center'}
ref={ref}
px={3}
alignItems={'center'}
borderRadius={'md'}
border={'base'}
border={'sm'}
userSelect={'none'}
cursor={isDisabled ? 'not-allowed' : 'pointer'}
_active={{
@ -159,68 +245,108 @@ const MultipleSelect = <T = any,>({
{...props}
{...(isOpen && !isDisabled
? {
boxShadow: '0px 0px 4px #A8DBFF',
borderColor: 'primary.500',
boxShadow: shadowLight,
borderColor: 'primary.600 !important',
bg: 'white'
}
: {})}
>
{value.length === 0 && placeholder ? (
<Box color={'myGray.500'} fontSize={'sm'}>
{placeholder}
</Box>
) : (
<Flex alignItems={'center'} gap={2}>
<Flex alignItems={'center'} w={'100%'} h={'100%'} py={1.5}>
{formLabel && (
<Flex alignItems={'center'}>
<Box color={'myGray.600'} fontSize={formLabelFontSize} whiteSpace={'nowrap'}>
{formLabel}
</Box>
<Box w={'1px'} h={'12px'} bg={'myGray.200'} mx={2} />
</Flex>
)}
{value.length === 0 && placeholder ? (
<Box color={'myGray.500'} fontSize={formLabelFontSize} flex={1}>
{placeholder}
</Box>
) : (
<Flex
alignItems={'center'}
gap={2}
flexWrap={itemWrap ? 'wrap' : 'nowrap'}
ref={tagsContainerRef}
flex={'1 0 0'}
gap={1}
flexWrap={'nowrap'}
overflow={'hidden'}
flex={1}
alignItems={'center'}
>
{isSelectAll ? (
<Box fontSize={'mini'} color={'myGray.900'}>
{t('common:All')}
</Box>
) : (
list
.filter((item) => value.includes(item.value))
.map((item, i) => (
<MyTag
className="tag-icon"
key={i}
bg={'primary.100'}
color={'primary.700'}
type={'fill'}
borderRadius={'lg'}
px={2}
py={0.5}
flexShrink={0}
>
{item.label}
{closeable && (
<MyIcon
name={'common/closeLight'}
ml={1}
w="0.8rem"
cursor={'pointer'}
_hover={{
color: 'red.500'
}}
onClick={(e) => {
e.stopPropagation();
e.preventDefault();
onclickItem(item.value);
}}
/>
)}
</MyTag>
))
{(!isOpen || !canInput) &&
(isSelectAll ? (
<Box fontSize={formLabelFontSize} color={'myGray.900'}>
{t('common:All')}
</Box>
) : (
<>
{visibleItems.map((item, i) => (
<MyTag
className="tag-icon"
key={i}
bg={'primary.100'}
color={'primary.700'}
type={'fill'}
borderRadius={'lg'}
px={2}
py={0.5}
flexShrink={0}
{...tagStyle}
>
{item.label}
{closeable && (
<MyIcon
name={'common/closeLight'}
ml={1}
w="0.8rem"
cursor={'pointer'}
_hover={{
color: 'red.500'
}}
onClick={(e) => {
e.stopPropagation();
e.preventDefault();
onclickItem(item.value);
}}
/>
)}
</MyTag>
))}
{overflowItems.length > 0 && (
<Box
fontSize={formLabelFontSize}
px={2}
py={0.5}
flexShrink={0}
borderRadius={'lg'}
bg={'myGray.100'}
>
+{overflowItems.length}
</Box>
)}
</>
))}
{canInput && isOpen && (
<Input
value={inputValue}
onChange={(e) => setInputValue?.(e.target.value)}
onKeyDown={handleKeyDown}
ref={SearchInputRef}
autoFocus
onBlur={() => {
setTimeout(() => {
SearchInputRef?.current?.focus();
}, 0);
}}
h={6}
variant={'unstyled'}
border={'none'}
/>
)}
</Flex>
<MyIcon name={'core/chat/chevronDown'} color={'myGray.600'} w={4} h={4} />
</Flex>
)}
)}
<MyIcon name={'core/chat/chevronDown'} color={'myGray.600'} w={4} h={4} />
</Flex>
</MenuButton>
<MenuList
@ -254,7 +380,7 @@ const MultipleSelect = <T = any,>({
<MyDivider my={1} />
{ScrollData ? <ScrollData>{ListRender}</ScrollData> : ListRender}
{ScrollData ? <ScrollData minH={20}>{ListRender}</ScrollData> : ListRender}
</MenuList>
</Menu>
</Box>

View File

@ -1,7 +1,7 @@
import { useTranslation as useNextTranslation } from 'next-i18next';
import { I18N_NAMESPACES_MAP } from '../i18n/constants';
export function useTranslation() {
export function useSafeTranslation() {
const { t: originalT, ...rest } = useNextTranslation();
const t = (key: string | undefined, ...args: any[]): string => {

View File

@ -90,10 +90,29 @@
"llm_use_vision": "Vision",
"llm_use_vision_tip": "After clicking on the model selection, you can see whether the model supports image recognition and the ability to control whether to start image recognition. \nAfter starting image recognition, the model will read the image content in the file link, and if the user question is less than 500 words, it will automatically parse the image in the user question.",
"logs_chat_user": "user",
"logs_date": "date",
"logs_empty": "No logs yet~",
"logs_error_count": "Error Count",
"logs_export_confirm_tip": "There are currently {{total}} conversation records, and each conversation can export up to 100 latest messages. \nConfirm export?",
"logs_export_title": "Time, source, user, contact, title, total number of messages, user good feedback, user bad feedback, custom feedback, labeled answers, conversation details",
"logs_key_config": "Field Configuration",
"logs_keys_annotatedCount": "Annotated Answer Count",
"logs_keys_createdTime": "Created Time",
"logs_keys_customFeedback": "Custom Feedback",
"logs_keys_errorCount": "Error Count",
"logs_keys_feedback": "User Feedback",
"logs_keys_lastConversationTime": "Last Conversation Time",
"logs_keys_messageCount": "Message Count",
"logs_keys_points": "Points Consumed",
"logs_keys_responseTime": "Average Response Time",
"logs_keys_sessionId": "Session ID",
"logs_keys_source": "Source",
"logs_keys_title": "Title",
"logs_keys_user": "User",
"logs_message_total": "Total Messages",
"logs_points": "Points Consumed",
"logs_response_time": "Average Response Time",
"logs_search_chat": "Search for session title or session ID",
"logs_source": "source",
"logs_title": "Title",
"look_ai_point_price": "View all model billing standards",
@ -134,6 +153,7 @@
"question_guide_tip": "After the conversation, 3 guiding questions will be generated for you.",
"reasoning_response": "Output thinking",
"response_format": "Response format",
"save_team_app_log_keys": "Save as team configuration",
"saved_success": "Saved successfully! \nTo use this version externally, click Save and Publish",
"search_app": "Search apps",
"search_tool": "Search Tools",
@ -147,7 +167,10 @@
"stop_sign_placeholder": "Multiple serial numbers are separated by |, for example: aaa|stop",
"stream_response": "Stream",
"stream_response_tip": "Turning this switch off forces the model to use non-streaming mode and will not output content directly. \nIn the output of the AI reply, the content output by this model can be obtained for secondary processing.",
"sync_log_keys_popover_text": "The current field configuration is only valid for individuals. Do you need to save it to the team configuration?",
"sync_team_app_log_keys": "Restore to team configuration",
"system_secret": "System secret",
"systemval_conflict_globalval": "The variable name conflicts with the system variable, please use other variable names",
"team_tags_set": "Team tags",
"temperature": "Temperature",
"temperature_tip": "Range 0~10. \nThe larger the value, the more divergent the models answer is; the smaller the value, the more rigorous the answer.",
@ -229,6 +252,8 @@
"upload_file_max_amount_tip": "Maximum number of files uploaded in a single round of conversation",
"variable.select type_desc": "You can define a global variable that does not need to be filled in by the user.\n\nThe value of this variable can come from the API interface, the Query of the shared link, or assigned through the [Variable Update] module.",
"variable.textarea_type_desc": "Allows users to input up to 4000 characters in the dialogue box.",
"variable_name_required": "Required variable name",
"variable_repeat": "This variable name has been occupied and cannot be used",
"version.Revert success": "Revert Successful",
"version_back": "Revert to Original State",
"version_copy": "Duplicate",

View File

@ -114,6 +114,8 @@
"can_copy_content_tip": "It is not possible to copy automatically using the browser, please manually copy the following content",
"chart_mode_cumulative": "Cumulative",
"chart_mode_incremental": "Incremental",
"chat": "Session",
"chat_chatId": "Session Id: {{chatId}}",
"choosable": "Choosable",
"chose_condition": "Choose Condition",
"chosen": "Chosen",
@ -679,7 +681,6 @@
"core.module.variable.add option": "Add Option",
"core.module.variable.input type": "Text",
"core.module.variable.key": "Variable Key",
"core.module.variable.key already exists": "Key Already Exists",
"core.module.variable.key is required": "Variable Key is Required",
"core.module.variable.select type": "Dropdown Single Select",
"core.module.variable.text max length": "Max Length",
@ -878,6 +879,7 @@
"max_quote_tokens": "Quote cap",
"max_quote_tokens_tips": "The maximum number of tokens in a single search, about 1 character in Chinese = 1.7 tokens, and about 1 character in English = 1 token",
"mcp_server": "MCP Services",
"member": "member",
"min_similarity": "lowest correlation",
"min_similarity_tip": "The relevance of different index models is different. Please select the appropriate value through search testing. \nWhen using Result Rearrange , use the rearranged results for filtering.",
"model.billing": "Billing",

View File

@ -90,10 +90,29 @@
"llm_use_vision": "图片识别",
"llm_use_vision_tip": "点击模型选择后,可以看到模型是否支持图片识别以及控制是否启动图片识别的能力。启动图片识别后,模型会读取文件链接里图片内容,并且如果用户问题少于 500 字,会自动解析用户问题中的图片。",
"logs_chat_user": "使用者",
"logs_date": "日期",
"logs_empty": "还没有日志噢~",
"logs_error_count": "报错数量",
"logs_export_confirm_tip": "当前共有 {{total}} 条对话记录,每条对话最多可导出最新 100 条消息。确认导出?",
"logs_export_title": "时间,来源,使用者,联系方式,标题,消息总数,用户赞同反馈,用户反对反馈,自定义反馈,标注答案,对话详情",
"logs_key_config": "字段配置",
"logs_keys_annotatedCount": "标注答案数量",
"logs_keys_createdTime": "创建时间",
"logs_keys_customFeedback": "自定义反馈",
"logs_keys_errorCount": "报错数量",
"logs_keys_feedback": "用户反馈",
"logs_keys_lastConversationTime": "上次对话时间",
"logs_keys_messageCount": "消息总数",
"logs_keys_points": "积分消耗",
"logs_keys_responseTime": "平均响应时长",
"logs_keys_sessionId": "会话 ID",
"logs_keys_source": "来源",
"logs_keys_title": "标题",
"logs_keys_user": "使用者",
"logs_message_total": "消息总数",
"logs_points": "积分消耗",
"logs_response_time": "平均响应时长",
"logs_search_chat": "搜索会话标题或会话 ID",
"logs_source": "来源",
"logs_title": "标题",
"look_ai_point_price": "查看所有模型计费标准",
@ -134,6 +153,7 @@
"question_guide_tip": "对话结束后,会为你生成 3 个引导性问题。",
"reasoning_response": "输出思考",
"response_format": "回复格式",
"save_team_app_log_keys": "保存为团队配置",
"saved_success": "保存成功!如需在外部使用该版本,请点击“保存并发布”",
"search_app": "搜索应用",
"search_tool": "搜索工具",
@ -147,7 +167,10 @@
"stop_sign_placeholder": "多个序列号通过 | 隔开例如aaa|stop",
"stream_response": "流输出",
"stream_response_tip": "关闭该开关,可以强制模型使用非流模式,并且不会直接进行内容输出。可以在 AI 回复的输出中,获取本次模型输出的内容进行二次处理。",
"sync_log_keys_popover_text": "当前字段配置仅对个人生效,是否需要保存至团队配置?",
"sync_team_app_log_keys": "还原成团队配置",
"system_secret": "系统密钥",
"systemval_conflict_globalval": "变量名与系统变量有冲突,请使用其他变量名",
"team_tags_set": "团队标签",
"temperature": "温度",
"temperature_tip": "范围 010。值越大代表模型回答越发散值越小代表回答越严谨。",
@ -229,6 +252,8 @@
"upload_file_max_amount_tip": "单轮对话中最大上传文件数量",
"variable.select type_desc": "可以为工作流定义全局变量,常用临时缓存。赋值的方式包括:\n1. 从对话页面的 query 参数获取。\n2. 通过 API 的 variables 对象传递。\n3. 通过【变量更新】节点进行赋值。",
"variable.textarea_type_desc": "允许用户最多输入4000字的对话框。",
"variable_name_required": "变量名必填",
"variable_repeat": "该变量名已被占用,无法使用",
"version.Revert success": "回滚成功",
"version_back": "回到初始状态",
"version_copy": "副本",

View File

@ -114,6 +114,8 @@
"can_copy_content_tip": "无法使用浏览器自动复制,请手动复制下面内容",
"chart_mode_cumulative": "累积",
"chart_mode_incremental": "分时",
"chat": "会话",
"chat_chatId": "会话Id: {{chatId}}",
"choosable": "可选",
"chose_condition": "选择条件",
"chosen": "已选",
@ -679,7 +681,6 @@
"core.module.variable.add option": "添加选项",
"core.module.variable.input type": "文本",
"core.module.variable.key": "变量 key",
"core.module.variable.key already exists": "Key 已经存在",
"core.module.variable.key is required": "变量 key 是必须的",
"core.module.variable.select type": "下拉单选",
"core.module.variable.text max length": "最大长度",
@ -878,6 +879,7 @@
"max_quote_tokens": "引用上限",
"max_quote_tokens_tips": "单次搜索最大的 token 数量,中文约 1 字=1.7 tokens英文约 1 字=1 token",
"mcp_server": "MCP 服务",
"member": "成员",
"min_similarity": "最低相关度",
"min_similarity_tip": "不同索引模型的相关度有区别,请通过搜索测试来选择合适的数值。使用 结果重排 时,使用重排结果进行过滤。",
"model.billing": "模型计费",

View File

@ -90,10 +90,29 @@
"llm_use_vision": "圖片辨識",
"llm_use_vision_tip": "點選模型選擇後,可以看到模型是否支援圖片辨識以及控制是否啟用圖片辨識的功能。啟用圖片辨識後,模型會讀取檔案連結中的圖片內容,並且如果使用者問題少於 500 字,會自動解析使用者問題中的圖片。",
"logs_chat_user": "使用者",
"logs_date": "日期",
"logs_empty": "還沒有紀錄喔~",
"logs_error_count": "錯誤數量",
"logs_export_confirm_tip": "當前共有 {{total}} 條對話記錄,每條對話最多可導出最新 100 條消息。\n確認導出",
"logs_export_title": "時間,來源,使用者,聯絡方式,標題,訊息總數,使用者贊同回饋,使用者反對回饋,自定義回饋,標註答案,對話詳細資訊",
"logs_key_config": "字段配置",
"logs_keys_annotatedCount": "標記答案數量",
"logs_keys_createdTime": "建立時間",
"logs_keys_customFeedback": "自訂回饋",
"logs_keys_errorCount": "錯誤數量",
"logs_keys_feedback": "使用者回饋",
"logs_keys_lastConversationTime": "上次對話時間",
"logs_keys_messageCount": "訊息總數",
"logs_keys_points": "積分消耗",
"logs_keys_responseTime": "平均回應時長",
"logs_keys_sessionId": "會話 ID",
"logs_keys_source": "來源",
"logs_keys_title": "標題",
"logs_keys_user": "使用者",
"logs_message_total": "訊息總數",
"logs_points": "積分消耗",
"logs_response_time": "平均回應時長",
"logs_search_chat": "搜索會話標題或會話 ID",
"logs_source": "來源",
"logs_title": "標題",
"look_ai_point_price": "檢視所有模型計費標準",
@ -134,6 +153,7 @@
"question_guide_tip": "對話結束後,會為你產生 3 個引導性問題。",
"reasoning_response": "輸出思考",
"response_format": "回覆格式",
"save_team_app_log_keys": "保存為團隊配置",
"saved_success": "儲存成功!\n如需在外部使用該版本請點選“儲存並發布”",
"search_app": "搜尋應用程式",
"search_tool": "搜索工具",
@ -147,7 +167,10 @@
"stop_sign_placeholder": "多個序列號透過 | 隔開例如aaa|stop",
"stream_response": "流輸出",
"stream_response_tip": "關閉該開關​​,可以強制模型使用非流模式,並且不會直接進行內容輸出。\n可在 AI 回覆的輸出中,取得本次模型輸出的內容進行二次處理。",
"sync_log_keys_popover_text": "當前字段配置僅對個人生效,是否需要保存至團隊配置?",
"sync_team_app_log_keys": "還原成團隊配置",
"system_secret": "系統密鑰",
"systemval_conflict_globalval": "變量名與系統變量有衝突,請使用其他變量名",
"team_tags_set": "團隊標籤",
"temperature": "溫度",
"temperature_tip": "範圍 010。\n值越大代表模型回答越發散值越小代表回答越嚴謹。",
@ -229,6 +252,8 @@
"upload_file_max_amount_tip": "單輪對話中最大上傳檔案數量",
"variable.select type_desc": "可以為工作流程定義全域變數,常用於暫存。賦值的方式包括:\n1. 從對話頁面的 query 參數取得。\n2. 透過 API 的 variables 物件傳遞。\n3. 透過【變數更新】節點進行賦值。",
"variable.textarea_type_desc": "允許使用者最多輸入 4000 字的對話框。",
"variable_name_required": "變量名必填",
"variable_repeat": "該變量名已被佔用,無法使用",
"version.Revert success": "復原成功",
"version_back": "回到初始狀態",
"version_copy": "副本",

View File

@ -114,6 +114,8 @@
"can_copy_content_tip": "無法使用瀏覽器自動複製,請手動複製下面內容",
"chart_mode_cumulative": "累積",
"chart_mode_incremental": "分時",
"chat": "會話",
"chat_chatId": "會話Id: {{chatId}}",
"choosable": "可選擇",
"chose_condition": "選擇條件",
"chosen": "已選擇",
@ -679,7 +681,6 @@
"core.module.variable.add option": "新增選項",
"core.module.variable.input type": "文字",
"core.module.variable.key": "變數鍵值",
"core.module.variable.key already exists": "鍵值已存在",
"core.module.variable.key is required": "變數鍵值為必填",
"core.module.variable.select type": "下拉單選",
"core.module.variable.text max length": "最大長度",
@ -878,6 +879,7 @@
"max_quote_tokens": "引用上限",
"max_quote_tokens_tips": "單次搜尋最大的 token 數量,中文約 1 字=1.7 tokens英文約 1 字=1 token",
"mcp_server": "MCP 服務",
"member": "成員",
"min_similarity": "最低相關度",
"min_similarity_tip": "不同索引模型的相關度有區別,請透過搜尋測試來選擇合適的數值。\n使用 結果重排 時,使用重排結果過濾。",
"model.billing": "模型計費",

View File

@ -121,8 +121,8 @@ importers:
packages/service:
dependencies:
'@fastgpt-sdk/plugin':
specifier: ^0.1.2
version: 0.1.2(@types/node@20.17.24)
specifier: ^0.1.4
version: 0.1.4(@types/node@20.17.24)
'@fastgpt/global':
specifier: workspace:*
version: link:../global
@ -1973,8 +1973,8 @@ packages:
resolution: {integrity: sha512-d9zaMRSTIKDLhctzH12MtXvJKSSUhaHcjV+2Z+GK+EEY7XKpP5yR4x+N3TAcHTcu963nIr+TMcCb4DBCYX1z6Q==}
engines: {node: ^12.22.0 || ^14.17.0 || >=16.0.0}
'@fastgpt-sdk/plugin@0.1.2':
resolution: {integrity: sha512-z8C0TCCSFxJpF81+V654Xv6SVnLn+/yBNQW78EfZmHe3udZ/WZGmXUTACMac0YrdcT0Wi3TV2/+mmXfePA2CSw==}
'@fastgpt-sdk/plugin@0.1.4':
resolution: {integrity: sha512-/wDpUvof6f2Elher295D+Z7YDLKY8+PuMORmA7RT+IfQ1sq6OgmTFDYrACrsqxq3Y5mgU8bt4zd5og2U+SmgDQ==}
'@fastify/accept-negotiator@1.1.0':
resolution: {integrity: sha512-OIHZrb2ImZ7XG85HXOONLcJWGosv7sIvM2ifAPQVhg9Lv7qdmMBNVaai4QTdyuaqbKM5eO6sLSQOYI7wEQeCJQ==}
@ -11208,7 +11208,7 @@ snapshots:
'@eslint/js@8.57.1': {}
'@fastgpt-sdk/plugin@0.1.2(@types/node@20.17.24)':
'@fastgpt-sdk/plugin@0.1.4(@types/node@20.17.24)':
dependencies:
'@fortaine/fetch-event-source': 3.0.6
'@ts-rest/core': 3.52.1(@types/node@20.17.24)(zod@3.25.51)
@ -15098,7 +15098,7 @@ snapshots:
transitivePeerDependencies:
- supports-color
eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.56.0):
eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0)(eslint@8.56.0))(eslint@8.56.0):
dependencies:
debug: 3.2.7
optionalDependencies:
@ -15109,7 +15109,7 @@ snapshots:
transitivePeerDependencies:
- supports-color
eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.57.1):
eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0)(eslint@8.57.1))(eslint@8.57.1):
dependencies:
debug: 3.2.7
optionalDependencies:
@ -15131,7 +15131,7 @@ snapshots:
doctrine: 2.1.0
eslint: 8.56.0
eslint-import-resolver-node: 0.3.9
eslint-module-utils: 2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.56.0)
eslint-module-utils: 2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0)(eslint@8.56.0))(eslint@8.56.0)
hasown: 2.0.2
is-core-module: 2.16.1
is-glob: 4.0.3
@ -15160,7 +15160,7 @@ snapshots:
doctrine: 2.1.0
eslint: 8.57.1
eslint-import-resolver-node: 0.3.9
eslint-module-utils: 2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.57.1)
eslint-module-utils: 2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0)(eslint@8.57.1))(eslint@8.57.1)
hasown: 2.0.2
is-core-module: 2.16.1
is-glob: 4.0.3

View File

@ -1,6 +1,6 @@
{
"name": "app",
"version": "4.11.0",
"version": "4.11.1",
"private": false,
"scripts": {
"dev": "next dev",

View File

@ -35,10 +35,10 @@ import DndDrag, {
type DraggableProvided,
type DraggableStateSnapshot
} from '@fastgpt/web/components/common/DndDrag';
import { workflowSystemVariables } from '@/web/core/app/utils';
import { getNanoid } from '@fastgpt/global/common/string/tools';
export const defaultVariable: VariableItemType = {
id: getNanoid(6),
key: '',
label: '',
type: VariableInputEnum.input,
@ -47,12 +47,8 @@ export const defaultVariable: VariableItemType = {
valueType: WorkflowIOValueTypeEnum.string
};
type InputItemType = VariableItemType & {
list: { label: string; value: string }[];
};
export const addVariable = () => {
const newVariable = { ...defaultVariable, key: '', id: '', list: [{ value: '', label: '' }] };
const newVariable = { ...defaultVariable, list: [{ value: '', label: '' }] };
return newVariable;
};
@ -103,23 +99,47 @@ const VariableEdit = ({
});
}, [variables]);
/*
- New var: random key
- Update var: keep key
*/
const onSubmitSuccess = useCallback(
(data: InputItemType, action: 'confirm' | 'continue') => {
(data: VariableItemType, action: 'confirm' | 'continue') => {
data.label = data?.label?.trim();
if (!data.label) {
return toast({
status: 'warning',
title: t('app:variable_name_required')
});
}
const existingVariable = variables.find(
(item) => item.label === data.label && item.id !== data.id
);
// check if the variable already exists
const existingVariable = variables.find((item) => {
return item.key !== data.key && (data.label === item.label || data.label === item.key);
});
if (existingVariable) {
return toast({
status: 'warning',
title: t('app:variable_repeat')
});
}
// check if the variable is a system variable
if (
workflowSystemVariables.some(
(item) => item.key === data.label || t(item.label) === data.label
)
) {
toast({
status: 'warning',
title: t('common:core.module.variable.key already exists')
title: t('app:systemval_conflict_globalval')
});
return;
}
data.key = data.label;
data.enums = data.list;
if (data.type !== VariableInputEnum.select && data.list) {
delete data.list;
}
if (data.type === VariableInputEnum.custom) {
data.required = false;
@ -127,16 +147,24 @@ const VariableEdit = ({
data.valueType = inputTypeList.find((item) => item.value === data.type)?.defaultValueType;
}
const onChangeVariable = [...variables];
if (data.id) {
const index = variables.findIndex((item) => item.id === data.id);
onChangeVariable[index] = data;
} else {
onChangeVariable.push({
...data,
id: getNanoid(6)
});
}
const onChangeVariable = (() => {
if (data.key) {
return variables.map((item) => {
if (item.key === data.key) {
return data;
}
return item;
});
}
return [
...variables,
{
...data,
key: getNanoid(8)
}
];
})();
if (action === 'confirm') {
onChange(onChangeVariable);
@ -226,7 +254,7 @@ const VariableEdit = ({
{({ provided }) => (
<Tbody {...provided.droppableProps} ref={provided.innerRef}>
{formatVariables.map((item, index) => (
<Draggable key={item.id} draggableId={item.id} index={index}>
<Draggable key={item.key} draggableId={item.key} index={index}>
{(provided, snapshot) => (
<TableItem
provided={provided}
@ -235,7 +263,7 @@ const VariableEdit = ({
reset={reset}
onChange={onChange}
variables={variables}
key={item.id}
key={item.key}
/>
)}
</Draggable>
@ -370,7 +398,7 @@ const TableItem = ({
<Td fontWeight={'medium'}>
<Flex alignItems={'center'}>
<MyIcon name={item.icon as any} w={'16px'} color={'myGray.400'} mr={1} />
{item.key}
{item.label}
</Flex>
</Td>
<Td>
@ -385,7 +413,10 @@ const TableItem = ({
onClick={() => {
const formattedItem = {
...item,
list: item.enums?.map((item) => ({ label: item.value, value: item.value })) || []
list:
item.list ||
item.enums?.map((item) => ({ label: item.value, value: item.value })) ||
[]
};
reset(formattedItem);
}}
@ -393,7 +424,7 @@ const TableItem = ({
<MyIconButton
icon={'delete'}
hoverColor={'red.500'}
onClick={() => onChange(variables.filter((variable) => variable.id !== item.id))}
onClick={() => onChange(variables.filter((variable) => variable.key !== item.key))}
/>
</Flex>
</Td>

View File

@ -34,13 +34,15 @@ const LabelAndFormRender = ({
placeholder,
inputType,
variablesForm,
showValueType,
...props
}: {
formKey: string;
label: string;
label: string | React.ReactNode;
required?: boolean;
placeholder?: string;
variablesForm: UseFormReturn<any>;
showValueType?: boolean;
} & SpecificProps &
BoxProps) => {
const { control } = variablesForm;
@ -48,7 +50,7 @@ const LabelAndFormRender = ({
return (
<Box _notLast={{ mb: 4 }}>
<Flex alignItems={'center'} mb={1}>
<FormLabel required={required}>{label}</FormLabel>
{typeof label === 'string' ? <FormLabel required={required}>{label}</FormLabel> : label}
{placeholder && <QuestionTip ml={1} label={placeholder} />}
</Flex>

View File

@ -11,7 +11,7 @@ import MultipleSelect, {
import JSONEditor from '@fastgpt/web/components/common/Textarea/JsonEditor';
import AIModelSelector from '../../../Select/AIModelSelector';
import FileSelector from '../../../Select/FileSelector';
import { useTranslation } from '@fastgpt/web/hooks/useSafeTranslation';
import { useSafeTranslation } from '@fastgpt/web/hooks/useSafeTranslation';
const InputRender = (props: InputRenderProps) => {
const {
@ -28,7 +28,7 @@ const InputRender = (props: InputRenderProps) => {
return <>{customRender(props)}</>;
}
const { t } = useTranslation();
const { t } = useSafeTranslation();
const {
value: selectedValue,
setValue,
@ -81,6 +81,7 @@ const InputRender = (props: InputRenderProps) => {
return (
<MyNumberInput
{...commonProps}
value={value ?? ''}
min={props.min}
max={props.max}
bg={undefined}

View File

@ -42,6 +42,7 @@ const SendCodeAuthModal = ({
};
const handleEnterKeyDown = (e: React.KeyboardEvent<HTMLInputElement>) => {
e.stopPropagation();
if (e.key.toLowerCase() !== 'enter') return;
handleSubmit(onSubmit, onError)();
};

View File

@ -7,7 +7,8 @@ export type GetAppChatLogsProps = {
dateStart: Date;
dateEnd: Date;
sources?: ChatSourceEnum[];
logTitle?: string;
tmbIds?: string[];
chatSearch?: string;
};
export type GetAppChatLogsParams = PaginationProps<GetAppChatLogsProps>;

View File

@ -185,7 +185,6 @@ const ChannelLog = ({ Tab }: { Tab: React.ReactNode }) => {
<DateRangePicker
defaultDate={filterProps.dateRange}
dateRange={filterProps.dateRange}
position="bottom"
onSuccess={(e) => setFilterProps({ ...filterProps, dateRange: e })}
/>
</Box>

View File

@ -374,7 +374,6 @@ const ModelDashboard = ({ Tab }: { Tab: React.ReactNode }) => {
<DateRangePicker
defaultDate={filterProps.dateRange}
dateRange={filterProps.dateRange}
position="bottom"
onSuccess={handleDateRangeChange}
/>
</Box>

View File

@ -15,12 +15,12 @@ import { type UsageItemType } from '@fastgpt/global/support/wallet/usage/type.d'
import dayjs from 'dayjs';
import { UsageSourceMap } from '@fastgpt/global/support/wallet/usage/constants';
import MyModal from '@fastgpt/web/components/common/MyModal';
import { useTranslation } from 'next-i18next';
import { formatNumber } from '@fastgpt/global/common/math/tools';
import FormLabel from '@fastgpt/web/components/common/MyBox/FormLabel';
import { useSafeTranslation } from '@fastgpt/web/hooks/useSafeTranslation';
const UsageDetail = ({ usage, onClose }: { usage: UsageItemType; onClose: () => void }) => {
const { t } = useTranslation();
const { t } = useSafeTranslation();
const filterBillList = useMemo(
() => usage.list.filter((item) => item && item.moduleName),
[usage.list]

View File

@ -27,6 +27,7 @@ import { type UsageFilterParams } from './type';
import PopoverConfirm from '@fastgpt/web/components/common/MyPopover/PopoverConfirm';
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
import { downloadFetch } from '@/web/common/system/utils';
import { useSafeTranslation } from '@fastgpt/web/hooks/useSafeTranslation';
const UsageDetail = dynamic(() => import('./UsageDetail'));
@ -39,7 +40,7 @@ const UsageTableList = ({
Selectors: React.ReactNode;
filterParams: UsageFilterParams;
}) => {
const { t } = useTranslation();
const { t } = useSafeTranslation();
const { dateRange, selectTmbIds, isSelectAllTmb, usageSources, isSelectAllSource, projectName } =
filterParams;
@ -147,7 +148,9 @@ const UsageTableList = ({
</Flex>
</Td>
<Td>{t(UsageSourceMap[item.source]?.label as any) || '-'}</Td>
<Td>{t(item.appName as any) || '-'}</Td>
<Td className="textEllipsis" maxW={'400px'} title={t(item.appName as any)}>
{t(item.appName as any) || '-'}
</Td>
<Td>{formatNumber(item.totalPoints) || 0}</Td>
<Td>
<Button

View File

@ -134,6 +134,7 @@ const DetailLogsModal = ({ appId, chatId, onClose }: Props) => {
totalRecordsCount={totalRecordsCount}
title={title || ''}
chatModels={chatModels}
chatId={chatId}
/>
<Box flex={1} />
</>

View File

@ -0,0 +1,156 @@
import { Box, Button, Flex } from '@chakra-ui/react';
import MyPopover from '@fastgpt/web/components/common/MyPopover';
import { useTranslation } from 'next-i18next';
import { AppLogKeysEnumMap } from '@fastgpt/global/core/app/logs/constants';
import type {
DraggableProvided,
DraggableStateSnapshot
} from '@fastgpt/web/components/common/DndDrag';
import DndDrag, { Draggable } from '@fastgpt/web/components/common/DndDrag';
import MyIcon from '@fastgpt/web/components/common/Icon';
import React from 'react';
import type { AppLogKeysType } from '@fastgpt/global/core/app/logs/type';
const LogKeysConfigPopover = ({
logKeysList,
setLogKeysList
}: {
logKeysList: AppLogKeysType[];
setLogKeysList: (logKeysList: AppLogKeysType[] | undefined) => void;
}) => {
const { t } = useTranslation();
return (
<MyPopover
placement="bottom-end"
w={'300px'}
closeOnBlur={true}
trigger="click"
Trigger={
<Button
size={'md'}
variant={'whiteBase'}
leftIcon={<MyIcon name={'common/setting'} w={'18px'} />}
>
{t('app:logs_key_config')}
</Button>
}
>
{({ onClose }) => {
return (
<Box p={4} overflowY={'auto'} maxH={['300px', '500px']}>
<DndDrag<AppLogKeysType>
onDragEndCb={setLogKeysList}
dataList={logKeysList}
renderClone={(provided, snapshot, rubric) => (
<DragItem
item={logKeysList[rubric.source.index]}
provided={provided}
snapshot={snapshot}
logKeys={logKeysList}
setLogKeys={setLogKeysList}
/>
)}
>
{({ provided }) => (
<Box {...provided.droppableProps} ref={provided.innerRef}>
{logKeysList.map((item, index) => (
<Draggable key={item.key} draggableId={item.key} index={index}>
{(provided, snapshot) => (
<>
<DragItem
item={item}
provided={provided}
snapshot={snapshot}
logKeys={logKeysList}
setLogKeys={setLogKeysList}
/>
{index !== logKeysList.length - 1 && <Box h={'1px'} bg={'myGray.200'} />}
</>
)}
</Draggable>
))}
</Box>
)}
</DndDrag>
</Box>
);
}}
</MyPopover>
);
};
export default LogKeysConfigPopover;
const DragItem = ({
item,
provided,
snapshot,
logKeys,
setLogKeys
}: {
item: AppLogKeysType;
provided: DraggableProvided;
snapshot: DraggableStateSnapshot;
logKeys: AppLogKeysType[];
setLogKeys: (logKeys: AppLogKeysType[]) => void;
}) => {
const { t } = useTranslation();
return (
<Flex
ref={provided.innerRef}
{...provided.draggableProps}
style={{
...provided.draggableProps.style,
opacity: snapshot.isDragging ? 0.8 : 1
}}
alignItems={'center'}
py={1}
>
<Box {...provided.dragHandleProps}>
<MyIcon
name={'drag'}
p={2}
borderRadius={'md'}
_hover={{ color: 'primary.600' }}
w={'12px'}
color={'myGray.600'}
/>
</Box>
<Box fontSize={'14px'} color={'myGray.900'}>
{t(AppLogKeysEnumMap[item.key])}
</Box>
<Box flex={1} />
{item.enable ? (
<MyIcon
name={'visible'}
borderRadius={'md'}
w={4}
p={1}
cursor={'pointer'}
color={'primary.600'}
_hover={{ bg: 'myGray.50' }}
onClick={() => {
setLogKeys(
logKeys.map((key) => (key.key === item.key ? { ...key, enable: false } : key))
);
}}
/>
) : (
<MyIcon
name={'invisible'}
borderRadius={'md'}
w={4}
p={1}
cursor={'pointer'}
_hover={{ bg: 'myGray.50' }}
onClick={() => {
setLogKeys(
logKeys.map((key) => (key.key === item.key ? { ...key, enable: true } : key))
);
}}
/>
)}
</Flex>
);
};

View File

@ -0,0 +1,88 @@
import MyPopover from '@fastgpt/web/components/common/MyPopover';
import { Box, Button, Flex } from '@chakra-ui/react';
import MyIcon from '@fastgpt/web/components/common/Icon';
import { useTranslation } from 'next-i18next';
import React from 'react';
import type { updateLogKeysBody } from '@/pages/api/core/app/logs/updateLogKeys';
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
import { updateLogKeys } from '@/web/core/app/api/log';
import { useContextSelector } from 'use-context-selector';
import { AppContext } from '../context';
import type { AppLogKeysType } from '@fastgpt/global/core/app/logs/type';
const SyncLogKeysPopover = ({
logKeys,
setLogKeys,
teamLogKeys,
fetchLogKeys
}: {
logKeys: AppLogKeysType[];
setLogKeys: (logKeys: AppLogKeysType[]) => void;
teamLogKeys: AppLogKeysType[];
fetchLogKeys: () => Promise<AppLogKeysType[]>;
}) => {
const { t } = useTranslation();
const appId = useContextSelector(AppContext, (v) => v.appId);
const { runAsync: updateList, loading: updateLoading } = useRequest2(
async (data: updateLogKeysBody) => {
await updateLogKeys(data);
},
{
manual: true,
onSuccess: async () => {
await fetchLogKeys();
}
}
);
return (
<MyPopover
placement="bottom-end"
w={'300px'}
closeOnBlur={true}
trigger="click"
Trigger={
<Flex alignItems={'center'} cursor={'pointer'}>
<MyIcon name="common/warn" w={4} color={'yellow.500'} />
</Flex>
}
>
{({ onClose }) => {
return (
<Box p={4}>
<Box mb={4}>{t('app:sync_log_keys_popover_text')}</Box>
<Flex justifyContent={'end'} gap={2}>
<Button
variant={'outline'}
size={'sm'}
onClick={() => {
setLogKeys(teamLogKeys);
onClose();
}}
>
{t('app:sync_team_app_log_keys')}
</Button>
<Button
size={'sm'}
isLoading={updateLoading}
onClick={async () => {
await updateList({
appId: appId,
logKeys
});
onClose();
}}
>
{t('app:save_team_app_log_keys')}
</Button>
</Flex>
</Box>
);
}}
</MyPopover>
);
};
export default SyncLogKeysPopover;

View File

@ -1,4 +1,4 @@
import React, { useEffect, useMemo, useState } from 'react';
import React, { useMemo, useState } from 'react';
import {
Flex,
Box,
@ -10,12 +10,13 @@ import {
Td,
Tbody,
HStack,
Button
Button,
Input
} from '@chakra-ui/react';
import UserBox from '@fastgpt/web/components/common/UserBox';
import MyIcon from '@fastgpt/web/components/common/Icon';
import { useTranslation } from 'next-i18next';
import { getAppChatLogs } from '@/web/core/app/api';
import { getAppChatLogs } from '@/web/core/app/api/log';
import dayjs from 'dayjs';
import { ChatSourceEnum, ChatSourceMap } from '@fastgpt/global/core/chat/constants';
import { addDays } from 'date-fns';
@ -27,16 +28,26 @@ import EmptyTip from '@fastgpt/web/components/common/EmptyTip';
import { useContextSelector } from 'use-context-selector';
import { AppContext } from '../context';
import { cardStyles } from '../constants';
import dynamic from 'next/dynamic';
import QuestionTip from '@fastgpt/web/components/common/MyTooltip/QuestionTip';
import MultipleSelect, {
useMultipleSelect
} from '@fastgpt/web/components/common/MySelect/MultipleSelect';
import SearchInput from '@fastgpt/web/components/common/Input/SearchInput';
import PopoverConfirm from '@fastgpt/web/components/common/MyPopover/PopoverConfirm';
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
import { downloadFetch } from '@/web/common/system/utils';
import LogKeysConfigPopover from './LogKeysConfigPopover';
import { getLogKeys } from '@/web/core/app/api/log';
import { AppLogKeysEnum } from '@fastgpt/global/core/app/logs/constants';
import { DefaultAppLogKeys } from '@fastgpt/global/core/app/logs/constants';
import { useScrollPagination } from '@fastgpt/web/hooks/useScrollPagination';
import { getTeamMembers } from '@/web/support/user/team/api';
import Avatar from '@fastgpt/web/components/common/Avatar';
import { useLocalStorageState } from 'ahooks';
import type { AppLogKeysType } from '@fastgpt/global/core/app/logs/type';
import type { AppLogsListItemType } from '@/types/app';
import SyncLogKeysPopover from './SyncLogKeysPopover';
import { isEqual } from 'lodash';
const DetailLogsModal = dynamic(() => import('./DetailLogsModal'));
@ -51,7 +62,105 @@ const Logs = () => {
});
const [detailLogsId, setDetailLogsId] = useState<string>();
const [logTitle, setLogTitle] = useState<string>();
const [tmbInputValue, setTmbInputValue] = useState('');
const [chatSearch, setChatSearch] = useState('');
const getCellRenderMap = (item: AppLogsListItemType) => ({
[AppLogKeysEnum.SOURCE]: (
<Td key={AppLogKeysEnum.SOURCE}>
{/* @ts-ignore */}
{item.sourceName || t(ChatSourceMap[item.source]?.name) || item.source}
</Td>
),
[AppLogKeysEnum.CREATED_TIME]: (
<Td key={AppLogKeysEnum.CREATED_TIME}>{dayjs(item.createTime).format('YYYY/MM/DD HH:mm')}</Td>
),
[AppLogKeysEnum.LAST_CONVERSATION_TIME]: (
<Td key={AppLogKeysEnum.LAST_CONVERSATION_TIME}>
{dayjs(item.updateTime).format('YYYY/MM/DD HH:mm')}
</Td>
),
[AppLogKeysEnum.USER]: (
<Td key={AppLogKeysEnum.USER}>
<Box>
{!!item.outLinkUid ? item.outLinkUid : <UserBox sourceMember={item.sourceMember} />}
</Box>
</Td>
),
[AppLogKeysEnum.TITLE]: (
<Td key={AppLogKeysEnum.TITLE} className="textEllipsis" maxW={'250px'}>
{item.customTitle || item.title}
</Td>
),
[AppLogKeysEnum.SESSION_ID]: (
<Td key={AppLogKeysEnum.SESSION_ID} className="textEllipsis" maxW={'200px'}>
{item.id || '-'}
</Td>
),
[AppLogKeysEnum.MESSAGE_COUNT]: <Td key={AppLogKeysEnum.MESSAGE_COUNT}>{item.messageCount}</Td>,
[AppLogKeysEnum.FEEDBACK]: (
<Td key={AppLogKeysEnum.FEEDBACK} w={'100px'}>
{!!item?.userGoodFeedbackCount && (
<Flex
mb={item?.userGoodFeedbackCount ? 1 : 0}
bg={'green.100'}
color={'green.600'}
px={3}
py={1}
alignItems={'center'}
justifyContent={'center'}
borderRadius={'md'}
fontWeight={'bold'}
>
<MyIcon mr={1} name={'core/chat/feedback/goodLight'} color={'green.600'} w={'14px'} />
{item.userGoodFeedbackCount}
</Flex>
)}
{!!item?.userBadFeedbackCount && (
<Flex
bg={'#FFF2EC'}
color={'#C96330'}
px={3}
py={1}
alignItems={'center'}
justifyContent={'center'}
borderRadius={'md'}
fontWeight={'bold'}
>
<MyIcon mr={1} name={'core/chat/feedback/badLight'} color={'#C96330'} w={'14px'} />
{item.userBadFeedbackCount}
</Flex>
)}
{!item?.userGoodFeedbackCount && !item?.userBadFeedbackCount && <>-</>}
</Td>
),
[AppLogKeysEnum.CUSTOM_FEEDBACK]: (
<Td key={AppLogKeysEnum.CUSTOM_FEEDBACK}>{item.customFeedbacksCount || '-'}</Td>
),
[AppLogKeysEnum.ANNOTATED_COUNT]: (
<Td key={AppLogKeysEnum.ANNOTATED_COUNT}>{item.markCount}</Td>
),
[AppLogKeysEnum.RESPONSE_TIME]: (
<Td key={AppLogKeysEnum.RESPONSE_TIME}>
{item.averageResponseTime ? `${item.averageResponseTime.toFixed(2)}s` : '-'}
</Td>
),
[AppLogKeysEnum.ERROR_COUNT]: (
<Td key={AppLogKeysEnum.ERROR_COUNT}>{item.errorCount || '-'}</Td>
),
[AppLogKeysEnum.POINTS]: (
<Td key={AppLogKeysEnum.POINTS}>
{item.totalPoints ? `${item.totalPoints.toFixed(2)}` : '-'}
</Td>
)
});
const {
value: selectTmbIds,
setValue: setSelectTmbIds,
isSelectAll: isSelectAllTmb,
setIsSelectAll: setIsSelectAllTmb
} = useMultipleSelect<string>([], true);
const {
value: chatSources,
@ -75,9 +184,19 @@ const Logs = () => {
dateStart: dateRange.from!,
dateEnd: dateRange.to!,
sources: isSelectAllSource ? undefined : chatSources,
logTitle
tmbIds: isSelectAllTmb ? undefined : selectTmbIds,
chatSearch
}),
[appId, chatSources, dateRange.from, dateRange.to, isSelectAllSource, logTitle]
[
appId,
chatSources,
dateRange.from,
dateRange.to,
isSelectAllSource,
selectTmbIds,
isSelectAllTmb,
chatSearch
]
);
const {
data: logs,
@ -92,6 +211,66 @@ const Logs = () => {
refreshDeps: [params]
});
const [logKeys = DefaultAppLogKeys, setLogKeys] = useLocalStorageState<AppLogKeysType[]>(
`app_log_keys_${appId}`
);
const { runAsync: fetchLogKeys, data: teamLogKeys = [] } = useRequest2(
async () => {
const res = await getLogKeys({ appId });
const keys = res.logKeys.length > 0 ? res.logKeys : DefaultAppLogKeys;
setLogKeys(keys);
return keys;
},
{
manual: false,
refreshDeps: [appId]
}
);
const HeaderRenderMap = useMemo(
() => ({
[AppLogKeysEnum.SOURCE]: <Th key={AppLogKeysEnum.SOURCE}>{t('app:logs_keys_source')}</Th>,
[AppLogKeysEnum.CREATED_TIME]: (
<Th key={AppLogKeysEnum.CREATED_TIME}>{t('app:logs_keys_createdTime')}</Th>
),
[AppLogKeysEnum.LAST_CONVERSATION_TIME]: (
<Th key={AppLogKeysEnum.LAST_CONVERSATION_TIME}>
{t('app:logs_keys_lastConversationTime')}
</Th>
),
[AppLogKeysEnum.USER]: <Th key={AppLogKeysEnum.USER}>{t('app:logs_chat_user')}</Th>,
[AppLogKeysEnum.TITLE]: <Th key={AppLogKeysEnum.TITLE}>{t('app:logs_title')}</Th>,
[AppLogKeysEnum.SESSION_ID]: (
<Th key={AppLogKeysEnum.SESSION_ID}>{t('app:logs_keys_sessionId')}</Th>
),
[AppLogKeysEnum.MESSAGE_COUNT]: (
<Th key={AppLogKeysEnum.MESSAGE_COUNT}>{t('app:logs_message_total')}</Th>
),
[AppLogKeysEnum.FEEDBACK]: <Th key={AppLogKeysEnum.FEEDBACK}>{t('app:feedback_count')}</Th>,
[AppLogKeysEnum.CUSTOM_FEEDBACK]: (
<Th key={AppLogKeysEnum.CUSTOM_FEEDBACK}>
{t('common:core.app.feedback.Custom feedback')}
</Th>
),
[AppLogKeysEnum.ANNOTATED_COUNT]: (
<Th key={AppLogKeysEnum.ANNOTATED_COUNT}>
<Flex gap={1} alignItems={'center'}>
{t('app:mark_count')}
<QuestionTip label={t('common:core.chat.Mark Description')} />
</Flex>
</Th>
),
[AppLogKeysEnum.RESPONSE_TIME]: (
<Th key={AppLogKeysEnum.RESPONSE_TIME}>{t('app:logs_response_time')}</Th>
),
[AppLogKeysEnum.ERROR_COUNT]: (
<Th key={AppLogKeysEnum.ERROR_COUNT}>{t('app:logs_error_count')}</Th>
),
[AppLogKeysEnum.POINTS]: <Th key={AppLogKeysEnum.POINTS}>{t('app:logs_points')}</Th>
}),
[t]
);
const { runAsync: exportLogs } = useRequest2(
async () => {
await downloadFetch({
@ -102,7 +281,8 @@ const Logs = () => {
dateStart: dateRange.from || new Date(),
dateEnd: addDays(dateRange.to || new Date(), 1),
sources: isSelectAllSource ? undefined : chatSources,
logTitle,
tmbIds: isSelectAllTmb ? undefined : selectTmbIds,
chatSearch,
title: t('app:logs_export_title'),
sourcesMap: Object.fromEntries(
@ -117,10 +297,36 @@ const Logs = () => {
});
},
{
refreshDeps: [chatSources, logTitle]
refreshDeps: [chatSources]
}
);
const { data: members, ScrollData: TmbScrollData } = useScrollPagination(getTeamMembers, {
params: { searchKey: tmbInputValue },
refreshDeps: [tmbInputValue]
});
const tmbList = useMemo(
() =>
members.map((item) => ({
label: (
<HStack spacing={1}>
<Avatar src={item.avatar} w={'1.2rem'} rounded={'full'} />
<Box color={'myGray.900'} className="textEllipsis">
{item.memberName}
</Box>
</HStack>
),
value: item.tmbId
})),
[members]
);
const showSyncPopover = useMemo(() => {
const teamLogKeysList = teamLogKeys.filter((item) => item.enable);
const personalLogKeysList = logKeys.filter((item) => item.enable);
return !isEqual(teamLogKeysList, personalLogKeysList);
}, [teamLogKeys, logKeys]);
return (
<Flex
flexDirection={'column'}
@ -131,49 +337,115 @@ const Logs = () => {
py={[4, 6]}
flex={'1 0 0'}
>
<Flex flexDir={['column', 'row']} alignItems={['flex-start', 'center']} gap={3}>
<Flex alignItems={'center'} gap={2}>
<Box fontSize={'mini'} fontWeight={'medium'} color={'myGray.900'}>
{t('app:logs_source')}
</Box>
<Box>
<MultipleSelect<ChatSourceEnum>
list={sourceList}
value={chatSources}
onSelect={setChatSources}
isSelectAll={isSelectAllSource}
setIsSelectAll={setIsSelectAllSource}
itemWrap={false}
height={'32px'}
bg={'myGray.50'}
w={'160px'}
/>
</Box>
<Flex alignItems={'center'} flexWrap={'wrap'} gap={3}>
<Flex>
<MultipleSelect<ChatSourceEnum>
list={sourceList}
value={chatSources}
onSelect={setChatSources}
isSelectAll={isSelectAllSource}
setIsSelectAll={setIsSelectAllSource}
h={9}
w={'226px'}
rounded={'8px'}
tagStyle={{
px: 1,
py: 1,
borderRadius: 'sm',
bg: 'myGray.100',
color: 'myGray.900'
}}
borderColor={'myGray.200'}
formLabel={t('app:logs_source')}
/>
</Flex>
<Flex alignItems={'center'} gap={2}>
<Box fontSize={'mini'} fontWeight={'medium'} color={'myGray.900'}>
{t('common:user.Time')}
</Box>
<Flex>
<DateRangePicker
defaultDate={dateRange}
position="bottom"
onSuccess={(date) => {
setDateRange(date);
}}
bg={'white'}
h={9}
w={'240px'}
rounded={'8px'}
borderColor={'myGray.200'}
formLabel={t('app:logs_date')}
_hover={{
borderColor: 'primary.300'
}}
/>
</Flex>
<Flex alignItems={'center'} gap={2}>
<Box fontSize={'mini'} fontWeight={'medium'} color={'myGray.900'} whiteSpace={'nowrap'}>
{t('app:logs_title')}
<Flex>
<MultipleSelect<string>
list={tmbList}
value={selectTmbIds}
onSelect={(val) => {
setSelectTmbIds(val as string[]);
}}
ScrollData={TmbScrollData}
isSelectAll={isSelectAllTmb}
setIsSelectAll={setIsSelectAllTmb}
h={9}
w={'226px'}
rounded={'8px'}
formLabel={t('common:member')}
tagStyle={{
px: 1,
borderRadius: 'sm',
bg: 'myGray.100',
w: '76px'
}}
inputValue={tmbInputValue}
setInputValue={setTmbInputValue}
/>
</Flex>
<Flex
w={'226px'}
h={9}
alignItems={'center'}
rounded={'8px'}
border={'1px solid'}
borderColor={'myGray.200'}
_focusWithin={{
borderColor: 'primary.600',
boxShadow: '0 0 0 2.4px rgba(51, 112, 255, 0.15)'
}}
pl={3}
>
<Box rounded={'8px'} bg={'white'} fontSize={'sm'} border={'none'} whiteSpace={'nowrap'}>
{t('common:chat')}
</Box>
<SearchInput
placeholder={t('app:logs_title')}
w={'240px'}
value={logTitle}
onChange={(e) => setLogTitle(e.target.value)}
<Box w={'1px'} h={'12px'} bg={'myGray.200'} mx={2} />
<Input
placeholder={t('app:logs_search_chat')}
value={chatSearch}
onChange={(e) => setChatSearch(e.target.value)}
fontSize={'sm'}
border={'none'}
pl={0}
_focus={{
boxShadow: 'none'
}}
_placeholder={{
fontSize: 'sm'
}}
/>
</Flex>
<Box flex={'1'} />
{showSyncPopover && (
<SyncLogKeysPopover
logKeys={logKeys}
setLogKeys={setLogKeys}
teamLogKeys={teamLogKeys || []}
fetchLogKeys={fetchLogKeys}
/>
)}
<LogKeysConfigPopover
logKeysList={logKeys || DefaultAppLogKeys}
setLogKeysList={setLogKeys}
/>
<PopoverConfirm
Trigger={<Button size={'md'}>{t('common:Export')}</Button>}
showCancel
@ -186,95 +458,28 @@ const Logs = () => {
<Table variant={'simple'} fontSize={'sm'}>
<Thead>
<Tr>
<Th>{t('common:core.app.logs.Source And Time')}</Th>
<Th>{t('app:logs_chat_user')}</Th>
<Th>{t('app:logs_title')}</Th>
<Th>{t('app:logs_message_total')}</Th>
<Th>{t('app:feedback_count')}</Th>
<Th>{t('common:core.app.feedback.Custom feedback')}</Th>
<Th>
<Flex gap={1} alignItems={'center'}>
{t('app:mark_count')}
<QuestionTip label={t('common:core.chat.Mark Description')} />
</Flex>
</Th>
{(logKeys || DefaultAppLogKeys)
.filter((logKey) => logKey.enable)
.map((logKey) => HeaderRenderMap[logKey.key])}
</Tr>
</Thead>
<Tbody fontSize={'xs'}>
{logs.map((item) => (
<Tr
key={item._id}
_hover={{ bg: 'myWhite.600' }}
cursor={'pointer'}
title={t('common:core.view_chat_detail')}
onClick={() => setDetailLogsId(item.id)}
>
<Td>
{/* @ts-ignore */}
<Box>{item.sourceName || t(ChatSourceMap[item.source]?.name) || item.source}</Box>
<Box color={'myGray.500'}>{dayjs(item.time).format('YYYY/MM/DD HH:mm')}</Box>
</Td>
<Td>
<Box>
{!!item.outLinkUid ? (
item.outLinkUid
) : (
<UserBox sourceMember={item.sourceMember} />
)}
</Box>
</Td>
<Td className="textEllipsis" maxW={'250px'}>
{item.customTitle || item.title}
</Td>
<Td>{item.messageCount}</Td>
<Td w={'100px'}>
{!!item?.userGoodFeedbackCount && (
<Flex
mb={item?.userGoodFeedbackCount ? 1 : 0}
bg={'green.100'}
color={'green.600'}
px={3}
py={1}
alignItems={'center'}
justifyContent={'center'}
borderRadius={'md'}
fontWeight={'bold'}
>
<MyIcon
mr={1}
name={'core/chat/feedback/goodLight'}
color={'green.600'}
w={'14px'}
/>
{item.userGoodFeedbackCount}
</Flex>
)}
{!!item?.userBadFeedbackCount && (
<Flex
bg={'#FFF2EC'}
color={'#C96330'}
px={3}
py={1}
alignItems={'center'}
justifyContent={'center'}
borderRadius={'md'}
fontWeight={'bold'}
>
<MyIcon
mr={1}
name={'core/chat/feedback/badLight'}
color={'#C96330'}
w={'14px'}
/>
{item.userBadFeedbackCount}
</Flex>
)}
{!item?.userGoodFeedbackCount && !item?.userBadFeedbackCount && <>-</>}
</Td>
<Td>{item.customFeedbacksCount || '-'}</Td>
<Td>{item.markCount}</Td>
</Tr>
))}
{logs.map((item) => {
const cellRenderMap = getCellRenderMap(item);
return (
<Tr
key={item._id}
_hover={{ bg: 'myWhite.600' }}
cursor={'pointer'}
title={t('common:core.view_chat_detail')}
onClick={() => setDetailLogsId(item.id)}
>
{(logKeys || DefaultAppLogKeys)
.filter((logKey) => logKey.enable)
.map((logKey) => cellRenderMap[logKey.key])}
</Tr>
);
})}
</Tbody>
</Table>
{logs.length === 0 && !isLoading && <EmptyTip text={t('app:logs_empty')}></EmptyTip>}

View File

@ -4,20 +4,21 @@ import { useContextSelector } from 'use-context-selector';
import { AppContext } from '../context';
import ChatItemContextProvider from '@/web/core/chat/context/chatItemContext';
import ChatRecordContextProvider from '@/web/core/chat/context/chatRecordContext';
import { Box, Button, Flex } from '@chakra-ui/react';
import { Box, Button, Flex, HStack } from '@chakra-ui/react';
import { cardStyles } from '../constants';
import { useTranslation } from 'react-i18next';
import { type McpToolConfigType } from '@fastgpt/global/core/app/type';
import { Controller, useForm } from 'react-hook-form';
import { useForm } from 'react-hook-form';
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
import Markdown from '@/components/Markdown';
import { postRunMCPTool } from '@/web/core/app/api/plugin';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
import InputRender from '@/components/core/app/formRender';
import FormLabel from '@fastgpt/web/components/common/MyBox/FormLabel';
import QuestionTip from '@fastgpt/web/components/common/MyTooltip/QuestionTip';
import { valueTypeToInputType } from '@/components/core/app/formRender/utils';
import { getNodeInputTypeFromSchemaInputType } from '@fastgpt/global/core/app/jsonschema';
import LabelAndFormRender from '@/components/core/app/formRender/LabelAndForm';
import FormLabel from '@fastgpt/web/components/common/MyBox/FormLabel';
import ValueTypeLabel from '../WorkflowComponents/Flow/nodes/render/ValueTypeLabel';
import { valueTypeFormat } from '@fastgpt/global/core/workflow/runtime/utils';
const ChatTest = ({
currentTool,
@ -32,7 +33,8 @@ const ChatTest = ({
const [output, setOutput] = useState<string>('');
const { control, handleSubmit, reset } = useForm();
const form = useForm();
const { handleSubmit, reset } = form;
useEffect(() => {
reset({});
@ -42,6 +44,20 @@ const ChatTest = ({
const { runAsync: runTool, loading: isRunning } = useRequest2(
async (data: Record<string, any>) => {
if (!currentTool) return;
// Format type
Object.entries(currentTool?.inputSchema.properties || {}).forEach(
([paramName, paramInfo]) => {
const valueType = getNodeInputTypeFromSchemaInputType({
type: paramInfo.type,
arrayItems: paramInfo.items
});
if (data[paramName] !== undefined) {
data[paramName] = valueTypeFormat(data[paramName], valueType);
}
}
);
return await postRunMCPTool({
params: data,
url,
@ -89,48 +105,35 @@ const ChatTest = ({
</Box>
<Box border={'1px solid'} borderColor={'myGray.200'} borderRadius={'8px'} p={3}>
{Object.entries(currentTool?.inputSchema.properties || {}).map(
([paramName, paramInfo]) => (
<Controller
key={paramName}
control={control}
name={paramName}
rules={{
validate: (value) => {
if (!currentTool?.inputSchema.required?.includes(paramName)) return true;
return !!value;
}
}}
render={({ field: { onChange, value }, fieldState: { error } }) => {
const inputType = valueTypeToInputType(
getNodeInputTypeFromSchemaInputType({ type: paramInfo.type })
);
([paramName, paramInfo]) => {
const inputType = valueTypeToInputType(
getNodeInputTypeFromSchemaInputType({ type: paramInfo.type })
);
const required = currentTool?.inputSchema.required?.includes(paramName);
return (
<Box _notLast={{ mb: 4 }}>
<Flex alignItems="center" mb={1}>
{currentTool?.inputSchema.required?.includes(paramName) && (
<Box mr={1} color="red.500">
*
</Box>
)}
<FormLabel fontSize="14px" fontWeight={'normal'} color="myGray.900">
{paramName}
<QuestionTip label={paramInfo.description} ml={1} />
</FormLabel>
</Flex>
<InputRender
inputType={inputType}
value={value}
onChange={onChange}
placeholder={paramInfo.description}
isInvalid={!!error}
return (
<LabelAndFormRender
label={
<HStack spacing={0} mr={2}>
<FormLabel required={required}>{paramName}</FormLabel>
<ValueTypeLabel
valueType={getNodeInputTypeFromSchemaInputType({
type: paramInfo.type,
arrayItems: paramInfo.items
})}
h={'auto'}
/>
</Box>
);
}}
/>
)
</HStack>
}
required={required}
key={paramName}
inputType={inputType}
formKey={paramName}
variablesForm={form}
placeholder={paramInfo.description}
/>
);
}
)}
</Box>
</>

View File

@ -6,7 +6,6 @@ import { useContextSelector } from 'use-context-selector';
import { AppContext } from '../context';
import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
import { type McpToolConfigType } from '@fastgpt/global/core/app/type';
import { type MCPToolSetData } from '@/pageComponents/dashboard/apps/MCPToolsEditModal';
import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type';
const MCPTools = () => {
@ -15,13 +14,15 @@ const MCPTools = () => {
const toolSetNode = appDetail.modules.find(
(item) => item.flowNodeType === FlowNodeTypeEnum.toolSet
);
return toolSetNode?.inputs[0].value as MCPToolSetData;
return toolSetNode?.toolConfig?.mcpToolSet ?? toolSetNode?.inputs[0].value;
}, [appDetail.modules]);
const [url, setUrl] = useState(toolSetData?.url || '');
const [toolList, setToolList] = useState<McpToolConfigType[]>(toolSetData?.toolList || []);
const [headerSecret, setHeaderSecret] = useState<StoreSecretValueType>(toolSetData?.headerSecret);
const [currentTool, setCurrentTool] = useState<McpToolConfigType>(toolSetData?.toolList[0]);
const [headerSecret, setHeaderSecret] = useState<StoreSecretValueType>(
toolSetData?.headerSecret ?? {}
);
const [currentTool, setCurrentTool] = useState<McpToolConfigType>(toolSetData.toolList[0]);
return (
<Flex h={'100%'} flexDirection={'column'} px={[3, 0]} pr={[3, 3]}>

View File

@ -22,18 +22,21 @@ import { useChatStore } from '@/web/core/chat/context/useChatStore';
import MyBox from '@fastgpt/web/components/common/MyBox';
import ChatQuoteList from '@/pageComponents/chat/ChatQuoteList';
import VariablePopover from '@/components/core/chat/ChatContainer/ChatBox/components/VariablePopover';
import { useCopyData } from '@fastgpt/web/hooks/useCopyData';
type Props = {
isOpen: boolean;
nodes?: StoreNodeItemType[];
edges?: StoreEdgeItemType[];
onClose: () => void;
chatId: string;
};
const ChatTest = ({ isOpen, nodes = [], edges = [], onClose }: Props) => {
const ChatTest = ({ isOpen, nodes = [], edges = [], onClose, chatId }: Props) => {
const { t } = useTranslation();
const { appDetail } = useContextSelector(AppContext, (v) => v);
const isPlugin = appDetail.type === AppTypeEnum.plugin;
const { copyData } = useCopyData();
const { restartChat, ChatContainer, loading } = useChatTest({
nodes,
@ -118,7 +121,16 @@ const ChatTest = ({ isOpen, nodes = [], edges = [], onClose }: Props) => {
>
<Flex fontSize={'16px'} fontWeight={'bold'} alignItems={'center'} mr={3}>
<MyIcon name={'common/paused'} w={'14px'} mr={2.5} />
{t('common:core.chat.Run test')}
<MyTooltip label={chatId ? t('common:chat_chatId', { chatId }) : ''}>
<Box
cursor={'pointer'}
onClick={() => {
copyData(chatId);
}}
>
{t('common:core.chat.Run test')}
</Box>
</MyTooltip>
</Flex>
{!isVariableVisible && <VariablePopover showExternalVariables />}
<Box flex={1} />
@ -199,7 +211,7 @@ const Render = (Props: Props) => {
showNodeStatus
>
<ChatRecordContextProvider params={chatRecordProviderParams}>
<ChatTest {...Props} />
<ChatTest {...Props} chatId={chatId} />
</ChatRecordContextProvider>
</ChatItemContextProvider>
);

Some files were not shown because too many files have changed in this diff Show More