diff --git a/.claude/design/lexical-consistency-analysis.md b/.claude/design/lexical-consistency-analysis.md deleted file mode 100644 index bce6966f8..000000000 --- a/.claude/design/lexical-consistency-analysis.md +++ /dev/null @@ -1,384 +0,0 @@ -# Lexical Editor 文本解析一致性分析报告 - -## 执行摘要 - -通过对 `textToEditorState` 和 `editorStateToText` 函数的全面分析,发现了 **3 个确认的不一致性问题**,会导致用户保存后重新加载时看到与编辑器显示不同的内容。 - -### 严重问题总览 - -| 问题 | 严重性 | 影响 | 位置 | -|------|--------|------|------| -| 列表项尾部空格丢失 | 🔴 高 | 用户有意添加的空格被删除 | utils.ts:255 | -| 有序列表序号重置 | 🔴 高 | 自定义序号变成连续序号 | utils.ts:257 | -| 列表项内换行不对称 | 🟡 中 | 编辑器支持但无法往返 | processListItem | - ---- - -## 问题 1: 列表项尾部空格丢失 🔴(已处理) - -### 问题描述 - -在 `processListItem` 函数中使用了 `trim()` 处理列表项文本: - -```typescript -// utils.ts:255 -const itemTextString = itemText.join('').trim(); -``` - -### 不一致性演示 - -**用户输入:** -``` -- hello world -``` -(注意 "world" 后面有 2 个空格) - -**EditorState:** -```json -{ - "type": "listitem", - "children": [ - { "type": "text", "text": "hello world " } - ] -} -``` - -**输出文本:** -``` -- hello world -``` -(尾部空格被 trim 删除) - -**重新加载:** -``` -- hello world -``` -(用户的空格永久丢失) - -### 影响分析 - -- **用户体验**: 用户有意添加的尾部空格(可能用于格式对齐)会丢失 -- **数据完整性**: 每次保存/加载循环都会丢失尾部空格 -- **严重程度**: 高 - 直接影响用户输入的完整性 - -### 解决方案 - -**方案 1: 移除 trim()** -```typescript -const itemTextString = itemText.join(''); // 不使用 trim -``` - -**方案 2: 只移除前导空格** -```typescript -const itemTextString = itemText.join('').trimStart(); // 只移除开头空格 -``` - -**推荐**: 方案 1,完全保留用户输入的空格 - ---- - -## 问题 2: 有序列表序号重置 🔴 - -### 问题描述 - -在输出有序列表时,使用 `index + 1` 而不是列表项自身的 `value`: - -```typescript -// utils.ts:257 -const prefix = listType === 'bullet' ? '- ' : `${index + 1}. `; -``` - -但在解析时,`numberValue` 被正确提取并存储到 `listItem.value`。 - -### 不一致性演示 - -**用户输入:** -``` -1. first -2. second -5. fifth -10. tenth -``` - -**解析 (textToEditorState):** -```javascript -items = [ - { numberValue: 1, text: "first" }, - { numberValue: 2, text: "second" }, - { numberValue: 5, text: "fifth" }, - { numberValue: 10, text: "tenth" } -] -``` - -**EditorState:** -```json -[ - { "value": 1, "text": "first" }, - { "value": 2, "text": "second" }, - { "value": 5, "text": "fifth" }, - { "value": 10, "text": "tenth" } -] -``` - -**输出文本 (editorStateToText):** -``` -1. first (index=0, 0+1=1) ✓ -2. second (index=1, 1+1=2) ✓ -3. fifth (index=2, 2+1=3) ✗ 应该是 5 -4. tenth (index=3, 3+1=4) ✗ 应该是 10 -``` - -**重新加载:** -用户的自定义序号 5 和 10 永久丢失,变成连续的 3 和 4。 - -### 影响分析 - -- **用户体验**: 用户有意设置的序号被强制改为连续序号 -- **数据完整性**: 有序列表的语义丢失(如章节编号 1.1, 1.2, 2.1) -- **严重程度**: 高 - 改变了用户的语义表达 - -### 解决方案 - -```typescript -// utils.ts:257 -const prefix = listType === 'bullet' - ? '- ' - : `${listItem.value || index + 1}. `; -``` - -使用 `listItem.value` 而不是 `index + 1`,保留原始序号。 - ---- - -## 问题 3: 列表项内换行不对称 🟡 - -### 问题描述 - -Lexical 编辑器允许在列表项内插入换行符 (`linebreak` 节点),但 `textToEditorState` 无法将包含换行的文本重新解析为列表项内换行。 - -### 不一致性演示 - -**用户在编辑器中操作:** -``` -1. 输入: "- item1" -2. 按 Shift+Enter (插入软换行) -3. 继续输入: "continued content" -``` - -**EditorState:** -```json -{ - "type": "listitem", - "children": [ - { "type": "text", "text": "item1" }, - { "type": "linebreak" }, - { "type": "text", "text": "continued content" } - ] -} -``` - -**输出文本 (editorStateToText):** -``` -- item1 -continued content -``` - -**重新加载 (textToEditorState):** -``` -行1: "- item1" → 列表项 -行2: "continued content" → 段落 (不再是列表项的一部分!) -``` - -**最终结构变化:** -``` -原来: 1个列表项(包含换行) -现在: 1个列表项 + 1个段落 -``` - -### 影响分析 - -- **结构完整性**: 列表项的内部结构在保存/加载后改变 -- **语义丢失**: 原本属于列表项的内容变成了独立段落 -- **严重程度**: 中 - 影响文档结构,但可能符合 Markdown 语义 - -### 解决方案 - -**方案 1: 在输出时将换行转为空格** -```typescript -if (child.type === 'linebreak') { - itemText.push(' '); // 使用空格而不是 \n -} -``` - -**方案 2: 在编辑器中禁止列表项内换行** -- 配置 Lexical 不允许在列表项内插入 linebreak -- 用户只能通过创建新列表项来换行 - -**方案 3: 支持 Markdown 风格的列表项多行** -```typescript -// 识别缩进的行为列表项的继续内容 -parseTextLine: -if (line.startsWith(' ') && prevLine.wasListItem) { - // 作为列表项的继续内容 -} -``` - -**推荐**: 方案 1 (最简单) 或方案 2 (最明确) - ---- - -## 其他潜在问题 - -### 问题 4: 变量节点保存后变成普通文本 🟡 - -**现象**: -``` -EditorState: { type: 'variableLabel', variableKey: '{{var1}}' } - ↓ -输出文本: "{{var1}}" - ↓ -重新加载: { type: 'text', text: "{{var1}}" } -``` - -**影响**: 变量节点的功能性丢失 - -**分析**: 这可能是设计决策 - 变量只在编辑会话中有效,保存到文本后变成普通占位符。如果需要保持变量功能,应该使用其他存储格式(如 JSON)而不是纯文本。 - -### 问题 5: 非连续缩进级别可能导致结构错误 🟡 - -**现象**: -``` -输入: -- level 0 - - level 2 (跳过 level 1) - - level 1 -``` - -**问题**: `buildListStructure` 可能无法正确处理非连续的缩进级别 - -**影响**: 列表嵌套结构可能不符合预期 - -**建议**: 规范化缩进级别,或在文档中说明只支持连续缩进 - ---- - -## 正常运作的部分 ✅ - -经过分析,以下功能**正常运作**,不存在一致性问题: - -1. **空行处理** - 空行被正确保留和还原 -2. **段落前导空格** - 修复后完全保留 -3. **列表和段落边界** - 正确识别和分离 -4. **特殊字符在段落中** - 只有行首的 `- ` 和 `\d+. ` 被识别为列表 -5. **混合列表类型** - bullet 和 number 列表正确分离 -6. **列表缩进** - 使用 TabStr 统一为 2 个空格 - ---- - -## 测试用例建议 - -### 测试用例 1: 列表项尾部空格 -```typescript -const input = "- hello world "; // 2个尾部空格 -const state = textToEditorState(input, true); -const editor = createEditorWithState(state); -const output = editorStateToText(editor); -expect(output).toBe("- hello world "); // 应保留空格 -``` - -### 测试用例 2: 自定义列表序号 -```typescript -const input = "1. first\n5. fifth\n10. tenth"; -const state = textToEditorState(input, true); -const editor = createEditorWithState(state); -const output = editorStateToText(editor); -expect(output).toBe("1. first\n5. fifth\n10. tenth"); // 应保留序号 -``` - -### 测试用例 3: 列表项换行 -```typescript -// 在编辑器中创建列表项并插入 linebreak -const editor = createEditor(); -// ... 创建列表项 -// ... 插入 linebreak -const output = editorStateToText(editor); -const reloadedState = textToEditorState(output, true); -const reloadedEditor = createEditorWithState(reloadedState); -// 验证结构是否一致 -expect(getStructure(editor)).toEqual(getStructure(reloadedEditor)); -``` - -### 测试用例 4: 往返对称性 -```typescript -const testCases = [ - "simple text", - " indented text", - "- bullet list\n - nested", - "1. first\n2. second\n5. fifth", - "text\n\n\nwith\n\nempty\n\nlines", - "- item ", // 尾部空格 -]; - -testCases.forEach(input => { - const state = textToEditorState(input, true); - const editor = createEditorWithState(state); - const output = editorStateToText(editor); - expect(output).toBe(input); // 应完全一致 -}); -``` - ---- - -## 修复优先级建议 - -### P0 - 立即修复 -1. ✅ **列表项尾部空格丢失** - 影响数据完整性 -2. ✅ **有序列表序号重置** - 影响语义表达 - -### P1 - 高优先级 -3. ⚠️ **列表项内换行不对称** - 影响结构一致性 - -### P2 - 按需修复 -4. 📝 **变量节点** - 根据产品需求决定 -5. 📝 **非连续缩进** - 文档说明或规范化处理 - ---- - -## 代码修改建议 - -### 修改 1: 保留列表项空格 -```diff -// utils.ts:255 -- const itemTextString = itemText.join('').trim(); -+ const itemTextString = itemText.join(''); -``` - -### 修改 2: 使用原始列表序号 -```diff -// utils.ts:257 -- const prefix = listType === 'bullet' ? '- ' : `${index + 1}. `; -+ const prefix = listType === 'bullet' ? '- ' : `${listItem.value || index + 1}. `; -``` - -### 修改 3: 处理列表项换行(方案1) -```diff -// utils.ts:242 - if (child.type === 'linebreak') { -- itemText.push('\n'); -+ itemText.push(' '); // 转为空格而不是换行 - } -``` - ---- - -## 总结 - -通过全面分析,确认了 **3 个会导致编辑器显示与解析文本不一致的问题**: - -1. 🔴 列表项尾部空格丢失 → 修复: 移除 trim() -2. 🔴 有序列表序号重置 → 修复: 使用 listItem.value -3. 🟡 列表项内换行不对称 → 修复: 转换为空格或禁止 - -其他方面(空行、前导空格、边界处理)都运作正常。 - -建议优先修复前两个 P0 问题,确保用户数据的完整性和语义准确性。 diff --git a/.claude/design/projects_app_performance_stability_analysis.md b/.claude/design/projects_app_performance_stability_analysis.md index 6ff0a0db9..574fad252 100644 --- a/.claude/design/projects_app_performance_stability_analysis.md +++ b/.claude/design/projects_app_performance_stability_analysis.md @@ -608,136 +608,6 @@ function App({ Component, pageProps }: AppPropsWithLayout) { --- -### 🔴 H9. instrumentation.ts 初始化失败未处理,导致静默失败 - -**位置**: `projects/app/src/instrumentation.ts:81-84` - -**问题描述**: -```typescript -} catch (error) { - console.log('Init system error', error); - exit(1); -} -``` -- 初始化失败直接退出进程 -- 部分初始化错误被 `.catch()` 吞没 -- 缺少初始化状态检查 - -**风险等级**: 🔴 高危 - -**影响**: -- 应用启动失败但无明确错误信息 -- 部分服务未初始化导致运行时错误 -- 调试困难 - -**建议方案**: -```typescript -// 1. 详细的初始化错误处理 -export async function register() { - const initSteps: Array<{ - name: string; - fn: () => Promise; - required: boolean; - }> = []; - - try { - if (process.env.NEXT_RUNTIME !== 'nodejs') { - return; - } - - const results = { - success: [] as string[], - failed: [] as Array<{ name: string; error: any }> - }; - - // 阶段 1: 基础连接 (必需) - try { - console.log('Connecting to MongoDB...'); - await connectMongo({ db: connectionMongo, url: MONGO_URL }); - results.success.push('MongoDB Main'); - } catch (error) { - console.error('Fatal: MongoDB connection failed', error); - throw error; - } - - try { - await connectMongo({ db: connectionLogMongo, url: MONGO_LOG_URL }); - results.success.push('MongoDB Log'); - } catch (error) { - console.warn('Non-fatal: MongoDB Log connection failed', error); - results.failed.push({ name: 'MongoDB Log', error }); - } - - // 阶段 2: 系统初始化 (必需) - try { - console.log('Initializing system config...'); - await Promise.all([ - getInitConfig(), - initVectorStore(), - initRootUser(), - loadSystemModels() - ]); - results.success.push('System Config'); - } catch (error) { - console.error('Fatal: System initialization failed', error); - throw error; - } - - // 阶段 3: 可选服务 - await Promise.allSettled([ - preLoadWorker().catch(e => { - console.warn('Worker preload failed (non-fatal)', e); - results.failed.push({ name: 'Worker Preload', error: e }); - }), - getSystemTools().catch(e => { - console.warn('System tools init failed (non-fatal)', e); - results.failed.push({ name: 'System Tools', error: e }); - }), - initSystemPluginGroups().catch(e => { - console.warn('Plugin groups init failed (non-fatal)', e); - results.failed.push({ name: 'Plugin Groups', error: e }); - }) - ]); - - // 阶段 4: 后台任务 - startCron(); - startTrainingQueue(true); - trackTimerProcess(); - - console.log('Init system success', { - success: results.success, - failed: results.failed.map(f => f.name) - }); - - } catch (error) { - console.error('Init system critical error', error); - console.error('Stack:', error.stack); - - // 发送告警通知 - if (process.env.ERROR_WEBHOOK_URL) { - try { - await fetch(process.env.ERROR_WEBHOOK_URL, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ - type: 'INIT_ERROR', - error: error.message, - stack: error.stack, - timestamp: new Date().toISOString() - }) - }); - } catch (webhookError) { - console.error('Failed to send error webhook', webhookError); - } - } - - exit(1); - } -} -``` - ---- - ## 二、中危问题 (Medium Priority) ### 🟡 M1. Next.js 未启用 SWC 编译优化完整特性 diff --git a/.claude/design/projects_app_performance_stability_deep_analysis.md b/.claude/design/projects_app_performance_stability_deep_analysis.md index 27f6cb80f..8379bd548 100644 --- a/.claude/design/projects_app_performance_stability_deep_analysis.md +++ b/.claude/design/projects_app_performance_stability_deep_analysis.md @@ -934,28 +934,6 @@ export const authCertWithCSRF = async (props: AuthModeType) => { ## 新增中危问题 (Additional Medium Priority) -### 🟡 M20. 向量查询缓存策略过于激进 - -**位置**: `packages/service/common/vectorDB/controller.ts:29-35` - -**问题描述**: -```typescript -const onDelCache = throttle((teamId: string) => delRedisCache(getChcheKey(teamId)), 30000, { - leading: true, - trailing: true -}); -``` -- 删除操作使用 throttle,30 秒内只执行一次 -- 可能导致缓存计数不准确 -- 未考虑高频删除场景 - -**建议**: -- 删除操作直接更新缓存 -- 定期全量同步缓存和数据库 -- 添加缓存一致性校验 - ---- - ### 🟡 M21. 训练队列缺少优先级机制 **位置**: `packages/service/common/bullmq/index.ts:20-26` diff --git a/document/content/docs/upgrading/4-14/4144.mdx b/document/content/docs/upgrading/4-14/4144.mdx index e4c870e43..7bc11e15b 100644 --- a/document/content/docs/upgrading/4-14/4144.mdx +++ b/document/content/docs/upgrading/4-14/4144.mdx @@ -4,18 +4,47 @@ description: 'FastGPT V4.14.4 更新说明' --- +## 更新指南 + +### 1. 更新镜像: + + + +### 2. 执行升级脚本 + +从任意终端,发起 1 个 HTTP 请求。其中 `{{rootkey}}` 替换成环境变量里的 `rootkey`;`{{host}}` 替换成**FastGPT 域名**。 + +```bash +curl --location --request POST 'https://{{host}}/api/admin/initv4144' \ +--header 'rootkey: {{rootkey}}' \ +--header 'Content-Type: application/json' +``` + +将 4.14.3 中,遗留的 Dataset/local 接口上传的文件,也迁移到 S3 中。 + ## 🚀 新增内容 1. 工具调用支持配置流输出 +2. AI 积分告警通知。 +3. 对话日志支持展示 IP 地址归属地。 +4. 通过 API 上传本地文件至知识库,保存至 S3。同时将旧版 Gridfs 代码全部移除。 +5. 新版订阅套餐逻辑。 ## ⚙️ 优化 1. 增加 S3 上传文件超时时长为 5 分钟。 +2. 问题优化采用 JinaAI 的边际收益公式,获取最大边际收益的检索词。 +3. 用户通知,支持中英文,以及优化模板。 +4. 删除知识库采用队列异步删除模式。 ## 🐛 修复 1. 循环节点数组,取消过滤空内容。 2. 工作流工具,未传递自定义 DataId,导致测试运行时,查看知识库提示无权限。 +3. 对话 Agent 工具配置中,非必填的布尔和数字类型无法直接确认。 +4. 工作台卡片在名字过长时错位。 +5. 分享链接中url query 中携带全局变量时,前端 UI 不会加载该值。 +6. window 下判断 CSV 文件异常。 ## 插件 diff --git a/document/data/doc-last-modified.json b/document/data/doc-last-modified.json index 1bcde0095..81bb72132 100644 --- a/document/data/doc-last-modified.json +++ b/document/data/doc-last-modified.json @@ -118,7 +118,7 @@ "document/content/docs/upgrading/4-14/4141.mdx": "2025-11-19T10:15:27+08:00", "document/content/docs/upgrading/4-14/4142.mdx": "2025-11-18T19:27:14+08:00", "document/content/docs/upgrading/4-14/4143.mdx": "2025-11-26T20:52:05+08:00", - "document/content/docs/upgrading/4-14/4144.mdx": "2025-12-01T21:46:30+08:00", + "document/content/docs/upgrading/4-14/4144.mdx": "2025-12-07T14:24:15+08:00", "document/content/docs/upgrading/4-8/40.mdx": "2025-08-02T19:38:37+08:00", "document/content/docs/upgrading/4-8/41.mdx": "2025-08-02T19:38:37+08:00", "document/content/docs/upgrading/4-8/42.mdx": "2025-08-02T19:38:37+08:00", diff --git a/packages/global/common/error/code/team.ts b/packages/global/common/error/code/team.ts index 0d42ee79a..4f8a891f9 100644 --- a/packages/global/common/error/code/team.ts +++ b/packages/global/common/error/code/team.ts @@ -11,6 +11,7 @@ export enum TeamErrEnum { datasetAmountNotEnough = 'datasetAmountNotEnough', appAmountNotEnough = 'appAmountNotEnough', pluginAmountNotEnough = 'pluginAmountNotEnough', + appFolderAmountNotEnough = 'appFolderAmountNotEnough', websiteSyncNotEnough = 'websiteSyncNotEnough', reRankNotEnough = 'reRankNotEnough', groupNameEmpty = 'groupNameEmpty', @@ -65,6 +66,10 @@ const teamErr = [ statusText: TeamErrEnum.pluginAmountNotEnough, message: i18nT('common:code_error.team_error.plugin_amount_not_enough') }, + { + statusText: TeamErrEnum.appFolderAmountNotEnough, + message: i18nT('common:code_error.team_error.app_folder_amount_not_enough') + }, { statusText: TeamErrEnum.websiteSyncNotEnough, message: i18nT('common:code_error.team_error.website_sync_not_enough') diff --git a/packages/global/common/file/constants.ts b/packages/global/common/file/constants.ts index ac48e3a3e..8e2149d76 100644 --- a/packages/global/common/file/constants.ts +++ b/packages/global/common/file/constants.ts @@ -1,20 +1,8 @@ -import { i18nT } from '../../../web/i18n/utils'; - /* mongo fs bucket */ export enum BucketNameEnum { dataset = 'dataset', chat = 'chat' } -export const bucketNameMap = { - [BucketNameEnum.dataset]: { - label: i18nT('file:bucket_file'), - previewExpireMinutes: 30 // 30 minutes - }, - [BucketNameEnum.chat]: { - label: i18nT('file:bucket_chat'), - previewExpireMinutes: 7 * 24 * 60 // 7 days - } -}; export const EndpointUrl = `${process.env.FILE_DOMAIN || process.env.FE_DOMAIN || ''}${process.env.NEXT_PUBLIC_BASE_URL || ''}`; export const ReadFileBaseUrl = `${EndpointUrl}/api/common/file/read`; diff --git a/packages/global/common/middle/tracks/constants.ts b/packages/global/common/middle/tracks/constants.ts index c7e6e76a3..8aec419f0 100644 --- a/packages/global/common/middle/tracks/constants.ts +++ b/packages/global/common/middle/tracks/constants.ts @@ -9,5 +9,6 @@ export enum TrackEnum { datasetSearch = 'datasetSearch', readSystemAnnouncement = 'readSystemAnnouncement', clickOperationalAd = 'clickOperationalAd', - closeOperationalAd = 'closeOperationalAd' + closeOperationalAd = 'closeOperationalAd', + teamChatQPM = 'teamChatQPM' } diff --git a/packages/global/common/string/tools.ts b/packages/global/common/string/tools.ts index 25f00da01..fd306e7ff 100644 --- a/packages/global/common/string/tools.ts +++ b/packages/global/common/string/tools.ts @@ -184,7 +184,7 @@ export const sliceStrStartEnd = (str: string, start: number, end: number) => { return `${startContent}${overSize ? `\n\n...[hide ${str.length - start - end} chars]...\n\n` : ''}${endContent}`; }; -/* +/* Parse file extension from url Test: 1. https://xxx.com/file.pdf?token=123 @@ -201,3 +201,32 @@ export const parseFileExtensionFromUrl = (url = '') => { const extension = fileName.split('.').pop(); return (extension || '').toLowerCase(); }; + +export const formatNumberWithUnit = (num: number, locale: string = 'zh-CN'): string => { + if (num === 0) return '0'; + if (!num || isNaN(num)) return '-'; + const absNum = Math.abs(num); + const isNegative = num < 0; + const prefix = isNegative ? '-' : ''; + + if (locale === 'zh-CN') { + if (absNum >= 10000) { + const value = absNum / 10000; + const formatted = Number(value.toFixed(2)).toString(); + return `${prefix}${formatted}万`; + } + return num.toLocaleString(locale); + } else { + if (absNum >= 1000000) { + const value = absNum / 1000000; + const formatted = Number(value.toFixed(2)).toString(); + return `${prefix}${formatted}M`; + } + if (absNum >= 1000) { + const value = absNum / 1000; + const formatted = Number(value.toFixed(2)).toString(); + return `${prefix}${formatted}K`; + } + return num.toLocaleString(locale); + } +}; diff --git a/packages/global/common/system/types/index.d.ts b/packages/global/common/system/types/index.d.ts index b90da87e1..589392ad1 100644 --- a/packages/global/common/system/types/index.d.ts +++ b/packages/global/common/system/types/index.d.ts @@ -65,6 +65,7 @@ export type FastGPTFeConfigsType = { show_compliance_copywriting?: boolean; show_aiproxy?: boolean; show_coupon?: boolean; + show_discount_coupon?: boolean; concatMd?: string; show_dataset_feishu?: boolean; diff --git a/packages/global/core/app/logs/constants.ts b/packages/global/core/app/logs/constants.ts index 8bce49ac9..176e73c15 100644 --- a/packages/global/core/app/logs/constants.ts +++ b/packages/global/core/app/logs/constants.ts @@ -13,7 +13,8 @@ export enum AppLogKeysEnum { ANNOTATED_COUNT = 'annotatedCount', POINTS = 'points', RESPONSE_TIME = 'responseTime', - ERROR_COUNT = 'errorCount' + ERROR_COUNT = 'errorCount', + REGION = 'region' } export const AppLogKeysEnumMap = { @@ -29,7 +30,8 @@ export const AppLogKeysEnumMap = { [AppLogKeysEnum.ANNOTATED_COUNT]: i18nT('app:logs_keys_annotatedCount'), [AppLogKeysEnum.POINTS]: i18nT('app:logs_keys_points'), [AppLogKeysEnum.RESPONSE_TIME]: i18nT('app:logs_keys_responseTime'), - [AppLogKeysEnum.ERROR_COUNT]: i18nT('app:logs_keys_errorCount') + [AppLogKeysEnum.ERROR_COUNT]: i18nT('app:logs_keys_errorCount'), + [AppLogKeysEnum.REGION]: i18nT('app:logs_keys_region') }; export const DefaultAppLogKeys = [ @@ -45,7 +47,8 @@ export const DefaultAppLogKeys = [ { key: AppLogKeysEnum.ANNOTATED_COUNT, enable: false }, { key: AppLogKeysEnum.POINTS, enable: false }, { key: AppLogKeysEnum.RESPONSE_TIME, enable: false }, - { key: AppLogKeysEnum.ERROR_COUNT, enable: false } + { key: AppLogKeysEnum.ERROR_COUNT, enable: false }, + { key: AppLogKeysEnum.REGION, enable: true } ]; export enum AppLogTimespanEnum { diff --git a/packages/global/core/chat/utils.ts b/packages/global/core/chat/utils.ts index 935b3de52..17d40ffdd 100644 --- a/packages/global/core/chat/utils.ts +++ b/packages/global/core/chat/utils.ts @@ -59,7 +59,6 @@ export const getHistoryPreview = ( return `![Input an image](${item.file.url.slice(0, 100)}...)`; return ''; }) - .filter(Boolean) .join('\n') || '' ); } else if (item.obj === ChatRoleEnum.AI) { diff --git a/packages/global/core/dataset/type.d.ts b/packages/global/core/dataset/type.d.ts index e3fee5781..de9b0adbd 100644 --- a/packages/global/core/dataset/type.d.ts +++ b/packages/global/core/dataset/type.d.ts @@ -83,6 +83,9 @@ export type DatasetSchemaType = { apiDatasetServer?: ApiDatasetServerType; + // 软删除字段 + deleteTime?: Date | null; + // abandon autoSync?: boolean; externalReadUrl?: string; diff --git a/packages/global/openapi/index.ts b/packages/global/openapi/index.ts index 323e3148f..770332266 100644 --- a/packages/global/openapi/index.ts +++ b/packages/global/openapi/index.ts @@ -3,6 +3,7 @@ import { ChatPath } from './core/chat'; import { ApiKeyPath } from './support/openapi'; import { TagsMap } from './tag'; import { PluginPath } from './core/plugin'; +import { WalletPath } from './support/wallet'; export const openAPIDocument = createDocument({ openapi: '3.1.0', @@ -14,7 +15,8 @@ export const openAPIDocument = createDocument({ paths: { ...ChatPath, ...ApiKeyPath, - ...PluginPath + ...PluginPath, + ...WalletPath }, servers: [{ url: '/api' }], 'x-tagGroups': [ @@ -33,6 +35,10 @@ export const openAPIDocument = createDocument({ { name: 'ApiKey', tags: [TagsMap.apiKey] + }, + { + name: '支付', + tags: [TagsMap.walletBill, TagsMap.walletDiscountCoupon] } ] }); diff --git a/packages/global/openapi/support/wallet/bill/api.ts b/packages/global/openapi/support/wallet/bill/api.ts new file mode 100644 index 000000000..59e9e577f --- /dev/null +++ b/packages/global/openapi/support/wallet/bill/api.ts @@ -0,0 +1,96 @@ +import { z } from 'zod'; +import { ObjectIdSchema } from '../../../../common/type/mongo'; +import { + BillTypeEnum, + BillStatusEnum, + BillPayWayEnum +} from '../../../../support/wallet/bill/constants'; +import { StandardSubLevelEnum, SubModeEnum } from '../../../../support/wallet/sub/constants'; +import { PaginationSchema } from '../../../api'; +import { BillSchema } from '../../../../support/wallet/bill/type'; + +// Bill list +export const BillListQuerySchema = PaginationSchema.safeExtend({ + type: z.enum(BillTypeEnum).optional().meta({ description: '订单类型筛选' }) +}); +export type GetBillListQueryType = z.infer; +export const BillListResponseSchema = z.object({ + total: z.number(), + list: z.array(BillSchema) +}); +export type GetBillListResponseType = z.infer; + +// Create +export const CreateStandPlanBillSchema = z + .object({ + type: z.literal(BillTypeEnum.standSubPlan).meta({ description: '订单类型:标准订阅套餐' }), + level: z.enum(StandardSubLevelEnum).meta({ description: '标准订阅等级' }), + subMode: z.enum(SubModeEnum).meta({ description: '订阅周期' }), + discountCouponId: z.string().optional().meta({ description: '优惠券 ID' }) + }) + .meta({ description: '标准订阅套餐订单创建参数' }); +export const CreateExtractPointsBillSchema = z + .object({ + type: z.literal(BillTypeEnum.extraPoints).meta({ description: '订单类型:额外积分' }), + extraPoints: z.int().min(0).meta({ description: '额外积分数量' }), + month: z.int().min(1).max(12).meta({ description: '订阅月数' }), + discountCouponId: z.string().optional().meta({ description: '优惠券 ID(未使用)' }) + }) + .meta({ description: '额外积分订单创建参数' }); +export const CreateExtractDatasetBillSchema = z + .object({ + type: z.literal(BillTypeEnum.extraDatasetSub).meta({ description: '订单类型:额外数据集存储' }), + extraDatasetSize: z.int().min(0).meta({ description: '额外数据集大小' }), + month: z.int().min(1).max(12).meta({ description: '订阅月数' }), + discountCouponId: z.string().optional().meta({ description: '优惠券 ID(未使用)' }) + }) + .meta({ description: '额外数据集存储订单创建参数' }); +export const CreateBillPropsSchema = z.discriminatedUnion('type', [ + CreateStandPlanBillSchema, + CreateExtractPointsBillSchema, + CreateExtractDatasetBillSchema +]); +export type CreateBillPropsType = z.infer; + +export const UpdatePaymentPropsSchema = z.object({ + billId: ObjectIdSchema, + payWay: z.enum(BillPayWayEnum) +}); +export type UpdatePaymentPropsType = z.infer; + +export const UpdateBillResponseSchema = z + .object({ + qrCode: z.string().optional().meta({ description: '支付二维码 URL' }), + iframeCode: z.string().optional().meta({ description: '支付 iframe 代码' }), + markdown: z.string().optional().meta({ description: 'Markdown 格式的支付信息' }) + }) + .refine((data) => data.qrCode || data.iframeCode || data.markdown, { + message: 'At least one of qrCode, iframeCode, or markdown must be provided' + }); +export type UpdateBillResponseType = z.infer; + +export const CreateBillResponseSchema = UpdateBillResponseSchema.safeExtend({ + billId: z.string().meta({ description: '订单 ID' }), + readPrice: z.number().min(0).meta({ description: '实际支付价格' }), + payment: z.enum(BillPayWayEnum).meta({ description: '支付方式' }) +}); +export type CreateBillResponseType = z.infer; + +// Check pay result +export const CheckPayResultResponseSchema = z.object({ + status: z.enum(BillStatusEnum), + description: z.string().optional() +}); +export type CheckPayResultResponseType = z.infer; + +// Bill detail +export const BillDetailResponseSchema = BillSchema.safeExtend({ + couponName: z.string().optional() +}); +export type BillDetailResponseType = z.infer; + +// Cancel bill +export const CancelBillPropsSchema = z.object({ + billId: ObjectIdSchema.meta({ description: '订单 ID' }) +}); +export type CancelBillPropsType = z.infer; diff --git a/packages/global/openapi/support/wallet/bill/index.ts b/packages/global/openapi/support/wallet/bill/index.ts new file mode 100644 index 000000000..9356dc811 --- /dev/null +++ b/packages/global/openapi/support/wallet/bill/index.ts @@ -0,0 +1,164 @@ +import { z } from 'zod'; +import type { OpenAPIPath } from '../../../type'; +import { + CreateBillPropsSchema, + CreateBillResponseSchema, + UpdatePaymentPropsSchema, + UpdateBillResponseSchema, + CheckPayResultResponseSchema, + BillDetailResponseSchema, + BillListQuerySchema, + CancelBillPropsSchema +} from './api'; +import { TagsMap } from '../../../tag'; +import { ObjectIdSchema } from '../../../../common/type/mongo'; + +export const BillPath: OpenAPIPath = { + '/support/wallet/bill/create': { + post: { + summary: '创建订单', + description: '创建订单订单,支持标准订阅套餐、额外积分、额外数据集存储三种类型', + tags: [TagsMap.walletBill], + requestBody: { + content: { + 'application/json': { + schema: CreateBillPropsSchema + } + } + }, + responses: { + 200: { + description: '成功创建订单', + content: { + 'application/json': { + schema: CreateBillResponseSchema + } + } + } + } + } + }, + '/support/wallet/bill/pay/updatePayment': { + post: { + summary: '更新支付方式', + description: '为未支付的订单更新支付方式,返回新的支付二维码或链接', + tags: [TagsMap.walletBill], + requestBody: { + content: { + 'application/json': { + schema: UpdatePaymentPropsSchema + } + } + }, + responses: { + 200: { + description: '成功更新支付方式', + content: { + 'application/json': { + schema: UpdateBillResponseSchema + } + } + } + } + } + }, + '/support/wallet/bill/pay/checkPayResult': { + get: { + summary: '检查支付结果', + description: '检查订单的支付状态,用于轮询支付结果', + tags: [TagsMap.walletBill], + requestParams: { + query: z.object({ + payId: ObjectIdSchema.meta({ + description: '订单 ID' + }) + }) + }, + responses: { + 200: { + description: '成功获取支付结果', + content: { + 'application/json': { + schema: CheckPayResultResponseSchema + } + } + } + } + } + }, + '/support/wallet/bill/detail': { + get: { + summary: '获取订单详情', + description: '根据订单 ID 获取订单详细信息,包括优惠券名称等', + tags: [TagsMap.walletBill], + requestParams: { + query: z.object({ + billId: ObjectIdSchema.meta({ + description: '订单 ID' + }) + }) + }, + responses: { + 200: { + description: '成功获取订单详情', + content: { + 'application/json': { + schema: BillDetailResponseSchema.nullable() + } + } + } + } + } + }, + '/support/wallet/bill/list': { + post: { + summary: '获取订单列表', + description: '分页获取团队的订单列表,支持按类型筛选', + tags: [TagsMap.walletBill], + requestBody: { + content: { + 'application/json': { + schema: BillListQuerySchema + } + } + }, + responses: { + 200: { + description: '成功获取订单列表', + content: { + 'application/json': { + schema: z.object({ + list: z.array(BillDetailResponseSchema), + total: z.number().meta({ description: '总数' }) + }) + } + } + } + } + } + }, + '/support/wallet/bill/cancel': { + post: { + summary: '取消订单', + description: '取消未支付的订单,如果使用了优惠券会自动返还', + tags: [TagsMap.walletBill], + requestBody: { + content: { + 'application/json': { + schema: CancelBillPropsSchema + } + } + }, + responses: { + 200: { + description: '成功取消订单', + content: { + 'application/json': { + schema: z.null() + } + } + } + } + } + } +}; diff --git a/packages/global/openapi/support/wallet/discountCoupon/api.ts b/packages/global/openapi/support/wallet/discountCoupon/api.ts new file mode 100644 index 000000000..b62c1fdc5 --- /dev/null +++ b/packages/global/openapi/support/wallet/discountCoupon/api.ts @@ -0,0 +1,33 @@ +import { z } from 'zod'; +import { ObjectIdSchema } from '../../../../common/type/mongo'; +import { + DiscountCouponStatusEnum, + DiscountCouponTypeEnum +} from '../../../../support/wallet/sub/discountCoupon/constants'; + +export const DiscountCouponSchema = z.object({ + _id: ObjectIdSchema.meta({ description: '优惠券 ID' }), + teamId: ObjectIdSchema.meta({ description: '团队 ID' }), + type: z.enum(Object.values(DiscountCouponTypeEnum)).meta({ description: '优惠券类型' }), + startTime: z.coerce.date().optional().meta({ description: '生效时间' }), + expiredTime: z.coerce.date().meta({ description: '过期时间' }), + usedAt: z.coerce.date().optional().meta({ description: '使用时间' }), + createTime: z.coerce.date().meta({ description: '创建时间' }) +}); + +export type DiscountCouponSchemaType = z.infer; + +export const DiscountCouponItemSchema = DiscountCouponSchema.extend({ + name: z.string().meta({ description: '优惠券名称' }), + description: z.string().meta({ description: '优惠券描述' }), + discount: z.number().min(0).max(1).meta({ description: '折扣率' }), + iconZh: z.string().meta({ description: '中文图标路径' }), + iconEn: z.string().meta({ description: '英文图标路径' }), + status: z.enum(DiscountCouponStatusEnum).meta({ description: '优惠券状态' }), + billId: ObjectIdSchema.optional().meta({ + description: '关联的订单 ID, 被使用后该值存在' + }) +}); +export const DiscountCouponListResponseSchema = z.array(DiscountCouponItemSchema); + +export type DiscountCouponListResponseType = z.infer; diff --git a/packages/global/openapi/support/wallet/discountCoupon/index.ts b/packages/global/openapi/support/wallet/discountCoupon/index.ts new file mode 100644 index 000000000..2ffcf36b6 --- /dev/null +++ b/packages/global/openapi/support/wallet/discountCoupon/index.ts @@ -0,0 +1,23 @@ +import type { OpenAPIPath } from '../../../type'; +import { DiscountCouponListResponseSchema } from './api'; +import { TagsMap } from '../../../tag'; + +export const DiscountCouponPath: OpenAPIPath = { + '/support/wallet/discountCoupon/list': { + get: { + summary: '获取优惠券列表', + description: '获取团队的优惠券列表,包括优惠券状态、使用情况等信息', + tags: [TagsMap.walletDiscountCoupon], + responses: { + 200: { + description: '成功获取优惠券列表', + content: { + 'application/json': { + schema: DiscountCouponListResponseSchema + } + } + } + } + } + } +}; diff --git a/packages/global/openapi/support/wallet/index.ts b/packages/global/openapi/support/wallet/index.ts new file mode 100644 index 000000000..75bc3944b --- /dev/null +++ b/packages/global/openapi/support/wallet/index.ts @@ -0,0 +1,8 @@ +import type { OpenAPIPath } from '../../type'; +import { BillPath } from './bill'; +import { DiscountCouponPath } from './discountCoupon'; + +export const WalletPath: OpenAPIPath = { + ...BillPath, + ...DiscountCouponPath +}; diff --git a/packages/global/openapi/tag.ts b/packages/global/openapi/tag.ts index ddc4d8ce6..853894194 100644 --- a/packages/global/openapi/tag.ts +++ b/packages/global/openapi/tag.ts @@ -6,5 +6,7 @@ export const TagsMap = { pluginAdmin: '管理员插件管理', pluginToolAdmin: '管理员系统工具管理', pluginTeam: '团队插件管理', - apiKey: 'APIKey' + apiKey: 'APIKey', + walletBill: '订单', + walletDiscountCoupon: '优惠券' }; diff --git a/packages/global/support/user/api.d.ts b/packages/global/support/user/api.d.ts index 858df4f17..ce7c91916 100644 --- a/packages/global/support/user/api.d.ts +++ b/packages/global/support/user/api.d.ts @@ -1,21 +1,24 @@ -import type { MemberGroupSchemaType } from 'support/permission/memberGroup/type'; -import { MemberGroupListItemType } from 'support/permission/memberGroup/type'; +import type { MemberGroupSchemaType } from '../permission/memberGroup/type'; +import { MemberGroupListItemType } from '../permission/memberGroup/type'; import type { OAuthEnum } from './constant'; -import type { TrackRegisterParams } from './login/api'; import { TeamMemberStatusEnum } from './team/constant'; import type { OrgType } from './team/org/type'; import type { TeamMemberItemType } from './team/type'; +import type { LangEnum } from '../../common/i18n/type'; +import type { TrackRegisterParams } from '../marketing/type'; export type PostLoginProps = { username: string; password: string; code: string; + language?: `${LangEnum}`; }; export type OauthLoginProps = { type: `${OAuthEnum}`; callbackUrl: string; props: Record; + language?: `${LangEnum}`; } & TrackRegisterParams; export type WxLoginProps = { diff --git a/packages/global/support/user/inform/constants.ts b/packages/global/support/user/inform/constants.ts index dfd19e6a4..206628a92 100644 --- a/packages/global/support/user/inform/constants.ts +++ b/packages/global/support/user/inform/constants.ts @@ -17,13 +17,22 @@ export const InformLevelMap = { }; export enum SendInformTemplateCodeEnum { - EXPIRE_SOON = 'EXPIRE_SOON', - EXPIRED = 'EXPIRED', - FREE_CLEAN = 'FREE_CLEAN', - REGISTER = 'REGISTER', - RESET_PASSWORD = 'RESET_PASSWORD', - BIND_NOTIFICATION = 'BIND_NOTIFICATION', - LACK_OF_POINTS = 'LACK_OF_POINTS', - CUSTOM = 'CUSTOM', - MANAGE_RENAME = 'MANAGE_RENAME' + REGISTER = 'REGISTER', // 注册 + RESET_PASSWORD = 'RESET_PASSWORD', // 重置密码 + BIND_NOTIFICATION = 'BIND_NOTIFICATION', // 绑定通知 + + EXPIRE_SOON = 'EXPIRE_SOON', // 即将过期 + EXPIRED = 'EXPIRED', // 已过期 + FREE_CLEAN = 'FREE_CLEAN', // 免费版清理 + + POINTS_THIRTY_PERCENT_REMAIN = 'POINTS_THIRTY_PERCENT_REMAIN', // 积分30%剩余 + POINTS_TEN_PERCENT_REMAIN = 'POINTS_TEN_PERCENT_REMAIN', // 积分10%剩余 + LACK_OF_POINTS = 'LACK_OF_POINTS', // 积分不足 + + DATASET_INDEX_NO_REMAIN = 'DATASET_INDEX_NO_REMAIN', // 数据集索引0剩余 + DATASET_INDEX_TEN_PERCENT_REMAIN = 'DATASET_INDEX_TEN_PERCENT_REMAIN', // 数据集索引10%剩余 + DATASET_INDEX_THIRTY_PERCENT_REMAIN = 'DATASET_INDEX_THIRTY_PERCENT_REMAIN', // 数据集索引30%剩余 + + MANAGE_RENAME = 'MANAGE_RENAME', // 管理员重命名 + CUSTOM = 'CUSTOM' // 自定义 } diff --git a/packages/global/support/user/login/api.d.ts b/packages/global/support/user/login/api.d.ts index 236827a83..6f2fe1028 100644 --- a/packages/global/support/user/login/api.d.ts +++ b/packages/global/support/user/login/api.d.ts @@ -1,3 +1,4 @@ +import type { LangEnum } from '../../../common/i18n/type'; import type { TrackRegisterParams } from '../../marketing/type'; export type GetWXLoginQRResponse = { @@ -9,4 +10,5 @@ export type AccountRegisterBody = { username: string; code: string; password: string; + language?: `${LangEnum}`; } & TrackRegisterParams; diff --git a/packages/global/support/user/type.d.ts b/packages/global/support/user/type.d.ts index 8690aab3b..96502dded 100644 --- a/packages/global/support/user/type.d.ts +++ b/packages/global/support/user/type.d.ts @@ -1,3 +1,4 @@ +import type { LangEnum } from '../common/i18n/type'; import type { TeamPermission } from '../permission/user/controller'; import type { UserStatusEnum } from './constant'; import type { TeamMemberStatusEnum } from './team/constant'; @@ -12,6 +13,7 @@ export type UserModelSchema = { openaiKey: string; createTime: number; timezone: string; + language: `${LangEnum}`; status: `${UserStatusEnum}`; lastLoginTmbId?: string; passwordUpdateTime?: Date; @@ -26,9 +28,9 @@ export type UserType = { username: string; avatar: string; // it should be team member's avatar after 4.8.18 timezone: string; + language?: `${LangEnum}`; promotionRate: UserModelSchema['promotionRate']; team: TeamTmbItemType; - notificationAccount?: string; permission: TeamPermission; contact?: string; }; diff --git a/packages/global/support/wallet/bill/api.d.ts b/packages/global/support/wallet/bill/api.d.ts deleted file mode 100644 index 9cec7c44c..000000000 --- a/packages/global/support/wallet/bill/api.d.ts +++ /dev/null @@ -1,44 +0,0 @@ -import type { StandardSubLevelEnum, SubModeEnum } from '../sub/constants'; -import type { BillTypeEnum, BillPayWayEnum } from './constants'; -import { DrawBillQRItem } from './constants'; - -export type CreateOrderResponse = { - qrCode?: string; - iframeCode?: string; - markdown?: string; -}; - -export type CreateStandPlanBill = { - type: BillTypeEnum.standSubPlan; - level: `${StandardSubLevelEnum}`; - subMode: `${SubModeEnum}`; -}; -type CreateExtractPointsBill = { - type: BillTypeEnum.extraPoints; - extraPoints: number; -}; -type CreateExtractDatasetBill = { - type: BillTypeEnum.extraDatasetSub; - extraDatasetSize: number; - month: number; -}; -export type CreateBillProps = - | CreateStandPlanBill - | CreateExtractPointsBill - | CreateExtractDatasetBill; - -export type CreateBillResponse = { - billId: string; - readPrice: number; - payment: BillPayWayEnum; -} & CreateOrderResponse; - -export type UpdatePaymentProps = { - billId: string; - payWay: BillPayWayEnum; -}; - -export type CheckPayResultResponse = { - status: BillStatusEnum; - description?: string; -}; diff --git a/packages/global/support/wallet/bill/tools.ts b/packages/global/support/wallet/bill/tools.ts index e4e09a734..c21ac7d5e 100644 --- a/packages/global/support/wallet/bill/tools.ts +++ b/packages/global/support/wallet/bill/tools.ts @@ -1,13 +1,5 @@ import type { PriceOption } from './type'; -// 根据积分获取月份 -export const getMonthByPoints = (points: number) => { - if (points >= 200) return 12; - if (points >= 100) return 6; - if (points >= 50) return 3; - return 1; -}; - // 根据月份获取积分下限 export const getMinPointsByMonth = (month: number): number => { switch (month) { diff --git a/packages/global/support/wallet/bill/type.d.ts b/packages/global/support/wallet/bill/type.d.ts deleted file mode 100644 index 2a73af374..000000000 --- a/packages/global/support/wallet/bill/type.d.ts +++ /dev/null @@ -1,65 +0,0 @@ -import type { StandardSubLevelEnum, SubModeEnum } from '../sub/constants'; -import { SubTypeEnum } from '../sub/constants'; -import type { BillPayWayEnum, BillStatusEnum, BillTypeEnum } from './constants'; -import type { TeamInvoiceHeaderType } from '../../user/team/type'; - -export type BillSchemaType = { - _id: string; - userId: string; - teamId: string; - tmbId: string; - createTime: Date; - orderId: string; - status: `${BillStatusEnum}`; - type: BillTypeEnum; - price: number; - hasInvoice: boolean; - metadata: { - payWay: `${BillPayWayEnum}`; - subMode?: `${SubModeEnum}`; - standSubLevel?: `${StandardSubLevelEnum}`; - month?: number; - datasetSize?: number; - extraPoints?: number; - }; - refundData?: { - amount: number; - refundId: string; - refundTime: Date; - }; -}; - -export type ChatNodeUsageType = { - inputTokens?: number; - outputTokens?: number; - totalPoints: number; - moduleName: string; - model?: string; -}; - -export type InvoiceType = { - amount: number; - billIdList: string[]; -} & TeamInvoiceHeaderType; - -export type InvoiceSchemaType = { - _id: string; - teamId: string; - status: 1 | 2; - createTime: Date; - finishTime?: Date; - file?: Buffer; -} & InvoiceType; - -export type AIPointsPriceOption = { - type: 'points'; - points: number; -}; - -export type DatasetPriceOption = { - type: 'dataset'; - size: number; - month: number; -}; - -export type PriceOption = AIPointsPriceOption | DatasetPriceOption; diff --git a/packages/global/support/wallet/bill/type.ts b/packages/global/support/wallet/bill/type.ts new file mode 100644 index 000000000..767b2f32b --- /dev/null +++ b/packages/global/support/wallet/bill/type.ts @@ -0,0 +1,76 @@ +import { StandardSubLevelEnum, SubModeEnum } from '../sub/constants'; +import { SubTypeEnum } from '../sub/constants'; +import { BillPayWayEnum, BillStatusEnum, BillTypeEnum } from './constants'; +import type { TeamInvoiceHeaderType } from '../../user/team/type'; +import { z } from 'zod'; +import { ObjectIdSchema } from '../../../common/type/mongo'; + +export const BillSchema = z.object({ + _id: ObjectIdSchema.meta({ description: '订单 ID' }), + userId: ObjectIdSchema.meta({ description: '用户 ID' }), + teamId: ObjectIdSchema.meta({ description: '团队 ID' }), + tmbId: ObjectIdSchema.meta({ description: '团队成员 ID' }), + createTime: z.coerce.date().meta({ description: '创建时间' }), + orderId: z.string().meta({ description: '订单 ID' }), + status: z.enum(BillStatusEnum).meta({ description: '订单状态' }), + type: z.enum(BillTypeEnum).meta({ description: '订单类型' }), + price: z.number().meta({ description: '价格' }), + couponId: ObjectIdSchema.optional().meta({ + description: '优惠券 ID' + }), + hasInvoice: z.boolean().meta({ description: '是否已开发票' }), + metadata: z + .object({ + payWay: z.enum(BillPayWayEnum).meta({ description: '支付方式' }), + subMode: z.enum(SubModeEnum).optional().meta({ description: '订阅周期' }), + standSubLevel: z.enum(StandardSubLevelEnum).optional().meta({ description: '订阅等级' }), + month: z.number().optional().meta({ description: '月数' }), + datasetSize: z.number().optional().meta({ description: '数据集大小' }), + extraPoints: z.number().optional().meta({ description: '额外积分' }) + }) + .meta({ description: '元数据' }), + refundData: z + .object({ + amount: z.number().meta({ description: '退款金额' }), + refundId: z.string().meta({ description: '退款 ID' }), + refundTime: z.date().meta({ description: '退款时间' }) + }) + .optional() + .meta({ description: '退款数据' }) +}); +export type BillSchemaType = z.infer; + +export type ChatNodeUsageType = { + inputTokens?: number; + outputTokens?: number; + totalPoints: number; + moduleName: string; + model?: string; +}; + +export type InvoiceType = { + amount: number; + billIdList: string[]; +} & TeamInvoiceHeaderType; + +export type InvoiceSchemaType = { + _id: string; + teamId: string; + status: 1 | 2; + createTime: Date; + finishTime?: Date; + file?: Buffer; +} & InvoiceType; + +export type AIPointsPriceOption = { + type: 'points'; + points: number; +}; + +export type DatasetPriceOption = { + type: 'dataset'; + size: number; + month: number; +}; + +export type PriceOption = AIPointsPriceOption | DatasetPriceOption; diff --git a/packages/global/support/wallet/sub/constants.ts b/packages/global/support/wallet/sub/constants.ts index 32c5a2efe..449fc4dc2 100644 --- a/packages/global/support/wallet/sub/constants.ts +++ b/packages/global/support/wallet/sub/constants.ts @@ -9,17 +9,17 @@ export enum SubTypeEnum { export const subTypeMap = { [SubTypeEnum.standard]: { - label: 'support.wallet.subscription.type.standard', + label: i18nT('common:support.wallet.subscription.type.standard'), icon: 'support/account/plans', orderType: BillTypeEnum.standSubPlan }, [SubTypeEnum.extraDatasetSize]: { - label: 'support.wallet.subscription.type.extraDatasetSize', + label: i18nT('common:support.wallet.subscription.type.extraDatasetSize'), icon: 'core/dataset/datasetLight', orderType: BillTypeEnum.extraDatasetSub }, [SubTypeEnum.extraPoints]: { - label: 'support.wallet.subscription.type.extraPoints', + label: i18nT('common:support.wallet.subscription.type.extraPoints'), icon: 'core/chat/chatLight', orderType: BillTypeEnum.extraPoints } @@ -31,12 +31,12 @@ export enum SubModeEnum { } export const subModeMap = { [SubModeEnum.month]: { - label: 'support.wallet.subscription.mode.Month', + label: i18nT('common:support.wallet.subscription.mode.Month'), durationMonth: 1, payMonth: 1 }, [SubModeEnum.year]: { - label: 'support.wallet.subscription.mode.Year', + label: i18nT('common:support.wallet.subscription.mode.Year'), durationMonth: 12, payMonth: 10 } @@ -44,17 +44,39 @@ export const subModeMap = { export enum StandardSubLevelEnum { free = 'free', + basic = 'basic', + advanced = 'advanced', + custom = 'custom', + + // @deprecated experience = 'experience', team = 'team', - enterprise = 'enterprise', - custom = 'custom' + enterprise = 'enterprise' } + export const standardSubLevelMap = { [StandardSubLevelEnum.free]: { label: i18nT('common:support.wallet.subscription.standardSubLevel.free'), desc: i18nT('common:support.wallet.subscription.standardSubLevel.free desc'), weight: 1 }, + [StandardSubLevelEnum.basic]: { + label: i18nT('common:support.wallet.subscription.standardSubLevel.basic'), + desc: i18nT('common:support.wallet.subscription.standardSubLevel.basic_desc'), + weight: 4 + }, + [StandardSubLevelEnum.advanced]: { + label: i18nT('common:support.wallet.subscription.standardSubLevel.advanced'), + desc: i18nT('common:support.wallet.subscription.standardSubLevel.advanced_desc'), + weight: 5 + }, + [StandardSubLevelEnum.custom]: { + label: i18nT('common:support.wallet.subscription.standardSubLevel.custom'), + desc: i18nT('common:support.wallet.subscription.standardSubLevel.custom_desc'), + weight: 7 + }, + + // deprecated [StandardSubLevelEnum.experience]: { label: i18nT('common:support.wallet.subscription.standardSubLevel.experience'), desc: i18nT('common:support.wallet.subscription.standardSubLevel.experience_desc'), @@ -68,11 +90,6 @@ export const standardSubLevelMap = { [StandardSubLevelEnum.enterprise]: { label: i18nT('common:support.wallet.subscription.standardSubLevel.enterprise'), desc: i18nT('common:support.wallet.subscription.standardSubLevel.enterprise_desc'), - weight: 4 - }, - [StandardSubLevelEnum.custom]: { - label: i18nT('common:support.wallet.subscription.standardSubLevel.custom'), - desc: '', - weight: 5 + weight: 6 } }; diff --git a/packages/global/support/wallet/sub/coupon/type.d.ts b/packages/global/support/wallet/sub/coupon/type.d.ts index 07e347cd0..0afcda182 100644 --- a/packages/global/support/wallet/sub/coupon/type.d.ts +++ b/packages/global/support/wallet/sub/coupon/type.d.ts @@ -1,12 +1,26 @@ import type { SubTypeEnum, StandardSubLevelEnum } from '../constants'; import type { CouponTypeEnum } from './constants'; +export type CustomSubConfig = { + requestsPerMinute: number; + maxTeamMember: number; + maxAppAmount: number; + maxDatasetAmount: number; + chatHistoryStoreDuration: number; + maxDatasetSize: number; + websiteSyncPerDataset: number; + appRegistrationCount: number; + auditLogStoreDuration: number; + ticketResponseTime: number; +}; + export type TeamCouponSub = { type: `${SubTypeEnum}`; // Sub type durationDay: number; // Duration day level?: `${StandardSubLevelEnum}`; // Standard sub level extraDatasetSize?: number; // Extra dataset size totalPoints?: number; // Total points(Extrapoints or Standard sub) + customConfig?: CustomSubConfig; // Custom config for custom level (only required when level=custom) }; export type TeamCouponSchema = { diff --git a/packages/global/support/wallet/sub/discountCoupon/constants.ts b/packages/global/support/wallet/sub/discountCoupon/constants.ts new file mode 100644 index 000000000..b03bb4fe2 --- /dev/null +++ b/packages/global/support/wallet/sub/discountCoupon/constants.ts @@ -0,0 +1,33 @@ +import { i18nT } from '../../../../../web/i18n/utils'; + +export enum DiscountCouponTypeEnum { + monthStandardDiscount70 = 'monthStandardDiscount70', + yearStandardDiscount90 = 'yearStandardDiscount90' +} + +export enum DiscountCouponStatusEnum { + active = 'active', + used = 'used', + expired = 'expired', + notStart = 'notStart' +} + +// Discount coupon type config table, modify to add or update types. +export const discountCouponTypeMap = { + [DiscountCouponTypeEnum.monthStandardDiscount70]: { + type: DiscountCouponTypeEnum.monthStandardDiscount70, + name: i18nT('common:old_user_month_discount_70'), + description: i18nT('common:old_user_month_discount_70_description'), + discount: 0.7, + iconZh: '/imgs/system/discount70CN.svg', + iconEn: '/imgs/system/discount70EN.svg' + }, + [DiscountCouponTypeEnum.yearStandardDiscount90]: { + type: DiscountCouponTypeEnum.yearStandardDiscount90, + name: i18nT('common:old_user_year_discount_90'), + description: i18nT('common:old_user_year_discount_90_description'), + discount: 0.9, + iconZh: '/imgs/system/discount90CN.svg', + iconEn: '/imgs/system/discount90EN.svg' + } +}; diff --git a/packages/global/support/wallet/sub/type.d.ts b/packages/global/support/wallet/sub/type.d.ts index 2f89b7c95..a575ddf7e 100644 --- a/packages/global/support/wallet/sub/type.d.ts +++ b/packages/global/support/wallet/sub/type.d.ts @@ -3,19 +3,28 @@ import type { StandardSubLevelEnum, SubModeEnum, SubTypeEnum } from './constants // Content of plan export type TeamStandardSubPlanItemType = { name?: string; + desc?: string; // Plan description price: number; // read price / month + pointPrice: number; // read price/ one thousand + totalPoints: number; // n maxTeamMember: number; maxAppAmount: number; // max app or plugin amount maxDatasetAmount: number; - chatHistoryStoreDuration: number; // n day maxDatasetSize: number; - trainingWeight: number; // 1~4 - permissionCustomApiKey: boolean; - permissionCustomCopyright: boolean; // feature - permissionWebsiteSync: boolean; - permissionTeamOperationLog: boolean; + + requestsPerMinute?: number; + appRegistrationCount?: number; + chatHistoryStoreDuration: number; // n day + websiteSyncPerDataset?: number; + auditLogStoreDuration?: number; + ticketResponseTime?: number; + + // Custom plan specific fields + priceDescription?: string; + customFormUrl?: string; + customDescriptions?: string[]; }; export type StandSubPlanLevelMapType = Record< @@ -23,14 +32,21 @@ export type StandSubPlanLevelMapType = Record< TeamStandardSubPlanItemType >; +export type PointsPackageItem = { + points: number; + month: number; + price: number; +}; + export type SubPlanType = { - [SubTypeEnum.standard]: StandSubPlanLevelMapType; + [SubTypeEnum.standard]?: StandSubPlanLevelMapType; planDescriptionUrl?: string; + appRegistrationUrl?: string; [SubTypeEnum.extraDatasetSize]: { price: number; }; [SubTypeEnum.extraPoints]: { - price: number; + packages: PointsPackageItem[]; }; }; @@ -49,6 +65,15 @@ export type TeamSubSchema = { maxApp?: number; maxDataset?: number; + // custom level configurations + requestsPerMinute?: number; + chatHistoryStoreDuration?: number; + maxDatasetSize?: number; + websiteSyncPerDataset?: number; + appRegistrationCount?: number; + auditLogStoreDuration?: number; + ticketResponseTime?: number; + totalPoints: number; surplusPoints: number; @@ -71,4 +96,5 @@ export type ClientTeamPlanStatusType = TeamPlanStatusType & { usedAppAmount: number; usedDatasetSize: number; usedDatasetIndexSize: number; + usedRegistrationCount: number; }; diff --git a/packages/service/common/buffer/rawText/controller.ts b/packages/service/common/buffer/rawText/controller.ts deleted file mode 100644 index d16c9c59e..000000000 --- a/packages/service/common/buffer/rawText/controller.ts +++ /dev/null @@ -1,180 +0,0 @@ -import { retryFn } from '@fastgpt/global/common/system/utils'; -import { connectionMongo, Types } from '../../mongo'; -import { MongoRawTextBufferSchema, bucketName } from './schema'; -import { addLog } from '../../system/log'; -import { setCron } from '../../system/cron'; -import { checkTimerLock } from '../../system/timerLock/utils'; -import { TimerIdEnum } from '../../system/timerLock/constants'; -import { gridFsStream2Buffer } from '../../file/gridfs/utils'; -import { readRawContentFromBuffer } from '../../../worker/function'; - -const getGridBucket = () => { - return new connectionMongo.mongo.GridFSBucket(connectionMongo.connection.db!, { - bucketName: bucketName - }); -}; - -export const addRawTextBuffer = async ({ - sourceId, - sourceName, - text, - expiredTime -}: { - sourceId: string; - sourceName: string; - text: string; - expiredTime: Date; -}) => { - const gridBucket = getGridBucket(); - const metadata = { - sourceId, - sourceName, - expiredTime - }; - - const buffer = Buffer.from(text); - - const fileSize = buffer.length; - // 单块大小:尽可能大,但不超过 14MB,不小于128KB - const chunkSizeBytes = (() => { - // 计算理想块大小:文件大小 ÷ 目标块数(10)。 并且每个块需要小于 14MB - const idealChunkSize = Math.min(Math.ceil(fileSize / 10), 14 * 1024 * 1024); - - // 确保块大小至少为128KB - const minChunkSize = 128 * 1024; // 128KB - - // 取理想块大小和最小块大小中的较大值 - let chunkSize = Math.max(idealChunkSize, minChunkSize); - - // 将块大小向上取整到最接近的64KB的倍数,使其更整齐 - chunkSize = Math.ceil(chunkSize / (64 * 1024)) * (64 * 1024); - - return chunkSize; - })(); - - const uploadStream = gridBucket.openUploadStream(sourceId, { - metadata, - chunkSizeBytes - }); - - return retryFn(async () => { - return new Promise((resolve, reject) => { - uploadStream.end(buffer); - uploadStream.on('finish', () => { - resolve(uploadStream.id); - }); - uploadStream.on('error', (error) => { - addLog.error('addRawTextBuffer error', error); - resolve(''); - }); - }); - }); -}; - -export const getRawTextBuffer = async (sourceId: string) => { - const gridBucket = getGridBucket(); - - return retryFn(async () => { - const bufferData = await MongoRawTextBufferSchema.findOne( - { - 'metadata.sourceId': sourceId - }, - '_id metadata' - ).lean(); - if (!bufferData) { - return null; - } - - // Read file content - const downloadStream = gridBucket.openDownloadStream(new Types.ObjectId(bufferData._id)); - - const fileBuffers = await gridFsStream2Buffer(downloadStream); - - const rawText = await (async () => { - if (fileBuffers.length < 10000000) { - return fileBuffers.toString('utf8'); - } else { - return ( - await readRawContentFromBuffer({ - extension: 'txt', - encoding: 'utf8', - buffer: fileBuffers - }) - ).rawText; - } - })(); - - return { - text: rawText, - sourceName: bufferData.metadata?.sourceName || '' - }; - }); -}; - -export const deleteRawTextBuffer = async (sourceId: string): Promise => { - const gridBucket = getGridBucket(); - - return retryFn(async () => { - const buffer = await MongoRawTextBufferSchema.findOne({ 'metadata.sourceId': sourceId }); - if (!buffer) { - return false; - } - - await gridBucket.delete(new Types.ObjectId(buffer._id)); - return true; - }); -}; - -export const updateRawTextBufferExpiredTime = async ({ - sourceId, - expiredTime -}: { - sourceId: string; - expiredTime: Date; -}) => { - return retryFn(async () => { - return MongoRawTextBufferSchema.updateOne( - { 'metadata.sourceId': sourceId }, - { $set: { 'metadata.expiredTime': expiredTime } } - ); - }); -}; - -export const clearExpiredRawTextBufferCron = async () => { - const gridBucket = getGridBucket(); - - const clearExpiredRawTextBuffer = async () => { - addLog.debug('Clear expired raw text buffer start'); - - const data = await MongoRawTextBufferSchema.find( - { - 'metadata.expiredTime': { $lt: new Date() } - }, - '_id' - ).lean(); - - for (const item of data) { - try { - await gridBucket.delete(new Types.ObjectId(item._id)); - } catch (error) { - addLog.error('Delete expired raw text buffer error', error); - } - } - addLog.debug('Clear expired raw text buffer end'); - }; - - setCron('*/10 * * * *', async () => { - if ( - await checkTimerLock({ - timerId: TimerIdEnum.clearExpiredRawTextBuffer, - lockMinuted: 9 - }) - ) { - try { - await clearExpiredRawTextBuffer(); - } catch (error) { - addLog.error('clearExpiredRawTextBufferCron error', error); - } - } - }); -}; diff --git a/packages/service/common/buffer/rawText/schema.ts b/packages/service/common/buffer/rawText/schema.ts deleted file mode 100644 index f6e9ea580..000000000 --- a/packages/service/common/buffer/rawText/schema.ts +++ /dev/null @@ -1,22 +0,0 @@ -import { getMongoModel, type Types, Schema } from '../../mongo'; - -export const bucketName = 'buffer_rawtext'; - -const RawTextBufferSchema = new Schema({ - metadata: { - sourceId: { type: String, required: true }, - sourceName: { type: String, required: true }, - expiredTime: { type: Date, required: true } - } -}); -RawTextBufferSchema.index({ 'metadata.sourceId': 'hashed' }); -RawTextBufferSchema.index({ 'metadata.expiredTime': -1 }); - -export const MongoRawTextBufferSchema = getMongoModel<{ - _id: Types.ObjectId; - metadata: { - sourceId: string; - sourceName: string; - expiredTime: Date; - }; -}>(`${bucketName}.files`, RawTextBufferSchema); diff --git a/packages/service/common/bullmq/index.ts b/packages/service/common/bullmq/index.ts index 17c66bd0a..b0d3d1961 100644 --- a/packages/service/common/bullmq/index.ts +++ b/packages/service/common/bullmq/index.ts @@ -22,7 +22,10 @@ export enum QueueNames { datasetSync = 'datasetSync', evaluation = 'evaluation', s3FileDelete = 's3FileDelete', - // abondoned + + // Delete Queue + datasetDelete = 'datasetDelete', + // @deprecated websiteSync = 'websiteSync' } diff --git a/packages/service/common/file/gridfs/controller.ts b/packages/service/common/file/gridfs/controller.ts index 9f99fbe2e..0ca698066 100644 --- a/packages/service/common/file/gridfs/controller.ts +++ b/packages/service/common/file/gridfs/controller.ts @@ -1,16 +1,6 @@ import { Types, connectionMongo, ReadPreference } from '../../mongo'; import type { BucketNameEnum } from '@fastgpt/global/common/file/constants'; -import fsp from 'fs/promises'; -import fs from 'fs'; -import { type DatasetFileSchema } from '@fastgpt/global/core/dataset/type'; import { MongoChatFileSchema, MongoDatasetFileSchema } from './schema'; -import { detectFileEncodingByPath } from '@fastgpt/global/common/file/tools'; -import { computeGridFsChunSize, stream2Encoding } from './utils'; -import { addLog } from '../../system/log'; -import { Readable } from 'stream'; -import { retryFn } from '@fastgpt/global/common/system/utils'; -import { getS3DatasetSource } from '../../s3/sources/dataset'; -import { isS3ObjectKey } from '../../s3/utils'; export function getGFSCollection(bucket: `${BucketNameEnum}`) { MongoDatasetFileSchema; @@ -18,6 +8,7 @@ export function getGFSCollection(bucket: `${BucketNameEnum}`) { return connectionMongo.connection.db!.collection(`${bucket}.files`); } + export function getGridBucket(bucket: `${BucketNameEnum}`) { return new connectionMongo.mongo.GridFSBucket(connectionMongo.connection.db!, { bucketName: bucket, @@ -26,106 +17,6 @@ export function getGridBucket(bucket: `${BucketNameEnum}`) { }); } -/* crud file */ -export async function uploadFile({ - bucketName, - teamId, - uid, - path, - filename, - contentType, - metadata = {} -}: { - bucketName: `${BucketNameEnum}`; - teamId: string; - uid: string; // tmbId / outLinkUId - path: string; - filename: string; - contentType?: string; - metadata?: Record; -}) { - if (!path) return Promise.reject(`filePath is empty`); - if (!filename) return Promise.reject(`filename is empty`); - - const stats = await fsp.stat(path); - if (!stats.isFile()) return Promise.reject(`${path} is not a file`); - - const readStream = fs.createReadStream(path, { - highWaterMark: 256 * 1024 - }); - - // Add default metadata - metadata.teamId = teamId; - metadata.uid = uid; - metadata.encoding = await detectFileEncodingByPath(path); - - // create a gridfs bucket - const bucket = getGridBucket(bucketName); - - const chunkSizeBytes = computeGridFsChunSize(stats.size); - - const stream = bucket.openUploadStream(filename, { - metadata, - contentType, - chunkSizeBytes - }); - - // save to gridfs - await new Promise((resolve, reject) => { - readStream - .pipe(stream as any) - .on('finish', resolve) - .on('error', reject); - }).finally(() => { - readStream.destroy(); - }); - - return String(stream.id); -} - -export async function getFileById({ - bucketName, - fileId -}: { - bucketName: `${BucketNameEnum}`; - fileId: string; -}) { - const db = getGFSCollection(bucketName); - const file = await db.findOne({ - _id: new Types.ObjectId(fileId) - }); - - return file || undefined; -} - -export async function delFileByFileIdList({ - bucketName, - fileIdList -}: { - bucketName: `${BucketNameEnum}`; - fileIdList: string[]; -}): Promise { - return retryFn(async () => { - const bucket = getGridBucket(bucketName); - - for await (const fileId of fileIdList) { - try { - if (isS3ObjectKey(fileId, 'dataset')) { - await getS3DatasetSource().deleteDatasetFileByKey(fileId); - } else { - await bucket.delete(new Types.ObjectId(String(fileId))); - } - } catch (error: any) { - if (typeof error?.message === 'string' && error.message.includes('File not found')) { - addLog.warn('File not found', { fileId }); - return; - } - return Promise.reject(error); - } - } - }); -} - export async function getDownloadStream({ bucketName, fileId diff --git a/packages/service/common/file/gridfs/utils.ts b/packages/service/common/file/gridfs/utils.ts index 691d85a4f..a02883eb1 100644 --- a/packages/service/common/file/gridfs/utils.ts +++ b/packages/service/common/file/gridfs/utils.ts @@ -1,78 +1,5 @@ import { detectFileEncoding } from '@fastgpt/global/common/file/tools'; import { PassThrough } from 'stream'; -import { getGridBucket } from './controller'; -import { type BucketNameEnum } from '@fastgpt/global/common/file/constants'; -import { retryFn } from '@fastgpt/global/common/system/utils'; - -export const createFileFromText = async ({ - bucket, - filename, - text, - metadata -}: { - bucket: `${BucketNameEnum}`; - filename: string; - text: string; - metadata: Record; -}) => { - const gridBucket = getGridBucket(bucket); - - const buffer = Buffer.from(text); - - const fileSize = buffer.length; - // 单块大小:尽可能大,但不超过 14MB,不小于128KB - const chunkSizeBytes = (() => { - // 计算理想块大小:文件大小 ÷ 目标块数(10)。 并且每个块需要小于 14MB - const idealChunkSize = Math.min(Math.ceil(fileSize / 10), 14 * 1024 * 1024); - - // 确保块大小至少为128KB - const minChunkSize = 128 * 1024; // 128KB - - // 取理想块大小和最小块大小中的较大值 - let chunkSize = Math.max(idealChunkSize, minChunkSize); - - // 将块大小向上取整到最接近的64KB的倍数,使其更整齐 - chunkSize = Math.ceil(chunkSize / (64 * 1024)) * (64 * 1024); - - return chunkSize; - })(); - - const uploadStream = gridBucket.openUploadStream(filename, { - metadata, - chunkSizeBytes - }); - - return retryFn(async () => { - return new Promise<{ fileId: string }>((resolve, reject) => { - uploadStream.end(buffer); - uploadStream.on('finish', () => { - resolve({ fileId: String(uploadStream.id) }); - }); - uploadStream.on('error', reject); - }); - }); -}; - -export const gridFsStream2Buffer = (stream: NodeJS.ReadableStream) => { - return new Promise((resolve, reject) => { - if (!stream.readable) { - return resolve(Buffer.from([])); - } - - const chunks: Uint8Array[] = []; - - stream.on('data', (chunk) => { - chunks.push(chunk); - }); - stream.on('end', () => { - const resultBuffer = Buffer.concat(chunks); // One-time splicing - resolve(resultBuffer); - }); - stream.on('error', (err) => { - reject(err); - }); - }); -}; export const stream2Encoding = async (stream: NodeJS.ReadableStream) => { const copyStream = stream.pipe(new PassThrough()); @@ -109,20 +36,3 @@ export const stream2Encoding = async (stream: NodeJS.ReadableStream) => { stream: copyStream }; }; - -// 单块大小:尽可能大,但不超过 14MB,不小于512KB -export const computeGridFsChunSize = (fileSize: number) => { - // 计算理想块大小:文件大小 ÷ 目标块数(10)。 并且每个块需要小于 14MB - const idealChunkSize = Math.min(Math.ceil(fileSize / 10), 14 * 1024 * 1024); - - // 确保块大小至少为512KB - const minChunkSize = 512 * 1024; // 512KB - - // 取理想块大小和最小块大小中的较大值 - let chunkSize = Math.max(idealChunkSize, minChunkSize); - - // 将块大小向上取整到最接近的64KB的倍数,使其更整齐 - chunkSize = Math.ceil(chunkSize / (64 * 1024)) * (64 * 1024); - - return chunkSize; -}; diff --git a/packages/service/common/file/image/controller.ts b/packages/service/common/file/image/controller.ts index 2ab09a516..6c6308548 100644 --- a/packages/service/common/file/image/controller.ts +++ b/packages/service/common/file/image/controller.ts @@ -78,9 +78,10 @@ export const copyAvatarImage = async ({ const avatarSource = getS3AvatarSource(); if (isS3ObjectKey(imageUrl?.slice(avatarSource.prefix.length), 'avatar')) { const filename = (() => { - const last = imageUrl.split('/').pop()?.split('-')[1]; + const last = imageUrl.split('/').pop(); if (!last) return getNanoid(6).concat(path.extname(imageUrl)); - return `${getNanoid(6)}-${last}`; + const firstDashIndex = last.indexOf('-'); + return `${getNanoid(6)}-${firstDashIndex === -1 ? last : last.slice(firstDashIndex + 1)}`; })(); const key = await getS3AvatarSource().copyAvatar({ key: imageUrl, diff --git a/packages/service/common/file/multer.ts b/packages/service/common/file/multer.ts index 237407656..9b2bfe41b 100644 --- a/packages/service/common/file/multer.ts +++ b/packages/service/common/file/multer.ts @@ -1,152 +1,138 @@ -import type { NextApiRequest, NextApiResponse } from 'next'; -import multer from 'multer'; -import path from 'path'; -import type { BucketNameEnum } from '@fastgpt/global/common/file/constants'; -import { bucketNameMap } from '@fastgpt/global/common/file/constants'; import { getNanoid } from '@fastgpt/global/common/string/tools'; -import { UserError } from '@fastgpt/global/common/error/utils'; +import m from 'multer'; +import type { NextApiRequest } from 'next'; +import path from 'path'; +import fs from 'node:fs'; -export type FileType = { - fieldname: string; - originalname: string; - encoding: string; - mimetype: string; - filename: string; - path: string; - size: number; -}; - -/* - maxSize: File max size (MB) -*/ -export const getUploadModel = ({ maxSize = 500 }: { maxSize?: number }) => { - maxSize *= 1024 * 1024; - - class UploadModel { - uploaderSingle = multer({ - limits: { - fieldSize: maxSize - }, - preservePath: true, - storage: multer.diskStorage({ - // destination: (_req, _file, cb) => { - // cb(null, tmpFileDirPath); - // }, - filename: (req, file, cb) => { - if (!file?.originalname) { - cb(new Error('File not found'), ''); - } else { - const { ext } = path.parse(decodeURIComponent(file.originalname)); - cb(null, `${getNanoid()}${ext}`); - } - } - }) - }).single('file'); - async getUploadFile( - req: NextApiRequest, - res: NextApiResponse, - originBucketName?: `${BucketNameEnum}` - ) { - return new Promise<{ - file: FileType; - metadata: Record; - data: T; - bucketName?: `${BucketNameEnum}`; - }>((resolve, reject) => { - // @ts-ignore - this.uploaderSingle(req, res, (error) => { - if (error) { - return reject(error); - } - - // check bucket name - const bucketName = (req.body?.bucketName || originBucketName) as `${BucketNameEnum}`; - if (bucketName && !bucketNameMap[bucketName]) { - return reject(new UserError('BucketName is invalid')); - } - - // @ts-ignore - const file = req.file as FileType; - - resolve({ - file: { - ...file, - originalname: decodeURIComponent(file.originalname) - }, - bucketName, - metadata: (() => { - if (!req.body?.metadata) return {}; - try { - return JSON.parse(req.body.metadata); - } catch (error) { - return {}; - } - })(), - data: (() => { - if (!req.body?.data) return {}; - try { - return JSON.parse(req.body.data); - } catch (error) { - return {}; - } - })() - }); - }); - }); +export const multer = { + _storage: m.diskStorage({ + filename: (_, file, cb) => { + if (!file?.originalname) { + cb(new Error('File not found'), ''); + } else { + const ext = path.extname(decodeURIComponent(file.originalname)); + cb(null, `${getNanoid()}${ext}`); + } } + }), - uploaderMultiple = multer({ + singleStore(maxFileSize: number = 500) { + const fileSize = maxFileSize * 1024 * 1024; + + return m({ limits: { - fieldSize: maxSize + fileSize }, preservePath: true, - storage: multer.diskStorage({ - // destination: (_req, _file, cb) => { - // cb(null, tmpFileDirPath); - // }, - filename: (req, file, cb) => { - if (!file?.originalname) { - cb(new Error('File not found'), ''); - } else { - const { ext } = path.parse(decodeURIComponent(file.originalname)); - cb(null, `${getNanoid()}${ext}`); - } - } - }) + storage: this._storage + }).single('file'); + }, + + multipleStore(maxFileSize: number = 500) { + const fileSize = maxFileSize * 1024 * 1024; + + return m({ + limits: { + fileSize + }, + preservePath: true, + storage: this._storage }).array('file', global.feConfigs?.uploadFileMaxSize); - async getUploadFiles(req: NextApiRequest, res: NextApiResponse) { - return new Promise<{ - files: FileType[]; - data: T; - }>((resolve, reject) => { - // @ts-ignore - this.uploaderMultiple(req, res, (error) => { - if (error) { - console.log(error); - return reject(error); + }, + + resolveFormData>({ + request, + maxFileSize + }: { + request: NextApiRequest; + maxFileSize?: number; + }) { + return new Promise<{ + data: T; + fileMetadata: Express.Multer.File; + getBuffer: () => Buffer; + getReadStream: () => fs.ReadStream; + }>((resolve, reject) => { + const handler = this.singleStore(maxFileSize); + + // @ts-expect-error it can accept a NextApiRequest + handler(request, null, (error) => { + if (error) { + return reject(error); + } + + // @ts-expect-error `file` will be injected by multer + const file = request.file as Express.Multer.File; + + if (!file) { + return reject(new Error('File not found')); + } + + const data = (() => { + if (!request.body?.data) return {}; + try { + return JSON.parse(request.body.data); + } catch { + return {}; } + })(); - // @ts-ignore - const files = req.files as FileType[]; - - resolve({ - files: files.map((file) => ({ - ...file, - originalname: decodeURIComponent(file.originalname) - })), - data: (() => { - if (!req.body?.data) return {}; - try { - return JSON.parse(req.body.data); - } catch (error) { - return {}; - } - })() - }); + resolve({ + data, + fileMetadata: file, + getBuffer: () => fs.readFileSync(file.path), + getReadStream: () => fs.createReadStream(file.path) }); }); + }); + }, + + resolveMultipleFormData>({ + request, + maxFileSize + }: { + request: NextApiRequest; + maxFileSize?: number; + }) { + return new Promise<{ + data: T; + fileMetadata: Array; + }>((resolve, reject) => { + const handler = this.multipleStore(maxFileSize); + + // @ts-expect-error it can accept a NextApiRequest + handler(request, null, (error) => { + if (error) { + return reject(error); + } + + // @ts-expect-error `files` will be injected by multer + const files = request.files as Array; + + if (!files || files.length === 0) { + return reject(new Error('File not found')); + } + + const data = (() => { + if (!request.body?.data) return {}; + try { + return JSON.parse(request.body.data); + } catch { + return {}; + } + })(); + + resolve({ + data, + fileMetadata: files + }); + }); + }); + }, + + clearDiskTempFiles(filepaths: string[]) { + for (const filepath of filepaths) { + fs.rm(filepath, { force: true }, (_) => {}); } } - - return new UploadModel(); }; diff --git a/packages/service/common/geo/constants.ts b/packages/service/common/geo/constants.ts new file mode 100644 index 000000000..f10a148bc --- /dev/null +++ b/packages/service/common/geo/constants.ts @@ -0,0 +1,15 @@ +import path from 'node:path'; +import type { LocationName } from './type'; + +export const dbPath = path.join(process.cwd(), 'data/GeoLite2-City.mmdb'); + +export const privateOrOtherLocationName: LocationName = { + city: undefined, + country: { + en: 'Other', + zh: '其他' + }, + province: undefined +}; + +export const cleanupIntervalMs = 6 * 60 * 60 * 1000; // Run cleanup every 6 hours diff --git a/packages/service/common/geo/index.ts b/packages/service/common/geo/index.ts new file mode 100644 index 000000000..c7107089b --- /dev/null +++ b/packages/service/common/geo/index.ts @@ -0,0 +1,107 @@ +import fs from 'node:fs'; +import type { ReaderModel } from '@maxmind/geoip2-node'; +import { Reader } from '@maxmind/geoip2-node'; +import { cleanupIntervalMs, dbPath, privateOrOtherLocationName } from './constants'; +import type { I18nName, LocationName } from './type'; +import { extractLocationData } from './utils'; +import type { NextApiRequest } from 'next'; +import { getClientIp } from 'request-ip'; +import { addLog } from '../system/log'; + +let reader: ReaderModel | null = null; + +const locationIpMap = new Map(); + +function loadGeoDB() { + const dbBuffer = fs.readFileSync(dbPath); + reader = Reader.openBuffer(dbBuffer); + return reader; +} + +export function getGeoReader() { + if (!reader) { + return loadGeoDB(); + } + return reader; +} + +export function getLocationFromIp(ip?: string, locale: keyof I18nName = 'zh') { + if (!ip) { + return privateOrOtherLocationName.country?.[locale]; + } + const reader = getGeoReader(); + + let locationName = locationIpMap.get(ip); + if (locationName) { + return [ + locationName.country?.[locale], + locationName.province?.[locale], + locationName.city?.[locale] + ] + .filter(Boolean) + .join(locale === 'zh' ? ',' : ','); + } + + try { + const response = reader.city(ip); + const data = extractLocationData(response); + locationName = { + city: { + en: data.city.en, + zh: data.city.zh + }, + country: { + en: data.country.en, + zh: data.country.zh + }, + province: { + en: data.province.en, + zh: data.province.zh + } + }; + locationIpMap.set(ip, locationName); + + return [ + locationName.country?.[locale], + locationName.province?.[locale], + locationName.city?.[locale] + ] + .filter(Boolean) + .join(locale === 'zh' ? ',' : ', '); + } catch (error) { + locationIpMap.set(ip, privateOrOtherLocationName); + return privateOrOtherLocationName.country?.[locale]; + } +} + +let cleanupInterval: NodeJS.Timeout | null = null; +function cleanupIpMap() { + locationIpMap.clear(); +} + +export function clearCleanupInterval() { + if (cleanupInterval) { + clearInterval(cleanupInterval); + cleanupInterval = null; + } +} + +export function initGeo() { + cleanupInterval = setInterval(cleanupIpMap, cleanupIntervalMs); + + try { + loadGeoDB(); + } catch (error) { + clearCleanupInterval(); + addLog.error(`Failed to load geo db`, error); + throw error; + } +} + +export function getIpFromRequest(request: NextApiRequest): string { + const ip = getClientIp(request); + if (!ip || ip === '::1') { + return '127.0.0.1'; + } + return ip; +} diff --git a/packages/service/common/geo/type.ts b/packages/service/common/geo/type.ts new file mode 100644 index 000000000..ca189f604 --- /dev/null +++ b/packages/service/common/geo/type.ts @@ -0,0 +1,10 @@ +export type I18nName = { + zh?: string; + en?: string; +}; + +export type LocationName = { + country?: I18nName; + province?: I18nName; + city?: I18nName; +}; diff --git a/packages/service/common/geo/utils.ts b/packages/service/common/geo/utils.ts new file mode 100644 index 000000000..7c451eb5c --- /dev/null +++ b/packages/service/common/geo/utils.ts @@ -0,0 +1,21 @@ +import type { City } from '@maxmind/geoip2-node'; + +export function extractLocationData(response: City) { + return { + city: { + id: response.city?.geonameId, + en: response.city?.names.en, + zh: response.city?.names['zh-CN'] + }, + country: { + id: response.country?.geonameId, + en: response.country?.names.en, + zh: response.country?.names['zh-CN'] + }, + province: { + id: response.subdivisions?.[0]?.geonameId, + en: response.subdivisions?.[0]?.names.en, + zh: response.subdivisions?.[0]?.names['zh-CN'] + } + }; +} diff --git a/packages/service/common/middle/tracks/processor.ts b/packages/service/common/middle/tracks/processor.ts index 498f62dab..d7a517fe5 100644 --- a/packages/service/common/middle/tracks/processor.ts +++ b/packages/service/common/middle/tracks/processor.ts @@ -15,6 +15,13 @@ const getCurrentTenMinuteBoundary = () => { return boundary; }; +const getCurrentMinuteBoundary = () => { + const now = new Date(); + const boundary = new Date(now); + boundary.setSeconds(0, 0); + return boundary; +}; + export const trackTimerProcess = async () => { while (true) { await countTrackTimer(); @@ -31,7 +38,8 @@ export const countTrackTimer = async () => { global.countTrackQueue = new Map(); try { - const currentBoundary = getCurrentTenMinuteBoundary(); + const currentTenMinuteBoundary = getCurrentTenMinuteBoundary(); + const currentMinuteBoundary = getCurrentMinuteBoundary(); const bulkOps = queuedItems .map(({ event, count, data }) => { @@ -44,7 +52,7 @@ export const countTrackTimer = async () => { filter: { event, teamId, - createTime: currentBoundary, + createTime: currentTenMinuteBoundary, 'data.datasetId': datasetId }, update: [ @@ -52,7 +60,7 @@ export const countTrackTimer = async () => { $set: { event, teamId, - createTime: { $ifNull: ['$createTime', currentBoundary] }, + createTime: { $ifNull: ['$createTime', currentTenMinuteBoundary] }, data: { datasetId, count: { $add: [{ $ifNull: ['$data.count', 0] }, count] } @@ -65,6 +73,36 @@ export const countTrackTimer = async () => { } ]; } + + if (event === TrackEnum.teamChatQPM) { + const { teamId } = data; + + return [ + { + updateOne: { + filter: { + event, + teamId, + createTime: currentMinuteBoundary + }, + update: [ + { + $set: { + event, + teamId, + createTime: { $ifNull: ['$createTime', currentMinuteBoundary] }, + data: { + requestCount: { $add: [{ $ifNull: ['$data.requestCount', 0] }, count] } + } + } + } + ], + upsert: true + } + } + ]; + } + return []; }) .flat(); diff --git a/packages/service/common/middle/tracks/utils.ts b/packages/service/common/middle/tracks/utils.ts index 900ccd23f..1d5127fb8 100644 --- a/packages/service/common/middle/tracks/utils.ts +++ b/packages/service/common/middle/tracks/utils.ts @@ -146,5 +146,15 @@ export const pushTrack = { } }); }); + }, + teamChatQPM: (data: { teamId: string }) => { + if (!data.teamId) return; + pushCountTrack({ + event: TrackEnum.teamChatQPM, + key: `${TrackEnum.teamChatQPM}_${data.teamId}`, + data: { + teamId: data.teamId + } + }); } }; diff --git a/packages/service/common/mongo/init.ts b/packages/service/common/mongo/init.ts index c67a395e2..cd5184b34 100644 --- a/packages/service/common/mongo/init.ts +++ b/packages/service/common/mongo/init.ts @@ -64,8 +64,7 @@ export async function connectMongo(props: { maxIdleTimeMS: 300000, // 空闲连接超时: 5分钟,防止空闲连接长时间占用资源 retryWrites: true, // 重试写入: 重试写入失败的操作 retryReads: true, // 重试读取: 重试读取失败的操作 - serverSelectionTimeoutMS: 10000, // 服务器选择超时: 10秒,防止副本集故障时长时间阻塞 - w: 'majority' // 写入确认策略: 多数节点确认后返回,保证数据安全性 + serverSelectionTimeoutMS: 10000 // 服务器选择超时: 10秒,防止副本集故障时长时间阻塞 }); console.log('mongo connected'); diff --git a/packages/service/common/s3/buckets/base.ts b/packages/service/common/s3/buckets/base.ts index ee5c153b8..2e2389b0b 100644 --- a/packages/service/common/s3/buckets/base.ts +++ b/packages/service/common/s3/buckets/base.ts @@ -160,6 +160,18 @@ export class S3BaseBucket { return this.client.statObject(this.name, ...params); } + async isObjectExists(key: string): Promise { + try { + await this.client.statObject(this.name, key); + return true; + } catch (err) { + if (err instanceof S3Error && err.message === 'Not Found') { + return false; + } + return Promise.reject(err); + } + } + async fileStreamToBuffer(stream: Readable): Promise { const chunks: Buffer[] = []; for await (const chunk of stream) { @@ -183,10 +195,7 @@ export class S3BaseBucket { const ext = path.extname(filename).toLowerCase(); const contentType = Mimes[ext as keyof typeof Mimes] ?? 'application/octet-stream'; - const key = (() => { - if ('rawKey' in params) return params.rawKey; - return [params.source, params.teamId, `${getNanoid(6)}-${filename}`].join('/'); - })(); + const key = params.rawKey; const policy = this.externalClient.newPostPolicy(); policy.setKey(key); @@ -230,9 +239,7 @@ export class S3BaseBucket { const { key, expiredHours } = parsed; const expires = expiredHours ? expiredHours * 60 * 60 : 30 * 60; // expires 的单位是秒 默认 30 分钟 - return await this.externalClient.presignedGetObject(this.name, key, expires, { - 'Content-Disposition': `attachment; filename="${path.basename(key)}"` - }); + return await this.externalClient.presignedGetObject(this.name, key, expires); } async createPreviewUrl(params: createPreviewUrlParams) { @@ -241,8 +248,6 @@ export class S3BaseBucket { const { key, expiredHours } = parsed; const expires = expiredHours ? expiredHours * 60 * 60 : 30 * 60; // expires 的单位是秒 默认 30 分钟 - return await this.client.presignedGetObject(this.name, key, expires, { - 'Content-Disposition': `attachment; filename="${path.basename(key)}"` - }); + return await this.client.presignedGetObject(this.name, key, expires); } } diff --git a/packages/service/common/s3/constants.ts b/packages/service/common/s3/constants.ts index c6ca270bc..cc611596d 100644 --- a/packages/service/common/s3/constants.ts +++ b/packages/service/common/s3/constants.ts @@ -35,6 +35,7 @@ export const defaultS3Options: { accessKey: process.env.S3_ACCESS_KEY || 'minioadmin', secretKey: process.env.S3_SECRET_KEY || 'minioadmin', port: process.env.S3_PORT ? parseInt(process.env.S3_PORT) : 9000, + pathStyle: process.env.S3_PATH_STYLE === 'false' ? false : true, transportAgent: process.env.HTTP_PROXY ? new HttpProxyAgent(process.env.HTTP_PROXY) : process.env.HTTPS_PROXY diff --git a/packages/service/common/s3/sources/avatar.ts b/packages/service/common/s3/sources/avatar.ts index 6aa15485e..bfa179e32 100644 --- a/packages/service/common/s3/sources/avatar.ts +++ b/packages/service/common/s3/sources/avatar.ts @@ -3,6 +3,7 @@ import { MongoS3TTL } from '../schema'; import { S3PublicBucket } from '../buckets/public'; import { imageBaseUrl } from '@fastgpt/global/common/file/image/constants'; import type { ClientSession } from 'mongoose'; +import { getFileS3Key } from '../utils'; class S3AvatarSource { private bucket: S3PublicBucket; @@ -29,8 +30,10 @@ class S3AvatarSource { teamId: string; autoExpired?: boolean; }) { + const { fileKey } = getFileS3Key.avatar({ teamId, filename }); + return this.bucket.createPostPresignedUrl( - { filename, teamId, source: S3Sources.avatar }, + { filename, rawKey: fileKey }, { expiredHours: autoExpired ? 1 : undefined, // 1 Hours maxFileSize: 5 // 5MB diff --git a/packages/service/common/s3/sources/dataset/index.ts b/packages/service/common/s3/sources/dataset/index.ts index ea446ef54..e8367b769 100644 --- a/packages/service/common/s3/sources/dataset/index.ts +++ b/packages/service/common/s3/sources/dataset/index.ts @@ -2,6 +2,8 @@ import { S3Sources } from '../../type'; import { S3PrivateBucket } from '../../buckets/private'; import { parseFileExtensionFromUrl } from '@fastgpt/global/common/string/tools'; import { + type AddRawTextBufferParams, + AddRawTextBufferParamsSchema, type CreateGetDatasetFileURLParams, CreateGetDatasetFileURLParamsSchema, type CreateUploadDatasetFileParams, @@ -10,18 +12,20 @@ import { DeleteDatasetFilesByPrefixParamsSchema, type GetDatasetFileContentParams, GetDatasetFileContentParamsSchema, - type UploadDatasetFileByBufferParams, - UploadDatasetFileByBufferParamsSchema + type GetRawTextBufferParams, + type UploadParams, + UploadParamsSchema } from './type'; import { MongoS3TTL } from '../../schema'; import { addHours, addMinutes } from 'date-fns'; import { addLog } from '../../../system/log'; import { detectFileEncoding } from '@fastgpt/global/common/file/tools'; import { readS3FileContentByBuffer } from '../../../file/read/utils'; -import { addRawTextBuffer, getRawTextBuffer } from '../../../buffer/rawText/controller'; import path from 'node:path'; import { Mimes } from '../../constants'; import { getFileS3Key, truncateFilename } from '../../utils'; +import { createHash } from 'node:crypto'; +import { S3Error } from 'minio'; export class S3DatasetSource { public bucket: S3PrivateBucket; @@ -61,8 +65,8 @@ export class S3DatasetSource { * 比如根据被解析的文档前缀去删除解析出来的图片 **/ deleteDatasetFilesByPrefix(params: DeleteDatasetFilesByPrefixParams) { - const { datasetId, rawPrefix } = DeleteDatasetFilesByPrefixParamsSchema.parse(params); - const prefix = rawPrefix || [S3Sources.dataset, datasetId].filter(Boolean).join('/'); + const { datasetId } = DeleteDatasetFilesByPrefixParamsSchema.parse(params); + const prefix = [S3Sources.dataset, datasetId].filter(Boolean).join('/'); return this.bucket.addDeleteJob({ prefix }); } @@ -83,7 +87,14 @@ export class S3DatasetSource { // 获取文件状态 getDatasetFileStat(key: string) { - return this.bucket.statObject(key); + try { + return this.bucket.statObject(key); + } catch (error) { + if (error instanceof S3Error && error.message === 'Not Found') { + return null; + } + return Promise.reject(error); + } } // 获取文件元数据 @@ -117,12 +128,11 @@ export class S3DatasetSource { const { fileId, teamId, tmbId, customPdfParse, getFormatText, usageId } = GetDatasetFileContentParamsSchema.parse(params); - const bufferId = `${fileId}-${customPdfParse}`; - const fileBuffer = await getRawTextBuffer(bufferId); - if (fileBuffer) { + const rawTextBuffer = await this.getRawTextBuffer({ customPdfParse, sourceId: fileId }); + if (rawTextBuffer) { return { - rawText: fileBuffer.text, - filename: fileBuffer.sourceName + rawText: rawTextBuffer.text, + filename: rawTextBuffer.filename }; } @@ -154,11 +164,11 @@ export class S3DatasetSource { } }); - addRawTextBuffer({ - sourceId: bufferId, + this.addRawTextBuffer({ + sourceId: fileId, sourceName: filename, text: rawText, - expiredTime: addMinutes(new Date(), 20) + customPdfParse }); return { @@ -168,25 +178,85 @@ export class S3DatasetSource { } // 根据文件 Buffer 上传文件 - async uploadDatasetFileByBuffer(params: UploadDatasetFileByBufferParams): Promise { - const { datasetId, buffer, filename } = UploadDatasetFileByBufferParamsSchema.parse(params); + async upload(params: UploadParams): Promise { + const { datasetId, filename, ...file } = UploadParamsSchema.parse(params); - // 截断文件名以避免S3 key过长的问题 + // 截断文件名以避免 S3 key 过长的问题 const truncatedFilename = truncateFilename(filename); - const { fileKey: key } = getFileS3Key.dataset({ datasetId, filename: truncatedFilename }); - await this.bucket.putObject(key, buffer, buffer.length, { - 'content-type': Mimes[path.extname(truncatedFilename) as keyof typeof Mimes], - 'upload-time': new Date().toISOString(), - 'origin-filename': encodeURIComponent(truncatedFilename) - }); + + const { stream, size } = (() => { + if ('buffer' in file) { + return { + stream: file.buffer, + size: file.buffer.length + }; + } + return { + stream: file.stream, + size: file.size + }; + })(); + await MongoS3TTL.create({ minioKey: key, bucketName: this.bucket.name, expiredTime: addHours(new Date(), 3) }); + + await this.bucket.putObject(key, stream, size, { + 'content-type': Mimes[path.extname(truncatedFilename) as keyof typeof Mimes], + 'upload-time': new Date().toISOString(), + 'origin-filename': encodeURIComponent(truncatedFilename) + }); + return key; } + + async addRawTextBuffer(params: AddRawTextBufferParams) { + const { sourceId, sourceName, text, customPdfParse } = + AddRawTextBufferParamsSchema.parse(params); + + // 因为 Key 唯一对应一个 Object 所以不需要根据文件内容计算 Hash 直接用 Key 计算 Hash 就行了 + const hash = createHash('md5').update(sourceId).digest('hex'); + const key = getFileS3Key.rawText({ hash, customPdfParse }); + + await MongoS3TTL.create({ + minioKey: key, + bucketName: this.bucket.name, + expiredTime: addMinutes(new Date(), 20) + }); + + const buffer = Buffer.from(text); + await this.bucket.putObject(key, buffer, buffer.length, { + 'content-type': 'text/plain', + 'origin-filename': encodeURIComponent(sourceName), + 'upload-time': new Date().toISOString() + }); + + return key; + } + + async getRawTextBuffer(params: GetRawTextBufferParams) { + const { customPdfParse, sourceId } = params; + + const hash = createHash('md5').update(sourceId).digest('hex'); + const key = getFileS3Key.rawText({ hash, customPdfParse }); + + if (!(await this.bucket.isObjectExists(key))) return null; + + const [stream, metadata] = await Promise.all([ + this.bucket.getObject(key), + this.getFileMetadata(key) + ]); + + const buffer = await this.bucket.fileStreamToBuffer(stream); + + return { + text: buffer.toString('utf-8'), + filename: metadata.filename + }; + } } export function getS3DatasetSource() { diff --git a/packages/service/common/s3/sources/dataset/type.ts b/packages/service/common/s3/sources/dataset/type.ts index 4f25aac03..79350d896 100644 --- a/packages/service/common/s3/sources/dataset/type.ts +++ b/packages/service/common/s3/sources/dataset/type.ts @@ -1,4 +1,5 @@ import { ObjectIdSchema } from '@fastgpt/global/common/type/mongo'; +import { ReadStream } from 'fs'; import { z } from 'zod'; export const CreateUploadDatasetFileParamsSchema = z.object({ @@ -15,8 +16,7 @@ export const CreateGetDatasetFileURLParamsSchema = z.object({ export type CreateGetDatasetFileURLParams = z.infer; export const DeleteDatasetFilesByPrefixParamsSchema = z.object({ - datasetId: ObjectIdSchema.optional(), - rawPrefix: z.string().nonempty().optional() + datasetId: ObjectIdSchema.optional() }); export type DeleteDatasetFilesByPrefixParams = z.infer< typeof DeleteDatasetFilesByPrefixParamsSchema @@ -44,9 +44,27 @@ export const ParsedFileContentS3KeyParamsSchema = z.object({ }); export type ParsedFileContentS3KeyParams = z.infer; -export const UploadDatasetFileByBufferParamsSchema = z.object({ - datasetId: ObjectIdSchema, - buffer: z.instanceof(Buffer), - filename: z.string().nonempty() +export const UploadParamsSchema = z.union([ + z.object({ + datasetId: ObjectIdSchema, + filename: z.string().nonempty(), + buffer: z.instanceof(Buffer) + }), + + z.object({ + datasetId: ObjectIdSchema, + filename: z.string().nonempty(), + stream: z.instanceof(ReadStream), + size: z.int().positive().optional() + }) +]); +export type UploadParams = z.input; + +export const AddRawTextBufferParamsSchema = z.object({ + customPdfParse: z.boolean().optional(), + sourceId: z.string().nonempty(), + sourceName: z.string().nonempty(), + text: z.string() }); -export type UploadDatasetFileByBufferParams = z.infer; +export type AddRawTextBufferParams = z.input; +export type GetRawTextBufferParams = Pick; diff --git a/packages/service/common/s3/type.ts b/packages/service/common/s3/type.ts index 3d3f34975..6d4af6398 100644 --- a/packages/service/common/s3/type.ts +++ b/packages/service/common/s3/type.ts @@ -17,25 +17,15 @@ export type ExtensionType = keyof typeof Mimes; export type S3OptionsType = typeof defaultS3Options; -export const S3SourcesSchema = z.enum(['avatar', 'chat', 'dataset', 'temp']); +export const S3SourcesSchema = z.enum(['avatar', 'chat', 'dataset', 'temp', 'rawText']); export const S3Sources = S3SourcesSchema.enum; export type S3SourceType = z.infer; -export const CreatePostPresignedUrlParamsSchema = z.union([ - // Option 1: Only rawKey - z.object({ - filename: z.string().min(1), - rawKey: z.string().min(1), - metadata: z.record(z.string(), z.string()).optional() - }), - // Option 2: filename with optional source and teamId - z.object({ - filename: z.string().min(1), - source: S3SourcesSchema.optional(), - teamId: z.string().length(16).optional(), - metadata: z.record(z.string(), z.string()).optional() - }) -]); +export const CreatePostPresignedUrlParamsSchema = z.object({ + filename: z.string().min(1), + rawKey: z.string().min(1), + metadata: z.record(z.string(), z.string()).optional() +}); export type CreatePostPresignedUrlParams = z.infer; export const CreatePostPresignedUrlOptionsSchema = z.object({ diff --git a/packages/service/common/s3/utils.ts b/packages/service/common/s3/utils.ts index ecb706f9c..cdc68b766 100644 --- a/packages/service/common/s3/utils.ts +++ b/packages/service/common/s3/utils.ts @@ -11,6 +11,7 @@ import { getNanoid } from '@fastgpt/global/common/string/tools'; import path from 'node:path'; import type { ParsedFileContentS3KeyParams } from './sources/dataset/type'; import { EndpointUrl } from '@fastgpt/global/common/file/constants'; +import type { NextApiRequest } from 'next'; // S3文件名最大长度配置 export const S3_FILENAME_MAX_LENGTH = 50; @@ -173,6 +174,17 @@ export const getFileS3Key = { }; }, + avatar: ({ teamId, filename }: { teamId: string; filename?: string }) => { + const { formatedFilename, extension } = getFormatedFilename(filename); + return { + fileKey: [ + S3Sources.avatar, + teamId, + `${formatedFilename}${extension ? `.${extension}` : ''}` + ].join('/') + }; + }, + // 对话中上传的文件的解析结果的图片的 Key chat: ({ appId, @@ -215,6 +227,10 @@ export const getFileS3Key = { fileKey: key, fileParsedPrefix: prefix }; + }, + + rawText: ({ hash, customPdfParse }: { hash: string; customPdfParse?: boolean }) => { + return [S3Sources.rawText, `${hash}${customPdfParse ? '-true' : ''}`].join('/'); } }; @@ -230,3 +246,130 @@ export function isS3ObjectKey( ): key is `${T}/${string}` { return typeof key === 'string' && key.startsWith(`${S3Sources[source]}/`); } + +// export const multer = { +// _storage: multer.diskStorage({ +// filename: (_, file, cb) => { +// if (!file?.originalname) { +// cb(new Error('File not found'), ''); +// } else { +// const ext = path.extname(decodeURIComponent(file.originalname)); +// cb(null, `${getNanoid()}${ext}`); +// } +// } +// }), + +// singleStore(maxFileSize: number = 500) { +// const fileSize = maxFileSize * 1024 * 1024; + +// return multer({ +// limits: { +// fileSize +// }, +// preservePath: true, +// storage: this._storage +// }).single('file'); +// }, + +// multipleStore(maxFileSize: number = 500) { +// const fileSize = maxFileSize * 1024 * 1024; + +// return multer({ +// limits: { +// fileSize +// }, +// preservePath: true, +// storage: this._storage +// }).array('file', global.feConfigs?.uploadFileMaxSize); +// }, + +// resolveFormData({ request, maxFileSize }: { request: NextApiRequest; maxFileSize?: number }) { +// return new Promise<{ +// data: Record; +// fileMetadata: Express.Multer.File; +// getBuffer: () => Buffer; +// getReadStream: () => fs.ReadStream; +// }>((resolve, reject) => { +// const handler = this.singleStore(maxFileSize); + +// // @ts-expect-error it can accept a NextApiRequest +// handler(request, null, (error) => { +// if (error) { +// return reject(error); +// } + +// // @ts-expect-error `file` will be injected by multer +// const file = request.file as Express.Multer.File; + +// if (!file) { +// return reject(new Error('File not found')); +// } + +// const data = (() => { +// if (!request.body?.data) return {}; +// try { +// return JSON.parse(request.body.data); +// } catch { +// return {}; +// } +// })(); + +// resolve({ +// data, +// fileMetadata: file, +// getBuffer: () => fs.readFileSync(file.path), +// getReadStream: () => fs.createReadStream(file.path) +// }); +// }); +// }); +// }, + +// resolveMultipleFormData({ +// request, +// maxFileSize +// }: { +// request: NextApiRequest; +// maxFileSize?: number; +// }) { +// return new Promise<{ +// data: Record; +// fileMetadata: Array; +// }>((resolve, reject) => { +// const handler = this.multipleStore(maxFileSize); + +// // @ts-expect-error it can accept a NextApiRequest +// handler(request, null, (error) => { +// if (error) { +// return reject(error); +// } + +// // @ts-expect-error `files` will be injected by multer +// const files = request.files as Array; + +// if (!files || files.length === 0) { +// return reject(new Error('File not found')); +// } + +// const data = (() => { +// if (!request.body?.data) return {}; +// try { +// return JSON.parse(request.body.data); +// } catch { +// return {}; +// } +// })(); + +// resolve({ +// data, +// fileMetadata: files +// }); +// }); +// }); +// }, + +// clearDiskTempFiles(filepaths: string[]) { +// for (const filepath of filepaths) { +// fs.unlink(filepath, (_) => {}); +// } +// } +// }; diff --git a/packages/service/common/system/countLimit/const.ts b/packages/service/common/system/countLimit/const.ts new file mode 100644 index 000000000..6ca48dae0 --- /dev/null +++ b/packages/service/common/system/countLimit/const.ts @@ -0,0 +1,29 @@ +import z from 'zod'; +import { CountLimitTypeEnum } from './type'; + +export const CountLimitConfigType = z.record( + CountLimitTypeEnum, + z.object({ maxCount: z.number() }) +); + +// 只会发送 n 次通知,如需自动发送,需要主动清除记录 +export const CountLimitConfig = { + [CountLimitTypeEnum.enum['notice:30PercentPoints']]: { + maxCount: 3 + }, + [CountLimitTypeEnum.enum['notice:10PercentPoints']]: { + maxCount: 5 + }, + [CountLimitTypeEnum.enum['notice:LackOfPoints']]: { + maxCount: 5 + }, + [CountLimitTypeEnum.enum['notice:30PercentDatasetIndexes']]: { + maxCount: 3 + }, + [CountLimitTypeEnum.enum['notice:10PercentDatasetIndexes']]: { + maxCount: 5 + }, + [CountLimitTypeEnum.enum['notice:NoDatasetIndexes']]: { + maxCount: 5 + } +} satisfies z.infer; diff --git a/packages/service/common/system/countLimit/controller.ts b/packages/service/common/system/countLimit/controller.ts new file mode 100644 index 000000000..3bdb50f7a --- /dev/null +++ b/packages/service/common/system/countLimit/controller.ts @@ -0,0 +1,81 @@ +import type z from 'zod'; +import type { CountLimitTypeEnum } from './type'; +import { CountLimitConfig } from './const'; +import { MongoCountLimit } from './schema'; +import { mongoSessionRun } from '../../mongo/sessionRun'; + +/** + * Update the count limit for a specific type and key. + * @param param0 - The type, key, and update value. + * @returns The updated count limit information. + */ +export const updateCountLimit = async ({ + type, + key, + update +}: { + type: z.infer; + key: string; + update: number; +}) => + mongoSessionRun(async (session) => { + const maxCount = CountLimitConfig[type].maxCount; + const countLimit = await MongoCountLimit.findOne( + { + type, + key + }, + undefined, + { + session + } + ).lean(); + + if (!countLimit) { + // do not exist, create a new one + await MongoCountLimit.create( + [ + { + type, + key, + count: update // 0 + update + } + ], + { + session + } + ); + return { + maxCount, + nowCount: update, + remain: maxCount - update + }; + } + + if (countLimit.count >= maxCount) { + return Promise.reject(`Max Count Reached, type: ${type}, key: ${key}`); + } + + await MongoCountLimit.updateOne( + { + type, + key + }, + { + $inc: { count: update } + }, + { session } + ); + + return { + maxCount, + nowCount: countLimit.count + update, + remain: maxCount - (countLimit.count + update) + }; + }); + +/** Clean the Count limit, if no key provided, clean all the type */ +export const cleanCountLimit = async ({ teamId }: { teamId: string }) => + MongoCountLimit.deleteMany({ + key: teamId + }); diff --git a/packages/service/common/system/countLimit/schema.ts b/packages/service/common/system/countLimit/schema.ts new file mode 100644 index 000000000..13e762afd --- /dev/null +++ b/packages/service/common/system/countLimit/schema.ts @@ -0,0 +1,34 @@ +import { connectionMongo, getMongoModel } from '../../../common/mongo'; +import type { CountLimitType } from './type'; + +const { Schema } = connectionMongo; + +const collectionName = 'system_count_limits'; +const CountLimitSchema = new Schema({ + key: { + type: String, + required: true + }, + type: { + type: String, + required: true + }, + count: { + type: Number, + required: true, + default: 0 + }, + createTime: { + type: Date, + default: () => new Date() + } +}); + +try { + CountLimitSchema.index({ type: 1, key: 1 }, { unique: true }); + CountLimitSchema.index({ createTime: 1 }, { expireAfterSeconds: 60 * 60 * 24 * 30 }); // ttl 30天 +} catch (error) { + console.log(error); +} + +export const MongoCountLimit = getMongoModel(collectionName, CountLimitSchema); diff --git a/packages/service/common/system/countLimit/type.ts b/packages/service/common/system/countLimit/type.ts new file mode 100644 index 000000000..6412b9f81 --- /dev/null +++ b/packages/service/common/system/countLimit/type.ts @@ -0,0 +1,19 @@ +import z from 'zod'; + +export const CountLimitTypeEnum = z.enum([ + 'notice:30PercentPoints', + 'notice:10PercentPoints', + 'notice:LackOfPoints', + 'notice:30PercentDatasetIndexes', + 'notice:10PercentDatasetIndexes', + 'notice:NoDatasetIndexes' +]); + +export const CountLimitType = z.object({ + type: CountLimitTypeEnum, + key: z.string(), + count: z.number() +}); + +export type CountLimitType = z.infer; +export type CountLimitTypeEnum = z.infer; diff --git a/packages/service/common/system/timerLock/constants.ts b/packages/service/common/system/timerLock/constants.ts index 2768d0085..600b249ba 100644 --- a/packages/service/common/system/timerLock/constants.ts +++ b/packages/service/common/system/timerLock/constants.ts @@ -9,7 +9,8 @@ export enum TimerIdEnum { clearExpiredRawTextBuffer = 'clearExpiredRawTextBuffer', clearExpiredDatasetImage = 'clearExpiredDatasetImage', - clearExpiredMinioFiles = 'clearExpiredMinioFiles' + clearExpiredMinioFiles = 'clearExpiredMinioFiles', + recordTeamQPM = 'recordTeamQPM' } export enum LockNotificationEnum { diff --git a/packages/service/common/system/timerLock/schema.ts b/packages/service/common/system/timerLock/schema.ts index a6f487ae6..2d30b65a1 100644 --- a/packages/service/common/system/timerLock/schema.ts +++ b/packages/service/common/system/timerLock/schema.ts @@ -2,7 +2,7 @@ import { connectionMongo, getMongoModel } from '../../mongo'; const { Schema } = connectionMongo; import { type TimerLockSchemaType } from './type.d'; -export const collectionName = 'systemtimerlocks'; +export const collectionName = 'system_timer_locks'; const TimerLockSchema = new Schema({ timerId: { diff --git a/packages/service/common/system/timerLock/utils.ts b/packages/service/common/system/timerLock/utils.ts index 280953551..b3f1aeabe 100644 --- a/packages/service/common/system/timerLock/utils.ts +++ b/packages/service/common/system/timerLock/utils.ts @@ -2,7 +2,7 @@ import { type ClientSession } from '../../mongo'; import { MongoTimerLock } from './schema'; import { addMinutes } from 'date-fns'; -/* +/* 利用唯一健,使得同一时间只有一个任务在执行,后创建的锁,会因唯一健创建失败,从而无法继续执行任务 */ export const checkTimerLock = async ({ @@ -30,3 +30,19 @@ export const checkTimerLock = async ({ return false; } }; + +export const cleanTimerLock = async ({ + teamId, + session +}: { + teamId: string; + session?: ClientSession; +}) => { + // Match timerId pattern where lockId (last segment) equals teamId + await MongoTimerLock.deleteMany( + { + timerId: new RegExp(`--${teamId}$`) + }, + { session } + ); +}; diff --git a/packages/service/core/ai/functions/queryExtension.ts b/packages/service/core/ai/functions/queryExtension.ts index 165b53110..b70682a3f 100644 --- a/packages/service/core/ai/functions/queryExtension.ts +++ b/packages/service/core/ai/functions/queryExtension.ts @@ -6,16 +6,23 @@ import { addLog } from '../../../common/system/log'; import { filterGPTMessageByMaxContext } from '../llm/utils'; import json5 from 'json5'; import { createLLMResponse } from '../llm/request'; +import { useTextCosine } from '../hooks/useTextCosine'; -/* - query extension - 问题扩展 - 可以根据上下文,消除指代性问题以及扩展问题,利于检索。 +/* + Query Extension - Semantic Search Enhancement + This module can eliminate referential ambiguity and expand queries based on context to improve retrieval. + Submodular Optimization Mode: Generate multiple candidate queries, then use submodular algorithm to select the optimal query combination */ - const title = global.feConfigs?.systemTitle || 'FastAI'; const defaultPrompt = `## 你的任务 -你作为一个向量检索助手,你的任务是结合历史记录,从不同角度,为“原问题”生成个不同版本的“检索词”,从而提高向量检索的语义丰富度,提高向量检索的精度。 -生成的问题要求指向对象清晰明确,并与“原问题语言相同”。 +你作为一个向量检索助手,你的任务是结合历史记录,为"原问题"生成{{count}}个不同版本的"检索词"。这些检索词应该从不同角度探索主题,以提高向量检索的语义丰富度和精度。 + +## 要求 +1. 每个检索词必须与原问题相关 +2. 检索词应该探索不同方面(例如:原因、影响、解决方案、示例、对比等) +3. 避免检索词之间的冗余 +4. 保持检索词简洁且可搜索 +5. 生成的问题要求指向对象清晰明确,并与"原问题语言相同" ## 参考示例 @@ -24,7 +31,7 @@ const defaultPrompt = `## 你的任务 null """ 原问题: 介绍下剧情。 -检索词: ["介绍下故事的背景。","故事的主题是什么?","介绍下故事的主要人物。"] +检索词: ["介绍下故事的背景。","故事的主题是什么?","介绍下故事的主要人物。","故事的转折点在哪里?","故事的结局如何?"] ---------------- 历史记录: """ @@ -32,7 +39,7 @@ user: 对话背景。 assistant: 当前对话是关于 Nginx 的介绍和使用等。 """ 原问题: 怎么下载 -检索词: ["Nginx 如何下载?","下载 Nginx 需要什么条件?","有哪些渠道可以下载 Nginx?"] +检索词: ["Nginx 如何下载?","下载 Nginx 需要什么条件?","有哪些渠道可以下载 Nginx?","Nginx 各版本的下载方式有什么区别?","如何选择合适的 Nginx 版本下载?"] ---------------- 历史记录: """ @@ -42,7 +49,7 @@ user: 报错 "no connection" assistant: 报错"no connection"可能是因为…… """ 原问题: 怎么解决 -检索词: ["Nginx报错"no connection"如何解决?","造成'no connection'报错的原因。","Nginx提示'no connection',要怎么办?"] +检索词: ["Nginx报错'no connection'如何解决?","造成'no connection'报错的原因。","Nginx提示'no connection',要怎么办?","'no connection'错误的常见解决步骤。","如何预防 Nginx 'no connection' 错误?"] ---------------- 历史记录: """ @@ -50,7 +57,7 @@ user: How long is the maternity leave? assistant: The number of days of maternity leave depends on the city in which the employee is located. Please provide your city so that I can answer your questions. """ 原问题: ShenYang -检索词: ["How many days is maternity leave in Shenyang?","Shenyang's maternity leave policy.","The standard of maternity leave in Shenyang."] +检索词: ["How many days is maternity leave in Shenyang?","Shenyang's maternity leave policy.","The standard of maternity leave in Shenyang.","What benefits are included in Shenyang's maternity leave?","How to apply for maternity leave in Shenyang?"] ---------------- 历史记录: """ @@ -58,7 +65,7 @@ user: 作者是谁? assistant: ${title} 的作者是 labring。 """ 原问题: Tell me about him -检索词: ["Introduce labring, the author of ${title}." ," Background information on author labring." "," Why does labring do ${title}?"] +检索词: ["Introduce labring, the author of ${title}." ,"Background information on author labring.","Why does labring do ${title}?","What other projects has labring worked on?","How did labring start ${title}?"] ---------------- 历史记录: """ @@ -74,7 +81,7 @@ user: ${title} 如何收费? assistant: ${title} 收费可以参考…… """ 原问题: 你知道 laf 么? -检索词: ["laf 的官网地址是多少?","laf 的使用教程。","laf 有什么特点和优势。"] +检索词: ["laf 的官网地址是多少?","laf 的使用教程。","laf 有什么特点和优势。","laf 的主要功能是什么?","laf 与其他类似产品的对比。"] ---------------- 历史记录: """ @@ -100,6 +107,7 @@ assistant: Laf 是一个云函数开发平台。 1. 输出格式为 JSON 数组,数组中每个元素为字符串。无需对输出进行任何解释。 2. 输出语言与原问题相同。原问题为中文则输出中文;原问题为英文则输出英文。 +3. 确保生成恰好 {{count}} 个检索词。 ## 开始任务 @@ -114,18 +122,24 @@ export const queryExtension = async ({ chatBg, query, histories = [], - model + llmModel, + embeddingModel, + generateCount = 10 // 生成优化问题集的数量,默认为10个 }: { chatBg?: string; query: string; histories: ChatItemType[]; - model: string; + llmModel: string; + embeddingModel: string; + generateCount?: number; }): Promise<{ rawQuery: string; extensionQueries: string[]; - model: string; + llmModel: string; + embeddingModel: string; inputTokens: number; outputTokens: number; + embeddingTokens: number; }> => { const systemFewShot = chatBg ? `user: 对话背景。 @@ -133,7 +147,8 @@ assistant: ${chatBg} ` : ''; - const modelData = getLLMModel(model); + // 1. Request model + const modelData = getLLMModel(llmModel); const filterHistories = await filterGPTMessageByMaxContext({ messages: chats2GPTMessages({ messages: histories, reserveId: false }), maxContext: modelData.maxContext - 1000 @@ -160,7 +175,8 @@ assistant: ${chatBg} role: 'user', content: replaceVariable(defaultPrompt, { query: `${query}`, - histories: concatFewShot || 'null' + histories: concatFewShot || 'null', + count: generateCount.toString() }) } ] as any; @@ -181,12 +197,15 @@ assistant: ${chatBg} return { rawQuery: query, extensionQueries: [], - model, + llmModel: modelData.model, + embeddingModel, inputTokens: inputTokens, - outputTokens: outputTokens + outputTokens: outputTokens, + embeddingTokens: 0 }; } + // 2. Parse answer const start = answer.indexOf('['); const end = answer.lastIndexOf(']'); if (start === -1 || end === -1) { @@ -196,9 +215,11 @@ assistant: ${chatBg} return { rawQuery: query, extensionQueries: [], - model, + llmModel: modelData.model, + embeddingModel, inputTokens: inputTokens, - outputTokens: outputTokens + outputTokens: outputTokens, + embeddingTokens: 0 }; } @@ -211,23 +232,51 @@ assistant: ${chatBg} try { const queries = json5.parse(jsonStr) as string[]; + if (!Array.isArray(queries) || queries.length === 0) { + return { + rawQuery: query, + extensionQueries: [], + llmModel: modelData.model, + embeddingModel, + inputTokens, + outputTokens, + embeddingTokens: 0 + }; + } + + // 3. 通过计算获取到最优的检索词 + const { lazyGreedyQuerySelection, embeddingModel: useEmbeddingModel } = useTextCosine({ + embeddingModel + }); + const { selectedData: selectedQueries, embeddingTokens } = await lazyGreedyQuerySelection({ + originalText: query, + candidates: queries, + k: Math.min(3, queries.length), // 至多 3 个 + alpha: 0.3 + }); + return { rawQuery: query, - extensionQueries: (Array.isArray(queries) ? queries : []).slice(0, 5), - model, + extensionQueries: selectedQueries, + llmModel: modelData.model, + embeddingModel: useEmbeddingModel, inputTokens, - outputTokens + outputTokens, + embeddingTokens }; } catch (error) { - addLog.warn('Query extension failed, not a valid JSON', { + addLog.warn('Query extension failed', { + error, answer }); return { rawQuery: query, extensionQueries: [], - model, + llmModel: modelData.model, + embeddingModel, inputTokens, - outputTokens + outputTokens, + embeddingTokens: 0 }; } }; diff --git a/packages/service/core/ai/hooks/useTextCosine.ts b/packages/service/core/ai/hooks/useTextCosine.ts new file mode 100644 index 000000000..6e73f8157 --- /dev/null +++ b/packages/service/core/ai/hooks/useTextCosine.ts @@ -0,0 +1,154 @@ +/* + 根据文本的余弦相似度,获取最大边际收益的检索词。 + Reference: https://github.com/jina-ai/submodular-optimization +*/ + +import { getVectorsByText } from '../embedding'; +import { getEmbeddingModel } from '../model'; + +class PriorityQueue { + private heap: Array<{ item: T; priority: number }> = []; + + enqueue(item: T, priority: number): void { + this.heap.push({ item, priority }); + this.heap.sort((a, b) => b.priority - a.priority); + } + + dequeue(): T | undefined { + return this.heap.shift()?.item; + } + + isEmpty(): boolean { + return this.heap.length === 0; + } + + size(): number { + return this.heap.length; + } +} +export const useTextCosine = ({ embeddingModel }: { embeddingModel: string }) => { + const vectorModel = getEmbeddingModel(embeddingModel); + // Calculate marginal gain + const computeMarginalGain = ( + candidateEmbedding: number[], + selectedEmbeddings: number[][], + originalEmbedding: number[], + alpha: number = 0.3 + ): number => { + // Calculate cosine similarity + const cosineSimilarity = (a: number[], b: number[]): number => { + if (a.length !== b.length) { + throw new Error('Vectors must have the same length'); + } + + let dotProduct = 0; + let normA = 0; + let normB = 0; + + for (let i = 0; i < a.length; i++) { + dotProduct += a[i] * b[i]; + normA += a[i] * a[i]; + normB += b[i] * b[i]; + } + + if (normA === 0 || normB === 0) return 0; + return dotProduct / (Math.sqrt(normA) * Math.sqrt(normB)); + }; + + if (selectedEmbeddings.length === 0) { + return alpha * cosineSimilarity(originalEmbedding, candidateEmbedding); + } + + let maxSimilarity = 0; + for (const selectedEmbedding of selectedEmbeddings) { + const similarity = cosineSimilarity(candidateEmbedding, selectedEmbedding); + maxSimilarity = Math.max(maxSimilarity, similarity); + } + + const relevance = alpha * cosineSimilarity(originalEmbedding, candidateEmbedding); + const diversity = 1 - maxSimilarity; + + return relevance + diversity; + }; + + // Lazy greedy query selection algorithm + const lazyGreedyQuerySelection = async ({ + originalText, + candidates, + k, + alpha = 0.3 + }: { + originalText: string; + candidates: string[]; // 候选文本 + k: number; + alpha?: number; + }) => { + const { tokens: embeddingTokens, vectors: embeddingVectors } = await getVectorsByText({ + model: vectorModel, + input: [originalText, ...candidates], + type: 'query' + }); + + const originalEmbedding = embeddingVectors[0]; + const candidateEmbeddings = embeddingVectors.slice(1); + + const n = candidates.length; + const selected: string[] = []; + const selectedEmbeddings: number[][] = []; + + // Initialize priority queue + const pq = new PriorityQueue<{ index: number; gain: number }>(); + + // Calculate initial marginal gain for all candidates + for (let i = 0; i < n; i++) { + const gain = computeMarginalGain( + candidateEmbeddings[i], + selectedEmbeddings, + originalEmbedding, + alpha + ); + pq.enqueue({ index: i, gain }, gain); + } + + // Greedy selection + for (let iteration = 0; iteration < k; iteration++) { + if (pq.isEmpty()) break; + + let bestCandidate: { index: number; gain: number } | undefined; + + // Find candidate with maximum marginal gain + while (!pq.isEmpty()) { + const candidate = pq.dequeue()!; + const currentGain = computeMarginalGain( + candidateEmbeddings[candidate.index], + selectedEmbeddings, + originalEmbedding, + alpha + ); + + if (currentGain >= candidate.gain) { + bestCandidate = { index: candidate.index, gain: currentGain }; + break; + } else { + // Create new object with updated gain to avoid infinite loop + pq.enqueue({ index: candidate.index, gain: currentGain }, currentGain); + } + } + + if (bestCandidate) { + selected.push(candidates[bestCandidate.index]); + selectedEmbeddings.push(candidateEmbeddings[bestCandidate.index]); + } + } + + return { + selectedData: selected, + embeddingTokens + }; + }; + + return { + lazyGreedyQuerySelection, + embeddingModel: vectorModel.model + }; +}; diff --git a/packages/service/core/app/controller.ts b/packages/service/core/app/controller.ts index bd8ec2238..b427ed38b 100644 --- a/packages/service/core/app/controller.ts +++ b/packages/service/core/app/controller.ts @@ -12,7 +12,6 @@ import { SystemToolSecretInputTypeEnum } from '@fastgpt/global/core/app/tool/sys import { type ClientSession } from '../../common/mongo'; import { MongoEvaluation } from './evaluation/evalSchema'; import { removeEvaluationJob } from './evaluation/mq'; -import { deleteChatFiles } from '../chat/controller'; import { MongoChatItem } from '../chat/chatItemSchema'; import { MongoChat } from '../chat/chatSchema'; import { MongoOutLink } from '../../support/outLink/schema'; @@ -224,7 +223,7 @@ export const onDelOneApp = async ({ // Delete chats for await (const app of apps) { const appId = String(app._id); - await deleteChatFiles({ appId }); + await getS3ChatSource().deleteChatFilesByPrefix({ appId }); await MongoChatItemResponse.deleteMany({ appId }); diff --git a/packages/service/core/chat/controller.ts b/packages/service/core/chat/controller.ts index c854e4c11..38062e95e 100644 --- a/packages/service/core/chat/controller.ts +++ b/packages/service/core/chat/controller.ts @@ -1,10 +1,6 @@ import type { ChatHistoryItemResType, ChatItemType } from '@fastgpt/global/core/chat/type'; import { MongoChatItem } from './chatItemSchema'; import { addLog } from '../../common/system/log'; -import { delFileByFileIdList, getGFSCollection } from '../../common/file/gridfs/controller'; -import { BucketNameEnum } from '@fastgpt/global/common/file/constants'; -import { MongoChat } from './chatSchema'; -import { UserError } from '@fastgpt/global/common/error/utils'; import { DispatchNodeResponseKeyEnum } from '@fastgpt/global/core/workflow/runtime/constants'; import { ChatRoleEnum } from '@fastgpt/global/core/chat/constants'; import { MongoChatItemResponse } from './chatItemResponseSchema'; @@ -95,41 +91,3 @@ export const addCustomFeedbacks = async ({ addLog.error('addCustomFeedbacks error', error); } }; - -/* - Delete chat files - 1. ChatId: Delete one chat files - 2. AppId: Delete all the app's chat files -*/ -export const deleteChatFiles = async ({ - chatIdList, - appId -}: { - chatIdList?: string[]; - appId?: string; -}) => { - if (!appId && !chatIdList) - return Promise.reject(new UserError('appId or chatIdList is required')); - - const appChatIdList = await (async () => { - if (appId) { - const appChatIdList = await MongoChat.find({ appId }, { chatId: 1 }); - return appChatIdList.map((item) => String(item.chatId)); - } else if (chatIdList) { - return chatIdList; - } - return []; - })(); - - const collection = getGFSCollection(BucketNameEnum.chat); - const where = { - 'metadata.chatId': { $in: appChatIdList } - }; - - const files = await collection.find(where, { projection: { _id: 1 } }).toArray(); - - await delFileByFileIdList({ - bucketName: BucketNameEnum.chat, - fileIdList: files.map((item) => String(item._id)) - }); -}; diff --git a/packages/service/core/chat/saveChat.ts b/packages/service/core/chat/saveChat.ts index d5cf1948d..ada9ef0af 100644 --- a/packages/service/core/chat/saveChat.ts +++ b/packages/service/core/chat/saveChat.ts @@ -245,6 +245,7 @@ export async function saveChat(props: Props) { ...chat?.metadata, ...metadata }; + const { welcomeText, variables: variableList } = getAppChatConfig({ chatConfig: appChatConfig, systemConfigNode: getGuideModule(nodes), diff --git a/packages/service/core/dataset/apiDataset/custom/api.ts b/packages/service/core/dataset/apiDataset/custom/api.ts index 1af2ffd81..a36534fa5 100644 --- a/packages/service/core/dataset/apiDataset/custom/api.ts +++ b/packages/service/core/dataset/apiDataset/custom/api.ts @@ -9,8 +9,7 @@ import { addLog } from '../../../../common/system/log'; import { readFileRawTextByUrl } from '../../read'; import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type'; import { type RequireOnlyOne } from '@fastgpt/global/common/type/utils'; -import { addRawTextBuffer, getRawTextBuffer } from '../../../../common/buffer/rawText/controller'; -import { addMinutes } from 'date-fns'; +import { getS3DatasetSource } from '../../../../common/s3/sources/dataset'; type ResponseDataType = { success: boolean; @@ -155,11 +154,14 @@ export const useApiDatasetRequest = ({ apiServer }: { apiServer: APIFileServer } } if (previewUrl) { // Get from buffer - const buffer = await getRawTextBuffer(previewUrl); - if (buffer) { + const rawTextBuffer = await getS3DatasetSource().getRawTextBuffer({ + sourceId: previewUrl, + customPdfParse + }); + if (rawTextBuffer) { return { title, - rawText: buffer.text + rawText: rawTextBuffer.text }; } @@ -173,11 +175,11 @@ export const useApiDatasetRequest = ({ apiServer }: { apiServer: APIFileServer } getFormatText: true }); - await addRawTextBuffer({ + getS3DatasetSource().addRawTextBuffer({ sourceId: previewUrl, sourceName: title || '', text: rawText, - expiredTime: addMinutes(new Date(), 30) + customPdfParse }); return { diff --git a/packages/service/core/dataset/collection/controller.ts b/packages/service/core/dataset/collection/controller.ts index 295400158..8a63ddfb8 100644 --- a/packages/service/core/dataset/collection/controller.ts +++ b/packages/service/core/dataset/collection/controller.ts @@ -12,8 +12,6 @@ import { MongoDatasetTraining } from '../training/schema'; import { MongoDatasetData } from '../data/schema'; import { delImgByRelatedId } from '../../../common/file/image/controller'; import { deleteDatasetDataVector } from '../../../common/vectorDB/controller'; -import { delFileByFileIdList } from '../../../common/file/gridfs/controller'; -import { BucketNameEnum } from '@fastgpt/global/common/file/constants'; import type { ClientSession } from '../../../common/mongo'; import { createOrGetCollectionTags } from './utils'; import { rawText2Chunks } from '../read'; @@ -33,9 +31,7 @@ import { getLLMMaxChunkSize } from '@fastgpt/global/core/dataset/training/utils'; import { DatasetDataIndexTypeEnum } from '@fastgpt/global/core/dataset/data/constants'; -import { clearCollectionImages } from '../image/utils'; -import { getS3DatasetSource, S3DatasetSource } from '../../../common/s3/sources/dataset'; -import path from 'node:path'; +import { getS3DatasetSource } from '../../../common/s3/sources/dataset'; import { removeS3TTL, isS3ObjectKey } from '../../../common/s3/utils'; export const createCollectionAndInsertData = async ({ @@ -326,18 +322,13 @@ export const delCollectionRelatedSource = async ({ if (!teamId) return Promise.reject('teamId is not exist'); - const fileIdList = collections.map((item) => item?.fileId || '').filter(Boolean); + // FIXME: 兼容旧解析图像删除 const relatedImageIds = collections .map((item) => item?.metadata?.relatedImgId || '') .filter(Boolean); // Delete files and images in parallel await Promise.all([ - // Delete files - delFileByFileIdList({ - bucketName: BucketNameEnum.dataset, - fileIdList - }), // Delete images delImgByRelatedId({ teamId, @@ -405,10 +396,8 @@ export async function delCollection({ datasetId: { $in: datasetIds }, collectionId: { $in: collectionIds } }), - // Delete dataset_images - clearCollectionImages(collectionIds), // Delete images if needed - ...(delImg + ...(delImg // 兼容旧图像删除 ? [ delImgByRelatedId({ teamId, @@ -421,10 +410,9 @@ export async function delCollection({ // Delete files if needed ...(delFile ? [ - delFileByFileIdList({ - bucketName: BucketNameEnum.dataset, - fileIdList: collections.map((item) => item?.fileId || '').filter(Boolean) - }) + getS3DatasetSource().deleteDatasetFilesByKeys( + collections.map((item) => item?.fileId || '').filter(Boolean) + ) ] : []), // Delete vector data diff --git a/packages/service/core/dataset/controller.ts b/packages/service/core/dataset/controller.ts index b60e40a5e..beb0c52b0 100644 --- a/packages/service/core/dataset/controller.ts +++ b/packages/service/core/dataset/controller.ts @@ -9,11 +9,6 @@ import { deleteDatasetDataVector } from '../../common/vectorDB/controller'; import { MongoDatasetDataText } from './data/dataTextSchema'; import { DatasetErrEnum } from '@fastgpt/global/common/error/code/dataset'; import { retryFn } from '@fastgpt/global/common/system/utils'; -import { clearDatasetImages } from './image/utils'; -import { MongoDatasetCollectionTags } from './tag/schema'; -import { removeDatasetSyncJobScheduler } from './datasetSync'; -import { mongoSessionRun } from '../../common/mongo/sessionRun'; -import { removeImageByPath } from '../../common/file/image/controller'; import { UserError } from '@fastgpt/global/common/error/utils'; import { getS3DatasetSource } from '../../common/s3/sources/dataset'; @@ -95,77 +90,39 @@ export async function delDatasetRelevantData({ '_id teamId datasetId fileId metadata' ).lean(); - await retryFn(async () => { - await Promise.all([ - // delete training data - MongoDatasetTraining.deleteMany({ - teamId, - datasetId: { $in: datasetIds } - }), - //Delete dataset_data_texts - MongoDatasetDataText.deleteMany({ - teamId, - datasetId: { $in: datasetIds } - }), - //delete dataset_datas - MongoDatasetData.deleteMany({ teamId, datasetId: { $in: datasetIds } }), - // Delete collection image and file - delCollectionRelatedSource({ collections }), - // Delete dataset Image - clearDatasetImages(datasetIds), - // Delete vector data - deleteDatasetDataVector({ teamId, datasetIds }) - ]); + // delete training data + await MongoDatasetTraining.deleteMany({ + teamId, + datasetId: { $in: datasetIds } }); + // Delete dataset_data_texts in batches by datasetId + for (const datasetId of datasetIds) { + await MongoDatasetDataText.deleteMany({ + teamId, + datasetId + }).maxTimeMS(300000); // Reduce timeout for single batch + } + // Delete dataset_datas in batches by datasetId + for (const datasetId of datasetIds) { + await MongoDatasetData.deleteMany({ + teamId, + datasetId + }).maxTimeMS(300000); + } + + await delCollectionRelatedSource({ collections }); + // Delete vector data + await deleteDatasetDataVector({ teamId, datasetIds }); + // delete collections await MongoDatasetCollection.deleteMany({ teamId, datasetId: { $in: datasetIds } }).session(session); + // Delete all dataset files for (const datasetId of datasetIds) { await getS3DatasetSource().deleteDatasetFilesByPrefix({ datasetId }); } } - -export const deleteDatasets = async ({ - teamId, - datasets -}: { - teamId: string; - datasets: DatasetSchemaType[]; -}) => { - const datasetIds = datasets.map((d) => d._id); - - // delete collection.tags - await MongoDatasetCollectionTags.deleteMany({ - teamId, - datasetId: { $in: datasetIds } - }); - - // Remove cron job - await Promise.all( - datasets.map((dataset) => { - return removeDatasetSyncJobScheduler(dataset._id); - }) - ); - - // delete all dataset.data and pg data - await mongoSessionRun(async (session) => { - // delete dataset data - await delDatasetRelevantData({ datasets, session }); - - // delete dataset - await MongoDataset.deleteMany( - { - _id: { $in: datasetIds } - }, - { session } - ); - - for await (const dataset of datasets) { - await removeImageByPath(dataset.avatar, session); - } - }); -}; diff --git a/packages/service/core/dataset/data/controller.ts b/packages/service/core/dataset/data/controller.ts index 97470f203..fc042d2aa 100644 --- a/packages/service/core/dataset/data/controller.ts +++ b/packages/service/core/dataset/data/controller.ts @@ -1,4 +1,4 @@ -import { replaceDatasetQuoteTextWithJWT } from '../../../core/dataset/utils'; +import { replaceS3KeyToPreviewUrl } from '../../../core/dataset/utils'; import { addEndpointToImageUrl } from '../../../common/file/image/utils'; import type { DatasetDataSchemaType } from '@fastgpt/global/core/dataset/type'; import { addDays } from 'date-fns'; @@ -53,8 +53,8 @@ export const formatDatasetDataValue = ({ if (!imageId) { return { - q: replaceDatasetQuoteTextWithJWT(q, addDays(new Date(), 90)), - a: a ? replaceDatasetQuoteTextWithJWT(a, addDays(new Date(), 90)) : undefined + q: replaceS3KeyToPreviewUrl(q, addDays(new Date(), 90)), + a: a ? replaceS3KeyToPreviewUrl(a, addDays(new Date(), 90)) : undefined }; } diff --git a/packages/service/core/dataset/data/schema.ts b/packages/service/core/dataset/data/schema.ts index 8d20378e6..d291f3cec 100644 --- a/packages/service/core/dataset/data/schema.ts +++ b/packages/service/core/dataset/data/schema.ts @@ -105,9 +105,6 @@ try { // rebuild data DatasetDataSchema.index({ rebuilding: 1, teamId: 1, datasetId: 1 }); - // 为查询 initJieba 字段不存在的数据添加索引 - DatasetDataSchema.index({ initJieba: 1, updateTime: 1 }); - // Cron clear invalid data DatasetDataSchema.index({ updateTime: 1 }); } catch (error) { diff --git a/packages/service/core/dataset/delete/index.ts b/packages/service/core/dataset/delete/index.ts new file mode 100644 index 000000000..6713ba329 --- /dev/null +++ b/packages/service/core/dataset/delete/index.ts @@ -0,0 +1,41 @@ +import { getQueue, getWorker, QueueNames } from '../../../common/bullmq'; +import { datasetDeleteProcessor } from './processor'; + +export type DatasetDeleteJobData = { + teamId: string; + datasetId: string; +}; + +// 创建工作进程 +export const initDatasetDeleteWorker = () => { + return getWorker(QueueNames.datasetDelete, datasetDeleteProcessor, { + concurrency: 1, // 确保同时只有1个删除任务 + removeOnFail: { + age: 30 * 24 * 60 * 60 // 保留30天失败记录 + } + }); +}; + +// 添加删除任务 +export const addDatasetDeleteJob = (data: DatasetDeleteJobData) => { + // 创建删除队列 + const datasetDeleteQueue = getQueue(QueueNames.datasetDelete, { + defaultJobOptions: { + attempts: 10, + backoff: { + type: 'exponential', + delay: 5000 + }, + removeOnComplete: true, + removeOnFail: { age: 30 * 24 * 60 * 60 } // 保留30天失败记录 + } + }); + + const jobId = `${data.teamId}:${data.datasetId}`; + + // 使用去重机制,避免重复删除 + return datasetDeleteQueue.add(jobId, data, { + deduplication: { id: jobId }, + delay: 1000 // 延迟1秒执行,确保API响应完成 + }); +}; diff --git a/packages/service/core/dataset/delete/processor.ts b/packages/service/core/dataset/delete/processor.ts new file mode 100644 index 000000000..b0d24280c --- /dev/null +++ b/packages/service/core/dataset/delete/processor.ts @@ -0,0 +1,130 @@ +import type { Processor } from 'bullmq'; +import type { DatasetDeleteJobData } from './index'; +import { delDatasetRelevantData, findDatasetAndAllChildren } from '../controller'; +import { addLog } from '../../../common/system/log'; +import type { DatasetSchemaType } from '@fastgpt/global/core/dataset/type'; +import { MongoDatasetCollectionTags } from '../tag/schema'; +import { removeDatasetSyncJobScheduler } from '../datasetSync'; +import { mongoSessionRun } from '../../../common/mongo/sessionRun'; +import { MongoDataset } from '../schema'; +import { removeImageByPath } from '../../../common/file/image/controller'; +import { MongoDatasetTraining } from '../training/schema'; + +export const deleteDatasetsImmediate = async ({ + teamId, + datasets +}: { + teamId: string; + datasets: DatasetSchemaType[]; +}) => { + const datasetIds = datasets.map((d) => d._id); + + // delete training data + MongoDatasetTraining.deleteMany({ + teamId, + datasetId: { $in: datasetIds } + }); + + // Remove cron job + await Promise.all( + datasets.map((dataset) => { + // 只处理已标记删除的数据集 + if (datasetIds.includes(dataset._id)) { + return removeDatasetSyncJobScheduler(dataset._id); + } + }) + ); +}; + +export const deleteDatasets = async ({ + teamId, + datasets +}: { + teamId: string; + datasets: DatasetSchemaType[]; +}) => { + const datasetIds = datasets.map((d) => d._id); + + // delete collection.tags + await MongoDatasetCollectionTags.deleteMany({ + teamId, + datasetId: { $in: datasetIds } + }); + + // Delete dataset avatar + for await (const dataset of datasets) { + await removeImageByPath(dataset.avatar); + } + + // delete all dataset.data and pg data + await mongoSessionRun(async (session) => { + // delete dataset data + await delDatasetRelevantData({ + datasets, + session + }); + }); + + // delete dataset + await MongoDataset.deleteMany({ + _id: { $in: datasetIds } + }); +}; + +export const datasetDeleteProcessor: Processor = async (job) => { + const { teamId, datasetId } = job.data; + const startTime = Date.now(); + + addLog.info(`[Dataset Delete] Start deleting dataset: ${datasetId} for team: ${teamId}`); + + try { + // 1. 查找知识库及其所有子知识库 + const datasets = await findDatasetAndAllChildren({ + teamId, + datasetId + }); + + if (!datasets || datasets.length === 0) { + addLog.warn(`[Dataset Delete] Dataset not found: ${datasetId}`); + return; + } + + // 2. 安全检查:确保所有要删除的数据集都已标记为 deleteTime + const markedForDelete = await MongoDataset.find( + { + _id: { $in: datasets.map((d) => d._id) }, + teamId, + deleteTime: { $ne: null } + }, + { _id: 1 } + ).lean(); + + if (markedForDelete.length !== datasets.length) { + addLog.warn( + `[Dataset Delete] Safety check: ${markedForDelete.length}/${datasets.length} datasets marked for deletion`, + { + markedDatasetIds: markedForDelete.map((d) => d._id), + totalDatasetIds: datasets.map((d) => d._id) + } + ); + } + + // 3. 执行真正的删除操作(只删除已经标记为 deleteTime 的数据) + await deleteDatasets({ + teamId, + datasets + }); + + addLog.info( + `[Dataset Delete] Successfully deleted dataset: ${datasetId} and ${datasets.length - 1} children`, + { + duration: Date.now() - startTime, + totalDatasets: datasets.length, + datasetIds: datasets.map((d) => d._id) + } + ); + } catch (error: any) { + addLog.error(`[Dataset Delete] Failed to delete dataset: ${datasetId}`, error); + throw error; + } +}; diff --git a/packages/service/core/dataset/image/controller.ts b/packages/service/core/dataset/image/controller.ts index 41a4bdd48..9cf75849c 100644 --- a/packages/service/core/dataset/image/controller.ts +++ b/packages/service/core/dataset/image/controller.ts @@ -1,17 +1,6 @@ -import { addMinutes } from 'date-fns'; import { bucketName, MongoDatasetImageSchema } from './schema'; import { connectionMongo, Types } from '../../../common/mongo'; -import fs from 'fs'; -import type { FileType } from '../../../common/file/multer'; -import fsp from 'fs/promises'; -import { computeGridFsChunSize } from '../../../common/file/gridfs/utils'; -import { setCron } from '../../../common/system/cron'; -import { checkTimerLock } from '../../../common/system/timerLock/utils'; -import { TimerIdEnum } from '../../../common/system/timerLock/constants'; -import { addLog } from '../../../common/system/log'; import { UserError } from '@fastgpt/global/common/error/utils'; -import { getS3DatasetSource, S3DatasetSource } from '../../../common/s3/sources/dataset'; -import { isS3ObjectKey } from '../../../common/s3/utils'; const getGridBucket = () => { return new connectionMongo.mongo.GridFSBucket(connectionMongo.connection.db!, { @@ -19,53 +8,6 @@ const getGridBucket = () => { }); }; -export const createDatasetImage = async ({ - teamId, - datasetId, - file, - expiredTime = addMinutes(new Date(), 30) -}: { - teamId: string; - datasetId: string; - file: FileType; - expiredTime?: Date; -}): Promise<{ imageId: string; previewUrl: string }> => { - const path = file.path; - const gridBucket = getGridBucket(); - const metadata = { - teamId: String(teamId), - datasetId: String(datasetId), - expiredTime - }; - - const stats = await fsp.stat(path); - if (!stats.isFile()) return Promise.reject(`${path} is not a file`); - - const readStream = fs.createReadStream(path, { - highWaterMark: 256 * 1024 - }); - const chunkSizeBytes = computeGridFsChunSize(stats.size); - - const stream = gridBucket.openUploadStream(file.originalname, { - metadata, - contentType: file.mimetype, - chunkSizeBytes - }); - - // save to gridfs - await new Promise((resolve, reject) => { - readStream - .pipe(stream as any) - .on('finish', resolve) - .on('error', reject); - }); - - return { - imageId: String(stream.id), - previewUrl: '' - }; -}; - export const getDatasetImageReadData = async (imageId: string) => { // Get file metadata to get contentType const fileInfo = await MongoDatasetImageSchema.findOne({ @@ -81,93 +23,3 @@ export const getDatasetImageReadData = async (imageId: string) => { fileInfo }; }; -export const getDatasetImageBase64 = async (imageId: string) => { - // Get file metadata to get contentType - const fileInfo = await MongoDatasetImageSchema.findOne({ - _id: new Types.ObjectId(imageId) - }).lean(); - if (!fileInfo) { - return Promise.reject(new UserError('Image not found')); - } - - // Get image stream from GridFS - const { stream } = await getDatasetImageReadData(imageId); - - // Convert stream to buffer - const chunks: Buffer[] = []; - - return new Promise((resolve, reject) => { - stream.on('data', (chunk: Buffer) => { - chunks.push(chunk); - }); - - stream.on('end', () => { - // Combine all chunks into a single buffer - const buffer = Buffer.concat(chunks); - // Convert buffer to base64 string - const base64 = buffer.toString('base64'); - const dataUrl = `data:${fileInfo.contentType || 'image/jpeg'};base64,${base64}`; - resolve(dataUrl); - }); - - stream.on('error', reject); - }); -}; - -export const deleteDatasetImage = async (imageId: string) => { - const gridBucket = getGridBucket(); - - try { - if (isS3ObjectKey(imageId, 'dataset')) { - await getS3DatasetSource().deleteDatasetFileByKey(imageId); - } else { - await gridBucket.delete(new Types.ObjectId(imageId)); - } - } catch (error: any) { - const msg = error?.message; - if (msg.includes('File not found')) { - addLog.warn('Delete dataset image error', error); - return; - } else { - return Promise.reject(error); - } - } -}; - -export const clearExpiredDatasetImageCron = async () => { - const gridBucket = getGridBucket(); - const clearExpiredDatasetImages = async () => { - addLog.debug('Clear expired dataset image start'); - - const data = await MongoDatasetImageSchema.find( - { - 'metadata.expiredTime': { $lt: new Date() } - }, - '_id' - ).lean(); - - for (const item of data) { - try { - await gridBucket.delete(new Types.ObjectId(item._id)); - } catch (error) { - addLog.error('Delete expired dataset image error', error); - } - } - addLog.debug('Clear expired dataset image end'); - }; - - setCron('*/10 * * * *', async () => { - if ( - await checkTimerLock({ - timerId: TimerIdEnum.clearExpiredDatasetImage, - lockMinuted: 9 - }) - ) { - try { - await clearExpiredDatasetImages(); - } catch (error) { - addLog.error('clearExpiredDatasetImageCron error', error); - } - } - }); -}; diff --git a/packages/service/core/dataset/image/utils.ts b/packages/service/core/dataset/image/utils.ts deleted file mode 100644 index f20c28122..000000000 --- a/packages/service/core/dataset/image/utils.ts +++ /dev/null @@ -1,107 +0,0 @@ -import { ERROR_ENUM } from '@fastgpt/global/common/error/errorCode'; -import { Types, type ClientSession } from '../../../common/mongo'; -import { deleteDatasetImage } from './controller'; -import { MongoDatasetImageSchema } from './schema'; -import { addMinutes } from 'date-fns'; -import jwt from 'jsonwebtoken'; -import { EndpointUrl } from '@fastgpt/global/common/file/constants'; - -export const removeDatasetImageExpiredTime = async ({ - ids = [], - collectionId, - session -}: { - ids?: string[]; - collectionId: string; - session?: ClientSession; -}) => { - if (ids.length === 0) return; - return MongoDatasetImageSchema.updateMany( - { - _id: { - $in: ids - .filter((id) => Types.ObjectId.isValid(id)) - .map((id) => (typeof id === 'string' ? new Types.ObjectId(id) : id)) - } - }, - { - $unset: { 'metadata.expiredTime': '' }, - $set: { - 'metadata.collectionId': String(collectionId) - } - }, - { session } - ); -}; - -export const getDatasetImagePreviewUrl = ({ - imageId, - teamId, - datasetId, - expiredMinutes -}: { - imageId: string; - teamId: string; - datasetId: string; - expiredMinutes: number; -}) => { - const expiredTime = Math.floor(addMinutes(new Date(), expiredMinutes).getTime() / 1000); - - const key = (process.env.FILE_TOKEN_KEY as string) ?? 'filetoken'; - const token = jwt.sign( - { - teamId: String(teamId), - datasetId: String(datasetId), - imageId: String(imageId), - exp: expiredTime - }, - key - ); - - return `${EndpointUrl}/api/file/datasetImg/${token}.jpeg`; -}; -export const authDatasetImagePreviewUrl = (token?: string) => - new Promise<{ - teamId: string; - datasetId: string; - imageId: string; - }>((resolve, reject) => { - if (!token) { - return reject(ERROR_ENUM.unAuthFile); - } - const key = (process.env.FILE_TOKEN_KEY as string) ?? 'filetoken'; - - jwt.verify(token, key, (err, decoded: any) => { - if (err || !decoded?.teamId || !decoded?.datasetId) { - reject(ERROR_ENUM.unAuthFile); - return; - } - resolve({ - teamId: decoded.teamId, - datasetId: decoded.datasetId, - imageId: decoded.imageId - }); - }); - }); - -export const clearDatasetImages = async (datasetIds: string[]) => { - if (datasetIds.length === 0) return; - const images = await MongoDatasetImageSchema.find( - { - 'metadata.datasetId': { $in: datasetIds.map((item) => String(item)) } - }, - '_id' - ).lean(); - await Promise.all(images.map((image) => deleteDatasetImage(String(image._id)))); -}; - -export const clearCollectionImages = async (collectionIds: string[]) => { - if (collectionIds.length === 0) return; - const images = await MongoDatasetImageSchema.find( - { - 'metadata.collectionId': { $in: collectionIds.map((item) => String(item)) } - }, - '_id' - ).lean(); - await Promise.all(images.map((image) => deleteDatasetImage(String(image._id)))); -}; diff --git a/packages/service/core/dataset/migration/schema.ts b/packages/service/core/dataset/migration/schema.ts index 222682ddf..11d6195cf 100644 --- a/packages/service/core/dataset/migration/schema.ts +++ b/packages/service/core/dataset/migration/schema.ts @@ -11,7 +11,7 @@ export type DatasetMigrationLogSchemaType = { migrationVersion: string; // 如 'v4.14.3' // 资源类型和标识 - resourceType: 'collection' | 'image' | 'chat_file'; // 支持不同类型的文件迁移 + resourceType: 'collection' | 'dataset_image'; // 支持不同类型的文件迁移 resourceId: string; // collection._id 或 image._id teamId: string; datasetId?: string; // collection 有,image 可能没有 @@ -101,7 +101,7 @@ const DatasetMigrationLogSchema = new Schema({ // 资源类型和标识 resourceType: { type: String, - enum: ['collection', 'image', 'chat_file'], + enum: ['collection', 'dataset_image'], required: true }, resourceId: { diff --git a/packages/service/core/dataset/schema.ts b/packages/service/core/dataset/schema.ts index 3749601db..452b009d5 100644 --- a/packages/service/core/dataset/schema.ts +++ b/packages/service/core/dataset/schema.ts @@ -131,6 +131,12 @@ const DatasetSchema = new Schema({ apiDatasetServer: Object, + // 软删除标记字段 + deleteTime: { + type: Date, + default: null // null表示未删除,有值表示删除时间 + }, + // abandoned autoSync: Boolean, externalReadUrl: String, @@ -143,6 +149,7 @@ const DatasetSchema = new Schema({ try { DatasetSchema.index({ teamId: 1 }); DatasetSchema.index({ type: 1 }); + DatasetSchema.index({ deleteTime: 1 }); // 添加软删除字段索引 } catch (error) { console.log(error); } diff --git a/packages/service/core/dataset/search/controller.ts b/packages/service/core/dataset/search/controller.ts index aa40b8254..c15d12113 100644 --- a/packages/service/core/dataset/search/controller.ts +++ b/packages/service/core/dataset/search/controller.ts @@ -33,7 +33,7 @@ import { datasetSearchQueryExtension } from './utils'; import type { RerankModelItemType } from '@fastgpt/global/core/ai/model.d'; import { formatDatasetDataValue } from '../data/controller'; import { pushTrack } from '../../../common/middle/tracks/utils'; -import { replaceDatasetQuoteTextWithJWT } from '../../../core/dataset/utils'; +import { replaceS3KeyToPreviewUrl } from '../../../core/dataset/utils'; import { addDays, addHours } from 'date-fns'; export type SearchDatasetDataProps = { @@ -81,9 +81,11 @@ export type SearchDatasetDataResponse = { usingSimilarityFilter: boolean; queryExtensionResult?: { - model: string; + llmModel: string; + embeddingModel: string; inputTokens: number; outputTokens: number; + embeddingTokens: number; query: string; }; deepSearchResult?: { model: string; inputTokens: number; outputTokens: number }; @@ -906,7 +908,7 @@ export async function searchDatasetData( const filterMaxTokensResult = await filterDatasetDataByMaxTokens(scoreFilter, maxTokens); const finalResult = filterMaxTokensResult.map((item) => { - item.q = replaceDatasetQuoteTextWithJWT(item.q, addDays(new Date(), 90)); + item.q = replaceS3KeyToPreviewUrl(item.q, addDays(new Date(), 90)); return item; }); @@ -938,32 +940,32 @@ export const defaultSearchDatasetData = async ({ const query = props.queries[0]; const histories = props.histories; - const extensionModel = datasetSearchUsingExtensionQuery - ? getLLMModel(datasetSearchExtensionModel) - : undefined; - - const { concatQueries, extensionQueries, rewriteQuery, aiExtensionResult } = - await datasetSearchQueryExtension({ - query, - extensionModel, - extensionBg: datasetSearchExtensionBg, - histories - }); + const { searchQueries, reRankQuery, aiExtensionResult } = await datasetSearchQueryExtension({ + query, + llmModel: datasetSearchUsingExtensionQuery + ? getLLMModel(datasetSearchExtensionModel).model + : undefined, + embeddingModel: props.model, + extensionBg: datasetSearchExtensionBg, + histories + }); const result = await searchDatasetData({ ...props, - reRankQuery: rewriteQuery, - queries: concatQueries + reRankQuery: reRankQuery, + queries: searchQueries }); return { ...result, queryExtensionResult: aiExtensionResult ? { - model: aiExtensionResult.model, + llmModel: aiExtensionResult.llmModel, inputTokens: aiExtensionResult.inputTokens, outputTokens: aiExtensionResult.outputTokens, - query: extensionQueries.join('\n') + embeddingModel: aiExtensionResult.embeddingModel, + embeddingTokens: aiExtensionResult.embeddingTokens, + query: searchQueries.join('\n') } : undefined }; diff --git a/packages/service/core/dataset/search/utils.ts b/packages/service/core/dataset/search/utils.ts index 410409f3d..230d70ac1 100644 --- a/packages/service/core/dataset/search/utils.ts +++ b/packages/service/core/dataset/search/utils.ts @@ -1,17 +1,18 @@ -import { type LLMModelItemType } from '@fastgpt/global/core/ai/model.d'; import { queryExtension } from '../../ai/functions/queryExtension'; import { type ChatItemType } from '@fastgpt/global/core/chat/type'; import { hashStr } from '@fastgpt/global/common/string/tools'; -import { chatValue2RuntimePrompt } from '@fastgpt/global/core/chat/adapt'; +import { addLog } from '../../../common/system/log'; export const datasetSearchQueryExtension = async ({ query, - extensionModel, + llmModel, + embeddingModel, extensionBg = '', histories = [] }: { query: string; - extensionModel?: LLMModelItemType; + llmModel?: string; + embeddingModel?: string; extensionBg?: string; histories?: ChatItemType[]; }) => { @@ -28,19 +29,8 @@ export const datasetSearchQueryExtension = async ({ return filterSameQueries; }; - let { queries, rewriteQuery, alreadyExtension } = (() => { - // concat query - let rewriteQuery = - histories.length > 0 - ? `${histories - .map((item) => { - return `${item.obj}: ${chatValue2RuntimePrompt(item.value).text}`; - }) - .join('\n')} -Human: ${query} -` - : query; - + // 检查传入的 query 是否已经进行过扩展 + let { queries, reRankQuery, alreadyExtension } = (() => { /* if query already extension, direct parse */ try { const jsonParse = JSON.parse(query); @@ -48,41 +38,45 @@ Human: ${query} const alreadyExtension = Array.isArray(jsonParse); return { queries, - rewriteQuery: alreadyExtension ? queries.join('\n') : rewriteQuery, - alreadyExtension: alreadyExtension + reRankQuery: alreadyExtension ? queries.join('\n') : query, + alreadyExtension }; } catch (error) { return { queries: [query], - rewriteQuery, + reRankQuery: query, alreadyExtension: false }; } })(); - // ai extension + // Use LLM to generate extension queries const aiExtensionResult = await (async () => { - if (!extensionModel || alreadyExtension) return; - const result = await queryExtension({ - chatBg: extensionBg, - query, - histories, - model: extensionModel.model - }); - if (result.extensionQueries?.length === 0) return; - return result; + if (!llmModel || !embeddingModel || alreadyExtension) return; + + try { + const result = await queryExtension({ + chatBg: extensionBg, + query, + histories, + llmModel, + embeddingModel + }); + if (result.extensionQueries?.length === 0) return; + return result; + } catch (error) { + addLog.error('Failed to generate extension queries', error); + } })(); - const extensionQueries = filterSamQuery(aiExtensionResult?.extensionQueries || []); if (aiExtensionResult) { - queries = filterSamQuery(queries.concat(extensionQueries)); - rewriteQuery = queries.join('\n'); + queries = queries.concat(aiExtensionResult.extensionQueries); + reRankQuery = queries.join('\n'); } return { - extensionQueries, - concatQueries: queries, - rewriteQuery, + searchQueries: queries, + reRankQuery, aiExtensionResult }; }; diff --git a/packages/service/core/dataset/utils.ts b/packages/service/core/dataset/utils.ts index 1eb947d0e..47d81e4e4 100644 --- a/packages/service/core/dataset/utils.ts +++ b/packages/service/core/dataset/utils.ts @@ -44,12 +44,12 @@ export const filterDatasetsByTmbId = async ({ * * ```typescript * const datasetQuoteText = '![image.png](dataset/68fee42e1d416bb5ddc85b19/6901c3071ba2bea567e8d8db/aZos7D-214afce5-4d42-4356-9e05-8164d51c59ae.png)'; - * const replacedText = await replaceDatasetQuoteTextWithJWT(datasetQuoteText, addDays(new Date(), 90)) + * const replacedText = await replaceS3KeyToPreviewUrl(datasetQuoteText, addDays(new Date(), 90)) * console.log(replacedText) * // '![image.png](http://localhost:3000/api/system/file/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJvYmplY3RLZXkiOiJjaGF0LzY5MWFlMjlkNDA0ZDA0Njg3MTdkZDc0Ny82OGFkODVhNzQ2MzAwNmM5NjM3OTlhMDcvalhmWHk4eWZHQUZzOVdKcGNXUmJBaFYyL3BhcnNlZC85YTBmNGZlZC00ZWRmLTQ2MTMtYThkNi01MzNhZjVhZTUxZGMucG5nIiwiaWF0IjoxNzYzMzcwOTYwLCJleHAiOjk1MzkzNzA5NjB9.tMDWg0-ZWRnWPNp9Hakd0w1hhaO8jj2oD98SU0wAQYQ)' * ``` */ -export function replaceDatasetQuoteTextWithJWT(documentQuoteText: string, expiredTime: Date) { +export function replaceS3KeyToPreviewUrl(documentQuoteText: string, expiredTime: Date) { if (!documentQuoteText || typeof documentQuoteText !== 'string') return documentQuoteText as string; diff --git a/packages/service/core/workflow/dispatch/ai/chat.ts b/packages/service/core/workflow/dispatch/ai/chat.ts index 36bed273a..574e77461 100644 --- a/packages/service/core/workflow/dispatch/ai/chat.ts +++ b/packages/service/core/workflow/dispatch/ai/chat.ts @@ -41,9 +41,6 @@ import { i18nT } from '../../../../../web/i18n/utils'; import { postTextCensor } from '../../../chat/postTextCensor'; import { createLLMResponse } from '../../../ai/llm/request'; import { formatModelChars2Points } from '../../../../support/wallet/usage/utils'; -import { replaceDatasetQuoteTextWithJWT } from '../../../dataset/utils'; -import { getFileS3Key } from '../../../../common/s3/utils'; -import { addDays } from 'date-fns'; export type ChatProps = ModuleDispatchProps< AIChatNodeProps & { @@ -307,7 +304,6 @@ async function filterDatasetQuote({ : ''; return { - // datasetQuoteText: replaceDatasetQuoteTextWithJWT(datasetQuoteText, addDays(new Date(), 90)) datasetQuoteText }; } diff --git a/packages/service/core/workflow/dispatch/dataset/search.ts b/packages/service/core/workflow/dispatch/dataset/search.ts index 749ce620b..9d58e317b 100644 --- a/packages/service/core/workflow/dispatch/dataset/search.ts +++ b/packages/service/core/workflow/dispatch/dataset/search.ts @@ -1,7 +1,4 @@ -import { - type DispatchNodeResponseType, - type DispatchNodeResultType -} from '@fastgpt/global/core/workflow/runtime/type.d'; +import { type DispatchNodeResultType } from '@fastgpt/global/core/workflow/runtime/type.d'; import { formatModelChars2Points } from '../../../../support/wallet/usage/utils'; import type { SelectedDatasetType } from '@fastgpt/global/core/workflow/type/io'; import type { SearchDataResponseItemType } from '@fastgpt/global/core/dataset/type'; @@ -117,7 +114,7 @@ export async function dispatchDatasetSearch( return emptyResult; } - // get vector + // Get vector model const vectorModel = getEmbeddingModel( (await MongoDataset.findById(datasets[0].datasetId, 'vectorModel').lean())?.vectorModel ); @@ -165,7 +162,7 @@ export async function dispatchDatasetSearch( // count bill results const nodeDispatchUsages: ChatNodeUsageType[] = []; - // vector + // 1. Search vector const { totalPoints: embeddingTotalPoints, modelName: embeddingModelName } = formatModelChars2Points({ model: vectorModel.model, @@ -177,96 +174,98 @@ export async function dispatchDatasetSearch( model: embeddingModelName, inputTokens: embeddingTokens }); - // Rerank - const { totalPoints: reRankTotalPoints, modelName: reRankModelName } = formatModelChars2Points({ - model: rerankModelData?.model, - inputTokens: reRankInputTokens - }); + // 2. Rerank if (usingReRank) { + const { totalPoints: reRankTotalPoints, modelName: reRankModelName } = + formatModelChars2Points({ + model: rerankModelData?.model, + inputTokens: reRankInputTokens + }); nodeDispatchUsages.push({ totalPoints: reRankTotalPoints, - moduleName: node.name, + moduleName: i18nT('account_usage:rerank'), model: reRankModelName, inputTokens: reRankInputTokens }); } - // Query extension - (() => { - if (queryExtensionResult) { - const { totalPoints, modelName } = formatModelChars2Points({ - model: queryExtensionResult.model, - inputTokens: queryExtensionResult.inputTokens, - outputTokens: queryExtensionResult.outputTokens - }); - nodeDispatchUsages.push({ - totalPoints, - moduleName: i18nT('common:core.module.template.Query extension'), - model: modelName, - inputTokens: queryExtensionResult.inputTokens, - outputTokens: queryExtensionResult.outputTokens - }); - return { - totalPoints - }; - } - return { - totalPoints: 0 - }; - })(); - // Deep search - (() => { - if (deepSearchResult) { - const { totalPoints, modelName } = formatModelChars2Points({ - model: deepSearchResult.model, - inputTokens: deepSearchResult.inputTokens, - outputTokens: deepSearchResult.outputTokens - }); - nodeDispatchUsages.push({ - totalPoints, - moduleName: i18nT('common:deep_rag_search'), - model: modelName, - inputTokens: deepSearchResult.inputTokens, - outputTokens: deepSearchResult.outputTokens - }); - return { - totalPoints - }; - } - return { - totalPoints: 0 - }; - })(); + // 3. Query extension + if (queryExtensionResult) { + const { totalPoints: llmPoints, modelName: llmModelName } = formatModelChars2Points({ + model: queryExtensionResult.llmModel, + inputTokens: queryExtensionResult.inputTokens, + outputTokens: queryExtensionResult.outputTokens + }); + nodeDispatchUsages.push({ + totalPoints: llmPoints, + moduleName: i18nT('common:core.module.template.Query extension'), + model: llmModelName, + inputTokens: queryExtensionResult.inputTokens, + outputTokens: queryExtensionResult.outputTokens + }); + const { totalPoints: embeddingPoints, modelName: embeddingModelName } = + formatModelChars2Points({ + model: queryExtensionResult.embeddingModel, + inputTokens: queryExtensionResult.embeddingTokens + }); + nodeDispatchUsages.push({ + totalPoints: embeddingPoints, + moduleName: `${i18nT('account_usage:ai.query_extension_embedding')}`, + model: embeddingModelName, + inputTokens: queryExtensionResult.embeddingTokens, + outputTokens: 0 + }); + } + // 4. Deep search + if (deepSearchResult) { + const { totalPoints, modelName } = formatModelChars2Points({ + model: deepSearchResult.model, + inputTokens: deepSearchResult.inputTokens, + outputTokens: deepSearchResult.outputTokens + }); + nodeDispatchUsages.push({ + totalPoints, + moduleName: i18nT('common:deep_rag_search'), + model: modelName, + inputTokens: deepSearchResult.inputTokens, + outputTokens: deepSearchResult.outputTokens + }); + } const totalPoints = nodeDispatchUsages.reduce((acc, item) => acc + item.totalPoints, 0); - const responseData: DispatchNodeResponseType & { totalPoints: number } = { - totalPoints, - query: userChatInput, - embeddingModel: vectorModel.name, - embeddingTokens, - similarity: usingSimilarityFilter ? similarity : undefined, - limit, - searchMode, - embeddingWeight: - searchMode === DatasetSearchModeEnum.mixedRecall ? embeddingWeight : undefined, - // Rerank - ...(searchUsingReRank && { - rerankModel: rerankModelData?.name, - rerankWeight: rerankWeight, - reRankInputTokens - }), - searchUsingReRank, - // Results - quoteList: searchRes, - queryExtensionResult, - deepSearchResult - }; - return { data: { quoteQA: searchRes }, - [DispatchNodeResponseKeyEnum.nodeResponse]: responseData, + [DispatchNodeResponseKeyEnum.nodeResponse]: { + totalPoints, + query: userChatInput, + embeddingModel: vectorModel.name, + embeddingTokens, + similarity: usingSimilarityFilter ? similarity : undefined, + limit, + searchMode, + embeddingWeight: + searchMode === DatasetSearchModeEnum.mixedRecall ? embeddingWeight : undefined, + // Rerank + ...(searchUsingReRank && { + rerankModel: rerankModelData?.name, + rerankWeight: rerankWeight, + reRankInputTokens + }), + searchUsingReRank, + queryExtensionResult: queryExtensionResult + ? { + model: queryExtensionResult.llmModel, + inputTokens: queryExtensionResult.inputTokens, + outputTokens: queryExtensionResult.outputTokens, + query: queryExtensionResult.query + } + : undefined, + deepSearchResult, + // Results + quoteList: searchRes + }, nodeDispatchUsages, [DispatchNodeResponseKeyEnum.toolResponses]: searchRes.length > 0 diff --git a/packages/service/core/workflow/dispatch/tools/queryExternsion.ts b/packages/service/core/workflow/dispatch/tools/queryExternsion.ts index c06a61c6a..bbed807cb 100644 --- a/packages/service/core/workflow/dispatch/tools/queryExternsion.ts +++ b/packages/service/core/workflow/dispatch/tools/queryExternsion.ts @@ -3,13 +3,12 @@ import type { ModuleDispatchProps } from '@fastgpt/global/core/workflow/runtime/ import type { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants'; import { NodeOutputKeyEnum } from '@fastgpt/global/core/workflow/constants'; import { DispatchNodeResponseKeyEnum } from '@fastgpt/global/core/workflow/runtime/constants'; -import { getLLMModel } from '../../../../core/ai/model'; +import { getLLMModel, getEmbeddingModel } from '../../../../core/ai/model'; import { formatModelChars2Points } from '../../../../support/wallet/usage/utils'; import { queryExtension } from '../../../../core/ai/functions/queryExtension'; import { getHistories } from '../utils'; import { hashStr } from '@fastgpt/global/common/string/tools'; import { type DispatchNodeResultType } from '@fastgpt/global/core/workflow/runtime/type'; -import { ModelTypeEnum } from '@fastgpt/global/core/ai/model'; type Props = ModuleDispatchProps<{ [NodeInputKeyEnum.aiModel]: string; @@ -31,23 +30,39 @@ export const dispatchQueryExtension = async ({ } const queryExtensionModel = getLLMModel(model); + const embeddingModel = getEmbeddingModel(); const chatHistories = getHistories(history, histories); - const { extensionQueries, inputTokens, outputTokens } = await queryExtension({ + const { + extensionQueries, + inputTokens, + outputTokens, + embeddingTokens, + llmModel, + embeddingModel: useEmbeddingModel + } = await queryExtension({ chatBg: systemPrompt, query: userChatInput, histories: chatHistories, - model: queryExtensionModel.model + llmModel: queryExtensionModel.model, + embeddingModel: embeddingModel.model }); extensionQueries.unshift(userChatInput); - const { totalPoints, modelName } = formatModelChars2Points({ - model: queryExtensionModel.model, + const { totalPoints: llmPoints, modelName: llmModelName } = formatModelChars2Points({ + model: llmModel, inputTokens, outputTokens }); + const { totalPoints: embeddingPoints, modelName: embeddingModelName } = formatModelChars2Points({ + model: useEmbeddingModel, + inputTokens: embeddingTokens + }); + + const totalPoints = llmPoints + embeddingPoints; + const set = new Set(); const filterSameQueries = extensionQueries.filter((item) => { // 删除所有的标点符号与空格等,只对文本进行比较 @@ -63,19 +78,27 @@ export const dispatchQueryExtension = async ({ }, [DispatchNodeResponseKeyEnum.nodeResponse]: { totalPoints, - model: modelName, + model: llmModelName, inputTokens, outputTokens, + embeddingTokens, query: userChatInput, textOutput: JSON.stringify(filterSameQueries) }, [DispatchNodeResponseKeyEnum.nodeDispatchUsages]: [ { moduleName: node.name, - totalPoints, - model: modelName, + totalPoints: llmPoints, + model: llmModelName, inputTokens, outputTokens + }, + { + moduleName: `${node.name} - Embedding`, + totalPoints: embeddingPoints, + model: embeddingModelName, + inputTokens: embeddingTokens, + outputTokens: 0 } ] }; diff --git a/packages/service/core/workflow/dispatch/tools/readFiles.ts b/packages/service/core/workflow/dispatch/tools/readFiles.ts index f7dd497e4..62f79911d 100644 --- a/packages/service/core/workflow/dispatch/tools/readFiles.ts +++ b/packages/service/core/workflow/dispatch/tools/readFiles.ts @@ -10,18 +10,17 @@ import { detectFileEncoding, parseUrlToFileType } from '@fastgpt/global/common/f import { readS3FileContentByBuffer } from '../../../../common/file/read/utils'; import { ChatRoleEnum } from '@fastgpt/global/core/chat/constants'; import { type ChatItemType, type UserChatItemValueItemType } from '@fastgpt/global/core/chat/type'; -import { parseFileExtensionFromUrl } from '@fastgpt/global/common/string/tools'; import { addLog } from '../../../../common/system/log'; -import { addRawTextBuffer, getRawTextBuffer } from '../../../../common/buffer/rawText/controller'; -import { addDays, addMinutes } from 'date-fns'; +import { addDays } from 'date-fns'; import { getNodeErrResponse } from '../utils'; import { isInternalAddress } from '../../../../common/system/utils'; -import { replaceDatasetQuoteTextWithJWT } from '../../../dataset/utils'; +import { replaceS3KeyToPreviewUrl } from '../../../dataset/utils'; import { getFileS3Key } from '../../../../common/s3/utils'; import { S3ChatSource } from '../../../../common/s3/sources/chat'; -import path from 'path'; +import path from 'node:path'; import { S3Buckets } from '../../../../common/s3/constants'; import { S3Sources } from '../../../../common/s3/type'; +import { getS3DatasetSource } from '../../../../common/s3/sources/dataset'; type Props = ModuleDispatchProps<{ [NodeInputKeyEnum.fileUrlList]: string[]; @@ -176,12 +175,15 @@ export const getFileContentFromLinks = async ({ parseUrlList .map(async (url) => { // Get from buffer - const fileBuffer = await getRawTextBuffer(url); - if (fileBuffer) { + const rawTextBuffer = await getS3DatasetSource().getRawTextBuffer({ + sourceId: url, + customPdfParse + }); + if (rawTextBuffer) { return formatResponseObject({ - filename: fileBuffer.sourceName || url, + filename: rawTextBuffer.filename || url, url, - content: fileBuffer.text + content: rawTextBuffer.text }); } @@ -264,14 +266,14 @@ export const getFileContentFromLinks = async ({ usageId }); - const replacedText = replaceDatasetQuoteTextWithJWT(rawText, addDays(new Date(), 90)); + const replacedText = replaceS3KeyToPreviewUrl(rawText, addDays(new Date(), 90)); // Add to buffer - addRawTextBuffer({ + getS3DatasetSource().addRawTextBuffer({ sourceId: url, sourceName: filename, text: replacedText, - expiredTime: addMinutes(new Date(), 20) + customPdfParse }); return formatResponseObject({ filename, url, content: replacedText }); diff --git a/packages/service/package.json b/packages/service/package.json index 8aff5b760..e99883ef1 100644 --- a/packages/service/package.json +++ b/packages/service/package.json @@ -4,6 +4,7 @@ "type": "module", "dependencies": { "@fastgpt/global": "workspace:*", + "@maxmind/geoip2-node": "^6.3.4", "@modelcontextprotocol/sdk": "^1.24.0", "@node-rs/jieba": "2.0.1", "@opentelemetry/api": "^1.9.0", diff --git a/packages/service/support/appRegistration/schema.ts b/packages/service/support/appRegistration/schema.ts new file mode 100644 index 000000000..b77a166e8 --- /dev/null +++ b/packages/service/support/appRegistration/schema.ts @@ -0,0 +1,33 @@ +import { Schema, getMongoModel } from '../../common/mongo/index'; +import { AppCollectionName } from '../../core/app/schema'; +import { TeamMemberCollectionName } from '@fastgpt/global/support/user/team/constant'; +import { TeamCollectionName } from '@fastgpt/global/support/user/team/constant'; + +export const AppRegistrationCollectionName = 'app_registrations'; + +const AppRegistrationSchema = new Schema({ + teamId: { + type: Schema.Types.ObjectId, + ref: TeamCollectionName, + required: true + }, + tmbId: { + type: Schema.Types.ObjectId, + ref: TeamMemberCollectionName, + required: true + }, + appId: { + type: Schema.Types.ObjectId, + ref: AppCollectionName + }, + createdAt: { + type: Date + } +}); + +AppRegistrationSchema.index({ teamId: 1 }); + +export const MongoAppRegistration = getMongoModel( + AppRegistrationCollectionName, + AppRegistrationSchema +); diff --git a/packages/service/support/permission/auth/file.ts b/packages/service/support/permission/auth/file.ts index 9bc4a11fa..ac883a6fb 100644 --- a/packages/service/support/permission/auth/file.ts +++ b/packages/service/support/permission/auth/file.ts @@ -1,7 +1,4 @@ import { type AuthModeType, type AuthResponseType } from '../type'; -import { type DatasetFileSchema } from '@fastgpt/global/core/dataset/type'; -import { getFileById } from '../../../common/file/gridfs/controller'; -import { BucketNameEnum, bucketNameMap } from '@fastgpt/global/common/file/constants'; import { CommonErrEnum } from '@fastgpt/global/common/error/code/common'; import { OwnerPermissionVal, ReadRoleVal } from '@fastgpt/global/support/permission/constant'; import { Permission } from '@fastgpt/global/support/permission/controller'; @@ -10,8 +7,7 @@ import { addMinutes } from 'date-fns'; import { parseHeaderCert } from './common'; import jwt from 'jsonwebtoken'; import { ERROR_ENUM } from '@fastgpt/global/common/error/errorCode'; -import { S3Sources } from '../../../common/s3/type'; -import { getS3DatasetSource, S3DatasetSource } from '../../../common/s3/sources/dataset'; +import { getS3DatasetSource } from '../../../common/s3/sources/dataset'; import { isS3ObjectKey } from '../../../common/s3/utils'; export const authCollectionFile = async ({ @@ -22,19 +18,12 @@ export const authCollectionFile = async ({ fileId: string; }): Promise => { const authRes = await parseHeaderCert(props); - const { teamId, tmbId } = authRes; if (isS3ObjectKey(fileId, 'dataset')) { const stat = await getS3DatasetSource().getDatasetFileStat(fileId); if (!stat) return Promise.reject(CommonErrEnum.fileNotFound); } else { - const file = await getFileById({ bucketName: BucketNameEnum.dataset, fileId }); - if (!file) { - return Promise.reject(CommonErrEnum.fileNotFound); - } - if (file.metadata?.teamId !== teamId) { - return Promise.reject(CommonErrEnum.unAuthFile); - } + return Promise.reject('Invalid dataset file key'); } const permission = new Permission({ role: ReadRoleVal, isOwner: true }); @@ -49,27 +38,6 @@ export const authCollectionFile = async ({ }; }; -/* file permission */ -export const createFileToken = (data: FileTokenQuery) => { - if (!process.env.FILE_TOKEN_KEY) { - return Promise.reject('System unset FILE_TOKEN_KEY'); - } - - const expireMinutes = - data.customExpireMinutes ?? bucketNameMap[data.bucketName].previewExpireMinutes; - const expiredTime = Math.floor(addMinutes(new Date(), expireMinutes).getTime() / 1000); - - const key = (process.env.FILE_TOKEN_KEY as string) ?? 'filetoken'; - const token = jwt.sign( - { - ...data, - exp: expiredTime - }, - key - ); - return Promise.resolve(token); -}; - export const authFileToken = (token?: string) => new Promise((resolve, reject) => { if (!token) { diff --git a/packages/service/support/permission/dataset/auth.ts b/packages/service/support/permission/dataset/auth.ts index 18a54bc7f..b1891c76a 100644 --- a/packages/service/support/permission/dataset/auth.ts +++ b/packages/service/support/permission/dataset/auth.ts @@ -19,13 +19,11 @@ import { MongoDatasetData } from '../../../core/dataset/data/schema'; import { type AuthModeType, type AuthResponseType } from '../type'; import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants'; import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type'; -import { getDatasetImagePreviewUrl } from '../../../core/dataset/image/utils'; import { i18nT } from '../../../../web/i18n/utils'; import { parseHeaderCert } from '../auth/common'; import { sumPer } from '@fastgpt/global/support/permission/utils'; -import { getS3DatasetSource, S3DatasetSource } from '../../../common/s3/sources/dataset'; -import { isS3ObjectKey, jwtSignS3ObjectKey } from '../../../common/s3/utils'; -import { addHours } from 'date-fns'; +import { getS3DatasetSource } from '../../../common/s3/sources/dataset'; +import { isS3ObjectKey } from '../../../common/s3/utils'; export const authDatasetByTmbId = async ({ tmbId, @@ -174,55 +172,6 @@ export async function authDatasetCollection({ }; } -// export async function authDatasetFile({ -// fileId, -// per, -// ...props -// }: AuthModeType & { -// fileId: string; -// }): Promise< -// AuthResponseType & { -// file: DatasetFileSchema; -// } -// > { -// const { teamId, tmbId, isRoot } = await parseHeaderCert(props); - -// const [file, collection] = await Promise.all([ -// getFileById({ bucketName: BucketNameEnum.dataset, fileId }), -// MongoDatasetCollection.findOne({ -// teamId, -// fileId -// }) -// ]); - -// if (!file) { -// return Promise.reject(CommonErrEnum.fileNotFound); -// } - -// if (!collection) { -// return Promise.reject(DatasetErrEnum.unAuthDatasetFile); -// } - -// try { -// const { permission } = await authDatasetCollection({ -// ...props, -// collectionId: collection._id, -// per, -// isRoot -// }); - -// return { -// teamId, -// tmbId, -// file, -// permission, -// isRoot -// }; -// } catch (error) { -// return Promise.reject(DatasetErrEnum.unAuthDatasetFile); -// } -// } - /* DatasetData permission is inherited from collection. */ @@ -251,21 +200,14 @@ export async function authDatasetData({ q: datasetData.q, a: datasetData.a, imageId: datasetData.imageId, - imagePreivewUrl: datasetData.imageId - ? isS3ObjectKey(datasetData.imageId, 'dataset') - ? // jwtSignS3ObjectKey(datasetData.imageId, addHours(new Date(), 1)) - await getS3DatasetSource().createGetDatasetFileURL({ + imagePreivewUrl: + datasetData.imageId && isS3ObjectKey(datasetData.imageId, 'dataset') + ? await getS3DatasetSource().createGetDatasetFileURL({ key: datasetData.imageId, expiredHours: 1, external: true }) - : getDatasetImagePreviewUrl({ - imageId: datasetData.imageId, - teamId: datasetData.teamId, - datasetId: datasetData.datasetId, - expiredMinutes: 30 - }) - : undefined, + : undefined, chunkIndex: datasetData.chunkIndex, indexes: datasetData.indexes, datasetId: String(datasetData.datasetId), diff --git a/packages/service/support/permission/teamLimit.ts b/packages/service/support/permission/teamLimit.ts index 988dba322..6d026c9ba 100644 --- a/packages/service/support/permission/teamLimit.ts +++ b/packages/service/support/permission/teamLimit.ts @@ -4,7 +4,7 @@ import { MongoDataset } from '../../core/dataset/schema'; import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants'; import { TeamErrEnum } from '@fastgpt/global/common/error/code/team'; import { SystemErrEnum } from '@fastgpt/global/common/error/code/system'; -import { AppTypeEnum } from '@fastgpt/global/core/app/constants'; +import { AppTypeEnum, ToolTypeList, AppFolderTypeList } from '@fastgpt/global/core/app/constants'; import { MongoTeamMember } from '../user/team/teamMemberSchema'; import { TeamMemberStatusEnum } from '@fastgpt/global/support/user/team/constant'; import { getVectorCountByTeamId } from '../../common/vectorDB/controller'; @@ -40,41 +40,62 @@ export const checkTeamMemberLimit = async (teamId: string, newCount: number) => } }; -export const checkTeamAppLimit = async (teamId: string, amount = 1) => { - const [{ standardConstants }, appCount] = await Promise.all([ - getTeamStandPlan({ teamId }), - MongoApp.countDocuments({ +export const checkTeamAppTypeLimit = async ({ + teamId, + appCheckType, + amount = 1 +}: { + teamId: string; + appCheckType: 'app' | 'tool' | 'folder'; + amount?: number; +}) => { + if (appCheckType === 'app') { + const [{ standardConstants }, appCount] = await Promise.all([ + getTeamStandPlan({ teamId }), + MongoApp.countDocuments({ + teamId, + type: { + $in: [AppTypeEnum.simple, AppTypeEnum.workflow] + } + }) + ]); + + if (standardConstants && appCount + amount > standardConstants.maxAppAmount) { + return Promise.reject(TeamErrEnum.appAmountNotEnough); + } + + // System check + if (global?.licenseData?.maxApps && typeof global?.licenseData?.maxApps === 'number') { + const totalApps = await MongoApp.countDocuments({ + type: { + $in: [AppTypeEnum.simple, AppTypeEnum.workflow] + } + }); + if (totalApps > global.licenseData.maxApps) { + return Promise.reject(SystemErrEnum.licenseAppAmountLimit); + } + } + } else if (appCheckType === 'tool') { + const toolCount = await MongoApp.countDocuments({ teamId, type: { - $in: [ - AppTypeEnum.simple, - AppTypeEnum.workflow, - AppTypeEnum.workflowTool, - AppTypeEnum.mcpToolSet, - AppTypeEnum.httpToolSet - ] - } - }) - ]); - - if (standardConstants && appCount + amount >= standardConstants.maxAppAmount) { - return Promise.reject(TeamErrEnum.appAmountNotEnough); - } - - // System check - if (global?.licenseData?.maxApps && typeof global?.licenseData?.maxApps === 'number') { - const totalApps = await MongoApp.countDocuments({ - type: { - $in: [ - AppTypeEnum.simple, - AppTypeEnum.workflow, - AppTypeEnum.workflowTool, - AppTypeEnum.mcpToolSet - ] + $in: ToolTypeList } }); - if (totalApps >= global.licenseData.maxApps) { - return Promise.reject(SystemErrEnum.licenseAppAmountLimit); + const maxToolAmount = 1000; + if (toolCount + amount > maxToolAmount) { + return Promise.reject(TeamErrEnum.pluginAmountNotEnough); + } + } else if (appCheckType === 'folder') { + const folderCount = await MongoApp.countDocuments({ + teamId, + type: { + $in: AppFolderTypeList + } + }); + const maxAppFolderAmount = 1000; + if (folderCount + amount > maxAppFolderAmount) { + return Promise.reject(TeamErrEnum.appFolderAmountNotEnough); } } }; @@ -131,7 +152,7 @@ export const checkTeamDatasetSyncPermission = async (teamId: string) => { teamId }); - if (standardConstants && !standardConstants?.permissionWebsiteSync) { + if (standardConstants && !standardConstants?.websiteSyncPerDataset) { return Promise.reject(TeamErrEnum.websiteSyncNotEnough); } }; diff --git a/packages/service/support/user/audit/schema.ts b/packages/service/support/user/audit/schema.ts index 8d1dc6373..ff2247208 100644 --- a/packages/service/support/user/audit/schema.ts +++ b/packages/service/support/user/audit/schema.ts @@ -35,7 +35,6 @@ const OperationLogSchema = new Schema({ }); OperationLogSchema.index({ teamId: 1, tmbId: 1, event: 1 }); -OperationLogSchema.index({ timestamp: 1 }, { expireAfterSeconds: 14 * 24 * 60 * 60 }); // Auto delete after 14 days export const MongoOperationLog = getMongoLogModel( OperationLogCollectionName, diff --git a/packages/service/support/user/controller.ts b/packages/service/support/user/controller.ts index 58c0582f0..413bbff06 100644 --- a/packages/service/support/user/controller.ts +++ b/packages/service/support/user/controller.ts @@ -45,8 +45,8 @@ export async function getUserDetail({ timezone: user.timezone, promotionRate: user.promotionRate, team: tmb, - notificationAccount: tmb.notificationAccount, permission: tmb.permission, - contact: user.contact + contact: user.contact, + language: user.language }; } diff --git a/packages/service/support/user/schema.ts b/packages/service/support/user/schema.ts index 9d430182c..dcf92847c 100644 --- a/packages/service/support/user/schema.ts +++ b/packages/service/support/user/schema.ts @@ -4,6 +4,7 @@ import { hashStr } from '@fastgpt/global/common/string/tools'; import type { UserModelSchema } from '@fastgpt/global/support/user/type'; import { UserStatusEnum, userStatusMap } from '@fastgpt/global/support/user/constant'; import { TeamMemberCollectionName } from '@fastgpt/global/support/user/team/constant'; +import { LangEnum } from '@fastgpt/global/common/i18n/type'; export const userCollectionName = 'users'; @@ -19,7 +20,6 @@ const UserSchema = new Schema({ required: true, unique: true // 唯一 }, - phonePrefix: Number, password: { type: String, required: true, @@ -46,6 +46,10 @@ const UserSchema = new Schema({ type: String, default: 'Asia/Shanghai' }, + language: { + type: String, + default: LangEnum.zh_CN + }, lastLoginTmbId: { type: Schema.Types.ObjectId, ref: TeamMemberCollectionName @@ -58,6 +62,8 @@ const UserSchema = new Schema({ }, fastgpt_sem: Object, sourceDomain: String, + + phonePrefix: Number, contact: String, /** @deprecated */ diff --git a/packages/service/support/wallet/discountCoupon/schema.ts b/packages/service/support/wallet/discountCoupon/schema.ts new file mode 100644 index 000000000..3d3be5ec6 --- /dev/null +++ b/packages/service/support/wallet/discountCoupon/schema.ts @@ -0,0 +1,42 @@ +import { connectionMongo, getMongoModel } from '../../../common/mongo'; +const { Schema } = connectionMongo; +import { TeamCollectionName } from '@fastgpt/global/support/user/team/constant'; +import { DiscountCouponTypeEnum } from '@fastgpt/global/support/wallet/sub/discountCoupon/constants'; +import type { DiscountCouponSchemaType } from '@fastgpt/global/openapi/support/wallet/discountCoupon/api'; + +export const discountCouponCollectionName = 'team_discount_coupons'; + +const DiscountCouponSchema = new Schema({ + teamId: { + type: Schema.Types.ObjectId, + ref: TeamCollectionName, + required: true + }, + type: { + type: String, + required: true, + enum: Object.values(DiscountCouponTypeEnum) + }, + startTime: Date, + expiredTime: { + type: Date, + required: true + }, + usedAt: Date, + createTime: { + type: Date, + default: () => new Date() + } +}); + +try { + DiscountCouponSchema.index({ status: 1, type: 1 }); + DiscountCouponSchema.index({ teamId: 1, status: 1 }); +} catch (error) { + console.log(error); +} + +export const MongoDiscountCoupon = getMongoModel( + discountCouponCollectionName, + DiscountCouponSchema +); diff --git a/packages/service/support/wallet/sub/schema.ts b/packages/service/support/wallet/sub/schema.ts index 723199b75..8585ff636 100644 --- a/packages/service/support/wallet/sub/schema.ts +++ b/packages/service/support/wallet/sub/schema.ts @@ -56,6 +56,15 @@ const SubSchema = new Schema({ maxApp: Number, maxDataset: Number, + // custom level configurations + requestsPerMinute: Number, + chatHistoryStoreDuration: Number, + maxDatasetSize: Number, + websiteSyncPerDataset: Number, + appRegistrationCount: Number, + auditLogStoreDuration: Number, + ticketResponseTime: Number, + // stand sub and extra points sub. Plan total points totalPoints: { type: Number diff --git a/packages/service/support/wallet/sub/utils.ts b/packages/service/support/wallet/sub/utils.ts index 853b20ffa..319ecd379 100644 --- a/packages/service/support/wallet/sub/utils.ts +++ b/packages/service/support/wallet/sub/utils.ts @@ -63,7 +63,18 @@ export const getTeamStandPlan = async ({ teamId }: { teamId: string }) => { ...standardConstants, maxTeamMember: standard?.maxTeamMember || standardConstants.maxTeamMember, maxAppAmount: standard?.maxApp || standardConstants.maxAppAmount, - maxDatasetAmount: standard?.maxDataset || standardConstants.maxDatasetAmount + maxDatasetAmount: standard?.maxDataset || standardConstants.maxDatasetAmount, + requestsPerMinute: standard?.requestsPerMinute || standardConstants.requestsPerMinute, + chatHistoryStoreDuration: + standard?.chatHistoryStoreDuration || standardConstants.chatHistoryStoreDuration, + maxDatasetSize: standard?.maxDatasetSize || standardConstants.maxDatasetSize, + websiteSyncPerDataset: + standard?.websiteSyncPerDataset || standardConstants.websiteSyncPerDataset, + appRegistrationCount: + standard?.appRegistrationCount || standardConstants.appRegistrationCount, + auditLogStoreDuration: + standard?.auditLogStoreDuration || standardConstants.auditLogStoreDuration, + ticketResponseTime: standard?.ticketResponseTime || standardConstants.ticketResponseTime } : undefined }; @@ -165,7 +176,9 @@ export const getTeamPlanStatus = async ({ const standardMaxDatasetSize = standardPlan?.currentSubLevel && standardPlans - ? standardPlans[standardPlan.currentSubLevel]?.maxDatasetSize || Infinity + ? standardPlans[standardPlan.currentSubLevel]?.maxDatasetSize || + standardPlan?.maxDatasetSize || + Infinity : Infinity; const totalDatasetSize = standardMaxDatasetSize + @@ -185,7 +198,19 @@ export const getTeamPlanStatus = async ({ ...standardConstants, maxTeamMember: standardPlan?.maxTeamMember || standardConstants.maxTeamMember, maxAppAmount: standardPlan?.maxApp || standardConstants.maxAppAmount, - maxDatasetAmount: standardPlan?.maxDataset || standardConstants.maxDatasetAmount + maxDatasetAmount: standardPlan?.maxDataset || standardConstants.maxDatasetAmount, + requestsPerMinute: standardPlan?.requestsPerMinute || standardConstants.requestsPerMinute, + chatHistoryStoreDuration: + standardPlan?.chatHistoryStoreDuration || standardConstants.chatHistoryStoreDuration, + maxDatasetSize: standardPlan?.maxDatasetSize || standardConstants.maxDatasetSize, + websiteSyncPerDataset: + standardPlan?.websiteSyncPerDataset || standardConstants.websiteSyncPerDataset, + appRegistrationCount: + standardPlan?.appRegistrationCount || standardConstants.appRegistrationCount, + auditLogStoreDuration: + standardPlan?.auditLogStoreDuration || standardConstants.auditLogStoreDuration, + ticketResponseTime: + standardPlan?.ticketResponseTime || standardConstants.ticketResponseTime } : undefined, diff --git a/packages/service/type/env.d.ts b/packages/service/type/env.d.ts index 0c26aefc2..f6e11abe7 100644 --- a/packages/service/type/env.d.ts +++ b/packages/service/type/env.d.ts @@ -28,6 +28,7 @@ declare global { CHECK_INTERNAL_IP?: string; ALLOWED_ORIGINS?: string; SHOW_COUPON?: string; + SHOW_DISCOUNT_COUPON?: string; CONFIG_JSON_PATH?: string; PASSWORD_LOGIN_LOCK_SECONDS?: string; PASSWORD_EXPIRED_MONTH?: string; diff --git a/packages/web/i18n/en/account.json b/packages/web/i18n/en/account.json index d982c3a78..cac044c84 100644 --- a/packages/web/i18n/en/account.json +++ b/packages/web/i18n/en/account.json @@ -83,5 +83,22 @@ "reset_default": "Restore the default configuration", "team": "Team", "third_party": "Third Party", - "usage_records": "Usage" + "usage_records": "Usage", + "bill_detail": "Bill details", + "order_number": "Order number", + "generation_time": "Generation time", + "order_type": "Order type", + "status": "state", + "payment_method": "Payment method", + "support_wallet_amount": "Amount", + "yuan": "¥{{amount}}", + "has_invoice": "Whether the invoice has been issued", + "yes": "yes", + "no": "no", + "subscription_period": "Subscription cycle", + "subscription_package": "Subscription package", + "subscription_mode_month": "Duration", + "month": "moon", + "extra_dataset_size": "Additional knowledge base capacity", + "extra_ai_points": "AI points calculation standard" } diff --git a/packages/web/i18n/en/account_bill.json b/packages/web/i18n/en/account_bill.json index f90e6b9c9..810c2a87c 100644 --- a/packages/web/i18n/en/account_bill.json +++ b/packages/web/i18n/en/account_bill.json @@ -4,7 +4,6 @@ "back": "return", "bank_account": "Account opening account", "bank_name": "Bank of deposit", - "bill_detail": "Bill details", "bill_record": "billing records", "click_to_download": "Click to download", "company_address": "Company address", @@ -17,37 +16,23 @@ "default_header": "Default header", "detail": "Details", "email_address": "Email address", - "extra_ai_points": "AI points calculation standard", - "extra_dataset_size": "Additional knowledge base capacity", - "generation_time": "Generation time", - "has_invoice": "Whether the invoice has been issued", "invoice_amount": "Invoice amount", "invoice_detail": "Invoice details", "invoice_sending_info": "The invoice will be sent to your mailbox within 3-7 working days, please be patient.", "mm": "mm", - "month": "moon", "need_special_invoice": "Do you need a special ticket?", - "no": "no", "no_invoice_record": "No bill record~", "no_invoice_record_tip": "No invoicing record yet", - "order_number": "Order number", - "order_type": "Order type", "organization_name": "Organization name", - "payment_method": "Payment method", "payway_coupon": "Redeem code", "rerank": "Rerank", "save": "save", "save_failed": "Save exception", "save_success": "Saved successfully", - "status": "state", "sub_mode_custom": "Customize", "submit_failed": "Submission failed", "submit_success": "Submission successful", "submitted": "Submitted", - "subscription_mode_month": "Duration", - "subscription_package": "Subscription package", - "subscription_period": "Subscription cycle", - "support_wallet_amount": "Amount", "support_wallet_apply_invoice": "Billable bills", "support_wallet_bill_tag_invoice": "bill invoice", "support_wallet_invoicing": "Invoicing", @@ -56,7 +41,5 @@ "type": "type", "unit_code": "unified credit code", "unit_code_void": "Unified credit code format error", - "update": "renew", - "yes": "yes", - "yuan": "¥{{amount}}" + "update": "renew" } diff --git a/packages/web/i18n/en/account_info.json b/packages/web/i18n/en/account_info.json index c420e057d..bbe372581 100644 --- a/packages/web/i18n/en/account_info.json +++ b/packages/web/i18n/en/account_info.json @@ -6,15 +6,18 @@ "ai_points_calculation_standard": "AI points", "ai_points_usage": "AI points", "ai_points_usage_tip": "Each time the AI ​​model is called, a certain amount of AI points will be consumed. \nFor specific calculation standards, please refer to the \"Billing Standards\" above.", - "app_amount": "App amount", + "app_amount": "Agent amount", + "app_registration_count": "Registered app count", + "apply_app_registration": "Apply for app registration", "avatar": "Avatar", - "avatar_selection_exception": "Abnormal avatar selection", - "avatar_can_only_select_one": "Avatar can only select one picture", "avatar_can_only_select_jpg_png": "Avatar can only select jpg or png format", + "avatar_can_only_select_one": "Avatar can only select one picture", + "avatar_selection_exception": "Abnormal avatar selection", "balance": "balance", "billing_standard": "Standards", "cancel": "Cancel", "change": "change", + "check_purchase_history": "View order", "choose_avatar": "Click to select avatar", "click_modify_nickname": "Click to modify nickname", "code_required": "Verification code cannot be empty", @@ -25,6 +28,7 @@ "current_package": "Current plan", "current_token_price": "Current points price", "dataset_amount": "Dataset amount", + "discount_coupon": "Coupon", "effective_time": "Effective time", "email_label": "Mail", "exchange": "Exchange", @@ -32,6 +36,7 @@ "exchange_success": "Redemption successful", "expiration_time": "Expiration time", "expired": "Expired", + "expired_tips": "It has expired~", "general_info": "General information", "group": "Group", "help_chatbot": "robot assistant", @@ -42,6 +47,7 @@ "member_name": "Name", "month": "moon", "new_password": "New Password", + "not_started_tips": "Not started", "notification_receiving": "Notify", "old_password": "Old Password", "package_and_usage": "Plans", @@ -76,6 +82,8 @@ "upgrade_package": "Upgrade", "usage_balance": "Use balance: Use balance", "usage_balance_notice": "Due to the system upgrade, the original \"automatic renewal and deduction from balance\" mode has been cancelled, and the balance recharge entrance has been closed. \nYour balance can be used to purchase points", + "used_time": "Used at", + "used_tips": "Already used", "user_account": "Username", "user_team_team_name": "Team", "verification_code": "Verification code", diff --git a/packages/web/i18n/en/account_usage.json b/packages/web/i18n/en/account_usage.json index dce1fa96a..3184c12fd 100644 --- a/packages/web/i18n/en/account_usage.json +++ b/packages/web/i18n/en/account_usage.json @@ -1,4 +1,5 @@ { + "ai.query_extension_embedding": "Problem optimization-embedding", "ai_model": "AI model", "all": "all", "app_name": "Application name", diff --git a/packages/web/i18n/en/app.json b/packages/web/i18n/en/app.json index 06b9a2d50..17b5698db 100644 --- a/packages/web/i18n/en/app.json +++ b/packages/web/i18n/en/app.json @@ -209,6 +209,7 @@ "logs_keys_lastConversationTime": "last conversation time", "logs_keys_messageCount": "Message Count", "logs_keys_points": "Points Consumed", + "logs_keys_region": "User IP", "logs_keys_responseTime": "Average Response Time", "logs_keys_sessionId": "Session ID", "logs_keys_source": "Source", diff --git a/packages/web/i18n/en/common.json b/packages/web/i18n/en/common.json index 2ce424a8e..9e6d832bd 100644 --- a/packages/web/i18n/en/common.json +++ b/packages/web/i18n/en/common.json @@ -33,13 +33,17 @@ "FAQ.dataset_compute_a": "1 knowledge base storage is equal to 1 knowledge base index. \nA single chunked data usually corresponds to multiple indexes. You can see \"n group indexes\" in a single knowledge base collection.", "FAQ.dataset_compute_q": "How is Dataset storage calculated?", "FAQ.dataset_index_a": "No, but if the Dataset index exceeds the limit, you cannot insert or update Dataset content.", - "FAQ.dataset_index_q": "Will the Dataset index be deleted if it exceeds the limit?", "FAQ.free_user_clean_a": "If a free team (free version and has not purchased additional packages) does not log in to the system for 30 consecutive days, the system will automatically clear all Dataset content under that team.", "FAQ.free_user_clean_q": "Will the data of the free version be cleared?", + "FAQ.index_del_a": "When you delete knowledge base content, the number of knowledge base indexes decreases within 5 minutes.", + "FAQ.index_del_q": "When will the knowledge base index be reduced?", "FAQ.package_overlay_a": "Yes, each purchased resource pack is independent and will be used in an overlapping manner within its validity period. AI points will be deducted from the resource pack that expires first.", "FAQ.package_overlay_q": "Can additional resource packs be stacked?", + "FAQ.qpm_a": "It mainly refers to the maximum number of times the team requests Agent per minute, and has nothing to do with the complexity of a single Agent. \nOther OpenAPI interfaces are also affected by this, and each interface is calculated separately.", + "FAQ.qpm_q": "What is QPM?", "FAQ.switch_package_a": "The package usage rule is to prioritize the use of higher-level packages. Therefore, if the newly purchased package is higher than the current package, the new package will take effect immediately; otherwise, the current package will continue to be used.", "FAQ.switch_package_q": "Will the subscription package be switched?", + "FAQ.year_day_a": "A month is calculated as 30 days and a year is calculated as 360 days.", "File": "File", "Finish": "Finish", "Folder": "Folder", @@ -111,10 +115,13 @@ "base_config": "Basic Configuration", "bill_already_processed": "Order has been processed", "bill_expired": "Order expired", + "bill_not_found": "Order does not exist", "bill_not_pay_processed": "Non-online orders", "button.extra_dataset_size_tip": "You are purchasing [Extra Knowledge Base Capacity]", "button.extra_points_tip": "You are purchasing [Extra AI Points]", "can_copy_content_tip": "It is not possible to copy automatically using the browser, please manually copy the following content", + "cancel_bill": "Cancel", + "cancel_bill_confirm": "Are you sure you want to cancel this order?", "chart_mode_cumulative": "Cumulative", "chart_mode_incremental": "Incremental", "chat": "Session", @@ -169,6 +176,7 @@ "code_error.system_error.license_user_amount_limit": "Exceed the maximum number of users in the system", "code_error.team_error.ai_points_not_enough": "Insufficient AI Points", "code_error.team_error.app_amount_not_enough": "Application Limit Reached", + "code_error.team_error.app_folder_amount_not_enough": "Folder Limit Reached", "code_error.team_error.cannot_delete_default_group": "Cannot delete default group", "code_error.team_error.cannot_delete_non_empty_org": "Cannot delete non-empty organization", "code_error.team_error.cannot_modify_root_org": "Cannot modify root organization", @@ -756,12 +764,23 @@ "core.workflow.tool.Handle": "Tool Connector", "core.workflow.tool.Select Tool": "Select Tool", "core.workflow.variable": "Variable", + "coupon_expired": "Coupon has expired", + "coupon_not_exist": "Coupon does not exist", + "coupon_not_start": "Coupon is not yet effective", + "coupon_redeem_failed": "Coupon redeemed failed", + "coupon_unavailable": "Coupon is unavailable", + "coupon_used": "Coupon has been used", "create": "Create", "create_app": "Create an application", "create_failed": "Create failed", "create_success": "Created Successfully", "create_time": "Creation Time", "cron_job_run_app": "Scheduled Task", + "custom_plan_price": "Custom Billing", + "custom_plan_feature_1": "Priority Deep Technical Support", + "custom_plan_feature_2": "Dedicated Account Manager", + "custom_plan_feature_3": "Flexible Resource Configuration", + "custom_plan_feature_4": "Secure and Controllable", "custom_title": "Custom Title", "data_index_custom": "Custom index", "data_index_default": "Default index", @@ -819,6 +838,7 @@ "delete_folder": "Delete Folder", "delete_success": "Deleted Successfully", "delete_warning": "Deletion Warning", + "discount_coupon_used": "Coupon used:", "embedding_model_not_config": "No index model is detected", "enable_auth": "Enable authentication", "error.Create failed": "Create failed", @@ -847,6 +867,7 @@ "exit_directly": "exit_directly", "expired_time": "Expiration Time", "export_to_json": "Export to JSON", + "extraPointsPrice": "¥ {{price}}", "extraction_results": "Extraction Results", "failed": "Failed", "field_name": "Field Name", @@ -916,6 +937,7 @@ "model.type.stt": "STT", "model.type.tts": "TTS", "month": "Month", + "month_text": "month", "move.confirm": "Confirm move", "move_success": "Moved Successfully", "move_to": "Move to", @@ -945,6 +967,10 @@ "not_support": "Not Supported", "not_support_wechat_image": "This is a WeChat picture", "not_yet_introduced": "No Introduction Yet", + "old_user_month_discount_70": "Benefits for old users - 30% off monthly fee", + "old_user_month_discount_70_description": "Enjoy a 30% discount when purchasing a monthly package, and the order will automatically take effect", + "old_user_year_discount_90": "Benefits for old users - 10% off annual fee", + "old_user_year_discount_90_description": "Enjoy a 10% discount when purchasing an annual package, and the order will automatically take effect", "open_folder": "Open Folder", "option": "Option", "page": "Page", @@ -959,6 +985,7 @@ "pay.wx_payment": "WeChat Payment", "pay.yuan": "{{amount}} Yuan", "pay_alipay_payment": "Alipay Payment", + "pay_bill": "pay", "pay_corporate_payment": "Payment to the public", "pay_money": "Amount payable", "pay_success": "Payment successfully", @@ -1188,18 +1215,26 @@ "support.wallet.subscription.Update extra price": "Price", "support.wallet.subscription.Upgrade plan": "Upgrade Package", "support.wallet.subscription.ai_model": "AI Language Model", + "support.wallet.subscription.function.Audit log store duration": "{{amount}} days team operation log records", "support.wallet.subscription.function.History store": "{{amount}} Days of Chat History Retention", - "support.wallet.subscription.function.Max app": "{{amount}} App limit", + "support.wallet.subscription.function.Max app": "{{amount}} Agent limit", "support.wallet.subscription.function.Max dataset": "{{amount}} Dataset limit", "support.wallet.subscription.function.Max dataset size": "{{amount}} Dataset Indexes", - "support.wallet.subscription.function.Max members": "{{amount}} Member limit", - "support.wallet.subscription.function.Points": "{{amount}} AI Points", + "support.wallet.subscription.function.Max members": "{{amount}} Member", + "support.wallet.subscription.function.Points": "{{amount}} points", + "support.wallet.subscription.function.Requests per minute": "{{amount}} QPM", + "support.wallet.subscription.function.Website sync per dataset": "Single knowledge base {{amount}} web pages synchronized", "support.wallet.subscription.mode.Month": "Month", "support.wallet.subscription.mode.Period": "Subscription Period", "support.wallet.subscription.mode.Year": "Year", "support.wallet.subscription.mode.Year sale": "Two Months Free", "support.wallet.subscription.point": " Points", + "support.wallet.subscription.standardSubLevel.advanced": "Advanced", + "support.wallet.subscription.standardSubLevel.advanced_desc": "Enterprise-grade production tool", + "support.wallet.subscription.standardSubLevel.basic": "Basic", + "support.wallet.subscription.standardSubLevel.basic_desc": "Unlock the full functionality of FastGPT", "support.wallet.subscription.standardSubLevel.custom": "Custom", + "support.wallet.subscription.standardSubLevel.custom_desc": "No longer limited by packages, designed for your unique needs", "support.wallet.subscription.standardSubLevel.enterprise": "Enterprise", "support.wallet.subscription.standardSubLevel.enterprise_desc": "Suitable for small and medium-sized enterprises to build Dataset applications in production environments", "support.wallet.subscription.standardSubLevel.experience": "Experience", diff --git a/packages/web/i18n/en/file.json b/packages/web/i18n/en/file.json index 172b7b0bd..279f6182f 100644 --- a/packages/web/i18n/en/file.json +++ b/packages/web/i18n/en/file.json @@ -14,8 +14,6 @@ "Please select the image to upload": "Please select the image to upload", "Please wait for all files to upload": "Please wait for all files to be uploaded to complete", "common.upload_system_tools": "Upload system tools", - "bucket_chat": "Conversation Files", - "bucket_file": "Dataset Documents", "bucket_image": "picture", "click_to_view_raw_source": "Click to View Original Source", "common.Some images failed to process": "Some images failed to process", diff --git a/packages/web/i18n/utils.ts b/packages/web/i18n/utils.ts index 9b2aae584..3fc58f48c 100644 --- a/packages/web/i18n/utils.ts +++ b/packages/web/i18n/utils.ts @@ -1,3 +1,58 @@ import { type I18nKeyFunction } from './i18next'; +import { LangEnum } from '@fastgpt/global/common/i18n/type'; +import Cookies from 'js-cookie'; + +const LANG_KEY = 'NEXT_LOCALE'; + +const isInIframe = () => { + try { + return window.self !== window.top; + } catch (e) { + return true; + } +}; + +export const setLangToStorage = (value: string) => { + if (isInIframe()) { + localStorage.setItem(LANG_KEY, value); + } else { + // 不在 iframe 中,同时使用 Cookie 和 localStorage + Cookies.set(LANG_KEY, value, { expires: 30 }); + localStorage.setItem(LANG_KEY, value); + } +}; + +export const getLangFromStorage = () => { + return localStorage.getItem(LANG_KEY) || Cookies.get(LANG_KEY); +}; + +export const getLangMapping = (lng: string): string => { + const languageMap: Record = { + zh: LangEnum.zh_CN, + 'zh-CN': LangEnum.zh_CN, + 'zh-Hans': LangEnum.zh_CN, + 'zh-HK': LangEnum.zh_Hant, + 'zh-TW': LangEnum.zh_Hant, + 'zh-Hant': LangEnum.zh_Hant, + en: LangEnum.en, + 'en-US': LangEnum.en + }; + + let lang = languageMap[lng]; + + // 如果没有直接映射,尝试智能回退 + if (!lang) { + const langPrefix = lng.split('-')[0]; + // 中文相关语言优先回退到简体中文 + if (langPrefix === 'zh') { + lang = LangEnum.zh_CN; + } + if (langPrefix === 'en') { + lang = LangEnum.en; + } + } + + return lang || LangEnum.zh_CN; +}; export const i18nT: I18nKeyFunction = (key) => key; diff --git a/packages/web/i18n/zh-CN/account.json b/packages/web/i18n/zh-CN/account.json index 43ba6e269..928113450 100644 --- a/packages/web/i18n/zh-CN/account.json +++ b/packages/web/i18n/zh-CN/account.json @@ -83,5 +83,22 @@ "reset_default": "恢复默认配置", "team": "团队管理", "third_party": "第三方账号", - "usage_records": "使用记录" + "usage_records": "使用记录", + "bill_detail": "账单详情", + "order_number": "订单号", + "generation_time": "生成时间", + "order_type": "订单类型", + "status": "状态", + "payment_method": "支付方式", + "support_wallet_amount": "金额", + "yuan": "{{amount}}元", + "has_invoice": "是否已开票", + "yes": "是", + "no": "否", + "subscription_period": "订阅周期", + "subscription_package": "订阅套餐", + "subscription_mode_month": "时长", + "month": "月", + "extra_dataset_size": "额外知识库容量", + "extra_ai_points": "额外 AI 积分" } diff --git a/packages/web/i18n/zh-CN/account_bill.json b/packages/web/i18n/zh-CN/account_bill.json index f067e2ae3..c9e6a09f5 100644 --- a/packages/web/i18n/zh-CN/account_bill.json +++ b/packages/web/i18n/zh-CN/account_bill.json @@ -4,7 +4,6 @@ "back": "返回", "bank_account": "开户账号", "bank_name": "开户银行", - "bill_detail": "账单详情", "bill_record": "账单记录", "click_to_download": "点击下载", "company_address": "公司地址", @@ -17,37 +16,23 @@ "default_header": "默认抬头", "detail": "详情", "email_address": "邮箱地址", - "extra_ai_points": "额外 AI 积分", - "extra_dataset_size": "额外知识库容量", - "generation_time": "生成时间", - "has_invoice": "是否已开票", "invoice_amount": "开票金额", "invoice_detail": "发票详情", "invoice_sending_info": "发票将在 3-7 个工作日内发送至邮箱,请耐心等待", "mm": "毫米", - "month": "月", "need_special_invoice": "是否需要专票", - "no": "否", "no_invoice_record": "无账单记录~", "no_invoice_record_tip": "暂无开票记录", - "order_number": "订单号", - "order_type": "订单类型", "organization_name": "组织名称", - "payment_method": "支付方式", "payway_coupon": "兑换码", "rerank": "结果重排", "save": "保存", "save_failed": "保存异常", "save_success": "保存成功", - "status": "状态", "sub_mode_custom": "自定义", "submit_failed": "提交失败", "submit_success": "提交成功", "submitted": "已提交", - "subscription_mode_month": "时长", - "subscription_package": "订阅套餐", - "subscription_period": "订阅周期", - "support_wallet_amount": "金额", "support_wallet_apply_invoice": "可开票账单", "support_wallet_bill_tag_invoice": "账单发票", "support_wallet_invoicing": "开票", @@ -56,7 +41,5 @@ "type": "类型", "unit_code": "统一信用代码", "unit_code_void": "统一信用代码格式错误", - "update": "更新", - "yes": "是", - "yuan": "{{amount}}元" + "update": "更新" } diff --git a/packages/web/i18n/zh-CN/account_info.json b/packages/web/i18n/zh-CN/account_info.json index 7822d8765..abb13b244 100644 --- a/packages/web/i18n/zh-CN/account_info.json +++ b/packages/web/i18n/zh-CN/account_info.json @@ -6,15 +6,18 @@ "ai_points_calculation_standard": "AI 积分", "ai_points_usage": "AI 积分使用量", "ai_points_usage_tip": "每次调用 AI 模型时,都会消耗一定的 AI 积分。具体的计算标准可参考上方的“计费标准”。", - "app_amount": "应用数量", + "app_amount": "Agent 数量", + "app_registration_count": "备案应用数量", + "apply_app_registration": "申请应用备案", "avatar": "头像", - "avatar_selection_exception": "头像选择异常", - "avatar_can_only_select_one": "头像只能选择一张图片", "avatar_can_only_select_jpg_png": "头像只能选择 jpg 或 png 格式", + "avatar_can_only_select_one": "头像只能选择一张图片", + "avatar_selection_exception": "头像选择异常", "balance": "余额", "billing_standard": "计费标准", "cancel": "取消", "change": "变更", + "check_purchase_history": "查看订单", "choose_avatar": "点击选择头像", "click_modify_nickname": "点击修改昵称", "code_required": "验证码不能为空", @@ -25,6 +28,7 @@ "current_package": "当前套餐", "current_token_price": "当前积分价格", "dataset_amount": "知识库数量", + "discount_coupon": "优惠券", "effective_time": "生效时间", "email_label": "邮箱", "exchange": "兑换", @@ -32,6 +36,7 @@ "exchange_success": "兑换成功", "expiration_time": "到期时间", "expired": "已过期", + "expired_tips": "已经过期了哦~", "general_info": "通用信息", "group": "组", "help_chatbot": "机器人助手", @@ -42,6 +47,7 @@ "member_name": "成员名", "month": "月", "new_password": "新密码", + "not_started_tips": "未开始", "notification_receiving": "通知接收", "old_password": "旧密码", "package_and_usage": "套餐与用量", @@ -76,6 +82,9 @@ "upgrade_package": "升级套餐", "usage_balance": "使用余额: 使用余额", "usage_balance_notice": "由于系统升级,原“自动续费从余额扣款”模式取消,余额充值入口关闭。您的余额可用于购买积分", + "use": "去使用", + "used_time": "使用时间", + "used_tips": "已使用", "user_account": "账号", "user_team_team_name": "团队", "verification_code": "验证码", diff --git a/packages/web/i18n/zh-CN/account_usage.json b/packages/web/i18n/zh-CN/account_usage.json index 93c2d518d..14396f742 100644 --- a/packages/web/i18n/zh-CN/account_usage.json +++ b/packages/web/i18n/zh-CN/account_usage.json @@ -1,10 +1,12 @@ { + "ai.query_extension_embedding": "问题优化-embedding", "ai_model": "AI 模型", "all": "所有", "answer_accuracy": "评测-回答准确性", "app_name": "应用名", "auto_index": "索引增强", "billing_module": "扣费模块", + "check_left_points": "查看剩余积分", "compress_llm_messages": "AI 历史记录压缩", "confirm_export": "共筛选出 {{total}} 条数据,是否确认导出?", "count": "运行次数", diff --git a/packages/web/i18n/zh-CN/app.json b/packages/web/i18n/zh-CN/app.json index 43e8b377e..1b1ed7918 100644 --- a/packages/web/i18n/zh-CN/app.json +++ b/packages/web/i18n/zh-CN/app.json @@ -214,6 +214,7 @@ "logs_keys_lastConversationTime": "最后对话时间", "logs_keys_messageCount": "消息总数", "logs_keys_points": "积分消耗", + "logs_keys_region": "使用者IP", "logs_keys_responseTime": "平均响应时长", "logs_keys_sessionId": "会话 ID", "logs_keys_source": "来源", diff --git a/packages/web/i18n/zh-CN/common.json b/packages/web/i18n/zh-CN/common.json index 28b2b9e93..8a75af271 100644 --- a/packages/web/i18n/zh-CN/common.json +++ b/packages/web/i18n/zh-CN/common.json @@ -33,13 +33,18 @@ "FAQ.dataset_compute_a": "1条知识库存储等于1条知识库索引。一条分块数据,通常对应多条索引,可以在单个知识库集合中看到\"n组索引\"", "FAQ.dataset_compute_q": "知识库存储怎么计算?", "FAQ.dataset_index_a": "不会。但知识库索引超出时,无法插入和更新知识库内容。", - "FAQ.dataset_index_q": "知识库索引超出会删除么?", "FAQ.free_user_clean_a": "免费版团队(免费版且未购买额外套餐)连续 30 天未登录系统,系统会自动清除该团队下所有知识库内容。", "FAQ.free_user_clean_q": "免费版数据会清除么?", + "FAQ.index_del_a": "当你删除知识库内容时,知识库索引数量会在 5 分钟内减少。", + "FAQ.index_del_q": "知识库索引什么时候会减少?", "FAQ.package_overlay_a": "可以的。每次购买的资源包都是独立的,在其有效期内将会叠加使用。AI积分会优先扣除最先过期的资源包。", "FAQ.package_overlay_q": "额外资源包可以叠加么?", + "FAQ.qpm_a": "主要指团队每分钟请求 Agent 的最大次数,与单个 Agent 复杂度无关。其他 OpenAPI 接口也受此影响,每个接口单独计算。", + "FAQ.qpm_q": "QPM 是什么?", "FAQ.switch_package_a": "套餐使用规则为优先使用更高级的套餐,因此,购买的新套餐若比当前套餐更高级,则新套餐立即生效:否则将继续使用当前套餐。", "FAQ.switch_package_q": "是否切换订阅套餐?", + "FAQ.year_day_a": "一个月按 30 天计算,一年按 360 天计算。", + "FAQ.year_day_q": "一个月,一年具体是多长时间?", "File": "文件", "Finish": "完成", "Folder": "文件夹", @@ -111,10 +116,13 @@ "base_config": "基础配置", "bill_already_processed": "订单已处理", "bill_expired": "订单已过期", + "bill_not_found": "订单不存在", "bill_not_pay_processed": "非在线订单", "button.extra_dataset_size_tip": "您正在购买【额外知识库容量】", "button.extra_points_tip": "您正在购买【额外 AI 积分】", "can_copy_content_tip": "无法使用浏览器自动复制,请手动复制下面内容", + "cancel_bill": "取消", + "cancel_bill_confirm": "确定要取消这个订单吗?", "chart_mode_cumulative": "累积", "chart_mode_incremental": "分时", "chat": "会话", @@ -169,6 +177,7 @@ "code_error.system_error.license_user_amount_limit": "超出系统最大用户数量", "code_error.team_error.ai_points_not_enough": "AI 积分不足", "code_error.team_error.app_amount_not_enough": "应用数量已达上限~", + "code_error.team_error.app_folder_amount_not_enough": "文件夹数量已达上限~", "code_error.team_error.cannot_delete_default_group": "不能删除默认群组", "code_error.team_error.cannot_delete_non_empty_org": "不能删除非空部门", "code_error.team_error.cannot_modify_root_org": "不能修改根部门", @@ -207,6 +216,7 @@ "confirm_logout": "确认退出登录?", "confirm_move": "移动到这", "confirm_update": "确认更新", + "contact_business": "联系商务", "contact_way": "通知接收", "contribute_app_template": "贡献模板", "copy_link": "复制链接", @@ -759,12 +769,23 @@ "core.workflow.tool.Handle": "工具连接器", "core.workflow.tool.Select Tool": "选择工具", "core.workflow.variable": "变量", + "coupon_expired": "优惠券已过期", + "coupon_not_exist": "优惠券不存在", + "coupon_not_start": "优惠券未生效", + "coupon_redeem_failed": "优惠券核销失败", + "coupon_unavailable": "优惠券不可用", + "coupon_used": "优惠券已使用", "create": "去创建", "create_app": "创建应用", "create_failed": "创建失败", "create_success": "创建成功", "create_time": "创建时间", "cron_job_run_app": "定时任务", + "custom_plan_price": "定制化计费", + "custom_plan_feature_1": "优先深度技术支持", + "custom_plan_feature_2": "专属客户经理", + "custom_plan_feature_3": "弹性资源配置", + "custom_plan_feature_4": "安全可控", "custom_title": "自定义标题", "data_index_custom": "自定义索引", "data_index_default": "默认索引", @@ -822,6 +843,7 @@ "delete_folder": "删除文件夹", "delete_success": "删除成功", "delete_warning": "删除警告", + "discount_coupon_used": "已使用优惠券:", "embedding_model_not_config": "检测到没有可用的索引模型", "enable_auth": "启用鉴权", "error.Create failed": "创建失败", @@ -850,6 +872,7 @@ "exit_directly": "直接退出", "expired_time": "过期时间", "export_to_json": "导出为 JSON", + "extraPointsPrice": "{{price}} 元", "extraction_results": "提取结果", "failed": "失败", "field_name": "字段名", @@ -919,6 +942,7 @@ "model.type.stt": "语音识别", "model.type.tts": "语音合成", "month": "月", + "month_text": "个月", "move.confirm": "确认移动", "move_success": "移动成功", "move_to": "移动到", @@ -948,6 +972,11 @@ "not_support": "不支持", "not_support_wechat_image": "这是一张微信图片", "not_yet_introduced": "暂无介绍", + "old_user_month_discount_70": "老用户福利-月费7折", + "old_user_month_discount_70_description": "购买月费套餐享受7折优惠,下单自动生效", + "old_user_year_discount_90": "老用户福利-年费9折", + "old_user_year_discount_90_description": "购买年费套餐享受9折优惠,下单自动生效", + "one_year": "1 年", "open_folder": "打开文件夹", "option": "选项", "page": "页", @@ -962,6 +991,7 @@ "pay.wx_payment": "微信支付", "pay.yuan": "{{amount}}元", "pay_alipay_payment": "支付宝支付", + "pay_bill": "支付", "pay_corporate_payment": "对公支付", "pay_money": "应付金额", "pay_success": "支付成功", @@ -1192,18 +1222,29 @@ "support.wallet.subscription.Upgrade plan": "升级套餐", "support.wallet.subscription.ai_model": "AI语言模型", "support.wallet.subscription.eval_items_count": "单次评测数据条数: {{count}} 条", + "support.wallet.subscription.function.App registration count": "{{amount}} 个应用备案", + "support.wallet.subscription.function.Audit log store duration": "{{amount}} 天团队操作日志记录", "support.wallet.subscription.function.History store": "{{amount}} 天对话记录保留", - "support.wallet.subscription.function.Max app": "{{amount}} 个应用上限", - "support.wallet.subscription.function.Max dataset": "{{amount}} 个知识库上限", + "support.wallet.subscription.function.Max app": "{{amount}} 个 Agent", + "support.wallet.subscription.function.Max dataset": "{{amount}} 个知识库", "support.wallet.subscription.function.Max dataset size": "{{amount}} 组知识库索引", - "support.wallet.subscription.function.Max members": "{{amount}} 个团队成员上限", - "support.wallet.subscription.function.Points": "{{amount}} AI 积分", + "support.wallet.subscription.function.Max members": "{{amount}} 个团队成员", + "support.wallet.subscription.function.Points": "{{amount}} 积分", + "support.wallet.subscription.function.Requests per minute": "{{amount}} QPM", + "support.wallet.subscription.function.Ticket response time": "{{amount}} 小时工单支持响应", + "support.wallet.subscription.function.Website sync per dataset": "站点同步最大 {{amount}} 页", + "support.wallet.subscription.function.qpm tip": "主要指团队每分钟请求 Agent 的最大次数,与单个 Agent 复杂度无关。其他 OpenAPI 接口也受此影响,每个接口单独计算", "support.wallet.subscription.mode.Month": "按月", "support.wallet.subscription.mode.Period": "订阅周期", "support.wallet.subscription.mode.Year": "按年", "support.wallet.subscription.mode.Year sale": "赠送两个月", "support.wallet.subscription.point": "积分", - "support.wallet.subscription.standardSubLevel.custom": "自定义版", + "support.wallet.subscription.standardSubLevel.advanced": "高级版", + "support.wallet.subscription.standardSubLevel.advanced_desc": "适合企业级的生产工具", + "support.wallet.subscription.standardSubLevel.basic": "基础版", + "support.wallet.subscription.standardSubLevel.basic_desc": "解锁 FastGPT 完整功能", + "support.wallet.subscription.standardSubLevel.custom": "定制版", + "support.wallet.subscription.standardSubLevel.custom_desc": "不再受限于套餐,为您的独特需求而生", "support.wallet.subscription.standardSubLevel.enterprise": "企业版", "support.wallet.subscription.standardSubLevel.enterprise_desc": "适合中小企业在生产环境构建知识库应用", "support.wallet.subscription.standardSubLevel.experience": "体验版", @@ -1273,6 +1314,7 @@ "upgrade": "升级", "upload_file": "上传文件", "upload_file_error": "上传文件失败", + "usage_records": "使用记录", "use_helper": "使用帮助", "user.Account": "账号", "user.Amount of earnings": "收益(¥)", diff --git a/packages/web/i18n/zh-CN/file.json b/packages/web/i18n/zh-CN/file.json index 02b6f7dcf..ba13cd772 100644 --- a/packages/web/i18n/zh-CN/file.json +++ b/packages/web/i18n/zh-CN/file.json @@ -13,8 +13,6 @@ "Only_support_uploading_one_image": "仅支持上传一张图片", "Please select the image to upload": "请选择要上传的图片", "Please wait for all files to upload": "请等待所有文件上传完成", - "bucket_chat": "对话文件", - "bucket_file": "知识库文件", "bucket_image": "图片", "click_to_view_raw_source": "点击查看来源", "common.Some images failed to process": "部分图片处理失败", diff --git a/packages/web/i18n/zh-Hant/account.json b/packages/web/i18n/zh-Hant/account.json index f8d7f0c52..cbf481c70 100644 --- a/packages/web/i18n/zh-Hant/account.json +++ b/packages/web/i18n/zh-Hant/account.json @@ -83,5 +83,22 @@ "reset_default": "恢復預設設定", "team": "團隊管理", "third_party": "第三方賬號", - "usage_records": "使用記錄" + "usage_records": "使用記錄", + "bill_detail": "帳單詳細資訊", + "order_number": "訂單編號", + "generation_time": "生成時間", + "order_type": "訂單類型", + "status": "狀態", + "payment_method": "支付方式", + "support_wallet_amount": "金額", + "yuan": "{{amount}}元", + "has_invoice": "是否已開票", + "yes": "是", + "no": "否", + "subscription_period": "訂閱週期", + "subscription_package": "訂閱套餐", + "subscription_mode_month": "時長", + "month": "月", + "extra_dataset_size": "額外知識庫容量", + "extra_ai_points": "AI 積分運算標準" } diff --git a/packages/web/i18n/zh-Hant/account_bill.json b/packages/web/i18n/zh-Hant/account_bill.json index aa3a92d61..3cea5ee45 100644 --- a/packages/web/i18n/zh-Hant/account_bill.json +++ b/packages/web/i18n/zh-Hant/account_bill.json @@ -4,7 +4,6 @@ "back": "返回", "bank_account": "開戶帳號", "bank_name": "開戶銀行", - "bill_detail": "帳單詳細資訊", "bill_record": "帳單記錄", "click_to_download": "點擊下載", "company_address": "公司地址", @@ -17,37 +16,23 @@ "default_header": "預設抬頭", "detail": "詳細資訊", "email_address": "郵件地址", - "extra_ai_points": "AI 積分運算標準", - "extra_dataset_size": "額外知識庫容量", - "generation_time": "生成時間", - "has_invoice": "是否已開票", "invoice_amount": "開票金額", "invoice_detail": "發票詳細資訊", "invoice_sending_info": "發票將在 3-7 個工作天內傳送至郵箱,請耐心等待", "mm": "毫米", - "month": "月", "need_special_invoice": "是否需要專票", - "no": "否", "no_invoice_record": "無帳單記錄~", "no_invoice_record_tip": "暫無開立發票紀錄", - "order_number": "訂單編號", - "order_type": "訂單類型", "organization_name": "組織名稱", - "payment_method": "支付方式", "payway_coupon": "兌換碼", "rerank": "結果重排", "save": "儲存", "save_failed": "儲存異常", "save_success": "儲存成功", - "status": "狀態", "sub_mode_custom": "自定義", "submit_failed": "提交失敗", "submit_success": "提交成功", "submitted": "已提交", - "subscription_mode_month": "時長", - "subscription_package": "訂閱套餐", - "subscription_period": "訂閱週期", - "support_wallet_amount": "金額", "support_wallet_apply_invoice": "可開立帳單", "support_wallet_bill_tag_invoice": "帳單發票", "support_wallet_invoicing": "開票", @@ -56,7 +41,5 @@ "type": "類型", "unit_code": "統一信用代碼", "unit_code_void": "統一信用代碼格式錯誤", - "update": "更新", - "yes": "是", - "yuan": "{{amount}}元" + "update": "更新" } diff --git a/packages/web/i18n/zh-Hant/account_info.json b/packages/web/i18n/zh-Hant/account_info.json index 5646df649..71f2dd72b 100644 --- a/packages/web/i18n/zh-Hant/account_info.json +++ b/packages/web/i18n/zh-Hant/account_info.json @@ -6,15 +6,18 @@ "ai_points_calculation_standard": "AI 積分", "ai_points_usage": "AI 積分使用量", "ai_points_usage_tip": "每次呼叫 AI 模型時,都會消耗一定的 AI 積分。\n具體的計算標準可參考上方的「計費標準」。", - "app_amount": "應用數量", + "app_amount": "Agent 數量", + "app_registration_count": "備案應用數量", + "apply_app_registration": "申請應用備案", "avatar": "頭像", - "avatar_selection_exception": "頭像選擇異常", - "avatar_can_only_select_one": "頭像只能選擇一張圖片", "avatar_can_only_select_jpg_png": "頭像只能選擇 jpg 或 png 格式", + "avatar_can_only_select_one": "頭像只能選擇一張圖片", + "avatar_selection_exception": "頭像選擇異常", "balance": "餘額", "billing_standard": "計費標準", "cancel": "取消", "change": "變更", + "check_purchase_history": "查看訂單", "choose_avatar": "點選選擇頭像", "click_modify_nickname": "點選修改暱稱", "code_required": "驗證碼不能為空", @@ -25,6 +28,7 @@ "current_package": "目前套餐", "current_token_price": "目前積分價格", "dataset_amount": "知識庫數量", + "discount_coupon": "優惠券", "effective_time": "生效時間", "email_label": "信箱", "exchange": "兌換", @@ -32,6 +36,7 @@ "exchange_success": "兌換成功", "expiration_time": "到期時間", "expired": "已過期", + "expired_tips": "已經過期了哦~", "general_info": "通用資訊", "group": "群組", "help_chatbot": "機器人助手", @@ -42,6 +47,7 @@ "member_name": "成員名", "month": "月", "new_password": "新密碼", + "not_started_tips": "未開始", "notification_receiving": "通知接收", "old_password": "舊密碼", "package_and_usage": "套餐與用量", @@ -76,6 +82,8 @@ "upgrade_package": "升級套餐", "usage_balance": "使用餘額:使用餘額", "usage_balance_notice": "由於系統升級,原「自動續費從餘額扣款」模式取消,餘額儲值入口關閉。\n您的餘額可用於購買積分", + "used_time": "使用時間", + "used_tips": "已使用", "user_account": "帳號", "user_team_team_name": "團隊", "verification_code": "驗證碼", diff --git a/packages/web/i18n/zh-Hant/account_usage.json b/packages/web/i18n/zh-Hant/account_usage.json index 66356e02f..df03f2afb 100644 --- a/packages/web/i18n/zh-Hant/account_usage.json +++ b/packages/web/i18n/zh-Hant/account_usage.json @@ -1,4 +1,5 @@ { + "ai.query_extension_embedding": "問題優化-embedding", "ai_model": "AI 模型", "all": "所有", "app_name": "應用程式名", diff --git a/packages/web/i18n/zh-Hant/common.json b/packages/web/i18n/zh-Hant/common.json index 281ae8aaf..5e619e30b 100644 --- a/packages/web/i18n/zh-Hant/common.json +++ b/packages/web/i18n/zh-Hant/common.json @@ -33,13 +33,17 @@ "FAQ.dataset_compute_a": "1條知識庫存儲等於1條知識庫索引。\n一條分塊數據,通常對應多條索引,可以在單個知識庫集合中看到\"n組索引\"", "FAQ.dataset_compute_q": "知識庫儲存如何計算?", "FAQ.dataset_index_a": "不會,但知識庫索引超出限制時,將無法插入和更新知識庫內容。", - "FAQ.dataset_index_q": "知識庫索引超出是否會被刪除?", "FAQ.free_user_clean_a": "若免費版團隊(免費版且未購買額外方案)連續 30 天未登入系統,系統會自動清除該團隊下所有知識庫內容。", "FAQ.free_user_clean_q": "免費版的資料會被清除嗎?", + "FAQ.index_del_a": "當你刪除知識庫內容時,知識庫索引數量會在 5 分鐘內減少。", + "FAQ.index_del_q": "知識庫索引什麼時候會減少?", "FAQ.package_overlay_a": "可以。每次購買的資源包都是獨立的,在其有效期內會重疊使用。AI 點數會優先從最早到期的資源包中扣除。", "FAQ.package_overlay_q": "額外資源包可以重疊使用嗎?", + "FAQ.qpm_a": "主要指團隊每分鐘請求 Agent 的最大次數,與單個 Agent 複雜度無關。\n其他 OpenAPI 接口也受此影響,每個接口單獨計算。", + "FAQ.qpm_q": "QPM 是什麼?", "FAQ.switch_package_a": "方案使用規則為優先使用較進階的方案。因此,若購買的新方案比目前方案更進階,則新方案會立即生效;否則將繼續使用目前方案。", "FAQ.switch_package_q": "是否會切換訂閱方案?", + "FAQ.year_day_a": "一個月按 30 天計算,一年按 360 天計算。", "File": "檔案", "Finish": "完成", "Folder": "資料夾", @@ -111,10 +115,13 @@ "base_config": "基本設定", "bill_already_processed": "訂單已處理", "bill_expired": "訂單已過期", + "bill_not_found": "訂單不存在", "bill_not_pay_processed": "非在線訂單", "button.extra_dataset_size_tip": "您正在購買【額外知識庫容量】", "button.extra_points_tip": "您正在購買【額外 AI 積分】", "can_copy_content_tip": "無法使用瀏覽器自動複製,請手動複製下面內容", + "cancel_bill": "取消", + "cancel_bill_confirm": "確定要取消這個訂單嗎?", "chart_mode_cumulative": "累積", "chart_mode_incremental": "分時", "chat": "會話", @@ -169,6 +176,7 @@ "code_error.system_error.license_user_amount_limit": "超出系統最大用戶數量", "code_error.team_error.ai_points_not_enough": "AI 點數不足", "code_error.team_error.app_amount_not_enough": "已達應用程式數量上限", + "code_error.team_error.app_folder_amount_not_enough": "已達資料夾數量上限", "code_error.team_error.cannot_delete_default_group": "無法刪除預設群組", "code_error.team_error.cannot_delete_non_empty_org": "無法刪除非空組織", "code_error.team_error.cannot_modify_root_org": "無法修改根組織", @@ -755,12 +763,23 @@ "core.workflow.tool.Handle": "工具聯結器", "core.workflow.tool.Select Tool": "選擇工具", "core.workflow.variable": "變數", + "coupon_expired": "優惠券已過期", + "coupon_not_exist": "優惠券不存在", + "coupon_not_start": "優惠券未生效", + "coupon_redeem_failed": "優惠券核銷失敗", + "coupon_unavailable": "優惠券不可用", + "coupon_used": "優惠券已使用", "create": "建立", "create_app": "創建應用", "create_failed": "建立失敗", "create_success": "建立成功", "create_time": "建立時間", "cron_job_run_app": "排程任務", + "custom_plan_price": "定制化計費", + "custom_plan_feature_1": "優先深度技術支援", + "custom_plan_feature_2": "專屬客戶經理", + "custom_plan_feature_3": "彈性資源配置", + "custom_plan_feature_4": "安全可控", "custom_title": "自訂標題", "data_index_custom": "自定義索引", "data_index_default": "預設索引", @@ -818,6 +837,7 @@ "delete_folder": "刪除資料夾", "delete_success": "刪除成功", "delete_warning": "刪除警告", + "discount_coupon_used": "已使用優惠券:", "embedding_model_not_config": "偵測到沒有可用的索引模型", "enable_auth": "啟用鑑權", "error.Create failed": "建立失敗", @@ -846,6 +866,7 @@ "exit_directly": "直接離開", "expired_time": "到期時間", "export_to_json": "匯出為 JSON", + "extraPointsPrice": "{{price}} 元", "extraction_results": "提取結果", "failed": "失敗", "field_name": "欄位名稱", @@ -943,6 +964,10 @@ "not_support": "不支援", "not_support_wechat_image": "這是一張微信圖片", "not_yet_introduced": "暫無介紹", + "old_user_month_discount_70": "老用戶福利-月費7折", + "old_user_month_discount_70_description": "購買月費套餐享受7折優惠,下單自動生效", + "old_user_year_discount_90": "老用戶福利-年費9折", + "old_user_year_discount_90_description": "購買年費套餐享受9折優惠,下單自動生效", "open_folder": "開啟資料夾", "option": "選項", "page": "頁", @@ -957,6 +982,7 @@ "pay.wx_payment": "微信支付", "pay.yuan": "{{amount}} 元", "pay_alipay_payment": "支付寶支付", + "pay_bill": "支付", "pay_corporate_payment": "對公支付", "pay_money": "應付金額", "pay_success": "支付成功", @@ -1186,18 +1212,26 @@ "support.wallet.subscription.Update extra price": "價格", "support.wallet.subscription.Upgrade plan": "升級方案", "support.wallet.subscription.ai_model": "AI 語言模型", + "support.wallet.subscription.function.Audit log store duration": "{{amount}} 天團隊操作日誌記錄", "support.wallet.subscription.function.History store": "{{amount}} 天對話紀錄保留", - "support.wallet.subscription.function.Max app": "{{amount}} 個應用上限", - "support.wallet.subscription.function.Max dataset": "{{amount}} 個知識庫上限", + "support.wallet.subscription.function.Max app": "{{amount}} 個 Agent", + "support.wallet.subscription.function.Max dataset": "{{amount}} 個知識庫", "support.wallet.subscription.function.Max dataset size": "{{amount}} 組知識庫索引", "support.wallet.subscription.function.Max members": "{{amount}} 個團隊成員", - "support.wallet.subscription.function.Points": "{{amount}} AI 點數", + "support.wallet.subscription.function.Points": "{{amount}} 積分", + "support.wallet.subscription.function.Requests per minute": "{{amount}} QPM", + "support.wallet.subscription.function.Website sync per dataset": "單知識庫 {{amount}} 個網頁同步", "support.wallet.subscription.mode.Month": "按月", "support.wallet.subscription.mode.Period": "訂閱週期", "support.wallet.subscription.mode.Year": "按年", "support.wallet.subscription.mode.Year sale": "贈送兩個月", "support.wallet.subscription.point": "點數", + "support.wallet.subscription.standardSubLevel.advanced": "高級版", + "support.wallet.subscription.standardSubLevel.advanced_desc": "適合企業級的生產工具", + "support.wallet.subscription.standardSubLevel.basic": "基礎版", + "support.wallet.subscription.standardSubLevel.basic_desc": "解鎖 FastGPT 完整功能", "support.wallet.subscription.standardSubLevel.custom": "客製版", + "support.wallet.subscription.standardSubLevel.custom_desc": "不再受限於套餐,為您的獨特需求而生", "support.wallet.subscription.standardSubLevel.enterprise": "企業版", "support.wallet.subscription.standardSubLevel.enterprise_desc": "適合中小企業在正式環境建構知識庫應用", "support.wallet.subscription.standardSubLevel.experience": "體驗版", diff --git a/packages/web/i18n/zh-Hant/file.json b/packages/web/i18n/zh-Hant/file.json index 577e454cb..61042ae84 100644 --- a/packages/web/i18n/zh-Hant/file.json +++ b/packages/web/i18n/zh-Hant/file.json @@ -14,8 +14,6 @@ "Please select the image to upload": "請選擇要上傳的圖片", "Please wait for all files to upload": "請等待所有文件上傳完成", "common.upload_system_tools": "上傳系統工具", - "bucket_chat": "對話檔案", - "bucket_file": "知識庫檔案", "bucket_image": "圖片", "click_to_view_raw_source": "點選檢視原始來源", "common.Some images failed to process": "部分圖片處理失敗", diff --git a/packages/web/package.json b/packages/web/package.json index 9f3e6ea21..f86120dd7 100644 --- a/packages/web/package.json +++ b/packages/web/package.json @@ -17,7 +17,7 @@ "@lexical/markdown": "0.12.6", "@lexical/react": "0.12.6", "@lexical/rich-text": "0.12.6", - "@lexical/selection": "^0.14.5", + "@lexical/selection": "0.12.6", "@lexical/text": "0.12.6", "@lexical/utils": "0.12.6", "@monaco-editor/react": "^4.6.0", diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 16b130026..d539800cd 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -135,6 +135,9 @@ importers: '@fastgpt/global': specifier: workspace:* version: link:../global + '@maxmind/geoip2-node': + specifier: ^6.3.4 + version: 6.3.4 '@modelcontextprotocol/sdk': specifier: ^1.24.0 version: 1.24.0(zod@4.1.12) @@ -372,16 +375,16 @@ importers: version: 0.12.6(lexical@0.12.6) '@lexical/markdown': specifier: 0.12.6 - version: 0.12.6(@lexical/clipboard@0.12.6(lexical@0.12.6))(@lexical/selection@0.14.5)(lexical@0.12.6) + version: 0.12.6(@lexical/clipboard@0.12.6(lexical@0.12.6))(@lexical/selection@0.12.6(lexical@0.12.6))(lexical@0.12.6) '@lexical/react': specifier: 0.12.6 version: 0.12.6(lexical@0.12.6)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(yjs@13.6.24) '@lexical/rich-text': specifier: 0.12.6 - version: 0.12.6(@lexical/clipboard@0.12.6(lexical@0.12.6))(@lexical/selection@0.14.5)(@lexical/utils@0.12.6(lexical@0.12.6))(lexical@0.12.6) + version: 0.12.6(@lexical/clipboard@0.12.6(lexical@0.12.6))(@lexical/selection@0.12.6(lexical@0.12.6))(@lexical/utils@0.12.6(lexical@0.12.6))(lexical@0.12.6) '@lexical/selection': - specifier: ^0.14.5 - version: 0.14.5 + specifier: 0.12.6 + version: 0.12.6(lexical@0.12.6) '@lexical/text': specifier: 0.12.6 version: 0.12.6(lexical@0.12.6) @@ -569,6 +572,9 @@ importers: immer: specifier: ^9.0.19 version: 9.0.21 + ip2region.js: + specifier: ^3.1.6 + version: 3.1.6 js-yaml: specifier: ^4.1.1 version: 4.1.1 @@ -2808,9 +2814,6 @@ packages: peerDependencies: lexical: 0.12.6 - '@lexical/selection@0.14.5': - resolution: {integrity: sha512-uK4X1wOSnlq2xvIIludnPb6i+grtV4IR7Y1Dg7ZGFJfk1q5FWuS9iA3iVjZbSiehgbZef5nDCPRez9WN/F5krA==} - '@lexical/table@0.12.6': resolution: {integrity: sha512-rUh9/fN831T6UpNiPuzx0x6HNi/eQ7W5AQrVBwwzEwkbwAqnE0n28DP924AUbX72UsQNHtywgmDApMoEV7W2iQ==} peerDependencies: @@ -2874,6 +2877,9 @@ packages: '@marijn/find-cluster-break@1.0.2': resolution: {integrity: sha512-l0h88YhZFyKdXIFNfSWpyjStDjGHwZ/U7iobcK1cQQD8sejsONdQtTVU+1wVN1PBw40PiiHB1vA5S7VTfQiP9g==} + '@maxmind/geoip2-node@6.3.4': + resolution: {integrity: sha512-BTRFHCX7Uie4wVSPXsWQfg0EVl4eGZgLCts0BTKAP+Eiyt1zmF2UPyuUZkaj0R59XSDYO+84o1THAtaenUoQYg==} + '@microsoft/tsdoc@0.15.1': resolution: {integrity: sha512-4aErSrCR/On/e5G2hDP0wjooqDdauzEbIq8hIkIe5pXV0rtWJZvdCEKL0ykZxex+IxIwBp0eGeV48hQN07dXtw==} @@ -7222,6 +7228,9 @@ packages: resolution: {integrity: sha512-zHtQzGojZXTwZTHQqra+ETKd4Sn3vgi7uBmlPoXVWZqYvuKmtI0l/VZTjqGmJY9x88GGOaZ9+G9ES8hC4T4X8g==} engines: {node: '>= 12'} + ip2region.js@3.1.6: + resolution: {integrity: sha512-fXIzLreccnOCtpsIxu9qowp09qb+an5BJ2tY8E+tXlFI3CFb+GuC8Yyw/P7ZV/4LukQwcHSPUDxHORmpYtY7OQ==} + ipaddr.js@1.9.1: resolution: {integrity: sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==} engines: {node: '>= 0.10'} @@ -7889,9 +7898,6 @@ packages: lexical@0.12.6: resolution: {integrity: sha512-Nlfjc+k9cIWpOMv7XufF0Mv09TAXSemNAuAqFLaOwTcN+RvhvYTDtVLSp9D9r+5I097fYs1Vf/UYwH2xEpkFfQ==} - lexical@0.14.5: - resolution: {integrity: sha512-ouV7Gyr9+3WT3WTrCgRAD3iZnlJWfs2/kBl2x3J2Q3X9uCWJn/zn21fQ8G1EUHlu0dvXPBmdk9hXb/FjTClt6Q==} - lib0@0.2.114: resolution: {integrity: sha512-gcxmNFzA4hv8UYi8j43uPlQ7CGcyMJ2KQb5kZASw6SnAKAf10hK12i2fjrS3Cl/ugZa5Ui6WwIu1/6MIXiHttQ==} engines: {node: '>=16'} @@ -8176,6 +8182,10 @@ packages: resolution: {integrity: sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==} engines: {node: '>= 0.4'} + maxmind@5.0.1: + resolution: {integrity: sha512-hYxQxvHkBUlyF34f7IlQOb60rytezCi2oZ8H/BtZpcoodXTlcK1eLgf7kY2TofHqBC3o+Hqtvde9kS72gFQSDw==} + engines: {node: '>=12', npm: '>=6'} + mdast-util-find-and-replace@3.0.2: resolution: {integrity: sha512-Tmd1Vg/m3Xz43afeNxDIhWRtFZgM2VLyaf4vSTYwudTyeuTneoL3qtWMA5jeLyz/O1vDJmmV4QuScFCA2tBPwg==} @@ -8556,6 +8566,10 @@ packages: mlly@1.7.4: resolution: {integrity: sha512-qmdSIPC4bDJXgZTCR7XosJiNKySV7O215tsPtDN9iEO/7q/76b/ijtgRu/+epFXSJhijtTCCGp3DWS549P3xKw==} + mmdb-lib@3.0.1: + resolution: {integrity: sha512-dyAyMR+cRykZd1mw5altC9f4vKpCsuywPwo8l/L5fKqDay2zmqT0mF/BvUoXnQiqGn+nceO914rkPKJoyFnGxA==} + engines: {node: '>=10', npm: '>=6'} + mnemonist@0.39.6: resolution: {integrity: sha512-A/0v5Z59y63US00cRSLiloEIw3t5G+MiKz4BhX21FI+YBJXBOGW0ohFxTxO08dsOYlzxo87T7vGfZKYp2bcAWA==} @@ -8754,6 +8768,7 @@ packages: next@15.3.5: resolution: {integrity: sha512-RkazLBMMDJSJ4XZQ81kolSpwiCt907l0xcgcpF4xC2Vml6QVcPNXW0NQRwQ80FFtSn7UM52XN0anaw8TEJXaiw==} engines: {node: ^18.18.0 || ^19.8.0 || >= 20.0.0} + deprecated: This version has a security vulnerability. Please upgrade to a patched version. See https://nextjs.org/blog/CVE-2025-66478 for more details. hasBin: true peerDependencies: '@opentelemetry/api': ^1.1.0 @@ -9509,7 +9524,7 @@ packages: react-dom@19.1.1: resolution: {integrity: sha512-Dlq/5LAZgF0Gaz6yiqZCf6VCcZs1ghAJyrsu84Q/GT0gV+mCxbfmKNoGRKBYMJ8IEdGPqu49YWXD02GCknEDkw==} peerDependencies: - react: ^19.1.1 + react: 18.3.1 react-error-boundary@3.1.4: resolution: {integrity: sha512-uM9uPzZJTF6wRQORmSrvOIgt4lJ9MC1sNgEOj2XGsDTRE4kmpWxg7ENK9EWNKJRMAOY9z0MuF4yIfl6gp4sotA==} @@ -10534,6 +10549,10 @@ packages: tiny-invariant@1.3.3: resolution: {integrity: sha512-+FbBPE1o9QAYvviau/qC5SE3caw21q3xkvWKBtja5vgqOWIHHJ3ioaq1VPfn/Szqctz2bU/oYeKd9/z5BL+PVg==} + tiny-lru@11.4.5: + resolution: {integrity: sha512-hkcz3FjNJfKXjV4mjQ1OrXSLAehg8Hw+cEZclOVT+5c/cWQWImQ9wolzTjth+dmmDe++p3bme3fTxz6Q4Etsqw==} + engines: {node: '>=12'} + tinybench@2.9.0: resolution: {integrity: sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg==} @@ -13628,19 +13647,6 @@ snapshots: - '@lexical/clipboard' - '@lexical/selection' - '@lexical/markdown@0.12.6(@lexical/clipboard@0.12.6(lexical@0.12.6))(@lexical/selection@0.14.5)(lexical@0.12.6)': - dependencies: - '@lexical/code': 0.12.6(lexical@0.12.6) - '@lexical/link': 0.12.6(lexical@0.12.6) - '@lexical/list': 0.12.6(lexical@0.12.6) - '@lexical/rich-text': 0.12.6(@lexical/clipboard@0.12.6(lexical@0.12.6))(@lexical/selection@0.14.5)(@lexical/utils@0.12.6(lexical@0.12.6))(lexical@0.12.6) - '@lexical/text': 0.12.6(lexical@0.12.6) - '@lexical/utils': 0.12.6(lexical@0.12.6) - lexical: 0.12.6 - transitivePeerDependencies: - - '@lexical/clipboard' - - '@lexical/selection' - '@lexical/offset@0.12.6(lexical@0.12.6)': dependencies: lexical: 0.12.6 @@ -13689,21 +13695,10 @@ snapshots: '@lexical/utils': 0.12.6(lexical@0.12.6) lexical: 0.12.6 - '@lexical/rich-text@0.12.6(@lexical/clipboard@0.12.6(lexical@0.12.6))(@lexical/selection@0.14.5)(@lexical/utils@0.12.6(lexical@0.12.6))(lexical@0.12.6)': - dependencies: - '@lexical/clipboard': 0.12.6(lexical@0.12.6) - '@lexical/selection': 0.14.5 - '@lexical/utils': 0.12.6(lexical@0.12.6) - lexical: 0.12.6 - '@lexical/selection@0.12.6(lexical@0.12.6)': dependencies: lexical: 0.12.6 - '@lexical/selection@0.14.5': - dependencies: - lexical: 0.14.5 - '@lexical/table@0.12.6(lexical@0.12.6)': dependencies: '@lexical/utils': 0.12.6(lexical@0.12.6) @@ -13782,6 +13777,10 @@ snapshots: '@marijn/find-cluster-break@1.0.2': {} + '@maxmind/geoip2-node@6.3.4': + dependencies: + maxmind: 5.0.1 + '@microsoft/tsdoc@0.15.1': {} '@mixmark-io/domino@2.2.0': {} @@ -17891,7 +17890,7 @@ snapshots: transitivePeerDependencies: - supports-color - eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.56.0): + eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0)(eslint@8.56.0))(eslint@8.56.0): dependencies: debug: 3.2.7 optionalDependencies: @@ -17902,7 +17901,7 @@ snapshots: transitivePeerDependencies: - supports-color - eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.57.1): + eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1): dependencies: debug: 3.2.7 optionalDependencies: @@ -17924,7 +17923,7 @@ snapshots: doctrine: 2.1.0 eslint: 8.56.0 eslint-import-resolver-node: 0.3.9 - eslint-module-utils: 2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.56.0) + eslint-module-utils: 2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0)(eslint@8.56.0))(eslint@8.56.0) hasown: 2.0.2 is-core-module: 2.16.1 is-glob: 4.0.3 @@ -17953,7 +17952,7 @@ snapshots: doctrine: 2.1.0 eslint: 8.57.1 eslint-import-resolver-node: 0.3.9 - eslint-module-utils: 2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.57.1) + eslint-module-utils: 2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1) hasown: 2.0.2 is-core-module: 2.16.1 is-glob: 4.0.3 @@ -19226,6 +19225,8 @@ snapshots: jsbn: 1.1.0 sprintf-js: 1.1.3 + ip2region.js@3.1.6: {} + ipaddr.js@1.9.1: {} ipaddr.js@2.2.0: {} @@ -20032,8 +20033,6 @@ snapshots: lexical@0.12.6: {} - lexical@0.14.5: {} - lib0@0.2.114: dependencies: isomorphic.js: 0.2.5 @@ -20324,6 +20323,11 @@ snapshots: math-intrinsics@1.1.0: {} + maxmind@5.0.1: + dependencies: + mmdb-lib: 3.0.1 + tiny-lru: 11.4.5 + mdast-util-find-and-replace@3.0.2: dependencies: '@types/mdast': 4.0.4 @@ -21023,6 +21027,8 @@ snapshots: pkg-types: 1.3.1 ufo: 1.5.4 + mmdb-lib@3.0.1: {} + mnemonist@0.39.6: dependencies: obliterator: 2.0.5 @@ -23464,6 +23470,8 @@ snapshots: tiny-invariant@1.3.3: {} + tiny-lru@11.4.5: {} + tinybench@2.9.0: {} tinyexec@0.3.2: {} diff --git a/projects/app/.env.template b/projects/app/.env.template index 6527eb4db..497cf9387 100644 --- a/projects/app/.env.template +++ b/projects/app/.env.template @@ -43,6 +43,7 @@ S3_ACCESS_KEY=minioadmin S3_SECRET_KEY=minioadmin S3_PUBLIC_BUCKET=fastgpt-public # 插件文件存储公开桶 S3_PRIVATE_BUCKET=fastgpt-private # 插件文件存储公开桶 +S3_PATH_STYLE=false # forcePathStyle 默认为 true, 当且仅当设置为 false 时关闭, 其他值都为 true # Redis URL REDIS_URL=redis://default:mypassword@127.0.0.1:6379 @@ -79,6 +80,9 @@ SIGNOZ_STORE_LEVEL=warn # 插件市场 MARKETPLACE_URL=https://marketplace.fastgpt.cn +# 申请应用备案地址 +APP_REGISTRATION_URL= + # 安全配置 # 启动 IP 限流(true),部分接口增加了 ip 限流策略,防止非正常请求操作。 USE_IP_LIMIT=false @@ -104,6 +108,8 @@ CHAT_MAX_QPM=5000 ALLOWED_ORIGINS= # 是否展示兑换码功能 SHOW_COUPON=false +# 是否展示优惠券功能 +SHOW_DISCOUNT_COUPON=false # 自定义 config.json 路径 CONFIG_JSON_PATH= diff --git a/projects/app/Dockerfile b/projects/app/Dockerfile index b97bba3f5..ce033e038 100644 --- a/projects/app/Dockerfile +++ b/projects/app/Dockerfile @@ -73,9 +73,11 @@ COPY --from=maindeps /app/node_modules/tiktoken ./node_modules/tiktoken RUN rm -rf ./node_modules/tiktoken/encoders COPY --from=maindeps /app/node_modules/@zilliz/milvus2-sdk-node ./node_modules/@zilliz/milvus2-sdk-node # copy package.json to version file -COPY --from=builder /app/projects/app/package.json ./package.json +COPY --from=builder /app/projects/app/package.json ./package.json # copy config COPY ./projects/app/data/config.json /app/data/config.json +# copy GeoLite2-City.mmdb +COPY ./projects/app/data/GeoLite2-City.mmdb /app/data/GeoLite2-City.mmdb RUN chown -R nextjs:nodejs /app/data @@ -92,4 +94,4 @@ USER nextjs ENV serverPath=./projects/app/server.js -ENTRYPOINT ["sh","-c","node --max-old-space-size=4096 ${serverPath}"] \ No newline at end of file +ENTRYPOINT ["sh","-c","node --max-old-space-size=4096 ${serverPath}"] diff --git a/projects/app/data/GeoLite2-City.mmdb b/projects/app/data/GeoLite2-City.mmdb new file mode 100644 index 000000000..70f64acea Binary files /dev/null and b/projects/app/data/GeoLite2-City.mmdb differ diff --git a/projects/app/package.json b/projects/app/package.json index ad380aeea..ff61f934f 100644 --- a/projects/app/package.json +++ b/projects/app/package.json @@ -1,6 +1,6 @@ { "name": "app", - "version": "4.14.3", + "version": "4.14.4", "private": false, "scripts": { "dev": "npm run build:workers && next dev", @@ -39,6 +39,7 @@ "hyperdown": "^2.4.29", "i18next": "23.16.8", "immer": "^9.0.19", + "ip2region.js": "^3.1.6", "js-yaml": "^4.1.1", "json5": "^2.2.3", "jsondiffpatch": "^0.7.2", diff --git a/projects/app/public/imgs/system/discount70CN.svg b/projects/app/public/imgs/system/discount70CN.svg new file mode 100644 index 000000000..6861cce68 --- /dev/null +++ b/projects/app/public/imgs/system/discount70CN.svg @@ -0,0 +1,22 @@ + + + + + + + + + + + + + + + + + + + + + + diff --git a/projects/app/public/imgs/system/discount70EN.svg b/projects/app/public/imgs/system/discount70EN.svg new file mode 100644 index 000000000..8753263d9 --- /dev/null +++ b/projects/app/public/imgs/system/discount70EN.svg @@ -0,0 +1,22 @@ + + + + + + + + + + + + + + + + + + + + + + diff --git a/projects/app/public/imgs/system/discount90CN.svg b/projects/app/public/imgs/system/discount90CN.svg new file mode 100644 index 000000000..f8334ef90 --- /dev/null +++ b/projects/app/public/imgs/system/discount90CN.svg @@ -0,0 +1,22 @@ + + + + + + + + + + + + + + + + + + + + + + diff --git a/projects/app/public/imgs/system/discount90EN.svg b/projects/app/public/imgs/system/discount90EN.svg new file mode 100644 index 000000000..e9943434a --- /dev/null +++ b/projects/app/public/imgs/system/discount90EN.svg @@ -0,0 +1,22 @@ + + + + + + + + + + + + + + + + + + + + + + diff --git a/projects/app/src/components/Layout/auth.tsx b/projects/app/src/components/Layout/auth.tsx index 5d717d045..d8f54e43e 100644 --- a/projects/app/src/components/Layout/auth.tsx +++ b/projects/app/src/components/Layout/auth.tsx @@ -35,10 +35,6 @@ const Auth = ({ children }: { children: JSX.Element | React.ReactNode }) => { { refetchInterval: 10 * 60 * 1000, onError(error) { - console.log('error->', error); - router.replace( - `/login?lastRoute=${encodeURIComponent(location.pathname + location.search)}` - ); toast({ status: 'warning', title: t('common:support.user.Need to login') diff --git a/projects/app/src/components/Select/I18nLngSelector.tsx b/projects/app/src/components/Select/I18nLngSelector.tsx index 552dfe521..533ce9441 100644 --- a/projects/app/src/components/Select/I18nLngSelector.tsx +++ b/projects/app/src/components/Select/I18nLngSelector.tsx @@ -2,13 +2,29 @@ import { Box, Flex } from '@chakra-ui/react'; import MySelect from '@fastgpt/web/components/common/MySelect'; import { useI18nLng } from '@fastgpt/web/hooks/useI18n'; import { useTranslation } from 'next-i18next'; -import { useMemo } from 'react'; +import { useCallback, useMemo } from 'react'; import MyIcon from '@fastgpt/web/components/common/Icon'; +import type { LangEnum } from '@fastgpt/global/common/i18n/type'; import { langMap } from '@fastgpt/global/common/i18n/type'; +import { useUserStore } from '@/web/support/user/useUserStore'; const I18nLngSelector = () => { const { i18n } = useTranslation(); - const { onChangeLng } = useI18nLng(); + const { onChangeLng: onChangeLngI18n } = useI18nLng(); + const { userInfo, updateUserInfo } = useUserStore(); + + const onChangeLng = useCallback( + async (lng: `${LangEnum}`) => { + if (userInfo?.username) { + // logined + await updateUserInfo({ + language: lng + }); + } + await onChangeLngI18n(lng); + }, + [userInfo?.username, onChangeLngI18n, updateUserInfo] + ); const list = useMemo(() => { return Object.entries(langMap).map(([key, lang]) => ({ diff --git a/projects/app/src/components/core/app/FileSelector/index.tsx b/projects/app/src/components/core/app/FileSelector/index.tsx index e48b94dc4..d550e99d4 100644 --- a/projects/app/src/components/core/app/FileSelector/index.tsx +++ b/projects/app/src/components/core/app/FileSelector/index.tsx @@ -122,9 +122,6 @@ const FileSelector = ({ Object.entries(fields).forEach(([k, v]) => formData.set(k, v)); formData.set('file', file.rawFile); await POST(url, formData, { - headers: { - 'Content-Type': 'multipart/form-data; charset=utf-8' - }, onUploadProgress: (e) => { if (!e.total) return; const percent = Math.round((e.loaded / e.total) * 100); diff --git a/projects/app/src/components/core/chat/ChatContainer/ChatBox/hooks/useFileUpload.tsx b/projects/app/src/components/core/chat/ChatContainer/ChatBox/hooks/useFileUpload.tsx index f9f530508..b60071a91 100644 --- a/projects/app/src/components/core/chat/ChatContainer/ChatBox/hooks/useFileUpload.tsx +++ b/projects/app/src/components/core/chat/ChatContainer/ChatBox/hooks/useFileUpload.tsx @@ -188,9 +188,6 @@ export const useFileUpload = (props: UseFileUploadOptions) => { Object.entries(fields).forEach(([k, v]) => formData.set(k, v)); formData.set('file', copyFile.rawFile); await POST(url, formData, { - headers: { - 'Content-Type': 'multipart/form-data; charset=utf-8' - }, onUploadProgress: (e) => { if (!e.total) return; const percent = Math.round((e.loaded / e.total) * 100); diff --git a/projects/app/src/components/core/dataset/SearchParamsTip.tsx b/projects/app/src/components/core/dataset/SearchParamsTip.tsx index e231ca963..b85ab81d7 100644 --- a/projects/app/src/components/core/dataset/SearchParamsTip.tsx +++ b/projects/app/src/components/core/dataset/SearchParamsTip.tsx @@ -5,7 +5,7 @@ import { DatasetSearchModeMap } from '@fastgpt/global/core/dataset/constants'; import { useTranslation } from 'next-i18next'; -import React, { useMemo } from 'react'; +import React, { useEffect, useMemo } from 'react'; import MyIcon from '@fastgpt/web/components/common/Icon'; import { getWebLLMModel } from '@/web/common/system/utils'; @@ -27,16 +27,15 @@ const SearchParamsTip = ({ queryExtensionModel?: string; }) => { const { t } = useTranslation(); - const { reRankModelList, llmModelList } = useSystemStore(); + const { reRankModelList } = useSystemStore(); const hasReRankModel = reRankModelList.length > 0; const hasEmptyResponseMode = responseEmptyText !== undefined; const hasSimilarityMode = usingReRank || searchMode === DatasetSearchModeEnum.embedding; const extensionModelName = useMemo( - () => - datasetSearchUsingExtensionQuery ? getWebLLMModel(queryExtensionModel)?.name : undefined, - [datasetSearchUsingExtensionQuery, queryExtensionModel, llmModelList] + () => getWebLLMModel(queryExtensionModel)?.name, + [queryExtensionModel] ); return ( diff --git a/projects/app/src/components/support/wallet/NotSufficientModal/index.tsx b/projects/app/src/components/support/wallet/NotSufficientModal/index.tsx index f0d3a7747..d6b3cfb62 100644 --- a/projects/app/src/components/support/wallet/NotSufficientModal/index.tsx +++ b/projects/app/src/components/support/wallet/NotSufficientModal/index.tsx @@ -11,6 +11,7 @@ import { useUserStore } from '@/web/support/user/useUserStore'; import { standardSubLevelMap } from '@fastgpt/global/support/wallet/sub/constants'; import { TeamErrEnum } from '@fastgpt/global/common/error/code/team'; import { useMount } from 'ahooks'; +import { useRouter } from 'next/router'; const NotSufficientModal = () => { const { t } = useTranslation(); @@ -62,7 +63,7 @@ const NotSufficientModal = () => { export default NotSufficientModal; -const RechargeModal = ({ +export const RechargeModal = ({ onClose, onPaySuccess }: { @@ -70,7 +71,9 @@ const RechargeModal = ({ onPaySuccess: () => void; }) => { const { t } = useTranslation(); + const router = useRouter(); const { teamPlanStatus, initTeamPlanStatus } = useUserStore(); + const { subPlans } = useSystemStore(); useMount(() => { initTeamPlanStatus(); @@ -78,8 +81,11 @@ const RechargeModal = ({ const planName = useMemo(() => { if (!teamPlanStatus?.standard?.currentSubLevel) return ''; - return standardSubLevelMap[teamPlanStatus.standard.currentSubLevel].label; - }, [teamPlanStatus?.standard?.currentSubLevel]); + return ( + subPlans?.standard?.[teamPlanStatus.standard.currentSubLevel]?.name || + t(standardSubLevelMap[teamPlanStatus.standard.currentSubLevel]?.label as any) + ); + }, [teamPlanStatus?.standard?.currentSubLevel, subPlans?.standard, t]); const [tab, setTab] = useState<'standard' | 'extra'>('standard'); @@ -96,28 +102,87 @@ const RechargeModal = ({ > - - {t('common:support.wallet.subscription.Current plan')} - - - {t(planName as any)} - + + + {t('common:support.wallet.subscription.Current plan')} + + + {t(planName as any)} + + + + - - - {t('common:info.resource')} - - - {`${t('common:support.user.team.Dataset usage')}:`} - {`${teamPlanStatus?.usedDatasetIndexSize} / ${teamPlanStatus?.datasetMaxSize || t('account_info:unlimited')}`} - {`${t('common:support.wallet.subscription.AI points usage')}:`} - {`${Math.round(teamPlanStatus?.usedPoints || 0)} / ${teamPlanStatus?.totalPoints || t('account_info:unlimited')}`} - + + + + + {t('common:support.wallet.subscription.AI points usage')} + + {`${teamPlanStatus?.usedPoints || 0} / ${teamPlanStatus?.totalPoints ?? t('common:Unlimited')}`} + + + + + + + + + {t('common:support.user.team.Dataset usage')} + + {`${teamPlanStatus?.usedDatasetIndexSize || 0} / ${teamPlanStatus?.datasetMaxSize ?? t('common:Unlimited')}`} + + + + + any }) => { + onSuccess, + discountCouponName, + onClose +}: QRPayProps & { + tip?: string; + onSuccess?: () => any; + onClose?: () => void; +}) => { const { t } = useTranslation(); const canvasRef = useRef(null); const toast = useToast(); @@ -161,13 +168,25 @@ const QRCodePayModal = ({ title={t('common:user.Pay')} iconSrc="/imgs/modal/wallet.svg" w={'600px'} + onClose={onClose} > {tip && } {t('common:pay_money')} - + ¥{readPrice.toFixed(2)} + {discountCouponName && ( + + {t('common:discount_coupon_used') + t(discountCouponName)} + + )} {renderPaymentContent()} diff --git a/projects/app/src/components/support/wallet/StandardPlanContentList.tsx b/projects/app/src/components/support/wallet/StandardPlanContentList.tsx index a3d1e1692..0bf5600a1 100644 --- a/projects/app/src/components/support/wallet/StandardPlanContentList.tsx +++ b/projects/app/src/components/support/wallet/StandardPlanContentList.tsx @@ -34,29 +34,66 @@ const StandardPlanContentList = ({ price: plan.price * (mode === SubModeEnum.month ? 1 : 10), level: level as `${StandardSubLevelEnum}`, ...standardSubLevelMap[level as `${StandardSubLevelEnum}`], + totalPoints: + standplan?.totalPoints || plan.totalPoints * (mode === SubModeEnum.month ? 1 : 12), + requestsPerMinute: standplan?.requestsPerMinute || plan.requestsPerMinute || 2000, maxTeamMember: standplan?.maxTeamMember || plan.maxTeamMember, maxAppAmount: standplan?.maxApp || plan.maxAppAmount, maxDatasetAmount: standplan?.maxDataset || plan.maxDatasetAmount, - chatHistoryStoreDuration: plan.chatHistoryStoreDuration, - maxDatasetSize: plan.maxDatasetSize, - permissionCustomApiKey: plan.permissionCustomApiKey, - permissionCustomCopyright: plan.permissionCustomCopyright, - trainingWeight: plan.trainingWeight, - totalPoints: plan.totalPoints * (mode === SubModeEnum.month ? 1 : 12), - permissionWebsiteSync: plan.permissionWebsiteSync, - permissionTeamOperationLog: plan.permissionTeamOperationLog + maxDatasetSize: standplan?.maxDatasetSize || plan.maxDatasetSize, + websiteSyncPerDataset: standplan?.websiteSyncPerDataset || plan.websiteSyncPerDataset, + chatHistoryStoreDuration: + standplan?.chatHistoryStoreDuration || plan.chatHistoryStoreDuration, + auditLogStoreDuration: standplan?.auditLogStoreDuration || plan.auditLogStoreDuration, + appRegistrationCount: standplan?.appRegistrationCount || plan.appRegistrationCount, + ticketResponseTime: standplan?.ticketResponseTime || plan.ticketResponseTime }; }, [ subPlans?.standard, level, mode, + standplan?.totalPoints, + standplan?.requestsPerMinute, standplan?.maxTeamMember, standplan?.maxApp, - standplan?.maxDataset + standplan?.maxDataset, + standplan?.maxDatasetSize, + standplan?.websiteSyncPerDataset, + standplan?.chatHistoryStoreDuration, + standplan?.auditLogStoreDuration, + standplan?.appRegistrationCount, + standplan?.ticketResponseTime ]); return planContent ? ( + + + + + {t('common:support.wallet.subscription.function.Points', { + amount: planContent.totalPoints + })} + + + {({ onOpen }) => ( + + )} + + + + + + + {t('common:support.wallet.subscription.function.Max dataset size', { + amount: planContent.maxDatasetSize + })} + + @@ -89,52 +126,52 @@ const StandardPlanContentList = ({ })} - - - - {t('common:support.wallet.subscription.function.Max dataset size', { - amount: planContent.maxDatasetSize - })} - - - - - - - {t('common:support.wallet.subscription.function.Points', { - amount: planContent.totalPoints - })} - - - {({ onOpen }) => ( - - )} - - - - - - - {t('common:support.wallet.subscription.Training weight', { - weight: planContent.trainingWeight - })} - - - {!!planContent.permissionWebsiteSync && ( - - - {t('common:support.wallet.subscription.web_site_sync')} - - )} - {!!planContent.permissionTeamOperationLog && ( + {!!planContent.auditLogStoreDuration && ( - {t('common:support.wallet.subscription.team_operation_log')} + {t('common:support.wallet.subscription.function.Audit log store duration', { + amount: planContent.auditLogStoreDuration + })} + + + )} + + + + {t('common:support.wallet.subscription.function.Requests per minute', { + amount: planContent.requestsPerMinute + })} + + + + {!!planContent.websiteSyncPerDataset && ( + + + + {t('common:support.wallet.subscription.function.Website sync per dataset', { + amount: planContent.websiteSyncPerDataset + })} + + + )} + {!!planContent.ticketResponseTime && ( + + + + {t('common:support.wallet.subscription.function.Ticket response time', { + amount: planContent.ticketResponseTime + })} + + + )} + {!!planContent.appRegistrationCount && ( + + + + {t('common:support.wallet.subscription.function.App registration count', { + amount: planContent.appRegistrationCount + })} )} diff --git a/projects/app/src/global/core/api/appReq.d.ts b/projects/app/src/global/core/api/appReq.d.ts index 355bdee56..6fe202f19 100644 --- a/projects/app/src/global/core/api/appReq.d.ts +++ b/projects/app/src/global/core/api/appReq.d.ts @@ -1,6 +1,7 @@ import type { ChatSourceEnum } from '@fastgpt/global/core/chat/constants'; import { UsageSourceEnum } from '@fastgpt/global/support/wallet/usage/constants'; import type { PaginationProps } from '@fastgpt/web/common/fetch/type'; +import type { I18nName } from '@fastgpt/service/common/geo/type'; export type GetAppChatLogsProps = { appId: string; @@ -9,6 +10,7 @@ export type GetAppChatLogsProps = { sources?: ChatSourceEnum[]; tmbIds?: string[]; chatSearch?: string; + locale?: keyof I18nName; }; export type GetAppChatLogsParams = PaginationProps; diff --git a/projects/app/src/instrumentation.ts b/projects/app/src/instrumentation.ts index b16acffa2..8df76a2b5 100644 --- a/projects/app/src/instrumentation.ts +++ b/projects/app/src/instrumentation.ts @@ -23,7 +23,8 @@ export async function register() { { getSystemTools }, { trackTimerProcess }, { initBullMQWorkers }, - { initS3Buckets } + { initS3Buckets }, + { initGeo } ] = await Promise.all([ import('@fastgpt/service/common/mongo/init'), import('@fastgpt/service/common/mongo/index'), @@ -40,7 +41,8 @@ export async function register() { import('@fastgpt/service/core/app/tool/controller'), import('@fastgpt/service/common/middle/tracks/processor'), import('@/service/common/bullmq'), - import('@fastgpt/service/common/s3') + import('@fastgpt/service/common/s3'), + import('@fastgpt/service/common/geo') ]); // connect to signoz @@ -53,6 +55,9 @@ export async function register() { // init s3 buckets initS3Buckets(); + // init geo + initGeo(); + // Connect to MongoDB await Promise.all([ connectMongo({ diff --git a/projects/app/src/pageComponents/account/bill/ApplyInvoiceModal.tsx b/projects/app/src/pageComponents/account/bill/ApplyInvoiceModal.tsx index cafc6221c..ff05ca8fe 100644 --- a/projects/app/src/pageComponents/account/bill/ApplyInvoiceModal.tsx +++ b/projects/app/src/pageComponents/account/bill/ApplyInvoiceModal.tsx @@ -131,7 +131,7 @@ const ApplyInvoiceModal = ({ onClose }: { onClose: () => void }) => { {t('account_bill:total_amount')} - {t('account_bill:yuan', { amount: formatStorePrice2Read(totalPrice) })} + {t('account:yuan', { amount: formatStorePrice2Read(totalPrice) })} @@ -205,7 +205,7 @@ const ApplyInvoiceModal = ({ onClose }: { onClose: () => void }) => { {t('account_bill:type')} {t('account_bill:time')} - {t('account_bill:support_wallet_amount')} + {t('account:support_wallet_amount')} @@ -233,9 +233,7 @@ const ApplyInvoiceModal = ({ onClose }: { onClose: () => void }) => { ? dayjs(item.createTime).format('YYYY/MM/DD HH:mm:ss') : '-'} - - {t('account_bill:yuan', { amount: formatStorePrice2Read(item.price) })} - + {t('account:yuan', { amount: formatStorePrice2Read(item.price) })} ))} diff --git a/projects/app/src/pageComponents/account/bill/BillDetailModal.tsx b/projects/app/src/pageComponents/account/bill/BillDetailModal.tsx new file mode 100644 index 000000000..484de813c --- /dev/null +++ b/projects/app/src/pageComponents/account/bill/BillDetailModal.tsx @@ -0,0 +1,123 @@ +import React from 'react'; +import MyModal from '@fastgpt/web/components/common/MyModal'; +import { Box, Flex, ModalBody } from '@chakra-ui/react'; +import { useTranslation } from 'next-i18next'; +import dayjs from 'dayjs'; +import FormLabel from '@fastgpt/web/components/common/MyBox/FormLabel'; +import { + billTypeMap, + billStatusMap, + billPayWayMap +} from '@fastgpt/global/support/wallet/bill/constants'; +import { formatStorePrice2Read } from '@fastgpt/global/support/wallet/usage/tools'; +import { standardSubLevelMap, subModeMap } from '@fastgpt/global/support/wallet/sub/constants'; +import { useRequest2 } from '@fastgpt/web/hooks/useRequest'; +import { getBillDetail } from '@/web/support/wallet/bill/api'; + +type BillDetailModalProps = { + billId: string; + onClose: () => void; +}; + +const BillDetailModal = ({ billId, onClose }: BillDetailModalProps) => { + const { t } = useTranslation(); + + const { data: bill, loading } = useRequest2(() => getBillDetail(billId), { + refreshDeps: [billId], + manual: false + }); + + return ( + + + + {t('account:order_number')}: + {bill?.orderId} + + + {t('account:generation_time')}: + {dayjs(bill?.createTime).format('YYYY/MM/DD HH:mm:ss')} + + {bill?.type && ( + + {t('account:order_type')}: + {t(billTypeMap[bill.type]?.label as any)} + + )} + {bill?.status && ( + + {t('account:status')}: + {t(billStatusMap[bill.status]?.label as any)} + + )} + {!!bill?.couponName && ( + + {t('account_info:discount_coupon')}: + {t(bill?.couponName as any)} + + )} + {!!bill?.metadata?.payWay && ( + + {t('account:payment_method')}: + {t(billPayWayMap[bill?.metadata.payWay]?.label as any)} + + )} + {!!bill?.price && ( + + {t('account:support_wallet_amount')}: + {t('account:yuan', { amount: formatStorePrice2Read(bill?.price) })} + + )} + {bill?.metadata && !!bill?.price && ( + + {t('account:has_invoice')}: + {bill?.metadata.payWay === 'balance' ? ( + t('user:bill.not_need_invoice') + ) : ( + {bill.hasInvoice ? t('account:yes') : t('account:no')} + )} + + )} + {!!bill?.metadata?.subMode && ( + + {t('account:subscription_period')}: + {t(subModeMap[bill.metadata.subMode]?.label as any)} + + )} + {!!bill?.metadata?.standSubLevel && ( + + {t('account:subscription_package')}: + {t(standardSubLevelMap[bill.metadata.standSubLevel]?.label as any)} + + )} + {bill?.metadata?.month !== undefined && ( + + {t('account:subscription_mode_month')}: + {`${bill.metadata?.month} ${t('account:month')}`} + + )} + {bill?.metadata?.datasetSize !== undefined && ( + + {t('account:extra_dataset_size')}: + {bill.metadata?.datasetSize} + + )} + {bill?.metadata?.extraPoints !== undefined && ( + + {t('account:extra_ai_points')}: + {bill.metadata.extraPoints} + + )} + + + ); +}; + +export default BillDetailModal; diff --git a/projects/app/src/pageComponents/account/bill/BillTable.tsx b/projects/app/src/pageComponents/account/bill/BillTable.tsx index b12c73666..2bd276acb 100644 --- a/projects/app/src/pageComponents/account/bill/BillTable.tsx +++ b/projects/app/src/pageComponents/account/bill/BillTable.tsx @@ -9,36 +9,40 @@ import { Td, TableContainer, Flex, - Box, - ModalBody + Box } from '@chakra-ui/react'; -import { getBills, checkBalancePayResult } from '@/web/support/wallet/bill/api'; -import type { BillSchemaType } from '@fastgpt/global/support/wallet/bill/type.d'; +import { + getBills, + checkBalancePayResult, + cancelBill, + putUpdatePayment +} from '@/web/support/wallet/bill/api'; import dayjs from 'dayjs'; import { formatStorePrice2Read } from '@fastgpt/global/support/wallet/usage/tools'; import { useToast } from '@fastgpt/web/hooks/useToast'; import MyIcon from '@fastgpt/web/components/common/Icon'; import { useTranslation } from 'next-i18next'; -import type { BillTypeEnum } from '@fastgpt/global/support/wallet/bill/constants'; +import type { BillPayWayEnum, BillTypeEnum } from '@fastgpt/global/support/wallet/bill/constants'; import { BillStatusEnum, - billPayWayMap, billStatusMap, billTypeMap } from '@fastgpt/global/support/wallet/bill/constants'; import MyBox from '@fastgpt/web/components/common/MyBox'; import { useRequest2 } from '@fastgpt/web/hooks/useRequest'; -import { standardSubLevelMap, subModeMap } from '@fastgpt/global/support/wallet/sub/constants'; import MySelect from '@fastgpt/web/components/common/MySelect'; -import MyModal from '@fastgpt/web/components/common/MyModal'; import { usePagination } from '@fastgpt/web/hooks/usePagination'; -import FormLabel from '@fastgpt/web/components/common/MyBox/FormLabel'; +import QRCodePayModal, { type QRPayProps } from '@/components/support/wallet/QRCodePayModal'; +import PopoverConfirm from '@fastgpt/web/components/common/MyPopover/PopoverConfirm'; +import BillDetailModal from './BillDetailModal'; +import type { BillSchemaType } from '@fastgpt/global/support/wallet/bill/type'; const BillTable = () => { const { t } = useTranslation(); const { toast } = useToast(); const [billType, setBillType] = useState(undefined); - const [billDetail, setBillDetail] = useState(); + const [billDetailId, setBillDetailId] = useState(); + const [qrPayData, setQRPayData] = useState(); const billTypeList = useMemo( () => @@ -72,13 +76,26 @@ const BillTable = () => { }); const { runAsync: handleRefreshPayOrder, loading: isRefreshing } = useRequest2( - async (payId: string) => { - const { status, description } = await checkBalancePayResult(payId); + async (bill: BillSchemaType) => { + const { status, description } = await checkBalancePayResult(bill._id); if (status === BillStatusEnum.SUCCESS) { toast({ title: t('common:pay_success'), status: 'success' }); + } else if (status === BillStatusEnum.NOTPAY) { + const payWay = bill.metadata?.payWay as BillPayWayEnum; + const paymentData = await putUpdatePayment({ + billId: bill._id, + payWay + }); + + setQRPayData({ + billId: bill._id, + readPrice: formatStorePrice2Read(bill.price), + payment: payWay, + ...paymentData + }); } else { toast({ title: t(description as any), @@ -92,6 +109,18 @@ const BillTable = () => { } ); + const { runAsync: handleCancelBill, loading: isCancelling } = useRequest2( + async (billId: string) => { + await cancelBill({ billId }); + }, + { + manual: true, + onSuccess: () => { + getData(1); + } + } + ); + return ( @@ -111,8 +140,8 @@ const BillTable = () => { > {t('account_bill:time')} - {t('account_bill:support_wallet_amount')} - {t('account_bill:status')} + {t('account:support_wallet_amount')} + {t('account:status')} @@ -124,20 +153,37 @@ const BillTable = () => { {item.createTime ? dayjs(item.createTime).format('YYYY/MM/DD HH:mm:ss') : '-'} - {t('account_bill:yuan', { amount: formatStorePrice2Read(item.price) })} + {t('account:yuan', { amount: formatStorePrice2Read(item.price) })} {t(billStatusMap[item.status]?.label as any)} - + {item.status === 'NOTPAY' && ( - + <> + + handleCancelBill(item._id)} + Trigger={ + + } + /> + )} - @@ -165,96 +211,29 @@ const BillTable = () => { )} - {!!billDetail && ( - setBillDetail(undefined)} /> + {!!billDetailId && ( + setBillDetailId(undefined)} /> + )} + {!!qrPayData && ( + { + setQRPayData(undefined); + getData(1); + }} + discountCouponName={qrPayData.discountCouponName} + {...qrPayData} + onSuccess={() => { + setQRPayData(undefined); + toast({ + title: t('common:pay_success'), + status: 'success' + }); + getData(1); + }} + /> )} ); }; export default BillTable; - -function BillDetailModal({ bill, onClose }: { bill: BillSchemaType; onClose: () => void }) { - const { t } = useTranslation(); - - return ( - - - - {t('account_bill:order_number')}: - {bill.orderId} - - - {t('account_bill:generation_time')}: - {dayjs(bill.createTime).format('YYYY/MM/DD HH:mm:ss')} - - - {t('account_bill:order_type')}: - {t(billTypeMap[bill.type]?.label as any)} - - - {t('account_bill:status')}: - {t(billStatusMap[bill.status]?.label as any)} - - {!!bill.metadata?.payWay && ( - - {t('account_bill:payment_method')}: - {t(billPayWayMap[bill.metadata.payWay]?.label as any)} - - )} - {!!bill.price && ( - - {t('account_bill:support_wallet_amount')}: - {t('account_bill:yuan', { amount: formatStorePrice2Read(bill.price) })} - - )} - {bill.metadata && !!bill.price && ( - - {t('account_bill:has_invoice')}: - {bill.metadata.payWay === 'balance' ? ( - t('user:bill.not_need_invoice') - ) : ( - {bill.hasInvoice ? t('account_bill:yes') : t('account_bill:no')} - )} - - )} - {!!bill.metadata?.subMode && ( - - {t('account_bill:subscription_period')}: - {t(subModeMap[bill.metadata.subMode]?.label as any)} - - )} - {!!bill.metadata?.standSubLevel && ( - - {t('account_bill:subscription_package')}: - {t(standardSubLevelMap[bill.metadata.standSubLevel]?.label as any)} - - )} - {bill.metadata?.month !== undefined && ( - - {t('account_bill:subscription_mode_month')}: - {`${bill.metadata?.month} ${t('account_bill:month')}`} - - )} - {bill.metadata?.datasetSize !== undefined && ( - - {t('account_bill:extra_dataset_size')}: - {bill.metadata?.datasetSize} - - )} - {bill.metadata?.extraPoints !== undefined && ( - - {t('account_bill:extra_ai_points')}: - {bill.metadata.extraPoints} - - )} - - - ); -} diff --git a/projects/app/src/pageComponents/account/bill/InvoiceHeaderForm.tsx b/projects/app/src/pageComponents/account/bill/InvoiceHeaderForm.tsx index da87f3064..75cbef190 100644 --- a/projects/app/src/pageComponents/account/bill/InvoiceHeaderForm.tsx +++ b/projects/app/src/pageComponents/account/bill/InvoiceHeaderForm.tsx @@ -142,10 +142,10 @@ export const InvoiceHeaderSingleForm = ({ > - {t('account_bill:yes')} + {t('account:yes')} - {t('account_bill:no')} + {t('account:no')} diff --git a/projects/app/src/pageComponents/account/bill/InvoiceTable.tsx b/projects/app/src/pageComponents/account/bill/InvoiceTable.tsx index 9c9d9a690..4014228a4 100644 --- a/projects/app/src/pageComponents/account/bill/InvoiceTable.tsx +++ b/projects/app/src/pageComponents/account/bill/InvoiceTable.tsx @@ -46,8 +46,8 @@ const InvoiceTable = () => { # {t('account_bill:time')} - {t('account_bill:support_wallet_amount')} - {t('account_bill:status')} + {t('account:support_wallet_amount')} + {t('account:status')} @@ -58,7 +58,7 @@ const InvoiceTable = () => { {item.createTime ? dayjs(item.createTime).format('YYYY/MM/DD HH:mm:ss') : '-'} - {t('account_bill:yuan', { amount: formatStorePrice2Read(item.amount) })} + {t('account:yuan', { amount: formatStorePrice2Read(item.amount) })} @@ -172,7 +172,7 @@ function InvoiceDetailModal({ diff --git a/projects/app/src/pageComponents/account/info/DiscountCouponsModal.tsx b/projects/app/src/pageComponents/account/info/DiscountCouponsModal.tsx new file mode 100644 index 000000000..dd1919e10 --- /dev/null +++ b/projects/app/src/pageComponents/account/info/DiscountCouponsModal.tsx @@ -0,0 +1,230 @@ +import React, { useState } from 'react'; +import MyModal from '@fastgpt/web/components/common/MyModal'; +import { Box, Flex, Button, ModalBody } from '@chakra-ui/react'; +import { useTranslation } from 'next-i18next'; +import { useUserStore } from '@/web/support/user/useUserStore'; +import dayjs from 'dayjs'; +import { useRequest2 } from '@fastgpt/web/hooks/useRequest'; +import { getDiscountCouponList } from '@/web/support/wallet/sub/discountCoupon/api'; +import EmptyTip from '@fastgpt/web/components/common/EmptyTip'; +import { useRouter } from 'next/router'; +import MyImage from '@fastgpt/web/components/common/Image/MyImage'; +import MyIcon from '@fastgpt/web/components/common/Icon'; +import BillDetailModal from '@/pageComponents/account/bill/BillDetailModal'; +import { DiscountCouponStatusEnum } from '@fastgpt/global/support/wallet/sub/discountCoupon/constants'; + +const DiscountCouponsModal = ({ onClose }: { onClose: () => void }) => { + const { t, i18n } = useTranslation(); + const { userInfo } = useUserStore(); + const router = useRouter(); + const isZh = i18n.language === 'zh-CN'; + const [billId, setBillId] = useState(); + const teamId = userInfo?.team?.teamId; + + const { data: coupons = [], loading } = useRequest2( + async () => { + if (!teamId) return []; + return getDiscountCouponList(teamId); + }, + { + manual: !teamId, + refreshDeps: [teamId] + } + ); + + const getStatusText = (status: DiscountCouponStatusEnum) => { + const statusTextMap = { + [DiscountCouponStatusEnum.active]: '', + [DiscountCouponStatusEnum.expired]: `(${t('account_info:expired')})`, + [DiscountCouponStatusEnum.notStart]: `(${t('account_info:not_started_tips')})`, + [DiscountCouponStatusEnum.used]: `(${t('account_info:used_tips')})` + }; + + return statusTextMap[status]; + }; + + return ( + + + {coupons.length > 0 ? ( + + {coupons.map((coupon) => { + return ( + + + + + + + + + + {`${getStatusText(coupon.status as DiscountCouponStatusEnum)} + ${t(coupon.name)}`} + + + {t(coupon.description)} + + + + + + {coupon.usedAt ? ( + + {`${t('account_info:used_time')}: `} + {dayjs(coupon.usedAt).format('YYYY-MM-DD')} + + ) : ( + + {`${t('account_info:expiration_time')}: `} + {dayjs(coupon.expiredTime).format('YYYY-MM-DD')} + + )} + {coupon.status === DiscountCouponStatusEnum.active ? ( + + ) : coupon.status === DiscountCouponStatusEnum.expired ? ( + + {t('account_info:expired_tips')} + + ) : coupon.status === DiscountCouponStatusEnum.notStart ? ( + + {t('account_info:not_started_tips')} + + ) : coupon.status === DiscountCouponStatusEnum.used ? ( + { + router.push('/account/bill'); + }} + > + {t('account_info:check_purchase_history')} + + ) : null} + + + + ); + })} + + ) : ( + + )} + + + + + {!!billId && setBillId(undefined)} />} + + ); +}; + +export default DiscountCouponsModal; diff --git a/projects/app/src/pageComponents/account/info/standardDetailModal.tsx b/projects/app/src/pageComponents/account/info/standardDetailModal.tsx index f76d22d6a..1ab1b8dae 100644 --- a/projects/app/src/pageComponents/account/info/standardDetailModal.tsx +++ b/projects/app/src/pageComponents/account/info/standardDetailModal.tsx @@ -116,7 +116,7 @@ const StandDetailModal = ({ onClose }: { onClose: () => void }) => { {t(subTypeMap[type]?.label as any)} {currentSubLevel && - `(${t(standardSubLevelMap[currentSubLevel]?.label as any)})`} + `(${subPlans?.standard?.[currentSubLevel]?.name || t(standardSubLevelMap[currentSubLevel]?.label as any)})`} diff --git a/projects/app/src/pageComponents/account/usage/Dashboard.tsx b/projects/app/src/pageComponents/account/usage/Dashboard.tsx index 6411b2007..a01bc4fa0 100644 --- a/projects/app/src/pageComponents/account/usage/Dashboard.tsx +++ b/projects/app/src/pageComponents/account/usage/Dashboard.tsx @@ -1,5 +1,5 @@ import { getDashboardData } from '@/web/support/wallet/usage/api'; -import { Box } from '@chakra-ui/react'; +import { Box, Button, Flex, useDisclosure } from '@chakra-ui/react'; import MyBox from '@fastgpt/web/components/common/MyBox'; import { useRequest2 } from '@fastgpt/web/hooks/useRequest'; import { addDays } from 'date-fns'; @@ -7,6 +7,8 @@ import React, { useMemo } from 'react'; import { type UsageFilterParams } from './type'; import dayjs from 'dayjs'; import dynamic from 'next/dynamic'; +import { RechargeModal } from '@/components/support/wallet/NotSufficientModal'; +import { useTranslation } from 'react-i18next'; const DashboardChart = dynamic(() => import('./DashboardChart'), { ssr: false @@ -21,6 +23,7 @@ const UsageDashboard = ({ Tabs: React.ReactNode; Selectors: React.ReactNode; }) => { + const { t } = useTranslation(); const { dateRange, selectTmbIds, usageSources, unit, isSelectAllSource, isSelectAllTmb } = filterParams; @@ -52,13 +55,31 @@ const UsageDashboard = ({ return totalPoints.reduce((acc, curr) => acc + curr.totalPoints, 0); }, [totalPoints]); + const { + isOpen: isOpenRecharge, + onOpen: onOpenRecharge, + onClose: onCloseRecharge + } = useDisclosure(); + return ( <> - {Tabs} + + {Tabs} + + + {Selectors} + {isOpenRecharge && } ); }; diff --git a/projects/app/src/pageComponents/account/usage/UsageTable.tsx b/projects/app/src/pageComponents/account/usage/UsageTable.tsx index 3dc5a8a09..dc9e5928b 100644 --- a/projects/app/src/pageComponents/account/usage/UsageTable.tsx +++ b/projects/app/src/pageComponents/account/usage/UsageTable.tsx @@ -8,7 +8,8 @@ import { Td, Th, Thead, - Tr + Tr, + useDisclosure } from '@chakra-ui/react'; import { formatNumber } from '@fastgpt/global/common/math/tools'; import { UsageSourceMap } from '@fastgpt/global/support/wallet/usage/constants'; @@ -16,7 +17,6 @@ import { type UsageListItemType } from '@fastgpt/global/support/wallet/usage/typ import EmptyTip from '@fastgpt/web/components/common/EmptyTip'; import MyBox from '@fastgpt/web/components/common/MyBox'; import dayjs from 'dayjs'; -import { useTranslation } from 'next-i18next'; import React, { useMemo, useState } from 'react'; import Avatar from '@fastgpt/web/components/common/Avatar'; import { usePagination } from '@fastgpt/web/hooks/usePagination'; @@ -30,6 +30,9 @@ import { downloadFetch } from '@/web/common/system/utils'; import { useSafeTranslation } from '@fastgpt/web/hooks/useSafeTranslation'; const UsageDetail = dynamic(() => import('./UsageDetail')); +const RechargeModal = dynamic(() => + import('@/components/support/wallet/NotSufficientModal/index').then((mod) => mod.RechargeModal) +); const UsageTableList = ({ filterParams, @@ -41,6 +44,11 @@ const UsageTableList = ({ filterParams: UsageFilterParams; }) => { const { t } = useSafeTranslation(); + const { + isOpen: isOpenRecharge, + onOpen: onOpenRecharge, + onClose: onCloseRecharge + } = useDisclosure(); const { dateRange, selectTmbIds, isSelectAllTmb, usageSources, isSelectAllSource, projectName } = filterParams; @@ -112,10 +120,22 @@ const UsageTableList = ({ return ( - {Tabs} + + {Tabs} + + + {Selectors} + {t('common:Export')}} showCancel @@ -170,6 +190,8 @@ const UsageTableList = ({ {!!usageDetail && ( setUsageDetail(undefined)} /> )} + + {isOpenRecharge && } ); }; diff --git a/projects/app/src/pageComponents/app/detail/Logs/LogTable.tsx b/projects/app/src/pageComponents/app/detail/Logs/LogTable.tsx index 368ad7a37..88e99ba73 100644 --- a/projects/app/src/pageComponents/app/detail/Logs/LogTable.tsx +++ b/projects/app/src/pageComponents/app/detail/Logs/LogTable.tsx @@ -50,6 +50,7 @@ import dynamic from 'next/dynamic'; import type { HeaderControlProps } from './LogChart'; import { useSystemStore } from '@/web/common/system/useSystemStore'; import MyBox from '@fastgpt/web/components/common/MyBox'; +import type { I18nName } from '@fastgpt/service/common/geo/type'; const DetailLogsModal = dynamic(() => import('./DetailLogsModal')); @@ -64,7 +65,7 @@ const LogTable = ({ showSourceSelector = true, px = [4, 8] }: HeaderControlProps) => { - const { t } = useTranslation(); + const { t, i18n } = useTranslation(); const { feConfigs } = useSystemStore(); const [detailLogsId, setDetailLogsId] = useState(); @@ -154,7 +155,7 @@ const LogTable = ({ sources: isSelectAllSource ? undefined : chatSources, tmbIds: isSelectAllTmb ? undefined : selectTmbIds, chatSearch, - + locale: i18n.language === 'zh-CN' ? 'zh' : 'en', title: `${headerTitle},${t('app:logs_keys_chatDetails')}`, logKeys: enabledKeys, sourcesMap: Object.fromEntries( @@ -179,7 +180,8 @@ const LogTable = ({ dateEnd: dateRange.to!, sources: isSelectAllSource ? undefined : chatSources, tmbIds: isSelectAllTmb ? undefined : selectTmbIds, - chatSearch + chatSearch, + locale: (i18n.language === 'zh-CN' ? 'zh' : 'en') as keyof I18nName }), [ appId, @@ -189,9 +191,11 @@ const LogTable = ({ isSelectAllSource, selectTmbIds, isSelectAllTmb, - chatSearch + chatSearch, + i18n.language ] ); + const { data: logs, isLoading, @@ -218,6 +222,7 @@ const LogTable = ({ ), [AppLogKeysEnum.USER]: {t('app:logs_chat_user')}, + [AppLogKeysEnum.REGION]: {t('app:logs_keys_region')}, [AppLogKeysEnum.TITLE]: {t('app:logs_title')}, [AppLogKeysEnum.SESSION_ID]: ( {t('app:logs_keys_sessionId')} @@ -272,6 +277,7 @@ const LogTable = ({ ), + [AppLogKeysEnum.REGION]: {item.region || '-'}, [AppLogKeysEnum.TITLE]: ( {item.customTitle || item.title} @@ -443,9 +449,11 @@ const LogTable = ({ }} /> + {showSyncPopover && ( {logKeys .filter((logKey) => logKey.enable) - .map((logKey) => cellRenderMap[logKey.key])} + .map((logKey) => cellRenderMap[logKey.key as AppLogKeysEnum])} ); })} diff --git a/projects/app/src/pageComponents/app/detail/Logs/SyncLogKeysPopover.tsx b/projects/app/src/pageComponents/app/detail/Logs/SyncLogKeysPopover.tsx index cf0733ccf..a1c6d29ea 100644 --- a/projects/app/src/pageComponents/app/detail/Logs/SyncLogKeysPopover.tsx +++ b/projects/app/src/pageComponents/app/detail/Logs/SyncLogKeysPopover.tsx @@ -6,8 +6,6 @@ import React from 'react'; import type { updateLogKeysBody } from '@/pages/api/core/app/logs/updateLogKeys'; import { useRequest2 } from '@fastgpt/web/hooks/useRequest'; import { updateLogKeys } from '@/web/core/app/api/log'; -import { useContextSelector } from 'use-context-selector'; -import { AppContext } from '../context'; import type { AppLogKeysType } from '@fastgpt/global/core/app/logs/type'; import type { getLogKeysResponse } from '@/pages/api/core/app/logs/getLogKeys'; import type { SetState } from 'ahooks/lib/createUseStorageState'; @@ -16,15 +14,16 @@ const SyncLogKeysPopover = ({ logKeys, setLogKeys, teamLogKeys, - fetchLogKeys + fetchLogKeys, + appId }: { logKeys: AppLogKeysType[]; setLogKeys: (value: SetState) => void; teamLogKeys: AppLogKeysType[]; fetchLogKeys: () => Promise; + appId: string; }) => { const { t } = useTranslation(); - const appId = useContextSelector(AppContext, (v) => v.appId); const { runAsync: updateList, loading: updateLoading } = useRequest2( async (data: updateLogKeysBody) => { diff --git a/projects/app/src/pageComponents/app/detail/SimpleApp/EditForm.tsx b/projects/app/src/pageComponents/app/detail/SimpleApp/EditForm.tsx index e7341ef22..217a35ee6 100644 --- a/projects/app/src/pageComponents/app/detail/SimpleApp/EditForm.tsx +++ b/projects/app/src/pageComponents/app/detail/SimpleApp/EditForm.tsx @@ -32,6 +32,7 @@ import VariableTip from '@/components/common/Textarea/MyTextarea/VariableTip'; import { getWebLLMModel } from '@/web/common/system/utils'; import ToolSelect from './components/ToolSelect'; import OptimizerPopover from '@/components/common/PromptEditor/OptimizerPopover'; +import { useSystemStore } from '@/web/common/system/useSystemStore'; const DatasetSelectModal = dynamic(() => import('@/components/core/app/DatasetSelectModal')); const DatasetParamsModal = dynamic(() => import('@/components/core/app/DatasetParamsModal')); @@ -66,6 +67,7 @@ const EditForm = ({ const theme = useTheme(); const router = useRouter(); const { t } = useTranslation(); + const { defaultModels } = useSystemStore(); const { appDetail } = useContextSelector(AppContext, (v) => v); const selectDatasets = useMemo(() => appForm?.dataset?.datasets, [appForm]); @@ -127,6 +129,26 @@ const EditForm = ({ } }, [selectedModel, setAppForm]); + useEffect(() => { + if ( + appForm.dataset.datasetSearchUsingExtensionQuery && + !appForm.dataset.datasetSearchExtensionModel + ) { + setAppForm((state) => ({ + ...state, + dataset: { + ...state.dataset, + datasetSearchExtensionModel: defaultModels.llm?.model + } + })); + } + }, [ + appForm.dataset.datasetSearchUsingExtensionQuery, + appForm.dataset.datasetSearchExtensionModel, + defaultModels.llm?.model, + setAppForm + ]); + const OptimizerPopverComponent = useCallback( ({ iconButtonStyle }: { iconButtonStyle: Record }) => { return ( diff --git a/projects/app/src/pageComponents/app/detail/SimpleApp/components/ConfigToolModal.tsx b/projects/app/src/pageComponents/app/detail/SimpleApp/components/ConfigToolModal.tsx index 407df20de..7271ee480 100644 --- a/projects/app/src/pageComponents/app/detail/SimpleApp/components/ConfigToolModal.tsx +++ b/projects/app/src/pageComponents/app/detail/SimpleApp/components/ConfigToolModal.tsx @@ -151,13 +151,15 @@ const ConfigToolModal = ({ name={input.key} rules={{ validate: (value) => { - if (input.valueType === WorkflowIOValueTypeEnum.boolean) { - return value !== undefined; + if ( + input.valueType === WorkflowIOValueTypeEnum.boolean || + input.valueType === WorkflowIOValueTypeEnum.number + ) { + return true; } - if (input.required) { - return !!value; - } - return true; + if (!input.required) return true; + + return !!value; } }} render={({ field: { onChange, value }, fieldState: { error } }) => { diff --git a/projects/app/src/pageComponents/app/detail/WorkflowComponents/Flow/nodes/render/RenderInput/templates/Reference.tsx b/projects/app/src/pageComponents/app/detail/WorkflowComponents/Flow/nodes/render/RenderInput/templates/Reference.tsx index 2fb9c2df7..595bfaac2 100644 --- a/projects/app/src/pageComponents/app/detail/WorkflowComponents/Flow/nodes/render/RenderInput/templates/Reference.tsx +++ b/projects/app/src/pageComponents/app/detail/WorkflowComponents/Flow/nodes/render/RenderInput/templates/Reference.tsx @@ -194,6 +194,20 @@ const SingleReferenceSelector = ({ [list] ); + // Adapt array type from old version + useEffect(() => { + if ( + Array.isArray(value) && + // @ts-ignore + value.length === 1 && + Array.isArray(value[0]) && + value[0].length === 2 + ) { + // @ts-ignore + onSelect(value[0]); + } + }, [value, onSelect]); + const ItemSelector = useMemo(() => { const selectorVal = value as ReferenceItemValueType; const [nodeName, outputName] = getSelectValue(selectorVal); diff --git a/projects/app/src/pageComponents/app/detail/components/QuickCreateDatasetModal.tsx b/projects/app/src/pageComponents/app/detail/components/QuickCreateDatasetModal.tsx index 1701eee02..e55103fd4 100644 --- a/projects/app/src/pageComponents/app/detail/components/QuickCreateDatasetModal.tsx +++ b/projects/app/src/pageComponents/app/detail/components/QuickCreateDatasetModal.tsx @@ -91,9 +91,6 @@ const QuickCreateDatasetModal = ({ formData.set('file', file); await POST(url, formData, { - headers: { - 'Content-Type': 'multipart/form-data; charset=utf-8' - }, onUploadProgress: (e) => { if (!e.total) return; const percent = Math.round((e.loaded / e.total) * 100); diff --git a/projects/app/src/pageComponents/dashboard/TeamPlanStatusCard.tsx b/projects/app/src/pageComponents/dashboard/TeamPlanStatusCard.tsx index 9a01efcc3..15d3d3715 100644 --- a/projects/app/src/pageComponents/dashboard/TeamPlanStatusCard.tsx +++ b/projects/app/src/pageComponents/dashboard/TeamPlanStatusCard.tsx @@ -15,7 +15,7 @@ import { webPushTrack } from '@/web/common/middle/tracks/utils'; const TeamPlanStatusCard = () => { const { t } = useTranslation(); const { teamPlanStatus } = useUserStore(); - const { operationalAd, loadOperationalAd, feConfigs } = useSystemStore(); + const { operationalAd, loadOperationalAd, feConfigs, subPlans } = useSystemStore(); const router = useRouter(); // Load data @@ -42,8 +42,11 @@ const TeamPlanStatusCard = () => { const planName = useMemo(() => { if (!teamPlanStatus?.standard?.currentSubLevel) return ''; - return standardSubLevelMap[teamPlanStatus.standard.currentSubLevel].label; - }, [teamPlanStatus?.standard?.currentSubLevel]); + return ( + subPlans?.standard?.[teamPlanStatus.standard.currentSubLevel]?.name || + standardSubLevelMap[teamPlanStatus.standard.currentSubLevel]?.label + ); + }, [teamPlanStatus?.standard?.currentSubLevel, subPlans]); const aiPointsUsageMap = useMemo(() => { if (!teamPlanStatus) { diff --git a/projects/app/src/pageComponents/dashboard/agent/List.tsx b/projects/app/src/pageComponents/dashboard/agent/List.tsx index a219ef999..f718833ec 100644 --- a/projects/app/src/pageComponents/dashboard/agent/List.tsx +++ b/projects/app/src/pageComponents/dashboard/agent/List.tsx @@ -237,15 +237,15 @@ const List = () => { isFolder: app.type === AppTypeEnum.folder || app.type === AppTypeEnum.toolFolder })} > - + - - {app.name} + + {app.name} - + - + (v) => v ); - // dataset sync confirm - const { openConfirm: openDatasetSyncConfirm, ConfirmModal: ConfirmDatasetSyncModal } = useConfirm( - { - content: t('dataset:start_sync_dataset_tip') - } - ); - - const syncDataset = async () => { - if (datasetDetail.type === DatasetTypeEnum.websiteDataset) { - await checkTeamWebSyncLimit(); - } - - await postDatasetSync({ datasetId: datasetId }); - loadDatasetDetail(datasetId); - - // Show success message - toast({ - status: 'success', - title: t('dataset:collection.sync.submit') - }); - }; - - const { - isOpen: isOpenWebsiteModal, - onOpen: onOpenWebsiteModal, - onClose: onCloseWebsiteModal - } = useDisclosure(); - - const { runAsync: onUpdateDatasetWebsiteConfig } = useRequest2( - async (websiteConfig: WebsiteConfigFormType) => { - await updateDataset({ - id: datasetId, - websiteConfig: websiteConfig.websiteConfig, - chunkSettings: websiteConfig.chunkSettings - }); - await syncDataset(); - }, - { - onSuccess() { - onCloseWebsiteModal(); - } - } - ); - // collection list const [searchText, setSearchText] = useState(''); const [filterTags, setFilterTags] = useState([]); @@ -139,6 +95,52 @@ const CollectionPageContextProvider = ({ children }: { children: ReactNode }) => refreshDeps: [parentId, searchText, filterTags] }); + const syncDataset = async () => { + if (datasetDetail.type === DatasetTypeEnum.websiteDataset) { + await checkTeamWebSyncLimit(); + } + + await postDatasetSync({ datasetId: datasetId }); + loadDatasetDetail(datasetId); + + getData(pageNum); + + // Show success message + toast({ + status: 'success', + title: t('dataset:collection.sync.submit') + }); + }; + + // dataset sync confirm + const { openConfirm: openDatasetSyncConfirm, ConfirmModal: ConfirmDatasetSyncModal } = useConfirm( + { + content: t('dataset:start_sync_dataset_tip') + } + ); + + const { + isOpen: isOpenWebsiteModal, + onOpen: onOpenWebsiteModal, + onClose: onCloseWebsiteModal + } = useDisclosure(); + + const { runAsync: onUpdateDatasetWebsiteConfig } = useRequest2( + async (websiteConfig: WebsiteConfigFormType) => { + await updateDataset({ + id: datasetId, + websiteConfig: websiteConfig.websiteConfig, + chunkSettings: websiteConfig.chunkSettings + }); + await syncDataset(); + }, + { + onSuccess() { + onCloseWebsiteModal(); + } + } + ); + const contextValue: CollectionPageContextType = { openDatasetSyncConfirm: openDatasetSyncConfirm(syncDataset), onOpenWebsiteModal, diff --git a/projects/app/src/pageComponents/dataset/detail/Import/diffSource/FileLocal.tsx b/projects/app/src/pageComponents/dataset/detail/Import/diffSource/FileLocal.tsx index f8b7ca96a..e0a93f8af 100644 --- a/projects/app/src/pageComponents/dataset/detail/Import/diffSource/FileLocal.tsx +++ b/projects/app/src/pageComponents/dataset/detail/Import/diffSource/FileLocal.tsx @@ -78,9 +78,6 @@ const SelectFile = React.memo(function SelectFile() { Object.entries(fields).forEach(([k, v]) => formData.set(k, v)); formData.set('file', file); await POST(url, formData, { - headers: { - 'Content-Type': 'multipart/form-data; charset=utf-8' - }, onUploadProgress: (e) => { if (!e.total) return; const percent = Math.round((e.loaded / e.total) * 100); diff --git a/projects/app/src/pageComponents/login/LoginForm/LoginForm.tsx b/projects/app/src/pageComponents/login/LoginForm/LoginForm.tsx index bc07bb227..d5a1db64f 100644 --- a/projects/app/src/pageComponents/login/LoginForm/LoginForm.tsx +++ b/projects/app/src/pageComponents/login/LoginForm/LoginForm.tsx @@ -14,6 +14,7 @@ import { useSearchParams } from 'next/navigation'; import { UserErrEnum } from '@fastgpt/global/common/error/code/user'; import { useRouter } from 'next/router'; import { useMount } from 'ahooks'; +import type { LangEnum } from '@fastgpt/global/common/i18n/type'; interface Props { setPageType: Dispatch<`${LoginPageTypeEnum}`>; @@ -26,8 +27,7 @@ interface LoginFormType { } const LoginForm = ({ setPageType, loginSuccess }: Props) => { - const { t } = useTranslation(); - const { toast } = useToast(); + const { t, i18n } = useTranslation(); const { feConfigs } = useSystemStore(); const query = useSearchParams(); const router = useRouter(); @@ -45,7 +45,8 @@ const LoginForm = ({ setPageType, loginSuccess }: Props) => { await postLogin({ username, password, - code + code, + language: i18n.language as LangEnum }) ); }, diff --git a/projects/app/src/pageComponents/price/ExtraPlan.tsx b/projects/app/src/pageComponents/price/ExtraPlan.tsx index 429544992..c46ad55bb 100644 --- a/projects/app/src/pageComponents/price/ExtraPlan.tsx +++ b/projects/app/src/pageComponents/price/ExtraPlan.tsx @@ -11,14 +11,11 @@ import QRCodePayModal, { type QRPayProps } from '@/components/support/wallet/QRC import MyNumberInput from '@fastgpt/web/components/common/Input/NumberInput'; import { useRequest2 } from '@fastgpt/web/hooks/useRequest'; import MySelect from '@fastgpt/web/components/common/MySelect'; -import { - getMonthByPoints, - getMinPointsByMonth, - calculatePrice -} from '@fastgpt/global/support/wallet/bill/tools'; +import { calculatePrice } from '@fastgpt/global/support/wallet/bill/tools'; +import { formatNumberWithUnit } from '@fastgpt/global/common/string/tools'; const ExtraPlan = ({ onPaySuccess }: { onPaySuccess?: () => void }) => { - const { t } = useTranslation(); + const { t, i18n } = useTranslation(); const { toast } = useToast(); const { subPlans } = useSystemStore(); const [qrPayData, setQRPayData] = useState(); @@ -69,46 +66,30 @@ const ExtraPlan = ({ onPaySuccess }: { onPaySuccess?: () => void }) => { } ); - // extra ai points const expireSelectorOptions: { label: string; value: number }[] = [ { label: t('common:date_1_month'), value: 1 }, { label: t('common:date_3_months'), value: 3 }, { label: t('common:date_6_months'), value: 6 }, { label: t('common:date_12_months'), value: 12 } ]; - const extraPointsPrice = subPlans?.extraPoints?.price || 0; - const { - watch: watchExtraPoints, - setValue: setValueExtraPoints, - getValues: getValuesExtraPoints - } = useForm({ - defaultValues: { - points: 1, - month: 1 - } - }); - // 监听积分和月份变化 - const watchedPoints = watchExtraPoints('points'); - const watchedMonth = watchExtraPoints('month'); + const extraPointsPackages = subPlans?.extraPoints?.packages || []; + const [selectedPackageIndex, setSelectedPackageIndex] = useState(0); + + const getMonthText = (month: number) => { + if (month < 12) return `${month} ${t('common:month_text')}`; + return t('common:one_year'); + }; const { runAsync: onclickBuyExtraPoints, loading: isLoadingBuyExtraPoints } = useRequest2( async ({ points, month }: { points: number; month: number }) => { points = Math.ceil(points); month = Math.ceil(month); - const payAmount = points * 1 * extraPointsPrice; - - if (payAmount === 0) { - return toast({ - status: 'warning', - title: t('common:support.wallet.amount_0') - }); - } - const res = await postCreatePayBill({ type: BillTypeEnum.extraPoints, - extraPoints: points + extraPoints: points, + month: month }); setQRPayData({ @@ -118,209 +99,185 @@ const ExtraPlan = ({ onPaySuccess }: { onPaySuccess?: () => void }) => { }, { manual: true, - refreshDeps: [extraPointsPrice] + refreshDeps: [extraPointsPackages] } ); return ( - {/* points */} - - - - - {t('common:support.wallet.subscription.Extra ai points')} - - - {`¥${extraPointsPrice}/1000` + t('common:support.wallet.subscription.point')} - - - {t('common:support.wallet.subscription.Extra ai points description')} - - - - - - - - {t('common:support.wallet.buy_ai_points')} - - - - {t('common:support.wallet.subscription.Points amount')} - + + {t('common:support.wallet.subscription.Extra ai points')} + + + {extraPointsPackages.map((pkg, index) => ( setSelectedPackageIndex(index)} + transition={'all 0.2s'} > - { - setValueExtraPoints('points', val as unknown as number); - }} - onBlur={(val) => { - const formatVal = val || 1; - setValueExtraPoints('points', formatVal); - - const expectedMonth = getMonthByPoints(formatVal); - if (expectedMonth !== watchedMonth) { - setValueExtraPoints('month', expectedMonth); - } - }} - /> - -  {`X 1000${t('common:support.wallet.subscription.point')}`} + + {formatNumberWithUnit(pkg.points, i18n.language)}{' '} + {t('common:support.wallet.subscription.point')} + + + {t('common:invalid_time') + ' '} + {getMonthText(pkg.month)} - - - - {t('common:invalid_time')} - - - { - setValueExtraPoints('month', val); - // 当用户选择月份时,设置积分为该月份的最小值 - const minPoints = getMinPointsByMonth(val); - setValueExtraPoints('points', minPoints); - }} - /> - - - - - {t('common:support.wallet.subscription.Update extra price')} - - - {`¥${(() => { - const price = calculatePrice(extraPointsPrice, { - type: 'points', - points: watchedPoints - }); - return Number.isNaN(price) ? 0 : price; - })()}`} - - - - - - - - - {t('common:support.wallet.subscription.Update extra ai points tips')} - - - - + {t('common:support.wallet.subscription.Update extra price')} + + + {selectedPackageIndex !== undefined && extraPointsPackages[selectedPackageIndex] + ? t('common:extraPointsPrice', { + price: extraPointsPackages[selectedPackageIndex].price + }) + : '--'} + + + + + + + + + {t('common:support.wallet.subscription.Update extra ai points tips')} + + + {/* dataset */} - - - + + + {t('common:support.wallet.subscription.Extra dataset size')} - + {`¥${extraDatasetPrice}/1000${t('common:support.wallet.subscription.Extra dataset unit')}`} - + {t('common:support.wallet.subscription.Extra dataset description')} - - + + - {t('common:support.wallet.buy_dataset_capacity')} + + {t('common:support.wallet.buy_dataset_capacity')} + + {t('common:support.wallet.subscription.Dataset size')} @@ -328,8 +285,8 @@ const ExtraPlan = ({ onPaySuccess }: { onPaySuccess?: () => void }) => { void }) => { + {t('common:invalid_time')} void }) => { /> - + + {t('common:support.wallet.subscription.Update extra price')} void }) => { - + + - - - + + + + {t('common:support.wallet.subscription.Update extra dataset tips')} - + - {!!qrPayData && } + {!!qrPayData && ( + setQRPayData(undefined)} + {...qrPayData} + /> + )} ); }; diff --git a/projects/app/src/pageComponents/price/FAQ.tsx b/projects/app/src/pageComponents/price/FAQ.tsx index 041723d41..6f6a51c77 100644 --- a/projects/app/src/pageComponents/price/FAQ.tsx +++ b/projects/app/src/pageComponents/price/FAQ.tsx @@ -9,6 +9,10 @@ const FAQ = () => { title: t('common:FAQ.switch_package_q'), desc: t('common:FAQ.switch_package_a') }, + { + title: t('common:FAQ.year_day_q'), + desc: t('common:FAQ.year_day_a') + }, { title: t('common:FAQ.check_subscription_q'), desc: t('common:FAQ.check_subscription_a') @@ -25,14 +29,19 @@ const FAQ = () => { title: t('common:FAQ.dataset_compute_q'), desc: t('common:FAQ.dataset_compute_a') }, + { - title: t('common:FAQ.dataset_index_q'), - desc: t('common:FAQ.dataset_index_a') + title: t('common:FAQ.index_del_q'), + desc: t('common:FAQ.index_del_a') }, { title: t('common:FAQ.package_overlay_q'), desc: t('common:FAQ.package_overlay_a') }, + { + title: t('common:FAQ.qpm_q'), + desc: t('common:FAQ.qpm_a') + }, { title: t('common:FAQ.free_user_clean_q'), desc: t('common:FAQ.free_user_clean_a') diff --git a/projects/app/src/pageComponents/price/Standard.tsx b/projects/app/src/pageComponents/price/Standard.tsx index d0de4fb07..762e61e12 100644 --- a/projects/app/src/pageComponents/price/Standard.tsx +++ b/projects/app/src/pageComponents/price/Standard.tsx @@ -1,6 +1,6 @@ import React, { useMemo, useState } from 'react'; import MyIcon from '@fastgpt/web/components/common/Icon'; -import { Box, Button, Flex, Grid, HStack } from '@chakra-ui/react'; +import { Box, Button, Flex, Grid } from '@chakra-ui/react'; import { useTranslation } from 'next-i18next'; import { StandardSubLevelEnum, SubModeEnum } from '@fastgpt/global/support/wallet/sub/constants'; import { useSystemStore } from '@/web/common/system/useSystemStore'; @@ -9,8 +9,14 @@ import { useRequest2 } from '@fastgpt/web/hooks/useRequest'; import { type TeamSubSchema } from '@fastgpt/global/support/wallet/sub/type'; import QRCodePayModal, { type QRPayProps } from '@/components/support/wallet/QRCodePayModal'; import { postCreatePayBill } from '@/web/support/wallet/bill/api'; +import { getDiscountCouponList } from '@/web/support/wallet/sub/discountCoupon/api'; import { BillTypeEnum } from '@fastgpt/global/support/wallet/bill/constants'; import StandardPlanContentList from '@/components/support/wallet/StandardPlanContentList'; +import MyBox from '@fastgpt/web/components/common/MyBox'; +import { + DiscountCouponStatusEnum, + DiscountCouponTypeEnum +} from '@fastgpt/global/support/wallet/sub/discountCoupon/constants'; export enum PackageChangeStatusEnum { buy = 'buy', @@ -37,27 +43,70 @@ const Standard = ({ const { subPlans, feConfigs } = useSystemStore(); const [selectSubMode, setSelectSubMode] = useState<`${SubModeEnum}`>(SubModeEnum.month); + const NEW_PLAN_LEVELS = [ + StandardSubLevelEnum.free, + StandardSubLevelEnum.basic, + StandardSubLevelEnum.advanced, + StandardSubLevelEnum.custom + ]; + const { + data: coupons = [], + loading, + runAsync: getCoupons + } = useRequest2( + async () => { + if (!myStandardPlan?.teamId) return []; + return getDiscountCouponList(myStandardPlan.teamId); + }, + { + manual: !myStandardPlan?.teamId, + refreshDeps: [myStandardPlan?.teamId] + } + ); + + const matchedCoupon = useMemo(() => { + const targetType = + selectSubMode === SubModeEnum.month + ? DiscountCouponTypeEnum.monthStandardDiscount70 + : DiscountCouponTypeEnum.yearStandardDiscount90; + + return coupons.find( + (coupon) => coupon.type === targetType && coupon.status === DiscountCouponStatusEnum.active + ); + }, [coupons, selectSubMode]); + const standardSubList = useMemo(() => { return subPlans?.standard - ? Object.entries(subPlans.standard).map(([level, value]) => { - return { - price: value.price * (selectSubMode === SubModeEnum.month ? 1 : 10), - level: level as `${StandardSubLevelEnum}`, - ...standardSubLevelMap[level as `${StandardSubLevelEnum}`], - label: value.name || standardSubLevelMap[level as `${StandardSubLevelEnum}`].label, // custom label - maxTeamMember: value.maxTeamMember, - maxAppAmount: value.maxAppAmount, - maxDatasetAmount: value.maxDatasetAmount, - chatHistoryStoreDuration: value.chatHistoryStoreDuration, - maxDatasetSize: value.maxDatasetSize, - permissionCustomApiKey: value.permissionCustomApiKey, - permissionCustomCopyright: value.permissionCustomCopyright, - trainingWeight: value.trainingWeight, - totalPoints: value.totalPoints * (selectSubMode === SubModeEnum.month ? 1 : 12), - permissionWebsiteSync: value.permissionWebsiteSync, - permissionTeamOperationLog: value.permissionTeamOperationLog - }; - }) + ? Object.entries(subPlans.standard) + .filter(([level, value]) => { + if (!NEW_PLAN_LEVELS.includes(level as StandardSubLevelEnum)) { + return false; + } + if (level === StandardSubLevelEnum.custom && !value.customFormUrl) { + return false; + } + return true; + }) + .map(([level, value]) => { + return { + ...standardSubLevelMap[level as `${StandardSubLevelEnum}`], + ...(value.desc ? { desc: value.desc } : {}), + ...(value.name ? { label: value.name } : {}), + price: value.price * (selectSubMode === SubModeEnum.month ? 1 : 10), + level: level as `${StandardSubLevelEnum}`, + maxTeamMember: myStandardPlan?.maxTeamMember || value.maxTeamMember, + maxAppAmount: myStandardPlan?.maxApp || value.maxAppAmount, + maxDatasetAmount: myStandardPlan?.maxDataset || value.maxDatasetAmount, + chatHistoryStoreDuration: value.chatHistoryStoreDuration, + maxDatasetSize: value.maxDatasetSize, + totalPoints: value.totalPoints * (selectSubMode === SubModeEnum.month ? 1 : 12), + + // custom plan + priceDescription: value.priceDescription, + customDescriptions: value.customDescriptions, + customFormUrl: value.customFormUrl + }; + }) : []; }, [subPlans?.standard, selectSubMode]); @@ -116,7 +165,7 @@ const Standard = ({ {/* card */} {t(item.label as any)} - - ¥{item.price} - + + {item.level === StandardSubLevelEnum.custom ? ( + + {t('common:custom_plan_price')} + + ) : ( + + ¥ + {matchedCoupon?.discount && item.price > 0 + ? (matchedCoupon.discount * item.price).toFixed(1) + : item.price} + + )} + {item.level !== StandardSubLevelEnum.free && + item.level !== StandardSubLevelEnum.custom && + matchedCoupon && ( + + {`${(matchedCoupon.discount * 10).toFixed(0)} 折`} + + )} + {t(item.desc as any, { title: feConfigs?.systemTitle })} @@ -193,6 +271,23 @@ const Standard = ({ ); } + if (item.level === StandardSubLevelEnum.custom) { + return ( + + ); + } if (isCurrentPlan) { return ( )} + {userInfo?.permission.isOwner && feConfigs?.show_discount_coupon && ( + + )} { @@ -575,87 +604,89 @@ const PlanUsage = () => { - - - - {t('account_info:knowledge_base_capacity')} - - - {datasetIndexUsageMap.value}/{datasetIndexUsageMap.max} - - + + + {t('common:support.wallet.subscription.AI points usage')} + + + + {Math.round(teamPlanStatus?.usedPoints || 0)} / {aiPointsUsageMap.total} + - - + - + + - - - - {t('account_info:ai_points_usage')} - - - - {aiPointsUsageMap.value}/{aiPointsUsageMap.max} - - + + + {t('common:support.user.team.Dataset usage')} + + + {Math.round(teamPlanStatus?.usedDatasetIndexSize || 0)} / {datasetIndexUsageMap.total} + - - + - + {limitData.map((item) => { + const isAppRegistration = item.label === t('account_info:app_registration_count'); + return ( - - + + {item.label} - + {item.value}/{item.max} + {isAppRegistration && subPlans?.appRegistrationUrl && ( + + {t('account_info:apply_app_registration')} + + + )} - - + - + ); })} @@ -667,6 +698,7 @@ const PlanUsage = () => { onSuccess={() => initTeamPlanStatus()} /> )} + {isOpenDiscountCouponsModal && } ) : null; }; diff --git a/projects/app/src/pages/account/setting.tsx b/projects/app/src/pages/account/setting.tsx index 0547133ba..7b336fb2e 100644 --- a/projects/app/src/pages/account/setting.tsx +++ b/projects/app/src/pages/account/setting.tsx @@ -19,7 +19,7 @@ const Individuation = () => { const { toast } = useToast(); const { reset } = useForm({ - defaultValues: userInfo as UserType + defaultValues: userInfo! }); const onclickSave = useCallback( diff --git a/projects/app/src/pages/account/team/index.tsx b/projects/app/src/pages/account/team/index.tsx index c19ff594b..ec51c98ac 100644 --- a/projects/app/src/pages/account/team/index.tsx +++ b/projects/app/src/pages/account/team/index.tsx @@ -57,10 +57,9 @@ const Team = () => { const { subPlans } = useSystemStore(); const planContent = useMemo(() => { const plan = level !== undefined ? subPlans?.standard?.[level] : undefined; - if (!plan) return; return { - permissionTeamOperationLog: plan.permissionTeamOperationLog + auditLogStoreDuration: plan?.auditLogStoreDuration }; }, [subPlans?.standard, level]); const { toast } = useToast(); @@ -82,7 +81,7 @@ const Team = () => { px={'1rem'} value={teamTab} onChange={(e) => { - if (e === TeamTabEnum.audit && planContent && !planContent?.permissionTeamOperationLog) { + if (e === TeamTabEnum.audit && planContent && !planContent?.auditLogStoreDuration) { toast({ status: 'warning', title: t('common:not_permission') diff --git a/projects/app/src/pages/api/admin/clearInvalidData.ts b/projects/app/src/pages/api/admin/clearInvalidData.ts index f75b6edf7..1c8d22bab 100644 --- a/projects/app/src/pages/api/admin/clearInvalidData.ts +++ b/projects/app/src/pages/api/admin/clearInvalidData.ts @@ -2,11 +2,7 @@ import type { NextApiRequest, NextApiResponse } from 'next'; import { jsonRes } from '@fastgpt/service/common/response'; import { authCert } from '@fastgpt/service/support/permission/auth/common'; import { addHours } from 'date-fns'; -import { - checkInvalidDatasetFiles, - checkInvalidDatasetData, - checkInvalidVector -} from '@/service/common/system/cronTask'; +import { checkInvalidDatasetData, checkInvalidVector } from '@/service/common/system/cronTask'; import dayjs from 'dayjs'; import { retryFn } from '@fastgpt/global/common/system/utils'; import { NextAPI } from '@/service/middleware/entry'; @@ -88,7 +84,6 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { )} to ${dayjs(chunkEndTime).format('YYYY-MM-DD HH:mm')}` ); - await retryFn(() => checkInvalidDatasetFiles(chunkStartTime, chunkEndTime)); await retryFn(() => checkInvalidImg(chunkStartTime, chunkEndTime)); await retryFn(() => checkInvalidDatasetData(chunkStartTime, chunkEndTime)); await retryFn(() => checkInvalidVector(chunkStartTime, chunkEndTime)); diff --git a/projects/app/src/pages/api/admin/initv4143-2.ts b/projects/app/src/pages/api/admin/initv4143-2.ts new file mode 100644 index 000000000..004c27ab1 --- /dev/null +++ b/projects/app/src/pages/api/admin/initv4143-2.ts @@ -0,0 +1,553 @@ +import { NextAPI } from '@/service/middleware/entry'; +import { addLog } from '@fastgpt/service/common/system/log'; +import { authCert } from '@fastgpt/service/support/permission/auth/common'; +import { type NextApiRequest, type NextApiResponse } from 'next'; +import { getS3DatasetSource } from '@fastgpt/service/common/s3/sources/dataset'; +import type { getDownloadStream } from '@fastgpt/service/common/file/gridfs/controller'; +import { getGFSCollection } from '@fastgpt/service/common/file/gridfs/controller'; +import { MongoDatasetCollection } from '@fastgpt/service/core/dataset/collection/schema'; +import pLimit from 'p-limit'; +import { MongoDatasetMigrationLog } from '@fastgpt/service/core/dataset/migration/schema'; +import type { DatasetCollectionSchemaType } from '@fastgpt/global/core/dataset/type'; +import { randomUUID } from 'crypto'; +import { MongoDatasetData } from '@fastgpt/service/core/dataset/data/schema'; +import type { DatasetDataSchemaType } from '@fastgpt/global/core/dataset/type'; +import { + uploadImage2S3Bucket, + removeS3TTL, + getFileS3Key, + truncateFilename +} from '@fastgpt/service/common/s3/utils'; +import { connectionMongo, Types } from '@fastgpt/service/common/mongo'; + +// 将 GridFS 的流转换为 Buffer +async function gridFSStreamToBuffer( + stream: Awaited> +): Promise { + const chunks: Buffer[] = []; + + stream.on('data', (chunk) => chunks.push(chunk)); + + await new Promise((resolve, reject) => { + stream.on('end', resolve); + stream.on('error', reject); + }); + + return Buffer.concat(chunks); +} + +// ========== Dataset Image Migration Functions ========== + +// 获取 dataset_image 的 GridFS bucket +function getDatasetImageGridBucket() { + return new connectionMongo.mongo.GridFSBucket(connectionMongo.connection.db!, { + bucketName: 'dataset_image' + }); +} + +// 获取 dataset_image 的 GridFS collection +function getDatasetImageGFSCollection() { + return connectionMongo.connection.db!.collection('dataset_image.files'); +} + +// 处理单批 image +async function processImageBatch({ + batchId, + migrationVersion, + offset, + limit, + concurrency +}: { + batchId: string; + migrationVersion: string; + offset: number; + limit: number; + concurrency: number; +}) { + // 1. 获取这一批的 GridFS 图片文件 + const imageFiles = await getDatasetImageGFSCollection() + .find( + {}, + { + projection: { + _id: 1, + filename: 1, + contentType: 1, + length: 1, + metadata: 1 + } + } + ) + .skip(offset) + .limit(limit) + .toArray(); + + if (imageFiles.length === 0) { + return { processed: 0, succeeded: 0, failed: 0, skipped: 0 }; + } + + // 2. 获取所有的 imageId,并在 dataset_datas 中查找对应的记录 + const imageIds = imageFiles.map((file) => file._id.toString()); + const dataList = await MongoDatasetData.find( + { + teamId: { $in: Array.from(new Set(imageFiles.map((file) => file.metadata?.teamId))) }, + datasetId: { $in: Array.from(new Set(imageFiles.map((file) => file.metadata?.datasetId))) }, + collectionId: { + $in: Array.from(new Set(imageFiles.map((file) => file.metadata?.collectionId))) + }, + imageId: { $in: imageIds } + }, + '_id imageId teamId datasetId collectionId updateTime' + ).lean(); + + if (dataList.length === 0) { + return { processed: 0, succeeded: 0, failed: 0, skipped: 0 }; + } + + // 3. 过滤已完成的 + const completedMigrations = await MongoDatasetMigrationLog.find( + { + resourceType: 'data_image', + resourceId: { $in: dataList.map((d) => d._id) }, + status: 'completed' + }, + 'resourceId' + ).lean(); + + const completedIds = new Set(completedMigrations.map((m) => m.resourceId.toString())); + const pendingDataList = dataList.filter((d) => !completedIds.has(d._id.toString())); + + const skippedCount = dataList.length - pendingDataList.length; + + if (pendingDataList.length === 0) { + addLog.info( + `[Migration ${batchId}] Image batch all skipped. Total: ${dataList.length}, Skipped: ${skippedCount}` + ); + return { + processed: dataList.length, + succeeded: 0, + failed: 0, + skipped: skippedCount + }; + } + + addLog.info( + `[Migration ${batchId}] Processing ${pendingDataList.length} images (${skippedCount} skipped)` + ); + + // 4. 创建 imageId 到 file 的映射 + const imageFileMap = new Map(imageFiles.map((file) => [file._id.toString(), file])); + + // 5. 为每个 data 关联对应的 image file + const imageDataPairs = pendingDataList + .map((data) => { + const imageFile = imageFileMap.get(data.imageId!); + if (!imageFile) { + addLog.warn( + `[Migration ${batchId}] Image file not found for imageId: ${data.imageId}, dataId: ${data._id}` + ); + return null; + } + return { data, imageFile }; + }) + .filter((pair) => pair !== null); + + if (imageDataPairs.length === 0) { + return { processed: dataList.length, succeeded: 0, failed: 0, skipped: dataList.length }; + } + + // 6. 创建迁移日志 + const imageMigrationLogs = imageDataPairs.map(({ data, imageFile }) => ({ + batchId, + migrationVersion, + resourceType: 'data_image' as const, + resourceId: data._id, + teamId: data.teamId, + datasetId: data.datasetId, + sourceStorage: { + type: 'gridfs' as const, + fileId: data.imageId, + bucketName: 'dataset_image' as any + }, + status: 'pending' as const, + attemptCount: 0, + maxAttempts: 3, + verified: false, + operations: [], + metadata: { + fileName: imageFile.filename, + originalUpdateTime: data.updateTime, + nodeEnv: process.env.NODE_ENV + } + })); + + if (imageMigrationLogs.length > 0) { + await MongoDatasetMigrationLog.insertMany(imageMigrationLogs, { ordered: false }); + } + + // 7. 执行迁移 + const limitFn = pLimit(concurrency); + let succeeded = 0; + let failed = 0; + + const tasks = imageDataPairs.map(({ data, imageFile }) => + limitFn(async () => { + try { + const { key, dataId } = await migrateDatasetImage({ batchId, data, imageFile }); + await updateDatasetDataImageId({ batchId, dataId, key }); + succeeded++; + } catch (error) { + failed++; + addLog.error(`[Migration ${batchId}] Failed to migrate image for data ${data._id}:`, error); + } + }) + ); + + await Promise.allSettled(tasks); + + return { + processed: dataList.length, + succeeded, + failed, + skipped: skippedCount + }; +} + +// 从 GridFS 迁移单个图片到 S3 +async function migrateDatasetImage({ + batchId, + data, + imageFile +}: { + batchId: string; + data: DatasetDataSchemaType; + imageFile: any; +}) { + const { imageId, datasetId, _id } = data; + const dataId = _id.toString(); + + try { + // 更新状态为处理中 + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: _id }, + { + $set: { + status: 'processing', + startedAt: new Date(), + lastAttemptAt: new Date() + }, + $inc: { attemptCount: 1 } + } + ); + + // 阶段 1: 从 GridFS 下载 + const downloadStartTime = Date.now(); + let buffer: Buffer; + try { + const bucket = getDatasetImageGridBucket(); + const stream = bucket.openDownloadStream(new Types.ObjectId(imageId!)); + buffer = await gridFSStreamToBuffer(stream); + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: _id }, + { + $push: { + operations: { + action: 'download_from_gridfs', + timestamp: new Date(), + success: true, + duration: Date.now() - downloadStartTime, + details: { + fileSize: buffer.length, + filename: imageFile.filename + } + } + }, + $set: { + 'sourceStorage.fileSize': buffer.length + } + } + ); + } catch (error) { + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: _id }, + { + $set: { + status: 'failed', + 'error.message': error instanceof Error ? error.message : String(error), + 'error.stack': error instanceof Error ? error.stack : undefined, + 'error.phase': 'download' + } + } + ); + throw error; + } + + // 阶段 2: 上传到 S3 + const uploadStartTime = Date.now(); + let key: string; + try { + // 从文件名中提取扩展名 + const mimetype = imageFile.contentType || 'image/png'; + const filename = imageFile.filename || 'image.png'; + + // 截断文件名以避免S3 key过长的问题 + const truncatedFilename = truncateFilename(filename); + + // 构造 S3 key + const { fileKey: s3Key } = getFileS3Key.dataset({ datasetId, filename: truncatedFilename }); + + // 使用 uploadImage2S3Bucket 上传图片(不设置过期时间) + key = await uploadImage2S3Bucket('private', { + base64Img: buffer.toString('base64'), + uploadKey: s3Key, + mimetype, + filename: truncatedFilename, + expiredTime: undefined // 不设置过期时间 + }); + + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: _id }, + { + $push: { + operations: { + action: 'upload_to_s3', + timestamp: new Date(), + success: true, + duration: Date.now() - uploadStartTime, + details: { + s3Key: key + } + } + }, + $set: { + 'targetStorage.key': key, + 'targetStorage.fileSize': buffer.length + } + } + ); + } catch (error) { + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: _id }, + { + $set: { + status: 'failed', + 'error.message': error instanceof Error ? error.message : String(error), + 'error.stack': error instanceof Error ? error.stack : undefined, + 'error.phase': 'upload' + } + } + ); + throw error; + } + + return { + key, + dataId + }; + } catch (error) { + addLog.error(`[Migration ${batchId}] Failed to migrate image for data ${dataId}:`, error); + throw error; + } +} + +// 更新 dataset_datas 的 imageId 为 S3 的 key +async function updateDatasetDataImageId({ + batchId, + dataId, + key +}: { + batchId: string; + dataId: string; + key: string; +}) { + const updateStartTime = Date.now(); + + try { + // 更新 data imageId + await MongoDatasetData.updateOne({ _id: dataId }, { $set: { imageId: key } }); + + // 标记迁移为完成 + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: dataId }, + { + $set: { + status: 'completed', + completedAt: new Date() + }, + $push: { + operations: { + action: 'update_data_imageId', + timestamp: new Date(), + success: true, + duration: Date.now() - updateStartTime, + details: { + newImageId: key + } + } + } + } + ); + + return { + dataId, + key + }; + } catch (error) { + // 标记迁移为失败 + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: dataId }, + { + $set: { + status: 'failed', + 'error.message': error instanceof Error ? error.message : String(error), + 'error.stack': error instanceof Error ? error.stack : undefined, + 'error.phase': 'update_db' + } + } + ); + + addLog.error(`[Migration ${batchId}] Failed to update data ${dataId}:`, error); + throw error; + } +} + +// 批量删除已完成迁移的 S3 文件的 TTL +async function removeTTLForCompletedMigrations(batchId: string) { + try { + addLog.info(`[Migration ${batchId}] Removing TTL for completed migrations...`); + + // 分批删除,避免一次查询太多 + const BATCH_SIZE = 5000; + let offset = 0; + let totalRemoved = 0; + + while (true) { + const completedMigrations = await MongoDatasetMigrationLog.find( + { + batchId, + status: 'completed', + 'targetStorage.key': { $exists: true, $ne: null } + }, + 'targetStorage.key' + ) + .skip(offset) + .limit(BATCH_SIZE) + .lean(); + + if (completedMigrations.length === 0) break; + + const keys = completedMigrations + .map((log) => log.targetStorage?.key) + .filter(Boolean) as string[]; + + if (keys.length > 0) { + await removeS3TTL({ key: keys, bucketName: 'private' }); + totalRemoved += keys.length; + addLog.info(`[Migration ${batchId}] Removed TTL for ${totalRemoved} objects so far`); + } + + offset += BATCH_SIZE; + + if (completedMigrations.length < BATCH_SIZE) break; + } + + addLog.info(`[Migration ${batchId}] Total TTL removed: ${totalRemoved}`); + } catch (error) { + addLog.error(`[Migration ${batchId}] Failed to remove TTL:`, error); + // 不抛出错误,因为这不是致命问题 + } +} + +async function handler(req: NextApiRequest, _res: NextApiResponse) { + await authCert({ req, authRoot: true }); + + // 迁移配置 + const config = { + collectionBatchSize: 500, + collectionConcurrency: 10, + imageBatchSize: 500, + imageConcurrency: 5, + pauseBetweenBatches: 1000 // ms + }; + + // 生成唯一的批次 ID + const batchId = `migration_${Date.now()}_${randomUUID()}`; + const migrationVersion = 'v4.14.3'; + + addLog.info(`[Migration ${batchId}] Starting migration ${migrationVersion}`); + addLog.info( + `[Migration ${batchId}] Config: collectionBatch=${config.collectionBatchSize}, collectionConcurrency=${config.collectionConcurrency}, imageBatch=${config.imageBatchSize}, imageConcurrency=${config.imageConcurrency}` + ); + + // ========== Image Migration ========== + addLog.info(`[Migration ${batchId}] Starting image migration...`); + + const totalImageFiles = await getDatasetImageGFSCollection().countDocuments({}); + addLog.info(`[Migration ${batchId}] Total image files in GridFS: ${totalImageFiles}`); + + let imageStats = { + processed: 0, + succeeded: 0, + failed: 0, + skipped: 0 + }; + + // 分批处理 images + for (let offset = 0; offset < totalImageFiles; offset += config.imageBatchSize) { + const currentBatch = Math.floor(offset / config.imageBatchSize) + 1; + const totalBatches = Math.ceil(totalImageFiles / config.imageBatchSize); + + addLog.info( + `[Migration ${batchId}] Processing images batch ${currentBatch}/${totalBatches} (${offset}-${offset + config.imageBatchSize})` + ); + + const batchStats = await processImageBatch({ + batchId, + migrationVersion, + offset, + limit: config.imageBatchSize, + concurrency: config.imageConcurrency + }); + + imageStats.processed += batchStats.processed; + imageStats.succeeded += batchStats.succeeded; + imageStats.failed += batchStats.failed; + imageStats.skipped += batchStats.skipped; + + addLog.info( + `[Migration ${batchId}] Batch ${currentBatch}/${totalBatches} completed. Batch: +${batchStats.succeeded} succeeded, +${batchStats.failed} failed. Total progress: ${imageStats.succeeded}/${totalImageFiles}` + ); + + // 暂停一下 + if (offset + config.imageBatchSize < totalImageFiles) { + await new Promise((resolve) => setTimeout(resolve, config.pauseBetweenBatches)); + } + } + + // ========== 批量删除已完成迁移的 TTL ========== + await removeTTLForCompletedMigrations(batchId); + + // ========== 汇总统计 ========== + addLog.info(`[Migration ${batchId}] ========== Migration Summary ==========`); + + addLog.info( + `[Migration ${batchId}] Images - Total: ${totalImageFiles}, Succeeded: ${imageStats.succeeded}, Failed: ${imageStats.failed}, Skipped: ${imageStats.skipped}` + ); + addLog.info(`[Migration ${batchId}] =======================================`); + + return { + batchId, + migrationVersion, + summary: { + images: { + total: totalImageFiles, + processed: imageStats.processed, + succeeded: imageStats.succeeded, + failed: imageStats.failed, + skipped: imageStats.skipped + } + } + }; +} + +export default NextAPI(handler); diff --git a/projects/app/src/pages/api/admin/initv4143.ts b/projects/app/src/pages/api/admin/initv4143.ts index cb9f743b0..d1039a65f 100644 --- a/projects/app/src/pages/api/admin/initv4143.ts +++ b/projects/app/src/pages/api/admin/initv4143.ts @@ -169,7 +169,7 @@ async function migrateDatasetCollection({ const uploadStartTime = Date.now(); let key: string; try { - key = await getS3DatasetSource().uploadDatasetFileByBuffer({ + key = await getS3DatasetSource().upload({ buffer, datasetId, filename: name diff --git a/projects/app/src/pages/api/admin/initv4144.ts b/projects/app/src/pages/api/admin/initv4144.ts new file mode 100644 index 000000000..d60b37bbb --- /dev/null +++ b/projects/app/src/pages/api/admin/initv4144.ts @@ -0,0 +1,570 @@ +import { NextAPI } from '@/service/middleware/entry'; +import { addLog } from '@fastgpt/service/common/system/log'; +import { authCert } from '@fastgpt/service/support/permission/auth/common'; +import { type NextApiRequest, type NextApiResponse } from 'next'; +import { getS3DatasetSource } from '@fastgpt/service/common/s3/sources/dataset'; +import { + getDownloadStream, + getGFSCollection +} from '@fastgpt/service/common/file/gridfs/controller'; +import { MongoDatasetCollection } from '@fastgpt/service/core/dataset/collection/schema'; +import pLimit from 'p-limit'; +import { MongoDatasetMigrationLog } from '@fastgpt/service/core/dataset/migration/schema'; +import type { DatasetCollectionSchemaType } from '@fastgpt/global/core/dataset/type'; +import { randomUUID } from 'crypto'; +import { MongoDatasetData } from '@fastgpt/service/core/dataset/data/schema'; +import type { DatasetDataSchemaType } from '@fastgpt/global/core/dataset/type'; +import { + uploadImage2S3Bucket, + removeS3TTL, + getFileS3Key, + truncateFilename +} from '@fastgpt/service/common/s3/utils'; +import { connectionMongo, Types } from '@fastgpt/service/common/mongo'; + +// 将 GridFS 的流转换为 Buffer +async function gridFSStreamToBuffer( + stream: Awaited> +): Promise { + const chunks: Buffer[] = []; + + stream.on('data', (chunk) => chunks.push(chunk)); + + await new Promise((resolve, reject) => { + stream.on('end', resolve); + stream.on('error', reject); + }); + + return Buffer.concat(chunks); +} + +// 将 MongoDB 中 ObjectId 类型的 fileId 转换为字符串 +async function convertFileIdToString(batchId: string) { + addLog.info(`[Migration ${batchId}] Converting ObjectId fileId to String in database...`); + + // 查找所有 fileId 存在且不为 null 的 collection + const collections = await MongoDatasetCollection.find( + { + fileId: { $exists: true, $ne: null } + }, + '_id fileId' + ).lean(); + + if (collections.length === 0) { + addLog.info(`[Migration ${batchId}] No collections with fileId found`); + return { converted: 0 }; + } + + addLog.info( + `[Migration ${batchId}] Found ${collections.length} collections with fileId, starting conversion...` + ); + + let convertedCount = 0; + const limit = pLimit(50); + + const tasks = collections.map((collection) => + limit(async () => { + try { + // 确保 fileId 存在 + if (!collection.fileId) { + return; + } + + // 将 ObjectId 转换为字符串 + const fileIdStr = collection.fileId.toString(); + + // 更新为字符串类型 + await MongoDatasetCollection.updateOne( + { _id: collection._id }, + { $set: { fileId: fileIdStr } } + ); + + convertedCount++; + } catch (error) { + addLog.error( + `[Migration ${batchId}] Failed to convert fileId for collection ${collection._id}:`, + error + ); + } + }) + ); + + await Promise.all(tasks); + + addLog.info(`[Migration ${batchId}] Converted ${convertedCount} fileId fields to String type`); + + return { converted: convertedCount }; +} + +// 从 GridFS 上传到 S3 +async function migrateDatasetCollection({ + batchId, + collection +}: { + batchId: string; + collection: DatasetCollectionSchemaType; +}) { + const { fileId, datasetId, name, _id } = collection; + const collectionId = _id.toString(); + + try { + // 更新状态为处理中 + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: _id }, + { + $set: { + status: 'processing', + startedAt: new Date(), + lastAttemptAt: new Date() + }, + $inc: { attemptCount: 1 } + } + ); + + // 阶段 1: 从 GridFS 下载 + const downloadStartTime = Date.now(); + let buffer: Buffer; + try { + const stream = await getDownloadStream({ + bucketName: 'dataset', + fileId: fileId! + }); + buffer = await gridFSStreamToBuffer(stream); + + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: _id }, + { + $push: { + operations: { + action: 'download_from_gridfs', + timestamp: new Date(), + success: true, + duration: Date.now() - downloadStartTime, + details: { + fileSize: buffer.length + } + } + }, + $set: { + 'sourceStorage.fileSize': buffer.length + } + } + ); + } catch (error) { + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: _id }, + { + $set: { + status: 'failed', + 'error.message': error instanceof Error ? error.message : String(error), + 'error.stack': error instanceof Error ? error.stack : undefined, + 'error.phase': 'download' + } + } + ); + throw error; + } + + // 阶段 2: 上传到 S3 + const uploadStartTime = Date.now(); + let key: string; + try { + key = await getS3DatasetSource().upload({ + buffer, + datasetId, + filename: name + }); + + // 立即删除 TTL(uploadDatasetFileByBuffer 会创建 3 小时的 TTL) + await removeS3TTL({ key, bucketName: 'private' }); + + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: _id }, + { + $push: { + operations: { + action: 'upload_to_s3', + timestamp: new Date(), + success: true, + duration: Date.now() - uploadStartTime, + details: { + s3Key: key + } + } + }, + $set: { + 'targetStorage.key': key, + 'targetStorage.fileSize': buffer.length + } + } + ); + } catch (error) { + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: _id }, + { + $set: { + status: 'failed', + 'error.message': error instanceof Error ? error.message : String(error), + 'error.stack': error instanceof Error ? error.stack : undefined, + 'error.phase': 'upload' + } + } + ); + throw error; + } + + return { + key, + collectionId + }; + } catch (error) { + addLog.error(`[Migration ${batchId}] Failed to migrate collection ${collectionId}:`, error); + throw error; + } +} + +// 处理单批 collection +async function processCollectionBatch({ + batchId, + migrationVersion, + offset, + limit, + concurrency +}: { + batchId: string; + migrationVersion: string; + offset: number; + limit: number; + concurrency: number; +}) { + // 1. 获取这一批的 GridFS 文件 + const files = await getGFSCollection('dataset') + .find( + {}, + { + projection: { + _id: 1, + metadata: { teamId: 1 } + } + } + ) + .sort({ _id: -1 }) + .skip(offset) + .limit(limit) + .toArray(); + + if (files.length === 0) { + return { processed: 0, succeeded: 0, failed: 0, skipped: 0 }; + } + + // 2. 查找对应的 collections + const fileIds = files.map((f) => f._id); + const collections = await MongoDatasetCollection.find( + { fileId: { $in: fileIds, $not: { $regex: /^dataset\// } } }, + '_id fileId teamId datasetId type parentId name updateTime' + ).lean(); + + if (collections.length === 0) { + return { processed: 0, succeeded: 0, failed: 0, skipped: 0 }; + } + + // 3. 过滤已完成的 + const completedMigrations = await MongoDatasetMigrationLog.find( + { + resourceType: 'collection', + resourceId: { $in: collections.map((c) => c._id) }, + status: 'completed' + }, + 'resourceId' + ).lean(); + + const completedIds = new Set(completedMigrations.map((m) => m.resourceId.toString())); + const pendingCollections = collections.filter((c) => !completedIds.has(c._id.toString())); + + const skippedCount = collections.length - pendingCollections.length; + + if (pendingCollections.length === 0) { + addLog.info( + `[Migration ${batchId}] Batch all skipped. Total: ${collections.length}, Skipped: ${skippedCount}` + ); + return { + processed: collections.length, + succeeded: 0, + failed: 0, + skipped: skippedCount + }; + } + + addLog.info( + `[Migration ${batchId}] Processing ${pendingCollections.length} collections (${skippedCount} skipped)` + ); + + // 4. 创建迁移日志 + const migrationLogs = pendingCollections.map((collection) => ({ + batchId, + migrationVersion, + resourceType: 'collection' as const, + resourceId: collection._id, + teamId: collection.teamId, + datasetId: collection.datasetId, + sourceStorage: { + type: 'gridfs' as const, + fileId: collection.fileId, + bucketName: 'dataset' + }, + status: 'pending' as const, + attemptCount: 0, + maxAttempts: 3, + verified: false, + operations: [], + metadata: { + fileName: collection.name, + originalUpdateTime: collection.updateTime, + nodeEnv: process.env.NODE_ENV + } + })); + + if (migrationLogs.length > 0) { + await MongoDatasetMigrationLog.insertMany(migrationLogs, { ordered: false }); + } + + // 5. 执行迁移(降低并发) + const limitFn = pLimit(concurrency); + let succeeded = 0; + let failed = 0; + + const tasks = pendingCollections.map((collection) => + limitFn(async () => { + try { + const { key, collectionId } = await migrateDatasetCollection({ + batchId, + collection + }); + await updateDatasetCollectionFileId({ batchId, collectionId, key }); + succeeded++; + } catch (error) { + failed++; + addLog.error( + `[Migration ${batchId}] Failed to migrate collection ${collection._id}:`, + error + ); + } + }) + ); + + await Promise.allSettled(tasks); + + return { + processed: collections.length, + succeeded, + failed, + skipped: skippedCount + }; +} + +// 修改 dataset collection 的 fileId 为 S3 的 key +async function updateDatasetCollectionFileId({ + batchId, + collectionId, + key +}: { + batchId: string; + collectionId: string; + key: string; +}) { + const updateStartTime = Date.now(); + + try { + // 更新 collection fileId + await MongoDatasetCollection.updateOne({ _id: collectionId }, { $set: { fileId: key } }); + + // 标记迁移为完成 + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: collectionId }, + { + $set: { + status: 'completed', + completedAt: new Date() + }, + $push: { + operations: { + action: 'update_collection_fileId', + timestamp: new Date(), + success: true, + duration: Date.now() - updateStartTime, + details: { + newFileId: key + } + } + } + } + ); + + return { + collectionId, + key + }; + } catch (error) { + // 标记迁移为失败 + await MongoDatasetMigrationLog.updateOne( + { batchId, resourceId: collectionId }, + { + $set: { + status: 'failed', + 'error.message': error instanceof Error ? error.message : String(error), + 'error.stack': error instanceof Error ? error.stack : undefined, + 'error.phase': 'update_db' + } + } + ); + + addLog.error(`[Migration ${batchId}] Failed to update collection ${collectionId}:`, error); + throw error; + } +} + +// 批量删除已完成迁移的 S3 文件的 TTL +async function removeTTLForCompletedMigrations(batchId: string) { + try { + addLog.info(`[Migration ${batchId}] Removing TTL for completed migrations...`); + + // 分批删除,避免一次查询太多 + const BATCH_SIZE = 5000; + let offset = 0; + let totalRemoved = 0; + + while (true) { + const completedMigrations = await MongoDatasetMigrationLog.find( + { + batchId, + status: 'completed', + 'targetStorage.key': { $exists: true, $ne: null } + }, + 'targetStorage.key' + ) + .skip(offset) + .limit(BATCH_SIZE) + .lean(); + + if (completedMigrations.length === 0) break; + + const keys = completedMigrations + .map((log) => log.targetStorage?.key) + .filter(Boolean) as string[]; + + if (keys.length > 0) { + await removeS3TTL({ key: keys, bucketName: 'private' }); + totalRemoved += keys.length; + addLog.info(`[Migration ${batchId}] Removed TTL for ${totalRemoved} objects so far`); + } + + offset += BATCH_SIZE; + + if (completedMigrations.length < BATCH_SIZE) break; + } + + addLog.info(`[Migration ${batchId}] Total TTL removed: ${totalRemoved}`); + } catch (error) { + addLog.error(`[Migration ${batchId}] Failed to remove TTL:`, error); + // 不抛出错误,因为这不是致命问题 + } +} + +async function handler(req: NextApiRequest, _res: NextApiResponse) { + await authCert({ req, authRoot: true }); + + // 迁移配置 + const config = { + collectionBatchSize: 500, + collectionConcurrency: 10, + imageBatchSize: 500, + imageConcurrency: 5, + pauseBetweenBatches: 1000 // ms + }; + + // 生成唯一的批次 ID + const batchId = `migration_${Date.now()}_${randomUUID()}`; + const migrationVersion = 'v4.14.4'; + + addLog.info(`[Migration ${batchId}] Starting migration ${migrationVersion}`); + addLog.info( + `[Migration ${batchId}] Config: collectionBatch=${config.collectionBatchSize}, collectionConcurrency=${config.collectionConcurrency}, imageBatch=${config.imageBatchSize}, imageConcurrency=${config.imageConcurrency}` + ); + + // 步骤 0: 将现有数据库中的 ObjectId 类型的 fileId 转换为 String + const { converted } = await convertFileIdToString(batchId); + + // ========== Collection 迁移 ========== + const totalCollectionFiles = await getGFSCollection('dataset').countDocuments({ + uploadDate: { $gte: new Date('2025-11-20') } + }); + addLog.info(`[Migration ${batchId}] Total collection files in GridFS: ${totalCollectionFiles}`); + + let collectionStats = { + processed: 0, + succeeded: 0, + failed: 0, + skipped: 0 + }; + + // 分批处理 collections + for (let offset = 0; offset < totalCollectionFiles; offset += config.collectionBatchSize) { + const currentBatch = Math.floor(offset / config.collectionBatchSize) + 1; + const totalBatches = Math.ceil(totalCollectionFiles / config.collectionBatchSize); + + addLog.info( + `[Migration ${batchId}] Processing collections batch ${currentBatch}/${totalBatches} (${offset}-${offset + config.collectionBatchSize})` + ); + + const batchStats = await processCollectionBatch({ + batchId, + migrationVersion, + offset, + limit: config.collectionBatchSize, + concurrency: config.collectionConcurrency + }); + + collectionStats.processed += batchStats.processed; + collectionStats.succeeded += batchStats.succeeded; + collectionStats.failed += batchStats.failed; + collectionStats.skipped += batchStats.skipped; + + addLog.info( + `[Migration ${batchId}] Batch ${currentBatch}/${totalBatches} completed. Batch: +${batchStats.succeeded} succeeded, +${batchStats.failed} failed. Total progress: ${collectionStats.succeeded}/${totalCollectionFiles}` + ); + + // 暂停一下 + if (offset + config.collectionBatchSize < totalCollectionFiles) { + await new Promise((resolve) => setTimeout(resolve, config.pauseBetweenBatches)); + } + } + + // ========== 批量删除已完成迁移的 TTL ========== + await removeTTLForCompletedMigrations(batchId); + + // ========== 汇总统计 ========== + addLog.info(`[Migration ${batchId}] ========== Migration Summary ==========`); + addLog.info( + `[Migration ${batchId}] Collections - Total: ${totalCollectionFiles}, Succeeded: ${collectionStats.succeeded}, Failed: ${collectionStats.failed}, Skipped: ${collectionStats.skipped}` + ); + + addLog.info(`[Migration ${batchId}] Converted fileId: ${converted}`); + addLog.info(`[Migration ${batchId}] =======================================`); + + return { + batchId, + migrationVersion, + summary: { + convertedFileIdCount: converted, + collections: { + total: totalCollectionFiles, + processed: collectionStats.processed, + succeeded: collectionStats.succeeded, + failed: collectionStats.failed, + skipped: collectionStats.skipped + } + } + }; +} + +export default NextAPI(handler); diff --git a/projects/app/src/pages/api/admin/support/appRegistration/create.ts b/projects/app/src/pages/api/admin/support/appRegistration/create.ts new file mode 100644 index 000000000..2e7692926 --- /dev/null +++ b/projects/app/src/pages/api/admin/support/appRegistration/create.ts @@ -0,0 +1,93 @@ +import type { NextApiRequest, NextApiResponse } from 'next'; +import { authCert } from '@fastgpt/service/support/permission/auth/common'; +import { NextAPI } from '@/service/middleware/entry'; +import { MongoAppRegistration } from '@fastgpt/service/support/appRegistration/schema'; +import { MongoApp } from '@fastgpt/service/core/app/schema'; +import type { ApiRequestProps } from '@fastgpt/service/type/next'; +import { getTeamPlanStatus } from '@fastgpt/service/support/wallet/sub/utils'; + +type CreateAppRegistrationBody = { + appId: string; +}; + +/** + * Admin API - 创建应用备案记录 + * + * 功能说明: + * 1. 使用 rootkey 认证(需要超级管理员权限) + * 2. 检查应用是否存在 + * 3. 检查该应用是否已经备案(避免重复备案) + * 4. 检查团队的备案应用配额是否已满 + * 5. 如果配额未满,创建备案记录 + * + * 使用方式: + * curl -X POST http://localhost:3000/api/admin/support/appRegistration/create \ + * -H "Content-Type: application/json" \ + * -H "rootkey: your-root-key" \ + * -d '{"appId": "your-app-id"}' + * + */ +async function handler( + req: ApiRequestProps, + res: NextApiResponse +): Promise<{ success: boolean }> { + const { appId } = req.body; + + if (!appId) { + return Promise.reject('appId is required'); + } + + // 使用 rootkey 认证 + await authCert({ + req, + authRoot: true + }); + + // 查找应用信息获取 teamId 和 tmbId + const app = await MongoApp.findById(appId); + if (!app) { + return Promise.reject('App not found'); + } + + // 检查是否已存在备案记录 + const existingRegistration = await MongoAppRegistration.findOne({ + teamId: app.teamId, + appId: app._id + }); + if (existingRegistration) { + return Promise.reject('Registration already exists for this app'); + } + + // 获取团队套餐信息 + const teamPlanStatus = await getTeamPlanStatus({ teamId: String(app.teamId) }); + const appRegistrationLimit = teamPlanStatus?.standardConstants?.appRegistrationCount; + + // 检查是否有配额限制 + if (appRegistrationLimit && appRegistrationLimit > 0) { + // 统计当前团队已有的备案记录数 + const currentRegistrationCount = await MongoAppRegistration.countDocuments({ + teamId: app.teamId + }); + + // 检查是否已达到配额上限 + if (currentRegistrationCount >= appRegistrationLimit) { + return Promise.reject( + `Registration quota exceeded. Current: ${currentRegistrationCount}, Limit: ${appRegistrationLimit}` + ); + } + } + + // 创建备案记录 + await MongoAppRegistration.create({ + teamId: app.teamId, + tmbId: app.tmbId, + appId: app._id, + createdAt: new Date() + }); + + return { + success: true + }; +} + +export default NextAPI(handler); diff --git a/projects/app/src/pages/api/common/file/read.ts b/projects/app/src/pages/api/common/file/read.ts deleted file mode 100644 index c81a4b0de..000000000 --- a/projects/app/src/pages/api/common/file/read.ts +++ /dev/null @@ -1,82 +0,0 @@ -import type { NextApiRequest, NextApiResponse } from 'next'; -import { jsonRes } from '@fastgpt/service/common/response'; -import { getDownloadStream, getFileById } from '@fastgpt/service/common/file/gridfs/controller'; -import { CommonErrEnum } from '@fastgpt/global/common/error/code/common'; -import { stream2Encoding } from '@fastgpt/service/common/file/gridfs/utils'; -import { authFileToken } from '@fastgpt/service/support/permission/auth/file'; - -const previewableExtensions = [ - 'jpg', - 'jpeg', - 'png', - 'gif', - 'bmp', - 'webp', - 'txt', - 'log', - 'csv', - 'md', - 'json' -]; - -// Abandoned, use: file/read/[filename].ts -export default async function handler(req: NextApiRequest, res: NextApiResponse) { - try { - const { token } = req.query as { token: string }; - - const { fileId, bucketName } = await authFileToken(token); - - if (!fileId) { - throw new Error('fileId is empty'); - } - - const [file, fileStream] = await Promise.all([ - getFileById({ bucketName, fileId }), - getDownloadStream({ bucketName, fileId }) - ]); - - if (!file) { - return Promise.reject(CommonErrEnum.fileNotFound); - } - - const { stream, encoding } = await (async () => { - if (file.metadata?.encoding) { - return { - stream: fileStream, - encoding: file.metadata.encoding - }; - } - return stream2Encoding(fileStream); - })(); - - const extension = file.filename.split('.').pop() || ''; - const disposition = previewableExtensions.includes(extension) ? 'inline' : 'attachment'; - - res.setHeader('Content-Type', `${file.contentType}; charset=${encoding}`); - res.setHeader('Cache-Control', 'public, max-age=31536000'); - res.setHeader( - 'Content-Disposition', - `${disposition}; filename="${encodeURIComponent(file.filename)}"` - ); - res.setHeader('Content-Length', file.length); - - stream.pipe(res); - - stream.on('error', () => { - res.status(500).end(); - }); - stream.on('end', () => { - res.end(); - }); - } catch (error) { - jsonRes(res, { - code: 500, - error - }); - } -} -export const config = { - api: { - responseLimit: '100mb' - } -}; diff --git a/projects/app/src/pages/api/common/file/read/[filename].ts b/projects/app/src/pages/api/common/file/read/[filename].ts index 19b4118d0..13748b6e8 100644 --- a/projects/app/src/pages/api/common/file/read/[filename].ts +++ b/projects/app/src/pages/api/common/file/read/[filename].ts @@ -1,6 +1,5 @@ import type { NextApiRequest, NextApiResponse } from 'next'; import { jsonRes } from '@fastgpt/service/common/response'; -import { getDownloadStream, getFileById } from '@fastgpt/service/common/file/gridfs/controller'; import { CommonErrEnum } from '@fastgpt/global/common/error/code/common'; import { stream2Encoding } from '@fastgpt/service/common/file/gridfs/utils'; import { authFileToken } from '@fastgpt/service/support/permission/auth/file'; @@ -24,41 +23,24 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse< try { const { token, filename } = req.query as { token: string; filename: string }; - const { fileId, bucketName } = await authFileToken(token); + const { fileId } = await authFileToken(token); - if (!fileId) { - throw new Error('fileId is empty'); + if (!fileId || !isS3ObjectKey(fileId, 'dataset')) { + throw new Error('Invalid fileId'); } - const [file, fileStream] = await (() => { - if (isS3ObjectKey(fileId, 'dataset')) { - return Promise.all([ - getS3DatasetSource().getFileMetadata(fileId), - getS3DatasetSource().getDatasetFileStream(fileId) - ]); - } - - return Promise.all([ - getFileById({ bucketName, fileId }), - getDownloadStream({ bucketName, fileId }) - ]); - })(); + const [file, fileStream] = await Promise.all([ + getS3DatasetSource().getFileMetadata(fileId), + getS3DatasetSource().getDatasetFileStream(fileId) + ]); if (!file) { return Promise.reject(CommonErrEnum.fileNotFound); } - const { stream, encoding } = await (async () => { - if ('metadata' in file && file.metadata?.encoding) { - return { - stream: fileStream, - encoding: file.metadata.encoding - }; - } - return stream2Encoding(fileStream); - })(); + const { stream, encoding } = await stream2Encoding(fileStream); - const extension = file.filename.split('.').pop() || ''; + const extension = file.extension; const disposition = previewableExtensions.includes(extension) ? 'inline' : 'attachment'; res.setHeader('Content-Type', `${file.contentType}; charset=${encoding}`); @@ -67,7 +49,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse< 'Content-Disposition', `${disposition}; filename="${encodeURIComponent(filename)}"` ); - res.setHeader('Content-Length', 'contentLength' in file ? file.contentLength : file.length); + res.setHeader('Content-Length', file.contentLength); stream.pipe(res); diff --git a/projects/app/src/pages/api/core/app/create.ts b/projects/app/src/pages/api/core/app/create.ts index 8d2e484b0..ea9265c5f 100644 --- a/projects/app/src/pages/api/core/app/create.ts +++ b/projects/app/src/pages/api/core/app/create.ts @@ -18,7 +18,7 @@ import { mongoSessionRun } from '@fastgpt/service/common/mongo/sessionRun'; import { MongoApp } from '@fastgpt/service/core/app/schema'; import { MongoAppVersion } from '@fastgpt/service/core/app/version/schema'; import { authApp } from '@fastgpt/service/support/permission/app/auth'; -import { checkTeamAppLimit } from '@fastgpt/service/support/permission/teamLimit'; +import { checkTeamAppTypeLimit } from '@fastgpt/service/support/permission/teamLimit'; import { authUserPer } from '@fastgpt/service/support/permission/user/auth'; import { MongoTeamMember } from '@fastgpt/service/support/user/team/teamMemberSchema'; import { type ApiRequestProps } from '@fastgpt/service/type/next'; @@ -62,7 +62,11 @@ async function handler(req: ApiRequestProps) { : await authUserPer({ req, authToken: true, per: TeamAppCreatePermissionVal }); // 上限校验 - await checkTeamAppLimit(teamId); + await checkTeamAppTypeLimit({ + teamId, + appCheckType: type === AppTypeEnum.workflowTool ? 'tool' : 'app' + }); + const tmb = await MongoTeamMember.findById({ _id: tmbId }, 'userId') .populate<{ user: { username: string }; @@ -168,13 +172,15 @@ export const onCreateApp = async ({ if (!template?.avatar) return avatar; const s3AvatarSource = getS3AvatarSource(); - if (!isS3ObjectKey(template.avatar?.slice(s3AvatarSource.prefix.length), 'avatar')) + if (!isS3ObjectKey(template.avatar?.slice(s3AvatarSource.prefix.length), 'avatar')) { return template.avatar; + } const filename = (() => { - const last = template.avatar.split('/').pop()?.split('-')[1]; + const last = template.avatar.split('/').pop(); if (!last) return getNanoid(6).concat(path.extname(template.avatar)); - return `${getNanoid(6)}-${last}`; + const firstDashIndex = last.indexOf('-'); + return `${getNanoid(6)}-${firstDashIndex === -1 ? last : last.slice(firstDashIndex + 1)}`; })(); return await s3AvatarSource.copyAvatar({ diff --git a/projects/app/src/pages/api/core/app/exportChatLogs.ts b/projects/app/src/pages/api/core/app/exportChatLogs.ts index 88ce75542..d75518fe3 100644 --- a/projects/app/src/pages/api/core/app/exportChatLogs.ts +++ b/projects/app/src/pages/api/core/app/exportChatLogs.ts @@ -1,13 +1,12 @@ import type { NextApiResponse } from 'next'; import { responseWriteController } from '@fastgpt/service/common/response'; -import { addDays } from 'date-fns'; import { readFromSecondary } from '@fastgpt/service/common/mongo/utils'; import { addLog } from '@fastgpt/service/common/system/log'; import dayjs from 'dayjs'; import { type ApiRequestProps } from '@fastgpt/service/type/next'; import { replaceRegChars } from '@fastgpt/global/common/string/tools'; import { NextAPI } from '@/service/middleware/entry'; -import { type GetAppChatLogsProps } from '@/global/core/api/appReq'; +import type { GetAppChatLogsProps } from '@/global/core/api/appReq'; import { authApp } from '@fastgpt/service/support/permission/app/auth'; import { Types } from '@fastgpt/service/common/mongo'; import { MongoChat } from '@fastgpt/service/core/chat/chatSchema'; @@ -26,6 +25,8 @@ import { useIPFrequencyLimit } from '@fastgpt/service/common/middle/reqFrequency import { getAppLatestVersion } from '@fastgpt/service/core/app/version/controller'; import { VariableInputEnum } from '@fastgpt/global/core/workflow/constants'; import { getTimezoneCodeFromStr } from '@fastgpt/global/common/time/timezone'; +import { getLocationFromIp } from '@fastgpt/service/common/geo'; +import type { I18nName } from '@fastgpt/service/common/geo/type'; const formatJsonString = (data: any) => { if (data == null) return ''; @@ -39,6 +40,7 @@ export type ExportChatLogsBody = GetAppChatLogsProps & { title: string; sourcesMap: Record; logKeys: AppLogKeysEnum[]; + locale?: keyof I18nName; }; async function handler(req: ApiRequestProps, res: NextApiResponse) { @@ -49,6 +51,7 @@ async function handler(req: ApiRequestProps, res: NextAp sources, tmbIds, chatSearch, + locale = 'en', title, sourcesMap, logKeys = [] @@ -104,13 +107,15 @@ async function handler(req: ApiRequestProps, res: NextAp }, ...(sources && { source: { $in: sources } }), ...(tmbIds && { tmbId: { $in: tmbIds } }), - ...(chatSearch && { - $or: [ - { chatId: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } }, - { title: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } }, - { customTitle: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } } - ] - }) + ...(chatSearch + ? { + $or: [ + { chatId: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } }, + { title: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } }, + { customTitle: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } } + ] + } + : undefined) }; res.setHeader('Content-Type', 'text/csv; charset=utf-8;'); @@ -364,7 +369,8 @@ async function handler(req: ApiRequestProps, res: NextAp customFeedbackItems: 1, markItems: 1, chatDetails: 1, - variables: 1 + variables: 1, + originIp: '$metadata.originIp' } } ], @@ -394,6 +400,7 @@ async function handler(req: ApiRequestProps, res: NextAp const tmbName = doc.outLinkUid ? doc.outLinkUid : teamMemberWithContact.find((member) => String(member.memberId) === String(doc.tmbId))?.name; + const region = getLocationFromIp(doc.originIp, locale); const valueMap: Record any> = { [AppLogKeysEnum.SOURCE]: () => source, @@ -432,6 +439,7 @@ async function handler(req: ApiRequestProps, res: NextAp doc.averageResponseTime ? Number(doc.averageResponseTime).toFixed(2) : 0, [AppLogKeysEnum.ERROR_COUNT]: () => doc.errorCount || 0, [AppLogKeysEnum.POINTS]: () => (doc.totalPoints ? Number(doc.totalPoints).toFixed(2) : 0), + [AppLogKeysEnum.REGION]: () => region, chatDetails: () => formatJsonString(doc.chatDetails || []) }; diff --git a/projects/app/src/pages/api/core/app/folder/create.ts b/projects/app/src/pages/api/core/app/folder/create.ts index 81fda4dd1..6f31f9c68 100644 --- a/projects/app/src/pages/api/core/app/folder/create.ts +++ b/projects/app/src/pages/api/core/app/folder/create.ts @@ -3,7 +3,7 @@ import { CommonErrEnum } from '@fastgpt/global/common/error/code/common'; import { FolderImgUrl } from '@fastgpt/global/common/file/image/constants'; import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type'; import { parseParentIdInMongo } from '@fastgpt/global/common/parentFolder/utils'; -import { AppFolderTypeList, AppTypeEnum } from '@fastgpt/global/core/app/constants'; +import { AppTypeEnum } from '@fastgpt/global/core/app/constants'; import { PerResourceTypeEnum, WritePermissionVal @@ -17,6 +17,7 @@ import { authUserPer } from '@fastgpt/service/support/permission/user/auth'; import { type ApiRequestProps } from '@fastgpt/service/type/next'; import { addAuditLog } from '@fastgpt/service/support/user/audit/util'; import { AuditEventEnum } from '@fastgpt/global/support/user/audit/constants'; +import { checkTeamAppTypeLimit } from '@fastgpt/service/support/permission/teamLimit'; export type CreateAppFolderBody = { parentId?: ParentIdType; name: string; @@ -40,6 +41,8 @@ async function handler(req: ApiRequestProps) { ? await authApp({ req, appId: parentId, per: WritePermissionVal, authToken: true }) : await authUserPer({ req, authToken: true, per: TeamAppCreatePermissionVal }); + await checkTeamAppTypeLimit({ teamId, appCheckType: 'folder' }); + // Create app await mongoSessionRun(async (session) => { const app = await MongoApp.create({ diff --git a/projects/app/src/pages/api/core/app/getChatLogs.ts b/projects/app/src/pages/api/core/app/getChatLogs.ts index c566f36f3..45047dbe2 100644 --- a/projects/app/src/pages/api/core/app/getChatLogs.ts +++ b/projects/app/src/pages/api/core/app/getChatLogs.ts @@ -15,6 +15,7 @@ import { parsePaginationRequest } from '@fastgpt/service/common/api/pagination'; import { type PaginationResponse } from '@fastgpt/web/common/fetch/type'; import { addSourceMember } from '@fastgpt/service/support/user/utils'; import { replaceRegChars } from '@fastgpt/global/common/string/tools'; +import { getLocationFromIp } from '@fastgpt/service/common/geo'; import { AppReadChatLogPerVal } from '@fastgpt/global/support/permission/app/constant'; import { CommonErrEnum } from '@fastgpt/global/common/error/code/common'; import type { ApiRequestProps } from '@fastgpt/service/type/next'; @@ -23,7 +24,7 @@ async function handler( req: ApiRequestProps, _res: NextApiResponse ): Promise> { - const { appId, dateStart, dateEnd, sources, tmbIds, chatSearch } = req.body; + const { appId, dateStart, dateEnd, sources, tmbIds, chatSearch, locale = 'en' } = req.body; const { pageSize = 20, offset } = parsePaginationRequest(req); @@ -48,13 +49,15 @@ async function handler( $gte: new Date(dateStart), $lte: new Date(dateEnd) }, - ...(chatSearch && { - $or: [ - { chatId: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } }, - { title: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } }, - { customTitle: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } } - ] - }) + ...(chatSearch + ? { + $or: [ + { chatId: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } }, + { title: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } }, + { customTitle: { $regex: new RegExp(`${replaceRegChars(chatSearch)}`, 'i') } } + ] + } + : undefined) }; const [list, total] = await Promise.all([ @@ -277,7 +280,8 @@ async function handler( errorCount: 1, totalPoints: 1, outLinkUid: 1, - tmbId: 1 + tmbId: 1, + region: '$metadata.originIp' } } ], @@ -288,11 +292,21 @@ async function handler( MongoChat.countDocuments(where, { ...readFromSecondary }) ]); - const listWithSourceMember = await addSourceMember({ - list + const listWithRegion = list.map((item) => { + const ip = item.region; + const region = getLocationFromIp(ip, locale); + + return { + ...item, + region: region || ip + }; }); - const listWithoutTmbId = list.filter((item) => !item.tmbId); + const listWithSourceMember = await addSourceMember({ + list: listWithRegion + }); + + const listWithoutTmbId = listWithRegion.filter((item) => !item.tmbId); return { list: listWithSourceMember.concat(listWithoutTmbId), diff --git a/projects/app/src/pages/api/core/app/httpTools/create.ts b/projects/app/src/pages/api/core/app/httpTools/create.ts index a16d3fe5f..4a38dae00 100644 --- a/projects/app/src/pages/api/core/app/httpTools/create.ts +++ b/projects/app/src/pages/api/core/app/httpTools/create.ts @@ -8,7 +8,7 @@ import { AppTypeEnum } from '@fastgpt/global/core/app/constants'; import { pushTrack } from '@fastgpt/service/common/middle/tracks/utils'; import { authApp } from '@fastgpt/service/support/permission/app/auth'; import { TeamAppCreatePermissionVal } from '@fastgpt/global/support/permission/user/constant'; -import { checkTeamAppLimit } from '@fastgpt/service/support/permission/teamLimit'; +import { checkTeamAppTypeLimit } from '@fastgpt/service/support/permission/teamLimit'; import { getHTTPToolSetRuntimeNode } from '@fastgpt/global/core/app/tool/httpTool/utils'; export type createHttpToolsQuery = {}; @@ -27,7 +27,7 @@ async function handler( ? await authApp({ req, appId: parentId, per: TeamAppCreatePermissionVal, authToken: true }) : await authUserPer({ req, authToken: true, per: TeamAppCreatePermissionVal }); - await checkTeamAppLimit(teamId); + await checkTeamAppTypeLimit({ teamId, appCheckType: 'tool' }); const httpToolsetId = await mongoSessionRun(async (session) => { const httpToolsetId = await onCreateApp({ diff --git a/projects/app/src/pages/api/core/app/mcpTools/create.ts b/projects/app/src/pages/api/core/app/mcpTools/create.ts index 71ed6074a..a3bd48406 100644 --- a/projects/app/src/pages/api/core/app/mcpTools/create.ts +++ b/projects/app/src/pages/api/core/app/mcpTools/create.ts @@ -9,7 +9,7 @@ import { mongoSessionRun } from '@fastgpt/service/common/mongo/sessionRun'; import { AppTypeEnum } from '@fastgpt/global/core/app/constants'; import { getMCPToolSetRuntimeNode } from '@fastgpt/global/core/app/tool/mcpTool/utils'; import { pushTrack } from '@fastgpt/service/common/middle/tracks/utils'; -import { checkTeamAppLimit } from '@fastgpt/service/support/permission/teamLimit'; +import { checkTeamAppTypeLimit } from '@fastgpt/service/support/permission/teamLimit'; import { WritePermissionVal } from '@fastgpt/global/support/permission/constant'; import { type StoreSecretValueType } from '@fastgpt/global/common/secret/type'; import { storeSecretValue } from '@fastgpt/service/common/secret/utils'; @@ -37,7 +37,7 @@ async function handler( ? await authApp({ req, appId: parentId, per: WritePermissionVal, authToken: true }) : await authUserPer({ req, authToken: true, per: TeamAppCreatePermissionVal }); - await checkTeamAppLimit(teamId); + await checkTeamAppTypeLimit({ teamId, appCheckType: 'tool' }); const formatedHeaderAuth = storeSecretValue(headerSecret); diff --git a/projects/app/src/pages/api/core/chat/chatTest.ts b/projects/app/src/pages/api/core/chat/chatTest.ts index e42f5aaa2..4d992f2ad 100644 --- a/projects/app/src/pages/api/core/chat/chatTest.ts +++ b/projects/app/src/pages/api/core/chat/chatTest.ts @@ -50,6 +50,8 @@ import { saveChat, updateInteractiveChat } from '@fastgpt/service/core/chat/save import { getLocale } from '@fastgpt/service/common/middle/i18n'; import { formatTime2YMDHM } from '@fastgpt/global/common/string/time'; import { LimitTypeEnum, teamFrequencyLimit } from '@fastgpt/service/common/api/frequencyLimit'; +import { getIpFromRequest } from '@fastgpt/service/common/geo'; +import { pushTrack } from '@fastgpt/service/common/middle/tracks/utils'; export type Props = { messages: ChatCompletionMessageParam[]; @@ -82,6 +84,9 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { if (!Array.isArray(edges)) { throw new Error('Edges is not array'); } + + const originIp = getIpFromRequest(req); + const chatMessages = GPTMessages2Chats({ messages }); // console.log(JSON.stringify(chatMessages, null, 2), '====', chatMessages.length); @@ -103,6 +108,8 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { return; } + pushTrack.teamChatQPM({ teamId }); + const isPlugin = app.type === AppTypeEnum.workflowTool; const isTool = app.type === AppTypeEnum.tool; @@ -246,7 +253,10 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { source: ChatSourceEnum.test, userContent: userQuestion, aiContent: aiResponse, - durationSeconds + durationSeconds, + metadata: { + originIp + } }; if (isInteractiveRequest) { diff --git a/projects/app/src/pages/api/core/chat/clearHistories.ts b/projects/app/src/pages/api/core/chat/clearHistories.ts index f257fbf20..ec73e4f52 100644 --- a/projects/app/src/pages/api/core/chat/clearHistories.ts +++ b/projects/app/src/pages/api/core/chat/clearHistories.ts @@ -4,7 +4,6 @@ import { MongoChatItem } from '@fastgpt/service/core/chat/chatItemSchema'; import { type ClearHistoriesProps } from '@/global/core/chat/api'; import { ChatSourceEnum } from '@fastgpt/global/core/chat/constants'; import { NextAPI } from '@/service/middleware/entry'; -import { deleteChatFiles } from '@fastgpt/service/core/chat/controller'; import { type ApiRequestProps } from '@fastgpt/service/type/next'; import { authChatCrud } from '@/service/support/permission/auth/chat'; import { MongoChatItemResponse } from '@fastgpt/service/core/chat/chatItemResponseSchema'; @@ -64,7 +63,6 @@ async function handler(req: ApiRequestProps<{}, ClearHistoriesProps>, res: NextA const list = await MongoChat.find(match, 'chatId').lean(); const idList = list.map((item) => item.chatId); - await deleteChatFiles({ chatIdList: idList }); await getS3ChatSource().deleteChatFilesByPrefix({ appId, uId: uid }); await MongoChatItemResponse.deleteMany({ diff --git a/projects/app/src/pages/api/core/chat/delHistory.ts b/projects/app/src/pages/api/core/chat/delHistory.ts index c50631ff6..c082ca095 100644 --- a/projects/app/src/pages/api/core/chat/delHistory.ts +++ b/projects/app/src/pages/api/core/chat/delHistory.ts @@ -6,7 +6,6 @@ import { authChatCrud } from '@/service/support/permission/auth/chat'; import { mongoSessionRun } from '@fastgpt/service/common/mongo/sessionRun'; import { NextAPI } from '@/service/middleware/entry'; import { type ApiRequestProps } from '@fastgpt/service/type/next'; -import { deleteChatFiles } from '@fastgpt/service/core/chat/controller'; import { MongoChatItemResponse } from '@fastgpt/service/core/chat/chatItemResponseSchema'; import { getS3ChatSource } from '@fastgpt/service/common/s3/sources/chat'; @@ -21,7 +20,6 @@ async function handler(req: ApiRequestProps<{}, DelHistoryProps>, res: NextApiRe ...req.query }); - await deleteChatFiles({ chatIdList: [chatId] }); await mongoSessionRun(async (session) => { await MongoChatItemResponse.deleteMany({ appId, diff --git a/projects/app/src/pages/api/core/chat/quote/getCollectionQuote.ts b/projects/app/src/pages/api/core/chat/quote/getCollectionQuote.ts index e799a926a..b043d3dd8 100644 --- a/projects/app/src/pages/api/core/chat/quote/getCollectionQuote.ts +++ b/projects/app/src/pages/api/core/chat/quote/getCollectionQuote.ts @@ -17,8 +17,6 @@ import { ChatErrEnum } from '@fastgpt/global/common/error/code/chat'; import { getCollectionWithDataset } from '@fastgpt/service/core/dataset/controller'; import { getFormatDatasetCiteList } from '@fastgpt/service/core/dataset/data/controller'; import { MongoChatItem } from '@fastgpt/service/core/chat/chatItemSchema'; -import { replaceDatasetQuoteTextWithJWT } from '@fastgpt/service/core/dataset/utils'; -import { addDays } from 'date-fns'; export type GetCollectionQuoteProps = LinkedPaginationProps & { chatId: string; diff --git a/projects/app/src/pages/api/core/chat/quote/getQuote.ts b/projects/app/src/pages/api/core/chat/quote/getQuote.ts index 0852eaf08..eadc0c342 100644 --- a/projects/app/src/pages/api/core/chat/quote/getQuote.ts +++ b/projects/app/src/pages/api/core/chat/quote/getQuote.ts @@ -8,9 +8,6 @@ import { ChatErrEnum } from '@fastgpt/global/common/error/code/chat'; import { getFormatDatasetCiteList } from '@fastgpt/service/core/dataset/data/controller'; import type { DatasetCiteItemType } from '@fastgpt/global/core/dataset/type'; import { MongoChatItem } from '@fastgpt/service/core/chat/chatItemSchema'; -import { replaceDatasetQuoteTextWithJWT } from '@fastgpt/service/core/dataset/utils'; -import { addDays } from 'date-fns'; -import { isS3ObjectKey, jwtSignS3ObjectKey } from '@fastgpt/service/common/s3/utils'; export type GetQuoteProps = { datasetDataIdList: string[]; diff --git a/projects/app/src/pages/api/core/dataset/collection/create/backup.ts b/projects/app/src/pages/api/core/dataset/collection/create/backup.ts index beda0bb53..2e9c1d5fe 100644 --- a/projects/app/src/pages/api/core/dataset/collection/create/backup.ts +++ b/projects/app/src/pages/api/core/dataset/collection/create/backup.ts @@ -1,7 +1,5 @@ -import type { ApiRequestProps, ApiResponseType } from '@fastgpt/service/type/next'; +import type { ApiRequestProps } from '@fastgpt/service/type/next'; import { NextAPI } from '@/service/middleware/entry'; -import { getUploadModel } from '@fastgpt/service/common/file/multer'; -import { removeFilesByPaths } from '@fastgpt/service/common/file/utils'; import { addLog } from '@fastgpt/service/common/system/log'; import { readRawTextByLocalFile } from '@fastgpt/service/common/file/read/utils'; import { authDataset } from '@fastgpt/service/support/permission/dataset/auth'; @@ -12,8 +10,9 @@ import { DatasetCollectionTypeEnum } from '@fastgpt/global/core/dataset/constants'; import { i18nT } from '@fastgpt/web/i18n/utils'; -import { uploadFile } from '@fastgpt/service/common/file/gridfs/controller'; import { isCSVFile } from '@fastgpt/global/common/file/utils'; +import { multer } from '@fastgpt/service/common/file/multer'; +import { getS3DatasetSource } from '@fastgpt/service/common/s3/sources/dataset'; export type backupQuery = {}; @@ -21,17 +20,17 @@ export type backupBody = {}; export type backupResponse = {}; -async function handler(req: ApiRequestProps, res: ApiResponseType) { - const filePaths: string[] = []; +async function handler(req: ApiRequestProps) { + const filepaths: string[] = []; try { - const upload = getUploadModel({ - maxSize: global.feConfigs?.uploadFileMaxSize + const result = await multer.resolveFormData({ + request: req, + maxFileSize: global.feConfigs?.uploadFileMaxSize }); - const { file, data } = await upload.getUploadFile<{ datasetId: string }>(req, res); - filePaths.push(file.path); + filepaths.push(result.fileMetadata.path); - if (!isCSVFile(file.originalname)) { + if (!isCSVFile(result.fileMetadata.originalname)) { return Promise.reject('File must be a CSV file'); } @@ -40,35 +39,28 @@ async function handler(req: ApiRequestProps, res: ApiRe authToken: true, authApiKey: true, per: WritePermissionVal, - datasetId: data.datasetId + datasetId: result.data.datasetId }); - // 1. Read const { rawText } = await readRawTextByLocalFile({ teamId, tmbId, - path: file.path, - encoding: file.encoding, + path: result.fileMetadata.path, + encoding: result.fileMetadata.encoding, getFormatText: false }); + if (!rawText.trim().startsWith('q,a,indexes')) { return Promise.reject(i18nT('dataset:backup_template_invalid')); } - // 2. Upload file - const fileId = await uploadFile({ - teamId, - uid: tmbId, - bucketName: 'dataset', - path: file.path, - filename: file.originalname, - contentType: file.mimetype + const fileId = await getS3DatasetSource().upload({ + datasetId: dataset._id, + stream: result.getReadStream(), + size: result.fileMetadata.size, + filename: result.fileMetadata.originalname }); - // 2. delete tmp file - removeFilesByPaths(filePaths); - - // 3. Create collection await createCollectionAndInsertData({ dataset, rawText, @@ -77,7 +69,7 @@ async function handler(req: ApiRequestProps, res: ApiRe teamId, tmbId, datasetId: dataset._id, - name: file.originalname, + name: result.fileMetadata.originalname, type: DatasetCollectionTypeEnum.file, fileId, trainingType: DatasetCollectionDataProcessModeEnum.backup @@ -87,8 +79,9 @@ async function handler(req: ApiRequestProps, res: ApiRe return {}; } catch (error) { addLog.error(`Backup dataset collection create error: ${error}`); - removeFilesByPaths(filePaths); return Promise.reject(error); + } finally { + multer.clearDiskTempFiles(filepaths); } } diff --git a/projects/app/src/pages/api/core/dataset/collection/create/fileId.ts b/projects/app/src/pages/api/core/dataset/collection/create/fileId.ts index a84897a7d..f3e1bacc6 100644 --- a/projects/app/src/pages/api/core/dataset/collection/create/fileId.ts +++ b/projects/app/src/pages/api/core/dataset/collection/create/fileId.ts @@ -1,9 +1,7 @@ -import { getFileById } from '@fastgpt/service/common/file/gridfs/controller'; import { authDataset } from '@fastgpt/service/support/permission/dataset/auth'; import { type FileIdCreateDatasetCollectionParams } from '@fastgpt/global/core/dataset/api'; import { createCollectionAndInsertData } from '@fastgpt/service/core/dataset/collection/controller'; import { DatasetCollectionTypeEnum } from '@fastgpt/global/core/dataset/constants'; -import { BucketNameEnum } from '@fastgpt/global/common/file/constants'; import { NextAPI } from '@/service/middleware/entry'; import { type ApiRequestProps } from '@fastgpt/service/type/next'; import { WritePermissionVal } from '@fastgpt/global/support/permission/constant'; @@ -25,23 +23,14 @@ async function handler( datasetId: body.datasetId }); - const filename = await (async () => { - if (isS3ObjectKey(fileId, 'dataset')) { - const metadata = await getS3DatasetSource().getFileMetadata(fileId); - if (!metadata) return Promise.reject(CommonErrEnum.fileNotFound); - return metadata.filename; - } + if (!isS3ObjectKey(fileId, 'dataset')) { + return Promise.reject('Invalid dataset file key'); + } - const file = await getFileById({ - bucketName: BucketNameEnum.dataset, - fileId - }); - if (!file) { - return Promise.reject(CommonErrEnum.fileNotFound); - } - - return file.filename; - })(); + const metadata = await getS3DatasetSource().getFileMetadata(fileId); + if (!metadata) { + return Promise.reject(CommonErrEnum.fileNotFound); + } const { collectionId, insertResults } = await createCollectionAndInsertData({ dataset, @@ -50,7 +39,7 @@ async function handler( teamId, tmbId, type: DatasetCollectionTypeEnum.file, - name: filename, + name: metadata.filename, fileId, // ObjectId -> ObjectKey customPdfParse } diff --git a/projects/app/src/pages/api/core/dataset/collection/create/images.ts b/projects/app/src/pages/api/core/dataset/collection/create/images.ts index 8b6978549..7b1d9be1d 100644 --- a/projects/app/src/pages/api/core/dataset/collection/create/images.ts +++ b/projects/app/src/pages/api/core/dataset/collection/create/images.ts @@ -9,17 +9,13 @@ import { NextAPI } from '@/service/middleware/entry'; import { type ApiRequestProps } from '@fastgpt/service/type/next'; import { WritePermissionVal } from '@fastgpt/global/support/permission/constant'; import type { CreateCollectionResponse } from '@/global/core/dataset/api'; -import { getUploadModel } from '@fastgpt/service/common/file/multer'; -import { removeFilesByPaths } from '@fastgpt/service/common/file/utils'; -import type { NextApiResponse } from 'next'; import { i18nT } from '@fastgpt/web/i18n/utils'; import { authFrequencyLimit } from '@/service/common/frequencyLimit/api'; import { addDays, addSeconds } from 'date-fns'; -import { S3Sources } from '@fastgpt/service/common/s3/type'; -import { getNanoid } from '@fastgpt/global/common/string/tools'; -import fsp from 'node:fs/promises'; +import fs from 'node:fs'; import path from 'node:path'; import { getFileS3Key, uploadImage2S3Bucket } from '@fastgpt/service/common/s3/utils'; +import { multer } from '@fastgpt/service/common/file/multer'; const authUploadLimit = (tmbId: string, num: number) => { if (!global.feConfigs.uploadFileMaxAmount) return; @@ -32,20 +28,17 @@ const authUploadLimit = (tmbId: string, num: number) => { }; async function handler( - req: ApiRequestProps, - res: NextApiResponse + req: ApiRequestProps ): CreateCollectionResponse { - const filePaths: string[] = []; + const filepaths: string[] = []; try { - const upload = getUploadModel({ - maxSize: global.feConfigs?.uploadFileMaxSize + const result = await multer.resolveMultipleFormData({ + request: req, + maxFileSize: global.feConfigs?.uploadFileMaxSize }); - const { - files, - data: { parentId, datasetId, collectionName } - } = await upload.getUploadFiles(req, res); - filePaths.push(...files.map((item) => item.path)); + filepaths.push(...result.fileMetadata.map((item) => item.path)); + const { parentId, datasetId, collectionName } = result.data; const { dataset, teamId, tmbId } = await authDataset({ datasetId, @@ -54,19 +47,19 @@ async function handler( authToken: true, authApiKey: true }); - await authUploadLimit(tmbId, files.length); + + await authUploadLimit(tmbId, result.fileMetadata.length); if (!dataset.vlmModel) { return Promise.reject(i18nT('file:Image_dataset_requires_VLM_model_to_be_configured')); } - // 1. Save image to S3 const imageIds = await Promise.all( - files.map(async (file) => { + result.fileMetadata.map(async (file) => { const filename = path.basename(file.filename); const { fileKey } = getFileS3Key.dataset({ datasetId, filename }); return uploadImage2S3Bucket('private', { - base64Img: (await fsp.readFile(file.path)).toString('base64'), + base64Img: (await fs.promises.readFile(file.path)).toString('base64'), uploadKey: fileKey, mimetype: file.mimetype, filename, @@ -75,7 +68,6 @@ async function handler( }) ); - // 2. Create collection const { collectionId, insertResults } = await createCollectionAndInsertData({ dataset, imageIds, @@ -97,7 +89,7 @@ async function handler( } catch (error) { return Promise.reject(error); } finally { - removeFilesByPaths(filePaths); + multer.clearDiskTempFiles(filepaths); } } diff --git a/projects/app/src/pages/api/core/dataset/collection/create/localFile.ts b/projects/app/src/pages/api/core/dataset/collection/create/localFile.ts index 8a5629e65..9bd79962f 100644 --- a/projects/app/src/pages/api/core/dataset/collection/create/localFile.ts +++ b/projects/app/src/pages/api/core/dataset/collection/create/localFile.ts @@ -1,67 +1,46 @@ import type { NextApiRequest, NextApiResponse } from 'next'; -import { uploadFile } from '@fastgpt/service/common/file/gridfs/controller'; -import { getUploadModel } from '@fastgpt/service/common/file/multer'; import { authDataset } from '@fastgpt/service/support/permission/dataset/auth'; -import { type FileCreateDatasetCollectionParams } from '@fastgpt/global/core/dataset/api'; -import { removeFilesByPaths } from '@fastgpt/service/common/file/utils'; import { createCollectionAndInsertData } from '@fastgpt/service/core/dataset/collection/controller'; import { DatasetCollectionTypeEnum } from '@fastgpt/global/core/dataset/constants'; -import { getNanoid } from '@fastgpt/global/common/string/tools'; -import { BucketNameEnum } from '@fastgpt/global/common/file/constants'; import { NextAPI } from '@/service/middleware/entry'; import { WritePermissionVal } from '@fastgpt/global/support/permission/constant'; import { type CreateCollectionResponse } from '@/global/core/dataset/api'; +import { multer } from '@fastgpt/service/common/file/multer'; +import { getS3DatasetSource } from '@fastgpt/service/common/s3/sources/dataset'; -async function handler(req: NextApiRequest, res: NextApiResponse): CreateCollectionResponse { - let filePaths: string[] = []; +async function handler(req: NextApiRequest): CreateCollectionResponse { + const filepaths: string[] = []; try { - // Create multer uploader - const upload = getUploadModel({ - maxSize: global.feConfigs?.uploadFileMaxSize + const result = await multer.resolveFormData({ + request: req, + maxFileSize: global.feConfigs?.uploadFileMaxSize }); - const { file, data, bucketName } = - await upload.getUploadFile( - req, - res, - BucketNameEnum.dataset - ); - filePaths = [file.path]; - - if (!file || !bucketName) { - throw new Error('file is empty'); - } + filepaths.push(result.fileMetadata.path); const { teamId, tmbId, dataset } = await authDataset({ req, authToken: true, authApiKey: true, per: WritePermissionVal, - datasetId: data.datasetId + datasetId: result.data.datasetId }); - const { fileMetadata, collectionMetadata, ...collectionData } = data; - const collectionName = file.originalname; + const { fileMetadata, collectionMetadata, ...collectionData } = result.data; + const collectionName = result.fileMetadata.originalname; - // 1. upload file - const fileId = await uploadFile({ - teamId, - uid: tmbId, - bucketName, - path: file.path, - filename: file.originalname, - contentType: file.mimetype, - metadata: fileMetadata + const fileId = await getS3DatasetSource().upload({ + datasetId: dataset._id, + stream: result.getReadStream(), + size: result.fileMetadata.size, + filename: result.fileMetadata.originalname }); - // 2. delete tmp file - removeFilesByPaths(filePaths); - - // 3. Create collection const { collectionId, insertResults } = await createCollectionAndInsertData({ dataset, createCollectionParams: { ...collectionData, + datasetId: dataset._id, name: collectionName, teamId, tmbId, @@ -76,9 +55,9 @@ async function handler(req: NextApiRequest, res: NextApiResponse): CreateCo return { collectionId, results: insertResults }; } catch (error) { - removeFilesByPaths(filePaths); - return Promise.reject(error); + } finally { + multer.clearDiskTempFiles(filepaths); } } diff --git a/projects/app/src/pages/api/core/dataset/collection/create/template.ts b/projects/app/src/pages/api/core/dataset/collection/create/template.ts index 8b0d84ff4..9f1f4fa52 100644 --- a/projects/app/src/pages/api/core/dataset/collection/create/template.ts +++ b/projects/app/src/pages/api/core/dataset/collection/create/template.ts @@ -1,7 +1,5 @@ -import type { ApiRequestProps, ApiResponseType } from '@fastgpt/service/type/next'; +import type { ApiRequestProps } from '@fastgpt/service/type/next'; import { NextAPI } from '@/service/middleware/entry'; -import { getUploadModel } from '@fastgpt/service/common/file/multer'; -import { removeFilesByPaths } from '@fastgpt/service/common/file/utils'; import { addLog } from '@fastgpt/service/common/system/log'; import { readRawTextByLocalFile } from '@fastgpt/service/common/file/read/utils'; import { authDataset } from '@fastgpt/service/support/permission/dataset/auth'; @@ -13,7 +11,8 @@ import { } from '@fastgpt/global/core/dataset/constants'; import { i18nT } from '@fastgpt/web/i18n/utils'; import { isCSVFile } from '@fastgpt/global/common/file/utils'; -import { uploadFile } from '@fastgpt/service/common/file/gridfs/controller'; +import { multer } from '@fastgpt/service/common/file/multer'; +import { getS3DatasetSource } from '@fastgpt/service/common/s3/sources/dataset'; export type templateImportQuery = {}; @@ -21,20 +20,17 @@ export type templateImportBody = { datasetId: string }; export type templateImportResponse = {}; -async function handler( - req: ApiRequestProps, - res: ApiResponseType -) { - const filePaths: string[] = []; +async function handler(req: ApiRequestProps) { + const filepaths: string[] = []; try { - const upload = getUploadModel({ - maxSize: global.feConfigs?.uploadFileMaxSize + const result = await multer.resolveFormData({ + request: req, + maxFileSize: global.feConfigs?.uploadFileMaxSize }); - const { file, data } = await upload.getUploadFile(req, res); - filePaths.push(file.path); + filepaths.push(result.fileMetadata.path); - if (!isCSVFile(file.originalname)) { + if (!isCSVFile(result.fileMetadata.originalname)) { return Promise.reject('File must be a CSV file'); } @@ -43,35 +39,28 @@ async function handler( authToken: true, authApiKey: true, per: WritePermissionVal, - datasetId: data.datasetId + datasetId: result.data.datasetId }); - // 1. Read const { rawText } = await readRawTextByLocalFile({ teamId, tmbId, - path: file.path, - encoding: file.encoding, + path: result.fileMetadata.path, + encoding: result.fileMetadata.encoding, getFormatText: false }); + if (!rawText.trim().startsWith('q,a,indexes')) { return Promise.reject(i18nT('dataset:template_file_invalid')); } - // 2. Upload file - const fileId = await uploadFile({ - teamId, - uid: tmbId, - bucketName: 'dataset', - path: file.path, - filename: file.originalname, - contentType: file.mimetype + const fileId = await getS3DatasetSource().upload({ + datasetId: dataset._id, + stream: result.getReadStream(), + size: result.fileMetadata.size, + filename: result.fileMetadata.originalname }); - // 3. delete tmp file - removeFilesByPaths(filePaths); - - // 4. Create collection await createCollectionAndInsertData({ dataset, rawText, @@ -80,7 +69,7 @@ async function handler( teamId, tmbId, datasetId: dataset._id, - name: file.originalname, + name: result.fileMetadata.originalname, type: DatasetCollectionTypeEnum.file, fileId, trainingType: DatasetCollectionDataProcessModeEnum.template @@ -90,8 +79,9 @@ async function handler( return {}; } catch (error) { addLog.error(`Backup dataset collection create error: ${error}`); - removeFilesByPaths(filePaths); return Promise.reject(error); + } finally { + multer.clearDiskTempFiles(filepaths); } } diff --git a/projects/app/src/pages/api/core/dataset/collection/create/text.ts b/projects/app/src/pages/api/core/dataset/collection/create/text.ts index 7d778eba0..21dec16d4 100644 --- a/projects/app/src/pages/api/core/dataset/collection/create/text.ts +++ b/projects/app/src/pages/api/core/dataset/collection/create/text.ts @@ -23,7 +23,7 @@ async function handler(req: NextApiRequest): CreateCollectionResponse { // 1. Create file from text const filename = `${name}.txt`; const s3DatasetSource = getS3DatasetSource(); - const key = await s3DatasetSource.uploadDatasetFileByBuffer({ + const key = await s3DatasetSource.upload({ datasetId: String(dataset._id), buffer: Buffer.from(text), filename diff --git a/projects/app/src/pages/api/core/dataset/collection/detail.ts b/projects/app/src/pages/api/core/dataset/collection/detail.ts index 6e645f4b3..5009db159 100644 --- a/projects/app/src/pages/api/core/dataset/collection/detail.ts +++ b/projects/app/src/pages/api/core/dataset/collection/detail.ts @@ -3,8 +3,6 @@ */ import type { NextApiRequest } from 'next'; import { authDatasetCollection } from '@fastgpt/service/support/permission/dataset/auth'; -import { BucketNameEnum } from '@fastgpt/global/common/file/constants'; -import { getFileById } from '@fastgpt/service/common/file/gridfs/controller'; import { getCollectionSourceData } from '@fastgpt/global/core/dataset/collection/utils'; import { NextAPI } from '@/service/middleware/entry'; import { ReadPermissionVal } from '@fastgpt/global/support/permission/constant'; @@ -34,15 +32,12 @@ async function handler(req: NextApiRequest): Promise }); const fileId = collection?.fileId; + if (fileId && !isS3ObjectKey(fileId, 'dataset')) { + return Promise.reject('Invalid dataset file key'); + } + const [file, indexAmount, errorCount] = await Promise.all([ - fileId - ? isS3ObjectKey(fileId, 'dataset') - ? getS3DatasetSource().getFileMetadata(fileId) - : (async () => { - const file = await getFileById({ bucketName: BucketNameEnum.dataset, fileId }); - return { filename: file?.filename, contentLength: file?.length }; - })() - : undefined, + fileId ? getS3DatasetSource().getFileMetadata(fileId) : undefined, getVectorCountByCollectionId(collection.teamId, collection.datasetId, collection._id), MongoDatasetTraining.countDocuments( { diff --git a/projects/app/src/pages/api/core/dataset/collection/read.ts b/projects/app/src/pages/api/core/dataset/collection/read.ts index ed510848a..9c9e03577 100644 --- a/projects/app/src/pages/api/core/dataset/collection/read.ts +++ b/projects/app/src/pages/api/core/dataset/collection/read.ts @@ -2,14 +2,14 @@ import type { ApiRequestProps } from '@fastgpt/service/type/next'; import { NextAPI } from '@/service/middleware/entry'; import { authDatasetCollection } from '@fastgpt/service/support/permission/dataset/auth'; import { DatasetCollectionTypeEnum } from '@fastgpt/global/core/dataset/constants'; -import { BucketNameEnum, ReadFileBaseUrl } from '@fastgpt/global/common/file/constants'; import { ReadPermissionVal } from '@fastgpt/global/support/permission/constant'; import { type OutLinkChatAuthProps } from '@fastgpt/global/support/permission/chat'; import { DatasetErrEnum } from '@fastgpt/global/common/error/code/dataset'; import { authChatCrud, authCollectionInChat } from '@/service/support/permission/auth/chat'; import { getCollectionWithDataset } from '@fastgpt/service/core/dataset/controller'; import { getApiDatasetRequest } from '@fastgpt/service/core/dataset/apiDataset'; -import { createFileToken } from '@fastgpt/service/support/permission/auth/file'; +import { isS3ObjectKey } from '@fastgpt/service/common/s3/utils'; +import { getS3DatasetSource } from '@fastgpt/service/common/s3/sources/dataset'; export type readCollectionSourceQuery = {}; @@ -32,12 +32,7 @@ async function handler( const { collectionId, appId, chatId, chatItemDataId, shareId, outLinkUid, teamId, teamToken } = req.body; - const { - collection, - teamId: userTeamId, - tmbId: uid, - authType - } = await (async () => { + const { collection } = await (async () => { if (!appId || !chatId || !chatItemDataId) { return authDatasetCollection({ req, @@ -79,16 +74,16 @@ async function handler( })(); const sourceUrl = await (async () => { - if (collection.type === DatasetCollectionTypeEnum.file && collection.fileId) { - const token = await createFileToken({ - bucketName: BucketNameEnum.dataset, - teamId: userTeamId, - uid, - fileId: collection.fileId, - customExpireMinutes: authType === 'outLink' ? 5 : undefined + if ( + collection.type === DatasetCollectionTypeEnum.file && + collection.fileId && + isS3ObjectKey(collection.fileId, 'dataset') + ) { + return getS3DatasetSource().createGetDatasetFileURL({ + key: collection.fileId, + expiredHours: 1, + external: true }); - - return `${ReadFileBaseUrl}/${collection.name}?token=${token}`; } if (collection.type === DatasetCollectionTypeEnum.link && collection.rawLink) { return collection.rawLink; diff --git a/projects/app/src/pages/api/core/dataset/data/insertImages.ts b/projects/app/src/pages/api/core/dataset/data/insertImages.ts index b9213fc5b..341a85627 100644 --- a/projects/app/src/pages/api/core/dataset/data/insertImages.ts +++ b/projects/app/src/pages/api/core/dataset/data/insertImages.ts @@ -1,9 +1,7 @@ -import type { ApiRequestProps, ApiResponseType } from '@fastgpt/service/type/next'; +import type { ApiRequestProps } from '@fastgpt/service/type/next'; import { NextAPI } from '@/service/middleware/entry'; import { authFrequencyLimit } from '@/service/common/frequencyLimit/api'; import { addDays, addSeconds } from 'date-fns'; -import { removeFilesByPaths } from '@fastgpt/service/common/file/utils'; -import { getUploadModel } from '@fastgpt/service/common/file/multer'; import { authDatasetCollection } from '@fastgpt/service/support/permission/dataset/auth'; import { WritePermissionVal } from '@fastgpt/global/support/permission/constant'; import { mongoSessionRun } from '@fastgpt/service/common/mongo/sessionRun'; @@ -13,8 +11,9 @@ import { getEmbeddingModel, getLLMModel, getVlmModel } from '@fastgpt/service/co import { pushDataListToTrainingQueue } from '@fastgpt/service/core/dataset/training/controller'; import { TrainingModeEnum } from '@fastgpt/global/core/dataset/constants'; import path from 'node:path'; -import fsp from 'node:fs/promises'; +import fs from 'node:fs'; import { getFileS3Key, uploadImage2S3Bucket } from '@fastgpt/service/common/s3/utils'; +import { multer } from '@fastgpt/service/common/file/multer'; export type insertImagesQuery = {}; @@ -35,20 +34,17 @@ const authUploadLimit = (tmbId: string, num: number) => { }; async function handler( - req: ApiRequestProps, - res: ApiResponseType + req: ApiRequestProps ): Promise { - const filePaths: string[] = []; + const filepaths: string[] = []; try { - const upload = getUploadModel({ - maxSize: global.feConfigs?.uploadFileMaxSize + const result = await multer.resolveMultipleFormData({ + request: req, + maxFileSize: global.feConfigs?.uploadFileMaxSize }); - const { - files, - data: { collectionId } - } = await upload.getUploadFiles(req, res); - filePaths.push(...files.map((item) => item.path)); + filepaths.push(...result.fileMetadata.map((item) => item.path)); + const { collectionId } = result.data; const { collection, teamId, tmbId } = await authDatasetCollection({ collectionId, @@ -59,13 +55,12 @@ async function handler( }); const dataset = collection.dataset; - await authUploadLimit(tmbId, files.length); + await authUploadLimit(tmbId, result.fileMetadata.length); - // 1. Upload images to S3 const imageIds = await Promise.all( - files.map(async (file) => + result.fileMetadata.map(async (file) => uploadImage2S3Bucket('private', { - base64Img: (await fsp.readFile(file.path)).toString('base64'), + base64Img: (await fs.promises.readFile(file.path)).toString('base64'), uploadKey: getFileS3Key.dataset({ datasetId: dataset._id, filename: path.basename(file.filename) @@ -77,7 +72,6 @@ async function handler( ) ); - // 2. Insert images to training queue await mongoSessionRun(async (session) => { const traingBillId = await (async () => { const { usageId } = await createTrainingUsage({ @@ -114,7 +108,7 @@ async function handler( } catch (error) { return Promise.reject(error); } finally { - removeFilesByPaths(filePaths); + multer.clearDiskTempFiles(filepaths); } } diff --git a/projects/app/src/pages/api/core/dataset/data/v2/list.ts b/projects/app/src/pages/api/core/dataset/data/v2/list.ts index e29cb3b97..46f62d0b3 100644 --- a/projects/app/src/pages/api/core/dataset/data/v2/list.ts +++ b/projects/app/src/pages/api/core/dataset/data/v2/list.ts @@ -9,11 +9,10 @@ import type { PaginationProps, PaginationResponse } from '@fastgpt/web/common/fe import { parsePaginationRequest } from '@fastgpt/service/common/api/pagination'; import { MongoDatasetImageSchema } from '@fastgpt/service/core/dataset/image/schema'; import { readFromSecondary } from '@fastgpt/service/common/mongo/utils'; -import { getDatasetImagePreviewUrl } from '@fastgpt/service/core/dataset/image/utils'; -import { getS3DatasetSource, S3DatasetSource } from '@fastgpt/service/common/s3/sources/dataset'; +import { getS3DatasetSource } from '@fastgpt/service/common/s3/sources/dataset'; import { addHours } from 'date-fns'; import { jwtSignS3ObjectKey, isS3ObjectKey } from '@fastgpt/service/common/s3/utils'; -import { replaceDatasetQuoteTextWithJWT } from '@fastgpt/service/core/dataset/utils'; +import { replaceS3KeyToPreviewUrl } from '@fastgpt/service/core/dataset/utils'; export type GetDatasetDataListProps = PaginationProps & { searchText?: string; @@ -59,9 +58,9 @@ async function handler( ]); list.forEach((item) => { - item.q = replaceDatasetQuoteTextWithJWT(item.q, addHours(new Date(), 1)); + item.q = replaceS3KeyToPreviewUrl(item.q, addHours(new Date(), 1)); if (item.a) { - item.a = replaceDatasetQuoteTextWithJWT(item.a, addHours(new Date(), 1)); + item.a = replaceS3KeyToPreviewUrl(item.a, addHours(new Date(), 1)); } }); @@ -91,16 +90,10 @@ async function handler( list: await Promise.all( list.map(async (item) => { const imageSize = item.imageId ? imageSizeMap.get(String(item.imageId)) : undefined; - const imagePreviewUrl = item.imageId - ? isS3ObjectKey(item.imageId, 'dataset') + const imagePreviewUrl = + item.imageId && isS3ObjectKey(item.imageId, 'dataset') ? jwtSignS3ObjectKey(item.imageId, addHours(new Date(), 1)) - : getDatasetImagePreviewUrl({ - imageId: item.imageId, - teamId, - datasetId: collection.datasetId, - expiredMinutes: 30 - }) - : undefined; + : undefined; return { ...item, diff --git a/projects/app/src/pages/api/core/dataset/delete.ts b/projects/app/src/pages/api/core/dataset/delete.ts index 7efef6eda..94011b056 100644 --- a/projects/app/src/pages/api/core/dataset/delete.ts +++ b/projects/app/src/pages/api/core/dataset/delete.ts @@ -1,6 +1,6 @@ import type { NextApiRequest } from 'next'; import { authDataset } from '@fastgpt/service/support/permission/dataset/auth'; -import { deleteDatasets } from '@fastgpt/service/core/dataset/controller'; +import { MongoDataset } from '@fastgpt/service/core/dataset/schema'; import { findDatasetAndAllChildren } from '@fastgpt/service/core/dataset/controller'; import { NextAPI } from '@/service/middleware/entry'; import { OwnerPermissionVal } from '@fastgpt/global/support/permission/constant'; @@ -8,6 +8,9 @@ import { CommonErrEnum } from '@fastgpt/global/common/error/code/common'; import { addAuditLog } from '@fastgpt/service/support/user/audit/util'; import { AuditEventEnum } from '@fastgpt/global/support/user/audit/constants'; import { getI18nDatasetType } from '@fastgpt/service/support/user/audit/util'; +import { addDatasetDeleteJob } from '@fastgpt/service/core/dataset/delete'; +import { mongoSessionRun } from '@fastgpt/service/common/mongo/sessionRun'; +import { deleteDatasetsImmediate } from '@fastgpt/service/core/dataset/delete/processor'; async function handler(req: NextApiRequest) { const { id: datasetId } = req.query as { @@ -27,16 +30,39 @@ async function handler(req: NextApiRequest) { per: OwnerPermissionVal }); - const datasets = await findDatasetAndAllChildren({ + const deleteDatasets = await findDatasetAndAllChildren({ teamId, datasetId }); - await deleteDatasets({ - teamId, - datasets + await mongoSessionRun(async (session) => { + // 1. Mark as deleted + await MongoDataset.updateMany( + { + _id: deleteDatasets.map((d) => d._id), + teamId + }, + { + deleteTime: new Date() + }, + { + session + } + ); + + await deleteDatasetsImmediate({ + teamId, + datasets: deleteDatasets + }); + + // 2. Add to delete queue + await addDatasetDeleteJob({ + teamId, + datasetId + }); }); + // 3. Add audit log (async () => { addAuditLog({ tmbId, diff --git a/projects/app/src/pages/api/core/dataset/file/getPreviewChunks.ts b/projects/app/src/pages/api/core/dataset/file/getPreviewChunks.ts index 26c129593..45c927420 100644 --- a/projects/app/src/pages/api/core/dataset/file/getPreviewChunks.ts +++ b/projects/app/src/pages/api/core/dataset/file/getPreviewChunks.ts @@ -15,7 +15,7 @@ import { import { CommonErrEnum } from '@fastgpt/global/common/error/code/common'; import { getEmbeddingModel, getLLMModel } from '@fastgpt/service/core/ai/model'; import type { ChunkSettingsType } from '@fastgpt/global/core/dataset/type'; -import { replaceDatasetQuoteTextWithJWT } from '@fastgpt/service/core/dataset/utils'; +import { replaceS3KeyToPreviewUrl } from '@fastgpt/service/core/dataset/utils'; import { addDays } from 'date-fns'; export type PostPreviewFilesChunksProps = ChunkSettingsType & { @@ -114,8 +114,8 @@ async function handler( }); const chunksWithJWT = chunks.slice(0, 10).map((chunk) => ({ - q: replaceDatasetQuoteTextWithJWT(chunk.q, addDays(new Date(), 1)), - a: replaceDatasetQuoteTextWithJWT(chunk.a, addDays(new Date(), 1)) + q: replaceS3KeyToPreviewUrl(chunk.q, addDays(new Date(), 1)), + a: replaceS3KeyToPreviewUrl(chunk.a, addDays(new Date(), 1)) })); return { diff --git a/projects/app/src/pages/api/core/dataset/image/[imageId].ts b/projects/app/src/pages/api/core/dataset/image/[imageId].ts deleted file mode 100644 index 96bd88842..000000000 --- a/projects/app/src/pages/api/core/dataset/image/[imageId].ts +++ /dev/null @@ -1,52 +0,0 @@ -import type { NextApiResponse } from 'next'; -import { jsonRes } from '@fastgpt/service/common/response'; -import type { ApiRequestProps } from '@fastgpt/service/type/next'; -import { authDatasetImagePreviewUrl } from '@fastgpt/service/core/dataset/image/utils'; -import { getDatasetImageReadData } from '@fastgpt/service/core/dataset/image/controller'; - -export default async function handler( - req: ApiRequestProps< - {}, - { - imageId: string; - token: string; - } - >, - res: NextApiResponse -) { - try { - const { imageId, token } = req.query; - - if (!imageId || !token) { - return jsonRes(res, { - code: 401, - error: 'ImageId not found' - }); - } - - // Verify token and permissions - await authDatasetImagePreviewUrl(token); - - const { fileInfo, stream } = await getDatasetImageReadData(imageId); - - // Set response headers - res.setHeader('Content-Type', fileInfo.contentType); - res.setHeader('Cache-Control', 'public, max-age=31536000'); - res.setHeader('Content-Length', fileInfo.length); - - stream.pipe(res); - stream.on('error', (error) => { - if (!res.headersSent) { - res.status(500).end(); - } - }); - stream.on('end', () => { - res.end(); - }); - } catch (error) { - return jsonRes(res, { - code: 500, - error - }); - } -} diff --git a/projects/app/src/pages/api/core/dataset/list.ts b/projects/app/src/pages/api/core/dataset/list.ts index e1cec5361..bcffc664b 100644 --- a/projects/app/src/pages/api/core/dataset/list.ts +++ b/projects/app/src/pages/api/core/dataset/list.ts @@ -104,6 +104,7 @@ async function handler(req: ApiRequestProps) { const data = { ...datasetPerQuery, teamId, + deleteTime: null, // 搜索时也要过滤已删除数据 ...searchMatch }; // @ts-ignore @@ -114,6 +115,7 @@ async function handler(req: ApiRequestProps) { return { ...datasetPerQuery, teamId, + deleteTime: null, // 关键:只返回未删除的数据 ...(type ? (Array.isArray(type) ? { type: { $in: type } } : { type }) : {}), ...parseParentIdInMongo(parentId) }; diff --git a/projects/app/src/pages/api/core/dataset/searchTest.ts b/projects/app/src/pages/api/core/dataset/searchTest.ts index 4174cbd9a..4bdd99dc2 100644 --- a/projects/app/src/pages/api/core/dataset/searchTest.ts +++ b/projects/app/src/pages/api/core/dataset/searchTest.ts @@ -114,9 +114,11 @@ async function handler(req: ApiRequestProps): Promise): Promise, - res: NextApiResponse -) { - try { - const { token } = req.query; - - if (!token) { - return jsonRes(res, { - code: 401, - error: 'ImageId not found' - }); - } - - const formatToken = token.replace(/\.jpeg$/, ''); - - // Verify token and permissions - const { imageId } = await authDatasetImagePreviewUrl(formatToken); - - const { fileInfo, stream } = await getDatasetImageReadData(imageId); - - // Set response headers - res.setHeader('Content-Type', fileInfo.contentType); - res.setHeader('Cache-Control', 'public, max-age=31536000'); - res.setHeader('Content-Length', fileInfo.length); - - stream.pipe(res); - stream.on('error', (error) => { - if (!res.headersSent) { - res.status(500).end(); - } - }); - stream.on('end', () => { - res.end(); - }); - } catch (error) { - return jsonRes(res, { - code: 500, - error - }); - } -} diff --git a/projects/app/src/pages/api/support/user/account/loginByPassword.ts b/projects/app/src/pages/api/support/user/account/loginByPassword.ts index d2b11d9fa..b191ff344 100644 --- a/projects/app/src/pages/api/support/user/account/loginByPassword.ts +++ b/projects/app/src/pages/api/support/user/account/loginByPassword.ts @@ -17,7 +17,7 @@ import requestIp from 'request-ip'; import { setCookie } from '@fastgpt/service/support/permission/auth/common'; async function handler(req: NextApiRequest, res: NextApiResponse) { - const { username, password, code } = req.body as PostLoginProps; + const { username, password, code, language = 'zh-CN' } = req.body as PostLoginProps; if (!username || !password || !code) { return Promise.reject(CommonErrEnum.invalidParams); @@ -60,7 +60,8 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { }); MongoUser.findByIdAndUpdate(user._id, { - lastLoginTmbId: userDetail.team.tmbId + lastLoginTmbId: userDetail.team.tmbId, + language }); const token = await createUserSession({ diff --git a/projects/app/src/pages/api/support/user/account/tokenLogin.ts b/projects/app/src/pages/api/support/user/account/tokenLogin.ts index 2954c4880..37b8ab958 100644 --- a/projects/app/src/pages/api/support/user/account/tokenLogin.ts +++ b/projects/app/src/pages/api/support/user/account/tokenLogin.ts @@ -4,7 +4,6 @@ import type { ApiRequestProps, ApiResponseType } from '@fastgpt/service/type/nex import { NextAPI } from '@/service/middleware/entry'; import { type UserType } from '@fastgpt/global/support/user/type'; import { pushTrack } from '@fastgpt/service/common/middle/tracks/utils'; -import { getGlobalRedisConnection } from '@fastgpt/service/common/redis'; export type TokenLoginQuery = {}; export type TokenLoginBody = {}; diff --git a/projects/app/src/pages/api/support/user/account/update.ts b/projects/app/src/pages/api/support/user/account/update.ts index 33cb8f85b..d859a8fb2 100644 --- a/projects/app/src/pages/api/support/user/account/update.ts +++ b/projects/app/src/pages/api/support/user/account/update.ts @@ -17,7 +17,7 @@ async function handler( req: ApiRequestProps, _res: ApiResponseType ): Promise { - const { avatar, timezone } = req.body; + const { avatar, timezone, language } = req.body; const { tmbId } = await authCert({ req, authToken: true }); // const user = await getUserDetail({ tmbId }); @@ -25,13 +25,14 @@ async function handler( // 更新对应的记录 await mongoSessionRun(async (session) => { const tmb = await MongoTeamMember.findById(tmbId).session(session); - if (timezone) { + if (timezone || language) { await MongoUser.updateOne( { _id: tmb?.userId }, { - timezone + ...(timezone && { timezone }), + ...(language && { language }) } ).session(session); } diff --git a/projects/app/src/pages/api/support/user/team/plan/getTeamPlanStatus.ts b/projects/app/src/pages/api/support/user/team/plan/getTeamPlanStatus.ts index 1d7c6b95a..b8740ae40 100644 --- a/projects/app/src/pages/api/support/user/team/plan/getTeamPlanStatus.ts +++ b/projects/app/src/pages/api/support/user/team/plan/getTeamPlanStatus.ts @@ -10,6 +10,7 @@ import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants'; import { getVectorCountByTeamId } from '@fastgpt/service/common/vectorDB/controller'; import { MongoTeamMember } from '@fastgpt/service/support/user/team/teamMemberSchema'; import { TeamMemberStatusEnum } from '@fastgpt/global/support/user/team/constant'; +import { MongoAppRegistration } from '@fastgpt/service/support/appRegistration/schema'; async function handler( req: NextApiRequest, @@ -21,39 +22,44 @@ async function handler( authToken: true }); - const [planStatus, usedMember, usedAppAmount, usedDatasetSize, usedDatasetIndexSize] = - await Promise.all([ - getTeamPlanStatus({ - teamId - }), - MongoTeamMember.countDocuments({ - teamId, - status: { $ne: TeamMemberStatusEnum.leave } - }), - MongoApp.countDocuments({ - teamId, - type: { - $in: [ - AppTypeEnum.simple, - AppTypeEnum.workflow, - AppTypeEnum.workflowTool, - AppTypeEnum.mcpToolSet - ] - } - }), - MongoDataset.countDocuments({ - teamId, - type: { $ne: DatasetTypeEnum.folder } - }), - getVectorCountByTeamId(teamId) - ]); + const [ + planStatus, + usedMember, + usedAppAmount, + usedDatasetSize, + usedDatasetIndexSize, + usedRegistrationCount + ] = await Promise.all([ + getTeamPlanStatus({ + teamId + }), + MongoTeamMember.countDocuments({ + teamId, + status: { $ne: TeamMemberStatusEnum.leave } + }), + MongoApp.countDocuments({ + teamId, + type: { + $in: [AppTypeEnum.simple, AppTypeEnum.workflow] + } + }), + MongoDataset.countDocuments({ + teamId, + type: { $ne: DatasetTypeEnum.folder } + }), + getVectorCountByTeamId(teamId), + MongoAppRegistration.countDocuments({ + teamId + }) + ]); return { ...planStatus, usedMember, usedAppAmount, usedDatasetSize, - usedDatasetIndexSize + usedDatasetIndexSize, + usedRegistrationCount }; } catch (error) {} } diff --git a/projects/app/src/pages/api/v1/audio/transcriptions.ts b/projects/app/src/pages/api/v1/audio/transcriptions.ts index 17bddca80..aa1f470b3 100644 --- a/projects/app/src/pages/api/v1/audio/transcriptions.ts +++ b/projects/app/src/pages/api/v1/audio/transcriptions.ts @@ -1,33 +1,20 @@ import type { NextApiRequest, NextApiResponse } from 'next'; import { jsonRes } from '@fastgpt/service/common/response'; -import { getUploadModel } from '@fastgpt/service/common/file/multer'; -import { removeFilesByPaths } from '@fastgpt/service/common/file/utils'; -import fs from 'fs'; import { pushWhisperUsage } from '@/service/support/wallet/usage/push'; import { authChatCrud } from '@/service/support/permission/auth/chat'; -import { type OutLinkChatAuthProps } from '@fastgpt/global/support/permission/chat'; import { NextAPI } from '@/service/middleware/entry'; import { aiTranscriptions } from '@fastgpt/service/core/ai/audio/transcriptions'; import { useIPFrequencyLimit } from '@fastgpt/service/common/middle/reqFrequencyLimit'; import { getDefaultSTTModel } from '@fastgpt/service/core/ai/model'; +import { multer } from '@fastgpt/service/common/file/multer'; -const upload = getUploadModel({ - maxSize: 5 -}); - -async function handler(req: NextApiRequest, res: NextApiResponse) { - let filePaths: string[] = []; +async function handler(req: NextApiRequest, res: NextApiResponse) { + const filepaths: string[] = []; try { - let { - file, - data: { appId, duration, shareId, outLinkUid, teamId: spaceTeamId, teamToken } - } = await upload.getUploadFile< - OutLinkChatAuthProps & { - appId: string; - duration: number; - } - >(req, res); + const result = await multer.resolveFormData({ request: req }); + filepaths.push(result.fileMetadata.path); + let { appId, duration, shareId, outLinkUid, teamId: spaceTeamId, teamToken } = result.data; req.body.appId = appId; req.body.shareId = shareId; @@ -35,13 +22,11 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { req.body.teamId = spaceTeamId; req.body.teamToken = teamToken; - filePaths = [file.path]; - if (!getDefaultSTTModel()) { throw new Error('whisper model not found'); } - if (!file) { + if (!result.fileMetadata) { throw new Error('file not found'); } if (duration === undefined) { @@ -49,36 +34,34 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { } duration = duration < 1 ? 1 : duration; - // auth role const { teamId, tmbId } = await authChatCrud({ req, authToken: true, ...req.body }); - const result = await aiTranscriptions({ + const transcriptionsResult = await aiTranscriptions({ model: getDefaultSTTModel(), - fileStream: fs.createReadStream(file.path) + fileStream: result.getReadStream() }); pushWhisperUsage({ teamId, tmbId, - duration: result?.usage?.total_tokens || duration + duration: transcriptionsResult?.usage?.total_tokens || duration }); jsonRes(res, { - data: result.text + data: transcriptionsResult.text }); } catch (err) { - console.log(err); jsonRes(res, { code: 500, error: err }); + } finally { + multer.clearDiskTempFiles(filepaths); } - - removeFilesByPaths(filePaths); } export default NextAPI( diff --git a/projects/app/src/pages/api/v1/chat/completions.ts b/projects/app/src/pages/api/v1/chat/completions.ts index ab8e38100..d8957f162 100644 --- a/projects/app/src/pages/api/v1/chat/completions.ts +++ b/projects/app/src/pages/api/v1/chat/completions.ts @@ -22,7 +22,6 @@ import { saveChat, updateInteractiveChat } from '@fastgpt/service/core/chat/save import { responseWrite } from '@fastgpt/service/common/response'; import { authOutLinkChatStart } from '@/service/support/permission/auth/outLink'; import { pushResult2Remote, addOutLinkUsage } from '@fastgpt/service/support/outLink/tools'; -import requestIp from 'request-ip'; import { getUsageSourceByAuthType } from '@fastgpt/global/support/wallet/usage/tools'; import { authTeamSpaceToken } from '@/service/support/permission/auth/team'; import { @@ -61,6 +60,8 @@ import { UserError } from '@fastgpt/global/common/error/utils'; import { getLocale } from '@fastgpt/service/common/middle/i18n'; import { formatTime2YMDHM } from '@fastgpt/global/common/string/time'; import { LimitTypeEnum, teamFrequencyLimit } from '@fastgpt/service/common/api/frequencyLimit'; +import { getIpFromRequest } from '@fastgpt/service/common/geo'; +import { pushTrack } from '@fastgpt/service/common/middle/tracks/utils'; type FastGptWebChatProps = { chatId?: string; // undefined: get histories from messages, '': new chat, 'xxxxx': get histories from db @@ -114,10 +115,10 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { metadata } = req.body as Props; - const originIp = requestIp.getClientIp(req); - const startTime = Date.now(); + const originIp = getIpFromRequest(req); + try { if (!Array.isArray(messages)) { throw new Error('messages is not array'); @@ -196,6 +197,8 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { return; } + pushTrack.teamChatQPM({ teamId }); + retainDatasetCite = retainDatasetCite && !!responseDetail; const isPlugin = app.type === AppTypeEnum.workflowTool; @@ -360,8 +363,8 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { userContent: userQuestion, aiContent: aiResponse, metadata: { - originIp, - ...metadata + ...metadata, + originIp }, durationSeconds }; diff --git a/projects/app/src/pages/api/v2/chat/completions.ts b/projects/app/src/pages/api/v2/chat/completions.ts index 5b6b83c06..20977a9c4 100644 --- a/projects/app/src/pages/api/v2/chat/completions.ts +++ b/projects/app/src/pages/api/v2/chat/completions.ts @@ -22,7 +22,6 @@ import { saveChat, updateInteractiveChat } from '@fastgpt/service/core/chat/save import { responseWrite } from '@fastgpt/service/common/response'; import { authOutLinkChatStart } from '@/service/support/permission/auth/outLink'; import { pushResult2Remote, addOutLinkUsage } from '@fastgpt/service/support/outLink/tools'; -import requestIp from 'request-ip'; import { getUsageSourceByAuthType } from '@fastgpt/global/support/wallet/usage/tools'; import { authTeamSpaceToken } from '@/service/support/permission/auth/team'; import { @@ -62,6 +61,8 @@ import { UserError } from '@fastgpt/global/common/error/utils'; import { getLocale } from '@fastgpt/service/common/middle/i18n'; import { formatTime2YMDHM } from '@fastgpt/global/common/string/time'; import { LimitTypeEnum, teamFrequencyLimit } from '@fastgpt/service/common/api/frequencyLimit'; +import { getIpFromRequest } from '@fastgpt/service/common/geo'; +import { pushTrack } from '@fastgpt/service/common/middle/tracks/utils'; type FastGptWebChatProps = { chatId?: string; // undefined: get histories from messages, '': new chat, 'xxxxx': get histories from db @@ -115,7 +116,7 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { metadata } = req.body as Props; - const originIp = requestIp.getClientIp(req); + const originIp = getIpFromRequest(req); const startTime = Date.now(); @@ -197,6 +198,8 @@ async function handler(req: NextApiRequest, res: NextApiResponse) { return; } + pushTrack.teamChatQPM({ teamId }); + retainDatasetCite = retainDatasetCite && !!responseDetail; const isPlugin = app.type === AppTypeEnum.workflowTool; diff --git a/projects/app/src/pages/chat/share.tsx b/projects/app/src/pages/chat/share.tsx index 1d9166e24..18aca372c 100644 --- a/projects/app/src/pages/chat/share.tsx +++ b/projects/app/src/pages/chat/share.tsx @@ -81,8 +81,11 @@ const OutLink = (props: Props) => { const { isPc } = useSystem(); const { outLinkAuthData, appId, chatId } = useChatStore(); - const isOpenSlider = useContextSelector(ChatContext, (v) => v.isOpenSlider); - const onCloseSlider = useContextSelector(ChatContext, (v) => v.onCloseSlider); + // Remove empty value field + const formatedCustomVariables = useMemo(() => { + return Object.fromEntries(Object.entries(customVariables).filter(([_, value]) => value !== '')); + }, [customVariables]); + const forbidLoadChat = useContextSelector(ChatContext, (v) => v.forbidLoadChat); const onChangeChatId = useContextSelector(ChatContext, (v) => v.onChangeChatId); const onUpdateHistoryTitle = useContextSelector(ChatContext, (v) => v.onUpdateHistoryTitle); @@ -114,7 +117,10 @@ const OutLink = (props: Props) => { setChatBoxData(res); resetVariables({ - variables: res.variables, + variables: { + ...formatedCustomVariables, + ...res.variables + }, variableList: res.app?.chatConfig?.variables }); diff --git a/projects/app/src/pages/login/provider.tsx b/projects/app/src/pages/login/provider.tsx index 71360e1b2..ed4c2e6e7 100644 --- a/projects/app/src/pages/login/provider.tsx +++ b/projects/app/src/pages/login/provider.tsx @@ -21,11 +21,12 @@ import { } from '@/web/support/marketing/utils'; import { postAcceptInvitationLink } from '@/web/support/user/team/api'; import { retryFn } from '@fastgpt/global/common/system/utils'; +import type { LangEnum } from '@fastgpt/global/common/i18n/type'; let isOauthLogging = false; const provider = () => { - const { t } = useTranslation(); + const { t, i18n } = useTranslation(); const { initd, loginStore, setLoginStore } = useSystemStore(); const { setUserInfo } = useUserStore(); const router = useRouter(); @@ -79,7 +80,8 @@ const provider = () => { bd_vid: getBdVId(), msclkid: getMsclkid(), fastgpt_sem: getFastGPTSem(), - sourceDomain: getSourceDomain() + sourceDomain: getSourceDomain(), + language: i18n.language as LangEnum }); if (!res) { diff --git a/projects/app/src/pages/price/index.tsx b/projects/app/src/pages/price/index.tsx index 46614896b..331b8bfc3 100644 --- a/projects/app/src/pages/price/index.tsx +++ b/projects/app/src/pages/price/index.tsx @@ -36,7 +36,7 @@ const PriceBox = () => { }, { threshold: 0, - rootMargin: '0px' + rootMargin: '0px 0px -50px 0px' } ); @@ -48,6 +48,7 @@ const PriceBox = () => { if (backButtonRef.current) { observer.unobserve(backButtonRef.current); } + observer.disconnect(); }; }, []); @@ -69,7 +70,7 @@ const PriceBox = () => { backgroundSize={'cover'} backgroundRepeat={'no-repeat'} > - {userInfo && ( + {teamSubPlan?.standard?.teamId && ( )} - {!isButtonInView && userInfo && ( + {!isButtonInView && teamSubPlan?.standard?.teamId && ( { addLog.info('Init BullMQ Workers...'); initS3MQWorker(); + initDatasetDeleteWorker(); }; diff --git a/projects/app/src/service/common/system/cron.ts b/projects/app/src/service/common/system/cron.ts index 0ecf1441a..d14a3fdf4 100644 --- a/projects/app/src/service/common/system/cron.ts +++ b/projects/app/src/service/common/system/cron.ts @@ -1,18 +1,11 @@ import { setCron } from '@fastgpt/service/common/system/cron'; import { startTrainingQueue } from '@/service/core/dataset/training/utils'; import { clearTmpUploadFiles } from '@fastgpt/service/common/file/utils'; -import { - checkInvalidDatasetFiles, - checkInvalidDatasetData, - checkInvalidVector, - removeExpiredChatFiles -} from './cronTask'; +import { checkInvalidDatasetData, checkInvalidVector } from './cronTask'; import { checkTimerLock } from '@fastgpt/service/common/system/timerLock/utils'; import { TimerIdEnum } from '@fastgpt/service/common/system/timerLock/constants'; import { addHours } from 'date-fns'; import { getScheduleTriggerApp } from '@/service/core/app/utils'; -import { clearExpiredRawTextBufferCron } from '@fastgpt/service/common/buffer/rawText/controller'; -import { clearExpiredDatasetImageCron } from '@fastgpt/service/core/dataset/image/controller'; import { cronRefreshModels } from '@fastgpt/service/core/ai/config/utils'; import { clearExpiredS3FilesCron } from '@fastgpt/service/common/s3/controller'; @@ -32,19 +25,6 @@ const setClearTmpUploadFilesCron = () => { }; const clearInvalidDataCron = () => { - // Clear files - setCron('0 */1 * * *', async () => { - if ( - await checkTimerLock({ - timerId: TimerIdEnum.checkExpiredFiles, - lockMinuted: 59 - }) - ) { - await checkInvalidDatasetFiles(addHours(new Date(), -6), addHours(new Date(), -2)); - removeExpiredChatFiles(); - } - }); - setCron('10 */1 * * *', async () => { if ( await checkTimerLock({ @@ -88,8 +68,6 @@ export const startCron = () => { setClearTmpUploadFilesCron(); clearInvalidDataCron(); scheduleTriggerAppCron(); - clearExpiredRawTextBufferCron(); - clearExpiredDatasetImageCron(); cronRefreshModels(); clearExpiredS3FilesCron(); }; diff --git a/projects/app/src/service/common/system/cronTask.ts b/projects/app/src/service/common/system/cronTask.ts index 18507c994..a3245918d 100644 --- a/projects/app/src/service/common/system/cronTask.ts +++ b/projects/app/src/service/common/system/cronTask.ts @@ -1,9 +1,4 @@ -import { BucketNameEnum } from '@fastgpt/global/common/file/constants'; import { retryFn } from '@fastgpt/global/common/system/utils'; -import { - delFileByFileIdList, - getGFSCollection -} from '@fastgpt/service/common/file/gridfs/controller'; import { addLog } from '@fastgpt/service/common/system/log'; import { deleteDatasetDataVector, @@ -13,88 +8,6 @@ import { MongoDatasetCollection } from '@fastgpt/service/core/dataset/collection import { MongoDatasetDataText } from '@fastgpt/service/core/dataset/data/dataTextSchema'; import { MongoDatasetData } from '@fastgpt/service/core/dataset/data/schema'; import { MongoDatasetTraining } from '@fastgpt/service/core/dataset/training/schema'; -import { addDays } from 'date-fns'; - -/* - check dataset.files data. If there is no match in dataset.collections, delete it - 可能异常情况: - 1. 上传了文件,未成功创建集合 -*/ -export async function checkInvalidDatasetFiles(start: Date, end: Date) { - let deleteFileAmount = 0; - const collection = getGFSCollection(BucketNameEnum.dataset); - const where = { - uploadDate: { $gte: start, $lte: end } - }; - - // 1. get all file _id - const files = await collection - .find(where, { - projection: { - metadata: 1, - _id: 1 - } - }) - .toArray(); - addLog.info(`Clear invalid dataset files, total files: ${files.length}`); - - let index = 0; - for await (const file of files) { - try { - // 2. find fileId in dataset.collections - const hasCollection = await MongoDatasetCollection.countDocuments({ - teamId: file.metadata.teamId, - fileId: file._id - }); - - // 3. if not found, delete file - if (hasCollection === 0) { - await delFileByFileIdList({ - bucketName: BucketNameEnum.dataset, - fileIdList: [String(file._id)] - }); - console.log('delete file', file._id); - deleteFileAmount++; - } - index++; - index % 100 === 0 && console.log(index); - } catch (error) { - console.log(error); - } - } - addLog.info(`Clear invalid dataset files finish, remove ${deleteFileAmount} files`); -} - -/* - Remove 7 days ago chat files -*/ -export const removeExpiredChatFiles = async () => { - let deleteFileAmount = 0; - const collection = getGFSCollection(BucketNameEnum.chat); - - const expireTime = Number(process.env.CHAT_FILE_EXPIRE_TIME || 7); - const where = { - uploadDate: { $lte: addDays(new Date(), -expireTime) } - }; - - // get all file _id - const files = await collection.find(where, { projection: { _id: 1 } }).toArray(); - - // Delete file one by one - for await (const file of files) { - try { - await delFileByFileIdList({ - bucketName: BucketNameEnum.chat, - fileIdList: [String(file._id)] - }); - deleteFileAmount++; - } catch (error) { - console.log(error); - } - } - - addLog.info(`Remove expired chat files finish, remove ${deleteFileAmount} files`); -}; /* 检测无效的 Mongo 数据 diff --git a/projects/app/src/service/common/system/index.ts b/projects/app/src/service/common/system/index.ts index 2aae46a09..a2e61968d 100644 --- a/projects/app/src/service/common/system/index.ts +++ b/projects/app/src/service/common/system/index.ts @@ -148,6 +148,7 @@ export async function initSystemConfig() { hideChatCopyrightSetting: process.env.HIDE_CHAT_COPYRIGHT_SETTING === 'true', show_aiproxy: !!process.env.AIPROXY_API_ENDPOINT, show_coupon: process.env.SHOW_COUPON === 'true', + show_discount_coupon: process.env.SHOW_DISCOUNT_COUPON === 'true', show_dataset_enhance: licenseData?.functions?.datasetEnhance, show_batch_eval: licenseData?.functions?.batchEval }, diff --git a/projects/app/src/service/core/dataset/data/controller.ts b/projects/app/src/service/core/dataset/data/controller.ts index 48e06d7a1..3c2e78d48 100644 --- a/projects/app/src/service/core/dataset/data/controller.ts +++ b/projects/app/src/service/core/dataset/data/controller.ts @@ -17,10 +17,9 @@ import { type ClientSession } from '@fastgpt/service/common/mongo'; import { MongoDatasetDataText } from '@fastgpt/service/core/dataset/data/dataTextSchema'; import { DatasetDataIndexTypeEnum } from '@fastgpt/global/core/dataset/data/constants'; import { countPromptTokens } from '@fastgpt/service/common/string/tiktoken'; -import { deleteDatasetImage } from '@fastgpt/service/core/dataset/image/controller'; import { isS3ObjectKey } from '@fastgpt/service/common/s3/utils'; import { text2Chunks } from '@fastgpt/service/worker/function'; -import { getS3DatasetSource, S3DatasetSource } from '@fastgpt/service/common/s3/sources/dataset'; +import { getS3DatasetSource } from '@fastgpt/service/common/s3/sources/dataset'; import { removeS3TTL } from '@fastgpt/service/common/s3/utils'; const formatIndexes = async ({ @@ -422,18 +421,19 @@ export async function updateData2Dataset({ export const deleteDatasetData = async (data: DatasetDataItemType) => { await mongoSessionRun(async (session) => { + if (data.imageId && !isS3ObjectKey(data.imageId, 'dataset')) { + return Promise.reject('Invalid dataset image key'); + } + // 1. Delete MongoDB data await MongoDatasetData.deleteOne({ _id: data.id }, { session }); await MongoDatasetDataText.deleteMany({ dataId: data.id }, { session }); - // 2. If there are any image files, delete the image records and GridFS file. if (data.imageId) { - await deleteDatasetImage(data.imageId); + await getS3DatasetSource().deleteDatasetFileByKey(data.imageId); } - // Note: We don't delete parsed images from S3 here - they will be cleaned up when the collection is deleted - - // 3. Delete vector data + // 2. Delete vector data await deleteDatasetDataVector({ teamId: data.teamId, idList: data.indexes.map((item) => item.dataId) diff --git a/projects/app/src/service/support/wallet/usage/push.ts b/projects/app/src/service/support/wallet/usage/push.ts index f5afba3a9..df98a53a4 100644 --- a/projects/app/src/service/support/wallet/usage/push.ts +++ b/projects/app/src/service/support/wallet/usage/push.ts @@ -260,25 +260,41 @@ export const pushDatasetTestUsage = ({ model: string; inputTokens: number; outputTokens: number; + embeddingTokens: number; + embeddingModel: string; }; }) => { const list: UsageItemType[] = []; let points = 0; if (extensionUsage) { - const { totalPoints, modelName } = formatModelChars2Points({ + const { totalPoints: llmPoints, modelName: llmModelName } = formatModelChars2Points({ model: extensionUsage.model, inputTokens: extensionUsage.inputTokens, outputTokens: extensionUsage.outputTokens }); - points += totalPoints; + points += llmPoints; list.push({ moduleName: i18nT('common:core.module.template.Query extension'), - amount: totalPoints, - model: modelName, + amount: llmPoints, + model: llmModelName, inputTokens: extensionUsage.inputTokens, outputTokens: extensionUsage.outputTokens }); + + const { totalPoints: embeddingPoints, modelName: embeddingModelName } = formatModelChars2Points( + { + model: extensionUsage.embeddingModel, + inputTokens: extensionUsage.embeddingTokens + } + ); + points += embeddingPoints; + list.push({ + moduleName: `${i18nT('account_usage:ai.query_extension_embedding')}`, + amount: embeddingPoints, + model: embeddingModelName, + inputTokens: extensionUsage.embeddingTokens + }); } if (embUsage) { const { totalPoints, modelName } = formatModelChars2Points({ diff --git a/projects/app/src/types/app.d.ts b/projects/app/src/types/app.d.ts index c4f3955df..b667b02be 100644 --- a/projects/app/src/types/app.d.ts +++ b/projects/app/src/types/app.d.ts @@ -14,6 +14,7 @@ import type { AppSchema } from '@fastgpt/global/core/app/type'; import { ChatModelType } from '@/constants/model'; import { TeamMemberStatusEnum } from '@fastgpt/global/support/user/team/constant'; import type { SourceMember } from '@fastgpt/global/support/user/type'; +import type { LocationName } from '@fastgpt/service/common/geo/type'; export interface ShareAppItem { _id: string; @@ -52,4 +53,5 @@ export type AppLogsListItemType = { outLinkUid?: string; tmbId: string; sourceMember: SourceMember; + region?: string; }; diff --git a/projects/app/src/types/user.d.ts b/projects/app/src/types/user.d.ts index 8b252554c..b86587703 100644 --- a/projects/app/src/types/user.d.ts +++ b/projects/app/src/types/user.d.ts @@ -1,8 +1,11 @@ import { UsageSourceEnum } from '@fastgpt/global/support/wallet/usage/constants'; import type { UserModelSchema } from '@fastgpt/global/support/user/type'; +import type { UserType } from '@fastgpt/global/support/user/type'; export interface UserUpdateParams { - balance?: number; avatar?: string; timezone?: string; + language?: UserType['language']; + /** @deprecated */ + balance?: number; } diff --git a/projects/app/src/web/common/api/request.ts b/projects/app/src/web/common/api/request.ts index 6268863f0..2fae3fb3a 100644 --- a/projects/app/src/web/common/api/request.ts +++ b/projects/app/src/web/common/api/request.ts @@ -162,10 +162,7 @@ function responseError(err: any) { /* 创建请求实例 */ const instance = axios.create({ - timeout: 60000, // 超时时间 - headers: { - 'content-type': 'application/json' - } + timeout: 60000 // 超时时间 }); /* 请求拦截 */ diff --git a/projects/app/src/web/common/file/api.ts b/projects/app/src/web/common/file/api.ts index bf5b96afb..5800b9e68 100644 --- a/projects/app/src/web/common/file/api.ts +++ b/projects/app/src/web/common/file/api.ts @@ -10,9 +10,6 @@ export const postS3UploadFile = ( ) => POST(postURL, form, { timeout: 600000, - headers: { - 'Content-Type': 'multipart/form-data' - }, onUploadProgress }); diff --git a/projects/app/src/web/common/hooks/useSpeech.ts b/projects/app/src/web/common/hooks/useSpeech.ts index cdbd0a94b..ac6bbedfd 100644 --- a/projects/app/src/web/common/hooks/useSpeech.ts +++ b/projects/app/src/web/common/hooks/useSpeech.ts @@ -445,10 +445,7 @@ export const useSpeech = (props?: OutLinkChatAuthProps & { appId?: string }) => setIsTransCription(true); try { const result = await POST('/v1/audio/transcriptions', formData, { - timeout: 60000, - headers: { - 'Content-Type': 'multipart/form-data; charset=utf-8' - } + timeout: 60000 }); onFinish(result); } catch (error) { diff --git a/projects/app/src/web/core/app/api/evaluation.ts b/projects/app/src/web/core/app/api/evaluation.ts index 5caf12f10..38b289f5b 100644 --- a/projects/app/src/web/core/app/api/evaluation.ts +++ b/projects/app/src/web/core/app/api/evaluation.ts @@ -32,9 +32,6 @@ export const postCreateEvaluation = ({ const percent = Math.round((e.loaded / e.total) * 100); percentListen?.(percent); - }, - headers: { - 'Content-Type': 'multipart/form-data; charset=utf-8' } }); }; diff --git a/projects/app/src/web/core/chat/context/chatItemContext.tsx b/projects/app/src/web/core/chat/context/chatItemContext.tsx index ecacb8a20..b56bb07d3 100644 --- a/projects/app/src/web/core/chat/context/chatItemContext.tsx +++ b/projects/app/src/web/core/chat/context/chatItemContext.tsx @@ -144,15 +144,16 @@ const ChatItemContextProvider = ({ (props?: { variables?: Record; variableList?: VariableItemType[] }) => { const { variables = {}, variableList = [] } = props || {}; + const varValues: Record = {}; + variableList.forEach((item) => { - if (variables[item.key] === undefined) { - variables[item.key] = item.defaultValue; - } + varValues[item.key] = variables[item.key] ?? variables[item.label] ?? item.defaultValue; }); const values = variablesForm.getValues(); + variablesForm.reset({ ...values, - variables + variables: varValues }); }, [variablesForm] diff --git a/projects/app/src/web/core/dataset/api.ts b/projects/app/src/web/core/dataset/api.ts index 08df242d1..8aeec32a2 100644 --- a/projects/app/src/web/core/dataset/api.ts +++ b/projects/app/src/web/core/dataset/api.ts @@ -146,9 +146,6 @@ export const postBackupDatasetCollection = ({ const percent = Math.round((e.loaded / e.total) * 100); percentListen?.(percent); - }, - headers: { - 'Content-Type': 'multipart/form-data; charset=utf-8' } }); }; @@ -172,9 +169,6 @@ export const postTemplateDatasetCollection = ({ const percent = Math.round((e.loaded / e.total) * 100); percentListen?.(percent); - }, - headers: { - 'Content-Type': 'multipart/form-data; charset=utf-8' } }); }; diff --git a/projects/app/src/web/core/dataset/image/api.ts b/projects/app/src/web/core/dataset/image/api.ts index 483a72f5e..b437381dd 100644 --- a/projects/app/src/web/core/dataset/image/api.ts +++ b/projects/app/src/web/core/dataset/image/api.ts @@ -17,9 +17,6 @@ export const createImageDatasetCollection = async ({ return await POST<{ collectionId: string }>('/core/dataset/collection/create/images', formData, { timeout: 600000, - headers: { - 'Content-Type': 'multipart/form-data; charset=utf-8' - }, onUploadProgress: (e) => { if (!onUploadProgress) return; if (!e.progress) { @@ -48,9 +45,6 @@ export const insertImagesToCollection = async ({ return await POST<{ collectionId: string }>('/core/dataset/data/insertImages', formData, { timeout: 600000, - headers: { - 'Content-Type': 'multipart/form-data; charset=utf-8' - }, onUploadProgress: (e) => { if (!onUploadProgress) return; if (!e.progress) { diff --git a/projects/app/src/web/support/user/api.ts b/projects/app/src/web/support/user/api.ts index 5b8af1df4..6cf3ee85a 100644 --- a/projects/app/src/web/support/user/api.ts +++ b/projects/app/src/web/support/user/api.ts @@ -16,12 +16,14 @@ import type { } from '@fastgpt/global/support/user/login/api.d'; import type { preLoginResponse } from '@/pages/api/support/user/account/preLogin'; import type { WxLoginProps } from '@fastgpt/global/support/user/api.d'; +import type { LangEnum } from '@fastgpt/global/common/i18n/type'; export const sendAuthCode = (data: { username: string; type: `${UserAuthTypeEnum}`; googleToken: string; captcha: string; + lang: `${LangEnum}`; }) => POST(`/proApi/support/user/inform/sendAuthCode`, data); export const getTokenLogin = () => diff --git a/projects/app/src/web/support/user/hooks/useSendCode.tsx b/projects/app/src/web/support/user/hooks/useSendCode.tsx index 1035bf0a8..d11c3d12c 100644 --- a/projects/app/src/web/support/user/hooks/useSendCode.tsx +++ b/projects/app/src/web/support/user/hooks/useSendCode.tsx @@ -8,10 +8,11 @@ import { Box, type BoxProps, useDisclosure } from '@chakra-ui/react'; import SendCodeAuthModal from '@/components/support/user/safe/SendCodeAuthModal'; import { useMemoizedFn } from 'ahooks'; import { useToast } from '@fastgpt/web/hooks/useToast'; +import type { LangEnum } from '@fastgpt/global/common/i18n/type'; let timer: NodeJS.Timeout; export const useSendCode = ({ type }: { type: `${UserAuthTypeEnum}` }) => { - const { t } = useTranslation(); + const { t, i18n } = useTranslation(); const { feConfigs } = useSystemStore(); const { toast } = useToast(); const [codeCountDown, setCodeCountDown] = useState(0); @@ -20,7 +21,7 @@ export const useSendCode = ({ type }: { type: `${UserAuthTypeEnum}` }) => { async ({ username, captcha }: { username: string; captcha: string }) => { if (codeCountDown > 0) return; const googleToken = await getClientToken(feConfigs.googleClientVerKey); - await sendAuthCode({ username, type, googleToken, captcha }); + await sendAuthCode({ username, type, googleToken, captcha, lang: i18n.language as LangEnum }); setCodeCountDown(60); diff --git a/projects/app/src/web/support/user/useUserStore.ts b/projects/app/src/web/support/user/useUserStore.ts index 91d588ac3..46f225635 100644 --- a/projects/app/src/web/support/user/useUserStore.ts +++ b/projects/app/src/web/support/user/useUserStore.ts @@ -6,6 +6,7 @@ import type { OrgType } from '@fastgpt/global/support/user/team/org/type'; import type { UserType } from '@fastgpt/global/support/user/type.d'; import type { ClientTeamPlanStatusType } from '@fastgpt/global/support/wallet/sub/type'; import { getTeamPlanStatus } from './team/api'; +import { setLangToStorage, getLangMapping } from '@fastgpt/web/i18n/utils'; type State = { systemMsgReadId: string; @@ -16,7 +17,7 @@ type State = { userInfo: UserType | null; isTeamAdmin: boolean; - initUserInfo: () => Promise; + initUserInfo: () => Promise; setUserInfo: (user: UserType | null) => void; updateUserInfo: (user: UserUpdateParams) => Promise; @@ -49,21 +50,30 @@ export const useUserStore = create()( async initUserInfo() { get().initTeamPlanStatus(); - const res = await getTokenLogin(); - get().setUserInfo(res); + try { + const res = await getTokenLogin(); + get().setUserInfo(res); - //设置html的fontsize - const html = document?.querySelector('html'); - if (html) { - // html.style.fontSize = '16px'; + //设置html的fontsize + const html = document?.querySelector('html'); + if (html) { + // html.style.fontSize = '16px'; + } + + return res; + } catch (error) { + console.log('[Init user] error', error); } - - return res; }, setUserInfo(user: UserType | null) { set((state) => { state.userInfo = user ? user : null; state.isTeamAdmin = !!user?.team?.permission?.hasManagePer; + const lang = user?.language; + if (lang) { + const mappedLang = getLangMapping(lang); + setLangToStorage(mappedLang); + } }); }, async updateUserInfo(user: UserUpdateParams) { diff --git a/projects/app/src/web/support/wallet/bill/api.ts b/projects/app/src/web/support/wallet/bill/api.ts index ae493cdca..6d4c48a19 100644 --- a/projects/app/src/web/support/wallet/bill/api.ts +++ b/projects/app/src/web/support/wallet/bill/api.ts @@ -1,27 +1,25 @@ import { GET, POST, PUT } from '@/web/common/api/request'; import type { - CheckPayResultResponse, - CreateBillProps, - CreateBillResponse, - CreateOrderResponse, - UpdatePaymentProps -} from '@fastgpt/global/support/wallet/bill/api'; -import type { BillTypeEnum } from '@fastgpt/global/support/wallet/bill/constants'; + CreateBillPropsType, + CreateBillResponseType, + GetBillListQueryType, + GetBillListResponseType, + CheckPayResultResponseType, + UpdatePaymentPropsType, + BillDetailResponseType, + CancelBillPropsType, + UpdateBillResponseType +} from '@fastgpt/global/openapi/support/wallet/bill/api'; import { BillStatusEnum } from '@fastgpt/global/support/wallet/bill/constants'; -import type { BillSchemaType } from '@fastgpt/global/support/wallet/bill/type.d'; -import type { PaginationProps, PaginationResponse } from '@fastgpt/web/common/fetch/type'; -export const getBills = ( - data: PaginationProps<{ - type?: BillTypeEnum; - }> -) => POST>(`/proApi/support/wallet/bill/list`, data); +export const getBills = (data: GetBillListQueryType) => + POST(`/proApi/support/wallet/bill/list`, data); -export const postCreatePayBill = (data: CreateBillProps) => - POST(`/proApi/support/wallet/bill/create`, data); +export const postCreatePayBill = (data: CreateBillPropsType) => + POST(`/proApi/support/wallet/bill/create`, data); -export const checkBalancePayResult = (payId: string) => - GET(`/proApi/support/wallet/bill/pay/checkPayResult`, { payId }).then( +export const checkBalancePayResult = (payId: string): Promise => + GET(`/proApi/support/wallet/bill/pay/checkPayResult`, { payId }).then( (data) => { try { if (data.status === BillStatusEnum.SUCCESS) { @@ -32,7 +30,13 @@ export const checkBalancePayResult = (payId: string) => } ); -export const putUpdatePayment = (data: UpdatePaymentProps) => - PUT(`/proApi/support/wallet/bill/pay/updatePayment`, data); +export const putUpdatePayment = (data: UpdatePaymentPropsType) => + PUT(`/proApi/support/wallet/bill/pay/updatePayment`, data); export const balanceConversion = () => GET(`/proApi/support/wallet/bill/balanceConversion`); + +export const cancelBill = (data: CancelBillPropsType) => + POST(`/proApi/support/wallet/bill/cancel`, data); + +export const getBillDetail = (billId: string) => + GET(`/proApi/support/wallet/bill/detail`, { billId }); diff --git a/projects/app/src/web/support/wallet/sub/discountCoupon/api.ts b/projects/app/src/web/support/wallet/sub/discountCoupon/api.ts new file mode 100644 index 000000000..4032bd4c2 --- /dev/null +++ b/projects/app/src/web/support/wallet/sub/discountCoupon/api.ts @@ -0,0 +1,5 @@ +import { GET } from '@/web/common/api/request'; +import type { DiscountCouponListResponseType } from '@fastgpt/global/openapi/support/wallet/discountCoupon/api'; + +export const getDiscountCouponList = (teamId: string) => + GET(`/proApi/support/wallet/discountCoupon/list`, { teamId }); diff --git a/projects/app/test/api/support/user/account/loginByPassword.test.ts b/projects/app/test/api/support/user/account/loginByPassword.test.ts new file mode 100644 index 000000000..7d1782ab8 --- /dev/null +++ b/projects/app/test/api/support/user/account/loginByPassword.test.ts @@ -0,0 +1,264 @@ +import { describe, it, expect, vi, beforeEach } from 'vitest'; +import * as loginApi from '@/pages/api/support/user/account/loginByPassword'; +import { MongoUser } from '@fastgpt/service/support/user/schema'; +import { UserStatusEnum } from '@fastgpt/global/support/user/constant'; +import { MongoTeam } from '@fastgpt/service/support/user/team/teamSchema'; +import { MongoTeamMember } from '@fastgpt/service/support/user/team/teamMemberSchema'; +import { authCode } from '@fastgpt/service/support/user/auth/controller'; +import { createUserSession } from '@fastgpt/service/support/user/session'; +import { setCookie } from '@fastgpt/service/support/permission/auth/common'; +import { pushTrack } from '@fastgpt/service/common/middle/tracks/utils'; +import { addAuditLog } from '@fastgpt/service/support/user/audit/util'; +import { CommonErrEnum } from '@fastgpt/global/common/error/code/common'; +import { UserErrEnum } from '@fastgpt/global/common/error/code/user'; +import { Call } from '@test/utils/request'; +import type { PostLoginProps } from '@fastgpt/global/support/user/api.d'; +import { initTeamFreePlan } from '@fastgpt/service/support/wallet/sub/utils'; + +describe('loginByPassword API', () => { + let testUser: any; + let testTeam: any; + let testTmb: any; + + beforeEach(async () => { + // Create test user and team + testUser = await MongoUser.create({ + username: 'testuser', + password: 'testpassword', + status: UserStatusEnum.active + }); + + testTeam = await MongoTeam.create({ + name: 'Test Team', + ownerId: testUser._id + }); + + await initTeamFreePlan({ + teamId: String(testTeam._id) + }); + + testTmb = await MongoTeamMember.create({ + teamId: testTeam._id, + userId: testUser._id, + status: 'active', + role: 'owner' + }); + + await MongoUser.findByIdAndUpdate(testUser._id, { + lastLoginTmbId: testTmb._id + }); + + // Reset mocks before each test + vi.clearAllMocks(); + }); + + it('should login successfully with valid credentials', async () => { + const res = await Call(loginApi.default, { + body: { + username: 'testuser', + password: 'testpassword', + code: '123456', + language: 'zh-CN' + } + }); + + expect(res.code).toBe(200); + expect(res.error).toBeUndefined(); + expect(res.data).toBeDefined(); + expect(res.data.user).toBeDefined(); + expect(res.data.user.team).toBeDefined(); + expect(res.data.token).toBeDefined(); + expect(typeof res.data.token).toBe('string'); + expect(res.data.token.length).toBeGreaterThan(0); + + // Verify authCode was called + expect(authCode).toHaveBeenCalledWith({ + key: 'testuser', + code: '123456', + type: expect.any(String) + }); + + // Verify setCookie was called + expect(setCookie).toHaveBeenCalled(); + + // Verify tracking was called + expect(pushTrack.login).toHaveBeenCalledWith({ + type: 'password', + uid: testUser._id, + teamId: String(testTeam._id), + tmbId: String(testTmb._id) + }); + + // Verify audit log was called + expect(addAuditLog).toHaveBeenCalled(); + }); + + it('should reject login when username is missing', async () => { + const res = await Call(loginApi.default, { + body: { + username: '', + password: 'testpassword', + code: '123456' + } + }); + + expect(res.code).toBe(500); + expect(res.error).toBe(CommonErrEnum.invalidParams); + }); + + it('should reject login when password is missing', async () => { + const res = await Call(loginApi.default, { + body: { + username: 'testuser', + password: '', + code: '123456' + } + }); + + expect(res.code).toBe(500); + expect(res.error).toBe(CommonErrEnum.invalidParams); + }); + + it('should reject login when code is missing', async () => { + const res = await Call(loginApi.default, { + body: { + username: 'testuser', + password: 'testpassword', + code: '' + } + }); + + expect(res.code).toBe(500); + expect(res.error).toBe(CommonErrEnum.invalidParams); + }); + + it('should reject login when auth code verification fails', async () => { + // Mock authCode to reject + vi.mocked(authCode).mockRejectedValueOnce(new Error('Invalid code')); + + const res = await Call(loginApi.default, { + body: { + username: 'testuser', + password: 'testpassword', + code: 'wrongcode' + } + }); + + expect(res.code).toBe(500); + expect(res.error).toBeDefined(); + }); + + it('should reject login when user does not exist', async () => { + const res = await Call(loginApi.default, { + body: { + username: 'nonexistentuser', + password: 'testpassword', + code: '123456' + } + }); + + expect(res.code).toBe(500); + expect(res.error).toBe(UserErrEnum.account_psw_error); + }); + + it('should reject login when user is forbidden', async () => { + // Update user status to forbidden + await MongoUser.findByIdAndUpdate(testUser._id, { + status: UserStatusEnum.forbidden + }); + + const res = await Call(loginApi.default, { + body: { + username: 'testuser', + password: 'testpassword', + code: '123456' + } + }); + + expect(res.code).toBe(500); + expect(res.error).toBe('Invalid account!'); + }); + + it('should reject login when password is incorrect', async () => { + const res = await Call(loginApi.default, { + body: { + username: 'testuser', + password: 'wrongpassword', + code: '123456' + } + }); + + expect(res.code).toBe(500); + expect(res.error).toBe(UserErrEnum.account_psw_error); + }); + + it('should accept language parameter on successful login', async () => { + // Spy on findByIdAndUpdate to verify it's called with the language + const findByIdAndUpdateSpy = vi.spyOn(MongoUser, 'findByIdAndUpdate'); + + const res = await Call(loginApi.default, { + body: { + username: 'testuser', + password: 'testpassword', + code: '123456', + language: 'en' + } + }); + + expect(res.code).toBe(200); + expect(res.error).toBeUndefined(); + + // Verify findByIdAndUpdate was called with the language + expect(findByIdAndUpdateSpy).toHaveBeenCalledWith( + testUser._id, + expect.objectContaining({ + language: 'en' + }) + ); + + findByIdAndUpdateSpy.mockRestore(); + }); + + it('should handle root user login correctly', async () => { + // Create root user + const rootUser = await MongoUser.create({ + username: 'root', + password: 'rootpassword', + status: UserStatusEnum.active + }); + + const rootTeam = await MongoTeam.create({ + name: 'Root Team', + ownerId: rootUser._id + }); + + await initTeamFreePlan({ + teamId: String(rootTeam._id) + }); + + const rootTmb = await MongoTeamMember.create({ + teamId: rootTeam._id, + userId: rootUser._id, + status: 'active', + role: 'owner' + }); + + await MongoUser.findByIdAndUpdate(rootUser._id, { + lastLoginTmbId: rootTmb._id + }); + + const res = await Call(loginApi.default, { + body: { + username: 'root', + password: 'rootpassword', + code: '123456' + } + }); + + expect(res.code).toBe(200); + expect(res.error).toBeUndefined(); + expect(res.data).toBeDefined(); + expect(res.data.token).toBeDefined(); + expect(typeof res.data.token).toBe('string'); + }); +}); diff --git a/test/cases/service/common/geo.test.ts b/test/cases/service/common/geo.test.ts new file mode 100644 index 000000000..76810abe8 --- /dev/null +++ b/test/cases/service/common/geo.test.ts @@ -0,0 +1,61 @@ +import { getLocationFromIp } from '@fastgpt/service/common/geo'; +import { describe, expect, it } from 'vitest'; + +describe('Get Location From IP', () => { + it('should return the `Other` when the ip is loopback address', () => { + const ip = '::1'; + const locationEn = getLocationFromIp(ip, 'en'); + const locationZh = getLocationFromIp(ip, 'zh'); + + expect(locationEn).toBe('Other'); + expect(locationZh).toBe('其他'); + }); + + it('should return the `Other` when the ip is private address', () => { + const ip = '192.168.0.1'; + const locationEn = getLocationFromIp(ip, 'en'); + const locationZh = getLocationFromIp(ip, 'zh'); + + expect(locationEn).toBe('Other'); + expect(locationZh).toBe('其他'); + }); + + it('should return the `Other` when the ip is invalid', () => { + const ip = 'Invalid'; + const locationEn = getLocationFromIp(ip, 'en'); + const locationZh = getLocationFromIp(ip, 'zh'); + + expect(locationEn).toBe('Other'); + expect(locationZh).toBe('其他'); + }); + + it('should only return the country name', () => { + const ip = '8.8.8.8'; + const locationEn = getLocationFromIp(ip, 'en'); + const locationZh = getLocationFromIp(ip, 'zh'); + + const ipv6 = '2001:4860:4860::8888'; + const locationEnIpv6 = getLocationFromIp(ipv6, 'en'); + const locationZhIpv6 = getLocationFromIp(ipv6, 'zh'); + + expect(locationEn).toBe('United States'); + expect(locationZh).toBe('美国'); + expect(locationEnIpv6).toBe('United States'); + expect(locationZhIpv6).toBe('美国'); + }); + + it('should return full location name', () => { + const ip = '223.5.5.5'; + const locationEn = getLocationFromIp(ip, 'en'); + const locationZh = getLocationFromIp(ip, 'zh'); + + const ipv6 = '2400:3200:baba::1'; + const locationEnIpv6 = getLocationFromIp(ipv6, 'en'); + const locationZhIpv6 = getLocationFromIp(ipv6, 'zh'); + + expect(locationEn).toBe('China, Zhejiang, Hangzhou'); + expect(locationZh).toBe('中国,浙江,杭州'); + expect(locationEnIpv6).toBe('China, Zhejiang, Hangzhou'); + expect(locationZhIpv6).toBe('中国,浙江,杭州'); + }); +}); diff --git a/test/cases/service/common/vectorDB/controller.test.ts b/test/cases/service/common/vectorDB/controller.test.ts new file mode 100644 index 000000000..5e79e42ce --- /dev/null +++ b/test/cases/service/common/vectorDB/controller.test.ts @@ -0,0 +1,324 @@ +import { describe, expect, it, vi, beforeEach, afterEach } from 'vitest'; +import { + mockVectorInsert, + mockVectorDelete, + mockVectorEmbRecall, + mockVectorInit, + mockGetVectorDataByTime, + mockGetVectorCountByTeamId, + mockGetVectorCountByDatasetId, + mockGetVectorCountByCollectionId, + resetVectorMocks +} from '@test/mocks/common/vector'; +import { mockGetVectorsByText } from '@test/mocks/core/ai/embedding'; + +// Import controller functions after mocks are set up +import { + initVectorStore, + recallFromVectorStore, + getVectorDataByTime, + getVectorCountByTeamId, + getVectorCountByDatasetId, + getVectorCountByCollectionId, + insertDatasetDataVector, + deleteDatasetDataVector +} from '@fastgpt/service/common/vectorDB/controller'; + +// Mock redis cache functions +const mockGetRedisCache = vi.fn(); +const mockSetRedisCache = vi.fn(); +const mockDelRedisCache = vi.fn(); +const mockIncrValueToCache = vi.fn(); + +vi.mock('@fastgpt/service/common/redis/cache', () => ({ + setRedisCache: (...args: any[]) => mockSetRedisCache(...args), + getRedisCache: (...args: any[]) => mockGetRedisCache(...args), + delRedisCache: (...args: any[]) => mockDelRedisCache(...args), + incrValueToCache: (...args: any[]) => mockIncrValueToCache(...args), + CacheKeyEnum: { + team_vector_count: 'team_vector_count', + team_point_surplus: 'team_point_surplus', + team_point_total: 'team_point_total' + }, + CacheKeyEnumTime: { + team_vector_count: 1800, + team_point_surplus: 60, + team_point_total: 60 + } +})); + +describe('VectorDB Controller', () => { + beforeEach(() => { + resetVectorMocks(); + mockGetRedisCache.mockReset(); + mockSetRedisCache.mockReset(); + mockDelRedisCache.mockReset(); + mockIncrValueToCache.mockReset(); + mockGetVectorsByText.mockClear(); + }); + + describe('initVectorStore', () => { + it('should call Vector.init', async () => { + await initVectorStore(); + expect(mockVectorInit).toHaveBeenCalled(); + }); + }); + + describe('recallFromVectorStore', () => { + it('should call Vector.embRecall with correct props', async () => { + const props = { + teamId: 'team_123', + datasetIds: ['dataset_1', 'dataset_2'], + vector: [0.1, 0.2, 0.3], + limit: 10, + forbidCollectionIdList: ['col_forbidden'] + }; + + const result = await recallFromVectorStore(props); + + expect(mockVectorEmbRecall).toHaveBeenCalledWith(props); + expect(result).toEqual({ + results: [ + { id: '1', collectionId: 'col_1', score: 0.95 }, + { id: '2', collectionId: 'col_2', score: 0.85 } + ] + }); + }); + + it('should handle filterCollectionIdList', async () => { + const props = { + teamId: 'team_123', + datasetIds: ['dataset_1'], + vector: [0.1, 0.2], + limit: 5, + forbidCollectionIdList: [], + filterCollectionIdList: ['col_1', 'col_2'] + }; + + await recallFromVectorStore(props); + + expect(mockVectorEmbRecall).toHaveBeenCalledWith(props); + }); + }); + + describe('getVectorDataByTime', () => { + it('should call Vector.getVectorDataByTime with correct date range', async () => { + const start = new Date('2024-01-01'); + const end = new Date('2024-01-31'); + + const result = await getVectorDataByTime(start, end); + + expect(mockGetVectorDataByTime).toHaveBeenCalledWith(start, end); + expect(result).toEqual([ + { id: '1', teamId: 'team_1', datasetId: 'dataset_1' }, + { id: '2', teamId: 'team_1', datasetId: 'dataset_2' } + ]); + }); + }); + + describe('getVectorCountByTeamId', () => { + it('should return cached count if available', async () => { + mockGetRedisCache.mockResolvedValue('150'); + + const result = await getVectorCountByTeamId('team_123'); + + expect(result).toBe(150); + expect(mockGetVectorCountByTeamId).not.toHaveBeenCalled(); + }); + + it('should fetch from Vector and cache if no cache exists', async () => { + mockGetRedisCache.mockResolvedValue(null); + mockGetVectorCountByTeamId.mockResolvedValue(200); + + const result = await getVectorCountByTeamId('team_456'); + + expect(result).toBe(200); + expect(mockGetVectorCountByTeamId).toHaveBeenCalledWith('team_456'); + }); + + it('should handle undefined cache value', async () => { + mockGetRedisCache.mockResolvedValue(undefined); + mockGetVectorCountByTeamId.mockResolvedValue(50); + + const result = await getVectorCountByTeamId('team_789'); + + expect(result).toBe(50); + expect(mockGetVectorCountByTeamId).toHaveBeenCalled(); + }); + }); + + describe('getVectorCountByDatasetId', () => { + it('should call Vector.getVectorCountByDatasetId', async () => { + const result = await getVectorCountByDatasetId('team_1', 'dataset_1'); + + expect(mockGetVectorCountByDatasetId).toHaveBeenCalledWith('team_1', 'dataset_1'); + expect(result).toBe(50); + }); + }); + + describe('getVectorCountByCollectionId', () => { + it('should call Vector.getVectorCountByCollectionId', async () => { + const result = await getVectorCountByCollectionId('team_1', 'dataset_1', 'col_1'); + + expect(mockGetVectorCountByCollectionId).toHaveBeenCalledWith('team_1', 'dataset_1', 'col_1'); + expect(result).toBe(25); + }); + }); + + describe('insertDatasetDataVector', () => { + const mockModel = { + model: 'text-embedding-ada-002', + name: 'text-embedding-ada-002', + charsPointsPrice: 0, + maxToken: 8192, + weight: 100, + defaultToken: 512, + dbConfig: {}, + queryExtensionModel: '' + }; + + it('should generate embeddings and insert vectors', async () => { + const mockVectors = [ + [0.1, 0.2], + [0.3, 0.4] + ]; + mockGetVectorsByText.mockResolvedValue({ + tokens: 100, + vectors: mockVectors + }); + mockVectorInsert.mockResolvedValue({ + insertIds: ['id_1', 'id_2'] + }); + + const result = await insertDatasetDataVector({ + teamId: 'team_123', + datasetId: 'dataset_456', + collectionId: 'col_789', + inputs: ['hello world', 'test text'], + model: mockModel as any + }); + + expect(mockGetVectorsByText).toHaveBeenCalledWith({ + model: mockModel, + input: ['hello world', 'test text'], + type: 'db' + }); + expect(mockVectorInsert).toHaveBeenCalledWith({ + teamId: 'team_123', + datasetId: 'dataset_456', + collectionId: 'col_789', + vectors: mockVectors + }); + expect(result).toEqual({ + tokens: 100, + insertIds: ['id_1', 'id_2'] + }); + }); + + it('should increment team vector cache', async () => { + mockGetVectorsByText.mockResolvedValue({ + tokens: 50, + vectors: [[0.1]] + }); + mockVectorInsert.mockResolvedValue({ + insertIds: ['id_1'] + }); + + await insertDatasetDataVector({ + teamId: 'team_abc', + datasetId: 'dataset_def', + collectionId: 'col_ghi', + inputs: ['single input'], + model: mockModel as any + }); + + // Cache increment is called asynchronously + await new Promise((resolve) => setTimeout(resolve, 10)); + expect(mockIncrValueToCache).toHaveBeenCalled(); + }); + + it('should handle empty inputs', async () => { + mockGetVectorsByText.mockResolvedValue({ + tokens: 0, + vectors: [] + }); + mockVectorInsert.mockResolvedValue({ + insertIds: [] + }); + + const result = await insertDatasetDataVector({ + teamId: 'team_123', + datasetId: 'dataset_456', + collectionId: 'col_789', + inputs: [], + model: mockModel as any + }); + + expect(result).toEqual({ + tokens: 0, + insertIds: [] + }); + }); + }); + + describe('deleteDatasetDataVector', () => { + it('should delete by single id', async () => { + const props = { + teamId: 'team_123', + id: 'vector_id_1' + }; + + await deleteDatasetDataVector(props); + + expect(mockVectorDelete).toHaveBeenCalledWith(props); + }); + + it('should delete by datasetIds', async () => { + const props = { + teamId: 'team_123', + datasetIds: ['dataset_1', 'dataset_2'] + }; + + await deleteDatasetDataVector(props); + + expect(mockVectorDelete).toHaveBeenCalledWith(props); + }); + + it('should delete by datasetIds and collectionIds', async () => { + const props = { + teamId: 'team_123', + datasetIds: ['dataset_1'], + collectionIds: ['col_1', 'col_2'] + }; + + await deleteDatasetDataVector(props); + + expect(mockVectorDelete).toHaveBeenCalledWith(props); + }); + + it('should delete by idList', async () => { + const props = { + teamId: 'team_123', + idList: ['id_1', 'id_2', 'id_3'] + }; + + await deleteDatasetDataVector(props); + + expect(mockVectorDelete).toHaveBeenCalledWith(props); + }); + + it('should call delete and return result', async () => { + mockVectorDelete.mockResolvedValue({ deletedCount: 5 }); + + const props = { + teamId: 'team_cache_test', + id: 'some_id' + }; + + const result = await deleteDatasetDataVector(props); + + expect(mockVectorDelete).toHaveBeenCalledWith(props); + expect(result).toEqual({ deletedCount: 5 }); + }); + }); +}); diff --git a/test/cases/service/core/ai/hooks/useTextCosine.test.ts b/test/cases/service/core/ai/hooks/useTextCosine.test.ts new file mode 100644 index 000000000..6f4e09519 --- /dev/null +++ b/test/cases/service/core/ai/hooks/useTextCosine.test.ts @@ -0,0 +1,231 @@ +import { describe, expect, it, vi, beforeEach } from 'vitest'; +import { useTextCosine } from '@fastgpt/service/core/ai/hooks/useTextCosine'; +import { + generateMockEmbedding, + createMockVectorsResponse, + generateSimilarVector, + generateOrthogonalVector, + mockGetVectorsByText +} from '../../../../../mocks/core/ai/embedding'; + +describe('useTextCosine', () => { + beforeEach(() => { + vi.clearAllMocks(); + }); + + describe('lazyGreedyQuerySelection', () => { + it('should return empty array when candidates is empty', async () => { + const { lazyGreedyQuerySelection } = useTextCosine({ + embeddingModel: 'text-embedding-ada-002' + }); + const result = await lazyGreedyQuerySelection({ + originalText: 'test query', + candidates: [], + k: 3 + }); + + expect(result.selectedData).toEqual([]); + }); + + it('should select k candidates when k <= candidates.length', async () => { + const { lazyGreedyQuerySelection } = useTextCosine({ + embeddingModel: 'text-embedding-ada-002' + }); + const result = await lazyGreedyQuerySelection({ + originalText: 'original text', + candidates: ['candidate1', 'candidate2', 'candidate3'], + k: 2 + }); + + expect(result.selectedData.length).toBe(2); + }); + + it('should select all candidates when k > candidates.length', async () => { + const { lazyGreedyQuerySelection } = useTextCosine({ + embeddingModel: 'text-embedding-ada-002' + }); + const result = await lazyGreedyQuerySelection({ + originalText: 'original text', + candidates: ['candidate1', 'candidate2'], + k: 5 + }); + + expect(result.selectedData.length).toBe(2); + }); + + it('should select single candidate correctly', async () => { + const { lazyGreedyQuerySelection } = useTextCosine({ + embeddingModel: 'text-embedding-ada-002' + }); + const result = await lazyGreedyQuerySelection({ + originalText: 'original text', + candidates: ['only candidate'], + k: 1 + }); + + expect(result.selectedData).toEqual(['only candidate']); + }); + + it('should prefer candidates with higher relevance to original text', async () => { + const originalVector = generateMockEmbedding('original text'); + // Create a candidate very similar to original + const similarVector = generateSimilarVector(originalVector, 0.95); + // Create a candidate very different from original + const differentVector = generateOrthogonalVector(originalVector); + + mockGetVectorsByText.mockResolvedValueOnce({ + tokens: 30, + vectors: [originalVector, differentVector, similarVector] + }); + + const { lazyGreedyQuerySelection } = useTextCosine({ + embeddingModel: 'text-embedding-ada-002' + }); + const result = await lazyGreedyQuerySelection({ + originalText: 'original text', + candidates: ['different', 'similar'], + k: 1, + alpha: 1.0 // Only consider relevance, not diversity + }); + + // Should select the similar candidate first when alpha=1.0 + expect(result.selectedData[0]).toBe('similar'); + }); + + it('should balance relevance and diversity with default alpha', async () => { + const { lazyGreedyQuerySelection } = useTextCosine({ + embeddingModel: 'text-embedding-ada-002' + }); + const result = await lazyGreedyQuerySelection({ + originalText: 'original text', + candidates: ['c1', 'c2', 'c3'], + k: 3, + alpha: 0.3 // Default alpha + }); + + expect(result.selectedData.length).toBe(3); + // All candidates should be selected + expect(result.selectedData).toContain('c1'); + expect(result.selectedData).toContain('c2'); + expect(result.selectedData).toContain('c3'); + }); + + it('should call getVectorsByText with correct parameters', async () => { + const { lazyGreedyQuerySelection } = useTextCosine({ embeddingModel: 'custom-model' }); + await lazyGreedyQuerySelection({ + originalText: 'test query', + candidates: ['candidate'], + k: 1 + }); + + expect(mockGetVectorsByText).toHaveBeenCalledWith({ + model: expect.anything(), + input: ['test query', 'candidate'], + type: 'query' + }); + }); + + it('should handle identical candidates correctly', async () => { + const originalVector = generateMockEmbedding('original'); + const identicalVector = generateMockEmbedding('same'); + + mockGetVectorsByText.mockResolvedValueOnce({ + tokens: 30, + vectors: [originalVector, identicalVector, identicalVector, identicalVector] + }); + + const { lazyGreedyQuerySelection } = useTextCosine({ + embeddingModel: 'text-embedding-ada-002' + }); + const result = await lazyGreedyQuerySelection({ + originalText: 'original', + candidates: ['same1', 'same2', 'same3'], + k: 2 + }); + + expect(result.selectedData.length).toBe(2); + }); + + it('should respect alpha parameter for diversity weighting', async () => { + const originalVector = generateMockEmbedding('original'); + // Create vectors with known similarities + const similarVector = generateSimilarVector(originalVector, 0.9); + const differentVector = generateOrthogonalVector(originalVector); + + mockGetVectorsByText.mockResolvedValueOnce({ + tokens: 25, + vectors: [originalVector, similarVector, differentVector] + }); + + const { lazyGreedyQuerySelection } = useTextCosine({ + embeddingModel: 'text-embedding-ada-002' + }); + + // With high alpha (more relevance) + const resultHighAlpha = await lazyGreedyQuerySelection({ + originalText: 'original', + candidates: ['similar', 'different'], + k: 1, + alpha: 0.9 + }); + + expect(resultHighAlpha.selectedData.length).toBe(1); + }); + + it('should return correct embedding tokens', async () => { + const mockResponse = createMockVectorsResponse(['test', 'candidate']); + mockResponse.tokens = 12345; // Override tokens for specific test + + mockGetVectorsByText.mockResolvedValueOnce(mockResponse); + + const { lazyGreedyQuerySelection } = useTextCosine({ + embeddingModel: 'text-embedding-ada-002' + }); + const result = await lazyGreedyQuerySelection({ + originalText: 'test', + candidates: ['candidate'], + k: 1 + }); + + expect(result.embeddingTokens).toBe(12345); + }); + + it('should handle k=0 correctly', async () => { + const { lazyGreedyQuerySelection } = useTextCosine({ + embeddingModel: 'text-embedding-ada-002' + }); + const result = await lazyGreedyQuerySelection({ + originalText: 'test', + candidates: ['candidate'], + k: 0 + }); + + expect(result.selectedData).toEqual([]); + }); + + it('should select diverse candidates when alpha is low', async () => { + const originalVector = generateMockEmbedding('original'); + // Create 3 candidates: 2 similar to each other, 1 different + const similar1 = generateSimilarVector(originalVector, 0.8); + const similar2 = generateSimilarVector(similar1, 0.95); // Very close to similar1 + const different = generateOrthogonalVector(originalVector); + + mockGetVectorsByText.mockResolvedValueOnce({ + tokens: 40, + vectors: [originalVector, similar1, similar2, different] + }); + + const { lazyGreedyQuerySelection } = useTextCosine({ + embeddingModel: 'text-embedding-ada-002' + }); + const result = await lazyGreedyQuerySelection({ + originalText: 'original', + candidates: ['similar1', 'similar2', 'different'], + k: 2, + alpha: 0.1 // Low alpha means more diversity + }); + + expect(result.selectedData.length).toBe(2); + }); + }); +}); diff --git a/test/mocks/common/bullmq.ts b/test/mocks/common/bullmq.ts new file mode 100644 index 000000000..1b82d0dd0 --- /dev/null +++ b/test/mocks/common/bullmq.ts @@ -0,0 +1,23 @@ +import { vi } from 'vitest'; + +// Mock BullMQ to prevent queue connection errors +vi.mock('@fastgpt/service/common/bullmq', async (importOriginal) => { + const actual = (await importOriginal()) as any; + + const mockQueue = { + add: vi.fn().mockResolvedValue({ id: '1' }), + close: vi.fn().mockResolvedValue(undefined), + on: vi.fn() + }; + + const mockWorker = { + close: vi.fn().mockResolvedValue(undefined), + on: vi.fn() + }; + + return { + ...actual, + getQueue: vi.fn(() => mockQueue), + getWorker: vi.fn(() => mockWorker) + }; +}); diff --git a/test/mocks/common/geo.ts b/test/mocks/common/geo.ts new file mode 100644 index 000000000..7599d6f27 --- /dev/null +++ b/test/mocks/common/geo.ts @@ -0,0 +1,10 @@ +import path from 'node:path'; +import { vi } from 'vitest'; + +vi.mock('@fastgpt/service/common/geo/constants', async (importOriginal) => { + const actual = (await importOriginal()) as any; + return { + ...actual, + dbPath: path.join(process.cwd(), 'projects/app/data/GeoLite2-City.mmdb') + }; +}); diff --git a/test/mocks/common/mongo.ts b/test/mocks/common/mongo.ts new file mode 100644 index 000000000..29212c61e --- /dev/null +++ b/test/mocks/common/mongo.ts @@ -0,0 +1,19 @@ +import { vi } from 'vitest'; +import { randomUUID } from 'crypto'; +import type { Mongoose } from '@fastgpt/service/common/mongo'; + +/** + * Mock MongoDB connection for testing + * Creates a unique database for each test run and drops it on connection + */ +vi.mock(import('@fastgpt/service/common/mongo/init'), async (importOriginal: any) => { + const mod = await importOriginal(); + return { + ...mod, + connectMongo: async (props: { db: Mongoose; url: string; connectedCb?: () => void }) => { + const { db, url } = props; + await db.connect(url, { dbName: randomUUID() }); + await db.connection.db?.dropDatabase(); + } + }; +}); diff --git a/test/mocks/common/redis.ts b/test/mocks/common/redis.ts new file mode 100644 index 000000000..8d1d3dccd --- /dev/null +++ b/test/mocks/common/redis.ts @@ -0,0 +1,81 @@ +import { vi } from 'vitest'; + +// Create a comprehensive mock Redis client factory +const createMockRedisClient = () => ({ + // Connection methods + on: vi.fn().mockReturnThis(), + connect: vi.fn().mockResolvedValue(undefined), + disconnect: vi.fn().mockResolvedValue(undefined), + quit: vi.fn().mockResolvedValue('OK'), + duplicate: vi.fn(function (this: any) { + return createMockRedisClient(); + }), + + // Key-value operations + get: vi.fn().mockResolvedValue(null), + set: vi.fn().mockResolvedValue('OK'), + del: vi.fn().mockResolvedValue(1), + exists: vi.fn().mockResolvedValue(0), + keys: vi.fn().mockResolvedValue([]), + + // Hash operations + hget: vi.fn().mockResolvedValue(null), + hset: vi.fn().mockResolvedValue(1), + hdel: vi.fn().mockResolvedValue(1), + hgetall: vi.fn().mockResolvedValue({}), + hmset: vi.fn().mockResolvedValue('OK'), + + // Expiry operations + expire: vi.fn().mockResolvedValue(1), + ttl: vi.fn().mockResolvedValue(-1), + expireat: vi.fn().mockResolvedValue(1), + + // Increment operations + incr: vi.fn().mockResolvedValue(1), + decr: vi.fn().mockResolvedValue(1), + incrby: vi.fn().mockResolvedValue(1), + decrby: vi.fn().mockResolvedValue(1), + incrbyfloat: vi.fn().mockResolvedValue(1), + + // Server commands + info: vi.fn().mockResolvedValue(''), + ping: vi.fn().mockResolvedValue('PONG'), + flushdb: vi.fn().mockResolvedValue('OK'), + + // List operations + lpush: vi.fn().mockResolvedValue(1), + rpush: vi.fn().mockResolvedValue(1), + lpop: vi.fn().mockResolvedValue(null), + rpop: vi.fn().mockResolvedValue(null), + llen: vi.fn().mockResolvedValue(0), + + // Set operations + sadd: vi.fn().mockResolvedValue(1), + srem: vi.fn().mockResolvedValue(1), + smembers: vi.fn().mockResolvedValue([]), + sismember: vi.fn().mockResolvedValue(0) +}); + +// Mock Redis connections to prevent connection errors in tests +vi.mock('@fastgpt/service/common/redis', async (importOriginal) => { + const actual = (await importOriginal()) as any; + + return { + ...actual, + newQueueRedisConnection: vi.fn(createMockRedisClient), + newWorkerRedisConnection: vi.fn(createMockRedisClient), + getGlobalRedisConnection: vi.fn(() => { + if (!global.mockRedisClient) { + global.mockRedisClient = createMockRedisClient(); + } + return global.mockRedisClient; + }), + initRedisClient: vi.fn().mockResolvedValue(createMockRedisClient()) + }; +}); + +// Initialize global.redisClient with mock before any module imports it +// This prevents getGlobalRedisConnection() from creating a real Redis client +if (!global.redisClient) { + global.redisClient = createMockRedisClient() as any; +} diff --git a/test/mocks/common/s3.ts b/test/mocks/common/s3.ts new file mode 100644 index 000000000..385a97a5f --- /dev/null +++ b/test/mocks/common/s3.ts @@ -0,0 +1,167 @@ +import { vi } from 'vitest'; + +// Create mock S3 bucket object for global use +const createMockS3Bucket = () => ({ + name: 'mock-bucket', + client: {}, + externalClient: {}, + exist: vi.fn().mockResolvedValue(true), + delete: vi.fn().mockResolvedValue(undefined), + putObject: vi.fn().mockResolvedValue(undefined), + getObject: vi.fn().mockResolvedValue(null), + statObject: vi.fn().mockResolvedValue({ size: 0, etag: 'mock-etag' }), + move: vi.fn().mockResolvedValue(undefined), + copy: vi.fn().mockResolvedValue(undefined), + addDeleteJob: vi.fn().mockResolvedValue(undefined), + createPostPresignedUrl: vi.fn().mockResolvedValue({ + url: 'http://localhost:9000/mock-bucket', + fields: { key: 'mock-key' }, + maxSize: 100 * 1024 * 1024 + }), + createExternalUrl: vi.fn().mockResolvedValue('http://localhost:9000/mock-bucket/mock-key'), + createGetPresignedUrl: vi.fn().mockResolvedValue('http://localhost:9000/mock-bucket/mock-key'), + createPublicUrl: vi.fn((key: string) => `http://localhost:9000/mock-bucket/${key}`) +}); + +// Initialize global s3BucketMap early to prevent any real S3 connections +const mockBucket = createMockS3Bucket(); +global.s3BucketMap = { + 'fastgpt-public': mockBucket, + 'fastgpt-private': mockBucket +} as any; + +// Mock minio Client to prevent real connections +const createMockMinioClient = vi.hoisted(() => { + return vi.fn().mockImplementation(() => ({ + bucketExists: vi.fn().mockResolvedValue(true), + makeBucket: vi.fn().mockResolvedValue(undefined), + setBucketPolicy: vi.fn().mockResolvedValue(undefined), + copyObject: vi.fn().mockResolvedValue(undefined), + removeObject: vi.fn().mockResolvedValue(undefined), + putObject: vi.fn().mockResolvedValue({ etag: 'mock-etag' }), + getObject: vi.fn().mockResolvedValue(null), + statObject: vi.fn().mockResolvedValue({ size: 0, etag: 'mock-etag' }), + presignedGetObject: vi.fn().mockResolvedValue('http://localhost:9000/mock-bucket/mock-object'), + presignedPostPolicy: vi.fn().mockResolvedValue({ + postURL: 'http://localhost:9000/mock-bucket', + formData: { key: 'mock-key' } + }), + newPostPolicy: vi.fn(() => ({ + setKey: vi.fn().mockReturnThis(), + setBucket: vi.fn().mockReturnThis(), + setContentType: vi.fn().mockReturnThis(), + setContentLengthRange: vi.fn().mockReturnThis(), + setExpires: vi.fn().mockReturnThis(), + setUserMetaData: vi.fn().mockReturnThis() + })) + })); +}); + +vi.mock('minio', () => ({ + Client: createMockMinioClient(), + S3Error: class S3Error extends Error {}, + CopyConditions: vi.fn() +})); + +// Simplified S3 bucket class mock +const createMockBucketClass = (defaultName: string) => { + return class MockS3Bucket { + public name: string; + public options: any; + public client = {}; + public externalClient = {}; + + constructor(bucket?: string, options?: any) { + this.name = bucket || defaultName; + this.options = options || {}; + } + + async exist() { + return true; + } + async delete() {} + async putObject() {} + async getObject() { + return null; + } + async statObject() { + return { size: 0, etag: 'mock-etag' }; + } + async move() {} + async copy() {} + async addDeleteJob() {} + async createPostPresignedUrl(params: any, options?: any) { + return { + url: 'http://localhost:9000/mock-bucket', + fields: { key: `mock/${params.teamId || 'test'}/${params.filename}` }, + maxSize: (options?.maxFileSize || 100) * 1024 * 1024 + }; + } + async createExternalUrl(params: any) { + return `http://localhost:9000/mock-bucket/${params.key}`; + } + async createGetPresignedUrl(params: any) { + return `http://localhost:9000/mock-bucket/${params.key}`; + } + createPublicUrl(objectKey: string) { + return `http://localhost:9000/mock-bucket/${objectKey}`; + } + }; +}; + +vi.mock('@fastgpt/service/common/s3/buckets/base', () => ({ + S3BaseBucket: createMockBucketClass('fastgpt-bucket') +})); + +vi.mock('@fastgpt/service/common/s3/buckets/public', () => ({ + S3PublicBucket: createMockBucketClass('fastgpt-public') +})); + +vi.mock('@fastgpt/service/common/s3/buckets/private', () => ({ + S3PrivateBucket: createMockBucketClass('fastgpt-private') +})); + +// Mock S3 source modules +vi.mock('@fastgpt/service/common/s3/sources/avatar', () => ({ + getS3AvatarSource: vi.fn(() => ({ + prefix: '/avatar/', + createUploadAvatarURL: vi.fn().mockResolvedValue({ + url: 'http://localhost:9000/mock-bucket', + fields: { key: 'mock-key' }, + maxSize: 5 * 1024 * 1024 + }), + createPublicUrl: vi.fn((key: string) => `http://localhost:9000/mock-bucket/${key}`), + removeAvatarTTL: vi.fn().mockResolvedValue(undefined), + deleteAvatar: vi.fn().mockResolvedValue(undefined), + refreshAvatar: vi.fn().mockResolvedValue(undefined), + copyAvatar: vi.fn().mockResolvedValue('http://localhost:9000/mock-bucket/mock-avatar') + })) +})); + +vi.mock('@fastgpt/service/common/s3/sources/dataset/index', () => ({ + getS3DatasetSource: vi.fn(() => ({ + createUploadDatasetFileURL: vi.fn().mockResolvedValue({ + url: 'http://localhost:9000/mock-bucket', + fields: { key: 'mock-key' }, + maxSize: 500 * 1024 * 1024 + }), + deleteDatasetFile: vi.fn().mockResolvedValue(undefined) + })), + S3DatasetSource: vi.fn() +})); + +vi.mock('@fastgpt/service/common/s3/sources/chat/index', () => ({ + S3ChatSource: vi.fn() +})); + +// Mock S3 initialization +vi.mock('@fastgpt/service/common/s3', () => ({ + initS3Buckets: vi.fn(() => { + const mockBucket = createMockS3Bucket(); + global.s3BucketMap = { + 'fastgpt-public': mockBucket, + 'fastgpt-private': mockBucket + } as any; + }), + initS3MQWorker: vi.fn().mockResolvedValue(undefined) +})); diff --git a/test/mocks/common/system.ts b/test/mocks/common/system.ts new file mode 100644 index 000000000..294cf1425 --- /dev/null +++ b/test/mocks/common/system.ts @@ -0,0 +1,33 @@ +import { vi } from 'vitest'; +import { readFileSync } from 'fs'; + +/** + * Mock system configuration for testing + */ +vi.mock(import('@/service/common/system'), async (importOriginal) => { + const mod = await importOriginal(); + return { + ...mod, + getSystemVersion: async () => { + return '0.0.0'; + }, + readConfigData: async () => { + return readFileSync('projects/app/data/config.json', 'utf-8'); + }, + initSystemConfig: async () => { + // read env from projects/app/.env + const str = readFileSync('projects/app/.env.local', 'utf-8'); + const lines = str.split('\n'); + const systemEnv: Record = {}; + for (const line of lines) { + const [key, value] = line.split('='); + if (key && value) { + systemEnv[key] = value; + } + } + global.systemEnv = systemEnv as any; + + return; + } + }; +}); diff --git a/test/mocks/common/tracks.ts b/test/mocks/common/tracks.ts new file mode 100644 index 000000000..cde57a38c --- /dev/null +++ b/test/mocks/common/tracks.ts @@ -0,0 +1,19 @@ +import { vi } from 'vitest'; + +// Mock tracking utilities - automatically mock all methods +vi.mock('@fastgpt/service/common/middle/tracks/utils', async (importOriginal) => { + const actual = (await importOriginal()) as any; + + // Get all methods from original pushTrack and mock them + const mockedPushTrack: Record = {}; + if (actual.pushTrack) { + Object.keys(actual.pushTrack).forEach((key) => { + mockedPushTrack[key] = vi.fn(); + }); + } + + return { + ...actual, + pushTrack: mockedPushTrack + }; +}); diff --git a/test/mocks/common/vector.ts b/test/mocks/common/vector.ts new file mode 100644 index 000000000..f1a8b8df6 --- /dev/null +++ b/test/mocks/common/vector.ts @@ -0,0 +1,79 @@ +import { vi } from 'vitest'; + +/** + * Mock Vector Controller for testing + */ + +export const mockVectorInsert = vi.fn().mockResolvedValue({ + insertIds: ['id_1', 'id_2', 'id_3'] +}); + +export const mockVectorDelete = vi.fn().mockResolvedValue(undefined); + +export const mockVectorEmbRecall = vi.fn().mockResolvedValue({ + results: [ + { id: '1', collectionId: 'col_1', score: 0.95 }, + { id: '2', collectionId: 'col_2', score: 0.85 } + ] +}); + +export const mockVectorInit = vi.fn().mockResolvedValue(undefined); + +export const mockGetVectorDataByTime = vi.fn().mockResolvedValue([ + { id: '1', teamId: 'team_1', datasetId: 'dataset_1' }, + { id: '2', teamId: 'team_1', datasetId: 'dataset_2' } +]); + +export const mockGetVectorCountByTeamId = vi.fn().mockResolvedValue(100); + +export const mockGetVectorCountByDatasetId = vi.fn().mockResolvedValue(50); + +export const mockGetVectorCountByCollectionId = vi.fn().mockResolvedValue(25); + +const MockVectorCtrl = vi.fn().mockImplementation(() => ({ + init: mockVectorInit, + insert: mockVectorInsert, + delete: mockVectorDelete, + embRecall: mockVectorEmbRecall, + getVectorDataByTime: mockGetVectorDataByTime, + getVectorCountByTeamId: mockGetVectorCountByTeamId, + getVectorCountByDatasetId: mockGetVectorCountByDatasetId, + getVectorCountByCollectionId: mockGetVectorCountByCollectionId +})); + +// Mock PgVectorCtrl +vi.mock('@fastgpt/service/common/vectorDB/pg', () => ({ + PgVectorCtrl: MockVectorCtrl +})); + +// Mock ObVectorCtrl +vi.mock('@fastgpt/service/common/vectorDB/oceanbase', () => ({ + ObVectorCtrl: MockVectorCtrl +})); + +// Mock MilvusCtrl +vi.mock('@fastgpt/service/common/vectorDB/milvus', () => ({ + MilvusCtrl: MockVectorCtrl +})); + +// Mock constants - use PG_ADDRESS to ensure PgVectorCtrl is used +vi.mock('@fastgpt/service/common/vectorDB/constants', () => ({ + DatasetVectorDbName: 'fastgpt', + DatasetVectorTableName: 'modeldata', + PG_ADDRESS: 'mock://pg', + OCEANBASE_ADDRESS: undefined, + MILVUS_ADDRESS: undefined, + MILVUS_TOKEN: undefined +})); + +// Export mocks for test assertions +export const resetVectorMocks = () => { + mockVectorInsert.mockClear(); + mockVectorDelete.mockClear(); + mockVectorEmbRecall.mockClear(); + mockVectorInit.mockClear(); + mockGetVectorDataByTime.mockClear(); + mockGetVectorCountByTeamId.mockClear(); + mockGetVectorCountByDatasetId.mockClear(); + mockGetVectorCountByCollectionId.mockClear(); +}; diff --git a/test/mocks/core/ai/embedding.ts b/test/mocks/core/ai/embedding.ts new file mode 100644 index 000000000..896fc58ca --- /dev/null +++ b/test/mocks/core/ai/embedding.ts @@ -0,0 +1,131 @@ +import { vi } from 'vitest'; + +/** + * Mock embedding generation utilities for testing + */ + +/** + * Generate a deterministic normalized vector based on text content + * Uses a simple hash-based approach to ensure same text produces same vector + */ +export const generateMockEmbedding = (text: string, dimension: number = 1536): number[] => { + // Simple hash function to generate seed from text + let hash = 0; + for (let i = 0; i < text.length; i++) { + const char = text.charCodeAt(i); + hash = (hash << 5) - hash + char; + hash = hash & hash; // Convert to 32-bit integer + } + + // Generate vector using seeded random + const vector: number[] = []; + let seed = Math.abs(hash); + for (let i = 0; i < dimension; i++) { + // Linear congruential generator + seed = (seed * 1103515245 + 12345) & 0x7fffffff; + vector.push((seed / 0x7fffffff) * 2 - 1); // Range [-1, 1] + } + + // Normalize the vector (L2 norm = 1) + const norm = Math.sqrt(vector.reduce((sum, val) => sum + val * val, 0)); + return vector.map((val) => val / norm); +}; + +/** + * Generate multiple mock embeddings for a list of texts + */ +export const generateMockEmbeddings = (texts: string[], dimension: number = 1536): number[][] => { + return texts.map((text) => generateMockEmbedding(text, dimension)); +}; + +/** + * Create a mock response for getVectorsByText + */ +export const createMockVectorsResponse = ( + texts: string | string[], + dimension: number = 1536 +): { tokens: number; vectors: number[][] } => { + const textArray = Array.isArray(texts) ? texts : [texts]; + const vectors = generateMockEmbeddings(textArray, dimension); + + // Estimate tokens (roughly 1 token per 4 characters) + const tokens = textArray.reduce((sum, text) => sum + Math.ceil(text.length / 4), 0); + + return { tokens, vectors }; +}; + +/** + * Generate a vector similar to another vector with controlled similarity + * @param baseVector - The base vector to create similarity from + * @param similarity - Target cosine similarity (0-1), higher means more similar + */ +export const generateSimilarVector = (baseVector: number[], similarity: number = 0.9): number[] => { + const dimension = baseVector.length; + const noise = generateMockEmbedding(`noise_${Date.now()}_${Math.random()}`, dimension); + + // Interpolate between base vector and noise + const vector = baseVector.map((val, i) => val * similarity + noise[i] * (1 - similarity)); + + // Normalize + const norm = Math.sqrt(vector.reduce((sum, val) => sum + val * val, 0)); + return vector.map((val) => val / norm); +}; + +/** + * Generate a vector orthogonal (dissimilar) to the given vector + */ +export const generateOrthogonalVector = (baseVector: number[]): number[] => { + const dimension = baseVector.length; + const randomVector = generateMockEmbedding(`orthogonal_${Date.now()}`, dimension); + + // Gram-Schmidt orthogonalization + const dotProduct = baseVector.reduce((sum, val, i) => sum + val * randomVector[i], 0); + const vector = randomVector.map((val, i) => val - dotProduct * baseVector[i]); + + // Normalize + const norm = Math.sqrt(vector.reduce((sum, val) => sum + val * val, 0)); + return vector.map((val) => val / norm); +}; + +/** + * Mock implementation for getVectorsByText + * Automatically generates embeddings based on input text + */ +export const mockGetVectorsByText = vi.fn( + async ({ + input, + type + }: { + model: any; + input: string[] | string; + type?: string; + }): Promise<{ tokens: number; vectors: number[][] }> => { + const texts = Array.isArray(input) ? input : [input]; + return createMockVectorsResponse(texts); + } +); + +/** + * Setup global mock for embedding module + */ +vi.mock('@fastgpt/service/core/ai/embedding', async (importOriginal) => { + const actual = (await importOriginal()) as any; + return { + ...actual, + getVectorsByText: mockGetVectorsByText + }; +}); + +/** + * Setup global mock for AI model module + */ +vi.mock('@fastgpt/service/core/ai/model', async (importOriginal) => { + const actual = (await importOriginal()) as any; + return { + ...actual, + getEmbeddingModel: vi.fn().mockReturnValue({ + model: 'text-embedding-ada-002', + name: 'text-embedding-ada-002' + }) + }; +}); diff --git a/test/mocks/core/ai/llm.ts b/test/mocks/core/ai/llm.ts new file mode 100644 index 000000000..c72278d32 --- /dev/null +++ b/test/mocks/core/ai/llm.ts @@ -0,0 +1,139 @@ +import { vi } from 'vitest'; +import type { ChatCompletion } from '@fastgpt/global/core/ai/type'; + +/** + * Mock LLM response utilities for testing + */ + +/** + * Create a mock non-streaming response with reason and text + * This simulates a complete response from models that support reasoning (like o1) + */ +export const createMockCompleteResponseWithReason = (options?: { + content?: string; + reasoningContent?: string; + finishReason?: 'stop' | 'length' | 'content_filter'; + promptTokens?: number; + completionTokens?: number; +}): ChatCompletion => { + const { + content = 'This is the answer to your question.', + reasoningContent = 'First, I need to analyze the question...', + finishReason = 'stop', + promptTokens = 100, + completionTokens = 50 + } = options || {}; + + return { + id: `chatcmpl-${Date.now()}`, + object: 'chat.completion', + created: Math.floor(Date.now() / 1000), + model: 'gpt-4o', + choices: [ + { + index: 0, + message: { + role: 'assistant', + content, + reasoning_content: reasoningContent, + refusal: null + } as any, + logprobs: null, + finish_reason: finishReason + } + ], + usage: { + prompt_tokens: promptTokens, + completion_tokens: completionTokens, + total_tokens: promptTokens + completionTokens + }, + system_fingerprint: 'fp_test' + } as ChatCompletion; +}; + +/** + * Create a mock non-streaming response with tool calls + * This simulates a response where the model decides to call tools/functions + */ +export const createMockCompleteResponseWithTool = (options?: { + toolCalls?: Array<{ + id?: string; + name: string; + arguments: string | Record; + }>; + finishReason?: 'tool_calls' | 'stop'; + promptTokens?: number; + completionTokens?: number; +}): ChatCompletion => { + const { + toolCalls = [ + { + id: 'call_test_001', + name: 'get_weather', + arguments: { location: 'Beijing', unit: 'celsius' } + } + ], + finishReason = 'tool_calls', + promptTokens = 120, + completionTokens = 30 + } = options || {}; + + return { + id: `chatcmpl-${Date.now()}`, + object: 'chat.completion', + created: Math.floor(Date.now() / 1000), + model: 'gpt-4o', + choices: [ + { + index: 0, + message: { + role: 'assistant', + content: null, + refusal: null, + tool_calls: toolCalls.map((call, index) => ({ + id: call.id || `call_${Date.now()}_${index}`, + type: 'function' as const, + function: { + name: call.name, + arguments: + typeof call.arguments === 'string' ? call.arguments : JSON.stringify(call.arguments) + } + })) + }, + logprobs: null, + finish_reason: finishReason + } + ], + usage: { + prompt_tokens: promptTokens, + completion_tokens: completionTokens, + total_tokens: promptTokens + completionTokens + }, + system_fingerprint: 'fp_test' + } as ChatCompletion; +}; + +/** + * Mock implementation for createChatCompletion + * Can be configured to return different types of responses based on test needs + */ +export const mockCreateChatCompletion = vi.fn( + async (body: any, options?: any): Promise => { + // Default: return response with text + if (body.tools && body.tools.length > 0) { + return createMockCompleteResponseWithTool(); + } + return createMockCompleteResponseWithReason(); + } +); + +/** + * Setup global mock for LLM request module + */ +vi.mock('@fastgpt/service/core/ai/llm/request', async (importOriginal) => { + const actual = (await importOriginal()) as any; + return { + ...actual, + createChatCompletion: mockCreateChatCompletion + }; +}); diff --git a/test/mocks/index.ts b/test/mocks/index.ts index 6ba3abc9f..07426cc43 100644 --- a/test/mocks/index.ts +++ b/test/mocks/index.ts @@ -1,127 +1,13 @@ -import { vi } from 'vitest'; import './request'; - -vi.mock('@fastgpt/service/support/audit/util', async (importOriginal) => { - const actual = (await importOriginal()) as any; - return { - ...actual, - addAuditLog: vi.fn() - }; -}); - -// Mock Redis connections to prevent connection errors in tests -vi.mock('@fastgpt/service/common/redis', async (importOriginal) => { - const actual = (await importOriginal()) as any; - - // Create a mock Redis client - const mockRedisClient = { - on: vi.fn(), - connect: vi.fn().mockResolvedValue(undefined), - disconnect: vi.fn().mockResolvedValue(undefined), - keys: vi.fn().mockResolvedValue([]), - get: vi.fn().mockResolvedValue(null), - set: vi.fn().mockResolvedValue('OK'), - del: vi.fn().mockResolvedValue(1), - exists: vi.fn().mockResolvedValue(0), - expire: vi.fn().mockResolvedValue(1), - ttl: vi.fn().mockResolvedValue(-1) - }; - - return { - ...actual, - newQueueRedisConnection: vi.fn(() => mockRedisClient), - newWorkerRedisConnection: vi.fn(() => mockRedisClient), - getGlobalRedisConnection: vi.fn(() => mockRedisClient) - }; -}); - -// Mock BullMQ to prevent queue connection errors -vi.mock('@fastgpt/service/common/bullmq', async (importOriginal) => { - const actual = (await importOriginal()) as any; - - const mockQueue = { - add: vi.fn().mockResolvedValue({ id: '1' }), - close: vi.fn().mockResolvedValue(undefined), - on: vi.fn() - }; - - const mockWorker = { - close: vi.fn().mockResolvedValue(undefined), - on: vi.fn() - }; - - return { - ...actual, - getQueue: vi.fn(() => mockQueue), - getWorker: vi.fn(() => mockWorker) - }; -}); - -vi.mock('@fastgpt/service/common/s3/buckets/base', async (importOriginal) => { - const actual = (await importOriginal()) as any; - - class MockS3BaseBucket { - private _bucket: string; - public options: any; - - constructor(bucket: string, afterInits?: any, options?: any) { - this._bucket = bucket; - this.options = options || {}; - } - - get name(): string { - return this._bucket; - } - - get client(): any { - return { - bucketExists: vi.fn().mockResolvedValue(true), - makeBucket: vi.fn().mockResolvedValue(undefined), - setBucketPolicy: vi.fn().mockResolvedValue(undefined), - copyObject: vi.fn().mockResolvedValue(undefined), - removeObject: vi.fn().mockResolvedValue(undefined), - presignedPostPolicy: vi.fn().mockResolvedValue({ - postURL: 'http://localhost:9000/mock-bucket', - formData: { key: 'mock-key' } - }), - newPostPolicy: vi.fn(() => ({ - setKey: vi.fn(), - setBucket: vi.fn(), - setContentType: vi.fn(), - setContentLengthRange: vi.fn(), - setExpires: vi.fn(), - setUserMetaData: vi.fn() - })) - }; - } - - move(src: string, dst: string, options?: any): Promise { - return Promise.resolve(); - } - - copy(src: string, dst: string, options?: any): any { - return Promise.resolve(); - } - - exist(): Promise { - return Promise.resolve(true); - } - - delete(objectKey: string, options?: any): Promise { - return Promise.resolve(); - } - - async createPostPresignedUrl(params: any, options?: any): Promise { - const key = `mock/${params.teamId}/${params.filename}`; - return { - url: 'http://localhost:9000/mock-bucket', - fields: { key } - }; - } - } - - return { - ...actual, - S3BaseBucket: MockS3BaseBucket - }; -}); +import './common/geo'; +import './common/mongo'; +import './common/redis'; +import './common/bullmq'; +import './common/s3'; +import './common/system'; +import './common/vector'; +import './common/tracks'; +import './support/audit/utils'; +import './support/user/auth/controller'; +import './core/ai/embedding'; +import './core/ai/llm'; diff --git a/test/mocks/request.ts b/test/mocks/request.ts index 09574c203..f98e325a3 100644 --- a/test/mocks/request.ts +++ b/test/mocks/request.ts @@ -5,15 +5,8 @@ import { MongoGroupMemberModel } from '@fastgpt/service/support/permission/membe import { getTmbInfoByTmbId } from '@fastgpt/service/support/user/team/controller'; import { vi } from 'vitest'; -// vi.mock(import('@/service/middleware/entry'), async () => { -// const NextAPI = vi.fn((handler: any) => handler); -// return { -// NextAPI -// }; -// }); - -vi.mock(import('@fastgpt/service/common/middle/entry'), async (importOriginal) => { - const mod = await importOriginal(); +vi.mock('@fastgpt/service/common/middle/entry', async (importOriginal) => { + const mod = (await importOriginal()) as any; const NextEntry = vi.fn(({ beforeCallback = [] }: { beforeCallback?: Promise[] }) => { return (...args: any) => { return async function api(req: any, res: any) { @@ -67,8 +60,8 @@ export type MockReqType = { [key: string]: any; }; -vi.mock(import('@fastgpt/service/support/permission/auth/common'), async (importOriginal) => { - const mod = await importOriginal(); +vi.mock('@fastgpt/service/support/permission/auth/common', async (importOriginal) => { + const mod = (await importOriginal()) as any; const parseHeaderCert = vi.fn( ({ req, @@ -98,68 +91,69 @@ vi.mock(import('@fastgpt/service/support/permission/auth/common'), async (import canWrite: true }; }; + + const setCookie = vi.fn(); + return { ...mod, parseHeaderCert, - authCert + authCert, + setCookie }; }); -vi.mock( - import('@fastgpt/service/support/permission/memberGroup/controllers'), - async (importOriginal) => { - const mod = await importOriginal(); - const parseHeaderCert = vi.fn( - ({ - req, - authToken = false, - authRoot = false, - authApiKey = false - }: { - req: MockReqType; - authToken?: boolean; - authRoot?: boolean; - authApiKey?: boolean; - }) => { - const { auth } = req; - if (!auth) { - return Promise.reject(Error('unAuthorization(mock)')); - } - return Promise.resolve(auth); +vi.mock('@fastgpt/service/support/permission/memberGroup/controllers', async (importOriginal) => { + const mod = (await importOriginal()) as any; + const parseHeaderCert = vi.fn( + ({ + req, + authToken = false, + authRoot = false, + authApiKey = false + }: { + req: MockReqType; + authToken?: boolean; + authRoot?: boolean; + authApiKey?: boolean; + }) => { + const { auth } = req; + if (!auth) { + return Promise.reject(Error('unAuthorization(mock)')); } - ); - const authGroupMemberRole = vi.fn(async ({ groupId, role, ...props }: any) => { - const result = await parseHeaderCert(props); - const { teamId, tmbId, isRoot } = result; - if (isRoot) { - return { - ...result, - permission: new TeamPermission({ - isOwner: true - }), - teamId, - tmbId - }; - } - const [groupMember, tmb] = await Promise.all([ - MongoGroupMemberModel.findOne({ groupId, tmbId }), - getTmbInfoByTmbId({ tmbId }) - ]); + return Promise.resolve(auth); + } + ); + const authGroupMemberRole = vi.fn(async ({ groupId, role, ...props }: any) => { + const result = await parseHeaderCert(props); + const { teamId, tmbId, isRoot } = result; + if (isRoot) { + return { + ...result, + permission: new TeamPermission({ + isOwner: true + }), + teamId, + tmbId + }; + } + const [groupMember, tmb] = await Promise.all([ + MongoGroupMemberModel.findOne({ groupId, tmbId }), + getTmbInfoByTmbId({ tmbId }) + ]); - // Team admin or role check - if (tmb.permission.hasManagePer || (groupMember && role.includes(groupMember.role))) { - return { - ...result, - permission: tmb.permission, - teamId, - tmbId - }; - } - return Promise.reject(TeamErrEnum.unAuthTeam); - }); - return { - ...mod, - authGroupMemberRole - }; - } -); + // Team admin or role check + if (tmb.permission.hasManagePer || (groupMember && role.includes(groupMember.role))) { + return { + ...result, + permission: tmb.permission, + teamId, + tmbId + }; + } + return Promise.reject(TeamErrEnum.unAuthTeam); + }); + return { + ...mod, + authGroupMemberRole + }; +}); diff --git a/test/mocks/support/audit/utils.ts b/test/mocks/support/audit/utils.ts new file mode 100644 index 000000000..1f382339d --- /dev/null +++ b/test/mocks/support/audit/utils.ts @@ -0,0 +1,9 @@ +import { vi } from 'vitest'; + +vi.mock('@fastgpt/service/support/user/audit/util', async (importOriginal) => { + const actual = (await importOriginal()) as any; + return { + ...actual, + addAuditLog: vi.fn() + }; +}); diff --git a/test/mocks/support/user/auth/controller.ts b/test/mocks/support/user/auth/controller.ts new file mode 100644 index 000000000..aa9f26f32 --- /dev/null +++ b/test/mocks/support/user/auth/controller.ts @@ -0,0 +1,10 @@ +import { vi } from 'vitest'; + +// Mock auth code validation +vi.mock('@fastgpt/service/support/user/auth/controller', async (importOriginal) => { + const actual = (await importOriginal()) as any; + return { + ...actual, + authCode: vi.fn().mockResolvedValue(true) + }; +}); diff --git a/test/setup.ts b/test/setup.ts index 16dd3df6f..3b22055d1 100644 --- a/test/setup.ts +++ b/test/setup.ts @@ -5,57 +5,15 @@ import { initGlobalVariables } from '@/service/common/system'; import { afterAll, beforeAll, beforeEach, inject, onTestFinished, vi } from 'vitest'; import setupModels from './setupModels'; import { clean } from './datas/users'; -import type { Mongoose } from '@fastgpt/service/common/mongo'; import { connectionLogMongo, connectionMongo } from '@fastgpt/service/common/mongo'; -import { randomUUID } from 'crypto'; import { delay } from '@fastgpt/global/common/system/utils'; vi.stubEnv('NODE_ENV', 'test'); -vi.mock(import('@fastgpt/service/common/mongo/init'), async (importOriginal: any) => { - const mod = await importOriginal(); - return { - ...mod, - connectMongo: async (db: Mongoose, url: string) => { - await db.connect(url, { dbName: randomUUID() }); - await db.connection.db?.dropDatabase(); - } - }; -}); - -vi.mock(import('@/service/common/system'), async (importOriginal) => { - const mod = await importOriginal(); - return { - ...mod, - getSystemVersion: async () => { - return '0.0.0'; - }, - readConfigData: async () => { - return readFileSync('projects/app/data/config.json', 'utf-8'); - }, - initSystemConfig: async () => { - // read env from projects/app/.env - - const str = readFileSync('projects/app/.env.local', 'utf-8'); - const lines = str.split('\n'); - const systemEnv: Record = {}; - for (const line of lines) { - const [key, value] = line.split('='); - if (key && value) { - systemEnv[key] = value; - } - } - global.systemEnv = systemEnv as any; - - return; - } - }; -}); - beforeAll(async () => { vi.stubEnv('MONGODB_URI', inject('MONGODB_URI')); - await connectMongo(connectionMongo, inject('MONGODB_URI')); - await connectMongo(connectionLogMongo, inject('MONGODB_URI')); + await connectMongo({ db: connectionMongo, url: inject('MONGODB_URI') }); + await connectMongo({ db: connectionLogMongo, url: inject('MONGODB_URI') }); initGlobalVariables(); global.systemEnv = {} as any; @@ -86,16 +44,27 @@ afterAll(async () => { }); beforeEach(async () => { - await connectMongo(connectionMongo, inject('MONGODB_URI')); - await connectMongo(connectionLogMongo, inject('MONGODB_URI')); + await connectMongo({ db: connectionMongo, url: inject('MONGODB_URI') }); + await connectMongo({ db: connectionLogMongo, url: inject('MONGODB_URI') }); onTestFinished(async () => { clean(); - await delay(200); // wait for asynchronous operations to complete - await Promise.all([ - connectionMongo?.connection.db?.dropDatabase(), - connectionLogMongo?.connection.db?.dropDatabase() - ]); + // Wait for any ongoing transactions and operations to complete + await delay(500); + + // Ensure all sessions are closed before dropping database + try { + await Promise.all([ + connectionMongo?.connection.db?.dropDatabase(), + connectionLogMongo?.connection.db?.dropDatabase() + ]); + } catch (error) { + // Ignore errors during cleanup + console.warn('Error during test cleanup:', error); + } + + // Additional delay to prevent lock contention between tests + await delay(100); }); }); diff --git a/test/test.ts b/test/test.ts deleted file mode 100644 index 57b67c4ed..000000000 --- a/test/test.ts +++ /dev/null @@ -1,18 +0,0 @@ -import { MongoUser } from '@fastgpt/service/support/user/schema'; -import { it, expect } from 'vitest'; - -it('should be a test', async () => { - expect(1).toBe(1); -}); - -it('should be able to connect to mongo', async () => { - expect(global.mongodb).toBeDefined(); - expect(global.mongodb?.connection.readyState).toBe(1); - await MongoUser.create({ - username: 'test', - password: '123456' - }); - const user = await MongoUser.findOne({ username: 'test' }); - expect(user).toBeDefined(); - expect(user?.username).toBe('test'); -});