mirror of
https://github.com/labring/FastGPT.git
synced 2025-12-25 20:02:47 +00:00
V4.14.4 features (#6075)
Some checks failed
Build FastGPT images in Personal warehouse / get-vars (push) Waiting to run
Build FastGPT images in Personal warehouse / build-fastgpt-images (map[arch:amd64 runs-on:ubuntu-24.04]) (push) Blocked by required conditions
Build FastGPT images in Personal warehouse / build-fastgpt-images (map[arch:arm64 runs-on:ubuntu-24.04-arm]) (push) Blocked by required conditions
Build FastGPT images in Personal warehouse / release-fastgpt-images (push) Blocked by required conditions
Document deploy / sync-images (push) Has been cancelled
Document deploy / generate-timestamp (push) Has been cancelled
Document deploy / build-images (map[domain:https://fastgpt.cn suffix:cn]) (push) Has been cancelled
Document deploy / build-images (map[domain:https://fastgpt.io suffix:io]) (push) Has been cancelled
Document deploy / update-images (map[deployment:fastgpt-docs domain:https://fastgpt.cn kube_config:KUBE_CONFIG_CN suffix:cn]) (push) Has been cancelled
Document deploy / update-images (map[deployment:fastgpt-docs domain:https://fastgpt.io kube_config:KUBE_CONFIG_IO suffix:io]) (push) Has been cancelled
Some checks failed
Build FastGPT images in Personal warehouse / get-vars (push) Waiting to run
Build FastGPT images in Personal warehouse / build-fastgpt-images (map[arch:amd64 runs-on:ubuntu-24.04]) (push) Blocked by required conditions
Build FastGPT images in Personal warehouse / build-fastgpt-images (map[arch:arm64 runs-on:ubuntu-24.04-arm]) (push) Blocked by required conditions
Build FastGPT images in Personal warehouse / release-fastgpt-images (push) Blocked by required conditions
Document deploy / sync-images (push) Has been cancelled
Document deploy / generate-timestamp (push) Has been cancelled
Document deploy / build-images (map[domain:https://fastgpt.cn suffix:cn]) (push) Has been cancelled
Document deploy / build-images (map[domain:https://fastgpt.io suffix:io]) (push) Has been cancelled
Document deploy / update-images (map[deployment:fastgpt-docs domain:https://fastgpt.cn kube_config:KUBE_CONFIG_CN suffix:cn]) (push) Has been cancelled
Document deploy / update-images (map[deployment:fastgpt-docs domain:https://fastgpt.io kube_config:KUBE_CONFIG_IO suffix:io]) (push) Has been cancelled
* perf: faq
* index
* delete dataset
* delete dataset
* perf: delete dataset
* init
* fix: faq
* refresh
* empty tip
* perf: delete type
* fix: some bugs (#6071)
* fix: publish channel doc link
* fix: checkbox disable hover style
* fix: huggingface.svg missing; update doc
* chore: update doc
* fix: typo
* fix: export log dateend;feat: file selector render (#6072)
* fix: export log dateend
* feat: file selector render
* perf: s3 controller
* team qpm limit & plan tracks (#6066)
* team qpm limit & plan tracks
* api entry qpm
* perf: computed days
* Revert "api entry qpm"
This reverts commit 1210c07217.
* perf: code
* system qpm limit
* system qpm limit
---------
Co-authored-by: archer <545436317@qq.com>
* perf: track
* remove export chat test
* doc
* feat: global agent (#6057)
* feat: global agent
* fix: agent
* fix: order display
* CHORE
* feat: error page log
* fix: var update
---------
Co-authored-by: Finley Ge <32237950+FinleyGe@users.noreply.github.com>
Co-authored-by: heheer <heheer@sealos.io>
Co-authored-by: Roy <whoeverimf5@gmail.com>
This commit is contained in:
parent
ed1623bd7f
commit
2da73a6555
|
|
@ -1,6 +1,6 @@
|
|||
# FastGPT 文档
|
||||
|
||||
这是FastGPT的官方文档,采用fumadoc框架。
|
||||
这是FastGPT的官方文档,采用 fumadoc 框架。
|
||||
## 运行项目
|
||||
|
||||
要运行文档,首先需要进行环境变量配置,在文档的根目录下创建`.env.local`文件,填写以下环境变量:
|
||||
|
|
@ -12,7 +12,7 @@ FASTGPT_HOME_DOMAIN = #要跳转的FastGPT项目的域名,默认海外版
|
|||
你可以在FastGPT项目根目录下执行以下命令来运行文档。
|
||||
|
||||
```bash
|
||||
npm install #只能npm install,不能pnpm
|
||||
npm install # 只能 npm install,不能 pnpm
|
||||
npm run dev
|
||||
```
|
||||
项目会默认跑在`http:localhost:3000`端口
|
||||
|
|
|
|||
|
|
@ -4,7 +4,7 @@ title: 报错
|
|||
|
||||
1. ### 当前分组上游负载已饱和,请稍后再试(request id:202407100753411462086782835521)
|
||||
|
||||
是oneapi渠道的问题,可以换个模型用or换一家中转站
|
||||
是oneapi渠道的问题,可以换个模型用或者换一家中转站
|
||||
|
||||
1. ### 使用API时在日志中报错Connection Error
|
||||
|
||||
|
|
|
|||
|
|
@ -20,8 +20,8 @@ FastGPT 云服务版自 v4.14.4 后支持配置自定义域名。
|
|||
1. 准备好您的域名。您的域名必须先经过备案,目前支持“阿里云”、“腾讯云”、“火山引擎”三家服务商的备案域名。
|
||||
2. 点击“编辑”按钮,进入编辑状态。
|
||||
3. 填入您的域名,例如 www.example.com
|
||||
4. 在域名服务商的域名解析处,添加界面中提示的 DNS 纪录,注意纪录类型为 CNAME。
|
||||
5. 添加解析纪录后,点击“保存”按钮。系统将自动检查 DNS 解析情况,一般情况下,在一分钟内就可以获取到解析纪录。如果长时间没有获取到纪录,可以重试一次。
|
||||
4. 在域名服务商的域名解析处,添加界面中提示的 DNS 记录,注意记录类型为 CNAME。
|
||||
5. 添加解析记录后,点击"保存"按钮。系统将自动检查 DNS 解析情况,一般情况下,在一分钟内就可以获取到解析记录。如果长时间没有获取到记录,可以重试一次。
|
||||
6. 待状态提示显示为“已生效”后,点击“确认”按钮即可。
|
||||
|
||||

|
||||
|
|
@ -30,7 +30,7 @@ FastGPT 云服务版自 v4.14.4 后支持配置自定义域名。
|
|||
|
||||
## 域名解析失效
|
||||
|
||||
系统会每天对 DNS 解析进行检查,如果发现 DNS 解析纪录失效,则会停用该自定义域名,可以在“自定义域名”管理界面中点击“编辑”进行重新解析。
|
||||
系统会每天对 DNS 解析进行检查,如果发现 DNS 解析记录失效,则会停用该自定义域名,可以在"自定义域名"管理界面中点击"编辑"进行重新解析。
|
||||
|
||||

|
||||
|
||||
|
|
|
|||
|
|
@ -30,9 +30,11 @@ curl --location --request POST 'https://{{host}}/api/admin/initv4144' \
|
|||
4. 通过 API 上传本地文件至知识库,保存至 S3。同时将旧版 Gridfs 代码全部移除。
|
||||
5. 新版订阅套餐逻辑。
|
||||
6. 支持配置对话文件白名单。
|
||||
7. S3 支持 pathStyle 配置。
|
||||
7. S3 支持 pathStyle 和 region 配置。
|
||||
8. 支持通过 Sealos 来进行多租户自定义域名配置。
|
||||
|
||||
9. 工作流中引用工具时,文件输入支持手动填写(原本只支持变量引用)。
|
||||
10. 支持网络代理(HTTP_PROXY,HTTPS_PROXY)
|
||||
|
||||
## ⚙️ 优化
|
||||
|
||||
1. 增加 S3 上传文件超时时长为 5 分钟。
|
||||
|
|
@ -57,7 +59,12 @@ curl --location --request POST 'https://{{host}}/api/admin/initv4144' \
|
|||
10. http 节点使用值为空字符串的全局变量时,值会被替换为 null。
|
||||
11. 判断器节点折叠时,连线断开。
|
||||
12. 节点调试时,单选和多选类型的变量无法展示选项。
|
||||
13. 发布渠道文档链接定位错误。
|
||||
14. Checkbox 在禁用状态时,hover 样式错误。
|
||||
15. 模型头像缺失情况下,默认 huggingface.svg 图标显示错误。
|
||||
16. 日志导出时,结束时间会多出一天。
|
||||
|
||||
## 插件
|
||||
|
||||
1. 新增 GLM4.6 与 DS3.2 系列模型预设。
|
||||
2. 修复 MinerU SaaS 插件模型版本不能选择 vlm 的问题
|
||||
|
|
|
|||
|
|
@ -27,7 +27,7 @@ import { Alert } from '@/components/docs/Alert';
|
|||
背景知识中,引导模型调用工具去执行不通的操作。
|
||||
|
||||
<Alert icon="🤗" context="success">
|
||||
**Tips:** 这里需要增加适当的上下文,方便模型结合历史纪录进行判断和决策~
|
||||
**Tips:** 这里需要增加适当的上下文,方便模型结合历史记录进行判断和决策~
|
||||
</Alert>
|
||||
|
||||
## 3. HTTP 模块
|
||||
|
|
|
|||
|
|
@ -34,7 +34,7 @@ description: FastGPT 接入企微机器人教程
|
|||
|
||||
### 2.4 获取关键密钥
|
||||
|
||||
随机生成或者手动输入 Token 和 Encoding-AESKey,并且纪录下来
|
||||
随机生成或者手动输入 Token 和 Encoding-AESKey,并且记录下来
|
||||
|
||||

|
||||
|
||||
|
|
@ -46,7 +46,7 @@ description: FastGPT 接入企微机器人教程
|
|||
|
||||
### 2.6 配置发布渠道信息
|
||||
|
||||
配置该发布渠道的信息,需要填入 Token 和 AESKey,也就是第四步中纪录下来的 Token 和 Encoding-AESKey
|
||||
配置该发布渠道的信息,需要填入 Token 和 AESKey,也就是第四步中记录下来的 Token 和 Encoding-AESKey
|
||||
|
||||

|
||||
|
||||
|
|
@ -69,4 +69,4 @@ description: FastGPT 接入企微机器人教程
|
|||
1. 检查可信域名是否配置正确。
|
||||
2. 检查 Token 和 Encoding-AESKey 是否正确。
|
||||
3. 查看 FastGPT 对话日志,是否有对应的提问记录。
|
||||
4. 如果没记录,则可能是应用运行报错了,可以先试试最简单的机器人.
|
||||
4. 如果没记录,则可能是应用运行报错了,可以先试试最简单的机器人。
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@
|
|||
"document/content/docs/faq/app.mdx": "2025-08-02T19:38:37+08:00",
|
||||
"document/content/docs/faq/chat.mdx": "2025-08-02T19:38:37+08:00",
|
||||
"document/content/docs/faq/dataset.mdx": "2025-08-02T19:38:37+08:00",
|
||||
"document/content/docs/faq/error.mdx": "2025-08-02T19:38:37+08:00",
|
||||
"document/content/docs/faq/error.mdx": "2025-12-10T13:24:24+08:00",
|
||||
"document/content/docs/faq/external_channel_integration.mdx": "2025-08-02T19:38:37+08:00",
|
||||
"document/content/docs/faq/index.mdx": "2025-08-02T19:38:37+08:00",
|
||||
"document/content/docs/faq/other.mdx": "2025-08-04T22:07:52+08:00",
|
||||
|
|
@ -89,7 +89,7 @@
|
|||
"document/content/docs/introduction/guide/plugins/google_search_plugin_guide.mdx": "2025-07-23T21:35:03+08:00",
|
||||
"document/content/docs/introduction/guide/plugins/searxng_plugin_guide.mdx": "2025-07-23T21:35:03+08:00",
|
||||
"document/content/docs/introduction/guide/plugins/upload_system_tool.mdx": "2025-11-04T16:58:12+08:00",
|
||||
"document/content/docs/introduction/guide/team_permissions/customDomain.mdx": "2025-12-09T23:33:32+08:00",
|
||||
"document/content/docs/introduction/guide/team_permissions/customDomain.mdx": "2025-12-10T13:24:24+08:00",
|
||||
"document/content/docs/introduction/guide/team_permissions/invitation_link.mdx": "2025-07-23T21:35:03+08:00",
|
||||
"document/content/docs/introduction/guide/team_permissions/team_roles_permissions.mdx": "2025-07-23T21:35:03+08:00",
|
||||
"document/content/docs/introduction/index.en.mdx": "2025-07-23T21:35:03+08:00",
|
||||
|
|
@ -119,7 +119,7 @@
|
|||
"document/content/docs/upgrading/4-14/4141.mdx": "2025-11-19T10:15:27+08:00",
|
||||
"document/content/docs/upgrading/4-14/4142.mdx": "2025-11-18T19:27:14+08:00",
|
||||
"document/content/docs/upgrading/4-14/4143.mdx": "2025-11-26T20:52:05+08:00",
|
||||
"document/content/docs/upgrading/4-14/4144.mdx": "2025-12-09T23:33:32+08:00",
|
||||
"document/content/docs/upgrading/4-14/4144.mdx": "2025-12-10T13:28:04+08:00",
|
||||
"document/content/docs/upgrading/4-8/40.mdx": "2025-08-02T19:38:37+08:00",
|
||||
"document/content/docs/upgrading/4-8/41.mdx": "2025-08-02T19:38:37+08:00",
|
||||
"document/content/docs/upgrading/4-8/42.mdx": "2025-08-02T19:38:37+08:00",
|
||||
|
|
@ -191,7 +191,7 @@
|
|||
"document/content/docs/use-cases/app-cases/feishu_webhook.mdx": "2025-07-23T21:35:03+08:00",
|
||||
"document/content/docs/use-cases/app-cases/fixingEvidence.mdx": "2025-07-23T21:35:03+08:00",
|
||||
"document/content/docs/use-cases/app-cases/google_search.mdx": "2025-07-23T21:35:03+08:00",
|
||||
"document/content/docs/use-cases/app-cases/lab_appointment.mdx": "2025-07-23T21:35:03+08:00",
|
||||
"document/content/docs/use-cases/app-cases/lab_appointment.mdx": "2025-12-10T13:24:24+08:00",
|
||||
"document/content/docs/use-cases/app-cases/multi_turn_translation_bot.mdx": "2025-07-23T21:35:03+08:00",
|
||||
"document/content/docs/use-cases/app-cases/submit_application_template.mdx": "2025-08-05T23:20:39+08:00",
|
||||
"document/content/docs/use-cases/app-cases/translate-subtitle-using-gpt.mdx": "2025-07-23T21:35:03+08:00",
|
||||
|
|
@ -199,6 +199,6 @@
|
|||
"document/content/docs/use-cases/external-integration/feishu.mdx": "2025-07-24T14:23:04+08:00",
|
||||
"document/content/docs/use-cases/external-integration/official_account.mdx": "2025-08-05T23:20:39+08:00",
|
||||
"document/content/docs/use-cases/external-integration/openapi.mdx": "2025-09-29T11:34:11+08:00",
|
||||
"document/content/docs/use-cases/external-integration/wecom.mdx": "2025-12-09T23:33:32+08:00",
|
||||
"document/content/docs/use-cases/external-integration/wecom.mdx": "2025-12-10T13:24:24+08:00",
|
||||
"document/content/docs/use-cases/index.mdx": "2025-07-24T14:23:04+08:00"
|
||||
}
|
||||
|
|
@ -16,3 +16,10 @@ export const parseI18nString = (str: I18nStringType | string = '', lang = 'en')
|
|||
// 最后回退到英文
|
||||
return str['en'] || '';
|
||||
};
|
||||
|
||||
export const formatI18nLocationToZhEn = (locale: localeType = 'zh-CN'): 'zh' | 'en' => {
|
||||
if (locale.toLocaleLowerCase().startsWith('zh')) {
|
||||
return 'zh';
|
||||
}
|
||||
return 'en';
|
||||
};
|
||||
|
|
|
|||
|
|
@ -10,5 +10,10 @@ export enum TrackEnum {
|
|||
readSystemAnnouncement = 'readSystemAnnouncement',
|
||||
clickOperationalAd = 'clickOperationalAd',
|
||||
closeOperationalAd = 'closeOperationalAd',
|
||||
teamChatQPM = 'teamChatQPM'
|
||||
teamChatQPM = 'teamChatQPM',
|
||||
subscriptionDeleted = 'subscriptionDeleted',
|
||||
freeAccountCleanup = 'freeAccountCleanup',
|
||||
|
||||
// web tracks
|
||||
clientError = 'clientError'
|
||||
}
|
||||
|
|
|
|||
|
|
@ -209,14 +209,15 @@ export type DispatchNodeResponseType = {
|
|||
headers?: Record<string, any>;
|
||||
httpResult?: Record<string, any>;
|
||||
|
||||
// plugin output
|
||||
// Tool
|
||||
toolInput?: Record<string, any>;
|
||||
pluginOutput?: Record<string, any>;
|
||||
pluginDetail?: ChatHistoryItemResType[];
|
||||
|
||||
// if-else
|
||||
ifElseResult?: string;
|
||||
|
||||
// tool
|
||||
// tool call
|
||||
toolCallInputTokens?: number;
|
||||
toolCallOutputTokens?: number;
|
||||
toolDetail?: ChatHistoryItemResType[];
|
||||
|
|
@ -225,9 +226,6 @@ export type DispatchNodeResponseType = {
|
|||
// code
|
||||
codeLog?: string;
|
||||
|
||||
// plugin
|
||||
pluginOutput?: Record<string, any>;
|
||||
|
||||
// read files
|
||||
readFilesResult?: string;
|
||||
readFiles?: ReadFileNodeResponse;
|
||||
|
|
|
|||
|
|
@ -2,25 +2,50 @@
|
|||
import { getGlobalRedisConnection } from '../../common/redis';
|
||||
import { jsonRes } from '../../common/response';
|
||||
import type { NextApiResponse } from 'next';
|
||||
import { teamQPM } from '../../support/wallet/sub/utils';
|
||||
import z from 'zod';
|
||||
import { addLog } from '../system/log';
|
||||
|
||||
export enum LimitTypeEnum {
|
||||
chat = 'chat'
|
||||
}
|
||||
const limitMap = {
|
||||
[LimitTypeEnum.chat]: {
|
||||
seconds: 60,
|
||||
limit: Number(process.env.CHAT_MAX_QPM || 5000)
|
||||
|
||||
const FrequencyLimitOptionSchema = z.union([
|
||||
z.object({
|
||||
type: z.literal(LimitTypeEnum.chat),
|
||||
teamId: z.string()
|
||||
})
|
||||
]);
|
||||
type FrequencyLimitOption = z.infer<typeof FrequencyLimitOptionSchema>;
|
||||
|
||||
const getLimitData = async (data: FrequencyLimitOption) => {
|
||||
if (data.type === LimitTypeEnum.chat) {
|
||||
const qpm = await teamQPM.getTeamQPMLimit(data.teamId);
|
||||
|
||||
if (!qpm) return;
|
||||
|
||||
return {
|
||||
limit: qpm,
|
||||
seconds: 60
|
||||
};
|
||||
}
|
||||
return;
|
||||
};
|
||||
|
||||
type FrequencyLimitOption = {
|
||||
teamId: string;
|
||||
type: LimitTypeEnum;
|
||||
res: NextApiResponse;
|
||||
};
|
||||
/*
|
||||
true: 未达到限制
|
||||
false: 达到了限制
|
||||
*/
|
||||
export const teamFrequencyLimit = async ({
|
||||
teamId,
|
||||
type,
|
||||
res
|
||||
}: FrequencyLimitOption & { res: NextApiResponse }) => {
|
||||
const data = await getLimitData({ type, teamId });
|
||||
if (!data) return true;
|
||||
|
||||
const { limit, seconds } = data;
|
||||
|
||||
export const teamFrequencyLimit = async ({ teamId, type, res }: FrequencyLimitOption) => {
|
||||
const { seconds, limit } = limitMap[type];
|
||||
const redis = getGlobalRedisConnection();
|
||||
const key = `frequency:${type}:${teamId}`;
|
||||
|
||||
|
|
@ -31,13 +56,16 @@ export const teamFrequencyLimit = async ({ teamId, type, res }: FrequencyLimitOp
|
|||
.exec();
|
||||
|
||||
if (!result) {
|
||||
return Promise.reject(new Error('Redis connection error'));
|
||||
return true;
|
||||
}
|
||||
|
||||
const currentCount = result[0][1] as number;
|
||||
|
||||
if (currentCount > limit) {
|
||||
const remainingTime = await redis.ttl(key);
|
||||
addLog.info(
|
||||
`[Completion Limit] Team ${teamId} reached the limit of ${limit} requests per ${seconds} seconds. Remaining time: ${remainingTime} seconds.`
|
||||
);
|
||||
jsonRes(res, {
|
||||
code: 429,
|
||||
error: `Rate limit exceeded. Maximum ${limit} requests per ${seconds} seconds for this team. Please try again in ${remainingTime} seconds.`
|
||||
|
|
|
|||
|
|
@ -2,11 +2,13 @@ import fs from 'node:fs';
|
|||
import type { ReaderModel } from '@maxmind/geoip2-node';
|
||||
import { Reader } from '@maxmind/geoip2-node';
|
||||
import { cleanupIntervalMs, dbPath, privateOrOtherLocationName } from './constants';
|
||||
import type { I18nName, LocationName } from './type';
|
||||
import type { LocationName } from './type';
|
||||
import { extractLocationData } from './utils';
|
||||
import type { NextApiRequest } from 'next';
|
||||
import { getClientIp } from 'request-ip';
|
||||
import { addLog } from '../system/log';
|
||||
import type { localeType } from '@fastgpt/global/common/i18n/type';
|
||||
import { formatI18nLocationToZhEn } from '@fastgpt/global/common/i18n/utils';
|
||||
|
||||
let reader: ReaderModel | null = null;
|
||||
|
||||
|
|
@ -25,21 +27,23 @@ export function getGeoReader() {
|
|||
return reader;
|
||||
}
|
||||
|
||||
export function getLocationFromIp(ip?: string, locale: keyof I18nName = 'zh') {
|
||||
export function getLocationFromIp(ip?: string, locale: localeType = 'zh-CN') {
|
||||
const formatedLocale = formatI18nLocationToZhEn(locale);
|
||||
|
||||
if (!ip) {
|
||||
return privateOrOtherLocationName.country?.[locale];
|
||||
return privateOrOtherLocationName.country?.[formatedLocale];
|
||||
}
|
||||
const reader = getGeoReader();
|
||||
|
||||
let locationName = locationIpMap.get(ip);
|
||||
if (locationName) {
|
||||
return [
|
||||
locationName.country?.[locale],
|
||||
locationName.province?.[locale],
|
||||
locationName.city?.[locale]
|
||||
locationName.country?.[formatedLocale],
|
||||
locationName.province?.[formatedLocale],
|
||||
locationName.city?.[formatedLocale]
|
||||
]
|
||||
.filter(Boolean)
|
||||
.join(locale === 'zh' ? ',' : ',');
|
||||
.join(formatedLocale === 'zh' ? ',' : ',');
|
||||
}
|
||||
|
||||
try {
|
||||
|
|
@ -62,15 +66,15 @@ export function getLocationFromIp(ip?: string, locale: keyof I18nName = 'zh') {
|
|||
locationIpMap.set(ip, locationName);
|
||||
|
||||
return [
|
||||
locationName.country?.[locale],
|
||||
locationName.province?.[locale],
|
||||
locationName.city?.[locale]
|
||||
locationName.country?.[formatedLocale],
|
||||
locationName.province?.[formatedLocale],
|
||||
locationName.city?.[formatedLocale]
|
||||
]
|
||||
.filter(Boolean)
|
||||
.join(locale === 'zh' ? ',' : ', ');
|
||||
.join(formatedLocale === 'zh' ? ',' : ', ');
|
||||
} catch (error) {
|
||||
locationIpMap.set(ip, privateOrOtherLocationName);
|
||||
return privateOrOtherLocationName.country?.[locale];
|
||||
return privateOrOtherLocationName.country?.[formatedLocale];
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -8,6 +8,7 @@ import type { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
|||
import { getAppLatestVersion } from '../../../core/app/version/controller';
|
||||
import { type ShortUrlParams } from '@fastgpt/global/support/marketing/type';
|
||||
import { getRedisCache, setRedisCache } from '../../redis/cache';
|
||||
import { differenceInDays } from 'date-fns';
|
||||
|
||||
const createTrack = ({ event, data }: { event: TrackEnum; data: Record<string, any> }) => {
|
||||
if (!global.feConfigs?.isPlus) return;
|
||||
|
|
@ -156,5 +157,33 @@ export const pushTrack = {
|
|||
teamId: data.teamId
|
||||
}
|
||||
});
|
||||
},
|
||||
subscriptionDeleted: (data: {
|
||||
teamId: string;
|
||||
subscriptionType: string;
|
||||
totalPoints: number;
|
||||
usedPoints: number;
|
||||
startTime: Date;
|
||||
expiredTime: Date;
|
||||
}) => {
|
||||
return createTrack({
|
||||
event: TrackEnum.subscriptionDeleted,
|
||||
data: {
|
||||
teamId: data.teamId,
|
||||
subscriptionType: data.subscriptionType,
|
||||
totalPoints: data.totalPoints,
|
||||
usedPoints: data.usedPoints,
|
||||
activeDays: differenceInDays(data.expiredTime, data.startTime)
|
||||
}
|
||||
});
|
||||
},
|
||||
freeAccountCleanup: (data: { teamId: string; expiredTime: Date }) => {
|
||||
return createTrack({
|
||||
event: TrackEnum.freeAccountCleanup,
|
||||
data: {
|
||||
teamId: data.teamId,
|
||||
expiredTime: data.expiredTime
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
|
|
|
|||
|
|
@ -0,0 +1,19 @@
|
|||
import http from 'http';
|
||||
import https from 'https';
|
||||
import { HttpProxyAgent } from 'http-proxy-agent';
|
||||
import { HttpsProxyAgent } from 'https-proxy-agent';
|
||||
|
||||
if (process.env.HTTP_PROXY || process.env.HTTPS_PROXY) {
|
||||
const httpProxy = process.env.HTTP_PROXY;
|
||||
const httpsProxy = process.env.HTTPS_PROXY;
|
||||
if (httpProxy) {
|
||||
http.globalAgent = new HttpProxyAgent(httpProxy);
|
||||
}
|
||||
if (httpsProxy) {
|
||||
https.globalAgent = new HttpsProxyAgent(httpsProxy);
|
||||
}
|
||||
|
||||
console.info(`Global Proxy enabled: ${httpProxy}, ${httpsProxy}`);
|
||||
} else {
|
||||
console.info('Global Proxy disabled');
|
||||
}
|
||||
|
|
@ -8,14 +8,16 @@ const getCacheKey = (key: string) => `${redisPrefix}${key}`;
|
|||
export enum CacheKeyEnum {
|
||||
team_vector_count = 'team_vector_count',
|
||||
team_point_surplus = 'team_point_surplus',
|
||||
team_point_total = 'team_point_total'
|
||||
team_point_total = 'team_point_total',
|
||||
team_qpm_limit = 'team_qpm_limit'
|
||||
}
|
||||
|
||||
// Seconds
|
||||
export enum CacheKeyEnumTime {
|
||||
team_vector_count = 30 * 60,
|
||||
team_point_surplus = 1 * 60,
|
||||
team_point_total = 1 * 60
|
||||
team_point_total = 1 * 60,
|
||||
team_qpm_limit = 60 * 60
|
||||
}
|
||||
|
||||
export const setRedisCache = async (
|
||||
|
|
|
|||
|
|
@ -15,6 +15,7 @@ import { addLog } from '../../system/log';
|
|||
import { addS3DelJob } from '../mq';
|
||||
import { type Readable } from 'node:stream';
|
||||
import { type UploadFileByBufferParams, UploadFileByBufferSchema } from '../type';
|
||||
import { parseFileExtensionFromUrl } from '@fastgpt/global/common/string/tools';
|
||||
|
||||
export class S3BaseBucket {
|
||||
private _client: Client;
|
||||
|
|
@ -26,7 +27,7 @@ export class S3BaseBucket {
|
|||
* @param options the options for the s3 client
|
||||
*/
|
||||
constructor(
|
||||
private readonly bucketName: string,
|
||||
public readonly bucketName: string,
|
||||
public options: Partial<S3OptionsType> = defaultS3Options
|
||||
) {
|
||||
options = { ...defaultS3Options, ...options };
|
||||
|
|
@ -51,25 +52,23 @@ export class S3BaseBucket {
|
|||
accessKey: options.accessKey,
|
||||
secretKey: options.secretKey,
|
||||
pathStyle: options.pathStyle,
|
||||
transportAgent: options.transportAgent
|
||||
region: options.region
|
||||
});
|
||||
}
|
||||
|
||||
const init = async () => {
|
||||
if (!(await this.exist())) {
|
||||
// Not exists bucket, create it
|
||||
if (!(await this.client.bucketExists(this.bucketName))) {
|
||||
await this.client.makeBucket(this.bucketName);
|
||||
}
|
||||
await this.options.afterInit?.();
|
||||
console.log(`S3 init success: ${this.name}`);
|
||||
console.log(`S3 init success: ${this.bucketName}`);
|
||||
};
|
||||
if (this.options.init) {
|
||||
init();
|
||||
}
|
||||
}
|
||||
|
||||
get name(): string {
|
||||
return this.bucketName;
|
||||
}
|
||||
get client(): Client {
|
||||
return this._client;
|
||||
}
|
||||
|
|
@ -110,21 +109,17 @@ export class S3BaseBucket {
|
|||
copyConditions?: CopyConditions;
|
||||
};
|
||||
}): ReturnType<Client['copyObject']> {
|
||||
const bucket = this.name;
|
||||
const bucket = this.bucketName;
|
||||
if (options?.temporary) {
|
||||
await MongoS3TTL.create({
|
||||
minioKey: to,
|
||||
bucketName: this.name,
|
||||
bucketName: this.bucketName,
|
||||
expiredTime: addHours(new Date(), 24)
|
||||
});
|
||||
}
|
||||
return this.client.copyObject(bucket, to, `${bucket}/${from}`, options?.copyConditions);
|
||||
}
|
||||
|
||||
exist(): Promise<boolean> {
|
||||
return this.client.bucketExists(this.name);
|
||||
}
|
||||
|
||||
async delete(objectKey: string, options?: RemoveOptions): Promise<void> {
|
||||
try {
|
||||
if (!objectKey) return Promise.resolve();
|
||||
|
|
@ -133,11 +128,11 @@ export class S3BaseBucket {
|
|||
const fileParsedPrefix = `${path.dirname(objectKey)}/${path.basename(objectKey, path.extname(objectKey))}-parsed`;
|
||||
await this.addDeleteJob({ prefix: fileParsedPrefix });
|
||||
|
||||
return await this.client.removeObject(this.name, objectKey, options);
|
||||
return await this.client.removeObject(this.bucketName, objectKey, options);
|
||||
} catch (error) {
|
||||
if (error instanceof S3Error) {
|
||||
if (error.code === 'InvalidObjectName') {
|
||||
addLog.warn(`${this.name} delete object not found: ${objectKey}`, error);
|
||||
addLog.warn(`${this.bucketName} delete object not found: ${objectKey}`, error);
|
||||
return Promise.resolve();
|
||||
}
|
||||
}
|
||||
|
|
@ -145,27 +140,43 @@ export class S3BaseBucket {
|
|||
}
|
||||
}
|
||||
|
||||
// 列出文件
|
||||
listObjectsV2(
|
||||
...params: Parameters<Client['listObjectsV2']> extends [string, ...infer R] ? R : never
|
||||
) {
|
||||
return this.client.listObjectsV2(this.name, ...params);
|
||||
return this.client.listObjectsV2(this.bucketName, ...params);
|
||||
}
|
||||
|
||||
// 上传文件
|
||||
putObject(...params: Parameters<Client['putObject']> extends [string, ...infer R] ? R : never) {
|
||||
return this.client.putObject(this.name, ...params);
|
||||
return this.client.putObject(this.bucketName, ...params);
|
||||
}
|
||||
|
||||
getObject(...params: Parameters<Client['getObject']> extends [string, ...infer R] ? R : never) {
|
||||
return this.client.getObject(this.name, ...params);
|
||||
// 获取文件流
|
||||
getFileStream(
|
||||
...params: Parameters<Client['getObject']> extends [string, ...infer R] ? R : never
|
||||
) {
|
||||
return this.client.getObject(this.bucketName, ...params);
|
||||
}
|
||||
|
||||
statObject(...params: Parameters<Client['statObject']> extends [string, ...infer R] ? R : never) {
|
||||
return this.client.statObject(this.name, ...params);
|
||||
// 获取文件状态
|
||||
async statObject(
|
||||
...params: Parameters<Client['statObject']> extends [string, ...infer R] ? R : never
|
||||
) {
|
||||
try {
|
||||
return await this.client.statObject(this.bucketName, ...params);
|
||||
} catch (error) {
|
||||
if (error instanceof S3Error && error.message === 'Not Found') {
|
||||
return null;
|
||||
}
|
||||
return Promise.reject(error);
|
||||
}
|
||||
}
|
||||
|
||||
// 判断文件是否存在
|
||||
async isObjectExists(key: string): Promise<boolean> {
|
||||
try {
|
||||
await this.client.statObject(this.name, key);
|
||||
await this.client.statObject(this.bucketName, key);
|
||||
return true;
|
||||
} catch (err) {
|
||||
if (err instanceof S3Error && err.message === 'Not Found') {
|
||||
|
|
@ -175,6 +186,7 @@ export class S3BaseBucket {
|
|||
}
|
||||
}
|
||||
|
||||
// 将文件流转换为Buffer
|
||||
async fileStreamToBuffer(stream: Readable): Promise<Buffer> {
|
||||
const chunks: Buffer[] = [];
|
||||
for await (const chunk of stream) {
|
||||
|
|
@ -184,7 +196,7 @@ export class S3BaseBucket {
|
|||
}
|
||||
|
||||
addDeleteJob(params: Omit<Parameters<typeof addS3DelJob>[0], 'bucketName'>) {
|
||||
return addS3DelJob({ ...params, bucketName: this.name });
|
||||
return addS3DelJob({ ...params, bucketName: this.bucketName });
|
||||
}
|
||||
|
||||
async createPostPresignedUrl(
|
||||
|
|
@ -202,7 +214,7 @@ export class S3BaseBucket {
|
|||
|
||||
const policy = this.externalClient.newPostPolicy();
|
||||
policy.setKey(key);
|
||||
policy.setBucket(this.name);
|
||||
policy.setBucket(this.bucketName);
|
||||
policy.setContentType(contentType);
|
||||
if (formatMaxFileSize) {
|
||||
policy.setContentLengthRange(1, formatMaxFileSize);
|
||||
|
|
@ -220,7 +232,7 @@ export class S3BaseBucket {
|
|||
if (expiredHours) {
|
||||
await MongoS3TTL.create({
|
||||
minioKey: key,
|
||||
bucketName: this.name,
|
||||
bucketName: this.bucketName,
|
||||
expiredTime: addHours(new Date(), expiredHours)
|
||||
});
|
||||
}
|
||||
|
|
@ -242,7 +254,7 @@ export class S3BaseBucket {
|
|||
const { key, expiredHours } = parsed;
|
||||
const expires = expiredHours ? expiredHours * 60 * 60 : 30 * 60; // expires 的单位是秒 默认 30 分钟
|
||||
|
||||
return await this.externalClient.presignedGetObject(this.name, key, expires);
|
||||
return await this.externalClient.presignedGetObject(this.bucketName, key, expires);
|
||||
}
|
||||
|
||||
async createPreviewUrl(params: createPreviewUrlParams) {
|
||||
|
|
@ -251,7 +263,7 @@ export class S3BaseBucket {
|
|||
const { key, expiredHours } = parsed;
|
||||
const expires = expiredHours ? expiredHours * 60 * 60 : 30 * 60; // expires 的单位是秒 默认 30 分钟
|
||||
|
||||
return await this.client.presignedGetObject(this.name, key, expires);
|
||||
return await this.client.presignedGetObject(this.bucketName, key, expires);
|
||||
}
|
||||
|
||||
async uploadFileByBuffer(params: UploadFileByBufferParams) {
|
||||
|
|
@ -259,7 +271,7 @@ export class S3BaseBucket {
|
|||
|
||||
await MongoS3TTL.create({
|
||||
minioKey: key,
|
||||
bucketName: this.name,
|
||||
bucketName: this.bucketName,
|
||||
expiredTime: addHours(new Date(), 1)
|
||||
});
|
||||
await this.putObject(key, buffer, undefined, {
|
||||
|
|
@ -274,4 +286,22 @@ export class S3BaseBucket {
|
|||
})
|
||||
};
|
||||
}
|
||||
|
||||
// 对外包装的方法
|
||||
// 获取文件元数据
|
||||
async getFileMetadata(key: string) {
|
||||
const stat = await this.statObject(key);
|
||||
if (!stat) return;
|
||||
|
||||
const contentLength = stat.size;
|
||||
const filename: string = decodeURIComponent(stat.metaData['origin-filename']);
|
||||
const extension = parseFileExtensionFromUrl(filename);
|
||||
const contentType: string = stat.metaData['content-type'];
|
||||
return {
|
||||
filename,
|
||||
extension,
|
||||
contentType,
|
||||
contentLength
|
||||
};
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ export class S3PublicBucket extends S3BaseBucket {
|
|||
super(S3Buckets.public, {
|
||||
...options,
|
||||
afterInit: async () => {
|
||||
const bucket = this.name;
|
||||
const bucket = this.bucketName;
|
||||
const policy = JSON.stringify({
|
||||
Version: '2012-10-17',
|
||||
Statement: [
|
||||
|
|
@ -34,7 +34,7 @@ export class S3PublicBucket extends S3BaseBucket {
|
|||
const protocol = this.options.useSSL ? 'https' : 'http';
|
||||
const hostname = this.options.endPoint;
|
||||
const port = this.options.port;
|
||||
const bucket = this.name;
|
||||
const bucket = this.bucketName;
|
||||
|
||||
const url = new URL(`${protocol}://${hostname}:${port}/${bucket}/${objectKey}`);
|
||||
|
||||
|
|
|
|||
|
|
@ -37,11 +37,7 @@ export const defaultS3Options: {
|
|||
secretKey: process.env.S3_SECRET_KEY || 'minioadmin',
|
||||
port: process.env.S3_PORT ? parseInt(process.env.S3_PORT) : 9000,
|
||||
pathStyle: process.env.S3_PATH_STYLE === 'false' ? false : true,
|
||||
transportAgent: process.env.HTTP_PROXY
|
||||
? new HttpProxyAgent(process.env.HTTP_PROXY)
|
||||
: process.env.HTTPS_PROXY
|
||||
? new HttpsProxyAgent(process.env.HTTPS_PROXY)
|
||||
: undefined
|
||||
region: process.env.S3_REGION || undefined
|
||||
};
|
||||
|
||||
export const S3Buckets = {
|
||||
|
|
|
|||
|
|
@ -8,8 +8,8 @@ export function initS3Buckets() {
|
|||
const privateBucket = new S3PrivateBucket({ init: true });
|
||||
|
||||
global.s3BucketMap = {
|
||||
[publicBucket.name]: publicBucket,
|
||||
[privateBucket.name]: privateBucket
|
||||
[publicBucket.bucketName]: publicBucket,
|
||||
[privateBucket.bucketName]: privateBucket
|
||||
};
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -57,7 +57,7 @@ export const startS3DelWorker = async () => {
|
|||
|
||||
const p = limit(() =>
|
||||
// 因为封装的 delete 方法里,包含前缀删除,这里不能再使用,避免循环。
|
||||
retryFn(() => bucket.client.removeObject(bucket.name, file.name))
|
||||
retryFn(() => bucket.client.removeObject(bucket.bucketName, file.name))
|
||||
);
|
||||
tasks.push(p);
|
||||
});
|
||||
|
|
|
|||
|
|
@ -5,11 +5,9 @@ import { imageBaseUrl } from '@fastgpt/global/common/file/image/constants';
|
|||
import type { ClientSession } from 'mongoose';
|
||||
import { getFileS3Key } from '../utils';
|
||||
|
||||
class S3AvatarSource {
|
||||
private bucket: S3PublicBucket;
|
||||
|
||||
class S3AvatarSource extends S3PublicBucket {
|
||||
constructor() {
|
||||
this.bucket = new S3PublicBucket();
|
||||
super();
|
||||
}
|
||||
|
||||
get prefix(): string {
|
||||
|
|
@ -27,7 +25,7 @@ class S3AvatarSource {
|
|||
}) {
|
||||
const { fileKey } = getFileS3Key.avatar({ teamId, filename });
|
||||
|
||||
return this.bucket.createPostPresignedUrl(
|
||||
return this.createPostPresignedUrl(
|
||||
{ filename, rawKey: fileKey },
|
||||
{
|
||||
expiredHours: autoExpired ? 1 : undefined, // 1 Hours
|
||||
|
|
@ -36,19 +34,15 @@ class S3AvatarSource {
|
|||
);
|
||||
}
|
||||
|
||||
createPublicUrl(objectKey: string): string {
|
||||
return this.bucket.createPublicUrl(objectKey);
|
||||
}
|
||||
|
||||
async removeAvatarTTL(avatar: string, session?: ClientSession): Promise<void> {
|
||||
const key = avatar.slice(this.prefix.length);
|
||||
await MongoS3TTL.deleteOne({ minioKey: key, bucketName: this.bucket.name }, session);
|
||||
await MongoS3TTL.deleteOne({ minioKey: key, bucketName: this.bucketName }, session);
|
||||
}
|
||||
|
||||
async deleteAvatar(avatar: string, session?: ClientSession): Promise<void> {
|
||||
const key = avatar.slice(this.prefix.length);
|
||||
await MongoS3TTL.deleteOne({ minioKey: key, bucketName: this.bucket.name }, session);
|
||||
await this.bucket.delete(key);
|
||||
await MongoS3TTL.deleteOne({ minioKey: key, bucketName: this.bucketName }, session);
|
||||
await this.delete(key);
|
||||
}
|
||||
|
||||
async refreshAvatar(newAvatar?: string, oldAvatar?: string, session?: ClientSession) {
|
||||
|
|
@ -78,7 +72,7 @@ class S3AvatarSource {
|
|||
}) {
|
||||
const from = key.slice(this.prefix.length);
|
||||
const to = `${S3Sources.avatar}/${teamId}/${filename}`;
|
||||
await this.bucket.copy({ from, to, options: { temporary } });
|
||||
await this.copy({ from, to, options: { temporary } });
|
||||
return this.prefix.concat(to);
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,4 +1,3 @@
|
|||
import { parseFileExtensionFromUrl } from '@fastgpt/global/common/string/tools';
|
||||
import { S3PrivateBucket } from '../../buckets/private';
|
||||
import { S3Sources } from '../../type';
|
||||
import {
|
||||
|
|
@ -14,11 +13,9 @@ import { S3Buckets } from '../../constants';
|
|||
import path from 'path';
|
||||
import { getFileS3Key } from '../../utils';
|
||||
|
||||
export class S3ChatSource {
|
||||
private bucket: S3PrivateBucket;
|
||||
|
||||
export class S3ChatSource extends S3PrivateBucket {
|
||||
constructor() {
|
||||
this.bucket = new S3PrivateBucket();
|
||||
super();
|
||||
}
|
||||
|
||||
static parseChatUrl(url: string | URL) {
|
||||
|
|
@ -51,46 +48,19 @@ export class S3ChatSource {
|
|||
}
|
||||
}
|
||||
|
||||
// 获取文件流
|
||||
getChatFileStream(key: string) {
|
||||
return this.bucket.getObject(key);
|
||||
}
|
||||
|
||||
// 获取文件状态
|
||||
getChatFileStat(key: string) {
|
||||
return this.bucket.statObject(key);
|
||||
}
|
||||
|
||||
// 获取文件元数据
|
||||
async getFileMetadata(key: string) {
|
||||
const stat = await this.getChatFileStat(key);
|
||||
if (!stat) return { filename: '', extension: '', contentLength: 0, contentType: '' };
|
||||
|
||||
const contentLength = stat.size;
|
||||
const filename: string = decodeURIComponent(stat.metaData['origin-filename']);
|
||||
const extension = parseFileExtensionFromUrl(filename);
|
||||
const contentType: string = stat.metaData['content-type'];
|
||||
return {
|
||||
filename,
|
||||
extension,
|
||||
contentType,
|
||||
contentLength
|
||||
};
|
||||
}
|
||||
|
||||
async createGetChatFileURL(params: { key: string; expiredHours?: number; external: boolean }) {
|
||||
const { key, expiredHours = 1, external = false } = params; // 默认一个小时
|
||||
|
||||
if (external) {
|
||||
return await this.bucket.createExternalUrl({ key, expiredHours });
|
||||
return await this.createExternalUrl({ key, expiredHours });
|
||||
}
|
||||
return await this.bucket.createPreviewUrl({ key, expiredHours });
|
||||
return await this.createPreviewUrl({ key, expiredHours });
|
||||
}
|
||||
|
||||
async createUploadChatFileURL(params: CheckChatFileKeys) {
|
||||
const { appId, chatId, uId, filename, expiredTime } = ChatFileUploadSchema.parse(params);
|
||||
const { fileKey } = getFileS3Key.chat({ appId, chatId, uId, filename });
|
||||
return await this.bucket.createPostPresignedUrl(
|
||||
return await this.createPostPresignedUrl(
|
||||
{ rawKey: fileKey, filename },
|
||||
{ expiredHours: expiredTime ? differenceInHours(expiredTime, new Date()) : 24 }
|
||||
);
|
||||
|
|
@ -100,11 +70,11 @@ export class S3ChatSource {
|
|||
const { appId, chatId, uId } = DelChatFileByPrefixSchema.parse(params);
|
||||
|
||||
const prefix = [S3Sources.chat, appId, uId, chatId].filter(Boolean).join('/');
|
||||
return this.bucket.addDeleteJob({ prefix });
|
||||
return this.addDeleteJob({ prefix });
|
||||
}
|
||||
|
||||
deleteChatFileByKey(key: string) {
|
||||
return this.bucket.addDeleteJob({ key });
|
||||
return this.addDeleteJob({ key });
|
||||
}
|
||||
|
||||
async uploadChatFileByBuffer(params: UploadFileParams) {
|
||||
|
|
@ -117,7 +87,7 @@ export class S3ChatSource {
|
|||
filename
|
||||
});
|
||||
|
||||
return this.bucket.uploadFileByBuffer({
|
||||
return this.uploadFileByBuffer({
|
||||
key: fileKey,
|
||||
buffer,
|
||||
contentType
|
||||
|
|
|
|||
|
|
@ -2,8 +2,6 @@ import { S3Sources } from '../../type';
|
|||
import { S3PrivateBucket } from '../../buckets/private';
|
||||
import { parseFileExtensionFromUrl } from '@fastgpt/global/common/string/tools';
|
||||
import {
|
||||
type AddRawTextBufferParams,
|
||||
AddRawTextBufferParamsSchema,
|
||||
type CreateGetDatasetFileURLParams,
|
||||
CreateGetDatasetFileURLParamsSchema,
|
||||
type CreateUploadDatasetFileParams,
|
||||
|
|
@ -12,26 +10,26 @@ import {
|
|||
DeleteDatasetFilesByPrefixParamsSchema,
|
||||
type GetDatasetFileContentParams,
|
||||
GetDatasetFileContentParamsSchema,
|
||||
type GetRawTextBufferParams,
|
||||
type UploadParams,
|
||||
UploadParamsSchema
|
||||
} from './type';
|
||||
import { MongoS3TTL } from '../../schema';
|
||||
import { addHours, addMinutes } from 'date-fns';
|
||||
import { addHours } from 'date-fns';
|
||||
import { addLog } from '../../../system/log';
|
||||
import { detectFileEncoding } from '@fastgpt/global/common/file/tools';
|
||||
import { readS3FileContentByBuffer } from '../../../file/read/utils';
|
||||
import path from 'node:path';
|
||||
import { Mimes } from '../../constants';
|
||||
import { getFileS3Key, truncateFilename } from '../../utils';
|
||||
import { createHash } from 'node:crypto';
|
||||
import { S3Error } from 'minio';
|
||||
import type { S3RawTextSource } from '../rawText';
|
||||
import { getS3RawTextSource } from '../rawText';
|
||||
|
||||
export class S3DatasetSource {
|
||||
public bucket: S3PrivateBucket;
|
||||
export class S3DatasetSource extends S3PrivateBucket {
|
||||
private rawTextSource: S3RawTextSource;
|
||||
|
||||
constructor() {
|
||||
this.bucket = new S3PrivateBucket();
|
||||
super();
|
||||
this.rawTextSource = getS3RawTextSource();
|
||||
}
|
||||
|
||||
// 下载链接
|
||||
|
|
@ -39,19 +37,26 @@ export class S3DatasetSource {
|
|||
const { key, expiredHours, external } = CreateGetDatasetFileURLParamsSchema.parse(params);
|
||||
|
||||
if (external) {
|
||||
return await this.bucket.createExternalUrl({ key, expiredHours });
|
||||
return await this.createExternalUrl({ key, expiredHours });
|
||||
}
|
||||
return await this.bucket.createPreviewUrl({ key, expiredHours });
|
||||
return await this.createPreviewUrl({ key, expiredHours });
|
||||
}
|
||||
|
||||
// 上传链接
|
||||
async createUploadDatasetFileURL(params: CreateUploadDatasetFileParams) {
|
||||
const { filename, datasetId } = CreateUploadDatasetFileParamsSchema.parse(params);
|
||||
const { fileKey } = getFileS3Key.dataset({ datasetId, filename });
|
||||
return await this.bucket.createPostPresignedUrl(
|
||||
{ rawKey: fileKey, filename },
|
||||
{ expiredHours: 3 }
|
||||
);
|
||||
return await this.createPostPresignedUrl({ rawKey: fileKey, filename }, { expiredHours: 3 });
|
||||
}
|
||||
|
||||
// 单个键删除
|
||||
deleteDatasetFileByKey(key?: string) {
|
||||
return this.addDeleteJob({ key });
|
||||
}
|
||||
|
||||
// 多个键删除
|
||||
deleteDatasetFilesByKeys(keys: string[]) {
|
||||
return this.addDeleteJob({ keys });
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -62,68 +67,27 @@ export class S3DatasetSource {
|
|||
deleteDatasetFilesByPrefix(params: DeleteDatasetFilesByPrefixParams) {
|
||||
const { datasetId } = DeleteDatasetFilesByPrefixParamsSchema.parse(params);
|
||||
const prefix = [S3Sources.dataset, datasetId].filter(Boolean).join('/');
|
||||
return this.bucket.addDeleteJob({ prefix });
|
||||
}
|
||||
|
||||
// 单个键删除
|
||||
deleteDatasetFileByKey(key?: string) {
|
||||
return this.bucket.addDeleteJob({ key });
|
||||
}
|
||||
|
||||
// 多个键删除
|
||||
deleteDatasetFilesByKeys(keys: string[]) {
|
||||
return this.bucket.addDeleteJob({ keys });
|
||||
}
|
||||
|
||||
// 获取文件流
|
||||
getDatasetFileStream(key: string) {
|
||||
return this.bucket.getObject(key);
|
||||
}
|
||||
|
||||
// 获取文件状态
|
||||
getDatasetFileStat(key: string) {
|
||||
try {
|
||||
return this.bucket.statObject(key);
|
||||
} catch (error) {
|
||||
if (error instanceof S3Error && error.message === 'Not Found') {
|
||||
return null;
|
||||
}
|
||||
return Promise.reject(error);
|
||||
}
|
||||
}
|
||||
|
||||
// 获取文件元数据
|
||||
async getFileMetadata(key: string) {
|
||||
const stat = await this.getDatasetFileStat(key);
|
||||
if (!stat) return { filename: '', extension: '', contentLength: 0, contentType: '' };
|
||||
|
||||
const contentLength = stat.size;
|
||||
const filename: string = decodeURIComponent(stat.metaData['origin-filename']);
|
||||
const extension = parseFileExtensionFromUrl(filename);
|
||||
const contentType: string = stat.metaData['content-type'];
|
||||
return {
|
||||
filename,
|
||||
extension,
|
||||
contentType,
|
||||
contentLength
|
||||
};
|
||||
return this.addDeleteJob({ prefix });
|
||||
}
|
||||
|
||||
async getDatasetBase64Image(key: string): Promise<string> {
|
||||
const [stream, metadata] = await Promise.all([
|
||||
this.getDatasetFileStream(key),
|
||||
this.getFileStream(key),
|
||||
this.getFileMetadata(key)
|
||||
]);
|
||||
const buffer = await this.bucket.fileStreamToBuffer(stream);
|
||||
const buffer = await this.fileStreamToBuffer(stream);
|
||||
const base64 = buffer.toString('base64');
|
||||
return `data:${metadata.contentType || 'image/jpeg'};base64,${base64}`;
|
||||
return `data:${metadata?.contentType || 'image/jpeg'};base64,${base64}`;
|
||||
}
|
||||
|
||||
async getDatasetFileRawText(params: GetDatasetFileContentParams) {
|
||||
const { fileId, teamId, tmbId, customPdfParse, getFormatText, usageId } =
|
||||
GetDatasetFileContentParamsSchema.parse(params);
|
||||
|
||||
const rawTextBuffer = await this.getRawTextBuffer({ customPdfParse, sourceId: fileId });
|
||||
const rawTextBuffer = await this.rawTextSource.getRawTextBuffer({
|
||||
customPdfParse,
|
||||
sourceId: fileId
|
||||
});
|
||||
if (rawTextBuffer) {
|
||||
return {
|
||||
rawText: rawTextBuffer.text,
|
||||
|
|
@ -133,14 +97,14 @@ export class S3DatasetSource {
|
|||
|
||||
const [metadata, stream] = await Promise.all([
|
||||
this.getFileMetadata(fileId),
|
||||
this.getDatasetFileStream(fileId)
|
||||
this.getFileStream(fileId)
|
||||
]);
|
||||
|
||||
const extension = metadata.extension;
|
||||
const filename: string = decodeURIComponent(metadata.filename);
|
||||
const extension = metadata?.extension || '';
|
||||
const filename: string = decodeURIComponent(metadata?.filename || '');
|
||||
|
||||
const start = Date.now();
|
||||
const buffer = await this.bucket.fileStreamToBuffer(stream);
|
||||
const buffer = await this.fileStreamToBuffer(stream);
|
||||
addLog.debug('get dataset file buffer', { time: Date.now() - start });
|
||||
|
||||
const encoding = detectFileEncoding(buffer);
|
||||
|
|
@ -159,7 +123,7 @@ export class S3DatasetSource {
|
|||
}
|
||||
});
|
||||
|
||||
this.addRawTextBuffer({
|
||||
this.rawTextSource.addRawTextBuffer({
|
||||
sourceId: fileId,
|
||||
sourceName: filename,
|
||||
text: rawText,
|
||||
|
|
@ -195,11 +159,11 @@ export class S3DatasetSource {
|
|||
|
||||
await MongoS3TTL.create({
|
||||
minioKey: key,
|
||||
bucketName: this.bucket.name,
|
||||
bucketName: this.bucketName,
|
||||
expiredTime: addHours(new Date(), 3)
|
||||
});
|
||||
|
||||
await this.bucket.putObject(key, stream, size, {
|
||||
await this.putObject(key, stream, size, {
|
||||
'content-type': Mimes[path.extname(truncatedFilename) as keyof typeof Mimes],
|
||||
'upload-time': new Date().toISOString(),
|
||||
'origin-filename': encodeURIComponent(truncatedFilename)
|
||||
|
|
@ -207,51 +171,6 @@ export class S3DatasetSource {
|
|||
|
||||
return key;
|
||||
}
|
||||
|
||||
async addRawTextBuffer(params: AddRawTextBufferParams) {
|
||||
const { sourceId, sourceName, text, customPdfParse } =
|
||||
AddRawTextBufferParamsSchema.parse(params);
|
||||
|
||||
// 因为 Key 唯一对应一个 Object 所以不需要根据文件内容计算 Hash 直接用 Key 计算 Hash 就行了
|
||||
const hash = createHash('md5').update(sourceId).digest('hex');
|
||||
const key = getFileS3Key.rawText({ hash, customPdfParse });
|
||||
|
||||
await MongoS3TTL.create({
|
||||
minioKey: key,
|
||||
bucketName: this.bucket.name,
|
||||
expiredTime: addMinutes(new Date(), 20)
|
||||
});
|
||||
|
||||
const buffer = Buffer.from(text);
|
||||
await this.bucket.putObject(key, buffer, buffer.length, {
|
||||
'content-type': 'text/plain',
|
||||
'origin-filename': encodeURIComponent(sourceName),
|
||||
'upload-time': new Date().toISOString()
|
||||
});
|
||||
|
||||
return key;
|
||||
}
|
||||
|
||||
async getRawTextBuffer(params: GetRawTextBufferParams) {
|
||||
const { customPdfParse, sourceId } = params;
|
||||
|
||||
const hash = createHash('md5').update(sourceId).digest('hex');
|
||||
const key = getFileS3Key.rawText({ hash, customPdfParse });
|
||||
|
||||
if (!(await this.bucket.isObjectExists(key))) return null;
|
||||
|
||||
const [stream, metadata] = await Promise.all([
|
||||
this.bucket.getObject(key),
|
||||
this.getFileMetadata(key)
|
||||
]);
|
||||
|
||||
const buffer = await this.bucket.fileStreamToBuffer(stream);
|
||||
|
||||
return {
|
||||
text: buffer.toString('utf-8'),
|
||||
filename: metadata.filename
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export function getS3DatasetSource() {
|
||||
|
|
|
|||
|
|
@ -59,12 +59,3 @@ export const UploadParamsSchema = z.union([
|
|||
})
|
||||
]);
|
||||
export type UploadParams = z.input<typeof UploadParamsSchema>;
|
||||
|
||||
export const AddRawTextBufferParamsSchema = z.object({
|
||||
customPdfParse: z.boolean().optional(),
|
||||
sourceId: z.string().nonempty(),
|
||||
sourceName: z.string().nonempty(),
|
||||
text: z.string()
|
||||
});
|
||||
export type AddRawTextBufferParams = z.input<typeof AddRawTextBufferParamsSchema>;
|
||||
export type GetRawTextBufferParams = Pick<AddRawTextBufferParams, 'customPdfParse' | 'sourceId'>;
|
||||
|
|
|
|||
|
|
@ -0,0 +1,79 @@
|
|||
import { S3PrivateBucket } from '../../buckets/private';
|
||||
import {
|
||||
type AddRawTextBufferParams,
|
||||
AddRawTextBufferParamsSchema,
|
||||
type GetRawTextBufferParams
|
||||
} from './type';
|
||||
import { MongoS3TTL } from '../../schema';
|
||||
import { addMinutes } from 'date-fns';
|
||||
import { getFileS3Key } from '../../utils';
|
||||
import { createHash } from 'node:crypto';
|
||||
|
||||
export class S3RawTextSource extends S3PrivateBucket {
|
||||
constructor() {
|
||||
super();
|
||||
}
|
||||
|
||||
// 获取文件元数据
|
||||
async getFilename(key: string) {
|
||||
const stat = await this.statObject(key);
|
||||
if (!stat) return '';
|
||||
|
||||
const filename: string = decodeURIComponent(stat.metaData['origin-filename']);
|
||||
return filename;
|
||||
}
|
||||
|
||||
async addRawTextBuffer(params: AddRawTextBufferParams) {
|
||||
const { sourceId, sourceName, text, customPdfParse } =
|
||||
AddRawTextBufferParamsSchema.parse(params);
|
||||
|
||||
// 因为 Key 唯一对应一个 Object 所以不需要根据文件内容计算 Hash 直接用 Key 计算 Hash 就行了
|
||||
const hash = createHash('md5').update(sourceId).digest('hex');
|
||||
const key = getFileS3Key.rawText({ hash, customPdfParse });
|
||||
const buffer = Buffer.from(text);
|
||||
|
||||
await MongoS3TTL.create({
|
||||
minioKey: key,
|
||||
bucketName: this.bucketName,
|
||||
expiredTime: addMinutes(new Date(), 20)
|
||||
});
|
||||
|
||||
await this.putObject(key, buffer, buffer.length, {
|
||||
'content-type': 'text/plain',
|
||||
'origin-filename': encodeURIComponent(sourceName),
|
||||
'upload-time': new Date().toISOString()
|
||||
});
|
||||
|
||||
return key;
|
||||
}
|
||||
|
||||
async getRawTextBuffer(params: GetRawTextBufferParams) {
|
||||
const { customPdfParse, sourceId } = params;
|
||||
|
||||
const hash = createHash('md5').update(sourceId).digest('hex');
|
||||
const key = getFileS3Key.rawText({ hash, customPdfParse });
|
||||
|
||||
if (!(await this.isObjectExists(key))) return null;
|
||||
|
||||
const [stream, filename] = await Promise.all([this.getFileStream(key), this.getFilename(key)]);
|
||||
|
||||
const buffer = await this.fileStreamToBuffer(stream);
|
||||
|
||||
return {
|
||||
text: buffer.toString('utf-8'),
|
||||
filename
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export function getS3RawTextSource() {
|
||||
if (global.rawTextBucket) {
|
||||
return global.rawTextBucket;
|
||||
}
|
||||
global.rawTextBucket = new S3RawTextSource();
|
||||
return global.rawTextBucket;
|
||||
}
|
||||
|
||||
declare global {
|
||||
var rawTextBucket: S3RawTextSource;
|
||||
}
|
||||
|
|
@ -0,0 +1,10 @@
|
|||
import z from 'zod';
|
||||
|
||||
export const AddRawTextBufferParamsSchema = z.object({
|
||||
customPdfParse: z.boolean().optional(),
|
||||
sourceId: z.string().nonempty(),
|
||||
sourceName: z.string().nonempty(),
|
||||
text: z.string()
|
||||
});
|
||||
export type AddRawTextBufferParams = z.input<typeof AddRawTextBufferParamsSchema>;
|
||||
export type GetRawTextBufferParams = Pick<AddRawTextBufferParams, 'customPdfParse' | 'sourceId'>;
|
||||
|
|
@ -131,7 +131,7 @@ export async function uploadImage2S3Bucket(
|
|||
if (expiredTime && isAfter(expiredTime, now)) {
|
||||
await MongoS3TTL.create({
|
||||
minioKey: uploadKey,
|
||||
bucketName: bucket.name,
|
||||
bucketName: bucket.bucketName,
|
||||
expiredTime: expiredTime
|
||||
});
|
||||
}
|
||||
|
|
|
|||
|
|
@ -10,7 +10,9 @@ export enum TimerIdEnum {
|
|||
clearExpiredRawTextBuffer = 'clearExpiredRawTextBuffer',
|
||||
clearExpiredDatasetImage = 'clearExpiredDatasetImage',
|
||||
clearExpiredMinioFiles = 'clearExpiredMinioFiles',
|
||||
recordTeamQPM = 'recordTeamQPM'
|
||||
recordTeamQPM = 'recordTeamQPM',
|
||||
auditLogCleanup = 'auditLogCleanup',
|
||||
chatHistoryCleanup = 'chatHistoryCleanup'
|
||||
}
|
||||
|
||||
export enum LockNotificationEnum {
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ import { addLog } from '../../../../common/system/log';
|
|||
import { readFileRawTextByUrl } from '../../read';
|
||||
import { type ParentIdType } from '@fastgpt/global/common/parentFolder/type';
|
||||
import { type RequireOnlyOne } from '@fastgpt/global/common/type/utils';
|
||||
import { getS3DatasetSource } from '../../../../common/s3/sources/dataset';
|
||||
import { getS3RawTextSource } from '../../../../common/s3/sources/rawText';
|
||||
|
||||
type ResponseDataType = {
|
||||
success: boolean;
|
||||
|
|
@ -154,7 +154,7 @@ export const useApiDatasetRequest = ({ apiServer }: { apiServer: APIFileServer }
|
|||
}
|
||||
if (previewUrl) {
|
||||
// Get from buffer
|
||||
const rawTextBuffer = await getS3DatasetSource().getRawTextBuffer({
|
||||
const rawTextBuffer = await getS3RawTextSource().getRawTextBuffer({
|
||||
sourceId: previewUrl,
|
||||
customPdfParse
|
||||
});
|
||||
|
|
@ -175,7 +175,7 @@ export const useApiDatasetRequest = ({ apiServer }: { apiServer: APIFileServer }
|
|||
getFormatText: true
|
||||
});
|
||||
|
||||
getS3DatasetSource().addRawTextBuffer({
|
||||
getS3RawTextSource().addRawTextBuffer({
|
||||
sourceId: previewUrl,
|
||||
sourceName: title || '',
|
||||
text: rawText,
|
||||
|
|
|
|||
|
|
@ -96,6 +96,25 @@ export async function delDatasetRelevantData({
|
|||
datasetId: { $in: datasetIds }
|
||||
});
|
||||
|
||||
// Delete dataset_data_texts in batches by datasetId
|
||||
for (const datasetId of datasetIds) {
|
||||
await MongoDatasetDataText.deleteMany({
|
||||
teamId,
|
||||
datasetId
|
||||
}).maxTimeMS(300000); // Reduce timeout for single batch
|
||||
}
|
||||
// Delete dataset_datas in batches by datasetId
|
||||
for (const datasetId of datasetIds) {
|
||||
await MongoDatasetData.deleteMany({
|
||||
teamId,
|
||||
datasetId
|
||||
}).maxTimeMS(300000);
|
||||
}
|
||||
|
||||
await delCollectionRelatedSource({ collections });
|
||||
// Delete vector data
|
||||
await deleteDatasetDataVector({ teamId, datasetIds });
|
||||
|
||||
for (const datasetId of datasetIds) {
|
||||
// Delete dataset_data_texts in batches by datasetId
|
||||
await MongoDatasetDataText.deleteMany({
|
||||
|
|
|
|||
|
|
@ -54,6 +54,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
} = props;
|
||||
|
||||
const systemToolId = toolConfig?.systemTool?.toolId;
|
||||
let toolInput: Record<string, any> = {};
|
||||
|
||||
try {
|
||||
// run system tool
|
||||
|
|
@ -78,10 +79,11 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
return dbPlugin?.inputListVal || {};
|
||||
}
|
||||
})();
|
||||
toolInput = Object.fromEntries(
|
||||
Object.entries(params).filter(([key]) => key !== NodeInputKeyEnum.systemInputConfig)
|
||||
);
|
||||
const inputs = {
|
||||
...Object.fromEntries(
|
||||
Object.entries(params).filter(([key]) => key !== NodeInputKeyEnum.systemInputConfig)
|
||||
),
|
||||
...toolInput,
|
||||
...inputConfigParams
|
||||
};
|
||||
|
||||
|
|
@ -132,6 +134,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
return {
|
||||
data: res.error,
|
||||
[DispatchNodeResponseKeyEnum.nodeResponse]: {
|
||||
toolInput,
|
||||
toolRes: res.error,
|
||||
moduleLogo: avatar
|
||||
},
|
||||
|
|
@ -148,6 +151,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
return {
|
||||
error: res.error,
|
||||
[DispatchNodeResponseKeyEnum.nodeResponse]: {
|
||||
toolInput,
|
||||
error: res.error,
|
||||
moduleLogo: avatar
|
||||
},
|
||||
|
|
@ -179,6 +183,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
data: result,
|
||||
[DispatchNodeResponseKeyEnum.answerText]: answerText,
|
||||
[DispatchNodeResponseKeyEnum.nodeResponse]: {
|
||||
toolInput,
|
||||
toolRes: result,
|
||||
moduleLogo: avatar,
|
||||
totalPoints: usagePoints
|
||||
|
|
@ -213,10 +218,12 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
});
|
||||
props.mcpClientMemory[url] = mcpClient;
|
||||
|
||||
toolInput = params;
|
||||
const result = await mcpClient.toolCall({ toolName, params, closeConnection: false });
|
||||
return {
|
||||
data: { [NodeOutputKeyEnum.rawResponse]: result },
|
||||
[DispatchNodeResponseKeyEnum.nodeResponse]: {
|
||||
toolInput,
|
||||
toolRes: result,
|
||||
moduleLogo: avatar
|
||||
},
|
||||
|
|
@ -241,6 +248,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
throw new Error(`HTTP tool ${toolName} not found`);
|
||||
}
|
||||
|
||||
toolInput = params;
|
||||
const { data, errorMsg } = await runHTTPTool({
|
||||
baseUrl: baseUrl || '',
|
||||
toolPath: httpTool.path,
|
||||
|
|
@ -262,6 +270,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
return {
|
||||
error: { [NodeOutputKeyEnum.errorText]: errorMsg },
|
||||
[DispatchNodeResponseKeyEnum.nodeResponse]: {
|
||||
toolInput,
|
||||
toolRes: errorMsg,
|
||||
moduleLogo: avatar
|
||||
},
|
||||
|
|
@ -274,6 +283,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
return {
|
||||
data: { [NodeOutputKeyEnum.rawResponse]: data, ...(typeof data === 'object' ? data : {}) },
|
||||
[DispatchNodeResponseKeyEnum.nodeResponse]: {
|
||||
toolInput,
|
||||
toolRes: data,
|
||||
moduleLogo: avatar
|
||||
},
|
||||
|
|
@ -290,6 +300,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
storeSecret: headerSecret
|
||||
})
|
||||
});
|
||||
toolInput = restParams;
|
||||
const result = await mcpClient.toolCall({ toolName, params: restParams });
|
||||
|
||||
return {
|
||||
|
|
@ -297,6 +308,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
[NodeOutputKeyEnum.rawResponse]: result
|
||||
},
|
||||
[DispatchNodeResponseKeyEnum.nodeResponse]: {
|
||||
toolInput,
|
||||
toolRes: result,
|
||||
moduleLogo: avatar
|
||||
},
|
||||
|
|
@ -318,6 +330,7 @@ export const dispatchRunTool = async (props: RunToolProps): Promise<RunToolRespo
|
|||
return getNodeErrResponse({
|
||||
error,
|
||||
customNodeResponse: {
|
||||
toolInput,
|
||||
moduleLogo: avatar
|
||||
}
|
||||
});
|
||||
|
|
|
|||
|
|
@ -105,6 +105,12 @@ export const dispatchRunPlugin = async (props: RunPluginProps): Promise<RunPlugi
|
|||
let val = data[input.key] ?? input.value;
|
||||
if (input.renderTypeList.includes(FlowNodeInputTypeEnum.password)) {
|
||||
val = anyValueDecrypt(val);
|
||||
} else if (
|
||||
input.renderTypeList.includes(FlowNodeInputTypeEnum.fileSelect) &&
|
||||
Array.isArray(val) &&
|
||||
data[input.key]
|
||||
) {
|
||||
data[input.key] = val.map((item) => item.url);
|
||||
}
|
||||
|
||||
return {
|
||||
|
|
@ -172,6 +178,7 @@ export const dispatchRunPlugin = async (props: RunPluginProps): Promise<RunPlugi
|
|||
[DispatchNodeResponseKeyEnum.nodeResponse]: {
|
||||
moduleLogo: plugin.avatar,
|
||||
totalPoints: usagePoints,
|
||||
toolInput: data,
|
||||
pluginOutput: output?.pluginOutput,
|
||||
pluginDetail: pluginData?.permission?.hasWritePer // Not system plugin
|
||||
? flowResponses.filter((item) => {
|
||||
|
|
|
|||
|
|
@ -56,6 +56,8 @@ export const dispatchPluginInput = async (
|
|||
return {
|
||||
data: {
|
||||
...params,
|
||||
|
||||
// 旧版本适配
|
||||
[NodeOutputKeyEnum.userFiles]: files
|
||||
.map((item) => {
|
||||
return item?.url ?? '';
|
||||
|
|
|
|||
|
|
@ -20,7 +20,7 @@ import { S3ChatSource } from '../../../../common/s3/sources/chat';
|
|||
import path from 'node:path';
|
||||
import { S3Buckets } from '../../../../common/s3/constants';
|
||||
import { S3Sources } from '../../../../common/s3/type';
|
||||
import { getS3DatasetSource } from '../../../../common/s3/sources/dataset';
|
||||
import { getS3RawTextSource } from '../../../../common/s3/sources/rawText';
|
||||
|
||||
type Props = ModuleDispatchProps<{
|
||||
[NodeInputKeyEnum.fileUrlList]: string[];
|
||||
|
|
@ -175,17 +175,17 @@ export const getFileContentFromLinks = async ({
|
|||
parseUrlList
|
||||
.map(async (url) => {
|
||||
// Get from buffer
|
||||
const rawTextBuffer = await getS3DatasetSource().getRawTextBuffer({
|
||||
const rawTextBuffer = await getS3RawTextSource().getRawTextBuffer({
|
||||
sourceId: url,
|
||||
customPdfParse
|
||||
});
|
||||
// if (rawTextBuffer) {
|
||||
// return formatResponseObject({
|
||||
// filename: rawTextBuffer.filename || url,
|
||||
// url,
|
||||
// content: rawTextBuffer.text
|
||||
// });
|
||||
// }
|
||||
if (rawTextBuffer) {
|
||||
return formatResponseObject({
|
||||
filename: rawTextBuffer.filename || url,
|
||||
url,
|
||||
content: rawTextBuffer.text
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
if (isInternalAddress(url)) {
|
||||
|
|
@ -285,7 +285,7 @@ export const getFileContentFromLinks = async ({
|
|||
const replacedText = replaceS3KeyToPreviewUrl(rawText, addDays(new Date(), 90));
|
||||
|
||||
// Add to buffer
|
||||
getS3DatasetSource().addRawTextBuffer({
|
||||
getS3RawTextSource().addRawTextBuffer({
|
||||
sourceId: url,
|
||||
sourceName: filename,
|
||||
text: replacedText,
|
||||
|
|
|
|||
|
|
@ -20,8 +20,8 @@ export const authCollectionFile = async ({
|
|||
const authRes = await parseHeaderCert(props);
|
||||
|
||||
if (isS3ObjectKey(fileId, 'dataset')) {
|
||||
const stat = await getS3DatasetSource().getDatasetFileStat(fileId);
|
||||
if (!stat) return Promise.reject(CommonErrEnum.fileNotFound);
|
||||
const exists = await getS3DatasetSource().isObjectExists(fileId);
|
||||
if (!exists) return Promise.reject(CommonErrEnum.fileNotFound);
|
||||
} else {
|
||||
return Promise.reject('Invalid dataset file key');
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
import { getTeamPlanStatus, getTeamStandPlan, getTeamPoints } from '../../support/wallet/sub/utils';
|
||||
import { getTeamPlanStatus, getTeamStandPlan, teamPoint } from '../../support/wallet/sub/utils';
|
||||
import { MongoApp } from '../../core/app/schema';
|
||||
import { MongoDataset } from '../../core/dataset/schema';
|
||||
import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
|
||||
|
|
@ -12,7 +12,7 @@ import { getVectorCountByTeamId } from '../../common/vectorDB/controller';
|
|||
export const checkTeamAIPoints = async (teamId: string) => {
|
||||
if (!global.subPlans?.standard) return;
|
||||
|
||||
const { totalPoints, usedPoints } = await getTeamPoints({ teamId });
|
||||
const { totalPoints, usedPoints } = await teamPoint.getTeamPoints({ teamId });
|
||||
|
||||
if (usedPoints >= totalPoints) {
|
||||
return Promise.reject(TeamErrEnum.aiPointsNotEnough);
|
||||
|
|
|
|||
|
|
@ -190,7 +190,7 @@ export const getTeamPlanStatus = async ({
|
|||
? standardPlans[standardPlan.currentSubLevel]
|
||||
: undefined;
|
||||
|
||||
updateTeamPointsCache({ teamId, totalPoints, surplusPoints });
|
||||
teamPoint.updateTeamPointsCache({ teamId, totalPoints, surplusPoints });
|
||||
|
||||
return {
|
||||
[SubTypeEnum.standard]: standardPlan,
|
||||
|
|
@ -223,58 +223,99 @@ export const getTeamPlanStatus = async ({
|
|||
};
|
||||
};
|
||||
|
||||
export const clearTeamPointsCache = async (teamId: string) => {
|
||||
const surplusCacheKey = `${CacheKeyEnum.team_point_surplus}:${teamId}`;
|
||||
const totalCacheKey = `${CacheKeyEnum.team_point_total}:${teamId}`;
|
||||
export const teamPoint = {
|
||||
getTeamPoints: async ({ teamId }: { teamId: string }) => {
|
||||
const surplusCacheKey = `${CacheKeyEnum.team_point_surplus}:${teamId}`;
|
||||
const totalCacheKey = `${CacheKeyEnum.team_point_total}:${teamId}`;
|
||||
|
||||
await Promise.all([delRedisCache(surplusCacheKey), delRedisCache(totalCacheKey)]);
|
||||
};
|
||||
const [surplusCacheStr, totalCacheStr] = await Promise.all([
|
||||
getRedisCache(surplusCacheKey),
|
||||
getRedisCache(totalCacheKey)
|
||||
]);
|
||||
|
||||
export const incrTeamPointsCache = async ({ teamId, value }: { teamId: string; value: number }) => {
|
||||
const surplusCacheKey = `${CacheKeyEnum.team_point_surplus}:${teamId}`;
|
||||
await incrValueToCache(surplusCacheKey, value);
|
||||
};
|
||||
export const updateTeamPointsCache = async ({
|
||||
teamId,
|
||||
totalPoints,
|
||||
surplusPoints
|
||||
}: {
|
||||
teamId: string;
|
||||
totalPoints: number;
|
||||
surplusPoints: number;
|
||||
}) => {
|
||||
const surplusCacheKey = `${CacheKeyEnum.team_point_surplus}:${teamId}`;
|
||||
const totalCacheKey = `${CacheKeyEnum.team_point_total}:${teamId}`;
|
||||
if (surplusCacheStr && totalCacheStr) {
|
||||
const totalPoints = Number(totalCacheStr);
|
||||
const surplusPoints = Number(surplusCacheStr);
|
||||
return {
|
||||
totalPoints,
|
||||
surplusPoints,
|
||||
usedPoints: totalPoints - surplusPoints
|
||||
};
|
||||
}
|
||||
|
||||
await Promise.all([
|
||||
setRedisCache(surplusCacheKey, surplusPoints, CacheKeyEnumTime.team_point_surplus),
|
||||
setRedisCache(totalCacheKey, totalPoints, CacheKeyEnumTime.team_point_total)
|
||||
]);
|
||||
};
|
||||
|
||||
export const getTeamPoints = async ({ teamId }: { teamId: string }) => {
|
||||
const surplusCacheKey = `${CacheKeyEnum.team_point_surplus}:${teamId}`;
|
||||
const totalCacheKey = `${CacheKeyEnum.team_point_total}:${teamId}`;
|
||||
|
||||
const [surplusCacheStr, totalCacheStr] = await Promise.all([
|
||||
getRedisCache(surplusCacheKey),
|
||||
getRedisCache(totalCacheKey)
|
||||
]);
|
||||
|
||||
if (surplusCacheStr && totalCacheStr) {
|
||||
const totalPoints = Number(totalCacheStr);
|
||||
const surplusPoints = Number(surplusCacheStr);
|
||||
const planStatus = await getTeamPlanStatus({ teamId });
|
||||
return {
|
||||
totalPoints,
|
||||
surplusPoints,
|
||||
usedPoints: totalPoints - surplusPoints
|
||||
totalPoints: planStatus.totalPoints,
|
||||
surplusPoints: planStatus.totalPoints - planStatus.usedPoints,
|
||||
usedPoints: planStatus.usedPoints
|
||||
};
|
||||
}
|
||||
},
|
||||
incrTeamPointsCache: async ({ teamId, value }: { teamId: string; value: number }) => {
|
||||
const surplusCacheKey = `${CacheKeyEnum.team_point_surplus}:${teamId}`;
|
||||
await incrValueToCache(surplusCacheKey, value);
|
||||
},
|
||||
updateTeamPointsCache: async ({
|
||||
teamId,
|
||||
totalPoints,
|
||||
surplusPoints
|
||||
}: {
|
||||
teamId: string;
|
||||
totalPoints: number;
|
||||
surplusPoints: number;
|
||||
}) => {
|
||||
const surplusCacheKey = `${CacheKeyEnum.team_point_surplus}:${teamId}`;
|
||||
const totalCacheKey = `${CacheKeyEnum.team_point_total}:${teamId}`;
|
||||
|
||||
const planStatus = await getTeamPlanStatus({ teamId });
|
||||
return {
|
||||
totalPoints: planStatus.totalPoints,
|
||||
surplusPoints: planStatus.totalPoints - planStatus.usedPoints,
|
||||
usedPoints: planStatus.usedPoints
|
||||
};
|
||||
await Promise.all([
|
||||
setRedisCache(surplusCacheKey, surplusPoints, CacheKeyEnumTime.team_point_surplus),
|
||||
setRedisCache(totalCacheKey, totalPoints, CacheKeyEnumTime.team_point_total)
|
||||
]);
|
||||
},
|
||||
clearTeamPointsCache: async (teamId: string) => {
|
||||
const surplusCacheKey = `${CacheKeyEnum.team_point_surplus}:${teamId}`;
|
||||
const totalCacheKey = `${CacheKeyEnum.team_point_total}:${teamId}`;
|
||||
|
||||
await Promise.all([delRedisCache(surplusCacheKey), delRedisCache(totalCacheKey)]);
|
||||
}
|
||||
};
|
||||
export const teamQPM = {
|
||||
getTeamQPMLimit: async (teamId: string): Promise<number | null> => {
|
||||
// 1. 尝试从缓存中获取
|
||||
const cacheKey = `${CacheKeyEnum.team_qpm_limit}:${teamId}`;
|
||||
const cached = await getRedisCache(cacheKey);
|
||||
|
||||
if (cached) {
|
||||
return Number(cached);
|
||||
}
|
||||
|
||||
// 2. Computed
|
||||
const teamPlanStatus = await getTeamPlanStatus({ teamId });
|
||||
const limit =
|
||||
teamPlanStatus[SubTypeEnum.standard]?.requestsPerMinute ??
|
||||
teamPlanStatus.standardConstants?.requestsPerMinute;
|
||||
|
||||
if (!limit) {
|
||||
if (process.env.CHAT_MAX_QPM) return Number(process.env.CHAT_MAX_QPM);
|
||||
return null;
|
||||
}
|
||||
|
||||
// 3. Set cache
|
||||
await teamQPM.setCachedTeamQPMLimit(teamId, limit);
|
||||
|
||||
return limit;
|
||||
},
|
||||
setCachedTeamQPMLimit: async (teamId: string, limit: number): Promise<void> => {
|
||||
const cacheKey = `${CacheKeyEnum.team_qpm_limit}:${teamId}`;
|
||||
await setRedisCache(cacheKey, limit.toString(), CacheKeyEnumTime.team_qpm_limit);
|
||||
},
|
||||
clearTeamQPMLimitCache: async (teamId: string): Promise<void> => {
|
||||
const cacheKey = `${CacheKeyEnum.team_qpm_limit}:${teamId}`;
|
||||
await delRedisCache(cacheKey);
|
||||
}
|
||||
};
|
||||
|
||||
// controler
|
||||
export const clearTeamPlanCache = async (teamId: string) => {
|
||||
await teamPoint.clearTeamPointsCache(teamId);
|
||||
await teamQPM.clearTeamQPMLimitCache(teamId);
|
||||
};
|
||||
|
|
|
|||
|
|
@ -51,61 +51,56 @@ const NodeInputSelect = ({
|
|||
{
|
||||
type: FlowNodeInputTypeEnum.textarea,
|
||||
icon: FlowNodeInputMap[FlowNodeInputTypeEnum.textarea].icon,
|
||||
|
||||
title: t('common:core.workflow.inputType.Manual input')
|
||||
},
|
||||
{
|
||||
type: FlowNodeInputTypeEnum.JSONEditor,
|
||||
icon: FlowNodeInputMap[FlowNodeInputTypeEnum.JSONEditor].icon,
|
||||
|
||||
title: t('common:core.workflow.inputType.Manual input')
|
||||
},
|
||||
{
|
||||
type: FlowNodeInputTypeEnum.addInputParam,
|
||||
icon: FlowNodeInputMap[FlowNodeInputTypeEnum.addInputParam].icon,
|
||||
|
||||
title: t('common:core.workflow.inputType.dynamicTargetInput')
|
||||
},
|
||||
{
|
||||
type: FlowNodeInputTypeEnum.selectLLMModel,
|
||||
icon: FlowNodeInputMap[FlowNodeInputTypeEnum.selectLLMModel].icon,
|
||||
|
||||
title: t('common:core.workflow.inputType.Manual select')
|
||||
},
|
||||
{
|
||||
type: FlowNodeInputTypeEnum.settingLLMModel,
|
||||
icon: FlowNodeInputMap[FlowNodeInputTypeEnum.settingLLMModel].icon,
|
||||
|
||||
title: t('common:core.workflow.inputType.Manual select')
|
||||
},
|
||||
{
|
||||
type: FlowNodeInputTypeEnum.selectDataset,
|
||||
icon: FlowNodeInputMap[FlowNodeInputTypeEnum.selectDataset].icon,
|
||||
|
||||
title: t('common:core.workflow.inputType.Manual select')
|
||||
},
|
||||
{
|
||||
type: FlowNodeInputTypeEnum.selectDatasetParamsModal,
|
||||
icon: FlowNodeInputMap[FlowNodeInputTypeEnum.selectDatasetParamsModal].icon,
|
||||
|
||||
title: t('common:core.workflow.inputType.Manual select')
|
||||
},
|
||||
{
|
||||
type: FlowNodeInputTypeEnum.settingDatasetQuotePrompt,
|
||||
icon: FlowNodeInputMap[FlowNodeInputTypeEnum.settingDatasetQuotePrompt].icon,
|
||||
|
||||
title: t('common:core.workflow.inputType.Manual input')
|
||||
},
|
||||
{
|
||||
type: FlowNodeInputTypeEnum.hidden,
|
||||
icon: FlowNodeInputMap[FlowNodeInputTypeEnum.hidden].icon,
|
||||
|
||||
title: t('common:core.workflow.inputType.Manual input')
|
||||
},
|
||||
{
|
||||
type: FlowNodeInputTypeEnum.custom,
|
||||
icon: FlowNodeInputMap[FlowNodeInputTypeEnum.custom].icon,
|
||||
|
||||
title: t('common:core.workflow.inputType.Manual input')
|
||||
},
|
||||
{
|
||||
type: FlowNodeInputTypeEnum.fileSelect,
|
||||
icon: FlowNodeInputMap[FlowNodeInputTypeEnum.fileSelect].icon,
|
||||
title: t('common:core.workflow.inputType.Manual input')
|
||||
}
|
||||
]);
|
||||
|
|
@ -122,7 +117,7 @@ const NodeInputSelect = ({
|
|||
onChange(input.type);
|
||||
}
|
||||
})),
|
||||
[renderType]
|
||||
[onChange, renderType]
|
||||
);
|
||||
|
||||
const filterMenuList = useMemo(
|
||||
|
|
|
|||
|
|
@ -145,6 +145,7 @@
|
|||
"expand_tool_create": "Expand MCP/Http create",
|
||||
"export_config_successful": "Configuration copied, some sensitive information automatically filtered. Please check for any remaining sensitive data.",
|
||||
"export_configs": "Export",
|
||||
"export_log_filename": "{{name}} chat logs.csv",
|
||||
"fastgpt_marketplace": "FastGPT plug-in market",
|
||||
"feedback_count": "User Feedback",
|
||||
"file_quote_link": "Files",
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@
|
|||
"citations": "{{num}} References",
|
||||
"clear_input_value": "Clear input",
|
||||
"click_contextual_preview": "Click to see contextual preview",
|
||||
"click_to_add_url": "Click to add link",
|
||||
"click_to_add_url": "Enter file link",
|
||||
"completion_finish_close": "Disconnection",
|
||||
"completion_finish_content_filter": "Trigger safe wind control",
|
||||
"completion_finish_function_call": "Function Calls",
|
||||
|
|
@ -51,6 +51,7 @@
|
|||
"home.no_available_tools": "No tools available",
|
||||
"home.select_tools": "Select Tool",
|
||||
"home.tools": "Tool: {{num}}",
|
||||
"images_collection_not_supported": "Image collection is not supported open the original file",
|
||||
"in_progress": "In Progress",
|
||||
"input_guide": "Input Guide",
|
||||
"input_guide_lexicon": "Lexicon",
|
||||
|
|
@ -75,7 +76,6 @@
|
|||
"query_extension_result": "Problem optimization results",
|
||||
"question_tip": "From top to bottom, the response order of each module",
|
||||
"read_raw_source": "Open the original text",
|
||||
"images_collection_not_supported": "Image collection is not supported open the original file",
|
||||
"reasoning_text": "Thinking process",
|
||||
"release_cancel": "Release Cancel",
|
||||
"release_send": "Release send, slide up to cancel",
|
||||
|
|
@ -167,6 +167,8 @@
|
|||
"start_chat": "Start",
|
||||
"stream_output": "Stream Output",
|
||||
"task_has_continued": "Task has continued running",
|
||||
"tool_input": "tool input",
|
||||
"tool_output": "Tool output",
|
||||
"unsupported_file_type": "Unsupported file types",
|
||||
"upload": "Upload",
|
||||
"variable_invisable_in_share": "External variables are not visible in login-free links",
|
||||
|
|
|
|||
|
|
@ -427,7 +427,6 @@
|
|||
"core.chat.response.module query": "Question/Search Term",
|
||||
"core.chat.response.module similarity": "Similarity",
|
||||
"core.chat.response.module temperature": "Temperature",
|
||||
"core.chat.response.plugin output": "Plugin Output Value",
|
||||
"core.chat.response.search using reRank": "Result Re-Rank",
|
||||
"core.chat.response.text output": "Text Output",
|
||||
"core.chat.response.update_var_result": "Variable Update Result (Displays Multiple Variable Update Results in Order)",
|
||||
|
|
@ -975,6 +974,7 @@
|
|||
"option": "Option",
|
||||
"page": "Page",
|
||||
"page_center": "Page Center",
|
||||
"page_error": "An uncaught exception occurred.\n\n1. For private deployment users, 90% of cases are caused by incorrect model configuration/model not enabled. \n.\n\n2. Some systems are not compatible with related APIs. \nMost of the time it's caused by Apple's Safari browser, you can try changing it to Chrome.\n\n3. Please turn off the browser translation function. Some translations may cause the page to crash.\n\n\nAfter eliminating 3, open the console to view the specific error information.\n\nIf it prompts xxx undefined, the model configuration is incorrect. Check:\n\n1. Please ensure that at least one model of each series is available in the system, which can be checked in [Account - Model Provider].\n\n2. Please ensure that there is at least one knowledge base file processing model (there is a switch in the language model), otherwise an error will be reported when creating the knowledge base.\n\n2. Check whether some \"object\" parameters in the model are abnormal (arrays and objects). If they are empty, you can try to give an empty array or empty object.",
|
||||
"pay.amount": "Amount",
|
||||
"pay.error_desc": "There was a problem when converting payment routes",
|
||||
"pay.noclose": "After payment is completed, please wait for the system to update automatically",
|
||||
|
|
|
|||
|
|
@ -149,6 +149,7 @@
|
|||
"expand_tool_create": "展开MCP、Http创建",
|
||||
"export_config_successful": "已复制配置,自动过滤部分敏感信息,请注意检查是否仍有敏感数据",
|
||||
"export_configs": "导出配置",
|
||||
"export_log_filename": "{{name}} 对话日志.csv",
|
||||
"fastgpt_marketplace": "FastGPT 插件市场",
|
||||
"feedback_count": "用户反馈",
|
||||
"file_quote_link": "文件链接",
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@
|
|||
"citations": "{{num}}条引用",
|
||||
"clear_input_value": "清空输入",
|
||||
"click_contextual_preview": "点击查看上下文预览",
|
||||
"click_to_add_url": "点击添加链接",
|
||||
"click_to_add_url": "输入文件链接",
|
||||
"completion_finish_close": "连接断开",
|
||||
"completion_finish_content_filter": "触发安全风控",
|
||||
"completion_finish_function_call": "函数调用",
|
||||
|
|
@ -51,6 +51,7 @@
|
|||
"home.no_available_tools": "暂无可用工具",
|
||||
"home.select_tools": "选择工具",
|
||||
"home.tools": "工具:{{num}}",
|
||||
"images_collection_not_supported": "图片数据集不支持打开原文",
|
||||
"in_progress": "进行中",
|
||||
"input_guide": "输入引导",
|
||||
"input_guide_lexicon": "词库",
|
||||
|
|
@ -75,7 +76,6 @@
|
|||
"query_extension_result": "问题优化结果",
|
||||
"question_tip": "从上到下,为各个模块的响应顺序",
|
||||
"read_raw_source": "打开原文",
|
||||
"images_collection_not_supported": "图片数据集不支持打开原文",
|
||||
"reasoning_text": "思考过程",
|
||||
"release_cancel": "松开取消",
|
||||
"release_send": "松开发送,上滑取消",
|
||||
|
|
@ -170,6 +170,8 @@
|
|||
"start_chat": "开始对话",
|
||||
"stream_output": "流输出",
|
||||
"task_has_continued": "任务已继续运行",
|
||||
"tool_input": "工具输入",
|
||||
"tool_output": "工具输出",
|
||||
"unsupported_file_type": "不支持的文件类型",
|
||||
"upload": "上传",
|
||||
"variable_invisable_in_share": "外部变量在免登录链接中不可见",
|
||||
|
|
|
|||
|
|
@ -430,7 +430,6 @@
|
|||
"core.chat.response.module query": "问题/检索词",
|
||||
"core.chat.response.module similarity": "相似度",
|
||||
"core.chat.response.module temperature": "温度",
|
||||
"core.chat.response.plugin output": "插件输出值",
|
||||
"core.chat.response.search using reRank": "结果重排",
|
||||
"core.chat.response.text output": "文本输出",
|
||||
"core.chat.response.update_var_result": "变量更新结果(按顺序展示多个变量更新结果)",
|
||||
|
|
@ -981,6 +980,7 @@
|
|||
"option": "选项",
|
||||
"page": "页",
|
||||
"page_center": "页面居中",
|
||||
"page_error": "出现未捕获的异常。\n1. 私有部署用户,90%是由于模型配置不正确/模型未启用导致。。\n2. 部分系统不兼容相关API。大部分是苹果的safari 浏览器导致,可以尝试更换 chrome。\n3. 请关闭浏览器翻译功能,部分翻译导致页面崩溃。\n\n排除3后,打开控制台的 console 查看具体报错信息。\n如果提示 xxx undefined 的话,就是模型配置不正确,检查:\n1. 请确保系统内每个系列模型至少有一个可用,可以在【账号-模型提供商】中检查。\n2. 请确保至少有一个知识库文件处理模型(语言模型中有一个开关),否则知识库创建会报错。\n2. 检查模型中一些“对象”参数是否异常(数组和对象),如果为空,可以尝试给个空数组或空对象。",
|
||||
"pay.amount": "金额",
|
||||
"pay.error_desc": "转换支付途径时出现了问题",
|
||||
"pay.noclose": "支付完成后,请等待系统自动更新",
|
||||
|
|
|
|||
|
|
@ -144,6 +144,7 @@
|
|||
"expand_tool_create": "展開 MCP、Http 創建",
|
||||
"export_config_successful": "已複製設定,自動過濾部分敏感資訊,請注意檢查是否仍有敏感資料",
|
||||
"export_configs": "匯出設定",
|
||||
"export_log_filename": "{{name}} 對話日誌.csv",
|
||||
"fastgpt_marketplace": "FastGPT 插件市場",
|
||||
"feedback_count": "使用者回饋",
|
||||
"file_quote_link": "檔案連結",
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@
|
|||
"citations": "{{num}} 筆引用",
|
||||
"clear_input_value": "清空輸入",
|
||||
"click_contextual_preview": "點選檢視上下文預覽",
|
||||
"click_to_add_url": "點擊添加鏈接",
|
||||
"click_to_add_url": "輸入文件鏈接",
|
||||
"completion_finish_close": "連接斷開",
|
||||
"completion_finish_content_filter": "觸發安全風控",
|
||||
"completion_finish_function_call": "函式呼叫",
|
||||
|
|
@ -51,6 +51,7 @@
|
|||
"home.no_available_tools": "暫無可用工具",
|
||||
"home.select_tools": "選擇工具",
|
||||
"home.tools": "工具:{{num}}",
|
||||
"images_collection_not_supported": "圖片資料集不支持開啟原文",
|
||||
"in_progress": "進行中",
|
||||
"input_guide": "輸入導引",
|
||||
"input_guide_lexicon": "詞彙庫",
|
||||
|
|
@ -75,7 +76,6 @@
|
|||
"query_extension_result": "問題優化結果",
|
||||
"question_tip": "由上至下,各個模組的回應順序",
|
||||
"read_raw_source": "開啟原文",
|
||||
"images_collection_not_supported": "圖片資料集不支持開啟原文",
|
||||
"reasoning_text": "思考過程",
|
||||
"release_cancel": "鬆開取消",
|
||||
"release_send": "鬆開傳送,上滑取消",
|
||||
|
|
@ -167,6 +167,8 @@
|
|||
"start_chat": "開始對話",
|
||||
"stream_output": "串流輸出",
|
||||
"task_has_continued": "任務已繼續運行",
|
||||
"tool_input": "工具輸入",
|
||||
"tool_output": "工具輸出",
|
||||
"unsupported_file_type": "不支援的檔案類型",
|
||||
"upload": "上傳",
|
||||
"variable_invisable_in_share": "外部變量在免登錄鏈接中不可見",
|
||||
|
|
|
|||
|
|
@ -427,7 +427,6 @@
|
|||
"core.chat.response.module query": "問題/搜尋詞",
|
||||
"core.chat.response.module similarity": "相似度",
|
||||
"core.chat.response.module temperature": "溫度",
|
||||
"core.chat.response.plugin output": "外掛程式輸出值",
|
||||
"core.chat.response.search using reRank": "結果重新排名",
|
||||
"core.chat.response.text output": "文字輸出",
|
||||
"core.chat.response.update_var_result": "變數更新結果(依序顯示多個變數更新結果)",
|
||||
|
|
@ -972,6 +971,7 @@
|
|||
"option": "選項",
|
||||
"page": "頁",
|
||||
"page_center": "頁面置中",
|
||||
"page_error": "出現未捕獲的異常。\n\n1. 私有部署用戶,90%是由於模型配置不正確/模型未啟用導致。 \n。\n\n2. 部分系統不兼容相關API。\n大部分是蘋果的safari 瀏覽器導致,可以嘗試更換 chrome。\n\n3. 請關閉瀏覽器翻譯功能,部分翻譯導致頁面崩潰。\n\n\n排除3後,打開控制台的 console 查看具體報錯信息。\n\n如果提示 xxx undefined 的話,就是模型配置不正確,檢查:\n1. 請確保系統內每個系列模型至少有一個可用,可以在【賬號-模型提供商】中檢查。\n\n2. 請確保至少有一個知識庫文件處理模型(語言模型中有一個開關),否則知識庫創建會報錯。\n\n2. 檢查模型中一些“對象”參數是否異常(數組和對象),如果為空,可以嘗試給個空數組或空對象。",
|
||||
"pay.amount": "金額",
|
||||
"pay.error_desc": "轉換支付途徑時出現了問題",
|
||||
"pay.noclose": "支付完成後,請等待系統自動更新",
|
||||
|
|
|
|||
|
|
@ -556,11 +556,20 @@ const Checkbox = checkBoxMultiStyle({
|
|||
bg: 'myGray.100',
|
||||
borderColor: 'transparent',
|
||||
color: 'myGray.400',
|
||||
outline: 'none'
|
||||
outline: 'none',
|
||||
_hover: {
|
||||
bg: 'myGray.100',
|
||||
borderColor: 'transparent'
|
||||
}
|
||||
}
|
||||
},
|
||||
_hover: {
|
||||
borderColor: 'primary.400'
|
||||
},
|
||||
_disabled: {
|
||||
_hover: {
|
||||
borderColor: 'inherit'
|
||||
}
|
||||
}
|
||||
}
|
||||
}),
|
||||
|
|
|
|||
|
|
@ -8768,7 +8768,6 @@ packages:
|
|||
next@15.3.5:
|
||||
resolution: {integrity: sha512-RkazLBMMDJSJ4XZQ81kolSpwiCt907l0xcgcpF4xC2Vml6QVcPNXW0NQRwQ80FFtSn7UM52XN0anaw8TEJXaiw==}
|
||||
engines: {node: ^18.18.0 || ^19.8.0 || >= 20.0.0}
|
||||
deprecated: This version has a security vulnerability. Please upgrade to a patched version. See https://nextjs.org/blog/CVE-2025-66478 for more details.
|
||||
hasBin: true
|
||||
peerDependencies:
|
||||
'@opentelemetry/api': ^1.1.0
|
||||
|
|
@ -9524,7 +9523,7 @@ packages:
|
|||
react-dom@19.1.1:
|
||||
resolution: {integrity: sha512-Dlq/5LAZgF0Gaz6yiqZCf6VCcZs1ghAJyrsu84Q/GT0gV+mCxbfmKNoGRKBYMJ8IEdGPqu49YWXD02GCknEDkw==}
|
||||
peerDependencies:
|
||||
react: 18.3.1
|
||||
react: ^19.1.1
|
||||
|
||||
react-error-boundary@3.1.4:
|
||||
resolution: {integrity: sha512-uM9uPzZJTF6wRQORmSrvOIgt4lJ9MC1sNgEOj2XGsDTRE4kmpWxg7ENK9EWNKJRMAOY9z0MuF4yIfl6gp4sotA==}
|
||||
|
|
@ -17840,8 +17839,8 @@ snapshots:
|
|||
'@typescript-eslint/parser': 6.21.0(eslint@8.57.1)(typescript@5.8.2)
|
||||
eslint: 8.57.1
|
||||
eslint-import-resolver-node: 0.3.9
|
||||
eslint-import-resolver-typescript: 3.9.0(eslint-plugin-import@2.31.0)(eslint@8.57.1)
|
||||
eslint-plugin-import: 2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-typescript@3.9.0)(eslint@8.57.1)
|
||||
eslint-import-resolver-typescript: 3.9.0(eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint@8.57.1))(eslint@8.57.1)
|
||||
eslint-plugin-import: 2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1)
|
||||
eslint-plugin-jsx-a11y: 6.10.2(eslint@8.57.1)
|
||||
eslint-plugin-react: 7.37.4(eslint@8.57.1)
|
||||
eslint-plugin-react-hooks: 5.0.0-canary-7118f5dd7-20230705(eslint@8.57.1)
|
||||
|
|
@ -17860,6 +17859,21 @@ snapshots:
|
|||
transitivePeerDependencies:
|
||||
- supports-color
|
||||
|
||||
eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint@8.57.1))(eslint@8.57.1):
|
||||
dependencies:
|
||||
'@nolyfill/is-core-module': 1.0.39
|
||||
debug: 4.4.0
|
||||
eslint: 8.57.1
|
||||
get-tsconfig: 4.10.0
|
||||
is-bun-module: 1.3.0
|
||||
oxc-resolver: 5.0.0
|
||||
stable-hash: 0.0.5
|
||||
tinyglobby: 0.2.12
|
||||
optionalDependencies:
|
||||
eslint-plugin-import: 2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1)
|
||||
transitivePeerDependencies:
|
||||
- supports-color
|
||||
|
||||
eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0)(eslint@8.56.0):
|
||||
dependencies:
|
||||
'@nolyfill/is-core-module': 1.0.39
|
||||
|
|
@ -17890,7 +17904,7 @@ snapshots:
|
|||
transitivePeerDependencies:
|
||||
- supports-color
|
||||
|
||||
eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0)(eslint@8.56.0))(eslint@8.56.0):
|
||||
eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.56.0):
|
||||
dependencies:
|
||||
debug: 3.2.7
|
||||
optionalDependencies:
|
||||
|
|
@ -17902,6 +17916,17 @@ snapshots:
|
|||
- supports-color
|
||||
|
||||
eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1):
|
||||
dependencies:
|
||||
debug: 3.2.7
|
||||
optionalDependencies:
|
||||
'@typescript-eslint/parser': 6.21.0(eslint@8.57.1)(typescript@5.8.2)
|
||||
eslint: 8.57.1
|
||||
eslint-import-resolver-node: 0.3.9
|
||||
eslint-import-resolver-typescript: 3.9.0(eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint@8.57.1))(eslint@8.57.1)
|
||||
transitivePeerDependencies:
|
||||
- supports-color
|
||||
|
||||
eslint-module-utils@2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.57.1):
|
||||
dependencies:
|
||||
debug: 3.2.7
|
||||
optionalDependencies:
|
||||
|
|
@ -17923,7 +17948,7 @@ snapshots:
|
|||
doctrine: 2.1.0
|
||||
eslint: 8.56.0
|
||||
eslint-import-resolver-node: 0.3.9
|
||||
eslint-module-utils: 2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0)(eslint@8.56.0))(eslint@8.56.0)
|
||||
eslint-module-utils: 2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.56.0)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.56.0)
|
||||
hasown: 2.0.2
|
||||
is-core-module: 2.16.1
|
||||
is-glob: 4.0.3
|
||||
|
|
@ -17941,7 +17966,7 @@ snapshots:
|
|||
- eslint-import-resolver-webpack
|
||||
- supports-color
|
||||
|
||||
eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-typescript@3.9.0)(eslint@8.57.1):
|
||||
eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-typescript@3.9.0(eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1):
|
||||
dependencies:
|
||||
'@rtsao/scc': 1.1.0
|
||||
array-includes: 3.1.8
|
||||
|
|
@ -17970,6 +17995,35 @@ snapshots:
|
|||
- eslint-import-resolver-webpack
|
||||
- supports-color
|
||||
|
||||
eslint-plugin-import@2.31.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-typescript@3.9.0)(eslint@8.57.1):
|
||||
dependencies:
|
||||
'@rtsao/scc': 1.1.0
|
||||
array-includes: 3.1.8
|
||||
array.prototype.findlastindex: 1.2.6
|
||||
array.prototype.flat: 1.3.3
|
||||
array.prototype.flatmap: 1.3.3
|
||||
debug: 3.2.7
|
||||
doctrine: 2.1.0
|
||||
eslint: 8.57.1
|
||||
eslint-import-resolver-node: 0.3.9
|
||||
eslint-module-utils: 2.12.0(@typescript-eslint/parser@6.21.0(eslint@8.57.1)(typescript@5.8.2))(eslint-import-resolver-node@0.3.9)(eslint-import-resolver-typescript@3.9.0)(eslint@8.57.1)
|
||||
hasown: 2.0.2
|
||||
is-core-module: 2.16.1
|
||||
is-glob: 4.0.3
|
||||
minimatch: 3.1.2
|
||||
object.fromentries: 2.0.8
|
||||
object.groupby: 1.0.3
|
||||
object.values: 1.2.1
|
||||
semver: 6.3.1
|
||||
string.prototype.trimend: 1.0.9
|
||||
tsconfig-paths: 3.15.0
|
||||
optionalDependencies:
|
||||
'@typescript-eslint/parser': 6.21.0(eslint@8.57.1)(typescript@5.8.2)
|
||||
transitivePeerDependencies:
|
||||
- eslint-import-resolver-typescript
|
||||
- eslint-import-resolver-webpack
|
||||
- supports-color
|
||||
|
||||
eslint-plugin-jsx-a11y@6.10.2(eslint@8.56.0):
|
||||
dependencies:
|
||||
aria-query: 5.3.2
|
||||
|
|
|
|||
|
|
@ -44,6 +44,7 @@ S3_SECRET_KEY=minioadmin
|
|||
S3_PUBLIC_BUCKET=fastgpt-public # 插件文件存储公开桶
|
||||
S3_PRIVATE_BUCKET=fastgpt-private # 插件文件存储公开桶
|
||||
S3_PATH_STYLE=true # forcePathStyle 默认为 true, 当且仅当设置为 false 时关闭, 其他值都为 true
|
||||
S3_REGION= # 如果是本地部署的 MinIO等服务就不需要;如果是云服务就需要,比如 aws 的或者国内 oss 厂商
|
||||
|
||||
# Redis URL
|
||||
REDIS_URL=redis://default:mypassword@127.0.0.1:6379
|
||||
|
|
|
|||
|
|
@ -0,0 +1,15 @@
|
|||
<svg t="1710841272884" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="2684"
|
||||
width="128" height="128">
|
||||
<path
|
||||
d="M511.968 959.936c298.688 0 447.968-200.576 447.968-448 0-247.36-149.28-447.936-448-447.936C213.28 64 64 264.576 64 511.968c0 247.392 149.248 447.968 447.968 447.968z"
|
||||
fill="#FFB02E" p-id="2685"></path>
|
||||
<path
|
||||
d="M103.936 586.912a31.936 31.936 0 0 0-7.584 25.568 32 32 0 0 0-37.152 51.84l9.344 8a32 32 0 0 0-24.992 56.256l63.52 52.928-4.032-1.984a35.712 35.712 0 0 0-36.672 60.896C107.712 869.76 163.008 908.64 192 928c48 32 102.72 42.944 160 0 32-24 72.48-97.984 29.92-171.712-8.064-13.952-15.296-28.64-18.304-44.48-13.152-69.76-32.8-141.216-75.616-119.808-23.2 11.584-21.184 31.584-18.304 60 1.088 10.784 2.304 22.784 2.304 36l-2.56 1.28-120.384-105.376a32 32 0 0 0-45.12 3.04zM920.096 586.912c6.368 7.296 8.832 16.64 7.584 25.568a32 32 0 0 1 37.12 51.84l-9.344 8a32 32 0 0 1 25.024 56.256l-63.52 52.928 4.032-1.984a35.712 35.712 0 0 1 36.672 60.896C916.32 869.76 861.024 908.64 832 928c-48 32-102.752 42.944-160 0-32-24-72.48-97.984-29.92-171.712 8.064-13.952 15.296-28.64 18.304-44.48 13.152-69.76 32.8-141.216 75.616-119.808 23.2 11.584 21.184 31.584 18.304 60-1.088 10.784-2.304 22.784-2.304 36l2.56 1.28 120.384-105.376a32 32 0 0 1 45.12 3.04z"
|
||||
fill="#FF822D" p-id="2686"></path>
|
||||
<path
|
||||
d="M224 464c0 44.16-28.64 80-64 80s-64-35.84-64-80 28.64-80 64-80 64 35.84 64 80zM928 464c0 44.16-28.64 80-64 80s-64-35.84-64-80 28.64-80 64-80 64 35.84 64 80z"
|
||||
fill="#FF6723" p-id="2687"></path>
|
||||
<path
|
||||
d="M299.168 333.184c-6.72 7.296-10.24 17.024-11.744 24.928a32 32 0 0 1-62.848-12.224c2.848-14.592 9.92-36.896 27.456-55.968C270.496 269.792 298.112 256 336 256c38.24 0 65.984 14.464 84.352 34.624 17.408 19.104 24.64 41.344 27.2 55.904a32 32 0 0 1-63.072 10.944 49.472 49.472 0 0 0-11.456-23.744C367.04 327.104 356.544 320 336 320c-20.896 0-31.104 6.944-36.832 13.184zM651.2 333.184c-6.72 7.296-10.24 17.024-11.776 24.928a32 32 0 0 1-62.816-12.224c2.816-14.592 9.92-36.896 27.424-55.968C622.496 269.792 650.112 256 688 256c38.272 0 65.984 14.464 84.352 34.624 17.408 19.104 24.64 41.344 27.2 55.904a32 32 0 0 1-63.072 10.944 49.44 49.44 0 0 0-11.456-23.744C719.04 327.104 708.544 320 688 320c-20.896 0-31.072 6.944-36.8 13.184zM313.6 492.8a32 32 0 1 0-51.2 38.4c22.464 29.952 96.256 92.8 249.6 92.8s227.136-62.848 249.6-92.8a32 32 0 0 0-51.2-38.4c-9.536 12.704-63.744 67.2-198.4 67.2s-188.864-54.496-198.4-67.2z"
|
||||
fill="#402A32" p-id="2688"></path>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 2.5 KiB |
|
|
@ -30,6 +30,7 @@ import { POST } from '@/web/common/api/request';
|
|||
import { getErrText } from '@fastgpt/global/common/error/utils';
|
||||
import { formatFileSize } from '@fastgpt/global/common/file/tools';
|
||||
import { WorkflowRuntimeContext } from '@/components/core/chat/ChatContainer/context/workflowRuntimeContext';
|
||||
import { useSafeTranslation } from '@fastgpt/web/hooks/useSafeTranslation';
|
||||
|
||||
const FileSelector = ({
|
||||
value,
|
||||
|
|
@ -53,7 +54,7 @@ const FileSelector = ({
|
|||
}) => {
|
||||
const { feConfigs } = useSystemStore();
|
||||
const { toast } = useToast();
|
||||
const { t } = useTranslation();
|
||||
const { t } = useSafeTranslation();
|
||||
|
||||
const appId = useContextSelector(WorkflowRuntimeContext, (v) => v.appId);
|
||||
const chatId = useContextSelector(WorkflowRuntimeContext, (v) => v.chatId);
|
||||
|
|
@ -491,7 +492,7 @@ const FileSelector = ({
|
|||
</HStack>
|
||||
{file?.error && (
|
||||
<Box mt={1} fontSize={'xs'} color={'red.600'}>
|
||||
{file?.error}
|
||||
{t(file.error)}
|
||||
</Box>
|
||||
)}
|
||||
</Box>
|
||||
|
|
|
|||
|
|
@ -350,10 +350,8 @@ export const WholeResponseContent = ({
|
|||
</>
|
||||
{/* plugin */}
|
||||
<>
|
||||
<Row
|
||||
label={t('common:core.chat.response.plugin output')}
|
||||
value={activeModule?.pluginOutput}
|
||||
/>
|
||||
<Row label={t('chat:tool_input')} value={activeModule?.toolInput} />
|
||||
<Row label={t('chat:tool_output')} value={activeModule?.pluginOutput} />
|
||||
</>
|
||||
{/* text output */}
|
||||
<Row label={t('common:core.chat.response.text output')} value={activeModule?.textOutput} />
|
||||
|
|
|
|||
|
|
@ -10,7 +10,6 @@ export type GetAppChatLogsProps = {
|
|||
sources?: ChatSourceEnum[];
|
||||
tmbIds?: string[];
|
||||
chatSearch?: string;
|
||||
locale?: keyof I18nName;
|
||||
};
|
||||
|
||||
export type GetAppChatLogsParams = PaginationProps<GetAppChatLogsProps>;
|
||||
|
|
|
|||
|
|
@ -6,6 +6,8 @@ import { exit } from 'process';
|
|||
export async function register() {
|
||||
try {
|
||||
if (process.env.NEXT_RUNTIME === 'nodejs') {
|
||||
await import('@fastgpt/service/common/proxy');
|
||||
|
||||
// 基础系统初始化
|
||||
const [
|
||||
{ connectMongo },
|
||||
|
|
|
|||
|
|
@ -20,7 +20,6 @@ import MultipleSelect, {
|
|||
import React, { useMemo, useState } from 'react';
|
||||
import { useTranslation } from 'next-i18next';
|
||||
import DateRangePicker from '@fastgpt/web/components/common/DateRangePicker';
|
||||
import { addDays } from 'date-fns';
|
||||
import { useScrollPagination } from '@fastgpt/web/hooks/useScrollPagination';
|
||||
import { getTeamMembers } from '@/web/support/user/team/api';
|
||||
import Avatar from '@fastgpt/web/components/common/Avatar';
|
||||
|
|
@ -50,7 +49,8 @@ import dynamic from 'next/dynamic';
|
|||
import type { HeaderControlProps } from './LogChart';
|
||||
import { useSystemStore } from '@/web/common/system/useSystemStore';
|
||||
import MyBox from '@fastgpt/web/components/common/MyBox';
|
||||
import type { I18nName } from '@fastgpt/service/common/geo/type';
|
||||
import { useContextSelector } from 'use-context-selector';
|
||||
import { AppContext } from '../context';
|
||||
|
||||
const DetailLogsModal = dynamic(() => import('./DetailLogsModal'));
|
||||
|
||||
|
|
@ -65,10 +65,11 @@ const LogTable = ({
|
|||
showSourceSelector = true,
|
||||
px = [4, 8]
|
||||
}: HeaderControlProps) => {
|
||||
const { t, i18n } = useTranslation();
|
||||
const { t } = useTranslation();
|
||||
const { feConfigs } = useSystemStore();
|
||||
|
||||
const [detailLogsId, setDetailLogsId] = useState<string>();
|
||||
const appName = useContextSelector(AppContext, (v) => v.appDetail.name);
|
||||
|
||||
// source
|
||||
const sourceList = useMemo(
|
||||
|
|
@ -147,15 +148,14 @@ const LogTable = ({
|
|||
const headerTitle = enabledKeys.map((k) => t(AppLogKeysEnumMap[k])).join(',');
|
||||
await downloadFetch({
|
||||
url: '/api/core/app/exportChatLogs',
|
||||
filename: 'chat_logs.csv',
|
||||
filename: t('app:export_log_filename', { name: appName }),
|
||||
body: {
|
||||
appId,
|
||||
dateStart: dayjs(dateRange.from || new Date()).format(),
|
||||
dateEnd: dayjs(addDays(dateRange.to || new Date(), 1)).format(),
|
||||
dateEnd: dayjs(dateRange.to || new Date()).format(),
|
||||
sources: isSelectAllSource ? undefined : chatSources,
|
||||
tmbIds: isSelectAllTmb ? undefined : selectTmbIds,
|
||||
chatSearch,
|
||||
locale: i18n.language === 'zh-CN' ? 'zh' : 'en',
|
||||
title: `${headerTitle},${t('app:logs_keys_chatDetails')}`,
|
||||
logKeys: enabledKeys,
|
||||
sourcesMap: Object.fromEntries(
|
||||
|
|
@ -180,8 +180,7 @@ const LogTable = ({
|
|||
dateEnd: dateRange.to!,
|
||||
sources: isSelectAllSource ? undefined : chatSources,
|
||||
tmbIds: isSelectAllTmb ? undefined : selectTmbIds,
|
||||
chatSearch,
|
||||
locale: (i18n.language === 'zh-CN' ? 'zh' : 'en') as keyof I18nName
|
||||
chatSearch
|
||||
}),
|
||||
[
|
||||
appId,
|
||||
|
|
@ -191,8 +190,7 @@ const LogTable = ({
|
|||
isSelectAllSource,
|
||||
selectTmbIds,
|
||||
isSelectAllTmb,
|
||||
chatSearch,
|
||||
i18n.language
|
||||
chatSearch
|
||||
]
|
||||
);
|
||||
|
||||
|
|
|
|||
|
|
@ -90,10 +90,7 @@ const DingTalkEditModal = ({
|
|||
<Box color="myGray.600">{t('publish:dingtalk.api')}</Box>
|
||||
{feConfigs?.docUrl && (
|
||||
<Link
|
||||
href={
|
||||
feConfigs.openAPIDocUrl ||
|
||||
getDocPath('/docs/use-cases/external-integration/dingtalk/')
|
||||
}
|
||||
href={getDocPath('/docs/use-cases/external-integration/dingtalk/')}
|
||||
target={'_blank'}
|
||||
ml={2}
|
||||
color={'primary.500'}
|
||||
|
|
|
|||
|
|
@ -74,10 +74,7 @@ const DingTalk = ({ appId }: { appId: string }) => {
|
|||
</Box>
|
||||
{feConfigs?.docUrl && (
|
||||
<Link
|
||||
href={
|
||||
feConfigs.openAPIDocUrl ||
|
||||
getDocPath('/docs/use-cases/external-integration/dingtalk/')
|
||||
}
|
||||
href={getDocPath('/docs/use-cases/external-integration/dingtalk/')}
|
||||
target={'_blank'}
|
||||
color={'primary.500'}
|
||||
fontSize={'sm'}
|
||||
|
|
|
|||
|
|
@ -90,10 +90,7 @@ const FeiShuEditModal = ({
|
|||
<Box color="myGray.600">{t('publish:feishu_api')}</Box>
|
||||
{feConfigs?.docUrl && (
|
||||
<Link
|
||||
href={
|
||||
feConfigs.openAPIDocUrl ||
|
||||
getDocPath('/docs/use-cases/external-integration/feishu/')
|
||||
}
|
||||
href={getDocPath('/docs/use-cases/external-integration/feishu/')}
|
||||
target={'_blank'}
|
||||
ml={2}
|
||||
color={'primary.500'}
|
||||
|
|
|
|||
|
|
@ -73,10 +73,7 @@ const FeiShu = ({ appId }: { appId: string }) => {
|
|||
</Box>
|
||||
{feConfigs?.docUrl && (
|
||||
<Link
|
||||
href={
|
||||
feConfigs.openAPIDocUrl ||
|
||||
getDocPath('/docs/use-cases/external-integration/feishu/')
|
||||
}
|
||||
href={getDocPath('/docs/use-cases/external-integration/feishu')}
|
||||
target={'_blank'}
|
||||
color={'primary.500'}
|
||||
fontSize={'sm'}
|
||||
|
|
|
|||
|
|
@ -96,10 +96,7 @@ const OffiAccountEditModal = ({
|
|||
<Box color="myGray.600">{t('publish:official_account.params')}</Box>
|
||||
{feConfigs?.docUrl && (
|
||||
<Link
|
||||
href={
|
||||
feConfigs.openAPIDocUrl ||
|
||||
getDocPath('/docs/use-cases/external-integration/official_account/')
|
||||
}
|
||||
href={getDocPath('/docs/use-cases/external-integration/official_account/')}
|
||||
target={'_blank'}
|
||||
ml={2}
|
||||
color={'primary.500'}
|
||||
|
|
|
|||
|
|
@ -75,10 +75,7 @@ const OffiAccount = ({ appId }: { appId: string }) => {
|
|||
|
||||
{feConfigs?.docUrl && (
|
||||
<Link
|
||||
href={
|
||||
feConfigs.openAPIDocUrl ||
|
||||
getDocPath('/docs/use-cases/external-integration/official_account/')
|
||||
}
|
||||
href={getDocPath('/docs/use-cases/external-integration/official_account')}
|
||||
target={'_blank'}
|
||||
ml={2}
|
||||
color={'primary.500'}
|
||||
|
|
|
|||
|
|
@ -22,7 +22,7 @@ const RenderList: Record<
|
|||
Component: dynamic(() => import('./templates/Reference'))
|
||||
},
|
||||
[FlowNodeInputTypeEnum.fileSelect]: {
|
||||
Component: dynamic(() => import('./templates/Reference'))
|
||||
Component: dynamic(() => import('./templates/FileSelect'))
|
||||
},
|
||||
[FlowNodeInputTypeEnum.selectApp]: {
|
||||
Component: dynamic(() => import('./templates/SelectApp'))
|
||||
|
|
@ -135,6 +135,8 @@ const RenderInput = ({ flowInputList, nodeId, CustomComponent, mb = 5 }: Props)
|
|||
|
||||
if (!RenderItem) return null;
|
||||
|
||||
console.log(renderType, input);
|
||||
|
||||
return {
|
||||
Component: (
|
||||
<RenderItem.Component inputs={filterProInputs} item={input} nodeId={nodeId} />
|
||||
|
|
|
|||
|
|
@ -0,0 +1,127 @@
|
|||
import React, { useCallback, useMemo, useState } from 'react';
|
||||
import type { RenderInputProps } from '../type';
|
||||
import { Box, Button, HStack, Input, InputGroup, useDisclosure, VStack } from '@chakra-ui/react';
|
||||
import type { SelectAppItemType } from '@fastgpt/global/core/workflow/template/system/abandoned/runApp/type';
|
||||
import Avatar from '@fastgpt/web/components/common/Avatar';
|
||||
import SelectAppModal from '../../../../SelectAppModal';
|
||||
import { useTranslation } from 'next-i18next';
|
||||
import { useContextSelector } from 'use-context-selector';
|
||||
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
|
||||
import { getAppDetailById } from '@/web/core/app/api';
|
||||
import { WorkflowActionsContext } from '@/pageComponents/app/detail/WorkflowComponents/context/workflowActionsContext';
|
||||
import { AppContext } from '@/pageComponents/app/detail/context';
|
||||
import MyIcon from '@fastgpt/web/components/common/Icon';
|
||||
import MyDivider from '@fastgpt/web/components/common/MyDivider';
|
||||
import { getFileIcon } from '@fastgpt/global/common/file/icon';
|
||||
import MyAvatar from '@fastgpt/web/components/common/Avatar';
|
||||
import IconButton from '@/pageComponents/account/team/OrgManage/IconButton';
|
||||
import MyIconButton from '@fastgpt/web/components/common/Icon/button';
|
||||
import MyTooltip from '@fastgpt/web/components/common/MyTooltip';
|
||||
|
||||
const FileSelectRender = ({ item, nodeId }: RenderInputProps) => {
|
||||
const { t } = useTranslation();
|
||||
const onChangeNode = useContextSelector(WorkflowActionsContext, (v) => v.onChangeNode);
|
||||
|
||||
const [urlInput, setUrlInput] = useState('');
|
||||
const values = useMemo(() => {
|
||||
if (Array.isArray(item.value)) {
|
||||
return item.value;
|
||||
}
|
||||
return [];
|
||||
}, [item.value]);
|
||||
const maxSelectFiles = item.maxFiles || 10;
|
||||
const isMaxSelected = values.length >= maxSelectFiles;
|
||||
|
||||
const handleAddUrl = useCallback(
|
||||
(value: string) => {
|
||||
if (!value.trim()) return;
|
||||
|
||||
onChangeNode({
|
||||
nodeId,
|
||||
type: 'updateInput',
|
||||
key: item.key,
|
||||
value: {
|
||||
...item,
|
||||
value: [value.trim(), ...values]
|
||||
}
|
||||
});
|
||||
setUrlInput('');
|
||||
},
|
||||
[item, nodeId, onChangeNode, values]
|
||||
);
|
||||
const handleDeleteUrl = useCallback(
|
||||
(index: number) => {
|
||||
onChangeNode({
|
||||
nodeId,
|
||||
type: 'updateInput',
|
||||
key: item.key,
|
||||
value: {
|
||||
...item,
|
||||
value: values.filter((_, i) => i !== index)
|
||||
}
|
||||
});
|
||||
},
|
||||
[item, nodeId, onChangeNode, values]
|
||||
);
|
||||
|
||||
return (
|
||||
<Box w={'500px'}>
|
||||
<Box w={'100%'}>
|
||||
<InputGroup display={'flex'} alignItems={'center'}>
|
||||
<MyIcon
|
||||
position={'absolute'}
|
||||
left={2.5}
|
||||
name="common/addLight"
|
||||
w={'1.2rem'}
|
||||
color={'primary.600'}
|
||||
zIndex={10}
|
||||
/>
|
||||
<Input
|
||||
isDisabled={isMaxSelected}
|
||||
value={urlInput}
|
||||
onChange={(e) => setUrlInput(e.target.value)}
|
||||
onBlur={(e) => handleAddUrl(e.target.value)}
|
||||
border={'1.5px dashed'}
|
||||
borderColor={'myGray.250'}
|
||||
borderRadius={'md'}
|
||||
pl={8}
|
||||
py={1.5}
|
||||
placeholder={
|
||||
isMaxSelected ? t('file:reached_max_file_count') : t('chat:click_to_add_url')
|
||||
}
|
||||
/>
|
||||
</InputGroup>
|
||||
</Box>
|
||||
{/* Render */}
|
||||
{values.length > 0 && (
|
||||
<>
|
||||
<MyDivider />
|
||||
<VStack>
|
||||
{values.map((url, index) => {
|
||||
const fileIcon = getFileIcon(url, 'common/link');
|
||||
return (
|
||||
<Box key={index} w={'full'}>
|
||||
<HStack py={2} px={3} bg={'white'} borderRadius={'md'} border={'sm'}>
|
||||
<MyAvatar src={fileIcon} w={'1.2rem'} />
|
||||
<Box fontSize={'sm'} flex={'1 0 0'} title={url} className="textEllipsis">
|
||||
{url}
|
||||
</Box>
|
||||
{/* Status icon */}
|
||||
<MyIconButton
|
||||
icon={'close'}
|
||||
onClick={() => handleDeleteUrl(index)}
|
||||
hoverColor="red.600"
|
||||
hoverBg="red.50"
|
||||
/>
|
||||
</HStack>
|
||||
</Box>
|
||||
);
|
||||
})}
|
||||
</VStack>
|
||||
</>
|
||||
)}
|
||||
</Box>
|
||||
);
|
||||
};
|
||||
|
||||
export default React.memo(FileSelectRender);
|
||||
|
|
@ -17,6 +17,7 @@ import { type NextPage } from 'next';
|
|||
import { getWebReqUrl } from '@fastgpt/web/common/system/utils';
|
||||
import SystemStoreContextProvider from '@fastgpt/web/context/useSystem';
|
||||
import { useRouter } from 'next/router';
|
||||
import { errorLogger } from '@/web/common/utils/errorLogger';
|
||||
|
||||
type NextPageWithLayout = NextPage & {
|
||||
setLayout?: (page: ReactElement) => JSX.Element;
|
||||
|
|
@ -45,6 +46,9 @@ function App({ Component, pageProps }: AppPropsWithLayout) {
|
|||
},
|
||||
{ passive: false }
|
||||
);
|
||||
|
||||
// Initialize error logger
|
||||
errorLogger.init();
|
||||
}, []);
|
||||
|
||||
const setLayout = Component.setLayout || ((page) => <>{page}</>);
|
||||
|
|
|
|||
|
|
@ -1,26 +1,25 @@
|
|||
import { useEffect } from 'react';
|
||||
import { useRouter } from 'next/router';
|
||||
import { serviceSideProps } from '@/web/common/i18n/utils';
|
||||
import { useSystemStore } from '@/web/common/system/useSystemStore';
|
||||
import { Box } from '@chakra-ui/react';
|
||||
import { TrackEventName } from '@/web/common/system/constants';
|
||||
import { useToast } from '@fastgpt/web/hooks/useToast';
|
||||
import { webPushTrack } from '@/web/common/middle/tracks/utils';
|
||||
import { useTranslation } from 'next-i18next';
|
||||
import { errorLogger } from '@/web/common/utils/errorLogger';
|
||||
import { useMount } from 'ahooks';
|
||||
|
||||
function Error() {
|
||||
const { t } = useTranslation();
|
||||
const router = useRouter();
|
||||
const { toast } = useToast();
|
||||
const { lastRoute, llmModelList, embeddingModelList } = useSystemStore();
|
||||
|
||||
useEffect(() => {
|
||||
setTimeout(() => {
|
||||
window.umami?.track(TrackEventName.pageError, {
|
||||
userAgent: navigator.userAgent,
|
||||
platform: navigator.platform,
|
||||
appName: navigator.appName,
|
||||
lastRoute,
|
||||
route: router.asPath
|
||||
});
|
||||
}, 1000);
|
||||
useMount(() => {
|
||||
// Send track
|
||||
webPushTrack.clientError({
|
||||
route: lastRoute,
|
||||
log: errorLogger.getLogs()
|
||||
});
|
||||
|
||||
let modelError = false;
|
||||
if (llmModelList.length === 0) {
|
||||
|
|
@ -51,23 +50,9 @@ function Error() {
|
|||
router.push('/dashboard/agent');
|
||||
}
|
||||
}, 2000);
|
||||
}, []);
|
||||
});
|
||||
|
||||
return (
|
||||
<Box whiteSpace={'pre-wrap'}>
|
||||
{`出现未捕获的异常。
|
||||
1. 私有部署用户,90%是由于模型配置不正确/模型未启用导致。。
|
||||
2. 部分系统不兼容相关API。大部分是苹果的safari 浏览器导致,可以尝试更换 chrome。
|
||||
3. 请关闭浏览器翻译功能,部分翻译导致页面崩溃。
|
||||
|
||||
排除3后,打开控制台的 console 查看具体报错信息。
|
||||
如果提示 xxx undefined 的话,就是模型配置不正确,检查:
|
||||
1. 请确保系统内每个系列模型至少有一个可用,可以在【账号-模型提供商】中检查。
|
||||
2. 请确保至少有一个知识库文件处理模型(语言模型中有一个开关),否则知识库创建会报错。
|
||||
2. 检查模型中一些“对象”参数是否异常(数组和对象),如果为空,可以尝试给个空数组或空对象。
|
||||
`}
|
||||
</Box>
|
||||
);
|
||||
return <Box whiteSpace={'pre-wrap'}>{t('common:page_error')}</Box>;
|
||||
}
|
||||
|
||||
export async function getServerSideProps(context: any) {
|
||||
|
|
|
|||
|
|
@ -67,7 +67,8 @@ const CustomDomain = () => {
|
|||
});
|
||||
|
||||
const { ConfirmModal, openConfirm } = useConfirm({
|
||||
content: t('account:custom_domain.delete_confirm')
|
||||
content: t('account:custom_domain.delete_confirm'),
|
||||
type: 'delete'
|
||||
});
|
||||
|
||||
const [editDomain, setEditDomain] = useState<CustomDomainType | undefined>(undefined);
|
||||
|
|
|
|||
|
|
@ -125,6 +125,9 @@ const InformTable = () => {
|
|||
},
|
||||
'& p': {
|
||||
my: 0
|
||||
},
|
||||
'& ol, & ul': {
|
||||
paddingInlineStart: '1.25em'
|
||||
}
|
||||
}}
|
||||
noOfLines={6}
|
||||
|
|
|
|||
|
|
@ -84,7 +84,7 @@ async function recoverCollectionFile({
|
|||
|
||||
// 直接使用 bucket.putObject 上传到指定的 key
|
||||
const s3Source = getS3DatasetSource();
|
||||
await s3Source.bucket.putObject(s3Key, buffer, buffer.length, {
|
||||
await s3Source.client.putObject(s3Source.bucketName, s3Key, buffer, buffer.length, {
|
||||
'content-type': 'application/octet-stream',
|
||||
'upload-time': new Date().toISOString(),
|
||||
'origin-filename': encodeURIComponent(filename)
|
||||
|
|
|
|||
|
|
@ -31,7 +31,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
|
|||
|
||||
const [file, fileStream] = await Promise.all([
|
||||
getS3DatasetSource().getFileMetadata(fileId),
|
||||
getS3DatasetSource().getDatasetFileStream(fileId)
|
||||
getS3DatasetSource().getFileStream(fileId)
|
||||
]);
|
||||
|
||||
if (!file) {
|
||||
|
|
|
|||
|
|
@ -39,11 +39,3 @@ async function handler(
|
|||
}
|
||||
|
||||
export default NextAPI(useIPFrequencyLimit({ id: 'push-tracks', seconds: 1, limit: 5 }), handler);
|
||||
|
||||
export const config = {
|
||||
api: {
|
||||
bodyParser: {
|
||||
sizeLimit: '5kb'
|
||||
}
|
||||
}
|
||||
};
|
||||
|
|
|
|||
|
|
@ -26,7 +26,7 @@ import { getAppLatestVersion } from '@fastgpt/service/core/app/version/controlle
|
|||
import { VariableInputEnum } from '@fastgpt/global/core/workflow/constants';
|
||||
import { getTimezoneCodeFromStr } from '@fastgpt/global/common/time/timezone';
|
||||
import { getLocationFromIp } from '@fastgpt/service/common/geo';
|
||||
import type { I18nName } from '@fastgpt/service/common/geo/type';
|
||||
import { getLocale } from '@fastgpt/service/common/middle/i18n';
|
||||
|
||||
const formatJsonString = (data: any) => {
|
||||
if (data == null) return '';
|
||||
|
|
@ -40,7 +40,6 @@ export type ExportChatLogsBody = GetAppChatLogsProps & {
|
|||
title: string;
|
||||
sourcesMap: Record<string, { label: string }>;
|
||||
logKeys: AppLogKeysEnum[];
|
||||
locale?: keyof I18nName;
|
||||
};
|
||||
|
||||
async function handler(req: ApiRequestProps<ExportChatLogsBody, {}>, res: NextApiResponse) {
|
||||
|
|
@ -51,7 +50,6 @@ async function handler(req: ApiRequestProps<ExportChatLogsBody, {}>, res: NextAp
|
|||
sources,
|
||||
tmbIds,
|
||||
chatSearch,
|
||||
locale = 'en',
|
||||
title,
|
||||
sourcesMap,
|
||||
logKeys = []
|
||||
|
|
@ -61,6 +59,7 @@ async function handler(req: ApiRequestProps<ExportChatLogsBody, {}>, res: NextAp
|
|||
throw new Error('缺少参数');
|
||||
}
|
||||
|
||||
const locale = getLocale(req);
|
||||
const timezoneCode = getTimezoneCodeFromStr(dateStart);
|
||||
|
||||
const { teamId, tmbId, app } = await authApp({
|
||||
|
|
|
|||
|
|
@ -1,8 +1,7 @@
|
|||
import type { NextApiRequest, NextApiResponse } from 'next';
|
||||
import type { NextApiResponse } from 'next';
|
||||
import { MongoChat } from '@fastgpt/service/core/chat/chatSchema';
|
||||
import { type AppLogsListItemType } from '@/types/app';
|
||||
import { Types } from '@fastgpt/service/common/mongo';
|
||||
import { addDays } from 'date-fns';
|
||||
import type { GetAppChatLogsParams } from '@/global/core/api/appReq.d';
|
||||
import { authApp } from '@fastgpt/service/support/permission/app/auth';
|
||||
import {
|
||||
|
|
@ -19,12 +18,13 @@ import { getLocationFromIp } from '@fastgpt/service/common/geo';
|
|||
import { AppReadChatLogPerVal } from '@fastgpt/global/support/permission/app/constant';
|
||||
import { CommonErrEnum } from '@fastgpt/global/common/error/code/common';
|
||||
import type { ApiRequestProps } from '@fastgpt/service/type/next';
|
||||
import { getLocale } from '@fastgpt/service/common/middle/i18n';
|
||||
|
||||
async function handler(
|
||||
req: ApiRequestProps<GetAppChatLogsParams>,
|
||||
_res: NextApiResponse
|
||||
): Promise<PaginationResponse<AppLogsListItemType>> {
|
||||
const { appId, dateStart, dateEnd, sources, tmbIds, chatSearch, locale = 'en' } = req.body;
|
||||
const { appId, dateStart, dateEnd, sources, tmbIds, chatSearch } = req.body;
|
||||
|
||||
const { pageSize = 20, offset } = parsePaginationRequest(req);
|
||||
|
||||
|
|
@ -294,7 +294,7 @@ async function handler(
|
|||
|
||||
const listWithRegion = list.map((item) => {
|
||||
const ip = item.region;
|
||||
const region = getLocationFromIp(ip, locale);
|
||||
const region = getLocationFromIp(ip, getLocale(req));
|
||||
|
||||
return {
|
||||
...item,
|
||||
|
|
|
|||
|
|
@ -82,7 +82,10 @@ async function handler(
|
|||
|
||||
const s3ImageIds = imageIds.filter((id) => isS3ObjectKey(id, 'dataset'));
|
||||
for (const id of s3ImageIds) {
|
||||
imageSizeMap.set(id, (await getS3DatasetSource().getFileMetadata(id)).contentLength);
|
||||
const metadata = await getS3DatasetSource().getFileMetadata(id);
|
||||
if (metadata) {
|
||||
imageSizeMap.set(id, metadata.contentLength);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import type { NextApiRequest, NextApiResponse } from 'next';
|
||||
import { jsonRes } from '@fastgpt/service/common/response';
|
||||
|
||||
import { request } from 'http';
|
||||
import { Agent, request } from 'http';
|
||||
import { FastGPTProUrl } from '@fastgpt/service/common/system/constants';
|
||||
|
||||
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
|
||||
|
|
@ -25,7 +25,8 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
|
|||
port: parsedUrl.port,
|
||||
path: requestPath,
|
||||
method: req.method,
|
||||
headers: req.headers
|
||||
headers: req.headers,
|
||||
agent: new Agent()
|
||||
});
|
||||
req.pipe(requestResult);
|
||||
|
||||
|
|
|
|||
|
|
@ -20,21 +20,23 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
|
|||
(() => {
|
||||
if (isS3ObjectKey(objectKey, 'dataset')) {
|
||||
return [
|
||||
s3DatasetSource.getDatasetFileStream(objectKey),
|
||||
s3DatasetSource.getFileStream(objectKey),
|
||||
s3DatasetSource.getFileMetadata(objectKey)
|
||||
];
|
||||
} else {
|
||||
return [
|
||||
s3ChatSource.getChatFileStream(objectKey),
|
||||
s3ChatSource.getFileStream(objectKey),
|
||||
s3ChatSource.getFileMetadata(objectKey)
|
||||
];
|
||||
}
|
||||
})()
|
||||
);
|
||||
|
||||
res.setHeader('Content-Type', metadata.contentType);
|
||||
if (metadata) {
|
||||
res.setHeader('Content-Type', metadata.contentType);
|
||||
res.setHeader('Content-Length', metadata.contentLength);
|
||||
}
|
||||
res.setHeader('Cache-Control', 'public, max-age=31536000');
|
||||
res.setHeader('Content-Length', metadata.contentLength);
|
||||
|
||||
stream.pipe(res);
|
||||
|
||||
|
|
|
|||
|
|
@ -35,5 +35,11 @@ export const webPushTrack = {
|
|||
event: TrackEnum.closeOperationalAd,
|
||||
data
|
||||
});
|
||||
},
|
||||
clientError: (data: { route: string; log: string }) => {
|
||||
return createTrack({
|
||||
event: TrackEnum.clientError,
|
||||
data
|
||||
});
|
||||
}
|
||||
};
|
||||
|
|
|
|||
|
|
@ -0,0 +1,171 @@
|
|||
/**
|
||||
* Browser error logger that keeps track of the last 10 errors
|
||||
*/
|
||||
|
||||
export interface ErrorLog {
|
||||
timestamp: number;
|
||||
type: 'console.error' | 'runtime.error' | 'unhandled.rejection';
|
||||
message: string;
|
||||
stack?: string;
|
||||
url?: string;
|
||||
line?: number;
|
||||
column?: number;
|
||||
}
|
||||
|
||||
class ErrorLogger {
|
||||
private maxLogs = 20;
|
||||
private logs: ErrorLog[] = [];
|
||||
private isInitialized = false;
|
||||
private originalConsoleError: typeof console.error;
|
||||
|
||||
constructor() {
|
||||
this.originalConsoleError = console.error;
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize error logger, override console.error and setup global error handlers
|
||||
*/
|
||||
init() {
|
||||
if (this.isInitialized || typeof window === 'undefined') return;
|
||||
this.isInitialized = true;
|
||||
|
||||
// Override console.error
|
||||
this.overrideConsoleError();
|
||||
|
||||
// Setup global error handlers
|
||||
this.setupGlobalErrorHandlers();
|
||||
}
|
||||
|
||||
/**
|
||||
* Override console.error to capture error logs
|
||||
*/
|
||||
private overrideConsoleError() {
|
||||
const self = this;
|
||||
console.error = function (...args: any[]) {
|
||||
// Call original console.error
|
||||
self.originalConsoleError.apply(console, args);
|
||||
|
||||
// Capture error log
|
||||
try {
|
||||
const message = args
|
||||
.map((arg) => {
|
||||
if (arg instanceof Error) {
|
||||
return arg.message;
|
||||
}
|
||||
return typeof arg === 'object' ? JSON.stringify(arg) : String(arg);
|
||||
})
|
||||
.join(' ');
|
||||
|
||||
const stack = args.find((arg) => arg instanceof Error)?.stack;
|
||||
|
||||
self.addLog({
|
||||
timestamp: Date.now(),
|
||||
type: 'console.error',
|
||||
message,
|
||||
stack
|
||||
});
|
||||
} catch (e) {
|
||||
// Silently fail to avoid infinite loop
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup global error handlers for uncaught errors and unhandled promise rejections
|
||||
*/
|
||||
private setupGlobalErrorHandlers() {
|
||||
// Capture JavaScript runtime errors
|
||||
window.addEventListener('error', (event) => {
|
||||
this.addLog({
|
||||
timestamp: Date.now(),
|
||||
type: 'runtime.error',
|
||||
message: event.message || 'Unknown error',
|
||||
stack: event.error?.stack,
|
||||
url: event.filename,
|
||||
line: event.lineno,
|
||||
column: event.colno
|
||||
});
|
||||
});
|
||||
|
||||
// Capture unhandled promise rejections
|
||||
window.addEventListener('unhandledrejection', (event) => {
|
||||
const reason = event.reason;
|
||||
const message =
|
||||
reason instanceof Error
|
||||
? reason.message
|
||||
: typeof reason === 'string'
|
||||
? reason
|
||||
: JSON.stringify(reason);
|
||||
const stack = reason instanceof Error ? reason.stack : undefined;
|
||||
|
||||
this.addLog({
|
||||
timestamp: Date.now(),
|
||||
type: 'unhandled.rejection',
|
||||
message: `Unhandled Promise Rejection: ${message}`,
|
||||
stack
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a new error log, maintaining only the last 10 logs
|
||||
*/
|
||||
private addLog(log: ErrorLog) {
|
||||
this.logs.push(log);
|
||||
|
||||
// Keep only the last maxLogs entries
|
||||
if (this.logs.length > this.maxLogs) {
|
||||
this.logs.shift();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all error logs as formatted string
|
||||
*/
|
||||
getLogs(): string {
|
||||
if (this.logs.length === 0) {
|
||||
return '暂无错误日志';
|
||||
}
|
||||
|
||||
return this.logs
|
||||
.map((log, index) => {
|
||||
const timestamp = new Date(log.timestamp).toLocaleString('zh-CN');
|
||||
let logStr = `[${index + 1}] ${timestamp} - [${log.type}]\n`;
|
||||
logStr += `Message: ${log.message}\n`;
|
||||
|
||||
if (log.url) {
|
||||
logStr += `Location: ${log.url}`;
|
||||
if (log.line !== undefined) logStr += `:${log.line}`;
|
||||
if (log.column !== undefined) logStr += `:${log.column}`;
|
||||
logStr += '\n';
|
||||
}
|
||||
|
||||
if (log.stack) {
|
||||
logStr += `Stack: ${log.stack}\n`;
|
||||
}
|
||||
|
||||
return logStr;
|
||||
})
|
||||
.join('\n---\n\n');
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear all error logs
|
||||
*/
|
||||
clearLogs() {
|
||||
this.logs = [];
|
||||
}
|
||||
|
||||
/**
|
||||
* Restore original console.error (useful for cleanup)
|
||||
*/
|
||||
restore() {
|
||||
if (this.isInitialized) {
|
||||
console.error = this.originalConsoleError;
|
||||
this.isInitialized = false;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Create singleton instance
|
||||
export const errorLogger = new ErrorLogger();
|
||||
|
|
@ -72,7 +72,7 @@ type ChatItemContextType = {
|
|||
pluginRunTab: PluginRunBoxTabEnum;
|
||||
setPluginRunTab: React.Dispatch<React.SetStateAction<PluginRunBoxTabEnum>>;
|
||||
resetVariables: (props?: {
|
||||
variables?: Record<string, any>;
|
||||
variables: Record<string, any> | undefined;
|
||||
variableList?: VariableItemType[];
|
||||
}) => void;
|
||||
clearChatRecords: () => void;
|
||||
|
|
@ -95,7 +95,7 @@ export const ChatItemContext = createContext<ChatItemContextType>({
|
|||
throw new Error('Function not implemented.');
|
||||
},
|
||||
resetVariables: function (props?: {
|
||||
variables?: Record<string, any>;
|
||||
variables: Record<string, any> | undefined;
|
||||
variableList?: VariableItemType[];
|
||||
}): void {
|
||||
throw new Error('Function not implemented.');
|
||||
|
|
@ -144,17 +144,24 @@ const ChatItemContextProvider = ({
|
|||
(props?: { variables?: Record<string, any>; variableList?: VariableItemType[] }) => {
|
||||
const { variables = {}, variableList = [] } = props || {};
|
||||
|
||||
const varValues: Record<string, any> = {};
|
||||
|
||||
variableList.forEach((item) => {
|
||||
varValues[item.key] = variables[item.key] ?? variables[item.label] ?? item.defaultValue;
|
||||
});
|
||||
const values = variablesForm.getValues();
|
||||
|
||||
variablesForm.reset({
|
||||
...values,
|
||||
variables: varValues
|
||||
});
|
||||
if (variableList.length) {
|
||||
const varValues: Record<string, any> = {};
|
||||
variableList.forEach((item) => {
|
||||
varValues[item.key] = variables[item.key] ?? variables[item.label] ?? item.defaultValue;
|
||||
});
|
||||
|
||||
variablesForm.reset({
|
||||
...values,
|
||||
variables: varValues
|
||||
});
|
||||
} else {
|
||||
variablesForm.reset({
|
||||
...values,
|
||||
variables
|
||||
});
|
||||
}
|
||||
},
|
||||
[variablesForm]
|
||||
);
|
||||
|
|
|
|||
|
|
@ -1,129 +0,0 @@
|
|||
import { describe, expect, it } from 'vitest';
|
||||
import { addDays } from 'date-fns';
|
||||
import { getFakeUsers } from '@test/datas/users';
|
||||
import { StreamCall } from '@test/utils/request';
|
||||
import { MongoApp } from '@fastgpt/service/core/app/schema';
|
||||
import { MongoChat } from '@fastgpt/service/core/chat/chatSchema';
|
||||
import { MongoChatItem } from '@fastgpt/service/core/chat/chatItemSchema';
|
||||
import { MongoResourcePermission } from '@fastgpt/service/support/permission/schema';
|
||||
import { AppTypeEnum } from '@fastgpt/global/core/app/constants';
|
||||
import { ChatSourceEnum } from '@fastgpt/global/core/chat/constants';
|
||||
import { AppReadChatLogPerVal } from '@fastgpt/global/support/permission/app/constant';
|
||||
import { AppLogKeysEnum } from '@fastgpt/global/core/app/logs/constants';
|
||||
import * as exportChatLogsApi from '@/pages/api/core/app/exportChatLogs';
|
||||
import type { ExportChatLogsBody } from '@/pages/api/core/app/exportChatLogs';
|
||||
|
||||
describe('exportChatLogs API', () => {
|
||||
it('应该成功导出聊天日志并包含反馈数据的 chatItemId', async () => {
|
||||
// 创建测试用户和应用
|
||||
const users = await getFakeUsers(2);
|
||||
const user = users.owner;
|
||||
|
||||
// 创建测试应用
|
||||
const app = await MongoApp.create({
|
||||
name: 'Test App',
|
||||
type: AppTypeEnum.simple,
|
||||
teamId: user.teamId,
|
||||
tmbId: user.tmbId,
|
||||
modules: [],
|
||||
version: 'v2',
|
||||
chatConfig: {
|
||||
variables: [{ key: 'var1', label: '变量1', type: 'input' }]
|
||||
}
|
||||
});
|
||||
|
||||
// 授予权限
|
||||
await MongoResourcePermission.create({
|
||||
resourceType: 'app',
|
||||
teamId: user.teamId,
|
||||
resourceId: String(app._id),
|
||||
tmbId: user.tmbId,
|
||||
permission: AppReadChatLogPerVal
|
||||
});
|
||||
|
||||
// 创建测试聊天记录
|
||||
const chatId = 'test-chat-' + Date.now();
|
||||
const chat = await MongoChat.create({
|
||||
chatId,
|
||||
userId: user.userId,
|
||||
teamId: user.teamId,
|
||||
tmbId: user.tmbId,
|
||||
appId: app._id,
|
||||
title: '测试对话',
|
||||
source: ChatSourceEnum.online,
|
||||
variables: { var1: 'value1' },
|
||||
updateTime: new Date(),
|
||||
createTime: new Date()
|
||||
});
|
||||
|
||||
// 创建聊天项 - 带好评反馈
|
||||
const goodFeedbackItem = await MongoChatItem.create({
|
||||
chatId,
|
||||
userId: user.userId,
|
||||
teamId: user.teamId,
|
||||
tmbId: user.tmbId,
|
||||
appId: app._id,
|
||||
obj: 'AI',
|
||||
value: [{ type: 'text', text: { content: '好回答' } }],
|
||||
userGoodFeedback: '这个回答很好',
|
||||
time: new Date()
|
||||
});
|
||||
|
||||
// 创建聊天项 - 带差评反馈
|
||||
const badFeedbackItem = await MongoChatItem.create({
|
||||
chatId,
|
||||
userId: user.userId,
|
||||
teamId: user.teamId,
|
||||
tmbId: user.tmbId,
|
||||
appId: app._id,
|
||||
obj: 'AI',
|
||||
value: [{ type: 'text', text: { content: '差回答' } }],
|
||||
userBadFeedback: '不够详细',
|
||||
time: new Date()
|
||||
});
|
||||
|
||||
// 执行导出 - 使用 StreamCall 处理流式 CSV 响应
|
||||
const dateStart = addDays(new Date(), -1);
|
||||
const dateEnd = addDays(new Date(), 1);
|
||||
|
||||
const result: any = await StreamCall<ExportChatLogsBody, {}, {}>(exportChatLogsApi.default, {
|
||||
auth: user,
|
||||
body: {
|
||||
appId: String(app._id),
|
||||
dateStart,
|
||||
dateEnd,
|
||||
title: '对话日志',
|
||||
sourcesMap: {
|
||||
[ChatSourceEnum.online]: { label: '在线对话' }
|
||||
},
|
||||
logKeys: [AppLogKeysEnum.SESSION_ID, AppLogKeysEnum.FEEDBACK]
|
||||
}
|
||||
});
|
||||
|
||||
// 验证导出成功
|
||||
expect(result.error).toBeUndefined();
|
||||
expect(result.code).toBe(200);
|
||||
|
||||
// 验证 CSV 数据
|
||||
const csvData = result.raw as string;
|
||||
expect(csvData).toBeDefined();
|
||||
expect(csvData.length).toBeGreaterThan(0);
|
||||
|
||||
// 验证响应头包含正确的 Content-Type 和 Content-Disposition
|
||||
expect(result.headers?.['Content-Type']).toContain('text/csv');
|
||||
expect(result.headers?.['Content-Disposition']).toContain('attachment');
|
||||
|
||||
// 核心验证:检查反馈数据中是否包含 chatItemId
|
||||
// 好评反馈应该包含 chatItemId 和反馈内容
|
||||
expect(csvData).toContain(String(goodFeedbackItem._id));
|
||||
expect(csvData).toContain('这个回答很好');
|
||||
|
||||
// 差评反馈应该包含 chatItemId 和反馈内容
|
||||
expect(csvData).toContain(String(badFeedbackItem._id));
|
||||
expect(csvData).toContain('不够详细');
|
||||
|
||||
// 验证反馈数据格式正确 (考虑 CSV 转义,引号会被转义为多个引号)
|
||||
expect(csvData).toMatch(/good.*chatItemId/);
|
||||
expect(csvData).toMatch(/bad.*chatItemId/);
|
||||
});
|
||||
});
|
||||
|
|
@ -8,7 +8,7 @@ const createMockS3Bucket = () => ({
|
|||
exist: vi.fn().mockResolvedValue(true),
|
||||
delete: vi.fn().mockResolvedValue(undefined),
|
||||
putObject: vi.fn().mockResolvedValue(undefined),
|
||||
getObject: vi.fn().mockResolvedValue(null),
|
||||
getFileStream: vi.fn().mockResolvedValue(null),
|
||||
statObject: vi.fn().mockResolvedValue({ size: 0, etag: 'mock-etag' }),
|
||||
move: vi.fn().mockResolvedValue(undefined),
|
||||
copy: vi.fn().mockResolvedValue(undefined),
|
||||
|
|
@ -39,7 +39,7 @@ const createMockMinioClient = vi.hoisted(() => {
|
|||
copyObject: vi.fn().mockResolvedValue(undefined),
|
||||
removeObject: vi.fn().mockResolvedValue(undefined),
|
||||
putObject: vi.fn().mockResolvedValue({ etag: 'mock-etag' }),
|
||||
getObject: vi.fn().mockResolvedValue(null),
|
||||
getFileStream: vi.fn().mockResolvedValue(null),
|
||||
statObject: vi.fn().mockResolvedValue({ size: 0, etag: 'mock-etag' }),
|
||||
presignedGetObject: vi.fn().mockResolvedValue('http://localhost:9000/mock-bucket/mock-object'),
|
||||
presignedPostPolicy: vi.fn().mockResolvedValue({
|
||||
|
|
@ -81,7 +81,7 @@ const createMockBucketClass = (defaultName: string) => {
|
|||
}
|
||||
async delete() {}
|
||||
async putObject() {}
|
||||
async getObject() {
|
||||
async getFileStream() {
|
||||
return null;
|
||||
}
|
||||
async statObject() {
|
||||
|
|
|
|||
Loading…
Reference in New Issue