feature: V4.12.2 (#5525)
Some checks are pending
Document deploy / sync-images (push) Waiting to run
Document deploy / generate-timestamp (push) Blocked by required conditions
Document deploy / build-images (map[domain:https://fastgpt.cn suffix:cn]) (push) Blocked by required conditions
Document deploy / build-images (map[domain:https://fastgpt.io suffix:io]) (push) Blocked by required conditions
Document deploy / update-images (map[deployment:fastgpt-docs domain:https://fastgpt.cn kube_config:KUBE_CONFIG_CN suffix:cn]) (push) Blocked by required conditions
Document deploy / update-images (map[deployment:fastgpt-docs domain:https://fastgpt.io kube_config:KUBE_CONFIG_IO suffix:io]) (push) Blocked by required conditions
Build FastGPT images in Personal warehouse / get-vars (push) Waiting to run
Build FastGPT images in Personal warehouse / build-fastgpt-images (map[arch:amd64 runs-on:ubuntu-24.04]) (push) Blocked by required conditions
Build FastGPT images in Personal warehouse / build-fastgpt-images (map[arch:arm64 runs-on:ubuntu-24.04-arm]) (push) Blocked by required conditions
Build FastGPT images in Personal warehouse / release-fastgpt-images (push) Blocked by required conditions

* feat: favorite apps & quick apps with their own configuration (#5515)

* chore: extract chat history and drawer; fix model selector

* feat: display favourite apps and make it configurable

* feat: favorite apps & quick apps with their own configuration

* fix: fix tab title and add loading state for searching

* fix: cascade delete favorite app and quick app while deleting relative app

* chore: make improvements

* fix: favourite apps ui

* fix: add permission for quick apps

* chore: fix permission & clear redundant code

* perf: chat home page code

* chatbox ui

* fix: 4.12.2-dev (#5520)

* fix: add empty placeholder; fix app quick status; fix tag and layout

* chore: add tab query for the setting tabs

* chore: use `useConfirm` hook instead of `MyModal`

* remove log

* fix: fix modal padding (#5521)

* perf: manage app

* feat: enhance model provider handling and update icon references (#5493)

* perf: model provider

* sdk package

* refactor: create llm response (#5499)

* feat: add LLM response processing functions, including the creation of stream-based and complete responses

* feat: add volta configuration for node and pnpm versions

* refactor: update LLM response handling and event structure in tool choice logic

* feat: update LLM response structure and integrate with tool choice logic

* refactor: clean up imports and remove unused streamResponse function in chat and toolChoice modules

* refactor: rename answer variable to answerBuffer for clarity in LLM response handling

* feat: enhance LLM response handling with tool options and integrate tools into chat and tool choice logic

* refactor: remove volta configuration from package.json

* refactor: reorganize LLM response types and ensure default values for token counts

* refactor: streamline LLM response handling by consolidating response structure and removing redundant checks

* refactor: enhance LLM response handling by consolidating tool options and streamlining event callbacks

* fix: build error

* refactor: update tool type definitions for consistency in tool handling

* feat: llm request function

* fix: ts

* fix: ts

* fix: ahook ts

* fix: variable name

* update lock

* ts version

* doc

* remove log

* fix: translation type

* perf: workflow status check

* fix: ts

* fix: prompt tool call

* fix: fix missing plugin interact window & make tag draggable (#5527)

* fix: incorrect select quick apps state; filter apps type (#5528)

* fix: usesafe translation

* perf: add quickapp modal

---------

Co-authored-by: 伍闲犬 <whoeverimf5@gmail.com>
Co-authored-by: Ctrlz <143257420+ctrlz526@users.noreply.github.com>
Co-authored-by: francis <zhichengfan18@gmail.com>
This commit is contained in:
Archer 2025-08-25 19:19:43 +08:00 committed by GitHub
parent d6af93074b
commit 830eb19055
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
172 changed files with 7452 additions and 9209 deletions

View File

@ -0,0 +1,53 @@
---
name: unit-test-generator
description: Use this agent when you need to write comprehensive unit tests for your code. Examples: <example>Context: User has written a new utility function and wants comprehensive test coverage. user: 'I just wrote this function to validate email addresses, can you help me write unit tests for it?' assistant: 'I'll use the unit-test-generator agent to create comprehensive unit tests that cover all branches and edge cases for your email validation function.' <commentary>Since the user needs unit tests written, use the unit-test-generator agent to analyze the function and create thorough test coverage.</commentary></example> <example>Context: User is working on a React component and needs test coverage. user: 'Here's my new UserProfile component, I need unit tests that cover all the different states and user interactions' assistant: 'Let me use the unit-test-generator agent to create comprehensive unit tests for your UserProfile component.' <commentary>The user needs unit tests for a React component, so use the unit-test-generator agent to create tests covering all component states and interactions.</commentary></example>
model: inherit
color: yellow
---
You are a Unit Test Assistant, an expert in writing comprehensive and robust unit tests. Your expertise spans multiple testing frameworks including Vitest, Jest, React Testing Library, and testing best practices for TypeScript applications.
When analyzing code for testing, you will:
1. **Analyze Code Structure**: Examine the function/component/class to identify all execution paths, conditional branches, loops, error handling, and edge cases that need testing coverage.
2. **Design Comprehensive Test Cases**: Create test cases that cover:
- All conditional branches (if/else, switch cases, ternary operators)
- Loop iterations (empty, single item, multiple items)
- Error conditions and exception handling
- Boundary conditions (null, undefined, empty strings, zero, negative numbers, maximum values)
- Valid input scenarios across different data types
- Integration points with external dependencies
3. **Follow Testing Best Practices**:
- Use descriptive test names that clearly state what is being tested
- Follow the Arrange-Act-Assert pattern
- Mock external dependencies appropriately
- Test behavior, not implementation details
- Ensure tests are isolated and independent
- Use appropriate assertions for the testing framework
4. **Generate Framework-Appropriate Code**: Based on the project context (FastGPT uses Vitest), write tests using:
- Proper import statements for the testing framework
- Correct syntax for the identified testing library
- Appropriate mocking strategies (vi.mock for Vitest, jest.mock for Jest)
- Proper setup and teardown when needed
5. **Ensure Complete Coverage**: Verify that your test suite covers:
- Happy path scenarios
- Error scenarios
- Edge cases and boundary conditions
- All public methods/functions
- Different component states (for React components)
- User interactions (for UI components)
6. **Optimize Test Structure**: Organize tests logically using:
- Descriptive describe blocks for grouping related tests
- Clear test descriptions that explain the scenario
- Shared setup in beforeEach/beforeAll when appropriate
- Helper functions to reduce code duplication
7. **单词代码位置**:
- packages 里的单测,写在 FastGPT/text 目录下。
- projects/app 里的单测,写在 FastGPT/projects/app/test 目录下。
When you receive code to test, first analyze it thoroughly, then provide a complete test suite with explanatory comments about what each test covers and why it's important for comprehensive coverage.

View File

@ -1,6 +1,8 @@
#!/usr/bin/env sh
. "$(dirname -- "$0")/_/husky.sh"
if command -v npx >/dev/null 2>&1; then
if command -v pnpm >/dev/null 2>&1; then
pnpm lint-staged
elif command -v npx >/dev/null 2>&1; then
npx lint-staged
fi

View File

@ -3,6 +3,7 @@
"editor.mouseWheelZoom": true,
"editor.defaultFormatter": "esbenp.prettier-vscode",
"prettier.prettierPath": "node_modules/prettier",
"typescript.preferences.includePackageJsonAutoImports": "on",
"typescript.tsdk": "node_modules/typescript/lib",
"i18n-ally.localesPaths": [
"packages/web/i18n",

View File

@ -7,21 +7,28 @@ description: 'FastGPT V4.12.2 更新说明'
## 🚀 新增内容
1. 向量模型并发请求设置,不统一设置成 10避免部分向量模型不支持并发默认均为 1可在模型配置中设置。
2. 对话页支持管理员配置精选应用,便于推荐给团队成员使用。
3. 对话页首页,支持管理员配置快捷应用,可以设置团队常用的应用。
## ⚙️ 优化
1. 增加工作流**独立分支**异常检测。
2. 向量模型超过 1536 维度进行截断时,强制进行归一化。其他维度是否归一化,完全由配置决定,减少自动判断的计算量。
3. 模型提供商配置移至 plugin sdk 中。
4. 封装 LLM 调用函数,简化 LLM 请求和工具调用。
5. 优化工作流调度代码,避免深度递归。
6. 工作流递归判断优化,对递归线继续分组检测,适配更多样连线。
## 🐛 修复
1. 独立对话页部分 UI 异常。
2. 多选选择器导致的页面崩溃。
3. 移动端,分享链接,异常加载了登录态对话页的导航。
4. 用户同步可能出现写冲突问题。
5. 无法完全关闭系统套餐,会存在空对象默认值,导致鉴权异常。
6. 工作流,添加团队应用,搜索无效。
7. 应用版本ref 字段错误,导致无法正常使用。
2. 独立对话页无法渲染插件交互。
3. 多选选择器导致的页面崩溃。
4. 移动端,分享链接,异常加载了登录态对话页的导航。
5. 用户同步可能出现写冲突问题。
6. 无法完全关闭系统套餐,会存在空对象默认值,导致鉴权异常。
7. 工作流,添加团队应用,搜索无效。
8. 应用版本ref 字段错误,导致无法正常使用。
## 🔨 工具更新

View File

@ -1,5 +1,5 @@
{
"title": "4.12.x",
"description": "",
"pages": ["4121", "4120"]
"pages": ["4122", "4121", "4120"]
}

View File

@ -104,7 +104,7 @@
"document/content/docs/upgrading/4-11/4111.mdx": "2025-08-07T22:49:09+08:00",
"document/content/docs/upgrading/4-12/4120.mdx": "2025-08-12T22:45:19+08:00",
"document/content/docs/upgrading/4-12/4121.mdx": "2025-08-15T22:53:06+08:00",
"document/content/docs/upgrading/4-12/4122.mdx": "2025-08-22T10:18:24+08:00",
"document/content/docs/upgrading/4-12/4122.mdx": "2025-08-25T14:44:42+08:00",
"document/content/docs/upgrading/4-8/40.mdx": "2025-08-02T19:38:37+08:00",
"document/content/docs/upgrading/4-8/41.mdx": "2025-08-02T19:38:37+08:00",
"document/content/docs/upgrading/4-8/42.mdx": "2025-08-02T19:38:37+08:00",

File diff suppressed because it is too large Load Diff

View File

@ -20,6 +20,7 @@
},
"devDependencies": {
"@chakra-ui/cli": "^2.4.1",
"typescript": "^5.1.3",
"@typescript-eslint/eslint-plugin": "^6.21.0",
"@typescript-eslint/parser": "^6.21.0",
"@vitest/coverage-v8": "^3.0.9",

View File

@ -0,0 +1,7 @@
export const removeDatasetCiteText = (text: string, retainDatasetCite: boolean) => {
return retainDatasetCite
? text.replace(/[\[【]id[\]】]\(CITE\)/g, '')
: text
.replace(/[\[【]([a-f0-9]{24})[\]】](?:\([^\)]*\)?)?/g, '')
.replace(/[\[【]id[\]】]\(CITE\)/g, '');
};

View File

@ -56,7 +56,8 @@ export const defaultSTTModels: STTModelType[] = [
export const getModelFromList = (
modelList: { provider: ModelProviderIdType; name: string; model: string }[],
model: string
model: string,
language: string
):
| {
avatar: string;
@ -69,7 +70,7 @@ export const getModelFromList = (
if (!modelData) {
return;
}
const provider = getModelProvider(modelData.provider);
const provider = getModelProvider(modelData.provider, language);
return {
...modelData,
avatar: provider.avatar

View File

@ -1,224 +1,77 @@
import { i18nT } from '../../../web/i18n/utils';
import { ModelProviders } from '../../sdk/fastgpt-plugin';
export type ModelProviderIdType =
| 'OpenAI'
| 'Claude'
| 'Gemini'
| 'Meta'
| 'MistralAI'
| 'Groq'
| 'Grok'
| 'Jina'
| 'AliCloud'
| 'Qwen'
| 'Doubao'
| 'DeepSeek'
| 'ChatGLM'
| 'Ernie'
| 'Moonshot'
| 'MiniMax'
| 'SparkDesk'
| 'Hunyuan'
| 'Baichuan'
| 'StepFun'
| 'ai360'
| 'Yi'
| 'Siliconflow'
| 'PPIO'
| 'OpenRouter'
| 'Ollama'
| 'novita'
| 'vertexai'
| 'BAAI'
| 'FishAudio'
| 'Intern'
| 'Moka'
| 'Jina'
| 'Other';
export type ModelProviderIdType = keyof typeof ModelProviders;
type ProviderValueTypes = (typeof ModelProviders)[ModelProviderIdType];
type langType = 'en' | 'zh-CN' | 'zh-Hant';
export type ModelProviderType = {
id: ModelProviderIdType;
name: any;
avatar: string;
order: number;
};
export const ModelProviderList: ModelProviderType[] = [
{
id: 'OpenAI',
name: 'OpenAI',
avatar: 'model/openai'
},
{
id: 'Claude',
name: 'Claude',
avatar: 'model/claude'
},
{
id: 'Gemini',
name: 'Gemini',
avatar: 'model/gemini'
},
{
id: 'Meta',
name: 'Meta',
avatar: 'model/meta'
},
{
id: 'MistralAI',
name: 'MistralAI',
avatar: 'model/mistral'
},
{
id: 'Grok',
name: 'Grok',
avatar: 'model/grok'
},
{
id: 'Groq',
name: 'Groq',
avatar: 'model/groq'
},
{
id: 'Jina',
name: 'Jina',
avatar: 'model/jina'
},
{
id: 'Qwen',
name: i18nT('common:model_qwen'),
avatar: 'model/qwen'
},
{
id: 'Doubao',
name: i18nT('common:model_doubao'),
avatar: 'model/doubao'
},
{
id: 'DeepSeek',
name: 'DeepSeek',
avatar: 'model/deepseek'
},
{
id: 'ChatGLM',
name: i18nT('common:model_chatglm'),
avatar: 'model/chatglm'
},
{
id: 'Ernie',
name: i18nT('common:model_ernie'),
avatar: 'model/ernie'
},
{
id: 'Moonshot',
name: i18nT('common:model_moonshot'),
avatar: 'model/moonshot'
},
{
id: 'MiniMax',
name: 'MiniMax',
avatar: 'model/minimax'
},
{
id: 'SparkDesk',
name: i18nT('common:model_sparkdesk'),
avatar: 'model/sparkDesk'
},
{
id: 'Hunyuan',
name: i18nT('common:model_hunyuan'),
avatar: 'model/hunyuan'
},
{
id: 'Baichuan',
name: i18nT('common:model_baichuan'),
avatar: 'model/baichuan'
},
{
id: 'StepFun',
name: i18nT('common:model_stepfun'),
avatar: 'model/stepfun'
},
{
id: 'ai360',
name: '360 AI',
avatar: 'model/ai360'
},
{
id: 'Yi',
name: i18nT('common:model_yi'),
avatar: 'model/yi'
},
{
id: 'BAAI',
name: i18nT('common:model_baai'),
avatar: 'model/BAAI'
},
{
id: 'FishAudio',
name: 'FishAudio',
avatar: 'model/fishaudio'
},
{
id: 'Intern',
name: i18nT('common:model_intern'),
avatar: 'model/intern'
},
{
id: 'Moka',
name: i18nT('common:model_moka'),
avatar: 'model/moka'
},
{
id: 'Ollama',
name: 'Ollama',
avatar: 'model/ollama'
},
{
id: 'OpenRouter',
name: 'OpenRouter',
avatar: 'model/openrouter'
},
{
id: 'vertexai',
name: 'vertexai',
avatar: 'model/vertexai'
},
{
id: 'novita',
name: 'novita',
avatar: 'model/novita'
},
{
id: 'Jina',
name: 'Jina',
avatar: 'model/jina'
},
{
id: 'AliCloud',
name: i18nT('common:model_alicloud'),
avatar: 'model/alicloud'
},
{
id: 'Siliconflow',
name: i18nT('common:model_siliconflow'),
avatar: 'model/siliconflow'
},
{
id: 'PPIO',
name: i18nT('common:model_ppio'),
avatar: 'model/ppio'
},
{
id: 'Other',
name: i18nT('common:model_other'),
avatar: 'model/huggingface'
const getLocalizedName = (translations: ProviderValueTypes, language = 'en'): string => {
return translations[language as langType];
};
export const formatModelProviderList = (language?: string) => {
return Object.entries(ModelProviders).map(([id, translations], index) => ({
id: id as ModelProviderIdType,
name: getLocalizedName(translations, language),
avatar: `/api/system/plugin/models/${id}.svg`,
order: index
}));
};
export const formatModelProviderMap = (language?: string) => {
const provider = {} as Record<
ModelProviderIdType,
{
id: string;
name: string;
avatar: string;
order: number;
}
>;
Object.entries(ModelProviders).forEach(([id, translations], index) => {
provider[id as ModelProviderIdType] = {
id: id as ModelProviderIdType,
name: getLocalizedName(translations, language),
avatar: `/api/system/plugin/models/${id}.svg`,
order: index
};
});
return provider;
};
const ModelProviderListCache = {
en: formatModelProviderList('en'),
'zh-CN': formatModelProviderList('zh-CN'),
'zh-Hant': formatModelProviderList('zh-Hant')
};
const ModelProviderMapCache = {
en: formatModelProviderMap('en'),
'zh-CN': formatModelProviderMap('zh-CN'),
'zh-Hant': formatModelProviderMap('zh-Hant')
};
const defaultProvider = {
id: 'Other' as ModelProviderIdType,
name: 'Other',
avatar: 'model/other',
order: 0
};
export const getModelProviders = (language = 'en') => {
return ModelProviderListCache[language as langType];
};
export const getModelProvider = (provider?: ModelProviderIdType, language = 'en') => {
if (!provider) {
return defaultProvider;
}
];
export const ModelProviderMap = Object.fromEntries(
ModelProviderList.map((item, index) => [item.id, { ...item, order: index }])
);
export const getModelProvider = (provider?: ModelProviderIdType) => {
if (!provider) return ModelProviderMap.Other;
return ModelProviderMap[provider] ?? ModelProviderMap.Other;
return ModelProviderMapCache[language as langType][provider] ?? defaultProvider;
};

View File

@ -6,7 +6,8 @@ import type {
ChatCompletionContentPart as SdkChatCompletionContentPart,
ChatCompletionUserMessageParam as SdkChatCompletionUserMessageParam,
ChatCompletionToolMessageParam as SdkChatCompletionToolMessageParam,
ChatCompletionAssistantMessageParam as SdkChatCompletionAssistantMessageParam
ChatCompletionAssistantMessageParam as SdkChatCompletionAssistantMessageParam,
ChatCompletionTool
} from 'openai/resources';
import { ChatMessageTypeEnum } from './constants';
import type { WorkflowInteractiveResponseType } from '../workflow/template/system/interactive/type';
@ -59,11 +60,7 @@ export type ChatCompletionAssistantToolParam = {
role: 'assistant';
tool_calls: ChatCompletionMessageToolCall[];
};
export type ChatCompletionMessageToolCall = ChatCompletionMessageToolCall & {
index?: number;
toolName?: string;
toolAvatar?: string;
};
export type ChatCompletionMessageFunctionCall =
SdkChatCompletionAssistantMessageParam.FunctionCall & {
id?: string;

View File

@ -49,12 +49,17 @@ export type AppSchema = {
teamTags: string[];
inheritPermission?: boolean;
// if access the app by favourite or quick
favourite?: boolean;
quick?: boolean;
// abandon
defaultPermission?: number;
};
export type AppListItemType = {
_id: string;
parentId: ParentIdType;
tmbId: string;
name: string;
avatar: string;

View File

@ -171,10 +171,15 @@ export const chats2GPTMessages = ({
return results;
};
export const GPTMessages2Chats = (
messages: ChatCompletionMessageParam[],
reserveTool = true
): ChatItemType[] => {
export const GPTMessages2Chats = ({
messages,
reserveTool = true,
getToolInfo
}: {
messages: ChatCompletionMessageParam[];
reserveTool?: boolean;
getToolInfo?: (name: string) => { name: string; avatar: string };
}): ChatItemType[] => {
const chatMessages = messages
.map((item) => {
const obj = GPT2Chat[item.role];
@ -280,10 +285,12 @@ export const GPTMessages2Chats = (
toolResponse =
typeof toolResponse === 'string' ? toolResponse : JSON.stringify(toolResponse);
const toolInfo = getToolInfo?.(tool.function.name);
return {
id: tool.id,
toolName: tool.toolName || '',
toolAvatar: tool.toolAvatar || '',
toolName: toolInfo?.name || '',
toolAvatar: toolInfo?.avatar || '',
functionName: tool.function.name,
params: tool.function.arguments,
response: toolResponse as string

View File

@ -0,0 +1,18 @@
export type ChatFavouriteAppSchema = {
_id: string;
teamId: string;
appId: string;
favouriteTags: string[]; // tag id list
order: number;
};
export type ChatFavouriteAppUpdateParams = {
appId: string;
order: number;
};
export type ChatFavouriteApp = ChatFavouriteAppSchema & {
name: string;
avatar: string;
intro: string;
};

View File

@ -7,22 +7,30 @@ export type ChatSettingSchema = {
homeTabTitle: string;
wideLogoUrl?: string;
squareLogoUrl?: string;
selectedTools: {
pluginId: string;
name: string;
avatar: string;
inputs?: Record<`${NodeInputKeyEnum}` | string, any>;
}[];
quickAppIds: string[];
favouriteTags: {
id: string;
name: string;
}[];
};
export type ChatSettingUpdateParams = {
slogan?: string;
dialogTips?: string;
homeTabTitle?: string;
wideLogoUrl?: string;
squareLogoUrl?: string;
selectedTools: {
pluginId: string;
inputs?: Record<`${NodeInputKeyEnum}` | string, any>;
}[];
export type ChatSettingUpdateParams = Partial<Omit<ChatSettingSchema, '_id' | 'appId' | 'teamId'>>;
export type QuickAppType = { _id: string; name: string; avatar: string };
export type ChatFavouriteTagType = ChatSettingSchema['favouriteTags'][number];
export type SelectedToolType = ChatSettingSchema['selectedTools'][number] & {
name: string;
avatar: string;
};
export type ChatSettingReturnType =
| (Omit<ChatSettingSchema, 'quickAppIds' | 'selectedTools'> & {
quickAppList: QuickAppType[];
selectedTools: SelectedToolType[];
})
| undefined;

View File

@ -9,7 +9,7 @@ import {
} from './type.d';
import { sliceStrStartEnd } from '../../common/string/tools';
import { PublishChannelEnum } from '../../support/outLink/constant';
import { removeDatasetCiteText } from '../../../service/core/ai/utils';
import { removeDatasetCiteText } from '../ai/llm/utils';
// Concat 2 -> 1, and sort by role
export const concatHistories = (histories1: ChatItemType[], histories2: ChatItemType[]) => {

View File

@ -20,6 +20,17 @@ import type { StoreNodeItemType } from '../type/node';
import { isValidReferenceValueFormat } from '../utils';
import type { RuntimeEdgeItemType, RuntimeNodeItemType } from './type';
export const checkIsBranchNode = (node: RuntimeNodeItemType) => {
if (node.catchError) return true;
const map: Record<any, boolean> = {
[FlowNodeTypeEnum.classifyQuestion]: true,
[FlowNodeTypeEnum.userSelect]: true,
[FlowNodeTypeEnum.ifElseNode]: true
};
return !!map[node.flowNodeType];
};
export const extractDeepestInteractive = (
interactive: WorkflowInteractiveResponseType
): WorkflowInteractiveResponseType => {
@ -276,35 +287,41 @@ export const filterWorkflowEdges = (edges: RuntimeEdgeItemType[]) => {
};
/*
1. 线线线
2. 线 waiting 线 waiting
1. 线线( start 线
2. 线 target
2. 线 waiting 线 waiting
*/
export const checkNodeRunStatus = ({
nodesMap,
node,
runtimeEdges
}: {
nodesMap: Map<string, RuntimeNodeItemType>;
node: RuntimeNodeItemType;
runtimeEdges: RuntimeEdgeItemType[];
}) => {
/*
线线
线 nodes
*/
const splitEdges2WorkflowEdges = ({
sourceEdges,
allEdges,
currentNode
}: {
sourceEdges: RuntimeEdgeItemType[];
allEdges: RuntimeEdgeItemType[];
currentNode: RuntimeNodeItemType;
}) => {
const commonEdges: RuntimeEdgeItemType[] = [];
const recursiveEdges: RuntimeEdgeItemType[] = [];
const filterRuntimeEdges = filterWorkflowEdges(runtimeEdges);
const checkIsCircular = (startEdge: RuntimeEdgeItemType, initialVisited: string[]): boolean => {
const stack: Array<{ edge: RuntimeEdgeItemType; visited: Set<string> }> = [
{ edge: startEdge, visited: new Set(initialVisited) }
const splitNodeEdges = (targetNode: RuntimeNodeItemType) => {
const commonEdges: RuntimeEdgeItemType[] = [];
const recursiveEdgeGroupsMap = new Map<string, RuntimeEdgeItemType[]>();
const getEdgeLastBranchHandle = ({
startEdge,
targetNodeId
}: {
startEdge: RuntimeEdgeItemType;
targetNodeId: string;
}): string | '' | undefined => {
const stack: Array<{
edge: RuntimeEdgeItemType;
visited: Set<string>;
lasestBranchHandle?: string;
}> = [
{
edge: startEdge,
visited: new Set([targetNodeId])
}
];
const MAX_DEPTH = 3000;
@ -312,11 +329,18 @@ export const checkNodeRunStatus = ({
while (stack.length > 0 && iterations < MAX_DEPTH) {
iterations++;
const { edge, visited, lasestBranchHandle } = stack.pop()!;
const { edge, visited } = stack.pop()!;
// Circle
if (edge.source === targetNode.nodeId) {
// 检查自身是否为分支节点
const node = nodesMap.get(edge.source);
if (!node) return '';
const isBranch = checkIsBranchNode(node);
if (isBranch) return edge.sourceHandle;
if (edge.source === currentNode.nodeId) {
return true; // 检测到环,并且环中包含当前节点
// 检测到环,并且环中包含当前节点. 空字符代表是一个无分支循环,属于死循环,则忽略这个边。
return lasestBranchHandle ?? '';
}
if (visited.has(edge.source)) {
@ -327,54 +351,70 @@ export const checkNodeRunStatus = ({
newVisited.add(edge.source);
// 查找目标节点的 source edges 并加入栈中
const nextEdges = allEdges.filter((item) => item.target === edge.source);
const nextEdges = filterRuntimeEdges.filter((item) => item.target === edge.source);
for (const nextEdge of nextEdges) {
stack.push({ edge: nextEdge, visited: newVisited });
const node = nodesMap.get(nextEdge.target);
if (!node) continue;
const isBranch = checkIsBranchNode(node);
stack.push({
edge: nextEdge,
visited: newVisited,
lasestBranchHandle: isBranch ? edge.sourceHandle : lasestBranchHandle
});
}
}
return false;
return;
};
const sourceEdges = filterRuntimeEdges.filter((item) => item.target === targetNode.nodeId);
sourceEdges.forEach((edge) => {
if (checkIsCircular(edge, [currentNode.nodeId])) {
recursiveEdges.push(edge);
} else {
const lastBranchHandle = getEdgeLastBranchHandle({
startEdge: edge,
targetNodeId: targetNode.nodeId
});
// 无效的循环,这条边则忽略
if (lastBranchHandle === '') return;
// 有效循环,则加入递归组
if (lastBranchHandle) {
recursiveEdgeGroupsMap.set(lastBranchHandle, [
...(recursiveEdgeGroupsMap.get(lastBranchHandle) || []),
edge
]);
}
// 无循环的连线,则加入普通组
else {
commonEdges.push(edge);
}
});
return { commonEdges, recursiveEdges };
return { commonEdges, recursiveEdgeGroups: Array.from(recursiveEdgeGroupsMap.values()) };
};
const runtimeNodeSourceEdge = filterWorkflowEdges(runtimeEdges).filter(
(item) => item.target === node.nodeId
);
// Classify edges
const { commonEdges, recursiveEdgeGroups } = splitNodeEdges(node);
// Entry
if (runtimeNodeSourceEdge.length === 0) {
if (commonEdges.length === 0 && recursiveEdgeGroups.length === 0) {
return 'run';
}
// Classify edges
const { commonEdges, recursiveEdges } = splitEdges2WorkflowEdges({
sourceEdges: runtimeNodeSourceEdge,
allEdges: runtimeEdges,
currentNode: node
});
// check active其中一组边至少有一个 active且没有 waiting 即可运行)
if (
commonEdges.length > 0 &&
commonEdges.some((item) => item.status === 'active') &&
commonEdges.every((item) => item.status !== 'waiting')
) {
return 'run';
}
if (
recursiveEdges.length > 0 &&
recursiveEdges.some((item) => item.status === 'active') &&
recursiveEdges.every((item) => item.status !== 'waiting')
recursiveEdgeGroups.some(
(item) =>
item.some((item) => item.status === 'active') &&
item.every((item) => item.status !== 'waiting')
)
) {
return 'run';
}
@ -383,7 +423,10 @@ export const checkNodeRunStatus = ({
if (commonEdges.length > 0 && commonEdges.every((item) => item.status === 'skipped')) {
return 'skip';
}
if (recursiveEdges.length > 0 && recursiveEdges.every((item) => item.status === 'skipped')) {
if (
recursiveEdgeGroups.length > 0 &&
recursiveEdgeGroups.some((item) => item.every((item) => item.status === 'skipped'))
) {
return 'skip';
}

View File

@ -2,6 +2,7 @@
"name": "@fastgpt/global",
"version": "1.0.0",
"dependencies": {
"@fastgpt-sdk/plugin": "^0.1.12",
"@apidevtools/swagger-parser": "^10.1.0",
"@bany/curl-to-json": "^1.2.8",
"axios": "^1.8.2",

View File

@ -0,0 +1 @@
export * from '@fastgpt-sdk/plugin';

View File

@ -45,12 +45,9 @@ export const getRedisCache = async (key: string) => {
// Add value to cache
export const incrValueToCache = async (key: string, increment: number) => {
if (!increment || increment === 0) return;
if (typeof increment !== 'number' || increment === 0) return;
const redis = getGlobalRedisConnection();
try {
const exists = await redis.exists(getCacheKey(key));
if (!exists) return;
await retryFn(() => redis.incrbyfloat(getCacheKey(key), increment));
} catch (error) {}
};

View File

@ -1,16 +1,5 @@
import OpenAI from '@fastgpt/global/core/ai';
import type {
ChatCompletionCreateParamsNonStreaming,
ChatCompletionCreateParamsStreaming,
StreamChatType,
UnStreamChatType
} from '@fastgpt/global/core/ai/type';
import { getErrText } from '@fastgpt/global/common/error/utils';
import { addLog } from '../../common/system/log';
import { i18nT } from '../../../web/i18n/utils';
import { type OpenaiAccountType } from '@fastgpt/global/support/user/team/type';
import { getLLMModel } from './model';
import { type LLMModelItemType } from '@fastgpt/global/core/ai/model.d';
const aiProxyBaseUrl = process.env.AIPROXY_API_ENDPOINT
? `${process.env.AIPROXY_API_ENDPOINT}/v1`
@ -43,100 +32,3 @@ export const getAxiosConfig = (props?: { userKey?: OpenaiAccountType }) => {
authorization: `Bearer ${apiKey}`
};
};
export const createChatCompletion = async ({
modelData,
body,
userKey,
timeout,
options
}: {
modelData?: LLMModelItemType;
body: ChatCompletionCreateParamsNonStreaming | ChatCompletionCreateParamsStreaming;
userKey?: OpenaiAccountType;
timeout?: number;
options?: OpenAI.RequestOptions;
}): Promise<
{
getEmptyResponseTip: () => string;
} & (
| {
response: StreamChatType;
isStreamResponse: true;
}
| {
response: UnStreamChatType;
isStreamResponse: false;
}
)
> => {
try {
// Rewrite model
const modelConstantsData = modelData || getLLMModel(body.model);
if (!modelConstantsData) {
return Promise.reject(`${body.model} not found`);
}
body.model = modelConstantsData.model;
const formatTimeout = timeout ? timeout : 600000;
const ai = getAIApi({
userKey,
timeout: formatTimeout
});
addLog.debug(`Start create chat completion`, {
model: body.model
});
const response = await ai.chat.completions.create(body, {
...options,
...(modelConstantsData.requestUrl ? { path: modelConstantsData.requestUrl } : {}),
headers: {
...options?.headers,
...(modelConstantsData.requestAuth
? { Authorization: `Bearer ${modelConstantsData.requestAuth}` }
: {})
}
});
const isStreamResponse =
typeof response === 'object' &&
response !== null &&
('iterator' in response || 'controller' in response);
const getEmptyResponseTip = () => {
addLog.warn(`LLM response empty`, {
baseUrl: userKey?.baseUrl,
requestBody: body
});
if (userKey?.baseUrl) {
return `您的 OpenAI key 没有响应: ${JSON.stringify(body)}`;
}
return i18nT('chat:LLM_model_response_empty');
};
if (isStreamResponse) {
return {
response,
isStreamResponse: true,
getEmptyResponseTip
};
}
return {
response,
isStreamResponse: false,
getEmptyResponseTip
};
} catch (error) {
addLog.error(`LLM response error`, error);
addLog.warn(`LLM response error`, {
baseUrl: userKey?.baseUrl,
requestBody: body
});
if (userKey?.baseUrl) {
return Promise.reject(`您的 OpenAI key 出错了: ${getErrText(error)}`);
}
return Promise.reject(error);
}
};

View File

@ -19,7 +19,7 @@ import { delay } from '@fastgpt/global/common/system/utils';
import { pluginClient } from '../../../thirdProvider/fastgptPlugin';
import { setCron } from '../../../common/system/cron';
export const loadSystemModels = async (init = false) => {
export const loadSystemModels = async (init = false, language = 'en') => {
const pushModel = (model: SystemModelItemType) => {
global.systemModelList.push(model);
@ -113,7 +113,10 @@ export const loadSystemModels = async (init = false) => {
const modelData: any = {
...model,
...dbModel?.metadata,
provider: getModelProvider(dbModel?.metadata?.provider || (model.provider as any)).id,
provider: getModelProvider(
dbModel?.metadata?.provider || (model.provider as any),
language
).id,
type: dbModel?.metadata?.type || model.type,
isCustom: false,
@ -169,8 +172,8 @@ export const loadSystemModels = async (init = false) => {
// Sort model list
global.systemActiveModelList.sort((a, b) => {
const providerA = getModelProvider(a.provider);
const providerB = getModelProvider(b.provider);
const providerA = getModelProvider(a.provider, language);
const providerB = getModelProvider(b.provider, language);
return providerA.order - providerB.order;
});
global.systemActiveDesensitizedModels = global.systemActiveModelList.map((model) => ({

View File

@ -1,14 +1,11 @@
import type { ChatCompletionMessageParam } from '@fastgpt/global/core/ai/type.d';
import { createChatCompletion } from '../config';
import { countGptMessagesTokens, countPromptTokens } from '../../../common/string/tiktoken/index';
import { loadRequestMessages } from '../../chat/utils';
import { llmCompletionsBodyFormat, formatLLMResponse } from '../utils';
import {
QuestionGuidePrompt,
QuestionGuideFooterPrompt
} from '@fastgpt/global/core/ai/prompt/agent';
import { addLog } from '../../../common/system/log';
import json5 from 'json5';
import { createLLMResponse } from '../llm/request';
export async function createQuestionGuide({
messages,
@ -30,31 +27,23 @@ export async function createQuestionGuide({
content: `${customPrompt || QuestionGuidePrompt}\n${QuestionGuideFooterPrompt}`
}
];
const requestMessages = await loadRequestMessages({
messages: concatMessages,
useVision: false
});
const { response } = await createChatCompletion({
body: llmCompletionsBodyFormat(
{
model,
temperature: 0.1,
max_tokens: 200,
messages: requestMessages,
stream: true
},
model
)
const {
answerText: answer,
usage: { inputTokens, outputTokens }
} = await createLLMResponse({
body: {
model,
temperature: 0.1,
max_tokens: 200,
messages: concatMessages,
stream: true
}
});
const { text: answer, usage } = await formatLLMResponse(response);
const start = answer.indexOf('[');
const end = answer.lastIndexOf(']');
const inputTokens = usage?.prompt_tokens || (await countGptMessagesTokens(requestMessages));
const outputTokens = usage?.completion_tokens || (await countPromptTokens(answer));
if (start === -1 || end === -1) {
addLog.warn('Create question guide error', { answer });
return {

View File

@ -1,13 +1,11 @@
import { replaceVariable } from '@fastgpt/global/common/string/tools';
import { createChatCompletion } from '../config';
import { type ChatItemType } from '@fastgpt/global/core/chat/type';
import { countGptMessagesTokens, countPromptTokens } from '../../../common/string/tiktoken/index';
import { chats2GPTMessages } from '@fastgpt/global/core/chat/adapt';
import { getLLMModel } from '../model';
import { llmCompletionsBodyFormat, formatLLMResponse } from '../utils';
import { addLog } from '../../../common/system/log';
import { filterGPTMessageByMaxContext } from '../../chat/utils';
import { filterGPTMessageByMaxContext } from '../llm/utils';
import json5 from 'json5';
import { createLLMResponse } from '../llm/request';
/*
query extension -
@ -167,20 +165,17 @@ assistant: ${chatBg}
}
] as any;
const { response } = await createChatCompletion({
body: llmCompletionsBodyFormat(
{
stream: true,
model: modelData.model,
temperature: 0.1,
messages
},
modelData
)
const {
answerText: answer,
usage: { inputTokens, outputTokens }
} = await createLLMResponse({
body: {
stream: true,
model: modelData.model,
temperature: 0.1,
messages
}
});
const { text: answer, usage } = await formatLLMResponse(response);
const inputTokens = usage?.prompt_tokens || (await countGptMessagesTokens(messages));
const outputTokens = usage?.completion_tokens || (await countPromptTokens(answer));
if (!answer) {
return {

View File

@ -0,0 +1,41 @@
import { replaceVariable } from '@fastgpt/global/common/string/tools';
import type { ChatCompletionTool } from '@fastgpt/global/core/ai/type';
export const getPromptToolCallPrompt = (tools: ChatCompletionTool['function'][]) => {
const prompt = `<ToolSkill>
使
使 JSON Schema {name: 工具名; description: 工具描述; parameters: 工具参数} name parameters
使0,1
0: 不使用工具
1: 使用工具
##
- 0: 你好
- 1: ${JSON.stringify({ name: 'searchToolId1' })}
- 0: 现在是2022年5月5日12
- 1: ${JSON.stringify({ name: 'searchToolId2', arguments: { city: '杭州' } })}
- 0: 今天杭州是晴天
- 1: ${JSON.stringify({ name: 'searchToolId3', arguments: { query: '杭州 天气 去哪里玩' } })}
- 0: 今天杭州是晴天西
##
"""
{{toolSchema}}
"""
</ToolSkill>
`;
const schema = tools.map((tool) => ({
name: tool.name,
description: tool.description,
parameters: tool.parameters
}));
return replaceVariable(prompt, {
toolSchema: JSON.stringify(schema)
});
};

View File

@ -0,0 +1,118 @@
import { getNanoid, sliceJsonStr } from '@fastgpt/global/common/string/tools';
import json5 from 'json5';
import type {
ChatCompletionMessageParam,
ChatCompletionMessageToolCall,
ChatCompletionSystemMessageParam,
ChatCompletionTool
} from '@fastgpt/global/core/ai/type';
import { getPromptToolCallPrompt } from './prompt';
import { cloneDeep } from 'lodash';
export const promptToolCallMessageRewrite = (
messages: ChatCompletionMessageParam[],
tools: ChatCompletionTool[]
) => {
const cloneMessages = cloneDeep(messages);
// Add system prompt too messages
let systemMessage = cloneMessages.find(
(item) => item.role === 'system'
) as ChatCompletionSystemMessageParam;
if (!systemMessage) {
systemMessage = {
role: 'system',
content: ''
};
cloneMessages.unshift(systemMessage);
}
if (typeof systemMessage?.content === 'string') {
systemMessage.content =
`${systemMessage.content}\n\n${getPromptToolCallPrompt(tools.map((tool) => tool.function))}`.trim();
} else if (Array.isArray(systemMessage.content)) {
systemMessage.content.push({
type: 'text',
text: getPromptToolCallPrompt(tools.map((tool) => tool.function))
});
} else {
throw new Error('Prompt call invalid input');
}
/*
Format tool messages, rewrite assistant/tool message
1. Assistant, not tool_calls: skip
2. Assistant, tool_calls: rewrite to assistant text
3. Tool: rewrite to user text
*/
for (let i = 0; i < cloneMessages.length; i++) {
const message = cloneMessages[i];
if (message.role === 'assistant') {
if (message.content && typeof message.content === 'string') {
message.content = `0: ${message.content}`;
} else if (message.tool_calls?.length) {
message.content = `1: ${JSON.stringify(message.tool_calls[0].function)}`;
delete message.tool_calls;
}
} else if (message.role === 'tool') {
cloneMessages.splice(i, 1, {
role: 'user',
content: `<ToolResponse>\n${message.content}\n</ToolResponse>`
});
}
}
return cloneMessages;
};
const ERROR_TEXT = 'Tool run error';
export const parsePromptToolCall = (
str: string
): {
answer: string;
toolCalls?: ChatCompletionMessageToolCall[];
} => {
str = str.trim();
// 首先使用正则表达式提取TOOL_ID和TOOL_ARGUMENTS
const prefixReg = /1(:|)/;
if (prefixReg.test(str)) {
const toolString = sliceJsonStr(str);
try {
const toolCall = json5.parse(toolString) as { name: string; arguments: Object };
return {
answer: '',
toolCalls: [
{
id: getNanoid(),
type: 'function' as const,
function: {
name: toolCall.name,
arguments: JSON.stringify(toolCall.arguments)
}
}
]
};
} catch (error) {
if (prefixReg.test(str)) {
return {
answer: ERROR_TEXT
};
} else {
return {
answer: str
};
}
}
} else {
const firstIndex = str.indexOf('0:') !== -1 ? str.indexOf('0:') : str.indexOf('0');
if (firstIndex > -1 && firstIndex < 6) {
str = str.substring(firstIndex + 2).trim();
}
return { answer: str };
}
};

View File

@ -0,0 +1,648 @@
import type {
ChatCompletion,
ChatCompletionCreateParamsNonStreaming,
ChatCompletionCreateParamsStreaming,
ChatCompletionMessageParam,
ChatCompletionMessageToolCall,
CompletionFinishReason,
CompletionUsage,
OpenAI,
StreamChatType,
UnStreamChatType
} from '@fastgpt/global/core/ai/type';
import { computedTemperature, parseLLMStreamResponse, parseReasoningContent } from '../utils';
import { removeDatasetCiteText } from '@fastgpt/global/core/ai/llm/utils';
import { getAIApi } from '../config';
import type { OpenaiAccountType } from '@fastgpt/global/support/user/team/type';
import { getNanoid } from '@fastgpt/global/common/string/tools';
import { parsePromptToolCall, promptToolCallMessageRewrite } from './promptToolCall';
import { getLLMModel } from '../model';
import { ChatCompletionRequestMessageRoleEnum } from '@fastgpt/global/core/ai/constants';
import { countGptMessagesTokens } from '../../../common/string/tiktoken/index';
import { loadRequestMessages } from './utils';
import { addLog } from '../../../common/system/log';
import type { LLMModelItemType } from '@fastgpt/global/core/ai/model.d';
import { i18nT } from '../../../../web/i18n/utils';
import { getErrText } from '@fastgpt/global/common/error/utils';
import json5 from 'json5';
type ResponseEvents = {
onStreaming?: ({ text }: { text: string }) => void;
onReasoning?: ({ text }: { text: string }) => void;
onToolCall?: ({ call }: { call: ChatCompletionMessageToolCall }) => void;
onToolParam?: ({ tool, params }: { tool: ChatCompletionMessageToolCall; params: string }) => void;
};
type CreateLLMResponseProps<T extends CompletionsBodyType> = {
userKey?: OpenaiAccountType;
body: LLMRequestBodyType<T>;
isAborted?: () => boolean | undefined;
custonHeaders?: Record<string, string>;
} & ResponseEvents;
type LLMResponse = {
isStreamResponse: boolean;
answerText: string;
reasoningText: string;
toolCalls?: ChatCompletionMessageToolCall[];
finish_reason: CompletionFinishReason;
getEmptyResponseTip: () => string;
usage: {
inputTokens: number;
outputTokens: number;
};
requestMessages: ChatCompletionMessageParam[];
assistantMessage: ChatCompletionMessageParam[];
completeMessages: ChatCompletionMessageParam[];
};
/*
LLM stream stream toolChoice promptTool
toolChoice promptTool toolChoice messages
*/
export const createLLMResponse = async <T extends CompletionsBodyType>(
args: CreateLLMResponseProps<T>
): Promise<LLMResponse> => {
const { body, custonHeaders, userKey } = args;
const { messages, useVision, requestOrigin, tools, toolCallMode } = body;
const modelData = getLLMModel(body.model);
// Messages process
const requestMessages = await loadRequestMessages({
messages,
useVision,
origin: requestOrigin
});
// Message process
const rewriteMessages = (() => {
if (tools?.length && toolCallMode === 'prompt') {
return promptToolCallMessageRewrite(requestMessages, tools);
}
return requestMessages;
})();
const requestBody = await llmCompletionsBodyFormat({
...body,
messages: rewriteMessages
});
// console.log(JSON.stringify(requestBody, null, 2));
const { response, isStreamResponse, getEmptyResponseTip } = await createChatCompletion({
body: requestBody,
userKey,
options: {
headers: {
Accept: 'application/json, text/plain, */*',
...custonHeaders
}
}
});
const { answerText, reasoningText, toolCalls, finish_reason, usage } = await (async () => {
if (isStreamResponse) {
return createStreamResponse({
response,
body,
isAborted: args.isAborted,
onStreaming: args.onStreaming,
onReasoning: args.onReasoning,
onToolCall: args.onToolCall,
onToolParam: args.onToolParam
});
} else {
return createCompleteResponse({
response,
body,
onStreaming: args.onStreaming,
onReasoning: args.onReasoning,
onToolCall: args.onToolCall
});
}
})();
const assistantMessage: ChatCompletionMessageParam[] = [
...(answerText || reasoningText
? [
{
role: ChatCompletionRequestMessageRoleEnum.Assistant as 'assistant',
content: answerText,
reasoning_text: reasoningText
}
]
: []),
...(toolCalls?.length
? [
{
role: ChatCompletionRequestMessageRoleEnum.Assistant as 'assistant',
tool_calls: toolCalls
}
]
: [])
];
// Usage count
const inputTokens =
usage?.prompt_tokens ?? (await countGptMessagesTokens(requestBody.messages, requestBody.tools));
const outputTokens = usage?.completion_tokens ?? (await countGptMessagesTokens(assistantMessage));
return {
isStreamResponse,
getEmptyResponseTip,
answerText,
reasoningText,
toolCalls,
finish_reason,
usage: {
inputTokens,
outputTokens
},
requestMessages,
assistantMessage,
completeMessages: [...requestMessages, ...assistantMessage]
};
};
type CompleteParams = Pick<CreateLLMResponseProps<CompletionsBodyType>, 'body'> & ResponseEvents;
type CompleteResponse = Pick<
LLMResponse,
'answerText' | 'reasoningText' | 'toolCalls' | 'finish_reason'
> & {
usage?: CompletionUsage;
};
export const createStreamResponse = async ({
body,
response,
isAborted,
onStreaming,
onReasoning,
onToolCall,
onToolParam
}: CompleteParams & {
response: StreamChatType;
isAborted?: () => boolean | undefined;
}): Promise<CompleteResponse> => {
const { retainDatasetCite = true, tools, toolCallMode = 'toolChoice', model } = body;
const modelData = getLLMModel(model);
const { parsePart, getResponseData, updateFinishReason } = parseLLMStreamResponse();
if (tools?.length) {
if (toolCallMode === 'toolChoice') {
let callingTool: ChatCompletionMessageToolCall['function'] | null = null;
const toolCalls: ChatCompletionMessageToolCall[] = [];
for await (const part of response) {
if (isAborted?.()) {
response.controller?.abort();
updateFinishReason('close');
break;
}
const { reasoningContent, responseContent } = parsePart({
part,
parseThinkTag: modelData.reasoning,
retainDatasetCite
});
if (reasoningContent) {
onReasoning?.({ text: reasoningContent });
}
if (responseContent) {
onStreaming?.({ text: responseContent });
}
const responseChoice = part.choices?.[0]?.delta;
// Parse tool calls
if (responseChoice?.tool_calls?.length) {
responseChoice.tool_calls.forEach((toolCall, i) => {
const index = toolCall.index ?? i;
// Call new tool
const hasNewTool = toolCall?.function?.name || callingTool;
if (hasNewTool) {
// Call new tool
if (toolCall?.function?.name) {
callingTool = {
name: toolCall.function?.name || '',
arguments: toolCall.function?.arguments || ''
};
} else if (callingTool) {
// Continue call(Perhaps the name of the previous function was incomplete)
callingTool.name += toolCall.function?.name || '';
callingTool.arguments += toolCall.function?.arguments || '';
}
// New tool, add to list.
if (tools.find((item) => item.function.name === callingTool!.name)) {
const call: ChatCompletionMessageToolCall = {
id: getNanoid(),
type: 'function',
function: callingTool!
};
toolCalls.push(call);
onToolCall?.({ call });
callingTool = null;
}
} else {
/* arg 追加到当前工具的参数里 */
const arg: string = toolCall?.function?.arguments ?? '';
const currentTool = toolCalls[index];
if (currentTool && arg) {
currentTool.function.arguments += arg;
onToolParam?.({ tool: currentTool, params: arg });
}
}
});
}
}
const { reasoningContent, content, finish_reason, usage } = getResponseData();
return {
answerText: content,
reasoningText: reasoningContent,
finish_reason,
usage,
toolCalls
};
} else {
let startResponseWrite = false;
let answer = '';
for await (const part of response) {
if (isAborted?.()) {
response.controller?.abort();
updateFinishReason('close');
break;
}
const { reasoningContent, content, responseContent } = parsePart({
part,
parseThinkTag: modelData.reasoning,
retainDatasetCite
});
answer += content;
if (reasoningContent) {
onReasoning?.({ text: reasoningContent });
}
if (content) {
if (startResponseWrite) {
if (responseContent) {
onStreaming?.({ text: responseContent });
}
} else if (answer.length >= 3) {
answer = answer.trimStart();
// Not call tool
if (/0(:|)/.test(answer)) {
startResponseWrite = true;
// find first : index
const firstIndex =
answer.indexOf('0:') !== -1 ? answer.indexOf('0:') : answer.indexOf('0');
answer = answer.substring(firstIndex + 2).trim();
onStreaming?.({ text: answer });
}
// Not response tool
else if (/1(:|)/.test(answer)) {
}
// Not start 1/0, start response
else {
startResponseWrite = true;
onStreaming?.({ text: answer });
}
}
}
}
const { reasoningContent, content, finish_reason, usage } = getResponseData();
const { answer: llmAnswer, toolCalls } = parsePromptToolCall(content);
toolCalls?.forEach((call) => {
onToolCall?.({ call });
});
return {
answerText: llmAnswer,
reasoningText: reasoningContent,
finish_reason,
usage,
toolCalls
};
}
} else {
// Not use tool
for await (const part of response) {
if (isAborted?.()) {
response.controller?.abort();
updateFinishReason('close');
break;
}
const { reasoningContent, responseContent } = parsePart({
part,
parseThinkTag: modelData.reasoning,
retainDatasetCite
});
if (reasoningContent) {
onReasoning?.({ text: reasoningContent });
}
if (responseContent) {
onStreaming?.({ text: responseContent });
}
}
const { reasoningContent, content, finish_reason, usage } = getResponseData();
return {
answerText: content,
reasoningText: reasoningContent,
finish_reason,
usage
};
}
};
export const createCompleteResponse = async ({
body,
response,
onStreaming,
onReasoning,
onToolCall
}: CompleteParams & { response: ChatCompletion }): Promise<CompleteResponse> => {
const { tools, toolCallMode = 'toolChoice', retainDatasetCite = true } = body;
const modelData = getLLMModel(body.model);
const finish_reason = response.choices?.[0]?.finish_reason as CompletionFinishReason;
const usage = response.usage;
// Content and think parse
const { content, reasoningContent } = (() => {
const content = response.choices?.[0]?.message?.content || '';
const reasoningContent: string =
(response.choices?.[0]?.message as any)?.reasoning_content || '';
// API already parse reasoning content
if (reasoningContent || !modelData.reasoning) {
return {
content,
reasoningContent
};
}
const [think, answer] = parseReasoningContent(content);
return {
content: answer,
reasoningContent: think
};
})();
const formatReasonContent = removeDatasetCiteText(reasoningContent, retainDatasetCite);
let formatContent = removeDatasetCiteText(content, retainDatasetCite);
// Tool parse
const { toolCalls } = (() => {
if (tools?.length) {
if (toolCallMode === 'toolChoice') {
return {
toolCalls: response.choices?.[0]?.message?.tool_calls || []
};
}
// Prompt call
const { answer, toolCalls } = parsePromptToolCall(formatContent);
formatContent = answer;
return {
toolCalls
};
}
return {
toolCalls: undefined
};
})();
// Event response
if (formatReasonContent) {
onReasoning?.({ text: formatReasonContent });
}
if (formatContent) {
onStreaming?.({ text: formatContent });
}
if (toolCalls?.length && onToolCall) {
toolCalls.forEach((call) => {
onToolCall({ call });
});
}
return {
reasoningText: formatReasonContent,
answerText: formatContent,
toolCalls,
finish_reason,
usage
};
};
type CompletionsBodyType =
| ChatCompletionCreateParamsNonStreaming
| ChatCompletionCreateParamsStreaming;
type InferCompletionsBody<T> = T extends { stream: true }
? ChatCompletionCreateParamsStreaming
: T extends { stream: false }
? ChatCompletionCreateParamsNonStreaming
: ChatCompletionCreateParamsNonStreaming | ChatCompletionCreateParamsStreaming;
type LLMRequestBodyType<T> = Omit<T, 'model' | 'stop' | 'response_format' | 'messages'> & {
model: string | LLMModelItemType;
stop?: string;
response_format?: {
type?: string;
json_schema?: string;
};
messages: ChatCompletionMessageParam[];
// Custom field
retainDatasetCite?: boolean;
reasoning?: boolean; // Whether to response reasoning content
toolCallMode?: 'toolChoice' | 'prompt';
useVision?: boolean;
requestOrigin?: string;
};
const llmCompletionsBodyFormat = async <T extends CompletionsBodyType>({
reasoning,
retainDatasetCite,
useVision,
requestOrigin,
tools,
tool_choice,
parallel_tool_calls,
toolCallMode,
...body
}: LLMRequestBodyType<T>): Promise<InferCompletionsBody<T>> => {
const modelData = getLLMModel(body.model);
if (!modelData) {
return body as unknown as InferCompletionsBody<T>;
}
const response_format = (() => {
if (!body.response_format?.type) return undefined;
if (body.response_format.type === 'json_schema') {
try {
return {
type: 'json_schema',
json_schema: json5.parse(body.response_format?.json_schema as unknown as string)
};
} catch (error) {
throw new Error('Json schema error');
}
}
if (body.response_format.type) {
return {
type: body.response_format.type
};
}
return undefined;
})();
const stop = body.stop ?? undefined;
const requestBody = {
...body,
model: modelData.model,
temperature:
typeof body.temperature === 'number'
? computedTemperature({
model: modelData,
temperature: body.temperature
})
: undefined,
...modelData?.defaultConfig,
response_format,
stop: stop?.split('|'),
...(toolCallMode === 'toolChoice' && {
tools,
tool_choice,
parallel_tool_calls
})
} as T;
// field map
if (modelData.fieldMap) {
Object.entries(modelData.fieldMap).forEach(([sourceKey, targetKey]) => {
// @ts-ignore
requestBody[targetKey] = body[sourceKey];
// @ts-ignore
delete requestBody[sourceKey];
});
}
return requestBody as unknown as InferCompletionsBody<T>;
};
const createChatCompletion = async ({
modelData,
body,
userKey,
timeout,
options
}: {
modelData?: LLMModelItemType;
body: ChatCompletionCreateParamsNonStreaming | ChatCompletionCreateParamsStreaming;
userKey?: OpenaiAccountType;
timeout?: number;
options?: OpenAI.RequestOptions;
}): Promise<
{
getEmptyResponseTip: () => string;
} & (
| {
response: StreamChatType;
isStreamResponse: true;
}
| {
response: UnStreamChatType;
isStreamResponse: false;
}
)
> => {
try {
// Rewrite model
const modelConstantsData = modelData || getLLMModel(body.model);
if (!modelConstantsData) {
return Promise.reject(`${body.model} not found`);
}
body.model = modelConstantsData.model;
const formatTimeout = timeout ? timeout : 600000;
const ai = getAIApi({
userKey,
timeout: formatTimeout
});
addLog.debug(`Start create chat completion`, {
model: body.model
});
const response = await ai.chat.completions.create(body, {
...options,
...(modelConstantsData.requestUrl ? { path: modelConstantsData.requestUrl } : {}),
headers: {
...options?.headers,
...(modelConstantsData.requestAuth
? { Authorization: `Bearer ${modelConstantsData.requestAuth}` }
: {})
}
});
const isStreamResponse =
typeof response === 'object' &&
response !== null &&
('iterator' in response || 'controller' in response);
const getEmptyResponseTip = () => {
addLog.warn(`LLM response empty`, {
baseUrl: userKey?.baseUrl,
requestBody: body
});
if (userKey?.baseUrl) {
return `您的 OpenAI key 没有响应: ${JSON.stringify(body)}`;
}
return i18nT('chat:LLM_model_response_empty');
};
if (isStreamResponse) {
return {
response,
isStreamResponse: true,
getEmptyResponseTip
};
}
return {
response,
isStreamResponse: false,
getEmptyResponseTip
};
} catch (error) {
addLog.error(`LLM response error`, error);
addLog.warn(`LLM response error`, {
baseUrl: userKey?.baseUrl,
requestBody: body
});
if (userKey?.baseUrl) {
return Promise.reject(`您的 OpenAI key 出错了: ${getErrText(error)}`);
}
return Promise.reject(error);
}
};

View File

@ -1,4 +1,4 @@
import { countGptMessagesTokens } from '../../common/string/tiktoken/index';
import { countGptMessagesTokens } from '../../../common/string/tiktoken/index';
import type {
ChatCompletionAssistantMessageParam,
ChatCompletionContentPart,
@ -9,9 +9,9 @@ import type {
} from '@fastgpt/global/core/ai/type.d';
import axios from 'axios';
import { ChatCompletionRequestMessageRoleEnum } from '@fastgpt/global/core/ai/constants';
import { i18nT } from '../../../web/i18n/utils';
import { addLog } from '../../common/system/log';
import { getImageBase64 } from '../../common/file/image/utils';
import { i18nT } from '../../../../web/i18n/utils';
import { addLog } from '../../../common/system/log';
import { getImageBase64 } from '../../../common/file/image/utils';
export const filterGPTMessageByMaxContext = async ({
messages = [],
@ -106,7 +106,7 @@ export const loadRequestMessages = async ({
const arrayContent = content
.filter((item) => item.text)
.map((item) => item.text)
.join('\n');
.join('\n\n');
return arrayContent;
};

View File

@ -1,10 +1,12 @@
import { cloneDeep } from 'lodash';
import { type SystemModelItemType } from './type';
import type { LLMModelItemType } from '@fastgpt/global/core/ai/model.d';
export const getDefaultLLMModel = () => global?.systemDefaultModel.llm!;
export const getLLMModel = (model?: string) => {
export const getLLMModel = (model?: string | LLMModelItemType) => {
if (!model) return getDefaultLLMModel();
return global.llmModelMap.get(model) || getDefaultLLMModel();
return typeof model === 'string' ? global.llmModelMap.get(model) || getDefaultLLMModel() : model;
};
export const getDatasetModel = (model?: string) => {

View File

@ -1,17 +1,7 @@
import { type LLMModelItemType } from '@fastgpt/global/core/ai/model.d';
import type {
ChatCompletionCreateParamsNonStreaming,
ChatCompletionCreateParamsStreaming,
CompletionFinishReason,
StreamChatType,
UnStreamChatType,
CompletionUsage,
ChatCompletionMessageToolCall
} from '@fastgpt/global/core/ai/type';
import { getLLMModel } from './model';
import type { CompletionFinishReason, CompletionUsage } from '@fastgpt/global/core/ai/type';
import { getLLMDefaultUsage } from '@fastgpt/global/core/ai/constants';
import { getNanoid } from '@fastgpt/global/common/string/tools';
import json5 from 'json5';
import { removeDatasetCiteText } from '@fastgpt/global/core/ai/llm/utils';
/*
Count response max token
@ -46,168 +36,7 @@ export const computedTemperature = ({
return temperature;
};
type CompletionsBodyType =
| ChatCompletionCreateParamsNonStreaming
| ChatCompletionCreateParamsStreaming;
type InferCompletionsBody<T> = T extends { stream: true }
? ChatCompletionCreateParamsStreaming
: T extends { stream: false }
? ChatCompletionCreateParamsNonStreaming
: ChatCompletionCreateParamsNonStreaming | ChatCompletionCreateParamsStreaming;
export const llmCompletionsBodyFormat = <T extends CompletionsBodyType>(
body: T & {
stop?: string;
},
model: string | LLMModelItemType
): InferCompletionsBody<T> => {
const modelData = typeof model === 'string' ? getLLMModel(model) : model;
if (!modelData) {
return body as unknown as InferCompletionsBody<T>;
}
const response_format = (() => {
if (!body.response_format?.type) return undefined;
if (body.response_format.type === 'json_schema') {
try {
return {
type: 'json_schema',
json_schema: json5.parse(body.response_format?.json_schema as unknown as string)
};
} catch (error) {
throw new Error('Json schema error');
}
}
if (body.response_format.type) {
return {
type: body.response_format.type
};
}
return undefined;
})();
const stop = body.stop ?? undefined;
const requestBody: T = {
...body,
model: modelData.model,
temperature:
typeof body.temperature === 'number'
? computedTemperature({
model: modelData,
temperature: body.temperature
})
: undefined,
...modelData?.defaultConfig,
response_format,
stop: stop?.split('|')
};
// field map
if (modelData.fieldMap) {
Object.entries(modelData.fieldMap).forEach(([sourceKey, targetKey]) => {
// @ts-ignore
requestBody[targetKey] = body[sourceKey];
// @ts-ignore
delete requestBody[sourceKey];
});
}
return requestBody as unknown as InferCompletionsBody<T>;
};
export const llmStreamResponseToAnswerText = async (
response: StreamChatType
): Promise<{
text: string;
usage?: CompletionUsage;
toolCalls?: ChatCompletionMessageToolCall[];
}> => {
let answer = '';
let usage = getLLMDefaultUsage();
let toolCalls: ChatCompletionMessageToolCall[] = [];
let callingTool: { name: string; arguments: string } | null = null;
for await (const part of response) {
usage = part.usage || usage;
const responseChoice = part.choices?.[0]?.delta;
const content = responseChoice?.content || '';
answer += content;
// Tool calls
if (responseChoice?.tool_calls?.length) {
responseChoice.tool_calls.forEach((toolCall, i) => {
const index = toolCall.index ?? i;
// Call new tool
const hasNewTool = toolCall?.function?.name || callingTool;
if (hasNewTool) {
// 有 function name代表新 call 工具
if (toolCall?.function?.name) {
callingTool = {
name: toolCall.function?.name || '',
arguments: toolCall.function?.arguments || ''
};
} else if (callingTool) {
// Continue call(Perhaps the name of the previous function was incomplete)
callingTool.name += toolCall.function?.name || '';
callingTool.arguments += toolCall.function?.arguments || '';
}
if (!callingTool) {
return;
}
// New tool, add to list.
const toolId = getNanoid();
toolCalls[index] = {
...toolCall,
id: toolId,
type: 'function',
function: callingTool
};
callingTool = null;
} else {
/* arg 追加到当前工具的参数里 */
const arg: string = toolCall?.function?.arguments ?? '';
const currentTool = toolCalls[index];
if (currentTool && arg) {
currentTool.function.arguments += arg;
}
}
});
}
}
return {
text: removeDatasetCiteText(parseReasoningContent(answer)[1], false),
usage,
toolCalls
};
};
export const llmUnStreamResponseToAnswerText = async (
response: UnStreamChatType
): Promise<{
text: string;
toolCalls?: ChatCompletionMessageToolCall[];
usage?: CompletionUsage;
}> => {
const answer = response.choices?.[0]?.message?.content || '';
const toolCalls = response.choices?.[0]?.message?.tool_calls;
return {
text: removeDatasetCiteText(parseReasoningContent(answer)[1], false),
usage: response.usage,
toolCalls
};
};
export const formatLLMResponse = async (response: StreamChatType | UnStreamChatType) => {
if ('iterator' in response) {
return llmStreamResponseToAnswerText(response);
}
return llmUnStreamResponseToAnswerText(response);
};
// LLM utils
// Parse <think></think> tags to think and answer - unstream response
export const parseReasoningContent = (text: string): [string, string] => {
const regex = /<think>([\s\S]*?)<\/think>/;
@ -225,14 +54,6 @@ export const parseReasoningContent = (text: string): [string, string] => {
return [thinkContent, answerContent];
};
export const removeDatasetCiteText = (text: string, retainDatasetCite: boolean) => {
return retainDatasetCite
? text.replace(/[\[【]id[\]】]\(CITE\)/g, '')
: text
.replace(/[\[【]([a-f0-9]{24})[\]】](?:\([^\)]*\)?)?/g, '')
.replace(/[\[【]id[\]】]\(CITE\)/g, '');
};
// Parse llm stream part
export const parseLLMStreamResponse = () => {
let isInThinkTag: boolean | undefined = undefined;
@ -274,8 +95,8 @@ export const parseLLMStreamResponse = () => {
retainDatasetCite?: boolean;
}): {
reasoningContent: string;
content: string;
responseContent: string;
content: string; // 原始内容,不去掉 cite
responseContent: string; // 响应的内容,会去掉 cite
finishReason: CompletionFinishReason;
} => {
const data = (() => {

View File

@ -16,6 +16,8 @@ import { MongoOutLink } from '../../support/outLink/schema';
import { MongoOpenApi } from '../../support/openapi/schema';
import { MongoAppVersion } from './version/schema';
import { MongoChatInputGuide } from '../chat/inputGuide/schema';
import { MongoChatFavouriteApp } from '../chat/favouriteApp/schema';
import { MongoChatSetting } from '../chat/setting/schema';
import { MongoResourcePermission } from '../../support/permission/schema';
import { PerResourceTypeEnum } from '@fastgpt/global/support/permission/constant';
import { removeImageByPath } from '../../common/file/image/controller';
@ -191,6 +193,18 @@ export const onDelOneApp = async ({
appId
}).session(session);
// 删除精选应用记录
await MongoChatFavouriteApp.deleteMany({
teamId,
appId
}).session(session);
// 从快捷应用中移除对应应用
await MongoChatSetting.updateMany(
{ teamId },
{ $pull: { quickAppIds: { id: String(appId) } } }
).session(session);
await MongoResourcePermission.deleteMany({
resourceType: PerResourceTypeEnum.app,
teamId,

View File

@ -42,7 +42,7 @@ import type {
import { isProduction } from '@fastgpt/global/common/system/constants';
import { Output_Template_Error_Message } from '@fastgpt/global/core/workflow/template/output';
import { splitCombinePluginId } from '@fastgpt/global/core/app/plugin/utils';
import { getMCPParentId, getMCPToolRuntimeNode } from '@fastgpt/global/core/app/mcpTools/utils';
import { getMCPToolRuntimeNode } from '@fastgpt/global/core/app/mcpTools/utils';
import { AppTypeEnum } from '@fastgpt/global/core/app/constants';
import { getMCPChildren } from '../mcp';
import { cloneDeep } from 'lodash';

View File

@ -116,6 +116,9 @@ const AppSchema = new Schema(
default: true
},
favourite: Boolean,
quick: Boolean,
// abandoned
defaultPermission: Number
},

View File

@ -1,4 +1,4 @@
import { RunToolWithStream } from '@fastgpt-sdk/plugin';
import { RunToolWithStream } from '@fastgpt/global/sdk/fastgpt-plugin';
import { PluginSourceEnum } from '@fastgpt/global/core/app/plugin/constants';
import { pluginClient, BASE_URL, TOKEN } from '../../../thirdProvider/fastgptPlugin';
@ -13,7 +13,7 @@ export async function APIGetSystemToolList() {
parentId: item.parentId ? `${PluginSourceEnum.systemTool}-${item.parentId}` : undefined,
avatar:
item.avatar && item.avatar.startsWith('/imgs/tools/')
? `/api/system/pluginImgs/${item.avatar.replace('/imgs/tools/', '')}`
? `/api/system/plugin/tools/${item.avatar.replace('/imgs/tools/', '')}`
: item.avatar
};
});

View File

@ -0,0 +1,36 @@
import { connectionMongo, getMongoModel } from '../../../common/mongo';
import { type ChatFavouriteAppSchema as ChatFavouriteAppType } from '@fastgpt/global/core/chat/favouriteApp/type';
import { TeamCollectionName } from '@fastgpt/global/support/user/team/constant';
import { AppCollectionName } from '../../app/schema';
const { Schema } = connectionMongo;
export const ChatFavouriteAppCollectionName = 'chat_favourite_apps';
const ChatFavouriteAppSchema = new Schema({
teamId: {
type: Schema.Types.ObjectId,
ref: TeamCollectionName,
required: true
},
appId: {
type: Schema.Types.ObjectId,
ref: AppCollectionName,
required: true
},
favouriteTags: {
type: [String],
default: []
},
order: {
type: Number,
default: 10000000
}
});
ChatFavouriteAppSchema.index({ teamId: 1, appId: 1 });
export const MongoChatFavouriteApp = getMongoModel<ChatFavouriteAppType>(
ChatFavouriteAppCollectionName,
ChatFavouriteAppSchema
);

View File

@ -26,7 +26,27 @@ const ChatSettingSchema = new Schema({
},
homeTabTitle: String,
wideLogoUrl: String,
squareLogoUrl: String
squareLogoUrl: String,
quickAppIds: {
type: [String],
default: []
},
favouriteTags: {
type: [
{
id: String,
name: String
}
],
default: [],
_id: false
}
});
ChatSettingSchema.virtual('quickAppList', {
ref: AppCollectionName,
localField: 'quickAppIds',
foreignField: '_id'
});
ChatSettingSchema.index({ teamId: 1 });

View File

@ -1,52 +1,5 @@
import { replaceVariable } from '@fastgpt/global/common/string/tools';
export const Prompt_Tool_Call = `<Instruction>
使
使 JSON Schema toolId description parameters required
使USER代表用户的输入TOOL_RESPONSE代表工具运行结果ANSWER
0,1
0: 不使用工具
1: 使用工具
USER: 你好呀
ANSWER: 0:
USER: 现在几点了
ANSWER: 1: {"toolId":"searchToolId1"}
TOOL_RESPONSE: """
2022/5/5 12:00 Thursday
"""
ANSWER: 0: 20225512
USER: 今天杭州的天气如何
ANSWER: 1: {"toolId":"searchToolId2","arguments":{"city": "杭州"}}
TOOL_RESPONSE: """
......
"""
ANSWER: 0:
USER: 今天杭州的天气适合去哪里玩
ANSWER: 1: {"toolId":"searchToolId3","arguments":{"query": "杭州 天气 去哪里玩"}}
TOOL_RESPONSE: """
. 西
"""
ANSWER: 0: 西
</Instruction>
------
使
"""
{{toolsPrompt}}
"""
USER: {{question}}
ANSWER: `;
export const getMultiplePrompt = (obj: {
fileCount: number;
imgCount: number;

View File

@ -1,655 +0,0 @@
import { createChatCompletion } from '../../../../ai/config';
import { filterGPTMessageByMaxContext, loadRequestMessages } from '../../../../chat/utils';
import type {
ChatCompletion,
StreamChatType,
ChatCompletionMessageParam,
ChatCompletionCreateParams,
ChatCompletionMessageFunctionCall,
ChatCompletionFunctionMessageParam,
ChatCompletionAssistantMessageParam,
CompletionFinishReason
} from '@fastgpt/global/core/ai/type.d';
import { type NextApiResponse } from 'next';
import { responseWriteController } from '../../../../../common/response';
import { SseResponseEventEnum } from '@fastgpt/global/core/workflow/runtime/constants';
import { textAdaptGptResponse } from '@fastgpt/global/core/workflow/runtime/utils';
import {
ChatCompletionRequestMessageRoleEnum,
getLLMDefaultUsage
} from '@fastgpt/global/core/ai/constants';
import { dispatchWorkFlow } from '../../index';
import { type DispatchToolModuleProps, type RunToolResponse, type ToolNodeItemType } from './type';
import json5 from 'json5';
import { type DispatchFlowResponse, type WorkflowResponseType } from '../../type';
import { countGptMessagesTokens } from '../../../../../common/string/tiktoken/index';
import { getNanoid, sliceStrStartEnd } from '@fastgpt/global/common/string/tools';
import { type AIChatItemType } from '@fastgpt/global/core/chat/type';
import { GPTMessages2Chats } from '@fastgpt/global/core/chat/adapt';
import { formatToolResponse, initToolCallEdges, initToolNodes } from './utils';
import {
computedMaxToken,
llmCompletionsBodyFormat,
removeDatasetCiteText,
parseLLMStreamResponse
} from '../../../../ai/utils';
import { toolValueTypeList, valueTypeJsonSchemaMap } from '@fastgpt/global/core/workflow/constants';
import { type WorkflowInteractiveResponseType } from '@fastgpt/global/core/workflow/template/system/interactive/type';
import { ChatItemValueTypeEnum } from '@fastgpt/global/core/chat/constants';
type FunctionRunResponseType = {
toolRunResponse: DispatchFlowResponse;
functionCallMsg: ChatCompletionFunctionMessageParam;
}[];
export const runToolWithFunctionCall = async (
props: DispatchToolModuleProps,
response?: RunToolResponse
): Promise<RunToolResponse> => {
const { messages, toolNodes, toolModel, interactiveEntryToolParams, ...workflowProps } = props;
const {
res,
requestOrigin,
runtimeNodes,
runtimeEdges,
externalProvider,
stream,
retainDatasetCite = true,
workflowStreamResponse,
params: {
temperature,
maxToken,
aiChatVision,
aiChatTopP,
aiChatStopSign,
aiChatResponseFormat,
aiChatJsonSchema
}
} = workflowProps;
// Interactive
if (interactiveEntryToolParams) {
initToolNodes(runtimeNodes, interactiveEntryToolParams.entryNodeIds);
initToolCallEdges(runtimeEdges, interactiveEntryToolParams.entryNodeIds);
// Run entry tool
const toolRunResponse = await dispatchWorkFlow({
...workflowProps,
isToolCall: true
});
const stringToolResponse = formatToolResponse(toolRunResponse.toolResponses);
workflowStreamResponse?.({
event: SseResponseEventEnum.toolResponse,
data: {
tool: {
id: interactiveEntryToolParams.toolCallId,
toolName: '',
toolAvatar: '',
params: '',
response: sliceStrStartEnd(stringToolResponse, 5000, 5000)
}
}
});
// Check stop signal
const hasStopSignal = toolRunResponse.flowResponses?.some((item) => item.toolStop);
// Check interactive response(Only 1 interaction is reserved)
const workflowInteractiveResponse = toolRunResponse.workflowInteractiveResponse;
const requestMessages = [
...messages,
...interactiveEntryToolParams.memoryMessages.map((item) =>
!workflowInteractiveResponse &&
item.role === 'function' &&
item.name === interactiveEntryToolParams.toolCallId
? {
...item,
content: stringToolResponse
}
: item
)
];
if (hasStopSignal || workflowInteractiveResponse) {
// Get interactive tool data
const toolWorkflowInteractiveResponse: WorkflowInteractiveResponseType | undefined =
workflowInteractiveResponse
? {
...workflowInteractiveResponse,
toolParams: {
entryNodeIds: workflowInteractiveResponse.entryNodeIds,
toolCallId: interactiveEntryToolParams.toolCallId,
memoryMessages: [...interactiveEntryToolParams.memoryMessages]
}
}
: undefined;
return {
dispatchFlowResponse: [toolRunResponse],
toolNodeInputTokens: 0,
toolNodeOutputTokens: 0,
completeMessages: requestMessages,
assistantResponses: toolRunResponse.assistantResponses,
runTimes: toolRunResponse.runTimes,
toolWorkflowInteractiveResponse
};
}
return runToolWithFunctionCall(
{
...props,
interactiveEntryToolParams: undefined,
// Rewrite toolCall messages
messages: requestMessages
},
{
dispatchFlowResponse: [toolRunResponse],
toolNodeInputTokens: 0,
toolNodeOutputTokens: 0,
assistantResponses: toolRunResponse.assistantResponses,
runTimes: toolRunResponse.runTimes
}
);
}
// ------------------------------------------------------------
const assistantResponses = response?.assistantResponses || [];
const functions: ChatCompletionCreateParams.Function[] = toolNodes.map((item) => {
if (item.jsonSchema) {
return {
name: item.nodeId,
description: item.intro,
parameters: item.jsonSchema
};
}
const properties: Record<
string,
{
type: string;
description: string;
required?: boolean;
enum?: string[];
}
> = {};
item.toolParams.forEach((item) => {
const jsonSchema = item.valueType
? valueTypeJsonSchemaMap[item.valueType] || toolValueTypeList[0].jsonSchema
: toolValueTypeList[0].jsonSchema;
properties[item.key] = {
...jsonSchema,
description: item.toolDescription || '',
enum: item.enum?.split('\n').filter(Boolean) || []
};
});
return {
name: item.nodeId,
description: item.toolDescription || item.intro,
parameters: {
type: 'object',
properties,
required: item.toolParams.filter((item) => item.required).map((item) => item.key)
}
};
});
const max_tokens = computedMaxToken({
model: toolModel,
maxToken
});
const filterMessages = (
await filterGPTMessageByMaxContext({
messages,
maxContext: toolModel.maxContext - (max_tokens || 0) // filter token. not response maxToken
})
).map((item) => {
if (item.role === ChatCompletionRequestMessageRoleEnum.Assistant && item.function_call) {
return {
...item,
function_call: {
name: item.function_call?.name,
arguments: item.function_call?.arguments
},
content: ''
};
}
return item;
});
const [requestMessages] = await Promise.all([
loadRequestMessages({
messages: filterMessages,
useVision: toolModel.vision && aiChatVision,
origin: requestOrigin
})
]);
const requestBody = llmCompletionsBodyFormat(
{
model: toolModel.model,
stream,
messages: requestMessages,
functions,
function_call: 'auto',
temperature,
max_tokens,
top_p: aiChatTopP,
stop: aiChatStopSign,
response_format: {
type: aiChatResponseFormat as any,
json_schema: aiChatJsonSchema
}
},
toolModel
);
// console.log(JSON.stringify(requestMessages, null, 2));
/* Run llm */
const {
response: aiResponse,
isStreamResponse,
getEmptyResponseTip
} = await createChatCompletion({
body: requestBody,
userKey: externalProvider.openaiAccount,
options: {
headers: {
Accept: 'application/json, text/plain, */*'
}
}
});
let { answer, functionCalls, inputTokens, outputTokens, finish_reason } = await (async () => {
if (isStreamResponse) {
if (!res || res.closed) {
return {
answer: '',
functionCalls: [],
inputTokens: 0,
outputTokens: 0,
finish_reason: 'close' as const
};
}
const result = await streamResponse({
res,
toolNodes,
stream: aiResponse,
workflowStreamResponse,
retainDatasetCite
});
return {
answer: result.answer,
functionCalls: result.functionCalls,
inputTokens: result.usage.prompt_tokens,
outputTokens: result.usage.completion_tokens,
finish_reason: result.finish_reason
};
} else {
const result = aiResponse as ChatCompletion;
const finish_reason = result.choices?.[0]?.finish_reason as CompletionFinishReason;
const function_call = result.choices?.[0]?.message?.function_call;
const usage = result.usage;
const toolNode = toolNodes.find((node) => node.nodeId === function_call?.name);
const toolCalls = function_call
? [
{
...function_call,
id: getNanoid(),
toolName: toolNode?.name,
toolAvatar: toolNode?.avatar
}
]
: [];
const answer = result.choices?.[0]?.message?.content || '';
if (answer) {
workflowStreamResponse?.({
event: SseResponseEventEnum.fastAnswer,
data: textAdaptGptResponse({
text: removeDatasetCiteText(answer, retainDatasetCite)
})
});
}
return {
answer,
functionCalls: toolCalls,
inputTokens: usage?.prompt_tokens,
outputTokens: usage?.completion_tokens,
finish_reason
};
}
})();
if (!answer && functionCalls.length === 0) {
return Promise.reject(getEmptyResponseTip());
}
// Run the selected tool.
const toolsRunResponse = (
await Promise.all(
functionCalls.map(async (tool) => {
if (!tool) return;
const toolNode = toolNodes.find((node) => node.nodeId === tool.name);
if (!toolNode) return;
const startParams = (() => {
try {
return json5.parse(tool.arguments);
} catch (error) {
return {};
}
})();
initToolNodes(runtimeNodes, [toolNode.nodeId], startParams);
const toolRunResponse = await dispatchWorkFlow({
...workflowProps,
isToolCall: true
});
const stringToolResponse = formatToolResponse(toolRunResponse.toolResponses);
const functionCallMsg: ChatCompletionFunctionMessageParam = {
role: ChatCompletionRequestMessageRoleEnum.Function,
name: tool.name,
content: stringToolResponse
};
workflowStreamResponse?.({
event: SseResponseEventEnum.toolResponse,
data: {
tool: {
id: tool.id,
toolName: '',
toolAvatar: '',
params: '',
response: sliceStrStartEnd(stringToolResponse, 500, 500)
}
}
});
return {
toolRunResponse,
functionCallMsg
};
})
)
).filter(Boolean) as FunctionRunResponseType;
const flatToolsResponseData = toolsRunResponse.map((item) => item.toolRunResponse).flat();
// concat tool responses
const dispatchFlowResponse = response
? response.dispatchFlowResponse.concat(flatToolsResponseData)
: flatToolsResponseData;
const functionCall = functionCalls[0];
if (functionCall) {
// Run the tool, combine its results, and perform another round of AI calls
const assistantToolMsgParams: ChatCompletionAssistantMessageParam = {
role: ChatCompletionRequestMessageRoleEnum.Assistant,
function_call: functionCall
};
/*
...
user
assistant: tool data
*/
const concatToolMessages = [
...requestMessages,
assistantToolMsgParams
] as ChatCompletionMessageParam[];
// Only toolCall tokens are counted here, Tool response tokens count towards the next reply
// const tokens = await countGptMessagesTokens(concatToolMessages, undefined, functions);
inputTokens =
inputTokens || (await countGptMessagesTokens(requestMessages, undefined, functions));
outputTokens = outputTokens || (await countGptMessagesTokens([assistantToolMsgParams]));
/*
...
user
assistant: tool data
tool: tool response
*/
const completeMessages = [
...concatToolMessages,
...toolsRunResponse.map((item) => item?.functionCallMsg)
];
/*
Get tool node assistant response
history assistant
current tool assistant
tool child assistant
*/
const toolNodeAssistant = GPTMessages2Chats([
assistantToolMsgParams,
...toolsRunResponse.map((item) => item?.functionCallMsg)
])[0] as AIChatItemType;
const toolChildAssistants = flatToolsResponseData
.map((item) => item.assistantResponses)
.flat()
.filter((item) => item.type !== ChatItemValueTypeEnum.interactive);
const toolNodeAssistants = [
...assistantResponses,
...toolNodeAssistant.value,
...toolChildAssistants
];
const runTimes =
(response?.runTimes || 0) +
flatToolsResponseData.reduce((sum, item) => sum + item.runTimes, 0);
const toolNodeInputTokens = response?.toolNodeInputTokens
? response.toolNodeInputTokens + inputTokens
: inputTokens;
const toolNodeOutputTokens = response?.toolNodeOutputTokens
? response.toolNodeOutputTokens + outputTokens
: outputTokens;
// Check stop signal
const hasStopSignal = flatToolsResponseData.some(
(item) => !!item.flowResponses?.find((item) => item.toolStop)
);
// Check interactive response(Only 1 interaction is reserved)
const workflowInteractiveResponseItem = toolsRunResponse.find(
(item) => item.toolRunResponse.workflowInteractiveResponse
);
if (hasStopSignal || workflowInteractiveResponseItem) {
// Get interactive tool data
const workflowInteractiveResponse =
workflowInteractiveResponseItem?.toolRunResponse.workflowInteractiveResponse;
// Flashback traverses completeMessages, intercepting messages that know the first user
const firstUserIndex = completeMessages.findLastIndex((item) => item.role === 'user');
const newMessages = completeMessages.slice(firstUserIndex + 1);
const toolWorkflowInteractiveResponse: WorkflowInteractiveResponseType | undefined =
workflowInteractiveResponse
? {
...workflowInteractiveResponse,
toolParams: {
entryNodeIds: workflowInteractiveResponse.entryNodeIds,
toolCallId: workflowInteractiveResponseItem?.functionCallMsg.name,
memoryMessages: newMessages
}
}
: undefined;
return {
dispatchFlowResponse,
toolNodeInputTokens,
toolNodeOutputTokens,
completeMessages,
assistantResponses: toolNodeAssistants,
runTimes,
toolWorkflowInteractiveResponse,
finish_reason
};
}
return runToolWithFunctionCall(
{
...props,
messages: completeMessages
},
{
dispatchFlowResponse,
toolNodeInputTokens,
toolNodeOutputTokens,
assistantResponses: toolNodeAssistants,
runTimes,
finish_reason
}
);
} else {
// No tool is invoked, indicating that the process is over
const gptAssistantResponse: ChatCompletionAssistantMessageParam = {
role: ChatCompletionRequestMessageRoleEnum.Assistant,
content: answer
};
const completeMessages = filterMessages.concat(gptAssistantResponse);
inputTokens =
inputTokens || (await countGptMessagesTokens(requestMessages, undefined, functions));
outputTokens = outputTokens || (await countGptMessagesTokens([gptAssistantResponse]));
// console.log(tokens, 'response token');
// concat tool assistant
const toolNodeAssistant = GPTMessages2Chats([gptAssistantResponse])[0] as AIChatItemType;
return {
dispatchFlowResponse: response?.dispatchFlowResponse || [],
toolNodeInputTokens: response?.toolNodeInputTokens
? response.toolNodeInputTokens + inputTokens
: inputTokens,
toolNodeOutputTokens: response?.toolNodeOutputTokens
? response.toolNodeOutputTokens + outputTokens
: outputTokens,
completeMessages,
assistantResponses: [...assistantResponses, ...toolNodeAssistant.value],
runTimes: (response?.runTimes || 0) + 1,
finish_reason
};
}
};
async function streamResponse({
res,
toolNodes,
stream,
workflowStreamResponse,
retainDatasetCite
}: {
res: NextApiResponse;
toolNodes: ToolNodeItemType[];
stream: StreamChatType;
workflowStreamResponse?: WorkflowResponseType;
retainDatasetCite?: boolean;
}) {
const write = responseWriteController({
res,
readStream: stream
});
let functionCalls: ChatCompletionMessageFunctionCall[] = [];
let functionId = getNanoid();
const { parsePart, getResponseData, updateFinishReason } = parseLLMStreamResponse();
for await (const part of stream) {
if (res.closed) {
stream.controller?.abort();
updateFinishReason('close');
break;
}
const { responseContent } = parsePart({
part,
parseThinkTag: false,
retainDatasetCite
});
const responseChoice = part.choices?.[0]?.delta;
if (responseContent) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
text: responseContent
})
});
} else if (responseChoice?.function_call) {
const functionCall: {
arguments?: string;
name?: string;
} = responseChoice.function_call;
// 流响应中,每次只会返回一个函数如果带了name说明触发某个函数
if (functionCall?.name) {
functionId = getNanoid();
const toolNode = toolNodes.find((item) => item.nodeId === functionCall?.name);
if (toolNode) {
functionCalls.push({
...functionCall,
arguments: functionCall.arguments || '',
id: functionId,
name: functionCall.name,
toolName: toolNode.name,
toolAvatar: toolNode.avatar
});
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.toolCall,
data: {
tool: {
id: functionId,
toolName: toolNode.name,
toolAvatar: toolNode.avatar,
functionName: functionCall.name,
params: functionCall.arguments || '',
response: ''
}
}
});
}
continue;
}
/* arg 插入最后一个工具的参数里 */
const arg: string = functionCall?.arguments || '';
const currentTool = functionCalls[functionCalls.length - 1];
if (currentTool) {
currentTool.arguments += arg;
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.toolParams,
data: {
tool: {
id: functionId,
toolName: '',
toolAvatar: '',
params: arg,
response: ''
}
}
});
}
}
}
const { content, finish_reason, usage } = getResponseData();
return { answer: content, functionCalls, finish_reason, usage };
}

View File

@ -7,7 +7,7 @@ import type {
} from '@fastgpt/global/core/workflow/runtime/type';
import { getLLMModel } from '../../../../ai/model';
import { filterToolNodeIdByEdges, getNodeErrResponse, getHistories } from '../../utils';
import { runToolWithToolChoice } from './toolChoice';
import { runToolCall } from './toolCall';
import { type DispatchToolModuleProps, type ToolNodeItemType } from './type';
import { type ChatItemType, type UserChatItemValueItemType } from '@fastgpt/global/core/chat/type';
import { ChatItemValueTypeEnum, ChatRoleEnum } from '@fastgpt/global/core/chat/constants';
@ -20,15 +20,12 @@ import {
} from '@fastgpt/global/core/chat/adapt';
import { formatModelChars2Points } from '../../../../../support/wallet/usage/utils';
import { getHistoryPreview } from '@fastgpt/global/core/chat/utils';
import { runToolWithFunctionCall } from './functionCall';
import { runToolWithPromptCall } from './promptCall';
import { replaceVariable } from '@fastgpt/global/common/string/tools';
import { getMultiplePrompt, Prompt_Tool_Call } from './constants';
import { getMultiplePrompt } from './constants';
import { filterToolResponseToPreview } from './utils';
import { getFileContentFromLinks, getHistoryFileLinks } from '../../tools/readFiles';
import { parseUrlToFileType } from '@fastgpt/global/common/file/tools';
import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
import { ModelTypeEnum } from '@fastgpt/global/core/ai/model';
import { getDocumentQuotePrompt } from '@fastgpt/global/core/ai/prompt/AIChat';
import { postTextCensor } from '../../../../chat/postTextCensor';
import type { FlowNodeInputItemType } from '@fastgpt/global/core/workflow/type/io';
@ -180,8 +177,8 @@ export const dispatchRunTools = async (props: DispatchToolModuleProps): Promise<
const {
toolWorkflowInteractiveResponse,
dispatchFlowResponse, // tool flow response
toolNodeInputTokens,
toolNodeOutputTokens,
toolCallInputTokens,
toolCallOutputTokens,
completeMessages = [], // The actual message sent to AI(just save text)
assistantResponses = [], // FastGPT system store assistant.value response
runTimes,
@ -201,64 +198,25 @@ export const dispatchRunTools = async (props: DispatchToolModuleProps): Promise<
interactiveEntryToolParams: lastInteractive?.toolParams
};
if (toolModel.toolChoice) {
return runToolWithToolChoice({
...props,
...requestParams,
maxRunToolTimes: 30
});
}
if (toolModel.functionCall) {
return runToolWithFunctionCall({
...props,
...requestParams
});
}
const lastMessage = adaptMessages[adaptMessages.length - 1];
if (typeof lastMessage?.content === 'string') {
lastMessage.content = replaceVariable(Prompt_Tool_Call, {
question: lastMessage.content
});
} else if (Array.isArray(lastMessage.content)) {
// array, replace last element
const lastText = lastMessage.content[lastMessage.content.length - 1];
if (lastText.type === 'text') {
lastText.text = replaceVariable(Prompt_Tool_Call, {
question: lastText.text
});
} else {
return Promise.reject('Prompt call invalid input');
}
} else {
return Promise.reject('Prompt call invalid input');
}
return runToolWithPromptCall({
return runToolCall({
...props,
...requestParams
...requestParams,
maxRunToolTimes: 30
});
})();
const { totalPoints, modelName } = formatModelChars2Points({
const { totalPoints: modelTotalPoints, modelName } = formatModelChars2Points({
model,
inputTokens: toolNodeInputTokens,
outputTokens: toolNodeOutputTokens,
modelType: ModelTypeEnum.llm
inputTokens: toolCallInputTokens,
outputTokens: toolCallOutputTokens
});
const toolAIUsage = externalProvider.openaiAccount?.key ? 0 : totalPoints;
const modelUsage = externalProvider.openaiAccount?.key ? 0 : modelTotalPoints;
// flat child tool response
const childToolResponse = dispatchFlowResponse.map((item) => item.flowResponses).flat();
const toolUsages = dispatchFlowResponse.map((item) => item.flowUsages).flat();
const toolTotalPoints = toolUsages.reduce((sum, item) => sum + item.totalPoints, 0);
// concat tool usage
const totalPointsUsage =
toolAIUsage +
dispatchFlowResponse.reduce((sum, item) => {
const childrenTotal = item.flowUsages.reduce((sum, item) => sum + item.totalPoints, 0);
return sum + childrenTotal;
}, 0);
const flatUsages = dispatchFlowResponse.map((item) => item.flowUsages).flat();
const totalPointsUsage = modelUsage + toolTotalPoints;
const previewAssistantResponses = filterToolResponseToPreview(assistantResponses);
@ -274,31 +232,31 @@ export const dispatchRunTools = async (props: DispatchToolModuleProps): Promise<
[DispatchNodeResponseKeyEnum.nodeResponse]: {
// 展示的积分消耗
totalPoints: totalPointsUsage,
toolCallInputTokens: toolNodeInputTokens,
toolCallOutputTokens: toolNodeOutputTokens,
childTotalPoints: flatUsages.reduce((sum, item) => sum + item.totalPoints, 0),
toolCallInputTokens: toolCallInputTokens,
toolCallOutputTokens: toolCallOutputTokens,
childTotalPoints: toolTotalPoints,
model: modelName,
query: userChatInput,
historyPreview: getHistoryPreview(
GPTMessages2Chats(completeMessages, false),
GPTMessages2Chats({ messages: completeMessages, reserveTool: false }),
10000,
useVision
),
toolDetail: childToolResponse,
toolDetail: dispatchFlowResponse.map((item) => item.flowResponses).flat(),
mergeSignId: nodeId,
finishReason: finish_reason
},
[DispatchNodeResponseKeyEnum.nodeDispatchUsages]: [
// 工具调用本身的积分消耗
// 模型本身的积分消耗
{
moduleName: name,
model: modelName,
totalPoints: toolAIUsage,
inputTokens: toolNodeInputTokens,
outputTokens: toolNodeOutputTokens
totalPoints: modelUsage,
inputTokens: toolCallInputTokens,
outputTokens: toolCallOutputTokens
},
// 工具的消耗
...flatUsages
...toolUsages
],
[DispatchNodeResponseKeyEnum.interactive]: toolWorkflowInteractiveResponse
};

View File

@ -1,709 +0,0 @@
import { createChatCompletion } from '../../../../ai/config';
import { filterGPTMessageByMaxContext, loadRequestMessages } from '../../../../chat/utils';
import {
type StreamChatType,
type ChatCompletionMessageParam,
type CompletionFinishReason
} from '@fastgpt/global/core/ai/type';
import { type NextApiResponse } from 'next';
import { responseWriteController } from '../../../../../common/response';
import { SseResponseEventEnum } from '@fastgpt/global/core/workflow/runtime/constants';
import { textAdaptGptResponse } from '@fastgpt/global/core/workflow/runtime/utils';
import {
ChatCompletionRequestMessageRoleEnum,
getLLMDefaultUsage
} from '@fastgpt/global/core/ai/constants';
import { dispatchWorkFlow } from '../../index';
import { type DispatchToolModuleProps, type RunToolResponse, type ToolNodeItemType } from './type';
import json5 from 'json5';
import { countGptMessagesTokens } from '../../../../../common/string/tiktoken/index';
import {
getNanoid,
replaceVariable,
sliceJsonStr,
sliceStrStartEnd
} from '@fastgpt/global/common/string/tools';
import { type AIChatItemType } from '@fastgpt/global/core/chat/type';
import { GPTMessages2Chats } from '@fastgpt/global/core/chat/adapt';
import { formatToolResponse, initToolCallEdges, initToolNodes } from './utils';
import {
computedMaxToken,
llmCompletionsBodyFormat,
removeDatasetCiteText,
parseReasoningContent,
parseLLMStreamResponse
} from '../../../../ai/utils';
import { type WorkflowResponseType } from '../../type';
import { toolValueTypeList, valueTypeJsonSchemaMap } from '@fastgpt/global/core/workflow/constants';
import { type WorkflowInteractiveResponseType } from '@fastgpt/global/core/workflow/template/system/interactive/type';
import { ChatItemValueTypeEnum } from '@fastgpt/global/core/chat/constants';
type FunctionCallCompletion = {
id: string;
name: string;
arguments: string;
toolName?: string;
toolAvatar?: string;
};
const ERROR_TEXT = 'Tool run error';
const INTERACTIVE_STOP_SIGNAL = 'INTERACTIVE_STOP_SIGNAL';
export const runToolWithPromptCall = async (
props: DispatchToolModuleProps,
response?: RunToolResponse
): Promise<RunToolResponse> => {
const { messages, toolNodes, toolModel, interactiveEntryToolParams, ...workflowProps } = props;
const {
res,
requestOrigin,
runtimeNodes,
runtimeEdges,
externalProvider,
stream,
retainDatasetCite = true,
workflowStreamResponse,
params: {
temperature,
maxToken,
aiChatVision,
aiChatReasoning,
aiChatTopP,
aiChatStopSign,
aiChatResponseFormat,
aiChatJsonSchema
}
} = workflowProps;
if (interactiveEntryToolParams) {
initToolNodes(runtimeNodes, interactiveEntryToolParams.entryNodeIds);
initToolCallEdges(runtimeEdges, interactiveEntryToolParams.entryNodeIds);
// Run entry tool
const toolRunResponse = await dispatchWorkFlow({
...workflowProps,
isToolCall: true
});
const stringToolResponse = formatToolResponse(toolRunResponse.toolResponses);
workflowStreamResponse?.({
event: SseResponseEventEnum.toolResponse,
data: {
tool: {
id: interactiveEntryToolParams.toolCallId,
toolName: '',
toolAvatar: '',
params: '',
response: sliceStrStartEnd(stringToolResponse, 5000, 5000)
}
}
});
// Check interactive response(Only 1 interaction is reserved)
const workflowInteractiveResponseItem = toolRunResponse?.workflowInteractiveResponse
? toolRunResponse
: undefined;
// Rewrite toolCall messages
const concatMessages = [...messages.slice(0, -1), ...interactiveEntryToolParams.memoryMessages];
const lastMessage = concatMessages[concatMessages.length - 1];
lastMessage.content = workflowInteractiveResponseItem
? lastMessage.content
: replaceVariable(lastMessage.content, {
[INTERACTIVE_STOP_SIGNAL]: stringToolResponse
});
// Check stop signal
const hasStopSignal = toolRunResponse.flowResponses.some((item) => !!item.toolStop);
if (hasStopSignal || workflowInteractiveResponseItem) {
// Get interactive tool data
const workflowInteractiveResponse =
workflowInteractiveResponseItem?.workflowInteractiveResponse;
const toolWorkflowInteractiveResponse: WorkflowInteractiveResponseType | undefined =
workflowInteractiveResponse
? {
...workflowInteractiveResponse,
toolParams: {
entryNodeIds: workflowInteractiveResponse.entryNodeIds,
toolCallId: '',
memoryMessages: [lastMessage]
}
}
: undefined;
return {
dispatchFlowResponse: [toolRunResponse],
toolNodeInputTokens: 0,
toolNodeOutputTokens: 0,
completeMessages: concatMessages,
assistantResponses: toolRunResponse.assistantResponses,
runTimes: toolRunResponse.runTimes,
toolWorkflowInteractiveResponse
};
}
return runToolWithPromptCall(
{
...props,
interactiveEntryToolParams: undefined,
messages: concatMessages
},
{
dispatchFlowResponse: [toolRunResponse],
toolNodeInputTokens: 0,
toolNodeOutputTokens: 0,
assistantResponses: toolRunResponse.assistantResponses,
runTimes: toolRunResponse.runTimes
}
);
}
const assistantResponses = response?.assistantResponses || [];
const toolsPrompt = JSON.stringify(
toolNodes.map((item) => {
if (item.jsonSchema) {
return {
toolId: item.nodeId,
description: item.intro,
parameters: item.jsonSchema
};
}
const properties: Record<
string,
{
type: string;
description: string;
required?: boolean;
enum?: string[];
}
> = {};
item.toolParams.forEach((item) => {
const jsonSchema = item.valueType
? valueTypeJsonSchemaMap[item.valueType] || toolValueTypeList[0].jsonSchema
: toolValueTypeList[0].jsonSchema;
properties[item.key] = {
...jsonSchema,
description: item.toolDescription || '',
enum: item.enum?.split('\n').filter(Boolean) || []
};
});
return {
toolId: item.nodeId,
description: item.toolDescription || item.intro,
parameters: {
type: 'object',
properties,
required: item.toolParams.filter((item) => item.required).map((item) => item.key)
}
};
})
);
const lastMessage = messages[messages.length - 1];
if (typeof lastMessage.content === 'string') {
lastMessage.content = replaceVariable(lastMessage.content, {
toolsPrompt
});
} else if (Array.isArray(lastMessage.content)) {
// array, replace last element
const lastText = lastMessage.content[lastMessage.content.length - 1];
if (lastText.type === 'text') {
lastText.text = replaceVariable(lastText.text, {
toolsPrompt
});
} else {
return Promise.reject('Prompt call invalid input');
}
} else {
return Promise.reject('Prompt call invalid input');
}
const max_tokens = computedMaxToken({
model: toolModel,
maxToken,
min: 100
});
const filterMessages = await filterGPTMessageByMaxContext({
messages,
maxContext: toolModel.maxContext - (max_tokens || 0) // filter token. not response maxToken
});
const [requestMessages] = await Promise.all([
loadRequestMessages({
messages: filterMessages,
useVision: aiChatVision,
origin: requestOrigin
})
]);
const requestBody = llmCompletionsBodyFormat(
{
model: toolModel.model,
stream,
messages: requestMessages,
temperature,
max_tokens,
top_p: aiChatTopP,
stop: aiChatStopSign,
response_format: {
type: aiChatResponseFormat as any,
json_schema: aiChatJsonSchema
}
},
toolModel
);
// console.log(JSON.stringify(requestMessages, null, 2));
/* Run llm */
const {
response: aiResponse,
isStreamResponse,
getEmptyResponseTip
} = await createChatCompletion({
body: requestBody,
userKey: externalProvider.openaiAccount,
options: {
headers: {
Accept: 'application/json, text/plain, */*'
}
}
});
let { answer, reasoning, finish_reason, inputTokens, outputTokens } = await (async () => {
if (isStreamResponse) {
if (!res || res.closed) {
return {
answer: '',
reasoning: '',
finish_reason: 'close' as const,
inputTokens: 0,
outputTokens: 0
};
}
const { answer, reasoning, finish_reason, usage } = await streamResponse({
res,
toolNodes,
stream: aiResponse,
workflowStreamResponse,
aiChatReasoning,
retainDatasetCite
});
return {
answer,
reasoning,
finish_reason,
inputTokens: usage.prompt_tokens,
outputTokens: usage.completion_tokens
};
} else {
const finish_reason = aiResponse.choices?.[0]?.finish_reason as CompletionFinishReason;
const content = aiResponse.choices?.[0]?.message?.content || '';
// @ts-ignore
const reasoningContent: string = aiResponse.choices?.[0]?.message?.reasoning_content || '';
const usage = aiResponse.usage;
const formatReasonContent = removeDatasetCiteText(reasoningContent, retainDatasetCite);
const formatContent = removeDatasetCiteText(content, retainDatasetCite);
// API already parse reasoning content
if (formatReasonContent || !aiChatReasoning) {
return {
answer: formatContent,
reasoning: formatReasonContent,
finish_reason,
inputTokens: usage?.prompt_tokens,
outputTokens: usage?.completion_tokens
};
}
const [think, answer] = parseReasoningContent(formatContent);
return {
answer,
reasoning: think,
finish_reason,
inputTokens: usage?.prompt_tokens,
outputTokens: usage?.completion_tokens
};
}
})();
if (stream && !isStreamResponse && aiChatReasoning && reasoning) {
workflowStreamResponse?.({
event: SseResponseEventEnum.fastAnswer,
data: textAdaptGptResponse({
reasoning_content: reasoning
})
});
}
const { answer: replaceAnswer, toolJson } = parseAnswer(answer);
if (!answer && !toolJson) {
return Promise.reject(getEmptyResponseTip());
}
// No tools
if (!toolJson) {
if (replaceAnswer === ERROR_TEXT) {
workflowStreamResponse?.({
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
text: replaceAnswer
})
});
}
// 不支持 stream 模式的模型的流失响应
if (stream && !isStreamResponse) {
workflowStreamResponse?.({
event: SseResponseEventEnum.fastAnswer,
data: textAdaptGptResponse({
text: removeDatasetCiteText(replaceAnswer, retainDatasetCite)
})
});
}
// No tool is invoked, indicating that the process is over
const gptAssistantResponse: ChatCompletionMessageParam = {
role: ChatCompletionRequestMessageRoleEnum.Assistant,
content: replaceAnswer,
reasoning_text: reasoning
};
const completeMessages = filterMessages.concat({
...gptAssistantResponse,
reasoning_text: undefined
});
inputTokens = inputTokens || (await countGptMessagesTokens(requestMessages));
outputTokens = outputTokens || (await countGptMessagesTokens([gptAssistantResponse]));
// concat tool assistant
const toolNodeAssistant = GPTMessages2Chats([gptAssistantResponse])[0] as AIChatItemType;
return {
dispatchFlowResponse: response?.dispatchFlowResponse || [],
toolNodeInputTokens: response?.toolNodeInputTokens
? response.toolNodeInputTokens + inputTokens
: inputTokens,
toolNodeOutputTokens: response?.toolNodeOutputTokens
? response.toolNodeOutputTokens + outputTokens
: outputTokens,
completeMessages,
assistantResponses: [...assistantResponses, ...toolNodeAssistant.value],
runTimes: (response?.runTimes || 0) + 1
};
}
// Run the selected tool.
const toolsRunResponse = await (async () => {
const toolNode = toolNodes.find((item) => item.nodeId === toolJson.name);
if (!toolNode) return Promise.reject('tool not found');
toolJson.toolName = toolNode.name;
toolJson.toolAvatar = toolNode.avatar;
// run tool flow
const startParams = (() => {
try {
return json5.parse(toolJson.arguments);
} catch (error) {
return {};
}
})();
// SSE response to client
workflowStreamResponse?.({
event: SseResponseEventEnum.toolCall,
data: {
tool: {
id: toolJson.id,
toolName: toolNode.name,
toolAvatar: toolNode.avatar,
functionName: toolJson.name,
params: toolJson.arguments,
response: ''
}
}
});
initToolNodes(runtimeNodes, [toolNode.nodeId], startParams);
const toolResponse = await dispatchWorkFlow({
...workflowProps,
isToolCall: true
});
const stringToolResponse = formatToolResponse(toolResponse.toolResponses);
workflowStreamResponse?.({
event: SseResponseEventEnum.toolResponse,
data: {
tool: {
id: toolJson.id,
toolName: '',
toolAvatar: '',
params: '',
response: sliceStrStartEnd(stringToolResponse, 500, 500)
}
}
});
return {
toolResponse,
toolResponsePrompt: stringToolResponse
};
})();
// 合并工具调用的结果,使用 functionCall 格式存储。
const assistantToolMsgParams: ChatCompletionMessageParam = {
role: ChatCompletionRequestMessageRoleEnum.Assistant,
function_call: toolJson,
reasoning_text: reasoning
};
// Only toolCall tokens are counted here, Tool response tokens count towards the next reply
inputTokens = inputTokens || (await countGptMessagesTokens(requestMessages));
outputTokens = outputTokens || (await countGptMessagesTokens([assistantToolMsgParams]));
/*
...
user
assistant: tool data
function: tool response
*/
const functionResponseMessage: ChatCompletionMessageParam = {
role: ChatCompletionRequestMessageRoleEnum.Function,
name: toolJson.name,
content: toolsRunResponse.toolResponsePrompt
};
// tool node assistant
const toolNodeAssistant = GPTMessages2Chats([
assistantToolMsgParams,
functionResponseMessage
])[0] as AIChatItemType;
const toolChildAssistants = toolsRunResponse.toolResponse.assistantResponses.filter(
(item) => item.type !== ChatItemValueTypeEnum.interactive
);
const toolNodeAssistants = [
...assistantResponses,
...toolNodeAssistant.value,
...toolChildAssistants
];
const dispatchFlowResponse = response
? [...response.dispatchFlowResponse, toolsRunResponse.toolResponse]
: [toolsRunResponse.toolResponse];
// Check interactive response(Only 1 interaction is reserved)
const workflowInteractiveResponseItem = toolsRunResponse.toolResponse?.workflowInteractiveResponse
? toolsRunResponse.toolResponse
: undefined;
// get the next user prompt
if (typeof lastMessage.content === 'string') {
lastMessage.content += `${replaceAnswer}
TOOL_RESPONSE: """
${workflowInteractiveResponseItem ? `{{${INTERACTIVE_STOP_SIGNAL}}}` : toolsRunResponse.toolResponsePrompt}
"""
ANSWER: `;
} else if (Array.isArray(lastMessage.content)) {
// array, replace last element
const lastText = lastMessage.content[lastMessage.content.length - 1];
if (lastText.type === 'text') {
lastText.text += `${replaceAnswer}
TOOL_RESPONSE: """
${workflowInteractiveResponseItem ? `{{${INTERACTIVE_STOP_SIGNAL}}}` : toolsRunResponse.toolResponsePrompt}
"""
ANSWER: `;
} else {
return Promise.reject('Prompt call invalid input');
}
} else {
return Promise.reject('Prompt call invalid input');
}
const runTimes = (response?.runTimes || 0) + toolsRunResponse.toolResponse.runTimes;
const toolNodeInputTokens = response?.toolNodeInputTokens
? response.toolNodeInputTokens + inputTokens
: inputTokens;
const toolNodeOutputTokens = response?.toolNodeOutputTokens
? response.toolNodeOutputTokens + outputTokens
: outputTokens;
// Check stop signal
const hasStopSignal = toolsRunResponse.toolResponse.flowResponses.some((item) => !!item.toolStop);
if (hasStopSignal || workflowInteractiveResponseItem) {
// Get interactive tool data
const workflowInteractiveResponse =
workflowInteractiveResponseItem?.workflowInteractiveResponse;
const toolWorkflowInteractiveResponse: WorkflowInteractiveResponseType | undefined =
workflowInteractiveResponse
? {
...workflowInteractiveResponse,
toolParams: {
entryNodeIds: workflowInteractiveResponse.entryNodeIds,
toolCallId: '',
memoryMessages: [lastMessage]
}
}
: undefined;
return {
dispatchFlowResponse,
toolNodeInputTokens,
toolNodeOutputTokens,
completeMessages: filterMessages,
assistantResponses: toolNodeAssistants,
runTimes,
toolWorkflowInteractiveResponse
};
}
return runToolWithPromptCall(
{
...props,
messages
},
{
dispatchFlowResponse,
toolNodeInputTokens,
toolNodeOutputTokens,
assistantResponses: toolNodeAssistants,
runTimes,
finish_reason
}
);
};
async function streamResponse({
res,
stream,
workflowStreamResponse,
aiChatReasoning,
retainDatasetCite
}: {
res: NextApiResponse;
toolNodes: ToolNodeItemType[];
stream: StreamChatType;
workflowStreamResponse?: WorkflowResponseType;
aiChatReasoning?: boolean;
retainDatasetCite?: boolean;
}) {
const write = responseWriteController({
res,
readStream: stream
});
let startResponseWrite = false;
let answer = '';
const { parsePart, getResponseData, updateFinishReason } = parseLLMStreamResponse();
for await (const part of stream) {
if (res.closed) {
stream.controller?.abort();
updateFinishReason('close');
break;
}
const { reasoningContent, content, responseContent } = parsePart({
part,
parseThinkTag: aiChatReasoning,
retainDatasetCite
});
answer += content;
// Reasoning response
if (aiChatReasoning && reasoningContent) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
reasoning_content: reasoningContent
})
});
}
if (content) {
if (startResponseWrite) {
if (responseContent) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
text: responseContent
})
});
}
} else if (answer.length >= 3) {
answer = answer.trimStart();
if (/0(:|)/.test(answer)) {
startResponseWrite = true;
// find first : index
const firstIndex =
answer.indexOf('0:') !== -1 ? answer.indexOf('0:') : answer.indexOf('0');
answer = answer.substring(firstIndex + 2).trim();
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
text: answer
})
});
}
}
}
}
const { reasoningContent, content, finish_reason, usage } = getResponseData();
return { answer: content, reasoning: reasoningContent, finish_reason, usage };
}
const parseAnswer = (
str: string
): {
answer: string;
toolJson?: FunctionCallCompletion;
} => {
str = str.trim();
// 首先使用正则表达式提取TOOL_ID和TOOL_ARGUMENTS
const prefixReg = /1(:|)/;
if (prefixReg.test(str)) {
const toolString = sliceJsonStr(str);
try {
const toolCall = json5.parse(toolString);
return {
answer: `1: ${toolString}`,
toolJson: {
id: getNanoid(),
name: toolCall.toolId,
arguments: JSON.stringify(toolCall.arguments || toolCall.parameters)
}
};
} catch (error) {
if (/^1(:|)/.test(str)) {
return {
answer: ERROR_TEXT
};
} else {
return {
answer: str
};
}
}
} else {
const firstIndex = str.indexOf('0:') !== -1 ? str.indexOf('0:') : str.indexOf('0');
const answer = str.substring(firstIndex + 2).trim();
return {
answer
};
}
};

View File

@ -1,38 +1,27 @@
import { createChatCompletion } from '../../../../ai/config';
import { filterGPTMessageByMaxContext, loadRequestMessages } from '../../../../chat/utils';
import {
type ChatCompletion,
type ChatCompletionMessageToolCall,
type StreamChatType,
type ChatCompletionToolMessageParam,
type ChatCompletionMessageParam,
type ChatCompletionTool,
type CompletionFinishReason
import { filterGPTMessageByMaxContext } from '../../../../ai/llm/utils';
import type {
ChatCompletionToolMessageParam,
ChatCompletionMessageParam,
ChatCompletionTool
} from '@fastgpt/global/core/ai/type';
import { type NextApiResponse } from 'next';
import { responseWriteController } from '../../../../../common/response';
import { SseResponseEventEnum } from '@fastgpt/global/core/workflow/runtime/constants';
import { textAdaptGptResponse } from '@fastgpt/global/core/workflow/runtime/utils';
import { ChatCompletionRequestMessageRoleEnum } from '@fastgpt/global/core/ai/constants';
import { dispatchWorkFlow } from '../../index';
import { type DispatchToolModuleProps, type RunToolResponse, type ToolNodeItemType } from './type';
import type { DispatchToolModuleProps, RunToolResponse, ToolNodeItemType } from './type';
import json5 from 'json5';
import { type DispatchFlowResponse, type WorkflowResponseType } from '../../type';
import { countGptMessagesTokens } from '../../../../../common/string/tiktoken/index';
import type { DispatchFlowResponse } from '../../type';
import { GPTMessages2Chats } from '@fastgpt/global/core/chat/adapt';
import { type AIChatItemType } from '@fastgpt/global/core/chat/type';
import type { AIChatItemType } from '@fastgpt/global/core/chat/type';
import { formatToolResponse, initToolCallEdges, initToolNodes } from './utils';
import {
computedMaxToken,
llmCompletionsBodyFormat,
removeDatasetCiteText,
parseLLMStreamResponse
} from '../../../../ai/utils';
import { getNanoid, sliceStrStartEnd } from '@fastgpt/global/common/string/tools';
import { toolValueTypeList, valueTypeJsonSchemaMap } from '@fastgpt/global/core/workflow/constants';
import { type WorkflowInteractiveResponseType } from '@fastgpt/global/core/workflow/template/system/interactive/type';
import { computedMaxToken } from '../../../../ai/utils';
import { sliceStrStartEnd } from '@fastgpt/global/common/string/tools';
import type { WorkflowInteractiveResponseType } from '@fastgpt/global/core/workflow/template/system/interactive/type';
import { ChatItemValueTypeEnum } from '@fastgpt/global/core/chat/constants';
import { getErrText } from '@fastgpt/global/common/error/utils';
import { createLLMResponse } from '../../../../ai/llm/request';
import { toolValueTypeList, valueTypeJsonSchemaMap } from '@fastgpt/global/core/workflow/constants';
type ToolRunResponseType = {
toolRunResponse?: DispatchFlowResponse;
@ -76,7 +65,7 @@ type ToolRunResponseType = {
3. messagesassistants responses tool responses
*/
export const runToolWithToolChoice = async (
export const runToolCall = async (
props: DispatchToolModuleProps & {
maxRunToolTimes: number;
},
@ -110,7 +99,6 @@ export const runToolWithToolChoice = async (
aiChatReasoning
}
} = workflowProps;
aiChatReasoning = !!aiChatReasoning && !!toolModel.reasoning;
if (maxRunToolTimes <= 0 && response) {
return response;
@ -175,8 +163,8 @@ export const runToolWithToolChoice = async (
return {
dispatchFlowResponse: [toolRunResponse],
toolNodeInputTokens: 0,
toolNodeOutputTokens: 0,
toolCallInputTokens: 0,
toolCallOutputTokens: 0,
completeMessages: requestMessages,
assistantResponses: toolRunResponse.assistantResponses,
runTimes: toolRunResponse.runTimes,
@ -184,7 +172,7 @@ export const runToolWithToolChoice = async (
};
}
return runToolWithToolChoice(
return runToolCall(
{
...props,
interactiveEntryToolParams: undefined,
@ -194,8 +182,8 @@ export const runToolWithToolChoice = async (
},
{
dispatchFlowResponse: [toolRunResponse],
toolNodeInputTokens: 0,
toolNodeOutputTokens: 0,
toolCallInputTokens: 0,
toolCallOutputTokens: 0,
assistantResponses: toolRunResponse.assistantResponses,
runTimes: toolRunResponse.runTimes
}
@ -206,7 +194,9 @@ export const runToolWithToolChoice = async (
const assistantResponses = response?.assistantResponses || [];
const toolNodesMap = new Map<string, ToolNodeItemType>();
const tools: ChatCompletionTool[] = toolNodes.map((item) => {
toolNodesMap.set(item.nodeId, item);
if (item.jsonSchema) {
return {
type: 'function',
@ -282,20 +272,26 @@ export const runToolWithToolChoice = async (
return item;
});
const [requestMessages] = await Promise.all([
loadRequestMessages({
messages: filterMessages,
useVision: toolModel.vision && aiChatVision,
origin: requestOrigin
})
]);
const requestBody = llmCompletionsBodyFormat(
{
const write = res ? responseWriteController({ res, readStream: stream }) : undefined;
let {
reasoningText: reasoningContent,
answerText: answer,
toolCalls = [],
finish_reason,
usage,
getEmptyResponseTip,
assistantMessage,
completeMessages
} = await createLLMResponse({
body: {
model: toolModel.model,
stream,
messages: requestMessages,
tools,
reasoning: aiChatReasoning,
messages: filterMessages,
tool_choice: 'auto',
toolCallMode: toolModel.toolChoice ? 'toolChoice' : 'prompt',
tools,
parallel_tool_calls: true,
temperature,
max_tokens,
@ -304,124 +300,67 @@ export const runToolWithToolChoice = async (
response_format: {
type: aiChatResponseFormat as any,
json_schema: aiChatJsonSchema
},
retainDatasetCite,
useVision: aiChatVision,
requestOrigin
},
isAborted: () => res?.closed,
userKey: externalProvider.openaiAccount,
onReasoning({ text }) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
reasoning_content: text
})
});
},
onStreaming({ text }) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
text
})
});
},
onToolCall({ call }) {
const toolNode = toolNodesMap.get(call.function.name);
if (toolNode) {
workflowStreamResponse?.({
event: SseResponseEventEnum.toolCall,
data: {
tool: {
id: call.id,
toolName: toolNode.name,
toolAvatar: toolNode.avatar,
functionName: call.function.name,
params: call.function.arguments ?? '',
response: ''
}
}
});
}
},
toolModel
);
// console.log(JSON.stringify(requestBody, null, 2), '==requestMessages');
/* Run llm */
const {
response: aiResponse,
isStreamResponse,
getEmptyResponseTip
} = await createChatCompletion({
body: requestBody,
userKey: externalProvider.openaiAccount,
options: {
headers: {
Accept: 'application/json, text/plain, */*'
}
onToolParam({ tool, params }) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.toolParams,
data: {
tool: {
id: tool.id,
toolName: '',
toolAvatar: '',
params,
response: ''
}
}
});
}
});
let { reasoningContent, answer, toolCalls, finish_reason, inputTokens, outputTokens } =
await (async () => {
if (isStreamResponse) {
if (!res || res.closed) {
return {
reasoningContent: '',
answer: '',
toolCalls: [],
finish_reason: 'close' as const,
inputTokens: 0,
outputTokens: 0
};
}
const result = await streamResponse({
res,
workflowStreamResponse,
toolNodes,
stream: aiResponse,
aiChatReasoning,
retainDatasetCite
});
return {
reasoningContent: result.reasoningContent,
answer: result.answer,
toolCalls: result.toolCalls,
finish_reason: result.finish_reason,
inputTokens: result.usage.prompt_tokens,
outputTokens: result.usage.completion_tokens
};
} else {
const result = aiResponse as ChatCompletion;
const finish_reason = result.choices?.[0]?.finish_reason as CompletionFinishReason;
const calls = result.choices?.[0]?.message?.tool_calls || [];
const answer = result.choices?.[0]?.message?.content || '';
// @ts-ignore
const reasoningContent = result.choices?.[0]?.message?.reasoning_content || '';
const usage = result.usage;
const formatReasoningContent = removeDatasetCiteText(reasoningContent, retainDatasetCite);
const formatAnswer = removeDatasetCiteText(answer, retainDatasetCite);
if (aiChatReasoning && reasoningContent) {
workflowStreamResponse?.({
event: SseResponseEventEnum.fastAnswer,
data: textAdaptGptResponse({
reasoning_content: formatReasoningContent
})
});
}
// 格式化 toolCalls
const toolCalls = calls.map((tool) => {
const toolNode = toolNodes.find((item) => item.nodeId === tool.function?.name);
// 不支持 stream 模式的模型的这里需要补一个响应给客户端
workflowStreamResponse?.({
event: SseResponseEventEnum.toolCall,
data: {
tool: {
id: tool.id,
toolName: toolNode?.name || '',
toolAvatar: toolNode?.avatar || '',
functionName: tool.function.name,
params: tool.function?.arguments ?? '',
response: ''
}
}
});
return {
...tool,
toolName: toolNode?.name || '',
toolAvatar: toolNode?.avatar || ''
};
});
if (answer) {
workflowStreamResponse?.({
event: SseResponseEventEnum.fastAnswer,
data: textAdaptGptResponse({
text: formatAnswer
})
});
}
return {
reasoningContent: formatReasoningContent,
answer: formatAnswer,
toolCalls: toolCalls,
finish_reason,
inputTokens: usage?.prompt_tokens,
outputTokens: usage?.completion_tokens
};
}
})();
if (!answer && !reasoningContent && toolCalls.length === 0) {
if (!answer && !reasoningContent && !toolCalls.length) {
return Promise.reject(getEmptyResponseTip());
}
@ -431,7 +370,7 @@ export const runToolWithToolChoice = async (
const toolsRunResponse: ToolRunResponseType = [];
for await (const tool of toolCalls) {
try {
const toolNode = toolNodes.find((item) => item.nodeId === tool.function?.name);
const toolNode = toolNodesMap.get(tool.function?.name);
if (!toolNode) continue;
@ -511,64 +450,46 @@ export const runToolWithToolChoice = async (
? response.dispatchFlowResponse.concat(flatToolsResponseData)
: flatToolsResponseData;
const inputTokens = response
? response.toolCallInputTokens + usage.inputTokens
: usage.inputTokens;
const outputTokens = response
? response.toolCallOutputTokens + usage.outputTokens
: usage.outputTokens;
if (toolCalls.length > 0) {
// Run the tool, combine its results, and perform another round of AI calls
const assistantToolMsgParams: ChatCompletionMessageParam[] = [
...(answer || reasoningContent
? [
{
role: ChatCompletionRequestMessageRoleEnum.Assistant as 'assistant',
content: answer,
reasoning_text: reasoningContent
}
]
: []),
{
role: ChatCompletionRequestMessageRoleEnum.Assistant,
tool_calls: toolCalls
}
];
/*
...
user
assistant: tool data
*/
const concatToolMessages = [
...requestMessages,
...assistantToolMsgParams
] as ChatCompletionMessageParam[];
// Only toolCall tokens are counted here, Tool response tokens count towards the next reply
inputTokens = inputTokens || (await countGptMessagesTokens(requestMessages, tools));
outputTokens = outputTokens || (await countGptMessagesTokens(assistantToolMsgParams));
/*
...
user
assistant: tool data
tool: tool response
*/
const completeMessages = [
...concatToolMessages,
const nextRequestMessages: ChatCompletionMessageParam[] = [
...completeMessages,
...toolsRunResponse.map((item) => item?.toolMsgParams)
];
/*
Get tool node assistant response
history assistant
current tool assistant
tool child assistant
- history assistant
- current tool assistant
- tool child assistant
*/
const toolNodeAssistant = GPTMessages2Chats([
...assistantToolMsgParams,
...toolsRunResponse.map((item) => item?.toolMsgParams)
])[0] as AIChatItemType;
const toolNodeAssistant = GPTMessages2Chats({
messages: [...assistantMessage, ...toolsRunResponse.map((item) => item?.toolMsgParams)],
getToolInfo: (id) => {
const toolNode = toolNodesMap.get(id);
return {
name: toolNode?.name || '',
avatar: toolNode?.avatar || ''
};
}
})[0] as AIChatItemType;
const toolChildAssistants = flatToolsResponseData
.map((item) => item.assistantResponses)
.flat()
.filter((item) => item.type !== ChatItemValueTypeEnum.interactive); // 交互节点留着下次记录
const toolNodeAssistants = [
const concatAssistantResponses = [
...assistantResponses,
...toolNodeAssistant.value,
...toolChildAssistants
@ -577,10 +498,6 @@ export const runToolWithToolChoice = async (
const runTimes =
(response?.runTimes || 0) +
flatToolsResponseData.reduce((sum, item) => sum + item.runTimes, 0);
const toolNodeInputTokens = response ? response.toolNodeInputTokens + inputTokens : inputTokens;
const toolNodeOutputTokens = response
? response.toolNodeOutputTokens + outputTokens
: outputTokens;
// Check stop signal
const hasStopSignal = flatToolsResponseData.some(
@ -596,8 +513,8 @@ export const runToolWithToolChoice = async (
workflowInteractiveResponseItem?.toolRunResponse?.workflowInteractiveResponse;
// Flashback traverses completeMessages, intercepting messages that know the first user
const firstUserIndex = completeMessages.findLastIndex((item) => item.role === 'user');
const newMessages = completeMessages.slice(firstUserIndex + 1);
const firstUserIndex = nextRequestMessages.findLastIndex((item) => item.role === 'user');
const newMessages = nextRequestMessages.slice(firstUserIndex + 1);
const toolWorkflowInteractiveResponse: WorkflowInteractiveResponseType | undefined =
workflowInteractiveResponse
@ -613,49 +530,41 @@ export const runToolWithToolChoice = async (
return {
dispatchFlowResponse,
toolNodeInputTokens,
toolNodeOutputTokens,
completeMessages,
assistantResponses: toolNodeAssistants,
toolCallInputTokens: inputTokens,
toolCallOutputTokens: outputTokens,
completeMessages: nextRequestMessages,
assistantResponses: concatAssistantResponses,
toolWorkflowInteractiveResponse,
runTimes,
finish_reason
};
}
return runToolWithToolChoice(
return runToolCall(
{
...props,
maxRunToolTimes: maxRunToolTimes - 1,
messages: completeMessages
messages: nextRequestMessages
},
{
dispatchFlowResponse,
toolNodeInputTokens,
toolNodeOutputTokens,
assistantResponses: toolNodeAssistants,
toolCallInputTokens: inputTokens,
toolCallOutputTokens: outputTokens,
assistantResponses: concatAssistantResponses,
runTimes,
finish_reason
}
);
} else {
// No tool is invoked, indicating that the process is over
const gptAssistantResponse: ChatCompletionMessageParam = {
role: ChatCompletionRequestMessageRoleEnum.Assistant,
content: answer,
reasoning_text: reasoningContent
};
const completeMessages = filterMessages.concat(gptAssistantResponse);
inputTokens = inputTokens || (await countGptMessagesTokens(requestMessages, tools));
outputTokens = outputTokens || (await countGptMessagesTokens([gptAssistantResponse]));
// concat tool assistant
const toolNodeAssistant = GPTMessages2Chats([gptAssistantResponse])[0] as AIChatItemType;
const toolNodeAssistant = GPTMessages2Chats({
messages: assistantMessage
})[0] as AIChatItemType;
return {
dispatchFlowResponse: response?.dispatchFlowResponse || [],
toolNodeInputTokens: response ? response.toolNodeInputTokens + inputTokens : inputTokens,
toolNodeOutputTokens: response ? response.toolNodeOutputTokens + outputTokens : outputTokens,
toolCallInputTokens: inputTokens,
toolCallOutputTokens: outputTokens,
completeMessages,
assistantResponses: [...assistantResponses, ...toolNodeAssistant.value],
@ -664,152 +573,3 @@ export const runToolWithToolChoice = async (
};
}
};
async function streamResponse({
res,
toolNodes,
stream,
workflowStreamResponse,
aiChatReasoning,
retainDatasetCite
}: {
res: NextApiResponse;
toolNodes: ToolNodeItemType[];
stream: StreamChatType;
workflowStreamResponse?: WorkflowResponseType;
aiChatReasoning: boolean;
retainDatasetCite?: boolean;
}) {
const write = responseWriteController({
res,
readStream: stream
});
let callingTool: { name: string; arguments: string } | null = null;
let toolCalls: ChatCompletionMessageToolCall[] = [];
const { parsePart, getResponseData, updateFinishReason } = parseLLMStreamResponse();
for await (const part of stream) {
if (res.closed) {
stream.controller?.abort();
updateFinishReason('close');
break;
}
const { reasoningContent, responseContent } = parsePart({
part,
parseThinkTag: true,
retainDatasetCite
});
const responseChoice = part.choices?.[0]?.delta;
// Reasoning response
if (aiChatReasoning && reasoningContent) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
reasoning_content: reasoningContent
})
});
}
if (responseContent) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
text: responseContent
})
});
}
// Parse tool calls
if (responseChoice?.tool_calls?.length) {
responseChoice.tool_calls.forEach((toolCall, i) => {
const index = toolCall.index ?? i;
// Call new tool
const hasNewTool = toolCall?.function?.name || callingTool;
if (hasNewTool) {
// 有 function name代表新 call 工具
if (toolCall?.function?.name) {
callingTool = {
name: toolCall.function?.name || '',
arguments: toolCall.function?.arguments || ''
};
} else if (callingTool) {
// Continue call(Perhaps the name of the previous function was incomplete)
callingTool.name += toolCall.function?.name || '';
callingTool.arguments += toolCall.function?.arguments || '';
}
if (!callingTool) {
return;
}
const toolNode = toolNodes.find((item) => item.nodeId === callingTool!.name);
if (toolNode) {
// New tool, add to list.
const toolId = getNanoid();
toolCalls[index] = {
...toolCall,
id: toolId,
type: 'function',
function: callingTool,
toolName: toolNode.name,
toolAvatar: toolNode.avatar
};
workflowStreamResponse?.({
event: SseResponseEventEnum.toolCall,
data: {
tool: {
id: toolId,
toolName: toolNode.name,
toolAvatar: toolNode.avatar,
functionName: callingTool.name,
params: callingTool?.arguments ?? '',
response: ''
}
}
});
callingTool = null;
}
} else {
/* arg 追加到当前工具的参数里 */
const arg: string = toolCall?.function?.arguments ?? '';
const currentTool = toolCalls[index];
if (currentTool && arg) {
currentTool.function.arguments += arg;
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.toolParams,
data: {
tool: {
id: currentTool.id,
toolName: '',
toolAvatar: '',
params: arg,
response: ''
}
}
});
}
}
});
}
}
const { reasoningContent, content, finish_reason, usage } = getResponseData();
return {
reasoningContent,
answer: content,
toolCalls: toolCalls.filter(Boolean),
finish_reason,
usage
};
}

View File

@ -42,8 +42,8 @@ export type DispatchToolModuleProps = ModuleDispatchProps<{
export type RunToolResponse = {
dispatchFlowResponse: DispatchFlowResponse[];
toolNodeInputTokens: number;
toolNodeOutputTokens: number;
toolCallInputTokens: number;
toolCallOutputTokens: number;
completeMessages?: ChatCompletionMessageParam[];
assistantResponses?: AIChatItemValueItemType[];
toolWorkflowInteractiveResponse?: WorkflowInteractiveResponseType;

View File

@ -1,28 +1,13 @@
import type { NextApiResponse } from 'next';
import { filterGPTMessageByMaxContext, loadRequestMessages } from '../../../chat/utils';
import { filterGPTMessageByMaxContext } from '../../../ai/llm/utils';
import type { ChatItemType, UserChatItemValueItemType } from '@fastgpt/global/core/chat/type.d';
import { ChatRoleEnum } from '@fastgpt/global/core/chat/constants';
import { SseResponseEventEnum } from '@fastgpt/global/core/workflow/runtime/constants';
import { textAdaptGptResponse } from '@fastgpt/global/core/workflow/runtime/utils';
import {
removeDatasetCiteText,
parseReasoningContent,
parseLLMStreamResponse
} from '../../../ai/utils';
import { createChatCompletion } from '../../../ai/config';
import type {
ChatCompletionMessageParam,
CompletionFinishReason,
StreamChatType
} from '@fastgpt/global/core/ai/type.d';
import { formatModelChars2Points } from '../../../../support/wallet/usage/utils';
import type { LLMModelItemType } from '@fastgpt/global/core/ai/model.d';
import { ChatCompletionRequestMessageRoleEnum } from '@fastgpt/global/core/ai/constants';
import type {
ChatDispatchProps,
DispatchNodeResultType
} from '@fastgpt/global/core/workflow/runtime/type';
import { countGptMessagesTokens } from '../../../../common/string/tiktoken/index';
import {
chats2GPTMessages,
chatValue2RuntimePrompt,
@ -47,16 +32,15 @@ import { DispatchNodeResponseKeyEnum } from '@fastgpt/global/core/workflow/runti
import { checkQuoteQAValue, getNodeErrResponse, getHistories } from '../utils';
import { filterSearchResultsByMaxChars } from '../../utils';
import { getHistoryPreview } from '@fastgpt/global/core/chat/utils';
import { computedMaxToken, llmCompletionsBodyFormat } from '../../../ai/utils';
import { type WorkflowResponseType } from '../type';
import { computedMaxToken } from '../../../ai/utils';
import { formatTime2YMDHM } from '@fastgpt/global/common/string/time';
import { type AiChatQuoteRoleType } from '@fastgpt/global/core/workflow/template/system/aiChat/type';
import type { AiChatQuoteRoleType } from '@fastgpt/global/core/workflow/template/system/aiChat/type';
import { getFileContentFromLinks, getHistoryFileLinks } from '../tools/readFiles';
import { parseUrlToFileType } from '@fastgpt/global/common/file/tools';
import { i18nT } from '../../../../../web/i18n/utils';
import { ModelTypeEnum } from '@fastgpt/global/core/ai/model';
import { postTextCensor } from '../../../chat/postTextCensor';
import { getErrText } from '@fastgpt/global/common/error/utils';
import { createLLMResponse } from '../../../ai/llm/request';
import { formatModelChars2Points } from '../../../../support/wallet/usage/utils';
export type ChatProps = ModuleDispatchProps<
AIChatNodeProps & {
@ -124,7 +108,6 @@ export const dispatchChatCompletion = async (props: ChatProps): Promise<ChatResp
try {
aiChatVision = modelConstantsData.vision && aiChatVision;
aiChatReasoning = !!aiChatReasoning && !!modelConstantsData.reasoning;
// Check fileLinks is reference variable
const fileUrlInput = inputs.find((item) => item.key === NodeInputKeyEnum.fileUrlList);
if (!fileUrlInput || !fileUrlInput.value || fileUrlInput.value.length === 0) {
@ -188,165 +171,82 @@ export const dispatchChatCompletion = async (props: ChatProps): Promise<ChatResp
})()
]);
const requestMessages = await loadRequestMessages({
messages: filterMessages,
useVision: aiChatVision,
origin: requestOrigin
});
const write = res ? responseWriteController({ res, readStream: stream }) : undefined;
const requestBody = llmCompletionsBodyFormat(
{
const {
completeMessages,
reasoningText,
answerText,
finish_reason,
getEmptyResponseTip,
usage
} = await createLLMResponse({
body: {
model: modelConstantsData.model,
stream,
messages: requestMessages,
reasoning: aiChatReasoning,
messages: filterMessages,
temperature,
max_tokens,
top_p: aiChatTopP,
stop: aiChatStopSign,
response_format: {
type: aiChatResponseFormat as any,
type: aiChatResponseFormat,
json_schema: aiChatJsonSchema
}
},
retainDatasetCite,
useVision: aiChatVision,
requestOrigin
},
modelConstantsData
);
// console.log(JSON.stringify(requestBody, null, 2), '===');
const { response, isStreamResponse, getEmptyResponseTip } = await createChatCompletion({
body: requestBody,
userKey: externalProvider.openaiAccount,
options: {
headers: {
Accept: 'application/json, text/plain, */*'
}
isAborted: () => res?.closed,
onReasoning({ text }) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
reasoning_content: text
})
});
},
onStreaming({ text }) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
text
})
});
}
});
let { answerText, reasoningText, finish_reason, inputTokens, outputTokens } =
await (async () => {
if (isStreamResponse) {
if (!res || res.closed) {
return {
answerText: '',
reasoningText: '',
finish_reason: 'close' as const,
inputTokens: 0,
outputTokens: 0
};
}
// sse response
const { answer, reasoning, finish_reason, usage } = await streamResponse({
res,
stream: response,
aiChatReasoning,
parseThinkTag: modelConstantsData.reasoning,
isResponseAnswerText,
workflowStreamResponse,
retainDatasetCite
});
return {
answerText: answer,
reasoningText: reasoning,
finish_reason,
inputTokens: usage?.prompt_tokens,
outputTokens: usage?.completion_tokens
};
} else {
const finish_reason = response.choices?.[0]?.finish_reason as CompletionFinishReason;
const usage = response.usage;
const { content, reasoningContent } = (() => {
const content = response.choices?.[0]?.message?.content || '';
const reasoningContent: string =
// @ts-ignore
response.choices?.[0]?.message?.reasoning_content || '';
// API already parse reasoning content
if (reasoningContent || !aiChatReasoning) {
return {
content,
reasoningContent
};
}
const [think, answer] = parseReasoningContent(content);
return {
content: answer,
reasoningContent: think
};
})();
const formatReasonContent = removeDatasetCiteText(reasoningContent, retainDatasetCite);
const formatContent = removeDatasetCiteText(content, retainDatasetCite);
// Some models do not support streaming
if (aiChatReasoning && reasoningContent) {
workflowStreamResponse?.({
event: SseResponseEventEnum.fastAnswer,
data: textAdaptGptResponse({
reasoning_content: formatReasonContent
})
});
}
if (isResponseAnswerText && content) {
workflowStreamResponse?.({
event: SseResponseEventEnum.fastAnswer,
data: textAdaptGptResponse({
text: formatContent
})
});
}
return {
reasoningText: formatReasonContent,
answerText: formatContent,
finish_reason,
inputTokens: usage?.prompt_tokens,
outputTokens: usage?.completion_tokens
};
}
})();
if (!answerText && !reasoningText) {
return getNodeErrResponse({ error: getEmptyResponseTip() });
}
const AIMessages: ChatCompletionMessageParam[] = [
{
role: ChatCompletionRequestMessageRoleEnum.Assistant,
content: answerText,
reasoning_text: reasoningText // reasoning_text is only recorded for response, but not for request
}
];
const completeMessages = [...requestMessages, ...AIMessages];
const chatCompleteMessages = GPTMessages2Chats(completeMessages);
inputTokens = inputTokens || (await countGptMessagesTokens(requestMessages));
outputTokens = outputTokens || (await countGptMessagesTokens(AIMessages));
const { totalPoints, modelName } = formatModelChars2Points({
model,
inputTokens,
outputTokens,
modelType: ModelTypeEnum.llm
model: modelConstantsData.model,
inputTokens: usage.inputTokens,
outputTokens: usage.outputTokens
});
const points = externalProvider.openaiAccount?.key ? 0 : totalPoints;
const chatCompleteMessages = GPTMessages2Chats({ messages: completeMessages });
const trimAnswer = answerText.trim();
return {
data: {
answerText: trimAnswer,
answerText: answerText,
reasoningText,
history: chatCompleteMessages
},
[DispatchNodeResponseKeyEnum.answerText]: isResponseAnswerText ? trimAnswer : undefined,
[DispatchNodeResponseKeyEnum.answerText]: isResponseAnswerText ? answerText : undefined,
[DispatchNodeResponseKeyEnum.reasoningText]: aiChatReasoning ? reasoningText : undefined,
[DispatchNodeResponseKeyEnum.nodeResponse]: {
totalPoints: externalProvider.openaiAccount?.key ? 0 : totalPoints,
totalPoints: points,
model: modelName,
inputTokens: inputTokens,
outputTokens: outputTokens,
inputTokens: usage.inputTokens,
outputTokens: usage.outputTokens,
query: `${userChatInput}`,
maxToken: max_tokens,
reasoningText,
@ -357,10 +257,10 @@ export const dispatchChatCompletion = async (props: ChatProps): Promise<ChatResp
[DispatchNodeResponseKeyEnum.nodeDispatchUsages]: [
{
moduleName: name,
totalPoints: externalProvider.openaiAccount?.key ? 0 : totalPoints,
totalPoints: points,
model: modelName,
inputTokens: inputTokens,
outputTokens: outputTokens
inputTokens: usage.inputTokens,
outputTokens: usage.outputTokens
}
],
[DispatchNodeResponseKeyEnum.toolResponses]: answerText
@ -559,66 +459,3 @@ async function getChatMessages({
filterMessages
};
}
async function streamResponse({
res,
stream,
workflowStreamResponse,
aiChatReasoning,
parseThinkTag,
isResponseAnswerText,
retainDatasetCite = true
}: {
res: NextApiResponse;
stream: StreamChatType;
workflowStreamResponse?: WorkflowResponseType;
aiChatReasoning?: boolean;
parseThinkTag?: boolean;
isResponseAnswerText?: boolean;
retainDatasetCite: boolean;
}) {
const write = responseWriteController({
res,
readStream: stream
});
const { parsePart, getResponseData, updateFinishReason } = parseLLMStreamResponse();
for await (const part of stream) {
if (res.closed) {
stream.controller?.abort();
updateFinishReason('close');
break;
}
const { reasoningContent, responseContent } = parsePart({
part,
parseThinkTag,
retainDatasetCite
});
if (aiChatReasoning && reasoningContent) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
reasoning_content: reasoningContent
})
});
}
if (isResponseAnswerText && responseContent) {
workflowStreamResponse?.({
write,
event: SseResponseEventEnum.answer,
data: textAdaptGptResponse({
text: responseContent
})
});
}
}
const { reasoningContent: reasoning, content: answer, finish_reason, usage } = getResponseData();
return { answer, reasoning, finish_reason, usage };
}

View File

@ -1,11 +1,6 @@
import { chats2GPTMessages } from '@fastgpt/global/core/chat/adapt';
import {
countGptMessagesTokens,
countPromptTokens
} from '../../../../common/string/tiktoken/index';
import type { ChatItemType } from '@fastgpt/global/core/chat/type.d';
import { ChatItemValueTypeEnum, ChatRoleEnum } from '@fastgpt/global/core/chat/constants';
import { createChatCompletion } from '../../../ai/config';
import type { ClassifyQuestionAgentItemType } from '@fastgpt/global/core/workflow/template/system/classifyQuestion/type';
import type { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { NodeOutputKeyEnum } from '@fastgpt/global/core/workflow/constants';
@ -18,10 +13,9 @@ import { getHistories } from '../utils';
import { formatModelChars2Points } from '../../../../support/wallet/usage/utils';
import { type DispatchNodeResultType } from '@fastgpt/global/core/workflow/runtime/type';
import { getHandleId } from '@fastgpt/global/core/workflow/utils';
import { loadRequestMessages } from '../../../chat/utils';
import { llmCompletionsBodyFormat, formatLLMResponse } from '../../../ai/utils';
import { addLog } from '../../../../common/system/log';
import { ModelTypeEnum } from '../../../../../global/core/ai/model';
import { createLLMResponse } from '../../../ai/llm/request';
type Props = ModuleDispatchProps<{
[NodeInputKeyEnum.aiModel]: string;
@ -73,8 +67,7 @@ export const dispatchClassifyQuestion = async (props: Props): Promise<CQResponse
const { totalPoints, modelName } = formatModelChars2Points({
model: cqModel.model,
inputTokens: inputTokens,
outputTokens: outputTokens,
modelType: ModelTypeEnum.llm
outputTokens: outputTokens
});
return {
@ -147,24 +140,19 @@ const completions = async ({
]
}
];
const requestMessages = await loadRequestMessages({
messages: chats2GPTMessages({ messages, reserveId: false }),
useVision: false
});
const { response } = await createChatCompletion({
body: llmCompletionsBodyFormat(
{
model: cqModel.model,
temperature: 0.01,
messages: requestMessages,
stream: true
},
cqModel
),
const {
answerText: answer,
usage: { inputTokens, outputTokens }
} = await createLLMResponse({
body: {
model: cqModel.model,
temperature: 0.01,
messages: chats2GPTMessages({ messages, reserveId: false }),
stream: true
},
userKey: externalProvider.openaiAccount
});
const { text: answer, usage } = await formatLLMResponse(response);
// console.log(JSON.stringify(chats2GPTMessages({ messages, reserveId: false }), null, 2));
@ -178,8 +166,8 @@ const completions = async ({
}
return {
inputTokens: usage?.prompt_tokens || (await countGptMessagesTokens(requestMessages)),
outputTokens: usage?.completion_tokens || (await countPromptTokens(answer)),
inputTokens,
outputTokens,
arg: { type: id }
};
};

View File

@ -1,13 +1,7 @@
import { chats2GPTMessages } from '@fastgpt/global/core/chat/adapt';
import { filterGPTMessageByMaxContext, loadRequestMessages } from '../../../chat/utils';
import { filterGPTMessageByMaxContext } from '../../../ai/llm/utils';
import type { ChatItemType } from '@fastgpt/global/core/chat/type.d';
import {
countMessagesTokens,
countGptMessagesTokens,
countPromptTokens
} from '../../../../common/string/tiktoken/index';
import { ChatItemValueTypeEnum, ChatRoleEnum } from '@fastgpt/global/core/chat/constants';
import { createChatCompletion } from '../../../ai/config';
import type { ContextExtractAgentItemType } from '@fastgpt/global/core/workflow/template/system/contextExtract/type';
import type { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import {
@ -29,12 +23,12 @@ import {
} from '@fastgpt/global/core/ai/type';
import { ChatCompletionRequestMessageRoleEnum } from '@fastgpt/global/core/ai/constants';
import { type DispatchNodeResultType } from '@fastgpt/global/core/workflow/runtime/type';
import { llmCompletionsBodyFormat, formatLLMResponse } from '../../../ai/utils';
import { ModelTypeEnum } from '../../../../../global/core/ai/model';
import {
getExtractJsonPrompt,
getExtractJsonToolPrompt
} from '@fastgpt/global/core/ai/prompt/agent';
import { createLLMResponse } from '../../../ai/llm/request';
type Props = ModuleDispatchProps<{
[NodeInputKeyEnum.history]?: ChatItemType[];
@ -128,8 +122,7 @@ export async function dispatchContentExtract(props: Props): Promise<Response> {
const { totalPoints, modelName } = formatModelChars2Points({
model: extractModel.model,
inputTokens: inputTokens,
outputTokens: outputTokens,
modelType: ModelTypeEnum.llm
outputTokens: outputTokens
});
return {
@ -231,10 +224,6 @@ const toolChoice = async (props: ActionProps) => {
messages: adaptMessages,
maxContext: extractModel.maxContext
});
const requestMessages = await loadRequestMessages({
messages: filterMessages,
useVision: false
});
const schema = getJsonSchema(props);
@ -253,23 +242,22 @@ const toolChoice = async (props: ActionProps) => {
}
];
const body = llmCompletionsBodyFormat(
{
stream: true,
model: extractModel.model,
temperature: 0.01,
messages: requestMessages,
tools,
tool_choice: { type: 'function', function: { name: agentFunName } }
},
extractModel
);
const { response } = await createChatCompletion({
const body = {
stream: true,
model: extractModel.model,
temperature: 0.01,
messages: filterMessages,
tools,
tool_choice: { type: 'function', function: { name: agentFunName } }
} as const;
const {
answerText: text,
toolCalls,
usage: { inputTokens, outputTokens }
} = await createLLMResponse({
body,
userKey: externalProvider.openaiAccount
});
const { text, toolCalls, usage } = await formatLLMResponse(response);
const arg: Record<string, any> = (() => {
try {
@ -289,8 +277,6 @@ const toolChoice = async (props: ActionProps) => {
}
];
const inputTokens = usage?.prompt_tokens || (await countGptMessagesTokens(filterMessages, tools));
const outputTokens = usage?.completion_tokens || (await countGptMessagesTokens(AIMessages));
return {
inputTokens,
outputTokens,
@ -336,26 +322,19 @@ const completions = async (props: ActionProps) => {
]
}
];
const requestMessages = await loadRequestMessages({
messages: chats2GPTMessages({ messages, reserveId: false }),
useVision: false
});
const { response } = await createChatCompletion({
body: llmCompletionsBodyFormat(
{
model: extractModel.model,
temperature: 0.01,
messages: requestMessages,
stream: true
},
extractModel
),
const {
answerText: answer,
usage: { inputTokens, outputTokens }
} = await createLLMResponse({
body: {
model: extractModel.model,
temperature: 0.01,
messages: chats2GPTMessages({ messages, reserveId: false }),
stream: true
},
userKey: externalProvider.openaiAccount
});
const { text: answer, usage } = await formatLLMResponse(response);
const inputTokens = usage?.prompt_tokens || (await countMessagesTokens(messages));
const outputTokens = usage?.completion_tokens || (await countPromptTokens(answer));
// parse response
const jsonStr = sliceJsonStr(answer);

View File

@ -0,0 +1,75 @@
import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
import { dispatchAppRequest } from './abandoned/runApp';
import { dispatchClassifyQuestion } from './ai/classifyQuestion';
import { dispatchContentExtract } from './ai/extract';
import { dispatchRunTools } from './ai/agent/index';
import { dispatchStopToolCall } from './ai/agent/stopTool';
import { dispatchToolParams } from './ai/agent/toolParams';
import { dispatchChatCompletion } from './ai/chat';
import { dispatchCodeSandbox } from './tools/codeSandbox';
import { dispatchDatasetConcat } from './dataset/concat';
import { dispatchDatasetSearch } from './dataset/search';
import { dispatchSystemConfig } from './init/systemConfig';
import { dispatchWorkflowStart } from './init/workflowStart';
import { dispatchFormInput } from './interactive/formInput';
import { dispatchUserSelect } from './interactive/userSelect';
import { dispatchLoop } from './loop/runLoop';
import { dispatchLoopEnd } from './loop/runLoopEnd';
import { dispatchLoopStart } from './loop/runLoopStart';
import { dispatchRunPlugin } from './plugin/run';
import { dispatchRunAppNode } from './child/runApp';
import { dispatchPluginInput } from './plugin/runInput';
import { dispatchPluginOutput } from './plugin/runOutput';
import { dispatchRunTool } from './child/runTool';
import { dispatchAnswer } from './tools/answer';
import { dispatchCustomFeedback } from './tools/customFeedback';
import { dispatchHttp468Request } from './tools/http468';
import { dispatchQueryExtension } from './tools/queryExternsion';
import { dispatchReadFiles } from './tools/readFiles';
import { dispatchIfElse } from './tools/runIfElse';
import { dispatchLafRequest } from './tools/runLaf';
import { dispatchUpdateVariable } from './tools/runUpdateVar';
import { dispatchTextEditor } from './tools/textEditor';
export const callbackMap: Record<FlowNodeTypeEnum, Function> = {
[FlowNodeTypeEnum.workflowStart]: dispatchWorkflowStart,
[FlowNodeTypeEnum.answerNode]: dispatchAnswer,
[FlowNodeTypeEnum.chatNode]: dispatchChatCompletion,
[FlowNodeTypeEnum.datasetSearchNode]: dispatchDatasetSearch,
[FlowNodeTypeEnum.datasetConcatNode]: dispatchDatasetConcat,
[FlowNodeTypeEnum.classifyQuestion]: dispatchClassifyQuestion,
[FlowNodeTypeEnum.contentExtract]: dispatchContentExtract,
[FlowNodeTypeEnum.httpRequest468]: dispatchHttp468Request,
[FlowNodeTypeEnum.appModule]: dispatchRunAppNode,
[FlowNodeTypeEnum.pluginModule]: dispatchRunPlugin,
[FlowNodeTypeEnum.pluginInput]: dispatchPluginInput,
[FlowNodeTypeEnum.pluginOutput]: dispatchPluginOutput,
[FlowNodeTypeEnum.queryExtension]: dispatchQueryExtension,
[FlowNodeTypeEnum.agent]: dispatchRunTools,
[FlowNodeTypeEnum.stopTool]: dispatchStopToolCall,
[FlowNodeTypeEnum.toolParams]: dispatchToolParams,
[FlowNodeTypeEnum.lafModule]: dispatchLafRequest,
[FlowNodeTypeEnum.ifElseNode]: dispatchIfElse,
[FlowNodeTypeEnum.variableUpdate]: dispatchUpdateVariable,
[FlowNodeTypeEnum.code]: dispatchCodeSandbox,
[FlowNodeTypeEnum.textEditor]: dispatchTextEditor,
[FlowNodeTypeEnum.customFeedback]: dispatchCustomFeedback,
[FlowNodeTypeEnum.readFiles]: dispatchReadFiles,
[FlowNodeTypeEnum.userSelect]: dispatchUserSelect,
[FlowNodeTypeEnum.loop]: dispatchLoop,
[FlowNodeTypeEnum.loopStart]: dispatchLoopStart,
[FlowNodeTypeEnum.loopEnd]: dispatchLoopEnd,
[FlowNodeTypeEnum.formInput]: dispatchFormInput,
[FlowNodeTypeEnum.tool]: dispatchRunTool,
// none
[FlowNodeTypeEnum.systemConfig]: dispatchSystemConfig,
[FlowNodeTypeEnum.pluginConfig]: () => Promise.resolve(),
[FlowNodeTypeEnum.emptyNode]: () => Promise.resolve(),
[FlowNodeTypeEnum.globalVariable]: () => Promise.resolve(),
[FlowNodeTypeEnum.comment]: () => Promise.resolve(),
[FlowNodeTypeEnum.toolSet]: () => Promise.resolve(),
// @deprecated
[FlowNodeTypeEnum.runApp]: dispatchAppRequest
};

View File

@ -169,8 +169,7 @@ export async function dispatchDatasetSearch(
const { totalPoints: embeddingTotalPoints, modelName: embeddingModelName } =
formatModelChars2Points({
model: vectorModel.model,
inputTokens: embeddingTokens,
modelType: ModelTypeEnum.embedding
inputTokens: embeddingTokens
});
nodeDispatchUsages.push({
totalPoints: embeddingTotalPoints,
@ -181,8 +180,7 @@ export async function dispatchDatasetSearch(
// Rerank
const { totalPoints: reRankTotalPoints, modelName: reRankModelName } = formatModelChars2Points({
model: rerankModelData?.model,
inputTokens: reRankInputTokens,
modelType: ModelTypeEnum.rerank
inputTokens: reRankInputTokens
});
if (usingReRank) {
nodeDispatchUsages.push({
@ -198,8 +196,7 @@ export async function dispatchDatasetSearch(
const { totalPoints, modelName } = formatModelChars2Points({
model: queryExtensionResult.model,
inputTokens: queryExtensionResult.inputTokens,
outputTokens: queryExtensionResult.outputTokens,
modelType: ModelTypeEnum.llm
outputTokens: queryExtensionResult.outputTokens
});
nodeDispatchUsages.push({
totalPoints,
@ -222,8 +219,7 @@ export async function dispatchDatasetSearch(
const { totalPoints, modelName } = formatModelChars2Points({
model: deepSearchResult.model,
inputTokens: deepSearchResult.inputTokens,
outputTokens: deepSearchResult.outputTokens,
modelType: ModelTypeEnum.llm
outputTokens: deepSearchResult.outputTokens
});
nodeDispatchUsages.push({
totalPoints,

File diff suppressed because it is too large Load Diff

View File

@ -45,8 +45,7 @@ export const dispatchQueryExtension = async ({
const { totalPoints, modelName } = formatModelChars2Points({
model: queryExtensionModel.model,
inputTokens,
outputTokens,
modelType: ModelTypeEnum.llm
outputTokens
});
const set = new Set<string>();

View File

@ -3,7 +3,6 @@
"version": "1.0.0",
"type": "module",
"dependencies": {
"@fastgpt-sdk/plugin": "^0.1.9",
"@fastgpt/global": "workspace:*",
"@modelcontextprotocol/sdk": "^1.12.1",
"@node-rs/jieba": "2.0.1",

View File

@ -88,6 +88,13 @@ export const authAppByTmbId = async ({
};
}
if (app.favourite || app.quick) {
return {
...app,
permission: new AppPermission({ isOwner: false, role: ReadRoleVal })
};
}
const isOwner = tmbPer.isOwner || String(app.tmbId) === String(tmbId);
const { Per } = await (async () => {

View File

@ -218,7 +218,6 @@ export const pushLLMTrainingUsage = async ({
// Compute points
const { totalPoints } = formatModelChars2Points({
model,
modelType: ModelTypeEnum.llm,
inputTokens,
outputTokens
});

View File

@ -1,17 +1,14 @@
import { findAIModel } from '../../../core/ai/model';
import type { ModelTypeEnum } from '@fastgpt/global/core/ai/model';
export const formatModelChars2Points = ({
model,
inputTokens = 0,
outputTokens = 0,
modelType,
multiple = 1000
}: {
model: string;
inputTokens?: number;
outputTokens?: number;
modelType: `${ModelTypeEnum}`;
multiple?: number;
}) => {
const modelData = findAIModel(model);

View File

@ -86,11 +86,11 @@ export const useDoc2xServer = ({ apiKey }: { apiKey: string }) => {
const uid = preupload_data.uid;
// 2. Upload file to pre-signed URL with binary stream
const blob = new Blob([fileBuffer], { type: 'application/pdf' });
const response = await axios
.put(upload_url, blob, {
.put(upload_url, fileBuffer, {
headers: {
'Content-Type': 'application/pdf'
'Content-Type': 'application/pdf',
'Content-Length': fileBuffer.length.toString()
}
})
.catch((error) => {

View File

@ -1,4 +1,4 @@
import createClient from '@fastgpt-sdk/plugin';
import { createClient } from '@fastgpt/global/sdk/fastgpt-plugin';
export const BASE_URL = process.env.PLUGIN_BASE_URL || '';
export const TOKEN = process.env.PLUGIN_TOKEN || '';

View File

@ -23,9 +23,17 @@ type Props<T = any> = {
}) => ReactElement<HTMLElement, string>;
dataList: T[];
zoom?: number;
renderInnerPlaceholder?: boolean;
};
function DndDrag<T>({ children, renderClone, onDragEndCb, dataList, zoom = 1 }: Props<T>) {
function DndDrag<T>({
children,
renderClone,
onDragEndCb,
dataList,
zoom = 1,
renderInnerPlaceholder = true
}: Props<T>) {
const [draggingItemHeight, setDraggingItemHeight] = useState(0);
const onDragStart = (start: DragStart) => {
@ -55,7 +63,9 @@ function DndDrag<T>({ children, renderClone, onDragEndCb, dataList, zoom = 1 }:
{(provided, snapshot) => (
<>
{children({ provided, snapshot })}
{snapshot.isDraggingOver && <Box height={`${draggingItemHeight / zoom}px`} />}
{snapshot.isDraggingOver && renderInnerPlaceholder && (
<Box height={`${draggingItemHeight / zoom}px`} />
)}
</>
)}
</Droppable>

View File

@ -214,6 +214,7 @@ export const iconPaths = {
'core/chat/sidebar/home': () => import('./icons/core/chat/sidebar/home.svg'),
'core/chat/sidebar/logout': () => import('./icons/core/chat/sidebar/logout.svg'),
'core/chat/sidebar/menu': () => import('./icons/core/chat/sidebar/menu.svg'),
'core/chat/sidebar/star': () => import('./icons/core/chat/sidebar/star.svg'),
'core/chat/speaking': () => import('./icons/core/chat/speaking.svg'),
'core/chat/stopSpeech': () => import('./icons/core/chat/stopSpeech.svg'),
'core/chat/think': () => import('./icons/core/chat/think.svg'),
@ -413,44 +414,11 @@ export const iconPaths = {
'modal/selectSource': () => import('./icons/modal/selectSource.svg'),
'modal/setting': () => import('./icons/modal/setting.svg'),
'modal/teamPlans': () => import('./icons/modal/teamPlans.svg'),
'model/BAAI': () => import('./icons/model/BAAI.svg'),
'model/ai360': () => import('./icons/model/ai360.svg'),
'model/alicloud': () => import('./icons/model/alicloud.svg'),
'model/aws': () => import('./icons/model/aws.svg'),
'model/azure': () => import('./icons/model/azure.svg'),
'model/baichuan': () => import('./icons/model/baichuan.svg'),
'model/chatglm': () => import('./icons/model/chatglm.svg'),
'model/claude': () => import('./icons/model/claude.svg'),
'model/cloudflare': () => import('./icons/model/cloudflare.svg'),
'model/cohere': () => import('./icons/model/cohere.svg'),
'model/coze': () => import('./icons/model/coze.svg'),
'model/deepseek': () => import('./icons/model/deepseek.svg'),
'model/doubao': () => import('./icons/model/doubao.svg'),
'model/ernie': () => import('./icons/model/ernie.svg'),
'model/fishaudio': () => import('./icons/model/fishaudio.svg'),
'model/gemini': () => import('./icons/model/gemini.svg'),
'model/grok': () => import('./icons/model/grok.svg'),
'model/groq': () => import('./icons/model/groq.svg'),
'model/huggingface': () => import('./icons/model/huggingface.svg'),
'model/hunyuan': () => import('./icons/model/hunyuan.svg'),
'model/intern': () => import('./icons/model/intern.svg'),
'model/jina': () => import('./icons/model/jina.svg'),
'model/meta': () => import('./icons/model/meta.svg'),
'model/minimax': () => import('./icons/model/minimax.svg'),
'model/mistral': () => import('./icons/model/mistral.svg'),
'model/moka': () => import('./icons/model/moka.svg'),
'model/moonshot': () => import('./icons/model/moonshot.svg'),
'model/novita': () => import('./icons/model/novita.svg'),
'model/ollama': () => import('./icons/model/ollama.svg'),
'model/openai': () => import('./icons/model/openai.svg'),
'model/openrouter': () => import('./icons/model/openrouter.svg'),
'model/ppio': () => import('./icons/model/ppio.svg'),
'model/qwen': () => import('./icons/model/qwen.svg'),
'model/siliconflow': () => import('./icons/model/siliconflow.svg'),
'model/sparkDesk': () => import('./icons/model/sparkDesk.svg'),
'model/stepfun': () => import('./icons/model/stepfun.svg'),
'model/vertexai': () => import('./icons/model/vertexai.svg'),
'model/yi': () => import('./icons/model/yi.svg'),
more: () => import('./icons/more.svg'),
moreLine: () => import('./icons/moreLine.svg'),
optimizer: () => import('./icons/optimizer.svg'),

View File

@ -0,0 +1,5 @@
<svg viewBox="0 0 20 20" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M9.82407 2.66888C10.195 1.64163 11.6478 1.64162 12.0187 2.66888L13.6696 7.24094L18.2167 8.90465C19.2378 9.27823 19.2378 10.7223 18.2167 11.0959L13.6696 12.7596L12.0187 17.3317C11.6478 18.3589 10.195 18.3589 9.82407 17.3317L8.17318 12.7596L3.62605 11.0959C2.60499 10.7223 2.60498 9.27824 3.62605 8.90465L8.17318 7.24094L9.82407 2.66888ZM10.9214 4.53733L9.66151 8.02652C9.54427 8.35121 9.28925 8.60731 8.96506 8.72593L5.48208 10.0003L8.96506 11.2746C9.28925 11.3933 9.54427 11.6494 9.66151 11.974L10.9214 15.4632L12.1813 11.974C12.2985 11.6494 12.5535 11.3933 12.8777 11.2746L16.3607 10.0003L12.8777 8.72593C12.5535 8.60731 12.2985 8.35121 12.1813 8.02652L10.9214 4.53733Z"/>
<path d="M3.49398 2.22273C3.60136 1.93252 4.01183 1.93252 4.11921 2.22273L4.45364 3.1265C4.4874 3.21774 4.55934 3.28968 4.65058 3.32344L5.55435 3.65786C5.84455 3.76525 5.84455 4.17571 5.55435 4.2831L4.65058 4.61752C4.55934 4.65128 4.4874 4.72322 4.45364 4.81446L4.11921 5.71823C4.01183 6.00844 3.60136 6.00844 3.49398 5.71823L3.15955 4.81446C3.12579 4.72322 3.05385 4.65129 2.96261 4.61752L2.05884 4.2831C1.76864 4.17571 1.76863 3.76525 2.05884 3.65786L2.96261 3.32344C3.05385 3.28968 3.12579 3.21774 3.15955 3.1265L3.49398 2.22273Z"/>
<path d="M3.49398 14.0651C3.60136 13.7749 4.01183 13.7749 4.11921 14.0651L4.45364 14.9689C4.4874 15.0601 4.55934 15.1321 4.65058 15.1658L5.55435 15.5002C5.84455 15.6076 5.84455 16.0181 5.55435 16.1255L4.65058 16.4599C4.55934 16.4937 4.4874 16.5656 4.45364 16.6568L4.11921 17.5606C4.01183 17.8508 3.60136 17.8508 3.49398 17.5606L3.15955 16.6568C3.12579 16.5656 3.05385 16.4937 2.96261 16.4599L2.05884 16.1255C1.76864 16.0181 1.76863 15.6076 2.05884 15.5002L2.96261 15.1658C3.05385 15.1321 3.12579 15.0601 3.15955 14.9689L3.49398 14.0651Z"/>
</svg>

After

Width:  |  Height:  |  Size: 1.8 KiB

View File

@ -1,3 +0,0 @@
<svg viewBox="0 0 48 48" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M17.8421 18.067L13.3433 29.38H17.1204L19.8133 21.8408L22.501 29.38H26.3705L21.8759 18.067H17.8421ZM43.9918 29.3552H47.6953V18.0419H43.9918V29.3552ZM10.4405 24.3003C10.0338 23.8233 9.568 23.4314 8.70827 23.2344C9.27395 23.0491 9.53849 22.8553 9.82969 22.5371C10.2706 22.0593 10.4939 21.4839 10.4939 20.8153C10.4939 20.0204 10.182 19.351 9.56515 18.8106C8.94862 18.272 8.06187 18 6.9088 18H0.000355556V20.4788H6.01067C6.01067 20.4788 6.94436 20.5244 6.93084 21.4074C6.9184 22.3645 5.984 22.3543 5.984 22.3543H0V24.8418H5.5936C5.99716 24.8418 6.63893 24.7581 6.97067 25.0086C7.23307 25.2078 7.33938 25.3655 7.33938 25.6998C7.33938 26.0607 7.20924 26.3484 6.95004 26.5629C6.68764 26.7759 6.44516 26.8276 5.60249 26.8385H0.000355556V29.313H6.36907C6.64107 29.313 7.19431 29.2584 8.02738 29.1564C8.6528 29.0799 9.12035 28.9576 9.43004 28.7861C9.93244 28.5162 10.326 28.144 10.614 27.6761C10.902 27.2092 11.0464 26.6787 11.0464 26.0938C11.0464 25.3731 10.8427 24.7763 10.4405 24.3003ZM35.1371 21.7752L37.824 29.313H41.6949L37.1993 18H33.1652L28.6674 29.313H32.4434L35.1371 21.7752Z" fill="#231F20"/>
</svg>

Before

Width:  |  Height:  |  Size: 1.2 KiB

View File

@ -1 +0,0 @@
<svg height="1em" style="flex:none;line-height:1" viewBox="0 0 24 24" width="1em" xmlns="http://www.w3.org/2000/svg"><title>AI360</title><path clip-rule="evenodd" d="M12 0h.018c1.473-.002 2.88.261 4.179.754C20.755 2.456 24 6.85 24 12c0 6.627-5.373 12-12 12S0 18.627 0 12 5.373 0 12 0zm8.604 18.967A11.024 11.024 0 0023.07 12c0-1.717-.39-3.344-1.089-4.794a2.59 2.59 0 01-3.214.62 6.278 6.278 0 01-1.333-.992C16.283 5.73 15.109 4.66 13.696 3.9c-3.211-1.729-6.825-1.501-9.695.447A11.033 11.033 0 00.93 12c0 1.663.367 3.241 1.024 4.657.75-.973 2.131-1.346 3.232-.71.667.384 1.257.92 1.837 1.447l.176.16c1.365 1.234 2.794 2.355 4.558 2.965 3.053 1.053 6.356.437 8.847-1.552z" fill="url(#lobe-icons-ai360-fill-0)" fill-rule="evenodd"></path><path d="M5.643 10.312c-.83.11-1.401.766-1.408 1.618a1.715 1.715 0 001.45 1.72c.805.128 1.64-.426 1.87-1.26.046-.167.076-.338.106-.51.025-.14.05-.282.084-.42.318-1.317 1.237-1.95 2.788-1.93 1.086.013 1.318.271 1.68 1.855.017.076.043.151.07.226.26.714.976 1.17 1.67 1.065a1.647 1.647 0 001.38-1.438c.083-.729-.348-1.264-1.122-1.575-.34-.136-.664-.158-.995-.141-.726.037-1.121-.36-1.339-.977a3.359 3.359 0 01-.134-.65c-.014-.093-.027-.186-.043-.278-.156-.887-.835-1.51-1.669-1.532-.791-.02-1.464.551-1.665 1.418l-.06.27-.025.117c-.355 1.636-.974 2.205-2.638 2.422z" fill="url(#lobe-icons-ai360-fill-1)"></path><path d="M18.059 13.644c.989-.206 1.577-.838 1.592-1.697.015-.83-.624-1.582-1.46-1.724-.77-.13-1.599.383-1.844 1.18-.069.22-.117.448-.165.676-.06.29-.122.58-.225.854-.367.986-1.593 1.546-2.926 1.394-.824-.095-1.106-.446-1.342-1.674-.18-.938-.864-1.535-1.681-1.467-.85.07-1.515.829-1.468 1.673.05.892.678 1.44 1.705 1.489 1.375.064 1.75.396 1.926 1.787.067.531.267.967.685 1.288 1.02.783 2.407.208 2.66-1.108l.022-.114c.152-.796.3-1.577 1.04-2.101.36-.255.761-.326 1.166-.397.105-.019.21-.037.315-.06z" fill="url(#lobe-icons-ai360-fill-2)"></path><path d="M13.83 7.961a.755.755 0 11-1.51 0 .755.755 0 011.51 0z" fill="url(#lobe-icons-ai360-fill-3)"></path><path d="M10.809 16.678a.755.755 0 100-1.511.755.755 0 000 1.51z" fill="url(#lobe-icons-ai360-fill-4)"></path><defs><linearGradient gradientUnits="userSpaceOnUse" id="lobe-icons-ai360-fill-0" x1="12" x2="12" y1="0" y2="24"><stop stop-color="#12B7FA"></stop><stop offset="1" stop-color="#006ffb"></stop></linearGradient><linearGradient gradientUnits="userSpaceOnUse" id="lobe-icons-ai360-fill-1" x1="11.943" x2="11.943" y1="6.085" y2="17.778"><stop stop-color="#006ffb"></stop><stop offset="1" stop-color="#12B7FA"></stop></linearGradient><linearGradient gradientUnits="userSpaceOnUse" id="lobe-icons-ai360-fill-2" x1="11.943" x2="11.943" y1="6.085" y2="17.778"><stop stop-color="#006ffb"></stop><stop offset="1" stop-color="#12B7FA"></stop></linearGradient><linearGradient gradientUnits="userSpaceOnUse" id="lobe-icons-ai360-fill-3" x1="11.943" x2="11.943" y1="6.085" y2="17.778"><stop stop-color="#006ffb"></stop><stop offset="1" stop-color="#12B7FA"></stop></linearGradient><linearGradient gradientUnits="userSpaceOnUse" id="lobe-icons-ai360-fill-4" x1="11.943" x2="11.943" y1="6.085" y2="17.778"><stop stop-color="#006ffb"></stop><stop offset="1" stop-color="#12B7FA"></stop></linearGradient></defs></svg>

Before

Width:  |  Height:  |  Size: 3.1 KiB

View File

@ -1 +0,0 @@
<?xml version="1.0" standalone="no"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg t="1735795333775" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="4413" xmlns:xlink="http://www.w3.org/1999/xlink" width="64" height="64"><path d="M512 64a448 448 0 1 1 0 896A448 448 0 0 1 512 64z" fill="#FF6A00" p-id="4414"></path><path d="M324.8 602.624a26.752 26.752 0 0 1-21.312-25.92v-142.72a27.712 27.712 0 0 1 21.376-25.984l132.416-28.672 13.952-56.896H317.312a97.6 97.6 0 0 0-98.24 96.96v169.344c0.384 54.08 44.16 97.856 98.24 98.176h153.92l-13.888-56.512-132.544-27.776zM710.4 322.432c54.016 0.128 97.92 43.584 98.56 97.6v170.176a98.368 98.368 0 0 1-98.56 98.048H555.328l14.08-56.832 132.608-28.736a27.84 27.84 0 0 0 21.376-25.92v-142.72a26.88 26.88 0 0 0-21.376-25.984l-132.544-28.8-14.08-56.832zM570.368 497.92v13.952H457.28v-13.952h113.088z" fill="#FFFFFF" p-id="4415"></path></svg>

Before

Width:  |  Height:  |  Size: 978 B

View File

@ -1,6 +0,0 @@
<svg t="1710841200339" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="1550"
width="128" height="128">
<path
d="M441.08001533 510.85997782V108.53819562c0-20.25527787 0-20.30830215 20.57342359-20.33481431H649.86314777c16.91474774 0 17.5510392 0.42419429 17.55103921 17.07382061 0 178.50626422-0.05302429 357.03904059-0.13256073 535.6513534 0 90.06175249 0.47721859 180.12350499 0.66280361 270.15874534 0 20.52039931 0 20.62644787-21.60739721 20.65296002h-188.20970885c-16.57008987 0-16.88823559-0.29163359-16.88823562-16.17240772V510.80695353l-0.15907285 0.05302429zM34.78141155 936.19430083c8.56342244-16.96777203 15.05889769-30.3829168 22.00507936-43.63898869 21.95205507-42.63152723 44.38132873-85.1039816 65.77662877-128.00063028 3.60565156-7.26432739 4.2949673-16.33148058 4.32147945-24.57675731 0.26512144-147.03634954 0.26512144-294.01967477 0-440.94997572-0.15907286-16.11938343 3.44657869-32.07969399 10.49880894-46.68788524a10636.14185068 10636.14185068 0 0 0 71.2381304-152.33877828c3.92379728-8.56342244 9.17320175-12.32814687 18.98269496-12.11604972 33.45832548 0.39768215 66.91665095 0.13256073 100.37497644 0.21209715 17.20638133 0 17.57755134 0.34465787 17.57755134 16.96777204v468.81423886c0 56.12620844 0.45070644 112.25241685-0.23860931 168.37862529-0.21209715 10.79044253-2.65121438 21.44832433-7.2113031 31.28432968-22.58834652 49.04746604-46.39625165 97.59120133-69.35576818 146.50610666-5.30242877 11.26766112-12.30163473 15.85426198-25.87585235 15.69518912-61.45514934-0.53024288-122.96332295 0.21209715-184.39196015 0.47721859L34.78141155 936.19430083zM762.14207678 633.13398503V350.99175069c0-18.66454924 0-18.92967068 19.56596212-18.95618283 62.72773223-0.10604857 125.45546447-0.10604857 188.236221 0 18.31989136 0 18.87664639 0.79536432 18.87664639 18.55850067L989.21858845 913.73851504c0 4.02984586-0.26512144 8.05969171-0.42419431 12.11604971-0.26512144 6.8401331-3.60565156 10.02159036-11.02905181 10.02159035-68.13620958 0-136.27241915 0.10604857-204.35560443 0.31814573-7.34386383 0-10.92300324-2.78377509-10.73741824-9.96856606V633.13398503h-0.53024288zM874.68612722 260.82394962c-32.98110689 0-66.01523807-0.39768215-98.99634496 0.21209715-10.63136967 0.185585-13.65375406-3.44657869-13.60072977-13.25607191 0.31814573-47.58929812 0-95.20510839 0-142.82091866 0-15.85426198 0.90141289-16.6496263 16.83521131-16.70265059 65.59104377-0.13256073 131.18208753 0 196.79964344-0.34465788 9.80949321 0 13.09699904 3.15494512 13.04397477 12.61978046-0.185585 47.64232241 0.26512144 95.25813268 0.34465786 142.84743081 0 16.09287129-0.90141289 16.78218703-17.97523349 16.80869917h-96.50420346l0.07953644 0.63629145z"
fill="#E06639" p-id="1551"></path>
</svg>

Before

Width:  |  Height:  |  Size: 2.7 KiB

View File

@ -1,9 +0,0 @@
<svg t="1710841195972" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="1400"
width="128" height="128">
<path
d="M164.864 616.704c0-57.856 34.56-115.925333 102.186667-161.706667 67.413333-45.653333 163.285333-75.52 271.786666-75.52 108.416 0 204.373333 29.866667 271.701334 75.52 34.133333 23.04 59.818667 49.322667 76.928 77.056 20.778667-37.973333 29.738667-78.293333 24.917333-118.485333a18.346667 18.346667 0 0 1 0.725333-7.765333 412.714667 412.714667 0 0 0-44.586666-34.773334C781.952 312.32 665.173333 277.76 538.794667 277.76S295.68 312.32 209.109333 371.029333c-86.314667 58.453333-146.944 144.426667-146.944 245.674667s60.586667 187.264 146.944 245.76c86.528 58.581333 203.264 93.226667 329.685334 93.226667s243.2-34.645333 329.728-93.269334c86.314667-58.453333 146.901333-144.469333 146.901333-245.717333 0-64.384-24.490667-122.581333-64.512-171.477333-2.133333 45.098667-18.133333 88.405333-44.757333 127.786666 4.394667 14.464 6.613333 29.098667 6.613333 43.690667 0 57.898667-34.602667 115.968-102.229333 161.749333-67.370667 45.653333-163.285333 75.52-271.744 75.52s-204.373333-29.866667-271.744-75.52c-67.626667-45.781333-102.186667-103.850667-102.186667-161.706666z"
fill="#3762FF" p-id="1401"></path>
<path
d="M164.010667 498.517333c-27.392 77.013333-20.906667 146.432 14.72 196.266667 35.626667 49.834667 99.584 78.933333 182.016 78.933333 82.261333-0.042667 178.133333-29.610667 266.197333-91.392s148.053333-141.525333 175.402667-218.368c27.392-77.013333 20.906667-146.432-14.677334-196.266666-35.626667-49.834667-99.584-78.933333-182.016-78.933334-82.261333 0-178.133333 29.610667-266.24 91.392-88.021333 61.738667-148.053333 141.525333-175.36 218.368z m-98.474667-34.389333C100.864 364.8 175.488 268.373333 279.04 195.669333 382.72 123.008 499.328 85.333333 605.610667 85.333333c106.197333 0 206.890667 38.058667 267.306666 122.624 60.416 84.522667 63.232 191.274667 27.946667 290.432-35.328 99.328-109.952 195.754667-213.589333 268.416-103.594667 72.704-220.202667 110.378667-326.528 110.378667-106.154667 0-206.848-38.058667-267.264-122.581333-60.458667-84.522667-63.232-191.274667-27.946667-290.432z"
fill="#1041F3" p-id="1402"></path>
</svg>

Before

Width:  |  Height:  |  Size: 2.2 KiB

View File

@ -1,7 +0,0 @@
<svg t="1710840533172" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="1458"
width="128" height="128">
<path d="M512 512m-512 0a512 512 0 1 0 1024 0 512 512 0 1 0-1024 0Z" fill="#F3DFBC" p-id="1459"></path>
<path
d="M653.443072 286.424064h-97.913856l178.556928 451.150848H832L653.443072 286.424064z m-282.886144 0L192 737.575936h99.84512l36.514816-94.741504h186.805248l36.514816 94.741504h99.84512L472.968192 286.424064H370.556928z m-9.89696 272.622592l61.103104-158.55104 61.103104 158.55104H360.659968z"
fill="#20201C" p-id="1460"></path>
</svg>

Before

Width:  |  Height:  |  Size: 619 B

View File

@ -1,6 +0,0 @@
<svg t="1719124967762" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="1456"
width="200" height="200">
<path
d="M320.512 804.864C46.08 676.864 77.824 274.432 362.496 274.432c34.816 0 86.016-7.168 114.688-14.336 59.392-16.384 99.328-10.24 69.632 10.24-9.216 7.168-15.36 19.456-13.312 28.672 5.12 20.48 158.72 161.792 177.152 161.792 27.648 0 27.648-32.768 1.024-57.344-43.008-38.912-55.296-90.112-35.84-141.312l9.216-26.624 54.272 52.224c35.84 34.816 58.368 49.152 68.608 44.032 9.216-4.096 30.72-9.216 49.152-12.288 18.432-2.048 38.912-10.24 45.056-18.432 19.456-23.552 43.008-17.408 35.84 9.216-3.072 12.288-6.144 27.648-6.144 34.816 0 23.552-62.464 83.968-92.16 90.112-23.552 5.12-30.72 12.288-30.72 30.72 0 46.08-38.912 148.48-75.776 198.656l-37.888 51.2 36.864 15.36c56.32 23.552 40.96 41.984-37.888 43.008-43.008 1.024-75.776 7.168-92.16 18.432-68.608 45.056-198.656 50.176-281.6 12.288z m251.904-86.016c-24.576-27.648-66.56-79.872-93.184-117.76-69.632-98.304-158.72-150.528-256-150.528-37.888 0-38.912 1.024-38.912 34.816 0 94.208 99.328 240.64 175.104 257.024 38.912 9.216 59.392-7.168 39.936-29.696-7.168-9.216-10.24-23.552-6.144-31.744 5.12-14.336 9.216-14.336 38.912 1.024 18.432 9.216 50.176 29.696 69.632 45.056 35.84 27.648 58.368 37.888 96.256 39.936 14.336 1.024 9.216-10.24-25.6-48.128z m88.064-145.408c8.192-13.312-31.744-78.848-56.32-92.16-10.24-6.144-26.624-10.24-34.816-10.24-23.552 0-20.48 27.648 4.096 33.792 13.312 3.072 20.48 14.336 20.48 29.696 0 13.312 5.12 29.696 12.288 36.864 15.36 15.36 46.08 16.384 54.272 2.048z"
fill="#4D6BFE" p-id="1457"></path>
</svg>

Before

Width:  |  Height:  |  Size: 1.6 KiB

View File

@ -1 +0,0 @@
<svg t="1734679654271" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="7734" width="64" height="64"><path d="M604.814264 81.402675a1319.065676 1319.065676 0 0 1 36.517088 237.361069c-88.649795-11.49071-173.856333 0.427554-255.619614 35.756315a427.093207 427.093207 0 0 0-59.340267 34.23477 450.161355 450.161355 0 0 0-66.947994 49.450223C170.040297 521.839834 103.600499 620.994377 60.10104 735.667162c-0.715126-140.51471 0.045646-281.003554 2.282318-421.468054C76.803043 215.078039 122.195305 133.675364 198.561664 69.991085c95.178746-71.853456 199.150502-87.575584 311.916791-47.167905 35.223774 13.789765 66.669551 33.315756 94.335809 58.579495z" fill="#A569FF" p-id="7735"></path><path d="M604.814264 81.402675a56768.782835 56768.782835 0 0 0 346.912333 352.998514c-94.145617-61.14482-197.102502-99.183453-308.8737-114.115899-1.014871 0-1.521545-0.506675-1.521545-1.521546a1319.065676 1319.065676 0 0 0-36.517088-237.361069z" fill="#2038FB" p-id="7736"></path><path d="M641.331352 318.763744c0 1.014871 0.506675 1.521545 1.521545 1.521546 7.105617 102.384785 3.301753 204.328321-11.411589 305.830609-13.789765 84.228184-40.416808 163.855215-79.88113 238.882615a499.523328 499.523328 0 0 1-103.465081 35.756315c-131.312404 17.313664-206.122223-39.744285-224.427935-171.173848-8.108315-103.991536 14.20819-200.863762 66.947994-290.615156a679.070241 679.070241 0 0 1 35.756315-50.210996 427.093207 427.093207 0 0 1 59.340267-34.23477c81.763281-35.328761 166.969819-47.247025 255.619614-35.756315z" fill="#FEFEFE" p-id="7737"></path><path d="M642.852897 320.28529c111.771198 14.932446 214.728083 52.971079 308.8737 114.115899 1.240059 2.034306 3.015703 3.555851 5.325409 4.564636 6.883471 14.958312 10.18066 30.681961 9.890045 47.167905-3.882984 72.767905-26.95874 138.70255-69.230313 197.800891-44.500636 59.716089-96.993949 111.196053-157.47994 154.43685-86.741777 63.043709-180.570912 113.508802-281.485884 151.393759 36.800095-37.296119 67.739198-78.886039 92.814264-124.766716 39.464321-75.0274 66.091364-154.654431 79.88113-238.882615 14.713343-101.502288 18.517207-203.445825 11.411589-305.830609z" fill="#37E0BE" p-id="7738"></path><path d="M326.371471 388.754829a679.070241 679.070241 0 0 0-35.756315 50.210996c-52.739804 89.751394-75.056309 186.62362-66.947994 290.615156 18.305712 131.429563 93.11553 188.487513 224.427935 171.173848a499.523328 499.523328 0 0 0 103.465081-35.756315c-25.075067 45.880678-56.014169 87.470597-92.814264 124.766716-39.120452 18.194639-80.202175 29.099554-123.245171 32.713225-87.99553 4.042746-161.790478-26.134062-221.384844-90.531947-45.209676-57.563103-63.215643-122.989551-54.014859-196.279346 43.499459-114.672785 109.939257-213.827328 199.322437-297.46211a450.161355 450.161355 0 0 1 66.947994-49.450223z" fill="#1F37FB" p-id="7739"></path></svg>

Before

Width:  |  Height:  |  Size: 2.8 KiB

View File

@ -1,11 +0,0 @@
<svg t="1710923402682" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="1121"
width="128" height="128">
<path
d="M68.48129212 644.59999627c-0.15710901-120.97393023-0.31421799-241.94786048-0.314218-362.92179071 0-15.86800901 8.16966803-28.27962006 21.83815104-36.29217909 48.5466811-28.43672906 97.09336219-56.87345811 145.9542613-84.83886017 75.88364715-43.67630209 151.9244033-87.03838618 227.96515945-130.55757925 7.85545002-4.39905201 15.55379103-8.95521301 22.93791404-14.13981003 20.73838803-14.29691903 41.63388507-17.12488105 64.10047214-3.77061602 29.22227406 17.43909904 58.60165712 34.40687107 87.98104017 51.37464313 93.95118219 54.3597141 188.05947338 108.56231921 282.01065555 162.7649243 10.52630302 6.127251 21.20971504 11.94028402 31.57890906 18.22464405 15.08246403 9.11232202 22.93791404 22.46658704 22.93791406 40.21990407 0 157.58032731 0.31421799 315.16065462-0.15710901 472.74098195 0 16.49644503-7.69834101 30.47914606-23.25213205 39.43435907-37.07772405 21.52393302-75.25521114 41.47677609-111.54739022 63.94336312-52.1601881 32.20734506-106.67701121 59.85852912-158.68009031 92.38009218-30.95047306 19.32440705-63.47203613 36.29217908-95.05094519 54.35971411-13.98270102 8.01255901-27.80829306 16.18222702-41.63388508 24.50900405-14.76824603 8.95521301-29.69360106 9.42654002-44.77606509 0.78554499-60.17274713-34.72108908-120.34549424-69.44217814-180.67535037-104.1632672-65.98578013-38.02037807-131.97156027-75.88364715-197.80023136-113.90402523-11.94028402-6.91279601-23.72345905-13.66848304-35.1924161-21.20971504-12.25450203-8.16966803-18.22464404-20.42417004-18.22464403-34.87819806-0.15710901-37.07772405 0.15710901-74.15544815 0.314218-111.23317222 1.72819901-0.628436 1.72819901-1.72819901-0.314218-2.827962z m380.36088975-240.53387947c2.35663501-1.885308 4.71327001-3.92772502 7.22701402-5.49881501 73.52701215-42.26232108 147.05402429-84.52464218 220.73814544-126.78696327 2.98507101-1.72819901 6.91279601-1.72819901 7.06990501-6.59857801 0.15710901-5.18459701-4.39905201-5.02748801-7.22701402-6.598578-43.36208409-25.13744005-87.03838618-49.96066209-130.24336125-75.56942917-17.91042605-10.68341202-35.03530708-9.11232202-51.6888611 0.62843601-49.01800811 28.27962006-97.56468919 57.18767612-146.4255883 85.78151417-34.09265308 19.95284304-68.49952413 39.59146809-102.74928619 59.38720211-19.63862503 11.31184801-29.37938307 27.02274804-29.06516507 50.90331611 0.94265401 95.52227219 0.628436 191.04454438 0.157109 286.56681656-0.15710901 21.52393302 9.89786702 35.82085206 27.02274805 46.0329371 33.46421707 19.95284304 67.39976113 39.59146809 101.4924142 58.60165712 11.62606602 6.441469 22.46658704 14.92535503 38.17748709 19.63862503V505.08720399c6.91279601 20.89549705 19.79573404 36.29217908 39.12014107 47.60402709 29.69360106 17.28199003 67.39976113 13.35426503 92.06587419-11.15473903 28.75094705-28.43672906 30.63625506-67.08554313 13.66848301-96.15070819-17.43909904-30.32203706-46.50426407-40.06279507-79.34004515-41.31966706z m87.98104018 458.2869539c15.23957303-4.08483399 27.65118406-14.92535503 41.47677608-22.78080505 43.67630209-24.82322205 86.72416817-50.90331611 130.71468825-75.25521115 28.75094705-16.02511803 56.40213111-33.77843506 86.09573217-48.2324631 17.91042605-8.64099502 31.57890905-24.03767705 31.57890908-46.34715509 0.15710901-99.1357792 0.31421799-198.1144494 0.157109-297.25022859 0-11.94028402-5.49881501-14.92535503-14.76824605-7.54123201-28.27962006 22.15236905-61.11540111 36.76350606-91.90876517 54.67393211-58.13033011 33.93554407-116.41776923 67.55687014-174.70520834 101.33530519-9.11232202 5.18459701-12.56872002 11.15473903-12.41161104 21.83815106 0.31421799 102.59217721 0.15710901 205.3414634 0.15710901 307.9336406-0.31421799 3.770616-1.57108999 8.16966803 3.61350701 11.62606603z"
fill="#066AF3" p-id="1122"></path>
<path d="M68.48129212 644.59999627c2.042417 1.09976302 2.042417 2.19952601 0 3.29928901v-3.29928901z" fill="#4372E0"
p-id="1123"></path>
<path
d="M383.32772875 504.93009499c-5.81303301-47.28980909 14.76824603-79.02582716 65.51445312-100.86397819 32.99289006 1.25687199 61.90094612 10.99763002 79.34004515 41.16255808 16.81066303 29.22227406 15.08246403 67.71397913-13.66848301 96.15070819-24.82322205 24.50900404-62.37227313 28.43672906-92.06587419 11.15473902-19.32440705-11.31184801-32.36445406-26.70853004-39.12014107-47.6040271z"
fill="#002A9A" p-id="1124"></path>
</svg>

Before

Width:  |  Height:  |  Size: 4.4 KiB

View File

@ -1,5 +0,0 @@
<svg viewBox="0 0 128 128" fill="none" xmlns="http://www.w3.org/2000/svg">
<rect width="128" height="128" fill="white"/>
<path opacity="0.35" fill-rule="evenodd" clip-rule="evenodd" d="M117.928 62.5487C119.067 62.5487 120 63.1367 120 63.8553V68.2933C120 69.012 119.067 69.6 117.928 69.6C116.789 69.6 115.856 69.012 115.856 68.2933V63.8507C115.856 63.1367 116.789 62.5487 117.928 62.5487ZM109.696 64.2847C110.835 64.2847 111.768 64.8727 111.768 65.5913V75.368C111.768 76.082 110.835 76.67 109.696 76.67C108.553 76.67 107.624 76.082 107.624 75.3633V65.5867C107.624 64.8727 108.557 64.2847 109.696 64.2847ZM101.459 66.31C102.598 66.31 103.531 66.8933 103.531 67.612V78.602C103.531 79.3207 102.598 79.9087 101.459 79.9087C100.321 79.9087 99.3873 79.3207 99.3873 78.602V67.612C99.3873 66.8933 100.321 66.31 101.459 66.31ZM93.2273 67.9853C94.366 67.9853 95.2993 68.5687 95.2993 69.2873V80.1607C95.2993 80.8747 94.366 81.4627 93.2273 81.4627C92.0886 81.4627 91.1553 80.8747 91.1553 80.156V69.2827C91.1553 68.5687 92.0886 67.9853 93.2273 67.9853ZM84.9906 69.46C86.134 69.46 87.0626 70.048 87.0626 70.7667V80.1793C87.0626 80.898 86.1293 81.486 84.9906 81.486C83.852 81.486 82.9186 80.898 82.9186 80.1793V70.7667C82.9186 70.048 83.852 69.46 84.9906 69.46ZM76.8006 75.466C77.9393 75.466 78.8726 76.0493 78.8726 76.7727V79.7593C78.8726 80.4733 77.9393 81.0567 76.8006 81.0567C75.662 81.0567 74.7286 80.4733 74.7286 79.7547V76.768C74.7286 76.0493 75.662 75.466 76.8006 75.466Z" fill="black"/>
<path fill-rule="evenodd" clip-rule="evenodd" d="M117.928 50.6067C119.067 50.6067 120 51.19 120 51.9087V59.2587C120 59.9773 119.067 60.5653 117.928 60.5653C116.789 60.5653 115.856 59.9773 115.856 59.2587V51.9087C115.856 51.19 116.789 50.602 117.928 50.602V50.6067ZM109.696 45.6693C110.835 45.6693 111.768 46.2527 111.768 46.976V60.0893C111.768 60.8033 110.835 61.3867 109.696 61.3867C108.553 61.3867 107.624 60.8033 107.624 60.0847V46.9713C107.624 46.2527 108.557 45.6693 109.696 45.6693ZM101.459 43.812C102.598 43.812 103.531 44.4 103.531 45.1187V62.2313C103.531 62.9453 102.598 63.538 101.459 63.538C100.321 63.538 99.3873 62.9453 99.3873 62.2313V45.114C99.3873 44.4 100.321 43.812 101.459 43.812ZM93.2273 43C94.366 43 95.2993 43.588 95.2993 44.3067V64.3267C95.2993 65.0453 94.366 65.6333 93.2273 65.6333C92.0887 65.6333 91.1553 65.0453 91.1553 64.3267V44.3067C91.1553 43.588 92.0887 43 93.2273 43ZM84.9907 43.5087C86.134 43.5087 87.0627 44.0967 87.0627 44.8153V65.1387C87.0627 65.8573 86.1293 66.4453 84.9907 66.4453C83.852 66.4453 82.9187 65.8573 82.9187 65.1387V44.8107C82.9187 44.0967 83.852 43.504 84.9907 43.504V43.5087ZM76.8007 44.7827C77.9393 44.7827 78.8727 45.366 78.8727 46.0893V71.448C78.8727 72.162 77.9393 72.7547 76.8007 72.7547C75.662 72.7547 74.7287 72.1667 74.7287 71.448V46.08C74.7287 45.3613 75.662 44.7733 76.8007 44.7733V44.7827ZM68.396 46.9433C69.5347 46.9433 70.468 47.5267 70.468 48.2453V77.0013C70.468 77.72 69.5347 78.308 68.396 78.308C67.2573 78.308 66.324 77.72 66.324 77.0013V48.2453C66.324 47.5267 67.2573 46.9387 68.396 46.9387V46.9433ZM60.164 50.602C61.3027 50.602 62.236 51.19 62.236 51.9087V79.5727C62.236 80.2913 61.3027 80.8793 60.164 80.8793C59.0207 80.8793 58.092 80.2913 58.092 79.5727V51.9087C58.092 51.19 59.0253 50.602 60.164 50.602ZM51.9273 55.3713C53.0707 55.3713 53.9993 55.9593 53.9993 56.678V80.7627C53.9993 81.4813 53.066 82.0693 51.9273 82.0693C50.7887 82.0693 49.8553 81.4813 49.8553 80.7627V56.6733C49.8553 55.9547 50.7887 55.3667 51.9273 55.3667V55.3713ZM43.6953 58.666C44.834 58.666 45.7673 59.254 45.7673 59.9727V83.698C45.7673 84.412 44.834 85 43.6953 85C42.5567 85 41.6233 84.412 41.6233 83.6933V59.9727C41.6233 59.254 42.5567 58.666 43.6953 58.666ZM35.5007 61.396C36.644 61.396 37.5727 61.984 37.5727 62.7027V75.326C37.5727 76.0447 36.6393 76.6327 35.5007 76.6327C34.362 76.6327 33.4287 76.0447 33.4287 75.326V62.698C33.4287 61.984 34.362 61.396 35.5007 61.396ZM27.0213 49.0853C28.16 49.0853 29.0933 49.6733 29.0933 50.392V67.99C29.0933 68.7087 28.16 69.2967 27.0213 69.2967C25.8827 69.2967 24.9493 68.7087 24.9493 67.99V50.392C24.9493 49.6733 25.8827 49.0853 27.0213 49.0853ZM18.556 50.238C19.6947 50.238 20.6233 50.826 20.6233 51.5447V56.958C20.6233 57.672 19.69 58.26 18.556 58.26C17.4127 58.26 16.484 57.672 16.484 56.9533V51.54C16.484 50.826 17.4173 50.238 18.556 50.238ZM10.072 49.51C11.2107 49.51 12.144 50.098 12.144 50.8167V53.2573C12.144 53.9713 11.2107 54.5593 10.072 54.5593C8.93333 54.5593 8 53.9713 8 53.2527V50.8167C8 50.098 8.93333 49.51 10.072 49.51ZM35.5053 47.4473C36.644 47.4473 37.5727 48.0353 37.5727 48.754V50.5833C37.5727 51.302 36.6393 51.89 35.5007 51.89C34.362 51.89 33.4287 51.302 33.4287 50.5833V48.7493C33.4287 48.0307 34.362 47.4427 35.5007 47.4427L35.5053 47.4473Z" fill="black"/>
</svg>

Before

Width:  |  Height:  |  Size: 4.6 KiB

View File

@ -1,9 +0,0 @@
<svg style="flex:none;line-height:1" viewBox="0 0 24 24" xmlns="http://www.w3.org/2000/svg">
<defs>
<linearGradient id="lobe-icons-gemini-fill" x1="0%" x2="68.73%" y1="100%" y2="30.395%">
<stop offset="0%" stop-color="#1C7DFF"></stop>
<stop offset="52.021%" stop-color="#1C69FF"></stop>
<stop offset="100%" stop-color="#F0DCD6"></stop>
</linearGradient>
</defs>
<path d="M12 24A14.304 14.304 0 000 12 14.304 14.304 0 0012 0a14.305 14.305 0 0012 12 14.305 14.305 0 00-12 12" fill="url(#lobe-icons-gemini-fill)" fill-rule="nonzero"></path></svg>

Before

Width:  |  Height:  |  Size: 543 B

View File

@ -1,3 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 1000 1000">
<rect width="1000" height="1000" fill="#000"/>
<g><polygon fill="#fff" points="226.83 411.15 501.31 803.15 623.31 803.15 348.82 411.15 226.83 411.15"></polygon><polygon fill="#fff" points="348.72 628.87 226.69 803.15 348.77 803.15 409.76 716.05 348.72 628.87"></polygon><polygon fill="#fff" points="651.23 196.85 440.28 498.12 501.32 585.29 773.31 196.85 651.23 196.85"></polygon><polygon fill="#fff" points="673.31 383.25 673.31 803.15 773.31 803.15 773.31 240.44 673.31 383.25"></polygon></g></svg>

Before

Width:  |  Height:  |  Size: 569 B

View File

@ -1,3 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="none">
<path d="M10.0289 1.21117C6.70605 1.18232 3.98312 3.85333 3.95427 7.17623C3.92543 10.4991 6.59644 13.2221 9.91933 13.2509H12.0077V10.9953H10.0289C7.95213 11.0183 6.24453 9.35688 6.22146 7.2743C6.19838 5.19172 7.85983 3.48989 9.94241 3.46682H10.0289C12.1058 3.46682 13.7903 5.15134 13.8018 7.22815V12.7721C13.8018 14.8316 12.1231 16.5103 10.0693 16.5334C9.08284 16.5276 8.14251 16.1296 7.45024 15.4316L5.85225 17.0295C6.95988 18.1429 8.4598 18.7775 10.0289 18.7891H10.1097C13.3922 18.7429 16.0286 16.0777 16.0459 12.7952V7.07816C15.9652 3.81872 13.2941 1.21694 10.0289 1.21117Z" fill="#F04E35"/>
</svg>

Before

Width:  |  Height:  |  Size: 677 B

View File

@ -1,15 +0,0 @@
<svg t="1710841272884" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="2684"
width="128" height="128">
<path
d="M511.968 959.936c298.688 0 447.968-200.576 447.968-448 0-247.36-149.28-447.936-448-447.936C213.28 64 64 264.576 64 511.968c0 247.392 149.248 447.968 447.968 447.968z"
fill="#FFB02E" p-id="2685"></path>
<path
d="M103.936 586.912a31.936 31.936 0 0 0-7.584 25.568 32 32 0 0 0-37.152 51.84l9.344 8a32 32 0 0 0-24.992 56.256l63.52 52.928-4.032-1.984a35.712 35.712 0 0 0-36.672 60.896C107.712 869.76 163.008 908.64 192 928c48 32 102.72 42.944 160 0 32-24 72.48-97.984 29.92-171.712-8.064-13.952-15.296-28.64-18.304-44.48-13.152-69.76-32.8-141.216-75.616-119.808-23.2 11.584-21.184 31.584-18.304 60 1.088 10.784 2.304 22.784 2.304 36l-2.56 1.28-120.384-105.376a32 32 0 0 0-45.12 3.04zM920.096 586.912c6.368 7.296 8.832 16.64 7.584 25.568a32 32 0 0 1 37.12 51.84l-9.344 8a32 32 0 0 1 25.024 56.256l-63.52 52.928 4.032-1.984a35.712 35.712 0 0 1 36.672 60.896C916.32 869.76 861.024 908.64 832 928c-48 32-102.752 42.944-160 0-32-24-72.48-97.984-29.92-171.712 8.064-13.952 15.296-28.64 18.304-44.48 13.152-69.76 32.8-141.216 75.616-119.808 23.2 11.584 21.184 31.584 18.304 60-1.088 10.784-2.304 22.784-2.304 36l2.56 1.28 120.384-105.376a32 32 0 0 1 45.12 3.04z"
fill="#FF822D" p-id="2686"></path>
<path
d="M224 464c0 44.16-28.64 80-64 80s-64-35.84-64-80 28.64-80 64-80 64 35.84 64 80zM928 464c0 44.16-28.64 80-64 80s-64-35.84-64-80 28.64-80 64-80 64 35.84 64 80z"
fill="#FF6723" p-id="2687"></path>
<path
d="M299.168 333.184c-6.72 7.296-10.24 17.024-11.744 24.928a32 32 0 0 1-62.848-12.224c2.848-14.592 9.92-36.896 27.456-55.968C270.496 269.792 298.112 256 336 256c38.24 0 65.984 14.464 84.352 34.624 17.408 19.104 24.64 41.344 27.2 55.904a32 32 0 0 1-63.072 10.944 49.472 49.472 0 0 0-11.456-23.744C367.04 327.104 356.544 320 336 320c-20.896 0-31.104 6.944-36.832 13.184zM651.2 333.184c-6.72 7.296-10.24 17.024-11.776 24.928a32 32 0 0 1-62.816-12.224c2.816-14.592 9.92-36.896 27.424-55.968C622.496 269.792 650.112 256 688 256c38.272 0 65.984 14.464 84.352 34.624 17.408 19.104 24.64 41.344 27.2 55.904a32 32 0 0 1-63.072 10.944 49.44 49.44 0 0 0-11.456-23.744C719.04 327.104 708.544 320 688 320c-20.896 0-31.072 6.944-36.8 13.184zM313.6 492.8a32 32 0 1 0-51.2 38.4c22.464 29.952 96.256 92.8 249.6 92.8s227.136-62.848 249.6-92.8a32 32 0 0 0-51.2-38.4c-9.536 12.704-63.744 67.2-198.4 67.2s-188.864-54.496-198.4-67.2z"
fill="#402A32" p-id="2688"></path>
</svg>

Before

Width:  |  Height:  |  Size: 2.5 KiB

View File

@ -1 +0,0 @@
<svg t="1735012417413" class="icon" viewBox="0 0 1032 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="4293" width="64" height="64"><path d="M3.796984 0h1018.143079v1018.143079h-1018.143079z" fill="#CCCCCC" fill-opacity="0" p-id="4294"></path><path d="M448.476277 1016.17573l-13.151015-498.465883c-26.514143-9.720085 80.597691-135.736502 124.563442-124.61647 128.917065-32.400282 269.060217-237.58793 13.204043-389.757898-774.159938 0-689.33059 951.857722-124.61647 1012.840251z" fill="#B3DDF2" p-id="4295"></path><path d="M446.021067 1015.942405c-230.673041-53.028285-456.425058-375.800853-180.29617-665.504981 63.421829-55.902418-63.262744-220.942351-167.039099-129.919299-275.725873 381.3264 86.908057 796.458332 347.335269 795.42428z" fill="#0055E9" p-id="4296"></path><path d="M408.827028 536.27505c0-2.651414-6.363394 4.576341 0 0z m0 0v39.771214c0 5.302829-7.42396 40.566638-10.605657 55.679699l39.771214 84.845257s26.514143 7.954243 29.165557 10.605657 15.643344 4.507404 21.211314 5.302829l55.6797 34.468385 92.799499-13.257071 119.313642-37.1198 111.359399-95.450914 34.468386-58.331113 29.165557-148.479199-5.302829-68.936771v-60.982529l-60.982528-111.359399-31.816971-39.771214-55.6797-50.376871-47.725457-29.165557-13.257071-7.954243-15.908486-5.302828-29.165557-10.605657-92.799499-23.862729c201.507484 66.285357 217.41597 270.444255 159.084856 368.546583s-184.713426 154.736537-307.564055 143.176371l-21.211314 18.5599z" fill="#00BCFF" p-id="4297"></path><path d="M461.828799 1016.207546c180.29617 47.698943 575.192508-124.425569 570.054068-493.163053 13.267677-164.414199-140.196181-491.471452-389.757898-503.768711 278.218202 56.660723 384.508097 567.720823 23.915757 694.670538-180.338593 45.074043-246.608041-92.799499-233.324456-190.901827-185.535365 9.375401-283.955862 407.209506 29.112529 493.163053z" fill="#0055DF" p-id="4298"></path></svg>

Before

Width:  |  Height:  |  Size: 1.8 KiB

View File

@ -1,20 +0,0 @@
<svg viewBox="0 0 48 48" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M13.6719 37.9311C13.6719 37.9311 17.7467 37.4782 17.9732 37.4027C18.1997 37.3272 22.4255 37.5537 22.4255 37.5537L25.1414 38.2332L27.6308 39.1392L28.3102 39.3656L32.0076 38.0066L36.4599 37.3271L39.4777 37.4027L41.816 37.7047L42.7976 38.0066L42.722 42.0813L40.6836 41.5529L36.7598 41.2509L34.0439 41.4773L31.7056 42.0061L29.5937 42.6853L28.0837 43.2893L24.9149 42.0813L21.0666 41.3265H16.6142L13.6719 41.8549V37.9311Z" fill="#1B3882" fill-opacity="0.5"/>
<path d="M28.0723 43.9999C21.2124 40.2227 13.9268 42.4879 13.8534 42.5111L13.0312 42.7731V37.572L13.4674 37.4315C13.7841 37.3287 21.3298 34.9694 28.6783 39.0149L28.0723 40.1179C22.175 36.8716 15.9652 38.09 14.2896 38.5095V41.0891C16.479 40.5691 22.6447 39.5727 28.6783 42.8947L28.0723 43.9979V43.9999Z" fill="#1B3882"/>
<path d="M27.9869 44.0003L27.3809 42.8976C33.4145 39.5754 39.5802 40.5715 41.7696 41.0915V38.5122C40.096 38.0906 33.8842 36.8743 27.9869 40.1208L27.3809 39.0176C34.7294 34.9721 42.2752 37.3314 42.5916 37.4342L43.028 37.5747V42.7779L42.206 42.5115C42.1344 42.4883 34.8489 40.2235 27.9869 44.0003Z" fill="#1B3882"/>
<path d="M35.8849 32.1384C35.887 31.6833 35.7276 31.2932 35.4047 30.9535C35.0859 30.6305 34.7 30.4774 34.2554 30.49C33.8003 30.5215 33.4123 30.7039 33.0852 31.0227C32.7664 31.4085 32.6112 31.8112 32.6133 32.2684C32.6112 32.7067 32.7664 33.0968 33.0789 33.4177C33.3977 33.7574 33.7877 33.9126 34.2491 33.8811C34.6979 33.8832 35.0838 33.7008 35.4005 33.3485C35.7234 32.9961 35.8828 32.5914 35.887 32.1363L35.8849 32.1384Z" fill="#1B3882" fill-opacity="0.5"/>
<path d="M23.1787 33.3504C23.5016 32.9981 23.661 32.6122 23.6589 32.155C23.661 31.6999 23.5016 31.2763 23.1829 30.9198C22.841 30.6304 22.4593 30.4941 22.0315 30.5067C21.5617 30.5549 21.1737 30.7205 20.8613 31.0058C20.5425 31.3581 20.3831 31.7775 20.3789 32.2515C20.3789 32.7066 20.5404 33.0967 20.8591 33.4196C21.1675 33.7594 21.5554 33.9125 22.0147 33.8831C22.4887 33.8684 22.8725 33.7028 23.1765 33.3504H23.1787Z" fill="#1B3882" fill-opacity="0.5"/>
<path d="M27.9635 37.9078C26.7031 37.9078 25.4658 37.8029 24.5179 37.5596C21.9677 36.9053 19.9376 35.7246 18.3123 33.9483C16.6241 32.1007 16.5171 30.3202 16.4039 28.4348C16.3891 28.1936 16.3745 27.9482 16.3556 27.6987C16.2046 25.612 15.2189 22.3509 15.2085 22.3194L15.1309 22.0636L15.2881 21.8476C16.7394 19.872 18.5283 18.4606 20.7618 17.5294C22.8128 16.6738 25.2539 16.2271 28.2257 16.1621C33.1729 15.8727 38.5669 17.93 41.054 21.0569L41.2116 21.254L40.0852 26.5075C40.0728 26.7906 39.9888 28.2691 39.5819 31.2912C39.1037 34.8312 34.7311 36.8571 32.485 37.4317C31.3442 37.7253 29.6308 37.9078 27.9614 37.9078H27.9635ZM16.3095 22.267C16.5527 23.1038 17.2867 25.7441 17.4231 27.619C17.4419 27.8728 17.4566 28.1244 17.4713 28.3698C17.5804 30.2069 17.6684 31.6582 19.1008 33.2269C20.5814 34.8459 22.4395 35.9259 24.7821 36.5257C26.7681 37.0354 30.348 36.878 32.2208 36.3957C33.8335 35.9826 38.1097 34.2314 38.5249 31.1444C38.9633 27.8958 39.0199 26.4299 39.022 26.4152V26.3691L40.062 21.5246C38.4263 18.383 37.4554 17.7392 35.3749 16.9548C33.1813 16.1264 30.5074 17.0911 28.2781 17.2232H28.2592C22.5759 17.347 18.4087 15.5245 16.3074 22.2649L16.3095 22.267Z" fill="#1B3882"/>
<path d="M39.1368 21.9314C39.5416 22.9736 39.7136 23.4958 40.0932 24.4501C40.0932 24.4501 41.0976 22.8374 41.4144 21.6839C41.7332 20.5178 41.538 19.033 41.2884 17.6447C40.6616 15.3148 40.1748 15.1344 38.2938 13.7544C37.4591 13.1819 36.5153 12.6744 35.4479 12.2277C33.1389 11.3574 30.8194 10.875 28.4852 10.7743C26.172 10.7345 23.8379 11.0805 21.4995 11.8082C20.4132 12.1857 19.4443 12.6387 18.5865 13.163C16.632 14.4276 15.2709 16.2081 14.5138 18.4941C14.3104 19.3539 13.9874 20.1802 14.0671 20.9772C14.432 24.6325 16.8165 26.2369 16.6529 25.356C16.4411 24.2194 16.5774 23.5483 16.8689 22.4619C17.4226 20.4067 18.7333 18.8883 20.8032 17.9215H20.8095C21.2122 17.8083 23.2528 16.6486 25.9623 18.0348C26.6816 18.3431 27.94 19.1778 28.5691 19.1589C29.2004 19.14 30.1084 18.5318 30.769 18.1187C32.0106 17.3406 33.1871 17.0743 34.7411 17.5671C36.9998 18.278 38.386 19.8845 39.1368 21.9334V21.9314Z" fill="#1B3882"/>
<path d="M5.81212 20.1084L7.5108 20.6075L9.45912 20.5068L10.8579 19.5568L12.3574 18.2083L13.4563 16.7088L14.8552 15.2597L16.2036 14.4103L18.0512 14.0098L16.8014 16.2076L15.7528 17.2562L13.5549 20.3034L12.5566 21.8029L11.4074 22.6019L9.759 23.7008C9.759 23.7008 8.30984 24.0008 8.16096 24.0511C8.01204 24.1014 6.81248 24.3006 6.81248 24.3006L5.76388 24.2503L4.96484 24.0008L5.71352 22.5013L5.8142 21.4023V20.1042L5.81212 20.1084Z" fill="#1B3882" fill-opacity="0.5"/>
<path d="M10.1579 11.4685L10.0069 12.6157L9.60842 13.4168L9.10929 14.1655L7.85938 14.9142L9.35886 15.6629L11.0072 16.313L12.7059 16.6632L13.4547 16.7136L15.254 15.4637L17.152 13.7649L16.5019 13.5657L14.1551 13.3161L12.4061 12.666L11.1561 12.018L10.1579 11.4685Z" fill="#1B3882" fill-opacity="0.5"/>
<path d="M6.36581 24.7249C5.83729 24.7249 5.28153 24.6934 4.69853 24.6326L3.59961 24.5067L4.14069 24.1586L4.43009 23.8461C4.46993 23.7853 4.58529 23.584 4.77613 23.1604V23.1562C5.08441 22.4935 5.19137 21.3841 5.09701 19.8594L5.03197 18.8339L5.82261 19.4882C6.64893 20.1719 7.68493 20.3355 8.98729 19.9831C10.1512 19.6518 11.4179 18.5654 12.7475 16.7577C13.9702 14.9981 15.7297 13.8048 17.9821 13.2092L18.1939 13.1526L19.6179 14.262L18.3386 14.1697C18.2191 14.2851 18.0995 14.4088 17.9905 14.5451L17.9653 14.5766L17.9318 14.6059C17.1977 15.9104 16.147 17.1981 15.0964 18.6074L15.0817 18.6325C14.9202 18.9157 14.7545 19.2093 14.5742 19.5008V19.5155L14.5154 19.6098C13.4396 21.3589 12.2568 22.6696 10.9985 23.5085C9.84293 24.318 8.28893 24.7269 6.36581 24.7269V24.7249ZM5.45561 23.8063C7.62621 23.9468 9.31861 23.6029 10.493 22.7766L10.5014 22.7703C11.6402 22.0132 12.7244 20.8115 13.7248 19.2009C13.7437 19.1463 13.7709 19.1002 13.8003 19.0646C13.9786 18.7773 14.1463 18.4815 14.3099 18.1942V18.19C15.0208 16.8269 15.8388 15.6147 16.751 14.5766C15.39 15.1764 14.312 16.0677 13.4773 17.2694L13.471 17.2799C12.0009 19.2827 10.6126 20.4466 9.22845 20.8409C8.00581 21.1701 6.93205 21.1114 6.02185 20.6668C6.04073 21.9209 5.89813 22.8626 5.58985 23.5274C5.54369 23.6322 5.49965 23.7245 5.45981 23.8063H5.45561Z" fill="#1B3882"/>
<path d="M6.46611 24.7247C5.93763 24.7247 5.38187 24.6932 4.79883 24.6324L4.89111 23.7495C7.38255 24.0116 9.29939 23.6845 10.5913 22.7764L10.6017 22.7701C11.7405 22.013 12.8247 20.8113 13.8251 19.2007C13.8607 19.1021 13.9174 19.035 13.9677 18.991L14.5486 19.6621C14.6367 19.5845 14.6745 19.4796 14.6807 19.4062L14.6724 19.5132L14.6157 19.6076C13.5399 21.3566 12.3571 22.6674 11.0988 23.5062C9.94323 24.3157 8.38923 24.7247 6.46611 24.7247Z" fill="#1B3882"/>
<path d="M14.0572 16.2124C13.8685 16.225 13.6797 16.2355 13.4909 16.2418C11.9957 16.2942 10.3997 15.8769 8.73667 15.0024C10.1774 14.0712 10.578 13.01 10.5213 12.0537C10.8653 12.3263 11.2239 12.5654 11.5972 12.7689C14.0425 14.1467 16.6263 14.3648 21.0429 13.1296L20.743 12.2928C16.5465 13.4505 14.2732 13.2596 12.0271 11.9929C11.3896 11.6469 10.7982 11.1813 10.2697 10.6129L10.2089 10.5479L9.29871 9.56226L9.47067 10.829C9.49583 10.9338 10.018 13.4064 7.63355 14.6123L6.91211 14.9772L7.60839 15.3903C9.55875 16.5501 11.4504 17.1373 13.233 17.1373L14.0572 16.2145V16.2124Z" fill="#1B3882"/>
<path d="M25.2919 29.8378C25.0151 29.8378 24.7551 29.6658 24.6271 29.3764C24.0336 28.0216 23.5681 27.4574 22.8655 26.6039C22.1545 25.742 20.6404 24.2823 20.3908 24.0705C20.0574 23.7874 19.963 23.4015 20.1622 23.1352C20.2629 23.003 20.5607 22.7367 21.1731 23.1435C22.4146 23.9656 25.1703 26.4382 25.9148 28.4242C26.1874 29.1457 25.9148 29.6364 25.5499 29.7853C25.4639 29.821 25.3779 29.8356 25.2898 29.8356L25.2919 29.8378Z" fill="#1B3882"/>
<path d="M30.6502 29.8713C30.5621 29.8713 30.4719 29.8545 30.3839 29.8168C30.019 29.6637 29.7547 29.1688 30.0357 28.4515C30.8054 26.476 33.5925 24.039 34.8446 23.2316C35.4633 22.8332 35.7569 23.1016 35.8554 23.2358C36.0525 23.5042 35.954 23.888 35.6163 24.167C35.3647 24.3767 33.8317 25.8132 33.1102 26.6689C32.3972 27.5141 31.9253 28.0719 31.3129 29.4204C31.1829 29.7056 30.9249 29.8734 30.6523 29.8734L30.6502 29.8713Z" fill="#1B3882"/>
<path d="M33.3855 8.21144C33.4043 7.18172 32.9031 6.29461 31.8817 5.5522C30.8437 4.82448 29.6042 4.45748 28.1635 4.447C26.7227 4.43648 25.4875 4.79092 24.4703 5.51444C23.4469 6.24428 22.9247 7.12088 22.9121 8.14224C22.9163 8.461 22.9436 8.75253 22.9834 9.02513C26.8024 7.79621 30.2229 7.89685 33.2387 9.32925C33.3205 8.94753 33.3708 8.57632 33.3855 8.20932V8.21144Z" fill="#1B3882" fill-opacity="0.5"/>
<path d="M17.4187 28.9379L17.3998 28.8939C17.3495 28.7786 17.3055 28.6611 17.2698 28.5458L17.1629 28.2018L16.4561 28.2249L16.4016 28.8184C16.4058 28.8414 16.4183 28.8981 16.4309 28.9988V29.0197C16.4372 29.0554 16.4393 29.0847 16.4456 29.1141C16.4561 29.4287 16.3617 29.6782 16.1478 29.8984C15.9444 30.1144 15.6781 30.2109 15.3278 30.1962C14.8937 30.1794 14.749 30.0054 14.6777 29.8754C14.4869 29.5314 14.4743 29.3469 14.4848 29.2651C14.512 29.0407 14.5581 28.8037 14.6232 28.5646C14.7238 28.1913 14.6483 27.7572 14.3946 27.277L14.1345 26.7841C14.0359 26.5975 13.95 26.3626 13.8787 26.0858C13.8241 25.853 13.8388 25.574 13.9206 25.2532C13.994 24.97 14.1513 24.7478 14.4009 24.5716C14.6504 24.4038 14.9482 24.343 15.3027 24.3891L15.3299 24.3933H15.3572C15.7053 24.3954 15.9905 24.429 16.2087 24.4877C16.1478 24.148 16.1289 23.7935 16.1604 23.4328C15.9318 23.3364 15.7011 23.2692 15.4683 23.2378H15.4558L15.2838 23.2231C14.7658 23.1748 14.2603 23.2902 13.7822 23.567C13.2873 23.8564 12.9538 24.2843 12.7923 24.84C12.6539 25.3245 12.6371 25.8236 12.7399 26.3206C12.7525 26.3961 12.7693 26.4842 12.7986 26.5681C12.8762 26.8135 13.0041 27.0861 13.1803 27.3797C13.4487 27.8222 13.5305 27.9669 13.5536 28.0152L13.562 28.0319C13.5955 28.0928 13.6019 28.185 13.5788 28.3067L13.5704 28.3465C13.5452 28.4828 13.5201 28.6108 13.5096 28.659C13.2999 29.2022 13.2789 29.7034 13.4445 30.1417C13.6123 30.6073 13.908 30.947 14.3275 31.1568C14.5707 31.2763 14.8518 31.3518 15.1643 31.3832C15.3341 31.4 15.5124 31.4021 15.699 31.3916C16.3051 31.3602 16.7979 31.0728 17.123 30.5674C17.4418 30.0872 17.5445 29.5545 17.4334 28.9862L17.425 28.9379H17.4187Z" fill="#1B3882"/>
<path d="M43.4628 24.106C43.2992 23.5565 42.9592 23.1308 42.4516 22.8393H42.4496C41.9948 22.5792 41.508 22.4723 41.0004 22.5205C40.9816 22.5205 40.9628 22.5247 40.944 22.5268L40.776 22.5478H40.7616C40.4384 22.6002 40.1156 22.7219 39.801 22.9043C39.8031 22.9295 39.8073 22.9567 39.8115 22.9819C39.8408 23.3028 39.8262 23.6195 39.7716 23.9277L39.9541 23.8543C40.1868 23.76 40.5036 23.6824 40.9208 23.6195L40.944 23.6153C41.2628 23.5502 41.5376 23.609 41.806 23.804C42.0888 24.0095 42.28 24.2654 42.3888 24.5821C42.4792 24.8799 42.4916 25.1357 42.4288 25.3622C42.3512 25.6453 42.2632 25.897 42.1664 26.1088C42.0512 26.3521 41.9336 26.5891 41.8184 26.8051C41.6444 27.128 41.5816 27.4783 41.634 27.8432C41.678 28.1619 41.724 28.4262 41.7764 28.6527C41.8144 28.81 41.7912 29.0113 41.7092 29.2504L41.7052 29.2609C41.638 29.4748 41.2772 29.5901 41.0676 29.6405C40.5896 29.7537 40.3756 29.5797 40.2476 29.4223C40.0296 29.1413 39.9268 28.8373 39.9331 28.4954L39.9415 28.3968C39.952 28.2983 39.9625 28.2479 39.9667 28.2291L39.9519 27.6167L39.1676 27.6356L39.0838 27.9711C39.0544 28.0865 39.0166 28.1997 38.9684 28.3129L38.9474 28.3612L38.939 28.4136C38.8468 28.9715 38.9621 29.5021 39.2788 29.9928L39.2872 30.0033C39.6332 30.5024 40.128 30.7771 40.7176 30.7981C40.8748 30.8065 41.0276 30.8023 41.1724 30.7897C41.5208 30.7583 41.8312 30.666 42.0972 30.5171C42.4896 30.3011 42.7684 29.955 42.932 29.4874C43.0872 29.0281 43.0556 28.529 42.8396 28.0026L42.7872 27.6922L42.7832 27.6691C42.758 27.558 42.7684 27.4489 42.8104 27.3399C42.8924 27.1448 43.0032 26.9183 43.1396 26.6604C43.318 26.3458 43.4332 26.0795 43.496 25.8529C43.5276 25.7523 43.5488 25.6642 43.5548 25.5761C43.6368 25.0833 43.6052 24.5905 43.4584 24.1102L43.4628 24.106Z" fill="#1B3882"/>
<path d="M33.7641 9.18672C33.8249 8.8512 33.8606 8.53452 33.8731 8.21784C33.8941 7.0392 33.3258 6.02 32.1807 5.1874C31.0755 4.41144 29.7396 4.013 28.2108 4.0004C26.6777 3.98572 25.3481 4.3758 24.2554 5.15176C23.1125 5.96756 22.5253 6.97212 22.5106 8.13812V8.14864C22.5127 8.3814 22.5274 8.6142 22.5568 8.85748C22.4498 8.9854 22.3911 9.12592 22.3848 9.27692V9.29368C22.3848 9.68796 22.4414 10.0822 22.5609 10.4639L22.563 10.4744C22.7266 10.9568 23.0076 11.416 23.3998 11.8397L23.5131 11.9613L24.5449 12.0766C25.4257 12.5884 26.6022 12.8631 28.0493 12.8882C28.1122 12.8882 28.1751 12.8882 28.2359 12.8882C29.3034 12.8882 30.2366 12.7582 31.0126 12.5024L31.0336 12.494C31.3251 12.3808 31.6082 12.2444 31.8787 12.0872C32.7008 12.1647 33.3174 11.76 33.6341 10.9358L33.7117 10.8519L33.7075 10.7344C33.8291 10.3612 34.043 9.62924 33.762 9.18672H33.7641ZM24.7693 5.87528C25.7046 5.21048 26.8581 4.87912 28.2024 4.8896C29.5466 4.90008 30.7127 5.24612 31.6627 5.91092C32.5666 6.56736 32.9986 7.31816 32.9839 8.19268C32.9776 8.35416 32.965 8.51984 32.942 8.6918C31.4152 8.05216 29.7857 7.7334 28.0598 7.7334C26.577 7.7334 25.0251 7.97036 23.4103 8.44224C23.404 8.34156 23.3998 8.24092 23.3977 8.14232C23.4103 7.26572 23.8591 6.52332 24.7693 5.87528ZM32.9756 10.0445L32.6756 10.6925C32.6756 10.6925 32.2835 11.2588 31.8556 11.1874L31.69 11.1602L31.5474 11.2483C31.2873 11.4098 31.0105 11.5482 30.7232 11.6593C29.9996 11.8963 29.1042 12.0096 28.064 11.9949C26.7763 11.9697 25.7445 11.739 24.9958 11.3049C24.9727 11.2902 24.9496 11.2755 24.9286 11.265L24.8406 11.2105L23.9472 11.112C23.7773 10.9127 23.5361 10.4933 23.5361 10.4933L23.2908 9.78232C23.2908 9.78232 23.2782 9.48032 23.2761 9.41112C26.8916 8.27236 30.1674 8.36672 33.0133 9.69216C33.0175 9.7446 32.9756 10.0445 32.9756 10.0445ZM33.2419 10.7051V10.6988L33.2482 10.7051H33.2419Z" fill="#1B3882"/>
</svg>

Before

Width:  |  Height:  |  Size: 14 KiB

View File

@ -1 +0,0 @@
<svg fill="currentColor" fill-rule="evenodd" height="1em" style="flex:none;line-height:1" viewBox="0 0 24 24" width="1em" xmlns="http://www.w3.org/2000/svg"><title>Jina</title><path d="M6.608 21.416a4.608 4.608 0 100-9.217 4.608 4.608 0 000 9.217zM20.894 2.015c.614 0 1.106.492 1.106 1.106v9.002c0 5.13-4.148 9.309-9.217 9.37v-9.355l-.03-9.032c0-.614.491-1.106 1.106-1.106h7.158l-.123.015z"></path></svg>

Before

Width:  |  Height:  |  Size: 404 B

View File

@ -1,74 +0,0 @@
<svg viewBox="0 0 48 48" fill="none" xmlns="http://www.w3.org/2000/svg">
<g >
<path d="M13.7936 8H13.7456L13.6836 13.23H13.7276C17.1576 13.23 19.8196 15.944 25.6076 25.722L25.9576 26.316L25.9816 26.356L29.2216 21.48L29.1976 21.442C28.4896 20.2828 27.7574 19.1386 27.0016 18.01C26.2575 16.8963 25.4736 15.8096 24.6516 14.752C20.8256 9.864 17.6256 8 13.7936 8Z" fill="url(#paint0_linear_28116_23027)"/>
<path d="M13.7455 8C9.89953 8.02 6.49353 10.516 4.03953 14.34C4.03284 14.3513 4.02617 14.3627 4.01953 14.374L8.52753 16.836L8.54953 16.802C9.98553 14.636 11.7695 13.254 13.6855 13.232H13.7275L13.7915 8H13.7455Z" fill="url(#paint1_linear_28116_23027)"/>
<path d="M4.03906 14.3401L4.01706 14.3741C2.40106 16.8941 1.19706 19.9901 0.549062 23.3281L0.539062 23.3721L5.60706 24.5721L5.61506 24.5281C6.15506 21.5941 7.18706 18.8721 8.52706 16.8381L8.54906 16.8041L4.03906 14.3401Z" fill="url(#paint2_linear_28116_23027)"/>
<path d="M5.614 24.5279L0.548 23.3279L0.538 23.3719C0.184 25.2079 0.004 27.0739 0 28.9439V28.9899L5.196 29.4559V29.4099C5.17709 27.7734 5.31776 26.139 5.616 24.5299L5.614 24.5279Z" fill="url(#paint3_linear_28116_23027)"/>
<path d="M5.35425 31.074C5.26107 30.5367 5.20825 29.9932 5.19625 29.448V29.404L0.000246211 28.936V28.984C-0.00558496 30.0921 0.0921736 31.1982 0.292246 32.288L5.36225 31.118C5.35954 31.1034 5.35687 31.0887 5.35425 31.074Z" fill="url(#paint4_linear_28116_23027)"/>
<path d="M6.54116 33.78C5.97316 33.16 5.57316 32.268 5.36316 31.124L5.35516 31.082L0.285156 32.252L0.293156 32.294C0.677156 34.314 1.42916 35.994 2.50516 37.268L2.53316 37.302L6.56916 33.812C6.5591 33.8014 6.5511 33.7907 6.54116 33.78Z" fill="url(#paint5_linear_28116_23027)"/>
<path d="M21.5599 19.3081C18.5039 24.0081 16.6519 26.9581 16.6519 26.9581C12.5819 33.3581 11.1739 34.7921 8.90991 34.7921C8.46467 34.8037 8.02219 34.7189 7.61277 34.5436C7.20336 34.3682 6.8367 34.1064 6.53791 33.7761L2.50391 37.2641L2.53191 37.2981C4.01991 39.0361 6.11591 40.0001 8.71191 40.0001C12.6379 40.0001 15.4599 38.1441 20.4799 29.3401L24.0119 23.0801C23.2283 21.8007 22.412 20.5429 21.5599 19.3081Z" fill="#0082FB"/>
<path d="M27.0043 11.8919L26.9723 11.9239C26.1723 12.7839 25.4003 13.7399 24.6523 14.7559C25.4083 15.7219 26.1883 16.8039 27.0023 18.0159C27.9623 16.5299 28.8583 15.3259 29.7363 14.4019L29.7683 14.3699L27.0043 11.8919Z" fill="url(#paint6_linear_28116_23027)"/>
<path d="M41.8367 11.426C39.7067 9.266 37.1667 8 34.4507 8C31.5867 8 29.1767 9.574 27.0047 11.888L26.9727 11.92L29.7367 14.4L29.7687 14.366C31.1987 12.872 32.5847 12.126 34.1207 12.126C35.7727 12.126 37.3207 12.906 38.6607 14.276L38.6907 14.308L41.8687 11.458L41.8367 11.426Z" fill="#0082FB"/>
<path d="M47.9962 28.25C47.8762 21.316 45.4562 15.118 41.8682 11.458L41.8362 11.426L38.6602 14.274L38.6902 14.306C41.3902 17.09 43.2442 22.266 43.4122 28.248V28.294H47.9962V28.25Z" fill="url(#paint7_linear_28116_23027)"/>
<path d="M47.996 28.2999V28.2539H43.412V28.2979C43.42 28.5779 43.424 28.8619 43.424 29.1459C43.424 30.7759 43.182 32.0939 42.688 33.0459L42.666 33.0899L46.082 36.6539L46.108 36.6139C47.348 34.6939 48 32.0279 48 28.7939C48 28.6279 48 28.4639 47.996 28.2999Z" fill="url(#paint8_linear_28116_23027)"/>
<path d="M42.688 33.04L42.666 33.08C42.238 33.884 41.628 34.42 40.832 34.654L42.388 39.578C42.6879 39.4767 42.9806 39.3551 43.264 39.214C44.3715 38.6542 45.3134 37.8144 45.996 36.778L46.084 36.648L46.108 36.608L42.688 33.04Z" fill="url(#paint9_linear_28116_23027)"/>
<path d="M39.8406 34.7861C39.3166 34.7861 38.8566 34.7081 38.4046 34.5061L36.8086 39.5501C37.7066 39.8561 38.6626 39.9941 39.7286 39.9941C40.7126 39.9941 41.6146 39.8481 42.4326 39.5641L40.8726 34.6401C40.5386 34.7401 40.1926 34.7901 39.8406 34.7861Z" fill="url(#paint10_linear_28116_23027)"/>
<path d="M36.6453 33.0679L36.6173 33.0339L32.9453 36.8619L32.9773 36.8959C34.2513 38.2599 35.4693 39.1059 36.8513 39.5699L38.4453 34.5299C37.8633 34.2799 37.2993 33.8239 36.6453 33.0679Z" fill="url(#paint11_linear_28116_23027)"/>
<path d="M36.619 33.03C35.519 31.746 34.155 29.606 32.013 26.15L29.221 21.478L29.199 21.438L25.959 26.314L25.983 26.354L27.961 29.69C29.879 32.91 31.441 35.238 32.947 36.86L32.979 36.892L36.647 33.064C36.6376 33.0527 36.6283 33.0414 36.619 33.03Z" fill="url(#paint12_linear_28116_23027)"/>
</g>
<defs>
<linearGradient id="paint0_linear_28116_23027" x1="1192.97" y1="1645.34" x2="229.443" y2="378.706" gradientUnits="userSpaceOnUse">
<stop offset="0.0006" stop-color="#0867DF"/>
<stop offset="0.4539" stop-color="#0668E1"/>
<stop offset="0.8591" stop-color="#0064E0"/>
</linearGradient>
<linearGradient id="paint1_linear_28116_23027" x1="215.779" y1="678.423" x2="903.276" y2="155.167" gradientUnits="userSpaceOnUse">
<stop offset="0.1323" stop-color="#0064DF"/>
<stop offset="0.9988" stop-color="#0064E0"/>
</linearGradient>
<linearGradient id="paint2_linear_28116_23027" x1="307.026" y1="926.288" x2="587.306" y2="218.601" gradientUnits="userSpaceOnUse">
<stop offset="0.0147" stop-color="#0072EC"/>
<stop offset="0.6881" stop-color="#0064DF"/>
</linearGradient>
<linearGradient id="paint3_linear_28116_23027" x1="264.132" y1="576.012" x2="298.323" y2="120.222" gradientUnits="userSpaceOnUse">
<stop offset="0.0731" stop-color="#007CF6"/>
<stop offset="0.9943" stop-color="#0072EC"/>
</linearGradient>
<linearGradient id="paint4_linear_28116_23027" x1="279.668" y1="224.361" x2="269.842" y2="151.011" gradientUnits="userSpaceOnUse">
<stop offset="0.0731" stop-color="#007FF9"/>
<stop offset="1" stop-color="#007CF6"/>
</linearGradient>
<linearGradient id="paint5_linear_28116_23027" x1="237.123" y1="108.844" x2="387.116" y2="428.013" gradientUnits="userSpaceOnUse">
<stop offset="0.0731" stop-color="#007FF9"/>
<stop offset="1" stop-color="#0082FB"/>
</linearGradient>
<linearGradient id="paint6_linear_28116_23027" x1="202.73" y1="433.72" x2="383.614" y2="182.812" gradientUnits="userSpaceOnUse">
<stop offset="0.2799" stop-color="#007FF8"/>
<stop offset="0.9141" stop-color="#0082FB"/>
</linearGradient>
<linearGradient id="paint7_linear_28116_23027" x1="447.222" y1="116.598" x2="849.003" y2="1599.3" gradientUnits="userSpaceOnUse">
<stop stop-color="#0082FB"/>
<stop offset="0.9995" stop-color="#0081FA"/>
</linearGradient>
<linearGradient id="paint8_linear_28116_23027" x1="362.999" y1="67.4063" x2="127.282" y2="545.316" gradientUnits="userSpaceOnUse">
<stop offset="0.0619" stop-color="#0081FA"/>
<stop offset="1" stop-color="#0080F9"/>
</linearGradient>
<linearGradient id="paint9_linear_28116_23027" x1="200.6" y1="420.874" x2="390.793" y2="290.929" gradientUnits="userSpaceOnUse">
<stop stop-color="#027AF3"/>
<stop offset="1" stop-color="#0080F9"/>
</linearGradient>
<linearGradient id="paint10_linear_28116_23027" x1="151.724" y1="308.912" x2="498.607" y2="308.912" gradientUnits="userSpaceOnUse">
<stop stop-color="#0377EF"/>
<stop offset="0.9994" stop-color="#0279F1"/>
</linearGradient>
<linearGradient id="paint11_linear_28116_23027" x1="254.612" y1="263.742" x2="449.915" y2="379.036" gradientUnits="userSpaceOnUse">
<stop offset="0.0019" stop-color="#0471E9"/>
<stop offset="1" stop-color="#0377EF"/>
</linearGradient>
<linearGradient id="paint12_linear_28116_23027" x1="370.69" y1="326.175" x2="1008.64" y2="1130.72" gradientUnits="userSpaceOnUse">
<stop offset="0.2765" stop-color="#0867DF"/>
<stop offset="1" stop-color="#0471E9"/>
</linearGradient>
</defs>
</svg>

Before

Width:  |  Height:  |  Size: 7.2 KiB

View File

@ -1,4 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="none">
<path d="M8.33595 9.4104V14.8168C7.95355 16.6472 5.40635 16.1752 5.39515 14.8472C5.38315 13.1792 5.39515 11.5232 5.39515 9.8664V7.9472C5.39515 7.7072 5.34075 7.5176 5.11995 7.3664C4.69035 7.0608 4.17595 7.4208 4.16475 7.8624C4.14075 8.4544 4.15275 9.0344 4.14075 9.6192C4.14075 10.08 4.14075 10.5288 4.15275 10.9888C3.83515 13.0168 1.25835 12.68 1.19995 11.0008V9.596C1.19995 9.1352 2.14395 9.0272 2.09755 9.716C2.06715 10.0568 2.08635 10.4088 2.07435 10.7456C2.06315 11.3488 3.03035 11.7552 3.25835 10.7688C3.27035 9.956 3.27035 9.1432 3.27035 8.3224C3.27035 7.2936 3.57595 6.4576 4.71835 6.38C5.21355 6.3376 5.54235 6.5344 5.85915 6.876C5.97915 6.996 6.26555 7.3784 6.27755 7.796C6.27755 8.18 6.28875 8.5624 6.28875 8.9496C6.28875 9.716 6.27755 10.4856 6.27755 11.252C6.27755 11.748 6.28875 12.2392 6.28875 12.7232C6.28875 13.3384 6.28875 13.9616 6.27755 14.5768C6.26555 15.3776 7.28715 15.3664 7.45035 14.5648C7.45035 13.6208 7.46155 12.688 7.46155 11.744C7.46155 9.4176 7.45035 7.092 7.45035 4.7664C7.45035 4.5264 7.41515 3.8568 7.55035 3.6128C8.26235 1.9568 10.4264 2.68 10.4456 4.0504C10.488 6.86 10.4456 9.7008 10.4568 12.5176C10.4568 13.4936 9.62475 13.284 9.57835 12.9016C9.57835 9.9712 9.57835 7.0296 9.59035 4.104C9.54795 3.3688 8.40635 3.4696 8.33995 3.996C8.31675 4.6 8.32875 5.2152 8.31675 5.8184V9.4064H8.32875L8.33675 9.4104H8.33595Z" fill="#D4367A"/>
<path d="M11.68 9.36721V13.3376V3.96801C12.052 2.12561 14.5984 2.60561 14.6096 3.93361C14.6216 5.58961 14.6096 7.25761 14.6216 8.91361C14.6216 9.54881 14.6216 10.1872 14.6096 10.8216C14.6096 11.0736 14.6752 11.2512 14.896 11.4136C15.3144 11.712 15.8288 11.36 15.852 10.9112C15.8752 10.3304 15.8632 9.74961 15.8632 9.15361V7.79201C16.1808 5.76401 18.7512 6.10081 18.8048 7.78001V12.8688C18.8048 13.3296 17.872 13.4384 17.9144 12.7488C17.9384 12.3968 17.9144 8.37281 17.9264 8.02001C17.9496 7.42801 16.9704 7.00961 16.7424 8.00801V10.4424C16.7424 11.484 16.436 12.308 15.2832 12.3968C14.1952 12.4208 13.7584 11.704 13.7232 10.9808V6.06161C13.7232 5.43441 13.7232 4.82321 13.7352 4.20801C13.7464 3.40721 12.7248 3.40721 12.5512 4.22001V15.2336C12.5512 15.4736 12.5856 16.1432 12.4616 16.3872C11.7496 18.0432 9.57518 17.32 9.55518 15.9496V14.6456C9.60958 13.7672 10.3872 13.976 10.4224 14.3472V15.8952C10.4648 16.6312 11.6064 16.5416 11.6608 16.004C11.684 15.4 11.684 14.796 11.684 14.1808V9.36801H11.68V9.36721Z" fill="#ED6D48"/>
</svg>

Before

Width:  |  Height:  |  Size: 2.4 KiB

View File

@ -1,12 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 20 20" fill="none">
<path d="M3.84003 3.39999H6.48003V6.03999H3.84003V3.39999ZM14.4 3.39999H17.04V6.03999H14.4V3.39999Z" fill="#FECE00"/>
<path d="M3.84003 6.03999H6.48003V8.67999H3.84003V6.03999Z" fill="#FFA301"/>
<path d="M3.84003 11.32H6.48003V13.96H3.84003V11.32ZM9.12003 11.32H11.76V13.96H9.12003V11.32ZM14.4 11.32H17.04V13.96H14.4V11.32Z" fill="#FE4900"/>
<path d="M11.76 6.03999H14.4V8.67999H11.76V6.03999Z" fill="#FFA301"/>
<path d="M3.84003 8.67999H6.48003V11.32H3.84003V8.67999Z" fill="#FF6F00"/>
<path d="M3.84003 13.96H6.48003V16.6H3.84003V13.96ZM14.4 13.96H17.04V16.6H14.4V13.96Z" fill="#FE0107"/>
<path d="M11.76 8.67999H14.4V11.32H11.76V8.67999Z" fill="#FF6F00"/>
<path d="M6.47998 6.03999H9.11998V8.67999H6.47998V6.03999ZM14.4 6.03999H17.04V8.67999H14.4V6.03999Z" fill="#FFA301"/>
<path d="M6.47998 8.67999H9.11998V11.32H6.47998V8.67999ZM14.4 8.67999H17.04V11.32H14.4V8.67999ZM9.11998 8.67999H11.76V11.32H9.11998V8.67999Z" fill="#FF6F00"/>
<path d="M2.96002 3.39999H3.84002V16.6H2.96002V3.39999ZM13.52 3.39999H14.4V6.03999H13.52V3.39999ZM10.88 6.03999H11.76V8.67999H10.88V6.03999ZM8.24002 11.32H9.12002V13.96H8.24002V11.32ZM13.52 11.32H14.4V16.6H13.52V11.32Z" fill="#191D1D"/>
</svg>

Before

Width:  |  Height:  |  Size: 1.3 KiB

View File

@ -1,24 +0,0 @@
<svg viewBox="0 0 45 26" xmlns="http://www.w3.org/2000/svg" version="1.1">
<!-- Generator: Sketch 55 (78076) - https://sketchapp.com -->
<title>Group</title>
<desc>Created with Sketch.</desc>
<g>
<title>Layer 1</title>
<g fill-rule="evenodd" fill="none" id="组件">
<g id="页脚">
<g id="Group">
<g id="Page-1">
<path fill="#0C8CF6" id="Fill-1" d="m20.9587,24.53226l-4.85075,-7.2592l2.16824,-1.44214l4.128,6.1776l9.34013,-13.9776l2.16824,1.44214l-10.06287,15.0592c-0.68757,1.02916 -2.20343,1.02916 -2.89099,0"/>
<polygon points="68.33575105667114,19.651321411132812 64.82147645950317,11.577018737792969 64.11480379104614,23.955184936523438 59.74661684036255,23.955184936523438 60.87459993362427,3.1465225219726562 65.50557374954224,3.1465225219726562 70.11526155471802,13.803489685058594 74.72494173049927,3.1465225219726562 79.35287141799927,3.1465225219726562 80.4812970161438,23.955184936523438 76.11614656448364,23.955184936523438 75.40946626663208,11.577018737792969 71.89519929885864,19.651321411132812 " fill="#FFFFFF" id="Fill-3"/>
<path fill="#FFFFFF" id="Fill-5" d="m90.0524,20.24468c0.89258,0 1.59187,-0.29683 2.09745,-0.8905c0.50601,-0.59366 0.75879,-1.64233 0.75879,-3.14686c0,-1.50324 -0.25278,-2.5571 -0.75879,-3.16074c-0.50558,-0.60363 -1.20487,-0.90566 -2.09745,-0.90566c-0.87303,0 -1.57232,0.31156 -2.09788,0.9347c-0.52599,0.624 -0.78833,1.6679 -0.78833,3.1317c0,1.50453 0.25757,2.5532 0.77357,3.14686c0.51556,0.59367 1.22007,0.8905 2.11264,0.8905m0,-11.9626c1.03113,0 1.99364,0.14387 2.88621,0.4303c0.89258,0.28687 1.66093,0.74707 2.30593,1.38017c0.64414,0.63397 1.14971,1.4547 1.51717,2.46393c0.36702,1.00924 0.55074,2.22647 0.55074,3.65084c0,1.42523 -0.18372,2.64246 -0.55074,3.65126c-0.36746,1.00967 -0.87303,1.83084 -1.51717,2.46437c-0.645,0.63353 -1.41335,1.0933 -2.30593,1.38017c-0.89257,0.28643 -1.85508,0.4303 -2.88621,0.4303c-1.05198,0 -2.01839,-0.14387 -2.90098,-0.4303c-0.88302,-0.28687 -1.64659,-0.74664 -2.29116,-1.38017c-0.645,-0.63353 -1.15057,-1.4547 -1.5176,-2.46437c-0.36745,-1.0088 -0.55031,-2.22603 -0.55031,-3.65126c0,-1.40444 0.18286,-2.60694 0.55031,-3.6062c0.36703,-0.99927 0.8726,-1.82044 1.5176,-2.46394c0.64457,-0.64263 1.40814,-1.1128 2.29116,-1.41006c0.88259,-0.29684 1.849,-0.44504 2.90098,-0.44504"/>
<polygon points="100.08356142044067,23.955101013183594 104.57641267776489,23.955101013183594 104.57641267776489,2.682769775390625 100.08356142044067,2.682769775390625 " fill="#FFFFFF" id="Fill-7"/>
<polygon points="104.54623079299927,15.940773010253906 109.77919435501099,8.459709167480469 114.86709260940552,8.459709167480469 109.48557710647583,15.732772827148438 115.46258211135864,23.955272674560547 110.4037823677063,23.955272674560547 " fill="#FFFFFF" id="Fill-9"/>
<path fill="#FFFFFF" id="Fill-10" d="m124.83454,17.15783c-0.1985,-0.03986 -0.40698,-0.06413 -0.62459,-0.0741c-0.21891,-0.00996 -0.40698,-0.01516 -0.56595,-0.01516c-0.83307,0 -1.50717,0.12913 -2.02273,0.3861c-0.51644,0.25696 -0.774,0.70243 -0.774,1.33553c0,0.39607 0.09425,0.70807 0.28319,0.93513c0.18763,0.2275 0.42088,0.39607 0.69886,0.5044c0.27754,0.1092 0.56985,0.17334 0.8778,0.19284c0.30752,0.01993 0.56986,0.0299 0.78877,0.0299c0.39612,0 0.84263,-0.0494 1.33865,-0.14864l0,-3.146zm-1.54713,-3.26516c0.15853,0 0.37179,0.00476 0.63978,0.0143c0.26756,0.0104 0.56943,0.02556 0.90735,0.04506c-0.01998,-0.7722 -0.21457,-1.3247 -0.82743,-1.63063c-0.51296,-0.25567 -1.19618,-0.36703 -2.02925,-0.36703c-0.45606,0 -0.98162,0.0546 -1.57667,0.16336c-0.59505,0.10877 -1.24005,0.27214 -1.93413,0.48967l-0.4756,-1.63237c-0.05994,-0.21796 -0.14464,-0.53473 -0.25323,-0.94986c-0.10945,-0.416 -0.18372,-0.7423 -0.22325,-0.97977c0.99161,-0.3367 1.92936,-0.57893 2.81194,-0.72757c0.88259,-0.1482 1.68091,-0.2223 2.39497,-0.2223c2.10222,0 3.72884,0.50007 4.87985,1.4989c1.15014,0.9997 1.72565,2.75427 1.72565,4.95084l0,8.69743c-0.77357,0.2184 -1.66614,0.43073 -2.67773,0.6383c-1.01202,0.208 -2.10309,0.312 -3.27278,0.312c-1.032,0 -1.96931,-0.08927 -2.81237,-0.26737c-0.84306,-0.1781 -1.56668,-0.47493 -2.17172,-0.8905c-0.60548,-0.41556 -1.07109,-0.9594 -1.39859,-1.6328c-0.32706,-0.67253 -0.4908,-1.4937 -0.4908,-2.4635c0,-0.96936 0.20283,-1.781 0.60981,-2.43403c0.40655,-0.65347 0.9321,-1.17303 1.57711,-1.5587c0.64456,-0.3861 1.36861,-0.6578 2.17215,-0.8164c0.8031,-0.15773 1.61141,-0.23703 2.42494,-0.23703l0,0z"/>
<path fill="#0C8CF6" id="Fill-11" d="m11.28274,23.05203l-7.29219,0l13.20144,-19.75653l3.64631,5.45653l-9.55556,14.3zm32.60399,-0.0949l-14.82457,-22.18536c-0.688,-1.02917 -2.20343,-1.02917 -2.89143,0l-3.76662,5.6368l-3.76619,-5.6368c-0.688,-1.02917 -2.20343,-1.02917 -2.89143,0l-14.82457,22.18536c-0.76966,1.1518 0.05776,2.6949 1.44549,2.6949l9.38008,0c0.58115,0 1.12321,-0.28946 1.4455,-0.77176l14.42324,-21.58477l13.20187,19.75653l-7.29219,0l-4.34344,-6.5l-1.56624,2.34347l3.99943,5.98477c0.32185,0.4823 0.86435,0.77176 1.4455,0.77176l9.38008,0c1.38773,0 2.21515,-1.5431 1.44549,-2.6949l0,0z"/>
</g>
</g>
</g>
</g>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 4.9 KiB

View File

@ -1,31 +0,0 @@
<svg t="1710840553956" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="2078"
width="128" height="128">
<path d="M0.143417 0h1011.952942v1011.952941h-1011.952942v-1011.952941z" fill="#1A1D22" p-id="2079"></path>
<path
d="M554.308123 289.129412a1898.375529 1898.375529 0 0 0-289.129412-96.376471c175.911153-130.710588 352.617412-126.686871 530.070589 12.047059 46.116141 34.044988 78.2336 78.2336 96.37647 132.517647l-60.235294 24.094118a1077.320282 1077.320282 0 0 0-277.082353-72.282353z"
fill="#EDEDEE" p-id="2080"></path>
<path
d="M265.178711 216.847059c86.690635 52.814306 183.067106 76.908424 289.129412 72.282353 14.215529 15.709365 30.286306 31.756047 48.188236 48.188235-27.467294 22.287059-27.467294 42.381553 0 60.235294l337.317647 84.329412v96.376471a2918.207247 2918.207247 0 0 0-409.6-96.376471c-122.398118-52.043294-250.916141-92.184094-385.505883-120.470588-18.432-86.738824 21.7088-134.927059 120.470588-144.564706z"
fill="#EFEFF0" p-id="2081"></path>
<path
d="M265.178711 192.752941a1898.375529 1898.375529 0 0 1 289.129412 96.376471c-106.062306 4.626071-202.438776-19.468047-289.129412-72.282353v-24.094118z"
fill="#626568" p-id="2082"></path>
<path
d="M144.708123 385.505882a2590.021271 2590.021271 0 0 0 313.22353 96.376471c29.226165 5.951247 45.296941 22.022024 48.188235 48.188235 2.891294 26.166212 18.962071 42.236988 48.188235 48.188236a6047.623529 6047.623529 0 0 1 361.411765 108.423529c-17.492329 33.659482-45.586071 53.729882-84.329412 60.235294a2021.7856 2021.7856 0 0 1-301.17647-72.282353 3182.302871 3182.302871 0 0 0-457.788236-120.470588v-72.282353c-9.926776-59.584753 14.167341-91.702212 72.282353-96.376471z"
fill="#EEEEEF" p-id="2083"></path>
<path
d="M144.708123 361.411765c134.589741 28.286494 263.107765 68.427294 385.505883 120.470588 4.144188 23.901365-3.903247 39.948047-24.094118 48.188235-2.891294-26.166212-18.962071-42.236988-48.188235-48.188235a2590.021271 2590.021271 0 0 1-313.22353-96.376471v-24.094117z"
fill="#5C5E62" p-id="2084"></path>
<path
d="M72.42577 481.882353v72.282353a3182.302871 3182.302871 0 0 1 457.788236 120.470588c-16.070776 0-24.094118 8.023341-24.094118 24.094118a6126.772706 6126.772706 0 0 1-397.552941-120.470588 118.832188 118.832188 0 0 0-12.047059 72.282352c-49.007435-51.922824-57.030776-108.1344-24.094118-168.658823z"
fill="#4F5155" p-id="2085"></path>
<path
d="M506.119888 698.729412v72.282353c94.930824 9.637647 183.283953 37.755482 265.035294 84.329411-44.092235 66.499765-100.303812 78.546824-168.658823 36.141177a4261.285647 4261.285647 0 0 0-433.694118-120.470588c-42.164706-27.949176-66.258824-68.089976-72.282353-120.470589a118.832188 118.832188 0 0 1 12.047059-72.282352 6126.772706 6126.772706 0 0 0 397.552941 120.470588z"
fill="#F2F2F3" p-id="2086"></path>
<path
d="M168.802241 771.011765a4261.285647 4261.285647 0 0 1 433.694118 120.470588v24.094118a2109.391812 2109.391812 0 0 1-397.552942-120.470589c-9.613553 5.493459-13.613176 13.540894-12.047058 24.094118-20.190871-8.240188-28.238306-24.286871-24.094118-48.188235z"
fill="#5D5F62" p-id="2087"></path>
<path
d="M602.496359 915.576471c10.553224-1.566118 18.600659 2.433506 24.094117 12.047058-169.285271 59.994353-313.825882 23.853176-433.694117-108.423529-1.566118-10.553224 2.433506-18.600659 12.047058-24.094118a2109.391812 2109.391812 0 0 0 397.552942 120.470589z"
fill="#E7E7E8" p-id="2088"></path>
</svg>

Before

Width:  |  Height:  |  Size: 3.5 KiB

View File

@ -1 +0,0 @@
<svg height="1em" style="flex:none;line-height:1" viewBox="0 0 24 24" width="1em" xmlns="http://www.w3.org/2000/svg"><title>Novita AI</title><path clip-rule="evenodd" d="M9.167 4.17v5.665L0 19.003h9.167v-5.666l5.666 5.666H24L9.167 4.17z" fill="#23D57C" fill-rule="evenodd"></path></svg>

Before

Width:  |  Height:  |  Size: 286 B

View File

@ -1,5 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="none">
<path d="M5.62346 1.55468C4.79534 2.05468 4.27971 3.85937 4.46721 5.57031L4.53753 6.25L4.1469 6.64062C2.88127 7.89062 2.56096 9.82812 3.35784 11.3906L3.56877 11.8047L3.37346 12.2891C2.86565 13.5625 2.90471 15.1016 3.4594 16.2109L3.68596 16.6562L3.54534 16.9453C3.22502 17.6016 3.06877 19.0469 3.25627 19.7031L3.3344 20H4.51409L4.45159 19.7422C4.41253 19.6094 4.38128 19.1406 4.38128 18.7109C4.38128 17.9687 4.3969 17.8984 4.67815 17.3125C4.8344 16.9766 4.96721 16.6406 4.96721 16.5703C4.96721 16.5 4.85784 16.2656 4.71721 16.0469C4.0219 14.9609 4.00627 13.6328 4.66252 12.3281C4.95159 11.75 4.94377 11.5703 4.62346 11.1797C4.18596 10.6641 3.97502 9.82812 4.10784 9.14062C4.29534 8.15625 4.8969 7.32812 5.7094 6.94531C6.10003 6.75781 6.27971 6.71875 6.77971 6.71875H7.38909L7.54534 6.40625C7.90471 5.69531 8.52971 5.20312 9.37346 4.96093C10.4594 4.64062 11.8031 5.25 12.3969 6.33593L12.5844 6.67968L13.2485 6.71875C14.3813 6.79687 15.0688 7.25 15.561 8.24218C16.061 9.25781 15.9985 10.3359 15.3969 11.125C15.061 11.5625 15.061 11.75 15.35 12.3281C16.0063 13.6328 15.9907 14.9609 15.2953 16.0469C15.1547 16.2656 15.0453 16.5 15.0453 16.5703C15.0453 16.6406 15.1782 16.9766 15.3344 17.3125C15.6157 17.8984 15.6313 17.9687 15.6313 18.7109C15.6313 19.1406 15.6 19.6094 15.561 19.7422L15.4985 20H16.6782L16.7563 19.7109C16.9438 19.0469 16.7875 17.6016 16.4672 16.9453L16.3266 16.6562L16.5532 16.2109C17.1078 15.1016 17.1469 13.5625 16.6313 12.2734L16.436 11.7812L16.6469 11.3516C17.1157 10.3828 17.1782 9.22656 16.8032 8.13281C16.5922 7.50781 16.311 7.0625 15.811 6.58593L15.475 6.25L15.5453 5.57031C15.686 4.28125 15.3735 2.67187 14.85 1.98437C14.436 1.4375 13.8813 1.27343 13.3032 1.51562C12.7719 1.73437 12.311 2.52343 12.0922 3.59375C11.9907 4.10156 11.936 4.21093 11.8422 4.17187C11.1782 3.86718 10.6625 3.75 9.99065 3.75C9.30315 3.75 8.95159 3.83593 8.17034 4.17187C8.07659 4.21093 8.0219 4.10156 7.92034 3.59375C7.70159 2.52343 7.24065 1.73437 6.7094 1.51562C6.35002 1.35937 5.90471 1.38281 5.62346 1.55468ZM6.51409 2.99218C6.82659 3.63281 7.04534 5.15625 6.85784 5.4375C6.82659 5.48437 6.59221 5.54687 6.3344 5.57812C6.07659 5.60937 5.80315 5.64843 5.73284 5.67187C5.60784 5.71093 5.59221 5.64062 5.59221 4.94531C5.59221 4.52343 5.6469 3.96093 5.7094 3.6875C5.85003 3.10156 6.11565 2.55468 6.24065 2.60156C6.28752 2.61718 6.41252 2.79687 6.51409 2.99218ZM14.0297 2.89062C14.2797 3.375 14.4203 4.11718 14.4203 4.95312C14.4203 5.55468 14.3969 5.71093 14.3188 5.67968C14.2563 5.65625 13.975 5.61718 13.6938 5.58593C13.0688 5.51562 13.0453 5.47656 13.1157 4.53125C13.186 3.6875 13.5688 2.57812 13.8032 2.57812C13.8422 2.57812 13.9438 2.71875 14.0297 2.89062Z" fill="black"/>
<path d="M8.85785 9.36718C7.98285 9.71874 7.3891 10.2969 7.18598 11C6.88129 12.0547 7.35004 12.9766 8.44379 13.4922C8.8891 13.6953 8.97504 13.7109 10.0063 13.7109C11.0375 13.7109 11.1235 13.6953 11.5688 13.4922C12.3735 13.1172 12.8188 12.5547 12.9125 11.8047C13.0141 10.8984 12.4438 9.99999 11.4672 9.51561C10.9907 9.27343 10.8813 9.2578 10.0844 9.24218C9.39691 9.22655 9.15473 9.24999 8.85785 9.36718ZM10.85 10.125C11.0454 10.1875 11.3657 10.3828 11.561 10.5547C12.2485 11.1641 12.3032 11.8906 11.6938 12.4453C11.2719 12.8281 10.9125 12.9297 10.0063 12.9297C9.10004 12.9297 8.74066 12.8281 8.31879 12.4453C7.70941 11.8906 7.7641 11.1641 8.4516 10.5547C8.64691 10.3828 8.9516 10.1953 9.13129 10.125C9.56879 9.97655 10.4047 9.96874 10.85 10.125Z" fill="black"/>
<path d="M9.38911 10.9141C9.27974 11.0234 9.35005 11.4063 9.49068 11.5C9.58443 11.5703 9.65474 11.7188 9.67036 11.9141C9.6938 12.2188 9.70161 12.2266 10.0063 12.2266C10.311 12.2266 10.3188 12.2188 10.3266 11.9297C10.3266 11.7344 10.3891 11.5859 10.4985 11.4844C10.686 11.3047 10.7251 11.0781 10.5766 10.9609C10.4672 10.875 9.46724 10.8438 9.38911 10.9141ZM6.09224 9.38282C5.73286 9.56251 5.54536 10.2578 5.77193 10.5547C5.91255 10.7344 6.17818 10.8594 6.43599 10.8594C6.77193 10.8594 7.15474 10.4063 7.15474 10.0156C7.15474 9.85938 7.10786 9.67188 7.05318 9.60157C6.83443 9.32032 6.42818 9.22657 6.09224 9.38282ZM13.2094 9.39063C12.9594 9.53126 12.8657 9.70313 12.8579 10.0156C12.8579 10.4063 13.2407 10.8594 13.5766 10.8594C14.0297 10.8594 14.3344 10.5859 14.3422 10.1719C14.3422 9.53126 13.7172 9.10157 13.2094 9.39063Z" fill="black"/>
</svg>

Before

Width:  |  Height:  |  Size: 4.3 KiB

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 6.6 KiB

View File

@ -1 +0,0 @@
<svg width="100%" height="100%" viewBox="0 0 512 512" xmlns="http://www.w3.org/2000/svg" class="size-4" fill="currentColor" stroke="currentColor" aria-label="Logo"><g clip-path="url(#clip0_205_3)"><path d="M3 248.945C18 248.945 76 236 106 219C136 202 136 202 198 158C276.497 102.293 332 120.945 423 120.945" stroke-width="90"></path><path d="M511 121.5L357.25 210.268L357.25 32.7324L511 121.5Z"></path><path d="M0 249C15 249 73 261.945 103 278.945C133 295.945 133 295.945 195 339.945C273.497 395.652 329 377 420 377" stroke-width="90"></path><path d="M508 376.445L354.25 287.678L354.25 465.213L508 376.445Z"></path></g><title style="display:none">OpenRouter</title><defs><clipPath id="clip0_205_3"><rect width="512" height="512" fill="white"></rect></clipPath></defs></svg>

Before

Width:  |  Height:  |  Size: 773 B

View File

@ -1,3 +0,0 @@
<svg viewBox="0 0 100 100" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M49.6479 0.359833C22.2415 0.359833 0 22.5703 0 49.9767C0 63.4861 5.41303 75.7546 14.1918 84.7039V50.0233C14.1918 40.5621 17.8832 31.6283 24.568 24.9434C31.2839 18.2275 40.1867 14.5671 49.6634 14.5671H49.9581L49.6479 14.5981C69.2372 14.5981 85.1196 30.4805 85.1196 50.0543C85.1196 51.7604 84.9955 53.4355 84.7628 55.0951L64.7238 34.9939C60.7221 30.9923 55.3556 28.7744 49.6789 28.7744C44.0022 28.7744 38.6512 30.9923 34.6341 34.9939C30.6015 39.0266 28.399 44.3621 28.399 50.0543C28.399 55.7465 30.617 61.082 34.6341 65.1146C38.6357 69.1162 44.0022 71.3342 49.6789 71.3342C55.3556 71.3342 60.7066 69.1162 64.7238 65.1146C68.4617 61.3767 70.6176 56.491 70.9123 51.2641L82.669 63.0673C77.4731 76.2199 64.6617 85.5415 49.6634 85.5415C41.8929 85.5415 34.479 83.0598 28.3835 78.4533V94.863C34.8357 97.934 42.0324 99.6402 49.6324 99.6402C77.0388 99.6402 99.2803 77.4297 99.2803 50.0233C99.3113 22.5858 77.0853 0.375343 49.6634 0.375343L49.6479 0.359833Z" fill="#0062E2"/>
</svg>

Before

Width:  |  Height:  |  Size: 1.0 KiB

View File

@ -1 +0,0 @@
<?xml version="1.0" standalone="no"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg t="1734679619961" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="4292" xmlns:xlink="http://www.w3.org/1999/xlink" width="64" height="64"><path d="M955 607.2L843.6 412.1l47.8-91.6c6.6-8.7 8.3-21.5 5.6-31.8L835.7 178c-4.7-7.3-12.8-11.8-21.5-12H586.3l-51-89.2c-3.6-7-10.5-11.8-18.3-12.7H396.7c-8.9 0.2-15.2 6.7-19.6 14.4l-1.9 3.1-113.8 195.1H152.2c-9-0.2-17.5 4.3-22.3 12L67 400.1c-4 8-4 17.5 0 25.5l113.1 196.7-51 89.2c-4 8-4 17.4 0 25.5l58.1 102c4.7 7.8 13.2 12.7 22.3 12.7h227.8l54.9 95.5c4.2 7.2 11.6 11.9 19.9 12.8h129c8.9-0.2 17.1-5 21.5-12.7L775 750.5h100.4c8.9-0.8 16.9-5.8 21.5-13.5L955 634.3c5.4-8.3 5.4-18.9 0-27.1zM814.1 620L756 512.5 517 933.7l-65.3-107.5H212.8l57.4-104.3H392L153 302.2h125L396.7 90.3l59.7 104.3-61.3 107.5H873l-60.5 106.8 120.3 211H814.1z" fill="#605BEC" p-id="4293"></path><path d="M511.4 660.5l149-238.9H361.7l149.7 238.9z m0 0" fill="#605BEC" p-id="4294"></path></svg>

Before

Width:  |  Height:  |  Size: 1.1 KiB

View File

@ -1,3 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 120 120">
<path fill="#8358F6" fill-rule="evenodd" d="M100.74 12h-7.506c-24.021 0-37.867 15.347-37.867 38.867V54.9a30.862 30.862 0 0 0-8.507-1.196C29.816 53.703 16 67.52 16 84.563c0 17.044 13.816 30.86 30.86 30.86 17.044 0 30.86-13.816 30.86-30.86 0-2.073-.209-4.14-.623-6.172h23.643c6.225-.023 11.26-5.076 11.26-11.301 0-6.226-5.035-11.279-11.26-11.302H77.22v-5.922c0-9.008 6.505-15.513 16.014-15.513h7.506c6.107-.093 11.01-5.069 11.01-11.177 0-6.107-4.903-11.084-11.01-11.176zM56.035 84.563a9.175 9.175 0 1 0-18.35 0 9.175 9.175 0 0 0 18.35 0z"/>
</svg>

Before

Width:  |  Height:  |  Size: 610 B

View File

@ -1,12 +0,0 @@
<svg t="1710840542445" class="icon" viewBox="0 0 1090 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="1764"
width="128" height="128">
<path
d="M851.79730973 887.73599356c-22.09345317 31.07812413-50.66765259 55.38092261-81.15661796 77.32708609-89.84670956 64.80746262-190.74014569 91.17231675-300.6182528 82.48222515-178.6623913-13.99252034-341.71207568-148.32071561-387.96103764-322.26983687-46.39625165-174.09641097-5.89158752-326.68852752 121.66128212-454.8305559 62.89269668-63.18727606 126.96371088-125.04894493 190.59285601-187.49977256 3.24037312-3.24037312 5.89158752-7.51177407 12.07775439-9.27925034 2.79850407 8.10093283-1.17831749 15.75999659-2.06205562 23.12448099-10.31027814 90.4358683 16.49644503 171.29790689 63.33456575 246.85751673 33.87662819 54.79176385 76.0014789 102.95549177 125.93268306 143.16557653 61.71437919 49.78391447 68.48970482 131.23511182 32.551021 187.20519319-41.24111259 64.21830387-129.32034589 82.77680453-192.06575288 39.47363633-3.24037312-2.20934532-7.95364315-3.5349525-8.83738126-9.86840908 8.54280189 0.29457937 16.79102442 2.79850407 25.03924691 0.88373812 19.73681815-4.71327001 33.28746944-16.49644503 38.44260852-36.2332632 5.15513907-19.88410785-0.88373813-36.96971163-16.64373473-49.78391447-45.51251353-37.11700133-88.0792333-77.32708608-123.13417898-124.90165524-13.2560719-17.96934191-25.18653662-36.96971163-36.96971164-57.73755762-17.67476253 19.88410785-29.75251694 41.68298165-38.88447757 64.95475232-45.80709291 116.9480121-7.2171947 240.81863954 97.06390426 313.87432469 7.65906378 5.30242877 15.75999659 10.16298847 22.68261191 16.49644504 1.62018657 1.62018657 3.09308345 3.5349525 5.00784938 4.86055968 59.94690292 40.21008477 124.60707587 68.93157389 196.33715382 80.12559017 50.96223197 7.95364315 102.07175363 8.69009158 153.4758547 3.68224218 37.26429101-3.5349525 71.73007796-14.13981003 104.13380925-32.10915193z"
fill="#3EC7F8" p-id="1765"></path>
<path
d="M514.06205563-27.96399545c65.54391107 64.36559357 130.204084 127.84744899 194.71696726 191.18201476 38.73718787 38.00073946 77.76895517 76.0014789 116.21156367 114.29679774 70.84633982 70.55176044 112.82390085 155.83248969 128.28931806 254.3692908 0.58915875 3.82953187 0.88373813 7.51177407 1.03102783 11.34130596 0 1.17831749-0.58915875 2.35663501-0.88373814 3.53495249-3.5349525 0.29457937-4.27140094-2.35663501-5.15513907-4.41869062-24.45008817-50.52036291-64.95475232-82.18764578-116.94801211-100.59885675-27.39588193-9.72111939-55.97008134-15.31812754-83.95512203-22.82990161-70.10989139-18.55850066-133.29716746-49.04746604-179.39883976-107.3741824-37.4115807-47.42727948-57.44297824-101.48259488-57.14839883-162.46052565 0.29457937-53.90802574 0-107.66876177 0.14728966-161.5767875-0.1472897-4.56598032-2.35663501-10.16298847 3.09308346-15.46541722z"
fill="#E90302" p-id="1766"></path>
<path
d="M851.79730973 887.73599356c-32.40373131 17.96934191-66.86951826 28.42690974-103.83922989 32.10915193-51.40410106 5.0078494-102.5136227 4.27140094-153.47585469-3.68224218-71.73007796-11.19401627-136.39025089-39.9155054-196.33715381-80.12559017-1.91476594-1.3256072-3.38766283-3.24037312-5.00784939-4.86055968 5.74429781-1.91476594 10.01569877 1.91476594 14.72896877 3.97682157 161.13491844 70.25718107 343.18497255-38.44260851 358.06123104-213.717337 3.97682157-46.24896197-3.68224221-90.14128893-22.82990161-132.11884996-1.17831749-2.65121438-2.35663501-5.15513907-3.38766282-7.80635344-0.29457937-0.73644845 0.1472897-1.76747625 0.14728968-2.79850407 4.12411126-3.5349525 8.24822251-0.44186907 12.07775441 0.44186905 49.78391447 10.89943689 94.55997955 32.10915193 132.70800869 66.13306983 51.55139072 46.10167228 72.4665264 103.8392299 56.70652979 171.29790689-14.72896878 64.0710142-44.7760651 121.8085718-89.55213017 171.15061723z"
fill="#1753D7" p-id="1767"></path>
</svg>

Before

Width:  |  Height:  |  Size: 3.8 KiB

View File

@ -1,29 +0,0 @@
<svg
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 52 52"
style="shape-rendering:geometricPrecision;text-rendering:geometricPrecision;image-rendering:optimizeQuality;fill-rule:evenodd;clip-rule:evenodd"
>
<path style="opacity:.718" fill="#0080ff" d="M46.5-.5h3c0 1.333.667 2 2 2v3c-1.676.683-2.343 2.017-2 4h-3v-4c-.427-1.238-.427-2.238 0-3v-2z"/>
<path style="opacity:.596" fill="#0084ff" d="M46.5 1.5c-.427.762-.427 1.762 0 3h-4v-3h4z"/>
<path style="opacity:.605" fill="#009dff" d="M7.5 4.5a10.173 10.173 0 0 0-3 3v-4c1.291-.237 2.291.096 3 1z"/>
<path style="opacity:.749" fill="#09f" d="M7.5 4.5v10a5.728 5.728 0 0 1-3-1v-6a10.173 10.173 0 0 1 3-3z"/>
<path style="opacity:.741" fill="#0196ff" d="M15.5 6.5a303.97 303.97 0 0 1-4 3v-3h4z"/>
<path style="opacity:.938" fill="#0091ff" d="M15.5 6.5h8c-1.965 2.978-4.632 4.978-8 6v3h-4v-6a303.97 303.97 0 0 0 4-3z"/>
<path style="opacity:.963" fill="#008cff" d="M23.5 6.5h9a147.885 147.885 0 0 0-18.5 17c-.77-1.098-1.603-1.098-2.5 0v-8h4v-3c3.368-1.022 6.035-3.022 8-6z"/>
<path style="opacity:.932" fill="#0086ff" d="M32.5 6.5h10c-5.818 2.99-10.484 6.658-14 11h-6v5a109.794 109.794 0 0 1-11 9v-8c.897-1.098 1.73-1.098 2.5 0a147.885 147.885 0 0 1 18.5-17z"/>
<path style="opacity:.853" fill="#0080ff" d="M42.5 6.5h3v8a30.943 30.943 0 0 1-2-2 17.853 17.853 0 0 0-5 5h-10c3.516-4.342 8.182-8.01 14-11z"/>
<path style="opacity:.749" fill="#007aff" d="M45.5 14.5v3h-7a17.853 17.853 0 0 1 5-5c.682.743 1.349 1.41 2 2z"/>
<path style="opacity:.721" fill="#0092ff" d="M4.5 13.5c.891.61 1.891.943 3 1v7c-.617.11-1.117.444-1.5 1-.278-.916-.778-1.582-1.5-2v-7z"/>
<path style="opacity:.618" fill="#008dff" d="M4.5 20.5c.722.418 1.222 1.084 1.5 2 .383-.556.883-.89 1.5-1v6h-3v-7z"/>
<path style="opacity:.941" fill="#0080ff" d="M22.5 22.5v8A273.02 273.02 0 0 1 7.5 43c.556.383.89.883 1 1.5h-9v-3a19.582 19.582 0 0 1 6-4.5 18.492 18.492 0 0 0 2-3.5c1.983.343 3.317-.324 4-2a109.794 109.794 0 0 0 11-9z"/>
<path style="opacity:.693" fill="#0078ff" d="M35.5 27.5v1c-3.515.845-6.181 2.845-8 6v-7h8z"/>
<path style="opacity:.716" fill="#016eff" d="M35.5 28.5v-1h16v1c-1.527-.073-2.527.594-3 2h-8c.11-.617.444-1.117 1-1.5a18.436 18.436 0 0 0-6-.5z"/>
<path style="opacity:.967" fill="#0073ff" d="M35.5 28.5a18.436 18.436 0 0 1 6 .5c-.556.383-.89.883-1 1.5-.992-.172-1.658.162-2 1-3.85 2.76-7.517 5.76-11 9v-6c1.819-3.155 4.485-5.155 8-6z"/>
<path style="opacity:.874" fill="#0069ff" d="M51.5 28.5v2h-3c.473-1.406 1.473-2.073 3-2z"/>
<path style="opacity:.921" fill="#0079ff" d="M22.5 30.5v8c-2.938 1.122-4.938 3.122-6 6h-8c-.11-.617-.444-1.117-1-1.5a273.02 273.02 0 0 0 15-12.5z"/>
<path style="opacity:.924" fill="#006dff" d="M38.5 31.5v8a73.126 73.126 0 0 0-11 9v-8c3.483-3.24 7.15-6.24 11-9z"/>
<path style="opacity:.878" fill="#0085ff" d="M-.5 33.5h8a18.492 18.492 0 0 1-2 3.5 19.582 19.582 0 0 0-6 4.5v-8z"/>
<path style="opacity:.788" fill="#0074ff" d="M22.5 38.5v6h-6c1.062-2.878 3.062-4.878 6-6z"/>
<path style="opacity:.93" fill="#0067ff" d="M38.5 39.5v9c-2.212-.547-3.879.453-5 3h-6v-3a73.126 73.126 0 0 1 11-9z"/>
<path style="opacity:.883" fill="#0062ff" d="M38.5 48.5v3h-5c1.121-2.547 2.788-3.547 5-3z"/>
</svg>

Before

Width:  |  Height:  |  Size: 3.1 KiB

View File

@ -1 +0,0 @@
<svg height="1em" style="flex:none;line-height:1" viewBox="0 0 24 24" width="1em" xmlns="http://www.w3.org/2000/svg"><title>VertexAI</title><path d="M11.995 20.216a1.892 1.892 0 100 3.785 1.892 1.892 0 000-3.785zm0 2.806a.927.927 0 11.927-.914.914.914 0 01-.927.914z" fill="#4285F4"></path><path clip-rule="evenodd" d="M21.687 14.144c.237.038.452.16.605.344a.978.978 0 01-.18 1.3l-8.24 6.082a1.892 1.892 0 00-1.147-1.508l8.28-6.08a.991.991 0 01.682-.138z" fill="#669DF6" fill-rule="evenodd"></path><path clip-rule="evenodd" d="M10.122 21.842l-8.217-6.066a.952.952 0 01-.206-1.287.978.978 0 011.287-.206l8.28 6.08a1.893 1.893 0 00-1.144 1.479z" fill="#AECBFA" fill-rule="evenodd"></path><path d="M4.273 4.475a.978.978 0 01-.965-.965V1.09a.978.978 0 111.943 0v2.42a.978.978 0 01-.978.965zM4.247 13.034a.978.978 0 100-1.956.978.978 0 000 1.956zM4.247 10.19a.978.978 0 100-1.956.978.978 0 000 1.956zM4.247 7.332a.978.978 0 100-1.956.978.978 0 000 1.956z" fill="#AECBFA"></path><path d="M19.718 7.307a.978.978 0 01-.965-.979v-2.42a.965.965 0 011.93 0v2.42a.964.964 0 01-.965.979zM19.743 13.047a.978.978 0 100-1.956.978.978 0 000 1.956zM19.743 10.151a.978.978 0 100-1.956.978.978 0 000 1.956zM19.743 2.068a.978.978 0 100-1.956.978.978 0 000 1.956z" fill="#4285F4"></path><path d="M11.995 15.917a.978.978 0 01-.965-.965v-2.459a.978.978 0 011.943 0v2.433a.976.976 0 01-.978.991zM11.995 18.762a.978.978 0 100-1.956.978.978 0 000 1.956zM11.995 10.64a.978.978 0 100-1.956.978.978 0 000 1.956zM11.995 7.783a.978.978 0 100-1.956.978.978 0 000 1.956z" fill="#669DF6"></path><path d="M15.856 10.177a.978.978 0 01-.965-.965v-2.42a.977.977 0 011.702-.763.979.979 0 01.241.763v2.42a.978.978 0 01-.978.965zM15.869 4.913a.978.978 0 100-1.956.978.978 0 000 1.956zM15.869 15.853a.978.978 0 100-1.956.978.978 0 000 1.956zM15.869 12.996a.978.978 0 100-1.956.978.978 0 000 1.956z" fill="#4285F4"></path><path d="M8.121 15.853a.978.978 0 100-1.956.978.978 0 000 1.956zM8.121 7.783a.978.978 0 100-1.956.978.978 0 000 1.956zM8.121 4.913a.978.978 0 100-1.957.978.978 0 000 1.957zM8.134 12.996a.978.978 0 01-.978-.94V9.611a.965.965 0 011.93 0v2.445a.966.966 0 01-.952.94z" fill="#AECBFA"></path></svg>

Before

Width:  |  Height:  |  Size: 2.1 KiB

View File

@ -1,7 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="none">
<path d="M10 20C15.5229 20 20 15.5229 20 10C20 4.47712 15.5229 0 10 0C4.47712 0 0 4.47712 0 10C0 15.5229 4.47712 20 10 20Z" fill="#003425"/>
<path d="M15.0705 12.0408C15.0705 11.8535 14.9961 11.6738 14.8636 11.5413C14.7311 11.4088 14.5515 11.3344 14.3641 11.3344C14.1768 11.3344 13.9971 11.4088 13.8646 11.5413C13.7321 11.6738 13.6577 11.8535 13.6577 12.0408V15.6829C13.6577 15.8702 13.7321 16.0499 13.8646 16.1824C13.9971 16.3149 14.1768 16.3893 14.3641 16.3893C14.5515 16.3893 14.7311 16.3149 14.8636 16.1824C14.9961 16.0499 15.0705 15.8702 15.0705 15.6829V12.0408Z" fill="white"/>
<path d="M14.067 4.44512C13.9963 4.38501 13.9145 4.33941 13.8262 4.31092C13.7379 4.28243 13.6448 4.27162 13.5524 4.27909C13.4599 4.28656 13.3698 4.31218 13.2872 4.35448C13.2047 4.39677 13.1312 4.45492 13.0711 4.5256L9.12344 9.16752C9.02533 9.28249 8.96687 9.426 8.95672 9.5768C8.95112 9.61248 8.94824 9.6488 8.94824 9.68608V15.6202C8.94824 15.8075 9.02267 15.9872 9.15514 16.1197C9.28762 16.2521 9.46729 16.3266 9.65464 16.3266C9.84199 16.3266 10.0217 16.2521 10.1541 16.1197C10.2866 15.9872 10.361 15.8075 10.361 15.6202V9.8928L14.1474 5.44096C14.2076 5.37029 14.2532 5.28847 14.2816 5.20018C14.3101 5.11189 14.3209 5.01885 14.3135 4.92637C14.306 4.8339 14.2804 4.7438 14.2381 4.66123C14.1958 4.57865 14.1376 4.50522 14.067 4.44512Z" fill="white"/>
<path d="M5.33941 4.12558C5.2806 4.05384 5.20823 3.99438 5.12644 3.9506C5.04465 3.90683 4.95504 3.87959 4.86272 3.87045C4.77041 3.8613 4.67719 3.87043 4.58841 3.89731C4.49962 3.92419 4.41699 3.96829 4.34525 4.0271C4.2735 4.08591 4.21405 4.15828 4.17027 4.24007C4.12649 4.32186 4.09926 4.41147 4.09011 4.50379C4.08097 4.5961 4.09009 4.68932 4.11697 4.7781C4.14385 4.86689 4.18796 4.94952 4.24677 5.02126L7.04389 8.43326C7.16266 8.57818 7.33414 8.66997 7.5206 8.68846C7.70705 8.70694 7.89321 8.6506 8.03813 8.53182C8.18304 8.41305 8.27484 8.24157 8.29332 8.05511C8.31181 7.86866 8.25546 7.6825 8.13669 7.53758L5.33925 4.12558H5.33941Z" fill="white"/>
<path d="M15.3376 10.1726C15.5666 10.1726 15.7862 10.0817 15.9481 9.91972C16.11 9.7578 16.201 9.53818 16.201 9.30918C16.201 9.08018 16.11 8.86056 15.9481 8.69864C15.7862 8.53671 15.5666 8.44574 15.3376 8.44574C15.1086 8.44574 14.8889 8.53671 14.727 8.69864C14.5651 8.86056 14.4741 9.08018 14.4741 9.30918C14.4741 9.53818 14.5651 9.7578 14.727 9.91972C14.8889 10.0817 15.1086 10.1726 15.3376 10.1726Z" fill="#00FF25"/>
</svg>

Before

Width:  |  Height:  |  Size: 2.4 KiB

View File

@ -4,7 +4,9 @@ import { I18N_NAMESPACES_MAP } from '../i18n/constants';
export function useSafeTranslation() {
const { t: originalT, ...rest } = useNextTranslation();
const t = (key: string | undefined, ...args: any[]): string => {
const t = (key: any, ...args: any[]): string => {
if (key === null || key === undefined) return '';
if (typeof key !== 'string') return String(key);
if (!key) return '';
const ns = key.split(':')[0];

View File

@ -98,16 +98,49 @@
"setting.copyright.tips.square": "Suggested ratio 1:1",
"setting.copyright.title": "Copyright",
"setting.copyright.upload_fail": "File upload failed",
"setting.data_dashboard.title": "Data board",
"setting.data_dashboard.title": "Home data",
"setting.fastgpt_chat_diagram": "/imgs/chat/fastgpt_chat_diagram_en.png",
"setting.favourite.add_new_app": "Add an app",
"setting.favourite.cancel_button": "Cancel",
"setting.favourite.categories_modal.delete_cancel_button": "Cancel",
"setting.favourite.categories_modal.delete_confirm": "Confirm deletion {{name}}? \nApps under this category will be moved to default",
"setting.favourite.categories_modal.delete_confirm_button": "delete",
"setting.favourite.categories_modal.delete_confirm_title": "Confirm deletion",
"setting.favourite.categories_modal.title": "Total {{num}} categories",
"setting.favourite.category.no_data": "No selected applications available",
"setting.favourite.category_all": "All Categories",
"setting.favourite.category_placeholder": "Select a category",
"setting.favourite.category_tab.all": "All",
"setting.favourite.confirm_button": "Sure",
"setting.favourite.delete_app_cancel_button": "Cancel",
"setting.favourite.delete_app_confirm": "Are you sure you want to remove this featured app?",
"setting.favourite.delete_app_confirm_button": "Sure",
"setting.favourite.delete_app_title": "Delete the app",
"setting.favourite.manage_categories_button": "Manage Category",
"setting.favourite.save_category_for_app_button": "Save",
"setting.favourite.search_placeholder": "Search for apps",
"setting.favourite.selected_list": "Selected: {{num}}",
"setting.favourite.table_column_action": "Action",
"setting.favourite.table_column_category": "Category",
"setting.favourite.table_column_intro": "Introduce",
"setting.favourite.table_column_name": "Name",
"setting.favourite.tag.no_data": "No classification yet",
"setting.favourite.title": "Favourite App",
"setting.home.available_tools": "Available tools",
"setting.home.available_tools.add": "Add",
"setting.home.cancel_button": "Cancel",
"setting.home.commercial_version": "Commercial version",
"setting.home.confirm_button": "Sure",
"setting.home.diagram": "Schematic diagram",
"setting.home.dialogue_tips": "Dialog prompt text",
"setting.home.dialogue_tips.default": "You can ask me any questions",
"setting.home.dialogue_tips_placeholder": "Please enter the prompt text of the dialog box",
"setting.home.home_tab_title": "Home Page Title",
"setting.home.home_tab_title_placeholder": "Please enter the title of the homepage",
"setting.home.no_selected_app": "No selected App",
"setting.home.quick_apps": "Quick Apps",
"setting.home.quick_apps.add": "Configure quick applications",
"setting.home.quick_apps.placeholder": "Please select an application",
"setting.home.slogan": "Slogan",
"setting.home.slogan.default": "Hello 👋, I am FastGPT! Is there anything I can help you?",
"setting.home.slogan_placeholder": "Please enter Slogan",
@ -115,9 +148,9 @@
"setting.incorrect_plan": "The current plan does not support this feature, please upgrade to the subscription plan",
"setting.incorrect_version": "This feature is not supported in the current version",
"setting.log_details.title": "Home Log",
"setting.logs.title": "Homepage log",
"setting.save": "Save",
"setting.save_success": "Save successfully",
"sidebar.favourite_apps": "Favourite Apps",
"sidebar.home": "Home",
"sidebar.team_apps": "Team Apps",
"source_cronJob": "Scheduled execution",

View File

@ -893,23 +893,7 @@
"model.type.reRank": "ReRank",
"model.type.stt": "STT",
"model.type.tts": "TTS",
"model_alicloud": "Ali Cloud",
"model_baai": "BAAI",
"model_baichuan": "Baichuan",
"model_chatglm": "ChatGLM",
"model_doubao": "Doubao",
"model_ernie": "Ernie",
"model_hunyuan": "Hunyuan",
"model_intern": "Intern",
"model_moka": "Moka-AI",
"model_moonshot": "Moonshot",
"model_other": "Other",
"model_ppio": "PPIO",
"model_qwen": "Qwen",
"model_siliconflow": "Siliconflow",
"model_sparkdesk": "SprkDesk",
"model_stepfun": "StepFun",
"model_yi": "Yi",
"month": "Month",
"move.confirm": "Confirm move",
"move_success": "Moved Successfully",

View File

@ -41,6 +41,7 @@
"file_input_tip": "可通过【插件开始】节点的“文件链接”获取对应文件的链接",
"history_slider.home.title": "聊天",
"home.chat_app": "首页聊天",
"home.chat_id": "会话ID",
"home.no_available_tools": "暂无可用工具",
"home.select_tools": "选择工具",
"home.tools": "工具:{{num}}",
@ -97,8 +98,36 @@
"setting.copyright.tips.square": "建议比例 1:1",
"setting.copyright.title": "版权信息",
"setting.copyright.upload_fail": "文件上传失败",
"setting.data_dashboard.title": "数据看板",
"setting.data_dashboard.title": "首页数据",
"setting.fastgpt_chat_diagram": "/imgs/chat/fastgpt_chat_diagram.png",
"setting.favourite.add_new_app": "添加应用",
"setting.favourite.all_apps": "所有应用",
"setting.favourite.cancel_button": "取消",
"setting.favourite.categories_modal.delete_cancel_button": "取消",
"setting.favourite.categories_modal.delete_confirm": "确认删除 {{name}} ?该分类下的应用将被移动至默认",
"setting.favourite.categories_modal.delete_confirm_button": "删除",
"setting.favourite.categories_modal.delete_confirm_title": "确认删除",
"setting.favourite.categories_modal.title": "共 {{num}} 个分类",
"setting.favourite.category_all": "全部分类",
"setting.favourite.category_tab.all": "全部",
"setting.favourite.category.no_data": "暂无可用精选应用",
"setting.favourite.tag.no_data": "暂无分类",
"setting.favourite.category_placeholder": "选择分类",
"setting.favourite.confirm_button": "确定",
"setting.favourite.delete_app_cancel_button": "取消",
"setting.favourite.delete_app_confirm": "确定要移除该精选应用吗?",
"setting.favourite.delete_app_confirm_button": "确定",
"setting.favourite.delete_app_title": "删除应用",
"setting.favourite.manage_categories_button": "分类管理",
"setting.favourite.save_category_for_app_button": "保存",
"setting.favourite.search_placeholder": "搜索应用",
"setting.favourite.selected_list": "已选: {{num}}",
"setting.favourite.table_column_action": "操作",
"setting.favourite.table_column_category": "分类",
"setting.favourite.table_column_intro": "介绍",
"setting.favourite.table_column_name": "名称",
"setting.favourite.title": "精选应用",
"setting.favourite.goto_add": "去配置",
"setting.home.available_tools": "可用工具",
"setting.home.available_tools.add": "添加",
"setting.home.commercial_version": "商业版",
@ -108,18 +137,23 @@
"setting.home.dialogue_tips_placeholder": "请输入对话框提示文字",
"setting.home.home_tab_title": "首页标题",
"setting.home.home_tab_title_placeholder": "请输入首页标题",
"setting.home.quick_apps": "快捷应用",
"setting.home.quick_apps.add": "配置快捷应用",
"setting.home.quick_apps.placeholder": "请选择应用",
"setting.home.slogan": "Slogan",
"setting.home.slogan.default": "你好👋,我是 FastGPT ! 请问有什么可以帮你?",
"setting.home.slogan_placeholder": "请输入 Slogan",
"setting.home.title": "首页配置",
"setting.home.cancel_button": "取消",
"setting.home.confirm_button": "确定",
"setting.home.no_selected_app": "未选择应用",
"setting.incorrect_plan": "当前套餐不支持该功能,请升级订阅套餐",
"setting.incorrect_version": "当前版本不支持该功能",
"setting.log_details.title": "首页日志",
"setting.logs.title": "首页日志",
"setting.save": "保存",
"setting.save_success": "保存成功",
"setting.share": "分享",
"home.chat_id": "会话ID",
"sidebar.favourite_apps": "精选应用",
"sidebar.home": "首页",
"sidebar.team_apps": "团队应用",
"source_cronJob": "定时执行",

View File

@ -893,23 +893,7 @@
"model.type.reRank": "重排模型",
"model.type.stt": "语音识别",
"model.type.tts": "语音合成",
"model_alicloud": "阿里云",
"model_baai": "智源",
"model_baichuan": "百川智能",
"model_chatglm": "ChatGLM",
"model_doubao": "豆包",
"model_ernie": "文心一言",
"model_hunyuan": "腾讯混元",
"model_intern": "书生",
"model_moka": "Moka-AI",
"model_moonshot": "月之暗面",
"model_other": "其他",
"model_ppio": "PPIO 派欧云",
"model_qwen": "阿里千问",
"model_siliconflow": "硅基流动",
"model_sparkdesk": "讯飞星火",
"model_stepfun": "阶跃星辰",
"model_yi": "零一万物",
"month": "月",
"move.confirm": "确认移动",
"move_success": "移动成功",

View File

@ -97,16 +97,49 @@
"setting.copyright.tips.square": "建議比例 1:1",
"setting.copyright.title": "版權信息",
"setting.copyright.upload_fail": "文件上傳失敗",
"setting.data_dashboard.title": "數據看板",
"setting.data_dashboard.title": "首頁數據",
"setting.fastgpt_chat_diagram": "/imgs/chat/fastgpt_chat_diagram_zh-Hant.png",
"setting.favourite.add_new_app": "添加應用",
"setting.favourite.cancel_button": "取消",
"setting.favourite.categories_modal.delete_cancel_button": "取消",
"setting.favourite.categories_modal.delete_confirm": "確認刪除 {{name}} \n該分類下的應用將被移動至默認",
"setting.favourite.categories_modal.delete_confirm_button": "刪除",
"setting.favourite.categories_modal.delete_confirm_title": "確認刪除",
"setting.favourite.categories_modal.title": "共 {{num}} 個分類",
"setting.favourite.category.no_data": "暫無可用精選應用",
"setting.favourite.category_all": "全部分類",
"setting.favourite.category_placeholder": "選擇分類",
"setting.favourite.category_tab.all": "全部",
"setting.favourite.confirm_button": "確定",
"setting.favourite.delete_app_cancel_button": "取消",
"setting.favourite.delete_app_confirm": "確定要移除該精選應用嗎?",
"setting.favourite.delete_app_confirm_button": "確定",
"setting.favourite.delete_app_title": "刪除應用",
"setting.favourite.manage_categories_button": "分類管理",
"setting.favourite.save_category_for_app_button": "保存",
"setting.favourite.search_placeholder": "搜索應用",
"setting.favourite.selected_list": "已選: {{num}}",
"setting.favourite.table_column_action": "操作",
"setting.favourite.table_column_category": "分類",
"setting.favourite.table_column_intro": "介紹",
"setting.favourite.table_column_name": "名稱",
"setting.favourite.tag.no_data": "暫無分類",
"setting.favourite.title": "精選應用",
"setting.home.available_tools": "可用工具",
"setting.home.available_tools.add": "添加",
"setting.home.cancel_button": "取消",
"setting.home.commercial_version": "商業版",
"setting.home.confirm_button": "確定",
"setting.home.diagram": "示意圖",
"setting.home.dialogue_tips": "對話框提示文字",
"setting.home.dialogue_tips.default": "你可以問我任何問題",
"setting.home.dialogue_tips_placeholder": "請輸入對話框提示文字",
"setting.home.home_tab_title": "首頁標題",
"setting.home.home_tab_title_placeholder": "請輸入首頁標題",
"setting.home.no_selected_app": "未選擇應用",
"setting.home.quick_apps": "快捷應用",
"setting.home.quick_apps.add": "配置快捷應用",
"setting.home.quick_apps.placeholder": "請選擇應用",
"setting.home.slogan": "Slogan",
"setting.home.slogan.default": "你好👋,我是 FastGPT ! 請問有什麼可以幫你?",
"setting.home.slogan_placeholder": "請輸入 Slogan",
@ -117,6 +150,7 @@
"setting.logs.title": "首頁日誌",
"setting.save": "保存",
"setting.save_success": "保存成功",
"sidebar.favourite_apps": "精選應用",
"sidebar.home": "首頁",
"sidebar.team_apps": "團隊應用",
"source_cronJob": "定時執行",

View File

@ -892,23 +892,7 @@
"model.type.reRank": "重排模型",
"model.type.stt": "語音辨識",
"model.type.tts": "語音合成",
"model_alicloud": "阿里雲",
"model_baai": "智源",
"model_baichuan": "百川智能",
"model_chatglm": "ChatGLM",
"model_doubao": "豆包",
"model_ernie": "文心一言",
"model_hunyuan": "騰訊混元",
"model_intern": "書生",
"model_moka": "Moka-AI",
"model_moonshot": "月之暗面",
"model_other": "其他",
"model_ppio": "PPIO 派歐雲",
"model_qwen": "阿里千問",
"model_siliconflow": "矽基流動",
"model_sparkdesk": "訊飛星火",
"model_stepfun": "階躍星辰",
"model_yi": "零一萬物",
"month": "月",
"move.confirm": "確認移動",
"move_success": "移動成功",

View File

@ -18,7 +18,7 @@
"@lexical/utils": "0.12.6",
"@monaco-editor/react": "^4.6.0",
"@tanstack/react-query": "^4.24.10",
"ahooks": "^3.7.11",
"ahooks": "^3.9.4",
"date-fns": "2.30.0",
"dayjs": "^1.11.7",
"recharts": "^2.15.0",

Some files were not shown because too many files have changed in this diff Show More