MaxKB/apps/setting/models_provider
CaptainB 3c6b65baa1 fix: Remove vllm image cache
--bug=1052365 --user=刘瑞斌 【github#2353】vllm视觉模型修改最大tokens不生效 https://www.tapd.cn/57709429/s/1657667
2025-02-24 16:30:13 +08:00
..
constants feat: support Tencent Cloud 2025-02-11 18:01:41 +08:00
impl fix: Remove vllm image cache 2025-02-24 16:30:13 +08:00
__init__.py refactor: check model use model_params 2024-12-25 17:05:57 +08:00
base_model_provider.py feat: support siliconCloud 2025-02-05 15:26:40 +08:00
tools.py feat: i18n (#2011) 2025-01-13 11:15:51 +08:00