MaxKB/apps/setting/models_provider
itaa ed8f8f8a3e
fix: ollama model provider can not set 'num_ctx' etc. parameter #2442 (#2444)
langchain-openai is not compatible with parameter Settings in ollama, such as num_ctx. Therefore, you need to create model instances using langchain-ollama

(cherry picked from commit 42ae7b443d)
2025-02-28 17:45:48 +08:00
..
constants feat: support Tencent Cloud 2025-02-11 18:01:41 +08:00
impl fix: ollama model provider can not set 'num_ctx' etc. parameter #2442 (#2444) 2025-02-28 17:45:48 +08:00
__init__.py refactor: check model use model_params 2024-12-25 17:05:57 +08:00
base_model_provider.py feat: support siliconCloud 2025-02-05 15:26:40 +08:00
tools.py feat: i18n (#2011) 2025-01-13 11:15:51 +08:00