feat: add Qiniu provider support#8025
feat: add Qiniu provider support#8025JackChiang233 wants to merge 2 commits intoAstrBotDevs:masterfrom
Conversation
There was a problem hiding this comment.
Hey - I've found 2 issues, and left some high level feedback:
- The hard-coded fallback model list in
ProviderQiniu.get_models(["deepseek-v3"]) would be more flexible if sourced fromprovider_configor settings so deployments can adjust it without code changes. - The new Qiniu-specific tests currently live in
tests/test_openai_source.py; consider moving them into a dedicatedtest_qiniu_source.pyto keep provider-specific behavior isolated and the test suite easier to navigate.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- The hard-coded fallback model list in `ProviderQiniu.get_models` (`["deepseek-v3"]`) would be more flexible if sourced from `provider_config` or settings so deployments can adjust it without code changes.
- The new Qiniu-specific tests currently live in `tests/test_openai_source.py`; consider moving them into a dedicated `test_qiniu_source.py` to keep provider-specific behavior isolated and the test suite easier to navigate.
## Individual Comments
### Comment 1
<location path="astrbot/core/provider/sources/qiniu_source.py" line_range="12-21" />
<code_context>
+ "Qiniu Chat Completion Provider Adapter",
+)
+class ProviderQiniu(ProviderOpenAIOfficial):
+ async def get_models(self):
+ try:
+ models = await super().get_models()
+ if models:
+ return models
+ except Exception as e:
+ logger.debug(
+ "Qiniu 列举模型不可用,退回占位列表: %s",
+ e,
+ exc_info=True,
+ )
+ return ["deepseek-v3"]
</code_context>
<issue_to_address>
**issue (bug_risk):** Catching `Exception` broadly here may hide real integration issues with Qiniu.
The blanket `except Exception` will turn any upstream failure (auth issues, API/HTTP errors, client bugs) into the placeholder list, making real Qiniu problems hard to detect. Please catch only the expected error types from the Qiniu/OpenAI client, and/or log at warning/error level when the fallback is used so operational issues are visible.
</issue_to_address>
### Comment 2
<location path="astrbot/core/provider/sources/qiniu_source.py" line_range="23" />
<code_context>
+ e,
+ exc_info=True,
+ )
+ return ["deepseek-v3"]
</code_context>
<issue_to_address>
**suggestion (bug_risk):** Hard-coding the fallback model ID may make this adapter brittle if Qiniu’s default model changes.
Using a hard-coded "deepseek-v3" couples this adapter to a specific model name; if Qiniu renames or deprecates it, the UI will still offer it and calls may fail at runtime. Prefer deriving the fallback from configuration (e.g., the default provider config) or at least defining it as a module-level constant so it’s easy to update without touching control flow.
Suggested implementation:
```python
# Default fallback model for Qiniu when listing models fails or returns empty.
# Keep this in sync with Qiniu's documented defaults or provider configuration.
DEFAULT_QINIU_FALLBACK_MODEL = "deepseek-v3"
try:
models = await super().get_models()
if models:
return models
except Exception as e:
logger.debug(
"Qiniu 列举模型不可用,退回占位列表: %s",
e,
exc_info=True,
)
return [DEFAULT_QINIU_FALLBACK_MODEL]
```
If this module already defines constants or has a configuration-driven default model, you may want to:
1. Move `DEFAULT_QINIU_FALLBACK_MODEL` to that existing configuration/constant section instead of directly above this method.
2. Optionally derive `DEFAULT_QINIU_FALLBACK_MODEL` from provider configuration (e.g., `settings.qiniu.default_model`) and have this constant simply reference that value.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
| async def get_models(self): | ||
| try: | ||
| models = await super().get_models() | ||
| if models: | ||
| return models | ||
| except Exception as e: | ||
| logger.debug( | ||
| "Qiniu 列举模型不可用,退回占位列表: %s", | ||
| e, | ||
| exc_info=True, |
There was a problem hiding this comment.
issue (bug_risk): Catching Exception broadly here may hide real integration issues with Qiniu.
The blanket except Exception will turn any upstream failure (auth issues, API/HTTP errors, client bugs) into the placeholder list, making real Qiniu problems hard to detect. Please catch only the expected error types from the Qiniu/OpenAI client, and/or log at warning/error level when the fallback is used so operational issues are visible.
| e, | ||
| exc_info=True, | ||
| ) | ||
| return ["deepseek-v3"] |
There was a problem hiding this comment.
suggestion (bug_risk): Hard-coding the fallback model ID may make this adapter brittle if Qiniu’s default model changes.
Using a hard-coded "deepseek-v3" couples this adapter to a specific model name; if Qiniu renames or deprecates it, the UI will still offer it and calls may fail at runtime. Prefer deriving the fallback from configuration (e.g., the default provider config) or at least defining it as a module-level constant so it’s easy to update without touching control flow.
Suggested implementation:
# Default fallback model for Qiniu when listing models fails or returns empty.
# Keep this in sync with Qiniu's documented defaults or provider configuration.
DEFAULT_QINIU_FALLBACK_MODEL = "deepseek-v3"
try:
models = await super().get_models()
if models:
return models
except Exception as e:
logger.debug(
"Qiniu 列举模型不可用,退回占位列表: %s",
e,
exc_info=True,
)
return [DEFAULT_QINIU_FALLBACK_MODEL]If this module already defines constants or has a configuration-driven default model, you may want to:
- Move
DEFAULT_QINIU_FALLBACK_MODELto that existing configuration/constant section instead of directly above this method. - Optionally derive
DEFAULT_QINIU_FALLBACK_MODELfrom provider configuration (e.g.,settings.qiniu.default_model) and have this constant simply reference that value.
There was a problem hiding this comment.
Code Review
This pull request introduces support for the Qiniu Chat Completion provider. Key changes include adding Qiniu to the default configuration, implementing the ProviderQiniu class with a fallback mechanism for model listing, and updating the dashboard with the appropriate provider mapping and icon. Feedback suggests improving consistency by using the @latest tag for the Qiniu icon CDN URL and refactoring the new unit tests to use a helper function for provider setup to reduce code duplication.
| async def test_qiniu_get_models_fallback_on_failure(monkeypatch): | ||
| provider = ProviderQiniu( | ||
| provider_config={ | ||
| "id": "test-qiniu", | ||
| "type": "qiniu_chat_completion", | ||
| "model": "deepseek-v3", | ||
| "key": ["k"], | ||
| "api_base": "https://api.qnaigc.com/v1", | ||
| }, | ||
| provider_settings={}, | ||
| ) | ||
| try: | ||
|
|
||
| async def fail_list(): | ||
| raise RuntimeError("unavailable") | ||
|
|
||
| monkeypatch.setattr(provider.client.models, "list", fail_list) | ||
| models = await provider.get_models() | ||
| assert models == ["deepseek-v3"] | ||
| finally: | ||
| await provider.terminate() | ||
|
|
||
|
|
||
| @pytest.mark.asyncio | ||
| async def test_qiniu_get_models_fallback_when_empty(monkeypatch): | ||
| provider = ProviderQiniu( | ||
| provider_config={ | ||
| "id": "test-qiniu-empty", | ||
| "type": "qiniu_chat_completion", | ||
| "model": "deepseek-v3", | ||
| "key": ["k"], | ||
| "api_base": "https://api.qnaigc.com/v1", | ||
| }, | ||
| provider_settings={}, | ||
| ) | ||
| try: | ||
|
|
||
| async def empty_list(): | ||
| return SimpleNamespace(data=[]) | ||
|
|
||
| monkeypatch.setattr(provider.client.models, "list", empty_list) | ||
| models = await provider.get_models() | ||
| assert models == ["deepseek-v3"] | ||
| finally: | ||
| await provider.terminate() | ||
|
|
There was a problem hiding this comment.
There is some code duplication in the setup for test_qiniu_get_models_fallback_on_failure and test_qiniu_get_models_fallback_when_empty. To improve maintainability and follow the pattern used elsewhere in this file (like _make_provider), you could extract the provider creation logic into a new helper function (e.g., _make_qiniu_provider). This aligns with the repository rule to refactor similar functionality into shared helper functions to avoid code duplication.
References
- When implementing similar functionality for different cases (e.g., direct vs. quoted attachments), refactor the logic into a shared helper function to avoid code duplication.
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Summary
Added support for Qiniu AI (七牛云) as a new model provider.
新增七牛云的厂商支持。
Modifications / 改动点
astrbot/core/config/default.py- Adds the default Qiniu provider template in the global provider config schema.astrbot/core/provider/manager.py- Registers dynamic import handling for theqiniu_chat_completionprovider type.astrbot/core/provider/sources/qiniu_source.py- Implements the Qiniu provider adapter with model-list fallback behavior.dashboard/src/composables/useProviderSources.ts- Wires provider-source UI logic to recognize and manage Qiniu source entries.dashboard/src/utils/providerUtils.js- Provides the Qiniu icon mapping used by provider-related dashboard views.tests/test_openai_source.py- Adds regression tests covering Qiniu model listing fallback scenarios.astrbot/core/config/default.py- 在全局提供程序配置模式中添加默认的 Qiniu 提供程序模板。astrbot/core/provider/manager.py- 注册qiniu_chat_completion提供程序类型的动态导入处理。astrbot/core/provider/sources/qiniu_source.py- 实现带有模型列表回退行为的 Qiniu 提供程序适配器。dashboard/src/composables/useProviderSources.ts- 连接提供程序源 UI 逻辑,以识别和管理 Qiniu 源条目。dashboard/src/utils/providerUtils.js- 提供提供程序相关仪表板视图使用的 Qiniu 图标映射。tests/test_openai_source.py- 添加涵盖 Qiniu 模型列表回退场景的回归测试。This is NOT a breaking change. / 这不是一个破坏性变更。
Testing
Screenshots or Test Results / 运行截图或测试结果
Checklist / 检查清单
😊 If there are new features added in the PR, I have discussed it with the authors through issues/emails, etc.
/ 如果 PR 中有新加入的功能,已经通过 Issue / 邮件等方式和作者讨论过。
👀 My changes have been well-tested, and "Verification Steps" and "Screenshots" have been provided above.
/ 我的更改经过了良好的测试,并已在上方提供了“验证步骤”和“运行截图”。
🤓 I have ensured that no new dependencies are introduced, OR if new dependencies are introduced, they have been added to the appropriate locations in
requirements.txtandpyproject.toml./ 我确保没有引入新依赖库,或者引入了新依赖库的同时将其添加到
requirements.txt和pyproject.toml文件相应位置。😮 My changes do not introduce malicious code.
/ 我的更改没有引入恶意代码。
Summary by Sourcery
Add Qiniu as an OpenAI-compatible chat completion provider with configuration, runtime, and dashboard support.
New Features:
Enhancements:
Tests: